A Look at Modus and the Future of Model-Native Apps - Part II
Modus Local Development
In Part I we scratched the surface on Modus and how its Model-Native apps concept aims to shift our designs by embedding AI/ML models as foundational components in developing intelligent APIs. Now we're going to create a overly simple GraphQL endpoint that received a request and based on its content responds with some over-the-top sarcastic response.
Let's go through the high level flow of this modus app which the gist of it can be summarized in the following screenshot:
- This is a modus app and it's running locally as you can tell
- It has a GraphQL
Query
type - Along with a
generateExecses
method that returns us aString
and a parameter callevent
that seems to be an invite to a wedding of some sort - In the response portion of the UI we see a response to this call that provides couple of legit execuses
Behind the scenes
As we mentioned in part I of this series modus.json
is like a manifest of your modus app.
{
"$schema": "https://schema.hypermode.com/modus.json",
"endpoints": {
"default": {
"type": "graphql",
"path": "/graphql",
"auth": "bearer-token"
}
},
"connections": {
"openai": {
"type": "http",
"baseUrl": "https://api.openai.com/",
"headers": {
"Authorization": "Bearer {{API_KEY}}"
}
}
},
"models": {
"llm": {
"sourceModel": "gpt-4o",
"connection": "openai",
"path": "v1/chat/completions"
}
}
}
and our index.ts
, which serves as your main export for our AssemblyScript function generateExcuses()
and makes it available in our app's generated API.
import { models } from "@hypermode/modus-sdk-as";
import {
OpenAIChatModel,
SystemMessage,
UserMessage,
} from "@hypermode/modus-sdk-as/models/openai/chat";
export function generateExcuses(event: string): string {
const modelName: string = "llm";
const model = models.getModel<OpenAIChatModel>(modelName);
const prompt = `Generate 2 absurd, sarcastic, over-the-top and dark excuses for why I can't attend "${event}".
Make them elaborate, ridiculous, and completely unbelievable.
Each excuse should be at least 2 sentences long.
Format the response as a JSON array of strings, with each excuse as a separate element.`;
const input = model.createInput([
new SystemMessage(
"You are a creative, dark and sarcastic excuse generator. Your excuses should be outlandish and humorous."
),
new UserMessage(prompt),
]);
// set temperature to higher value for more creative responses
input.temperature = 0.9;
const response = model.invoke(input);
return response.choices[0].message.content.trim();
}
to run the app locally we issue:
npx modus dev
and we will see an output like the following:
You can then use either of the following endpoints to interact with your app:
Your local endpoint is ready!
GraphQL (default): http://localhost:8686/graphql
View endpoint: http://localhost:8686/explorer
Diagram below depicts the high level interaction and flow of how this simple modus app works:
Codebase is available here. Follow its README
for setting up your local environment.
In Part III, we will cover importing our Modus app into the Hypermode platform. While local development offers more immediate control and testing capabilities, deploying to Hypermode provides a more robust, production-ready environment with features for managing, securing, and observing your app.
all opinions are me own