It takes only a few minutes, if you know what you are doing. Otherwise, it may take half frustrating for a full day. The key to preparation, just like any cooking offer.
Preparatory
You will need Deno Installed as well Boys If you want to run the example locally. Otherwise, you can connect to Openai’s end point but you will need a paid API key.
Once the above is installed, run these orders to check:
seandotau@aseandotaus-MBP ~ % deno --version
deno 2.1.6 (stable, release, aarch64-apple-darwin)
v8 13.0.245.12-rusty
typescript 5.6.2
seandotau@aseandotaus-MBP ~ % ollama --version
ollama version is 0.5.11
Deno
To install Deno, run the following command:
deno install -g -f --allow-env --allow-net --allow-import --allow-read --allow-write --allow-ffi --allow-run --unstable-worker-options -n subql-ai jsr:@subql/ai-app-framework/cli
deno installIt installs a text program or a tool worldwide.-gIt symbolizes “Global”, which means that this tool will be installed worldwide and can be run from anywhere.-f: It stabilizes the tool strongly, even if the version is already installed.--allow-envGrant the permission of the text program to reach environmental variables.--allow-netGrant the permission of the text program to reach the network.--allow-importThe text is allowed to import units or other files.--allow-readThe permission of the text to read files from the file system is granted.--allow-writeIt gives permission the text to write files to the file system.--allow-ffiThe text program is allowed to use foreign functionality interface calls (FFI) to interact with the original code.--allow-runThe text program is allowed to operate the sub -operations.--unstable-worker-optionsIt provides experimental or unstable workers’ options (Deno may develop and this option may not be always).-n subql-aiThe name of the tool or dualism that is installed (subql-ai In this case).jsr:@subql/ai-app-framework/cliIt refers to the source code or the stereotype that is installed. It uses the JS package specified to bring the stereotype (@subql/ai-app-framework/cli) From URL or record ( jsr: prefix).
Boys
Sees here.
Create a new application
Run subql-ai init Providing the project name. Then select the LLM model. You should have downloaded a model when installing OLLAMA. If not so, download one by following the instructions here And restart this process. I will use llama3.2
seandotau@aseandotaus-MBP subquery % subql-ai init
Enter a project name: testai
Enter a LLM model llama3.2
You must have a folder structure like the following:
Here is the difficult thing. Update the statement file to:
endpoints: ("localhost:11434"),
It is the end point for the local Llama33.2 data group.

Application running
Assuming that you have OLLAMA working in the background already and you have the correct form version, then run the application via: subql-ai -p ./manifest.ts. You must get:
seandotau@aseandotaus-MBP testai % subql-ai -p ./manifest.ts
(12:54:33.357) INFO (app): Subql AI Framework (0.0.5)
✔ Loaded project manifest
✔ Loaded project source
Listening on http://0.0.0.0:7827/
Now at another station, run: subql-ai repl. You must get:
seandotau@aseandotaus-MBP testai % subql-ai repl
Special messages:
/bye: to exit (ctrl + c also works)
/clear: to remove all previous chat history
Enter a message: hi
That was quick! It seems like I have the user's name already. Let me try something a bit more personalized.
What's your favorite hobby or interest?
Enter a message:

summary
There you have. Run the AI Subser frame with a local example of ollama within minutes. There are other options for running the application such as VIA DOCKER, and contact with Openai that I will explore in another writing.







