You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/15_endpoint_apis/02_ollama_endpoint.ipynb
+1-1Lines changed: 1 addition & 1 deletion
Original file line number
Diff line number
Diff line change
@@ -9,7 +9,7 @@
9
9
"[Ollama](https://ollama.com/download) is a tool that downloads models to our computer and allows us to run them locally. Before executing the following code, you need to run `ollama run llama3:8b` once, if you didn't do this during setup. Also, depending on how you installed ollama, you may have to execute it in a terminal window using this command, before executing this notebook:\n",
10
10
"\n",
11
11
"```\n",
12
-
"ollam serve\n",
12
+
"ollama serve\n",
13
13
"```\n",
14
14
"\n",
15
15
"As you will see, we access the local models offered via ollama using the OpenAI API as shown before. We just exchange the `base_url` and we do not need to provide an API-Key."
0 commit comments