User Tools

Site Tools


ai:llm:ollama

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
ai:llm:ollama [2025/02/13 09:15] jmbargalloai:llm:ollama [2025/02/18 17:09] (current) – [How to Run Ollama and Connect to the Service API Through Internal Network or Internet] 85.219.17.206
Line 72: Line 72:
  
 ollama create is used to create a model from a Modelfile. ollama create is used to create a model from a Modelfile.
 +<code> 
 ollama create mymodel -f ./Modelfile ollama create mymodel -f ./Modelfile
 +</code> 
 Pull a model Pull a model
 +<code> 
 ollama pull llama3.2 ollama pull llama3.2
- +</code>  
-    This command can also be used to update a local model. Only the diff will be pulled.+This command can also be used to update a local model. Only the diff will be pulled.
  
 Remove a model Remove a model
 +<code> 
 ollama rm llama3.2 ollama rm llama3.2
 +</code> 
 Copy a model Copy a model
 +<code> 
 ollama cp llama3.2 my-model ollama cp llama3.2 my-model
 +</code> 
 Multiline input Multiline input
  
 For multiline input, you can wrap text with """: For multiline input, you can wrap text with """:
 +<code> 
 >>> """Hello, >>> """Hello,
 ... world! ... world!
 ... """ ... """
 I'm a basic program that prints the famous "Hello, world!" message to the console. I'm a basic program that prints the famous "Hello, world!" message to the console.
 +</code> 
 Multimodal models Multimodal models
 +<code> 
 ollama run llava "What's in this image? /Users/jmorgan/Desktop/smile.png" ollama run llava "What's in this image? /Users/jmorgan/Desktop/smile.png"
- +</code> 
-    Output: The image features a yellow smiley face, which is likely the central focus of the picture.+Output: The image features a yellow smiley face, which is likely the central focus of the picture.
  
 Pass the prompt as an argument Pass the prompt as an argument
 +<code> 
 ollama run llama3.2 "Summarize this file: $(cat README.md)" ollama run llama3.2 "Summarize this file: $(cat README.md)"
- +</code> 
-    Output: Ollama is a lightweight, extensible framework for building and running language models on the local machine. It provides a simple API for creating, running, and managing models, as well as a library of pre-built models that can be easily used in a variety of applications.+
  
 Show model information Show model information
 +<code> 
 ollama show llama3.2 ollama show llama3.2
 +</code> 
 List models on your computer List models on your computer
 +<code> 
 ollama list ollama list
 +</code> 
 List which models are currently loaded List which models are currently loaded
 +<code> 
 ollama ps ollama ps
 +</code> 
 Stop a model which is currently running Stop a model which is currently running
 +<code> 
 ollama stop llama3.2 ollama stop llama3.2
 +</code>  
 +<code> 
 Start Ollama Start Ollama
 ollama serve is used when you want to start ollama without running the desktop application. ollama serve is used when you want to start ollama without running the desktop application.
 +</code> 
 +
 +===== How to Run Ollama and Connect to the Service API Through Internal Network or Internet =====
 +
 +Setting Environment Variables on Linux
 +
 +If Ollama is run as a systemd service, environment variables should be set using systemctl:
 +
 +Edit the Ollama Service File: Open the Ollama service configuration file with the following command:
 +
 +    sudo systemctl edit ollama.service
 +
 +Add the Environment Variable: In the editor, add the following lines under the [Service] section:
 +
 +    [Service]
 +    Environment="OLLAMA_HOST=0.0.0.0"
 +
 +Note #1: Sometimes, 0.0.0.0 does not work due to your environment setup. Instead, you can try setting it to your local ip address like 10.0.0.x or xxx.local, etc.
 +
 +Note #2: You should put this above this line ### Lines below this comment will be discarded. It should look something like this:
 +
 +    ### Editing /etc/systemd/system/ollama.service.d/override.conf
 +    ### Anything between here and the comment below will become the new contents of the file
 +    [Service]
 +    Environment="OLLAMA_HOST=0.0.0.0"
 +    ### Lines below this comment will be discarded
 +    ### /etc/systemd/system/ollama.service
 +    # [Unit]
 +    # Description=Ollama Service
 +    # After=network-online.target
 +    #
 +    # [Service]
 +    # ExecStart=/usr/local/bin/ollama serve
 +    # User=ollama
 +    # Group=ollama
 +    # Restart=always
 +    # RestartSec=3
 +    # Environment="PATH=/home/kimi/.nvm/versions/node/v20.5.0/bin:/home/kimi/.local/share/pnpm:/usr/local/sbin:/usr/local/bin:/usr/s>
 +    #
 +    # [Install]
 +    # WantedBy=default.target
 +
 +Restart the Service: After editing the file, reload the systemd daemon and restart the Ollama service:
 +
 +    sudo systemctl daemon-reload
 +    sudo systemctl restart ollama
 +
 +
 +
  
  
ai/llm/ollama.1739438157.txt.gz · Last modified: 2025/02/13 09:15 by jmbargallo