User Tools

Site Tools


ai:llm:ollama

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
ai:llm:ollama [2025/02/13 09:21] jmbargalloai:llm:ollama [2025/02/18 17:09] (current) – [How to Run Ollama and Connect to the Service API Through Internal Network or Internet] 85.219.17.206
Line 74: Line 74:
 <code>  <code> 
 ollama create mymodel -f ./Modelfile ollama create mymodel -f ./Modelfile
 +</code> 
 Pull a model Pull a model
 <code>  <code> 
 ollama pull llama3.2 ollama pull llama3.2
- +</code>  
-    This command can also be used to update a local model. Only the diff will be pulled.+This command can also be used to update a local model. Only the diff will be pulled.
  
 Remove a model Remove a model
 <code>  <code> 
 ollama rm llama3.2 ollama rm llama3.2
 +</code> 
 Copy a model Copy a model
 <code>  <code> 
 ollama cp llama3.2 my-model ollama cp llama3.2 my-model
 +</code> 
 Multiline input Multiline input
  
Line 97: Line 97:
 ... """ ... """
 I'm a basic program that prints the famous "Hello, world!" message to the console. I'm a basic program that prints the famous "Hello, world!" message to the console.
 +</code> 
 Multimodal models Multimodal models
 <code>  <code> 
Line 129: Line 129:
 ollama serve is used when you want to start ollama without running the desktop application. ollama serve is used when you want to start ollama without running the desktop application.
 </code>  </code> 
 +
 +===== How to Run Ollama and Connect to the Service API Through Internal Network or Internet =====
 +
 +Setting Environment Variables on Linux
 +
 +If Ollama is run as a systemd service, environment variables should be set using systemctl:
 +
 +Edit the Ollama Service File: Open the Ollama service configuration file with the following command:
 +
 +    sudo systemctl edit ollama.service
 +
 +Add the Environment Variable: In the editor, add the following lines under the [Service] section:
 +
 +    [Service]
 +    Environment="OLLAMA_HOST=0.0.0.0"
 +
 +Note #1: Sometimes, 0.0.0.0 does not work due to your environment setup. Instead, you can try setting it to your local ip address like 10.0.0.x or xxx.local, etc.
 +
 +Note #2: You should put this above this line ### Lines below this comment will be discarded. It should look something like this:
 +
 +    ### Editing /etc/systemd/system/ollama.service.d/override.conf
 +    ### Anything between here and the comment below will become the new contents of the file
 +    [Service]
 +    Environment="OLLAMA_HOST=0.0.0.0"
 +    ### Lines below this comment will be discarded
 +    ### /etc/systemd/system/ollama.service
 +    # [Unit]
 +    # Description=Ollama Service
 +    # After=network-online.target
 +    #
 +    # [Service]
 +    # ExecStart=/usr/local/bin/ollama serve
 +    # User=ollama
 +    # Group=ollama
 +    # Restart=always
 +    # RestartSec=3
 +    # Environment="PATH=/home/kimi/.nvm/versions/node/v20.5.0/bin:/home/kimi/.local/share/pnpm:/usr/local/sbin:/usr/local/bin:/usr/s>
 +    #
 +    # [Install]
 +    # WantedBy=default.target
 +
 +Restart the Service: After editing the file, reload the systemd daemon and restart the Ollama service:
 +
 +    sudo systemctl daemon-reload
 +    sudo systemctl restart ollama
 +
 +
 +
 +
  
 ===== Learn More ===== ===== Learn More =====
 For more detailed information and tutorials, visit [Ollama's official website](https://ollama.com/) or check out this [video overview](https://www.youtube.com/watch?v=wxyDEqR4KxM). For more detailed information and tutorials, visit [Ollama's official website](https://ollama.com/) or check out this [video overview](https://www.youtube.com/watch?v=wxyDEqR4KxM).
  
ai/llm/ollama.1739438493.txt.gz · Last modified: 2025/02/13 09:21 by jmbargallo