User Tools

Site Tools


linux64

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
linux64 [2025/07/13 11:59] – [ollama in podman for ai] bengtlinux64 [2025/07/13 14:09] (current) – [ollama in podman for ai] bengt
Line 828: Line 828:
   $ podman exec -it ollama ollama run llama2   $ podman exec -it ollama ollama run llama2
  
-You can add the ollama integration to home assistant. On issues check your firewall config.+You can add the ollama integration to home assistant. On issues check your firewall config. Be sure to have "conversation:" "assist_pipeline:" in configuration.yaml if you do not use the default config before trying to setup assist pipelines.
  
 TODO: GPU support. Need to use nvidia drivers first... TODO: GPU support. Need to use nvidia drivers first...
linux64.txt · Last modified: 2025/07/13 14:09 by bengt

Donate Powered by PHP Valid HTML5 Valid CSS Driven by DokuWiki