diff --git a/options.xhtml b/options.xhtml index b2953a096..c8c173ec1 100644 --- a/options.xhtml +++ b/options.xhtml @@ -70148,6 +70148,171 @@ package

+
+ + services.ollama.enable + + +
+
+

Whether to enable ollama server for local large language models.

+ +

Type: +boolean

+ +

Default: +false

+ +

Example: +true

+ +

Declared by:

+ + +
+ +<home-manager/modules/services/ollama.nix> + +
+
+
+ + services.ollama.package + + +
+
+

The ollama package to use.

+ +

Type: +package

+ +

Default: +pkgs.ollama

+ +

Declared by:

+ + +
+ +<home-manager/modules/services/ollama.nix> + +
+
+
+ + services.ollama.acceleration + + +
+
+

What interface to use for hardware acceleration.

+ +

Type: +null or one of false, “rocm”, “cuda”

+ +

Default: +null

+ +

Example: +"rocm"

+ +

Declared by:

+ + +
+ +<home-manager/modules/services/ollama.nix> + +
+
+
+ + services.ollama.environmentVariables + + +
+
+

Set arbitrary environment variables for the ollama service.

Be aware that these are only seen by the ollama server (systemd service), +not normal invocations like ollama run. +Since ollama run is mostly a shell around the ollama server, this is usually sufficient.

+ +

Type: +attribute set of string

+ +

Default: +{ }

+ +

Example:

{
+  HIP_VISIBLE_DEVICES = "0,1";
+  OLLAMA_LLM_LIBRARY = "cpu";
+}
+
+ +

Declared by:

+ + +
+ +<home-manager/modules/services/ollama.nix> + +
+
+
+ + services.ollama.host + + +
+
+

The host address which the ollama server HTTP interface listens to.

+ +

Type: +string

+ +

Default: +"127.0.0.1"

+ +

Example: +"[::]"

+ +

Declared by:

+ + +
+ +<home-manager/modules/services/ollama.nix> + +
+
+
+ + services.ollama.port + + +
+
+

Which port the ollama server listens to.

+ +

Type: +16 bit unsigned integer; between 0 and 65535 (both inclusive)

+ +

Default: +11434

+ +

Example: +11111

+ +

Declared by:

+ + +
+ +<home-manager/modules/services/ollama.nix> + +
+
services.opensnitch-ui.enable