From a7a55fe074dae3b66f18779a49db002873947529 Mon Sep 17 00:00:00 2001 From: teto Date: Fri, 10 Jan 2025 11:31:56 +0000 Subject: [PATCH] deploy: 2532b500c3ed2b8940e831039dcec5a5ea093afc --- options.xhtml | 165 ++++++++++++++++++++++++++++++++++++++++++++++++++ 1 file changed, 165 insertions(+) diff --git a/options.xhtml b/options.xhtml index b2953a096..c8c173ec1 100644 --- a/options.xhtml +++ b/options.xhtml @@ -70148,6 +70148,171 @@ package

+
+ + services.ollama.enable + + +
+
+

Whether to enable ollama server for local large language models.

+ +

Type: +boolean

+ +

Default: +false

+ +

Example: +true

+ +

Declared by:

+ + +
+ +<home-manager/modules/services/ollama.nix> + +
+
+
+ + services.ollama.package + + +
+
+

The ollama package to use.

+ +

Type: +package

+ +

Default: +pkgs.ollama

+ +

Declared by:

+ + +
+ +<home-manager/modules/services/ollama.nix> + +
+
+
+ + services.ollama.acceleration + + +
+
+

What interface to use for hardware acceleration.

  • null: default behavior

    • if nixpkgs.config.rocmSupport is enabled, uses "rocm"

    • if nixpkgs.config.cudaSupport is enabled, uses "cuda"

    • otherwise defaults to false

  • false: disable GPU, only use CPU

  • "rocm": supported by most modern AMD GPUs

    • may require overriding gpu type with services.ollama.rocmOverrideGfx +if rocm doesn’t detect your AMD gpu

  • "cuda": supported by most modern NVIDIA GPUs

+ +

Type: +null or one of false, “rocm”, “cuda”

+ +

Default: +null

+ +

Example: +"rocm"

+ +

Declared by:

+ + +
+ +<home-manager/modules/services/ollama.nix> + +
+
+
+ + services.ollama.environmentVariables + + +
+
+

Set arbitrary environment variables for the ollama service.

Be aware that these are only seen by the ollama server (systemd service), +not normal invocations like ollama run. +Since ollama run is mostly a shell around the ollama server, this is usually sufficient.

+ +

Type: +attribute set of string

+ +

Default: +{ }

+ +

Example:

{
+  HIP_VISIBLE_DEVICES = "0,1";
+  OLLAMA_LLM_LIBRARY = "cpu";
+}
+
+ +

Declared by:

+ + +
+ +<home-manager/modules/services/ollama.nix> + +
+
+
+ + services.ollama.host + + +
+
+

The host address which the ollama server HTTP interface listens to.

+ +

Type: +string

+ +

Default: +"127.0.0.1"

+ +

Example: +"[::]"

+ +

Declared by:

+ + +
+ +<home-manager/modules/services/ollama.nix> + +
+
+
+ + services.ollama.port + + +
+
+

Which port the ollama server listens to.

+ +

Type: +16 bit unsigned integer; between 0 and 65535 (both inclusive)

+ +

Default: +11434

+ +

Example: +11111

+ +

Declared by:

+ + +
+ +<home-manager/modules/services/ollama.nix> + +
+
services.opensnitch-ui.enable