Skip to content

download command

Download a release of llama.cpp and compile it

NOTE

node-llama-cpp ships with a git bundle of the release of llama.cpp it was built with, so when you run the download command without specifying a specific release or repo, it will use the bundled git bundle instead of downloading the release from GitHub.

This is useful for building from source on machines that aren't connected to the internet.

INFO

If the build fails on macOS with the error "/usr/bin/cc" is not able to compile a simple test program, try running xcode-select --install to install the Xcode command line tools.

Usage

shell
npx --no node-llama-cpp download
npx --no node-llama-cpp download

Options

Option Description
--repo [string] The GitHub repository to download a release of llama.cpp from. Can also be set via the NODE_LLAMA_CPP_REPO environment variable (default: ggerganov/llama.cpp) (string)
--release [string] The tag of the llama.cpp release to download. Set to "latest" to download the latest release. Can also be set via the NODE_LLAMA_CPP_REPO_RELEASE environment variable (default: <current build>) (string)
-a [string], --arch [string] The architecture to compile llama.cpp for (string)
-t [string], --nodeTarget [string] The Node.js version to compile llama.cpp for. Example: v18.0.0 (string)
--metal Compile llama.cpp with Metal support. Enabled by default on macOS. Can be disabled with "--no-metal". Can also be set via the NODE_LLAMA_CPP_METAL environment variable (default: true) (boolean)
--cuda Compile llama.cpp with CUDA support. Can also be set via the NODE_LLAMA_CPP_CUDA environment variable (default: false) (boolean)
--skipBuild, --sb Skip building llama.cpp after downloading it (default: false) (boolean)
--noBundle, --nb Download a llama.cpp release only from GitHub, even if a local git bundle exists for the release (default: false) (boolean)
-h, --help Show help
-v, --version Show version number

To set custom cmake options that are supported by llama.cpp's cmake build, set an environment variable of the option prefixed with NODE_LLAMA_CPP_CMAKE_OPTION_.