source download
command
Download a release of llama.cpp
and compile it
NOTE
node-llama-cpp
ships with a git bundle of the release of llama.cpp
it was built with, so when you run the source download
command without specifying a specific release or repo, it will use the bundled git bundle instead of downloading the release from GitHub.
This is useful for building from source on machines that aren't connected to the internet.
INFO
If the build fails on macOS with the error "/usr/bin/cc" is not able to compile a simple test program
, try running xcode-select --install
to install the Xcode command line tools.
Programmatically calling the source download
command in your code
To programmatically call this command in your code, call the DownloadLlamaCppCommand
function:
import {DownloadLlamaCppCommand} from "node-llama-cpp/commands";
await DownloadLlamaCppCommand({});
Note: The
node-llama-cpp/commands
import is subject to change and is unsupported inside Electron
Usage
npx --no node-llama-cpp source download
Options
Option | Description |
---|---|
--repo [string] |
The GitHub repository to download a release of llama.cpp from. Can also be set via the NODE_LLAMA_CPP_REPO environment variable (default: ggerganov/llama.cpp ) (string) |
--release [string] |
The tag of the llama.cpp release to download. Set to latest to download the latest release. Can also be set via the NODE_LLAMA_CPP_REPO_RELEASE environment variable (default: <current build> ) (string) |
-a [string] , --arch [string] |
The architecture to compile llama.cpp for (string) |
-t [string] , --nodeTarget [string] |
The Node.js version to compile llama.cpp for. Example: v18.0.0 (string) |
--gpu [string] |
Compute layer implementation type to use for llama.cpp (default: auto ) (string)
|
--skipBuild , --sb |
Skip building llama.cpp after downloading it (default: false ) (boolean) |
--noBundle , --nb |
Download a llama.cpp release only from GitHub, even if a local git bundle exists for the release (default: false ) (boolean) |
--noUsageExample , --nu |
Don't print code usage example after building (default: false ) (boolean) |
-h , --help |
Show help |
-v , --version |
Show version number |
To set custom cmake options that are supported by
llama.cpp
's cmake build, set an environment variable of the option prefixed withNODE_LLAMA_CPP_CMAKE_OPTION_
.