pull command
Download models from URLs
A wrapper around ipull to download model files as fast as possible with parallel connections and other optimizations.
Automatically handles split and binary-split models files, so only pass the URI to the first file of a model.
If a file already exists and its size matches the expected size, it will not be downloaded again unless the --override flag is used.
The supported URI schemes are:
- HTTP:
https://,http:// - Hugging Face:
hf:<user>/<model>:<quant>(:<quant>is optional, but recommended) - Hugging Face:
hf:<user>/<model>/<file-path>#<branch>(#<branch>is optional)
Learn more about using model URIs in the Downloading Models guide.
To programmatically download a model file in your code, use
createModelDownloader()
Usage
shell
npx --no node-llama-cpp pull [urls..]
Options
Required
| Option | Description |
|---|---|
--urls [string], --url [string], --uris [string], --uri [string] |
A .gguf model URI to pull.
Pass multiple URIs to download multiple models at once. (string[]) (required) |
Optional
| Option | Description |
|---|---|
-H [string], --header [string] |
Headers to use when downloading a model from a URL, in the format key: value. You can pass this option multiple times to add multiple headers. (string[]) |
-o, --override |
Override existing model files (default: false) (boolean) |
--noProgress |
Do not show a progress bar while downloading (default: false) (boolean) |
--noTempFile, --noTemp |
Delete the temporary file when canceling the download (default: false) (boolean) |
-d [string], --directory [string], --dir [string] |
Directory to save the model to (default: ~/.node-llama-cpp/models) (string) |
-n [string], --filename [string], --name [string] |
Filename to save the model as. Can only be used if a single URL is passed (string) |
-p <number>, --parallel <number> |
Maximum parallel downloads (default: 4) (number) |
Other
| Option | Description |
|---|---|
-h, --help |
Show help |
-v, --version |
Show version number |