Skip to content

Function: combineModelDownloaders() ​

ts
function combineModelDownloaders(downloaders: (
  | ModelDownloader
  | Promise<ModelDownloader>)[], options?: CombinedModelDownloaderOptions): Promise<CombinedModelDownloader>

Defined in: utils/createModelDownloader.ts:192

Combine multiple models downloaders to a single downloader to download everything using as much parallelism as possible.

You can check each individual model downloader for its download progress, but only the onProgress passed to the combined downloader will be called during the download.

When combining ModelDownloader instances, the following options on each individual ModelDownloader are ignored:

  • showCliProgress
  • onProgress
  • parallelDownloads

To set any of those options for the combined downloader, you have to pass them to the combined downloader instance.

Parameters ​

ParameterType
downloaders( | ModelDownloader | Promise<ModelDownloader>)[]
options?CombinedModelDownloaderOptions

Returns ​

Promise<CombinedModelDownloader>

Example ​

typescript
import {
fileURLToPath
} from "url";
import
path
from "path";
import {
createModelDownloader
,
combineModelDownloaders
,
getLlama
} from "node-llama-cpp";
const
__dirname
=
path
.
dirname
(
fileURLToPath
(import.meta.
url
));
const
downloaders
= [
createModelDownloader
({
modelUri
: "https://example.com/model1.gguf",
dirPath
:
path
.
join
(
__dirname
, "models")
}),
createModelDownloader
({
modelUri
: "hf:user/model2:quant",
dirPath
:
path
.
join
(
__dirname
, "models")
}),
createModelDownloader
({
modelUri
: "hf:user/model/model3.gguf",
dirPath
:
path
.
join
(
__dirname
, "models")
}) ]; const
combinedDownloader
= await
combineModelDownloaders
(
downloaders
, {
showCliProgress
: true // show download progress in the CLI
}); const [
model1Path
,
model2Path
,
model3Path
] = await
combinedDownloader
.
download
();
const
llama
= await
getLlama
();
const
model1
= await
llama
.
loadModel
({
modelPath
:
model1Path
!
}); const
model2
= await
llama
.
loadModel
({
modelPath
:
model2Path
!
}); const
model3
= await
llama
.
loadModel
({
modelPath
:
model3Path
!
});