Skip to content

Class: LlamaChatSession

Defined in: evaluator/LlamaChatSession/LlamaChatSession.ts:519

See

Using LlamaChatSession tutorial

Constructors

Constructor

ts
new LlamaChatSession(options: LlamaChatSessionOptions): LlamaChatSession;

Defined in: evaluator/LlamaChatSession/LlamaChatSession.ts:535

Parameters

ParameterType
optionsLlamaChatSessionOptions

Returns

LlamaChatSession

Properties

onDispose

ts
readonly onDispose: EventRelay<void>;

Defined in: evaluator/LlamaChatSession/LlamaChatSession.ts:533

Accessors

disposed

Get Signature

ts
get disposed(): boolean;

Defined in: evaluator/LlamaChatSession/LlamaChatSession.ts:592

Returns

boolean


chatWrapper

Get Signature

ts
get chatWrapper(): ChatWrapper;

Defined in: evaluator/LlamaChatSession/LlamaChatSession.ts:596

Returns

ChatWrapper


sequence

Get Signature

ts
get sequence(): LlamaContextSequence;

Defined in: evaluator/LlamaChatSession/LlamaChatSession.ts:603

Returns

LlamaContextSequence


context

Get Signature

ts
get context(): LlamaContext;

Defined in: evaluator/LlamaChatSession/LlamaChatSession.ts:610

Returns

LlamaContext


model

Get Signature

ts
get model(): LlamaModel;

Defined in: evaluator/LlamaChatSession/LlamaChatSession.ts:614

Returns

LlamaModel

Methods

dispose()

ts
dispose(__namedParameters?: {
  disposeSequence?: boolean;
}): void;

Defined in: evaluator/LlamaChatSession/LlamaChatSession.ts:577

Parameters

ParameterType
__namedParameters{ disposeSequence?: boolean; }
__namedParameters.disposeSequence?boolean

Returns

void


prompt()

ts
prompt<Functions>(prompt: string, options?: LLamaChatPromptOptions<Functions>): Promise<string>;

Defined in: evaluator/LlamaChatSession/LlamaChatSession.ts:618

Type Parameters

Type ParameterDefault type
Functions extends | ChatSessionModelFunctions | undefinedundefined

Parameters

ParameterType
promptstring
optionsLLamaChatPromptOptions<Functions>

Returns

Promise<string>


promptWithMeta()

ts
promptWithMeta<Functions>(prompt: string, options?: LLamaChatPromptOptions<Functions>): Promise<
  | {
  response: (
     | string
     | ChatModelFunctionCall
    | ChatModelSegment)[];
  responseText: string;
  stopReason: "customStopTrigger";
  customStopTrigger: (string | Token)[];
  remainingGenerationAfterStop: string | Token[] | undefined;
}
  | {
  customStopTrigger?: undefined;
  response: (
     | string
     | ChatModelFunctionCall
    | ChatModelSegment)[];
  responseText: string;
  stopReason:   | "abort"
     | "maxTokens"
     | "eogToken"
     | "stopGenerationTrigger"
     | "functionCalls";
  remainingGenerationAfterStop: string | Token[] | undefined;
}>;

Defined in: evaluator/LlamaChatSession/LlamaChatSession.ts:669

Type Parameters

Type ParameterDefault type
Functions extends | ChatSessionModelFunctions | undefinedundefined

Parameters

ParameterTypeDescription
promptstring-
options?LLamaChatPromptOptions<Functions>-

Returns

Promise< | { response: ( | string | ChatModelFunctionCall | ChatModelSegment)[]; responseText: string; stopReason: "customStopTrigger"; customStopTrigger: (string | Token)[]; remainingGenerationAfterStop: string | Token[] | undefined; } | { customStopTrigger?: undefined; response: ( | string | ChatModelFunctionCall | ChatModelSegment)[]; responseText: string; stopReason: | "abort" | "maxTokens" | "eogToken" | "stopGenerationTrigger" | "functionCalls"; remainingGenerationAfterStop: string | Token[] | undefined; }>


preloadPrompt()

ts
preloadPrompt(prompt: string, options?: LLamaChatPreloadPromptOptions): Promise<void>;

Defined in: evaluator/LlamaChatSession/LlamaChatSession.ts:963

Preload a user prompt into the current context sequence state to make later inference of the model response begin sooner and feel faster.

Note: Preloading a long user prompt can incur context shifts, so consider limiting the length of prompts you preload

Parameters

ParameterTypeDescription
promptstringthe prompt to preload
options?LLamaChatPreloadPromptOptions-

Returns

Promise<void>


completePrompt()

ts
completePrompt(prompt: string, options?: LLamaChatCompletePromptOptions): Promise<string>;

Defined in: evaluator/LlamaChatSession/LlamaChatSession.ts:981

Preload a user prompt into the current context sequence state and generate a completion for it.

Note: Preloading a long user prompt and completing a user prompt with a high number of maxTokens can incur context shifts, so consider limiting the length of prompts you preload.

Also, it's recommended to limit the number of tokens generated to a reasonable amount by configuring maxTokens.

Parameters

ParameterTypeDescription
promptstringthe prompt to preload
options?LLamaChatCompletePromptOptions-

Returns

Promise<string>


createPromptCompletionEngine()

ts
createPromptCompletionEngine(options?: LLamaChatPromptCompletionEngineOptions): LlamaChatSessionPromptCompletionEngine;

Defined in: evaluator/LlamaChatSession/LlamaChatSession.ts:994

Create a smart completion engine that caches the prompt completions and reuses them when the user prompt matches the beginning of the cached prompt or completion.

All completions are made and cache is used only for the current chat session state. You can create a single completion engine for an entire chat session.

Parameters

ParameterType
options?LLamaChatPromptCompletionEngineOptions

Returns

LlamaChatSessionPromptCompletionEngine


completePromptWithMeta()

ts
completePromptWithMeta(prompt: string, options?: LLamaChatCompletePromptOptions): Promise<
  | {
  completion: string;
  stopReason: "customStopTrigger";
  customStopTrigger: (string | Token)[];
  remainingGenerationAfterStop: string | Token[] | undefined;
}
  | {
  customStopTrigger?: undefined;
  completion: string;
  stopReason:   | "abort"
     | "maxTokens"
     | "eogToken"
     | "stopGenerationTrigger"
     | "functionCalls";
  remainingGenerationAfterStop: string | Token[] | undefined;
}>;

Defined in: evaluator/LlamaChatSession/LlamaChatSession.ts:1003

See completePrompt for more information.

Parameters

ParameterTypeDescription
promptstring-
options?LLamaChatCompletePromptOptions-

Returns

Promise< | { completion: string; stopReason: "customStopTrigger"; customStopTrigger: (string | Token)[]; remainingGenerationAfterStop: string | Token[] | undefined; } | { customStopTrigger?: undefined; completion: string; stopReason: | "abort" | "maxTokens" | "eogToken" | "stopGenerationTrigger" | "functionCalls"; remainingGenerationAfterStop: string | Token[] | undefined; }>


getChatHistory()

ts
getChatHistory(): ChatHistoryItem[];

Defined in: evaluator/LlamaChatSession/LlamaChatSession.ts:1237

Returns

ChatHistoryItem[]


getLastEvaluationContextWindow()

ts
getLastEvaluationContextWindow(): ChatHistoryItem[] | null;

Defined in: evaluator/LlamaChatSession/LlamaChatSession.ts:1241

Returns

ChatHistoryItem[] | null


setChatHistory()

ts
setChatHistory(chatHistory: ChatHistoryItem[]): void;

Defined in: evaluator/LlamaChatSession/LlamaChatSession.ts:1248

Parameters

ParameterType
chatHistoryChatHistoryItem[]

Returns

void


resetChatHistory()

ts
resetChatHistory(): void;

Defined in: evaluator/LlamaChatSession/LlamaChatSession.ts:1256

Clear the chat history and reset it to the initial state.

Returns

void