Skip to content

Class: LlamaChatSession

Constructors

new LlamaChatSession()

ts
new LlamaChatSession(options: LlamaChatSessionOptions): LlamaChatSession

Parameters

ParameterType
optionsLlamaChatSessionOptions

Returns

LlamaChatSession

Defined in

evaluator/LlamaChatSession/LlamaChatSession.ts:303

Properties

onDispose

ts
readonly onDispose: EventRelay<void>;

Defined in

evaluator/LlamaChatSession/LlamaChatSession.ts:301

Accessors

disposed

ts
get disposed(): boolean

Returns

boolean

Defined in

evaluator/LlamaChatSession/LlamaChatSession.ts:360


chatWrapper

ts
get chatWrapper(): ChatWrapper

Returns

ChatWrapper

Defined in

evaluator/LlamaChatSession/LlamaChatSession.ts:364


sequence

ts
get sequence(): LlamaContextSequence

Returns

LlamaContextSequence

Defined in

evaluator/LlamaChatSession/LlamaChatSession.ts:371


context

ts
get context(): LlamaContext

Returns

LlamaContext

Defined in

evaluator/LlamaChatSession/LlamaChatSession.ts:378


model

ts
get model(): LlamaModel

Returns

LlamaModel

Defined in

evaluator/LlamaChatSession/LlamaChatSession.ts:382

Methods

dispose()

ts
dispose(__namedParameters: {
  disposeSequence: boolean;
 }): void

Parameters

ParameterType
__namedParametersobject
__namedParameters.disposeSequence?boolean

Returns

void

Defined in

evaluator/LlamaChatSession/LlamaChatSession.ts:345


prompt()

ts
prompt<Functions>(prompt: string, options: LLamaChatPromptOptions<Functions>): Promise<string>

Type Parameters

Type ParameterDefault type
Functions extends undefined | ChatSessionModelFunctionsundefined

Parameters

ParameterType
promptstring
optionsLLamaChatPromptOptions<Functions>

Returns

Promise<string>

Defined in

evaluator/LlamaChatSession/LlamaChatSession.ts:386


promptWithMeta()

ts
promptWithMeta<Functions>(prompt: string, options?: LLamaChatPromptOptions<Functions>): Promise<{
  response: lastModelResponseItem.response;
  responseText: string;
  stopReason: metadata.stopReason;
  customStopTrigger: metadata.customStopTrigger;
  remainingGenerationAfterStop: metadata.remainingGenerationAfterStop;
 } | {
  customStopTrigger: undefined;
  response: lastModelResponseItem.response;
  responseText: string;
  stopReason: metadata.stopReason;
  remainingGenerationAfterStop: metadata.remainingGenerationAfterStop;
 }>

Type Parameters

Type ParameterDefault type
Functions extends undefined | ChatSessionModelFunctionsundefined

Parameters

ParameterTypeDescription
promptstring
options?LLamaChatPromptOptions<Functions>

Returns

Promise<{ response: lastModelResponseItem.response; responseText: string; stopReason: metadata.stopReason; customStopTrigger: metadata.customStopTrigger; remainingGenerationAfterStop: metadata.remainingGenerationAfterStop; } | { customStopTrigger: undefined; response: lastModelResponseItem.response; responseText: string; stopReason: metadata.stopReason; remainingGenerationAfterStop: metadata.remainingGenerationAfterStop; }>

Defined in

evaluator/LlamaChatSession/LlamaChatSession.ts:428


preloadPrompt()

ts
preloadPrompt(prompt: string, options?: LLamaChatPreloadPromptOptions): Promise<void>

Preload a user prompt into the current context sequence state to make later inference of the model response begin sooner and feel faster.

Note: Preloading a long user prompt can incur context shifts, so consider limiting the length of prompts you preload

Parameters

ParameterTypeDescription
promptstringthe prompt to preload
options?LLamaChatPreloadPromptOptions

Returns

Promise<void>

Defined in

evaluator/LlamaChatSession/LlamaChatSession.ts:667


completePrompt()

ts
completePrompt(prompt: string, options?: LLamaChatCompletePromptOptions): Promise<string>

Preload a user prompt into the current context sequence state and generate a completion for it.

Note: Preloading a long user prompt and completing a user prompt with a high number of maxTokens can incur context shifts, so consider limiting the length of prompts you preload.

Also, it's recommended to limit the number of tokens generated to a reasonable amount by configuring maxTokens.

Parameters

ParameterTypeDescription
promptstringthe prompt to preload
options?LLamaChatCompletePromptOptions

Returns

Promise<string>

Defined in

evaluator/LlamaChatSession/LlamaChatSession.ts:684


createPromptCompletionEngine()

ts
createPromptCompletionEngine(options?: LLamaChatPromptCompletionEngineOptions): LlamaChatSessionPromptCompletionEngine

Create a smart completion engine that caches the prompt completions and reuses them when the user prompt matches the beginning of the cached prompt or completion.

All completions are made and cache is used only for the current chat session state. You can create a single completion engine for an entire chat session.

Parameters

ParameterType
options?LLamaChatPromptCompletionEngineOptions

Returns

LlamaChatSessionPromptCompletionEngine

Defined in

evaluator/LlamaChatSession/LlamaChatSession.ts:697


completePromptWithMeta()

ts
completePromptWithMeta(prompt: string, options?: LLamaChatCompletePromptOptions): Promise<{
  completion: completion;
  stopReason: metadata.stopReason;
  customStopTrigger: metadata.customStopTrigger;
  remainingGenerationAfterStop: metadata.remainingGenerationAfterStop;
 } | {
  customStopTrigger: undefined;
  completion: completion;
  stopReason: metadata.stopReason;
  remainingGenerationAfterStop: metadata.remainingGenerationAfterStop;
 }>

See completePrompt for more information.

Parameters

ParameterTypeDescription
promptstring
options?LLamaChatCompletePromptOptions

Returns

Promise<{ completion: completion; stopReason: metadata.stopReason; customStopTrigger: metadata.customStopTrigger; remainingGenerationAfterStop: metadata.remainingGenerationAfterStop; } | { customStopTrigger: undefined; completion: completion; stopReason: metadata.stopReason; remainingGenerationAfterStop: metadata.remainingGenerationAfterStop; }>

Defined in

evaluator/LlamaChatSession/LlamaChatSession.ts:706


getChatHistory()

ts
getChatHistory(): ChatHistoryItem[]

Returns

ChatHistoryItem[]

Defined in

evaluator/LlamaChatSession/LlamaChatSession.ts:805


getLastEvaluationContextWindow()

ts
getLastEvaluationContextWindow(): null | ChatHistoryItem[]

Returns

null | ChatHistoryItem[]

Defined in

evaluator/LlamaChatSession/LlamaChatSession.ts:809


setChatHistory()

ts
setChatHistory(chatHistory: ChatHistoryItem[]): void

Parameters

ParameterType
chatHistoryChatHistoryItem[]

Returns

void

Defined in

evaluator/LlamaChatSession/LlamaChatSession.ts:816


resetChatHistory()

ts
resetChatHistory(): void

Clear the chat history and reset it to the initial state.

Returns

void

Defined in

evaluator/LlamaChatSession/LlamaChatSession.ts:823