Class: LlamaChatSession
Constructors
new LlamaChatSession()
new LlamaChatSession(options: LlamaChatSessionOptions): LlamaChatSession
Parameters
Parameter | Type |
---|---|
options | LlamaChatSessionOptions |
Returns
Defined in
evaluator/LlamaChatSession/LlamaChatSession.ts:303
Properties
onDispose
readonly onDispose: EventRelay<void>;
Defined in
evaluator/LlamaChatSession/LlamaChatSession.ts:301
Accessors
disposed
get disposed(): boolean
Returns
boolean
Defined in
evaluator/LlamaChatSession/LlamaChatSession.ts:360
chatWrapper
get chatWrapper(): ChatWrapper
Returns
Defined in
evaluator/LlamaChatSession/LlamaChatSession.ts:364
sequence
get sequence(): LlamaContextSequence
Returns
Defined in
evaluator/LlamaChatSession/LlamaChatSession.ts:371
context
get context(): LlamaContext
Returns
Defined in
evaluator/LlamaChatSession/LlamaChatSession.ts:378
model
get model(): LlamaModel
Returns
Defined in
evaluator/LlamaChatSession/LlamaChatSession.ts:382
Methods
dispose()
dispose(__namedParameters: {
disposeSequence: boolean;
}): void
Parameters
Parameter | Type |
---|---|
__namedParameters | object |
__namedParameters.disposeSequence ? | boolean |
Returns
void
Defined in
evaluator/LlamaChatSession/LlamaChatSession.ts:345
prompt()
prompt<Functions>(prompt: string, options: LLamaChatPromptOptions<Functions>): Promise<string>
Type Parameters
Type Parameter | Default type |
---|---|
Functions extends undefined | ChatSessionModelFunctions | undefined |
Parameters
Parameter | Type |
---|---|
prompt | string |
options | LLamaChatPromptOptions <Functions > |
Returns
Promise
<string
>
Defined in
evaluator/LlamaChatSession/LlamaChatSession.ts:386
promptWithMeta()
promptWithMeta<Functions>(prompt: string, options?: LLamaChatPromptOptions<Functions>): Promise<{
response: lastModelResponseItem.response;
responseText: string;
stopReason: metadata.stopReason;
customStopTrigger: metadata.customStopTrigger;
remainingGenerationAfterStop: metadata.remainingGenerationAfterStop;
} | {
customStopTrigger: undefined;
response: lastModelResponseItem.response;
responseText: string;
stopReason: metadata.stopReason;
remainingGenerationAfterStop: metadata.remainingGenerationAfterStop;
}>
Type Parameters
Type Parameter | Default type |
---|---|
Functions extends undefined | ChatSessionModelFunctions | undefined |
Parameters
Parameter | Type | Description |
---|---|---|
prompt | string | |
options ? | LLamaChatPromptOptions <Functions > |
Returns
Promise
<{ response
: lastModelResponseItem.response
; responseText
: string
; stopReason
: metadata.stopReason
; customStopTrigger
: metadata.customStopTrigger
; remainingGenerationAfterStop
: metadata.remainingGenerationAfterStop
; } | { customStopTrigger
: undefined
; response
: lastModelResponseItem.response
; responseText
: string
; stopReason
: metadata.stopReason
; remainingGenerationAfterStop
: metadata.remainingGenerationAfterStop
; }>
Defined in
evaluator/LlamaChatSession/LlamaChatSession.ts:428
preloadPrompt()
preloadPrompt(prompt: string, options?: LLamaChatPreloadPromptOptions): Promise<void>
Preload a user prompt into the current context sequence state to make later inference of the model response begin sooner and feel faster.
Note: Preloading a long user prompt can incur context shifts, so consider limiting the length of prompts you preload
Parameters
Parameter | Type | Description |
---|---|---|
prompt | string | the prompt to preload |
options ? | LLamaChatPreloadPromptOptions |
Returns
Promise
<void
>
Defined in
evaluator/LlamaChatSession/LlamaChatSession.ts:667
completePrompt()
completePrompt(prompt: string, options?: LLamaChatCompletePromptOptions): Promise<string>
Preload a user prompt into the current context sequence state and generate a completion for it.
Note: Preloading a long user prompt and completing a user prompt with a high number of
maxTokens
can incur context shifts, so consider limiting the length of prompts you preload.Also, it's recommended to limit the number of tokens generated to a reasonable amount by configuring
maxTokens
.
Parameters
Parameter | Type | Description |
---|---|---|
prompt | string | the prompt to preload |
options ? | LLamaChatCompletePromptOptions |
Returns
Promise
<string
>
Defined in
evaluator/LlamaChatSession/LlamaChatSession.ts:684
createPromptCompletionEngine()
createPromptCompletionEngine(options?: LLamaChatPromptCompletionEngineOptions): LlamaChatSessionPromptCompletionEngine
Create a smart completion engine that caches the prompt completions and reuses them when the user prompt matches the beginning of the cached prompt or completion.
All completions are made and cache is used only for the current chat session state. You can create a single completion engine for an entire chat session.
Parameters
Parameter | Type |
---|---|
options ? | LLamaChatPromptCompletionEngineOptions |
Returns
LlamaChatSessionPromptCompletionEngine
Defined in
evaluator/LlamaChatSession/LlamaChatSession.ts:697
completePromptWithMeta()
completePromptWithMeta(prompt: string, options?: LLamaChatCompletePromptOptions): Promise<{
completion: completion;
stopReason: metadata.stopReason;
customStopTrigger: metadata.customStopTrigger;
remainingGenerationAfterStop: metadata.remainingGenerationAfterStop;
} | {
customStopTrigger: undefined;
completion: completion;
stopReason: metadata.stopReason;
remainingGenerationAfterStop: metadata.remainingGenerationAfterStop;
}>
See completePrompt
for more information.
Parameters
Parameter | Type | Description |
---|---|---|
prompt | string | |
options ? | LLamaChatCompletePromptOptions |
Returns
Promise
<{ completion
: completion
; stopReason
: metadata.stopReason
; customStopTrigger
: metadata.customStopTrigger
; remainingGenerationAfterStop
: metadata.remainingGenerationAfterStop
; } | { customStopTrigger
: undefined
; completion
: completion
; stopReason
: metadata.stopReason
; remainingGenerationAfterStop
: metadata.remainingGenerationAfterStop
; }>
Defined in
evaluator/LlamaChatSession/LlamaChatSession.ts:706
getChatHistory()
getChatHistory(): ChatHistoryItem[]
Returns
Defined in
evaluator/LlamaChatSession/LlamaChatSession.ts:805
getLastEvaluationContextWindow()
getLastEvaluationContextWindow(): null | ChatHistoryItem[]
Returns
null
| ChatHistoryItem
[]
Defined in
evaluator/LlamaChatSession/LlamaChatSession.ts:809
setChatHistory()
setChatHistory(chatHistory: ChatHistoryItem[]): void
Parameters
Parameter | Type |
---|---|
chatHistory | ChatHistoryItem [] |
Returns
void
Defined in
evaluator/LlamaChatSession/LlamaChatSession.ts:816
resetChatHistory()
resetChatHistory(): void
Clear the chat history and reset it to the initial state.
Returns
void