Type Alias: LlamaChatSessionOptions
type LlamaChatSessionOptions = {
contextSequence: LlamaContextSequence;
chatWrapper?: "auto" | ChatWrapper;
systemPrompt?: string;
forceAddSystemPrompt?: boolean;
autoDisposeSequence?: boolean;
contextShift?: LlamaChatSessionContextShiftOptions;
};
Defined in: evaluator/LlamaChatSession/LlamaChatSession.ts:24
Properties
contextSequence
contextSequence: LlamaContextSequence;
Defined in: evaluator/LlamaChatSession/LlamaChatSession.ts:25
chatWrapper?
optional chatWrapper: "auto" | ChatWrapper;
Defined in: evaluator/LlamaChatSession/LlamaChatSession.ts:28
"auto"
is used by default
systemPrompt?
optional systemPrompt: string;
Defined in: evaluator/LlamaChatSession/LlamaChatSession.ts:30
forceAddSystemPrompt?
optional forceAddSystemPrompt: boolean;
Defined in: evaluator/LlamaChatSession/LlamaChatSession.ts:40
Add the system prompt even on models that don't support a system prompt.
Each chat wrapper has its own workaround for adding a system prompt to a model that doesn't support it, but forcing the system prompt on unsupported models may not always work as expected.
Use with caution.
autoDisposeSequence?
optional autoDisposeSequence: boolean;
Defined in: evaluator/LlamaChatSession/LlamaChatSession.ts:47
Automatically dispose the sequence when the session is disposed.
Defaults to false
.
contextShift?
optional contextShift: LlamaChatSessionContextShiftOptions;
Defined in: evaluator/LlamaChatSession/LlamaChatSession.ts:49