Type Alias: LlamaChatSessionOptions
type LlamaChatSessionOptions = {
contextSequence: LlamaContextSequence;
chatWrapper?: "auto" | ChatWrapper;
systemPrompt?: string;
forceAddSystemPrompt?: boolean;
autoDisposeSequence?: boolean;
contextShift?: LlamaChatSessionContextShiftOptions;
};Defined in: evaluator/LlamaChatSession/LlamaChatSession.ts:25
Properties
contextSequence
contextSequence: LlamaContextSequence;Defined in: evaluator/LlamaChatSession/LlamaChatSession.ts:26
chatWrapper?
optional chatWrapper: "auto" | ChatWrapper;Defined in: evaluator/LlamaChatSession/LlamaChatSession.ts:29
"auto" is used by default
systemPrompt?
optional systemPrompt: string;Defined in: evaluator/LlamaChatSession/LlamaChatSession.ts:31
forceAddSystemPrompt?
optional forceAddSystemPrompt: boolean;Defined in: evaluator/LlamaChatSession/LlamaChatSession.ts:41
Add the system prompt even on models that don't support a system prompt.
Each chat wrapper has its own workaround for adding a system prompt to a model that doesn't support it, but forcing the system prompt on unsupported models may not always work as expected.
Use with caution.
autoDisposeSequence?
optional autoDisposeSequence: boolean;Defined in: evaluator/LlamaChatSession/LlamaChatSession.ts:48
Automatically dispose the sequence when the session is disposed.
Defaults to false.
contextShift?
optional contextShift: LlamaChatSessionContextShiftOptions;Defined in: evaluator/LlamaChatSession/LlamaChatSession.ts:50