js-bao-wss-client / LlmAPI
Interface: LlmAPI
Methods
chat()
chat(
options):Promise<{annotations?:any;content:any;raw?:any;role:string; }>
Sends a chat completion request to the configured LLM provider.
Parameters
options
Configuration for the chat request
Returns
Promise<{ annotations?: any; content: any; raw?: any; role: string; }>
The assistant message with role, content, optional annotations, and raw provider response
models()
models():
Promise<{defaultModel:string;models:string[]; }>
Lists available LLM models and returns the default model name.
Returns
Promise<{ defaultModel: string; models: string[]; }>
Object with models array and defaultModel name