Expand description
Constructors§
Source§new MockServer(options?: MockServerOptions): MockServer
new MockServer(options?: MockServerOptions): MockServer
Properties§
§§§§
next Error: { ... }...Queue a one-shot error for the very next request. Fires once then removes itself.
when: { ... }...Register a matching rule. Call .reply() on the result to set the response.
when Tool: { ... }...Shorthand for when({ toolName }).
when Tool Result: { ... }...Shorthand for when({ toolCallId }).
Accessors§
Source§get history(): RequestHistory
get history(): RequestHistory
Every request the server has handled.
Source§get routes(): readonly string[]
get routes(): readonly string[]
The API routes registered on this server, e.g. ["/v1/chat/completions", ...].
Source§get rules(): readonly RuleSummary[]
get rules(): readonly RuleSummary[]
A snapshot of all registered rules with their descriptions and remaining match counts.
Mock LLM server that handles OpenAI Chat Completions, Anthropic Messages, and OpenAI Responses API formats. Register rules with
when(), point your SDK aturl, and go.Supports
await usingfor automatic cleanup.