How to parse JSON output
While some model providers support built-in ways to return structured output, not all do. We can use an output parser to help users to specify an arbitrary JSON schema via the prompt, query a model for outputs that conform to that schema, and finally parse that schema as JSON.
Keep in mind that large language models are leaky abstractions! You’ll have to use an LLM with sufficient capacity to generate well-formed JSON.
The
JsonOutputParser
is one built-in option for prompting for and then parsing JSON output.
Pick your chat model:
- OpenAI
- Anthropic
- FireworksAI
- MistralAI
Install dependencies
- npm
- yarn
- pnpm
npm i @langchain/openai
yarn add @langchain/openai
pnpm add @langchain/openai
Add environment variables
OPENAI_API_KEY=your-api-key
Instantiate the model
import { ChatOpenAI } from "@langchain/openai";
const model = new ChatOpenAI({
model: "gpt-3.5-turbo-0125",
temperature: 0
});
Install dependencies
- npm
- yarn
- pnpm
npm i @langchain/anthropic
yarn add @langchain/anthropic
pnpm add @langchain/anthropic
Add environment variables
ANTHROPIC_API_KEY=your-api-key
Instantiate the model
import { ChatAnthropic } from "@langchain/anthropic";
const model = new ChatAnthropic({
model: "claude-3-sonnet-20240229",
temperature: 0
});
Install dependencies
- npm
- yarn
- pnpm
npm i @langchain/community
yarn add @langchain/community
pnpm add @langchain/community
Add environment variables
FIREWORKS_API_KEY=your-api-key
Instantiate the model
import { ChatFireworks } from "@langchain/community/chat_models/fireworks";
const model = new ChatFireworks({
model: "accounts/fireworks/models/firefunction-v1",
temperature: 0
});
Install dependencies
- npm
- yarn
- pnpm
npm i @langchain/mistralai
yarn add @langchain/mistralai
pnpm add @langchain/mistralai
Add environment variables
MISTRAL_API_KEY=your-api-key
Instantiate the model
import { ChatMistralAI } from "@langchain/mistralai";
const model = new ChatMistralAI({
model: "mistral-large-latest",
temperature: 0
});
import { JsonOutputParser } from "@langchain/core/output_parsers";
import { PromptTemplate } from "@langchain/core/prompts";
// Define your desired data structure.
interface Joke {
setup: string;
punchline: string;
}
// And a query intented to prompt a language model to populate the data structure.
const jokeQuery = "Tell me a joke.";
const formatInstructions =
"Respond with a valid JSON object, containing two fields: 'setup' and 'punchline'.";
// Set up a parser + inject instructions into the prompt template.
const parser = new JsonOutputParser<Joke>();
const prompt = new PromptTemplate({
template: "Answer the user query.\n{format_instructions}\n{query}\n",
inputVariables: ["query"],
partialVariables: { format_instructions: formatInstructions },
});
const chain = prompt.pipe(model).pipe(parser);
await chain.invoke({ query: jokeQuery });
{
setup: "Why couldn't the bicycle stand up by itself?",
punchline: "Because it was two tired!"
}
Streaming
The JsonOutputParser
also supports streaming partial chunks. This is
useful when the model returns partial JSON output in multiple chunks.
The parser will keep track of the partial chunks and return the final
JSON output when the model finishes generating the output.
for await (const s of await chain.stream({ query: jokeQuery })) {
console.log(s);
}
{}
{ setup: "" }
{ setup: "Why" }
{ setup: "Why couldn" }
{ setup: "Why couldn't" }
{ setup: "Why couldn't the" }
{ setup: "Why couldn't the bicycle" }
{ setup: "Why couldn't the bicycle stand" }
{ setup: "Why couldn't the bicycle stand up" }
{ setup: "Why couldn't the bicycle stand up by" }
{ setup: "Why couldn't the bicycle stand up by itself" }
{
setup: "Why couldn't the bicycle stand up by itself?",
punchline: ""
}
{
setup: "Why couldn't the bicycle stand up by itself?",
punchline: "It"
}
{
setup: "Why couldn't the bicycle stand up by itself?",
punchline: "It was"
}
{
setup: "Why couldn't the bicycle stand up by itself?",
punchline: "It was two"
}
{
setup: "Why couldn't the bicycle stand up by itself?",
punchline: "It was two tired"
}
{
setup: "Why couldn't the bicycle stand up by itself?",
punchline: "It was two tired."
}
Next steps
You’ve now learned one way to prompt a model to return structured JSON. Next, check out the broader guide on obtaining structured output for other techniques.