Tools extend what agents can do—letting them fetch real-time data, execute code, query external databases, and take actions in the world.Under the hood, tools are callable functions with well-defined inputs and outputs that get passed to a chat model. The model decides when to invoke a tool based on the conversation context, and what input arguments to provide.
For details on how models handle tool calls, see Tool calling.
The simplest way to create a tool is by importing the tool function from the langchain package. You can use zod to define the tool’s input schema:
Copy
Ask AI
import * as z from "zod"import { tool } from "langchain"const searchDatabase = tool( ({ query, limit }) => `Found ${limit} results for '${query}'`, { name: "search_database", description: "Search the customer database for records matching the query.", schema: z.object({ query: z.string().describe("Search terms to look for"), limit: z.number().describe("Maximum number of results to return"), }), });
Server-side tool useSome chat models (e.g., OpenAI, Anthropic, and Gemini) feature built-in tools that are executed server-side, such as web search and code interpreters. Refer to the provider overview to learn how to access these tools with your specific chat model.
Why this matters: Tools are most powerful when they can access agent state, runtime context, and long-term memory. This enables tools to make context-aware decisions, personalize responses, and maintain information across conversations.The runtime context provides a structured way to supply runtime data, such as DB connections, user IDs, or config, into your tools. This avoids global state and keeps tools testable and reusable.
Access persistent data across conversations using the store. The store is accessed via config.store and allows you to save and retrieve user-specific or application-specific data.
Copy
Ask AI
import * as z from "zod";import { createAgent, tool } from "langchain";import { InMemoryStore } from "@langchain/langgraph";import { ChatOpenAI } from "@langchain/openai";const store = new InMemoryStore();// Access memoryconst getUserInfo = tool( async ({ user_id }) => { const value = await store.get(["users"], user_id); console.log("get_user_info", user_id, value); return value; }, { name: "get_user_info", description: "Look up user info.", schema: z.object({ user_id: z.string(), }), });// Update memoryconst saveUserInfo = tool( async ({ user_id, name, age, email }) => { console.log("save_user_info", user_id, name, age, email); await store.put(["users"], user_id, { name, age, email }); return "Successfully saved user info."; }, { name: "save_user_info", description: "Save user info.", schema: z.object({ user_id: z.string(), name: z.string(), age: z.number(), email: z.string(), }), });const agent = createAgent({ model: new ChatOpenAI({ model: "gpt-4o" }), tools: [getUserInfo, saveUserInfo], store,});// First session: save user infoawait agent.invoke({ messages: [ { role: "user", content: "Save the following user: userid: abc123, name: Foo, age: 25, email: foo@langchain.dev", }, ],});// Second session: get user infoconst result = await agent.invoke({ messages: [ { role: "user", content: "Get user info for user with id 'abc123'" }, ],});console.log(result);// Here is the user info for user with ID "abc123":// - Name: Foo// - Age: 25// - Email: foo@langchain.dev
Stream custom updates from tools as they execute using config.streamWriter. This is useful for providing real-time feedback to users about what a tool is doing.
Copy
Ask AI
import * as z from "zod";import { tool } from "langchain";const getWeather = tool( ({ city }, config) => { const writer = config.streamWriter; // Stream custom updates as the tool executes writer(`Looking up data for city: ${city}`); writer(`Acquired data for city: ${city}`); return `It's always sunny in ${city}!`; }, { name: "get_weather", description: "Get weather for a given city.", schema: z.object({ city: z.string(), }), });