🛠️ LangChain Tools: Teaching Your AI to Use Superpowers
Imagine you have a super-smart robot friend. This robot can think and talk really well. But what if it could also do things? Like check the weather, search the internet, or solve math problems?
That’s what Tools in LangChain do. They give your AI hands to interact with the world!
🎯 The Big Picture: A Universal Analogy
Think of your AI like a chef in a kitchen.
- The chef (AI) knows recipes (knowledge)
- But to actually cook, the chef needs kitchen tools: knives, pans, ovens
- Each tool does ONE specific job
- The chef decides WHICH tool to use and WHEN
LangChain Tools = Kitchen tools for your AI chef!
1. 📞 Function and Tool Calling
What Is It?
When your AI needs to DO something (not just talk), it “calls” a tool. Like saying:
“Hey Calculator Tool, what is 25 × 48?”
The tool does the work and sends back the answer.
Simple Example
# Your AI wants to know the weather
# Instead of guessing, it CALLS a tool
result = weather_tool("New York")
# Tool returns: "Sunny, 72°F"
Real Life Comparison
- You ask Siri “What’s the weather?”
- Siri doesn’t GUESS - it calls a weather service
- The service returns real data
- Siri tells you the answer
That’s tool calling!
2. đź”— Binding Tools to Models
What Is It?
Before your AI can use tools, you must INTRODUCE them. Like:
“Hey AI, here are the tools you can use today!”
This is called binding.
The Kitchen Analogy
Before cooking, you show the chef:
- “Here’s your knife” 🔪
- “Here’s your pan” 🍳
- “Here’s your oven” 🔥
Now the chef KNOWS what’s available!
Simple Example
from langchain_openai import ChatOpenAI
from langchain_core.tools import tool
# Create a simple tool
@tool
def add_numbers(a: int, b: int) -> int:
"""Add two numbers together."""
return a + b
# Create AI model
llm = ChatOpenAI()
# BIND the tool to the model
llm_with_tools = llm.bind_tools([add_numbers])
What Happens?
graph TD A[AI Model] --> B[Bind Tools] B --> C[AI + Tools Ready!] C --> D[Can use add_numbers] C --> E[Can use other tools]
Now your AI KNOWS it can add numbers!
3. 🎮 Tool Choice Control
What Is It?
Sometimes you want to CONTROL which tools your AI uses:
- “Use ANY tool you want” - AI picks freely
- “You MUST use this specific tool” - Force a choice
- “Don’t use any tools” - Just chat normally
The Kitchen Analogy
- “Cook however you want” = Any tool
- “You MUST use the oven” = Required tool
- “No cooking today, just talk” = No tools
Simple Examples
# Let AI choose any tool
llm.bind_tools(tools, tool_choice="auto")
# Force AI to use a specific tool
llm.bind_tools(tools, tool_choice="weather")
# Don't use any tools
llm.bind_tools(tools, tool_choice="none")
When To Use Each?
| Mode | When to Use |
|---|---|
auto |
Normal conversations |
required |
You KNOW which tool is needed |
none |
Just want to chat |
4. ⚡ Parallel Tool Calling
What Is It?
Your AI can use multiple tools at the same time! Instead of:
- Get weather… wait…
- Get news… wait…
- Get stocks… wait…
It does:
- Get weather + news + stocks… all at once!
The Kitchen Analogy
A great chef doesn’t:
- Chop onions, WAIT, then boil water, WAIT, then preheat oven
A great chef does:
- Chops onions WHILE water boils WHILE oven preheats
Much faster!
Simple Example
# AI needs multiple pieces of info
# It calls ALL tools in parallel
response = llm.invoke(
"What's the weather in NYC, Paris, and Tokyo?"
)
# AI calls weather_tool 3 times SIMULTANEOUSLY
The Magic
graph TD A[User Question] --> B[AI Decides] B --> C[Weather NYC] B --> D[Weather Paris] B --> E[Weather Tokyo] C --> F[Combine Results] D --> F E --> F F --> G[Final Answer]
All three happen at the same time!
5. đź”§ Defining Custom Tools
What Is It?
LangChain has built-in tools. But YOU can create your own!
It’s like building your own kitchen gadget that does exactly what you need.
The Simplest Way: @tool Decorator
from langchain_core.tools import tool
@tool
def multiply(a: int, b: int) -> int:
"""Multiply two numbers together."""
return a * b
That’s it! You just created a tool!
Key Parts
| Part | Purpose |
|---|---|
@tool |
Magic decorator that makes it a tool |
| Function name | Becomes tool name |
| Docstring | Tells AI what tool does |
| Parameters | What info the tool needs |
| Return | What the tool gives back |
Another Example
@tool
def get_word_length(word: str) -> int:
"""Count letters in a word."""
return len(word)
# Now AI can use this!
# "How long is 'elephant'?" → 8
6. đź“‹ Tool Schemas with Pydantic
What Is It?
Sometimes your tool needs COMPLEX inputs. Pydantic helps you describe exactly what’s needed.
The Kitchen Analogy
Simple order: “One pizza” Complex order: “Large pizza, thin crust, extra cheese, no onions, well done”
Pydantic helps describe complex orders clearly!
Simple Example
from pydantic import BaseModel, Field
from langchain_core.tools import tool
class PizzaOrder(BaseModel):
"""A pizza order with all details."""
size: str = Field(description="small, medium, or large")
toppings: list[str] = Field(description="list of toppings")
crust: str = Field(description="thin, regular, or thick")
@tool
def order_pizza(order: PizzaOrder) -> str:
"""Place a pizza order."""
return f"Ordered: {order.size} pizza with {order.toppings}"
Why Use Pydantic?
- Clear descriptions - AI knows exactly what each field means
- Validation - Wrong data gets caught early
- Structure - Complex data stays organized
graph TD A[User Request] --> B[AI Understands] B --> C[Creates Pydantic Object] C --> D[Validation Check] D --> E{Valid?} E -->|Yes| F[Tool Executes] E -->|No| G[Error Message]
7. ⚠️ Tool Execution and Errors
What Is It?
Tools can fail! Networks go down. APIs break. Inputs are wrong.
Your code needs to handle these gracefully.
The Kitchen Analogy
What if:
- The oven breaks mid-cooking?
- You run out of ingredients?
- The recipe has a typo?
A good chef has backup plans!
Simple Error Handling
from langchain_core.tools import ToolException
@tool
def divide(a: int, b: int) -> float:
"""Divide a by b."""
if b == 0:
raise ToolException("Cannot divide by zero!")
return a / b
Handling Errors Gracefully
@tool(handle_tool_error=True)
def risky_operation(data: str) -> str:
"""Do something that might fail."""
# If error happens, AI gets friendly message
# Instead of crashing
Custom Error Messages
def handle_my_error(error):
return f"Oops! Something went wrong: {error}"
@tool(handle_tool_error=handle_my_error)
def my_tool(x: int) -> int:
"""A tool that might fail."""
# Your code here
8. đź§° Toolkits
What Is It?
A toolkit is a COLLECTION of related tools bundled together.
Instead of adding tools one by one, you get a whole set!
The Kitchen Analogy
Instead of buying:
- Knife
- Fork
- Spoon
- Ladle
- Spatula
You buy a “Complete Kitchen Set” - all tools included!
Simple Example
from langchain_community.agent_toolkits import SQLDatabaseToolkit
from langchain_community.utilities import SQLDatabase
# Connect to database
db = SQLDatabase.from_uri("sqlite:///mydata.db")
# Get the TOOLKIT (multiple tools at once!)
toolkit = SQLDatabaseToolkit(db=db, llm=llm)
# See what tools you got
tools = toolkit.get_tools()
# Returns: [QuerySQLDataBaseTool, InfoSQLDatabaseTool, ...]
Popular Toolkits
| Toolkit | What It Does |
|---|---|
| SQLDatabaseToolkit | Work with databases |
| FileManagementToolkit | Read/write files |
| PlayWrightBrowserToolkit | Browse the web |
| GmailToolkit | Send/read emails |
Why Toolkits?
graph TD A[Without Toolkit] --> B[Add Tool 1] B --> C[Add Tool 2] C --> D[Add Tool 3] D --> E[Configure Each] F[With Toolkit] --> G[Import Toolkit] G --> H[All Tools Ready!]
Toolkits save time and ensure tools work together!
🎬 Putting It All Together
Here’s how all these pieces work as a team:
graph TD A[Define Custom Tools] --> B[Add Pydantic Schemas] B --> C[Or Use a Toolkit] C --> D[Bind Tools to Model] D --> E[Set Tool Choice] E --> F[AI Uses Tools] F --> G[Parallel Calls if Needed] G --> H[Handle Any Errors] H --> I[Return Results!]
Complete Mini Example
from langchain_openai import ChatOpenAI
from langchain_core.tools import tool
from pydantic import BaseModel, Field
# 1. Define tool with Pydantic
class MathInput(BaseModel):
x: int = Field(description="First number")
y: int = Field(description="Second number")
@tool
def add(input: MathInput) -> int:
"""Add two numbers."""
return input.x + input.y
@tool
def multiply(input: MathInput) -> int:
"""Multiply two numbers."""
return input.x * input.y
# 2. Create model and bind tools
llm = ChatOpenAI()
llm_with_tools = llm.bind_tools(
[add, multiply],
tool_choice="auto" # 3. Set choice
)
# 4. Use it!
response = llm_with_tools.invoke(
"What is 5 + 3 and 5 Ă— 3?"
)
# AI calls BOTH tools in parallel!
🌟 Key Takeaways
- Tools give AI hands - They can DO things, not just talk
- Binding connects tools to AI - AI must know what’s available
- Tool choice gives control - You decide how AI picks tools
- Parallel = Speed - Multiple tools at once
- Custom tools = Your rules - Build exactly what you need
- Pydantic = Clarity - Complex inputs made simple
- Error handling = Safety - Things fail gracefully
- Toolkits = Convenience - Related tools bundled together
🚀 You Did It!
You now understand how to give your AI superpowers through tools!
Your AI chef now has a fully-equipped kitchen. It knows how to:
- Pick the right tool for the job
- Use multiple tools at once
- Handle kitchen disasters gracefully
- Use pre-made tool sets
Go build something amazing! 🎉