🎠LangChain Prompts: Teaching Your AI What to Say
Imagine you have a super-smart robot friend. But here’s the thing—robots don’t just know what you want. You have to tell them! That’s what prompts are: the instructions you give to your AI.
🍳 The Kitchen Analogy
Think of your AI like a chef in a kitchen:
- You are the customer placing an order
- The prompt is your order slip
- The AI is the chef cooking what you ordered
If you write “make food” on your order slip, the chef is confused! But if you write “make a cheese pizza with mushrooms”—now the chef knows exactly what to do.
Prompts work the same way. The clearer your instructions, the better your AI responds.
đź“‹ Prompt Templates Basics
What’s a Template?
A prompt template is like a fill-in-the-blank form.
Instead of writing a new prompt every time, you create a template with blanks (called variables) that you fill in later.
Without a template (writing everything fresh each time):
Tell me about dogs.
Tell me about cats.
Tell me about birds.
With a template (smart and reusable):
Tell me about {animal}.
Now you just plug in “dogs”, “cats”, or “birds”!
Why Templates Rock 🎸
- Save time – Write once, use many times
- Stay consistent – Same format every time
- Easy to change – Update one template, fix everything
Simple Example in LangChain
from langchain_core.prompts import (
PromptTemplate
)
# Create a template with {topic} variable
template = PromptTemplate.from_template(
"Explain {topic} to a 5-year-old"
)
# Fill in the blank
prompt = template.format(topic="gravity")
# Result: "Explain gravity to a 5-year-old"
The {topic} is a placeholder. When you call .format(), it gets replaced with your actual word!
đź’¬ ChatPromptTemplate
Chatting is Different!
Regular prompts are like leaving a note. But chat prompts are like having a conversation with roles:
- System: The invisible director (sets the rules)
- Human: That’s you (asking questions)
- AI: The assistant (responding)
graph TD A[System Message] --> B[Sets AI personality] C[Human Message] --> D[Your question] E[AI Message] --> F[AI's response] B --> G[Complete Conversation] D --> G F --> G
Building a Chat Template
from langchain_core.prompts import (
ChatPromptTemplate
)
# Create a chat prompt with roles
chat_template = ChatPromptTemplate.from_messages([
("system", "You are a helpful {role}"),
("human", "Help me with {task}")
])
# Fill in the variables
messages = chat_template.format_messages(
role="math tutor",
task="fractions"
)
What you get:
- System: “You are a helpful math tutor”
- Human: “Help me with fractions”
The AI now knows it should act like a math tutor!
🎠Chat Message Types
Think of message types like characters in a play. Each has a role:
| Message Type | Who’s Talking | Purpose |
|---|---|---|
| SystemMessage | Director (invisible) | Set rules & personality |
| HumanMessage | You | Ask questions |
| AIMessage | The AI | Give responses |
Code Example
from langchain_core.messages import (
SystemMessage,
HumanMessage,
AIMessage
)
messages = [
SystemMessage(content="You are a pirate"),
HumanMessage(content="Hello!"),
AIMessage(content="Ahoy, matey!")
]
System: Tells AI “pretend you’re a pirate” Human: You say “Hello!” AI: Responds like a pirate: “Ahoy, matey!”
When to Use Each
- 🎬 System: At the start, set the stage
- 🙋 Human: Every time YOU speak
- 🤖 AI: When showing past AI responses
🔲 Message Placeholders
The Magic of Placeholders
What if you want to insert a whole conversation into your template? That’s where MessagesPlaceholder shines!
It’s like saving a spot for a list of messages—not just one word.
graph TD A[Template with Placeholder] --> B["#40;system#41; You're helpful"] B --> C["📦 MessagesPlaceholder"] C --> D["#40;human#41; New question"] E[Chat History] --> C
Why Use This?
Your AI needs memory! When chatting, you want it to remember what you talked about before.
Code Example
from langchain_core.prompts import (
ChatPromptTemplate,
MessagesPlaceholder
)
template = ChatPromptTemplate.from_messages([
("system", "You are a helpful assistant"),
MessagesPlaceholder(variable_name="history"),
("human", "{question}")
])
# Later, inject conversation history
messages = template.format_messages(
history=[
HumanMessage(content="Hi!"),
AIMessage(content="Hello!")
],
question="What's 2+2?"
)
Now your prompt includes:
- System message
- All the past conversation (history)
- Your new question
The AI remembers everything!
🎓 Few-Shot Prompting
Teaching by Example
Imagine teaching a child to draw a cat. You don’t explain every line—you show them examples!
Few-shot prompting works the same way. You give the AI examples of what you want, then ask it to follow the pattern.
Example 1:
Word: happy → Opposite: sad
Example 2:
Word: big → Opposite: small
Now you try:
Word: fast → Opposite: ???
The AI sees the pattern and answers: “slow”!
Few-Shot in LangChain
from langchain_core.prompts import (
FewShotPromptTemplate,
PromptTemplate
)
# Define your examples
examples = [
{"word": "happy", "opposite": "sad"},
{"word": "big", "opposite": "small"},
]
# Template for each example
example_template = PromptTemplate.from_template(
"Word: {word} → Opposite: {opposite}"
)
# Build the few-shot prompt
few_shot = FewShotPromptTemplate(
examples=examples,
example_prompt=example_template,
prefix="Find the opposite word:",
suffix="Word: {input} → Opposite:",
input_variables=["input"]
)
# Use it!
prompt = few_shot.format(input="fast")
Output:
Find the opposite word:
Word: happy → Opposite: sad
Word: big → Opposite: small
Word: fast → Opposite:
The AI learns from your examples and completes the pattern!
Pro Tips for Few-Shot
| Tip | Why It Helps |
|---|---|
| Use 2-5 examples | Enough to show pattern, not too many |
| Vary your examples | Cover different cases |
| Order matters | Put similar examples together |
đź§© Prompt Composition
Building Bigger from Smaller
What if your prompt is getting long and messy? Break it into pieces!
Prompt composition is like building with LEGO blocks:
- Build small pieces
- Snap them together
- Create something amazing
graph TD A[Piece 1: System Setup] --> D[Final Prompt] B[Piece 2: Examples] --> D C[Piece 3: User Input] --> D
The “+” Magic
In LangChain, you can combine prompts with +:
from langchain_core.prompts import (
ChatPromptTemplate
)
# Piece 1: System setup
system = ChatPromptTemplate.from_messages([
("system", "You are a {role}")
])
# Piece 2: The question
question = ChatPromptTemplate.from_messages([
("human", "{question}")
])
# Combine them!
full_prompt = system + question
# Use the combined prompt
messages = full_prompt.format_messages(
role="chef",
question="How do I boil water?"
)
PipelinePromptTemplate
For even more control, use PipelinePromptTemplate:
from langchain_core.prompts import (
PipelinePromptTemplate,
PromptTemplate
)
# Create reusable pieces
intro = PromptTemplate.from_template(
"You are an expert in {topic}."
)
main = PromptTemplate.from_template(
"{intro}\nQuestion: {question}"
)
# Combine them in a pipeline
pipeline = PipelinePromptTemplate(
final_prompt=main,
pipeline_prompts=[("intro", intro)]
)
result = pipeline.format(
topic="cooking",
question="Best way to chop onions?"
)
Output:
You are an expert in cooking.
Question: Best way to chop onions?
🎯 Quick Recap
| Concept | What It Does | Think of It Like |
|---|---|---|
| Prompt Template | Reusable text with blanks | Fill-in-the-blank form |
| ChatPromptTemplate | Template for conversations | Script with roles |
| Message Types | System/Human/AI messages | Characters in a play |
| MessagesPlaceholder | Insert conversation history | Memory holder |
| Few-Shot Prompting | Teach by examples | “Do it like this” |
| Prompt Composition | Combine smaller prompts | LEGO building |
🚀 You Did It!
You now understand how to:
- âś… Create reusable prompt templates
- âś… Build chat prompts with roles
- âś… Use different message types
- âś… Add memory with placeholders
- âś… Teach AI with examples
- âś… Compose complex prompts from simple pieces
Remember: The clearer your prompt, the smarter your AI seems. You’re not just coding—you’re teaching a robot how to think!
Now go build something amazing! 🎉