π¦ NoSQL Batch Operations: The Pizza Delivery Story
Imagine youβre running a pizza delivery service. Instead of sending one driver for each pizza, you load up a van with 50 pizzas and deliver them all at once. Thatβs what batch operations do for your database!
π The Big Picture
When you need to work with lots of data in NoSQL databases, doing things one-by-one is slowβlike delivering pizzas one at a time on a bicycle.
Batch operations let you:
- Send many requests in one trip (Bulk Operations)
- Read large results piece by piece (Cursor Operations)
- Show data in manageable chunks (Pagination)
π Part 1: Bulk Operations
What Are Bulk Operations?
Think of bulk operations like a moving truck vs a backpack.
Without bulk operations:
Save pizza order 1 β Wait β Done
Save pizza order 2 β Wait β Done
Save pizza order 3 β Wait β Done
... (100 more times)
With bulk operations:
Load all 100 orders into truck
Send truck once β All done!
Real MongoDB Example
// π Slow way (one by one)
await db.orders.insertOne({pizza: "Pepperoni"})
await db.orders.insertOne({pizza: "Margherita"})
await db.orders.insertOne({pizza: "Hawaiian"})
// π Fast way (bulk insert)
await db.orders.insertMany([
{pizza: "Pepperoni"},
{pizza: "Margherita"},
{pizza: "Hawaiian"}
])
Types of Bulk Operations
graph TD A["π¦ Bulk Operations"] --> B["Insert Many"] A --> C["Update Many"] A --> D["Delete Many"] A --> E["Mixed Batch"] B --> F["Add 1000 users at once"] C --> G["Mark all orders as shipped"] D --> H["Remove old records"] E --> I["Insert + Update + Delete together"]
Mixed Bulk Operations
Sometimes you need to do different things in one batch:
// MongoDB bulkWrite example
await db.orders.bulkWrite([
// Insert a new order
{ insertOne: {
document: {pizza: "BBQ Chicken"}
}},
// Update an existing order
{ updateOne: {
filter: {_id: "order123"},
update: {$set: {status: "delivered"}}
}},
// Delete cancelled orders
{ deleteMany: {
filter: {status: "cancelled"}
}}
])
π‘ Why it matters: One network trip instead of three. Faster. Cheaper. Better!
π― Part 2: Cursor Operations
What is a Cursor?
Imagine you have a library with 10,000 books. You canβt carry them all at once!
A cursor is like a bookmark that helps you:
- Read books a few at a time
- Remember where you stopped
- Continue from that spot later
graph TD A["π 10,000 Documents"] --> B["Cursor Created"] B --> C["Fetch First 100"] C --> D["Process Them"] D --> E["Fetch Next 100"] E --> F["Process Them"] F --> G["... Continue Until Done"]
Simple Cursor Example
// Create a cursor (doesn't fetch data yet!)
const cursor = db.users.find({active: true})
// Process documents one at a time
while (await cursor.hasNext()) {
const user = await cursor.next()
console.log(user.name)
}
Cursor Methods Youβll Use
| Method | What It Does | Pizza Example |
|---|---|---|
next() |
Get one document | Take one pizza from the box |
hasNext() |
Check if more exist | Is the box empty? |
toArray() |
Get all at once | Dump all pizzas on table |
forEach() |
Process each one | Eat each pizza one by one |
close() |
Stop and cleanup | Close the pizza box |
Batch Size Control
// Fetch 50 documents at a time
const cursor = db.orders
.find({})
.batchSize(50)
// Now each network call gets 50 docs
π§ Think of it like: Taking 50 steps at a time instead of 1. You get there faster!
π Part 3: Pagination
What is Pagination?
Open your phone. Go to any shopping app. See how products appear on pages?
- Page 1: Products 1-20
- Page 2: Products 21-40
- Page 3: Products 41-60
Thatβs pagination! Breaking big lists into small, easy pages.
graph TD A["1000 Products"] --> B["Page 1: 1-20"] A --> C["Page 2: 21-40"] A --> D["Page 3: 41-60"] A --> E["... Page 50: 981-1000"]
Two Ways to Paginate
Method 1: Skip and Limit (Easy but Slow)
// Page 1
db.products.find().skip(0).limit(20)
// Page 2
db.products.find().skip(20).limit(20)
// Page 3
db.products.find().skip(40).limit(20)
Problem: Skip gets slower as pages increase. Skipping 10,000 items takes time!
Method 2: Cursor-Based (Fast and Smart)
// First page
db.products
.find()
.sort({_id: 1})
.limit(20)
// Next page (use last ID from previous page)
db.products
.find({_id: {$gt: lastSeenId}})
.sort({_id: 1})
.limit(20)
Why itβs better: Database jumps directly to the right spot!
Comparison Table
| Approach | Speed | Best For |
|---|---|---|
| Skip/Limit | π Slow for big data | Small datasets, simple apps |
| Cursor-based | π Fast always | Large datasets, real apps |
Complete Pagination Example
async function getPage(lastId = null) {
const pageSize = 20
// Build the query
const query = lastId
? {_id: {$gt: lastId}}
: {}
// Fetch the page
const items = await db.products
.find(query)
.sort({_id: 1})
.limit(pageSize + 1) // +1 to check if more exist
.toArray()
// Check if there's a next page
const hasMore = items.length > pageSize
if (hasMore) items.pop() // Remove extra item
return {
items,
hasMore,
nextCursor: items[items.length - 1]?._id
}
}
π Quick Recap
| Concept | What It Does | Everyday Example |
|---|---|---|
| Bulk Ops | Many actions in one request | Moving truck vs backpack |
| Cursors | Read large data piece by piece | Bookmark in a library |
| Pagination | Show data in pages | Shopping app product pages |
π Why This Matters
- Speed: Less waiting, faster apps
- Memory: Your app doesnβt explode with big data
- User Experience: People see results quickly
- Cost: Fewer database calls = lower bills
π Pro Tips
π₯ Bulk Tip: Always use
bulkWrite()when doing 10+ operations
π₯ Cursor Tip: Set
batchSize()based on your document size
π₯ Pagination Tip: Use cursor-based pagination for infinite scroll
Remember: Batch operations are like smart delivery routes. Do more with less trips! π
