TensorFlow Advanced Tensors: The LEGO Master Builder’s Guide
Imagine you have the world’s biggest box of LEGO bricks. Today, we’ll learn how to join them together, split them apart, and use special magic bricks that can do amazing things!
The Big Picture: What Are We Learning?
Think of tensors as LEGO structures. Sometimes you want to:
- Stick two structures together (Joining)
- Break one big structure into smaller pieces (Splitting)
- Build with weird-shaped bricks (Ragged Tensors)
- Save space by only keeping important bricks (Sparse Tensors)
Let’s become master builders!
1. Tensor Joining and Splitting
Joining Tensors: Sticking LEGO Towers Together
Imagine you built two LEGO towers. Now you want to combine them into one amazing creation!
tf.concat: Stacking Towers End-to-End
Like putting two trains together to make one long train:
import tensorflow as tf
# Two small towers
tower1 = tf.constant([[1, 2],
[3, 4]])
tower2 = tf.constant([[5, 6],
[7, 8]])
# Stack vertically (axis=0)
tall_tower = tf.concat(
[tower1, tower2], axis=0
)
# Result: [[1,2], [3,4], [5,6], [7,8]]
# Stack horizontally (axis=1)
wide_tower = tf.concat(
[tower1, tower2], axis=1
)
# Result: [[1,2,5,6], [3,4,7,8]]
Simple Rule:
axis=0= Stack on TOP of each other (taller)axis=1= Stack NEXT TO each other (wider)
tf.stack: Creating a New Dimension
This is different! Instead of combining, you’re putting towers into separate display cases:
# Same towers
tower1 = tf.constant([1, 2, 3])
tower2 = tf.constant([4, 5, 6])
# Put each in its own layer
stacked = tf.stack([tower1, tower2])
# Shape: (2, 3) - 2 towers, 3 bricks each
# Result: [[1,2,3], [4,5,6]]
concat = Merge into one | stack = Keep separate but organized
Splitting Tensors: Breaking Apart Your Creation
Sometimes you need to take your big LEGO castle and divide it into smaller pieces for your friends.
tf.split: Equal Pieces
Like cutting a cake into equal slices:
# One big tensor
big_tower = tf.constant([1, 2, 3, 4, 5, 6])
# Split into 3 equal parts
pieces = tf.split(big_tower, 3)
# pieces[0] = [1, 2]
# pieces[1] = [3, 4]
# pieces[2] = [5, 6]
tf.unstack: Remove the Display Cases
The opposite of stack - take items out of their cases:
stacked = tf.constant([[1, 2, 3],
[4, 5, 6]])
# Unpack into separate tensors
tower1, tower2 = tf.unstack(stacked)
# tower1 = [1, 2, 3]
# tower2 = [4, 5, 6]
2. Advanced Tensor Operations
Now let’s learn some superpowers for manipulating our LEGO creations!
Reshaping: Same Bricks, New Shape
Imagine rearranging 12 bricks from a 3x4 rectangle to a 2x6 rectangle:
# Original: 2 rows, 3 columns
original = tf.constant([[1, 2, 3],
[4, 5, 6]])
# Reshape to 3 rows, 2 columns
new_shape = tf.reshape(original, [3, 2])
# Result: [[1,2], [3,4], [5,6]]
# Flatten to 1D (one long line)
flat = tf.reshape(original, [-1])
# Result: [1, 2, 3, 4, 5, 6]
Pro Tip: Use
-1to let TensorFlow figure out that dimension!
Transposing: Rotating Your Creation
Like rotating a photo from portrait to landscape:
matrix = tf.constant([[1, 2, 3],
[4, 5, 6]])
# Shape: (2, 3)
rotated = tf.transpose(matrix)
# Shape: (3, 2)
# Result: [[1,4], [2,5], [3,6]]
Squeezing and Expanding: Dimension Magic
tf.squeeze: Remove Empty Spaces
# Has an unnecessary dimension
bulky = tf.constant([[[1, 2, 3]]])
# Shape: (1, 1, 3)
slim = tf.squeeze(bulky)
# Shape: (3,)
# Result: [1, 2, 3]
tf.expand_dims: Add a Dimension
simple = tf.constant([1, 2, 3])
# Shape: (3,)
expanded = tf.expand_dims(simple, axis=0)
# Shape: (1, 3)
# Result: [[1, 2, 3]]
Tiling: Copy and Repeat
Like using a stamp to create a pattern:
pattern = tf.constant([[1, 2],
[3, 4]])
# Repeat 2x vertically, 3x horizontally
tiled = tf.tile(pattern, [2, 3])
# Creates a 4x6 result!
3. Ragged Tensors: The Flexible LEGO Box
The Problem with Regular Tensors
What if your friends built LEGO towers of different heights?
Friend 1: [🔵🔵🔵] (3 bricks)
Friend 2: [🔴🔴🔴🔴🔴] (5 bricks)
Friend 3: [🟢🟢] (2 bricks)
Regular tensors need all rows to be the same size. That’s annoying!
Enter Ragged Tensors: No Padding Needed!
# Each row can have different lengths!
ragged = tf.ragged.constant([
[1, 2, 3], # 3 items
[4, 5, 6, 7, 8], # 5 items
[9, 10] # 2 items
])
print(ragged.shape)
# (3, None) - 3 rows, variable columns
Working with Ragged Tensors
# Access rows
first_row = ragged[0] # [1, 2, 3]
# Get row lengths
lengths = ragged.row_lengths()
# Result: [3, 5, 2]
# Flatten to regular tensor
flat = ragged.flat_values
# Result: [1,2,3,4,5,6,7,8,9,10]
When to Use Ragged Tensors?
graph TD A[Do your sequences have different lengths?] -->|Yes| B[Use Ragged Tensors!] A -->|No| C[Regular Tensors are fine] B --> D[Examples: Sentences, Time Series, User Actions]
Real Examples:
- Sentences: “Hi” vs “How are you doing today?”
- Reviews: Short 2-word vs long 500-word reviews
- User clicks: Some users click 3 times, others 100 times
4. Sparse Tensors: The Smart Storage System
The Problem: Mostly Empty Data
Imagine a giant warehouse with 1 million shelves, but only 100 have items. Why track all the empty ones?
Regular storage (wasteful):
[0, 0, 5, 0, 0, 0, 0, 3, 0, 0, 0, 0, 0, 7, 0...]
Smart storage (sparse):
Position 2 → 5
Position 7 → 3
Position 13 → 7
Creating Sparse Tensors
# Only store what matters!
sparse = tf.SparseTensor(
indices=[[0, 2], [1, 3], [2, 0]],
values=[5, 3, 7],
dense_shape=[3, 4]
)
# This represents:
# [[0, 0, 5, 0],
# [0, 0, 0, 3],
# [7, 0, 0, 0]]
The Three Parts:
- indices = WHERE the values are (row, column)
- values = WHAT the values are
- dense_shape = Total SIZE of the grid
Converting Sparse to Dense (and back!)
# Sparse to regular tensor
dense = tf.sparse.to_dense(sparse)
# Regular to sparse (finds non-zero values)
dense_tensor = tf.constant([
[0, 0, 5],
[3, 0, 0]
])
sparse_again = tf.sparse.from_dense(
dense_tensor
)
Why Sparse Tensors Matter
graph TD A[Huge Matrix with Mostly Zeros] --> B{Use Sparse Tensor} B --> C[Save Memory: 100x less!] B --> D[Faster Math Operations] B --> E[Real Examples:] E --> F[User-Movie Ratings] E --> G[Word Embeddings] E --> H[Social Networks]
Memory Example:
- Regular 10,000 x 10,000 matrix = 400 MB
- Same matrix with only 1000 non-zeros = 0.02 MB
- 20,000x smaller!
Quick Summary: Your New Superpowers
| Superpower | What It Does | When to Use |
|---|---|---|
tf.concat |
Join tensors along axis | Combining batches |
tf.stack |
Create new dimension | Grouping items |
tf.split |
Divide into equal parts | Distributing work |
tf.reshape |
Change shape, keep data | Model compatibility |
tf.transpose |
Swap dimensions | Matrix operations |
| Ragged | Variable-length rows | Text, sequences |
| Sparse | Store only non-zeros | Large, empty matrices |
You’re Now a Tensor Master!
You’ve learned how to:
- Join tensors like connecting train cars
- Split tensors like sharing pizza slices
- Reshape data like molding clay
- Handle messy data with Ragged Tensors
- Save memory with Sparse Tensors
These aren’t just fancy tricks—they’re the foundation of real machine learning systems. Every recommendation engine, every chatbot, every image classifier uses these operations thousands of times per second.
You’re ready to build amazing things!