Skip to content

How to Think Like an Interviewer: The 14 Pattern Recognition Framework

Site Console Site Console
20 min read Updated Mar 7, 2026 Career & Skills 0 comments

The Difference Between Grinding and Mastering

Here is a question that separates candidates who get offers from candidates who don't:

When you see a new problem you've never encountered before, what do you do in the first 60 seconds?

The grinding candidate scans their memory for a similar problem they've solved. If they find one, great. If not, they're stuck — cycling through half-remembered solutions hoping something clicks.

The pattern candidate reads the problem and asks a different question: "What class of problem is this?" Within 30 seconds they've identified the pattern, and they have a template to start from.

That's the difference between preparing by volume and preparing by understanding. You can solve 500 random LeetCode problems and still freeze on problem 501. Or you can master 14 patterns and walk into any interview with a systematic framework for approaching problems you've genuinely never seen before.

This post is that framework.


Why Patterns Work (The Data Behind the Claim)

This isn't a soft claim. Almost every question in FAANG onsite rounds maps to a well-known pattern. Amazon often asks scheduling problems that require Merge Intervals. Meta loves graph problems that fit DFS/BFS. Recognizing the pattern is usually what separates a good answer from a great one.

Focus on mastering 15–20 core patterns thoroughly rather than superficially memorizing more. Deep understanding of the essential patterns is typically sufficient for most technical interviews at top companies. Quality trumps quantity — it's better to truly master these patterns than to have shallow knowledge of 50.

The 14 patterns below cover the overwhelming majority of interview problems at every major tech company. For each one, I'll give you:

  • What it is — the core idea in plain English

  • The trigger signals — what words, constraints, or problem shapes tell you this pattern applies

  • The template — the code skeleton you start from every time

  • Canonical problems — 2–3 LeetCode problems that are the clearest examples

Let's build the framework.


The 14 Patterns


Pattern 01 — Sliding Window

What It Is

A window (subarray or substring) that slides across the data, expanding or shrinking based on a condition. Instead of recomputing from scratch each step, you reuse the previous window's result — turning O(n²) into O(n).

Trigger Signals

  • "Longest/shortest subarray/substring that satisfies X"

  • "Maximum/minimum sum of k consecutive elements"

  • "Find a subarray containing all characters of Y"

  • The word contiguous in the problem statement

The Template

def sliding_window(arr, k):
    left = 0
    window_sum = 0
    result = 0

    for right in range(len(arr)):
        window_sum += arr[right]          # expand window

        while window_sum > k:             # shrink condition
            window_sum -= arr[left]
            left += 1

        result = max(result, right - left + 1)  # update result

    return result

Fixed window: move both pointers together, window size stays constant. Dynamic window: right pointer always moves forward; left pointer moves when the condition is violated.

Canonical Problems


Pattern 02 — Two Pointers

What It Is

Two pointers that move toward each other (or in the same direction) to avoid nested loops. When the array is sorted, two pointers from opposite ends can find pairs in O(n) instead of O(n²).

Trigger Signals

  • Sorted array + find a pair/triplet that sums to a target

  • "Remove duplicates in-place"

  • Comparing characters from both ends of a string (palindrome check)

  • "Reverse an array/string in-place"

The Template

# Opposite direction (sorted array, pair sum)
def two_sum_sorted(arr, target):
    left, right = 0, len(arr) - 1

    while left < right:
        current_sum = arr[left] + arr[right]
        if current_sum == target:
            return [left, right]
        elif current_sum < target:
            left += 1     # need larger sum → move left forward
        else:
            right -= 1    # need smaller sum → move right back

    return []

# Same direction (remove duplicates)
def remove_duplicates(arr):
    if not arr:
        return 0
    slow = 0
    for fast in range(1, len(arr)):
        if arr[fast] != arr[slow]:
            slow += 1
            arr[slow] = arr[fast]
    return slow + 1

Canonical Problems


Pattern 03 — Fast & Slow Pointers

What It Is

Two pointers moving at different speeds — one at 1x, one at 2x. In a cyclic structure, the fast pointer will eventually catch the slow pointer. This detects cycles and finds midpoints without extra space.

Trigger Signals

  • "Detect a cycle in a linked list"

  • "Find the middle of a linked list"

  • "Is this linked list a palindrome?"

  • "Find the start of the cycle"

  • The words linked list + cycle together

The Template

def has_cycle(head):
    slow = head
    fast = head

    while fast and fast.next:
        slow = slow.next          # moves 1 step
        fast = fast.next.next     # moves 2 steps

        if slow == fast:          # they met → cycle exists
            return True

    return False   # fast reached end → no cycle

def find_middle(head):
    slow = head
    fast = head

    while fast and fast.next:
        slow = slow.next
        fast = fast.next.next

    return slow   # when fast reaches end, slow is at middle

Canonical Problems


Pattern 04 — Merge Intervals

What It Is

Sort intervals by start time, then merge any that overlap. The core insight: after sorting, two intervals overlap if and only if the start of the second is ≤ the end of the first.

Trigger Signals

  • "Merge overlapping intervals"

  • "Insert a new interval into a sorted list"

  • "Find gaps/free time between meetings"

  • Input contains pairs like [start, end]

  • Words: schedule, meetings, calendar, time ranges

The Template

def merge_intervals(intervals):
    intervals.sort(key=lambda x: x[0])   # sort by start time
    merged = [intervals[0]]

    for start, end in intervals[1:]:
        last_end = merged[-1][1]

        if start <= last_end:             # overlap → merge
            merged[-1][1] = max(last_end, end)
        else:                             # no overlap → add new
            merged.append([start, end])

    return merged

Canonical Problems


Pattern 05 — Cyclic Sort

What It Is

When the input is an array of numbers in range [1, n], each number belongs at a specific index. Swap elements to their correct positions in O(n) — then a second pass finds misplaced elements (missing or duplicate numbers).

Trigger Signals

  • "Array contains numbers from 1 to n"

  • "Find missing/duplicate numbers in [1, n]"

  • "Find the smallest missing positive integer"

  • Constraints explicitly state the number range maps to array indices

The Template

def cyclic_sort(nums):
    i = 0
    while i < len(nums):
        correct_idx = nums[i] - 1         # where this number belongs
        if nums[i] != nums[correct_idx]:  # not in right place → swap
            nums[i], nums[correct_idx] = nums[correct_idx], nums[i]
        else:
            i += 1                        # already correct → advance

def find_missing(nums):
    cyclic_sort(nums)
    for i, num in enumerate(nums):
        if num != i + 1:
            return i + 1                  # first index out of place
    return len(nums) + 1

Canonical Problems


Pattern 06 — Tree BFS (Level Order Traversal)

What It Is

Traverse a tree level by level using a queue. Process all nodes at depth d before moving to depth d+1. This is the go-to pattern whenever the answer depends on levels, distances, or shortest paths in a tree.

Trigger Signals

  • "Level order traversal"

  • "Minimum depth of a tree"

  • "Connect nodes at the same level"

  • "Right side view of a tree"

  • "Shortest path" in an unweighted tree or graph

The Template

from collections import deque

def level_order(root):
    if not root:
        return []

    result = []
    queue = deque([root])

    while queue:
        level_size = len(queue)
        current_level = []

        for _ in range(level_size):       # process all nodes at this level
            node = queue.popleft()
            current_level.append(node.val)

            if node.left:
                queue.append(node.left)
            if node.right:
                queue.append(node.right)

        result.append(current_level)

    return result

Canonical Problems


Pattern 07 — Tree DFS (Depth First Search)

What It Is

Traverse a tree by going as deep as possible before backtracking. Three orderings: pre-order (root → left → right), in-order (left → root → right), post-order (left → right → root). In-order on a BST gives sorted output.

Trigger Signals

  • "Path sum from root to leaf"

  • "Maximum depth/height of tree"

  • "Validate BST"

  • "All root-to-leaf paths"

  • "Lowest common ancestor"

  • Any problem where you need to explore all paths in a tree

The Template

def dfs_tree(node, path, result):
    if not node:
        return

    path.append(node.val)                 # pre-order: process before children

    if not node.left and not node.right:  # leaf node
        result.append(list(path))         # record complete path

    dfs_tree(node.left, path, result)
    dfs_tree(node.right, path, result)

    path.pop()                            # backtrack

def max_depth(root):
    if not root:
        return 0
    return 1 + max(max_depth(root.left), max_depth(root.right))

Canonical Problems


Pattern 08 — Graph BFS / DFS

What It Is

Extends tree traversal to graphs, which can have cycles. The key addition: a visited set to avoid processing the same node twice. BFS for shortest paths; DFS for connectivity, cycle detection, and path exploration.

Trigger Signals

  • "Number of connected components"

  • "Shortest path between two nodes" (unweighted) → BFS

  • "Is there a path from A to B?"

  • Grid problems: "Number of islands," "flood fill," "rot oranges"

  • "Course schedule" (detect cycles in directed graph)

The Template

from collections import deque

# BFS — shortest path
def bfs(graph, start, target):
    visited = {start}
    queue = deque([(start, 0)])           # (node, distance)

    while queue:
        node, dist = queue.popleft()
        if node == target:
            return dist
        for neighbor in graph[node]:
            if neighbor not in visited:
                visited.add(neighbor)
                queue.append((neighbor, dist + 1))
    return -1

# DFS — connectivity
def dfs(graph, node, visited):
    visited.add(node)
    for neighbor in graph[node]:
        if neighbor not in visited:
            dfs(graph, neighbor, visited)

def count_components(n, edges):
    graph = {i: [] for i in range(n)}
    for u, v in edges:
        graph[u].append(v)
        graph[v].append(u)

    visited = set()
    count = 0
    for node in range(n):
        if node not in visited:
            dfs(graph, node, visited)
            count += 1
    return count

Canonical Problems


Pattern 09 — Binary Search (Modified)

What It Is

Binary search extended far beyond sorted arrays. Any time the search space has a monotonic property — meaning one side satisfies a condition and the other doesn't — binary search can find the boundary in O(log n). This includes searching on answer space, not just array positions.

Trigger Signals

  • Sorted or rotated sorted array

  • "Find first/last occurrence of X"

  • "Minimum/maximum value that satisfies a condition" (binary search on answer)

  • "Search in a rotated sorted array"

  • The answer is in a range and the condition is monotonic

The Template

# Find leftmost position satisfying condition
def binary_search_left(arr, target):
    left, right = 0, len(arr) - 1
    result = -1

    while left <= right:
        mid = left + (right - left) // 2    # avoids integer overflow

        if arr[mid] == target:
            result = mid
            right = mid - 1                 # keep searching left
        elif arr[mid] < target:
            left = mid + 1
        else:
            right = mid - 1

    return result

# Binary search on answer space
def min_capacity(weights, days):
    left = max(weights)           # minimum possible answer
    right = sum(weights)          # maximum possible answer

    while left < right:
        mid = (left + right) // 2
        if can_ship(weights, days, mid):
            right = mid           # try smaller
        else:
            left = mid + 1        # need larger

    return left

Canonical Problems


Pattern 10 — Top K Elements (Heap)

What It Is

Use a heap (priority queue) to efficiently track the K largest, K smallest, or K most frequent elements. A min-heap of size K maintains the K largest elements seen so far — when a new element is larger than the heap's minimum, swap it in.

Trigger Signals

  • "Find the K largest/smallest elements"

  • "K most frequent elements"

  • "Median of a data stream"

  • "Kth largest element in an array"

  • Any problem with the words top K, smallest K, frequent K

The Template

import heapq

# K largest elements using min-heap of size K
def k_largest(nums, k):
    heap = []
    for num in nums:
        heapq.heappush(heap, num)
        if len(heap) > k:
            heapq.heappop(heap)    # remove smallest → keep K largest
    return list(heap)

# K most frequent elements
from collections import Counter
def top_k_frequent(nums, k):
    count = Counter(nums)
    return heapq.nlargest(k, count.keys(), key=count.get)

# Median of data stream (two heaps)
class MedianFinder:
    def __init__(self):
        self.small = []   # max-heap (negate for Python)
        self.large = []   # min-heap

    def add_num(self, num):
        heapq.heappush(self.small, -num)
        if self.small and self.large and (-self.small[0] > self.large[0]):
            heapq.heappush(self.large, -heapq.heappop(self.small))
        if len(self.small) > len(self.large) + 1:
            heapq.heappush(self.large, -heapq.heappop(self.small))
        if len(self.large) > len(self.small):
            heapq.heappush(self.small, -heapq.heappop(self.large))

    def find_median(self):
        if len(self.small) > len(self.large):
            return -self.small[0]
        return (-self.small[0] + self.large[0]) / 2

Canonical Problems


Pattern 11 — Backtracking

What It Is

Explore all possibilities by building a candidate solution incrementally and abandoning it as soon as a constraint is violated (pruning). The key insight: don't generate all possibilities first — prune invalid branches early to avoid unnecessary work.

Trigger Signals

  • "Generate all permutations/combinations/subsets"

  • "Solve a constraint puzzle" (Sudoku, N-Queens)

  • "Find all paths that satisfy condition X"

  • "Word search in a grid"

  • Problems asking for all possible solutions, not just one

The Template

def backtrack(result, current, start, nums):
    result.append(list(current))       # record current state

    for i in range(start, len(nums)):
        current.append(nums[i])        # make a choice
        backtrack(result, current, i + 1, nums)
        current.pop()                  # undo the choice (backtrack)

def subsets(nums):
    result = []
    backtrack(result, [], 0, nums)
    return result

# With pruning (permutations with constraint)
def permute(nums):
    result = []
    def bt(current, remaining):
        if not remaining:
            result.append(list(current))
            return
        for i, num in enumerate(remaining):
            current.append(num)
            bt(current, remaining[:i] + remaining[i+1:])
            current.pop()
    bt([], nums)
    return result

Canonical Problems


Pattern 12 — Dynamic Programming

What It Is

Break a problem into overlapping subproblems, solve each once, and store the result. DP applies when: (1) the problem has optimal substructure (optimal solution uses optimal solutions to subproblems), and (2) there are overlapping subproblems (same subproblem computed repeatedly).

Trigger Signals

  • "Maximum/minimum value to achieve X"

  • "Number of ways to do X"

  • "Is it possible to reach/achieve X?"

  • You find yourself in a recursion tree where the same inputs repeat

  • The words: optimal, fewest, most, count the ways

The Template

# Top-down (memoization)
def dp_memo(n, memo={}):
    if n in memo:
        return memo[n]
    if n <= 1:                            # base case
        return n
    memo[n] = dp_memo(n-1, memo) + dp_memo(n-2, memo)
    return memo[n]

# Bottom-up (tabulation) — usually preferred in interviews
def coin_change(coins, amount):
    dp = [float('inf')] * (amount + 1)
    dp[0] = 0                             # base case: 0 coins for amount 0

    for i in range(1, amount + 1):
        for coin in coins:
            if coin <= i:
                dp[i] = min(dp[i], dp[i - coin] + 1)

    return dp[amount] if dp[amount] != float('inf') else -1

The DP decision questions:

  1. What are the subproblems? (define dp[i] precisely)

  2. What is the recurrence? (how does dp[i] depend on smaller subproblems?)

  3. What are the base cases?

  4. What order to fill the table?

Canonical Problems


Pattern 13 — Monotonic Stack

What It Is

A stack that maintains elements in monotonically increasing or decreasing order. When a new element violates the ordering, pop elements off and process them. This gives O(n) solutions to problems that naively require O(n²) comparisons.

Trigger Signals

  • "Next greater element to the right/left"

  • "Next smaller element"

  • "Largest rectangle in histogram"

  • "Daily temperatures" — how many days until warmer

  • "Trapping rain water"

  • Any problem about spans or boundaries between elements

The Template

# Next greater element (monotonic decreasing stack)
def next_greater(nums):
    result = [-1] * len(nums)
    stack = []                            # stores indices

    for i, num in enumerate(nums):
        while stack and nums[stack[-1]] < num:  # current breaks monotonicity
            idx = stack.pop()
            result[idx] = num             # current is the "next greater" for idx
        stack.append(i)

    return result

# Daily temperatures
def daily_temperatures(temps):
    result = [0] * len(temps)
    stack = []

    for i, temp in enumerate(temps):
        while stack and temps[stack[-1]] < temp:
            idx = stack.pop()
            result[idx] = i - idx         # days until warmer
        stack.append(i)

    return result

Canonical Problems


Pattern 14 — Topological Sort

What It Is

Order nodes in a directed acyclic graph (DAG) such that for every edge A → B, A comes before B in the ordering. Uses BFS with in-degree tracking (Kahn's algorithm) or DFS with a post-order stack.

Trigger Signals

  • "Course prerequisites" — take A before B

  • "Build order" — compile X before Y

  • "Tasks with dependencies"

  • "Detect cycle in a directed graph"

  • Any problem about ordering with dependencies

The Template

from collections import deque

def topological_sort(n, prerequisites):
    graph = {i: [] for i in range(n)}
    in_degree = [0] * n

    for course, prereq in prerequisites:
        graph[prereq].append(course)
        in_degree[course] += 1

    # Start with all nodes that have no prerequisites
    queue = deque([i for i in range(n) if in_degree[i] == 0])
    order = []

    while queue:
        node = queue.popleft()
        order.append(node)

        for neighbor in graph[node]:
            in_degree[neighbor] -= 1
            if in_degree[neighbor] == 0:    # all prerequisites done
                queue.append(neighbor)

    return order if len(order) == n else []  # empty = cycle exists

Canonical Problems


The Master Decision Framework

The 14 patterns are only useful if you can pick the right one quickly. Here is the decision framework — a series of questions to work through in the first 60 seconds of any new problem.

READ THE PROBLEM
      │
      ▼
Is the input a LINKED LIST?
  ├─ Yes + "cycle/middle/palindrome" → Fast & Slow Pointers (03)
  └─ No
      │
      ▼
Is the input a TREE?
  ├─ Yes + "level/shortest/minimum depth" → Tree BFS (06)
  ├─ Yes + "path/depth/LCA/all paths"     → Tree DFS (07)
  └─ No
      │
      ▼
Is the input a GRAPH or GRID?
  ├─ Yes + "shortest path (unweighted)"  → BFS (08)
  ├─ Yes + "connected components/paths"  → DFS (08)
  ├─ Yes + "ordering with dependencies"  → Topological Sort (14)
  └─ No
      │
      ▼
Is the input SORTED (or can be sorted)?
  ├─ Yes + "pair/triplet summing to X"   → Two Pointers (02)
  ├─ Yes + "search for element/boundary" → Binary Search (09)
  ├─ Yes + "merge overlapping intervals" → Merge Intervals (04)
  └─ No
      │
      ▼
Is the input an ARRAY/STRING with a SUBARRAY/SUBSTRING goal?
  ├─ "Longest/shortest/max contiguous"   → Sliding Window (01)
  └─ No
      │
      ▼
Does the problem involve OPTIMIZATION or COUNTING?
  ├─ Yes + "overlapping subproblems"     → Dynamic Programming (12)
  ├─ Yes + "top/smallest K elements"     → Top K / Heap (10)
  └─ No
      │
      ▼
Does the problem ask for ALL POSSIBILITIES?
  ├─ Yes → Backtracking (11)
  └─ No
      │
      ▼
Is the input numbers in range [1, n]?
  ├─ Yes + "missing/duplicate"           → Cyclic Sort (05)
  └─ No
      │
      ▼
Does the problem involve NEXT GREATER/SMALLER elements?
  └─ Yes → Monotonic Stack (13)

The Pattern Cheat Sheet

#

Pattern

Complexity

Key Signal

01

Sliding Window

O(n)

Contiguous subarray/substring

02

Two Pointers

O(n)

Sorted array, pairs/triplets

03

Fast & Slow Pointers

O(n)

Linked list + cycle/middle

04

Merge Intervals

O(n log n)

Overlapping [start, end] pairs

05

Cyclic Sort

O(n)

Numbers in range [1, n]

06

Tree BFS

O(n)

Level order, min depth

07

Tree DFS

O(n)

Path sum, all paths, LCA

08

Graph BFS/DFS

O(V + E)

Islands, components, shortest path

09

Binary Search

O(log n)

Sorted input or monotonic condition

10

Top K Elements

O(n log k)

K largest/smallest/frequent

11

Backtracking

O(2ⁿ) worst

All permutations/combinations

12

Dynamic Programming

Varies

Optimize/count with overlapping subproblems

13

Monotonic Stack

O(n)

Next greater/smaller element

14

Topological Sort

O(V + E)

Dependencies, ordering, DAG


How to Practice: The Right Method

Knowing the patterns is the easy part. Building the habit of recognizing them automatically under pressure is the actual work.

The right practice loop:

  1. Open a new LeetCode problem you've never seen

  2. Read it once. Don't open a solution.

  3. Ask the decision framework questions out loud (yes, out loud)

  4. Write down which pattern you think applies and why

  5. Attempt a solution using that pattern's template

  6. Check the solution — did you pick the right pattern?

  7. If wrong: understand why the correct pattern applies, not just what it is

The wrong practice loop:

  1. Read a problem

  2. Look at the solution immediately when stuck

  3. Think "oh, I get it" and move to the next problem

  4. Repeat for 300 problems

  5. Freeze in interviews because recognition was never practiced

After covering all patterns, practice with mixed problems where you must identify the appropriate pattern yourself. Solve under timed conditions (45 minutes per problem) and conduct mock interviews with peers. The goal is to build pattern recognition speed and interview stamina.

The speed threshold to aim for: 30 seconds to pattern identification. If you can't name the pattern within 30 seconds of reading a problem you've seen before, your recognition isn't automatic yet. Keep practicing that specific pattern.


Completing Chapter 1: What You Now Have

With this post, Chapter 1 — Foundation — is complete. Here is what you've built:

Post 01 — The mindset: why algorithm interviews test reasoning under uncertainty, not code memorization, and how 2026 interviews have shifted toward judgment and AI collaboration.

Post 02 — The vocabulary: Big-O notation from O(1) to O(n!), the rules for complexity analysis, space complexity, amortized analysis, and the cheat sheet for interview day.

Post 03 (this post) — The framework: 14 patterns with templates, canonical problems, and a decision tree for identifying the right pattern within 30 seconds of reading any new problem.

These three components — mindset, vocabulary, framework — are the foundation that everything in Chapter 2 (Data Structures) and beyond builds on. Every data structure deep-dive, every algorithm explanation, and every advanced technique becomes significantly easier once these three things are in place.


🧭 What's Next: Chapter 2 — Data Structures

Chapter 1 gives you the how to think. Chapter 2 gives you the tools to think with:

  • Post 04: Arrays & Strings — the most common interview topic, with every technique that converts O(n²) problems to O(n)

  • Post 05: Hash Maps & Hash Sets — the single data structure that improves interview performance more than any other

  • Post 06–09: Linked Lists, Stacks & Queues, Trees, Graphs — deep dives into each with every interview pattern mapped


💡 Final Practice Challenge

Before moving to Chapter 2, take this pattern recognition test. For each problem description below, name the pattern and explain your reasoning — before clicking any link.

Problem A: Given an array of integers, find the length of the longest subarray with sum equal to k.

Sliding Window (or Prefix Sum + Hash Map for negative numbers)

Problem B: Given a list of meeting time intervals, find the minimum number of conference rooms required.

Merge Intervals (or Min-Heap sorted by end time)

Problem C: Given a directed graph of course prerequisites, determine if you can finish all courses.

Topological Sort (detect cycle in directed graph)

Problem D: Given a linked list, return true if it has a cycle.

Fast & Slow Pointers

Problem E: Given an array, find the kth largest element.

Top K Elements (Min-Heap of size k)

If you got all five in under 2 minutes: you're ready for Chapter 2. If you struggled with any: go back and practice 3–5 problems for that specific pattern before moving on.

The foundation is built. Now we build the tools.

Related

Leave a comment

Sign in to leave a comment.

Comments