How Infinity Hides into Your Algorithms

Introduction

“You cannot put 10 socks in 9 drawers without something overlapping, obvious right? But this trivial fact lets you prove infinite truths, detect collisions in cryptography, and defeat compression algorithms.”

The Pigeonhole Principle (PHP), “if you have more pigeons than holes, at least one hole must contain at least two pigeons” feels like a Kindergarten observation. Now imagine using that idea to prove:

1. The Baby Pigeonhole Principle

1.1 Pigeonhole Principle (finite version):

Formal Statement Suppose n objects (pigeons) are distributed among m containers (holes). If n > m, then at least one container must contain at least two objects.

Alternatively, if each container has at most one object, you can place at most m objects total. Once you place a (m+1)th, you force a container to have two.

Visual Demonstration

1.2 Birthday Paradox (Basic counting)

Setup: Consider 23 people in a room. There are 365 possible birthdays.

Application of PHP: Since 23 < 365, the basic PHP does not force a collision. But if we has 366 people, two must share a birthday.

Remark. The ”birthday paradox” probability calculation shows that collisions happen much earlier than 366; but PHP gives a guarantee only when participants > 366.

1.3 Matching problems

If you have 5 red balls and 5 blue balls in 9 boxes, whatever arrangement, at least one box has at least 2 balls.

$$ \text{More generally, distributing n items into k catagories forces at least} \hspace{2mm} \left\lceil \frac{n}{k} \right\rceil \hspace{2mm} \text{items in one catagory.}$$

1.4 Why this simple idea matters

  • Universality. Any time you try to squeeze more objects into fewer “slots,” you get overlap.
  • Certainty. Unlike probabilistic arguments, PHP yields a strict, deterministic guarantee.
  • Foundation. Many elementary proofs (e.g., in combinatorics, graph theory, and number theory) rest on pigeonholes.
  • 2. Going Deeper—Infinite Applications

    While the finite PHP is easy, there are “infinite” analogues that let you extract surprising conclusions from seemingly weak assumptions.

    2.1 Infinite Pigeonhole Principle

    Infinite Pigeonhole Principle (Weak form) If infinitely many objects (pigeons) are distributed among finitely many categories (holes), then at least one category must contain infinitely many objects.

    Why. Suppose you have infinitely many items x1, x2, x3,... and only m catagories. If each category held only finitely many, say at most Ni in catagory i, then the total objects would be at most N1 + N2 + ... +Nm < ∞. That contradicts “infinitely many.” Hence some category is infinite.

    Diagram 1: Imagine we assign each natural number n ∈ N to one of 3 bins labeled A, B, C. If infinitely many numbers exist, one bin must collect infinitely many:

    Bins: [ A ],     [ B ],     [ C]

    Numbers: 1→B     2→A     3→C     4→B     5→C     6→B     7→A     8→C    9→B ...

    Since there are infinitely many natural numbers, at least one bin, say B, receives infinitely many.

    2.2 Dirichlet’s Approximation Theorem

    Dirichlet’s Theorem.

    For any real α and any integer N ≥ 1. there exist integers p and q with 1≤ q ≤ N $$\bigl|\alpha - \tfrac{p}{q}\bigr| < \tfrac1{qN}. $$

    🎯 Dirichlet’s Approximation - Overflow in Bins
    Pick n real numbers randomly from [0,1) and divide them into k bins.
    The Pigeonhole Principle says: ⌈n⁄k⌉ or more must land in at least one bin. Try it!
    Number of points (n):   Number of bins (k):

    Sketch using pigeonholes.

    1. Consider the N + 1 real numbers { α }, { 2α }, { 3α }, . . . , { Nα },where { x } denotes the fractional part of x. Each { kα } lies in [0, 1).

    2. Partition the interval [0, 1) into N smaller half-open subintervals (“holes”): $$ [0,\tfrac{1}{N}), \;\; [\tfrac{1}{N},\tfrac{2}{N}), \;\dots,\; [\tfrac{N-1}{N},1). $$

    3. By the (finite) PHP, two of the N + 1 fractional parts, say iα and jα with i < j ≤ N, must land in the same interval of length 1/N. Therefore, $$ \bigl| \{\,j\alpha\} - \{\,i\alpha\} \bigr| < \frac{1}{N}. $$

    4. But { jα } − { iα } equals {( j −i )α} (up to possibly ±1), so there is an integer q = j −i (1 ≤ q ≤ N) and integer p = ⌊ jα ⌋ − ⌊ iα ⌋ such that $$ \left|\, q\,\alpha - p \right| < \frac{1}{N} \quad\Longrightarrow\quad \left|\, \alpha - \frac{p}{q} \right| < \frac{1}{q\,N}. $$ Thus infinite conclusions—approximation of irrationals—follow from partitioning a finite interval.

    2.3 Irrationality of √2

    A classic proof of √2 being irrational is not pigeonhole-style, but one can recast it via Dirichlet’s theorem:

    1. If √2 = p/q exactly, then |√2 − p/q| = 0.
    2. But Dirichlet’s theorem guarantees infinitely many rational approximations p/q with $$ \left| \sqrt{2} - \frac{p}{q} \right| < \frac{1}{q\,N}. $$ For large N, 1/qN < 0 is impossible, so no exact equality can hold.

    This view shows that no rational can equal √2, since rationals do not achieve these “too-good” approximations infinitely often.

    3. Pigeonholes in Algorithms and Computer Science

    Modern computer science often exploits pigeonhole reasoning to show collisions, impossibility of lossless compression, and security bounds. Below are key applications.

    3.1 Hashing and Collisions

    $$ \text{Setup:} \hspace{2mm} \text{A hash function h} : \hspace{2mm} \{0, 1\}^n → \{0, 1\}^m \hspace{2mm} \text{maps n-bit inputs to m-bit outputs, where m < n.} $$

    Pigeonhole argument. There are 2^n possible inputs (“pigeons”) but only 2^m hash-values (“holes”). If 2^n > 2^m, then by PHP, some hash-value must be attained by at least two distinct inputs i.e. a collision.

    Implication. No hash function can be injective once n > m. In practice, cryptographic hashes are designed to make collisions hard to find even if they must exist.

    Diagram: Inputs vs Hash Buckets $$ \text{Inputs (} \hspace{1mm} 2^n \hspace{2mm} \text{pigeons):} \hspace{1mm} x_1, \hspace{1mm} x_2, \hspace{1mm} x_3, \hspace{1mm} ..., \hspace{1mm} x_{2^n} $$

    $$ \text{Hash Buckets} \hspace{2mm} (2^m \hspace{2mm} \text{holes}):\hspace{2mm} [ \hspace{1mm} 0 \hspace{1mm} ] \hspace{2mm} [\hspace{1mm} 1 \hspace{1mm} ] \hspace{2mm} ... \hspace{2mm} [ \hspace{1mm} 2^m − 1 \hspace{1mm}] $$

    Since 2^n ≫ 2^m, many inputs are forced to map into the same bucket.

    3.2 Compression Paradox

    Claim. No lossless compression algorithm can compress every file to a shorter representation.

    Intuition. Suppose you have an algorithm C that maps any bit-string of length L into a shorter string of length < L.

    Counting argument. $$ \text{There are} \hspace{2mm} 2^L \hspace{2mm} \text{distinct input strings of length L. But the algorithm can only output strings of length } $$

    $$ \text{at most L − 1, so there are at most} \hspace{2mm} 2^0 + 2^1 + 2^2 + \dots + 2^{L-1} \;=\; 2^{L}-1 \hspace{2mm} \text{possible outputs.} $$

    Conclusion (pigeonhole). By PHP, at least two distinct inputs of length L must map to the same output. The compression fails to be lossless.

    Corollary. Any “universal” compressor must expand some files; no magic algorithm compresses every file. In practice, compressors like ZIP exploit patterns in real-world data; but in the worst case (e.g., random data), they cannot shrink everything.

    3.3 Birthday Attacks in Cryptography

    Let H be a hash function producing m-bit outputs. If an adversary picks random inputs and computes hashes, then:

    Birthday Paradox.

    $$ \text{After about} \hspace{2mm} 2^{m/2} \hspace{1mm} \text{random inputs, the probability of finding a collision in hashvalues become significant (≈50%).}$$

    $$ \textbf{Pigeonhole argument (simplified). There are} \hspace{1mm} 2^m \hspace{1mm} \text{hash-values (“holes”). Sampling N random inputs (“pigeons”),} $$

    $$ \text{the expected number of collisions arises when} \hspace{1mm} N ≈ \sqrt{2^m}. \hspace{1mm} \text{Though not strictly PHP, the intuition stems from } $$

    $$ \text{“ifyou place about the} \hspace{1mm} \sqrt{2^m} \hspace{1mm} \text{pigeons into} \hspace{1mm} 2^m \hspace{1mm} \text{holes, collisions become likely.”} $$

    Implication. To resist birthday attacks, choose m large (e.g., m = 256 for SHA-256). Then 2^128 effort is required to find a collision by brute force, which is currently computationally infeasible.

    3.4 Load Balancing and Cache Design

    Suppose you have k servers (or cache slots) and want to assign n tasks (or memory blocks) dynamically:

    If n > k and tasks are all “hot,” by simple pigeonhole, at least one server must handle at least ⌈ n/k ⌉ tasks simultaneously creating a bottleneck.

    Cache lines. Consider a CPU cache with B cache lines and N distinct memory blocks. If a program accesses N blocks where
    N > B while reusing them in some pattern, eventually some cache line must evict another (due to PHP). This underlies cache-miss lower bounds in algorithms.

    Balanced hashing (Cuckoo, Consistent Hashing). These techniques try to distribute “pigeons” (requests) evenly among “holes” (servers), but ultimately if you have more requests than servers, some server must bear extra load.

    4. Surprising Proofs

    Beyond hashing and algorithms, pigeonhole ideas often hide inside elegant combinatorial proofs.

    4.1 Erdős–Szekeres: Monotonic Subsequence

    Theorem (Erdős–Szekeres).

    Any sequence of m.n + 1 distinct real numbers contains either an increasing subsequence of length m + 1 or a decreasing subsequence of length n + 1.

    📈 Erdős–Szekeres: Longest Increasing Subsequence
    Random sequence of distinct numbers is generated. The longest increasing subsequence is highlighted in red.
    Sequence length:

    Proof Sketch (pigeonhole coloration). $$ \text{1. Let the sequence be} \hspace{2mm} a_1, a_2, . . . , a_{m.n} +1.$$

    2. For each position i, define two numbers:
    L(i) = length of longest increasing subsequence starting at a_i.
    D(i) = length of longest decreasing subsequence starting at a_i.

    3. We know 1 ≤ L(i) ≤ m + 1 and 1 ≤ D(i) ≤ n + 1. Consider the set of ordered pairs {(L(i), D(i)) : 1 ≤ i ≤ mn + 1}.

    4. There are only m × n possible pairs (u, v) with 1 ≤ u ≤ m and 1 ≤ v ≤ n. Actually, if either L(i) = m+1 or D(i) = n+1 for some i, we are done (we found our long monotonic subsequence). Otherwise, each i corresponds to a pair in {1, . . . , m} × {1, . . . , n}.

    Since we have mn + 1 indices but only mn possible (L, D) values with both coordinates (m, n), by PHP at least two indices i < j must share the same L(i), D(i) = L( j ), D( j ).

    One then shows—by comparing ai and aj—that either there is a longer increasing or a longer decreasing subsequence than assumed, contradiction. Hence the desired monotonic subsequence must exist.

    Grid of (L, D) values

    Once two indices map to the same grid cell (u, v) and neither coordinate is at the max, one of the monotonic subsequences can be extended—leading to a contradiction if you assumed no long subsequences existed.

    $$ \text{4.2 Ramsey Theory: Triangles in} \hspace{2mm} K_6 $$

    Fact. Any 2-coloring of the edges of the complete graph K6 on 6 vertices must contain a monochromatic triangle (either all-red or all-blue).

    Pigeonhole proof sketch

    1. Label vertices V = {v1, v2, v3, v4, v5, v6}. Consider vertex v1. It has 5 incident edges, each colored red (R) or blue (B).

    2. By PHP, at least three of those edges must share the same color, say red. Without loss of generality, edges {v1-v2, v1-v3, v1-v4} are all red.

    3. Now look at the triangle formed by vertices v2, v3, v4. If any of those edges is red, say {v2-v3}, then {v1, v2, v3} forms a red triangle. If none are red, all three edges {v2-v3, v3-v4, v2-v4} must be blue, forming a blue triangle among {v2, v3, v4}.

    4. In either case, we find a monochromatic triangle.

    From v1, three red edges go to v2, v3, v4. Among {v2, v3, v4}, either a red edge exists (closing a red triangle) or all are blue (forming a blue triangle).

    4.3 Monochromatic Subsets (Ramsey-Flavored)

    🔺 Ramsey Theory — Monochromatic Triangle Game
    You are coloring edges of a complete graph with 6 nodes. Click an edge to toggle its color (blue 🔵 or red 🔴).
    Can you avoid a monochromatic triangle?

    More generally, if you color the edges of a sufficiently large complete graph K_n in c colors, pigeonhole arguments often help find monochromatic cliques of a given size. While standard proofs use induction, the base cases rely on simple pigeonhole partitions of edges.

    5. The Art of Finding the Right Pigeonholes

    So far, we’ve applied pigeonhole in rather straightforward ways—grouping objects by obvious categories. In more advanced problems, the creative insight is in designing the “holes” so that the PHP yields a nontrivial conclusion.

    5.1 The “Duality” of Pigeonhole: Existence vs. Impossibility

  • Existence proofs. Show that a certain structure must exist. Example: Erdős–Szekeres guarantees a long monotonic subsequence.
  • Impossibility proofs. Show that a certain algorithm or mapping cannot exist. Example: compression paradox, hashing injectivity.
  • In both, you define a mapping (objects) (categories), and then count objects vs. categories.

    5.2 Choosing Pigeonholes: Several Tips

    Drag the dividers to rebalance how many “pigeons” fall in each hole.

    🕊️ Choosing Pigeonholes — Interactive Partitioning
    You have 30 pigeons spread along a line.
    Drag the black dividers to choose pigeonholes and see how the counts shift!
    1. Map to a Finite Set. If you suspect an infinite conclusion (e.g., find infinitely many with some property), map infinitely many items to a finite set (e.g., basic types, intervals, residues modulo m).
    2. Colors, Types, Intervals
      • Colors. In graph theory or combinatorics, “color” edges or vertices into a small number of classes.
      • Residue classes. In number theory, assign integers to {0,1,..., d-1} by taking remainders mod d.
      • Intervals. When dealing with real numbers, subdivide [0, 1] into intervals of length ε.
    3. Look for Quantitative Bounds. If a problem asks to prove “some subset of size k exists,” you often partition into fewer than k categories—forcing a large class by PHP.
    4. Symmetry Helps. When elements are exchangeable (e.g., vertices of a complete graph), you can assume one vertex is special and count its incident edges or neighbors.
    5. Antichains and Chains. In partially ordered sets (posets), map each element to the length of the longest chain (or antichain) through it. Comparing lengths forces either a long chain or long antichain. This is the classic Dilworth/Erdős–Szekeres approach.

    5.3 Example: Covering Lattice Points

    Problem. In a n × n grid of lattice points {1, 2, . . . , n}^2, color each point red or blue. Show there exists either a red “increasing diagonal” of length k or a blue “decreasing diagonal” of length k, provided n is large relative to k.

    Pigeonhole Sketch.

    1. Label each point (i, j) with its “type” (i+j) mod (k−1). This puts all n^2 points into k −1 residue classes. If n^2 > (k-1)· T (where T is the max number of points avoiding monochromatic diagonals of length k), then some residue class contains “too many” same-colored points, forcing a diagonal.

    2. The choice of residue (i+j) mod (k −1) aligns “diagonal direction” with categories. Each category represents a parallel family of diagonals. By forcing too many points in one category, you force a long diagonal of the same color.

    This shows how a residue-mod trick (designing pigeonholes) delivers a combinatorial result about monochromatic patterns.

    Closing

    From “you can’t put 10 socks in 9 drawers” to “no universal compressor can shrink every file,” the Pigeonhole Principle reveals astonishing power. By carefully choosing “holes”-intervals, residue classes, colors, or substructures- one transforms a finite counting argument into far-reaching conclusions.

  • Infinite conclusions: Dirichlet’s approximation, uncountability, and existence of infinite subsets
  • Cryptographic bounds: collisions in hashing, birthday attacks in SHA, and security parameters.
  • Combinatorial certainties: monotonic subsequences, Ramsey-style triangles, and anticliques in posets
  • Impossibility theorems: compression paradox, injective limitations, and cache lower bounds
  • Every time you feel stuck when a problem hints at “too many items to fit in too few categories” reach for the Pigeonhole Principle.

    Call to Action. Next time you see “more objects than categories,” pause. Can you define the categories (pigeonholes) so that PHP gives you exactly the statement you need? Look around every limitation hides a powerful truth waiting to be unearthed.

    6.1 Further Exploration

    If you want to dive deeper, consider these self-contained exercises:

    1. Dirichlet’s Bound on π. Show that there are infinitely many rational approximations p/q to π satisfying $$\left|\pi - \tfrac{p}{q}\right| < \tfrac{1}{q^2},$$ using a pigeonhole partition of {kπ} into intervals of length 1/N.

    2. Ramsey Number R(4, 4). By extending the “star-center” pigeonhole idea, prove that any 2-coloring of K_18 contains a monochromatic K4. (Hint: Partition edges incident to a vertex into two colors; then apply the R(3, 3) = 6 result on neighbors.)

    3. Generalized PHP in Metric Spaces. Given infinitely many points in [0, 1]^2, partition the square into a n×n grid of little squares. Show one small square contains infinitely many points yielding a cluster point for Bolzano–Weierstrass.

    Each of these tasks reinforces the key insight: finite partitions force infinite structure.

    Remember: The simplest idea: “if more pigeons than holes, two share a hole” - is often the key to both elementary and deep mathematical and computational truths. Happy pigeonholing!

    References:

    1. Pigeonhole Principle - Wikipedia

    2. ES 214 website and inspiration for content

    3. Learning Discrete Mathematics with ChatGPT and Visualization specific help & feedback queries

    4. ChatGPT "deep research" article with poor formatting

    5. Dirichlet's approximation theorem - Wikipedia

    6. Birthday Problem - Wikipedia