MemoryTiles
For n pairs, there are
(2n)! / 2ⁿ possible layouts. Difficulty scales
factorially. Optimal play relies on memory rather than chance,
mirroring partial matchings in graph theory.
SumLeaves
A friendly form of the Subset Sum problem — an
NP-complete puzzle. Each board hides 2ⁿ possible
subsets of cells, and the player must find one summing to the
target. Difficulty tuning uses number range and target
distribution.
Learn more about Subset Sum problem.
SequenceForge
Exercises rule induction and algebraic reasoning. Players infer
the hidden function f(n), often linear (an + b), quadratic (an² + bn + c), or modular (n mod k). It’s informal function discovery and pattern fitting.
Learn more about Integer Sequences.
SequenceGarden
Classic memory growth curve: level k has sequence
length k, so total taps = k(k+1)/2.
Challenge grows quadratically. Information load increases linearly
in bits with each added symbol. How much can you hold in your
working memory?
Learn more about Working Memory.
Constellation
Based on
visual working memory . A pattern of illuminated squares appears briefly on a grid,
then disappears, and you have to reconstruct it from recall. It
exercises short-term spatial memory and pattern recall, similar to
classic pattern-recall tasks in cognitive psychology.
PathLines
Grounded in Euclidean geometry. Path length ≈ Σ |p₍ᵢ₊₁₎ – pᵢ|.
Collision tests use distance inequalities. The concept parallels
shortest-path problems with geometric constraints.
Learn more about Shortest Path problem.
ColourBloom
Colour space as 3-D vectors: (R,G,B). Blends are
linear interpolation (1–t)c₁ + t c₂. Switching to HSL
or HSV uses cylindrical coordinates — hue as angle, saturation as
radius, lightness as height.
Learn more about Colour Space.
FractalGrow
Recursive branching with decay factor d and spread
α. Segments grow exponentially:
(b^(k+1) – 1)/(b – 1). Visuals approximate
self-similar fractals with non-integer dimension
D defined by N = s^(–D).
Learn more about Fractals.