You are a company commander planning next quarter's training. Your unit has a limited budget of training hours and five possible events to resource. You can run 3 events on your own, or 4 if you coordinate with a sister company (at a cost). Each event has diminishing returns, some combinations unlock hidden synergies, unselected events suffer atrophy, a 4th event costs 5-15 points of overhead, and every evaluation is noisy. Your goal: allocate hours to maximize total readiness.
Individual Payoff Curves
Your Simulation Results
Score Breakdown

                  
◆ THE BLUEPRINT
What You're Looking At

A noisy resource allocation problem. Your unit has a limited training budget. Choose 3-4 of 5 events, allocate hours, and maximize total readiness.

The Setup

Each event follows a diminishing-returns payoff curve. Some event combinations unlock hidden synergies. Unselected events suffer atrophy. A 4th event costs 5-15 points of overhead. Every evaluation adds noise.

Why It's Hard

The search space is large (discrete event selection + continuous hour allocation), the fitness landscape is noisy, and synergies are hidden. Brute-force search is impractical. This is exactly the kind of problem a genetic algorithm can solve.

How to Interact

Select events and set hours in the sidebar. Press Sim x1/x10/x50 to see noisy outcomes. The payoff curves show what you can observe in isolation. Toggle atrophy and noise bands to see the full picture.

Related Topics
What is a Chromosome?

A genetic algorithm represents each candidate solution as a chromosome . Each chromosome has 10 genes:

  • Genes 1-5: binary (0 = no, 1 = yes). 3 or 4 must be selected (4 incurs overhead).
  • Genes 6-10: hours allocated to each event. Must sum to the budget.
  • Unselected events have weight genes but they are zeroed out in decoding.
Example Chromosome
◆ THE BLUEPRINT
What You're Looking At

A 10-gene chromosome that encodes a candidate solution to the resource allocation problem.

The Encoding

Genes 1-5 are binary selection genes (0 = skip, 1 = select). Genes 6-10 are hour-weight genes. Decoding normalizes the active weights so hours sum to the budget. Unselected events have their weight genes zeroed out.

Why This Encoding

Separating discrete choices (which events) from continuous choices (how many hours) lets crossover and mutation operate naturally on both. A repair operator enforces the 3-or-4 constraint after genetic operations.

How to Interact

Press Generate New Example to see a random chromosome. Trace how the raw genes map to an allocation and a fitness score. Notice how two chromosomes with different weight genes can decode to the same allocation if their ratios are identical.

How Selection Works

Selection decides which individuals get to be parents for the next generation. Better solutions should be chosen more often, but not always. Some randomness preserves diversity and prevents premature convergence.

Population Fitness
◆ THE BLUEPRINT
What You're Looking At

Three selection methods for choosing parents from a population. Selection pressure controls the tradeoff between exploiting the best solutions and exploring new ones.

The Methods

Tournament: pick k random individuals, the fittest wins. Roulette: selection probability proportional to fitness. Rank: selection probability proportional to rank, not raw fitness. Rank-based selection prevents a single dominant individual from monopolizing reproduction.

Selection Pressure

Stronger pressure (larger tournament size, or roulette with spread-out fitness) converges faster but risks losing diversity. Weaker pressure preserves exploration but converges slowly. Elitism guarantees the best individuals survive regardless of selection method.

How to Interact

Try each method and press Select a Parent. Watch how elites (blue) always pass through. Compare tournament sizes. Toggle elitism on and off to see its effect on the population bar chart.

Genetic Operators
How Crossover Works

Crossover combines two parents to produce one child .

Parent 1 + Parent 2 = Child
Decoded Allocations
How Mutation Works
Before and After
Where Did the Draws Land?
Fitness Impact
◆ THE BLUEPRINT
What You're Looking At

Two genetic operators that create variation. Crossover combines two parents into a child. Mutation perturbs individual genes.

Crossover

Single-point: one cut, child gets P1 before and P2 after. Two-point: two cuts, middle segment from P2. Uniform: each gene independently from P1 or P2 with equal probability. Purple genes show where the repair operator flipped selection bits to enforce the 3-or-4 constraint.

Mutation

Each gene has a chance of being mutated. Selection genes flip on/off. Weight genes get Gaussian nudges controlled by sigma. Paired transfers move weight between two events, preserving the budget. Unpaired nudges are independent and can violate the budget.

How to Interact

On the Crossover tab, try each method and watch how the child inherits genes. On the Mutation tab, compare paired vs unpaired. Watch the hour sums: paired always matches the budget, unpaired can drift.

Convergence
Best Allocation So Far
Population Diversity
Current Generation's Best
◆ THE BLUEPRINT
What You're Looking At

A full GA run. The convergence plot tracks best, generation-best, and mean fitness over generations. The diversity plot shows how similar the population becomes over time.

The Ceiling

The dashed line is the theoretical optimum given noise. The GA estimates fitness by averaging N noisy simulations. Even the true-best allocation will score below its deterministic value when evaluated this way.

What the Plots Show

Overall Best (blue) never decreases. Generation Best (green) can fluctuate. Generation Mean (purple, dotted) reflects population quality. When all three converge, the population has settled.

The allocation bar chart compares the GA's best solution to the true optimum. The breakdown chart shows per-event scores for the current generation's best individual.

How to Interact

Press Run and watch convergence. Press Pause to freeze and inspect. Use Step to advance one generation at a time. Reset reinitializes the population.

Try increasing mutation rate if the GA plateaus early. Try larger populations for smoother convergence. Reduce simulations per evaluation to speed up each generation at the cost of noisier fitness estimates.

Related Topics
What to Look For

Each config runs independently on the same problem. Compare:

  • How fast does each config approach the optimum?
  • Does it plateau early or keep improving?
Convergence Comparison
Results Summary
◆ THE BLUEPRINT
What You're Looking At

Multiple GA configurations racing on the same problem. Each config runs independently with its own population, selection, and mutation parameters.

What to Compare

Convergence speed: how quickly does each config approach the ceiling? Final score: how close does it get? Diversity: does the population collapse early or maintain variation? Higher mutation rates explore more. Larger populations converge more smoothly. More simulations per evaluation reduce noise but cost more compute per generation.

Quick Lessons

The preset buttons set up classic comparisons. Mutation Rate shows the exploration-exploitation tradeoff. Population Size shows the effect of population diversity. Elitism shows guaranteed survival vs. pure selection. Noise shows how simulation count affects fitness estimation.

How to Interact

Pick a preset or customize configs manually. Press Run All Configs and compare the convergence curves. The dashed line is the theoretical ceiling. The results table shows final scores as a percentage of that ceiling.