Cortical Labs grew 200,000 human neurons on a chip and taught them to play Doom in a week. The CL1 biological computer is now shipping at $35,000 per unit with a Python API. For prediction market agent builders, this is the first real glimpse of what Layer 4 intelligence looks like when it runs on actual biology instead of silicon.

200,000 Neurons, One Week, Doom

Australian biotech startup Cortical Labs has demonstrated its CL1 biological computer playing the 1993 first-person shooter Doom — using approximately 200,000 living human neurons grown on a microelectrode array. The company posted the demo on YouTube on February 25, 2026, with source code available on GitHub.

This is a massive leap from Cortical Labs’ 2022 experiment, where a predecessor system called DishBrain used 800,000 neurons and took over 18 months of training to play Pong. The CL1 learned Doom — a dramatically more complex 3D environment with enemies, weapons, and real-time navigation decisions — in roughly one week. Independent developer Sean Cole built the interface using the Cortical Cloud Python API with minimal biological computing experience.

The neurons aren’t winning any esports tournaments. Brett Kagan, Cortical Labs’ Chief Scientific Officer, acknowledged that the cells play like beginners. But they demonstrate adaptive, real-time goal-directed learning: the neurons can seek out enemies, fire weapons, and navigate 3D space. They learned this faster than equivalent silicon-based machine learning systems, according to the company.

How It Works

The CL1 is a self-contained unit where lab-grown human neurons — derived from adult donor skin or blood samples, reprogrammed into induced pluripotent stem cells, then differentiated into cortical neurons — sit on a high-density microelectrode array with 59 electrodes. An internal life-support system handles temperature, waste filtration, gas mixing, and circulation, keeping neurons alive for up to six months.

The system runs on what Cortical Labs calls biOS (Biological Intelligence Operating System), which simulates an environment for the neurons and mediates the bidirectional electrical communication between the biological and silicon layers.

For Doom, the challenge was translating visual game data into electrical patterns that neurons without eyes could interpret. CTO David Hogan explained that Sean Cole piped the video feed from the game into patterns of electrical stimulation. When neurons fire in specific learned patterns, those patterns map to in-game actions — shoot, move, turn. The neurons receive reward feedback for correct actions (targeting an enemy) and stronger rewards for successful kills, reinforcing those firing patterns over time.

Cortical Labs calls this Synthetic Biological Intelligence (SBI) — not artificial intelligence, but actual neurons doing actual computing on actual silicon.

The Numbers

  • Neurons per CL1 unit: ~200,000 (the human brain has ~86 billion)
  • Unit price: $35,000 ($20,000 each in 30-unit server racks)
  • Power consumption: 850–1,000 watts per full rack
  • First shipments: 115 units shipped in 2025
  • Neuron lifespan: Up to 6 months per culture
  • Training time for Doom: ~1 week (vs. 18 months for Pong on DishBrain)
  • Funding raised: Over $11 million (investors include Horizons Ventures and In-Q-Tel)
  • Cloud access: Cortical Cloud — deploy Python code to remote CL1 units via SDK

For comparison, a single Nvidia H100 GPU draws around 700 watts on its own. A CL1 rack with 30 biological compute units draws roughly the same power as one and a half H100s.

What This Means for Agent Intelligence

Here’s where it gets interesting for the prediction market agent stack.

The AgentBets four-layer framework defines agent intelligence as Layer 4 — the decision-making brain that sits on top of identity, wallet, and trading infrastructure. Today, Layer 4 is dominated by LLMs like Claude and GPT, fine-tuned models, and statistical systems like Bayesian inference engines. These are powerful but expensive: they require massive training datasets, consume enormous energy, and struggle with true real-time adaptation in volatile environments.

Biological neurons offer a fundamentally different computational model:

Energy efficiency. The human brain runs on approximately 20 watts. Biological neurons operate in the milliwatt range. For prediction market agents that need to run continuously — monitoring odds, detecting arbitrage, executing trades — the energy cost of the intelligence layer matters. A biological compute layer could run adaptive pattern recognition at a fraction of the cost of a GPU inference cluster.

Minimal training data. Silicon-based AI needs millions of examples to learn a pattern. Biological neurons adapt with dramatically less input. For prediction markets, where regime changes happen fast (a regulation drops, a market structure shifts, a new platform launches), an intelligence layer that can adapt in real time without retraining on massive datasets is a structural advantage.

Native uncertainty handling. Neurons evolved to process noisy, incomplete, real-time information — exactly what prediction markets produce. The CL1’s Doom demo showed neurons making decisions under uncertainty (enemies appearing unexpectedly, navigating unknown environments) without the brittle failure modes that LLMs exhibit when inputs deviate from training distributions.

Programmable via Python. The Cortical Cloud API means developers can deploy code to living neurons using the same language they use to build Polymarket trading bots or connect to the Kalshi API. The interface problem — getting code to talk to biology — is solved at a basic level.

The Agent Infrastructure Angle

Today, a prediction market agent’s intelligence layer typically looks like this:

Market Data → LLM Analysis → Strategy Engine → Trade Signal

A hybrid SBI architecture could look like this:

Market Data → Biological Pattern Recognition (CL1) → Strategy Validation (LLM) → Trade Signal

The biological layer handles the fast, adaptive, energy-efficient pattern recognition — detecting anomalies in odds movements, sensing liquidity shifts, recognizing arbitrage windows. The LLM layer handles the structured reasoning — evaluating whether the pattern matches a known strategy, checking risk parameters, generating human-readable explanations for audit logs.

This isn’t production-ready. The CL1 has 200,000 neurons. You’d need orders of magnitude more for anything approaching useful market analysis. The neurons have a six-month lifespan, creating maintenance and continuity challenges for always-on trading agents. And the ethical questions around using human-derived neurons for financial speculation are unresolved.

But the trajectory matters. Cortical Labs went from 18 months to train Pong to one week to train Doom. The Cortical Cloud makes the technology remotely accessible to any developer with Python skills. And the company’s roadmap includes larger-scale neuron networks and CL1 server clusters at research institutions.

What Builders Should Watch

Three things for prediction market agent developers to track:

Cortical Cloud API capabilities. The API currently supports basic input/output stimulation and read operations. As the interface matures, watch for higher-level abstractions that would allow feeding market data streams directly to neuronal networks and reading back interpretable signals. The GitHub repo has the Doom source code — study it for the data encoding patterns.

Neuron network scale. 200,000 neurons is roughly equivalent to a simple insect nervous system. Cortical Labs’ roadmap targets larger networks. The inflection point for agent applications is likely in the millions-of-neurons range, where biological systems could handle the complexity of multi-market, multi-asset prediction analysis.

Regulatory frameworks. The CL1 currently operates under stem cell research ethical guidelines. As biological computing moves toward commercial applications in finance, new regulatory frameworks will emerge. Cortical Labs has already attracted investment from In-Q-Tel (the CIA’s venture arm) and Horizons Ventures — signals that institutional interest is ahead of regulatory clarity.

The Bigger Picture

Cortical Labs’ Doom demo is a milestone, not a product launch for trading agents. But it represents something fundamental: the first time a commercially available, developer-accessible biological computer has demonstrated real-time adaptive learning in a complex environment.

For the prediction market agent ecosystem, the intelligence layer is the biggest unsolved problem. LLMs are expensive, slow to adapt, and energy-intensive. Fine-tuned models are brittle. Statistical models are rigid. Biological computing offers a fourth path — one that runs on biology’s four-billion-year head start in processing uncertain, real-time information.

The agent betting stack has four layers. Identity, wallet, and trading infrastructure are maturing fast. Intelligence is where the next breakthrough lives. And as of March 2026, that breakthrough might have actual neurons in it.


Browse prediction market agents and tools in the AgentBets marketplace, or explore the full agent intelligence guide for a deep dive on current Layer 4 options.