Ethereum Foundation Shifts Focus: Security Trumps Speed with Hard 128-Bit Deadline for 2026

The Ethereum Foundation just dropped a bombshell—speed takes a backseat. The new mandate? Unbreakable security, with a strict 128-bit cryptographic standard locked in for 2026.
The New North Star
Forget chasing transaction throughput. The Foundation's core developers are now laser-focused on fortifying the network's cryptographic bedrock. The message is clear: raw speed is meaningless if the foundation is shaky.
The 128-Bit Rule: No Exceptions
This isn't a suggestion; it's a deadline. By 2026, any protocol upgrade or Layer-2 solution must comply with the 128-bit security threshold. It's a move that sidelines projects built on weaker cryptography, forcing a top-down security overhaul across the entire ecosystem.
Why the Sudden Pivot?
Pressure has been mounting. As institutional money eyes the chain, the tolerance for theoretical vulnerabilities plummets. The Foundation is preemptively slamming the door on future attack vectors, betting that long-term trust will outweigh short-term scaling hype.
The Ripple Effect
Expect a scramble. Development roadmaps will be torn up. Teams prioritizing slick user experiences over cryptographic rigor now face a brutal refactor. It's a win for purists and a nightmare for anyone who treated security as an afterthought—or worse, a marketing checkbox.
The move signals a mature, if less glamorous, phase for Ethereum. While traders chase the next shiny meme coin, the architects are quietly pouring concrete. After all, what's the point of building a digital Wall Street if the vault doors are made of plywood?
Three-milestone roadmap
The post lays out a clean roadmap with three hard stops. First, by the end of February 2026, every zkEVM team in the race plugs its proof system and circuits into “soundcalc,” an EF-maintained tool that computes security estimates based on current cryptanalytic bounds and the scheme's parameters.
The story here is “common ruler.” Instead of each team quoting their own bit security with bespoke assumptions, soundcalc becomes the canonical calculator and can be updated as new attacks emerge.
Second, “Glamsterdam” by the end of May 2026 demands at least 100-bit provable security via soundcalc, final proofs at or below 600 kilobytes, and a compact public explanation of each team's recursion architecture with a sketch of why it should be sound.
That quietly walks back the original 128-bit requirement for early deployment and treats 100 bits as an interim target.
Third, “H-star” by the end of 2026 is the full bar: 128-bit provable security by soundcalc, proofs at or below 300 kilobytes, plus a formal security argument for the recursion topology. That is where this becomes less about engineering and more about formal methods and cryptographic proofs.
Technical levers
The EF points to several concrete tools intended to make the 128-bit, sub-300-kilobyte target feasible. They highlight WHIR, a new Reed-Solomon proximity test that doubles as a multilinear polynomial commitment scheme.
WHIR offers transparent, post-quantum security and produces proofs that are smaller and verification faster than those of older FRI-style schemes at the same security level.
Benchmarks at 128-bit security show proofs roughly 1.95 times smaller and verification several times faster than baseline constructions.
They reference “JaggedPCS,” a set of techniques for avoiding excessive padding when encoding traces as polynomials, which let provers avoid wasted work while still producing succinct commitments.
They mention “grinding,” which is brute-force searching over protocol randomness to find cheaper or smaller proofs while staying within soundness bounds, and “well-structured recursion topology,” meaning layered schemes in which many smaller proofs are aggregated into a single final proof with carefully argued soundness.
Exotic polynomial math and recursion tricks are being used to shrink proofs back down after cranking security up to 128 bits.
Independent work like Whirlaway uses WHIR to build multilinear STARKs with improved efficiency, and more experimental polynomial-commitment constructions are being built from data-availability schemes.
The math is moving fast, but it's also moving away from assumptions that looked SAFE six months ago.
What changes and the open questions
If proofs are consistently ready within 10 seconds and stay under 300 kilobytes, Ethereum can increase the gas limit without forcing validators to re-execute every transaction.
Validators WOULD instead verify a small proof, letting block capacity grow while keeping home-staking realistic. This is why the EF's earlier real-time post tied latency and power explicitly to “home proving” budgets like 10 kilowatts and sub-$100,000 rigs.
The combination of large security margins and small proofs is what makes an “L1 zkEVM” a credible settlement layer. If those proofs are both fast and provably 128-bit secure, L2s and zk-rollups can reuse the same machinery via precompiles, and the distinction between “rollup” and “L1 execution” becomes more of a configuration choice than a rigid boundary.
Real-time proving is currently an off-chain benchmark, not an on-chain reality. The latency and cost numbers come from EthProofs' curated hardware setups and workloads.
There is still a gap between that and thousands of independent validators actually running these provers at home. The security story is in flux. The whole reason soundcalc exists is that STARK and hash-based SNARK security parameters keep moving as conjectures are disproven.
Recent results have redrawn the line between “definitely safe,” “conjecturally safe,” and “definitely unsafe” parameter regimes, meaning today's “100-bit” settings may be revised again as new attacks emerge.
It's not clear whether all major zkEVM teams will actually hit 100-bit provable security by May 2026 and 128-bit by December 2026 while staying under the proof-size caps, or whether some will quietly accept lower margins, rely on heavier assumptions, or push verification off-chain for longer.
The hardest part may not be math or GPUs, but formalizing and auditing the full recursion architectures.
The EF admits that different zkEVMs often compose many circuits with substantial “glue code” between them, and that documenting and proving soundness for those bespoke Stacks is essential.
That opens a long tail of work for projects like Verified-zkEVM and formal verification frameworks, which are still early and uneven across ecosystems.
A year ago, the question was whether zkEVMs could prove fast enough. That question is answered.
The new question is whether they can prove soundly enough, at a security level that doesn't depend on conjectures that may break tomorrow, with proofs small enough to propagate across Ethereum's P2P network, and with recursion architectures formally verified enough to anchor hundreds of billions of dollars.
The performance sprint is over. The security race just started.