Glad to see this as a concrete proposal. The phased approach makes this workable — it gives validators time to plan and lets the community observe each step before the next one lands. A single abrupt cut would create unnecessary opposition from people who might otherwise support the direction.
The economics reinforce the case. Even under a reduced inflation model, per-validator rewards at 500 validators are roughly 13% above current levels, and at your final target of 120 they’re close to 5x. The “validators can’t pay their bills” problem gets better at every step of the reduction, not worse. That’s the argument that should win over the validators who might initially see this as a threat.
Starting at 24 cores rather than the 32 approved in WFC 573 makes sense given current demand. No point provisioning for capacity nobody is using. Scaling back up to 32 — or beyond — when activity justifies it is a more honest approach than maintaining infrastructure on speculation.
On the decentralization question — this is where it gets hard. At 120 validators, the barrier to entry rises and the set could consolidate around exchanges and large holders who can absorb the operational costs. Something like an on-chain nodes programme that factors in entity diversity or geographic distribution could help, but the design matters a lot — poorly structured incentives would just create a new set of problems. Curious whether others in the validator community have thoughts on what that would look like.
The reduction schedule and the broader economic picture are connected. Fewer validators means lower security costs, which opens the door to adjusting inflation parameters without hurting anyone’s bottom line. The burn-based tokenomics work in this forum was designed with exactly this sequencing in mind — cost side first, supply side second. Seeing the cost side move forward matters.