Proposal discussion: Adapting Kusama to demand and its JAM future

Last year when I asked the Kusama community about what should our take be on JAM, whether we should try to share home with Polkadot’s JAM deployment or be “lightweight and independent”, the community agreed on the later option, a great choice IMO! Kusama has been underutilized for many years as a glorified testnet, but there’s a strong desire to realize its full potential as an independent hub for innovation and the leading edge of Web3 and the cypherpunk movement, a spirit well aligned with W3F’s Kusama Vision.

The main highlight of the wish for change is that Kusama infrastructure would need to be reduced to 32 cores(along with the number of validators), an estimation of the size required for a JAM chain to produce faster 1 second block times and an opportunity to bring new use cases to the ecosystem like real-time applications.

JAM is still under development and the reduction wouldn’t be needed for some time, however I see an opportunity to propose this reduction earlier and in a phased way, not only to prepare for what’s next to come but as a response to the market conditions and to the reality that there is virtually zero coretime demand in the ecosystem, we need to scale down, validators can’t pay their bills and there is no point in sustaining a huge infrastructure that no one(but the Virto team?) is using.

A gradual reduction

Suddenly cropping the supply of cores and “firing” most validators can bring a lot of unnecessary friction. I propose a gradual reduction over the course of 6 months to go from our current setup of 1000 validator(700 para-validators) and 140 cores arriving to a final setup of 24 cores backed by 5 para-validators for a grand total of 120 validators. Later when we prove there is demand for more than 24 cores we can always scale back up to 32 cores and even more if the technology allows it.

The referendum would be a single root proposal with a batch that schedules the following adjustments:

  • First set validators on the staking system to match 700 para-validators.
  • 1 month later: set cores to 120, validators to 600
  • 2 months later: set cores to 100, validators to 500
  • 3 months later: set cores to 80, validators to 400
  • 4 months later: set cores to 60, validators to 300
  • 5 months later: set cores to 40, validators to 200
  • 6 months later: set cores to 24, validators to 120

Next steps and thoughts

I was initially planning to propose this reduction since last year without asking much questions but I believe taking time to get feedback from different affected parties was the better approach, now taking even longer than I should have as I was busy(and lazy) to do run the necessary tests, but the time has come! I would like to hear thoughts from the community before submitting this referendum, there might still be topics to consider but the necessary calls are pretty much figured out.

I’m also curious to hear ideas about what comes next for validators, this change will reduce decentralization and could concentrate operations in the hands of a few big players like centralized exchanges… On-chain decentralized nodes program anyone?.

6 Likes

Glad to see this as a concrete proposal. The phased approach makes this workable — it gives validators time to plan and lets the community observe each step before the next one lands. A single abrupt cut would create unnecessary opposition from people who might otherwise support the direction.

The economics reinforce the case. Even under a reduced inflation model, per-validator rewards at 500 validators are roughly 13% above current levels, and at your final target of 120 they’re close to 5x. The “validators can’t pay their bills” problem gets better at every step of the reduction, not worse. That’s the argument that should win over the validators who might initially see this as a threat.

Starting at 24 cores rather than the 32 approved in WFC 573 makes sense given current demand. No point provisioning for capacity nobody is using. Scaling back up to 32 — or beyond — when activity justifies it is a more honest approach than maintaining infrastructure on speculation.

On the decentralization question — this is where it gets hard. At 120 validators, the barrier to entry rises and the set could consolidate around exchanges and large holders who can absorb the operational costs. Something like an on-chain nodes programme that factors in entity diversity or geographic distribution could help, but the design matters a lot — poorly structured incentives would just create a new set of problems. Curious whether others in the validator community have thoughts on what that would look like.

The reduction schedule and the broader economic picture are connected. Fewer validators means lower security costs, which opens the door to adjusting inflation parameters without hurting anyone’s bottom line. The burn-based tokenomics work in this forum was designed with exactly this sequencing in mind — cost side first, supply side second. Seeing the cost side move forward matters.