Is it time to rethink the core pricing model for Polkadot?

The DOT pricing model has been widely discussed, yet with recent experience on both Kusama and Polkadot, we may need to reconsider the current approach.

The primary goal of the core pricing model should be to provide customers with predictable and stable pricing, a point underscored in a presentation at Polkadot Decoded 2024. However, the existing model fails to offer such stability, even for core renewals. For example, initial cores on Polkadot were sold at 69 DOT, while renewals climbed to 100 DOT. Prices on Kusama, meanwhile, are in free fall, dropping as low as 0.08 KSM.

This level of price fluctuation can deter potential customers, who are likely seeking a consistent, reliable and easy to understand pricing structure.

Given the lackluster performance of core sales so far on both DOT and Kusama, it may be worth exploring a model with fixed pricing. For example, we could establish fixed rates either in DOT (e.g., 250 DOT per month) or pegged to USD (e.g., the equivalent of $1,000 in DOT per month). The price could be set by OpenGov, providing a structure that is:

  1. Easier to understand and navigate,
  2. More appealing to customers seeking price predictability.

Another alternative would be to consider models used in lending markets, such as Aave, where both stable and variable rates are available.

A move toward a simplified, stable, or dual-option pricing structure could make Polkadot’s core model more attractive and accessible to potential customers.

3 Likes

I agree the current price adapter is not ideal, at the very least we should be able to configure a minimum core price updatable via governance.

The burned tokens from coretime sales should aim to counter(and exceed) the newly printed money going to validators to cover their operation costs, setting some minimums can ensure price won’t plummet, we saw in Kusama how quick the current formula can drive the price to nothing. As a builder it seems nice that I can secure my blockchain for free but as a holder it surely is worrying.

2 Likes

KSM and DOT is United Ecosystem
Community interested in balance.
I was not happy about KSM when it started, but now its part of the game, we just have to count it in all calculations and thats it.

2 Likes

A minimum price is certainly a good change.

However, if I were a potential customer and asked, “How much does being secured by Polkadot cost?” I would prefer to hear a specific number or price range.

Hearing, “There is an auction, and there’s no upper limit to what you might pay,” would be a red flag for me. Who knows if I can afford being connected to Polkadot in one year from now even if it’s cheap today?

It would be interesting to hear the perspectives of parachains—or, even more so, of those who decided not to build on DOT—regarding this issue.

1 Like

Crossposted from Kusama ref 469
I thought I’d weigh in here after talking about it a bit on the Fellowship call yesterday.

Like I mentioned on the call, I 100% agree that the current pricing model falls short with a higher core count, and have been exploring options for a next iteration of the pricing adapter which takes more information into consideration and more closely aligns with one of the goals (lowering the barrier to entry), so watch this space. However I think that minimum pricing is not the way to solve some of the issues described here.

I would like to separate the cost of production into a different discussion and not cover it here. It was a useful starting point to estimate the starting price where we had no historical data of demand-driven pricing, but I don’t think it has a bearing on the analysis of the current problem

I’d like to rephrase the problem statement as the following:

The current price adapter falls short with higher core count, failing to provide a target price for the next sale which seems to represent the market sentiment over time (not just one off variations due to temporary market conditions) with the effect that the cost spread is decreased, even with a large lead-in factor, arguably allowing the cores to fall to a “fastest finger first” market and becoming a non-deterministic barrier to teams who want to start a project on Kusama.

By non-deterministic I mean that even if you know you’re willing to pay 10000KSM (insert ridiculously high number here) to get a core, if the lead-in period starts much lower you’re not able to show how much you value it and are forced to offer the ceiling price in the absence of secondary markets, therefore you have no option to outbid people who value it less than you and somebody could bulk buy all cores, beating you to it.

What I’d suggest is to use a part of the configuration that has been overlooked, the ideal_bulk_proportion. From the docs:

The proportion of cores available for sale which should be sold.
If more cores are sold than this, then further sales will no longer be considered in determining the sellout price. In other words the sellout price will be the last price paid, without going over this limit.

pub ideal_bulk_proportion: Perbill,

So if we establish a rough heuristic that when we add cores, if we’re not aware of any increases in demand, we can set this to the proportion of the old core count to the new core count and maintain the previous price finding behaviour.
If we do then get the increase in demand then we end up with the opposite problem with runaway upward pricing, but this can be easily adjusted with a referendum to set the configuration.

Applying this heuristic with the benefit of hindsight, we could say that when we increased the core count from ~60 cores to 100 cores, we should have also decreased the ideal_bulk_proportion from 100% to 60% to get a naive equivalent. I think it was a useful test for the pricing adapter, but I think that now testing this heuristic that I’ve proposed is good for Polkadot while also addressing some of the issues raised here. This could be achieved without code changes and just make a referendum to change this value in the broker configuration.

Since we have had several sales with this in place, maybe we could consider a more brutal cut for a few sales to speed up the return to what governance deems an equilibrium point by decreasing the ideal_bulk_proportion lower than 60%, then a future referendum could adjust it to what is seen to be a stable price position that balances our aims.

To be clear, I still consider this a temporary adjustment until a pricing model rework, and it could be largely mitigated with trustless secondary markets. Also it could be argued that Kusama should be allowed to be Kusama too, but I think that this is a useful test for when the Polkadot core count increases while also addressing some complaints.

1 Like

Also this was the entire aim! Cheap coretime is a good thing, not a bad thing.

The bad thing for the ecosystem kicks in when the price drops so low that people who have no utility for it can get in before a team who value it more highly but have no opportunity to express that due to market mechanic limitations - their bid is above the market ceiling price and somebody beats them in timing but not in price.
In any other situation where teams are outbid, they need to reconsider how much they’re willing to pay in the following auction, and if they need to be live immediately, then they can fall back to a secondary market or using on-demand coretime.

I disagree. Auctions typically do not aim to create extremely low prices. If cheap prices were the goal, one could simply sell cores for 0.01 KSM or 0.1 DOT and avoid overcomplicating the process with auctions altogether.

By “lackluster,” I’m referring to the low number of cores sold overall. Most cores on Kusama are acquired by “blockspace barons.” To my knowledge, only a single new rollup chain has onboarded on Polkadot since the inception of core time sales, which is quite disappointing.

The design of the core time pricing mechanism may be contributing to the low interest. There’s no price stability - not even for renewals. Using a complicated auction system to offer a service makes little sense to me. If I was a potential customer, this lack of price stability and complexity would likely drive me away from the ecosystem.

I’m not talking about extremely low prices there, for that you can see my previous comment. For the number of cores sold see below.
I think you’re oversimplifying the utility of auctions. Here the dutch auction acts as a price finding mechanism, and from the spread of possible prices inside a sale, people can express how much they value coretime, with the aim being that people who value it most will get it. This is independent of how the range of prices changes, due to demand, project use cases and individual requirements (especially with elastic scaling coming) and exchange rates to FIAT currencies. The price needs to be able to shift to adapt to these things to avoid governance having to step in to attempt to estimate this.
If somebody has a use case for having a parachain on a core, they pay a premium to ensure they get it and can realise that value.

Aiming towards market efficiency: if the cores don’t sell out one sale, then we conclude that the floor of the price range is still above the price that people value it at, so the price range shifts downward in the next sale. Likewise if all cores sell out very quickly with not much spread, we shift the range upwards assuming that there is part of the range that is missing on the upper end. We can change the parameters of how that shifts. As you’ve correctly noted there currently isn’t the demand (registered parachains requiring long term coretime) for the number of cores on Kusama at the minute.

From your comments here it seems that there are parts of the design that you’re not aware of, so I’ll respond inline below:

Renewals have price stability by definition - we have enshrined renewals at a 3% bump above the last price paid for parachains who satisfy some basic usage rules. I don’t know what led you to think that there was no price stability.

Is it complicated? To get on board you:

  1. Decide how much you’re willing to pay for coretime
  2. Call purchase when the market hits the price you’re willing to pay (the timing of which you can easily predict in advance)
  3. Assign the core to your parachain
  4. Renew each month at a 3% premium
  5. Profit (presumably)

Two extrinsics to set up and then one each month for high quality blockspace with price predictability - I don’t see the complexity in 2-4.
The small added complexity in step 1 to determine the price you’re willing to pay depends on 5, and also is necessary in any system with a finite resource unless your sole aim is to make back a certain amount each month for some reason. When the price range for a sale shifts too low for the perceived market rate you run the risk of cores selling out to people right at the start whose extrinsics happen to arrive first, and removing the option from people who value it more than the top of the range.

I don’t think there’s any reason to think that - did we have 40 parachains trying to go for the last few auctions? Therefore why would we expect 40 extra parachains to leap on board when we drop extra cores in on Kusama. Further why would that put them off? Is that not directly conflicting with your previous point that cores are too cheap? Additionally there is on-demand, where people can make blocks with low latency without needing to participate in the auction at all.

If this is disappointing, how disappointed were you in the three months before coretime launched? How many new teams joined the ecosystem then? Were there people building a parachain who decided to stop when they heard about Coretime’s launch? When assessing a protocol change, I think it’s important to separate the changes caused by a release vs changes which correlate with the release, or even things that haven’t changed at all and are totally orthogonal to the release. The latter points belong to a different conversation.

Last point I wanted to respond to is about the Kusama Blockspace Barons - what is the issue with this inherently? My second comment above covers a specific case where this becomes a problem, but in general if people who want to hold the asset for zero utility value it more than people who have utility, then the people with utility need to start buying earlier in the curve, and this is just another consideration involved when deciding the price you’re willing to pay.

Exactly. So renewals prices are not stable. They can increase over time if there’s demand for core time making planning for customers difficult, expecially considering time frames of 2-3 years. I don’t know what led you to think that the price was stable.

What? They increase at a fixed percentage each month indefinitely; what about that is not stable?

Planning is easy:
price(t) = starting_price * pow(1.03, t)
Where t is the integer number of months from when you first bought a core until the time you are predicting.
Buy a core at 15.11 DOT this month (the target price), then you know that on 18th November 2027 you will be paying ~47.55 for the last month of your three years. You will have paid a total of 1139.31 DOT for the three years of uninterrupted guaranteed renewed coretime.

You pay a premium for predictability, but if the market price increases over time you’re protected from the increases. If the price falls, then you buy a core on the open market instead of renewing and renew that core, resetting the start price which the 1.03 is applied to.

1 Like

For me, an exponential growth of 3% per month, or approximately 43% per year, does not meet the definition of a stable price.
That said, everyone is entitled to their own opinion, I suppose.

Agree with the many points made by Seadanda. I just wanted to share my perspective as a prospective dev team that almost pulled the trigger, but remained on the sidelines to see how things played out after the coretime announcement.

The change to coretime was very concerning when it was first announced, as months of planning was tossed out of the window and everything seemed up in the air to revise any plans. At that time, the calculated risks was tolerable: my tech team needed to learn substrate and build with confidence. The abundance of documentation that lacked organization or access to an official representative or point of contact who could provide guidance on where to start and answer our list of questions was nonexistent. The amount of information is daunting, but not knowing which information was the most recent version, the right place to start, or which material was endorsed, created or prepared by the core dev team / Polkadot was not the best experience for my grumpy dev team.

The unpredictable cost, especially if it is based on the DOT price is one barrier that can be addressed using $ USD as the base price. Predicting crypto prices should never be part of a startup plan. I would expect to have competitive offers if I was signing up with a service provider to build my business – knowing that if I use more, my costs will be even lower with zero possibilities of raising prices without my consent. The real question for core pricing is what are the actual costs and are the margins inline with revenue projections that will scale with growth. For competitive pricing, factor in the current market rates (that you pay to operate as your pricing) and cut that in half to gauge profit margins for this year and 5 years out. The growth in business or sales will not come from raising your price – your price should go down along with your costs – while your customer base or market share grows.

Lastly, the uncontrollable factor that has been the most concerning is having the largest shareholder who is not part of the founding team and no records of investments for ownership, and is very active in governance, the direction and decisions of Polkadot. It would not be a concern if his influence added any value or anything positive, but he takes on roles that he clearly has no experience or success. The lack of leadership and ownership was very disappointing, but hopefully the issue will be addressable before the new year.