We do probably need to set minimums of data/compute per candidate to account for constant overheads of validation. Setting strictly specific sizes is probably unnecessary, though maybe some rounding up algorithm is desirable. This topic deserves a forum thread of its own, but this problem should be either solvable with incentives or with mandates. By incentives, I mean that chains using less than a full core for a candidate would be charged extra when they aren’t bundled with others (i.e. they didn’t make packing easy for backers). By mandates I mean that the protocol would be very strict with the particular candidate sizes which are allowed. I’d like to lean towards the incentive design space rather than the mandate design space where possible.
The general issue we’re running into is that chains may not coordinate on which sizes of candidates they produce. e.g. one chain might be producing candidates which are 60% of the core. You can’t pack two of these together, and it only packs cleanly with other candidates which are 40% of the core.
We likely want some rounding schedule, where there are fixed amounts of resource utilization that candidates get rounded up to. A linear rounding schedule might result in bands like this:
- Between 0-20% → 20%
- Between 20-40% → 40%
- Between 40-60% → 60%
- Between 60-80% → 80%
- Between 80-100% → 100%
But it’s probably better to set an exponentially decaying rounding schedule with some fixed bottom size like 1/32 or 1/16:
- (1/2 + ε) to 1/1 → 1
- (1/4 + ε) to 1/2 → 1/2
- (1/8 + ε) to 1/4 → 1/4
- (1/16 + ε) to 1/8 → 18
These are essentially “coretime frequency bands”, which are orthogonal to coretime consumption (amplitude, in the wave analogy).
Let’s start another topic to go over this?