Parity's Market Research: Getting ideas on where to look and how to approach it

At the beginning of Q4 last year, @joyce and I worked alongside a few ecosystem members on identifying key criteria assessing different blockchain infrastructure on the market.

Hence, we scouted amongst 4 big target audiences (decisions makers/C-level people spread between commercial, technical, investor and infrastructure providers roles) on what metrics are they prioritizing evaluating their choices:

  1. where to deploy
  2. with whom to integrate
  3. who to invest in (or in projects from which ecosystem), etc.

An additional important point to mention is that all of these subjects would come outside of the Polkadot ecosystem.

How did we approach the project?

Based on the timeline and deadline we’ve received for the project, we reached out to ∼50 decision makers (C–level) and managed to get around 30 of them for a call going through a pre-defined survey of questions helping us get the desired data. Additionally, we’ve sent out a quantitative survey to a few dozen more, which helped us get closer to the number of a total of 40-45 participants.

Deliverables and conclusions from vol.1:

  • Prioritization of performance criteria/metrics
    • Scalability aka performance/speed and cost came in as the most important (presented through KPIs such as through TPS, time to finality)
    • Followed by DevX, Ecosystem development and Security
    • The least ranked was Decentralization
  • Assessing which data/metrics are important as a part of each pillar
    • :+1: TPS, Block times, Finality, Gas costs
    • :+1: Total no. of wallets, txs, liquidity (stablecoin distribution)
    • :-1: (De-validated): Financial performance (one of the least prioritized)
    • :-1: (De-validated): Horizontal scaling (one of the least prioritized)
  • Blockchain of preference
    • Solana came in at the first place
    • Polkadot was mentioned only once (its tech–Polkadot SDK/Substrate)

For a full presentation/slide deck, visit this link

On another note, we have received a few feedback points that the learnings/conclusions presented are not adequate or reliable enough due to the too few number of people we interviewed – however, it’s very important to mention that the project deadline and timeline resulted in a total of 1,5 months, while we were working on 3+ other projects at the same time.

What comes next?

We’d like to say that we learn from our mistakes and grow from our experiences.

Hence this time, and for part 2 of the Market Research project, we’d like to include Polkadot Community to present ideas or requirements for continuing with the project.

In order to inspire and provide examples, here are a few practical outcomes of the previous effort:

  • Parity decided to build a Reliability dashboard serving as a competitive landscape tool comparing prioritized KPIs (TPS & Latency performance, predicting costs, etc.)

  • Follow-up projects from the feedback targeted at two projects that Parity is focusing on: Plaza upgrade (“Polkadot Hub”) and Parachains (“Polkadot Cloud”)

    • EVM developer requirements (prioritizing tooling integrations for Hub)
    • Polkadot’s Omni-Infrastructure (GTM positioning Omni-node and Hub)
    • Analysis on Tier 1 Projects that went from dApp to Appchain launched (positioning Polkadot Cloud)
    • (TBD) Revamping ‘Alpha program’ into a new Polkadot Builders program (Polkadot in general)

With that said, we hope to see many reasonable ideas and requirements coming from quantitative and qualitative arguments based on projects you’re leading, or working on.

Additionally, Joyce and I will be welcoming every solution proposal and accept more practical resources in time allocation, sourcing networks/contacts, and more.

Let’s hear it :ear:

13 Likes

Cool research! Just to clarify, you asked this within, or outside of our Polkadot ecosystem?

2 Likes

Can you please provide more insight into:

You mention the most important things are:

And that

were validated.

So what does it mean to “de-validate” horizontal scaling? It is just a technical implementation detail on how we achieve the TPS, block times, finality, gas costs, and performance of Polkadot.

Can you describe how “horizontal scaling” is something that can be compared to “scalability”, “ecosystem”, “devx”, “security”, etc? As you seem to place this “technical implementation detail” in graphs along these others “general qualities” of a blockchain.

I asked the same question when this presented internally at Parity, so I am interested to understand how this is still a bullet point in this forum post, and why it even made it into the original questionnaire.

Seeing that “decentralization” was the second “least important metric”, which would also “de-validate” it… can you tell me why we should even listen to any of the other results coming from this audience?

Like would you really try to extract insights about Web3 and blockchain technologies from a group of people who do not highly value decentralization of that technology?

Seems like it would be helpful to get a copy of the exact questionnaire given to the audience which led to these results.

4 Likes

@DotDotApe - outside of Polkadot ecosystem.

3 Likes

sure, i can elaborate. Thanks for your feedback @shawntabrizi

  1. ‘de-validated’ means that the results of asking to prioritize the user groups / interviewees fell short - after averaging the rating/score they’ve use to mark the importance evaluating blockchain infra/ecosystems.
    in practice, e.g. if a CTO of a dApp looking for an infra/ecosystem to deploy on mentioned that ‘scalability’ is important (they rated ‘4’ or ‘5’; from 1 to 5, where 1 is least important, and 5 is the most important), we’d ask them right afterwards ‘if HOW the scalability is achieving, in terms of, does horizontal scalability influence their decision’ - they simply responded with ‘no, we don’t care how, we care that it IS fast/scalable’, adding a score of either 1 or 2’. So in short; yes, it was posed as a technical implementation detail.

  2. in terms of how we asked people about these criteria; we had these 8-9 criterias:

  • scalability/performance (speed)
  • devX/accessibility
  • ecosystem development
  • security
  • interoperability
  • developer activity
  • financial performance/kpis
  • decentralization
  • (optional, if the 1st one aka scalability/performance is ranked important) we’d ask about horizontal scalability

and asked every interviewee to rank it from 1-5 (1 least important, 5 most important) based on their use cases and personas.

That’s why, on this screenshot:

you can see how each user group rated these criteria, but also overall ‘average’ scoring results.

the interviewees were objectively selected to be a part of the survey - mostly based on their successful track record running companies for years in this space. Those include companies working as L1s like Avail, dApps deployed on Avalanche, Solana, Polygon, but also Investors such as Omni-chain capital, devshops like Ethernal, ConsenSys, etc. building tooling for EVM infra, or Infra providers such as DappRadar.

if I, or we, ignored these people just because we don’t like their responses/results - that would be a subjective attitude towards seeing the state of the market right now.

like, should we close an eye on Solana’s growth path in the last year, although ‘most’ of the folks know it’s significantly centralized and unreliable (1x downtimes for 30-60 mins / day)? i don’t think so.

it can provide a lot of important and useful data on building our products in Polkadot, but furthermore, positioning them.

so on the contrary, although we can think whatever we want (subjective) about these results and data - they provide a valuable, tangible and measurable input on where the industry stands today.

3 Likes

can you make this link public please?

3 Likes

Yes please. or accept the request access

2 Likes

@et90266 @sepi thanks for flagging this - the access has been updated to public.

3 Likes