Why Polkadot-API?

Introduction

It’s been 14 months since we published the first stable version of polkadot-api. The library has matured a lot since then, largely thanks to feedback from our users :person_bowing:. We’re now focused on delivering PAPI v2: leaner, easier to use, and more performant.

Equally important, though, is explaining why the ecosystem should migrate to PAPI. So I’m taking a short break from v2 work to share why teams currently using PJS should strongly consider moving to PAPI.

TL;DR:

PAPI is:

  • Interoperable

  • Composable

  • Extremely performant

  • Future‑ready

  • Equipped with great development tools

  • Incredibly stable

What is wrong with Polkadot.js?

PolkadotJS has several issues. Many assume the biggest problems are performance and bundle size. Those are symptoms. The root cause is deeper:

  • Tight coupling and leaky abstractions. PolkadotJS libraries are tightly coupled without clear contracts or boundaries. Abstractions bleed across layers, so a small change can cascade into major incidents. See this recent issue triggered by a tiny change. I wrote a post‑mortem in this comment. It’s one of many examples where poorly drawn boundaries cause random bugs and unexpected behavior.

  • Non‑interoperable public interfaces. The two most important public interfaces in PJS: the JSON-RPC Provider and the keyring, are not interoperable. They’re tightly coupled to PJS‑specific implementation details. Those fragile, leaky interfaces became de‑facto “standards,” limiting what a rich, healthy tooling ecosystem could build on Polkadot. In my view, this is PJS’s biggest problem.

  • By‑products of the above:

    • Large bundle sizes: still coupled to bn.js and @polkadot/wasm-*. The reason decoupling hasn’t landed cleanly is the same boundary problem, which makes it effectively impossible to make these kinds of changes.

    • Poor memory and CPU performance: a common outcome of tightly coupled software that fails to isolate complex problems into composable components.

    • Leaky subscription APIs: e.g., you don’t get the unsubscribe function synchronously when establishing a subscription (leaking an internal detail). There’s no way to register a callback for errors that happen after the subscription is established. Mixing a Promise with a subscription is the worst of both worlds and a root cause of memory leaks and unexpected behavior.

  • Ad‑hoc chain‑specific logic. PJS never fully leveraged modern metadata versions. Architectural constraints made it hard to offer compatibility APIs or protect DApps from unintended behavior after runtime upgrades. We’ve discussed this at length here.

  • Maintenance model. PJS was, for a long time, a one-person project. The problem isn’t the contributor count per se, it’s that the codebase wasn’t shaped for easy handoff or collective evolution. Credit to @Tarik and the BlockDeep team for keeping things running after Jaco left, but the structural issues remain.

The PAPI tenets

When we started polkadot-api, we set one paramount goal: design for long‑term health of the ecosystem.

Interoperability first:

We began by defining the public interfaces the library would use. They had to be simple and interoperable, so different tools and libraries could plug in without friction.

These interfaces are purposely decoupled from PAPI itself. They happen to live under @polkadot-api/* today, but ideally they’d live under polkadot/*. Put differently: these are the interfaces we wish the Web3 Foundation would champion to enable true interoperability. In the absence of a formal standard, we decided to pioneer this front.

There are two APIs that matter most:

JSON‑RPC Provider

This is the interface for interacting with a node, ideally a light client. The PAPI client consumes a straightforward interface aligned with the modern JSON‑RPC spec. It makes no assumptions about who’s behind it: smoldot, a WebSocket endpoint, a new light‑client, a web worker, Chopsticks… anything. If the producer presents a modern JSON‑RPC‑compliant interface, PAPI just works.

A big upside: the modern JSON‑RPC spec is load‑balancer‑friendly. The provider abstracts transport complexity so the PAPI client keeps operating even if a connection drops, endpoints rotate, or the producer crashes. With a WebSocket provider, for example, your app keeps working through disconnects or stale connections. And when an underlying transport isn’t fully compliant, our middleware translates legacy to modern.

That’s why the PAPI team is committed to improving the modern JSON‑RPC APIs.

Polkadot Signer

This interface covers extrinsic creation. The ecosystem has long lacked standardized interfaces for:

  1. Discovering/requesting available signers

  2. Creating extrinsics

So we avoided the messy PJS keyring and introduced a generic PolkadotSigner interface, properly decoupled from the PJS signer. We’re also bringing this standardized interface to PJS and considering a Fellowship RFC. More context in this blogpost.

Because of this generic interface, teams building cutting‑edge PolkadotSDK chains can create transactions that otherwise couldn’t be created with PJS.

Composability

Our second tenet is composability. Each PAPI library is as decoupled as possible from the others. We relentlessly break complex problems into small, well‑isolated packages with clean APIs, so consumers don’t need to understand how the hard parts are solved. “Good fences make good neighbors.”

That’s why some of our libraries (like scale-ts) are widely adopted. In fact, even PJS and Dedot use some @polkadot-api/* packages… which is great!

A practical upside: the PAPI client is resilient to network hiccups. If you use a WebSocket provider, you can safely reconnect every few minutes, even with in‑flight operations. Thanks to the modern JSON‑RPC spec (and our middleware), operations continue as if nothing happened.

Performance

The only sustainable way to make a complex library fast is to isolate complexity into small, composable pieces. As a result, Polkadot‑API is (at the time of writing) the most performant JS library in this space, both in memory and CPU usage. Keep reading for hard evidence.

Contribute upstream

Clear boundaries help everyone. That’s why the PAPI team actively contributes upstream (PolkadotSDK, smoldot, Chopsticks, and more). When we spot issues or opportunities at the boundary, we open issues, PRs, and discussions. The list is long; I’ve collected many of them in this gist.

Empower Polkadot developers

Our last tenet is empowering developers with great tools and DX. We invest heavily in:

  • Improving the PAPI development console.

  • Providing powerful Polkadot SDKs (Ink!, Solidity, Governance, Identity, Staking, etc.).

  • Shipping DApps that showcase those SDKs: like the popular https://bounties.usepapi.app/ and a new staking DApp we’ll release in a few weeks.

  • Building tools like diff.papi.how and the @polkadot-api/check-runtime CLI.

  • An upcoming JSON‑RPC analyzer to inspect all operations (storage requests, runtime calls, headers, bodies, etc.) from any compliant JSON‑RPC provider: so you can spot bugs, find potential improvements, and analyze how a DApp interacts with a chain.

Ok, so why PAPI?

First, PAPI has improved dramatically since the first stable release 14 months ago. Second, it’s uniquely stable: we haven’t shipped a single breaking change to the public API in that time, and the upcoming v2 changes are ridiculously small (see here and here).

PAPI is also, by a wide margin, the most performant JS library in this space (details below).

It’s the only library truly ready for the near future, thanks to being fully decoupled from the PJS keyring.

Finally, the core PAPI team consists of three exceptional developers who challenge and learn from one another. We each understand the responsibilities of every package deeply enough to jump in and fix issues anywhere in the codebase. The code is designed so others can fork or take over. In other words: the bus factor is healthy. PAPI is here to stay.

Choosing PAPI also means choosing interoperability, making it easier for other teams to plug in and add value.

What about Dedot?

It’s time to address the elephant in the room.

First and foremost, we applaud Sinzii’s effort to create an alternative to PJS. That said, we do have concerns about Dedot’s approach and share them here in a constructive spirit, especially given Dedot is still in major‑version zero.

Lack of interoperable interfaces

Earlier, we explained how Polkadot‑API invested in simple, interoperable public interfaces.

Unfortunately, Dedot hasn’t done the same:

Dedot’s signing interface:

Dedot adopted the problematic interfaces used by PolkadotJS, even PJS maintainers agree we should move away from them. Consequences include:

  • Inability to create extrinsics for modern chains.

  • Tight coupling of the signer to @polkadot/wasm-crypto and bn.js, which are heavy and inefficient.

It’s disappointing to see a new library doubling down on interfaces we’ve long known are problematic.

Dedot’s JSON‑RPC provider

Dedot didn’t leverage the modern JSON‑RPC APIs to abstract away reconnections, halted transports, etc. Support for modern JSON‑RPC appears to have been added later, without fully embracing its benefits. The provider leaks Dedot‑specific implementation details and isn’t truly compliant with the new spec, since it behaves differently depending on the provider. That connects to the next point.

Poorly defined boundaries

In Dedot, packages are tightly coupled. Complexity isn’t isolated behind clean APIs, so consumers inherit internal details.

For example, Dedot exposes two different clients depending on the provider (legacy vs. modern). Both miss common, useful APIs every client should have: like subscribing to all finalized events or to current best blocks. Users must roll their own, handle reconnect/disconnect edge cases, and manage operation persistence across reconnects… twice, for two clients.

Worse, the modern client is hard to use because several endpoints aren’t fully compliant with the modern JSON‑RPC spec. Instead of using generic middlewares (like those PAPI provides) to bridge gaps, Dedot delivers a broken DX on the modern client. Everyone ends up on the legacy client, which can’t offer the guarantees modern APIs provide. This cements legacy usage while still failing to offer minimum guarantees.

And Dedot inherited PJS’s leaky subscription patterns too.

Missing runtime safety

Another issue Dedot inherited from PJS is the lack of runtime type safety: ensuring that the chain you’re interacting with is compatible with the code and type definitions your DApp was built with.

PAPI solved this through “descriptors”: lightweight runtime checks that validate compatibility every time you interact with a chain. While we already have improvements planned post-v2, the current implementation is both highly reliable and performant.

This eliminates the dreaded Cannot read property [X] of undefined errors that sometimes appear after a runtime upgrade in PJS. Errors that make debugging painful and can silently produce incorrect behavior. In the worst cases, these mismatches can result in a DApp displaying wrong info (e.g., due to a new enum variant) or even submitting transactions with wrong data.

We’ve encouraged the Dedot team to address this many times, but to this day, they’ve continued to dismiss it as not important enough to solve.

Subpar performance

Leaky abstractions often lead to performance issues. Dedot is no exception. Using the benchmark Dedot created:

Results

Results ran in a MacBook M3 Pro, Bun 1.2.7

Dedot:


================================================================================

📊 FINAL MEMORY CONSUMPTION REPORT

================================================================================

Total blocks processed: 200

Total time: 610.6s

Average time per block: 3.05s

Memory Statistics:

Initial heap used: 4.96 MB

Final heap used: 237.15 MB

Total memory growth: 232.19 MB

Average memory per block: 1.16 MB

Peak RSS: 386.09 MB

Peak heap used: 241.09 MB

⚠️ WARNING: High memory growth detected!

Average growth of 1.16 MB per block may indicate a memory leak.

PAPI:


================================================================================

📊 FINAL MEMORY CONSUMPTION REPORT

================================================================================

Total blocks processed: 200

Total time: 164.4s

Average time per block: 0.82s

Memory Statistics:

Initial heap used: 2.30 MB

Final heap used: 19.84 MB

Total memory growth: 17.54 MB

Average memory per block: 0.09 MB

Peak RSS: 175.91 MB

Peak heap used: 42.49 MB

✅ Memory growth appears normal.

We’ve set up the following repo so anyone can verify these results. Please, notice that we didn’t create these benchmarks, we simply took the ones that Dedot created for their own repo.

PAPI processed the same number of blocks ~3.7× faster while using dramatically less memory. Note: Dedot’s memory usage would have been worse if the user weren’t responsible for periodically cleaning the cache, another example of a leaky abstraction. PAPI knows when to clean up safely because the client tracks when it’s correct to discard cached items.

Also, the PAPI client does more, continuing smoothly through connection hiccups.

Lack of upstream contributions

Given both libraries tackle the same problems, we’d love to see Dedot engage more upstream. For example, our team pushed hard so Pallet Revive exposed the instantiated event, something Dedot benefits from. Another example is this upstream issue, which we suspect they must have encountered?

To date, no one from the Dedot team has opened an issue or PR to PolkadotSDK. The same appears true for smoldot and the new JSON‑RPC spec. Upstream collaboration matters because that’s how the platform gets better for everyone.

Please help Dedot step up

We’re puzzled by how quickly some influential voices started to shill a project still in its early phase (v0.18.3), with leaky APIs and familiar architectural pitfalls.

If you support Dedot, please ask them -at a minimum- to embrace interoperable APIs for the JSON‑RPC provider and signing interfaces. That alone would unlock a healthier, more resilient ecosystem for all of us.

Closing thoughts

We’re not asking the ecosystem to “pick a side”. We’re asking it to pick standards. If libraries converge on interoperable provider and signer interfaces, everyone wins: wallets, DApps, infra, even competitors. PAPI is our contribution toward that future: fewer foot-guns, clearer contracts, and performance you can verify.

If you build on Polkadot, take this as an invitation: hold us to a high bar, file issues, challenge assumptions, and help refine the interfaces. Let’s make the future about shared foundations, not one-off patches.

9 Likes

I agree — at the end of the day, we need more standardization. Also, I’ve really enjoyed working with polkadot-api; it’s probably the best in Polkadot right now in terms of developer experience.

2 Likes

Bro, polkadot-api should of course be better than dedot, given that dedot has received ~$50k in grants working as a one-man project. And i am assuming PAPI has raised >$500k? That’s a >10x difference in funding. I’m also pretty sure Dedot is a side project.

…So, i think the comparison are a little harsh. Seems you wanna play fisticuffs with whoever is near you.

instead of competing, recognise that you are a key product-owner and thought leader in the space, you can just be giving the best suggestions and guidance sugar wrapped in welcoming chad vibes, to projects like dedot, who seems to be working mostly fuelled by enthusiasm.

High critique isn’t the only tool in the locker, it definitely has a toxic effect if incorrectly dosed.

Kudos to you though for really greatly improving the performance and design of the api. when i started building things with PJS, i was dumbfounded by its complexity, and assumptions about what i knew (or should know) as a polkadot developer. It was indeed a hair pulling exercise.

There are some aspects of Dedot which Papi can learn from… and i think that’s in how welcoming it is to enthusiasts to come and play and try things.

some raw suggestions:

  • docs are decent, but they can be world class:
    • make almost no assumptions about the skill level of the user. I find the best docs write as if the person is turning the computer on for the first time.
    • have narrative voice that reads like stories, with user journey’s in the docs. So instead of just having “recipes”, weave recipes into the onboarding story.
    • heavy focus on “getting started” and lots of examples, with one click set up apps.
    • If you are focussing on converting PJS users to PAPI, make it insanely easy for a PJS user to grok the new type interfaces and nomenclature. With videos and hand holds. I think this will speed the process.
  • Attract more builders.
    • As infrastructure/tooling project i think there should be a duty to foster a community of building and not just rely on the onboarding through polkadot main channels. The only reason i say this because you built it and so you know it better than anyone.
    • make different forms of content, ultimately the api with the best content will win the majority of users.
    • …Maybe its worth hiring someone who is going to focus on the above elements onboarding journey and community element of DevEx. Or Perhaps this can be more a Dedot thing.

on a funding note… I’m sure you will get the treasury funding 99% of the time. And i’m sure if you don’t like the direct treasury route there are bounty funds like Velocity labs that can take away the stresses of treasury away.

1 Like

Last time I tried papi is building this with AI tool. The AI tool have chosen papi initially which I am happy with. However, it ended up using some private methods (starts with underscore) and making raw RPC calls directly. So I have to force it to switch to polkadot.js because I (and the AI) couldn’t figure out how to write chain agnostic code that doesn’t depend on metadata of any particular chain. Did I miss something? For example, how to do api.query.timestamp.now() with papi that works with all the chains?

Hey @decentration, thank you for the comment, there’s a lot of valuable feedback here.

I want to clarify that this post is about promoting interoperable standards across the ecosystem, and moving past the problematic PJS interfaces. We all benefit from that direction, regardless of which library people choose.

We’ve put significant effort into making these interfaces interoperable and composable for a reason: To strengthen the ecosystem as a whole, libraries, developers and users.

Instead of competing, recognise that you are a key product-owner and thought leader in the space, you can just be giving the best suggestions and guidance sugar wrapped in welcoming chad vibes, to projects like dedot, who seems to be working mostly fuelled by enthusiasm.

High critique isn’t the only tool in the locker, it definitely has a toxic effect if incorrectly dosed.

We believe in a healthy competition between libraries, one where we interoperate, collaborate and learn from each other.

We’ve always aimed to encourage Dedot to improve, on the points mentioned in the original post and more. For example, here’s a constructive PR review for Dedot from @josep.

It’s tricky to express those points without being direct, but the goal is to highlight why decoupled interfaces matter. This post is meant as a push toward a stronger, more resilient ecosystem overall.

On funding: different projects receive different levels of support at different times, and that inevitably shapes scope, team capacity, and delivery speed. From the outset, long before funding, PAPI has centered on interoperable interfaces. Our focus on a modular architecture made it easier to grow our team and ultimately attract more funding. We’re rooting for Dedot’s continued success, which is exactly why we’re offering this constructive feedback.

There are some aspects of Dedot which Papi can learn from… and i think that’s in how welcoming it is to enthusiasts to come and play and try things.

Absolutely! Onboarding new devs is something we definitely need to improve. In fact, Dedot’s docs helped us realize that ours needed improvement. A lot of improvement.

And we have already started taking action: over the past few weeks we’ve made significant updates to our docs, and we’re bringing in someone external to help refine the docs with a fresh perspective.

And lastly, thank you very much for all your suggestions. They are genuinely helpful, and we’re committed to acting on them

3 Likes

Hey @xlc, regarding this specific point, it’s actually simple:

const api = client.getUnsafeApi();
const now = await api.query.Timestamp.Now.getValue();

This ties back to the earlier comment about improving the first-time developer experience. It’s something we’re actively working on.

We’re also considering bringing in one thing Dedot does well to make onboarding smoother: pre-generated types for well-known chains. It’s another good example of how healthy competition helps everyone improve.

And please, whenever you run into an issue with PAPI, feel free to open a ticket on our GitHub (just like we do when using Chopsticks :wink:).

3 Likes