Why Polkadot-API?

Introduction

It’s been 14 months since we published the first stable version of polkadot-api. The library has matured a lot since then, largely thanks to feedback from our users :person_bowing:. We’re now focused on delivering PAPI v2: leaner, easier to use, and more performant.

Equally important, though, is explaining why the ecosystem should migrate to PAPI. So I’m taking a short break from v2 work to share why teams currently using PJS should strongly consider moving to PAPI.

TL;DR:

PAPI is:

  • Interoperable

  • Composable

  • Extremely performant

  • Future‑ready

  • Equipped with great development tools

  • Incredibly stable

What is wrong with Polkadot.js?

PolkadotJS has several issues. Many assume the biggest problems are performance and bundle size. Those are symptoms. The root cause is deeper:

  • Tight coupling and leaky abstractions. PolkadotJS libraries are tightly coupled without clear contracts or boundaries. Abstractions bleed across layers, so a small change can cascade into major incidents. See this recent issue triggered by a tiny change. I wrote a post‑mortem in this comment. It’s one of many examples where poorly drawn boundaries cause random bugs and unexpected behavior.

  • Non‑interoperable public interfaces. The two most important public interfaces in PJS: the JSON-RPC Provider and the keyring, are not interoperable. They’re tightly coupled to PJS‑specific implementation details. Those fragile, leaky interfaces became de‑facto “standards,” limiting what a rich, healthy tooling ecosystem could build on Polkadot. In my view, this is PJS’s biggest problem.

  • By‑products of the above:

    • Large bundle sizes: still coupled to bn.js and @polkadot/wasm-*. The reason decoupling hasn’t landed cleanly is the same boundary problem, which makes it effectively impossible to make these kinds of changes.

    • Poor memory and CPU performance: a common outcome of tightly coupled software that fails to isolate complex problems into composable components.

    • Leaky subscription APIs: e.g., you don’t get the unsubscribe function synchronously when establishing a subscription (leaking an internal detail). There’s no way to register a callback for errors that happen after the subscription is established. Mixing a Promise with a subscription is the worst of both worlds and a root cause of memory leaks and unexpected behavior.

  • Ad‑hoc chain‑specific logic. PJS never fully leveraged modern metadata versions. Architectural constraints made it hard to offer compatibility APIs or protect DApps from unintended behavior after runtime upgrades. We’ve discussed this at length here.

  • Maintenance model. PJS was, for a long time, a one-person project. The problem isn’t the contributor count per se, it’s that the codebase wasn’t shaped for easy handoff or collective evolution. Credit to @Tarik and the BlockDeep team for keeping things running after Jaco left, but the structural issues remain.

The PAPI tenets

When we started polkadot-api, we set one paramount goal: design for long‑term health of the ecosystem.

Interoperability first:

We began by defining the public interfaces the library would use. They had to be simple and interoperable, so different tools and libraries could plug in without friction.

These interfaces are purposely decoupled from PAPI itself. They happen to live under @polkadot-api/* today, but ideally they’d live under polkadot/*. Put differently: these are the interfaces we wish the Web3 Foundation would champion to enable true interoperability. In the absence of a formal standard, we decided to pioneer this front.

There are two APIs that matter most:

JSON‑RPC Provider

This is the interface for interacting with a node, ideally a light client. The PAPI client consumes a straightforward interface aligned with the modern JSON‑RPC spec. It makes no assumptions about who’s behind it: smoldot, a WebSocket endpoint, a new light‑client, a web worker, Chopsticks… anything. If the producer presents a modern JSON‑RPC‑compliant interface, PAPI just works.

A big upside: the modern JSON‑RPC spec is load‑balancer‑friendly. The provider abstracts transport complexity so the PAPI client keeps operating even if a connection drops, endpoints rotate, or the producer crashes. With a WebSocket provider, for example, your app keeps working through disconnects or stale connections. And when an underlying transport isn’t fully compliant, our middleware translates legacy to modern.

That’s why the PAPI team is committed to improving the modern JSON‑RPC APIs.

Polkadot Signer

This interface covers extrinsic creation. The ecosystem has long lacked standardized interfaces for:

  1. Discovering/requesting available signers

  2. Creating extrinsics

So we avoided the messy PJS keyring and introduced a generic PolkadotSigner interface, properly decoupled from the PJS signer. We’re also bringing this standardized interface to PJS and considering a Fellowship RFC. More context in this blogpost.

Because of this generic interface, teams building cutting‑edge PolkadotSDK chains can create transactions that otherwise couldn’t be created with PJS.

Composability

Our second tenet is composability. Each PAPI library is as decoupled as possible from the others. We relentlessly break complex problems into small, well‑isolated packages with clean APIs, so consumers don’t need to understand how the hard parts are solved. “Good fences make good neighbors.”

That’s why some of our libraries (like scale-ts) are widely adopted. In fact, even PJS and Dedot use some @polkadot-api/* packages… which is great!

A practical upside: the PAPI client is resilient to network hiccups. If you use a WebSocket provider, you can safely reconnect every few minutes, even with in‑flight operations. Thanks to the modern JSON‑RPC spec (and our middleware), operations continue as if nothing happened.

Performance

The only sustainable way to make a complex library fast is to isolate complexity into small, composable pieces. As a result, Polkadot‑API is (at the time of writing) the most performant JS library in this space, both in memory and CPU usage. Keep reading for hard evidence.

Contribute upstream

Clear boundaries help everyone. That’s why the PAPI team actively contributes upstream (PolkadotSDK, smoldot, Chopsticks, and more). When we spot issues or opportunities at the boundary, we open issues, PRs, and discussions. The list is long; I’ve collected many of them in this gist.

Empower Polkadot developers

Our last tenet is empowering developers with great tools and DX. We invest heavily in:

  • Improving the PAPI development console.

  • Providing powerful Polkadot SDKs (Ink!, Solidity, Governance, Identity, Staking, etc.).

  • Shipping DApps that showcase those SDKs: like the popular https://bounties.usepapi.app/ and a new staking DApp we’ll release in a few weeks.

  • Building tools like diff.papi.how and the @polkadot-api/check-runtime CLI.

  • An upcoming JSON‑RPC analyzer to inspect all operations (storage requests, runtime calls, headers, bodies, etc.) from any compliant JSON‑RPC provider: so you can spot bugs, find potential improvements, and analyze how a DApp interacts with a chain.

Ok, so why PAPI?

First, PAPI has improved dramatically since the first stable release 14 months ago. Second, it’s uniquely stable: we haven’t shipped a single breaking change to the public API in that time, and the upcoming v2 changes are ridiculously small (see here and here).

PAPI is also, by a wide margin, the most performant JS library in this space (details below).

It’s the only library truly ready for the near future, thanks to being fully decoupled from the PJS keyring.

Finally, the core PAPI team consists of three exceptional developers who challenge and learn from one another. We each understand the responsibilities of every package deeply enough to jump in and fix issues anywhere in the codebase. The code is designed so others can fork or take over. In other words: the bus factor is healthy. PAPI is here to stay.

Choosing PAPI also means choosing interoperability, making it easier for other teams to plug in and add value.

What about Dedot?

It’s time to address the elephant in the room.

First and foremost, we applaud Sinzii’s effort to create an alternative to PJS. That said, we do have concerns about Dedot’s approach and share them here in a constructive spirit, especially given Dedot is still in major‑version zero.

Lack of interoperable interfaces

Earlier, we explained how Polkadot‑API invested in simple, interoperable public interfaces.

Unfortunately, Dedot hasn’t done the same:

Dedot’s signing interface:

Dedot adopted the problematic interfaces used by PolkadotJS, even PJS maintainers agree we should move away from them. Consequences include:

  • Inability to create extrinsics for modern chains.

  • Tight coupling of the signer to @polkadot/wasm-crypto and bn.js, which are heavy and inefficient.

It’s disappointing to see a new library doubling down on interfaces we’ve long known are problematic.

Dedot’s JSON‑RPC provider

Dedot didn’t leverage the modern JSON‑RPC APIs to abstract away reconnections, halted transports, etc. Support for modern JSON‑RPC appears to have been added later, without fully embracing its benefits. The provider leaks Dedot‑specific implementation details and isn’t truly compliant with the new spec, since it behaves differently depending on the provider. That connects to the next point.

Poorly defined boundaries

In Dedot, packages are tightly coupled. Complexity isn’t isolated behind clean APIs, so consumers inherit internal details.

For example, Dedot exposes two different clients depending on the provider (legacy vs. modern). Both miss common, useful APIs every client should have: like subscribing to all finalized events or to current best blocks. Users must roll their own, handle reconnect/disconnect edge cases, and manage operation persistence across reconnects… twice, for two clients.

Worse, the modern client is hard to use because several endpoints aren’t fully compliant with the modern JSON‑RPC spec. Instead of using generic middlewares (like those PAPI provides) to bridge gaps, Dedot delivers a broken DX on the modern client. Everyone ends up on the legacy client, which can’t offer the guarantees modern APIs provide. This cements legacy usage while still failing to offer minimum guarantees.

And Dedot inherited PJS’s leaky subscription patterns too.

Missing runtime safety

Another issue Dedot inherited from PJS is the lack of runtime type safety: ensuring that the chain you’re interacting with is compatible with the code and type definitions your DApp was built with.

PAPI solved this through “descriptors”: lightweight runtime checks that validate compatibility every time you interact with a chain. While we already have improvements planned post-v2, the current implementation is both highly reliable and performant.

This eliminates the dreaded Cannot read property [X] of undefined errors that sometimes appear after a runtime upgrade in PJS. Errors that make debugging painful and can silently produce incorrect behavior. In the worst cases, these mismatches can result in a DApp displaying wrong info (e.g., due to a new enum variant) or even submitting transactions with wrong data.

We’ve encouraged the Dedot team to address this many times, but to this day, they’ve continued to dismiss it as not important enough to solve.

Subpar performance

Leaky abstractions often lead to performance issues. Dedot is no exception. Using the benchmark Dedot created:

Results

Results ran in a MacBook M3 Pro, Bun 1.2.7

Dedot:


================================================================================

📊 FINAL MEMORY CONSUMPTION REPORT

================================================================================

Total blocks processed: 200

Total time: 610.6s

Average time per block: 3.05s

Memory Statistics:

Initial heap used: 4.96 MB

Final heap used: 237.15 MB

Total memory growth: 232.19 MB

Average memory per block: 1.16 MB

Peak RSS: 386.09 MB

Peak heap used: 241.09 MB

⚠️ WARNING: High memory growth detected!

Average growth of 1.16 MB per block may indicate a memory leak.

PAPI:


================================================================================

📊 FINAL MEMORY CONSUMPTION REPORT

================================================================================

Total blocks processed: 200

Total time: 164.4s

Average time per block: 0.82s

Memory Statistics:

Initial heap used: 2.30 MB

Final heap used: 19.84 MB

Total memory growth: 17.54 MB

Average memory per block: 0.09 MB

Peak RSS: 175.91 MB

Peak heap used: 42.49 MB

✅ Memory growth appears normal.

We’ve set up the following repo so anyone can verify these results. Please, notice that we didn’t create these benchmarks, we simply took the ones that Dedot created for their own repo.

PAPI processed the same number of blocks ~3.7× faster while using dramatically less memory. Note: Dedot’s memory usage would have been worse if the user weren’t responsible for periodically cleaning the cache, another example of a leaky abstraction. PAPI knows when to clean up safely because the client tracks when it’s correct to discard cached items.

Also, the PAPI client does more, continuing smoothly through connection hiccups.

Lack of upstream contributions

Given both libraries tackle the same problems, we’d love to see Dedot engage more upstream. For example, our team pushed hard so Pallet Revive exposed the instantiated event, something Dedot benefits from. Another example is this upstream issue, which we suspect they must have encountered?

To date, no one from the Dedot team has opened an issue or PR to PolkadotSDK. The same appears true for smoldot and the new JSON‑RPC spec. Upstream collaboration matters because that’s how the platform gets better for everyone.

Please help Dedot step up

We’re puzzled by how quickly some influential voices started to shill a project still in its early phase (v0.18.3), with leaky APIs and familiar architectural pitfalls.

If you support Dedot, please ask them -at a minimum- to embrace interoperable APIs for the JSON‑RPC provider and signing interfaces. That alone would unlock a healthier, more resilient ecosystem for all of us.

Closing thoughts

We’re not asking the ecosystem to “pick a side”. We’re asking it to pick standards. If libraries converge on interoperable provider and signer interfaces, everyone wins: wallets, DApps, infra, even competitors. PAPI is our contribution toward that future: fewer foot-guns, clearer contracts, and performance you can verify.

If you build on Polkadot, take this as an invitation: hold us to a high bar, file issues, challenge assumptions, and help refine the interfaces. Let’s make the future about shared foundations, not one-off patches.

14 Likes

I agree — at the end of the day, we need more standardization. Also, I’ve really enjoyed working with polkadot-api; it’s probably the best in Polkadot right now in terms of developer experience.

3 Likes

Bro, polkadot-api should of course be better than dedot, given that dedot has received ~$50k in grants working as a one-man project. And i am assuming PAPI has raised >$500k? That’s a >10x difference in funding. I’m also pretty sure Dedot is a side project.

…So, i think the comparison are a little harsh. Seems you wanna play fisticuffs with whoever is near you.

instead of competing, recognise that you are a key product-owner and thought leader in the space, you can just be giving the best suggestions and guidance sugar wrapped in welcoming chad vibes, to projects like dedot, who seems to be working mostly fuelled by enthusiasm.

High critique isn’t the only tool in the locker, it definitely has a toxic effect if incorrectly dosed.

Kudos to you though for really greatly improving the performance and design of the api. when i started building things with PJS, i was dumbfounded by its complexity, and assumptions about what i knew (or should know) as a polkadot developer. It was indeed a hair pulling exercise.

There are some aspects of Dedot which Papi can learn from… and i think that’s in how welcoming it is to enthusiasts to come and play and try things.

some raw suggestions:

  • docs are decent, but they can be world class:
    • make almost no assumptions about the skill level of the user. I find the best docs write as if the person is turning the computer on for the first time.
    • have narrative voice that reads like stories, with user journey’s in the docs. So instead of just having “recipes”, weave recipes into the onboarding story.
    • heavy focus on “getting started” and lots of examples, with one click set up apps.
    • If you are focussing on converting PJS users to PAPI, make it insanely easy for a PJS user to grok the new type interfaces and nomenclature. With videos and hand holds. I think this will speed the process.
  • Attract more builders.
    • As infrastructure/tooling project i think there should be a duty to foster a community of building and not just rely on the onboarding through polkadot main channels. One reason is you built it and so you know it better than anyone.
    • make different forms of content, ultimately the api with the best content will win the majority of users.
    • …Maybe its worth hiring someone who is going to focus on the above elements onboarding journey and community element of DevEx. Or Perhaps this can be more a Dedot thing.

on a funding note… I’m sure you will get the treasury funding 99% of the time. And i’m sure if you don’t like the direct treasury route there are bounty funds like Velocity labs that can take away the stresses of treasury away.

3 Likes

Last time I tried papi is building this with AI tool. The AI tool have chosen papi initially which I am happy with. However, it ended up using some private methods (starts with underscore) and making raw RPC calls directly. So I have to force it to switch to polkadot.js because I (and the AI) couldn’t figure out how to write chain agnostic code that doesn’t depend on metadata of any particular chain. Did I miss something? For example, how to do api.query.timestamp.now() with papi that works with all the chains?

Hey @decentration, thank you for the comment, there’s a lot of valuable feedback here.

I want to clarify that this post is about promoting interoperable standards across the ecosystem, and moving past the problematic PJS interfaces. We all benefit from that direction, regardless of which library people choose.

We’ve put significant effort into making these interfaces interoperable and composable for a reason: To strengthen the ecosystem as a whole, libraries, developers and users.

Instead of competing, recognise that you are a key product-owner and thought leader in the space, you can just be giving the best suggestions and guidance sugar wrapped in welcoming chad vibes, to projects like dedot, who seems to be working mostly fuelled by enthusiasm.

High critique isn’t the only tool in the locker, it definitely has a toxic effect if incorrectly dosed.

We believe in a healthy competition between libraries, one where we interoperate, collaborate and learn from each other.

We’ve always aimed to encourage Dedot to improve, on the points mentioned in the original post and more. For example, here’s a constructive PR review for Dedot from @josep.

It’s tricky to express those points without being direct, but the goal is to highlight why decoupled interfaces matter. This post is meant as a push toward a stronger, more resilient ecosystem overall.

On funding: different projects receive different levels of support at different times, and that inevitably shapes scope, team capacity, and delivery speed. From the outset, long before funding, PAPI has centered on interoperable interfaces. Our focus on a modular architecture made it easier to grow our team and ultimately attract more funding. We’re rooting for Dedot’s continued success, which is exactly why we’re offering this constructive feedback.

There are some aspects of Dedot which Papi can learn from… and i think that’s in how welcoming it is to enthusiasts to come and play and try things.

Absolutely! Onboarding new devs is something we definitely need to improve. In fact, Dedot’s docs helped us realize that ours needed improvement. A lot of improvement.

And we have already started taking action: over the past few weeks we’ve made significant updates to our docs, and we’re bringing in someone external to help refine the docs with a fresh perspective.

And lastly, thank you very much for all your suggestions. They are genuinely helpful, and we’re committed to acting on them

6 Likes

Hey @xlc, regarding this specific point, it’s actually simple:

const api = client.getUnsafeApi();
const now = await api.query.Timestamp.Now.getValue();

This ties back to the earlier comment about improving the first-time developer experience. It’s something we’re actively working on.

We’re also considering bringing in one thing Dedot does well to make onboarding smoother: pre-generated types for well-known chains. It’s another good example of how healthy competition helps everyone improve.

And please, whenever you run into an issue with PAPI, feel free to open a ticket on our GitHub (just like we do when using Chopsticks :wink:).

7 Likes

Thank you for the comparison post, I have been waiting for this. I also value how the communication has gotten more productive. I have worked with papi and dedot extensively and can only agree to what has been said. I value papi’s effort in shaping the ecosystem positively with upstream contributions, PRs, developing standards. I also value the onboarding and responsiveness of dedot. It has been very enjoyable to work with dedot and typink too! I also have seen that papi docs have an ai chat feature which is new?! great! In the end we have to attract much more builders and what I can say to this point is that the competition / collaboration between papi and dedot turns out to be very beneficial for that goal.

3 Likes

My 2½ cents:

  • People should be free to build and use whatever they want
  • Influential voices and leaders, however, should want what’s best for Polkadot
  • Shilling out of self-interest is cringe and counterproductive, e.g. this
3 Likes

I believe the elephant in the room here isn’t Dedot itself. I have nothing but respect for legitimate builders like Sinzii. Rather, it’s the apparent push for its adoption that seems driven by personal alignment and self-interest.

I try to stay impartial and open, but in this case, it unfortunately does seem that way.

What’s more concerning is that even the leaders responsible for maintaining our main source of truth don’t seem immune to this behaviour.

1 Like

Race to the bottom needs to stop

  1. The PR author had a public fallout with the PAPI team and went on a crusade, that PR is pure cringe.
  2. The order should optimise for developer journey, reflecting on adoption and/or technical merit. That decision rests with the docs maintainers, whose sole goal should be attracting and keeping more developers.
  3. Alphabetical order is idiotic, especially given the current landscape, proven by point 1. What’s next, PAPI renaming itself to APIoP (API of Polkadot)?
3 Likes

Thanks for chiming in, @leonardocustodio.

That’s not the point we are making. We didn’t create PAPI because PJS entered maintenance mode; when we started PAPI, PJS wasn’t in that stage. The motivation is architectural: interoperability, clear boundaries, and modern JSON-RPC/signing interfaces. I think you may have missed this section: “What is wrong with PolkadotJS?”.

Can you point to what, specifically, you consider “unfair play”? The post critiques interfaces and boundaries, and asks Dedot to adopt interoperable provider/signer contracts. That’s pro-ecosystem: shared interfaces let tools compete on quality while remaining swappable. If anything, I’m worried we’re under-emphasizing how important those interfaces are.

Agreed. It has been in “active development” for about two years now. If that’s the positioning, it should be communicated as such so developers can set expectations around risk and API stability.

Totally your call! I’m genuinely happy you’re using it. I’d be even happier if you championed interoperable interfaces in Dedot (JSON-RPC + signer). That alone would make everyone’s tooling more resilient. Also, as noted, Dedot already relies on some @polkadot-api/* packages, so in a way you’re also using PAPI :wink:.

Our ask isn’t to “pick PAPI.” It’s to pick standards. If Dedot, PAPI, and others converge on the same minimal, stable interfaces, the entire Polkadot ecosystem wins. Happy to collaborate on making that happen.

2 Likes

Thanks for the reply, @leonardocustodio. Let me address your points directly and keep this grounded in facts.

I don’t think that’s accurate. We responded on GitHub immediately, and I also reached out on Matrix and later in person in Lucerne to share a solution. You stopped replying, and a call invitation went unanswered. Happy to continue that conversation any time, publicly on the issue or live, your choice.

This is categorically false. No one told you that you were “doing it wrong.” On the contrary: we explicitly acknowledged the need you raised and prioritized typed-codecs, which shipped a few weeks later despite a heavy workload. (As a side note, Dedot doesn’t have this feature.)

This is out of context. I was referring to a specific API: watchEntries, which is not intended for indexing very large storage maps like System.Account. (As a side note, Dedot doesn’t offer this API).

PAPI can be used for indexers. For example, https://xcscan.io/ uses PAPI for their indexer and saw substantial performance improvements after migrating. We’re building an indexer ourselves and we know that PAPI is a great choice for indexers. One reason being its ability to recover cleanly from network outages and reconnections (another area where Dedot currently struggles, btw).

That is a very weak strawman argument. We advocate interoperable interfaces and clean boundaries. That’s not “use PAPI or else”; it’s “use standards so tools can be swapped without pain.” Compete on implementation, align on interfaces.

In this case, the blocker you hit was a limitation of the modern JSON-RPC spec, the one maintained by Parity (the company that you work at, right?), not “my API.” We acknowledged it, proposed a workaround, and we’re contributing upstream to improve the spec. Since you graduated from PBA, you also know why deprecating legacy RPCs matters. It’d be great to have your help shaping the modern spec so these gaps close faster. Also: one limitation shouldn’t negate the many operational advantages of the modern spec.

They aren’t “my standards.” The minimal JSON-RPC provider interface PAPI uses was largely articulated by Pierre Krieger for substrate-connect, and further simplified precisely because the modern spec enables it, see this reminder we got back then. The signing interface work is similarly being discussed in the open with multiple stakeholders (e.g. here). These are ecosystem standards by design, decoupled from PAPI internals and meant to be adopted broadly.

Could you share the code, please? :folded_hands: From your numbers it looks roughly identical on that micro-case. It’d be useful to see results over a longer run, where memory behavior and reconnection handling matter. Also remember: the benchmark we used in the post wasn’t “crafted by us”, it was authored by Dedot; we simply ported it to PAPI.

1 Like

Hey there, I want to share a few thoughts since Dedot is mentioned here in this thread.

First off, I want to acknowledge that there are still areas we need to improve, and we’ll continue doing so moving forward.

At the same time, we’re really happy to see that more and more people and projects are choosing to trust and start adopting Dedot. It’s also great to see a lot of users expressing their happiness when using Dedot. We feel thankful that Dedot has started to receive a lot of valuable feedback that is shaping its direction.

We welcome all feedback and, of course, we will have our own judgment on what’s suitable and what’s not for Dedot at a given time and within our current capacity.

Even though our funding situation is not in very good shape, we still continue to develop Dedot over the past 2 years, staying aligned with the latest changes on the protocol side, the movement of the ecosystem, and the feedback from the community. This shows our commitment to improving & strengthening the overall DX of the ecosystem.

To some of the points in the post

PJS signer interface is the most well-adopted signing interface in the ecosystem, and I don’t understand why you’re criticizing supporting it even though PAPI still support it in the first place.

I agree that there are limitations right now in the interface, and that is actually a good sign because Polkadot is moving forward and we need to improve it to move along in agreement with the whole ecosystem. I bet the original author would never have been able to imagine how Polkadot would evolve in the future when designing the first version of the interface, and I think that’s completely fine—what we have to do is continue improving it.

We started Dedot without awareness of the new JSON-RPC specs, and once we learned about them, we quickly began working on supporting them on top of the existing support for legacy JSON-RPC.

Our JSON-RPC provider is designed as a means to connect to the network and is not tied to any specific modern or legacy JSON-RPC specs. The interface is simple, with methods to send requests, subscribe, connect, and disconnect from the network. There is no knowledge about modern or legacy JSON-RPC specs within the JSON-RPC provider itself. Instead, higher-level abstractions or clients will use this provider to interact with the network through specific APIs depending on the specs.

Each package in Dedot has its own purpose. For example, one can use the WsProvider or SmoldotProvider from @dedot/providers to connect to the network, or use codecs from @dedot/codecs to encode/decode metadata, headers, etc.

While the two clients have different implementations, they both share the same generic interface, and therefore they both expose a common set of APIs for on-chain interactions. This allows developers to use the shared interface for interacting with the chain and easily swap out clients as needed when they want to switch to a different spec.

One of our goals with Dedot is that it has to be easy to migrate from PJS to Dedot, so we intentionally designed the interface to be familiar to PJS, which helps make the migration much faster and easier.

Dedot will never throw this behavior when a dapps interact with an unknown on-chain APIs. If an API is unavailable, Dedot will throw an error saying exactly which API is not found or unavailable. E.g: if a client.query.pallet.method() is unavailable, Dedot will throw an error saying pallet is not found or storage item method is not found.

We also have instructions to deal with breaking runtime upgrades in our docs where devs can generate new chaintypes for upcoming runtime changes and act accordingly.

We agree that there are still things we can improve in this area to make it easier to deal with these situations.

Dedot caches data using LRU caching with a defined capacity, so the memory used for caching will always be under control, and developers never need to clear the cache manually. We also automatically clear cache items appropriately, e.g: when blocks are unpinned, etc.

The memory consumption difference here is mostly due to caching, where we sacrifice memory to gain data availability for operations and calls that are done repetitively. There is no right or wrong here — what we have chosen is a balance.

This is a good feedback and we receive this constructively.

Totally agreed on this. The question now is which standard to pick and how to make it work with both libraries, and even with PJS as well. Even though PJS is in maintenance mode, something like the signer interface still needs to reach agreement within the community before we move forward. I’m happy to discuss this in details.

5 Likes

I don’t quite understand what you mean by this. If you believe there is an issue with this decision, I think the appropriate place to address it is by opening an issue in the repository and sharing your thoughts there so it can be discussed and considered properly, rather than bringing it up here.

What you think is best for Polkadot may not be what I or others think is best for Polkadot. We all want Polkadot to succeed and that comes from each of us acting based on our own perspective and interests, which is completely normal.

4 Likes

Please check my comment here:

I prefer to stay impartial and avoid engaging in or furthering any more nonsensical “libs war”.

At no point did I dictate what I think is “best for Polkadot”. Text-based communication can be tricky, if that was the impression you got, I hope this clears it up.

2 Likes

Thanks for engaging in this conversation, @sinzii!

This is incorrect. PAPI also caches and reuses data so repeated operations are served from cache. The primary reason PAPI performs better in the benchmark is simpler: Dedot repeatedly downloads metadata for all historical blocks, whereas PAPI only fetches metadata when needed. When we hit a block that’s not connected to the tip, we first check whether we already have its metadata and avoid redundant downloads. Re-downloading metadata every time an operation targets an old block, as Dedot currently does, incurs substantial overhead, which your own benchmark highlights.

If it helps, feel free to port the PAPI approach, or ask follow-ups about our strategy for avoiding unnecessary metadata fetches. Let’s fix the root cause rather than inventing a non-existing “memory for availability” trade-off.

Choosing to download additional data that isn’t needed, slowing things down and increasing memory pressure, doesn’t look like a good balance from a performance standpoint.

If that were strictly true, there wouldn’t be a public clearCache API.

To clarify: I’m not criticizing supporting it; I’m criticizing adopting it as Dedot’s own public signing interface. That cements long-known limitations: difficulty creating extrinsics for modern chains, heavier signers, and awkwardness around custom signed-extensions, among others. Supporting legacy interfaces for compatibility is fine; elevating them to your primary public surface is a missed opportunity for a “modern” library.

Exactly. Which is why modern libraries should converge on better, interoperable interfaces rather than re-entrench old ones.

Agreed! That’s what we’re doing. It would be great to see Dedot decouple from the PJS signer interface and join this work.

The new JSON-RPC API mega Q&A came out a few months before Dedot’s development started, alongside with other posts and public discussions. Library authors have a responsibility to track and leverage upstream changes, precisely to avoid repeating past mistakes.

It’s more opinionated than it appears, which makes it non-interoperable and coupled to Dedot internals. Neither generic JSON-RPC nor the modern Polkadot JSON-RPC define a standard “subscribe” primitive; subscriptions are conventions built on notifications. Returning a Promise from your subscribe function also leaks an implementation detail about how Dedot internally manages subscriptions.

By contrast, the JSON-RPC interface we proposed is deliberately minimal and library-agnostic. Smoldot’s public interface for Chains is another interoperable example. You can translate between smoldot’s and PAPI’s easily (because they are both interoperable).

In fact, we could use any PAPI provider with Dedot’s modern client by “impersonating” a smoldot chain. The reverse isn’t feasible with Dedot’s current provider API.

If you don’t like PAPI’s interface, proposing a simpler/better interoperable one (or adopting smoldot’s) would still be a win. We’re not fully satisfied with smoldot’s because disconnects require rejecting the latest Promise, it limits synchronous message bursts, and it’s harder to ensure consumers drain all yielded promises. Even so, we’d be willing to compromise on it.

This interface is missing several strengths the modern JSON-RPC provides: subscribing to all finalized blocks, tracking current best blocks, automatic recovery across reconnections, etc.

And practically speaking, the modern client isn’t usable with Dedot today. As I wrote above:

This is exactly how a “modern” library inadvertently reinforces legacy JSON-RPC usage. Offering two clients, where one works and the other doesn’t, pushes users toward the legacy path. This is a solved problem: you can deliver a solid DX on the modern client.

It will. A recent example: at Polkadot People block-height 2188447, the structure of the Identity.IdentityOf storage-value changed. Dedot can’t detect these structural changes on runtime upgrades on the fly. Without a compatibility API, tools like https://diff.papi.how aren’t achievable. We’ve explained this in multiple places; denying it doesn’t make the limitation go away.

Regarding the JSON-RPC Provider. We propose using a super simple and minimalistic API, although for v2 we have realized that it’s slightly better if the payloads are actually parsed, which is better for performance. They are essentially the same, though.

We’re open to either, or to a third option, so long as it’s simple, performant, and decoupled from any single library’s internals.

That agreement exists. The new interface we proposed is being added into PJS.

1 Like