Meta Transactions Support for Polkadot

Meta Transactions Support for Polkadot

Meta Transactions are coming to Polkadot and are now released on Westend for testing. The feature is ready on Westend’s Asset Hub, Collectives, Coretime, and People chains, and we’d like to share how to use it.

Introduction

The concept of Meta Transaction is well-established in the Ethereum ecosystem, referring to a transaction authorised by one party (signer) and executed by an untrusted third party that covers the transaction fees (relayer). This concept proves useful in scenarios where the signer lacks the assets to cover the fee or lacks the incentive to do so.

Examples include:

  • dApp covering transaction fees for its users
  • proxy accounts lacking balance
  • transaction fees paid in any asset when a signer incentivizes a relayer to cover the fee, as demonstrated by a transaction such as batch([sendCustomToken(relayerAddr, amount), doTheActualWork()])
  • signer delegating voting power without covering the transaction fee.

For the implementation details refer to the RFC polkadot-sdk/4123.

How It Works

Much like a regular transaction, the signer part of a meta transaction is constructed from the call the signer wishes to execute, the extension version, and a set of extensions, including one that carries the signer’s signature. The extensions concept is inherited from regular transactions, so many will already be familiar: CheckMortality, CheckGenesis, CheckNonce, and others. These extensions serve the same purpose here as they do in regular transactions, enforcing constraints such as expiration, chain-specific validity, and protection against double-spend attacks. One additional extension, MetaTxMarker, ensures that meta transactions cannot be submitted as regular transactions.

Once the signer has assembled their part of the meta transaction, rather than publishing it on-chain directly, they share it with any interested relayer. The relayer then takes the signer’s payload as an argument to the dedicated call meta_tx.dispatch(meta_tx, meta_tx_encoded_len), wraps it in a regular transaction, and submits it on-chain. Transaction fees are charged to the relayer, while the enclosed call is executed on behalf of the original signer.

/// Meta Transaction type.
///
/// The data that is provided and signed by the signer and shared with the relayer.
pub struct MetaTx<Call, Extension> {
	/// The target call to be executed on behalf of the signer.
	call: Call,
	/// The extension version.
	extension_version: ExtensionVersion,
	/// The extension/s for the meta transaction.
	extension: Extension, // example: (CheckMortality, CheckNonce, CheckGenesis, VerifySignature)
}

/// Processes and executes a meta-transaction on behalf of its signer.
fn meta-tx.dispatch(
  meta_tx: MetaTx,
  meta_tx_encoded_len: u32,
);

Runtime Setup

If you want your runtime to support meta transactions you’ll need to setup a pallet where you need to think only about one config parameter, which is Extensions. Those are the extensions that the signer has to include into its part of the meta transaction.

Here is the example:

pub type MetaTxExtension = (
  // Verify Signature is used as an extension to ensure signature is valid
  pallet_verify_signature::VerifySignature<Runtime>,
  // Ensure that this will only execute as meta transaction, not as regular
  pallet_meta_tx::MetaTxMarker<Runtime>,
  frame_system::CheckNonZeroSender<Runtime>,
  frame_system::CheckSpecVersion<Runtime>,
  frame_system::CheckTxVersion<Runtime>,
  // Ensures the call is executed on the right chain
  frame_system::CheckGenesis<Runtime>,
  // Enables the expiration on signature validity
  frame_system::CheckMortality<Runtime>,
  // Ensures there is no double spending
  frame_system::CheckNonce<Runtime>,
  frame_metadata_hash_extension::CheckMetadataHash<Runtime>
);

Integration into dApp

Following code & explanation would help understand how to build meta transactions & integrate it into custom dApps. Note that this is implemented using PAPI (Polkadot API) with Bun runtime & for custom packages needs adaptation.

Scenario given in the code: Alice is signer & Bob is relayer. Alice builds a call of system remark event “i am a signer” and signs it with extension data, and Bob publishes on chain it through Meta Tx pallet.

Meta Tx Request

Check the full code here if you need more context.

1. Build the Call Data

Create the remark call using PAPI’s typed API and extract its encoded bytes:

const remark = stringToU8a("i am a signer");
const call = api.tx.System.remark_with_event({
  remark: Binary.fromBytes(remark),
});
const encodedCallData = (await call.getEncodedData()).asBytes();

2. Providing Data Needed to Build Extensions

Retrieve the runtime version, genesis hash, and Alice’s nonce — these bind the signature to a specific chain and runtime:

const genesisHashHex = await client._request<string>(
  "chain_getBlockHash",
  [0]
);
const genesisHash = hexToU8a(genesisHashHex);

const rv = await client._request<{
  specVersion: number;
  transactionVersion: number;
}>("state_getRuntimeVersion", []);

const aliceAddress = ss58Address(alice.publicKey);
const accountInfo = await api.query.System.Account.getValue(aliceAddress);
const nonce = Number(accountInfo.nonce);

3. Encode Extension Data

Explicit and implicit extension data:

const extensionExplicit = u8aConcat(
  new Uint8Array([0x00]),       // Era::Immortal
  compactToU8a(nonce),          // nonce (compact encoded)
  new Uint8Array([0x00])        // metadata hash mode: disabled
);

const extensionImplicit = u8aConcat(
  stringToU8a("_meta_tx"),             // MetaTxMarker domain separator
  u32ToLeBytes(rv.specVersion),        // spec version
  u32ToLeBytes(rv.transactionVersion), // tx version
  genesisHash,                         // genesis hash
  genesisHash,                         // block hash (immortal → genesis)
  new Uint8Array([0x00])               // metadata hash: None
);

4. Collect Data for Signature and Sign

Concatenate everything, hash with blake2-256, and sign:

const extensionDataToBeSigned = u8aConcat(
  new Uint8Array([0x00]),  // extension_version = 0
  encodedCallData,
  extensionExplicit,
  extensionImplicit
);

const hash = blake2AsU8a(extensionDataToBeSigned, 256);
const signature = alice.sign(hash);

5. Build Final Meta Tx Request from Signer

// assemble explicit extensions
const metaTxExtension = [
  Enum("Signed", {               // VerifySignature
    signature: Enum("Sr25519", FixedSizeBinary.fromBytes(signature)),
    account: aliceAddress,
  }),
  undefined,                     // MetaTxMarker
  undefined,                     // CheckNonZeroSender
  undefined,                     // CheckSpecVersion
  undefined,                     // CheckTxVersion
  undefined,                     // CheckGenesis
  Enum("Immortal"),              // CheckMortality
  nonce,                         // CheckNonce
  Enum("Disabled"),              // CheckMetadataHash
];

const metaTx = {
  call: call,
  extension_version: 0,
  extension: metaTxExtension,
};

const codecs = await getTypedCodecs(localdev);
// Encode Meta Tx and share with the world
const metaTxEncoded = u8aToHex(codecs.tx.MetaTx.dispatch.inner.meta_tx.enc(metaTx));

Relay Transaction

1. Build the MetaTx.dispatch Call

Wrap the signer’s meta transaction into a MetaTx.dispatch extrinsic. Use getTypedCodecs to encode just the meta_tx field and measure its length:

const codecs = await getTypedCodecs(localdev);
const decodedMetaTx = codecs.tx.MetaTx.dispatch.inner.meta_tx.dec(hexToU8a(metaTxEncoded));
const metaTxEncodedLen = hexToU8a(metaTxEncoded).length;

const dispatchCall = api.tx.MetaTx.dispatch({
  meta_tx: decodedMetaTx,
  meta_tx_encoded_len: metaTxEncodedLen,
});

2. Sign and Submit

Bob signs the outer transaction with his own key and submits it. Bob pays the transaction fees:

const result = await dispatchCall.signAndSubmit(bobSigner);

console.log("Finalized in block:", result.block.hash);
console.log("Block number:", result.block.number);
console.log("Success:", result.ok);

What is the rationale behind the MetaTxMarker domain separator? Where is it actually defined? Is this supposed to be the AuthorizeCall extension defined in the metadata? Because if so, the metadata clearly states that both the implicit and explicit values of that extension must always be empty bytes. So why do I have to set it to arbitrary bytes for my transaction to be relayed?

Was it really necessary to use this super hacky signed extension? Why can’t the authorizer simply create a signed transaction like MetaTx.authorize({ call }), and then the relayer just relay it with something as simple as MetaTx.dispatch({ authorized_tx })?

The current implementation produces APIs that are frankly unusable, uncomposable, and leaky. It punches through far too many abstraction layers. Please, for the love of god, do not ship this to production as-is. Please change the implementation to something more ergonomic and usable.

Quite frankly, I have a hard time understanding why this was implemented in such a convoluted and unergonomic way. It’s not like I didn’t warn about this, I raised the concern 2 years ago when I wrote this.


If this had been implemented in a non-convoluted way, then the DX would be as simple as this with PAPI (with other libraries it would have also been just as simple, of course):

// --------------------------------------------------------------------------
// Create signers for dev accounts
// --------------------------------------------------------------------------
const entropy = mnemonicToEntropy(DEV_PHRASE);
const miniSecret = entropyToMiniSecret(entropy);
const derive = sr25519CreateDerive(miniSecret);
const alice = derive("//Alice");
const aliceSigner = getPolkadotSigner(alice.publicKey, "Sr25519", alice.sign);
const bob = derive("//Bob");
const bobSigner = getPolkadotSigner(bob.publicKey, "Sr25519", bob.sign);

// Create api
const api = createClient(getWsProvider("ws://127.0.0.1:9944"))
  .getTypedApi(localdev);

// Bob, who doesn't have funds, authorizes the tx
const authorizedTx = await api.tx.MetaTx.authorize({
  call: api.tx.System.remark_withEvent({
    remark: Binary.fromText("i am a signer")
  }).decodedCall
}).sign(bobSigner)

// Alice, who has funds, pays for Bob's authorized tx
const dispatchCall = api.tx.MetaTx.dispatch({ tx: authorizedTx })
const result = await dispatchCall.signAndSubmit(aliceSigner)

That’s just one of the many possible sane APIs that could have enabled using meta-transactions in a sane way. I am not married to this particular one.

Why do runtime engineers keep creating this super convoluted and over-engineered solutions? Why can’t we start from the APIs that we want to expose and then make the implementation afterwords?

Another major problem with the current approach is that, in practice, it is effectively unusable.

The reason is simple: current signers such as browser extensions, wallets, Polkadot Vault, Ledger devices, and similar tools do not sign arbitrary data directly.

When asked to sign arbitrary data, they typically wrap the payload with 0x3c42797465733e and 0x3c2f42797465733e (<bytes> and </bytes>), for well-understood security reasons.

So let’s look at this from the perspective of the account that is supposed to authorize the transaction. What options does that user actually have, depending on their signer?

  • Ledger device: a Ledger cannot sign arbitrary bytes at all, not even with blind signing (unless they are wrapped with the before mentioned <bytes>). So this approach simply does not work for Ledger users. There is no straightforward way for them to sign the required MetaTxMarker domain separator payload. The only possible workaround would be to sign a transaction with the Authorized signed-extension “enabled” and then extract the signature from that. But even that is blocked, because the metadata explicitly says those bytes must be empty. In other words, a Ledger user cannot authorize a transaction to be relayed by someone else.
  • Browser extension: same issue. It does not work unless a new, hacky, non-standard signing API is introduced. I know what certain Parity devs must be thinking… Please don’t.
  • Polkadot Vault: same issue.
  • Anything else following the same signing model: same issue.

Given that, can anyone point to a real-world use case where this API is actually usable as designed?

This is exactly why I argued against implementing authorization through a signed extension in the first place. Not only was that concern ignored, but the final design also relies on a non-standard signed extension whose metadata says the payload must be empty. That combination eliminates any realistic path toward making this work in a sane, interoperable way.

To me, this is another example of runtime developers designing around theoretical flexibility while overlooking the constraints of actual signer implementations. Sigh…

@josep thank you for the feedback.

Yes. we discussed this two years ago (polkadot-sdk/4123). You were against Solution 1. and in favor of Solution 2.x. I agreed and went with Solution 2.1. The presented solution above is Solution 2.1.

The solution you’re proposing now - MetaTx.authorize({ call }) and MetaTx.dispatch({ authorized_tx }) - would be stateful and require a user deposit.

The MetaTxMarker extension carries only implicit data for security, ensuring that a signer’s meta-transaction signature cannot be reused as a regular transaction. Nothing new here we use similar approach with CheckGenesis extension for example in general transactions.

AuthorizeCall is relatively new, and I believe we should be able to use it - especially if there have been no major changes to the contract. We used TransactionExtension specifically to follow the framework’s contract and utilize existing and familiar to clients tools. There were no hacks or deviations from the contract.

I’ll follow up with whether AuthorizeCall can be used and provide more details on the remaining concerns.

I would be grateful for more feedback from dApps engineers.

I completely disagree. Please provide some evidence. It seems to me that this is completely made up. There is no reason why this should be stateful, at all.

The solution I’m proposing also ensures that the signature cannot be reused as a regular transaction. What makes you think otherwise?

Wrong again! You see, if you have a look at the metadata you will find that the metadata clearly states that for the CheckGenesis extension its implicit data (also known as the “Additional Signed” field) must be 32 bytes. This is what allows signers like ledger, PAPI, etc, to know the kind of data that must be pased into the signature. The hacky-hidden signed-extension that you came up with doesn’t declare anything on the metadata. Therefore, making this hacky/hidden signed-extension effectively unusable. You can use the PAPI console to compare the info exposed on the metadata. Compare the info exposed for the CheckGenesis:

with what this new sign-extension exposes… Nothing, basically. This is, quite frankly, so incredibly wrong that I can’t believe that you keep doubling down on this mistake.

Are you for real?

Dude, the proof is in the pudding! I would be grateful if you simply try to authorize a transaction using a normal signer: ledger, browser extension, etc. You won’t be able to!

This is not really correct. There is a reason transaction extensions expose a name as well, because you have no idea how to construct the implicit data. The actual types are useful for decoding, but to construct you need to know semantics of the type, e.g. that this is the genesis hash.

I also read your example as @Muharem, but I think we just misunderstand each other. We could put both into a batch or whatever, but this gets more and more hacky.

This is clearly the biggest problem with the current approach. IMO it would also be better if we could fit this into the normal transaction extension queue (in my head it was always the idea to have it working there). We somehow need some kind of “split” or whatever in the verification queue. Things like genesis hash, tx version etc all can just be reused and don’t need to be added twice. However, the actual payment needs to be signed by someone else (the relayer).

I’m sorry, but I have to correct you here.

It is true that the consumer of the signer must know the semantics. For example, PAPI, PJS, Subxt, and similar tooling need to understand the semantics in order to construct the data for a given type. In that context, yes: both the extension name and its semantics are highly relevant.

However, that is not true for a generic and safe signer such as Ledger, Polkadot Vault, or a properly built signer extension. These components are the receivers of the inputs, and they should be as decoupled from the semantics as possible. In other words, we do not want them to depend on chain-specific knowledge of what those values should be.

What they must know, however, is the data type of what they are being asked to sign. Otherwise, an attacker could exploit that ambiguity and feed arbitrary data into the signing flow, potentially altering the inputs of other signed extensions. That is obviously a serious security problem.

This is precisely why it is paramount for the metadata to accurately describe the types of the values that are going to be signed. Without that, it is not possible to build secure and generic signers such as Ledger.

Also, as you said, the actual types are useful for “decoding”. But as you already know, once you only have the signed extrinsic, you cannot recover its implicit data in any reliable way (at least not without brute-forcing different permutations until one verifies). So the only point at which those implicit fields can actually be decoded is before signing, when the user is reviewing what they are about to authorize.

And this brings us back to the same distinction: the consumer of the signer must know the semantics in order to construct the payload, but the generic signer does not need to know those semantics. It only needs to know the type information, so it can generically decode the fields, display them to the user, and validate that the provided data actually matches the expected types.

So I insist: it is essential that the metadata always reflects the real data types of everything that is going to be signed. Otherwise, generic signers are simply not possible.

In fact, this is the main reason why it is currently not possible to authorize these transactions from a generic signer. If the metadata exposed the types of this “hidden” signed extension, then we could construct a normal transaction, extract its relevant parts, and pass them to the relayer.

That is, in fact, the crux of the issue here.

I know how they work and I did not said anything different here :smiley: Also the extension is setting the correct Implict type that should land in the metadata. So, not sure what the problem is?

The post claims that this is deployed on Westend AssetHub. In Westend AssetHub it ain’t there. Also, it if was there, then it would be MUCH MUCH MUCH easier to create the authorization, without going through all the hoops and loops that @cirko33 put in that gist. :person_shrugging:

The method shared by @cirko33 doesn’t work for normal signers and it is incredibly convoluted. So, the first thing that I did was to check if the metadata reflected the metadata type of that signed-extension, because if it did, then I would be able to show how to do that exact same thing in a much, much, much simpler way. However, it wasn’t there… Then, while reviewing the comments of @Muharem on GH I came across this:

One issue I’ve discovered with our solution is that the chain metadata won’t contain the identifiers for meta tx extensions, and clients won’t be able to rely on them as they do with transaction extensions.

And that’s why I’m making all this fuss, basically. This ain’t right.

By the way: what in the world is a “meta tx extension”? Is there an RFC somewhere that defines this concept?

I was just looking at the extension itself and not at the entire context. So, yeah you are right that it isn’t there right now.

@josep

I want to restate one important point, because it does not seem to have been reflected in your comments.

What is implemented is Solution 2.1 from polkadot-sdk/RFC/4123. The regular transaction extension pipeline is unchanged.

The flow is the following:

  • the signer authorizes the call by constructing a payload that contains the call, the meta-tx signer’s extension pipeline, and the signer’s signature;
  • the relayer then builds and signs a regular transaction, just like any other transaction today;
  • that regular transaction dispatches the signer’s authorization via meta_tx.dispatch(signer_authorization_with_call, ...).

This is the flow shown in the example shared in the post.

So the current design is not that meta-tx is integrated into the main transaction extension pipeline. The signer has a separate extension pipeline for it’s meta-tx payload, and the relayer later submits that payload with dispatch call in a form of a regular transaction.

The feature is released on Westend parachains, as stated in the post, and it was tested with PAPI by @cirko33.

Another important point: the signer’s meta-tx extension pipeline is currently not exposed in metadata. I think we may be able to expose it through custom metadata keys, but I would prefer to gather more feedback first, since I want to avoid introducing breaking changes within metadata without a clear direction.

Regarding MetaTxMarker: the idea is similar to CheckGenesis in the sense that it contributes fixed implicit data to the signed payload, so that the signer’s signature cannot be reused as a regular transaction signature.

When I looked at some client code example, my impression was that extension payload construction was static enough that this would not be a problem even without metadata support. The Ledger example is a useful counterexample. I do not claim to have complete knowledge of all client constraints, and I am interested in concrete feedback on that point.

The signer-side meta-tx pipeline also uses the relatively new VerifySignature extension, introduced in polkadot-sdk/pull/2280. I used it because I understood it as aligned with the transaction model direction described in Extrinsic Horizon polkadot-sdk/issues/2415, where newer authorization patterns are expected to be handled through extensions.

Given this clarification, I would ask that the solution be assessed again based on the actual flow above, and that the discussion stay focused on the technical issues.

I am happy to iterate and to focus on converging on a reasonable solution.

Quite frankly, at this point I think it is pointless to keep trying to explain to you why the “solution” pushed to Westend is completely messed up.

It reminds me of that time when I tried to explain to you how incredibly problematic it was that a major breaking change was introduced into the Bounties pallet without any kind of downward communication or heads-up, and you doubled down on the idea that everything had been done by the book. Not only that, you dismissed all my recommendations for improving the workflow so that these kinds of incidents would stop happening.

So here we are again. :person_shrugging:

This is, once again, the same underlying problem: a runtime engineer looking at everything from an ivory tower, without being willing to put themselves in the shoes of users or dApp developers. Quite frankly, someone with that attitude and that kind of tunnel vision should not be in charge of this. So yes, I think Parity leadership should seriously reconsider who is entrusted with this kind of task. And I am starting to suspect that there is a broader issue at play here.

I do not say that lightly. In fact, it would be much easier for me not to say it at all. But it would be dishonest not to say things as they are. Problems do not get fixed unless they are first acknowledged.

Look @Muharem, this is very easy to understand. I will say it again, and maybe this time it lands. I am going to lay down some hard facts. These are not matters of opinion. These are facts that can be verified by anyone. So if you disagree with any of them, then please verify them yourself, and if any of them do not hold up, prove me wrong with evidence.

These are the hard facts about the solution Parity deployed to Westend:

  • Users with wallets on hardware devices like Ledger cannot authorize meta-transactions, as they can’t create the signature that it’s needed.
  • Users with wallets on Nova cannot authorize meta-transactions, as they can’t create the signature that it’s needed.
  • Users with wallets inside browser extension wallets such as Talisman, SubWallet, polkadot-js extension, etc. cannot authorize meta-transactions, as they can’t create the signature that it’s needed.

And the list goes on.

The reason is very simple: dApps do not (and in most cases should not) have direct access to the raw signer, the private keys or the seed-phrases, because that would be incredibly dangerous and insecure. Therefore, dApps must communicate with signers through standardized APIs, which, by the way, my team has been trying to improve for some time with proposals like this one which aim at getting our ecosystem closer to the “Extrinsic Horizon”.

The solution that is currently on Westend:

  • does not define how dApps should securely interact with signers in order to create these “meta signatures”;
  • does not properly consider the security implications of how dApps are supposed to create those “meta signatures”;
  • does not propose an API for dApps to interact with signers and request those “meta signatures”;
  • makes it impossible to properly leverage the CheckMetadataHash extension, because one of the fields being signed is not even declared anywhere in the metadata. Ironically, CheckMetadataHash is still a required field when creating a “meta signature”.

Up to this point, these are simply facts that can be verified.

In fact, I invite you, @Muharem, to go ahead and try to authorize a meta-transaction with the account where you receive your Fellowship salary (5DMDbLHtrLhQfhvJWKRiixUeWtFTquHRvea3ccjbAueR86hS on Westend), and try to relay that authorized transaction from any other account. Perhaps then you will realize that the only way to do that is to be in direct control of the private key or seed phrase, which is obviously incredibly insecure and absolutely not something we should ever expect dApp users to do.

Now, based on all of the facts above, I think it is perfectly fair to conclude that this solution is, in practical terms, completely useless and insecure.

It is also my opinion that there was no need to create such a convoluted solution in the first place, and that the signature of a normal v4 transaction would have sufficed. But that is a discussion for another post, and for a proper RFC that I plan to submit to the Polkadot Fellowship, proposing a more robust, secure, and useful alternative to the atrocity that has been deployed to Westend.

And yes, I do think it is incredibly audacious for something this sensitive to be pushed as an ad hoc solution without proper Fellowship RFC discussion and approval. I hope we learn something from this, although I am not especially optimistic.

I am still shocked that someone could share that code and think: “yes, this is good stuff.”

It should be self-evident, just by reading the code, that the solution is fundamentally wrong. I mean, the mere fact that it only works when the dApp has direct access to the raw signer is already a massive red flag. And that is before even getting into the fact that a normal dApp developer should not be touching low-level concerns like manually assembling signed-extension values, especially without deriving them from metadata.

So no, sorry, but no: what was tested is obviously insecure and unrealistic.

I mean… how is that even relevant?

That is a completely irrelevant implementation detail for the criticism I am making here.

For starters:

  • It does not address the fact that the method for creating these “meta signatures” is insecure.
  • It would still be insecure even if we were already fully ready for “Extrinsic Horizon” and Extrinsic v5.
  • The reality is that we are not yet ready for “Extrinsic Horizon”.

Once we are actually there, and once Extrinsic v5 becomes the real transaction model in practice, then we can start talking about all the fancy things that new reality enables. But as of today, it is not even possible to create general transactions using Extrinsic v5 in the ecosystem as it actually exists.

In other words: we are not there yet.

Even the example that @cirko33 shared still uses an Extrinsic v4 transaction on the relayer side to broadcast the meta-transaction. So I genuinely do not understand why we are getting ahead of ourselves here.

We could have had a solution that works properly with the reality we have today, and then, once “Extrinsic Horizon” and Extrinsic v5 become real, we could evolve it using the new constructs that future reality enables. Instead, what we got is this insecure and unrealistic in-between solution that is neither here nor there, and ends up being the worst of both worlds.


PS: I previously said that it is not possible to create a “meta signature” using browser extensions. For all realistic purposes, that is still accurate. There is a hacky way to exploit a vulnerability in PJS-based extensions in combination with the legacy signPayload API that could make that possible, although I strongly doubt that any Parity engineer is even remotely aware of that. I am not going to describe it here, because that would amount to abusing an existing vulnerability that we are actively trying to remove with the new createTransaction interface. So if you happen to know about that, please keep it to yourself, because that should absolutely not be treated as a valid option.

I review and make it work for described use cases.