Metadata for offline signers

Summary

As a person behind the messy verifier logic in metadata updates, difficult to maintain metadata portals and slow to download long metadata QRs, I propose to finally fix this mess.

Previous related discussions could be found in github issue and forum and probably other places; please throw in comments, I'll be updating this list.

The proposal is to fund work to fix this issue. The plan is presented below.

Prelude

You can probably skip right to proposed solution part if you think you know what this is about - you probably do.

Background

While all blockchain systems support at least in some sense offline singing used in air-gapped wallets and lightweight embedded devices, only few allow simultaneously complex upgradeable logic and full message decoding on cold off-line signer side; Substrate is one of these blessed ones, and we should rely on this feature. This greatly improves transaction security and thus in general network resilience. (I should be adding links to real-life attacks here, I'll do it later, feel free to suggest)

As we do rely on this feature, we should be very careful to make sure it works correctly every time lest we create false sense of security and create even more dangerous paths for attacks.

For decoding of short and optimized for chain storage transaction, a Metadata entity is used, which is nothing like short itself (on the order of half-MB for most networks). This is a dynamic data chunk completely describing chain interfaces and properties that could be made into portable scale-encoded string for any given network version and passed along into off-chain device to familiarize it with latest network updates. Of course, compromising this metadata anywhere in the path could result in disagreement between what user sees and signs, thus it is essential that we protect it.

Problem statement

Thus, we have 2 problems here:

  1. Metadata is large, takes long time to be passed into cold storage device

  2. Metadata authenticity should be ensured

Current solutions

Dealing with size of metadata.

To send metadata into cold storage device (Signer app), we've used erasure-code encrypted QR sequences. Smartphones hosting Signer have plenty of memory and performance of their QR readers is excellent, thus whole metadata could easily be transferred into the Signer and even several versions could be stored (cached) alongside each other.

Neither is the case for lighter and more secure dedicated devices like Kampela or Ledger - secure embedded memory limitations prevent those from storing large amounts of data, which prevents caching; thus full transfer would have to be performed for each transaction; this would probably result in less than pleasant user experience; although at least there would not be need to "hold still" like in with QR transfers. Yet, this is current solution.

Dealing with metadata authenticity

This is where user complain the most, at least as I see it.

Of course, anyone can generate metadata locally with several tools (including polkadot-js calls, Signer's built-in metadata generator, etc.). The question is, how would one prove that the code that generated metadata is not corrupted? At the moment of designing a solution, the most obvious approach was envisioned - use distributed consensus for checking metadata hash along with call author signature by including it into input of signing function. Unfortunately, this requires coordinated action of many actors, so at that point this approach was set aside until better times (now).

The solution we've came with is verifiers certificates. This is both elegant and ugly; everyone who used Signer knows why it is ugly - you lose a certificate, whole set of users are compromised and should wipe their devices completely - I'll briefly explain the good part. Since we could not use distributed consensus to check metadata, we had to use some authority. Thus, we gave this trust to issuer of certificate, be it user themselves or some trusted party. Now that party has the burden of checking the authenticity. So the beauty here was that every user does not have to spend effort to check metadata - they could rely on peers and web2-like centers of distribution like metadata portals. Remove some manual work requiring vigilance.

Yet it is ugly now.

So, for both problems we need a new solution.

Proposed solution

Include metadata into signature

This is simple part - add some hash of metadata to signing inputs, sign. The size of data stored on chain does not change - except for bit (or byte?) that would indicate whether this new feature is used or is data signed as before (so that cold signers could force "with metadata always" rule if they can or not force it if they can't). This was discussed in github issues, now I suggest that the whole wallet/signer community comments on this part.

Reduce transferred metadata size

This is where things are tricky - and what would be addressed mostly within scope of this proposal. For small embedded devices like Kampela (64kBit internal RAM) and Ledger, shortening the metadata is essential. Only small part of metadata is really needed for decoding. The proposal by Substrate team was to represent metadata hashes as Merkle tree and use it to securely and scalably generate the metadata validity proof on cold device.

For this, metadata should be abbreviated in some deterministic manner, similar to what native retain() operation from scale-info does, but harsher. This procedure should have following properties:

  1. Be deterministic across implementations and platforms
  2. Result in easily-merkelizeable data
  3. Be associative - we should be able to reduce metadata sequentially for optimized caching strategies or go directly to minimal
  4. Result in minimal size of metadata and require minimal RAM for its handling

Course of actions

I propose coordinated joined action across the ecosystem to define and adopt the new standard and set of tools. This would include the following actors:

  1. Research/coordination group (us, Alzymologist Oy)
  2. Substrate core developers
  3. Client developers
    3.1. Including polkadot-js ecosystem
  4. Offline signer developers (including us, Kampela developers, Alzymologist Oy)

With the following roles.

Research team

We would develop, optimize, characterize and define shortening protocol and deliver its reference implementation in Rust, with support for deeply no-std systems on receiving side and at least wasm on sending side.

Substrate core developers

would implement the metadata inclusion into signature based on results of research

Client developers

We expect client developers to raise their voices to propose requirements they have for this protocol and to work in conjunction with the rest of the actors listed here. Together they would prepare transition plan (could be done in stages) and implement it; some users would probably keep using legacy indefinitely unless it would be made impossible by design we come up with (which would probably be the case with metadataV15 release approaching). It is essential though that several client systems adopt this transition, as, for example, even now there is incomplete compliance with fully-featured Signer app across the ecosystem, which significantly hinders adoption.

Hot wallet developers

These actors should work closely with research team to confirm compatibility of the standard with specifics of hardware environment and then attempt to implement the new protocol while forbidding legacy standard altogether.

Preliminary work

We've done some minimal analysis at the day of publishing this trying to reliably shorten metadata using basic tools. The first attempt resulted in reduction from about 200kB to about 8kB with some outdated snapshot of metadata - modern result should look even more impressive as while metadata grew significantly, the individual call-related data substructures sizes changed little.

Resources

Here I would list envisioned resources for completion of this project. Actors are welcome to add their input in comments, I'll be editing this publication as we get more information.

From Alzymologist team side, I envision that a pair of good Rust developers/researchers would require 1-2 months to perform this analysis thoroughly. There might be a need for a community manager if community proves too inert, let's measure this in this discussion thread.

Then we could make a multisignature proposal from several teams to work on this or several small Gov2-initiatives to address the project in parts, probably latter is better for simpler coordination.

I will be modifying this post to include all relevant inputs. I'll let it sit here for at least a week to refine the plan.

Up
Comments
No comments here