Ink! analyzer: 🎉 300+ unique VS Code extension installs, preparing for ink! v5, moving to retroactive treasury funding and gauging interest in a FRAME equivalent

Hello from David Semakula - the creator of ink! analyzer.
I’m here with a few updates on adoption, short term development plans and future funding plans.

Adoption

The ink! analyzer VS Code extension recently hit an adoption milestone by reaching 300+ unique installs from the VS Code Marketplace :tada:.

I’d like to extend a big thank you to both the Web3 Foundation and the ink! lang team for sharing the project through their respective X/Twitter accounts.
And an extra thank you to the Web3 Foundation for funding all of ink! analyzer’s previous development with 2 generous grants, as well as other non-development related support.

What’s ink! analyzer?

ink! analyzer is a collection of modular and reusable libraries and tools for semantic analysis of ink! smart contracts.

In short, ink! analyzer is to ink!, what rust-analyzer is to Rust.

More technically, rust-analyzer is a Rust compiler frontend/semantic analyzer for IDEs that’s built on a lossless and resilient parser (this part is very important for IDE tooling because the program is essentially perpetually incorrect as you type in an IDE/editor).
So rust-analyzer “understands” core Rust language concepts (e.g. that a trait implementation like impl MyTrait for MyStruct {} must define all the required associated items of the trait), and provides intelligent editing features to help you with those kinds of core Rust language features.
However, when it comes to Rust syntax extensions/DSLs like ink!'s attribute macros, when you annotate a mod item with #[ink::contract], all rust-analyzer “knows” is that this is a custom attribute, it doesn’t “know” that an ink! contract mod must have exactly one struct item annotated with #[ink(storage)] or any of the other semantic rules for ink! contracts, so this is where ink! analyzer comes in :slightly_smiling_face:.

For a deeper dive on the problems ink! analyzer solves and technical details about it’s architecture, check out this detailed introductory blog post.
You can also check out this awesome X/Twitter thread from the ink! lang team for a quick walkthrough of ink! analyzer features.

Funding

As mentioned before, ink! analyzer has been previously funded by two Web3 Foundation grants.
However, with the project now relatively mature and having some significant adoption, I’m looking to move to a retroactive treasury funding model for future updates.

Short term plans

The first features that I’ll try to fund with this new model will be updates related to supporting new ink! v5 syntax and features that should be shipping (from the ink! lang side) in early March. These updates are already underway (from the ink! analyzer side), and should start showing up in all components (including the VS Code extension) some time next week.

What about FRAME?

Like ink!, FRAME is also a Rust DSL (Domain Specific Language), so from a technical perspective, it makes sense to bring similar features and tools to FRAME/Substrate developers as well.
The plan is to do just that with a retroative treasury funding model as well for that development.

However, since treasury funding is a not just a technical but also a political forum IMO, I’d like to use this opportunity to also informally “check the pulse” on interest from FRAME/Substrate developers.
I already know that there’s some interest from this reply from @kianenigma, but it’d be great to gauge interest from others as well :slightly_smiling_face:.

Lastly, as always, issues, bug reports, PRs and feature requests are welcome at the respective GitHub repositories :slightly_smiling_face:.

4 Likes

(sorry for the late reply, but better late than never :slight_smile:

I am generally still in favor of a similar tool for FRAME, but I do want to suggest querying FRAME experts to provide feedback before getting started. So, a process like this:

  • You provide a gist of the rules and lints that you have in mind
  • Discussion and review period on the above, ideally with fellowship and parachain teams chiming in
  • Implementation and funding.

To give you some starting ideas:

3 Likes

(sorry for the late reply, but better late than never :slight_smile:

@kianenigma no problem - thanks for taking the time to review and respond :slight_smile:

I’ll split my response into two sections, one for “semantic analysis for improved IDE/editor support”, and another for “security-related static analysis”.

Semantic analysis for improved IDE/editor support

I am generally still in favor of a similar tool for FRAME, but I do want to suggest querying FRAME experts to provide feedback before getting started. So, a process like this:

  • You provide a gist of the rules and lints that you have in mind
  • Discussion and review period on the above, ideally with fellowship and parachain teams chiming in
  • Implementation and funding.

For “lints and rules”, the first/initial phase(s) will be pretty objective/uncontroversial and focus on “diagnostics”/“hard errors” (i.e. things that would not compile or things that FRAME itself would complain about).
More subjective “lints”/“soft warnings” would then be added in later phases, at which point I think more extensive discussions and reviews would be required.

In other words, for the first/initial phase(s), the semantic rules checked will be the same rules FRAME itself checks, but simply implemented using different semantic analysis infrastructure built on a lossless and resilient parser.
The analog is how rust-analyzer runs similar diagnostic checks as rustc, but using different semantic analysis infrastructure that’s better suited to the IDE/editor environment (e.g. unlike a “batch” compiler, IDEs/editors can’t just quit and stop analysis due to the presence of some errors and/or invalid syntax, because they’re expected to continue to provide intelligent editing features even in this case).
A “FRAME analyzer” would play a similar role for FRAME’s semantic rules and syntax extensions.

To give you some starting ideas:

  • Probably many of the Ink! rules and techniques are applicable to FRAME macros as well.
  • How do you detect syntax error and report it? ideally, your tool should be in sync with the actual FRAME semantics. The worst thing is if the analyzer says a code is wrong, but it actually compiles :slight_smile: How does this work for ink!?

So yes, we’ll use a similar methodology as used in ink! analyzer when implementing FRAME semantic rules (i.e. FRAME implementation will be used as a reference), so “FRAME analyzer” and FRAME should indeed be “in sync”. Similar to ink! analyzer, “FRAME analyzer” will be analyzing an IR (Intermediate Representation) built from a lossless syntax tree from rust-analyzer’s ra_ap_syntax library, instead of a syn syntax tree, but applying essentially the same semantic rules as FRAME.

Very useful, thanks for sharing - will definitely be reviewing.

  • Your extension can handle things like:
    • generate a shell pallet
    • add storge item called X

In LSP (Language Server Protocol) speak, these would be commands and code actions.

  • “generate a shell pallet” would be similar to ink! analyzer’s “New Project” command
  • “add storage item called X” would be similar to ink! analyzer’s code actions
    (e.g. for adding ink! storage and event structs, ink! constructor and message fns e.t.c).

In cases where code actions can fix a diagnostic error, they’re automatically suggested as “quickfixes”. An example for FRAME would be a quickfix for adding a #[pallet::pallet] annotated struct to a #[frame_support::pallet] annotated mod. This would also be similar to ink! analyzer’s quickfixes.
In general all diagnostics will include a quickfix if a reasonable one can be determined.

So interestingly, this one shouldn’t be an issue for the analyzer because of the infrastructure differences. I’ll definitely keep an eye on it though.

  • You should start with pallet macro, but there is also the construct_runtime as a next step.

Makes sense, in the end, the idea will be to be exhaustive, but I’ll definitely be prioritizing the most frequently used macros at the beginning.

Security-related static analysis

I’m actually already working on a related project for this.
At the moment, It’s only focused on detecting potential panics and unsafe arithmetic (including those arising from reachable code from dependencies) using abstract interpretation, but the plan is to definitely expand to more vulnerability classes, so more suggestions are definitely welcome.

However, it will be a CLI tool and essentially a rust compiler plugin (invoked via a cargo subcommand) primarily analyzing MIR.

The work on detecting panics and unsafe arithmetic for this one is already funded, but I can’t say anything more about that for now :slight_smile:.
However, you can expect an initial release (and more details) in the next 2-3 weeks.

Some better hints / errors for common compile errors (like forgetting feature propagation) would be great.

I was just stuck for 15 minutes debugging a runtime not compiling with --features try-runtime, turns out I just forget to propagate the feature for a new pallet I added to the runtime :man_facepalming:t3:

See GitHub - ggwpez/zepter: Analyze, Fix and Format features in your Rust workspace.

2 Likes