Figma’s canvas engine runs in your browser at desktop speed. Google Earth loads a 3D globe without a plugin. AutoCAD moved from a desktop app to a website.
They all use WebAssembly.
What is WebAssembly?
WebAssembly — Wasm — is a binary format that browsers can run directly. It is not a programming language. It is a compile target.
You write code in C++, Rust, Go, or another language. You compile it to a .wasm binary. The browser loads that binary and runs it at near-native speed.
Before Wasm, JavaScript was the only language browsers could run. JavaScript is fast for many tasks. But for heavy computation — image processing, 3D rendering, encryption, physics simulations — it struggles. The JIT compiler can only do so much.
Wasm changes that.
How fast is it?
WebAssembly runs at 95% of native speed. JavaScript runs at 10–50% depending on the task.
For compute-heavy work, Wasm is typically 5x to 26x faster than optimized JavaScript. Image processing, cryptographic hashing, audio processing — tasks that used to require a native app now run in the browser with almost no performance cost.
The speed comes from the binary format itself. Wasm is pre-compiled. There is no parsing step. No JIT warm-up. The browser just runs it.
How it works
There are three steps:
1. Write in any language
// Rust example
#[no_mangle]
pub fn add(a: i32, b: i32) -> i32 {
a + b
}
2. Compile to .wasm
# Rust → Wasm
wasm-pack build --target web
3. Load in the browser
const { instance } = await WebAssembly.instantiateStreaming(
fetch("module.wasm")
);
console.log(instance.exports.add(1, 2)); // 3
The browser loads the .wasm file and executes it in a sandboxed environment. It has no access to the DOM or browser APIs directly — it must call JavaScript for that. But for pure computation, it runs at near-native speed without touching JavaScript at all.
Who uses it today?
Wasm is not experimental. It is in production at scale.
Figma — the design tool uses Wasm for its entire canvas rendering engine. This is what makes Figma fast in a browser tab.
Google Earth — the web version runs on Wasm. It renders a full 3D globe with satellite imagery.
AutoCAD — Autodesk ported their desktop CAD software to the web using Wasm. Decades of C++ code, running in a browser.
TensorFlow.js — uses Wasm for inference acceleration. AI models run faster in the browser when Wasm handles the math.
Wasm leaves the browser
The biggest shift in 2026 is that Wasm is no longer just a browser technology.
WASI — the WebAssembly System Interface — lets Wasm modules run on servers, edge networks, and IoT devices. WASI gives Wasm access to files, sockets, and environment variables in a safe, sandboxed way.
This makes Wasm interesting as a server-side runtime:
- Zero cold start — Wasm modules start in microseconds, not seconds like containers
- 1/10th the memory of a Node.js process
- 60–80% lower latency compared to traditional containers
- Single binary — one
.wasmfile runs on any WASI-compatible runtime
Cloudflare Workers, Fastly Compute, Vercel Edge, and Akamai all run Wasm at the edge today. In 2025, Akamai acquired Fermyon — a Wasm-first cloud platform — specifically to bring Wasm to their edge network.
Supported languages
WebAssembly is language-agnostic. As of 2026, these languages compile to Wasm:
| Language | Toolchain |
|---|---|
| Rust | wasm-pack, wasm32-unknown-unknown target |
| C / C++ | Emscripten |
| Go | Built-in GOARCH=wasm support |
| Python | Pyodide |
| TypeScript/JavaScript | Javy (Shopify) |
| Kotlin | Kotlin/Wasm (stable since 2024) |
Rust has the best Wasm story in 2026. Small binary sizes, no garbage collector, excellent tooling.
Wasm vs JavaScript — when to use which
Wasm does not replace JavaScript. They work together.
Use JavaScript for:
- DOM manipulation
- Event handling
- API calls
- Application logic
Use Wasm for:
- Image and video processing
- Cryptography
- Physics simulations
- Audio processing
- AI inference
- Porting existing native codebases
The boundary between JavaScript and Wasm has a cost. Every call across that boundary copies data. If you call Wasm thousands of times per frame, the overhead adds up. Keep hot code inside Wasm and minimize the number of crossings.
Quick start
Try Wasm with Rust in 5 minutes.
Install the toolchain:
curl https://rustwasm.github.io/wasm-pack/installer/init.sh -sSf | sh
Create a project:
cargo new --lib hello-wasm
cd hello-wasm
Add to Cargo.toml:
[lib]
crate-type = ["cdylib"]
[dependencies]
wasm-bindgen = "0.2"
Write the Wasm function in src/lib.rs:
use wasm_bindgen::prelude::*;
#[wasm_bindgen]
pub fn greet(name: &str) -> String {
format!("Hello, {}!", name)
}
Build:
wasm-pack build --target web
This generates a .wasm binary and JavaScript glue code. For complex types like strings, wasm-pack handles the conversion automatically through wasm-bindgen. You can import the result directly into any web project.
Why it matters in 2026
WebAssembly solves a real problem. Applications that used to require native installs now run in a browser tab with near-native performance. Code written in any language can run anywhere — browsers, servers, edge networks, IoT devices.
It is a W3C standard. Every major browser supports it. The ecosystem is mature. WASI Preview 2 is the stable foundation as of 2026. WASI 0.3 — which adds async support and the Component Model — is in release candidate, expected to finalize by late 2026.
The Docker slogan was “build once, run anywhere.” Wasm delivers that promise with smaller binaries, faster starts, and stronger isolation.
What’s Next?
- Rust Tutorial — learn Rust, the best language for Wasm
- WebAssembly with Rust cheat sheet — WASM-specific Rust guide
- What is Docker? — containers vs Wasm tradeoffs explained