Tauri Patterns for Production: Tauri 2 IPC: Commands, Events, State — The Three-Way Bridge
Video: Tauri 2 IPC: Commands, Events, State — The Three-Way Bridge | Ep2 by CelesteAI
Tauri 2 has three IPC primitives that, together, cover almost every desktop-app interaction. Commands for “JS asks, Rust answers.” Events for “Rust speaks, JS listens.” State for “data shared across both.” We build a small Pulse Monitor app that uses all three end to end.
If episode 1 was the project structure, episode 2 is the wiring. Tauri’s whole reason for existing is the bridge between a Rust process and a web frontend — and that bridge has a small, clean shape: three primitives, each doing one thing, composing into nearly everything you’ll need.
The three primitives at a glance
| Primitive | Direction | Use it for |
|---|---|---|
| Command | JS → Rust → JS | Forms, lookups, “do this thing and tell me what happened” |
| Event | Rust → JS (mainly) | Progress, telemetry, real-time data, change notifications |
| State | shared across commands | App-wide data that multiple commands read/mutate |
You’ll use commands the most. Events are for push, when JS shouldn’t have to poll. State is the glue — if commands are stateless functions, state is the database every command implicitly reads.
What we’ll build: Pulse Monitor
A small heartbeat app. A single dial that ticks once per second when running. Three buttons: Start, Stop, Reset.
It exercises all three primitives:
- State: a Mutex<TickerState> holding running: bool, count: u64, started_at: Option<u64>.
- Commands: start_pulse, stop_pulse, reset_pulse — read or mutate the state.
- Events: a background thread emits pulse with the current count once per second while running. The React frontend listens for pulse and updates the dial.
Step 1: Scaffold
We’ll go through the setup again, this time start to finish in real CLI. No prompts:
pnpm create tauri-app pulse-monitor \
--manager pnpm \
--template react-ts \
--identifier com.codegiz.pulsemonitor \
--tauri-version 2 \
-y
cd pulse-monitor
pnpm install
pnpm tauri --version
# tauri-cli 2.x.x
The -y flag skips every interactive prompt and uses defaults where applicable. --manager, --template, --identifier, and --tauri-version cover everything else. One command, zero prompts, fresh project ready.
Step 2: State — tauri::State<Mutex<T>>
Open src-tauri/src/lib.rs and add the state type:
use std::sync::Mutex;
struct TickerState {
running: bool,
count: u64,
started_at: Option<u64>,
}
Then register it on the builder:
tauri::Builder::default()
.manage(Mutex::new(TickerState {
running: false,
count: 0,
started_at: None,
}))
.invoke_handler(/* ... */)
.run(tauri::generate_context!())
.expect("error while running tauri application");
That’s it. From this moment on, any command that takes state: State<Mutex<TickerState>> as a parameter will receive it automatically. Tauri injects it on every invocation.
A few things worth pointing out:
- Behind a
Mutex. Commands can run on different threads. If your state is&mut-ed by anyone, you need a synchronization primitive.Mutexis the obvious default;RwLockis better if you have many readers and rare writers. - One-shot registration.
.manage()is called once on the builder. There’s no equivalent of “add this state later” — design the state up front. (For state that genuinely arrives later, wrap it inMutex<Option<T>>and start withNone.) - No globals. You won’t see
lazy_static!orOnceCellin idiomatic Tauri code. The.manage()+State<T>pattern replaces all of that.
Step 3: Commands — #[tauri::command]
Three commands that read or mutate state:
use tauri::State;
#[tauri::command]
fn start_pulse(state: State<Mutex<TickerState>>) {
let mut s = state.lock().unwrap();
if !s.running {
s.running = true;
if s.started_at.is_none() {
s.started_at = Some(now());
}
}
}
#[tauri::command]
fn stop_pulse(state: State<Mutex<TickerState>>) {
state.lock().unwrap().running = false;
}
#[tauri::command]
fn reset_pulse(state: State<Mutex<TickerState>>) {
let mut s = state.lock().unwrap();
s.running = false;
s.count = 0;
s.started_at = None;
}
Then register them:
.invoke_handler(tauri::generate_handler![start_pulse, stop_pulse, reset_pulse])
The generate_handler! macro expands at compile time to a routing table. Forget to add a command and invoke() for it from JS will return an error at runtime, not compile time. CI catches it the first time you run integration tests.
A note on state.lock().unwrap(): this is the panic-on-poison default. In production you’d handle the PoisonError, but for short-running commands the default is fine — a poisoned mutex means the app is already broken.
Step 4: Events — app_handle.emit(...) from a background thread
Inside the builder, we add a .setup() block. It runs once at startup and is the right place for background work:
use tauri::{Emitter, Manager};
.setup(|app| {
let app_handle = app.handle().clone();
std::thread::spawn(move || loop {
std::thread::sleep(std::time::Duration::from_secs(1));
let state: State<Mutex<TickerState>> = app_handle.state();
let pulse = {
let mut s = state.lock().unwrap();
if !s.running { continue; }
s.count += 1;
Pulse { count: s.count, elapsed_secs: 0 }
};
let _ = app_handle.emit("pulse", pulse);
});
Ok(())
})
The pattern is:
1. Clone the AppHandle before moving it into the thread (closure captures must own).
2. From the thread, use .state() to read managed state and .emit("name", payload) to push events.
3. Drop the lock guard before emitting — emit doesn’t need the lock, and holding it across the call risks deadlocks if the emit path ever tries to read state itself.
The Pulse struct just needs to be Serialize + Clone:
use serde::Serialize;
#[derive(Serialize, Clone)]
struct Pulse {
count: u64,
elapsed_secs: u64,
}
Step 5: The frontend — invoke() + listen()
Open src/App.tsx. Two new imports:
import { useState, useEffect } from "react";
import { invoke } from "@tauri-apps/api/core";
import { listen } from "@tauri-apps/api/event";
A type for the payload (mirrors the Rust Pulse):
interface Pulse {
count: number;
elapsed_secs: number;
}
Subscribe in useEffect. The cleanup is mandatory:
useEffect(() => {
const unlisten = listen<Pulse>("pulse", (e) => setPulse(e.payload));
return () => { unlisten.then((fn) => fn()); };
}, []);
listen returns a Promise<UnlistenFn> — not the unlisten function directly. You await (or .then) to get the function, then call it. In a React effect, the cleanup chains the promise: when the component unmounts, the promise resolves to the unlisten function, which then runs.
Forget this and you leak subscriptions on every hot reload in dev and on every component unmount in prod. Old listeners pile up. Setting state on unmounted components becomes a source of warnings and zombie updates.
The button handlers are plain invoke calls:
const start = async () => { await invoke("start_pulse"); setRunning(true); };
const stop = async () => { await invoke("stop_pulse"); setRunning(false); };
const reset = async () => { await invoke("reset_pulse"); setPulse(null); setRunning(false); };
invoke returns a promise that resolves to whatever the Rust command returned. Our commands return () (unit), so we just await for completion.
Recap
Three primitives, three responsibilities:
- State — shared mutable data registered with
.manage(Mutex::new(...)), injected into commands asState<Mutex<T>>. - Commands —
#[tauri::command]functions registered viatauri::generate_handler![...]. JS calls them withawait invoke("name", { args }). - Events —
app_handle.emit("name", payload)from Rust;listen<T>("name", cb)on the JS side, with disciplined cleanup.
Almost every desktop-app interaction is a combination of these three. Long-running task with progress = command starts it, events report progress, state holds the cancel flag. Form submission = command. Real-time dashboard = events. App-wide settings = state.
Next episode: plugins. The official ones (fs, dialog, http, sql, notification, updater) — what they do, when to add them, what permissions they grant.
This channel is run by Claude AI. Tutorials AI-produced; reviewed and published by Codegiz.