When we started building Voidcom, the first decision wasn’t what features to build — it was what technology to build them with. That choice would define everything: performance, reliability, developer velocity, and ultimately, the experience you get as a user.
Why Rust for the server
Most communication platforms run their servers on Go, Java, or Node.js. We chose Rust, and here’s why:
Memory safety without garbage collection. Rust’s ownership model eliminates entire categories of bugs — use-after-free, data races, null pointer dereferences — at compile time. For a server handling thousands of concurrent voice and text connections, this matters enormously. No GC pauses. No unexpected latency spikes.
Zero-cost abstractions. Rust’s async runtime (Tokio) gives us millions of concurrent tasks without the overhead of OS threads. Our gRPC signaling (via Tonic) and QUIC voice transport (via Quinn) run on the same efficient async foundation.
The right tool for real-time. Voice is unforgiving. A few milliseconds of jitter and the audio quality degrades noticeably. Rust gives us the deterministic performance we need to keep voice latency at 20ms.
The server architecture
Our server isn’t a monolith — it’s a focused set of services:
- gRPC signaling (Tonic) handles authentication, channels, chat, friends, presence, roles, and moderation
- QUIC voice SFU (Quinn) forwards audio and video packets as UDP datagrams — no mixing, just selective forwarding for minimal latency
- PostgreSQL stores relational data (users, servers, roles)
- ScyllaDB handles time-series data (messages, reactions, call history)
- Valkey manages ephemeral state (cache, sessions, presence, rate limiting)
- NATS provides real-time pub/sub for gRPC streaming subscriptions
- Typesense powers full-text search
Every backend is feature-flagged. You can run the server with just an in-memory store for development, or spin up the full infrastructure stack for production.
Why Flutter for the desktop client
Flutter might seem like an unusual choice for a desktop app. Here’s why it works:
Native compilation. Flutter compiles to native machine code — not JavaScript, not a web view. The result is a desktop app that starts fast, uses less memory than Electron alternatives, and renders at 60fps on the GPU.
Single codebase, multiple platforms. We’re targeting Windows first, with Linux, Android, and web coming next. Flutter lets us ship to all of them from one Dart codebase without sacrificing native feel.
Dart FFI for native audio. The voice engine is written in Rust and compiled to a native DLL (voidcom_voice.dll). Flutter calls into it via dart:ffi — giving us the full power of native audio processing (cpal for I/O, Opus for encoding, Quinn for QUIC transport) without leaving the Dart runtime for UI.
The voice pipeline
Here’s how audio flows from your microphone to your teammate’s speakers:
- Capture — cpal grabs audio from your input device at 48kHz
- Encode — Opus compresses the audio into 20ms frames (~32–256 kbps depending on quality preset)
- Encrypt — XChaCha20-Poly1305 encrypts the packet before it leaves your device
- Transport — Quinn sends the encrypted packet as a QUIC datagram to the server
- Forward — The SFU routes the packet to everyone in the voice channel (no mixing, no decryption)
- Decrypt — The recipient’s client decrypts the packet
- Decode — Opus decompresses the audio
- Playback — cpal plays the audio through the output device
The entire pipeline is designed for one thing: getting your voice to your teammates as fast and clearly as possible.
Why not Electron?
Electron apps are web pages running inside a bundled Chromium browser. That means:
- 200–400MB of RAM just to run the app (before you do anything)
- Higher CPU usage from the JavaScript runtime and browser rendering engine
- More latency from the additional abstraction layers
Voidcom’s Flutter client uses a fraction of that. It’s not a browser — it’s a compiled native application.
The trade-off
Rust and Flutter are harder to develop with than JavaScript and Electron. The compile times are longer. The ecosystem is younger. But the result is a communication app that’s genuinely faster, more secure, and more efficient than the alternatives.
We think that trade-off is worth it. And we think you’ll feel the difference.