The bellows move, but the CPU stays quiet — what’s powering this browser web harmonium?
Clicking into this harmonium-in-a-tab feels a bit surreal: a little bellow handle animates, reeds respond to keypresses, and the sound is surprisingly organic for something coming out of AudioContext. I’m less interested in how musical it feels than in how they pulled off that “real instrument in a browser” illusion with so little lag.
Probably sample playback, not synthesis
The page claims “nine sampling points across three octaves,” which suggests pre‑recorded WAVs pitched via resampling rather than additive synthesis. My guess is each note name maps to an AudioBuffer loaded once and stretched by playbackRate. Because harmonium reeds are relatively static timbrally, a few samples per octave are enough if you interpolate pitch carefully — maybe detune in 25‑cent steps.
What’s impressive is the lack of audible pops between bellows volume changes; that implies they’re smoothing gain adjustments through a shared GainNode envelope, not tweaking volume directly on AudioBufferSourceNode. The “pump handle” UI probably just changes a target amplitude, and a short exponential ramp handles the fade so the CPU doesn’t glitch.
MIDI velocity and the mystery of expression
The velocity‑sensitive response hints at simple amplitude mapping, but the harmonium doesn’t actually respond to velocity in the acoustic sense — it’s air pressure. Maybe they map velocity onto the global bellows gain, making harder hits simulate stronger pressure. Without polyphonic aftertouch, that would still feel static if you hold notes. I’d bet they’re smoothing gain.value over time to fake air compression.
The navigator.requestMIDIAccess API can be finicky in Safari, so I’d love to see what polyfill (if any) they use. The low latency suggests they rely on Chrome’s direct MIDI path and accept best‑effort elsewhere.
The “drone” is a separate audio graph
Switching between the reed drone and tanpura loop is instant, no browser re‑buffering pause. That implies pre‑decoded looping buffers. The tanpura’s periodicity sounds about five seconds, so they likely run a looping AudioBufferSourceNode with a small crossfade fade‑out/fade‑in at the seam. The harmonium drone could simply be the same sample layered under a low‑pass filter.
Given that it keeps playing while you record, the recorder must be grabbing from a mix MediaStreamDestination feeding into a MediaRecorder — straightforward but clever for an all‑client setup.
Recording and share links
Recordings save as WebM up to two minutes, which lines up neatly with Chrome’s default MediaRecorder memory limits; beyond that, latency spikes. The “one‑hour share link” that supposedly stores the state is probably a compressed JSON blob passed via base64 — about forty characters on my short test, so likely LZ‑compressed settings encoded into the query string. Interesting that playback settings, not audio, travel in that link; the recipient just rebuilds the same instrument state client‑side.
PWA and caching puzzle
The thing installs offline, so they’ve gone the serviceWorker route. But harmonium samples are big — three octaves × nine notes ≈ 27 files at maybe 300–500 KB each. That’s bandwidth‑heavy. Either Cloudflare is edge‑caching aggressively, or they lazy‑cache samples on first touch. Watching the network panel shows delayed fetches per note, supporting the lazy hypothesis. Makes sense: cache only notes actually played rather than forcing a multi‑megabyte warm‑up.
What I still wonder
The bellows animation tracks volume with almost zero frame lag on mid‑range phones. Are they syncing that via an AudioWorkletProcessor thread posting back amplitude RMS values, or just deducing handle position purely from UI state? If anyone has dug through its JavaScript bundle, I’m curious how tightly (or loosely) the visual and audio loops are coupled.