A small Over-the-Air(OTA) system that keeps shipping in your hands
Shipping fast is easy. Shipping with control is harder. I wanted channels, rollback, and a history I could query without handing the pipeline away. So I built a Hono API on Cloudflare Workers backed by D1 and KV, two shell scripts, and a /check endpoint the app polls on launch. Everything runs from my self-hosted code server — no CI, no cloud build agent.
# build web bundle, zip it, upload to my worker $ ./scripts/deploy-ota.sh admin mypassword 1.2.0 🚀 Starting Nuxt build (npm run generate)... ✅ Nuxt build complete. 📦 Generating bundle for version 1.2.0... 🔑 Logging in to https://worker.example.com/api/auth/login... ✅ Authentication successful. Token obtained. ⬆️ Uploading stable-1.2.0.zip... ✅ Done!
Control is the feature.
Updates are part of the product. I wanted channels, versioning, upload history, and rollback without living inside someone else's pricing or infra. I already had a Cloudflare Worker in the stack, so I taught it to ship.
D1 stores every bundle ever pushed — version, channel, filename, checksum, timestamp. KV holds the raw bytes under BUNDLES and the active manifest per channel under OTA_MANIFEST. The app polls /check on next launch; if the manifest version differs from the installed build, it downloads and applies the update.
mingc/android-build-box, and no local SDK is needed. The pipeline is a loop, not a maze.
One script, one upload, one source of truth.
Everything starts in the terminal of my self-hosted code server — a VS Code instance on my ThinkPad E560, reachable from any browser over Tailscale. The deploy script builds the web bundle, zips it, authenticates, and uploads. On the next launch, the app polls /check and either updates or stays put.
Native builds stay in the same story.
For native builds, build-apk.sh spins up mingc/android-build-box inside Docker on the code server — no local Android SDK anywhere. Same JWT auth, same upload endpoint, different channel.
A worker that keeps the present and remembers the past.
The manifest is the present. D1 is the memory.
Admin routes sit behind JWT middleware. BUNDLES stores raw files, while OTA_MANIFEST stores the active manifest per channel. Three paths cover the lifecycle: upload, check, and rollback on delete.
const key = `${channel}-${version}.zip`; await c.env.BUNDLES.put(key, await file.arrayBuffer()); const manifest = { version, key, checksum, url: `${origin}/api/ota/bundle/${key}`, updated: new Date().toISOString() }; await c.env.OTA_MANIFEST.put(`manifest:${channel}`, JSON.stringify(manifest)); // record in D1 for history / rollback await c.env.RouteDB.prepare( 'INSERT INTO history (channel, version, filename, uploaded_at, checksum) VALUES (?, ?, ?, datetime("now"), ?)' ).bind(channel, version, key, checksum).run();
const { version_build, channel = 'stable' } = await req.json(); const manifest = JSON.parse(await c.env.OTA_MANIFEST.get(`manifest:${channel}`)); if (manifest.version !== version_build) { return c.json({ version: manifest.version, url: manifest.url, checksum: manifest.checksum }); } return c.json({}); // already up to date
// after deleting from BUNDLES + D1... const { results } = await c.env.RouteDB.prepare( 'SELECT version, filename FROM history WHERE channel = ? ORDER BY uploaded_at DESC LIMIT 1' ).bind(channel).all(); if (results.length > 0) { await c.env.OTA_MANIFEST.put(`manifest:${channel}`, JSON.stringify(newManifest)); } else { await c.env.OTA_MANIFEST.delete(`manifest:${channel}`); }
Two Scripts. One Release Pipeline.
@capgo/cli output, get a JWT, and upload to the worker. # 1. build npm run generate npx cap sync # 2. zip + extract checksum from capgo cli output CAPGO_OUTPUT=$(npx @capgo/cli@latest bundle zip) ZIP_NAME=$(echo "$CAPGO_OUTPUT" | grep "Saved to" | awk '{print $NF}') CHECKSUM=$(echo "$CAPGO_OUTPUT" | grep "Checksum SHA256" | awk '{print $NF}') # 3. login — get JWT AUTH_TOKEN=$(curl -sS -X POST "$BASE_URL/api/auth/login" \ -H "Content-Type: application/json" \ -d '{"username":"...","password":"..."}' | jq -r '.token') # 4. upload curl -X POST "$BASE_URL/api/ota/admin/ota/upload" \ -H "Authorization: Bearer $AUTH_TOKEN" \ -F "file=@$ZIP_PATH" \ -F "version=$VERSION" \ -F "channel=stable" \ -F "checksum=$CHECKSUM"
mingc/android-build-box in Docker — no local Android SDK needed. # 1. build web layer npm run generate npx cap sync # 2. build APK inside docker — Docker socket mounted on code server host docker run --rm \ -v "$(pwd):/project" \ mingc/android-build-box \ bash -c 'cd /project/android; ./gradlew :app:assembleDebug' # 3. same JWT auth flow as deploy-ota.sh AUTH_TOKEN=$(curl ... | jq -r '.token') # 4. upload APK to its own channel curl -X POST "$BASE_URL/api/ota/admin/apk/upload" \ -H "Authorization: Bearer $AUTH_TOKEN" \ -F "file=@android/app/build/outputs/apk/debug/app-debug.apk" \ -F "version=$VERSION"
history table in D1. Channels keep the story clean (apk vs stable), and the admin endpoints filter by channel to match. What the system taught me.
KV has a 25MiB limit per value. KV is fast but small. 25MiB is enough for web bundles but tight for APKs. Stripping debug symbols and using assembleRelease instead of debug keeps size in check.
D1 makes rollback trivial. A real history table turns rollback into a delete. The worker walks back to the previous entry automatically. No manual manifest edits.
JWT auth in shell scripts needs care. The scripts build JSON with printf instead of string interpolation to avoid quoting issues. Small thing, but it matters when you are curling with credentials.
VITE_CF_API_URL comes from .env — never hardcoded. Credentials are passed as script arguments, not env vars, so they do not linger in shell history. The whole thing took a weekend to wire up. Running it all from my self-hosted code server means I never need to install anything locally or spin up a CI job. It fades into the background: open the browser, run the script, ship.