Rolling my own OTA update server for Capacitor apps
No third-party update service. A Hono API on Cloudflare Workers backed by D1 and KV, two shell scripts, and a /check endpoint the app polls on launch. Everything runs from my self-hosted code server โ no CI, no cloud build agent.
# build web bundle, zip it, upload to my worker $ ./scripts/deploy-ota.sh admin mypassword 1.2.0 ๐ Starting Nuxt build (npm run generate)... โ Nuxt build complete. ๐ฆ Generating bundle for version 1.2.0... ๐ Logging in to https://worker.example.com/api/auth/login... โ Authentication successful. Token obtained. โฌ๏ธ Uploading stable-1.2.0.zip... โ Done!
Full control over delivery.
I wanted channels, versioning, upload history, and rollback โ without being locked into a third-party service's pricing or infra. I was already running a Cloudflare Worker for other things, so building on top of it was the obvious move.
D1 stores every bundle ever pushed โ version, channel, filename, checksum, timestamp. KV holds the raw bundle bytes under BUNDLES and the active manifest per channel under OTA_MANIFEST. The app polls /check on next launch; if the manifest version differs from the installed build, it downloads and applies the update.
mingc/android-build-box with no local SDK needed. How the OTA pipeline flows end to end.
One script, one upload, one manifest.
Everything starts in the terminal of my self-hosted code server โ a VS Code instance running on my ThinkPad E560, reachable from any browser over Tailscale. The deploy script builds the web bundle, zips it, authenticates with the worker, and uploads. The worker writes to KV and D1, then the app polls /check on next launch.
APK builds use the same flow.
For native builds, build-apk.sh spins up mingc/android-build-box inside Docker on the code server โ no local Android SDK anywhere. Same JWT auth, same upload endpoint, different channel.
Hono worker with KV and D1.
Uploads update the manifest, history stays in D1.
Admin routes are protected by JWT middleware. BUNDLES stores raw files, while OTA_MANIFEST stores the active manifest per channel. Three code paths cover the whole lifecycle: upload, check, and rollback-on-delete.
const key = `${channel}-${version}.zip`; await c.env.BUNDLES.put(key, await file.arrayBuffer()); const manifest = { version, key, checksum, url: `${origin}/api/ota/bundle/${key}`, updated: new Date().toISOString(), }; await c.env.OTA_MANIFEST.put(`manifest:${channel}`, JSON.stringify(manifest)); // record in D1 for history / rollback await c.env.RouteDB.prepare( 'INSERT INTO history (channel, version, filename, uploaded_at, checksum) VALUES (?, ?, ?, datetime("now"), ?)' ).bind(channel, version, key, checksum).run();
const { version_build, channel = 'stable' } = await req.json(); const manifest = JSON.parse( await c.env.OTA_MANIFEST.get(`manifest:${channel}`) ); if (manifest.version !== version_build) { return c.json({ version: manifest.version, url: manifest.url, checksum: manifest.checksum, }); } return c.json({}); // already up to date
// after deleting from BUNDLES + D1... const { results } = await c.env.RouteDB.prepare( 'SELECT version, filename FROM history WHERE channel = ? ORDER BY uploaded_at DESC LIMIT 1' ).bind(channel).all(); if (results.length > 0) { // promote the previous bundle to active await c.env.OTA_MANIFEST.put(`manifest:${channel}`, JSON.stringify(newManifest)); } else { // no more bundles โ clear the manifest await c.env.OTA_MANIFEST.delete(`manifest:${channel}`); }
Two scripts, no CI needed.
@capgo/cli output, gets a JWT, then uploads to the worker. # 1. build npm run generate npx cap sync # 2. zip + extract checksum from capgo cli output CAPGO_OUTPUT=$(npx @capgo/cli@latest bundle zip) ZIP_NAME=$(echo "$CAPGO_OUTPUT" | grep "Saved to" | awk '{print $NF}') CHECKSUM=$(echo "$CAPGO_OUTPUT" | grep "Checksum SHA256" | awk '{print $NF}') # 3. login โ get JWT AUTH_TOKEN=$(curl -sS -X POST "$BASE_URL/api/auth/login" \ -H "Content-Type: application/json" \ -d '{"username":"...","password":"..."}' | jq -r '.token') # 4. upload curl -X POST "$BASE_URL/api/ota/admin/ota/upload" \ -H "Authorization: Bearer $AUTH_TOKEN" \ -F "file=@$ZIP_PATH" \ -F "version=$VERSION" \ -F "channel=stable" \ -F "checksum=$CHECKSUM"
mingc/android-build-box in Docker โ no local Android SDK needed. The Docker socket is mounted on the host. # 1. build web layer npm run generate npx cap sync # 2. build APK inside docker โ Docker socket mounted on code server host docker run --rm \ -v "$(pwd):/project" \ mingc/android-build-box \ bash -c 'cd /project/android; ./gradlew :app:assembleDebug' # 3. same JWT auth flow as deploy-ota.sh AUTH_TOKEN=$(curl ... | jq -r '.token') # 4. upload APK to its own channel curl -X POST "$BASE_URL/api/ota/admin/apk/upload" \ -H "Authorization: Bearer $AUTH_TOKEN" \ -F "file=@android/app/build/outputs/apk/debug/app-debug.apk" \ -F "version=$VERSION"
history table in D1 โ they're just different channels (apk vs stable). The admin endpoints filter by channel so they stay separate. What I learned along the way.
KV has a 25MB limit per value
Both zip bundles and APKs are stored in KV. 25MB is enough for web bundles but tight for APKs. Stripping debug symbols and using assembleRelease instead of debug cuts size significantly.
D1 makes rollback trivial
Having the full upload history in a queryable database means rollback is just a delete โ the worker automatically walks back to the previous entry. No manual manifest editing needed.
JWT auth in shell scripts needs care
The scripts build the JSON payload with printf instead of string interpolation to avoid injection issues with special characters in passwords. Small thing but it matters when you're curling with credentials.
VITE_CF_API_URL is loaded from .env โ never hardcoded. Credentials are passed as script arguments, not env vars, so they don't linger in shell history. The whole thing took a weekend to wire up. Running it all from my self-hosted code server means I never need to install anything locally or spin up a CI job. Open the browser, run the script, done.