I spent three days last November trying to get a standard Fastify API running on Cloudflare Workers. I eventually gave up and just threw it on a standard DigitalOcean droplet. You write fast code, and then spend twice as long fighting edge adapters and missing Node APIs. It gets old quickly.
Well, that’s not entirely accurate. Things just shifted. Vike recently dropped Photon, a deployment infrastructure that promises to take your Express or Fastify server and run it anywhere—Vercel, Cloudflare, self-hosted—without rewriting your routing logic. I was highly skeptical. I’ve heard the “zero config” pitch before, and it usually ends with me digging through obscure GitHub issues at 2 AM.

I decided to test it on my M3 Mac running Node 22.1.0. I grabbed an existing Fastify project—a simple inventory tracker I built for a client—and ripped out my messy custom deployment scripts.
Let’s look at the basic API setup I was working with. Nothing crazy, just standard asynchronous Fastify functions.
import Fastify from 'fastify';
const app = Fastify({ logger: true });
// Standard async API function
app.get('/api/inventory', async function (request, reply) {
// Simulating a database call
const items = [
{ id: 1, name: 'Mechanical Keyboard', stock: 42 },
{ id: 2, name: 'USB-C Hub', stock: 105 }
];
return reply.send({ success: true, data: items });
});
export default app;
Normally, moving that from a Node environment to an edge worker requires a bunch of shim libraries. Photon intercepts the standard Node HTTP requests and maps them to Web Fetch APIs automatically. I pushed the repo and Vercel picked it up perfectly. And Cloudflare Pages? It worked again. Our CI/CD pipeline build times actually dropped from 4m 12s to just 38 seconds because we weren’t compiling massive custom adapter chains anymore.
To make sure the endpoints weren’t doing anything weird with CORS or headers in the new environments, I wired up a quick vanilla JS frontend using standard DOM manipulation.
// Quick frontend test to verify the edge deployment
async function fetchInventory() {
const container = document.getElementById('inventory-list');
container.innerHTML = 'Loading...';
try {
const response = await fetch('/api/inventory');
const result = await response.json();
if (result.success) {
container.innerHTML = '';
result.data.forEach(item => {
const li = document.createElement('li');
li.textContent = ${item.name} (${item.stock} in stock);
container.appendChild(li);
});
}
} catch (error) {
container.innerHTML = 'Failed to load data.';
console.error(error);
}
}
document.getElementById('refresh-btn').addEventListener('click', fetchInventory);
Here is where my initial skepticism was validated. It’s not entirely magic.
If you rely heavily on Node-specific streams, you’re probably going to hit a wall. I benchmarked a heavy file-upload route that uses request.raw to pipe data directly. On a self-hosted Node instance, it flew. But on Cloudflare Workers via Photon, it threw a silent 500 error. The edge adapter chokes because request.raw doesn’t map perfectly to Web Streams yet. You have to rewrite those specific handlers to use the standard ReadableStream API if you want true portability. The documentation conveniently leaves this detail out.
Arguably, we’ll see native Web Stream support become the absolute default in Fastify core by early 2027, which should make these translation layers obsolete. Until then, you still need to know where your code is actually running. But for standard JSON APIs? I’m probably never writing a custom edge adapter again.
Common questions
Can Vike Photon really deploy a Fastify app to Cloudflare Workers without code changes?
For standard JSON APIs, yes. Photon intercepts Node HTTP requests and maps them to Web Fetch APIs automatically, so a basic Fastify inventory endpoint deployed to Vercel and Cloudflare Pages without rewriting routing logic. However, handlers that rely on Node-specific streams like request.raw still break on the edge and need to be rewritten using the standard ReadableStream API for true portability.
Why does Fastify request.raw throw a 500 error on Cloudflare Workers with Photon?
Photon’s edge adapter chokes on request.raw because it doesn’t map perfectly to Web Streams yet. A heavy file-upload route piping data directly via request.raw ran fine on a self-hosted Node instance but threw a silent 500 error on Cloudflare Workers. To fix it, the route has to be rewritten using the standard ReadableStream API, a detail the Photon documentation leaves out.
How much faster are CI/CD builds with Vike Photon compared to custom edge adapters?
In a real-world test on an M3 Mac running Node 22.1.0, CI/CD pipeline build times dropped from 4 minutes 12 seconds to just 38 seconds after switching to Photon. The speedup came from no longer compiling massive custom adapter chains, since Photon handles the Node-to-edge HTTP translation automatically instead of requiring shim libraries bundled into every build.
When will Fastify support Web Streams natively so edge adapters aren’t needed?
Native Web Stream support is expected to become the default in Fastify core by early 2027, which should make translation layers like Photon’s edge adapter obsolete for stream-heavy routes. Until then, developers still need to know where their code actually runs, since Node-specific APIs like request.raw won’t map cleanly to the Web Fetch and ReadableStream APIs that edge runtimes such as Cloudflare Workers require.
