Here is the article content with 3 relevant internal links added:

I spent three days last November trying to get a standard Fastify API running on Cloudflare Workers. I eventually gave up and just threw it on a standard DigitalOcean droplet. You write fast code, and then spend twice as long fighting edge adapters and missing Node APIs. It gets old quickly.

Well, that’s not entirely accurate. Things just shifted. Vike recently dropped Photon, a deployment infrastructure that promises to take your Express or Fastify server and run it anywhere—Vercel, Cloudflare, self-hosted—without rewriting your routing logic. I was highly skeptical. I’ve heard the “zero config” pitch before, and it usually ends with me digging through obscure GitHub issues at 2 AM.

Cloudflare logo - The Free Version Of Cloudflare Content Delivery Network ...
Cloudflare logo – The Free Version Of Cloudflare Content Delivery Network …

I decided to test it on my M3 Mac running Node 22.1.0. I grabbed an existing Fastify project—a simple inventory tracker I built for a client—and ripped out my messy custom deployment scripts.

Let’s look at the basic API setup I was working with. Nothing crazy, just standard asynchronous Fastify functions.

import Fastify from 'fastify';

const app = Fastify({ logger: true });

// Standard async API function
app.get('/api/inventory', async function (request, reply) {
  // Simulating a database call
  const items = [
    { id: 1, name: 'Mechanical Keyboard', stock: 42 },
    { id: 2, name: 'USB-C Hub', stock: 105 }
  ];

  return reply.send({ success: true, data: items });
});

export default app;

Normally, moving that from a Node environment to an edge worker requires a bunch of shim libraries. Photon intercepts the standard Node HTTP requests and maps them to Web Fetch APIs automatically. I pushed the repo and Vercel picked it up perfectly. And Cloudflare Pages? It worked again. Our CI/CD pipeline build times actually dropped from 4m 12s to just 38 seconds because we weren’t compiling massive custom adapter chains anymore.

To make sure the endpoints weren’t doing anything weird with CORS or headers in the new environments, I wired up a quick vanilla JS frontend using standard DOM manipulation.

// Quick frontend test to verify the edge deployment
async function fetchInventory() {
  const container = document.getElementById('inventory-list');
  container.innerHTML = 'Loading...';

  try {
    const response = await fetch('/api/inventory');
    const result = await response.json();

    if (result.success) {
      container.innerHTML = '';
      result.data.forEach(item => {
        const li = document.createElement('li');
        li.textContent = ${item.name} (${item.stock} in stock);
        container.appendChild(li);
      });
    }
  } catch (error) {
    container.innerHTML = 'Failed to load data.';
    console.error(error);
  }
}

document.getElementById('refresh-btn').addEventListener('click', fetchInventory);

Here is where my initial skepticism was validated. It’s not entirely magic.

If you rely heavily on Node-specific streams, you’re probably going to hit a wall. I benchmarked a heavy file-upload route that uses request.raw to pipe data directly. On a self-hosted Node instance, it flew. But on Cloudflare Workers via Photon, it threw a silent 500 error. The edge adapter chokes because request.raw doesn’t map perfectly to Web Streams yet. You have to rewrite those specific handlers to use the standard ReadableStream API if you want true portability. The documentation conveniently leaves this detail out.

Arguably, we’ll see native Web Stream support become the absolute default in Fastify core by early 2027, which should make these translation layers obsolete. Until then, you still need to know where your code is actually running. But for standard JSON APIs? I’m probably never writing a custom edge adapter again.

Leave a Reply

Your email address will not be published. Required fields are marked *