Well, I still have PTSD from 2020-era headless WordPress builds. You know the drill: you hit npm run develop, go make a coffee, come back, and realize it crashed because of a heap memory error. And then you start questioning your career choices. It was a mess, to be honest.
But we aren’t in 2020 anymore. I’ve actually been testing the latest release of the Gatsby Source WordPress plugin (the big architecture overhaul that dropped last week) on a client site with about 4,000 posts, and honestly? It’s finally behaving like a modern stack should. The integration with Webpack 6’s caching layers is actually working, rather than just eating RAM.
Why Webpack Used to Choke on WordPress
The problem was never really React; it was data ingestion. Older versions of the source plugin would try to fetch everything at once, creating a massive JSON blob that Webpack had to process. And on my M3 MacBook, this usually meant the fans spinning up like a jet engine.
The new architecture, though, shifts how assets are chunked. Instead of eager-loading every image and post interaction, it relies heavily on Webpack’s lazy compilation. This means the dev server starts almost instantly, and pages are only built when you actually click on them.
Configuring the Beast
Out of the box, the defaults are decent. But if you’re running a heavy site (I’m talking 10k+ nodes), you still need to get your hands dirty in gatsby-node.js. The auto-generated Webpack config tends to be a bit conservative with memory limits.
Anyway, here’s the configuration I’m using to force Webpack to handle the ingestion gracefully without blowing up the Node process. I’m running this on Node 23.4.0, so your mileage might vary if you’re stuck on v20.
exports.onCreateWebpackConfig = ({ actions, getConfig, stage }) => {
const config = getConfig();
// We only want to mess with this during development
// Production builds handle memory differently
if (stage === 'develop') {
const newConfig = {
...config,
optimization: {
...config.optimization,
// Stop Webpack from trying to optimize every single WP asset immediately
removeAvailableModules: false,
removeEmptyChunks: false,
splitChunks: false,
},
cache: {
type: 'filesystem',
buildDependencies: {
config: [__filename],
},
// Bump this if you change your WP data structure
version: 'wp-source-v2-cache',
}
};
actions.replaceWebpackConfig(newConfig);
}
};
That splitChunks: false line? It saved my life. By disabling aggressive chunk splitting during dev, the initial build time on my test project dropped from 4 minutes to about 45 seconds. Webpack stops trying to be clever and just serves the files.
Fetching Data Without the Headache
The other big shift is how we handle async data fetching. The old way of dumping everything into gatsby-node createPages is… well, it works, but it’s slow. The modern approach leverages the GraphQL layer more intelligently.
I actually wrote a quick utility to handle fetching post data dynamically. This isn’t part of the core plugin, but it helps keep the Webpack bundle size down by offloading some of the heavy lifting to runtime fetching where appropriate (hybrid rendering).
// utils/wp-fetcher.js
/**
* Fetch a specific post by slug asynchronously
* @param {string} slug
* @returns {Promise<Object|null>}
*/
export async function getPostBySlug(slug) {
const query =
query GetPost($slug: ID!) {
post(id: $slug, idType: SLUG) {
title
content
date
featuredImage {
node {
sourceUrl
}
}
}
}
;
try {
const response = await fetch(process.env.GATSBY_WP_GRAPHQL_URL, {
method: 'POST',
headers: {
'Content-Type': 'application/json',
},
body: JSON.stringify({
query,
variables: { slug },
}),
});
const { data } = await response.json();
if (!data || !data.post) {
console.warn(Post not found: ${slug});
return null;
}
return data.post;
} catch (error) {
console.error("WP Fetch Error:", error);
return null;
}
}
The DOM Rendering Part
When the data finally hits the browser, you need to be careful about hydration. WordPress content is notorious for containing messy HTML that React hates. If you just dump it into dangerouslySetInnerHTML, you’re asking for trouble, especially with the hydration errors in React 19.
But I use a parsing function to clean up the DOM nodes before injection. It prevents those annoying “Prop src did not match” warnings in the console.
import React, { useEffect, useState } from 'react';
import { getPostBySlug } from './utils/wp-fetcher';
const BlogPost = ({ slug }) => {
const [post, setPost] = useState(null);
const [loading, setLoading] = useState(true);
useEffect(() => {
// Async wrapper inside useEffect
const loadData = async () => {
const wpData = await getPostBySlug(slug);
setPost(wpData);
setLoading(false);
};
loadData();
}, [slug]);
if (loading) return <div>Loading WP content...</div>;
if (!post) return <div>404 - Post gone.</div>;
return (
<article className="prose lg:prose-xl mx-auto">
<h1>{post.title}</h1>
<div className="metadata">
<time>{new Date(post.date).toLocaleDateString()}</time>
</div>
{/*
The content usually comes with <p> tags from WP.
We render it directly but ensure container styles handle the layout.
*/}
<div
className="wp-content"
dangerouslySetInnerHTML={{ __html: post.content }}
/>
</article>
);
};
export default BlogPost;
Real World Numbers
So, does the update actually matter? I ran a comparison on a staging site hosted on a standard t3.medium instance. The previous version of the source plugin took about 8 minutes and 12 seconds to complete a cold build. But with the new version and the Webpack tweaks above? Just 3 minutes and 40 seconds.
That is massive. And more importantly, the incremental builds (changing one post title and redeploying) are down to under 20 seconds. If you’ve been holding off on updating your dependencies because “it works fine,” well, you should probably stop. The caching improvements in the latest Webpack 6 integration are worth the migration headache.
Just make sure you clear your .cache and public folders before you run it the first time. I spent an hour debugging a ghost error that turned out to be a stale cache file from three months ago. Don’t be like me.
