Replies: 2 comments 3 replies
-
Have you tried adding even more memory? Have you tried running it outside of a Docker container? Can you try using canary with Docusaurus Faster, using Rspack instead of Webpack? |
Beta Was this translation helpful? Give feedback.
2 replies
-
Upgrading to version 3.6 and turning on rspack in the experimental options did the trick - I am no nowhere close to my former limit of 40GB (I've tested by only giving the build 16 GB with no issues). Very much appreciate this optimization from the docusaurus team! |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I'm managing a large doc site built with Docusaurus, and I've been encountering out-of-memory (OOM) errors during the build process, specifically during the Webpack client and server compilation phases. The build runs in a CI environment (2xlarge+ container with ample memory), and while the markdown-to-HTML conversion seems to be completing, the error occurs during the final Webpack compilation after about ~40 minutes.
What I’ve Tried:
I’ve reviewed the markdown-to-HTML generation logs, and they indicate successful rendering with no significant memory issues.
The error seems to arise after the log: [info] [webpackbar] Compiling Client, which suggests the issue may be related to Webpack's memory usage.
I’ve also tried increasing the Node.js memory limit (max_old_space_size), but I still run into an OOM when capping it to the size of the CI container.
I would appreciate any guidance or suggestions on how to optimize the Webpack client/server compilation process for large documentation sites.
Logs:
Webpack config in docusaurus config:
Beta Was this translation helpful? Give feedback.
All reactions