Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[performance] Achieving a 100% mobile Lighthouse score seems impossible (using v5 and Next.js) #32103

Open
janus-reith opened this issue Apr 2, 2022 · 2 comments

Comments

@janus-reith
Copy link

janus-reith commented Apr 2, 2022

Summary 💡

Preface: To avoid too much variance, and since local Ligthouse tests usually perfom better in my experience, I'm focusing on tests done with https://pagespeed.web.dev. Also, since tests there also don't always yield consistent results, I'll be using the best out of five score for each test. All deployments are done on vercel.

While trying to improve the Lighthouse score of my nextjs app, I faced some hard limits on what scores seem to be possible.
After a lot of improvements, the only parts left where React and Nextjs itself, and MUI (v5 using Emotion).
Even on an empty test page within my app, the best I was able to reach was 98(with some high variance compared to other tests).
I assume that the performance impact is caused by the MUI Setup in _app.js (+ document.js) since the actual page doesn't import components.
With another simple test page which actually incorporated some MUI components, the score already dropped to 93. (Again, with some high variance, sometimes dropped to 73, still wonder what's causing this.)
While I know that the React hydration process imposes some perfomance hit on its own, and so does Emotion, I was wondering wether that would be enough of an impact to actually reduce the score from 100%.

So I took the nextjs base example, so mainly only React, Nextjs and some CSS Module, deployed it, and I'm getting a perfect 100% mobile Lighthouse score.

Next, I took the Material UI Nextjs Example with the Emotion setup, and the best score it gets is 98. Again, the variance is higher than with the non-mui tests, the score sometimes dropped to 93.

Now the Material UI example doesn't have the same page content as the next js starter so it's not an exact comparision (Although actually it has less since there's no image).

So I took the default example page and its small CSS Module from the next example, and created it as a separate page in the material ui example.
Now the page content should be the same, the only difference would be the MUI setup in _document.js and _app.js that would affect the result for that page, since the page itself is not even importing any MUI packages.
The best score I was able to achieve is 96.
Next, to remove any external dependencies, I also removed the Roboto Font import from _document to get closer to the default next.js example.
Now, I was actually able to reach 100%, although the more frequent result I get was 99%.

The main page however, which actually uses a few MUI components, did not improve from the font removal and still only reached a maximum of 98%. I disabled the automatic prefetching that is done by default for the /about link on that example, that actually helped raise the score to 99%.

So my takeaways/things I wonder about:

  • Using MUI somehow seems to cause a greater variance in the performance test results compared to the nexts example which doesn't. Sometimes, the tests where off by more than 10%.

  • A 100% mobile score with just the MUI/Emotion setup in _document.js/app.js at least seems to be possible, although there seems to be some little impact, the more common lower result I got was 97, while for the plain next example without MUI the lower score was 98. (I guess that seems fair given that MUI comes with some default styling)

  • Importing actual MUI Components seems to quickly degrade that score. I believe that this is not a "static" decrease for using MUI, but one that exponentially grows depending both on how many Components are used(Makes sense, since things seem to tree-shake properly), and also how many times a Component is used.
    I wonder what a reasonable expectation might be here: Wether the 100% is just utopic for a certain amount of flexibility the components offer , or if it's mainly the emotion style calculation causing a performance hit, and not the remaining component logic. In case of the latter, the current move to unstyled components might actually help solve that, if pared with some zero-runtime styling solution.

To put things into the right perspective, a score of e.g. 96 would be great too and I'm not obsessing about these 100%, however my main concern here is the exponential drop when some more components are used in in a realistic use case.

Examples 🌈

Ligthhouse-based Test in PageSpeed Insights: https://pagespeed.web.dev/

Nextjs starter: https://next-app-starter-my9ttgtj0-janusreith.vercel.app/
MUI nextjs starter: https://nextjs-material-ui-example-q7qtq7ocl-janusreith.vercel.app/
Nextjs example page in MUI nextjs starter: https://nextjs-material-ui-example-q7qtq7ocl-janusreith.vercel.app/compareToVanilla

Motivation 🔦

While optimizing the performance of your app, there's a lot of details to consider, and the path for improvement is not always straightforward and involves some trial and error.
It would be good if the foundation your app is built upon provides performant defaults and doesn't impose an insurmountable bottleneck.

I would've wanted to open a discussion instead, but they seem to be restricted.

@janus-reith janus-reith added the status: waiting for maintainer These issues haven't been looked at yet by a maintainer label Apr 2, 2022
@danilo-leal danilo-leal changed the title Achieving a 100% Mobile Lighthouse Score seems impossible (Using MUI v5 and Nextjs) [performance] Achieving a 100% mobile Lighthouse score seems impossible (using v5 and Next.js) Apr 4, 2022
@mnajdova
Copy link
Member

mnajdova commented Apr 4, 2022

Using MUI somehow seems to cause a greater variance in the performance test results compared to the nexts example which doesn't. Sometimes, the tests where off by more than 10%.

I am wondering whether it would be more realistic to compare by using a different UI library. I feel like the comparison is not really fair.

A 100% mobile score with just the MUI/Emotion setup in _document.js/app.js at least seems to be possible, although there seems to be some little impact, the more common lower result I got was 97, while for the plain next example without MUI the lower score was 98. (I guess that seems fair given that MUI comes with some default styling)

It would probably be worth adding an example using nextjs with emotion only (without Material UI).

Importing actual MUI Components seems to quickly degrade that score. I believe that this is not a "static" decrease for using MUI, but one that exponentially grows depending both on how many Components are used(Makes sense, since things seem to tree-shake properly), and also how many times a Component is used.

Would be great if this can be justified somehow (with an examples maybe?)

I wonder what a reasonable expectation might be here: Wether the 100% is just utopic for a certain amount of flexibility the components offer , or if it's mainly the emotion style calculation causing a performance hit, and not the remaining component logic.

As mentioned above, probably a project using only nextj.js and emotion would answer this.

In case of the latter, the current move to unstyled components might actually help solve that, if pared with some zero-runtime styling solution.

This is basically the goal. In the end it comes down to whether you want a packaged library to customize and use, or build your own framework (thus using whatever you think it's best for your specific use-case, for example CSS modules vs emotion).


It's great that you've started the discussion, I am curios if other teams have already do some benchmarking like this.

@mnajdova mnajdova added discussion and removed status: waiting for maintainer These issues haven't been looked at yet by a maintainer labels Apr 4, 2022
@janus-reith
Copy link
Author

I am wondering whether it would be more realistic to compare by using a different UI library. I feel like the comparison is not really fair.

Good point, and agreed, I also wouldn't assume that this is specific to MUI, I will see if I can come up with some good comparisons.

It would probably be worth adding an example using nextjs with emotion only (without Material UI).

and

Would be great if this can be justified somehow (with an examples maybe?)

These are good ideas, I will perform some testing and try to give some more precise numbers.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants