The number one rule in the general best practice of web performance is "Minimize HTTP Requests".
Yeah, it's more important than anything else.
So, I took the biggest offender from the collected sites Number of Domains and ran it through WebPageTest. The result is almost comical.
That's amazing! I'm not even sure I could achieve that even if I tried. It's almost like those sickeningly infested Internet Explorer toolbars that are often a joke.
Some numbers to blow your mind (on a DSL connection from San Jose, CA, USA)...
- Time to fully loaded: 85 seconds
- Bytes in: 9,851 KB
- Total requests: 1,390 (!!)
- SpeedIndex: 10028 (my own site's 950)
What's mindblowing about how bad this site is that even on the repeated view, where a browser cache is supposed to help a lot, is that it takes 30.6 seconds and still makes 347 requests. Just wow!
- Some tips on learning React
04 August 2015
- News sites suck in terms of performance
08 August 2015
- Related by category:
- How to create-react-app with Docker
17 November 2017
- Fastest way to find out if a file exists in S3 (with boto3)
16 June 2017
- How to throttle AND debounce an autocomplete input in React
01 March 2018
- Be very careful with your add_header in Nginx! You might make your site insecure
11 February 2018
- Displaying fetch() errors and unwanted responses in React
06 February 2019
- Related by keyword:
- Ultrafast loading of CSS
01 September 2017
- How to do performance micro benchmarks in Python
24 June 2017
- Webpack Bundle Analyzer for create-react-app
14 May 2018
- To CDN assets or just HTTP/2
17 May 2018
- From jQuery to Cash
18 June 2019