- Home
- >
- Software Development
- >
- How Facebook Tweaked Browsers to Reduce Server Requests by 60 Percent – InApps 2022
How Facebook Tweaked Browsers to Reduce Server Requests by 60 Percent – InApps is an article under the topic Software Development Many of you are most interested in today !! Today, let’s InApps.net learn How Facebook Tweaked Browsers to Reduce Server Requests by 60 Percent – InApps in today’s post !
Read more about How Facebook Tweaked Browsers to Reduce Server Requests by 60 Percent – InApps at Wikipedia
You can find content about How Facebook Tweaked Browsers to Reduce Server Requests by 60 Percent – InApps from the Wikipedia website
Facebook has posted details on its work with Google Chrome and Firefox browsers to significantly improve web page loading times. Thanks to this joint effort, the companies have seen, according to Facebook, the following results:
- New features in both Chrome and Firefox that make their caches significantly more efficient.
- Facebook was able to reduce static resource requests made to their servers by 60 percent, giving those servers more CPU space to load the rest of the web page.
- An improvement to reload button behavior, Chrome, in particular, went from having 63 percent of requests being conditional to 24 percent of them being conditional.
Today, the Facebook Code site will have a blog post detailing the joint work it’s done with Google and Firefox to significantly improve their web programs — modifying outdated, long-standing web behavior, reducing server traffic and boosting overall efficiency.
The collaboration resulted in new features that enable Facebook to reduce static resource requests to their servers by 60 percent, and that make Chrome and Firefox’s browser caches drastically more efficient. Chrome itself was able to improve the browser’s percentage of conditional reload requests by 39 percent.
Faster load times and reduced server traffic, great stuff! But what does this mean in real life for us web devs working down in the trenches? Besides being able to see Facebook posts of funny cat videos that much faster, of course.
The most interesting outcome from this particular collaboration between three Internet behemoths is not so much the streamlined code, though the increased efficiencies it produced are indeed significant. Instead, the real value here is how it happened: stepping back to critically evaluate long-standing web behavior.
Assumptions we all make about how it has to be because this is how it’s always been. These shared assumptions provide the parameters we work within both when developing new web applications and improving existing apps. In the rush to launch the next project and fix all the things that break all the time everywhere, it’s difficult to devote resources to what can feel like looking backward instead of ahead.
Yet this collaboration’s results are an impressive illustration of how stopping to ask a question about something that wasn’t actually broken can yield very worthwhile performance improvements.
Specifically, engineers at Facebook were on-point enough to wonder why 63 percent of Chrome revalidation requests for static resources returned a response of “304 Not Modified” — while Firefox’s conditional request rate was 14 percent (with Safari and Internet Explorer performing at similar levels). Given that static resources are called just that because they rarely change, 304 responses indicated a completely unnecessary chain of revalidation, request and return chain that needlessly wasted resources.
So the immediate question was “Why, Chrome, why???”
The Facebook engineers tracked the mystery down to a line of code within Chrome that asked for reasons why the browser might want to revalidate resources — for instance when pages loaded from a POST request. This makes sense for mail or enterprise sites where users need the most up-to-date info. However, many sites (cough Facebook cough) use POST requests for login, meaning that Chrome ignored its cache and downloaded the whole enchilada every time.
Working together, the Facebook and Chrome teams determined that this behavior was (a) unique to Chrome and (b) unnecessary. With a few non-Earth-shattering changes to its code base, Chrome’s conditional request rate plummeted from 63 percent to 24 percent.
The team didn’t stop there, though. They figured there was still room for improvement in this process, for all browsers, and they were right. Read the Facebook blog (and a related Google blog post) for specific details, but in short, it involved taking a hard look at a long established feature of web design: the reload button.
https://youtu.be/FhgcPjM16TE
At the moment, these performance improvements happen for all developers and all websites without any other changes. Which is great. But the ultimate takeaway from this announcement for working web devs everywhere is to take the time to stop and sniff the browser.
It’s very likely that our collective habits and taken-for-granted assumptions about web behavior are hiding some very tasty low-hanging fruit — that is, simple changes resulting in potentially large improvements. This whole story is a reminder that technology moves forward every day, but it’s important to also look at where we came from and see if we are lugging unnecessary baggage with us into the future.
Source: InApps.net
Let’s create the next big thing together!
Coming together is a beginning. Keeping together is progress. Working together is success.