Proof of concept: Automated stripping of 3rd party CSS and JS files to improve in performance audit scores

Author : usitvhd
Publish Date : 2021-04-05 17:28:09


Proof of concept: Automated stripping of 3rd party CSS and JS files to improve in performance audit scores

vendor-coverage-workflow

This is a sample workflow for generating "coverage" files out of vendor css. Designed for working with mid sized teams.
About this project

Google lighthouse is a tool available within the chrome browser which gives a report on various web-page quality metrics. A higher value in these scores can directly translate into project profitablity.

One major issue I was running into with my projects was getting points deducted for any files that have more that 20kb of unused code.

Vendor libraries like bootstrap and jquery offer a tremendous leg-up during early development, but when it comes time to deliver the final code your lighthouse scores can suffer.

Alternatively, some legacy projects may find that making the argument to refactoring from the ground up in order to save 500 miliseconds on a page load doesn't go over well with the business types. (Shocking, I know)

As an exercise in "can this even be done?" I thought I'd take a kick at the can. Can we design a relatively painless plug-in for existing and low-budget architectures which allows teams to leverage the power of vendor frameworks while squeezing out those sub-second performance gains?
On usage rights

I imagine this technique is easily transferrable to other languages and frameworks, but for the purposes of the sample I'll be working on a default dotnet new mvc.

There is no proprietary code in this and I'm writing on the weekend so I can share these results with you. Under the MIT license, you can take this to your fortune 500 day job and plug it in without fear.

I'm hoping that through community feedback and further insights we can get this to a point where we can be confident in its robustness and start using it in production.

On with the show...
Enter "COVERAGE"

If you look in chrome developer tools under the "three dots" menu you can see a COVERAGE tab. Look at the bottom-half of this image and we'll discuss what it means below.

lighthouse and coverage

See those large RED bars? That's a visualization of the unused portion of the CSS and JavaScript. In theory I can save 97.4% of my bootstrap load and realize massive gains.

Perhaps by automating the extraction of CSS into a new file we can get better. In looking around the internet, there are a number of posts on how to do this once, but it isn't yet compiled into a pluggable system.
The pseudo logic

    write your website
    launch it locally
    run a coverage report
    discard all the "red bits"
    save the subset and serve it as a static file

The interesting parts of the CSS code
ensure you can serve "full fat" files on demand

In order to make sure we can generate complete and robust trimmed files, we have to be able to pull the full version of the site on demand.

In order to accomplish this we make sure our controllers inherit from a shared base controller that has built in query string detection for the unoptimazed site.

    public abstract class BaseMyWebsiteController : Controller
    {
        public override void OnActionExecuting(ActionExecutingContext context)
        {
            base.OnActionExecuting(context);
            Request.Query.TryGetValue("generatecoverage", out StringValues fullcss);
            ViewBag.GenerateCoverage = fullcss.Count > 0 && fullcss[0].ToLower() == "true";
        }
    }

in our _Layout.cshtml we go on to add a control statement.

    @if(ViewBag.GenerateCoverage){
        <link rel="stylesheet" href="~/lib/bootstrap/dist/css/bootstrap.min.css" />
    } else {
        <link rel="stylesheet" href="~/coverage/css/bootstrap.min.css" asp-append-version="true" />
    }

Next, we need to launch our site run coverage analysis in a headless web browser puppeteer and compile the trimmed-down version of the CSS into a static file.

Turns out that is relatively easy and safe.

To get a sense of how that works check out the extract_coverage funtion in this gulpfile.js

In pseudo code we:

    launch the site locally using a gulp dotnet wrapper.
    pull the coverage in puppeteer
    process the coverage report

Super easy! Yay! How did we do? See below!

css savings

We boosted our performance from a 95 to a 98! We got the bootstrap CSS down to 4.6Kb! a savings of around 145Kb per page load and our team can still use bootstrap.

In theory as they add bootstrap classes through the project's lifecycle, these will be detected and added dynamically by this vendor coverage system.

Kinda feels like a win doesn't it?

Unfortunately, like the parent who says "what happened to the other 5%?" when their kid comes home with a 95% on their school paper, I could only see that JavaScript.

Unacceptable! Can we do better?
The interesting parts of the JS code

Turns out JavaScript is a LOT harder to work with. You can't just rip out the unused parts without introducing syntax errors.

... or can you?! ...

Well it turns out if you have two thumbs and are an idiot you won't let something like terrible decisions stop you from racking up lighthouse points.
the JS psuedo code

    load your javascript coverage report
    build the inverse report in order to get a definitive list of code blocks that are not used.
    try to remove the block
    if the updated code compiles use the new code as your baseline
    repeat until you've trimmed all you can

In my work, I discovered that the easiest win here is to replace every unused function with an empty one.

You can see this in action within the strip_coverage_js in the gulpfile.js

// get the text you're going to remove
try_removing = entry.text.slice(range.start - 1, range.end + 1);
// regex match to detect it as a function
if(/^function(.+}$/.test(try_removing)){
    // replace it with a stub
    this_attempt_js = last_good_js.replace(try_removing, "function(){}");
}

Now it's not enough to just swap out the code, you also have to run a sandboxed javascript environment to evaluate the code after each edit. Producing broken javascript doesn't do anyone any good at all!

For that we use the js-interpreter library and wrap the evaluation in a try catch. If we don't throw an exception we keep going! (When has something like that ever gone wrong? This plan is flawless!)

Anyways, how did we do?

js savings

We racked up another 1% performance and shaved roughly another 100Kb!

We reduced our page load time from 2.1s down to 1.7s! Down from 2.4s when using both techniques combined.

Kinda cool right?

It's at the point where I think it's ready to share.
Future development and outstanding challenges
Challenge: make this run as a watch task

 

http://downeaststyleboats.com/advert/tv-free-mens-basketball-championship-2021-live-stream-college-basketball-final-game-free/

https://zambiainc.com/advert/final-tv-free-mens-basketball-championship-2021-live-stream-college-basketball-final-game-free/

https://www.loudounbar.org/advert/kickoff-mens-basketball-championship-2021-live-stream-college-basketball-final-game-free/

https://zambiainc.com/advert/watch-free-baylor-vs-gonzaga-national-championship-live-stream-college-basketball-2021-final-free/

https://www.loudounbar.org/advert/watch-tv-baylor-vs-gonzaga-basketball-championship-live-stream-college-basketball-2021-final-free/

http://downeaststyleboats.com/advert/final-baylor-vs-gonzaga-2021-live-stream-college-basketball-final-free/


https://eclipsemagazine.co.uk/advert/freebaylor-vs-gonzaga-basketball-championship-live-stream-college-basketball/

https://zambiainc.com/advert/basketball-final-gonzaga-vs-baylor-championship-2021-live-stream-college-basketball-free/

https://www.loudounbar.org/advert/free-gonzaga-vs-baylor-2021-live-stream-college-basketball-online/

http://downeaststyleboats.com/advert/tv-freecollege-basketball-championship-final-gonzaga-vs-baylor-live-stream-free/

https://blog.goo.ne.jp/goolindx/e/26df57ece2ebf97ebbbdac62cecc1da1

https://www.deviantart.com/fgdffdfd/art/Dont-Copy-That-Surface-875429539

https://paiza.io/projects/MfkDcYnkYQlCGMvqlKMlpA

https://caribbeanfever.com/photo/albums/performance-audit-scores

https://authors.curseforge.com/paste/6db7aa10

The orignial goal was to make this something that runs in a "WATCH" task. By monitoring developer files and automatically re-building the CSS and JS as functions are added could we catch the unintended consequences while building our sites?

There are some challenges around getting the signal that the site has resatarted out of DotnetWatch and back into gulp so we can execute our extractions. The first round of challenges would have been insurmountable if I hadn't gone into the abandoned npm module and fixed a few bugs that arose in dotnetcore 3.1+

Be warned that version of dotnetwatch on NPM does not have the edits required to make this work. dependencies/gulp-dotnet-watch-customized

There are some things I don't know about how gulp works. How do I bubble an event from within dotnetwatch such that I can have the extractions run once the views have finished recompiling? I feel like I'm really close on that one, but there's one little thing missing. Maybe a .bind on the function I pass to the watcher or something like that.

Another Quality of Life improvement I'd want to take care of in a production setting is that I'd prefer it if every execution of the extractor didn't spawn a new browser window. I know that is as simple as creating a launch.json file and passing in the parameter, but after a long stretch on the hard stuff, I didn't have the energy keep going on that path. It's proabbly "super easy" though I swear!

Can we further reduce javascript? Some thoughts for techniques to try:
method for investigation #1 shared stub function.

instead of replacing each function with 12 characters usign the string function(){} could we instead insert a single named stub?

Consider the resulting javascript code:

toArray:function(){},
get:function(){},
pushStack:function(){},
each:function(e){return S.each(this,e)},map:function(){},
slice:function(){},
first:function(){},
last:function(){},
even:function(){},
odd:function(){},
eq:function(){},
end:function(){},

is functionally identical to:

function x(){};
/* [...] */
toArray:x,
get:x,
pushStack:x,
each:function(e){return S.each(this,e)},
map:x,
slice:x,
first:x,
last:x,
even:x,
odd:x,
eq:x,
end:x,

We could effectively save 11 more characters per function. I bet that adds up fast.
method for investigation #2 elim



Category : general

Get Absolute Success in the S90.04 Exam at First Attempt:

Get Absolute Success in the S90.04 Exam at First Attempt:

- Everyone wants to pass the exam in first try. Visit CertsAdvice website for an easy preparation of your exam


President Bidens COVID-19 stimulus bill is on the brink of becoming law. Heres where it stands

President Bidens COVID-19 stimulus bill is on the brink of becoming law. Heres where it stands

- President Bidens COVID-19 stimulus bill is on the brink of becoming law. Heres where it stands


summit, participants gain expert insight from practitioners, researchers, and vendors about the services and technologies

summit, participants gain expert insight from practitioners, researchers, and vendors about the services and technologies

- summit, participants gain expert insight from practitioners, researchers, and vendors about the services and technologies


Jogger Stroller for Baby

Jogger Stroller for Baby

- Mothers who have quite recently had a child need to realize the quickest method to get their body back fit