Browsing the internet without uBlockbOrigin is bad for your health.
mod_pagespeed does
Raymond Hill is a hero of our times. Not even kidding.
*googles the name*
Raymond Earl Hill was an American tenor saxophonist and singer, best known as a member of Ike Turner’s Kings of Rhythm in the 1950s.
Well, I mean we all have our own ideas about the world.
True, true
Ad blockers don’t protect you against dumbass frontend devs who serve 5mb png files to be stuffed into 600x400 boxes.
No but the good blockers give you the power to disable elements on a case by case basis.
Removing the background image for fandom wiki articles makes them load 3x faster.
I especially hated wallpaper website that load full size pictures on previews grids
You can set a limit in uBO settings to prevent loading of files/elements above x KB in size. You will just see a red dotted blank space for elements comically large, and they will load if you click them, if needed.
Edit: here is what you need to change
Shit, I might use that on a desktop with broadband.
Thanks kind stranger, I didn’t realize this was configurable.
Is 50kb enough or should I go higher.
Also. Thanks!
Estimating the general acceptable size for images or GIFs helps here. Images usually cap under 600-700 KB, while GIFs can go over 1-2 MB. Experiment with the number, or set it at 1200 KB maybe.
Thank you, this helps a lot
I have mine at 50kb, and most things load, but I do get some images that don’t. Just play with it and increase it until the frequency you need to tap pictures to render them doesn’t piss you off anymore.
Thank you.
I always think it’s unfair to compare things to video games. Video games are so inefficient they had to invent a separate processor with hundreds of cores just to run them. Of course they end up running well.
If cheap phones had a 128-core JavaScript Processing Unit, websites would probably run fast too.
Video game developer here. A lot of anti-optimiation sentiment are just excuses and/or part of some dumb trend.
Oh no, compiled languages require you to choose between variable types! Better use Javascript.
Why should we develop a proper portable app environment when we have Electron? It can even run in browsers. Imagine if you didn’t had to go to your pops to install the word processor, instead he just types in wordprocessor dot app into the browser?
What if code was so easy to understand you didn’t had to document it, and each macroblock of a function instead were a named function, so they’d be automatically documented?
And this is just the tip of the iceberg. I’m currently writing my own scripting VM, as most others have their own limitations, and would introduce a barely usable build system to my game engine (which are their own can of worms). Code as data is a very useful feature, but having to include DLL files as scripts would be very complicated due to platform differences, although also very fast. Issue comes when people treat scripting languages as full-fledged programming languages, and even scaring away beginners from compiled languages, because you have to compile them, you have to choose a type, etc.
I just want to point out that interpreted languages don’t have to be slow. For example, LuaJIT is competitive with Java in terms of performance, and not that much slower than C. Likewise, PyPy is almost always consistently faster than CPython, and Python 3.13 will have a JIT. I’ve also used numba to improve performance in Python (got close enough to naive Rust to not be worth adding Rust to our pipeline).
If you want scripting languages to be fast, there are options, so the decision should instead be made based on the benefits of each. For example:
- scripting languages - generally better edit/reload experience, write once, run anywhere an interpreter exists
- compiled languages - catch common errors before running, lots of fixes for various platforms
I’m super interested in Rust because it catches way more common errors than most compiled languages, so you’re getting a lot more value for that compile step. My day job is Python + Javascript, though I have nearly 10 years with Go and most of my personal projects use Rust these days, so I feel like I’m fairly experienced here.
just excuses and/or part of some dumb trend
I agree. There are good reasons to prefer scripting languages to compiled languages and vice versa, but most people don’t seem to decide based on those reasons, they often decide based on what’s easier to hire for, what they’re familiar with, or what’s already being used.
I’m super excited about Rust gaining traction because it’s basically the best case for a compiled language I’ve seen. Maybe it’ll revise the trend toward higher level languages and encourage a bit more provable correctness.
The real irony is that you can make games entirely with Javascript (no backend server needed) and I wouldn’t be surprised if some of those games, even with 3D rendering via three.js or babylonjs, performed better than certain websites
deleted by creator
Imagine that, people don’t mind when they have to wait 10-15 minutes every once in a while for their game to update, but they do mind waiting 15-30 seconds every time they navigate to a new webpage.
Then again, those 100 MB are usually mostly assets I want to look at or listen to. Certain websites contain 100 kB of text and pictures I want to look at and load 2 MB of JavaScript frameworks that add nothing to the usability of the site. Bonus points for automatically streaming a 20 MB video I don’t want to watch while I look for one sentence’s worth of information.
- I have a limited amount of bandwith on my mobile plan
- if i know a game download is large i go home where i have broadband; i dont download large files over mobile internet
- a text-only website can be rather small with only a few KB. It’s only when you get ““Designers”” that things start to bloat, because the system fonts are not good enough and 2MB in extra fonts no sane visitor will ever notice must be downloaded.
- the marketing department REALLY needs those 10 extra trackers and analytics scripts that take 5s to load, even if they last looked at the visitor stats back in 2021 and the login has long been forgotten.
- the CEO wanted that animated AI powered talking gorilla widget he had seen on a local tradeshow where the customers can ask product or website questions (spoiler: they dont!), which ads a few more Megabytes to each pageload even before you even use it.
separate processor with hundreds of cores
Well, graphics rendering is very suited for parallelism. That’s why GPUs were invented.
Most other tasks are not. Most of the cores in a 128-core JPU would end up being unused. Also why JPU? It’s not like it’s significantly different from a normal CPU task.
I don’t think the person you replied to actually knows what they’re talking about.
You see, stuff like this is why I never understood the wave of “Android Go” and “Lite/Go” apps a couple of years ago.
On my old low end phone, the native Twitter app ran infinitely better than the Web based “Twitter Lite”. This applied to almost every “Lite” app compared to their regular versions.
I feel like whoever started that “Webapps are great for low end” concept never actually tried to run a modern Webapp on a slow phone.The lite apps also take up less user storage. Which was a big issue for lower end phones at that time. Once you ran out of storage people struggled to install new apps. Even with external SD cards, as it wasn’t an easy concept for some people to get over.
I only ever used the lite version of FB Messenger. Shit was much better than the full version, especially without all the bloated “features” that I didn’t use at best and being annoying/battery drains at worst. Was noticeably snappier on both my old and new phones. Fortunately most of my friends started using Discord and/or Signal with better features (and one less Meta app to have running).
I think that the idea of having smaller and less demanding versions of lots of apps is a good idea. As so many apps are just not optimized and bloated. Just being coded to rely on higher specs to make up for said lack of effort in cleaning up stuff. The ads on ads on ads being part of the issue as well. Which is only getting worse with the close buttons not loading unless shit has been however many seconds. Seems that the “hit box” for the close buttons is getting smaller and smaller to guaranty the ads are clicked on and then open another app or a browser. Though optimizations and better coding won’t fix dirty underhanded grifts.
I think that the idea of having smaller and less demanding versions of lots of apps is a good idea.
I think that too!
I’m just not sure Webapps are the way to go about this over native, smaller, leaner apps.
Every justification made for webapps are transparent lies to get access to more user data.
I am so fucking tired of corporations lying with no consequences to the detriment of their customers.
Of course an app that is compiled ahead of time to run natively on the cpu would run faster than a web app that compiles it bloated JavaScript code on the fly.
The web app versions was to avoid having to download large apps, not to be faster. They are slow because the companies tried to have feature parity with the native app and also stuffed it with tracker software. Web apps are supposed to barebones.
I have been just bewildered at the proliferation of excessive scripts and garbage on seemingly every webpage over the last decade. I’m no web-dev, but I’m pretty positive that the vast majority of websites could remove 99-some percent of their javascript bs and their websites would function just fine. So many are pretty much unusable these days. It’s atrocious.
I recommend to use an adblocker. It’s not a moral question anymore but pure self-defence, says multiple US secret services.
I’m a web dev and yes they could. It’s annoying that web devs get blamed for it though, the reason for all the javascript is mostly business decisions out of our control.
Mainly the tracking scripts which the marketing department adds against out will. But also it’s a lot cheaper to have a client-rendered web app than a traditional website (with client side rendering you can shut off all your web servers and just keep the api servers, our server side processing went down 90% in the switchover). And it’s more efficient for the company to have one team working in one programming language and one framework that can run the backend and frontend, so the frontend ends being a web app even if it’s not really necessary.
A bunch of websites operating as web apps would help explain the bloat. Great idea if somebody is navigating a good chunk of your website. Horrible idea if 99% of your traffic is people being linked to a news article and then leaving afterwards.
Fwiw, I don’t blame the devs. That’s just me saying I’m not an expert. I understand it’s a management/corporate decision.
And thanks for the explanation. That clarifies the changes I’ve been noticing.
I’ve been working at organizing a bunch of stuff I’ve been collecting over the years … data, writing, lists, ideas, whatever … I kept using all sorts of services, apps, websites, cloud services and all sorts of crap to maintain them all but eventually it all becomes too complicated and breaks down.
I’ve since discovered just using simple text files and services that just use simple text mark down … no special service, nothing proprietary, easily transferable and interoperable.
I started looking at websites the same way … I don’t care what it looks like, I just want to read the information … you made it too hard for me to read your simple text info? You’re asking me to turn off my ad blockers and turn on Java script? All to read 200 words on your site? I’ll skip it and move on to the next site that will allow me.
I manage a web dev team. We try to optimise as much as possible but then there’s all sorts of tracking that gets tacked on by personalisation teams, opti teams, things like Tik Tok, Facebook, Twitter/X scripts inserted too… It’s pretty shit. And sometimes when things break it makes it super hard to debug too
REACT EVERYTHING
I made a stupid little page that downloads a Pathfinder 2e SRD API, and then randomly combines an ancestry, background, and class from that list and displays it on screen. It’s really nothing special, I hacked it together in an afternoon. But I showed it to a friend and they were blown away that I didn’t use a framework for it. I was like, “it does three things. Why would it need a framework? What would I even use a framework for?”
They still couldn’t believe I did it by hand.
I’ve chatted with a few experienced web devs, and from what I’ve heard, there’s a whole group of “web programmers” out there that just learn React and other fameworks, but don’t actually know how to code anything themselves. So many places won’t even consider you if you don’t know React.
And here I am still thinking jQuery is an excessive amount of page bloat.
This is accurate. I’m a full stack dev, and a huge number of job postings I’ve seen over the past ten years or so have switched to React.
At work we have this timecard management system that’s an enormous pain to use. All the bottom rung employees hate it because it’s anything but intuitive. For example, it has stupid things like weekdays sorted alphabetically and a scroll bar to select the day of the month in a form. It’s like the interfaces were tested exactly one time and never visited again, so long as it works minimally.
What’s this crappy app have to do with big web pages? That application is awful for us worker bees, but management loves it because it produces nice reports. Management is the real customer for which the product is optimized. Similarly, many web pages are awful because they’re mainly rated on how it looks. Nobody is including how fast it loads in the contract, and at the product demo you bet those resources are cached in the browser.
Ask yourself: who with the money in hand is actually looking at how fast the page loads on a slow connection or low-end devices?
TLDR: Looks > performance.
Friday Monday Saturday Sunday Thursday Tuesday Wednesday
In its defense, that also flows better if you’re trying to sing it.
At my current dev role I try to do optimizations to make new system area pages pretty lightweight, but it’s a bit of a struggle as I’m working with devs who have been in the same role for decades. WCAG is not prioritized, and they pull in a ton of JS libraries that usually aren’t even used. A lot of the practices I see in use are from 10 years ago, but slowly tidying up the horror show with each dev product meeting.
Admittedly could be much worse though, at least our pages aren’t 21MB large.
WCAG
Ours doesn’t even try at all, because we’re largely a B2B shop and we know our customers (in the low thousands). It’s still dumb, because we could totally hire a QA or developer who has some kind of need where accessibility would be helpful, and we even have a couple of colorblind people on the team, yet we don’t prioritize anything. It’s a little disappointing, but I guess the need hasn’t arisen yet.
We build a very interactive web app with tons of data, and a fresh load is still well under 21MB (looks like ~5MB transferred over the network, ~15MB total). I don’t understand how a typical website could use more than our app when we do lots of complex stuff (2D drawing library, lots of calculations, we’re adding in 3D soon, etc).
You could probably write a browser plug-in / extension to manipulate the DOM to fix these issues.
Also yes, some people look at page load times. Our team does this. But apparently not a lot of people do this, judging by the replies in this thread.
Yeah, I’m guessing a bunch of random workers are going to do that…
A UserScript or UserStyle could fix it up.
And make some clueless Facebook addict with seventeen toolbars scream “You haCKED OUR WEBSITE?!”
it has stupid things like weekdays sorted alphabetically
Holy shit, that’s stupid. Why would you even do this in the first place?! I can’t comprehend how anyone could come to the conclusion that that’s a good way to sort it.
alphabetically was probably the default sorting method for an array of data and they didn’t bother to fix it, just my guess.
Then why not use labels?
Value=“1Monday” Label=“Monday”
That’s even more work than skipping the “weekdays” array wherever the sorting happens
And thats why i believe that ublock origin is needed for modern web browsing
Yup. If websites respected me, I’d respect them back and not need uBlock Origin.
No corporate website respects you in the slightest, they are just greedy for your metrics.
Yup, hence why uBlock Origin stays on.
Agreed.
People seem to forget that in the past you didn’t run a constant antivirus on your computer constantly, and would just occasionally run a scan if suspicious activity happened. At least until malicious web activity became so bad and everyone HAD to run antivirus.
We are approaching that point for web blockers.
Wet Ass Pussy is clearly the answer.
I have a WAP phone, and to my surprise google loads
I’m delighted. I wonder if they still employ one lone engineer with the title “WAP Architect” :-)
Gemini
Capricorn
You made me look, alas, no dice to a web thingy.
Shh it’s a secret https://geminiprotocol.net/
More reason to use Brave, keep JS disabled by default and just re-enable on a site-by-site basis.
Ignoring the shady practices of Brave Software, this doesn’t really solve the problem. Sites will still use way too much scripting to be flashy, and that will continue to be a problem for everyone, because some of these sites willbe needed for some and will require all scripts to function properly.
What might help more in the long run is complain to the site owners that their site, despite you having an up to date browser, does not work on your phone. Sure, some of those complaints will fall on deaf ears, but even some changing means progress.
My simple home page is 10 KB now. And you might not think that’s such a big deal, but it has more content than Google’s search page and that rings in at a couple MB IIRC. 😁
How do I measure how much data my page loads? Now I’m curious
- Press F12 to open the Debugger.
- Click the “Network” tab
- Press Ctrl+F5 to reaload the whole page (including previously cached files)
- under the list of transfered files in the greyish bar above the debug console (if enabled…) you see the total number of requests the site made and the total filesize that has been transfered; lower is better.
Picture: https://superuser.com/a/1718133
Thank you for the detailed answer!
In the developer tools in the Network tab. FF sums it up at the bottom of the list when you reload (e.g. for this page “24 requests, 4.74 MB transferred”). Chrome must have something similar. Be sure to check “Disable cache” in the devtools.
Chrome reports the memory a tab uses if you hover over the tab. Look at the task manager within your browser. Try clicking on the burger bar, then “More tools” and “Task Manager” within the browser.
Oh, I didn’t know I could do this in DevTools! I figured I would need some other tool
If you can’t answer this question you’re doing it wrong. It should be as simple as “how large are the files in my web hosting folder”. All this fucking tech stack bloat is so unnecessary.
On the contrary! I absolutely loathe how bloated webpages have become over the last few decades, so it’s very refreshing and laudable to see a webpage that tries to keep itself as small as possible.
I ended up using a static site generator for my personal site because I fucking hate JS and frameworks and WebComponents. The front page is 646 KB and it loads in 4 seconds. I’d love for it to be 1 second or less, but the fonts are a factor.
And I shrunk the shit out of that background too with pngcrush so miss me with that.
My front page is 613KB with Wordpress. Moral of the story, you don’t have to use a static website generator to have light things.
Can I achieve the same with vue.js or flutter? I need to learn this
And how do you plan to manage your posts, database etc. and render stuff in those? You still need some backend solution like Wordpress, you can use vue as a frontend library for it… or vanilla JS, or jQuery…
Ah, for that I’ll just dump some fast API or flask thing. Vue or flutter will just handle the front end
So… you are aware that FastAPI and Flask will always be significantly slower than Wordpress… because Python, always running processes etc.?
You’re building a simple website / blog just use Wordpress, it will output most of the pages into plan simple and fast HTML, then add a few pieces of vanilla JS or Vue (if you’re into that) to make things “fluffier”. Why bother with constant XHR requests when you’re just serving simple text pages?
With Wordpress you’ll also get all the management, roles, permissions, backend for “free” and you can always, like sane people, cache the output of the most visited pages. Wordpress also provides a RESTful API if required.
No I mean, I wanna make a full project but without bloating the front end website
Yes. You can. I have a personal site that is using nuxt static site mode and it renders extremely fast and clean output.
Check out https://250kb.club all performance sites focused on speed and small size.
Or maybe the 512kb.club a more reasonable balance between 250 club and the 1mb club.
Also with a view: jankfree.org for a similar focus on performance.
I have a pixel 6 and notice some lag in scrolling. Could it be that you don’t use srcsets but instead huge screenshots no matter the device screen?
The fonts can be loaded from another file that ends in the cache, lowering load time next time.
Haven’t done this type of optimizing in a long time, I had a quick look at the network graph for your front page (F12 dev tools in desktop browser), my understanding is it looks like you are getting blocked from loading additional resources (fonts + background) until your style sheets are fully read --pink line is document loaded i believe.
It may be worthwhile to experiment with adding some preload links to the html template? or output? like below and assessing if it makes things faster for you.
<link rel="preload" as="image" href="https://volcanolair.co/img/bg1-ultracompressed.webp" fetchpriority="high">
<link rel="preload" as="font" href="https://volcanolair.co/fonts/Inter-Regular.woff2">
<link rel="preload" as="font" href="https://volcanolair.co/fonts/Inter-Bold.woff2">
___
I love all your replies.
You wouldn’t get these responses from stackoverflow.
This isn’t even a programming or development community…it’s a general interest one.
You didn’t even ask for help.
I gotta say I came in here to flex and I learned so much. I am going to roll some of these changes really soon once I find out where to best add them to my Hugo template. I’m going to reply to some of them below to clarify some things:
It may be worthwhile to experiment with adding some preload links to the html template? or output? like below and assessing if it makes things faster for you.
This is the most interesting because I didn’t even know this was possible with HTML5, so I want to add this right away.
I have a pixel 6 and notice some lag in scrolling. Could it be that you don’t use srcsets but instead huge screenshots no matter the device screen?
The background is a large image in the CSS via background-image, I don’t know how easy it would be to change it to a srcset but I will give it a shot
The fonts can be loaded from another file that ends in the cache, lowering load time next time.
At the very least they need to load last because they are the largest burden
Not that you’d want to because you hate JS and web components and all that, and there’s nothing wrong with your website, but NextJS supports Static Site generation.
So, JS and frameworks and webcomponents can get the job done for simple stuff nowadays. My portfolio page has a load time of 631 ms using the SSG built into NextJS, and its really similar to your website.
The entire Material Design framework in JS and Web Components in 80kb
https://clshortfuse.github.io/materialdesignweb/components/buttons.html
JS and Web Components are not the problem. Poor design is.
Loaded pretty much instantaneously on my phone (a second at most). Then again, I block third party fonts.
Honestly, 4 seconds is really slow, especially with static HTML. I built my first companies’ site myself, it includes a video on the front page and jquery, is built by PHP, and on descent Internet connections the front page will load in slightly over a second, other pages dip under that.
There are loads of tweaks you can make to -any- site, and total amount of bytes really isn’t the only speed factor here.
, but the fonts are a factor.
I’m not sure if the possibility is there depending on your use case (eg.: you are exporting the fonts) nor if the cost of doing it would be worth the shot, but you can send minified
versionsvariants of fonts, too.
When ever I used to have issues with my internet I used to use news.com.au as a test to see if the issue was fixed, if that site loaded than anything would.
if you watch steve jobs’ 2007 iphone keynote it’s incredibly depressing now. he brags about how the iphone can load full, rich webpages instead of awful mobile versions; he loads the NYT website and gets the whole lush landscape desktop version, and taps to zoom in on certain elements. i used to be such a dork and so into tech in high school, it seemed so promising and wondrous.
i bet jobs could’ve yelled at spez about the API changes and gotten him to relent
“Full rich webpages” on a 2007 iPhone meant bare HTML and a kilobyte of Javascript. Anything fancy would be in Flash because JS was slow as balls, and the iPhone never ran Flash.
the iPhone never ran Flash.
Ruffle has (recently, for me) entered the chat.
Not that this negates the performance concerns, but just that Flash on iPhone is becoming a possibility.
If we’re counting now and into the future, the EU has coerced them to finally tolerate other browsers.
… not that I’m aware of any current browser with Flash support.
Ruffle gives it support, no EU good-intention-poor-implementation regulation required. The demo link I shared above works with any browser, built in Safari included.
Oh I know, I was just suggesting more-direct support was possible. Genuine stupid coverage for a long-dead plugin.
Maybe someone could coerce Dolphin browser from Android to iOS.
I do have to say, Ruffle is the most boringly-named of the “let’s do Flash in JS” projects. The first big one was named Gordon, in an obvious pun. The follow-up was named Shumway, in a less-obvious pun. About ALF.
Speaking of historical Flash support, I actually forgot the old Puffin Browser which I’ve bought back in 2011, and apparently is still around. They run a browser on their server and you get a VNC-like client to access that instance. So by no means native support, but it was super functional at least back in the days — haven’t used it for years since I stopped buying iPads as my use case are better suited for the Mac and the iPhone instead.
That is dedication I absolutely would not match. I bought Android for software freedom and mmmight have watched some pivotal Homestuck animations on a Droid 2 Global.
Even now, please don’t give Apple money.
i bet jobs could’ve yelled at spez about the API changes and gotten him to relent
Why would Jobs care? Reddit’s app goes through the app store, Apple gets a cut of any premium users buy on it.
And why would Spez relent to Jobs? Everything Spez is doing is to get maximum payout from the IPO and then cash out. He doesn’t give a shit about the actual site anymore.
the guy that was convinced apples could cure his cancer?
Thats mostly because of the overload quantity of ads, trackers, plugins, integrations, etc all websites have now. Using an adblocker halves your bandwidth usage. If you have a data cap, an adblocker is a must.
And then, optimization. As an Angular developer, knowing many websites nowadays are Angular or similar, the lack of optimization is a big problem. Most don’t even use lazy loading, not to mention managing the module imports into different components. They import everything into the main component and don’t do lazy loading leading you to websites that have 20-40MB (!!!) of initial load (when you open the website). This is so common that I think junior angular devs will slowly just kill angular popularity and give it a bad look. Takes work to optimize Angular, and many devs don’t care enough and just rush it. And then there are companies that don’t understand that web frameworks need optimization and just underpay devs or rush the dev time.
Please don’t use Angular (or similar complex web frameworks like Vue or React) if you don’t know how to correctly optimize it, or don’t have time or care for it. And don’t overload your pages with ads and integrations. You are ruining the web.
npm install everything --force