The average webpage is now the size of the original doom

giggity

Expert Member
Joined
Feb 19, 2011
Messages
1,024
[)roi(];17842083 said:

If this didn't work, UX design wouldn't exist.

I understand your points made, but, if applied correctly, analytics can be a great asset to a company and allow them to redesign the site in ways which benefit the user.

The only websites which really see a noticeable negative impact in performance are those which put the script in the head rather than before the closing body tag, and it only really impacts load time. It uses hardly any resources, especially at such a scale so small (one user).
 

[)roi(]

Executive Member
Joined
Apr 15, 2005
Messages
6,282
If this didn't work, UX design wouldn't exist.

I understand your points made, but, if applied correctly, analytics can be a great asset to a company and allow them to redesign the site in ways which benefit the user.

The only websites which really see a noticeable negative impact in performance are those which put the script in the head rather than before the closing body tag, and it only really impacts load time. It uses hardly any resources, especially at such a scale so small (one user).
We're going to disagree again; IMO you're wrong to tie UX to this. There is no proven correlation between this rubbish and UX betterment. We don't disagree on the benefit for the website; but we do on the benefit for the user; unless of course you have unmitigated evidence to the contrary, if so please share.

As to an impact (whether marginal or not), you are utilising resources on someone's computer without prior or express approval + you can't guarantee that every website who does this will always have marginal impact. I could argue this is similar in behaviour to a virus.
The fact that's it's been done for so long without reprisal doesn't make it acceptable; to the contrary, it's very abuse is the reason we have ad and script blockers; the new age equivalent of antivirus.

Imagine for a moment; I drive into a shopping centre for the purpose of window shopping and possibly to acquire something; whilst I'm browsing you drive my car around, and siphon petrol out of my tank., because in your words you want to improve my UX around your shopping centre.

Different situation but hopefully it's gives you a different perspective on this; today's situation is great for the websites and terrible for the users; hence blockers. We need a compromise.
 
Last edited:

giggity

Expert Member
Joined
Feb 19, 2011
Messages
1,024
[)roi(];17865055 said:
There is no proven correlation between [analytics] and UX betterment.

How do you go about determining whether or not something works without having any data on it? You need feedback. Would you rather have that ridiculously annoying "won't you PLEASE do a survey for us?" (which doesn't and never will work, since the data is immediately skewed towards people who are willing to waste their time to do online surveys) popup on every site, or a small script that uses a tiny amount of CPU cycles?

Something doesn't have to only benefit the user to be benefiting the user. If a shop rearranges its items such that it will make more money and allow buyers to find all of the items they want in a shorter time, this is improving both the business and the user's experience.

[)roi(];17865055 said:
As to an impact (whether marginal or not), you are utilising resources on someone's computer without prior or express approval + you can't guarantee that every website who does this will always have marginal impact. I could argue this is similar in behaviour to a virus.
The fact that's it's been done for so long without reprisal doesn't make it acceptable; to the contrary, it's very abuse is the reason we have ad and script blockers; the new age equivalent of antivirus.

So what you're saying is that everybody should be forced to look through the source of a page with the ability to alter the code to remove scripts before being allowed to view the page? Your web browser is the only reason that code runs automatically, and they have plenty legal coverage to reason this. It's definitely not a virus (it cannot copy itself, and it does not perform malicious activity).

I don't see how analytics is related to ad blocking. Tracking protection can be enabled through most browsers' settings.

[)roi(];17865055 said:
Imagine for a moment; I drive into a shopping centre for the purpose of window shopping and possibly to acquire something; whilst I'm browsing you drive my car around, and siphon petrol out of my tank., because in your words you want to improve my UX around your shopping centre.

This exists. It's called parking tickets. They use the funds to help pay for the maintenance of the shopping centre.


I'm not trying to start a flame war with you. There are way bigger problems than analytics scripts when it comes to the web. Things like improper image optimisation, un-minified code and lack of browser caching support are the major driving forces ruining the web. Services like Google Analytics enables developers to realise this issue - seeing people leave their site since it failed to load under two seconds is a good incentive to investigate the issue, thus leading to more optimisation.
 

genetic

Honorary Master
Joined
Apr 26, 2008
Messages
37,594
Which Doom was 2.3mb?? Coz all early Dooms I played were far greater than that.
 

[)roi(]

Executive Member
Joined
Apr 15, 2005
Messages
6,282
How do you go about determining whether or not something works without having any data on it? You need feedback. Would you rather have that ridiculously annoying "won't you PLEASE do a survey for us?" (which doesn't and never will work, since the data is immediately skewed towards people who are willing to waste their time to do online surveys) popup on every site, or a small script that uses a tiny amount of CPU cycles?

Something doesn't have to only benefit the user to be benefiting the user. If a shop rearranges its items such that it will make more money and allow buyers to find all of the items they want in a shorter time, this is improving both the business and the user's experience.
Don't get ridiculous, nobody likes a survey + scripts don't replace surveys (different needs), and yes I'd rather have a choice.

As for the data collected by scripts; you forget browsers provide user information, and server can collect data. Historically that's the way it was always done, until some idiot thought it would be a good idea to run script to collect data as opposed to extending a standard that obviously didn't provide enough information, or features.

Today we're in the middle ground where some users employ blockers and others don't, similarly to antivirus, if the problem goes unaddressed everyone will ultimately employ these blocks, or companies like Brave, Apple, etc. will increase these blocking features in their apps, Brave is already quite good at this, have a look.

Once blockers are prolific, sites will be forced to find an alternative; it's a pity we have to go this route, but as you've shown -- you're unwilling to acknowledge the problem.

So what you're saying is that everybody should be forced to look through the source of a page with the ability to alter the code to remove scripts before being allowed to view the page? Your web browser is the only reason that code runs automatically, and they have plenty legal coverage to reason this. It's definitely not a virus (it cannot copy itself, and it does not perform malicious activity).

I don't see how analytics is related to ad blocking. Tracking protection can be enabled through most browsers' settings.
Like adverts, you've never given me the opportunity to decide. You taken the easy route for you, so get used to the consequence, today some browsers allow you in (I don't), tomorrow there will be far more people blocking your unapproved practice.
This exists. It's called parking tickets. They use the funds to help pay for the maintenance of the shopping centre.
Parking tickets don't drive my car or steal my fuel. Your scripts run on my CPU, steal my bandwidth.
I'm not trying to start a flame war with you. There are way bigger problems than analytics scripts when it comes to the web. Things like improper image optimisation, un-minified code and lack of browser caching support are the major driving forces ruining the web. Services like Google Analytics enables developers to realise this issue - seeing people leave their site since it failed to load under two seconds is a good incentive to investigate the issue, thus leading to more optimisation.
Things are going to change. Best you start preparing for it; I will no longer allow anything to run that I haven't first approved, that's not going to change; tomorrow there will be more people realising that the Internet is fast without all the unapproved crap you've been running on our computers.

PS. I got past most of that slow loading crap -- disable web fonts, 3rd party scripts, ads, .... If it's still slow after that, then I don't bother coming back.
 
Last edited:

genetic

Honorary Master
Joined
Apr 26, 2008
Messages
37,594
OK so a shareware version of the original Doom (1 level) was 2.3mb.

Click bait much? :erm:

That's like saying the entire Doom game can fit into the memory of a modern day graphics card. No sh|t!!!
 

[)roi(]

Executive Member
Joined
Apr 15, 2005
Messages
6,282
Which Doom was 2.3mb?? Coz all early Dooms I played were far greater than that.
Agreed,
Doom (wad) - 12.4mb
Doom2 (wad) - 14.6mb

...but quite a few sites are in the real Doom ballpark... worst I've seen is >70mb for a homepage.
 

KleinBoontjie

Honorary Master
Joined
Jul 30, 2010
Messages
14,607
Ad Muncher, Privacy Badger and uBlock Original is working over time, blocking everything that comes with every page I visit.
Just look below at privacy badger on this page alone, not to mention other sites and what Ad Muncher and uBlock encounter:

badger.png
 

Monsta Graphics

Well-Known Member
Joined
Jul 20, 2015
Messages
442
[XC] Oj101;17827683 said:
The first Doom was 2D though, not 3D.

What the hell are you talking about, Doom was a pioneer in 3D engines.

The editor may have been top down 2D, but the engine ran in 3D.
 

genetic

Honorary Master
Joined
Apr 26, 2008
Messages
37,594
[)roi(];17865761 said:
Agreed,
Doom (wad) - 12.4mb
Doom2 (wad) - 14.6mb

...but quite a few sites are in the real Doom ballpark... worst I've seen is >70mb for a homepage.
Click bait it is.

What the title fails to state is that "part of the first level of the original Doom game is only 2.3mb"
 

[)roi(]

Executive Member
Joined
Apr 15, 2005
Messages
6,282
Ad Muncher, Privacy Badger and uBlock Original is working over time, blocking everything that comes with every page I visit.
Just look below at privacy badger on this page alone, not to mention other sites and what Ad Muncher and uBlock encounter:

View attachment 371285
Exactly... and they all run scripts, all wanting to collect information that you never approved.

I'd much rather have the HTML/HTTP standards extended to allow for strictly defined data collection (with privacy controls). That way a site like this doesn't get to compound the problem with every plugin they choose to add.

Btw on MyBB I get the following:
  • Mainpage: 10 Trackers + 20 Ads
  • Articles: 10 Trackers + 27/28 Ads
Good part is they use tiny web fonts: 60Kb
 
Last edited:

[)roi(]

Executive Member
Joined
Apr 15, 2005
Messages
6,282
Click bait it is.

What the title fails to state is that "part of the first level of the original Doom game is only 2.3mb"
Yip... well known formula:
  1. rebadge BS
  2. add a catchy title
  3. litter the page with ads.

PS.... we're on site that does that.
 

genetic

Honorary Master
Joined
Apr 26, 2008
Messages
37,594
[)roi(];17865899 said:
Yip... well known formula:
  1. rebadge BS
  2. add a catchy title
  3. litter the page with ads.

4) posted on facebook / forum.
 

cguy

Executive Member
Joined
Jan 2, 2013
Messages
8,527
[XC] Oj101;17834639 said:
Call it what you want, but it was 100 % a 2D engine.

It was a 3D engine that could only render 2D models that had been extruded into 3D, and only supported camera directions in the y=0 plane (aka xz plane). This means that the model had volume, and was 3D, the camera position could be an arbitrary 3D point, and the model-to-viewport transform was a 3D-to-2D perspective projection, so yes, it was most definitely a 3D engine.

The more general engines support arbitrary 3D models, and allow the camera direction to leave the y=0 plane. This doesn't make the new engines 3D vs 2D though, this makes the new engine 3D without constraints vs 3D with constraints. For constraints to be sufficient to make the engine 2D, there would have to be no model extrusion, only orthogonal projection, and a camera direction constrained to just a single vector (0,-1,0), and the camera position would effectively be constrained to a single plane.
 

FarligOpptreden

Executive Member
Joined
Mar 5, 2007
Messages
5,396
In my mind, web browsers should have APIs which replace common functionality on sites. Rather than loading jQuery and Google Analytics for every second site on the web, it should be built into web browsers and optimised as far as possible.
Less bandwidth needed for the user and server, less requests on load, less CPU cycles because of direct implementation of libraries into browsers' technology. It could also entail less security risks.

Google hosted libraries exist for that. If you reference a version of a script from there and it's already loaded in your browser cache, it won't be loaded again. Instant bandwidth and performance win!

People should just learn how to leverage browser caching properly and how to optimise images, CSS and scripts on a site. Instead of loading 100+ images and icons, combine them into a bigger image and position with CSS (aka CSS sprites). Instead of loading 10+ CSS and JavaScript files, bundle and minify them and try and reference them from hosted libraries or content delivery networks. Chances are your user has already visited a site using jQuery, Bootstrap or Angular, so reference those files from a popular host.
 

giggity

Expert Member
Joined
Feb 19, 2011
Messages
1,024
Google hosted libraries exist for that. If you reference a version of a script from there and it's already loaded in your browser cache, it won't be loaded again. Instant bandwidth and performance win!

People should just learn how to leverage browser caching properly and how to optimise images, CSS and scripts on a site. Instead of loading 100+ images and icons, combine them into a bigger image and position with CSS (aka CSS sprites). Instead of loading 10+ CSS and JavaScript files, bundle and minify them and try and reference them from hosted libraries or content delivery networks. Chances are your user has already visited a site using jQuery, Bootstrap or Angular, so reference those files from a popular host.

That is an option, however loading APIs from external sites is expensive on load times, especially since you can't minimise those files into the rest of your scripts (unless you're using a CDN, in which case it can automatically minimise your content, but I don't believe the caching will work across sites) and have more requests to outside sources. There's also the problem that different sites use different versions of libraries.

This works after a while, as you've browsed sites and cached the libraries, but if you're loading multiple libraries from a CDN, this can impact your load time a fair amount, impacting your site. You are benefiting the next person's website while disadvantaging your own. Of course this is all just a couple hundred milliseconds we are talking about.

Completely agree with the second paragraph.
 

[)roi(]

Executive Member
Joined
Apr 15, 2005
Messages
6,282
Google hosted libraries exist for that. If you reference a version of a script from there and it's already loaded in your browser cache, it won't be loaded again. Instant bandwidth and performance win!

People should just learn how to leverage browser caching properly and how to optimise images, CSS and scripts on a site. Instead of loading 100+ images and icons, combine them into a bigger image and position with CSS (aka CSS sprites). Instead of loading 10+ CSS and JavaScript files, bundle and minify them and try and reference them from hosted libraries or content delivery networks. Chances are your user has already visited a site using jQuery, Bootstrap or Angular, so reference those files from a popular host.
The browser is specifically designed to sandbox each site for security. so inter site caching is not possible. The problem is in the absence of caching and a good stdlib, they try to get too fancy with a lot of 3rd party add-ons; neglecting performance, download size in favour of "eye candy", 3rd party services, ... There's no excuse for this; performance / debugging tools are no longer completely rubbish, so it's easy it check how bad your site is.
For example:
Some sites download take 2.5mb and ~4 seconds if you disable everything with blockers; turning everything on, they take >4mb and >1 minute. Clearly the person who designed the site never considered execution time and resources for 3rd party scripts. In comparison on iOS, apps are monitored by a watchdog timer; anything taking too long to startup, crashes the app, so too with any process taking too long (if only).
 
Last edited:
Top