Email or username:

Password:

Forgot your password?
Jonathan Schofield

Not sure Tim Berners-Lee’s vision was to have 148 requests transfer 5.3 MB of assets to deliver 15 KB of text

#pollution

49 comments
Jonathan Schofield

…thanks for liking /boosting the thing at the top. Muting for now 💚

mastodon.social/@urlyman/11276

Rachel Evans

@urlyman Approximately a 1:352 signal-to-crud ratio. Impressive.

Jonathan Schofield

@rvedotrc in this case, they’re made-up numbers to express an experience [mastodon.social/@urlyman/11296], but in the general vicinity of tonsky.me/blog/js-bloat/

Devlin :farfetchd:

@urlyman and exfiltrate gods-only-know what kinds of data to it's 878 partners

Mike Torr

@urlyman Indeed.

Software and comms tech has a tendency to become more bloated to use the available bandwidth. It reminds me of the way new roads attract more traffic rather than easing congestion. Why can't we value elegance more? Depressing.

Jonathan Schofield

@macronencer it’s what happens continually in an economics that does not care about what it externalises.

But it will care. Eventually
mastodon.social/@urlyman/11296

janet_catcus

@macronencer @urlyman

highly related:

en.wikipedia.org/wiki/Jevons_p

tldr:
the more we get, the more we use. applicable to anything rly: land, coal, income, workforce, whatever influx there is.
if it's 'cheap' it will be used/claimed/exhausted as much as trivially possible :(

Paul Will Gamble :unverified:

@macronencer @urlyman “Wait, it has even more horsepower? Can’t stable them ponies!”

Ecolhombre ⏚

J'ai mieux compris quand je suis arrivé en Bosnie-Herzégovine, et que ça m'a coûté 60€ pour afficher l'adresse de mon Airbnb 😱 (à 7,5€/Mo)

hulver

@urlyman and involve 247 "partners" with "legitimate interest"

Friede Freude 《🕊🙃🥞》

@urlyman
And probably could also not imagine that the Internet is going to be kidnapped by morons like Musk, Zuckerberg and the "we are the good"-gang.

Dan Cornell

@urlyman I got my career start in the Web 1.0 server-side Java world but had the opportunity to do some work with a couple former DataPoint guys who invented ArcNet. We had some great conversations about how expectations about available resources and approaches to optimization have evolved over time. We all agreed that no web page should ever have more code embedded in it than the original UNIX source code base

But here we are...

kaia
@urlyman your website is not truly modern and _responsive_ if it uses less than 50MB of JavaScript and assets :AuraSmug:
John Mierau

@urlyman aaaaaaaaaaarrrrrgggghhhhhh(screaming into the void and bigtech's ear)

m0xEE

@urlyman
*148 requests carried over UDP 🤦

To me this is one of the most bizarre parts!

Jonathan Schofield

@m0xee can you expand? I’m not well-versed on networking protocols ( I’m more of a front-end bloke)

m0xEE

@urlyman
HTTP/3 is UDP-based — a strong departure from TCP-based HTTP1.1
Not without a few advantages — mainly parallelism, and it's a further development of HTTP/2, which was still TCP-based, but multiplexed.
Yet, dealing with datagrams on application level to work with streams of data seems a little controversial to me. Last, but not least: it mostly makes sense when in addition to 15 kB of main text content you have to carry large number of scripts, stylesheets, auxilary data fragments, etc

Mark Pauley

@m0xee @urlyman does HTTP/3 have back off logic built in? A big part of TCP is making sure that normal clients don’t cause a storm when packets are being dropped

m0xEE

@unsaturated
TBH, I'm not knowledgeable enough myself about how QUIC, which HTTP/3 is based on, handles that. To me the idea of introducing congestion control into application level (and the implied added complexity of client implementation) seem bad enough, but you might be right, there might be even more to it than that.
@urlyman

m0xEE

@unsaturated
Maybe network engineers who had to deal with it in practice can shed some light. My "solution" is disabling it in clients where it is optional (it is in Firefox) and building software such as curl without support for it 😅
@urlyman

Mark Pauley

@m0xee @urlyman it looks like quic and http/3 have congestion control RFC’s. I totally agree with you wrt doing this at the app layer.

smeg

@urlyman I feel this constantly. News sites where all I want is the text are covered in auto playing videos, irrelevant photos, structural elements that limit viewability, and of course ads.

I just want the text.

Peter Mount

@urlyman there's me, earlier in the year writing the page for my weather station so it shows graphically the current status with real time live updates & got it all to be about 64k in total.

At some point I'll try to cut that by a few more k as there's some white space that's not needed.

The development page: weather.area51.dev/dash/home

DELETED

@urlyman If a website doesn't work in Lynx over a 56K dialup connection, then it simply doesn't work.

#ProgressiveEnhancement or GTFO.

midzer

@urlyman When #webperf is not in consideration for sites, search engines should rank them even worse.

M. Forester

@midzer @urlyman which is what
clew.se/
does as I've learned today. 🙂
Only vanishingly few sites are indexed, but it's a great discovery machine.

midzer

@mforester Thanks for sharing. Interesting project!

Bill Zaumen

@urlyman A program I wrote for debugging web servers lists headers including ones from HTTP redirects. Here's a partial list for https: // www . wsj . com:
---------
Status: 403 Forbidden
Content-Type: text/html; charset=UTF-8
Content-Length: 557155
--------
... and the content length just includes the HTML, not anything loaded in a separate transaction.

Over 1/2 MB for an error response? Really?

@urlyman A program I wrote for debugging web servers lists headers including ones from HTTP redirects. Here's a partial list for https: // www . wsj . com:
---------
Status: 403 Forbidden
Content-Type: text/html; charset=UTF-8
Content-Length: 557155
--------
... and the content length just includes the HTML, not anything loaded in a separate transaction.

Vortec Space

@urlyman "If present trends continue, there is the real chance that articles warning about page bloat could exceed 5 megabytes in size by 2020" idlewords.com/talks/website_ob

kawazoe

@urlyman I'm currently working on an internal application that uses graphql to transfer data to a react front-end. In general, our queries are >1.5x larger than the responses. We could optimize them, if we had time and they wouldn't change all the time. Maybe, one day...

pasta la vida

@urlyman okay,

a nice 15kb text article,

and a nice 1MB photo or two.

none of this JS waste and ad-gunk.

Kevin Karhan :verified:

@urlyman I'm pretty shure it wasn't and I think that the #Enshittification of the #Web with #Trackers, #Ads, #Autoplay bs and the like should've been outlawed ages before...

Artemesia

@urlyman

...and disable basic browser navigation with half-assed javascript intercepting the user's clicks, then blame the end user for being backwards when the end user complains about this.

ruurd@mastodon.social

@urlyman @jwildeboer while in the mean time consuming an inordinate amount of resources on your local computer…

Go Up