Email or username:

Password:

Forgot your password?
Top-level
PureTryOut

@blacklight That article hits home indeed. I never understood why huge frameworks like Electron became popular and I sincerely believe it and tools that come along with it the web is going to shit nowadays. You can't just render a simple HTML page anymore, now you need to pull in giant JS frameworks that slow down your PC to a crawl.

I read an article a while ago that advocated for slowing down internet connections on purpose just like we have speed limits for cars. I really agreed with it.

2 comments
Fabio Manganiello

@bart@fam-ribbers.com The argument for purposefully introducing friction on the infrastructure to push software engineers to write more optimized software is compelling indeed - and it's often pushed by telecom companies that want software companies to pay their fair share of network usage for all the wasted MBs transferred on the wire.

But I'm still hesistant to embrace that argument because I would be an acknowledgement of failure - the failure for the software industry to self-regulate its usage of resources without introducing external constraints and limiting the supply of resources.

> I never understood why huge frameworks like Electron became popular and I sincerely believe it and tools that come along with it the web is going to shit nowadays

As I wrote in another post, I think that it's a failure on the language side.

JavaScript has never had a standard library like many other languages do. Even simple operations like making an HTTP request, parsing the cookies or the query string require either an external library, or writing some functions that reinvent the wheel again and again for different browsers/Web engines. Things have also been slowed down by Microsoft of course - the ECMAScript committee tried to push things forward, but the majority of folks kept using a browser that stubbornly refused to embrace any new thing unless it was developed by its parent company.

Now the Microsoft problem is largely gone, but the consequences remain in the form of a language that is the most widely used for UI development, but it's been in a half-baked state for so long, so too many libraries and frameworks have come in to fill the void.

In an ideal world, it should be possible to use vanilla JavaScript to write a frontend that works on any system, as well as both in a browser and as a stand-alone app. In an ideal world, that shouldn't involve running a Vue/React CLI init to download a few tens of MBs of dependencies, as well as Babel, browserify and tons of other frameworks and libraries, and I shouldn't download another few dozens of MBs just to have static typing support, and another few dozens just to get the ability for my code to run both as stand-alone and in a browser: all that stuff should have been part of the standard language.

@bart@fam-ribbers.com The argument for purposefully introducing friction on the infrastructure to push software engineers to write more optimized software is compelling indeed - and it's often pushed by telecom companies that want software companies to pay their fair share of network usage for all the wasted MBs transferred on the wire.

PureTryOut

@blacklight I see no problem with admitting defeat if it helps fixing things. The "market" clearly can't regulate itself so let's stop giving them the freedom to do so and start imposing artificial constraints. Maybe we can then finally start getting rid of the huge pile of e-waste and reduce power drain rather than just continuously making more and more hardware.

Go Up