Email or username:

Password:

Forgot your password?
Top-level
Lorq Von Ray

@blacklight "If the young but knew..." My first computer had a whopping 4K (yes, 4 Kilobytes) of main memory. Today I'm forced (occasionally) to deal with a circle (jerk) of Agile coders who are powerless when the IDE on their MacBook Pros won't launch, with absolutely no idea where to start in fixing the problem; the file servers, the datastores, their Macbook, the network, or a hundred other things they know nothing about. Oh, and trying to teach them about IPv6 is basically pointless.

33 comments
BJ Swope :verified:βž–

@lorq @blacklight trying to teach anybody about IPv6 is pointless, I say as somebody who’s been doing this stuff for ~30 years now.

Lorq Von Ray

@cybeej @blacklight I find it both annoying and comical that my IPv6 plant is larger than the two largest broadband providers in my region, so I get you.

Natasha Nox πŸ‡ΊπŸ‡¦πŸ‡΅πŸ‡Έ

@cybeej @lorq @blacklight Do you (both) think it has to do with the commercialisation of education itself?

I keep seeing all those certifications and courses for and by companies and their products as well as universities and schools being filled with tech by either Google, Apple or Microsoft, and I wonder whether studying informatics is the only way nowadays to actually get taught necessary basic skills to understand the technology you're working with.

BJ Swope :verified:βž–

@Natanox @lorq @blacklight I think it is due to the abstraction of technology. As products and services lower the barrier to using tech they usually remove the need to learn the underlying fundamentals. So more people use the tech but less understand how it actually works. It’s both good and bad.

Fabio Manganiello

@cybeej @Natanox @lorq good point there.

I've looked at some of the Assembly code that was written at NASA in the 1960s and 1970s (the one for the Apollo programs was also open-sourced a while ago).

I was impressed by how "embedded" it was, with literally zero room for abstractions. The code was tightly coupled to the hardware. Each single bit was crafted and optimized for its specific purpose on that specific hardware.

You can do amazing things once you remove all the abstractions. You can get very creative on how to optimize resources if you know that your code will only run on one specific piece of hardware, for a very specific purpose.

Let's not forget that until at least the early 1980s most of the computers didn't even agree on the "word size", and even on the byte as the minimum quantity of addressable information.

As we started seeking for compatibility, more general-purpose hardware and software, lower entry barriers for coding etc., we started introducing abstractions.

Those abstractions came with many advantages - namely lower barriers for coders and greater portability. Also, decoupling software from hardware meant that you no longer have to be a electronic engineer (or an engineer who deeply understands the architecture the software is running on) in order to code. But I also feel like there's an inflection point, and once we pass it those advantages start to fade.

It's obvious why a compiler that can digest the same piece of C/C++ code and produce an exe on different architectures is a useful abstraction. Same goes for a virtual machine that can run the code on any architecture without recompiling it from scratch.

But when building a basic website ends up creating a node_modules folder of 200 MB, requires the developer to be familiar with npm/yarn/gulp pipelines, transpiling etc., and maybe also with deploying their code to a Kubernetes cluster running on somebody's private cloud, I wonder if we've already passed that inflection point.

@cybeej @Natanox @lorq good point there.

I've looked at some of the Assembly code that was written at NASA in the 1960s and 1970s (the one for the Apollo programs was also open-sourced a while ago).

I was impressed by how "embedded" it was, with literally zero room for abstractions. The code was tightly coupled to the hardware. Each single bit was crafted and optimized for its specific purpose on that specific hardware.

Lorq Von Ray

@blacklight @cybeej @Natanox I think it's a combination of the evolution of education (Pro Tip: Education is NOT permanent from generation to generation and we need to remember that as a society.) and lower barriers of access to many technologies all at once. And one of the fundamental "learning to code" problems professionals face in teaching coding is that those abstractions get in the way of learning some of those fundamentals.
/1

Lorq Von Ray

@blacklight @cybeej @Natanox I learned TinyBasic and TinyPascal and TinyFortan and TinyCobol and TinyAssembler around the age of 10 or 11. Then I learned more of the full-blown languages again in college. My personal angle on the "learning to code" problem is that; A.) Not everyone is meant to be a deep thinker and analytical problem solver. We need all kinds of other professions as well, and we STILL dig holes with shovels.; and B.) In order to teach youngsters to code, we also need...
/2

Lorq Von Ray

@blacklight @cybeej @Natanox - ...to teach them how computers "think", and therefore old tools like Z80 and 6502 Assembler, BASIC, Forth, and a few other contrived languages are helpful in instructing even if they're not used in the "real world" anymore. Then they also need to understand the fundamentals of how the Internet works; things like TCP/IP and ICMP, along with historically significant developments like AT&T T-Carrier, IS-IS, IPX, and Token Ring versus Ethernet.
/3

Lorq Von Ray

@blacklight @cybeej @Natanox - I think that Robert Heinlein pretty much addresses the core of this problem in Glory Road and the character Rufo has most of those answers for those of you who had the time and inclination to read while you were growing up.
/5

Lorq Von Ray

@Natanox @blacklight @cybeej A "must read"? Well, that's a hard one. Let's just say it's a "strong buy". Anything by Heinlein or Niven or Asprin or Asimov is a "must read" or a "strong buy" in my book.

DiscreetSecurity

@lorq @blacklight @cybeej @Natanox If you've not listened to "The Fall of Civilizations" Podcast, then you're losing some long form/history context for that statement!

Lorq Von Ray

@blacklight @cybeej @Natanox So basically I'm a holistic technology teacher when I have the good fortune to teach. -- Now, all of that said, I still believe that, statistically speaking, the number of kids ditching class and smoking weed out behind the bleachers or getting drunk during and/or after the Friday Night Football Game in high school is still the same as it was back in the 1980s and 70s and 50s, and the smartest 3-10 percent are still out there learning, doing what they do.
/4

Natasha Nox πŸ‡ΊπŸ‡¦πŸ‡΅πŸ‡Έ

@blacklight @cybeej @lorq There are multiple bad developments I see at work here. One would be "dependency hell", another one the abundance of "jack of all trades" libraries and frameworks that can't be split up at all. Web development became quite monstrous in this regard, though some IDE's (or compilers?) are also quick in silently attaching 3 to 4 megs of libraries onto your hello world program unless told otherwise, just in case you need them.

Lorq Von Ray

@Natanox @blacklight @cybeej You are not wrong. And that is also why I have to have "the talk" with devs before they put fresh code randomly into production. Then I attack it to see where it breaks. Then I give them hell if it's a large business entity I'm consulting with. Sometimes we learn by doing. Sometimes the dependencies teach us lessons we didn't know we needed.

Natasha Nox πŸ‡ΊπŸ‡¦πŸ‡΅πŸ‡Έ

@lorq @blacklight @cybeej I feel like we should get back to the UNIX / GNU style of toolboxes (not sure right now where the philosophy originated from). Make a tool that does *one* thing, and does it exceptionally well.
There might be an extended philosophy to be articulated for frameworks or very complex tools that require massive flexibility (simple example would be a Browser).

Fabio Manganiello

@Natanox @lorq @cybeej yes, this one πŸ‘†

The UNIX philosophy was born in the AT&T Bell labs in the first half of the 1970s - another example of good symbiosis between public funding and clever engineering that I often mention, probably on par with the Apollo and Voyager programs.

Founding fathers of our industry like Thompson, Kernighan and Ritchie were behind it, and they had the right level of "eagle view" on what they wanted achieve on a system level to be able to break it down into smaller and reusable components that communicated over simple messages (pipes and signals), and where everything was a file descriptor, so everything could leverage the same abstraction.

It's the textbook example of a good abstraction to me. It introduced something on top of the "bare metal", but it was lightweight enough to be conceptually easy to grasp, and easy to extend. And you could pick whichever component you needed, without being an all-or-none selection of a framework.

Do-it-all monolithic frameworks with dozens of abstractions like Spring, Django, React, Symfony or Blazor are exactly on the opposite side of the spectrum compared to the UNIX philosophy.

@Natanox @lorq @cybeej yes, this one πŸ‘†

The UNIX philosophy was born in the AT&T Bell labs in the first half of the 1970s - another example of good symbiosis between public funding and clever engineering that I often mention, probably on par with the Apollo and Voyager programs.

Founding fathers of our industry like Thompson, Kernighan and Ritchie were behind it, and they had the right level of "eagle view" on what they wanted achieve on a system level to be able to break it down into smaller and...

Panda | νŒλ‹€

@blacklight @cybeej @Natanox @lorq Abstraction and hardware independence are two very separate concepts. E.g., Multics which has many levels of abstraction but was so tied with the hardware that it was EOLs when it was to expensive to maintain the hardware.

Fabio Manganiello

@panda @cybeej @Natanox @lorq true, but in general you build abstractions when you want to decouple the software from either the hardware or the OS. Otherwise it doesn't make much sense to bother introducing new concepts :)

Alan Langford

@blacklight @cybeej @Natanox @lorq Hey, they didn't even agree on character sets! (see EBCDIC vs ASCII).

theothertom

@cybeej @Natanox @lorq @blacklight It does also feel like not needing to understand (so many of) the fundamentals is a sign of maturity. Throughout my career, I’ve needed to understand things like networking/quirks of multi-socket systems/endianness etc. however, those things didn’t have anything to do with β€œthe problem that a bunch of people were being paid to solve”.
If the abstractions are enabling people who understand the human/business aspects to code at all, it feels like mostly a win.

Lorq Von Ray

@tom @cybeej @Natanox @blacklight I have got to say that a body of knowledge can only be made more valuable (up to a certain point, of course) by expanding depth and periphery to other connected thoughts and concepts. That's why they always used to call me into the horribly damaged, difficult, lingering problems that would get 27 VPs and engineers onto a conference bridge for a couple of hours. And, to be honest, I reveled in solving the hard ones. -- Jack of all trades....

theothertom

@lorq @cybeej @Natanox @blacklight Oh, I’d fully agree with you - just that I think that depth etc. may be of a non-technical nature (or totally different areas of tech).
There’s only so much brain space for this stuff, and it’s good to have a mix between people who’ve got into the weeds of things like DB/network architecture and people who really know the business problems, as well as the legal/ethical issues with a system.

DiscreetSecurity

@cybeej @lorq @blacklight @tom @Natanox then it's "plus 1 to the list". Or more, depending how hard it makes you think!

Dustin Rue

@Natanox honesty this one of the best explanations I’ve seen for the state of things.

Go Up