Email or username:

Password:

Forgot your password?
Top-level
Ben Leggett

@lina

The longer you work in this stuff the more you realize a lot of people are just acting like children about code. That's all it is. It's not great skill or genius. It's just childish obsessiveness.

9 comments
Sean

@dadregga @lina yeah my experience is that people tend to get defensive about their projects and suggesting that they might be able to improve on something or that you have a need they didn’t consider gets an immediate negative reaction, no matter what community the report comes from. This even happens between teams working at the same company.

argv minus one

@complexmath

As someone who has changed favorite languages several times in his life, each time tending toward greater type/memory/concurrency safety, I find it difficult to fathom these people's attachment to C.

@dadregga @lina

Ben Leggett

@argv_minus_one

They're *really, really* good at C. Built their entire careers around it, most likely.

The idea that someone could come along and negate that does terrify them, and should! I can't blame them for that, it's a terribly human reaction.

argv minus one

@dadregga

Is it? I would think they'd find it relieving that they can now write code without worrying about all the things C makes you worry about.

Now, if programmers in general were obsolete, that would be another story. 😬 That's what “AI” is trying to do, but so far, it's failing miserably.

Gina Peter Banyard

@argv_minus_one @dadregga I don't think that's how they see it.
They see it like their whole career has been invalidated, even if that is not the case, as Rust did not exist at the beginning of their career.

This ego and pride also affects academia and PhD applications being rejected by a panel of expert where the subject of the PhD would "invalidate" all their prior research (and I am not talking about CompSci or Maths here)

Ben Leggett

@Girgias

@argv_minus_one

Exactly. And it will happen to you! And me! The trick is being able to understand your own response, when it *does * happen to you, for what it is.

Gina Peter Banyard

@dadregga @argv_minus_one Honestly, at this point in time, my whole job is to invalidate prior decisions PHP has done, and I am dealing with those emotions constantly.
There even are some PHP RFCs that I voted in favour that not even 3 years later I wonder if it wasn't a mistake and something people after me will need to "clean-up".

We make bad decisions all the time, it doesn't mean that everything we have done was pointless or not sensible at the time.

@dadregga @argv_minus_one Honestly, at this point in time, my whole job is to invalidate prior decisions PHP has done, and I am dealing with those emotions constantly.
There even are some PHP RFCs that I voted in favour that not even 3 years later I wonder if it wasn't a mistake and something people after me will need to "clean-up".

Sean

@argv_minus_one @dadregga @lina I dunno. It's comparatively easy to write a C compiler and toolchain, so the language is available literally everywhere. And it's very WYSIWYG, which is handy when you're writing low-level code. I think there's a strong argument for being good at C in various fields. But much less of an argument for having it be your favorite language. The simplicity of the language offloads the burden of complexity onto the programmer, which means maintenance issues and bugs.

argv minus one

@complexmath

I would not call C WYSIWYG unless you are using a non-optimizing compiler. Optimizing compilers can and will generate machine code drastically different from your source code. I've seen a case where dozens of lines were reduced to a single instruction.

For that matter, x86 assembly itself is a rather misrepresentative abstraction of the hardware's actual behavior. It looks sequential, but the CPU executing it is anything but sequential. queue.acm.org/detail.cfm?id=32

@dadregga @lina

@complexmath

I would not call C WYSIWYG unless you are using a non-optimizing compiler. Optimizing compilers can and will generate machine code drastically different from your source code. I've seen a case where dozens of lines were reduced to a single instruction.

For that matter, x86 assembly itself is a rather misrepresentative abstraction of the hardware's actual behavior. It looks sequential, but the CPU executing it is anything but sequential. queue.acm.org/detail.cfm?id=32

Go Up