Email or username:

Password:

Forgot your password?
Devine Lu Linvega

In 1949, Claude Shannon was characterizing information loss, and needed a term for the degree to which information is scrambled. Visiting mathematical physicist John von Neumann, he received the following advice:

"You should call it entropy... nobody knows what entropy really is, so in a debate you will always have the advantage."

3 comments
Ramin Honary
Visiting mathematical physicist John von Neumann, he received the following advice: "You should call it entropy... nobody knows what entropy really is, so in a debate you will always have the advantage."

@neauoire Von Neumann was a genius on many levels.

cathos

@neauoire I've been thinking about this lately. I have this theory that entropy is just time. That time is just things falling apart. Energy... dissipating. And that knowledge and information is a way that we fight against it, just as all life fights by reproducing and continuing to build things even as they wear away.

Luigi :donor:

@neauoire in uni I took an information theory course right after statistical mechanics and it was mindblowing seeing the parallels

Go Up