@snarfmason @technomancy no I know, I mean, there’s not *really* a right answer there, but if there’s anything I’ve learned in over a decade of figuring out how to mistake-proof semantics, is that it’s better to fail / throw errors / prevent obviously stupid things from happening (the way pony does) than to ship obvious footguns
@snarfmason @technomancy I once had to build alternate number types into a schema to represent statistical averages to prevent people from adding them, because people were taking multiple averaging results and averaging *those* & making decision based off this
people do stupid things like this all the time, yet we still have yet to really evolve our concept of numbers in computers to do things like making dividing by zero impossible by having dividend / divisor types