You'd think every computer should be able to divide two numbers, but early microprocessors didn't have division instructions. The Intel 8086 (1978) was one of the first with division. Let's look at how it implemented division and why division is so hard.
Computers can divide by performing long division, just like grade school except using binary. This needs a subtract-and-shift loop. For early microprocessors, you'd implement the loop in assembly code. The 8086 implemented the loop in microcode, much faster and more convenient.