Dividing two signed (positive or negative) integers uses more microcode. This microcode makes the divisor and dividend positive, but keeps track of the final sign in an internal flag called F1. After dividing, the quotient's sign is adjusted according to F1.
Later chips use a faster algorithm called SRT. It uses a table to estimate quotient bits two or four at a time. Intel's Pentium chip (1993) missed a few table entries so it occasionally got the answer wrong, the famous FDIV bug. Replacing the bad chips cost Intel $475 million.