Email or username:

Password:

Forgot your password?
Top-level
Riley S. Faelan

@scruss As far as I know, no, with the possible exception of whoever worked it out. It was likely too computation-intensive for 1970s' OCR tech (that's when MICR was still the bleeding edge of writing that both humans and machines could read, as you recall), and by the time fuzzy recognition techniques ahem AI handwriting recognition ahem became feasible in common computers, the considerations had become slightly different. The closest, in concept (but not really in the quirks) would be Graffiti, the (non-OCR) handwriting recognition of Palm devices.

3 comments
Stewart Russell

@riley I can still write in Graffitti , but yes, it was a compromise.

I guess the reverse trend were fonts like Data70 and Westminster that mimicked the look of MICR, but weren't.

In other bizarre standards of typography, it seems that the original Adrian Frutiger tracings of OCR-B were accidentally thrown out at the standards agency that held them. OCR-B is required for use on passports, so it's kinda important

Riley S. Faelan

@scruss FWIW, I'd still prefer Graffiti to the newfangled fancy handwriting recognition systems. Even though both Android and iOS have gotten pretty good at it.

Computeum Vilshofen

@riley @scruss

As usual a measure overtaken by technology faster than expected. CGK/Siemens had by ~1977/78 perfected OCR of single letter handwriting in Forms good enough to process cheques and alike at very high speed (fast enough that a dedicated real mode mainframe was needed to record the data).

By 1980 small scale systems build on a network of 8085 could do the same as desk size units.

Go Up