A recent section used a
decision tree for
pattern matching, assuming all the binary input data is reliable. A past page also argued that
correlation is the proper technique when the input data is noisy, but the relevant program had been omitted. This section presents the missing code.
The figures in the original article pertained to 8x8 character blocks, but this
program actually uses 8x16. (Some of the pixels are always off for all standard ASCII characters, so do not really count.) A success rate of about 81% is easily demonstrated for standard ASCII- from space to the tilde (~), when the noise level is 25%. The human viewer cannot tell any letters at that noise level.
(The highest noise figure as far as the viewer is concerned is 50% - at 100%, the foreground and background colours are simply interchanged.)
You may try extended ASCII as well, but no figures are given in that case: For one thing, there are several local alphabets; more importantly, some of the letters in local alphabets may clash with standard ASCII characters.
Up to this point, for my own code, and prior to code added by the server:
Valid HTML 4.01!