information will be lost.
Ok, i may be wrong here again, but the way I understand it when Claude
Shannon was trying to sort out information compression he actually located
entropy as an essential (and positive) force within the information channel
itself. To me, this means that the apparent loss of information (or
increased noise) is actually central to the making of information itself.
(We never see a diagram of even simple information systems without a big
arrow pointing up labelled NOISE). So that when Robinson encodes glitches,
and the code 'fails' , or when my CD burner once again produces a nice warm
tea coaster (grr!) information is not actually 'failing' but doing exactly
what it has always done. It is playing with noise.
So we don't need to 'see past' the typos in Robinson, to do that you have to
be able to recognise them in the first place. The typos are essential. They
also mark this piece out as not simply there to 'be read' - ie. a symbolic
representation. It is only symbolic if i choose it to be. Instead the noise
in the language allows for more gaps, glitches, noises, or material to enter
- that is, us.