[Simh] pdp11 - console input with high bit set

Paul Koning paulkoning at comcast.net
Sat Jul 25 13:47:45 EDT 2020



> On Jul 24, 2020, at 5:57 PM, Clem Cole <clemc at ccc.com> wrote:
> 
> ...
>>  If you set your UART for 8 bits with parity, it would send 11 bits total: start, 8 data, parity, stop.
> Yep ...  you are correct that is exactly how the hardware works. mei culpa.   Although, if we are going to get specific.   There could also optionally be 2 stop bits.  As I said, the Model 33 used 2 but either 5-bit gear used either 1 or 1.5.  I think I remember with 5-bit plus parity the A version generated 1.5 stop bits when the chip was programmed for 2 and the original sent a full 2 bits.
> 
> You are right that I should have been more specific but in all fairness, I don't know of a DEC OS that supported that setup.  But I'm sure it was possible and probably for people running PDP-11's as communications 'front-ends' it was done.  The fact is if you ran 8 bits back in the day, it was usually without parity.  Instead, some other protection was done in a higher level protocol to protect and/or correct for transmission errors.

It's true that parity is a bit silly because it only does error detection on an unstructured data stream, which isn't all that helpful.  But it was fairly common practice at one time.  For example, the popular Friden Flexowriter machines -- somewhat like a TTY model 35 but even more reliable -- use a 7 bit character code that is better thought of as 6 bit characters with odd parity.

I don't know about others, but RSTS definitely supports setting parity on 8-bit data.  Since the devices (like DH11 and DHU11) support it, that's just a matter of the software implementing an API and telling the driver to set the proper bits:

$ set ter kb6/perm/parity=even

> 
> If I remember McNamara's book he had a whole chapter on errors and suggested that parity was popular (almost always used) in early devices, like the 5-bit Baudot based equipment like Teletypes model 28 and Friden Flexowriter, and of course as you pointed out the infamous 7-bit Model 33 and 37.

Not 5 bit devices, none of those codes have parity.  Neither do 6 bit devices (typesetters).

>  But by the mid to late 70s, i.e. with the glass TTY it started to fall from favor.   I don't know why, but I would suspect this was because dedicated lines started to supplant telephone circuit-based connections and single-bit error detect was not useful.  It did not happen that often.

It could be that glass TTYs were computer peripherals, and typically close to the computer or connected by a modem that was pretty clean.  The older devices tended to be on current loops, possibly quite long ones with debatable signal quality.

>> I've even run into 10-bit UARTs (on PLATO terminals).  But that's not DEC stuff.
> 
> Not surprised, CDC did some very strange things with characters ;-) 

True, but not this.  CDC had nothing to do with that design, it came from the U of Illinois.  7 bit characters, the high order bits were to indentify one of several data streams (keyboard, touch panel, echo response, external device).

The amazing thing is that they managed to get standard chips to speak these weird signalling schemes, in the later 8080-based PPT (PLATO V terminal).  10 bit async one way, 21 bit sync the other way, both done with an 8251 USART.

	paul



More information about the Simh mailing list