[Simh] pdp11 - console input with high bit set

Clem Cole clemc at ccc.com
Fri Jul 24 17:57:16 EDT 2020


On Fri, Jul 24, 2020 at 4:18 PM Paul Koning <paulkoning at comcast.net> wrote:

> Parity is something that comes in addition to the data.  DEC UARTS (and
> many others, I think)
>
Appendix D of McNamara's book, *a.k.a.* the 40-pin WD1402 and 1402A :-)  I
once knew it well and at one time had an official WD manual for it (I may
still in a filing cabinet)  I believe the chip was added to IEEE's hall of
fame.  In those days, but I've long forgotten the difference between the
original and the "A" version, I think it had to do with the second stop bit
getting being the wrong size when 2 stop bits were defined. IIRC: Getting
the framing right was a common issue with a number of early UARTs
cause shorter chars (5 bits) used 1.5 stop bits, not a full 2.   Sadly
there was a time when I could probably have quoted stuff from that and the
EIA RS-232 >>B<< specs, much less my copy of his now, dog eared book.



> would let you set the data length (5, 6, 7, 8 bits) and the parity setting
> (none, even, odd).  So what you called "8 bits including parity" is
> technically "7 bits with parity".
>
absolutely.  The WD chip could not do space or mark parity.  It was either
none or even/odd.  There were two pins on the chip, one that said to use it
or not, and the other if in used which of the two types.

Which comes back to the OP's question is why I think Tim's comment is the
correct one, you don't want 'mark' parity, if you need parity I would think
that you want to set it to 'even' (see Table D-1 on page 288 of the book)
since I don't think the UART in the KL11 could generate anything but these
three options.




>  If you set your UART for 8 bits with parity, it would send 11 bits total:
> start, 8 data, parity, stop.
>
Yep ...  you are correct that is exactly how the hardware works. mei
culpa.   Although, if we are going to get specific.   There could also
optionally be 2 stop bits.  As I said, the Model 33 used 2 but either 5-bit
gear used either 1 or 1.5.  I think I remember with 5-bit plus parity the A
version generated 1.5 stop bits when the chip was programmed for 2 and the
original sent a full 2 bits.

You are right that I should have been more specific but in all fairness, I
don't know of a DEC OS that supported that setup.  But I'm sure it was
possible and probably for people running PDP-11's as communications
'front-ends' it was done.  The fact is if you ran 8 bits back in the day,
it was usually without parity.  Instead, some other protection was done in
a higher level protocol to protect and/or correct for transmission errors.

If I remember McNamara's book he had a whole chapter on errors and
suggested that parity was popular (almost always used) in early devices,
like the 5-bit Baudot based equipment like Teletypes model 28 and Friden
Flexowriter, and of course as you pointed out the infamous 7-bit Model 33
and 37.  But by the mid to late 70s, i.e. with the glass TTY it started to
fall from favor.   I don't know why, but I would suspect this was because
dedicated lines started to supplant telephone circuit-based connections and
single-bit error detect was not useful.  It did not happen that often.



>
> I've even run into 10-bit UARTs (on PLATO terminals).  But that's not DEC
> stuff.
>
Not surprised, CDC did some very strange things with characters ;-)

Clem
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.trailing-edge.com/pipermail/simh/attachments/20200724/c3691f72/attachment.html>


More information about the Simh mailing list