[Simh] Serial Console

Holger Veit holger.veit at iais.fraunhofer.de
Wed Jul 4 09:55:06 EDT 2012


Am 04.07.2012 12:28, schrieb Johnny Billquist:
> On 2012-07-04 11:00, Holger Veit wrote:
>> Am 03.07.2012 20:16, schrieb Johnny Billquist:
>>> On 2012-07-03 19:46, Jan-Benedict Glaw wrote:
>>>> On Tue, 2012-07-03 18:58:15 +0200, Peter Svensson <psvsimh at psv.nu>
>>>> wrote:
>>>>> On Tue, 3 Jul 2012, Jan-Benedict Glaw wrote:
>>>>>> What about other solutions? I haven't used the simulated DZ up to
>>>>>> now,
>>>>>> but if I get it, it maps the serial ports to (telnet'able) IP/Port
>>>>>> network sockets, right?
>>>>>>
>>>>>> If so, why not simply write a little program to configure the serial
>>>>>> port and splice it to the network socket? Or a small script using
>>>>>> `stty' and `nc'?
>>>>>
>>>>> Baud rate changes, modem control lines and so on. Break handling.
>>>>
>>>> Okay, you won't really get modem control lines. Baud rate changes
>>>> won't work, too, but were they *really* used in the wild? I doubt it.
>>>
>>> Baud rate changes were used a lot. Atleast on DEC systems. But even more
>>> importantly, if you don't implement it, you will have to make a decision
>>> somewhere as to what speed to use, and the terminal have to adapt to
>>> that. This might not always be possible.
>>
>> Surely, they were be used. But why? For the same reason we nowadays use
>> DHCP, USB, BONJOUR and other technology - harshly spoken, the users were
>> and are idiots who cannot figure out the right setting if more than two
>> alternatives were provided: this damn terminal does not work
>> out of the box (yes, it was constructed to be connectable to more than a
>> single machine!).
>
> No. A big reason was/is that different equipment have different
> capabilities.

Sure. A card reader is slower than a tape and this slower than a disk.
But this swaps cause and effect.
As long as the technical staff installs these systems, it should
not really be a problem to set matching characteristics on both sides.
Particularly then, when you need to wire-wrap pods on the MUX and
set dip switches on the backside of a terminal, and get a suitably
wired cable. A VT100 is not a laptop, to be carried around every few 
minutes (wasn't those days), and then plugged in here and there.

I don't deny that one needs to change terminal characteristics
once in a while; I just say it won't happen so often that it needs to
be emulated in all details and including any anachronism the old
hardware had. Emulation is not real hardware; it serves the purpose
(besides hobbyist's fun) to run former software. Baudrate and clock
speeds were characteristics the old hardware had to deal with somehow,
just to communicate with devices at that time. It was not the real 
purpose. The emulator needs to provide an expected response to a request
issued by the simulated OS; it is usually not needed to emulate
hardware misbehaviour (such as power fail interrupts, RAM ECC errors).
Simh does not need to run in core memory to emulate a PDP-8.

>> Emulators and a glass terminals are nowadays no longer in the hands
>> of end users; and I have yet to find someone who does not set the
>> maximum baudrate that will work for a given terminal/emulator
>> connection, other than for the nostalgical demonstration "see how slow
>> computing was those days" (and even in this case: in contrast to the
>> real iron, the emulator can be, in principle, artificially throttled
>> to give a "realistic feeling" if one actually wants that).
>
> Ha! And what is, pray tell, the "maximum baudrate"? It varies for the
> terminals that you use, and the interface they are connected to. Which
> means you must be able to adapt on both sides in order to have a working
> connection. The setting of baudrates are as important today as it was 30
> years ago.

No "Ha!" The "maximum baudrate" is the minimum of what
1. the target system ("VMS") can handle,
2. the host system hardware (the PC's serial I/O port) can handle,
3. the attached glass terminal can handle.
There is also a minimum baudrate; contemporary PCs might no longer
accept 110Bd or lower.

For each of the three components, the limits are known. If you want
to have a working system - anything else is ridiculous - you choose
one that is within these bounds. Usually, one chooses the highest
possible value - we are impatient. And then we leave it at this value
and enjoy the virtual blinkenlights. If we like to demonstrate to 
someone what a 1200Bd line was about, we run a different experiment,
without a need to operate the big rocker switch at the back and
anxiously waiting whether disks spin up again.

This is emulation, real hardware is another world.

> The OP wanted to hook up a real DEC terminal. Now, a real DEC termnal
> have a max baudrate that very much depends on which terminal it is.
> A VT220 can do 19200 bps. A VT320 I seem to remember the same. A VT420
> can do 38400. A VT520 can do 38400 as well, but I think it can actuallu
> do 115K. A real VT100 I think is limited to 9600.

Yes. Okay. Again, I have never said in the discussion that the ability
of baudrate setting is unnecessary. I reacted to the proposition that
it is not possible in simh. David (Bryan) and I had run the HP2100 MUX
and the PDP-11 DZ on VT220 and HP2648. sim_serial.[ch] didn't make it in 
3.9.0 for reasons I don't know, and won't investigate.

> On the system side, a DZ11 cannot be told to do more than 9600. A DH11
> also is limited to 9600, but also have two magic values (ExtA and ExtB).
> A DHU11 can do 19200.
>
> I hope you start seeing the problem. Being able to set the baudrate is
> neccesary for a successful connection.

I think you missed my point. Besides, an emulator can run things much 
faster than the original hardware. I could tell VMS that it has accepted
the command to set the DZ11 to 9600 while serving a serial VT520 at
38400 at the real physical PC serial port. And both don't know there is 
a converter between them. It is software - if you want it, you can
intercept the data stream in that VMS can send the get device settings
ESC sequence and receive an answer: "yes I am a VT100 at 9600Bd" (where
it is actually a 520).

>>> Auto baud detection. Not at all uncommon... And having different speeds
>>> on some terminals, for whatever reason... Depends on how you generate
>>> the system. The speeds of the terminal ports are set by the OS at boot
>>> time.
>>
>> Besides being an API matter - somehow the OS must itself set the
>> appropriate (emulated) hardware bits, and the emulator could react to
>> such changes - there is hardly any reason that the emulator actually
>> needs to react in the same way the hardware would do.
>
> No? Why not? If I have my VT320, and I want to connect it to a serial
> port that goes in to the simh system, I think it makes sense that I on
> the simh console can tell the OS that TT1: should be running at 19200
> bps. I then set my VT320 to 19200 bps, and I connect it to the serial
> port, and at that point I would expect it to work. No?

See above for a different scenario. The emulator needs to make the 
target OS happy on one side and the terminal on another side. Noone
has carved it in stone that the baud rate needs to be the same on
both sides. It is not even clear that the VT100 the target OS sees
is actually one (it is just an illusion). This is usually called 
terminal emulation.

>> The fall of mankind was already done with a telnet interface to
>> an emulated terminal multiplexer - TCP/IP has no concept of baudrates,
>> so the emulator has to take measures anyway in order not to flood the OS
>> with data it expects to arrive at lower rates.
>
> If we talk about tcp and telnet, it's a different story. There are no
> speeds, so that part can be ignored.

Oh, the emulator needs to fake the DZ-11. VMS on one side really 
believes it has a VT100 attached, and will then start to program it
with its known ESC codes. And the mentioned SET TT10:/BAUD=9600
(now I think to remember this was RSX-11M speak) will arrive at
the emulator which just stores it for a future Read Device Settings-ESC,
maybe happily acknowledges it, and does nothing.

> Also, there is always flow control, so that part needs to be handled.
> Sometimes that actually makes the emulation break.

Is there a reason why this cannot drain away in the emulator? VMS sees
XON/XOFF, the attached terminal sees DTR/DSR and RTS/CTS? It is just
in the hands of the writer of the emulator.

Maybe this is the real problem. sim_simulate(), the CPU emulation core, 
is usually rather accurate and good; the framework itself is not the
derniere crie in software design.

-- 
Holger



More information about the Simh mailing list