public inbox for ecos-discuss@sourceware.org
 help / color / mirror / Atom feed
* [ECOS] RedBoot gets() problems
@ 2001-03-02  8:17 Grant Edwards
  2001-03-02  8:31 ` Gary Thomas
  0 siblings, 1 reply; 7+ messages in thread
From: Grant Edwards @ 2001-03-02  8:17 UTC (permalink / raw)
  To: ecos-discuss

I'm having problems with RedBoot due to the way the main loop
and gets() interact.  

If at some point a spurious byte comes in on one of the diag
ports, RedBoot effectively "locks up" until it sees an
end-of-line on that port: it ignores network packets, it
ignores commands on the other diag port.

I think I'm going to have to re-design the input scheme so that
Redboot still responds to the network and to all ports while
in the "middle" of reading an input line.

-- 
Grant Edwards
grante@visi.com

^ permalink raw reply	[flat|nested] 7+ messages in thread

* RE: [ECOS] RedBoot gets() problems
  2001-03-02  8:17 [ECOS] RedBoot gets() problems Grant Edwards
@ 2001-03-02  8:31 ` Gary Thomas
  2001-03-02  8:38   ` Grant Edwards
  0 siblings, 1 reply; 7+ messages in thread
From: Gary Thomas @ 2001-03-02  8:31 UTC (permalink / raw)
  To: Grant Edwards; +Cc: ecos-discuss

On 02-Mar-2001 Grant Edwards wrote:
> 
> I'm having problems with RedBoot due to the way the main loop
> and gets() interact.  
> 
> If at some point a spurious byte comes in on one of the diag
> ports, RedBoot effectively "locks up" until it sees an
> end-of-line on that port: it ignores network packets, it
> ignores commands on the other diag port.
> 

What defines "spurious"?

> I think I'm going to have to re-design the input scheme so that
> Redboot still responds to the network and to all ports while
> in the "middle" of reading an input line.

I'm not convinced that this is the right thing to do.  Maybe
the check for network packets is OK (but I ruled out doing it all
the time because of overhead costs), but once data arrives on
one port, then RedBoot is designed to switch to that port
exclusively.

You can disable this using the CDL which may be what you want.

^ permalink raw reply	[flat|nested] 7+ messages in thread

* Re: [ECOS] RedBoot gets() problems
  2001-03-02  8:31 ` Gary Thomas
@ 2001-03-02  8:38   ` Grant Edwards
  2001-03-02  8:48     ` Gary Thomas
  0 siblings, 1 reply; 7+ messages in thread
From: Grant Edwards @ 2001-03-02  8:38 UTC (permalink / raw)
  To: Gary Thomas; +Cc: ecos-discuss

On Fri, Mar 02, 2001 at 09:31:11AM -0700, Gary Thomas wrote:

> > I'm having problems with RedBoot due to the way the main loop
> > and gets() interact.  
> > 
> > If at some point a spurious byte comes in on one of the diag
> > ports, RedBoot effectively "locks up" until it sees an
> > end-of-line on that port: it ignores network packets, it
> > ignores commands on the other diag port.
> 
> What defines "spurious"?

Spurious as in there's nothing connected to that port, so it's
floating.  On power-up, sometimes there ssems to be noise that
generates input data on unconnected ports.  Flushing the
receive data when I initialize the port seems to help.

> > I think I'm going to have to re-design the input scheme so that
> > Redboot still responds to the network and to all ports while in
> > the "middle" of reading an input line.
> 
> I'm not convinced that this is the right thing to do.  Maybe
> the check for network packets is OK (but I ruled out doing it
> all the time because of overhead costs),

I'm not sure what you mean by "overhead costs".  Are you
concerned about not handling characters fast enough once they
start to arrive?  The minimum inter-character gap is already
defined by the length of time it takes to do a network poll.

> but once data arrives on one port, then RedBoot is designed to
> switch to that port exclusively.

Right.  My problem is that it sometimes switches exclusively to
a port with nothing connected -- at which point the board
becomes dead to the world.

> You can disable this using the CDL which may be what you want.

That might be what I need to do.

-- 
Grant Edwards
grante@visi.com

^ permalink raw reply	[flat|nested] 7+ messages in thread

* Re: [ECOS] RedBoot gets() problems
  2001-03-02  8:38   ` Grant Edwards
@ 2001-03-02  8:48     ` Gary Thomas
  2001-03-02  9:08       ` Grant Edwards
  0 siblings, 1 reply; 7+ messages in thread
From: Gary Thomas @ 2001-03-02  8:48 UTC (permalink / raw)
  To: Grant Edwards; +Cc: ecos-discuss

On 02-Mar-2001 Grant Edwards wrote:
>> > I think I'm going to have to re-design the input scheme so that
>> > Redboot still responds to the network and to all ports while in
>> > the "middle" of reading an input line.
>> 
>> I'm not convinced that this is the right thing to do.  Maybe
>> the check for network packets is OK (but I ruled out doing it
>> all the time because of overhead costs),
> 
> I'm not sure what you mean by "overhead costs".  Are you
> concerned about not handling characters fast enough once they
> start to arrive?  The minimum inter-character gap is already
> defined by the length of time it takes to do a network poll.
> 

I'm mostly concerned about the cost of checking the network
interface for data.  This involves seeing if any packets have
arrived, processing them if they have and then checking to see
if a request to make a Telnet/TCP connection has been made.  This
is all quite expensive and should not be encumbered on every input
character, thus the choice to only make such a check when the 
"console" port is idle.

One change which might help (wrt network packets) is to treat
all characters the same, i.e. with a timeout, that would let you
check for network activity while input was being received on a
serial port.  I think that the code would get very upset if a
Telnet/TCP connection arrived while a command was being entered
though, another reason for only handling it at the pure idle
point, when no [serial] characters have arrived at all.

^ permalink raw reply	[flat|nested] 7+ messages in thread

* Re: [ECOS] RedBoot gets() problems
  2001-03-02  8:48     ` Gary Thomas
@ 2001-03-02  9:08       ` Grant Edwards
  2001-03-03  5:28         ` Gary Thomas
  0 siblings, 1 reply; 7+ messages in thread
From: Grant Edwards @ 2001-03-02  9:08 UTC (permalink / raw)
  To: Gary Thomas; +Cc: ecos-discuss

On Fri, Mar 02, 2001 at 09:48:06AM -0700, Gary Thomas wrote:

> On 02-Mar-2001 Grant Edwards wrote:
> >> > I think I'm going to have to re-design the input scheme so that
> >> > Redboot still responds to the network and to all ports while in
> >> > the "middle" of reading an input line.
> >> 
> >> I'm not convinced that this is the right thing to do.  Maybe
> >> the check for network packets is OK (but I ruled out doing it
> >> all the time because of overhead costs),
> > 
> > I'm not sure what you mean by "overhead costs".  Are you
> > concerned about not handling characters fast enough once they
> > start to arrive?  The minimum inter-character gap is already
> > defined by the length of time it takes to do a network poll.
> 
> I'm mostly concerned about the cost of checking the network
> interface for data. This involves seeing if any packets have
> arrived, processing them if they have and then checking to see
> if a request to make a Telnet/TCP connection has been made.
> This is all quite expensive

Right.  But, it's not the absolute cost, only the opportunity
cost that counts.  It only matters if there's something else
that we could be doing but can't do because we're processing
network traffic.

> and should not be encumbered on every input character, thus the
> choice to only make such a check when the "console" port is
> idle.

That cost is already incurred.  If the gap between rx
characters is smaller than the time it takes to do a network
poll, then initial characters in a command can be lost.  If I
send a command line to the board at 57600 baud, I generally
loose 2-3 characters due to the fact that rx characters are
only processed between network polls.

[I could probably get around that problem by setting up the DMA
controller to handle incoming serial data, but that's too
complicated.  This problem would also be alleviated by UARTs
with FIFOs]

Ignoring the network while reading a line isn't really gaining
me anything: I can't tolerate rx characters arriving any faster
than one per network poll anyway, so checking the network while
reading a line isn't imposing any additional restrictions on
baud rate.

> One change which might help (wrt network packets) is to treat
> all characters the same, i.e. with a timeout, that would let
> you check for network activity while input was being received
> on a serial port. I think that the code would get very upset if
> a Telnet/TCP connection arrived while a command was being
> entered though, another reason for only handling it at the pure
> idle point, when no [serial] characters have arrived at all.

I can see how the current scheme works well if there are FIFOs
in the UARTs -- we can tolerate the initial read delay due to
network polling (the FIFO starts to fill), but once we notice
characters are coming in, we stop polling the network until we
see the EOL.

This reflects (I think) the assumption that serial port traffic
is more important than network traffic.  For me, however,
handling network traffic is the top priority. I can tolerate
the serial ports going dead occasionally, but if the network is
ignored, I'm sunk.

-- 
Grant Edwards
grante@visi.com

^ permalink raw reply	[flat|nested] 7+ messages in thread

* Re: [ECOS] RedBoot gets() problems
  2001-03-02  9:08       ` Grant Edwards
@ 2001-03-03  5:28         ` Gary Thomas
  2001-03-05 13:52           ` Grant Edwards
  0 siblings, 1 reply; 7+ messages in thread
From: Gary Thomas @ 2001-03-03  5:28 UTC (permalink / raw)
  To: Grant Edwards; +Cc: ecos-discuss

On 02-Mar-2001 Grant Edwards wrote:
>> and should not be encumbered on every input character, thus the
>> choice to only make such a check when the "console" port is
>> idle.
> 
> That cost is already incurred.  If the gap between rx
> characters is smaller than the time it takes to do a network
> poll, then initial characters in a command can be lost.  If I
> send a command line to the board at 57600 baud, I generally
> loose 2-3 characters due to the fact that rx characters are
> only processed between network polls.
> 

This is probably not the problem.  The true problem was that the
timeouts were not being handled properly if you have multiple 
serial channels and were scanning for a console to use.  In this
case, the old code would have waited for 10/N ms (where N is the
number of channels) on each channel before deciding that any
characters have arrived.  If N is 2, then this is 5 ms.  At a
baud rate of 57600, 5ms represents as many as 15 characters!
Thus, a bunch of characters could arrive on one channel while the
code was waiting for the other.

This patch will make it better, by forcing each channel to timeout
as fast as possible (sorry, but the granularity is only 1ms).

I'm still thinking about how [limited] network processing can take
place between characters.  However, I'm not convinced that RedBoot
is the place to be doing such processing.  It would seem to me that
if you need more processing and higher response rates, you should
really have a real application running, using interrupts, etc.
... but that's probably a whole different discussion.

 
Index: redboot/current/src/io.c
===================================================================
RCS file: /home/cvs/ecc/ecc/redboot/current/src/io.c,v
retrieving revision 1.20
diff -u -5 -p -r1.20 io.c
--- redboot/current/src/io.c    2001/01/17 13:50:59     1.20
+++ redboot/current/src/io.c    2001/03/03 13:20:26
@@ -88,32 +88,39 @@ mon_read_char(char *c)
         __chan = CYGACC_CALL_IF_DEBUG_PROCS();
         *c = CYGACC_COMM_IF_GETC(*__chan);
     }
 }
 
+#ifdef CYGPKG_REDBOOT_ANY_CONSOLE
+static int _mon_timeout;
+#endif
+
 static bool
 mon_read_char_with_timeout(char *c)
 {
-    bool res;
+    bool res = false;
     hal_virtual_comm_table_t *__chan;
 
 #ifdef CYGPKG_REDBOOT_ANY_CONSOLE
     if (!console_selected) {
         int cur = CYGACC_CALL_IF_SET_CONSOLE_COMM(CYGNUM_CALL_IF_SET_COMM_ID_QUERY_CURRENT);
-        int i;
+        int i, tot;
         // Try input from all channels
-        for (i = 0;  i < CYGNUM_HAL_VIRTUAL_VECTOR_COMM_CHANNELS;  i++) {
-            CYGACC_CALL_IF_SET_CONSOLE_COMM(i);
-            __chan = CYGACC_CALL_IF_CONSOLE_PROCS();
-            res = CYGACC_COMM_IF_GETC_TIMEOUT(*__chan, c);
-            if (res) {
-                // Input available on this channel, make it be the console
-                if (*c != '\0') {
-                    // Don't chose this unless real data have arrived
-                    console_selected = true;
-                    CYGACC_CALL_IF_SET_DEBUG_COMM(i);
-                    return res;
+        tot = 0;
+        while (tot < _mon_timeout) {
+            for (i = 0;  i < CYGNUM_HAL_VIRTUAL_VECTOR_COMM_CHANNELS;  i++, tot++) {
+                CYGACC_CALL_IF_SET_CONSOLE_COMM(i);
+                __chan = CYGACC_CALL_IF_CONSOLE_PROCS();
+                res = CYGACC_COMM_IF_GETC_TIMEOUT(*__chan, c);
+                if (res) {
+                    // Input available on this channel, make it be the console
+                    if (*c != '\0') {
+                        // Don't chose this unless real data have arrived
+                        console_selected = true;
+                        CYGACC_CALL_IF_SET_DEBUG_COMM(i);
+                        return res;
+                    }
                 }
             }
         }
         CYGACC_CALL_IF_SET_CONSOLE_COMM(cur);        
     } else 
@@ -137,12 +144,13 @@ mon_set_read_char_timeout(int ms)
 
 #ifdef CYGPKG_REDBOOT_ANY_CONSOLE
     if (!console_selected) {
         int cur = CYGACC_CALL_IF_SET_CONSOLE_COMM(CYGNUM_CALL_IF_SET_COMM_ID_QUERY_CURRENT);
         int i;
-        // Set timeout on each channel so total amounts to desired value
-        ms = ms / CYGNUM_HAL_VIRTUAL_VECTOR_COMM_CHANNELS;
+        // Set timeout to minimum on each channel; total amounts to desired value
+        _mon_timeout = ms;
+        ms = 1;
         for (i = 0;  i < CYGNUM_HAL_VIRTUAL_VECTOR_COMM_CHANNELS;  i++) {
             CYGACC_CALL_IF_SET_CONSOLE_COMM(i);
             if ((__chan = CYGACC_CALL_IF_CONSOLE_PROCS()) != 0) {
                 CYGACC_COMM_IF_CONTROL(*__chan, __COMMCTL_SET_TIMEOUT, ms);
             }

^ permalink raw reply	[flat|nested] 7+ messages in thread

* Re: [ECOS] RedBoot gets() problems
  2001-03-03  5:28         ` Gary Thomas
@ 2001-03-05 13:52           ` Grant Edwards
  0 siblings, 0 replies; 7+ messages in thread
From: Grant Edwards @ 2001-03-05 13:52 UTC (permalink / raw)
  To: Gary Thomas; +Cc: ecos-discuss

On Sat, Mar 03, 2001 at 06:28:32AM -0700, Gary Thomas wrote:

> On 02-Mar-2001 Grant Edwards wrote:

> > That cost is already incurred.  If the gap between rx
> > characters is smaller than the time it takes to do a network
> > poll, then initial characters in a command can be lost.  If I
> > send a command line to the board at 57600 baud, I generally
> > loose 2-3 characters due to the fact that rx characters are
> > only processed between network polls.
> 
> This is probably not the problem.  The true problem was that
> the timeouts were not being handled properly if you have
> multiple serial channels and were scanning for a console to
> use.  In this case, the old code would have waited for 10/N ms
> (where N is the number of channels) on each channel before
> deciding that any characters have arrived.  If N is 2, then
> this is 5 ms.  At a baud rate of 57600, 5ms represents as many
> as 15 characters! Thus, a bunch of characters could arrive on
> one channel while the code was waiting for the other.

Ah.  I hadn't spotted that.

> This patch will make it better, by forcing each channel to
> timeout as fast as possible (sorry, but the granularity is only
> 1ms).

The current performace is actually fine for a human typing
commands to RedBoot -- it's only a problem when the stuff sent
to RedBoot is buffered and sent a line at a time.  I can get
GDB to connect to RedBoot via TCP but not via serial port, and
I suspect it's because characters are getting lost when GDB
initiates the connection.

One thing I'm a little fuzzy on is why not set the timeout to
zero?  If there's no character there, don't sit and wait, check
the next port (or the network) right away.  The only issue I
can think of is that the system time no longer gets updated if
nobody is doing any 1ms delays.  Adding a 100us delay in the
main-loop that updates the system time every 10th time could
alleviate that problem.

> I'm still thinking about how [limited] network processing can
> take place between characters.  However, I'm not convinced that
> RedBoot is the place to be doing such processing.  It would
> seem to me that if you need more processing and higher response
> rates, you should really have a real application running, using
> interrupts, etc. ... but that's probably a whole different
> discussion.

The throughput is plenty good enough for what little network
processing I want to do.  Except for the GDB issue I just
mentioned, my only problem is when RedBoot switches exclusively
to a console port due to a noise glitch and ignores the network
completely.

-- 
Grant Edwards
grante@visi.com

^ permalink raw reply	[flat|nested] 7+ messages in thread

end of thread, other threads:[~2001-03-05 13:52 UTC | newest]

Thread overview: 7+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
2001-03-02  8:17 [ECOS] RedBoot gets() problems Grant Edwards
2001-03-02  8:31 ` Gary Thomas
2001-03-02  8:38   ` Grant Edwards
2001-03-02  8:48     ` Gary Thomas
2001-03-02  9:08       ` Grant Edwards
2001-03-03  5:28         ` Gary Thomas
2001-03-05 13:52           ` Grant Edwards

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox;
as well as URLs for read-only IMAP folder(s) and NNTP newsgroup(s).