Forums

UART RX interrupt handlers

Started by Alaric B Snell February 23, 2004

Has anyone got a working example of a UART0 receive interrupt handler
that I could study to see what I'm missing?

Mine works perfectly within Keil's ARM simulator, but in real life,
sending a character in on UART0 puts the system in a new random state
every time! The main loop is outputting characters via the UART, so my
first thought is that maybe I need to be more careful about disabling
interrupts ot prevent reading the receive buffer register during my
main-loop code that loops until the transmit holding register is empty
then outputs a character.

It's all written in assembly language, so none of the usual
gcc-being-too-smart issues here...

An earlier run-in with code that worked fine in simulator and died in
real life concerned the simulator assuming I had more RAM than I had,
and not throwing a data abort when I accessed outside of it - so I now
have a seperate handler for each ARM vector, plus a default VIC vector,
all of which put a different pattern on my test LEDs so I can tell it's
happened. My UART0 received interrupt handler should put the received
char on the LEDs, yet somehow it always makes all the LEDs go on, like
when the chip is reset.

I'm starting to wonder if there *is* something in this JTAG debugging
lark after all ;-) This simulator seems too optimistic - it will
correctly reflect the behaviour of a correct program, but will not
always correctly reflect the behaviour of an incorrect program. Bah!
When I was designing a logic simulator for asynchronous processor
design, I took every opportunity to make it take worst-case estimates
for bus settling times and the like, and whenever anything was read from
in a potentially indeterminate state, it flagged it there and then, to
root out as many potential problems as possible...

Background: I'm writing my LPC21xx FORTH, as before mentioned. It's now
setting the system state up fine and starting executing after assembling
a few basic words (like EMIT, for now hardcoded to use UART0, although
at a later date to support swapping in different input/output drivers by
updating a pointer; I plan to support console over I2C, for
multi-processor setups) onto a dictionary list. But I want to have an
interrupt handler for UART0 that puts characters into a buffer for KEY
to read, except for Ctrl+C which will reset the stacks and drop into the
interpreter with the console driver switched to UART0, and Ctrl+R which
will do same but also reset the wordlist pointer (for when you've REALLY
hosed the system) - these are needed because the system will, by
default, load a file from a FLASH filesystem on startup and begin
executing, so we need a way to force it into a clean working state that
cannot be overriden.

ABS



An Engineer's Guide to the LPC2100 Series

Alaric
Check my "Hello World" on UART0 with a Blinky Light, an elapsed system
timer, and optional interrupts" code (UT040322A.zip) in the files section
on Yahoo.

Regards
-Bill Knight
R O SoftWare
On Mon, 23 Feb 2004 20:27:53 +0000, Alaric B Snell wrote: Has anyone got a working example of a UART0 receive interrupt handler
that I could study to see what I'm missing?

Mine works perfectly within Keil's ARM simulator, but in real life,
sending a character in on UART0 puts the system in a new random state
every time! The main loop is outputting characters via the UART, so my
first thought is that maybe I need to be more careful about disabling
interrupts ot prevent reading the receive buffer register during my
main-loop code that loops until the transmit holding register is empty
then outputs a character.

It's all written in assembly language, so none of the usual
gcc-being-too-smart issues here...

An earlier run-in with code that worked fine in simulator and died in
real life concerned the simulator assuming I had more RAM than I had,
and not throwing a data abort when I accessed outside of it - so I now
have a seperate handler for each ARM vector, plus a default VIC vector,
all of which put a different pattern on my test LEDs so I can tell it's
happened. My UART0 received interrupt handler should put the received
char on the LEDs, yet somehow it always makes all the LEDs go on, like
when the chip is reset.

I'm starting to wonder if there *is* something in this JTAG debugging
lark after all ;-) This simulator seems too optimistic - it will
correctly reflect the behaviour of a correct program, but will not
always correctly reflect the behaviour of an incorrect program. Bah!
When I was designing a logic simulator for asynchronous processor
design, I took every opportunity to make it take worst-case estimates
for bus settling times and the like, and whenever anything was read from
in a potentially indeterminate state, it flagged it there and then, to
root out as many potential problems as possible...

Background: I'm writing my LPC21xx FORTH, as before mentioned. It's now
setting the system state up fine and starting executing after assembling
a few basic words (like EMIT, for now hardcoded to use UART0, although
at a later date to support swapping in different input/output drivers by
updating a pointer; I plan to support console over I2C, for
multi-processor setups) onto a dictionary list. But I want to have an
interrupt handler for UART0 that puts characters into a buffer for KEY
to read, except for Ctrl+C which will reset the stacks and drop into the
interpreter with the console driver switched to UART0, and Ctrl+R which
will do same but also reset the wordlist pointer (for when you've REALLY
hosed the system) - these are needed because the system will, by
default, load a file from a FLASH filesystem on startup and begin
executing, so we need a way to force it into a clean working state that
cannot be overriden.

ABS Yahoo! Groups Links


--- In , Alaric B Snell <alaric@a...> wrote:
>
> Background: I'm writing my LPC21xx FORTH, as before mentioned. It's now

Dude! What would it take to get you to defect over to the OKI ARM
camp? I am looking to host Forth on the OKI 67Q5003 ARM mcu. I am a
newbie to the ARM and I have not worked with Forth at this level
before. So I was looking at using a commercial Forth like MPE. But
it will be pretty expensive to buy one complete with a support package
for a new chip like this.

> setting the system state up fine and starting executing after
assembling
> a few basic words (like EMIT, for now hardcoded to use UART0, although
> at a later date to support swapping in different input/output
drivers by
> updating a pointer; I plan to support console over I2C, for
> multi-processor setups) onto a dictionary list. But I want to have an
> interrupt handler for UART0 that puts characters into a buffer for KEY
> to read, except for Ctrl+C which will reset the stacks and drop into
the
> interpreter with the console driver switched to UART0, and Ctrl+R which
> will do same but also reset the wordlist pointer (for when you've
REALLY
> hosed the system) - these are needed because the system will, by
> default, load a file from a FLASH filesystem on startup and begin
> executing, so we need a way to force it into a clean working state that
> cannot be overriden.

I don't think I would use a serial port character for this, it would
be very dangerous in any real world system. A wrong baud rate could
result in any given character being "received". Given time, the
system will get hosed. Instead, I suggest that an input be used to
indicate on reset what actions to be taken. A single input pin can be
used to check for a jumper to ground, a jumper to a Vdd pullup or no
jumper. Then a push button reset will boot normally or have two
levels of recovery from a problem.



wrote:
> --- In , Alaric B Snell <alaric@a...> wrote:
>
>>Background: I'm writing my LPC21xx FORTH, as before mentioned. It's now > Dude! What would it take to get you to defect over to the OKI ARM
> camp? I am looking to host Forth on the OKI 67Q5003 ARM mcu. I am a
> newbie to the ARM and I have not worked with Forth at this level
> before. So I was looking at using a commercial Forth like MPE. But
> it will be pretty expensive to buy one complete with a support package
> for a new chip like this.
>

What I'm doing probably won't take much porting, and most of that all in
a small number of places, and I'll open source it so feel free to port
away, or if you get really desperate, lend me an OKI devel board so I
can do it ;-)

> I don't think I would use a serial port character for this, it would
> be very dangerous in any real world system. A wrong baud rate could
> result in any given character being "received". Given time, the
> system will get hosed. Instead, I suggest that an input be used to
> indicate on reset what actions to be taken. A single input pin can be
> used to check for a jumper to ground, a jumper to a Vdd pullup or no
> jumper. Then a push button reset will boot normally or have two
> levels of recovery from a problem.

I was trying to avoid needing another I/O pin as being 'special' and, as
such, being awkward to use - if you want to use P0.14, you need to make
sure it will remain high for a few ms after /RST rises, if you want your
system to boot at all...

The Ctrl+C and Ctrl+R won't hose the entire system per se, a reset will
always bring it back again, it just clears up the current state of RAM.

As it stands I'm putting in a delay of a few ms after installing the
interrupt handler to allow for a Ctrl+C to get in there before execution
of code from Flash begins, just in case - one could make the interrupt
handler ignore Ctrl+C after this period, if so desired, meaning that RxD
is the special line to be kept in a known state for a few ms after
reset, rather than a GPIO somewhere.

ABS



--- In , Alaric B Snell <alaric@a...> wrote:
> redsp@y... wrote:
> > Dude! What would it take to get you to defect over to the OKI ARM
> > camp? I am looking to host Forth on the OKI 67Q5003 ARM mcu. I am a
> > newbie to the ARM and I have not worked with Forth at this level
> > before. So I was looking at using a commercial Forth like MPE. But
> > it will be pretty expensive to buy one complete with a support package
> > for a new chip like this.
> >
>
> What I'm doing probably won't take much porting, and most of that
all in
> a small number of places, and I'll open source it so feel free to port
> away, or if you get really desperate, lend me an OKI devel board so I
> can do it ;-)

Hey, if you are at all serious, I can do that. I have a Cogent eval
board on the way, they called me today about sending the invoice. I
can get another one easily. It has an ML67Q5003 at 60 MHz with SDRAM,
Flash, Ethernet and I forget what all on the board. It comes with a
monitor in Flash,

What are you using for debugging? Seems like most of the development
tools are not cheap either.



Hi,

I had some real big problem with the version 1.0c from Keil. But
with the new version 1.1a it seams like the simulator is working
correct. Send them an email to get the update link...

//Helge --- In , Alaric B Snell <alaric@a...> wrote:
>
> Has anyone got a working example of a UART0 receive interrupt
handler
> that I could study to see what I'm missing?
>
> Mine works perfectly within Keil's ARM simulator, but in real
life,
> sending a character in on UART0 puts the system in a new random
state
> every time! The main loop is outputting characters via the UART,
so my
> first thought is that maybe I need to be more careful about
disabling
> interrupts ot prevent reading the receive buffer register during
my
> main-loop code that loops until the transmit holding register is
empty
> then outputs a character.
>
> It's all written in assembly language, so none of the usual
> gcc-being-too-smart issues here...
>
> An earlier run-in with code that worked fine in simulator and died
in
> real life concerned the simulator assuming I had more RAM than I
had,
> and not throwing a data abort when I accessed outside of it - so I
now
> have a seperate handler for each ARM vector, plus a default VIC
vector,
> all of which put a different pattern on my test LEDs so I can tell
it's
> happened. My UART0 received interrupt handler should put the
received
> char on the LEDs, yet somehow it always makes all the LEDs go on,
like
> when the chip is reset.
>
> I'm starting to wonder if there *is* something in this JTAG
debugging
> lark after all ;-) This simulator seems too optimistic - it will
> correctly reflect the behaviour of a correct program, but will not
> always correctly reflect the behaviour of an incorrect program.
Bah!
> When I was designing a logic simulator for asynchronous processor
> design, I took every opportunity to make it take worst-case
estimates
> for bus settling times and the like, and whenever anything was
read from
> in a potentially indeterminate state, it flagged it there and
then, to
> root out as many potential problems as possible...
>
> Background: I'm writing my LPC21xx FORTH, as before mentioned.
It's now
> setting the system state up fine and starting executing after
assembling
> a few basic words (like EMIT, for now hardcoded to use UART0,
although
> at a later date to support swapping in different input/output
drivers by
> updating a pointer; I plan to support console over I2C, for
> multi-processor setups) onto a dictionary list. But I want to have
an
> interrupt handler for UART0 that puts characters into a buffer for
KEY
> to read, except for Ctrl+C which will reset the stacks and drop
into the
> interpreter with the console driver switched to UART0, and Ctrl+R
which
> will do same but also reset the wordlist pointer (for when you've
REALLY
> hosed the system) - these are needed because the system will, by
> default, load a file from a FLASH filesystem on startup and begin
> executing, so we need a way to force it into a clean working state
that
> cannot be overriden.
>
> ABS




wrote:

>>a small number of places, and I'll open source it so feel free to port
>>away, or if you get really desperate, lend me an OKI devel board so I
>>can do it ;-)

> Hey, if you are at all serious, I can do that. I have a Cogent eval
> board on the way, they called me today about sending the invoice. I
> can get another one easily. It has an ML67Q5003 at 60 MHz with SDRAM,
> Flash, Ethernet and I forget what all on the board. It comes with a
> monitor in Flash,

When I've got it working on my LPC, remind me and I'll see what I can
do. Depends on my workload at the time!

> What are you using for debugging? Seems like most of the development
> tools are not cheap either.

I'm doing it the hard way - lacking a JTAG debugger (although I probably
could set one up since I have a parallel port JTAG thingy for my Lattice
ISP devices, although I gather it's wired differently to the Wigglers
and so on people round these parts use), I make my code set and unset
LEDs at key points.

Right now, I've found that my ISR works fine (ok, I wasn't acking the
interrupt properly, but easily detected with a 'scope and fixed) as long
as I'm not outputting characters in my main loop.

Looking at the really nice sample code in the files section that Bill
Knight pointed me to, I don't see any interrupt disabling or anything
around the non-interrupt UART output code. Now, since my test loop was
outputting characters at 9600 baud flat out, it was spending almost all
of its time looping for the THRE bit to go up so I could put out the
next byte (FIFO disabled).

Interestingly, the problem that made it crash seemed to be that the VIC
was returning a garbage vector - if I replace the *raw* IRQ vector with
code to set the LEDs to a known state, rather than all that LDR PC, PC,
[-0xFF0] stuff, it would always do so. This is highly interesting,
especially since I have a valid default vector address in the VIC, too.
Some problem with overloading the VPB? I'm running the CPU at 60MHz (MAM
enabled, down to 2 wait states, for maximum performance, just for
kicks!) with a VPBDIV of 4.

And the same code works when not blatting out characters at 9600 baud.

I'm going to try:

1) Disabling IRQs while reading from the THRE bit (although obviously
not for the entire wait-until-THRE-set loop), and/or while writing to
the THR, to see if either help.

2) Compiling up Bill's code and getting it outputting at 9600 baud, non
stop, with receive interrupt handling enabled, then sending a character
and seeing what happens.

3) Running things slower - slowing the CPU, making it wait between
outputting characters, speeding up the VPB, etc.

Failing that, has anyone got an ETM setup? :-)

ABS



Check the new version of the UART code in the files section. It
now disables global interrupts very breifly in the non-interrupt
uart code.

Regards
-Bill Knight
R O SoftWare On Tue, 24 Feb 2004 22:14:09 +0000, Alaric B Snell wrote:

wrote:

>>a small number of places, and I'll open source it so feel free to port
>>away, or if you get really desperate, lend me an OKI devel board so I
>>can do it ;-)

> Hey, if you are at all serious, I can do that. I have a Cogent eval
> board on the way, they called me today about sending the invoice. I
> can get another one easily. It has an ML67Q5003 at 60 MHz with SDRAM,
> Flash, Ethernet and I forget what all on the board. It comes with a
> monitor in Flash,

When I've got it working on my LPC, remind me and I'll see what I can
do. Depends on my workload at the time!

> What are you using for debugging? Seems like most of the development
> tools are not cheap either.

I'm doing it the hard way - lacking a JTAG debugger (although I probably
could set one up since I have a parallel port JTAG thingy for my Lattice
ISP devices, although I gather it's wired differently to the Wigglers
and so on people round these parts use), I make my code set and unset
LEDs at key points.

Right now, I've found that my ISR works fine (ok, I wasn't acking the
interrupt properly, but easily detected with a 'scope and fixed) as long
as I'm not outputting characters in my main loop.

Looking at the really nice sample code in the files section that Bill
Knight pointed me to, I don't see any interrupt disabling or anything
around the non-interrupt UART output code. Now, since my test loop was
outputting characters at 9600 baud flat out, it was spending almost all
of its time looping for the THRE bit to go up so I could put out the
next byte (FIFO disabled).

Interestingly, the problem that made it crash seemed to be that the VIC
was returning a garbage vector - if I replace the *raw* IRQ vector with
code to set the LEDs to a known state, rather than all that LDR PC, PC,
[-0xFF0] stuff, it would always do so. This is highly interesting,
especially since I have a valid default vector address in the VIC, too.
Some problem with overloading the VPB? I'm running the CPU at 60MHz (MAM
enabled, down to 2 wait states, for maximum performance, just for
kicks!) with a VPBDIV of 4.

And the same code works when not blatting out characters at 9600 baud.

I'm going to try:

1) Disabling IRQs while reading from the THRE bit (although obviously
not for the entire wait-until-THRE-set loop), and/or while writing to
the THR, to see if either help.

2) Compiling up Bill's code and getting it outputting at 9600 baud, non
stop, with receive interrupt handling enabled, then sending a character
and seeing what happens.

3) Running things slower - slowing the CPU, making it wait between
outputting characters, speeding up the VPB, etc.

Failing that, has anyone got an ETM setup? :-)

ABS

Yahoo! Groups Links