EmbeddedRelated.com
Forums
The 2024 Embedded Online Conference

ADC Burst mode with Interrupts Enabled... possible?

Started by robertfleury May 21, 2012
Hi all,

Im trying to get the ADC working reliably on my LPC1763 board under
crossworks. Some interesting behaviour I noticed but would like some
confirmation on. My basic implementation is a Timer ISR which sets burst mode
active, and an ADC interrupt that is triggered on a set channel where i first
disable burst mode and then read the data from the registers.

Currently, I have my ADC setup to convert 6 channels, and the interrupt set to
trigger on channel 4 done bit. What appears to happen is - the Timer ISR
occurs, burst mode gets activated, channels 0 through 4 are converted 5 and 6
are unconverted(DONE and OVERUN = 0) and the odd thing, channel 0 is set to
overrun. How is this possible? Channels 5 and 6 never even got converted? The
datasheet says that burst converts all channels there are enabled in the select
bit, and 0-6 are enabled. Any known reason for this? Is there anyway to use
interrupts without clobbering A/D channel 0? Thanks.

An Engineer's Guide to the LPC2100 Series

Hi Robert,

Quote: "the interrupt set totrigger on channel 4 done bit."

The ADC will sample from AD0 to AD7. If you want the AD5 and AD6 ready when ADC interrupt happens, set the ADC interrupt to trigger on channel 6.

Regards,
-daniel
________________________________
From: robertfleury
To: l...
Sent: Monday, May 21, 2012 11:31 PM
Subject: [lpc2000] ADC Burst mode with Interrupts Enabled... possible?

Hi all,

Im trying to get the ADC working reliably on my LPC1763 board under
crossworks. Some interesting behaviour I noticed but would like some
confirmation on. My basic implementation is a Timer ISR which sets burst mode
active, and an ADC interrupt that is triggered on a set channel where i first
disable burst mode and then read the data from the registers.

Currently, I have my ADC setup to convert 6 channels, and the interrupt set to
trigger on channel 4 done bit. What appears to happen is - the Timer ISR
occurs, burst mode gets activated, channels 0 through 4 are converted 5 and 6
are unconverted(DONE and OVERUN = 0) and the odd thing, channel 0 is set to
overrun. How is this possible? Channels 5 and 6 never even got converted? The
datasheet says that burst converts all channels there are enabled in the select
bit, and 0-6 are enabled. Any known reason for this? Is there anyway to use
interrupts without clobbering A/D channel 0? Thanks.

The question is about the overrun bit on channel 0. Not about AD5 and AD6.

Regards,
Bernardo Marques.

On Tue, May 22, 2012 at 3:12 AM, Daniel Widyanto
wrote:

> **
> Hi Robert,
>
> Quote: "the interrupt set to trigger on channel 4 done bit. "
>
> The ADC will sample from AD0 to AD7. If you want the AD5 and AD6 ready
> when ADC interrupt happens, set the ADC interrupt to trigger on channel 6.
>
> Regards,
> -daniel
> ________________________________
> From: robertfleury
> To: l...
> Sent: Monday, May 21, 2012 11:31 PM
> Subject: [lpc2000] ADC Burst mode with Interrupts Enabled... possible?
>
> Hi all,
>
> Im trying to get the ADC working reliably on my LPC1763 board under
> crossworks. Some interesting behaviour I noticed but would like some
> confirmation on. My basic implementation is a Timer ISR which sets burst
> mode
> active, and an ADC interrupt that is triggered on a set channel where i
> first
> disable burst mode and then read the data from the registers.
>
> Currently, I have my ADC setup to convert 6 channels, and the interrupt
> set to
> trigger on channel 4 done bit. What appears to happen is - the Timer ISR
> occurs, burst mode gets activated, channels 0 through 4 are converted 5
> and 6
> are unconverted(DONE and OVERUN = 0) and the odd thing, channel 0 is set to
> overrun. How is this possible? Channels 5 and 6 never even got converted?
> The
> datasheet says that burst converts all channels there are enabled in the
> select
> bit, and 0-6 are enabled. Any known reason for this? Is there anyway to use
> interrupts without clobbering A/D channel 0? Thanks.
>
>
Hi Robert,

> Im trying to get the ADC working reliably on my LPC1763 board under
> crossworks. Some interesting behaviour I noticed but would like some
> confirmation on. My basic implementation is a Timer ISR which sets burst mode
> active, and an ADC interrupt that is triggered on a set channel where i first
> disable burst mode and then read the data from the registers.

> Currently, I have my ADC setup to convert 6 channels, and the interrupt set to
> trigger on channel 4 done bit. What appears to happen is - the Timer ISR
> occurs, burst mode gets activated, channels 0 through 4 are converted 5 and 6
> are unconverted(DONE and OVERUN = 0) and the odd thing, channel 0 is set to
> overrun. How is this possible? Channels 5 and 6 never even got converted? The
> datasheet says that burst converts all channels there are enabled in the select
> bit, and 0-6 are enabled. Any known reason for this? Is there anyway to use
> interrupts without clobbering A/D channel 0? Thanks.

I'm not exactly familiar with the LPC1763 but want to give some hints
regarding AD conversion.
First, you write that you setup the ADC to convert 6 Channels.
That is, Channel 0 to Channel 5, right?
But you write that you see channels 0-4 coverted but not channels 5 and 6.
Thats in total 7 channels - did you set 6 or 7 channels to be converted?

On my knowledge of ADC converters, it is only possible to get
an interrupt after all desired channels have been converted. If you
set the interrupt to a channel prior to "all converted", the ADC might
restart conversion from the first channel again.
And this is what i expect happens in your case.
You said you set 6 channels to be converted, thats channels 0 - 5
What happens is now:
Channels 0-4 are converted (5 channels) ->ISR and ADC restarts
has 1 channel left-> channel 0 overwritten

Please check your setup again regarding these hints - maybe you find whats wrong then.

Greetings
Carsten Schmid


Il 21/05/2012 17:31, robertfleury ha scritto:
>
>
> Hi all,
>
> Im trying to get the ADC working reliably on my LPC1763 board under
> crossworks. Some interesting behaviour I noticed but would like some
> confirmation on. My basic implementation is a Timer ISR which sets
> burst mode
> active, and an ADC interrupt that is triggered on a set channel where
> i first
> disable burst mode and then read the data from the registers.
>
> Currently, I have my ADC setup to convert 6 channels, and the
> interrupt set to
> trigger on channel 4 done bit.
>
This is not well done and it is not the simplest way to do the job.
The burst mode conversion is useful to free the microcontroller to do
stupid things as continuously programming the next channel to convert.
The idea is:
1. set the channels to convert (adjacent channels work better the sparse
channels)
2. enable the irq of the last channel to convert (it means end of
conversion)
3. when irq fires you have 5 micro seconds to read and store the 1st result

so if you need to convert adc data continuously you should set up a DMA
channel to transfer the ADC data when 6th conversion is finished. You
should use a queue or a ring buffer using properly the auto increment
feature of the DMA.

If you don't need to convert data continuously, may be 1 burst every 10
milliseconds, you should set also the 5th channel irq because when it
will be fired the 6th conversion is started so in the 5th channel irq
you should disable the burst function and in the 6th channel irq you
should read the 6 channels data. May be that at higher clock frequency a
little delay is necessary to be sure that 6th conversion is started, if
it isn't you should not use the 5th irq.

> What appears to happen is - the Timer ISR
> occurs, burst mode gets activated, channels 0 through 4 are converted
> 5 and 6
> are unconverted(DONE and OVERUN = 0) and the odd thing, channel 0 is
> set to
> overrun. How is this possible? Channels 5 and 6 never even got
> converted? The
> datasheet says that burst converts all channels there are enabled in
> the select
> bit, and 0-6 are enabled. Any known reason for this? Is there anyway
> to use
> interrupts without clobbering A/D channel 0? Thanks.



This is not necessarily the best way to do it.

Once a channel conversion has started it will run to completion, and the
burst mode will recycle the channels.

Interrupting after the last conversion will result on a tight window to read
the first conversion before it gets overwritten. OK if you don't care about
conversion timing but not if you are using samples where (relative) sample
time is important

The simplest way is to halt the ADC whilst it is converting the last
channel, then after a few us you can read the results.

I've tried both methods and this is much more reliable, I'm running this
mode on LPC11xx and LPC17xx devices reliably now for a very long time.

In my case I let the conversion of channel n-1 generate the interrupt then
the ISR immediately clears the burst flag, does some general initialization
then reads the ADC values and I get no errors.

Regards

Phil.

From: l... [mailto:l...] On Behalf Of
M. Manca
Sent: 22 May 2012 09:39
To: l...
Subject: Re: [lpc2000] ADC Burst mode with Interrupts Enabled... possible?

Il 21/05/2012 17:31, robertfleury ha scritto:
> Hi all,
>
> Im trying to get the ADC working reliably on my LPC1763 board under
> crossworks. Some interesting behaviour I noticed but would like some
> confirmation on. My basic implementation is a Timer ISR which sets
> burst mode
> active, and an ADC interrupt that is triggered on a set channel where
> i first
> disable burst mode and then read the data from the registers.
>
> Currently, I have my ADC setup to convert 6 channels, and the
> interrupt set to
> trigger on channel 4 done bit.
>
This is not well done and it is not the simplest way to do the job.
The burst mode conversion is useful to free the microcontroller to do
stupid things as continuously programming the next channel to convert.
The idea is:
1. set the channels to convert (adjacent channels work better the sparse
channels)
2. enable the irq of the last channel to convert (it means end of
conversion)
3. when irq fires you have 5 micro seconds to read and store the 1st result

so if you need to convert adc data continuously you should set up a DMA
channel to transfer the ADC data when 6th conversion is finished. You
should use a queue or a ring buffer using properly the auto increment
feature of the DMA.

If you don't need to convert data continuously, may be 1 burst every 10
milliseconds, you should set also the 5th channel irq because when it
will be fired the 6th conversion is started so in the 5th channel irq
you should disable the burst function and in the 6th channel irq you
should read the 6 channels data. May be that at higher clock frequency a
little delay is necessary to be sure that 6th conversion is started, if
it isn't you should not use the 5th irq.

> What appears to happen is - the Timer ISR
> occurs, burst mode gets activated, channels 0 through 4 are converted
> 5 and 6
> are unconverted(DONE and OVERUN = 0) and the odd thing, channel 0 is
> set to
> overrun. How is this possible? Channels 5 and 6 never even got
> converted? The
> datasheet says that burst converts all channels there are enabled in
> the select
> bit, and 0-6 are enabled. Any known reason for this? Is there anyway
> to use
> interrupts without clobbering A/D channel 0? Thanks.





Il 22/05/2012 11:15, Phil Young ha scritto:
>
>
> This is not necessarily the best way to do it.
>
> Once a channel conversion has started it will run to completion, and the
> burst mode will recycle the channels.
> Interrupting after the last conversion will result on a tight window
> to read
> the first conversion before it gets overwritten. OK if you don't care
> about
> conversion timing but not if you are using samples where (relative) sample
> time is important
>
> The simplest way is to halt the ADC whilst it is converting the last
> channel, then after a few us you can read the results.
>
I said "if you need a continuous conversion" (this meand you can't stop
adc) and I just follow what seems to be him intentions so correcting a
little his main idea.
Personally I prefer to use DMA for continuous conversions, it is my 1st
choice.
> I've tried both methods and this is much more reliable, I'm running this
> mode on LPC11xx and LPC17xx devices reliably now for a very long time.
>
> In my case I let the conversion of channel n-1 generate the interrupt then
> the ISR immediately clears the burst flag, does some general
> initialization
> then reads the ADC values and I get no errors.
>
Personally in low end applications I need not more then 1 burst every 10
milliseconds so I read the previous conversions and then start 1 burst,
I will read the results at next 10 milliseconds tick.
In more demanding applications I use DMA to have continuous conversions
and eventually I do calculations after every DMA end of transfer
interrupt but more commonly I do calculations when I have time to do
them on a task or on the main loop.
> Regards
>
> Phil.
>
> From: l...
> [mailto:l... ] On
> Behalf Of
> M. Manca
> Sent: 22 May 2012 09:39
> To: l...
> Subject: Re: [lpc2000] ADC Burst mode with Interrupts Enabled... possible?
>
> Il 21/05/2012 17:31, robertfleury ha scritto:
> >
> >
> > Hi all,
> >
> > Im trying to get the ADC working reliably on my LPC1763 board under
> > crossworks. Some interesting behaviour I noticed but would like some
> > confirmation on. My basic implementation is a Timer ISR which sets
> > burst mode
> > active, and an ADC interrupt that is triggered on a set channel where
> > i first
> > disable burst mode and then read the data from the registers.
> >
> > Currently, I have my ADC setup to convert 6 channels, and the
> > interrupt set to
> > trigger on channel 4 done bit.
> >
> This is not well done and it is not the simplest way to do the job.
> The burst mode conversion is useful to free the microcontroller to do
> stupid things as continuously programming the next channel to convert.
> The idea is:
> 1. set the channels to convert (adjacent channels work better the sparse
> channels)
> 2. enable the irq of the last channel to convert (it means end of
> conversion)
> 3. when irq fires you have 5 micro seconds to read and store the 1st
> result
>
> so if you need to convert adc data continuously you should set up a DMA
> channel to transfer the ADC data when 6th conversion is finished. You
> should use a queue or a ring buffer using properly the auto increment
> feature of the DMA.
>
> If you don't need to convert data continuously, may be 1 burst every 10
> milliseconds, you should set also the 5th channel irq because when it
> will be fired the 6th conversion is started so in the 5th channel irq
> you should disable the burst function and in the 6th channel irq you
> should read the 6 channels data. May be that at higher clock frequency a
> little delay is necessary to be sure that 6th conversion is started, if
> it isn't you should not use the 5th irq.
>
> > What appears to happen is - the Timer ISR
> > occurs, burst mode gets activated, channels 0 through 4 are converted
> > 5 and 6
> > are unconverted(DONE and OVERUN = 0) and the odd thing, channel 0 is
> > set to
> > overrun. How is this possible? Channels 5 and 6 never even got
> > converted? The
> > datasheet says that burst converts all channels there are enabled in
> > the select
> > bit, and 0-6 are enabled. Any known reason for this? Is there anyway
> > to use
> > interrupts without clobbering A/D channel 0? Thanks.
> >
> >
>
>



I was assuming nothing, the conversion speed is a function of the clock
frequency for the ADC, but this is often not what is required.

If using an ISR to trigger conversions with a timer then presumably the rate
is important, and usually so is the relative timing of the samples.

The ADC burst mode is broken as I have said in the past because there is no
way to trigger a single burst conversion, this is a big mistake as it often
complicates the SW, IMHO NXP should add a flag to automatically stop the
burst on completion of 1 conversion set.

In the meantime is you need to convert all channels, and the order of
conversion matters ( as it often does ), then stopping it after the
penultimate channel conversion is the easiest way, it will always complete
the conversion in progress.

But this always relies on the ISR latency being less than 1 conversion time.

It is possible to simplify thing slightly by making the ISR for the
penultimate conversion a high priority and swapping letting it re-trigger
after the final conversion, but this can be messy to debug.

Regards

Phil.

From: l... [mailto:l...] On Behalf Of
M. Manca
Sent: 22 May 2012 10:50
To: l...
Subject: Re: [lpc2000] ADC Burst mode with Interrupts Enabled... possible?

Il 22/05/2012 11:15, Phil Young ha scritto:
> This is not necessarily the best way to do it.
>
> Once a channel conversion has started it will run to completion, and the
> burst mode will recycle the channels.
> Interrupting after the last conversion will result on a tight window
> to read
> the first conversion before it gets overwritten. OK if you don't care
> about
> conversion timing but not if you are using samples where (relative) sample
> time is important
>
> The simplest way is to halt the ADC whilst it is converting the last
> channel, then after a few us you can read the results.
>
I said "if you need a continuous conversion" (this meand you can't stop
adc) and I just follow what seems to be him intentions so correcting a
little his main idea.
Personally I prefer to use DMA for continuous conversions, it is my 1st
choice.
> I've tried both methods and this is much more reliable, I'm running this
> mode on LPC11xx and LPC17xx devices reliably now for a very long time.
>
> In my case I let the conversion of channel n-1 generate the interrupt then
> the ISR immediately clears the burst flag, does some general
> initialization
> then reads the ADC values and I get no errors.
>
Personally in low end applications I need not more then 1 burst every 10
milliseconds so I read the previous conversions and then start 1 burst,
I will read the results at next 10 milliseconds tick.
In more demanding applications I use DMA to have continuous conversions
and eventually I do calculations after every DMA end of transfer
interrupt but more commonly I do calculations when I have time to do
them on a task or on the main loop.
> Regards
>
> Phil.
>
> From: l...

> [mailto:l...
] On
> Behalf Of
> M. Manca
> Sent: 22 May 2012 09:39
> To: l...

> Subject: Re: [lpc2000] ADC Burst mode with Interrupts Enabled... possible?
>
> Il 21/05/2012 17:31, robertfleury ha scritto:
> >
> >
> > Hi all,
> >
> > Im trying to get the ADC working reliably on my LPC1763 board under
> > crossworks. Some interesting behaviour I noticed but would like some
> > confirmation on. My basic implementation is a Timer ISR which sets
> > burst mode
> > active, and an ADC interrupt that is triggered on a set channel where
> > i first
> > disable burst mode and then read the data from the registers.
> >
> > Currently, I have my ADC setup to convert 6 channels, and the
> > interrupt set to
> > trigger on channel 4 done bit.
> >
> This is not well done and it is not the simplest way to do the job.
> The burst mode conversion is useful to free the microcontroller to do
> stupid things as continuously programming the next channel to convert.
> The idea is:
> 1. set the channels to convert (adjacent channels work better the sparse
> channels)
> 2. enable the irq of the last channel to convert (it means end of
> conversion)
> 3. when irq fires you have 5 micro seconds to read and store the 1st
> result
>
> so if you need to convert adc data continuously you should set up a DMA
> channel to transfer the ADC data when 6th conversion is finished. You
> should use a queue or a ring buffer using properly the auto increment
> feature of the DMA.
>
> If you don't need to convert data continuously, may be 1 burst every 10
> milliseconds, you should set also the 5th channel irq because when it
> will be fired the 6th conversion is started so in the 5th channel irq
> you should disable the burst function and in the 6th channel irq you
> should read the 6 channels data. May be that at higher clock frequency a
> little delay is necessary to be sure that 6th conversion is started, if
> it isn't you should not use the 5th irq.
>
> > What appears to happen is - the Timer ISR
> > occurs, burst mode gets activated, channels 0 through 4 are converted
> > 5 and 6
> > are unconverted(DONE and OVERUN = 0) and the odd thing, channel 0 is
> > set to
> > overrun. How is this possible? Channels 5 and 6 never even got
> > converted? The
> > datasheet says that burst converts all channels there are enabled in
> > the select
> > bit, and 0-6 are enabled. Any known reason for this? Is there anyway
> > to use
> > interrupts without clobbering A/D channel 0? Thanks.
> >
> >
>
>





Seems like the hard way to do it.

Why not set the ADC to start conversion on a timer match, then enable the interrupt for the last channel converted? Note that the ADC does one conversion each match, so the timer will need to run 6X faster.

That way you get precise timing unaffected by interrupt latency.

I do this with a LPC1759.

--- In l..., "robertfleury" wrote:
>
> Hi all,
>
> Im trying to get the ADC working reliably on my LPC1763 board under
> crossworks. Some interesting behaviour I noticed but would like some
> confirmation on. My basic implementation is a Timer ISR which sets burst mode
> active, and an ADC interrupt that is triggered on a set channel where i first
> disable burst mode and then read the data from the registers.
>
> Currently, I have my ADC setup to convert 6 channels, and the interrupt set to
> trigger on channel 4 done bit. What appears to happen is - the Timer ISR
> occurs, burst mode gets activated, channels 0 through 4 are converted 5 and 6
> are unconverted(DONE and OVERUN = 0) and the odd thing, channel 0 is set to
> overrun. How is this possible? Channels 5 and 6 never even got converted? The
> datasheet says that burst converts all channels there are enabled in the select
> bit, and 0-6 are enabled. Any known reason for this? Is there anyway to use
> interrupts without clobbering A/D channel 0? Thanks.
>

There is a flaw with this mechanism, it is in fact more sensitive to
interrupt latency. The issue is that the burst has to be stopped by SW, but
once stopped it will continue the conversion it started so will overwrite
the first conversion pretty soon so you have to read the value quickly.

This can work for some instances, but this is not ideal for many cases, for
instance I need to convert at 20 KHz, I want one sample per channel at
exactly this rate and need the skew between channels to be as short as
possible which means converting at the maximum ADC Clock rate and then
shutting off the ADC.

If you interrupt on completion of the last channel then you need to ensure
that the ADC ISR has highest priority and can sample the data before ADC0 is
overwritten, but if you interrupt on channel 6 completion then the only
critical part is clearing the burst mode bit which is faster so relaxes the
ISR latency requirements.

Triggering with a timer for a single burst would be the ideal solution, but
NXP did not think to make a single burst possible so interrupt timing
becomes more critical as the timing to stop the ADC requires a low interrupt
latency.

The easiest way for this is 2 isr's, first ISR runs on a timer at highest
priority and starts the ADC, it takes just a couple of instructions.

Second ISR is triggered by completion of channel 6 and immediately stops the
ADC, this runs at a lower priority and samples the ADC into an array in the
middle of the ISR ( after processing the previous sample set ). Stopping the
ADC during conversion of channel 7 means that channel 7 will complete and at
the end of the ISR the ADC has halted and can simply be copied to an array
so the timing for sampling the ADC is more relaxed.

In my case there is a simple main loop but all DSP is done in the ISR to
guarantee sample processing is synchronous to data sampling, this takes 85%
of the CPU power and avoids any synchronization events or RTOS requirements,
but the ADC has to be restarted before the ADC ISR completes which is why
the timer ISR is higher priority ( but extremely fast ), it's a simple
solution that works perfectly but it's really a workaround for a deficiency
in the ADC specification.

There are probably many more ways to achieve it, it all depends on what the
requirements are, but when somebody says they are using a timer to start the
ADC rather than just leaving it in burst mode I would assume that the
precise sample timing is important to them.

Regards

Phil.

From: l... [mailto:l...] On Behalf Of
misterhershel
Sent: 24 May 2012 21:24
To: l...
Subject: [lpc2000] Re: ADC Burst mode with Interrupts Enabled... possible?

Seems like the hard way to do it.

Why not set the ADC to start conversion on a timer match, then enable the
interrupt for the last channel converted? Note that the ADC does one
conversion each match, so the timer will need to run 6X faster.

That way you get precise timing unaffected by interrupt latency.

I do this with a LPC1759.

--- In l... ,
"robertfleury" wrote:
>
> Hi all,
>
> Im trying to get the ADC working reliably on my LPC1763 board under
> crossworks. Some interesting behaviour I noticed but would like some
> confirmation on. My basic implementation is a Timer ISR which sets burst
mode
> active, and an ADC interrupt that is triggered on a set channel where i
first
> disable burst mode and then read the data from the registers.
>
> Currently, I have my ADC setup to convert 6 channels, and the interrupt
set to
> trigger on channel 4 done bit. What appears to happen is - the Timer ISR
> occurs, burst mode gets activated, channels 0 through 4 are converted 5
and 6
> are unconverted(DONE and OVERUN = 0) and the odd thing, channel 0 is set
to
> overrun. How is this possible? Channels 5 and 6 never even got converted?
The
> datasheet says that burst converts all channels there are enabled in the
select
> bit, and 0-6 are enabled. Any known reason for this? Is there anyway to
use
> interrupts without clobbering A/D channel 0? Thanks.
>




The 2024 Embedded Online Conference