EmbeddedRelated.com
Forums

LPC17xx ADC burst mode vs interrupt mode

Started by Jean-Sebastien Stoezel October 3, 2011
Hi:

I would like to identify the pros and cons of using the ADC of an LPC17xx in
burst vs interrupt mode. The end application I'm working on is a typical
control system, analog channels are sampled at a fixed rate, then processed,
and eventually outputs are updated.

I see at least 2 ways of implementing this:
- In interrupt mode, a timer would start the ADC conversions and an
interrupt would be generated when the conversion is over. The control loop
could be activated periodically by the completion interrupt.
- In burst mode, the ADC would continuously convert and update each channel
register. The control loop could be activated periodically by a timer, it
would then access the result of the continuous burst conversions.

The interrupt mode seems to be the solution that would provide the least
latency between the samplings and the processing loop. It also limits the
amount of time the ADC is active, thus limiting power consumption.
The burst mode seems to be the solution that would generate the least
interrupts, no "conversion done" interrupt would be generated, since we're
pretty much polling the ADC periodically. However, there would be some
latency between the time when the conversion would be done, and the moment
when the sample would be used by the control loop. Power consumption would
be increased too since the ADC would be active continuously.

What are the other cons of using the burst mode? Does this mode generate
more noise - I've read several places that the LPC1768 ADC is very finicky
with noise. Any other issue with the burst mode?

Regards,
Jean-Sebastien

An Engineer's Guide to the LPC2100 Series

Hi Jean-Sebastien,

Unfortunately this may be a bit more difficult than you imagine, it depends
on whether you care about the relative sampling times of the ADC.

I am using the ADC in this way currently and performing DSP on the outputs
so I care about the sampling times and it took a lot of effort to achieve a
working solution.

The problem is that you need to stop the burst conversion manually, there is
no single burst mode, once started a burst continues forever looping round
the enabled channels.

It also depends on how long you are prepared to wait in the ISR for
completion, you could wait in the ISR for them to complete but this is
inefficient as the ADC conversion rate is relatively slow.

Since I use all 8 channels it's even more complicated.

The solution I came up with it was to use 2 ISR's as you suggest, the first
is triggered by a timer and enables the burst mode.

The second is triggered by completion of the 7th ADC conversion, that
ensures that the ADC interrupt is activated during processing of the 8th
channel, this ISR immediately clears the burst bit.

The ADC will continue processing the channel it has started, so you need to
do this and wait for the channel to complete conversion before sampling the
results.

If you require less channels then you can enable one more than you need and
simply read the values required immediately.

Failure to work in this way results in the ADC looping back to the first
channel and overwriting the value with a new value which is sampled at a
different time.

Also you may need to consider the phase offsets between the channels as
there is no S/H so each channel is sampled with a time offset from the
previous channel.

I had particular problems because all of my processing is done in the ISR,
the main thread does nothing, and there was insufficient time to do all
processing and wait for the conversions of all 8 channels,

so the Timer ISR has to operate at a higher priority than the ADC ISR,
however there was a bug in the CMSIS code regarding setting of interrupt
priority which used the wrong shift factor for setting the interrupt
priority field.

This may have been fixed in the latest CMSIS code, I haven't checked.

Regards

Phil.
It's also worth pointing out that there is an excellent apps note from NCP
regarding layout of the board for using the 12 bit ADC on the LPC17xx
family.

Look for an10974 on the NXP site.

Regards

Phil.
Have you considered a 1 channel burst mode that is triggered by a timer.
Each timer interrupt, save the last channel and set burst mode on a new
channel. Rinse and repeat.

DaveS
1 channel is not a burst, in this case you would trigger a single
conversion.
I guess here you could use the isr to adjust a compare output from a timer
and use this to trigger the next channel in single conversion mode, but
actually this will be equally complex with a few additional drawbacks.

This causes an additional phase offset between channels which may be
variable, and additional interrupts.
Both of these are undesirable.

If a truly constant sample rate and channel phase relationship are required
the ADC needs to be used in burst mode and triggered by the highest priority
interrupt in the application, then stopped again by SW.

If only they had included a single burst conversion it would have been easy,
but sadly this is not available. Maybe we will get a fix for this deficiency
in future devices

Fortunately the ISR can be very fast and work within the few registers that
the cortex saves automatically.
I didn't look at the user manual for the LPC17xx but have assumed the same
ADC device is used in it as in the LPC2148, which I'm familiar with. In
that device, the burst mode channels are selectable, and selecting just one
is possible. Why would NXP reduce the functionality?

DaveS
Hi David,
Selecting 1 channel for burst mode simply means that this channel will
continue converting, why would you want to do that if you were using an ISR
to trigger it.
This would cause additional problems because when you tried to set up the
second channel to convert you would need to wait for the first channel to
finish before changing the status flags.

So whilst you can set up a 1 channel burst it makes no sense to do so, it
would make sense to trigger a single conversion, you could then get the ADC
to generate an interrupt on completion and trigger the next channel using
SW, setting a flag when the final conversion is complete, but this will
cause an ISR storm.
Or you can schedule the next conversion with a timer, but again you would
get 1 interrupt per channel.

So if you want multiple channels converted periodically it is best to simply
use 2 isr's, one timer driven to start a burst, and one trigger when the
final channel starts conversion to disable the burst.

But the ideal solution is a single burst flag in the ADC and a timer to
trigger the burst, so that you can use a timer to trigger the burst without
using an ISR then just get a single periodic interrupt when all channels are
converted.

I really hope that NXP will fix this problem at some point.

Regards

Phil.
On the LPC2148, I operate in burst mode. I have a timer that is used for
many different chores at different intervals. It interrupts 65,536 times a
second and updates a countdown chain. Some readings I grab at 4096 times a
second, others at 256. Other things in the system are triggered at other
intervals. For instance, switches being debounced are sampled at 32 Hz.

By grab a reading, I mean read the latest ADC register for the channel I
want and stuff it into the 'filter'.

Yes, the ADC is working all the time.

DaveS
Hi Phil:

Thank you for the detailed reply. I had read one of your previous post in
the archive, where you were describing this technique which consists of
interrupting at the second last channel in burst mode.
I will be using all 8 channels, and I think it's going to be ok. This means
I will be interrupting on channel 7, and channel 1 will be overwritten...

I did implement you solution, which consists of triggering the start of a
burst periodically (timer), and stopping the burst with an interrupt on the
last channel sampled. From there I am activating a task (freeRTOS) which
handles the processing. Thus I am doing the waiting for the last channel to
be done at the task level. I am hoping to minimize the time spent in the
interrupts, the context switch should be long enough for the conversion on
the last channel to be done by the time the task is activated.
I'm still debugging issues where the second interrupt seems to be masking
the USB interrupt... I'm still debugging.

Back to "controlled burst" vs free "wheels bursts".
When running at maximum conversion frequency, is it accurate to say that the
maximum delay between any channel is 7 times the conversion time? I.E. there
is no delay between the last channel being sampled and the first channel (no
inter burst delays)? Is this delay constant (no jitter), that is are the
conversion times constant?

If that's the case then a free wheel burst generates the same delays as an
interrupt mode. I'm more and more thinking that a free wheeling burst with a
periodic poll of the channel conversion might just be more efficient than an
interrupt controlled one. At least processing wise - power wise it might use
more power.

Any opinion on this?

Regards,
Jean
Hi Jean,
I haven't measured the time at the end of a burst as I perform bursts at a
fixed rate and then perform DSP on the samples, to minimize phase delay
between the samples I run the ADC clock at it's fastest rate rather than
setting the burst conversion rate to match the sample period.

If you interrupt on completion of channel 6 then the ISR can stop the ADC,
it will have started processing channel 7 and will complete that before
stopping anyway as when a burst is stopped it always completes the channel
it is processing.
In this way the ADC will have completed all conversions by the time the RTOS
activates the task so you should have a very efficient system.

The problem I discovered with the LPC devices was in the setting of
interrupt priorities, the CMSIS code had a bug and was not setting
priorities correctly so I had exactly the same issue, it was just a wrong
constant for the shift on the priority field. In my case I was setting the
timer ISR to a higher priority than the ADC ISR but it did not work at
first.

Since the Interrupt pushes R0-R3 in HW and the ISR to stop the ADC takes a
couple of instructions you do not need to allocate stack space for it, and
presumably the call to enable the suspended task should be simple the RTOS
overhead should also be negligible. In fact in the RTOS I use it is just a
few cycles since it is setting a semaphore which the task has pended on, it
does not actually invoke the context switch to the thread running the ADC
processing so the overhead for this is the same if you use the ADC Interrupt
or suspend the task on a timer event.

Regards

Phil.