EmbeddedRelated.com
Forums
The 2024 Embedded Online Conference

interrupt latency

Started by Unknown June 4, 2007
Just a generic question regarding interrupt latency. I am using a
periodic timer which generates an interrupt at every clock tick. The
question I have is will the interrupt latency cause the timer
interrupt service routine to run at a reduced frequency. What I mean
here is as follows. Consider a simple isr routine as shown below,

void isr()
{
      t = timestamp();
}

Assume the above code runs twice. The question i have is will the time
elapsed between 2 calls of timestamp() in this isr include the
interrupt latency value or will the time elapsed correspond to the
original timer frequency (i.e. time_elapsed = 1/timer_freq). It would
make sense that the isr still runs at the original timer frequency and
the interrupt latency only delays the calling of isr and not the
frequency at which the isr runs. However from the measurements it
doesn't look like the case. Any help is appreciated.

In article <1180927222.996612.46800@r19g2000prf.googlegroups.com>,  
says...
> Just a generic question regarding interrupt latency. I am using a > periodic timer which generates an interrupt at every clock tick. The > question I have is will the interrupt latency cause the timer > interrupt service routine to run at a reduced frequency.
No, as long as the interrupt routine is short.
> What I mean > here is as follows. Consider a simple isr routine as shown below, > > void isr() > { > t = timestamp(); > } > > Assume the above code runs twice. The question i have is will the time > elapsed between 2 calls of timestamp() in this isr include the > interrupt latency value or will the time elapsed correspond to the > original timer frequency (i.e. time_elapsed = 1/timer_freq).
Maybe. Over time it should but there is no a-priori reason for the latentcy to be constant. Robert -- Posted via a free Usenet account from http://www.teranews.com
On Jun 3, 8:30 pm, Robert Adsett <s...@aeolusdevelopment.com> wrote:
> In article <1180927222.996612.46...@r19g2000prf.googlegroups.com>, > says... > > > Just a generic question regarding interrupt latency. I am using a > > periodic timer which generates an interrupt at every clock tick. The > > question I have is will the interrupt latency cause the timer > > interrupt service routine to run at a reduced frequency. > > No, as long as the interrupt routine is short. > > > What I mean > > here is as follows. Consider a simple isr routine as shown below, > > > void isr() > > { > > t = timestamp(); > > } > > > Assume the above code runs twice. The question i have is will the time > > elapsed between 2 calls of timestamp() in this isr include the > > interrupt latency value or will the time elapsed correspond to the > > original timer frequency (i.e. time_elapsed = 1/timer_freq). > > Maybe. Over time it should but there is no a-priori reason for the > latentcy to be constant. > > Robert > > -- > Posted via a free Usenet account fromhttp://www.teranews.com
Ok so its fair to say the following is not correct, interrupt latency = time_elapsed between time stamp value in the isr - timer_period (1/timer_freq)
On Jun 3, 8:20 pm, wamba.ku...@gmail.com wrote:
> Just a generic question regarding interrupt latency. I am using a > periodic timer which generates an interrupt at every clock tick. The > question I have is will the interrupt latency cause the timer > interrupt service routine to run at a reduced frequency. What I mean > here is as follows. Consider a simple isr routine as shown below, > > void isr() > { > t = timestamp(); > > } > > Assume the above code runs twice. The question i have is will the time > elapsed between 2 calls of timestamp() in this isr include the > interrupt latency value or will the time elapsed correspond to the > original timer frequency (i.e. time_elapsed = 1/timer_freq). It would > make sense that the isr still runs at the original timer frequency and > the interrupt latency only delays the calling of isr and not the > frequency at which the isr runs. However from the measurements it > doesn't look like the case. Any help is appreciated.
Another way of phrasing the above question is as follows, interrupt latency != time elapsed between two calls to timestamp (in the code above) - timer_period (ie 1/timer_freq)
wamba.kuete@gmail.com wrote:
> Just a generic question regarding interrupt latency. I am using a > periodic timer which generates an interrupt at every clock tick. The > question I have is will the interrupt latency cause the timer > interrupt service routine to run at a reduced frequency. What I mean > here is as follows. Consider a simple isr routine as shown below, > > void isr() > { > t = timestamp(); > } > > Assume the above code runs twice. The question i have is will the time > elapsed between 2 calls of timestamp() in this isr include the > interrupt latency value or will the time elapsed correspond to the > original timer frequency (i.e. time_elapsed = 1/timer_freq).
That depends on the source of the interrupt. If it comes from a timer that continues to run after an interrupt occurs and you do not change the timer value, the interrupt rate should be fixed. If instead, you restart the timer within the interrupt, then the time between the timer interrupt and restarting is lost (although it may be partially compensated for). For many applications I use a continuously running timer that generates an interrupt on overflow. I form a high resolution run timer from a combination of the software interrupt count and the timer value. Care has to be taken to handle timer rollovers properly. -- Thad
On Jun 4, 5:20 am, wamba.ku...@gmail.com wrote:
> Just a generic question regarding interrupt latency. I am using a > periodic timer which generates an interrupt at every clock tick. The > question I have is will the interrupt latency cause the timer > interrupt service routine to run at a reduced frequency. What I mean > here is as follows. Consider a simple isr routine as shown below, > > void isr() > { > t = timestamp(); > > } > > Assume the above code runs twice. The question i have is will the time > elapsed between 2 calls of timestamp() in this isr include the > interrupt latency value or will the time elapsed correspond to the > original timer frequency (i.e. time_elapsed = 1/timer_freq). It would > make sense that the isr still runs at the original timer frequency and > the interrupt latency only delays the calling of isr and not the > frequency at which the isr runs. However from the measurements it > doesn't look like the case. Any help is appreciated.
If your system does not have to process other (higher level) interrupts or has areas in the code where interrupts are disabled, then the time between 2 calls to timestamp() will be constant, as long as you have a timer interrupt that restart itself (as indicated by others as well). If you have other higher level interrupts and/or interrupt disabled areas then the moment the timestamp is called can be deleayed by a total of (time needed in interrupt disabled areas) + (total time of all higher level ISR's). So depending on the system configuration you may or may not see variation in the time between two calls to timestamp. Kind regards, Johan Borkhuis
In article <1180936642.272357.147260@q69g2000hsb.googlegroups.com>,  
says...
> On Jun 4, 5:20 am, wamba.ku...@gmail.com wrote: > > Just a generic question regarding interrupt latency. I am using a > > periodic timer which generates an interrupt at every clock tick. The > > question I have is will the interrupt latency cause the timer > > interrupt service routine to run at a reduced frequency. What I mean > > here is as follows. Consider a simple isr routine as shown below, > > > > void isr() > > { > > t = timestamp(); > > > > } > > > > Assume the above code runs twice. The question i have is will the time > > elapsed between 2 calls of timestamp() in this isr include the > > interrupt latency value or will the time elapsed correspond to the > > original timer frequency (i.e. time_elapsed = 1/timer_freq). It would > > make sense that the isr still runs at the original timer frequency and > > the interrupt latency only delays the calling of isr and not the > > frequency at which the isr runs. However from the measurements it > > doesn't look like the case. Any help is appreciated. > > If your system does not have to process other (higher level) > interrupts or has areas in the code where interrupts are disabled, > then the time between 2 calls to timestamp() will be constant, as long > as you have a timer interrupt that restart itself (as indicated by > others as well).
Not necessarily. Many systems, maybe even most, have varying instruction execution times depending on the instruction, what memory it is executing from and what memory is being accessed. That's without considering instructions tha automatically disable interrupts for the next instruction or two. Robert -- Posted via a free Usenet account from http://www.teranews.com
In article <1180928828.313834.210960@j4g2000prf.googlegroups.com>,  
says...
> On Jun 3, 8:30 pm, Robert Adsett <s...@aeolusdevelopment.com> wrote: > > > > Maybe. Over time it should but there is no a-priori reason for the > > latentcy to be constant. > > > > Robert > > > Ok so its fair to say the following is not correct, > interrupt latency = time_elapsed between time stamp value in the isr - > timer_period (1/timer_freq)
Yes, it might give you the jitter in the perios though. I think that would be simpler to measure on a storage oscilloscope though. As has been pointed out, if you use a free running timer for your timestamp and a comparator off of that same timer for the interrupt you can measure the latency directly. Robert -- Posted via a free Usenet account from http://www.teranews.com
In article <46639201$0$56949$892e0abb@auth.newsreader.octanews.com>, 
Thad Smith says...
> wamba.kuete@gmail.com wrote: > > Just a generic question regarding interrupt latency. I am using a > > periodic timer which generates an interrupt at every clock tick. The > > question I have is will the interrupt latency cause the timer > > interrupt service routine to run at a reduced frequency. What I mean > > here is as follows. Consider a simple isr routine as shown below, > > > > void isr() > > { > > t = timestamp(); > > } > > > > Assume the above code runs twice. The question i have is will the time > > elapsed between 2 calls of timestamp() in this isr include the > > interrupt latency value or will the time elapsed correspond to the > > original timer frequency (i.e. time_elapsed = 1/timer_freq). > > That depends on the source of the interrupt. If it comes from a timer > that continues to run after an interrupt occurs and you do not change > the timer value, the interrupt rate should be fixed. If instead, you > restart the timer within the interrupt, then the time between the timer > interrupt and restarting is lost (although it may be partially > compensated for).
Good point. I was assuming either an automatic reload or a free running timer with an adjustable comaparator w/o any justification. Robert -- Posted via a free Usenet account from http://www.teranews.com
Robert Adsett wrote:
> In article <1180936642.272357.147260@q69g2000hsb.googlegroups.com>, > says... >> If your system does not have to process other (higher level) >> interrupts or has areas in the code where interrupts are disabled, >> then the time between 2 calls to timestamp() will be constant, as long >> as you have a timer interrupt that restart itself (as indicated by >> others as well). > > Not necessarily. Many systems, maybe even most, have varying > instruction execution times depending on the instruction, what memory it > is executing from and what memory is being accessed. That's without > considering instructions tha automatically disable interrupts for the > next instruction or two.
Just as a note. These latency effects are known as jitter. The frequency of the the timer interrupt remains constant, but the time between interrupts vary. This assumes that the total latency is less than the period of the interrupt. -- Michael N. Moran (h) 770 516 7918 5009 Old Field Ct. (c) 678 521 5460 Kennesaw, GA, USA 30144 http://mnmoran.org "So often times it happens, that we live our lives in chains and we never even know we have the key." The Eagles, "Already Gone" The Beatles were wrong: 1 & 1 & 1 is 1

The 2024 Embedded Online Conference