I'm using an STM32F103VB, and one of the things that I'm doing is doing a set of ADC reads which are then being transferred via DMA to a buffer. For some reason, when I'm debugging the DMA transfer gets scrambled: channel 0 ends up where channel 1 is supposed to go, channel 1 where channel 2 should be, on up the line until finally the last channel gets written to channel 0. I have the code set up for the ADC to run in "scan" mode, with the DMA (theoretically!) sucking the data off and putting it into memory. Furthermore, I have the ADC ISR set up so that on an end of conversion interrupt (which is only supposed to happen at the end of the scan in scan mode) the DMA engine gets reinitialized to point to the base of the memory array it's supposed to write to. Does anyone have any obvious clue to what I'm doing wrong? Does ST have any good bits of sample code that I've missed? Thanks in advance. -- My liberal friends think I'm a conservative kook. My conservative friends think I'm a liberal kook. Why am I not happy that they have found common ground? Tim Wescott, Communications, Control, Circuits & Software http://www.wescottdesign.com
DMA losing sync on ST ARM processor
Started by ●June 20, 2012
Reply by ●June 20, 20122012-06-20
On Wed, 20 Jun 2012 10:51:52 -0500, Tim Wescott wrote:> I'm using an STM32F103VB, and one of the things that I'm doing is doing > a set of ADC reads which are then being transferred via DMA to a buffer. > > For some reason, when I'm debugging the DMA transfer gets scrambled: > channel 0 ends up where channel 1 is supposed to go, channel 1 where > channel 2 should be, on up the line until finally the last channel gets > written to channel 0. > > I have the code set up for the ADC to run in "scan" mode, with the DMA > (theoretically!) sucking the data off and putting it into memory. > > Furthermore, I have the ADC ISR set up so that on an end of conversion > interrupt (which is only supposed to happen at the end of the scan in > scan mode) the DMA engine gets reinitialized to point to the base of the > memory array it's supposed to write to. > > Does anyone have any obvious clue to what I'm doing wrong? Does ST have > any good bits of sample code that I've missed? > > Thanks in advance.Whoops -- I meant to cross-post this to sed, 'cause there may be folks over there with useful information, too. -- My liberal friends think I'm a conservative kook. My conservative friends think I'm a liberal kook. Why am I not happy that they have found common ground? Tim Wescott, Communications, Control, Circuits & Software http://www.wescottdesign.com
Reply by ●June 20, 20122012-06-20
On 20 Jun., 17:53, Tim Wescott <t...@seemywebsite.com> wrote:> On Wed, 20 Jun 2012 10:51:52 -0500, Tim Wescott wrote: > > I'm using an STM32F103VB, and one of the things that I'm doing is doing > > a set of ADC reads which are then being transferred via DMA to a buffer=.> > > For some reason, when I'm debugging the DMA transfer gets scrambled: > > channel 0 ends up where channel 1 is supposed to go, channel 1 where > > channel 2 should be, on up the line until finally the last channel gets > > written to channel 0. > > > I have the code set up for the ADC to run in "scan" mode, with the DMA > > (theoretically!) sucking the data off and putting it into memory. > > > Furthermore, I have the ADC ISR set up so that on an end of conversion > > interrupt (which is only supposed to happen at the end of the scan in > > scan mode) the DMA engine gets reinitialized to point to the base of th=e> > memory array it's supposed to write to. > > > Does anyone have any obvious clue to what I'm doing wrong? =A0Does ST h=ave> > any good bits of sample code that I've missed? > > > Thanks in advance. > > Whoops -- I meant to cross-post this to sed, 'cause there may be folks > over there with useful information, too. >a suggestion from a quick google http://www.micromouseonline.com/2009/05/26/simple-adc-use-on-the-stm32/#axz= z1yLmVUCe5 -Lasse
Reply by ●June 20, 20122012-06-20
On Jun 20, 10:51=A0am, Tim Wescott <t...@seemywebsite.com> wrote:> I'm using an STM32F103VB, and one of the things that I'm doing is doing a > set of ADC reads which are then being transferred via DMA to a buffer. > > For some reason, when I'm debugging the DMA transfer gets scrambled: > channel 0 ends up where channel 1 is supposed to go, channel 1 where > channel 2 should be, on up the line until finally the last channel gets > written to channel 0. > > I have the code set up for the ADC to run in "scan" mode, with the DMA > (theoretically!) sucking the data off and putting it into memory. > > Furthermore, I have the ADC ISR set up so that on an end of conversion > interrupt (which is only supposed to happen at the end of the scan in > scan mode) the DMA engine gets reinitialized to point to the base of the > memory array it's supposed to write to. > > Does anyone have any obvious clue to what I'm doing wrong? =A0Does ST hav=e> any good bits of sample code that I've missed? > > Thanks in advance. > > -- > My liberal friends think I'm a conservative kook. > My conservative friends think I'm a liberal kook. > Why am I not happy that they have found common ground? > > Tim Wescott, Communications, Control, Circuits & Softwarehttp://www.wesco=ttdesign.com Do you have the DMA set up to operate in circular mode? If so then it sounds like the DMA is doing one transfer before the first ADC conversion is complete. Try changing your ISR to only set up the DMA but not start the ADC and see if the DMA counter drops by one as soon as you exit the ISR. Also remember that the ADC has two interrupt sources, one for EOC and one for JEOC so make sure you're using the right one. If this ONLY happens while debugging through JTAG then it may be that your debugger halts the DMA but not the ADC. Some peripherals like the CAN controller can be selected to either keep running or halt during debug.
Reply by ●June 20, 20122012-06-20
On Wed, 20 Jun 2012 09:11:07 -0700, langwadt@fonz.dk wrote:> On 20 Jun., 17:53, Tim Wescott <t...@seemywebsite.com> wrote: >> On Wed, 20 Jun 2012 10:51:52 -0500, Tim Wescott wrote: >> > I'm using an STM32F103VB, and one of the things that I'm doing is >> > doing a set of ADC reads which are then being transferred via DMA to >> > a buffer. >> >> > For some reason, when I'm debugging the DMA transfer gets scrambled: >> > channel 0 ends up where channel 1 is supposed to go, channel 1 where >> > channel 2 should be, on up the line until finally the last channel >> > gets written to channel 0. >> >> > I have the code set up for the ADC to run in "scan" mode, with the >> > DMA (theoretically!) sucking the data off and putting it into memory. >> >> > Furthermore, I have the ADC ISR set up so that on an end of >> > conversion interrupt (which is only supposed to happen at the end of >> > the scan in scan mode) the DMA engine gets reinitialized to point to >> > the base of the memory array it's supposed to write to. >> >> > Does anyone have any obvious clue to what I'm doing wrong? Does ST >> > have any good bits of sample code that I've missed? >> >> > Thanks in advance. >> >> Whoops -- I meant to cross-post this to sed, 'cause there may be folks >> over there with useful information, too. >> >> > > a suggestion from a quick google > http://www.micromouseonline.com/2009/05/26/simple-adc-use-on-the-stm32/#axzz1yLmVUCe5 That's way too simple -- my problem isn't getting the right numbers out of the ADC. My problem is getting the right numbers out of the ADC, at the right times, and into the right spots in memory fast. The _only_ way that I can see to do multiple consecutive ADC conversions with that part is to use the ADC scan mode and DMA -- so I pretty much must get it working. It is very frustrating to halt the processor, restart, and see all the right numbers in all the wrong places. Scary, too, considering that I'm controlling enough power to make things awfully toasty if I screw it up. -- My liberal friends think I'm a conservative kook. My conservative friends think I'm a liberal kook. Why am I not happy that they have found common ground? Tim Wescott, Communications, Control, Circuits & Software http://www.wescottdesign.com
Reply by ●June 20, 20122012-06-20
On Wed, 20 Jun 2012 09:23:32 -0700, peter_gotkatov wrote:> On Jun 20, 10:51 am, Tim Wescott <t...@seemywebsite.com> wrote: >> I'm using an STM32F103VB, and one of the things that I'm doing is doing >> a set of ADC reads which are then being transferred via DMA to a >> buffer. >> >> For some reason, when I'm debugging the DMA transfer gets scrambled: >> channel 0 ends up where channel 1 is supposed to go, channel 1 where >> channel 2 should be, on up the line until finally the last channel gets >> written to channel 0. >> >> I have the code set up for the ADC to run in "scan" mode, with the DMA >> (theoretically!) sucking the data off and putting it into memory. >> >> Furthermore, I have the ADC ISR set up so that on an end of conversion >> interrupt (which is only supposed to happen at the end of the scan in >> scan mode) the DMA engine gets reinitialized to point to the base of >> the memory array it's supposed to write to. >> >> Does anyone have any obvious clue to what I'm doing wrong? Does ST >> have any good bits of sample code that I've missed? >> >> Thanks in advance. > > Do you have the DMA set up to operate in circular mode?Initially I had it set up with the DMA in circular mode, and the interrupt coming from the DMA controller on the completion of a set of writes. That gave me this problem as soon as the processor started, so I axed that pretty quick!> If so then it > sounds like the DMA is doing one transfer before the first ADC > conversion is complete. Try changing your ISR to only set up the DMA but > not start the ADC and see if the DMA counter drops by one as soon as you > exit the ISR.Hmm. I'll try that, but it really shouldn't be doing so. Currently the DMA is set to terminate at the end of a transfer, and to start it up again you need to disable the DMA, re-initialize the length register, then turn it on again. It does make me think that carefully checking the order in which I turn things back on may bear fruit. However, the way that the whole system is set up (and if I understand ST's documentation), the ADC gets triggered periodically from a timer, converts a bunch of channels which get squirreled away in memory by the DMA, then interrupts. So if the DMA is getting set up in the interval when the ADC is quiescent, even if it gets screwed up once when the processor starts up, it should still get back on track the next ISR -- or, at least that's how I read things. Yet this is obviously not the case.> Also remember that the ADC has two interrupt sources, one for EOC and > one for JEOC so make sure you're using the right one.I'm only interrupting on EOC.> If this ONLY happens while debugging through JTAG then it may be that > your debugger halts the DMA but not the ADC. Some peripherals like the > CAN controller can be selected to either keep running or halt during > debug.Good thought, I'll think about implications, and how to test. -- My liberal friends think I'm a conservative kook. My conservative friends think I'm a liberal kook. Why am I not happy that they have found common ground? Tim Wescott, Communications, Control, Circuits & Software http://www.wescottdesign.com
Reply by ●June 20, 20122012-06-20
On 20 Jun., 18:36, Tim Wescott <t...@seemywebsite.com> wrote:> On Wed, 20 Jun 2012 09:11:07 -0700, langw...@fonz.dk wrote: > > On 20 Jun., 17:53, Tim Wescott <t...@seemywebsite.com> wrote: > >> On Wed, 20 Jun 2012 10:51:52 -0500, Tim Wescott wrote: > >> > I'm using an STM32F103VB, and one of the things that I'm doing is > >> > doing a set of ADC reads which are then being transferred via DMA to > >> > a buffer. > > >> > For some reason, when I'm debugging the DMA transfer gets scrambled: > >> > channel 0 ends up where channel 1 is supposed to go, channel 1 where > >> > channel 2 should be, on up the line until finally the last channel > >> > gets written to channel 0. > > >> > I have the code set up for the ADC to run in "scan" mode, with the > >> > DMA (theoretically!) sucking the data off and putting it into memory=.> > >> > Furthermore, I have the ADC ISR set up so that on an end of > >> > conversion interrupt (which is only supposed to happen at the end of > >> > the scan in scan mode) the DMA engine gets reinitialized to point to > >> > the base of the memory array it's supposed to write to. > > >> > Does anyone have any obvious clue to what I'm doing wrong? =A0Does S=T> >> > have any good bits of sample code that I've missed? > > >> > Thanks in advance. > > >> Whoops -- I meant to cross-post this to sed, 'cause there may be folks > >> over there with useful information, too. > > > a suggestion from a quick google > >http://www.micromouseonline.com/2009/05/26/simple-adc-use-on-the-stm32/ > > #axzz1yLmVUCe5 > > That's way too simple -- my problem isn't getting the right numbers out > of the ADC. =A0My problem is getting the right numbers out of the ADC, at > the right times, and into the right spots in memory fast. > > The _only_ way that I can see to do multiple consecutive ADC conversions > with that part is to use the ADC scan mode and DMA -- so I pretty much > must get it working. > > It is very frustrating to halt the processor, restart, and see all the > right numbers in all the wrong places. =A0Scary, too, considering that I'=m> controlling enough power to make things awfully toasty if I screw it up. >should have mentioned I was looking at the post by "tony" two thirds down the page, look like scan of two channels with dma -Lasse
Reply by ●June 20, 20122012-06-20
On Wed, 20 Jun 2012 10:51:52 -0500, Tim Wescott <tim@seemywebsite.com> wrote:>I'm using an STM32F103VB, and one of the things that I'm doing is doing a >set of ADC reads which are then being transferred via DMA to a buffer. > >For some reason, when I'm debugging the DMA transfer gets scrambled: >channel 0 ends up where channel 1 is supposed to go, channel 1 where >channel 2 should be, on up the line until finally the last channel gets >written to channel 0. > >I have the code set up for the ADC to run in "scan" mode, with the DMA >(theoretically!) sucking the data off and putting it into memory. > >Furthermore, I have the ADC ISR set up so that on an end of conversion >interrupt (which is only supposed to happen at the end of the scan in >scan mode) the DMA engine gets reinitialized to point to the base of the >memory array it's supposed to write to. > >Does anyone have any obvious clue to what I'm doing wrong? Does ST have >any good bits of sample code that I've missed?Here's how I've been doing it on an STM32F103RB. ADCStart() is called from the main loop to kick things off. DMADone and ADCActive are extern volatile so that main can keep an eye on progress. Mind the line wrap on some of the comments. ---- // We'll get ADC_NUM_CHANS results in sequence and use DMA to save the // results of the conversions to the output array. No ADC IRQ is needed; // we'll interrupt at the end of the associated DMA cycle. void ADCInit(void) { RCC->APB2ENR |= RCC_APB2ENR_ADC1EN; // give it a clock ADC1->CR2 |= ADC_CR2_ADON; // turn it on RCC->CFGR &= ~RCC_CFGR_ADCPRE; // ADC clock to 12 MHz (pclk2 / 6) RCC->CFGR |= RCC_CFGR_ADCPRE_DIV6; RCC->CFGR |= RCC_CFGR_ADCPRE_DIV6; // stroke twice to ensure it's awake ADC1->CR2 |= ADC_CR2_CAL; // since we're doing a cal cycle while ((ADC1->CR2 & ADC_CR2_CAL) != 0) { ; // wait } // Note that we've already setup PA0-7; PB0,1; PC0-3 for analog input ADC1->CR1 |= ADC_CR1_SCAN; // enable scan mode ADC1->CR2 |= ADC_CR2_EXTSEL; // with SWSTART as the trigger ADC1->CR2 |= ADC_CR2_DMA; // enable DMA at the end of scan ADC1->SQR3 = 0x0A418820; // scan 0, 1, 2, 3, 4, 5, ADC1->SQR2 = 0x16A4A0E6; // 6, 7, 8, 9, 10, 11, ADC1->SQR1 = 0x00D001AC; // 12, 13; total 14 conversions ADC1->SMPR2 = 0x00B6DB6D; // 55.5 clocks sample time // Setup the DMA peripheral to handle the storing of the ADC results. RCC->AHBENR |= RCC_AHBENR_DMA1EN; // give it a clock // Tell DMA which peripheral register has the data to store. DMA1_Channel1->CPAR = (unsigned long)&ADC1->DR; // and then point to the destination in memory. DMA1_Channel1->CMAR = (unsigned long)ADCResult; DMA1_Channel1->CNDTR = ADC_NUM_CHANS; // how many to transfer // medium priority, 32 bits, peripheral to memory, increment memory, // not circular, and interrupt when all the transfers are complete. DMA1_Channel1->CCR = 0x00001A82; DMA1_Channel1->CCR |= 0x01; // enable it NVIC_EnableIRQ(DMA1_Channel1_IRQn); } void DMA1_Channel1_IRQHandler(void) { DMA1->IFCR = 0x03; // reset channel 1 and global DMA1_Channel1->CCR &= ~0x01; // turn the channel off DMADone = 1; ADCActive = 0; } void ADCStart(void) { ADCActive = 1; // Prep the DMA channel DMA1_Channel1->CNDTR = ADC_NUM_CHANS; DMA1_Channel1->CCR |= 1; // and start the conversions ADC1->CR2 |= ADC_CR2_ADON; } -- Rich Webb Norfolk, VA
Reply by ●June 20, 20122012-06-20
On Jun 20, 11:44=A0am, Tim Wescott <t...@seemywebsite.com> wrote:> So if the DMA is getting set up in the interval when the ADC is > quiescent, even if it gets screwed up once when the processor starts up, > it should still get back on track the next ISR -- or, at least that's how > I read things. =A0Yet this is obviously not the case.As long as ADC_SR_EOC is clear before you set DMA_CCR1_EN then there should be no problem. I don't know if JEOC needs to be cleared as well but it wouldn't hurt.
Reply by ●June 20, 20122012-06-20
In article <0vidnfqFnZKEYXzSnZ2dnUVZ_uKdnZ2d@web-ster.com>, tim@seemywebsite.com says...> > On Wed, 20 Jun 2012 09:11:07 -0700, langwadt@fonz.dk wrote: > > > On 20 Jun., 17:53, Tim Wescott <t...@seemywebsite.com> wrote: > >> On Wed, 20 Jun 2012 10:51:52 -0500, Tim Wescott wrote: > >> > I'm using an STM32F103VB, and one of the things that I'm doing is > >> > doing a set of ADC reads which are then being transferred via DMA to > >> > a buffer. > >> > >> > For some reason, when I'm debugging the DMA transfer gets scrambled: > >> > channel 0 ends up where channel 1 is supposed to go, channel 1 where > >> > channel 2 should be, on up the line until finally the last channel > >> > gets written to channel 0. > >> > >> > I have the code set up for the ADC to run in "scan" mode, with the > >> > DMA (theoretically!) sucking the data off and putting it into memory. > >> > >> > Furthermore, I have the ADC ISR set up so that on an end of > >> > conversion interrupt (which is only supposed to happen at the end of > >> > the scan in scan mode) the DMA engine gets reinitialized to point to > >> > the base of the memory array it's supposed to write to. > >> > >> > Does anyone have any obvious clue to what I'm doing wrong? �Does ST > >> > have any good bits of sample code that I've missed? > >> > >> > Thanks in advance. > >> > >> Whoops -- I meant to cross-post this to sed, 'cause there may be folks > >> over there with useful information, too. > >> > >> > > > > a suggestion from a quick google > > http://www.micromouseonline.com/2009/05/26/simple-adc-use-on-the-stm32/ > #axzz1yLmVUCe5 > > That's way too simple -- my problem isn't getting the right numbers out > of the ADC. My problem is getting the right numbers out of the ADC, at > the right times, and into the right spots in memory fast. > > The _only_ way that I can see to do multiple consecutive ADC conversions > with that part is to use the ADC scan mode and DMA -- so I pretty much > must get it working. > > It is very frustrating to halt the processor, restart, and see all the > right numbers in all the wrong places. Scary, too, considering that I'm > controlling enough power to make things awfully toasty if I screw it up.I think the STM32 debug hardware has the option to stop the timer clocks when debugging. If you don't do that, perhaps your timer is triggering an ADC cycle while in the debug mode. Mark Borgerson