EmbeddedRelated.com
Forums
The 2024 Embedded Online Conference

ST ARM versus Atmel ARM?

Started by John-Smith May 18, 2014
This is a bit of a simplistic Q and I am in the early stages of
deciding...

My background is a huge amount of assembler on Z80, Z180, Z280, 80x86,
8031, 8051, AT90S1200, H8/300, etc. Plus a bit of C. No C++.

Now I am starting on a new product which will need about 1MB of *data*
storage (factory initialised so can be on chip FLASH) plus some
floating point maths, and will be done mostly in C.

I have been using Atmel chips for many years but they seem to run each
one for only a few years nowadays, and that is my biggest negative on
Atmel. We were going to do this (and other upcoming projects, mostly
serial comms) with one of the ATmega128 type of chips but each time I
look at the range, they seem to have discontinued half of the specific
devices.

Whereas the H8/300 has only just gone on LTB, after some 25 years...
which leads me to think ST is the way to go as this is a "serious"
industrial product which will run for at least 10 years. Also, looking
at what other firms use (e.g. Pico) they seem to be often using the ST
32F103 or similar which comes in a huge family, from miniscule 6mm x
6mm chips with 16k FLASH (cost $1 1k+) to 1MB FLASH for about $10.

The devt kits seem to use GNU compilers and the Eclipse IDE, with IAR
charging even more for their compilers than they ever used to. IAR
compilers used to be good but they did contain some really bad details
e.g. an incredibly slow sscanf() where they were doing multiple
floating point ops for every single input digit. I spent ages
optimising that stuff. I think their runtimes were written in generic
C and that was it. The tool cost is not that important in this case
however.

I' be very interested in hearing peoples' views on this. I really want
a 10 year life of the actual device.
John-Smith <noospam@noospam.com> wrote:

> I' be very interested in hearing peoples' views on this. I really want > a 10 year life of the actual device.
Freescale have Kinetis parts with both 10- and 15-year life spans in their longevity program. <http://www.freescale.com/webapp/sps/site/overview.jsp?code=PRDCT_LONGEVITY_HM> -a
John-Smith <noospam@noospam.com> writes:

> This is a bit of a simplistic Q and I am in the early stages of > deciding... > > My background is a huge amount of assembler on Z80, Z180, Z280, 80x86, > 8031, 8051, AT90S1200, H8/300, etc. Plus a bit of C. No C++. > > Now I am starting on a new product which will need about 1MB of *data* > storage (factory initialised so can be on chip FLASH) plus some > floating point maths, and will be done mostly in C. > > I have been using Atmel chips for many years but they seem to run each > one for only a few years nowadays, and that is my biggest negative on > Atmel. We were going to do this (and other upcoming projects, mostly > serial comms) with one of the ATmega128 type of chips but each time I > look at the range, they seem to have discontinued half of the specific > devices. > > Whereas the H8/300 has only just gone on LTB, after some 25 years... > which leads me to think ST is the way to go as this is a "serious" > industrial product which will run for at least 10 years. Also, looking > at what other firms use (e.g. Pico) they seem to be often using the ST > 32F103 or similar which comes in a huge family, from miniscule 6mm x > 6mm chips with 16k FLASH (cost $1 1k+) to 1MB FLASH for about $10. > > The devt kits seem to use GNU compilers and the Eclipse IDE, with IAR > charging even more for their compilers than they ever used to. IAR > compilers used to be good but they did contain some really bad details > e.g. an incredibly slow sscanf() where they were doing multiple > floating point ops for every single input digit. I spent ages > optimising that stuff. I think their runtimes were written in generic > C and that was it. The tool cost is not that important in this case > however. > > I' be very interested in hearing peoples' views on this. I really want > a 10 year life of the actual device.
John, I can't help you on longevity, but for me the fact TI offers Code Composer Studio for free or at least cheap ($800) is a big deal. There are several tool vendors that want you to shuck out $3000+ for _device-specific_ toolsets. Not happening with this dude. I also like very very much that TI's CCS runs under linux and windows. -- Randy Yates Digital Signal Labs http://www.digitalsignallabs.com
On 18.5.14 13:18, John-Smith wrote:
> This is a bit of a simplistic Q and I am in the early stages of > deciding... > > > Now I am starting on a new product which will need about 1MB of *data* > storage (factory initialised so can be on chip FLASH) plus some > floating point maths, and will be done mostly in C. > > I have been using Atmel chips for many years but they seem to run each > one for only a few years nowadays, and that is my biggest negative on > Atmel. We were going to do this (and other upcoming projects, mostly > serial comms) with one of the ATmega128 type of chips but each time I > look at the range, they seem to have discontinued half of the specific > devices.
If your floating point is not in a hurry, the Atmel AT91SAM4SD32 seems to fit your bill. I have experience of Cortex-M processors from three vendors: Atmel, ST and TI/Luminary. The peripherals of Luminary are very well organized, the Atmel are a bit more disordered, and ST are pretty challenging.
> The devt kits seem to use GNU compilers and the Eclipse IDE, with IAR > charging even more for their compilers than they ever used to. IAR > compilers used to be good but they did contain some really bad details > e.g. an incredibly slow sscanf() where they were doing multiple > floating point ops for every single input digit. I spent ages > optimising that stuff. I think their runtimes were written in generic > C and that was it. The tool cost is not that important in this case > however.
I dropped IAR years ago when I got a sick dongle with a compiler upgrade. The version I upgraded from did not have a dongle. The free toolchain, GCC, Eclipse and OpenOCD have fit the bill on all of the above processors well. The compiler has a bewildering number of switches, and it takes some time to find the right combination for the task at hand. Reading the generated assembler code helps a lot. With GCC and Cortex-M, the need for assembler code is minimal. Even the startup code can (and should) be written in C. The same applies to interrupt handlers. In my applications, there are some small embedded assembler functions, mainly in task switcher. -- Tauno Voipio
Tauno Voipio <tauno.voipio@notused.fi.invalid> wrote

>On 18.5.14 13:18, John-Smith wrote: >> This is a bit of a simplistic Q and I am in the early stages of >> deciding... >> >> >> Now I am starting on a new product which will need about 1MB of *data* >> storage (factory initialised so can be on chip FLASH) plus some >> floating point maths, and will be done mostly in C. >> >> I have been using Atmel chips for many years but they seem to run each >> one for only a few years nowadays, and that is my biggest negative on >> Atmel. We were going to do this (and other upcoming projects, mostly >> serial comms) with one of the ATmega128 type of chips but each time I >> look at the range, they seem to have discontinued half of the specific >> devices. > >If your floating point is not in a hurry, the Atmel AT91SAM4SD32 >seems to fit your bill.
The SD card interface is very interesting. We could use a few MB of storage. It is damn hard from their website to find which devices do which ADC and DAC resolutions!
>I have experience of Cortex-M processors from three vendors: >Atmel, ST and TI/Luminary. The peripherals of Luminary are >very well organized, the Atmel are a bit more disordered, >and ST are pretty challenging.
In what way are they challenging? I have programmed loads of weird hardware, with the Zilog/AMD 85c30 (SDLC) being probably the weirdest in terms of bizzare behaviour, and have seen loads of undocumented stuff to do with the exact instruction sequences / bit manipulations required to clear pending interrupts etc. One can waste a whole life on crap like that. Is that what you meant? My main requirement is driving a 16 bit DAC, at about 100k samples/sec. It will probably be an audio DAC and probably I2C rather than parallel. But this should "just work" :) I did once bit-bang I2C on the H8/300 (for a Texas 12-bit ADC) and it was horrible, and I discovered some TI data sheet "artefacts" which they had not known about... But these ARM chips all come with I2C interfaces already done.
>> The devt kits seem to use GNU compilers and the Eclipse IDE, with IAR >> charging even more for their compilers than they ever used to. IAR >> compilers used to be good but they did contain some really bad details >> e.g. an incredibly slow sscanf() where they were doing multiple >> floating point ops for every single input digit. I spent ages >> optimising that stuff. I think their runtimes were written in generic >> C and that was it. The tool cost is not that important in this case >> however. > >I dropped IAR years ago when I got a sick dongle with a compiler >upgrade. The version I upgraded from did not have a dongle.
I would never touch a dongled product. I lost the dongles on a $20k Xilinx kit (Viewlogic 4 + XACT6) many years ago, but luckily a Russian cracked it.
>The free toolchain, GCC, Eclipse and OpenOCD have fit the bill >on all of the above processors well. The compiler has a bewildering >number of switches, and it takes some time to find the right >combination for the task at hand. Reading the generated assembler >code helps a lot. > >With GCC and Cortex-M, the need for assembler code is minimal. >Even the startup code can (and should) be written in C. The same >applies to interrupt handlers. In my applications, there are >some small embedded assembler functions, mainly in task switcher.
I had read about the GCC compiler switches but it seems that once this is set up, it can be left alone.
John-Smith <noospam@noospam.com> writes:

> It is damn hard from their website to find which devices do which ADC > and DAC resolutions!
Usually the uC ADC and DAC range from horrible to bad, so they typically do not highlight the specs.
> My main requirement is driving a 16 bit DAC, at about 100k > samples/sec. It will probably be an audio DAC and probably I2C rather > than parallel. But this should "just work" :) I did once bit-bang I2C > on the H8/300 (for a Texas 12-bit ADC) and it was horrible, and I > discovered some TI data sheet "artefacts" which they had not known > about... But these ARM chips all come with I2C interfaces already > done.
If audio DAC is good enough, I recommend having a look at I2S, which is a synchronous SPI-like interface used for audio. I've used this kind of setup (LP1768+DAC) for outputting complex calibration signal from MCU flash. -- Mikko OH2HVJ
Mikko OH2HVJ <mikko.syrjalahti@nospam.fi> wrote

>John-Smith <noospam@noospam.com> writes: > >> It is damn hard from their website to find which devices do which ADC >> and DAC resolutions! > >Usually the uC ADC and DAC range from horrible to bad, so they >typically do not highlight the specs.
They all seem to be 12 bit. I agree that IME the last 2 bits or so are fiction, though adding up 100 samples and dividing by 100 seems to work :)
>> My main requirement is driving a 16 bit DAC, at about 100k >> samples/sec. It will probably be an audio DAC and probably I2C rather >> than parallel. But this should "just work" :) I did once bit-bang I2C >> on the H8/300 (for a Texas 12-bit ADC) and it was horrible, and I >> discovered some TI data sheet "artefacts" which they had not known >> about... But these ARM chips all come with I2C interfaces already >> done. > >If audio DAC is good enough, I recommend having a look at >I2S, which is a synchronous SPI-like interface used for audio. >I've used this kind of setup (LP1768+DAC) for outputting complex >calibration signal from MCU flash.
Thanks - will check. Audio should be fine. I don't need full 16 bit linearity and certainly not 16-bit temp stability (hard to do anyway).
Op Sun, 18 May 2014 12:18:47 +0200 schreef John-Smith  
<noospam@noospam.com>:
> The devt kits seem to use GNU compilers and the Eclipse IDE, with IAR > charging even more for their compilers than they ever used to.
The IAR Embedded Workbench also has more features, tools and example projects than than it ever used to. If there are many things that you don't need, then indeed the cost might be too high.
> IAR > compilers used to be good but they did contain some really bad details > e.g. an incredibly slow sscanf() where they were doing multiple > floating point ops for every single input digit.
I'm curious which target processor and which version that was, but AFAICS recent libs don't use floating point math when dealing with integers and you can choose between different printf/scanf formatter features. -- (Remove the obvious prefix to reply privately.) Gemaakt met Opera's e-mailprogramma: http://www.opera.com/mail/
John-Smith <noospam@noospam.com> writes:
> [...] > They all seem to be 12 bit. I agree that IME the last 2 bits or so are > fiction, though adding up 100 samples and dividing by 100 seems to > work :)
That's called oversampling and buys you log_4(M) more bits of resolution, where M is the oversampling factor. See section 3 of http://www.digitalsignallabs.com/presentation.pdf --Randy
> >>> My main requirement is driving a 16 bit DAC, at about 100k >>> samples/sec. It will probably be an audio DAC and probably I2C rather >>> than parallel. But this should "just work" :) I did once bit-bang I2C >>> on the H8/300 (for a Texas 12-bit ADC) and it was horrible, and I >>> discovered some TI data sheet "artefacts" which they had not known >>> about... But these ARM chips all come with I2C interfaces already >>> done. >> >>If audio DAC is good enough, I recommend having a look at >>I2S, which is a synchronous SPI-like interface used for audio. >>I've used this kind of setup (LP1768+DAC) for outputting complex >>calibration signal from MCU flash. > > Thanks - will check. Audio should be fine. I don't need full 16 bit > linearity and certainly not 16-bit temp stability (hard to do anyway).
-- Randy Yates Digital Signal Labs http://www.digitalsignallabs.com
2014-05-18 12:18, John-Smith skrev:
> This is a bit of a simplistic Q and I am in the early stages of > deciding... > > My background is a huge amount of assembler on Z80, Z180, Z280, 80x86, > 8031, 8051, AT90S1200, H8/300, etc. Plus a bit of C. No C++. > > Now I am starting on a new product which will need about 1MB of *data* > storage (factory initialised so can be on chip FLASH) plus some > floating point maths, and will be done mostly in C. > > I have been using Atmel chips for many years but they seem to run each > one for only a few years nowadays, and that is my biggest negative on > Atmel. We were going to do this (and other upcoming projects, mostly > serial comms) with one of the ATmega128 type of chips but each time I > look at the range, they seem to have discontinued half of the specific > devices. > > Whereas the H8/300 has only just gone on LTB, after some 25 years... > which leads me to think ST is the way to go as this is a "serious" > industrial product which will run for at least 10 years. Also, looking > at what other firms use (e.g. Pico) they seem to be often using the ST > 32F103 or similar which comes in a huge family, from miniscule 6mm x > 6mm chips with 16k FLASH (cost $1 1k+) to 1MB FLASH for about $10. > > The devt kits seem to use GNU compilers and the Eclipse IDE, with IAR > charging even more for their compilers than they ever used to. IAR > compilers used to be good but they did contain some really bad details > e.g. an incredibly slow sscanf() where they were doing multiple > floating point ops for every single input digit. I spent ages > optimising that stuff. I think their runtimes were written in generic > C and that was it. The tool cost is not that important in this case > however. > > I' be very interested in hearing peoples' views on this. I really want > a 10 year life of the actual device. >
Hi John, As a former Atmel FAE, I can give you a little info on the Atmel part. As for longevity, the AT91M40800 was released in 1997-8, and is still in production.&#836; (Actually this was the AT9140400, but they added an additional 4 kB of RAM) The ARM devices has had much longer life time than the AVRs. This is mostly because Atmel has released pincompatible AVR devices with more features, and when those are in production, the older devices has been obsoleted. This philosophy has been changed a couple of years ago, so expect longer lifetime even on AVRs. I would still expect longer lifetime on ARMs still. With Atmel, you have the gcc based Atmel Studio, which is free of charge and supports both AVR, AVR32 and ARM. This is more like Visual Studio than Eclipse. They connect to low cost JTAG-ICE or SAM-ICE which is a lower priced version of the Segger J-Link used by IAR. The low price is good, but they will only work with Atmel parts, and when Atmel releases a new part, you will have to update the firmware. If you need floating point, then the AVR32 UC3C is quite nice due to its internal floating point unit. There are some SAM4's with floating point unit as well. Both the SAM4s and the AVR32s has I2S support, and each I2S is connected to dual buffered DMA both on transmit and receive sides. You should have ZERO problem maintaining 100 kbps on I2S. Also each serial port is connected to dual buffered DMA controllers in each direction, with some nice additions like idle detection. You can easily run the serial ports at megabit speed without problems. Flash is generally quite expensive while onchip, and an external dataflash or SPI flash will be quite small, and much cheaper. The SPIs are (surprisingly) supported by dual buffer DMA, so you can load data at several MBytes per second in the background. An interesting feature of SAM4s and UC3C is the event system. This allows peripherals to interact without CPU intervention, and has Best Regards Ulf Samuelsson

The 2024 Embedded Online Conference