EmbeddedRelated.com
Forums
The 2024 Embedded Online Conference

IDE for Atmel ARM processor

Started by Fred July 8, 2005
Chris Hills wrote:
> <toby@telegraphics.com.au> writes >> CBFalconer wrote: >>> Chris Hills wrote: >>>> >>> ... snip ... >>>> >>>> How ever by coincidence today I have come across a company that >>>> has found using GNU has cost it dearly. >>>> >>>> They have run out of code space. They are now in mid project >>>> going to have to port al the code to a commercial compiler that >>>> is more efficient. >>> >>> Sounds like a silly characterization. They have obviously been >>> able to produce in the past, with minimal toolchain expenses. The >>> fact that they don't want to spend the effort building better code >>> generators has nothing to do with that. If they wrote non-standard >>> code and didn't keep the system specific stuff separate, that is >>> their own fault. For all we know their bloat problems may have to >>> do with the size and organization of the system library, or the >>> unnecessary usage of big modules such as scanf and printf. >> >> Yes, or just semi-competent programmers. I picked up a firmware >> project recently (in C) and fully expect to be able to *halve* the >> code size with a bit of obvious refactoring. > > It still doesn't get away from the fact that the they reckon just be > recompiling with a commercial compiler instead of GNU the code size > was halved.
That may well be. However, that has not "cost it dearly". In fact, it has cost them exactly nothing. They are way ahead of the game. If they were doing the complaining I would consider them ungrateful boneheads. I have no idea of your motives, but I am sure some exist. -- "If you want to post a followup via groups.google.com, don't use the broken "Reply" link at the bottom of the article. Click on "show options" at the top of the article, then click on the "Reply" at the bottom of the article headers." - Keith Thompson
>>Yes, or just semi-competent programmers. I picked up a firmware project >>recently (in C) and fully expect to be able to *halve* the code size >>with a bit of obvious refactoring. > > It still doesn't get away from the fact that the they reckon just be > recompiling with a commercial compiler instead of GNU the code size was > halved.
If that's true (half the codesize just by switching compilers) I have a hard time imagining that only be due to miss-optimizations. There's a little-known fact of GCC: by default they put all non-referenced functions in modules into the final binary. This is due to their default sectioning of the object modules. To enable 'garbage-collection', you would have to specify: -fdata-sections -ffunction-sections for the compiler and --gc-sections for the linker Do you know if they did that? Regards, Andras Tantos
Chris Hills wrote:

>>>Chris Hills wrote: >>> >>>... snip ...
>>>>They have run out of code space. They are now in mid project going >>>>to have to port al the code to a commercial compiler that is more >>>>efficient. >>>
> > It still doesn't get away from the fact that the they reckon just be > recompiling with a commercial compiler instead of GNU the code size was > halved.
Can you make up your mind - once they facing costly porting process and 12 hours later they halve the code just by recompiling... Am I missing sometning?
In article <CH7CCoB7ji1CFArW@phaedsys.demon.co.uk>, chris@phaedsys.org 
> Not all ARM compilers are equal. > > You need to look at: > code density, > Speed of execution > targets and hosts supported > Other tools that work with it. > Support from vendor (et al) > The headline cost of the purchase price is not all there is to it. In > some cases Linux is the most expensive option! > > It is like buying automobiles. Some one my give you a Ferrari but can > you afford to run or insure it?
I would like to point out that the OP's message was dated on the 8th and didn't start to get any responses until mine (13th). While your points maybe valid, it might have been more useful if you attempted to answer the questions asked (by the OP) instead of providing a sales pitch on why GNU tools may not be the best choice. I at least attempted to answer some of his questions, I mentioned a compiler he COULD use (and it's up-front cost) and a potential IDE. What compilers did you actually mention? Where is the associated cost? Where did you talk about this compilers speed of execution compared to another product? HW.

Andras Tantos wrote:
> >>Yes, or just semi-competent programmers. I picked up a firmware project > >>recently (in C) and fully expect to be able to *halve* the code size > >>with a bit of obvious refactoring. > > > > It still doesn't get away from the fact that the they reckon just be > > recompiling with a commercial compiler instead of GNU the code size was > > halved. > > If that's true (half the codesize just by switching compilers) I have a hard > time imagining that only be due to miss-optimizations.
It's certainly implausible on any architecture I'm familiar with, but I defer to Mr Hills' ARM-specific knowledge.
> There's a > little-known fact of GCC: by default they put all non-referenced functions > in modules into the final binary. This is due to their default sectioning of > the object modules. To enable 'garbage-collection', you would have to > specify: > > -fdata-sections -ffunction-sections for the compiler and > --gc-sections for the linker > > Do you know if they did that? > > Regards, > Andras Tantos
Chris Hills <chris@phaedsys.org> wrote in message news:<yY6+VkG+gs1CFAs3@phaedsys.demon.co.uk>...

> It still doesn't get away from the fact that the they reckon just be > recompiling with a commercial compiler instead of GNU the code size was > halved.
It's hard to believe this, all our tests show that GCC produces almost same or little bigger code size than the commercial compilers in most of the cases. There are specific situations where GCC produces same or less sized code, there are others when GCC produces bigger code - everything depend on how much time you have spent to learn how your compiler assembly your code, where it puts the function parameters, how it compiles "case" and "if" statements etc. and to know where what to use. I'm not so good assembly writer as assembly needs you to be quite focused when you write or you easily shoot your leg, but few years ago I had to write CRC code for AVR assembler, then out of couriousity re-wrote the same on C and compiled with AVR GCC - it made the code 1/2 size than my original assembly code(!). On top of this in some commercial compilers when you put the highest optimization possible your code will stop work or start behave weird ;) so you should use the optimization with extra care (!) and spent more time on testing. So bottom line is that GCC is doing just fine, but I can't say that there is decent ARM debugger on the market though which to work flawless in Windows environment (or I do asking too much ... ;) Best regards Tsvetan --- PCB prototypes for $26 at http://run.to/pcb (http://www.olimex.com/pcb) PCB any volume assembly (http://www.olimex.com/pcb/protoa.html) Development boards for ARM, AVR, PIC, MAXQ2000 and MSP430 (http://www.olimex.com/dev)
"Tsvetan Usunov" <tusunov@my-deja.com> wrote in message
news:dd52331e.0507150259.24d67db5@posting.google.com...
> Chris Hills <chris@phaedsys.org> wrote in message
news:<yY6+VkG+gs1CFAs3@phaedsys.demon.co.uk>...
> > > It still doesn't get away from the fact that the they reckon just be > > recompiling with a commercial compiler instead of GNU the code size was > > halved. > > It's hard to believe this, all our tests show that GCC produces almost > same or little bigger code size than the commercial compilers in most > of the cases. There are specific situations where GCC produces same or
On which architecture? GCC is embarassingly behind the best commercial compilers, especially on ARM, and it is questionable whether it could ever reduce the gap. My figures show it is at least 8 years behind state of the art - the code the latest GCC produces for Thumb is worse than that of 8+ year old ARM compilers, and Thumb code produced by GCC is *larger* than ARM code produced by the ARM compiler... Performance is about as bad as its codesize. So I can believe halving the code is quite feasible - you can save about 30% in the compiler, and the rest is likely due to the libraries.
> less sized code, there are others when GCC produces bigger code - > everything depend on how much time you have spent to learn how your > compiler assembly your code, where it puts the function parameters, > how it compiles "case" and "if" statements etc. and to know where what > to use.
You should not need to change your code to suit your compiler - a good compiler can produce high quality code from any source, whether good or bad. Of course well-written source will produce good code on any compiler.
> On top of this in some commercial compilers when you put the highest > optimization possible your code will stop work or start behave weird > ;) so you should use the optimization with extra care (!) and spent > more time on testing.
This is not true in general. In many compilers all (or most) optimizations are enabled by default - there is no point in adding optimizations if you then don't enable them! Many of the problems with optimizations are caused by programmers writing illegal C/C++ and expecting the compiler to not break their program. Any optimizations that go beyond the standard (eg. assume no aliasing) are typically well documented, and are only for programmers who know what they are doing.
> So bottom line is that GCC is doing just fine, but I can't say that > there is decent ARM debugger on the market though which to work > flawless in Windows environment (or I do asking too much ... ;)
GCC is just fine if you don't require the best compiler. You get what you pay for - it's that simple. Wilco
"Andras Tantos" <andras_tantos@yahoo.com> wrote in message
news:42d6fc61$1@news.microsoft.com...

> If that's true (half the codesize just by switching compilers) I have a hard > time imagining that only be due to miss-optimizations. There's a > little-known fact of GCC: by default they put all non-referenced functions > in modules into the final binary. This is due to their default sectioning of > the object modules. To enable 'garbage-collection', you would have to > specify: > > -fdata-sections -ffunction-sections for the compiler and > --gc-sections for the linker
Shouldn't the compiler and linker do this automatically? Removing unused functions is an essential feature that should be on by default. It is well known that the default settings of GCC are terrible. Last year there was a paper on the GCC conference that showed how using 20+ options could reduce codesize by 5% on ARM. I don't know whether they have changed them to be the default since then, but that would be the obvious thing to do - I can't imagine anybody remember even half of them! I also wonder whether the latest GCC has finally dropped that expensive 60's style frame pointer... Wilco
Wilco Dijkstra wrote:
> "Andras Tantos" <andras_tantos@yahoo.com> wrote in message > news:42d6fc61$1@news.microsoft.com... > > >>If that's true (half the codesize just by switching compilers) I have a hard >>time imagining that only be due to miss-optimizations. There's a >>little-known fact of GCC: by default they put all non-referenced functions >>in modules into the final binary. This is due to their default sectioning of >>the object modules. To enable 'garbage-collection', you would have to >>specify: >> >>-fdata-sections -ffunction-sections for the compiler and >>--gc-sections for the linker > > > Shouldn't the compiler and linker do this automatically? Removing > unused functions is an essential feature that should be on by default. > > It is well known that the default settings of GCC are terrible. Last year > there was a paper on the GCC conference that showed how using 20+ > options could reduce codesize by 5% on ARM. I don't know whether > they have changed them to be the default since then, but that would be > the obvious thing to do - I can't imagine anybody remember even half > of them! I also wonder whether the latest GCC has finally dropped that > expensive 60's style frame pointer... >
I'd like to see how the other compiler compares with GCC and option -Os. When looking at the generated assembly code (-S or -Wa,-ahlms), it's hard to believe that a compiler could generate 50 % smaller code if it compiles it all. -- Tauno Voipio tauno voipio (at) iki fi

Wilco Dijkstra wrote:
> ... > It is well known that the default settings of GCC are terrible. Last year > there was a paper on the GCC conference that showed how using 20+ > options could reduce codesize by 5% on ARM.
Which default settings do you mean? Regarding optimisation, gcc "defaults" to -O0 if one forgets to specify anything else. Switching to -O2 should make at least a 5% difference, but even 5% is well within the range of variation I'd expect from different vendors' compilers. gcc targets an enormous number of architectures, and nobody can reasonably expect it to be the best compiler for all of them. Nonetheless it remains the de facto standard on many architectures simply because it does a pretty good and reliable job. Few applications could not tolerate 5% larger code, and those that cannot, should perhaps consider assembler or finding smarter C programmers, since that factor alone can certainly account for a 50%+ code bloat even with the best available compiler!
> I don't know whether > they have changed them to be the default since then, but that would be > the obvious thing to do - I can't imagine anybody remember even half > of them! I also wonder whether the latest GCC has finally dropped that > expensive 60's style frame pointer... > > Wilco

The 2024 Embedded Online Conference