EmbeddedRelated.com
Forums
Memfault Beyond the Launch

Gnu tools for ARM Cortex development

Started by Tim Wescott May 4, 2010
Oliver Betz wrote:
> David Brown wrote: > > [...] > >>>> Even their paid-for subscription versions have source tarballs - you can >>>> re-compile them yourself with the node-locking code disabled if you want >>> Sure? As far as I understand, you don't get the library sources in the >>> personal version. >> I was thinking here about the tools themselves - gcc, binutils, gdb, > > for these, there seems to little or no difference between the > versions. >
There are three differences that I know about between the paid-for versions of these tools, and the entirely free versions. One is that the paid-for versions include the node-lock and license validity checks (which can be removed if you are recompiling them, and feel you want to "get around" the licensing). Another is that the paid-for binaries go through a more thorough testing and validation process. But perhaps the most importantly, CodeSourcery updates their paid-for versions faster than their fully free versions.
>> etc. I believe you get the sources to some parts of the library with >> the personal version, but not all. As with many companies that make > > "Professional Edition also includes debuggable versions of the > run-time libraries". > > I will ask them when I start again an evaluation of the differences. > When I tried last time, urgent other work prevented me from finishing > my tests. >
That always seems to happen just after you've registered for your 30 days trial...
Tim Wescott skrev:
> Is anyone out there doing development for the ARM Cortex (specifically > the m3) with the Gnu tools? > > Are you using the CodeSourcery set, or are you building your own? > > If so, how are things going? There seems to be a welter of "how to" > pages on this, but nearly all of them seem to be as old as the hills. > > My spare-time job right now is bringing up a set of tools that'll work > on Linux and will let me develop on the TI LM3S811. I'm trying to keep > everything 100% open source; since CodeSourcery is exceedingly coy about > coughing up source code (I certainly haven't found it) and because their > install scripts don't seem to be terribly compatible with my Linux > installation (Ubuntu Karmic) I'm building from scratch. > > Things seem to be going well, although not completely straightforward -- > my current task is to write or find the obligatory startup code to > establish a C++ run-time environment so that the rest of my code will > work, and to verify that OpenOCD really does function on this machine. > > Aside from "you're crazy, see a shrink!" does anyone have any useful > observations on the process? Any known-fresh web pages? > > TIA >
Would be interested to see some performance data vs commercial compilers. Some tests I found on internet (ST's Forum), indicate that older gcc really suck for Cortex-M3 (~0.4 Dhrystone MIPS/MHz IIRC). Which gcc version do you need to get decent performance? -- Best Regards Ulf Samuelsson These are my own personal opinions, which may or may not be shared by my employer Atmel Nordic AB
On 19/05/2010 23:42, Ulf Samuelsson wrote:
> Tim Wescott skrev: >> Is anyone out there doing development for the ARM Cortex (specifically >> the m3) with the Gnu tools? >> >> Are you using the CodeSourcery set, or are you building your own? >> >> If so, how are things going? There seems to be a welter of "how to" >> pages on this, but nearly all of them seem to be as old as the hills. >> >> My spare-time job right now is bringing up a set of tools that'll work >> on Linux and will let me develop on the TI LM3S811. I'm trying to keep >> everything 100% open source; since CodeSourcery is exceedingly coy >> about coughing up source code (I certainly haven't found it) and >> because their install scripts don't seem to be terribly compatible >> with my Linux installation (Ubuntu Karmic) I'm building from scratch. >> >> Things seem to be going well, although not completely straightforward >> -- my current task is to write or find the obligatory startup code to >> establish a C++ run-time environment so that the rest of my code will >> work, and to verify that OpenOCD really does function on this machine. >> >> Aside from "you're crazy, see a shrink!" does anyone have any useful >> observations on the process? Any known-fresh web pages? >> >> TIA >> > > Would be interested to see some performance data vs commercial > compilers. Some tests I found on internet (ST's Forum), indicate that > older gcc really suck for Cortex-M3 (~0.4 Dhrystone MIPS/MHz IIRC). > > > Which gcc version do you need to get decent performance? >
As with any other compiler, early versions of gcc for a particular target have often been poor. One of the differences between gcc and commercial compilers is that gcc releases are often available even in their earliest versions, while a commercial company will probably not release their tools until they are happy with the performance. There are also many different versions of gcc around - for various reasons, people will sometimes choose old versions of the tools. Another difference between gcc and commercial tools is that commercial tools often have EULAs restricting you from publishing any sort of benchmark information (this is understandable from the supplier's viewpoint). Finally, commercial tool vendors have a strong need for competitive marketing, and therefore to tell users how much better code their compiler produces. gcc, even from commercial companies, are not in the same situation - their main "competitor" is older versions of gcc. All this adds up to it being very common to see "benchmarks" showing that brand X compiler generates faster code than brand Y or gcc. When you look at the details (if details are even shown), brand X is probably the latest version with the fastest choice of compiler flags, while brand Y and gcc are often older versions and poorer flag choices. The source code used for the tests is typically meaningless (who really wants to calculate lists of primes on a microcontroller? And why does "printf" turn up so often in a /compiler/ test?) and chosen to fit the results the tester wants. If you want to know which compiler does a better job, the only way to find out is to get some evaluation copies and do the comparison yourself. It would be nice if there were such information available on a website, but it would take a lot of time and effort (and therefore money), especially to keep it updated, and would break the vendors licensing agreements.
In message <4bf529e2$0$2035$8404b019@news.wineasy.se>, David Brown
<david@westcontrol.removethisbit.com> writes
>Another difference between gcc and commercial tools is that commercial >tools often have EULAs restricting you from publishing any sort of >benchmark information (this is understandable from the supplier's >viewpoint).
I agree... however it does not stop benchmarks being done. Most compiler companies test all the competitors compilers they can get their hands on. I have seen some of these tests. (I am under NDA's with various companies and have done compiler testing) They are all better than GCC equivalents. That is unless you spend a LOT of time (and time == money) improving the GCC compiler set up. Then it gets better but rarely as good. It is essentially a generic compiler system it is not going to get anywhere close to the targeted commercial compilers.
>Finally, commercial tool vendors have a strong need for competitive >marketing, and therefore to tell users how much better code their >compiler produces.
What most commercial compilers companies do is tell their users how to get the best out of the tools. That is true.
> gcc, even from commercial companies, are not in the same situation - >their main "competitor" is older versions of gcc.
Or other suppilers of the same (ish) version.
>All this adds up to it being very common to see "benchmarks" showing >that brand X compiler generates faster code than brand Y or gcc. When >you look at the details (if details are even shown), brand X is >probably the latest version with the fastest choice of compiler flags, >while brand Y and gcc are often older versions and poorer flag choices.
They always say that. However the internal tests and bench-marking use the current and main stream GCC compilers which suitable flags set for all. There is no point in doing otherwise for internal testing and beach-marks you can't publish
> The source code used for the tests is typically meaningless (who >really wants to calculate lists of primes on a microcontroller? And >why does "printf" turn up so often in a /compiler/ test?) and chosen to >fit the results the tester wants.
There are many benchmarks. Each tests different things. Apart from he obvious whet and dhry stones, sives and primes etc there are a lot of other benchmarks used. Certainly internally. Quite apart from Language conformance tests.
>If you want to know which compiler does a better job, the only way to >find out is to get some evaluation copies and do the comparison >yourself.
Very true. I once saw some one who upgraded because the new version of a compiler said it could to *on average* 10% reduction in code size. HE complained because he got a 1% reduction.... When we looked into it 90% of his code was look up tables!!!! So try the compiler on YOUR code. That is what eval versions are for. Most do a size limited version and or a time limited unrestricted version
> It would be nice if there were such information available
There is... but
> on a website,
Not a chance because.....
>but it would take a lot of time and effort (and therefore money), >especially to keep it updated,
Time == money
> and would break the vendors licensing agreements.
Yes. Incidentally I had a chat with a company whose legal department went thought the licenses* for some Open Source they wanted to use. Apparently it was "far too restrictive" and they refused to permit any Open Source in the company!!!! They did not say which Open Source License(s) it was. -- \/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\ \/\/\/\/\ Chris Hills Staffs England /\/\/\/\/ \/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/
Chris H wrote:
> In message <4bf529e2$0$2035$8404b019@news.wineasy.se>, David Brown > <david@westcontrol.removethisbit.com> writes >> Another difference between gcc and commercial tools is that commercial >> tools often have EULAs restricting you from publishing any sort of >> benchmark information (this is understandable from the supplier's >> viewpoint). > > I agree... however it does not stop benchmarks being done. Most compiler > companies test all the competitors compilers they can get their hands > on. I have seen some of these tests. (I am under NDA's with various > companies and have done compiler testing) They are all better than GCC > equivalents. > > That is unless you spend a LOT of time (and time == money) improving the > GCC compiler set up. Then it gets better but rarely as good. It is > essentially a generic compiler system it is not going to get anywhere > close to the targeted commercial compilers. >
I'm not going to argue here about how gcc /actually/ compares to other vendors - we've heard each other's opinions on that before. And I haven't done any comparisons myself on the ARM platform, so I don't have any facts on hand. What I am saying is that you cannot place much trust in a compiler vendor's benchmark publications. I am not trying to accuse them of dishonesty or anything (unless you consider all marketing dishonest) - it's just that it's somewhere between very hard and impossible to do a good job of such benchmarks, and there are too many conflicts of interest involved.
>> Finally, commercial tool vendors have a strong need for competitive >> marketing, and therefore to tell users how much better code their >> compiler produces. > > What most commercial compilers companies do is tell their users how to > get the best out of the tools. That is true. > >> gcc, even from commercial companies, are not in the same situation - >> their main "competitor" is older versions of gcc. > > Or other suppilers of the same (ish) version. > > >> All this adds up to it being very common to see "benchmarks" showing >> that brand X compiler generates faster code than brand Y or gcc. When >> you look at the details (if details are even shown), brand X is >> probably the latest version with the fastest choice of compiler flags, >> while brand Y and gcc are often older versions and poorer flag choices. > > They always say that. However the internal tests and bench-marking use > the current and main stream GCC compilers which suitable flags set for > all. There is no point in doing otherwise for internal testing and > beach-marks you can't publish >
I'm sure that /internally/ vendors take gcc a lot more seriously. I'd be very surprised if they don't do close comparisons on all sorts of generated code using their own tools and gcc, and I'm sure there are times when they study the gcc source code for ideas (they won't copy anything, of course - it would not fit in their code structure. And it would be illegal). But that's different from published benchmarks. Internal benchmarks are a development tool written and used by engineers. Published benchmarks are a marketing tool for salespeople. I've often heard you say that this sort of benchmark proves that gcc can't compare to commercial compilers. But I'm sure you'll agree that a statement like that on its own is worthless - and your NDAs prevent you giving anything more. Even if you are able to provide your customers or potential customers with more details "off the record", anyone interested in checking the performance of a compiler has to do the tests themselves.
>> The source code used for the tests is typically meaningless (who >> really wants to calculate lists of primes on a microcontroller? And >> why does "printf" turn up so often in a /compiler/ test?) and chosen to >> fit the results the tester wants. > > There are many benchmarks. Each tests different things. Apart from he > obvious whet and dhry stones, sives and primes etc there are a lot of > other benchmarks used. Certainly internally. Quite apart from > Language conformance tests. > >> If you want to know which compiler does a better job, the only way to >> find out is to get some evaluation copies and do the comparison >> yourself. > > Very true. I once saw some one who upgraded because the new version of a > compiler said it could to *on average* 10% reduction in code size. HE > complained because he got a 1% reduction.... When we looked into it 90% > of his code was look up tables!!!! > > So try the compiler on YOUR code. That is what eval versions are for. > Most do a size limited version and or a time limited unrestricted > version > >> It would be nice if there were such information available > > There is... but > >> on a website, > Not a chance because..... > >> but it would take a lot of time and effort (and therefore money), >> especially to keep it updated, > > Time == money > >> and would break the vendors licensing agreements. > > Yes. > > Incidentally I had a chat with a company whose legal department went > thought the licenses* for some Open Source they wanted to use. > Apparently it was "far too restrictive" and they refused to permit any > Open Source in the company!!!! > > They did not say which Open Source License(s) it was. >
Some open source licenses have a lot of restrictions, but these are mostly on use of the source code. There are few which have restrictions on /use/ of the program. In fact, to get OSI approval of a license as "open source" (tm), you are not allowed to have restrictions on who can use the software - see <http://www.opensource.org/docs/osd> for the rules. Of course, there are legal departments, managers, etc., who come up with all sorts of bizarre rules based on their understanding or misunderstanding of things. I've heard of companies refusing to use free software (open source or otherwise) because if they haven't paid for it, there is no one to sue if it goes wrong! If this particular story is referring to software development, then it's a different matter. Trying to make use of existing open source software in the development of your own products can be a legal minefield, especially if you want to mix and match code with different licenses. And in this context, people often consider the GPL to be very restrictive, especially compared to BSD licenses.
Chris H wrote:

[...]

>I agree... however it does not stop benchmarks being done. Most compiler >companies test all the competitors compilers they can get their hands >on. I have seen some of these tests. (I am under NDA's with various >companies and have done compiler testing) They are all better than GCC >equivalents.
does this apply to well maintained 32 bit targets as m68k/Coldfire, ARM? General code generation problems or library quality? Oliver -- Oliver Betz, Muenchen (oliverbetz.de)
On 21/05/2010 07:58, Oliver Betz wrote:
> Chris H wrote: > > [...] > >> I agree... however it does not stop benchmarks being done. Most compiler >> companies test all the competitors compilers they can get their hands >> on. I have seen some of these tests. (I am under NDA's with various >> companies and have done compiler testing) They are all better than GCC >> equivalents. > > does this apply to well maintained 32 bit targets as m68k/Coldfire, > ARM? >
In my experience, gcc produces very good code for general C (and C++, Ada, Fortran, etc.) for the main 32-bit targets, such as m68k/Coldfire, ARM, MIPS, x86, and PPC as well as the 64-bit targets PPC, MIPS, and amd64. There are some aspects where top-rank commercial compilers will do a better job - their support for "cpu accelerators" such as extra DSP or vector units is often better, especially if there are few chips with these units. In general, the more specialised an add-on is, the less likely it is that gcc will fully support it. Support for that sort of thing takes time and money - commercial tool vendors will often get support from the chip vendors, and they have customers happy to pay money for exactly this sort of thing. With gcc, such features depend more on popularity and on the number of paying customers at commercial gcc developers like Code Sourcery. Hardware vendors may also sponsor such features. On the other hand, more development is done on the front end of gcc than for most other compilers. The front end is shared across all ports, and is thus probably more used than all other compilers put together. So there is a lot in terms of language support and extensions, as well as front and middle end optimisations, in which gcc leads many commercial compilers. One area in which gcc has been poor at compared to commercial compilers is whole-program optimisation (a.k.a. link time optimisation, inter module optimisation, omniscient code generation, etc.). For several versions, gcc has supported this to a limited extent - basically, you compile all your C files at once with a few compiler flags. But you couldn't split the compilation up, you couldn't mix in C++, you couldn't use it on libraries, and it didn't scale well (that's less of a problem with many embedded systems, but limits the scope for development and testing since it is of little use on "big system" software). With gcc 4.5, there is now proper link-time optimisation. It remains to be seen just how good this will be in practice, and it will probably take time to mature, but the potential is huge, and it could lead to many changes to the way C code and modules are organised.
> General code generation problems or library quality? >
Libraries also vary a lot in quality. There are also balances to be made - libraries aimed for desktop use will put more effort into flexibility and standards compliance (such as full IEEE floating point support), while those aimed at embedded system emphasis size and speed. This is an area where the various commercial gcc vendors differentiate their products.
> Oliver
In message <9O2dndwkDrE9h2vWnZ2dnUVZ8vydnZ2d@lyse.net>, David Brown
<david.brown@hesbynett.removethisbit.no> writes
>Chris H wrote: >> In message <4bf529e2$0$2035$8404b019@news.wineasy.se>, David Brown >> <david@westcontrol.removethisbit.com> writes >>> Another difference between gcc and commercial tools is that commercial >>> tools often have EULAs restricting you from publishing any sort of >>> benchmark information (this is understandable from the supplier's >>> viewpoint). >> I agree... however it does not stop benchmarks being done. Most >>compiler >> companies test all the competitors compilers they can get their hands >> on. I have seen some of these tests. (I am under NDA's with various >> companies and have done compiler testing) They are all better than GCC >> equivalents. >> That is unless you spend a LOT of time (and time == money) improving >>the >> GCC compiler set up. Then it gets better but rarely as good. It is >> essentially a generic compiler system it is not going to get anywhere >> close to the targeted commercial compilers. >> >What I am saying is that you cannot place much trust in a compiler >vendor's benchmark publications.
I agree. The published ones are not of any real use.
>I am not trying to accuse them of dishonesty or anything (unless you >consider all marketing dishonest) - it's just that it's somewhere >between very hard and impossible to do a good job of such benchmarks, >and there are too many conflicts of interest involved.
I agree. And that applies to the non-commercial tools as well. BTW whilst the marketing from some companies is "close to the line" (and some times close on the wrong side :-) Open Source Devotees can be just as bad and often far worse in their claims and arguments. They make religious zealots look sane.
>I'm sure that /internally/ vendors take gcc a lot more seriously.
Not technically. It is way behind most commercial compilers and for many targets there are no open source or free compilers. One compiler designer I know was complaining last year that all GCC is doing is rearranging the deck chairs on the titanic when it comes to compiler technology.
> I'd be very surprised if they don't do close comparisons on all sorts >of generated code using their own tools and gcc,
They do.... but not "and gcc". Gcc is just one of many compilers a compiler company will test starting with their main competitors.
> and I'm sure there are times when they study the gcc source code for >ideas (they won't copy anything, of course - it would not fit in their >code structure. And it would be illegal).
There is no need except for amusement. GCC is a LONG way behind the main commercial compilers.
>But that's different from published benchmarks. Internal benchmarks >are a development tool written and used by engineers. Published >benchmarks are a marketing tool for salespeople.
Yes and not. All benchmarks are benchmarks. The published ones tend to use well known benchmarks where the source is public. Internal benchmarks use all sorts of code. I know one company who uses several very large projects from customers as well as the sources for their own tools.
>I've often heard you say that this sort of benchmark proves that gcc >can't compare to commercial compilers.
It does.
> But I'm sure you'll agree that a statement like that on its own is >worthless - and your NDAs prevent you giving anything more.
Yes to a point.
> Even if you are able to provide your customers or potential customers >with more details "off the record", anyone interested in checking the >performance of a compiler has to do the tests themselves.
Yes. But who does? Apart from the standard published benchmarks which are very narrow. -- \/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\ \/\/\/\/\ Chris Hills Staffs England /\/\/\/\/ \/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/

David Brown wrote:
> > I'm sure that /internally/ vendors take gcc a lot more seriously. I'd > be very surprised if they don't do close comparisons on all sorts of > generated code using their own tools and gcc, and I'm sure there are > times when they study the gcc source code for ideas (they won't copy > anything, of course - it would not fit in their code structure. And it > would be illegal). > > But that's different from published benchmarks. Internal benchmarks are > a development tool written and used by engineers. Published benchmarks > are a marketing tool for salespeople. > > I've often heard you say that this sort of benchmark proves that gcc > can't compare to commercial compilers.
In my experience that true. The technical issue with gcc is the fundamental design is very old and it tries to support many targets with as much common code as possible. These are primary weaknesses when gcc is competing with commercial compilers designed for a specific target. I don't think that many commercial compiler companies would be looking though gcc source code for ideas anymore. The approaches that commercial compiler developers are using now using is so fundamentally different that gcc would be of little help. Regards, w.. -- Walter Banks Byte Craft Limited http://www.bytecraft.com --- news://freenews.netfront.net/ - complaints: news@netfront.net ---
David Brown wrote:

[...]

>In my experience, gcc produces very good code for general C (and C++, >Ada, Fortran, etc.) for the main 32-bit targets, such as m68k/Coldfire, >ARM, MIPS, x86, and PPC as well as the 64-bit targets PPC, MIPS, and amd64.
what I have seen in my tests till now looked good, besides a strange re-ordering of instructions making the generated code not faster but unreadable (e.g. in the debugger). And it could be that the re-ordering affects performance when accessing slow Coldfire V2 peripherals (consecutive accesses to peripherals cost more wait states), but I didn't investigate this yet. [...]
>One area in which gcc has been poor at compared to commercial compilers >is whole-program optimisation (a.k.a. link time optimisation, inter >module optimisation, omniscient code generation, etc.). For several
since this affects mainly code size, this is no problem for me. My applications are small and time critical, so I need only speed. [...]
>> General code generation problems or library quality? > >Libraries also vary a lot in quality. There are also balances to be >made - libraries aimed for desktop use will put more effort into >flexibility and standards compliance (such as full IEEE floating point >support), while those aimed at embedded system emphasis size and speed.
newlib, uClibc? IMO still bloated for small applications.
> This is an area where the various commercial gcc vendors differentiate >their products.
At least Codesourcery doesn't tell much about specific advantages of their libraries. And since the libraries have to cover a broad range of applications, it might be necessary to compile them with specific settings - who provides sources? Oliver -- Oliver Betz, Munich despammed.com might be broken, use Reply-To:

Memfault Beyond the Launch