EmbeddedRelated.com
Forums

Code Red for ARM Cortex M3 development - any good ?

Started by Mike July 5, 2012
On 7/10/2012 12:11 PM, Walter Banks wrote:
[...]
> In the automotive area for example not only are language standards being > tested but there are becoming standardized regression tests that are used > by tool vendors.
the automotive area is definitely showing interest to the Open Source Software (even though they do not mention 'free' in the sense of 'libre'): http://www.autosar.org/
> > Most of the FOSS tools are using 20+ year old technology whose code > generation is weak.
I do not quite understand what 'whose code generation is weak' means. I compile my code on 20+ years old technology as well as newest technology. Except for minor 'warnings' that are not flagged anymore by new compilers - and that I fix nevertheless - I don't see any problem.
> > w.. > > > >
In article <20120709093351.241147a1@rg.highlandtechnology.com>,
Rob Gaddi  <rgaddi@technologyhighland.invalid> wrote:
<SNIP>
>Permanence. By using gcc, I know that I will always, always, always be >able to get my exact toolchain back if I try hard enough. I won't have >to try to contact a defunct company's license server to reinstall on a >machine with a new MAC address or hard disk ID, or a still-existing >company who will no longer provide licenses for "ancient" tools.
What about the situation where it is impossible to find out who actually owns the rights to the software? (I was once in that situation). I ended up with scrapping the c-compiler and retaining the libraries. However I had to adapt gcc to change the calling convention, such that the libraries still could be used. (An other poster said that "very skilled" people can do that, thanks for the compliment. I want to say just one thing, don't be too intimidated.)
> >It might take me going all the way back to rebuilding gcc and newlib >from source, and that might take me days, but it's not impossible.
In that same company, they allowed me to buy a new SUN to run my compilers on. I ordered a c-compiler to go with the system. Installing gcc from source was a snap. Getting the new c-compiler to work with the license server over the network was not. After several calls to SUN's service department I gave up. SUN-gcc did a good job in compiling the above embedded gcc-compiler anyway. Who would expect otherwise? The license was never used.
> >-- >Rob Gaddi, Highland Technology -- www.highlandtechnology.com
Groetjes Albert -- -- Albert van der Horst, UTRECHT,THE NETHERLANDS Economic growth -- being exponential -- ultimately falters. albert@spe&ar&c.xs4all.nl &=n http://home.hccnet.nl/a.w.m.van.der.horst

alb wrote:

> On 7/10/2012 12:11 PM, Walter Banks wrote: > > > Most of the FOSS tools are using 20+ year old technology whose code > > generation is weak. > > I do not quite understand what 'whose code generation is weak' means. I > compile my code on 20+ years old technology as well as newest > technology. Except for minor 'warnings' that are not flagged anymore by > new compilers - and that I fix nevertheless - I don't see any problem.
Weak in the optimization and creation sense. FOSS tools have a long ways to go to create truly modern tools. To name a few, Most are still using technology that was designed to overcome development systems that had limited resources, like linking and asm code generation instead of direct to machine code. Most FOSS tools I have used are shockingly slow . Application wide strategy passes before generation. Primitive optimization and code creation using a subset of the resources on the target processor. w..
Mark Borgerson wrote:
> In article <jtfbgj.o8.1@stefan.msgid.phost.de>, stefan.news@arcor.de >>>>Any sensible editor has support for ctags/etags, otherwise it doesn't >>>>deserve being called a sensible programmer's editor. This solves the >>>>navigation problem for me most of the time. Plus, my emacs has 1000 >>>>files open anyway, so most of the time opening a file is C-x b >>>>name-of-file.c RET, no need to navigate through a file system. Try that >>>>with Eclipse. >>> >>>This raises a few questions: >>> >>>1. Why does your emacs have 1000 files open? >> >>Why not? >> >>>2. How do you pick the file you would like to edit among those 1000 >>>files? >> >>C-x b name-of-file.c RET > > WOW! you can keep track of 1000 file names in your head. I have > trouble remembering the last 50 files I've used.
I had a few years to learn them...
>>Or, actually, F10 (my shortcut for C-x b) nam<TAB>fi<TAB>c RET, using >>completion. >> >>>3. What projects do you work on that have 1000 files? >> >>Automotive infotainment, from low level (boot loader) through the whole >>stack (drivers, file system, codec), to high level (HMI adaption). > > That's a much larger system than any I've worked on. Even the flight > control system I worked on had only about 50-60 files---but we used > pretty standard drivers for everything and depended on the OS (linux) to > provide those, so we didn't have to recompile them.
What adds to my file count is that being a low-level guy, I get to debug the low-level problems the other developers have. Which means, over time, kernel and driver sources accumulate in my environment :-)
>>>4. How much system resources does it take to manage those 1000 files? >> >>The emacs process has a memory footprint of about 50-100 megabytes, >>containing all those files, and their undo data for the last half year. > > Does the EMACs process have all the file structures, or does the OS have > them and EMACS just keeps file handles?
It loads everything into RAM and then releases the file handles. It processes the files on first use (syntax coloring etc.). Stefan
Walter Banks wrote:
> alb wrote: >>- standardization: the FOSS community has always paid great attention to >>the necessity of agreeing upon standards, from file formats to protocols >>and much more. A closed tool has not such an interest and from time to >>time even different versions of the same tool do not support file >>formats they originally created. > > The FOSS community has a lot of Ad Hoc standards but doesn't > participate in standards groups and generally don't support formal > standards.
The Austin Group lists as members SuSE, RedHat, NetBSD, FreeBSD, Linux Standard Base. And if I look at C99, I find quite a number of gcc extensions in there.
> Commercial tools do and that makes them much more > flexible to use in unusual combinations of code generation toolsets, > simulators, emulators and debug devices.
Commercial tools put sentences into their manuals like "The compiler does not support C99.", or for C++, "The <complex> header and its functions are not included in the library. Two-phase name binding in templates, as described in [tesp.res] and [temp.dep] of the standard, is not implemented. A typedef of a function type cannot include member function cv-qualifiers. A partial specialization of a class member template cannot be added outside of the class definition." (From a document SPNU151G). And we're talking about C++98 only, in the year 2012! I would have expected 14 years to be enough to get conformant. gcc's standards conformance is much better than that compiler's, and not worse than any other compiler's I've used. The world is not black/white as you draw it.
> This is especially true of language implementation in FOSS tools > and conformance to IEC / ISO standards. I have been shocked that > of all the work that has been done on tools sets very little has been > done in the FOSS community on conformance testing.
For _formal_ conformance testing. This costs real money, and gives you a sticker for glossy brochures. FOSS has/needs neither. I don't have the impression that gcc is particularily untested. My statistic currently is about fifteen bugs in commercial tools vs. two in gcc, even though I've been using gcc much longer. Stefan

Stefan Reuther wrote:

> Walter Banks wrote: > > alb wrote: > >>- standardization: the FOSS community has always paid great attention to > >>the necessity of agreeing upon standards, from file formats to protocols > >>and much more. A closed tool has not such an interest and from time to > >>time even different versions of the same tool do not support file > >>formats they originally created. > > > > The FOSS community has a lot of Ad Hoc standards but doesn't > > participate in standards groups and generally don't support formal > > standards. > > The Austin Group lists as members SuSE, RedHat, NetBSD, FreeBSD, Linux > Standard Base. And if I look at C99, I find quite a number of gcc > extensions in there.
FOSS groups did not actively participate in C99 / C11. I would say there are some GCC extensions and some extensions that GCC adopted from other development work.
> > This is especially true of language implementation in FOSS tools > > and conformance to IEC / ISO standards. I have been shocked that > > of all the work that has been done on tools sets very little has been > > done in the FOSS community on conformance testing. > > For _formal_ conformance testing. This costs real money, and gives you a > sticker for glossy brochures. FOSS has/needs neither.
Conformance testing is one way to define with confidence the definition of a tool set. My surprise is that the FOSS community has not developed conformance testing tools in part to the response their cost. FOSS tools have been making the argument for free tools for years. . w..
Walter Banks wrote:
> Stefan Reuther wrote: >>Walter Banks wrote: >>>This is especially true of language implementation in FOSS tools >>>and conformance to IEC / ISO standards. I have been shocked that >>>of all the work that has been done on tools sets very little has been >>>done in the FOSS community on conformance testing. >> >>For _formal_ conformance testing. This costs real money, and gives you a >>sticker for glossy brochures. FOSS has/needs neither. > > Conformance testing is one way to define with confidence the > definition of a tool set. My surprise is that the FOSS community > has not developed conformance testing tools in part to the response > their cost. FOSS tools have been making the argument for free > tools for years. .
gcc comes with an extensive test suite, why does this one not count? I am actually quite confident with it, simply because it has failed less often than "the others" on me. Stefan

Stefan Reuther wrote:

> Walter Banks wrote: > > > Conformance testing is one way to define with confidence the > > definition of a tool set. My surprise is that the FOSS community > > has not developed conformance testing tools in part to the response > > their cost. FOSS tools have been making the argument for free > > tools for years. . > > gcc comes with an extensive test suite, why does this one not count? > > I am actually quite confident with it, simply because it has failed less > often than "the others" on me. >
The GCC test suite is ad hoc, tests some things and misses others. What it lacks is tests that can be directly linked back to specific requirements. Tests that have names for example that go back to standards reference numbers (section paragraph numbering). The GCC test suite is like a collection of regression tests that we have built up over the years as part of our internal testing , from customer support or just interesting code fragments that get debated in news groups. Conformance tests should be written by someone that knows the requirements but have no knowledge of the implementation Regards, Walter Banks
In article <jti0lk.1qo.1@stefan.msgid.phost.de>, stefan.news@arcor.de 
says...
> > Mark Borgerson wrote: > > In article <jtfbgj.o8.1@stefan.msgid.phost.de>, stefan.news@arcor.de > >>>>Any sensible editor has support for ctags/etags, otherwise it doesn't > >>>>deserve being called a sensible programmer's editor. This solves the > >>>>navigation problem for me most of the time. Plus, my emacs has 1000 > >>>>files open anyway, so most of the time opening a file is C-x b > >>>>name-of-file.c RET, no need to navigate through a file system. Try that > >>>>with Eclipse. > >>> > >>>This raises a few questions: > >>> > >>>1. Why does your emacs have 1000 files open? > >> > >>Why not? > >> > >>>2. How do you pick the file you would like to edit among those 1000 > >>>files? > >> > >>C-x b name-of-file.c RET > > > > WOW! you can keep track of 1000 file names in your head. I have > > trouble remembering the last 50 files I've used. > > I had a few years to learn them... > > >>Or, actually, F10 (my shortcut for C-x b) nam<TAB>fi<TAB>c RET, using > >>completion. > >> > >>>3. What projects do you work on that have 1000 files? > >> > >>Automotive infotainment, from low level (boot loader) through the whole > >>stack (drivers, file system, codec), to high level (HMI adaption). > > > > That's a much larger system than any I've worked on. Even the flight > > control system I worked on had only about 50-60 files---but we used > > pretty standard drivers for everything and depended on the OS (linux) to > > provide those, so we didn't have to recompile them. > > What adds to my file count is that being a low-level guy, I get to debug > the low-level problems the other developers have. Which means, over > time, kernel and driver sources accumulate in my environment :-)
I feel your pain. Luckily, I generally only have to solve my own problems.
> > >>>4. How much system resources does it take to manage those 1000 files? > >> > >>The emacs process has a memory footprint of about 50-100 megabytes, > >>containing all those files, and their undo data for the last half year. > > > > Does the EMACs process have all the file structures, or does the OS have > > them and EMACS just keeps file handles? > > It loads everything into RAM and then releases the file handles. It > processes the files on first use (syntax coloring etc.).
That's the way it should be done.
> >
That mirrors a question I had a few year ago. I asked why a data reduction program was reading bytes and words from a file instead of reading the whole file (a few hundred megabytes) into memory and processing the data using a pointer to the data. A PC will give you hundreds of megabytes of data memory if you ask nicely----why should we be content to process data a few bytes at a time using a file handle? Having gigabytes of RAM to handle tasks has certainly changed the paradigm from the time when we had tens of megabytes of disk storage and a memory usage was measuresed in blocks of 64KBytes. Alas, many tools and operating systems still think of processing data via file handles and KBytes of memory. Mark Borgerson
Walter Banks wrote:
> Stefan Reuther wrote: >>Walter Banks wrote: >>>Conformance testing is one way to define with confidence the >>>definition of a tool set. My surprise is that the FOSS community >>>has not developed conformance testing tools in part to the response >>>their cost. FOSS tools have been making the argument for free >>>tools for years. . >> >>gcc comes with an extensive test suite, why does this one not count? >> >>I am actually quite confident with it, simply because it has failed less >>often than "the others" on me. > > The GCC test suite is ad hoc, tests some things and misses others. > What it lacks is tests that can be directly linked back to specific > requirements. Tests that have names for example that go back to > standards reference numbers (section paragraph numbering).
The fact that commercial compilers have bugs indicates that commercial conformance test suites also just test some things and miss others. So what's the point? Recent compiler bugs I've encountered are one tool claiming this char foo[] = { "bar" }; to not be valid C++. It is. But probably the test linked to &#4294967295;8.5.2(1) did not catch it. Another one is a compiler crashing on class a { }; class b { }; namespace { class c : public a, public b { }; } Probably they've tested multiple inheritance, and anonymous namespaces, but it seems they didn't test the combination of both. A systematic test suite is no silver bullet. And a suite that tests for previous problems is pretty useful. Whether it's more or less useful remains to debate, I consider it good enough. Stefan