On 16/10/14 10:32, Anders.Montonen@kapsi.spam.stop.fi.invalid wrote:> Wouter van Ooijen <wouter@voti.nl> wrote: >> Can you suggest more reasons, or sources (pages, articles, books, etc) >> that mention such reason? > > For one perspective, have a look at the book "Real-Time C++: Efficient > Object-Oriented and Template Microcontroller Programming" by Christopher > Kormanyos. For most of the book he uses an AVR. > > The biggest hurdles to using C++ are in my opinion poor compiler support > and the fact that it is just a vastly more complex language than C. It > is also still rapidly evolving, and the things you learned ten years ago > might just not be valid anymore. >Almost all of my embedded programming is in C (with older stuff in assembly) - I've only done a small amount of embedded C++ for specific customers. But I think now is a good time to re-evaluate that and look towards C++ for more embedded applications. There are three reasons for that: C++11 is a significantly better language than C++98, C++ compilers are significantly better than they used to be, and modern small micros (i.e., Cortex M3/M4) are more powerful and deal better with pointers than old small micros (such as the AVR). Yes, C++ is evolving - and I think the recent step to C++11 has brought it forward significantly. For comparison, how many people noticed the new features of C11?
reasons for preferring C over C++
Started by ●October 15, 2014
Reply by ●October 16, 20142014-10-16
Reply by ●October 16, 20142014-10-16
Am 16.10.2014 um 18:19 schrieb Stefan Reuther:> If that is a problem, it is a consequence of the semantic distance to > the machine, as you put it: if a single line of code can generate a few > hundred machine instructions, have a lot of fun debugging what exactly > caused the program to generate a SIGSEGV in the middle of that.... and consider yourself a lucky bastard if the problem was as brutally evident as a SIGSEGV. Now try debugging a tiny, yet inacceptable numerical inaccuracy in such an environment. Or a random, about-once-every-three-hours, timing infraction that's been known never to happen when the debugger is attached to the program. Among computing tools, the programming language C has been said to be the analog of a surgical scalpel: it's deceptively small, while ultimately very powerful --- and it makes a world of difference whether the persion wielding it knows how to handle it or not. One user may save a patient's life with just a few deft cuts, while the other will kill someone, most likely himself, even faster, with a single clumsy one. If I try to extend that analogy in the direction of C++, my imagination tries to bring up some nightmarish contraption like a handheld, large-diameter buzz saw with scalpel blades for teeth, a 20 kW engine and nitro-glycerine for fuel: I can accept that might be a very useful tool to someone who could manage to operate it safely, but I can't make myself believe that anyone ever could. Not even with all kinds of safe-guard mechanisms added to the design.
Reply by ●October 17, 20142014-10-17
Hans-Bernhard Bröker wrote:> Am 16.10.2014 um 18:19 schrieb Stefan Reuther: > >> If that is a problem, it is a consequence of the semantic distance to >> the machine, as you put it: if a single line of code can generate a few >> hundred machine instructions, have a lot of fun debugging what exactly >> caused the program to generate a SIGSEGV in the middle of that. > > ... and consider yourself a lucky bastard if the problem was as brutally > evident as a SIGSEGV. Now try debugging a tiny, yet inacceptable > numerical inaccuracy in such an environment. Or a random, > about-once-every-three-hours, timing infraction that's been known never > to happen when the debugger is attached to the program. > > > Among computing tools, the programming language C has been said to be > the analog of a surgical scalpel: it's deceptively small, while > ultimately very powerful --- and it makes a world of difference whether > the persion wielding it knows how to handle it or not. One user may > save a patient's life with just a few deft cuts, while the other will > kill someone, most likely himself, even faster, with a single clumsy one. > > If I try to extend that analogy in the direction of C++, my imagination > tries to bring up some nightmarish contraption like a handheld, > large-diameter buzz saw with scalpel blades for teeth, a 20 kW engine > and nitro-glycerine for fuel: I can accept that might be a very useful > tool to someone who could manage to operate it safely, but I can't make > myself believe that anyone ever could. Not even with all kinds of > safe-guard mechanisms added to the design.It was Chuck Moore who said about his FORTH language that it is an amplifier for the abilities of the programmer, the good as well the bad. C++ i just the bigger hammer. Given to someone who knows what to do, it's just the stronger tool, given to the wrong people they create the bigger disaster. C++ has more features to misuse and people who barely understand the basic concepts immediately jump on using the newest feature. For good reason MISRA or DO178 do not allow many of the fancy features. I am using C, C++ and assembler for safety critical code. Using the features of C++ wisely it helps to write clear, correct programs and the overhead is negligible. The reason not to use C++ are mainly the unavailability of a decent compiler for smaller micros. And not everything is ARM with megabytes of RAM and ROM, sometimes there is just 1k of RAM. -- Reinhardt
Reply by ●October 17, 20142014-10-17
Hans-Bernhard Br�ker wrote:> Am 16.10.2014 um 18:19 schrieb Stefan Reuther: >> If that is a problem, it is a consequence of the semantic distance to >> the machine, as you put it: if a single line of code can generate a few >> hundred machine instructions, have a lot of fun debugging what exactly >> caused the program to generate a SIGSEGV in the middle of that. > > ... and consider yourself a lucky bastard if the problem was as brutally > evident as a SIGSEGV. Now try debugging a tiny, yet inacceptable > numerical inaccuracy in such an environment. Or a random, > about-once-every-three-hours, timing infraction that's been known never > to happen when the debugger is attached to the program.Last time I had that was in C, and was fixed by -ffloat-store :-)> Among computing tools, the programming language C has been said to be > the analog of a surgical scalpel: [...] > > If I try to extend that analogy in the direction of C++, my imagination > tries to bring up some nightmarish contraption like a handheld, > large-diameter buzz saw with scalpel blades for teeth, a 20 kW engine > and nitro-glycerine for fuel: [...]C makes it easy to shoot yourself in the foot; C++ makes it harder, but when you do it blows your whole leg off. -- Bjarne Stroustrup Stefan
Reply by ●October 18, 20142014-10-18
"Stefan Reuther" <stefan.news@arcor.de> wrote in message news:m1mmar.23s.1@stefan.msgid.phost.de...> Wouter van Ooijen wrote: >> A student of me is looking for reasons that are put forward for >> preferring C over C++ on small chips. (Note: all reasons, not just >> *valid* reasons). >> >> So far he found >> >> - tooling & experience (no C++ available for my chip, no C++ experience) > > That would be the killer argument for me.me too. A few weeks ago I was put forward for a job requiring: extensive low level embedded experience in writing code in resourced limited environments, specifically ARM Cortex chipsets (tick, tick, tick) The project turned out to be a "bionic" watch - no not one that turn into a human being, one that monitors your biometrics. I though "great that sound really me", and then they revealed we are a hardcode C++ shop (because some of their other products are advantaged by that) and you're going to have to do an online test. My C++ experience is limited to the embedded environment and I'm up to speed on creating OO class structures but not much else. I don't do standard library stiff (just like I don't carry the C library manual around in my head), streams or containers. These things don't seem to have much place low level embedded code, but if you believe different fell free to explain why And guess what, the test was 80% on these types on things and predictably, I failed. Looks like they are going to get the employee (well freelancer) that they deserve tim
Reply by ●October 18, 20142014-10-18
"tim....." <tims_new_home@yahoo.co.uk> writes:> My C++ experience is limited to the embedded environment and I'm up to > speed on creating OO class structures but not much else. I don't do > standard library stiff (just like I don't carry the C library manual > around in my head), streams or containers. These things don't seem to > have much place low level embedded code, but if you believe different > fell free to explain whyIf you mean STL containers, yes they are useful in all but the smallest environments and (to a limited extent) even in those. They will do stuff like automatic resizing and range checking, and they allow you to use containers of multiple types without having to paste code all over the place. OO on the other hand has become somewhat unfashionable in C++. I think it's sufficient to just get a recent C++ book as STL is not really hard to use. cppreference.com is also pretty good. And if it was an online test you took, unless it said otherwise I'd think it was ok to use reference materials while taking the test.
Reply by ●October 18, 20142014-10-18
On 18/10/14 19:44, Paul Rubin wrote:> "tim....." <tims_new_home@yahoo.co.uk> writes: >> My C++ experience is limited to the embedded environment and I'm up to >> speed on creating OO class structures but not much else. I don't do >> standard library stiff (just like I don't carry the C library manual >> around in my head), streams or containers. These things don't seem to >> have much place low level embedded code, but if you believe different >> fell free to explain why > > If you mean STL containers, yes they are useful in all but the smallest > environments and (to a limited extent) even in those. They will do > stuff like automatic resizing and range checking, and they allow you to > use containers of multiple types without having to paste code all over > the place.My knowledge is very out of date and probably plain wrong, but although /you/ don't have to make duplications in the source code, doesn't the /compiler/ have to expand them in the object code.> OO on the other hand has become somewhat unfashionable in C++.Probably because they seen "objects done right" in other languages! :)
Reply by ●October 18, 20142014-10-18
Tom Gardner <spamjunk@blueyonder.co.uk> writes:>> use containers of multiple types without having to paste code all >> over the place. > but although /you/ don't have to make duplications in the > source code, doesn't the /compiler/ have to expand them in > the object code.C++ templates do work like that, so you indeed get bloat in the object code (just as if you'd manually duplicated code like you'd have to in C), but the source code becomes more uniform and maintainable. The bloat isn't a law of nature but rather reflects C++'s design goal of zero-overhead abstraction. Other languages like Haskell can avoid the bloat by supporting polymorphic functions implemented by passing type info at runtime, taking a slight penalty in speed. I'm not sure how Ada handles this. Ada has generic but I don't know how they work. There's also some generics in C11 or C14 that looked nice though I don't remember any details by now.>> OO on the other hand has become somewhat unfashionable in C++. > Probably because they seen "objects done right" in other > languages! :)No I mean at least among some PL geeks, OO has become unfashionable in general, not just in C++. They see it as a 1990's thing that didn't fulfill its promises. Smalltalk has faded to obscurity and Java is a post-Cobol Cobol, etc.
Reply by ●October 18, 20142014-10-18
tim..... <tims_new_home@yahoo.co.uk> wrote: (snip)> A few weeks ago I was put forward for a job requiring:> extensive low level embedded experience in writing code in resourced limited > environments, specifically ARM Cortex chipsets (tick, tick, tick)> The project turned out to be a "bionic" watch - no not one that turn into a > human being, one that monitors your biometrics.> I though "great that sound really me", and then they revealed we are a > hardcode C++ shop (because some of their other products are advantaged by > that) and you're going to have to do an online test.> My C++ experience is limited to the embedded environment and I'm up to speed > on creating OO class structures but not much else. I don't do standard > library stiff (just like I don't carry the C library manual around in my > head), streams or containers. These things don't seem to have much place > low level embedded code, but if you believe different fell free to explain > whyI probably believe that when doing a programming test that you should either have access to library documentation, or problems should not need it. Others might disagree, though.> And guess what, the test was 80% on these types on things and > predictably, I failed.> Looks like they are going to get the employee (well freelancer) > that they deserveThere are way too many stories about interviewing in general having unreasonably expectations. Not that the people aren't good enough, but that the problems don't test the right thing. -- glen
Reply by ●October 19, 20142014-10-19
On 14-10-19 01:42 , Paul Rubin wrote:> Tom Gardner <spamjunk@blueyonder.co.uk> writes: >>> use containers of multiple types without having to paste code all >>> over the place. >> but although /you/ don't have to make duplications in the >> source code, doesn't the /compiler/ have to expand them in >> the object code. > > C++ templates do work like that, so you indeed get bloat in the object > code (just as if you'd manually duplicated code like you'd have to in > C), but the source code becomes more uniform and maintainable. The > bloat isn't a law of nature but rather reflects C++'s design goal of > zero-overhead abstraction. > > Other languages like Haskell can avoid the bloat by supporting > polymorphic functions implemented by passing type info at runtime, > taking a slight penalty in speed. I'm not sure how Ada handles this. > Ada has generic but I don't know how they work.The rules for Ada generics are specified so that generic units can either share code between instances, or can use the "macro" approach and generate specific code for each instance. Some compilers use shared code, others use the macro approach. I believe there have been some compilers that let the programmer choose which method to use, but I don't know if that can be done in any current compiler. The GNAT compiler uses the macro approach (specific code for each instance). This tends to produce faster code, at the cost of more code, of course. It is sometimes possible to divide generic Ada units into two parts: a non-generic part that has most of the complex code, and a small generic wrapper that contains the code that is duplicated (and separately optimised) for each instance. One way to do to that is to use tagged-type polymorphism, similar to the Haskell way. -- Niklas Holsti Tidorum Ltd niklas holsti tidorum fi . @ .







