EmbeddedRelated.com
Forums
The 2026 Embedded Online Conference

validity of ... reasons for preferring C over C++

Started by Nobody October 16, 2014
David Brown schreef op 17-Oct-14 11:03 PM:
> On 17/10/14 19:15, Niklas Holsti wrote: >> On 14-10-17 13:44 , David Brown wrote: >>> On 17/10/14 11:35, Paul Rubin wrote: >>>> David Brown <david.brown@hesbynett.no> writes: >>>>> I disagree, in the embedded world. In PC programming, I agree - very >>>>> often you are better with a language like Python than with C or C++. >>>> >>>> I have to think that even in MCU programming there have to be better >>>> alternatives to C or C++. Ada seems like a possibility except that the >>>> toolchains are either obscure or very expensive. There have been >>>> some C >>>> dialects like Cyclone that were clever but never got traction. And I >>>> guess there are some specialty EDSL's like Atom that aren't so good for >>>> general purpose development. Forth is interesting but it's from a >>>> different world, and still unsafe though with fewer traps than C. >>>> >>> >>> In theory, one could do much better than C or C++ - but not in practice. >>> For some systems, Ada is a possible choice. But while Ada is "safer" >>> than C in some ways, it has its own problems >> >> Care to list a few of those? Just to have a good debate? Been a while... > > My experience with Ada is rather limited, so you might end up teaching > me rather than getting a good debate. But of course others will no > doubt join in. > > An obvious (but highly subjective) irritation with Ada is the verbosity > of the language - lots of things need repeated, and many of the > constructs are more wordy than necessary.
But do you find it more difficult to *read* Ada because of this verbosity? (One of the basic rules of software engineering is that you must optimzie for the reader, not for the author.) > Using words rather than
> symbols is not necessarily a bad thing, within limits - C++ arguably > relies too much on symbols, as anyone trying to read a lambda function > will know.
Funny you mention lambdas in this context. I just finished giving a C++ course, which I revised to include C++ 0x11 features, including lambdas. I had never written any lambda untill 4 months ago. They still feel strange to me, and my college that assisted in the course still thinks they are something totally weird. But for the students a lambda seems to be much easier to understand than exceptions or virtual methods. I think experienced programmers (myself included) are often much less eager to adopt a new feature than novices. Wouter van Ooijen
On 17/10/14 19:29, Tom Gardner wrote:
> On 17/10/14 08:27, David Brown wrote: >> If you don't know what your C++ code is doing, that's /your/ problem, >> not the language's problem. > Well yes. But some languages do have more obscured traps that > that require higher levels of skill and more continual alertness.
Sharp tools cut. What's your point?
On 18/10/14 00:14, Clifford Heath wrote:
> On 17/10/14 19:29, Tom Gardner wrote: >> On 17/10/14 08:27, David Brown wrote: >>> If you don't know what your C++ code is doing, that's /your/ problem, >>> not the language's problem. >> Well yes. But some languages do have more obscured traps that >> that require higher levels of skill and more continual alertness. > > Sharp tools cut. What's your point?
Read the rest of my posting, i.e. the bit you snipped. If you can't be bothered to do that you are no better than a troll. Other people have succeeded and we've had a reasonable discussion. Please don't spoil it unless you have something to contribute.
On 14-10-18 02:14 , Clifford Heath wrote:
> On 17/10/14 19:29, Tom Gardner wrote: >> On 17/10/14 08:27, David Brown wrote: >>> If you don't know what your C++ code is doing, that's /your/ problem, >>> not the language's problem. >> Well yes. But some languages do have more obscured traps that >> that require higher levels of skill and more continual alertness. > > Sharp tools cut. What's your point?
Even a sharp tool should have a comfortable and safe handle. You don't want the handle to have surprising sharp edges or to collapse if you hold it in the wrong way. As a tool, a programming language has both a "handle" part, which is the rules and principles of the language, for the programer to "hold" as she guides the tool by writing programs, and a "sharp edge" part, which is the running program, which changes the world around it and carves out the nuggets of desired results from the default chaos. -- Niklas Holsti Tidorum Ltd niklas holsti tidorum fi . @ .
On 18/10/14 06:52, Niklas Holsti wrote:
> On 14-10-18 02:14 , Clifford Heath wrote: >> On 17/10/14 19:29, Tom Gardner wrote: >>> On 17/10/14 08:27, David Brown wrote: >>>> If you don't know what your C++ code is doing, that's /your/ problem, >>>> not the language's problem. >>> Well yes. But some languages do have more obscured traps that >>> that require higher levels of skill and more continual alertness. >> >> Sharp tools cut. What's your point? > > Even a sharp tool should have a comfortable and safe handle. You don't > want the handle to have surprising sharp edges or to collapse if you > hold it in the wrong way. > > As a tool, a programming language has both a "handle" part, which is the > rules and principles of the language, for the programer to "hold" as she > guides the tool by writing programs, and a "sharp edge" part, which is > the running program, which changes the world around it and carves out > the nuggets of desired results from the default chaos.
Nicely put, but I hope this doesn't degenerate into a lot of hot air about dubious analogies!
Rob Gaddi wrote:

> The last time I even thought about using Ada, I got scared off by the > size of the runtime you needed to go with it (tens of KB). Anyone > know if that's still the case?
I'm not sure if that is quite the case today. You can do actual "work" with GNAT for Mindstorms NXT, where you have the application and run-time for ARM Cortex M series in 64 kb. (The run-time may still account for a few tens of kb.) When I target the 8 bit Atmel AVR MCUs I usually work with a "zero-footprint run-time". Greetings, Jacob -- "I just might be wrong."
Paul Rubin <no.email@nospam.invalid> writes:

> I've heard of an Ada to C translator and I've been meaning to look > into it. I guess it would be like the early implementations of C++ > that translated C++ to C. You'd translate your Ada code and then run > it through a C compiler, using C as a portable assembler.
The (still working) Vermont Technical College cubesat was developed that way. - Except that they actually developed in SPARK to ensure the absence of run-time errors. Greetings, Jacob -- "Hungh. You see! More bear. Yellow snow is always dead give-away."
On 17/10/14 23:44, Wouter van Ooijen wrote:
> David Brown schreef op 17-Oct-14 11:03 PM: >> On 17/10/14 19:15, Niklas Holsti wrote: >>> On 14-10-17 13:44 , David Brown wrote: >>>> On 17/10/14 11:35, Paul Rubin wrote: >>>>> David Brown <david.brown@hesbynett.no> writes: >>>>>> I disagree, in the embedded world. In PC programming, I agree - very >>>>>> often you are better with a language like Python than with C or C++. >>>>> >>>>> I have to think that even in MCU programming there have to be better >>>>> alternatives to C or C++. Ada seems like a possibility except that >>>>> the >>>>> toolchains are either obscure or very expensive. There have been >>>>> some C >>>>> dialects like Cyclone that were clever but never got traction. And I >>>>> guess there are some specialty EDSL's like Atom that aren't so good >>>>> for >>>>> general purpose development. Forth is interesting but it's from a >>>>> different world, and still unsafe though with fewer traps than C. >>>>> >>>> >>>> In theory, one could do much better than C or C++ - but not in >>>> practice. >>>> For some systems, Ada is a possible choice. But while Ada is "safer" >>>> than C in some ways, it has its own problems >>> >>> Care to list a few of those? Just to have a good debate? Been a while... >> >> My experience with Ada is rather limited, so you might end up teaching >> me rather than getting a good debate. But of course others will no >> doubt join in. >> >> An obvious (but highly subjective) irritation with Ada is the verbosity >> of the language - lots of things need repeated, and many of the >> constructs are more wordy than necessary. > > But do you find it more difficult to *read* Ada because of this > verbosity? (One of the basic rules of software engineering is that you > must optimzie for the reader, not for the author.)
Yes, I find it harder to read (but again, I stress my limited experience - any language is easier to use after more practice). When reading C, I find the syntax and the common identifiers contrast with function names, variables, and other identifiers, making it easier to see the structure of the code. Ada just seems to have too many words for my liking - it reads like a school essay. There is some evidence suggesting that the "errors per line of code" rates is fairly independent of the programming language - and with all other things being equal (which they seldom are), a more compact language will have a lower bug rate than a more verbose one. I believe this is simply a matter of the amount of information that you can easily see and process at a time - this is why there is a common rule of keeping your functions shorter than one screenfull. Of course I agree with you that making code easy to read and understand is important - languages (and identifiers in the language) should not be made short to save keystrokes.
> > > Using words rather than >> symbols is not necessarily a bad thing, within limits - C++ arguably >> relies too much on symbols, as anyone trying to read a lambda function >> will know. > > Funny you mention lambdas in this context. I just finished giving a C++ > course, which I revised to include C++ 0x11 features, including lambdas. > I had never written any lambda untill 4 months ago. They still feel > strange to me, and my college that assisted in the course still thinks > they are something totally weird. >
I use lambdas regularly in Python - where they are defined using the keyword "lambda". So I am quite happy with the concept of lambda functions - I just think that the C++ syntax for them is going to take quite a while to get used to.
> But for the students a lambda seems to be much easier to understand than > exceptions or virtual methods. I think experienced programmers (myself > included) are often much less eager to adopt a new feature than novices. > > Wouter van Ooijen >
On 17/10/14 23:11, Paul Rubin wrote:
> David Brown <david.brown@hesbynett.no> writes: >> Ada programming encourages the use of user-defined types for all sorts >> of things. You are not supposed to hold a "day" in an "int" or >> "uint8_t", you are supposed to define "type Day_type is range 1 >> .. 31;". Sometimes this sort of thing can make code clearer, but it >> can also make it harder to see what is really going on in the program. > > This is called "typeful programming" (search on the phrase) and the idea > is it helps the compiler catch errors in the code.
Yes, and it can sometimes be helpful in that way - but it can also mean that you need extra code to deal with the conversions, and that means extra scope for errors. It works both ways.
> >> And because type conversions have to be explicit in Ada, you need to >> add lots of them when using these types. C and C++ do more of this >> automatically, giving clearer code. > > After using Haskell for a while, I've gotten to hate automatic > conversions. They make the code obscure and scary compared to explicit > conversions. I never know if some unwanted conversion is going on in > the background. If I passed the wrong type by accident and the compiler > goes and converts it automatically, what's the point of having types? > >> The run-time overhead in Ada can be an issue - the larger run-time >> library, the run-time checks, exceptions (I don't like them in C++ >> either). > > I think this is a matter of what runtime profile you choose. Ada has > lots of different profiles including some intended for small embedded > processors that don't support fancier features like tasking (not sure > about exceptions). Runtime checks for stuff like integer range errors > are typically compile-time options.
OK. I haven't tried Ada on embedded systems, and haven't looked at this in detail.
> >> there are fewer Ada tools, fewer Ada developers, fewer libraries, >> RTOS's, network stacks, etc., less example code, and so on. > > True. > >> We program 8051 microcontrollers, type with QWERTY keyboards, use >> Windows systems, eat at McDonalds, and listen to Britny Spears - even >> though they are all hopelessly bad compared to alternatives, they >> exist because they are popular. > > Saved to quote file :) >
I hope you corrected the spelling of "Britney" before saving it! But it's nice to know I've written something of interest to someone - it's a rare thing on Usenet.
Jacob Sparre Andersen <jacob@jacob-sparre.dk> writes:
>> I've heard of an Ada to C translator > The (still working) Vermont Technical College cubesat was developed that > way. - Except that they actually developed in SPARK to ensure the > absence of run-time errors.
Do you know what Ada-to-C translation tools they used? What do the tools do about the Ada runtime? Thanks.
The 2026 Embedded Online Conference