EmbeddedRelated.com
Forums
The 2026 Embedded Online Conference

C# for Embedded ?...

Started by Chris November 4, 2016
On 05/11/16 09:17, Don Y wrote:
> What Ada brings to the table is specifically for the "big systems" > that are too complex for "single brains" to accurately contain
Most brains involved in embedded programming cannot accurately contain many small problems, as I just pointed out elsewhere. Of course, having a better language guarantees nothing, but the choice of a better language is a good indication of awareness of the difficulty of even "small problems".
On 04/11/16 19:51, Paul Rubin wrote:
> Tom Gardner <spamjunk@blueyonder.co.uk> writes: >> On 04/11/16 18:22, kalvin.news@gmail.com wrote: >>> Some people might argue that C++ would be just fine, >> Make sure such people can provide good answers ... http://yosefk.com/c++fqa/ > > That's an entertaining document but it mostly describes: > > 1) Hazards in C++ that are also present in C, so you don't escape from > them by choosing C over C++. > > 2) Hazards of complicated C++ features that aren't used that frequently > and that programmers can decide not to use in a given project > > 3) Deficiencies in "old" (C++98) C++ that have been fixed in "modern" > (C++11 and later) C++. C++11 was really a big improvement over earlier > versions and I didn't get much interested in C++ til it came out. > > I've written tons of C and a (so far) smallish amount of "modern" C++ > and would consider both of them to be scary, but overall my current > impression is that C++ is safer if you know take #2 above into account. > > For example, in C++ it's, idiomatic to say array.at(i) instead of > array[i], and the .at() method on STL arrays checks that the subscript > is in range (measurements I've done in my own programs have so far > encountered almost no performance loss from this). In C, most people > end up using unchecked subscripts. C++ RAII avoids a lot of dangling- > resource bugs, etc. None of this has anything to do with C# though. > > I've fooled with Ada a little bit and would consider it to be much safer > than either C or C++ while being somewhere between the two of them in > expressiveness, though much worse in terms of tooling. I can't possibly > be saying "my language is better" in naming Ada, since my languages are > C and C++, and I've only looked into Ada as a possible alternative to > them. > > That C# allows escaping to unsafe operations doesn't seem worse to me > than Java having a JNI. Even Haskell supports unsafe operations. > Of course C# has other issues that make it not sound good for real time. > > Here are Bjarne Stroustrup's currently recommended C++ core guidelines > which spell out a much safer set of practices than I usually see in C > programs: > > https://isocpp.org/blog/2015/09/bjarne-stroustrup-announces-cpp-core-guidelines
Your suggested actions w.r.t. C/C++ are ameliorations, not cures. But you know that. Even if (and it is a big if) they were a cure, how could you guarantee that /everybody/ (in whatever company) that produced any code that ends up in your product has fully followed the recommended practices. Using C/C++ for safety critical system is "building a castle on sand". If you look back (to say 1993) in the archives you will find cases where competent compiler writers have asked expert users with a record of knowing where skeletons are likely to be buried, "what does the standard mean by X?" and "how do we simultaneously resolve the standard saying X and Y?". That doesn't warm the cockles of my heart.
Tom Gardner <spamjunk@blueyonder.co.uk> writes:
>> C++ core guidelines > Your suggested actions w.r.t. C/C++ are ameliorations, not cures.
The guidelines basically say to use the current reasonably safe subset of C++ rather than the leftover legacy stuff. C++ itself can't get rid of the old stuff because old programs would break, but new code can avoid it straightforwardly.
> Even if (and it is a big if) they were a cure, how could you guarantee > that /everybody/ (in whatever company) that produced any code that > ends up in your product has fully followed the recommended practices.
1) Most (not all) of the guideline recommendations are statically machine checkable and the guideline discusses that for each item. 2) Critical code even in Ada usually has multi-person code reviews that should be able to spot departures from coding rules. 3) Stuff like excluding the use of certain runtime libraries can be enforced by replacing those libraries with versions that signal errors. 4) The same "amelioration" issues apply to C (MISRA guidelines for acceptable practices), Ada (Spark subset and profile), etc.
> Using C/C++ for safety critical system is "building a castle on sand".
C and C++ these days are completely different languages and Stroustrup likes to say "I take most uses of the compound C/C++ as an indication of ignorance." (http://www.stroustrup.com/bs_faq.html#C-slash) The C++ FQA sometimes is used to support the view that C++ is more dangerous than C. Having used C++ for a while now, my current subjective impression is that if you follow the core guidelines and use traditional coding/debugging processes, C++ is safer than C, especially with modern (C++11 and later) dialects. Ada is probably safer than either. I definitely choked a little when I heard that the Tesla car and SpaceX rocket are both programmed in C++. C has better static analysis tools like Frama-C that I'd like to try using, which could be considered an advantage of C over C++. I currently haven't used them though.
> "what does the standard mean by X?" and "how do we simultaneously > resolve the standard saying X and Y?". That doesn't warm the cockles > of my heart.
Yep. It's still like that. You also can see that type of thing in comp.lang.ada, so it's not C specific.
On Fri, 4 Nov 2016 18:36:13 +0000, Tom Gardner
<spamjunk@blueyonder.co.uk> wrote:

>On 04/11/16 16:07, George Neuner wrote: > >> C# is designed primarily to be a safe(r) version of C with the >> addition of single inheritence objects [unlike C++ which has multiple >> inheritence]. > >No. It was designed as a response to Java's success, and >was a traditional Microsoft attempt to keep developers on >the Windows platform.
C# was designed to compete head-to-head with Java. They did this because Sun's lawsuit forced them to abandon J++. However, if you take a step back and look at the history, another picture emerges. Managed C++ (the ancestor of C++/CLI and C#) was released just a few months after Java - proof that it was in development at the same time. When the Java (ne Oak) project began, Microsoft already was working on "COM+", a project to unite COM and OLE with the runtimes of their popular managed languages: VisualBasic and FoxPro. When Java appeared, COM+ already supported VisualBasic, FoxPro and Managed C++. After Java appeared, the project morphed into CLR and the aim became to (eventually) support all the Microsoft languages. So it's true that Java arrived before C#. It's not true that JVM arrived before (what became) CLR.
>There are two, and only two significant differences >with Java: a different runtime philosophy ("ahead >of time" vs "just in time"/Hotspot optimisation), and >the addition of code marked "unsafe".
That isn't exactly true either. JVM does not directly support tail calls. CLR does directly support tail calls and so better supports languages that require them. F# is an example of this - there is no ML family language for JVM.
>You can work directly with hardware in Java >using the JNI interface and C/assembler/ADA/etc.
C# does not require you to use anything but C#
>OTOH, when I've needed to create clustered high >availability soft realtime systems (e.g. telco), >then Java is my language of choice.
The JVM machine is not bad, but I dislike Java the language and I avoid it whenever possible. There are better languages to use if you need to target the JVM. YMMV. I don't much like C# either and I'm not particularly a fan of Microsoft, but I don't support bashing anyone with innuendo and half-truths. George
George Neuner <gneuner2@comcast.net> writes:
> JVM does not directly support tail calls. CLR does directly support > tail calls and so better supports languages that require them. F# is > an example of this - there is no ML family language for JVM.
This is apparently a real issue, but I'm not sure why, since tail calls can be compiled into jumps. I wonder how Scala, Clojure, and Frege (all JVM functional languages) deal with it.
On 16-11-04 23:12 , Don Y wrote:
> On 11/4/2016 2:38 PM, Niklas Holsti wrote: >> On 16-11-04 19:54 , Don Y wrote: >>> "Big system language" and "memory requirements" and "runtime" >>> don't preclude a particular choice. Consider Ada in your >>> above assessment (hardly "small", economical, etc.). >> >> I challenge you to prove that facile slander of Ada. Ada-the-language is >> certainly "larger" than C, but not larger than C++. Ada applications are >> comparable in performance and memory needs to C applications. You can >> prefer C >> if you like, but not for such reasons. > > Do you consider Ada a "SMALL system language"?
Certainly it it suitable for small systems. Remember, Ada was originally designed in the early 1980s, for applications including embedded systems, many of which were then quite small by current standards. The emphasis on embedded systems was so strong that some people believed that Ada wasn't good for anything else (IMO a false belief, of course).
> I'm not talking about > the number of words/constructs in the language -- do you think it is > designed with "small systems" in mind? Would you choose it to > implement a microwave oven controller (a half page of "BASIC")?
If an Ada compiler were available, certainly. And a half page of BASIC would be a very simple microwave oven indeed (depending of course on how much of the functions, safety interlocks etc. are implemented in HW).
> Ada is inherently targeted towards "bigger systems" (bigger > meaning more complex).
It was designed to *support* such systems -- that was often called "programming in the large". But those features of Ada -- in particular the strong modularity (packages) and the encapsulation and information hiding features -- are entirely source-level, compile-time features and imply no overhead at run time. One of the goals in the design of Ada was to avoid "distributed overhead" from language features that are not used in an application. If your application does not need multi-threading, fine, use a run-time system without tasking support. If you need only simple multi-threading, use a Ravenscar RTS, and so on.
> Why employ all those capabilities when > your overall objective is a "small project"?
Small projects tend to grow, and to need maintenance, sometimes over a long time. Readability and maintainability are important and were also explicit design goals of the Ada language.
> Seriously, look at what it takes to implement YOUR microwave > oven's controller
I've not seen the code in there, but I suspect that it is rather more than a half page of BASIC. Still, this thread is (or was) not about micowave ovens, but about a safety-critical automotive application, which I suspect is larger than a microwave-oven controller. -- Niklas Holsti Tidorum Ltd niklas holsti tidorum fi . @ .
On 11/4/2016 11:21 PM, Niklas Holsti wrote:
> On 16-11-04 23:12 , Don Y wrote: >> On 11/4/2016 2:38 PM, Niklas Holsti wrote: >>> On 16-11-04 19:54 , Don Y wrote: >>>> "Big system language" and "memory requirements" and "runtime" >>>> don't preclude a particular choice. Consider Ada in your >>>> above assessment (hardly "small", economical, etc.). >>> >>> I challenge you to prove that facile slander of Ada. Ada-the-language is >>> certainly "larger" than C, but not larger than C++. Ada applications are >>> comparable in performance and memory needs to C applications. You can >>> prefer C >>> if you like, but not for such reasons. >> >> Do you consider Ada a "SMALL system language"? > > Certainly it it suitable for small systems. Remember, Ada was originally > designed in the early 1980s, for applications including embedded systems, many > of which were then quite small by current standards.
And assembly language is suitable for HUGE systems! As is COBOL, LISP, etc. Would you *typically* advertise: Manufacturer of microwave ovens seeks Ada-fluent developer with N years experience? And: Developer of nation-state missile defense systems seeks fluent ASM programmer? You pick tools appropriate to the effort involved and the skillsets of the folks you expect to hire. E.g., if the local missile manufacturer goes out of business suddenly, you can probably pick up a slew of Ada developers "dirt cheap". But, barring that, you'd typically not ask for that expertise and have to retrain them in some other "commodity" language and development style.
> The emphasis on embedded systems was so strong that some people believed that > Ada wasn't good for anything else (IMO a false belief, of course). > >> I'm not talking about >> the number of words/constructs in the language -- do you think it is >> designed with "small systems" in mind? Would you choose it to >> implement a microwave oven controller (a half page of "BASIC")? > > If an Ada compiler were available, certainly. And a half page of BASIC would be > a very simple microwave oven indeed (depending of course on how much of the > functions, safety interlocks etc. are implemented in HW).
A Microwave oven controller is a simple project. There are essentially four things that have to be done: - keep track of time of day - control the maggie's output power (duty cycle modulate at just above DC) - keep track of time remaining in cook cycle - update display/accept keypad input/overrides Some might have a temperature probe (cook until specific temperature reached). Some might have shortcuts (defrost is powerlevel P for time T). But, the overall function is trivial. [Note some have an electromechanical DIAL mechanism]
>> Ada is inherently targeted towards "bigger systems" (bigger >> meaning more complex). > > It was designed to *support* such systems -- that was often called "programming > in the large". But those features of Ada -- in particular the strong modularity > (packages) and the encapsulation and information hiding features -- are > entirely source-level, compile-time features and imply no overhead at run time. > > One of the goals in the design of Ada was to avoid "distributed overhead" from > language features that are not used in an application. If your application does > not need multi-threading, fine, use a run-time system without tasking support. > If you need only simple multi-threading, use a Ravenscar RTS, and so on. > >> Why employ all those capabilities when >> your overall objective is a "small project"? > > Small projects tend to grow, and to need maintenance, sometimes over a long > time. Readability and maintainability are important and were also explicit > design goals of the Ada language.
And many small projects are just one-shot deals. It is wiser to reuse *designs* than code. How much code will that next microwave inherit from the first? "We want to allow the user to select a food TYPE to be defrosted..." "We want to add a popcorn sensor..." "We want to add a full graphic display so the user can surf the web..."
>> Seriously, look at what it takes to implement YOUR microwave >> oven's controller > > I've not seen the code in there, but I suspect that it is rather more than a > half page of BASIC.
Sit down and *write* the pseudo-code for YOUR microwave oven. Or your toaster. Or your HVAC thermostat. Or your furnace. Or your keyboard/mouse/etc. The number of "simple" designs out there greatly exceeds the number of designs that can significantly benefit from more capable languages -- esp when you consider the resources those languages can impose on the unsuspecting developer (not all employers can afford or hire 'Experts')
> Still, this thread is (or was) not about micowave ovens, but about a > safety-critical automotive application, which I suspect is larger than a > microwave-oven controller.
No, the comment to which I replied was an utterance re: C#: "So from what I can see, it's a big systems language in memory requirements for libraries and run time support. C# also uses dynamic memory allocation and garbage collection, another red line for the this type of application." The reference to the "application" was essentially an afterthought, therein. My reply, to which you consider slander: "'Big system language' and 'memory requirements' and 'runtime' don't preclude a particular choice. Consider Ada in your above assessment (hardly "small", economical, etc.)." I stand by that assertion.
Niklas Holsti <niklas.holsti@tidorum.invalid> writes:
> The emphasis on embedded systems was so strong that some people > believed that Ada wasn't good for anything else (IMO a false belief, > of course).
Famous quote by Larry Wall: Plus I remember being impressed with Ada because you could write an infinite loop without a faked up condition. The idea being that in Ada the typical infinite loop would be normally be terminated by detonation. -- Larry Wall in <199911192212.OAA23621@kiev.wall.org>
> Still, this thread is (or was) not about micowave ovens, but about a > safety-critical automotive application, which I suspect is larger than > a microwave-oven controller.
It seems to me that microwave ovens these days would usually be programmed in C rather than BASIC, but Ada is perfectly well suited for almost anything C is suitable for. For that matter, the Arduino was built around AVR microcontrollers with 1k or 2k of ram, and the "official" programming language for the Arduino environment was/is C++, which is even more complex than Ada.
On 11/4/2016 11:52 PM, Paul Rubin wrote:
>> Still, this thread is (or was) not about micowave ovens, but about a >> safety-critical automotive application, which I suspect is larger than >> a microwave-oven controller. > > It seems to me that microwave ovens these days would usually be > programmed in C rather than BASIC, but Ada is perfectly well suited for > almost anything C is suitable for. For that matter, the Arduino was > built around AVR microcontrollers with 1k or 2k of ram, and the > "official" programming language for the Arduino environment was/is C++, > which is even more complex than Ada.
Note that I'd not claimed microwave ovens WERE coded in BASIC but, rather, used BASIC as an example of how "complex (simple)" the application is. (Too!) Many small applications are still coded in ASM in the belief that this gives better control over resources/application size. We have a MCU controlled *toaster*; wanna bet the application is two pages of ASM? heating_element(ON) while (temperature < setpoint) { blink_LED() } eject_toast() I'd wager the code in our furnace isn't much more complex -- and there much primarily to report errors for troubleshooting (blinkenlites) The complexity of the user interface seems to be a driving factor in the complexity of the codebase in many deeply embedded products. I.e., the control algorithms being simpler than the interactions with the user.
On 05/11/16 02:00, George Neuner wrote:
> On Fri, 4 Nov 2016 18:36:13 +0000, Tom Gardner > <spamjunk@blueyonder.co.uk> wrote: > >> On 04/11/16 16:07, George Neuner wrote: >> >>> C# is designed primarily to be a safe(r) version of C with the >>> addition of single inheritence objects [unlike C++ which has multiple >>> inheritence]. >> >> No. It was designed as a response to Java's success, and >> was a traditional Microsoft attempt to keep developers on >> the Windows platform. > > C# was designed to compete head-to-head with Java. They did this > because Sun's lawsuit forced them to abandon J++.
Agreed. Don't forget that MS already had a reasonably good JVM and could have continued to offer Java, but decided not to on business grounds and out of hubris.
> However, if you take a step back and look at the history, another > picture emerges. > > Managed C++ (the ancestor of C++/CLI and C#) was released just a few > months after Java - proof that it was in development at the same time. > When the Java (ne Oak) project began, Microsoft already was working on > "COM+", a project to unite COM and OLE with the runtimes of their > popular managed languages: VisualBasic and FoxPro. When Java > appeared, COM+ already supported VisualBasic, FoxPro and Managed C++. > After Java appeared, the project morphed into CLR and the aim became > to (eventually) support all the Microsoft languages.
My understanding is that Com+ is mainly a means of bolting together components and passing state (especially transational state) between them. The components are usually much larger than a class, and can in theory be written in any language provided that the /external/ semantics are preserved. That's very different in scope and objectives to the JVM.
> So it's true that Java arrived before C#. It's not true that JVM > arrived before (what became) CLR.
Translation: something came before the JVM and turned into something else after the JVM arrived. Shrug :) Most things are an evolution of what went before, and things are continually "repurposed" in the light of changing environment. If you want to go down that route, you'll need to consider Smalltalk, Oak, ANDF and many other initiatives. Java arrived and was remarkably usable very quickly. Within six months of its arrival, I was able to buy a library that instantly gave me interactive 2D and 3D charts/graphs - something that C++ didn't manage in 10 years! C# and the CLR came years later, not that that is particularly important.
>> There are two, and only two significant differences >> with Java: a different runtime philosophy ("ahead >> of time" vs "just in time"/Hotspot optimisation), and >> the addition of code marked "unsafe". > > That isn't exactly true either. JVM does not directly support tail > calls. CLR does directly support tail calls and so better supports > languages that require them. F# is an example of this - there is no > ML family language for JVM.
There are many different languages that compile down to the JVM, with many different characteristics. Many are commercially important.
>> You can work directly with hardware in Java >> using the JNI interface and C/assembler/ADA/etc. > > C# does not require you to use anything but C#
That was primarily a business/marketing decision, based on the need to support historic code (an MS strength) and to claim ubiquitous applicability (poor, very debatable). Otherwise it is somewhat true, but the consequence is that you explicitly destroy all the valuable guarantees that the managed environment provides. That's a bad tradeoff, IMNSHO.
>> OTOH, when I've needed to create clustered high >> availability soft realtime systems (e.g. telco), >> then Java is my language of choice. > > > The JVM machine is not bad, but I dislike Java the language and I > avoid it whenever possible. There are better languages to use if you > need to target the JVM. YMMV.
Depends on the problem at hand. IMNSHO, in the absence of any requirements and constraints, Java is the best application language. But people should always identify their requirements and constraints and use a screw, pop-rivet, nail, glue as appropriate.
The 2026 Embedded Online Conference