EmbeddedRelated.com
Forums

Fundamental C question about "if" statements

Started by Oltimer September 20, 2015
On 22/09/15 23:41, Tim Wescott wrote:
> On Tue, 22 Sep 2015 23:01:14 +0200, David Brown wrote: > >> On 22/09/15 19:40, Tim Wescott wrote: >>> On Tue, 22 Sep 2015 08:53:28 +0200, David Brown wrote: >>> >>>> On 21/09/15 19:24, Tim Wescott wrote: >>>> >>>>> But it is a point -- who IS the Egyptian god of computing? >>>>> >>>>> >>>> I don't know that one - but some candidates could be: >>>> >>>> Am-heh &ndash; A dangerous underworld god Apep &ndash; A serpent deity who >>>> personified malevolent chaos Heka &ndash; Personification of magic >>>> >>>> (I justify including Heka on the list from what customers expect of >>>> us.) >>>> >>>> When I develop my own programming language to be the perfect >>>> replacement for C and all it's quirks, I shall call it Apep. >>> >>> There's a large population of programmers out there who believe it's >>> been done -- they call the language "Ada". >>> >>> I've never programmed in Ada, but I've debugged it. We had a multi- >>> million dollar contract stall because our box (programmed in C) and the >>> prime's box (programmed in Ada) could not talk to each other. We went >>> back and forth on it for months, until finally the prime decided that >>> they needed to fly half a dozen people all the way across the continent >>> to smack us stupid Oregonians until we saw the light and fixed our bug. >>> >>> I ended up getting sucked into it at the last minute (in retrospect I >>> think it was because the project manager on our side knew that, when >>> handled right, I can act like a demented terrier who's just coming down >>> with second stage rabies -- and mange). So there I am, in a meeting >>> room with half a dozen software engineers from two companies, plus half >>> a dozen sales and project management types to lend weight and credence >>> to the whole "this is serious business" air of the thing. >>> >>> Two things became apparent to me, in the order that I present them >>> here. >>> >>> One, these weren't just Ada programmers, they were (as many Ada >>> programmers seem to be), devout members of the Ada cult. Their simple >>> rule for localizing bugs in a project that contained Ada code and C >>> code was that the bug had to be in the C code -- end of story. On top >>> of that, they had a complete and completely contemptuous ignorance of C >>> -- trying to actually show them code was roughly equivalent to a devout >>> pagan trying to explain what Aphrodite was really about to Jimmy >>> Swaggart in his pre-caught days. >>> >>> Two, the bug had to be in the Ada code. It came about because things >>> had devolved into a very genteel argument, with the Disciples of Ada >>> adamantly insisting that all problems are C problems, with their >>> management team backing them up and making veiled threats to our >>> management team, and our management team fighting a courageous >>> rearguard action. >>> >>> While this all was going on I sort of disengaged and started reading >>> their code, line by line. Now, the problem was that a message was >>> getting bit-reversed. I won't get , and I saw the following lines: >>> >>> <something or other> this_message[0..31]; >>> <something or other> that_message[1..31]; >>> <something or other> trouble_message[31..0] >>> >>> So right in the middle of the discussion, which had been growing ever >>> more heated in an ever more quiet and genteel way, I blurted "Hey! I >>> think I found it!" >>> >>> You may imagine the sorts of looks I got. >>> >>> I explained the whole 0..31 vs. 31..0 business, and the Ada people >>> _would not listen_ -- because Ada code is automatically bug-free, >>> right? Particularly when you've cot C code in the vicinity acting like >>> fresh, extra-sticky fly paper. >>> >>> I tried explaining again -- have you ever tried to hold an intelligent >>> conversation with a rock? A pissed-off rock? I got nowhere. >>> >>> So I had to resort to intellectual violence: "hey, I know that I'm just >>> a dumb-ass C programmer, and moreover that I'm Oregon born and bred. >>> So could you Really Smart Ada people 'splain this here feature of your >>> ever- >>> so-wonderful language to me, in short words?" >>> >>> Then I went over it "so this line says zero to thirty-one. And THIS >>> line says 1 to thirty one." Then, resisting the urge to drool a bit >>> "and THIS HERE line says thirty one to 0 and DAMN but stupid little old >>> me just can't unnerstand what it all MEANS!" >>> >>> At which point their chief Ada programmer and High Priestess of Blessed >>> Code actually LOOKED at her code, slammed her printout on the table, >>> and stomped out of the room. >>> >>> So, anyway -- Ada, because it's always bug free, and infinitely better >>> than C in every possible way. >>> >>> >> /My/ language won't allow bugs like that - that would be a compile-time >> error! >> >> The lesson from your story is, of course, that one can write bugs in any >> language. Ada has features that can help write correct code, and it >> reduces the risk of some errors that are not uncommon in C - but a good >> Ada programmer and a good C programmer, sticking to a good coding >> standard, will produce similar quality code. (C++ has many features >> that can help write good code, and many features that can help write >> appallingly bad code - but when you know what features to use, it can >> give solid results.) > > My favorite quote about C vs. C++ is that C gives you lots of rope. C++ > gives you lots of rope, and, in a few places in the STL, some pre-tied > nooses.
The weird thing about C++ is how it combines features for good, safe, strongly typed programming with some incredible incoherent nonsense to achieve apparently simple things. Just for a laugh, I was trying to comprehend the "safe boolean idiom" recently - I believe I understand it roughly, but it is pretty mind-numbing stuff. (C++11 allowed conversion operators to be "explicit", making the whole thing redundant - C++11 really was a big step forward.) Here's a couple of "shoot yourself in the foot" jokes: Ada If you are dumb enough to actually use this language, the United States Department of Defense will kidnap you, stand you up in front of a firing squad, and tell the soldiers, "Shoot at his feet." After correctly packaging your foot, you attempt to concurrently load the gun, pull the trigger, scream, and shoot yourself in the foot. When you try, however, you discover that your foot is of the wrong type. You scour all 156e54 pages of the manuals, looking for references to foot, leg, or toe; then you get hopelessly confused and give up. You sneak in when the boss isn't around and finally write the damn thing in C. You turn in 7,689 pages of source code to the review committee, knowing they'll never look at it, and when the program needs maintenance, you quit. C You shoot yourself in the foot. You shoot yourself in the foot and then nobody else can figure out what you did. C++ You accidentally create a dozen instances of yourself and shoot them all in the foot. Providing emergency medical assistance is impossible since you can't tell which are bitwise copies and which are just pointing at others and saying, "That's me, over there." Assembly You try to shoot yourself in the foot only to discover that you must first invent the gun, the bullet, the trigger, and your foot. You crash the OS and overwrite the root disk. The system administrator arrives and shoots you in the foot. After a moment of contemplation, the system administrator shoots himself in the foot and then hops around the room rapidly shooting at everyone in sight. By the time you've written the gun, you are dead, and don't have to worry about shooting your feet. Alternatively, you shoot and miss, but don't notice. Using only 7 bytes of code, you blow off your entire leg in only 2 CPU clock ticks. Python You shoot yourself in the foot and then brag for hours about how much more elegantly you did it than if you had been using C or (God forbid) Perl. You create a gun module, a gun class, a foot module, and a foot class. After realizing you can't point the gun at the foot, you pass a reference to the gun to a foot object. After the foot is blown up, the gun object remains alive for eternity, ready to shoot all future feet that may happen to appear.
> >> There is one good, clear justification for claiming that Ada leads to >> fewer bugs and higher quality code - there are no amateur Ada >> programmers. People learn and use Ada because they care about code >> quality and correctness. Sure, there can be corporate cultural or >> personal reasons for not doing a good job of it, but you can be >> confident that they will at least understand the importance of quality >> coding. In the C world, there are countless people (coders and >> managers) who just do not see the point of quality - they are happy with >> something that seems to work okay during a quick test. So there is >> nothing wrong with good C (or C++) development - it's the low-end that >> brings down the average. > > I think you're largely right. I see very few newbie/hobbyist types > seeking to learn how to program PICs using Ada, but lots that just want > to learn how to code in C. > > I wonder if there's a way to test good responsible C or C++ code quality > vs. Ada code quality. >
There have been various attempts at such studies. The problem is, to do it properly involves an enormous amount of money - you've got to find two teams of programmers with roughly equal proficiency and experience in one of the languages, and set them off doing independent high-quality implementations of the same task. At a minimum, you'd want perhaps 4 people on each team, and several months of work. Then you'd want separate judges to review the code quality afterwards.
On 23/09/15 08:11, David Brown wrote:
> The weird thing about C++ is how it combines features for good, safe, > strongly typed programming with some incredible incoherent nonsense to > achieve apparently simple things. Just for a laugh, I was trying to > comprehend the "safe boolean idiom" recently - I believe I understand it > roughly, but it is pretty mind-numbing stuff. (C++11 allowed conversion > operators to be "explicit", making the whole thing redundant - C++11 > really was a big step forward.)
See http://yosefk.com/c++fqa/ I'm particularly fond of the "const correctness" section, since it relates to the endless unresolved "casting away constness" discussions from the early 90s.
On 2015-09-22, Dimiter_Popoff <dp@tgi-sci.com> wrote:
> > Oh come on, it is time we stop commenting on wanting to use a DIP > processor. I have not used one since the 6809 days, i.e. since the 80-s. > There is no sensible reason to use one in whatever design. >
There must still be a significant market for them however. I can walk into the Farnell trade counter and buy these things over the counter from stock. Surely PDIP MCUs (and other devices) would not be held as in-stock inventory unless there was still a market for them. Simon. -- Simon Clubley, clubley@remove_me.eisner.decus.org-Earth.UFP Microsoft: Bringing you 1980s technology to a 21st century world
On Wed, 23 Sep 2015 08:16:26 +0200, Jacob Sparre Andersen wrote:

> Tim Wescott wrote: > >> I wonder if there's a way to test good responsible C or C++ code >> quality vs. Ada code quality. > > There was published a paper on the subject 20 years ago (not the > freshest data): > > http://archive.adaic.com/intro/ada-vs-c/cada_art.pdf > > > How engineering students fare with the two languages does not map > directly to real-world projects, but the difference is quite extreme: > > http://static1.1.sqspcdn.com/static/
f/702523/9458053/1290008042427/200008-McCormick.pdf
> > Greetings, > > Jacob
Thanks Jacob -- that's informative. -- Tim Wescott Wescott Design Services http://www.wescottdesign.com
On 23.9.2015 &#1075;. 20:08, Simon Clubley wrote:
> On 2015-09-22, Dimiter_Popoff <dp@tgi-sci.com> wrote: >> >> Oh come on, it is time we stop commenting on wanting to use a DIP >> processor. I have not used one since the 6809 days, i.e. since the 80-s. >> There is no sensible reason to use one in whatever design. >> > > There must still be a significant market for them however. > > I can walk into the Farnell trade counter and buy these things over > the counter from stock. Surely PDIP MCUs (and other devices) would not > be held as in-stock inventory unless there was still a market for them. > > Simon. >
I said there is no sensible reason to use them in a design; I never claimed there were no people doing things which do not make sense :D :D . Dimiter
On 2015-09-22, Tim Wescott <seemywebsite@myfooter.really> wrote:
> On Tue, 22 Sep 2015 23:01:14 +0200, David Brown wrote: >> There is one good, clear justification for claiming that Ada leads to >> fewer bugs and higher quality code - there are no amateur Ada >> programmers. People learn and use Ada because they care about code >> quality and correctness. Sure, there can be corporate cultural or >> personal reasons for not doing a good job of it, but you can be >> confident that they will at least understand the importance of quality >> coding. In the C world, there are countless people (coders and >> managers) who just do not see the point of quality - they are happy with >> something that seems to work okay during a quick test. So there is >> nothing wrong with good C (or C++) development - it's the low-end that >> brings down the average. > > I think you're largely right. I see very few newbie/hobbyist types > seeking to learn how to program PICs using Ada, but lots that just want > to learn how to code in C. >
This hobbyist (at least in the embedded world; I'm a commercial programmer by day) would love to use Ada for embedded stuff but finally gave up on that a few years ago and went back to using C. The main problem are the compilers - there's a freely available C compiler for all embedded platforms/environments out there. This isn't true for Ada which is a problem when you want to write some common libraries to run on a range of MCU architectures. Even when there's a free compiler available, it might be based on the free ACT version of gcc. As the Ada runtime library for this version of gcc is under the GPL, this means your binaries also become covered by the GPL as well. The FSF version of gcc doesn't have this problem but this compiler situation with Ada is a real mess.
> I wonder if there's a way to test good responsible C or C++ code quality > vs. Ada code quality. >
A related example that gets quoted in Ada circles is an article from about 15 years ago when a university course used C and then Ada to complete a train modeling project: http://archive.adaic.com/projects/atwork/trains.html The interesting bit is towards the end in the "Software" section. When the code was written in C, no team successfully implemented the minimum project requirements. When the code was written in Ada, about 50% of them did. I don't have anything more recent to hand sorry. That's unfortunate because it would be interesting to see what the results are today if the project was done again using the options available these days. Simon. -- Simon Clubley, clubley@remove_me.eisner.decus.org-Earth.UFP Microsoft: Bringing you 1980s technology to a 21st century world
On 2015-09-23, Tim Wescott <seemywebsite@myfooter.really> wrote:
> On Wed, 23 Sep 2015 08:16:26 +0200, Jacob Sparre Andersen wrote: >> >> There was published a paper on the subject 20 years ago (not the >> freshest data): >> >> http://archive.adaic.com/intro/ada-vs-c/cada_art.pdf >> >> >> How engineering students fare with the two languages does not map >> directly to real-world projects, but the difference is quite extreme: >> >> http://static1.1.sqspcdn.com/static/ > f/702523/9458053/1290008042427/200008-McCormick.pdf >> > > Thanks Jacob -- that's informative. >
I missed Jacob's posting before my reply. :-( Oh well. Now you have a text only HTML version as well. :-) Simon. -- Simon Clubley, clubley@remove_me.eisner.decus.org-Earth.UFP Microsoft: Bringing you 1980s technology to a 21st century world
On Wed, 23 Sep 2015 17:55:19 +0000, Simon Clubley wrote:

> On 2015-09-22, Tim Wescott <seemywebsite@myfooter.really> wrote: >> On Tue, 22 Sep 2015 23:01:14 +0200, David Brown wrote: >>> There is one good, clear justification for claiming that Ada leads to >>> fewer bugs and higher quality code - there are no amateur Ada >>> programmers. People learn and use Ada because they care about code >>> quality and correctness. Sure, there can be corporate cultural or >>> personal reasons for not doing a good job of it, but you can be >>> confident that they will at least understand the importance of quality >>> coding. In the C world, there are countless people (coders and >>> managers) who just do not see the point of quality - they are happy >>> with something that seems to work okay during a quick test. So there >>> is nothing wrong with good C (or C++) development - it's the low-end >>> that brings down the average. >> >> I think you're largely right. I see very few newbie/hobbyist types >> seeking to learn how to program PICs using Ada, but lots that just want >> to learn how to code in C. >> >> > This hobbyist (at least in the embedded world; I'm a commercial > programmer by day) would love to use Ada for embedded stuff but finally > gave up on that a few years ago and went back to using C. > > The main problem are the compilers - there's a freely available C > compiler for all embedded platforms/environments out there. This isn't > true for Ada which is a problem when you want to write some common > libraries to run on a range of MCU architectures. > > Even when there's a free compiler available, it might be based on the > free ACT version of gcc. As the Ada runtime library for this version of > gcc is under the GPL, this means your binaries also become covered by > the GPL as well. > > The FSF version of gcc doesn't have this problem but this compiler > situation with Ada is a real mess.
That's very interesting, and relevant to me because I use the gnu toolchain in my work -- so Ada would be ruled out unless I wanted to start by making a run-time library (and, presumably, selling it to recoup my investment).
>> I wonder if there's a way to test good responsible C or C++ code >> quality vs. Ada code quality. >> >> > A related example that gets quoted in Ada circles is an article from > about 15 years ago when a university course used C and then Ada to > complete a train modeling project: > > http://archive.adaic.com/projects/atwork/trains.html > > The interesting bit is towards the end in the "Software" section. > When the code was written in C, no team successfully implemented the > minimum project requirements. When the code was written in Ada, > about 50% of them did. > > I don't have anything more recent to hand sorry. That's unfortunate > because it would be interesting to see what the results are today if the > project was done again using the options available these days. > > Simon.
I believe the original paper is here: http://static1.1.sqspcdn.com/static/f/702523/9458053/1290008042427/200008- McCormick.pdf (Thanks Jacob) At least one of the points he noted about C makes me wonder if it isn't a biased instructor, rather than a bias between the languages, that was the primary cause of the problem. The point was in a section of why Ada was better than C, and reads: "Representation clauses for device registers (record field selection rather than bit masks)" But, unless you're using a horribly obsolete version of C, you can use bit fields in structures. And while this is, to some extent, excuse- making, even with horribly obsolete versions of C you can damned well hide all of your bit mask ugliness inside of nice tidy macros. He also mentions that C lacks exception handling, which is a moderately good point -- but I've never felt comfortable with using exception handling in an embedded environment, because in the mostly-headless stuff that I do, there's not much you can do with an exception other than to shut down in a manner that mirrors your best guess at being safe. -- Tim Wescott Wescott Design Services http://www.wescottdesign.com
Simon Clubley <clubley@remove_me.eisner.decus.org-Earth.UFP> wrote:

> The FSF version of gcc doesn't have this problem but this compiler > situation with Ada is a real mess.
Yup. I've built fsf gnat for various targets: MIPS, ARM, x86, MSP430. No runtime though. What target are you wanting? Luke
On 2015-09-23, Tim Wescott <seemywebsite@myfooter.really> wrote:
> > I believe the original paper is here: > > http://static1.1.sqspcdn.com/static/f/702523/9458053/1290008042427/200008- > McCormick.pdf > > (Thanks Jacob) >
Yes, I missed Jacob's original posting until I saw it quoted by you after I posted. :-(
> At least one of the points he noted about C makes me wonder if it isn't a > biased instructor, rather than a bias between the languages, that was the > primary cause of the problem. The point was in a section of why Ada was > better than C, and reads: > > "Representation clauses for device registers (record field selection > rather than bit masks)" > > But, unless you're using a horribly obsolete version of C, you can use > bit fields in structures. And while this is, to some extent, excuse- > making, even with horribly obsolete versions of C you can damned well > hide all of your bit mask ugliness inside of nice tidy macros. >
Good luck doing that in a reliable way on ARM with gcc when you are accessing device registers. If you access a bitfield in the lower 8 bits of a register, the gcc optimiser can turn that into 8-bit ldrb/strb opcodes instead of 32-bit ldr/str opcodes and thereby causing the program to break if you need to access your registers in units of greater than 8 bits. Ada's not immune as well if you try accessing the registers directly using bitfields; I helped someone out in comp.lang.ada a few weeks ago with a similar issue. BTW, I've got a couple of Ada Issues open which talk about this and the related issue of directly updating multiple bitfields as one Read/Modify/Write operation instead of as multiple RMW operations. It would be nice to be able to do that in C as well instead of having to use an intermediate variable (as you currently have to do with Ada as well). Simon. -- Simon Clubley, clubley@remove_me.eisner.decus.org-Earth.UFP Microsoft: Bringing you 1980s technology to a 21st century world