EmbeddedRelated.com
Forums
The 2026 Embedded Online Conference

Modern debuggers cause bad code quality

Started by Oliver Betz December 2, 2014
On Friday, December 5, 2014 3:22:22 PM UTC-5, dp wrote:
> On 05.12.2014 г. 20:40, Les Cargill wrote:
[]
> >> It is the popularity growth, not the birth date. Then C does not > >> prevent one from writing decent software, it only makes it more > >> difficult - > > > > I don't think that's ... demonstrable in any reasonable fashion. > > It is obvious enough for me. The fact is that C tries to be a > "universal assembler" as some people see it and > it does it poorly (too abstracted from any machine model). There > are a lot more details about my vpa which allow me to do things > people just can't do in C which are way too lengthy for me to explain > to myself let alone other people from the trade so I won't go > into it, neither would any sane person want me to :-).
[]
> Exactly. This is the basic flaw of high level languages. Instead of > dealing with text they deal with hieroglyphs - which is much less > efficient than just using an alphabet and design your words en route > to evolve the language to fit the whims of life.
Sounds like FORTH.
> The basic flaw of any (too) high level language is its lack of > flexibility to adapt to an ever changing world. Sure changes are > made and the phrasebooks get rewritten - but how is this comparable > with adding just the new words to the dictionary and twisting the > language without needing any "official" approval. Before the change > happens years will have passed and gigatons of poor software will > have been written (poor simply because the language was not up to date > with reality). >
No language will ever be able to keep up as you propose and still be used by many developers. Applications reflect the reality, the language only helps. []
> > > >> And then there is my VPA (virtual processor assembly) which > >> makes me more efficient by at least an order of magnitude than > >> anyone who uses C when it comes to projects which take more > >> than a month to program (before you ask my code is in the > >> millions of lines, >50M sources over the past 20 years). > >>
[]
> Dimiter >
[] I guess that is why you keep you language to yourself. What do your customers do when you finish the project? Or is this the "code so complicated I can never be fired" approach to programming? Especially when I program under contract, my goal is to essentially work myself out of the job. I have even had folks that took over my code compliment me on its maintainability. It sounds like you take an opposite approach. Good luck with that. ed
On 05.12.2014 г. 23:12, Ed Prochak wrote:
> On Friday, December 5, 2014 3:22:22 PM UTC-5, dp wrote: >> On 05.12.2014 г. 20:40, Les Cargill wrote: > [] >>>> It is the popularity growth, not the birth date. Then C does not >>>> prevent one from writing decent software, it only makes it more >>>> difficult - >>> >>> I don't think that's ... demonstrable in any reasonable fashion. >> >> It is obvious enough for me. The fact is that C tries to be a >> "universal assembler" as some people see it and >> it does it poorly (too abstracted from any machine model). There >> are a lot more details about my vpa which allow me to do things >> people just can't do in C which are way too lengthy for me to explain >> to myself let alone other people from the trade so I won't go >> into it, neither would any sane person want me to :-). > [] >> Exactly. This is the basic flaw of high level languages. Instead of >> dealing with text they deal with hieroglyphs - which is much less >> efficient than just using an alphabet and design your words en route >> to evolve the language to fit the whims of life. > > Sounds like FORTH. > >> The basic flaw of any (too) high level language is its lack of >> flexibility to adapt to an ever changing world. Sure changes are >> made and the phrasebooks get rewritten - but how is this comparable >> with adding just the new words to the dictionary and twisting the >> language without needing any "official" approval. Before the change >> happens years will have passed and gigatons of poor software will >> have been written (poor simply because the language was not up to date >> with reality). >> > > No language will ever be able to keep up as you propose and still be >used by many developers. Applications reflect the reality, the language only helps. >
I have never said it is easy to learn to write good code using a lower level language. It is not easy to write good literature in English either, few people can learn the language that well. Yet English and other alphabet based languages are distinctly superior to hieroglyph based ones when it comes to literature.
> > [] >>> >>>> And then there is my VPA (virtual processor assembly) which >>>> makes me more efficient by at least an order of magnitude than >>>> anyone who uses C when it comes to projects which take more >>>> than a month to program (before you ask my code is in the >>>> millions of lines, >50M sources over the past 20 years). >>>> > [] >> Dimiter >> > [] > > I guess that is why you keep you language to yourself. What do your customers do when you finish the project? > Or is this the "code so complicated I can never be fired" approach to programming? > > Especially when I program under contract, my goal is to essentially work myself out of the job. I have even had folks that took over my code compliment me on its maintainability. It sounds like you take an opposite approach. Good luck with that. >
I am sure you will find my vpa code easier to read than you would read code written in C or whatever the high level language is that you are most familiar with. There is no phrasebook you have to learn for years, remember. Writing is another matter of course but maintaining code does not take a lot of it. Like I said earlier I am not trying to convince anyone. And now I must really turn a not so small thing in my vpa into something I can finish by tomorrow night so I will have to stop posting here and get back to work :-). Dimiter ------------------------------------------------------ Dimiter Popoff, TGI http://www.tgi-sci.com ------------------------------------------------------ http://www.flickr.com/photos/didi_tgi/
On 12/5/2014 12:31 PM, Niklas Holsti wrote:
> On 14-12-05 20:45 , Ed Prochak wrote: >> A bad programmer writes bad code in any language. > > A good language can guide "bad" programmers towards better habits.
+1 But, it can't prevent the programmer from abusing the language (in ways that the language designers may never have imagined) and still cranking out garbage. Any tool/policy/process (ideally) wants to make "doing it right" (whatever that means) EASIER than "doing it wrong". So, experience trains the user to take the path of least resistance FOR HIS OWN INTERESTS.
Ed Prochak <edprochak@gmail.com> writes:
> THANKS. I look forward to their updates. > (data past 1994, and the other study on C++)
I think after the mid 1990's, the role in the regular IT industry where Ada might have displaced a lot of C and C++ was instead filled by Java. Ada was relegated to a defense/aerospace realtime niche. But here's an old article comparing C++ to Ada-95: http://adahome.com/Ammo/cpp2ada.html
On 05/12/14 20:31, Niklas Holsti wrote:
> On 14-12-05 20:45 , Ed Prochak wrote: >> I always viewed C as the universal assembly language. > > It may have been that, in the past, before the standardisation and > before the compilers became ambitious about optimisation and code speed. > Nowadays, standard C has more "gotchas" and hard-to-remember rules than > a typical real assembly language (reference: recent discussions on > comp.arch about gcc "miscompiling" typical C programs, because gcc > assumes that C code with undefined behaviour, per the standard, can do > anything.) >
I haven't seen the thread in comp.arch, but do you have any particular situations in mind? And what do you suggest gcc /should/ do about undefined behaviour? Make wild guesses about what it thinks the user actually intended? In some cases, the compiler will allow the user to define the behaviour - such as by compiler flags that make signed integers overflow as two's complement (even though code will almost never use such a "feature", and changing it reduces some optimisation opportunities in good code). In many other cases, undefined behaviour is fairly obvious if the programmer things about it (and programmers /should/ think!) - dividing by zero is undefined, so the compiler can assume that you don't care what will happen if you try it.
On 12/5/2014 3:12 AM, Oliver Betz wrote:
> rickman wrote: > > [...] > >>>>> In the early days of embedded computing, most embedded developers >>>>> could use a TTY interface at best and instrumented the code with some >>>>> print statements if something went wrong. >>>> >>>> What do you mean "early days"? That still works for me. >>> >>> it's often (not always) inefficient compared to on-chip-debugging. >> >> Define inefficient. I can do the TTY thing with the absolute minimum of >> hardware and nearly no supporting software. How exactly is that > > SWD / JTAG / BDM / whatever debugging is usually "for free" if you use > this interface also for production programming. Otherwise it has the > same hardware cost as TTY. And it gives you extensive access to your > system without the need of instrumenting your code.
"Not instrumenting your code" is a red herring. It can take me as much if not more time to configure a debugger as adding some output code to my program. I don't get the "for free" part. There is always a *special* debugger pod that requires a unique connector on the board... a connector which is virtually never used in the intended app of the unit while a serial port can serve double duty in minimal hardware designs or a built in display can serve double duty.
> Consider also automated testing with original binaries, no > instrumentation.
Not sure what you mean. The debugger interface can be used in production for programming the Flash, but my experience has been that production people don't want to deal with design tools in production. -- Rick
Yes it is sad that the language "required" for American government contracts is popular mainly in Europe. 

Maybe we should start another thread about languages and how they seem to be picked by popularity and not applicable features.

Hi David,

On 12/6/2014 8:07 AM, David Brown wrote:
> On 05/12/14 20:31, Niklas Holsti wrote: >> On 14-12-05 20:45 , Ed Prochak wrote: >>> I always viewed C as the universal assembly language. >> >> It may have been that, in the past, before the standardisation and >> before the compilers became ambitious about optimisation and code speed. >> Nowadays, standard C has more "gotchas" and hard-to-remember rules than >> a typical real assembly language (reference: recent discussions on >> comp.arch about gcc "miscompiling" typical C programs, because gcc >> assumes that C code with undefined behaviour, per the standard, can do >> anything.) > > I haven't seen the thread in comp.arch, but do you have any particular > situations in mind? > > And what do you suggest gcc /should/ do about undefined behaviour? Make wild > guesses about what it thinks the user actually intended?
Exactly. This is the argument I use with clients who don't want to specify how something should behave in a particular set of circumstances. "You want me to read your mind? Heck, why don't I just do what's EASIEST for me? Even if that's not what you would have wanted had you taken the time to consider the behavior that you WANTED, here!" Really. Think about it. You're telling the compiler (in your code) to do "something"... but, how is it supposed to know what that "something" is -- when CONTRACTUALLY you've opted to do something that is not defined? Gee, what a great catch-all for BUGS! "It's not MY fault the code is misbehaving! It didn't read my mind properly!!"
> In some cases, the compiler will allow the user to define the behaviour - such > as by compiler flags that make signed integers overflow as two's complement > (even though code will almost never use such a "feature", and changing it > reduces some optimisation opportunities in good code). In many other cases, > undefined behaviour is fairly obvious if the programmer things about it (and > programmers /should/ think!) - dividing by zero is undefined, so the compiler > can assume that you don't care what will happen if you try it.
Dimiter_Popoff wrote:
> On 05.12.2014 &#1075;. 20:40, Les Cargill wrote: >> Dimiter_Popoff wrote: >>> On 05.12.2014 &#1075;. 15:41, Les Cargill wrote: >>>> Dimiter_Popoff wrote: >>>>> On 04.12.2014 &#1075;. 12:51, Oliver Betz wrote: >>>>>> Paul E Bennett wrote: >>>>>> >>>>>> [...] >>>>>> >>>>>>>> Could it be that today's sophisticated tools lead to more "try and >>>>>>>> error", less thinking before doing? >>>>>>> >>>>>>> Talk about cats amongst pigeons. >>>>>> >>>>>> causing the foreseeable defensiveness. >>>>>> >>>>>> [...] >>>>>> >>>>>>> Errors that creep into projects are quite language and technology >>>>>>> agnostic. >>>>>> >>>>>> Ganssle presented numbers: 50..100 errors/KLOC in C, 5..10 in ADA, >>>>>> zero with SPARK. >>>>> >>>>> It is the language, not the rest of the toolchain. >>>>> "C" is the major contributor to the decline in software quality (where >>>>> there was some quality to decline of course). >>>> >>>> >>>> That's odd, since 'C' has been there since... well, the start. How >>>> can a thing-that-has-not-changed be the cause of decline? Some >>>> massive lag? Changes in the populations of practitioners? >>> >>> It is the popularity growth, not the birth date. Then C does not >>> prevent one from writing decent software, it only makes it more >>> difficult - >> >> I don't think that's ... demonstrable in any reasonable fashion. > > It is obvious enough for me. The fact is that C tries to be a > "universal assembler" as some people see it and > it does it poorly (too abstracted from any machine model). There > are a lot more details about my vpa which allow me to do things > people just can't do in C which are way too lengthy for me to explain > to myself let alone other people from the trade so I won't go > into it, neither would any sane person want me to :-). >
That's just not been my experience. Since the mid 80s, I can count the number of times I've felt like going to assembly on one hand. <snip>
>> Then I am not sure what to tell you - the idioms of 'C' are >> a pretty lengthy thing. I have committed many of the patterns to >> memory over 25 years but not all of them. > > Exactly. This is the basic flaw of high level languages. Instead of > dealing with text they deal with hieroglyphs - which is much less > efficient than just using an alphabet and design your words en route > to evolve the language to fit the whims of life.
But there really is a problem using "English like" words. COBOL went that way and, while not exactly deprecated, isn't widely used outside of , say, banks. Seems like punctuation marks are pretty useful.
> The basic flaw of any (too) high level language is its lack of > flexibility to adapt to an ever changing world. Sure changes are > made and the phrasebooks get rewritten - but how is this comparable > with adding just the new words to the dictionary and twisting the > language without needing any "official" approval. Before the change > happens years will have passed and gigatons of poor software will > have been written (poor simply because the language was not up to date > with reality). >
I personally do not find this an impractical limitation.
>>>> >>>>> The thing is, their novels get sold simply because the general >>>>> public can't even use a phrasebook. >>>>> And this happened mainly because x86 entered the scene widely, >>>>> made assembly programming impractical with its messy programming >>>>> model etc. >>>>> >>>> >>>> I wrote more assembly language in x86 than in any other architecture. >>>> You want something to wreck things? Try assembly. >>> >>> This explains why you see assembly as something impractical. >> >> I don't. > > OK, your previous post left me with the impression you did, I must > have misunderstood you. >
I feel like 'C' is a better choice. The set of programmers for it is larger and it's modestly more expressive.
>> >>> There is no such thing as "assembly" language really, there >>> are worlds of a difference between this or that "assembly". >> >> They're all essentially the same. There is a narcissism of small >> differences. > > Well if this is "the same" the way all human languages are "the same", > I could agree. Only if so. >
Ah - well, it takes some digging and you have to be prepared to ignore differences that are smaller :) but all human languages can be arranged in a tree structure. Turns out there might be more in common than in difference. Differences tend to be things added after a population moved to a different place and the language evolved.
>> >>> And then there is my VPA (virtual processor assembly) which >>> makes me more efficient by at least an order of magnitude than >>> anyone who uses C when it comes to projects which take more >>> than a month to program (before you ask my code is in the >>> millions of lines, >50M sources over the past 20 years). >>> >> >> Those projects are arguably too large. An old saying is "by the time >> you get N=a million lines of FORTRAN to compile, you no longer >> care what it was supposed to do." > > If a project which takes over a month of programming is "too large" > in your book then OK, I will agree with you that copying this and > that and putting something together in a week or two is better done > using a high level language, yes.
You won't get to a million lines in a month. Ten times what you get in a month won't take ten months; it'll likely take more - complexity is arguably O(n^2) or O(log(n)) of number of lines - using the term "complexity" to approximate cost.
> But one month of programming is nowhere near what is a "large project" > in my book. >
Nor mine. I just mean that a project should be small enough to make verification and validation tractable problems. If you have, say, a Linux distro which is several hundred thousands or millions of lines, it's actually multiple smaller projects bundled together.
> Dimiter > > ------------------------------------------------------ > Dimiter Popoff, TGI http://www.tgi-sci.com > ------------------------------------------------------ > http://www.flickr.com/photos/didi_tgi/ > >
-- Les Cargill
rickman wrote:

[...]

>>>> it's often (not always) inefficient compared to on-chip-debugging. >>> >>> Define inefficient. I can do the TTY thing with the absolute minimum of >>> hardware and nearly no supporting software. How exactly is that >> >> SWD / JTAG / BDM / whatever debugging is usually "for free" if you use >> this interface also for production programming. Otherwise it has the >> same hardware cost as TTY. And it gives you extensive access to your >> system without the need of instrumenting your code. > >"Not instrumenting your code" is a red herring. It can take me as much >if not more time to configure a debugger as adding some output code to >my program.
It takes seconds to get a live display of dozens of variables, including structs and arrays. Hardly possible by "adding output code" and likely a runtime execution speed problem. BTW instrumenting code is usually more invasive than background debug.
>I don't get the "for free" part. There is always a *special* debugger >pod that requires a unique connector on the board... a connector which
cheap these days, e.g. Segger J-Link comes with free eval boards. Better adapters starting below 1000EUR.
>is virtually never used in the intended app of the unit while a serial
The port is usually necessary for in-system-programming.
>port can serve double duty in minimal hardware designs or a built in >display can serve double duty. > >> Consider also automated testing with original binaries, no >> instrumentation. > >Not sure what you mean. The debugger interface can be used in
A scriptable debugger enables automated tests in the target hardware.
>production for programming the Flash, but my experience has been that >production people don't want to deal with design tools in production.
Production people use production tools but the same "connector" on the target. Can be just four <1mm diameter test pads. Add two holes beneath and something like http://www.tag-connect.com/ Oliver -- Oliver Betz, Munich http://oliverbetz.de/
The 2026 Embedded Online Conference