Dimiter_Popoff wrote:> On 05.12.2014 г. 15:41, Les Cargill wrote: >> Dimiter_Popoff wrote: >>> On 04.12.2014 г. 12:51, Oliver Betz wrote: >>>> Paul E Bennett wrote: >>>> >>>> [...] >>>> >>>>>> Could it be that today's sophisticated tools lead to more "try and >>>>>> error", less thinking before doing? >>>>> >>>>> Talk about cats amongst pigeons. >>>> >>>> causing the foreseeable defensiveness. >>>> >>>> [...] >>>> >>>>> Errors that creep into projects are quite language and technology >>>>> agnostic. >>>> >>>> Ganssle presented numbers: 50..100 errors/KLOC in C, 5..10 in ADA, >>>> zero with SPARK. >>> >>> It is the language, not the rest of the toolchain. >>> "C" is the major contributor to the decline in software quality (where >>> there was some quality to decline of course). >> >> >> That's odd, since 'C' has been there since... well, the start. How >> can a thing-that-has-not-changed be the cause of decline? Some >> massive lag? Changes in the populations of practitioners? > > It is the popularity growth, not the birth date. Then C does not > prevent one from writing decent software, it only makes it more > difficult -I don't think that's ... demonstrable in any reasonable fashion.> and much easier to write messy such.Sure. So don't do that :)> People who have > known what their compiler does - i.e. those who wrote the > compiler - must have been able to write some good code using it. > >I think I probably spent a total of six months - call it 1000 hours - learning how to write good 'C' code. I know people who use "more modern" toolchains who have ten times that invested in them and still have problems. By the time you learn all of C++, it will have morphed into something else.>>> Nowadays people have no clue where the machine stack is, write >>> IRQ handlers in C etc. etc. - in a way not dissimilar to writing >>> novels in a language for which they need a phrasebook. >> >> We all need phrasebooks. > > Not all of us. I don't, for example. >Then I am not sure what to tell you - the idioms of 'C' are a pretty lengthy thing. I have committed many of the patterns to memory over 25 years but not all of them.>> >>> The thing is, their novels get sold simply because the general >>> public can't even use a phrasebook. >>> And this happened mainly because x86 entered the scene widely, >>> made assembly programming impractical with its messy programming >>> model etc. >>> >> >> I wrote more assembly language in x86 than in any other architecture. >> You want something to wreck things? Try assembly. > > This explains why you see assembly as something impractical.I don't.> There is no such thing as "assembly" language really, there > are worlds of a difference between this or that "assembly".They're all essentially the same. There is a narcissism of small differences.> And then there is my VPA (virtual processor assembly) which > makes me more efficient by at least an order of magnitude than > anyone who uses C when it comes to projects which take more > than a month to program (before you ask my code is in the > millions of lines, >50M sources over the past 20 years). >Those projects are arguably too large. An old saying is "by the time you get N=a million lines of FORTRAN to compile, you no longer care what it was supposed to do." There is an N ( doubtless larger ) for 'C'...> Dimiter > > ------------------------------------------------------ > Dimiter Popoff, TGI http://www.tgi-sci.com > ------------------------------------------------------ > http://www.flickr.com/photos/didi_tgi/ > >-- Les Cargill
Modern debuggers cause bad code quality
Started by ●December 2, 2014
Reply by ●December 5, 20142014-12-05
Reply by ●December 5, 20142014-12-05
I always viewed C as the universal assembly language. And I disagree that it is more error prone than other languages. A bad programmer writes bad code in any language.
Reply by ●December 5, 20142014-12-05
On 14-12-05 20:45 , Ed Prochak wrote:> I always viewed C as the universal assembly language.It may have been that, in the past, before the standardisation and before the compilers became ambitious about optimisation and code speed. Nowadays, standard C has more "gotchas" and hard-to-remember rules than a typical real assembly language (reference: recent discussions on comp.arch about gcc "miscompiling" typical C programs, because gcc assumes that C code with undefined behaviour, per the standard, can do anything.)> And I disagree that it is more error prone than other languages.Have you read the Zeigler study "Comparing development costs of C and Ada"? Strong suggestion that C is more error-prone, and C programs are harder to repair, than Ada. Documents a gradual switch from C to Ada within a company, same development procedures, same developers, reduced bug rate and repair time with Ada. http://archive.adaic.com/intro/ada-vs-c/cada_art.html.> A bad programmer writes bad code in any language.A good language can guide "bad" programmers towards better habits. -- Niklas Holsti Tidorum Ltd niklas holsti tidorum fi . @ .
Reply by ●December 5, 20142014-12-05
On 12/5/2014 11:45 AM, Ed Prochak wrote:> I always viewed C as the universal assembly language. And I disagree that it > is more error prone than other languages. A bad programmer writes bad code > in any language.+42 It's all about process.
Reply by ●December 5, 20142014-12-05
On Friday, December 5, 2014 2:31:21 PM UTC-5, Niklas Holsti wrote:> On 14-12-05 20:45 , Ed Prochak wrote: > > I always viewed C as the universal assembly language. > > It may have been that, in the past, before the standardisation and > before the compilers became ambitious about optimisation and code speed. > Nowadays, standard C has more "gotchas" and hard-to-remember rules than > a typical real assembly language (reference: recent discussions on > comp.arch about gcc "miscompiling" typical C programs, because gcc > assumes that C code with undefined behaviour, per the standard, can do > anything.)Well it is undefined behavior. :)> > > And I disagree that it is more error prone than other languages. > > Have you read the Zeigler study "Comparing development costs of C and > Ada"? Strong suggestion that C is more error-prone, and C programs are > harder to repair, than Ada. Documents a gradual switch from C to Ada > within a company, same development procedures, same developers, reduced > bug rate and repair time with Ada. > > http://archive.adaic.com/intro/ada-vs-c/cada_art.html. >Yes I will read it. I like Ada. Wish I could use it more often. In fact I may propose that when we see about changing host OS on our next product. It does help.> > A bad programmer writes bad code in any language. > > A good language can guide "bad" programmers towards better habits. >I guess there we will have to agree to disagree. And to keep this about the debugging topic: I have a general rule which you could say applies to testing. It is not how well it works when it works. It is how well it works when it doesn't work that matters. IOW, how you handle the error conditions can make or break a product. I'm working in the medical devices field now and I always keep Therac-25 in the back of my mind when designing solutions. A minor corollary: race conditions are never solved by sleep();
Reply by ●December 5, 20142014-12-05
Niklas Holsti <niklas.holsti@tidorum.invalid> writes:>> I always viewed C as the universal assembly language. > It may have been that, in the past, before the standardisation and > before the compilers became ambitious about optimisation and code > speed. Nowadays, standard C has more "gotchas" and hard-to-remember > rules than a typical real assembly languageBut the standard hasn't changed: most behaviours that are undefined now have always been undefined. In the era of less aggressive optimizing compilers, people wrote code all the time intending a particular behaviour but that was actually undefined, though the compiler happened to do what the user expected, so the user thought their code wasn't buggy. These days with today's compilers, that same code results in nasal demons, just like the standard specified all along. If C is an assembly language, it's an extremely treacherous one.
Reply by ●December 5, 20142014-12-05
Hi Dimiter, On 12/5/2014 6:33 AM, Dimiter_Popoff wrote:> On 04.12.2014 г. 12:51, Oliver Betz wrote:>>>> Could it be that today's sophisticated tools lead to more "try and >>>> error", less thinking before doing?>>> Errors that creep into projects are quite language and technology agnostic. >> >> Ganssle presented numbers: 50..100 errors/KLOC in C, 5..10 in ADA, >> zero with SPARK. > > It is the language, not the rest of the toolchain. > "C" is the major contributor to the decline in software quality (where > there was some quality to decline of course).I disagree -- in that the inherent qualities of C lead to poor quality. I think it is the *availability* of the language/tool that has led to a wider variety (read: "range of capabilities/skill levels") of folks *using* that language. [The same argument as the "modern debuggers cause bad code quality"] C is relatively easy to port to different machines/architectures. It's "cheap" to implement (at runtime and compile time). It's reasonably transparent (important for folks who have to deal with the underlying hardware -- like writing OS's, drivers, etc.). It's reasonably expressive (I don't have to write a floating point library in ASM for every machine on which I want to develop apps). Unfortunately, it allows too many "not of the priesthood" to practice its faith! (groan) And, unlike more benign languages (e.g., BASIC), they can actually do serious harm to their code that may or may not be noticeable. [I don't know if you recall the sorts of cruft folks would write in BASIC... clearly, no understanding of good program/application design... "but, it works"! Counting with a "float"? Gack!]> Nowadays people have no clue where the machine stack is, write > IRQ handlers in C etc. etc. - in a way not dissimilar to writing > novels in a language for which they need a phrasebook. > The thing is, their novels get sold simply because the general > public can't even use a phrasebook. > And this happened mainly because x86 entered the scene widely, > made assembly programming impractical with its messy programming > model etc.One of the unspoken goals of most HLL's is to make programming more "accessible"... NOT to require "ordained ministers" but, instead, "lay folk" (practically) to be able to write code (hey, if *they* can do it, then professionals should be able to design *golden* apps!) Everyone uses floats. Most are aware of roundoff error. But, how many think about order of evaluation when hacking together a series of operations? Cancellation? etc. On the one hand, more people can do more things with these "improved tools". OTOH, that doesn't mean they know how to do those things *right*. Worse, yet, they may not know what they *don't* know! It's relatively easy to get a piece of code to LOOK like it is working -- and then move on (oblivious to the fact that it really may not be working PROPERLY). Try to imagine what it would be like if there was pressure to make "practicing medicine" as accessible as programming has (tried to) become.
Reply by ●December 5, 20142014-12-05
On 05.12.2014 г. 20:40, Les Cargill wrote:> Dimiter_Popoff wrote: >> On 05.12.2014 г. 15:41, Les Cargill wrote: >>> Dimiter_Popoff wrote: >>>> On 04.12.2014 г. 12:51, Oliver Betz wrote: >>>>> Paul E Bennett wrote: >>>>> >>>>> [...] >>>>> >>>>>>> Could it be that today's sophisticated tools lead to more "try and >>>>>>> error", less thinking before doing? >>>>>> >>>>>> Talk about cats amongst pigeons. >>>>> >>>>> causing the foreseeable defensiveness. >>>>> >>>>> [...] >>>>> >>>>>> Errors that creep into projects are quite language and technology >>>>>> agnostic. >>>>> >>>>> Ganssle presented numbers: 50..100 errors/KLOC in C, 5..10 in ADA, >>>>> zero with SPARK. >>>> >>>> It is the language, not the rest of the toolchain. >>>> "C" is the major contributor to the decline in software quality (where >>>> there was some quality to decline of course). >>> >>> >>> That's odd, since 'C' has been there since... well, the start. How >>> can a thing-that-has-not-changed be the cause of decline? Some >>> massive lag? Changes in the populations of practitioners? >> >> It is the popularity growth, not the birth date. Then C does not >> prevent one from writing decent software, it only makes it more >> difficult - > > I don't think that's ... demonstrable in any reasonable fashion.It is obvious enough for me. The fact is that C tries to be a "universal assembler" as some people see it and it does it poorly (too abstracted from any machine model). There are a lot more details about my vpa which allow me to do things people just can't do in C which are way too lengthy for me to explain to myself let alone other people from the trade so I won't go into it, neither would any sane person want me to :-).> >> and much easier to write messy such. > > Sure. So don't do that :)Telling people "don't eat too much" and eradicate obesity is much much easier to do than what you suggest.> >> People who have >> known what their compiler does - i.e. those who wrote the >> compiler - must have been able to write some good code using it. >> >> > > I think I probably spent a total of six months - call it 1000 hours - > learning how to write good 'C' code. > > I know people who use "more modern" toolchains who have ten times that > invested in them and still have problems. By the time you learn all of > C++, it will have morphed into something else. > >>>> Nowadays people have no clue where the machine stack is, write >>>> IRQ handlers in C etc. etc. - in a way not dissimilar to writing >>>> novels in a language for which they need a phrasebook. >>> >>> We all need phrasebooks. >> >> Not all of us. I don't, for example. >> > > Then I am not sure what to tell you - the idioms of 'C' are > a pretty lengthy thing. I have committed many of the patterns to > memory over 25 years but not all of them.Exactly. This is the basic flaw of high level languages. Instead of dealing with text they deal with hieroglyphs - which is much less efficient than just using an alphabet and design your words en route to evolve the language to fit the whims of life. The basic flaw of any (too) high level language is its lack of flexibility to adapt to an ever changing world. Sure changes are made and the phrasebooks get rewritten - but how is this comparable with adding just the new words to the dictionary and twisting the language without needing any "official" approval. Before the change happens years will have passed and gigatons of poor software will have been written (poor simply because the language was not up to date with reality).>>> >>>> The thing is, their novels get sold simply because the general >>>> public can't even use a phrasebook. >>>> And this happened mainly because x86 entered the scene widely, >>>> made assembly programming impractical with its messy programming >>>> model etc. >>>> >>> >>> I wrote more assembly language in x86 than in any other architecture. >>> You want something to wreck things? Try assembly. >> >> This explains why you see assembly as something impractical. > > I don't.OK, your previous post left me with the impression you did, I must have misunderstood you.> >> There is no such thing as "assembly" language really, there >> are worlds of a difference between this or that "assembly". > > They're all essentially the same. There is a narcissism of small > differences.Well if this is "the same" the way all human languages are "the same", I could agree. Only if so.> >> And then there is my VPA (virtual processor assembly) which >> makes me more efficient by at least an order of magnitude than >> anyone who uses C when it comes to projects which take more >> than a month to program (before you ask my code is in the >> millions of lines, >50M sources over the past 20 years). >> > > Those projects are arguably too large. An old saying is "by the time > you get N=a million lines of FORTRAN to compile, you no longer > care what it was supposed to do."If a project which takes over a month of programming is "too large" in your book then OK, I will agree with you that copying this and that and putting something together in a week or two is better done using a high level language, yes. But one month of programming is nowhere near what is a "large project" in my book. Dimiter ------------------------------------------------------ Dimiter Popoff, TGI http://www.tgi-sci.com ------------------------------------------------------ http://www.flickr.com/photos/didi_tgi/
Reply by ●December 5, 20142014-12-05
Hi Don, On 05.12.2014 г. 22:07, Don Y wrote:> Hi Dimiter, > > On 12/5/2014 6:33 AM, Dimiter_Popoff wrote: >> On 04.12.2014 г. 12:51, Oliver Betz wrote: > >>>>> Could it be that today's sophisticated tools lead to more "try and >>>>> error", less thinking before doing? > >>>> Errors that creep into projects are quite language and technology >>>> agnostic. >>> >>> Ganssle presented numbers: 50..100 errors/KLOC in C, 5..10 in ADA, >>> zero with SPARK. >> >> It is the language, not the rest of the toolchain. >> "C" is the major contributor to the decline in software quality (where >> there was some quality to decline of course). > > I disagree -- in that the inherent qualities of C lead to poor quality. > I think it is the *availability* of the language/tool that has led > to a wider variety (read: "range of capabilities/skill levels") of > folks *using* that language.Of course I agree that C has evolved to what it is for a reason. It is just that the reason is much too distorted - e.g. the messy x86 architecture being one of the important reasons most people (other than myself :-) ) abandoned further development of lower level languages. Then good architectures simply came with barely usable assembly - e.g. power, the best architecture I know of with mnemonics no sane person would try to write much code for (which is why I did my vpa for power). I understand that I am practically alone against the rest of the world so I am not trying to convince anyone here. I am just stating my thoughts, someone some day might find something he is after. For now I just use my vpa (which owes a lot to 68k assembly, I built on it - in fact it can still "assemble" 68k sources and produce power code, though it can do more, especially when it comes to handling variables in the text, macro flexibility etc. etc.) to the advantage of what I design :-). Dimiter ------------------------------------------------------ Dimiter Popoff, TGI http://www.tgi-sci.com ------------------------------------------------------ http://www.flickr.com/photos/didi_tgi/
Reply by ●December 5, 20142014-12-05
On Friday, December 5, 2014 2:31:21 PM UTC-5, Niklas Holsti wrote:> On 14-12-05 20:45 , Ed Prochak wrote:[]> > > And I disagree that it is more error prone than other languages. > > Have you read the Zeigler study "Comparing development costs of C and > Ada"? Strong suggestion that C is more error-prone, and C programs are > harder to repair, than Ada. Documents a gradual switch from C to Ada > within a company, same development procedures, same developers, reduced > bug rate and repair time with Ada. > > http://archive.adaic.com/intro/ada-vs-c/cada_art.html. > > > A bad programmer writes bad code in any language. > > A good language can guide "bad" programmers towards better habits. > > -- > Niklas Holsti > Tidorum Ltd > niklas holsti tidorum fi > . @ .That was an impressive study. They shot down pretty much every counter argument other than Ada may not do as well for small projects. THANKS. I look forward to their updates. (data past 1994, and the other study on C++)







