Reply by dsco...@rcn.com March 11, 20102010-03-11
On Mar 11, 10:33=A0am, "Not Really Me"
<sc...@validatedQWERTYsoftware.XYZZY.com> wrote:
> djordj wrote: > > I've read "The Non-Quality Revolution" by Jack Ganssle @ Embedde.com > > (http://tinyurl.com/y9tspzl). > > > What about the concept of achieving higher levels of firmware quality? > > How we can define it? > > > Number of bugs per LOC? > > Perceived quality by the final user? > > Or.... ? > > I think you mean bugs per KLOC, as in kilo or thousands of lines of code. > Bugs per LOC indicates a complete lack of quality. > > As others here have indicated, quality is a rather subjective matter. =A0=
We
> work with firmware used in Safety-Critical applications, medical, avionic=
s,
> etc. =A0In these quality is defined more objectively in terms of software > failures per hours of operation, or indirectly by the type of verificatio=
n
> and validation required by a given standard. =A0In both cases the measure=
ment
> is for a speicified safety level which range from not considered a safety > concern to life critical. > > -- > Scott Nowell > Validated Software > Lafayette, CO > > __________ Information from ESET NOD32 Antivirus, version of virus signat=
ure database 4935 (20100311) __________
> > The message was checked by ESET NOD32 Antivirus. > > http://www.eset.com
Isn't that an oxymoron? Like military intelligence, sorry coundn't resist.
Reply by Not Really Me March 11, 20102010-03-11
djordj wrote:
> I've read "The Non-Quality Revolution" by Jack Ganssle @ Embedde.com > (http://tinyurl.com/y9tspzl). > > What about the concept of achieving higher levels of firmware quality? > How we can define it? > > Number of bugs per LOC? > Perceived quality by the final user? > Or.... ? >
I think you mean bugs per KLOC, as in kilo or thousands of lines of code. Bugs per LOC indicates a complete lack of quality. As others here have indicated, quality is a rather subjective matter. We work with firmware used in Safety-Critical applications, medical, avionics, etc. In these quality is defined more objectively in terms of software failures per hours of operation, or indirectly by the type of verification and validation required by a given standard. In both cases the measurement is for a speicified safety level which range from not considered a safety concern to life critical. -- Scott Nowell Validated Software Lafayette, CO __________ Information from ESET NOD32 Antivirus, version of virus signature database 4935 (20100311) __________ The message was checked by ESET NOD32 Antivirus. http://www.eset.com
Reply by Cesar Rabak March 10, 20102010-03-10
Em 10/3/2010 18:34, D Yuniskis escreveu:
> news.tin.it wrote: >> Nel suo scritto precedente, John Speth ha sostenuto : >>>> If the system provides discrete events, like readings or firings or >>>> whatever, the number of attempts vs. the number of failures. >>> >>> Tim's response is excellent. In other words, you and your customer >>> will define quality. Your job as the firmware QA engineer is to >>> define the metrics of quality and then go ahead and measure it. >> >> Measuring "something" is the very first step, that's right. >> Here's the question: are we sure that we're measuring the right thing? >> If we were speaking about a mechanical components (say an aircraft wing, >> for example) we could define a set of mechanical test cases because we >> all >> know what we're looking for (a wing that can make an airplan fly). > > Presumably, those tests are designed to verify that the wing > conforms to some *specification* regarding its weight, mechanical > strength, conformance of the airfoil to "ideal" contour, etc. > > Those specifications were, in turn, derived from other specifications: > "We have to be able to carry X passengers and Y cargo over adistance of > M miles with a fuel efficiency of E...". > >> As a matter of fact, these tests don't care about the customer definition >> of quality (ok... customers like not to crash while flying to Hawaii >> -.- ) >> >> But when we talk about firmware, are we able to define something like >> this? >> Or we have to lean only on customer aspectatives? >> Wouldn't it be better if we can define a metric that allows to compare >> initial requisites with produced firmware? > > You need specifications ("requisites") against which to measure. > Quality, in the software sense, is how well you conform to your > requirements (how pretty your code looks might be nice, too, > but that doesn't directly indicate how well it does what it > is *designed* to do) >
Well in the "software sense" we've advanced a lot more than that. We have now the "SQuaRE series" of international standards (ISO/IEC 25000-ISO/IEC 25051). In fact they address some of the subtleties written earlier like the question of the wing 'quality' versus the passenger's perceived attributes for a plane quality.
>> Are we falling back to bugs counting? > > How do you define the "quality" of a homebuilder: > - Number of nail-pops_in_the_drywall after 6 months? > - Number of floor_creaks? > - Ounces_of_water_per_inch_penetrating_the_roof_per_inch_of_rain? > etc. >
Ditto.
> Many industries have "invented" (seemingly) meaningless > metrics simply because you have to count *something*...
Or because those 'somethings' are meaningful in the chain of supply becoming antecedents for the attributes ultimately perceived by the final user?
> > I suspect most (software) products are now evaluated solely > in terms of "number of units sold". :< And, as long as that > number is high enough to keep the company profitable, they > keep doing what they are doing.
Yes. . . the good enough SW that made some Redmond company the wealthiest in the Globe ;-) -- Cesar Rabak GNU/Linux User 52247. Get counted: http://counter.li.org/
Reply by D Yuniskis March 10, 20102010-03-10
djordj wrote:
> I've read "The Non-Quality Revolution" by Jack Ganssle @ Embedde.com > (http://tinyurl.com/y9tspzl). > > What about the concept of achieving higher levels of firmware quality? > How we can define it? > > Number of bugs per LOC?
You have to define a line-of-code (some will argue about the criteria to use, there -- and how it can be "manipulated" by folks trying to "improve quality" by manipulating the LoC metric :>). E.g., do you include "dead code" in your LoC metric? You could also use function points, etc. And, how do you define a "bug"? Are all bugs "equal"? (I would argue that they are NOT)
> Perceived quality by the final user?
Perception implicitly assigns some "level of quality" to the perceiver. :> E.g., if the user isn't very adept, does that give you a false sense of having produced a "quality" product? What happens if the user becomes more "advanced" with experience? Suddenly, the same product in the same user's hands has *less* quality than it did when the user was inexperienced/naive?
> Or.... ?
*If* you had good specifications (exhaustive), quality could be measured in terms of "deviations from specification". Few pieces of code are written with such specification detail, though (perhaps the military?). So, you then have to address "specification quality" :-/ And, regardless, what are you going to *do* about that quality (or lack thereof) *if* you can find a way to actually objectively measure it?
Reply by D Yuniskis March 10, 20102010-03-10
news.tin.it wrote:
> Nel suo scritto precedente, John Speth ha sostenuto : >>> If the system provides discrete events, like readings or firings or >>> whatever, the number of attempts vs. the number of failures. >> >> Tim's response is excellent. In other words, you and your customer >> will define quality. Your job as the firmware QA engineer is to >> define the metrics of quality and then go ahead and measure it. > > Measuring "something" is the very first step, that's right. > Here's the question: are we sure that we're measuring the right thing? > If we were speaking about a mechanical components (say an aircraft wing, > for example) we could define a set of mechanical test cases because we all > know what we're looking for (a wing that can make an airplan fly).
Presumably, those tests are designed to verify that the wing conforms to some *specification* regarding its weight, mechanical strength, conformance of the airfoil to "ideal" contour, etc. Those specifications were, in turn, derived from other specifications: "We have to be able to carry X passengers and Y cargo over adistance of M miles with a fuel efficiency of E...".
> As a matter of fact, these tests don't care about the customer definition > of quality (ok... customers like not to crash while flying to Hawaii -.- ) > > But when we talk about firmware, are we able to define something like this? > Or we have to lean only on customer aspectatives? > Wouldn't it be better if we can define a metric that allows to compare > initial requisites with produced firmware?
You need specifications ("requisites") against which to measure. Quality, in the software sense, is how well you conform to your requirements (how pretty your code looks might be nice, too, but that doesn't directly indicate how well it does what it is *designed* to do)
> Are we falling back to bugs counting?
How do you define the "quality" of a homebuilder: - Number of nail-pops_in_the_drywall after 6 months? - Number of floor_creaks? - Ounces_of_water_per_inch_penetrating_the_roof_per_inch_of_rain? etc. Many industries have "invented" (seemingly) meaningless metrics simply because you have to count *something*... I suspect most (software) products are now evaluated solely in terms of "number of units sold". :< And, as long as that number is high enough to keep the company profitable, they keep doing what they are doing.
Reply by Tim Wescott March 10, 20102010-03-10
news.tin.it wrote:
> Nel suo scritto precedente, John Speth ha sostenuto : >>> If the system provides discrete events, like readings or firings or >>> whatever, the number of attempts vs. the number of failures. >> >> Tim's response is excellent. In other words, you and your customer >> will define quality. Your job as the firmware QA engineer is to >> define the metrics of quality and then go ahead and measure it. > > Measuring "something" is the very first step, that's right. > Here's the question: are we sure that we're measuring the right thing? > If we were speaking about a mechanical components (say an aircraft wing, > for example) we could define a set of mechanical test cases because we all > know what we're looking for (a wing that can make an airplan fly). > As a matter of fact, these tests don't care about the customer definition > of quality (ok... customers like not to crash while flying to Hawaii -.- ) > > But when we talk about firmware, are we able to define something like this? > Or we have to lean only on customer aspectatives? > Wouldn't it be better if we can define a metric that allows to compare > initial requisites with produced firmware? > Are we falling back to bugs counting?
I think you're confusing the question "What is quality?" with the question "How do we accurately express, predict and measure quality?". I'd expand on that, but about all I can say at this point is "and that's a hard question to answer!". -- Tim Wescott Control system and signal processing consulting www.wescottdesign.com
Reply by news.tin.it March 10, 20102010-03-10
Nel suo scritto precedente, John Speth ha sostenuto :
>> If the system provides discrete events, like readings or firings or >> whatever, the number of attempts vs. the number of failures. > > Tim's response is excellent. In other words, you and your customer will > define quality. Your job as the firmware QA engineer is to define the > metrics of quality and then go ahead and measure it.
Measuring "something" is the very first step, that's right. Here's the question: are we sure that we're measuring the right thing? If we were speaking about a mechanical components (say an aircraft wing, for example) we could define a set of mechanical test cases because we all know what we're looking for (a wing that can make an airplan fly). As a matter of fact, these tests don't care about the customer definition of quality (ok... customers like not to crash while flying to Hawaii -.- ) But when we talk about firmware, are we able to define something like this? Or we have to lean only on customer aspectatives? Wouldn't it be better if we can define a metric that allows to compare initial requisites with produced firmware? Are we falling back to bugs counting? Regards! -- http://www.grgmeda.it
Reply by John Speth March 10, 20102010-03-10
>> What about the concept of achieving higher levels of firmware quality? >> How we can define it? >> >> Number of bugs per LOC? >> Perceived quality by the final user? >> Or.... ? > > Up time vs. problem time is a common one -- i.e. how many hours you can > drive your car without the throttle sticking wide open, or how long your > pacemaker works without skipping a beat. > > If the system provides discrete events, like readings or firings or > whatever, the number of attempts vs. the number of failures.
Tim's response is excellent. In other words, you and your customer will define quality. Your job as the firmware QA engineer is to define the metrics of quality and then go ahead and measure it. "If you can't measure it, you can't manage it." JJS --- news://freenews.netfront.net/ - complaints: news@netfront.net ---
Reply by Tim Wescott March 10, 20102010-03-10
djordj wrote:
> I've read "The Non-Quality Revolution" by Jack Ganssle @ Embedde.com > (http://tinyurl.com/y9tspzl). > > What about the concept of achieving higher levels of firmware quality? > How we can define it? > > Number of bugs per LOC? > Perceived quality by the final user? > Or.... ?
Up time vs. problem time is a common one -- i.e. how many hours you can drive your car without the throttle sticking wide open, or how long your pacemaker works without skipping a beat. If the system provides discrete events, like readings or firings or whatever, the number of attempts vs. the number of failures. -- Tim Wescott Control system and signal processing consulting www.wescottdesign.com
Reply by Vladimir Vassilevsky March 10, 20102010-03-10

djordj wrote:

> I've read "The Non-Quality Revolution" by Jack Ganssle @ Embedde.com > (http://tinyurl.com/y9tspzl). > > What about the concept of achieving higher levels of firmware quality? > How we can define it? > > Number of bugs per LOC? > Perceived quality by the final user? > Or.... ?
chinese + hindu programmers in this project divided by total number of programmers in this project. VLV