EmbeddedRelated.com
Forums
The 2024 Embedded Online Conference

Source code static analysis tool recommendations

Started by John Speth February 2, 2018
Hi folks-

Does anybody have any recommendations for static source code analysis 
tools based on practical use?

My customer is asking us to apply a static source code analysis tool to 
an existing medical product.  The product is undergoing changes for 
which the FDA requires another round of testing.  The customer was 
dinged by the FDA for not using an analysis tool so they want to correct 
the process this time around.

My concerns are:
- cost
- time to set up
- effectiveness
- too many false problems identified
- open source vs paid product

The project size is about 50K lines of code.

Thanks - John
On Fri, 2 Feb 2018 13:32:42 -0800, John Speth <johnspeth@yahoo.com>
wrote:

>Hi folks- > >Does anybody have any recommendations for static source code analysis >tools based on practical use? > >My customer is asking us to apply a static source code analysis tool to >an existing medical product. The product is undergoing changes for >which the FDA requires another round of testing. The customer was >dinged by the FDA for not using an analysis tool so they want to correct >the process this time around. > >My concerns are: >- cost >- time to set up >- effectiveness >- too many false problems identified >- open source vs paid product > >The project size is about 50K lines of code. > >Thanks - John
What language(s) and what platform should the tool run on? This is going back a ways [so take with a grain of salt] but circa 2000 when I was in medical imaging the FDA was quite happy with Lint for C or C++. If the code never has been linted previously, you are likely to get many pages of warnings. Examining them, you are likely to find quite a large percentage are trivial / ignorable, but checking a large code base for the 1st time will not go quickly. All the tools I am familiar with allow to disable particular checks that consistently produce ignorable results. But you have to determine that by actually looking at the code. Whatever tool you use, you'll need to verify that the FDA will accept the result ... I've been away from that arena for too long to know what tools currently are acceptable. George
On 2/2/2018 3:24 PM, George Neuner wrote:
> On Fri, 2 Feb 2018 13:32:42 -0800, John Speth <johnspeth@yahoo.com> > wrote: > >> Hi folks- >> >> Does anybody have any recommendations for static source code analysis >> tools based on practical use? >> >> My customer is asking us to apply a static source code analysis tool to >> an existing medical product. The product is undergoing changes for >> which the FDA requires another round of testing. The customer was >> dinged by the FDA for not using an analysis tool so they want to correct >> the process this time around. >> >> My concerns are: >> - cost >> - time to set up >> - effectiveness >> - too many false problems identified >> - open source vs paid product >> >> The project size is about 50K lines of code. >> >> Thanks - John > > What language(s) and what platform should the tool run on? > > > This is going back a ways [so take with a grain of salt] but circa > 2000 when I was in medical imaging the FDA was quite happy with Lint > for C or C++. > > If the code never has been linted previously, you are likely to get > many pages of warnings. Examining them, you are likely to find quite > a large percentage are trivial / ignorable, but checking a large code > base for the 1st time will not go quickly. > > All the tools I am familiar with allow to disable particular checks > that consistently produce ignorable results. But you have to > determine that by actually looking at the code. > > Whatever tool you use, you'll need to verify that the FDA will accept > the result ... I've been away from that arena for too long to know > what tools currently are acceptable.
Thanks George. The code is in C. We'd like to run the tools on Windows computers. It happens that the first FDA submission was subject to only PC-Lint. The customer told me that the FDA did accept that test but I didn't believe it. PC-Lint is not a deep testing tool, in my opinion. I'll need to take a second look at that. JJS
On Fri, 2 Feb 2018 17:13:40 -0800, John Speth <johnspeth@yahoo.com>
wrote:

>On 2/2/2018 3:24 PM, George Neuner wrote: >> On Fri, 2 Feb 2018 13:32:42 -0800, John Speth <johnspeth@yahoo.com> >> wrote: >> >>> Hi folks- >>> >>> Does anybody have any recommendations for static source code analysis >>> tools based on practical use? >>> >>> My customer is asking us to apply a static source code analysis tool to >>> an existing medical product. The product is undergoing changes for >>> which the FDA requires another round of testing. The customer was >>> dinged by the FDA for not using an analysis tool so they want to correct >>> the process this time around. >>> >>> My concerns are: >>> - cost >>> - time to set up >>> - effectiveness >>> - too many false problems identified >>> - open source vs paid product >>> >>> The project size is about 50K lines of code. >>> >>> Thanks - John >> >> What language(s) and what platform should the tool run on? >> >> >> This is going back a ways [so take with a grain of salt] but circa >> 2000 when I was in medical imaging the FDA was quite happy with Lint >> for C or C++. >> >> If the code never has been linted previously, you are likely to get >> many pages of warnings. Examining them, you are likely to find quite >> a large percentage are trivial / ignorable, but checking a large code >> base for the 1st time will not go quickly. >> >> All the tools I am familiar with allow to disable particular checks >> that consistently produce ignorable results. But you have to >> determine that by actually looking at the code. >> >> Whatever tool you use, you'll need to verify that the FDA will accept >> the result ... I've been away from that arena for too long to know >> what tools currently are acceptable. > >Thanks George. The code is in C. We'd like to run the tools on Windows >computers. > >It happens that the first FDA submission was subject to only PC-Lint. >The customer told me that the FDA did accept that test but I didn't >believe it. PC-Lint is not a deep testing tool, in my opinion. I'll >need to take a second look at that.
I haven't seen PC-Lint in many years, so I can't comment on that. Splint and cppcheck are free alternatives that are worth a look. There's an add-in that will integrate cppcheck into any recent Visual Studio edition. Also VS Professional (or better) since 2013 has a pretty decent lint module included. Splint works fine with VS, but there's no integration available AFAIK. You can use it as an external tool. Synopsys's Coverity Scan is quite nice and decently comprehensive, but note that I only have used the free online version, which uploads your code to Synopsys's servers for analysis [and license-wise requires your project to be FOSS]. They do sell a version for proprietary use, but I have no knowledge of the cost. At this point, I'm out of suggestions. Maybe someone else has other ideas? George
Am 03.02.2018 um 09:40 schrieb George Neuner:
> On Fri, 2 Feb 2018 17:13:40 -0800, John Speth <johnspeth@yahoo.com> >> Thanks George. The code is in C. We'd like to run the tools on Windows >> computers. >> >> It happens that the first FDA submission was subject to only PC-Lint. >> The customer told me that the FDA did accept that test but I didn't >> believe it. PC-Lint is not a deep testing tool, in my opinion. I'll >> need to take a second look at that. > > I haven't seen PC-Lint in many years, so I can't comment on that. > Splint and cppcheck are free alternatives that are worth a look. > > There's an add-in that will integrate cppcheck into any recent Visual > Studio edition. Also VS Professional (or better) since 2013 has a > pretty decent lint module included.
Not sure what lint module you're referring to, but Visual Studio has had an "/analyze" option for quite some while. I remember there some tricks circulating for enabling that on editions that did not officially support it. One advantage that tool has over others is that it knows Windows APIs, and finds things like "this API call returns a handle that you should close if you don't need it". The same thing can be said of cppcheck: it knows some POSIX APIs and tells you you're leaking stuff.
> Splint works fine with VS, but there's no integration available AFAIK. > You can use it as an external tool. > > Synopsys's Coverity Scan is quite nice and decently comprehensive, but > note that I only have used the free online version, which uploads your > code to Synopsys's servers for analysis [and license-wise requires > your project to be FOSS]. They do sell a version for proprietary use, > but I have no knowledge of the cost.
Our company evaluated Coverity vs. Klocwork and ended up at Klocwork, although I was not involved at the decision. My main gripe with tools of that kind is that they have tons of checkers and people turn them on all at once because they can. Klocwork literally has something to say for every line of code, ending up with tens of thousands of "issues" in a perfectly working software. Our QA people defined a subset what to consider critical - about 1% of that - and even of that remainder, in my team's software only 1% is actual problems, 50% is style issues ("no 'break' after the last branch of this 'switch'", "no '&' before this function name used as pointer"), and the remainder is false positives ("please remove this check for errors, because I can prove that this function call does not ever produce an error"). If my goal were getting better software, I'd start with turning up compiler warnings to the max (warnings count as static analysis, right?), unit tests with coverage analysis, valgrind, and an occasional cppcheck or cl /analyze. If the goal is to get a tool permanently into your workflow in a documented way, a database-based tool like Klocwork doesn't sound too bad, mainly because it has a way to re-identify findings across source code changes (you need to tell it only once that that finding is a false positive, not adjust a suppressions file after every edit). Just be careful at what checkers you use and what to do about them. "Fix all findings" is the wrong approach. Stefan
On 2018-02-03, George Neuner <gneuner2@comcast.net> wrote:

> I haven't seen PC-Lint in many years, so I can't comment on that. > Splint and cppcheck are free alternatives that are worth a look.
I haven't tried splint it years, but it used to be completely useless for real code. It would warn you about things like this: uint8_t b; b = 1; or uint8_t i,j; [...] i = j+1; The warning was because the expression on the RHS was of type "int" (16 or 32 bits on the platforms I cared about) and was being truncated by the assignment to an 8-bit wide LHS. On small processors with only a few hundred bytes of RAM one tended to use the smallest variable types possible, and as a result splint produced thousands of meaningless warnings unless you typecast most of your integer assignments -- which made the code completely unreadable. I spent a few days hacking on splint to try to fix issues like that&nbsp;but was told by the split developers that they didn't care to incorporate fixes like mine because splint wasn't intended for use on real code in the real world because it was solely an academic project for use studying language concepts. Have they since pulled their heads out of <whatever>? -- Grant
Am 03.02.2018 um 17:37 schrieb Grant Edwards:
> On 2018-02-03, George Neuner <gneuner2@comcast.net> wrote: >> I haven't seen PC-Lint in many years, so I can't comment on that. >> Splint and cppcheck are free alternatives that are worth a look. > > I haven't tried splint it years, but it used to be completely > useless for real code. It would warn you about things like this: > > > uint8_t b; > b = 1; > > or > > uint8_t i,j; > [...] > i = j+1;
The latter gives a warning with gcc "-Wconversion" as well. Although I was long opposing that warning, it's hard to argue your head out of it if your company's software already had a few incidents with it. For control code, it is definitively possible to get "-Wconversion"-free; for computational code, it's some more work but possible (e.g. make a 'sample_add' inline function that does the casting, and use that). What I find more annoying is that MISRA finds this uint32_t MASK = UINT32_C(1) << 17; objectionable. '1', no matter how spelled, has type 'char' for MISRA, and cannot be shifted by 17 bits. Stefan
On 2018-02-04, Stefan Reuther <stefan.news@arcor.de> wrote:

> What I find more annoying is that MISRA finds this > > uint32_t MASK = UINT32_C(1) << 17; > > objectionable. '1', no matter how spelled, has type 'char' for MISRA, > and cannot be shifted by 17 bits.
In MISRA C, the literal 1 is a char not an int? -- Grant
Am 04.02.2018 um 16:12 schrieb Grant Edwards:
> On 2018-02-04, Stefan Reuther <stefan.news@arcor.de> wrote: >> What I find more annoying is that MISRA finds this >> >> uint32_t MASK = UINT32_C(1) << 17; >> >> objectionable. '1', no matter how spelled, has type 'char' for MISRA, >> and cannot be shifted by 17 bits. > > In MISRA C, the literal 1 is a char not an int?
Yes. MISRA C 6.10.4 "Underlying type". The problem they're trying to solve isn't too far-fetched: if 'int' has 16 bits only, '1 << 17' is undefined, so you'd better be explicit about the type you're shifting. The problem now is that C99's way of being explicit is the 'UINT32_C' macro - but that one expands to nothing on a 32-bit architecture... Stefan
Am 05.02.2018 um 18:48 schrieb Stefan Reuther:
> The problem now is that C99's way of being > explicit is the 'UINT32_C' macro
Where on earth did you get that idea? UINT32_C does not even _appear_ in the C99 standard.

The 2024 Embedded Online Conference