Attached is sample test cases document.
What is the best program to write test cases professionally, with very nice fonts, etc.?
Should I use html or xml or some other program ?
Sounds like you're at the beginning of your embedded career. Welcome.
Some good answers already but if you're still learning at college or home then you won't be able to use integrated professional tools because of their cost and time to maintain them. At one place I worked Jenkins was coupled to the source code repository but took 25% to 50% of the administrator's time to keep it up to date. That said it was also coupled to Klocwork static analysis tool. Any time a source code commit took place Jenkins ran the integration/unit tests as well as Klocwork to analyse the code base.
Also bear in mind that many embedded systems require test jigs, power supplies and other connecting cables so a PC can talk to the device. Manulual intervention is often required to check, for example, if the correct message is on the LCD display, a mechanical value is activated or not and so on.
I'm not sure you need to worry about 'nice fonts'. Times New Roman and Arial cover most things and are universally available on all computer systems and printers. If you use a fancy font it may not print as it looks on the screen or you send the doc to someone else who does no have that font on their machine. Note the fonts available in the various versions of windows has changed over the years. Mac OS is different again but I've never found a PC without Times New Roman and Arial.
Two places I've worked in used an Exel spread sheet, 1 row per test. What is very helpful with this approach is a very wide screen or even a dual screen system so you can see the spread sheet across both of them.
Important points are systematically number the tests preferably using some sort of grouping, 1.0 1.1 ... 2.0 2.1 2.1.1 2.1.2 etc, Specify the test setup eg what cables to connect where, what software to run on the PC, somewhere to record the result and a comment box for the tester to record anything else about the test. One thing is for certain they will not all go as you expected all the time.
Note there are often two kinds of tests, one very thorough test to unit/integrate test the software and another for the production line to check the hardware is working before it leaves the factory and the correct version of the software has been installed.
As to software testing this really has to be automated because of the huge number of tests that have to be performed to give good code coverage. Not only should tests result in all parts of the code being run but also run with various data values, especially the corner case eg ADC readings of 0 and full scale.
I recomend "Test-Driven Development for Embedded C" by James Grenning. Personnally like to write automated tests in Python using 'unittest' test framework that comes in the standard library. There are many other Python unit test frameworks. If you want to write tests in Java there's always the original JUnit and no doubt many other test frameworks. To put your tests directly in an embedded system you could try 'CUnit' or 'embeddedUnit'. These are relatively simple frameworks but you will need to be able to connect to the target system through a serial port so the tests can report their results.
The recommended book seems to be for Software Developer. Will it be useful for Embedded Software Validation Engineer?
In my original post I attached sample test report. Will this book show how to correctly write test report?
What do you mean by "Validation Engineer"? Try https://en.wikipedia.org/wiki/Software_verificatio... to see the difference between validation and verification. My experience is at the 'verification' end of this spectrum and mostly relates to direct testing of the system against the software spec.
You are right in thinking "Test-Driven Development for Embedded C" is about the software approach and frameworks to actually test softeware. Don't remember it covering test reports except the normal output of the unit/integration test framework, which is usually verbose with one line per test and its success or otherwise. Nor does it discuss relating such output to the software spec/requirements.
I'm not entirely sure by what is meant by "professionally". MS Word and/or Excel will certainly help a team author and track test cases just fine, but obviously that approach is driven primarily by people authoring and maintaining such documents manually.
That being said, if the goal is creating a "Do Not Repeat Yourself" environment where multiple forms of documentation are created from a single data source, then I do not have specific advice.
If the goal is requirements traceability, perhaps required by a certifying authority, where product requirements can be traced to test cases, then there are various tools to help, such as QAComplete, and many others. I would guess many of those tools can generate professional reporting output.
Best of luck!
From my experience, what I've seen in practice:
- The test reports are usually generated in an xml file, depending on what templates you have defined you can display then with a very nicely formatted way in a browser. You can also generate from your script directly the html format, however further data analysis is easier from the xml format.
The test requirements can be written in any format, example word, here more important is to have a clear "writing standard", similar to a coding standard, so that you can easily understand the: - what is tested - brief name of the unit test, scenario, use-case - exact steps, very important to avoid any ambiguous formulations, use plain and simple language, short sentences, somebody has to implement them - clear pass/fail conditions
In the logs provide a FAIL/SUCCESS indication and the additional information: executed command, passed data, received response and the expected response, if possible with time stamps.
!!Make codereview's of the testscript vs requirement!! I've seen elaborated test scripts with a forgotten "always return PASSED"... everything was green on the screen, but the feature was actually not working.
Don't forget: You can't say if your product/system is good or bad, you can just say if it passed or not your tests.
Results in: Your system is as good as your tests.
We'll be running manual tests. We'll explore automated tests in future.
Does Jenkins automatically create test reports in xml?
The test report I attached to my original post needs to be reviewed. It doesn't look sharp. I need some tips in improving it.
Is it possible to write this test report in xml manually? How will I distribute it to my coworkers and customers so they can view it? Will they need internet explorer to view it? Or can they view in pdf format?