EmbeddedRelated.com
Forums
The 2024 Embedded Online Conference

[OT] I got a JOB!!!

Started by Tim Wescott May 16, 2017
On Wed, 17 May 2017 18:22:16 +0100, Tom Gardner wrote:

> On 17/05/17 16:25, Tim Wescott wrote: >> On Wed, 17 May 2017 10:32:59 +0100, Tom Gardner wrote: >> >>> On 17/05/17 09:57, Reinhardt Behm wrote: >>>> AT Wednesday 17 May 2017 16:42, Tom Gardner wrote: >>>> >>>>> On 17/05/17 01:56, Don Y wrote: >>>>>> On 5/16/2017 3:51 PM, Tom Gardner wrote: >>>>>>> The other dysfunctional aspect of unit tests is that, >>>>>>> while they are very useful when making incremental improvements >>>>>>> during design, they can be a real impediment in a few months/years >>>>>>> time. The problem is that over time people forget which tests >>>>>>> demonstrate required properties, and which are merely ensuring >>>>>>> behaviour of implementation artifacts. At that point people are >>>>>>> afraid to make changes that break tests, even if the tests are >>>>>>> unimportant. At that point the codebase has become ossified. >>>>>> >>>>>> But that's true any time you (attempt to) nail down an aspect of a >>>>>> design. Moreso if you don't provide a rationale for WHY the >>>>>> requirement (whether it be from a spec or a test) was imposed. >>>>> >>>>> Of course. Nonetheless, it happens - and zealots sometimes refuse to >>>>> admit it. >>>> >>>> Therefor any good development process demands that requirements >>>> (high-level, >>>> or low-level) must be traceable to system requirements or must be >>>> marked as design decisions. Then you have the WHY. >>>> A test can not be the source of a requirement. A test should also be >>>> traceable to sys-reqs. >>>> And hopefully the sys-reqs are good. >>> >>> We are in violent agreement. >>> >>> But my point is about the XP/Agile/TDD/Unit Test zealots that >>> overstate the benefits of those techniques and ignore their >>> dysfunctional aspects. >> >> That's kind of the definition of a zealot. >> >> Something that Reinhardt -- and I think maybe you -- are missing, is >> that part of the reason for agile is that requirements don't remain >> static. As soon as people see a something working, the requirements >> change -- and saying "no, I'm sorry person-who-signs-checks, we're >> going to spend the next year making what you asked for initially and >> THEN we're going to entertain change requests". >> >> So you can't treat requirements as static things graved in stone. >> Agile tries to deal with that, and as near as I can tell, only having >> nibbled at the edges of it, it does a reasonably good job of it for >> things that aren't terribly safety critical. Agile is also new, so >> it's naturally going to attract zealots -- the trick is to see the >> value underneath the hype, and then make rational choices about what to >> do. > > I understand and agree with all of that. > > I once pushed for agile practices to be introduced into a company for > those reasons. I then saw the newly-trained staff think it was a good > idea to delete /all/ comments on the religious principle that comments > are bad because they get out of sync with the code. They also changed > private functions and data to public so that they could be seen by their > unit tests - thereby encouraging future developers to /misuse/ the > classes in ways that are easily avoidable. > > That doesn't mean agile is bad (it isn't!), only that /unthinkingly/ > following a process is bad.
Unthinkingly following any process is bad. Refusing to follow any process at all is bad*. Trying to apply a process to the wrong situation** is bad. I've seen it go down all these ways. It was bad every time. I need to find out what makes good practice vis-a-vis hiding functions for unit tests. One way or another the answer has to be that the functions STAY hidden for higher-level "production" code -- whether this means that class foo has a "friend class testFoo" in it, or if it means that you have a #define TEST_CLASS_FOO, or if there's something yet again -- I don't know. * This can be ameliorated in a group environment by identifying the people who are cranky about the process because of honestly perceived faults, and inviting them to the Dark Side -- uh, to the process steering committee meetings. They end up understanding the underlying reasons, or better, making the process better. The people who just want to be cowboys are generally either poor programmers in general, or are good programmers if they're put into an organization small enough to not need written process. ** A friend of mine used to own a company that built card-swipe machines, when they were first coming out. He got bought by a banking company, and had buckets of entertainment when their IT department got wind of Someone Writing Software that We Don't Control and tried to impose the coding standard on him. The punchline was pointing at the 8051 on the circuit board and saying "where do I get a COBOL compiler for that?" -- Tim Wescott Wescott Design Services http://www.wescottdesign.com
On 17/05/17 18:21, Tim Wescott wrote:
> On Wed, 17 May 2017 18:11:22 +0100, Tom Gardner wrote: > >> On 17/05/17 16:02, Tim Wescott wrote: >>> On Wed, 17 May 2017 09:12:46 -0400, Randy Yates wrote: >>> >>>> Tim Wescott <seemywebsite@myfooter.really> writes: >>>> >>>>> Since this is a newsgroup, and this is news... >>>>> >>>>> Wescott Design Services is going into remission, while I pursue a day >>>>> job. Job title is Software Designer 5 at Planar Systems -- so any >>>>> circuit design or control systems jones will have to be satisfied by >>>>> hobby work or on the side. >>>>> >>>>> In the near term I'll be finishing up current work with current >>>>> customers; in the longer term I'll probably concentrate on the >>>>> educational videos and maybe hobby stuff. >>>>> >>>>> Lots of embedded Linux work in my near future, and possibly TDD >>>>> proselytizing. >>>> >>>> Congratulations Tim. We'll have to exchange trade secrets someday... >>>> >>>> BTW, does PS use a specific unit test framework? Have you used it yet? >>>> How do you like it? >>> >>> I start next week, so I don't know and no. It's my former coworker >>> from FLIR who's pushing TDD, and the manager who hired me is kind of >>> warily standing on the sidelines going "what's the deal here?" >>> >>> The guy pushing it is extremely smart and capable, so whatever it is >>> it's probably good. >> >> I expect you'll find it is the codification of many of the development >> mentality and practices that you have been using for a long time. >> >> Be cautious about how TDD applies to bottom-up design (i.e. finding >> things that work and clagging them together) v.s. top-down design. TDD >> works naturally with top-down design where all the yet-to-be-implemented >> are well understood and feasible. >> >> Anybody that /thinks/ will realise that sometimes it is beneficial to do >> a "spike investigation" to quickly validate key concepts from >> top-to-bottom, and then to use that experience to do it "properly" >> using full-blown TDD. > > It's not magic. I've been geeking out on the COSMAC 1802 lately, because > it was the first microprocessor I ever owned (I had an ELF-II kit). The > user's manual has an entire chapter extolling the virtue of SUBROUTINES > (ooh, ahh) and how to implement them. It's quite gushy about how using > subroutines make your code better. And yet, I've worked on lots of > crappy code that has subroutines.
:) The 1802's implementation of subroutines was, um, quirky to the point of being obtuse. I hand built my first computer using a 6800, after having thought long and hard about the 1802 and 8080. It was a mess, but worked, I learned a heck of a lot, and prospective employers were duly impressed.
On Wed, 17 May 2017 18:36:52 +0100, Tom Gardner wrote:

> On 17/05/17 18:21, Tim Wescott wrote: >> On Wed, 17 May 2017 18:11:22 +0100, Tom Gardner wrote: >> >>> On 17/05/17 16:02, Tim Wescott wrote: >>>> On Wed, 17 May 2017 09:12:46 -0400, Randy Yates wrote: >>>> >>>>> Tim Wescott <seemywebsite@myfooter.really> writes: >>>>> >>>>>> Since this is a newsgroup, and this is news... >>>>>> >>>>>> Wescott Design Services is going into remission, while I pursue a >>>>>> day job. Job title is Software Designer 5 at Planar Systems -- so >>>>>> any circuit design or control systems jones will have to be >>>>>> satisfied by hobby work or on the side. >>>>>> >>>>>> In the near term I'll be finishing up current work with current >>>>>> customers; in the longer term I'll probably concentrate on the >>>>>> educational videos and maybe hobby stuff. >>>>>> >>>>>> Lots of embedded Linux work in my near future, and possibly TDD >>>>>> proselytizing. >>>>> >>>>> Congratulations Tim. We'll have to exchange trade secrets someday... >>>>> >>>>> BTW, does PS use a specific unit test framework? Have you used it >>>>> yet? >>>>> How do you like it? >>>> >>>> I start next week, so I don't know and no. It's my former coworker >>>> from FLIR who's pushing TDD, and the manager who hired me is kind of >>>> warily standing on the sidelines going "what's the deal here?" >>>> >>>> The guy pushing it is extremely smart and capable, so whatever it is >>>> it's probably good. >>> >>> I expect you'll find it is the codification of many of the development >>> mentality and practices that you have been using for a long time. >>> >>> Be cautious about how TDD applies to bottom-up design (i.e. finding >>> things that work and clagging them together) v.s. top-down design. TDD >>> works naturally with top-down design where all the >>> yet-to-be-implemented are well understood and feasible. >>> >>> Anybody that /thinks/ will realise that sometimes it is beneficial to >>> do a "spike investigation" to quickly validate key concepts from >>> top-to-bottom, and then to use that experience to do it "properly" >>> using full-blown TDD. >> >> It's not magic. I've been geeking out on the COSMAC 1802 lately, >> because it was the first microprocessor I ever owned (I had an ELF-II >> kit). The user's manual has an entire chapter extolling the virtue of >> SUBROUTINES (ooh, ahh) and how to implement them. It's quite gushy >> about how using subroutines make your code better. And yet, I've >> worked on lots of crappy code that has subroutines. > > :) > > The 1802's implementation of subroutines was, um, quirky to the point of > being obtuse. > > I hand built my first computer using a 6800, after having thought long > and hard about the 1802 and 8080. It was a mess, > but worked, I learned a heck of a lot, and prospective employers were > duly impressed.
The 1802 is neither a CISC processor nor a RISC processor -- it's a NHISC processor -- "Never Had Instruction Set Computer". I wish I had the chops to organize a contest -- I think an annual "build the fastest 1802" contest would be fun to be involved in. Imagine what you could do if the only basic rule was that it had to execute 1802 machine code faithfully, with no constraints on how much happened per clock cycle. Ditch TDA and TDB, keep the I/O command lines, Q, and the flags, and go to town. Prefetch, pipelines, caches, parallel execution, predictive branching, everything -- all with that crazy 1802 instruction set. -- Tim Wescott Wescott Design Services http://www.wescottdesign.com
On 17/05/17 18:36, Tim Wescott wrote:
> On Wed, 17 May 2017 18:22:16 +0100, Tom Gardner wrote: >> I once pushed for agile practices to be introduced into a company for >> those reasons. I then saw the newly-trained staff think it was a good >> idea to delete /all/ comments on the religious principle that comments >> are bad because they get out of sync with the code. They also changed >> private functions and data to public so that they could be seen by their >> unit tests - thereby encouraging future developers to /misuse/ the >> classes in ways that are easily avoidable. >> >> That doesn't mean agile is bad (it isn't!), only that /unthinkingly/ >> following a process is bad. > > Unthinkingly following any process is bad. Refusing to follow any > process at all is bad*. Trying to apply a process to the wrong > situation** is bad. I've seen it go down all these ways. It was bad > every time.
Yes indeed.
> I need to find out what makes good practice vis-a-vis hiding functions > for unit tests. One way or another the answer has to be that the > functions STAY hidden for higher-level "production" code -- whether this > means that class foo has a "friend class testFoo" in it, or if it means > that you have a #define TEST_CLASS_FOO, or if there's something yet again > -- I don't know.
I don't know for C++; I hate the language and have managed to avoid using it in anger. I'm sure there are some "design patterns" of accepted practice around for you to find. Probably too many, so you'll have to spend time discovering the disadvantages that their proponents don't mention. Java, and I expect C#, is probably more flexible since there are many dangerously powerful tools available. Think reflection, and proceed from there.
On 17/05/17 18:43, Tim Wescott wrote:
> On Wed, 17 May 2017 18:36:52 +0100, Tom Gardner wrote: > >> On 17/05/17 18:21, Tim Wescott wrote: >>> On Wed, 17 May 2017 18:11:22 +0100, Tom Gardner wrote: >>> >>>> On 17/05/17 16:02, Tim Wescott wrote: >>>>> On Wed, 17 May 2017 09:12:46 -0400, Randy Yates wrote: >>>>> >>>>>> Tim Wescott <seemywebsite@myfooter.really> writes: >>>>>> >>>>>>> Since this is a newsgroup, and this is news... >>>>>>> >>>>>>> Wescott Design Services is going into remission, while I pursue a >>>>>>> day job. Job title is Software Designer 5 at Planar Systems -- so >>>>>>> any circuit design or control systems jones will have to be >>>>>>> satisfied by hobby work or on the side. >>>>>>> >>>>>>> In the near term I'll be finishing up current work with current >>>>>>> customers; in the longer term I'll probably concentrate on the >>>>>>> educational videos and maybe hobby stuff. >>>>>>> >>>>>>> Lots of embedded Linux work in my near future, and possibly TDD >>>>>>> proselytizing. >>>>>> >>>>>> Congratulations Tim. We'll have to exchange trade secrets someday... >>>>>> >>>>>> BTW, does PS use a specific unit test framework? Have you used it >>>>>> yet? >>>>>> How do you like it? >>>>> >>>>> I start next week, so I don't know and no. It's my former coworker >>>>> from FLIR who's pushing TDD, and the manager who hired me is kind of >>>>> warily standing on the sidelines going "what's the deal here?" >>>>> >>>>> The guy pushing it is extremely smart and capable, so whatever it is >>>>> it's probably good. >>>> >>>> I expect you'll find it is the codification of many of the development >>>> mentality and practices that you have been using for a long time. >>>> >>>> Be cautious about how TDD applies to bottom-up design (i.e. finding >>>> things that work and clagging them together) v.s. top-down design. TDD >>>> works naturally with top-down design where all the >>>> yet-to-be-implemented are well understood and feasible. >>>> >>>> Anybody that /thinks/ will realise that sometimes it is beneficial to >>>> do a "spike investigation" to quickly validate key concepts from >>>> top-to-bottom, and then to use that experience to do it "properly" >>>> using full-blown TDD. >>> >>> It's not magic. I've been geeking out on the COSMAC 1802 lately, >>> because it was the first microprocessor I ever owned (I had an ELF-II >>> kit). The user's manual has an entire chapter extolling the virtue of >>> SUBROUTINES (ooh, ahh) and how to implement them. It's quite gushy >>> about how using subroutines make your code better. And yet, I've >>> worked on lots of crappy code that has subroutines. >> >> :) >> >> The 1802's implementation of subroutines was, um, quirky to the point of >> being obtuse. >> >> I hand built my first computer using a 6800, after having thought long >> and hard about the 1802 and 8080. It was a mess, >> but worked, I learned a heck of a lot, and prospective employers were >> duly impressed. > > The 1802 is neither a CISC processor nor a RISC processor -- it's a NHISC > processor -- "Never Had Instruction Set Computer".
:)
> I wish I had the chops to organize a contest -- I think an annual "build > the fastest 1802" contest would be fun to be involved in. Imagine what > you could do if the only basic rule was that it had to execute 1802 > machine code faithfully, with no constraints on how much happened per > clock cycle. Ditch TDA and TDB, keep the I/O command lines, Q, and the > flags, and go to town. Prefetch, pipelines, caches, parallel execution, > predictive branching, everything -- all with that crazy 1802 instruction > set.
I hate all those with a vengeance, since they prevent hard real-time software. I'm currently experimenting with a /small/ XMOS device which deliberately avoids all of those techniques so that it can guarantee timing. So far I've been able to get the /software/ to reliably count the edges on two 20Mb/s input pins, process the results and simultaneously shove them up a USB link to a host PC. Now I've got to understand the algorithms in reciprocal and continuous timestamping frequency counters :)
On 17/05/17 18:55, Tom Gardner wrote:
> On 17/05/17 18:43, Tim Wescott wrote: >> On Wed, 17 May 2017 18:36:52 +0100, Tom Gardner wrote: >> >>> On 17/05/17 18:21, Tim Wescott wrote: >>>> On Wed, 17 May 2017 18:11:22 +0100, Tom Gardner wrote: >>>> >>>>> On 17/05/17 16:02, Tim Wescott wrote: >>>>>> On Wed, 17 May 2017 09:12:46 -0400, Randy Yates wrote: >>>>>> >>>>>>> Tim Wescott <seemywebsite@myfooter.really> writes: >>>>>>> >>>>>>>> Since this is a newsgroup, and this is news... >>>>>>>> >>>>>>>> Wescott Design Services is going into remission, while I pursue a >>>>>>>> day job. Job title is Software Designer 5 at Planar Systems -- so >>>>>>>> any circuit design or control systems jones will have to be >>>>>>>> satisfied by hobby work or on the side. >>>>>>>> >>>>>>>> In the near term I'll be finishing up current work with current >>>>>>>> customers; in the longer term I'll probably concentrate on the >>>>>>>> educational videos and maybe hobby stuff. >>>>>>>> >>>>>>>> Lots of embedded Linux work in my near future, and possibly TDD >>>>>>>> proselytizing. >>>>>>> >>>>>>> Congratulations Tim. We'll have to exchange trade secrets someday... >>>>>>> >>>>>>> BTW, does PS use a specific unit test framework? Have you used it >>>>>>> yet? >>>>>>> How do you like it? >>>>>> >>>>>> I start next week, so I don't know and no. It's my former coworker >>>>>> from FLIR who's pushing TDD, and the manager who hired me is kind of >>>>>> warily standing on the sidelines going "what's the deal here?" >>>>>> >>>>>> The guy pushing it is extremely smart and capable, so whatever it is >>>>>> it's probably good. >>>>> >>>>> I expect you'll find it is the codification of many of the development >>>>> mentality and practices that you have been using for a long time. >>>>> >>>>> Be cautious about how TDD applies to bottom-up design (i.e. finding >>>>> things that work and clagging them together) v.s. top-down design. TDD >>>>> works naturally with top-down design where all the >>>>> yet-to-be-implemented are well understood and feasible. >>>>> >>>>> Anybody that /thinks/ will realise that sometimes it is beneficial to >>>>> do a "spike investigation" to quickly validate key concepts from >>>>> top-to-bottom, and then to use that experience to do it "properly" >>>>> using full-blown TDD. >>>> >>>> It's not magic. I've been geeking out on the COSMAC 1802 lately, >>>> because it was the first microprocessor I ever owned (I had an ELF-II >>>> kit). The user's manual has an entire chapter extolling the virtue of >>>> SUBROUTINES (ooh, ahh) and how to implement them. It's quite gushy >>>> about how using subroutines make your code better. And yet, I've >>>> worked on lots of crappy code that has subroutines. >>> >>> :) >>> >>> The 1802's implementation of subroutines was, um, quirky to the point of >>> being obtuse. >>> >>> I hand built my first computer using a 6800, after having thought long >>> and hard about the 1802 and 8080. It was a mess, >>> but worked, I learned a heck of a lot, and prospective employers were >>> duly impressed. >> >> The 1802 is neither a CISC processor nor a RISC processor -- it's a NHISC >> processor -- "Never Had Instruction Set Computer". > > :) > >> I wish I had the chops to organize a contest -- I think an annual "build >> the fastest 1802" contest would be fun to be involved in. Imagine what >> you could do if the only basic rule was that it had to execute 1802 >> machine code faithfully, with no constraints on how much happened per >> clock cycle. Ditch TDA and TDB, keep the I/O command lines, Q, and the >> flags, and go to town. Prefetch, pipelines, caches, parallel execution, >> predictive branching, everything -- all with that crazy 1802 instruction >> set. > > I hate all those with a vengeance, since they prevent hard > real-time software. > > I'm currently experimenting with a /small/ XMOS device which > deliberately avoids all of those techniques so that it can > guarantee timing. So far I've been able to get the /software/ > to reliably count the edges on two 20Mb/s input pins,
Oops, 50Mb/s, i.e. 100Mb/s total and I might be able to get it to 100Mb/s per input.
> process the results and simultaneously shove them up a USB > link to a host PC. > > Now I've got to understand the algorithms in reciprocal and > continuous timestamping frequency counters :)
On Wed, 17 May 2017 18:55:16 +0100, Tom Gardner wrote:

> On 17/05/17 18:43, Tim Wescott wrote: >> On Wed, 17 May 2017 18:36:52 +0100, Tom Gardner wrote: >> >>> On 17/05/17 18:21, Tim Wescott wrote: >>>> On Wed, 17 May 2017 18:11:22 +0100, Tom Gardner wrote: >>>> >>>>> On 17/05/17 16:02, Tim Wescott wrote: >>>>>> On Wed, 17 May 2017 09:12:46 -0400, Randy Yates wrote: >>>>>> >>>>>>> Tim Wescott <seemywebsite@myfooter.really> writes: >>>>>>> >>>>>>>> Since this is a newsgroup, and this is news... >>>>>>>> >>>>>>>> Wescott Design Services is going into remission, while I pursue a >>>>>>>> day job. Job title is Software Designer 5 at Planar Systems -- >>>>>>>> so any circuit design or control systems jones will have to be >>>>>>>> satisfied by hobby work or on the side. >>>>>>>> >>>>>>>> In the near term I'll be finishing up current work with current >>>>>>>> customers; in the longer term I'll probably concentrate on the >>>>>>>> educational videos and maybe hobby stuff. >>>>>>>> >>>>>>>> Lots of embedded Linux work in my near future, and possibly TDD >>>>>>>> proselytizing. >>>>>>> >>>>>>> Congratulations Tim. We'll have to exchange trade secrets >>>>>>> someday... >>>>>>> >>>>>>> BTW, does PS use a specific unit test framework? Have you used it >>>>>>> yet? >>>>>>> How do you like it? >>>>>> >>>>>> I start next week, so I don't know and no. It's my former coworker >>>>>> from FLIR who's pushing TDD, and the manager who hired me is kind >>>>>> of warily standing on the sidelines going "what's the deal here?" >>>>>> >>>>>> The guy pushing it is extremely smart and capable, so whatever it >>>>>> is it's probably good. >>>>> >>>>> I expect you'll find it is the codification of many of the >>>>> development mentality and practices that you have been using for a >>>>> long time. >>>>> >>>>> Be cautious about how TDD applies to bottom-up design (i.e. finding >>>>> things that work and clagging them together) v.s. top-down design. >>>>> TDD works naturally with top-down design where all the >>>>> yet-to-be-implemented are well understood and feasible. >>>>> >>>>> Anybody that /thinks/ will realise that sometimes it is beneficial >>>>> to do a "spike investigation" to quickly validate key concepts from >>>>> top-to-bottom, and then to use that experience to do it "properly" >>>>> using full-blown TDD. >>>> >>>> It's not magic. I've been geeking out on the COSMAC 1802 lately, >>>> because it was the first microprocessor I ever owned (I had an ELF-II >>>> kit). The user's manual has an entire chapter extolling the virtue >>>> of SUBROUTINES (ooh, ahh) and how to implement them. It's quite >>>> gushy about how using subroutines make your code better. And yet, >>>> I've worked on lots of crappy code that has subroutines. >>> >>> :) >>> >>> The 1802's implementation of subroutines was, um, quirky to the point >>> of being obtuse. >>> >>> I hand built my first computer using a 6800, after having thought long >>> and hard about the 1802 and 8080. It was a mess, >>> but worked, I learned a heck of a lot, and prospective employers were >>> duly impressed. >> >> The 1802 is neither a CISC processor nor a RISC processor -- it's a >> NHISC processor -- "Never Had Instruction Set Computer". > > :) > >> I wish I had the chops to organize a contest -- I think an annual >> "build the fastest 1802" contest would be fun to be involved in. >> Imagine what you could do if the only basic rule was that it had to >> execute 1802 machine code faithfully, with no constraints on how much >> happened per clock cycle. Ditch TDA and TDB, keep the I/O command >> lines, Q, and the flags, and go to town. Prefetch, pipelines, caches, >> parallel execution, predictive branching, everything -- all with that >> crazy 1802 instruction set. > > I hate all those with a vengeance, since they prevent hard real-time > software. > > I'm currently experimenting with a /small/ XMOS device which > deliberately avoids all of those techniques so that it can guarantee > timing. So far I've been able to get the /software/ > to reliably count the edges on two 20Mb/s input pins, process the > results and simultaneously shove them up a USB link to a host PC. > > Now I've got to understand the algorithms in reciprocal and continuous > timestamping frequency counters :)
Some guy has Verilog code for an 1802 in which he claims a 60MHz clock, one clock per instruction (or perhaps fetch). That would be deterministic, and fast by some measures. All the modern pipeline/predict/prefetch whiz-bang doesn't prevent hard real time, if only the processor manufacturers would publish the absolute maximum time it takes to execute any possible instruction, or (better) provide tools for finding the maximum time-from-interrupt for any given chunk of code. Then you could just add up all the critical stuff and make sure it works. In my experience there isn't THAT much variation -- you just need to know how much variation to allow for to meet hard real time criteria. -- Tim Wescott Wescott Design Services http://www.wescottdesign.com
On 5/17/2017 10:25 AM, Niklas Holsti wrote:
>>> For my last product, I wrote a combined user manual and requirement >>> specification, using LibreOffice conditional text: set a document >>> variable one >>> way, it is a user manual; set the variable differently, and every >>> meaningful >>> paragraph gets a requirement number that can be referenced in tests and >>> validation matrices. >>> >>> Only drawback is that the paragraphs are rather shorter than in usual >>> prose. >> >> The problem, here, is that you have the same "data" represented in two >> different forms (views/documents). So, there is the implicit concern >> that the two may be -- or become -- out of sync. Like comments in code >> not agreeing with what the code is NOW doing (after some revision). > > The _text_ of the document is unchanged in the two views, except for some > meta-text paragraphs that explain the purpose and structure of the document. > The "requirement specification" view only adds the formal machinery > (requirement identifiers and requirement index) for tracebility with tests. The > user manual and the specifications cannot go "out of sync".
Then your "manual" and "spec" read like the same document -- i.e., the same "tone" and manner of presentation. I will explain something differently to two different (types of) audiences. E.g., when I know I'm talking to a mathematician (or, someone who would have a solid grasp of mathematical concepts), I'll employ "equations" and "relations" freely. The same concepts expressed to the /hoi polloi/ would tend to be conveyed in a more verbosely descriptive manner, perhaps accompanied by anecdotes or "real world examples". E.g., the tutorial for my Cubic Bezier implementation includes interactive demos -- so the "reader" can move control points around and *see* the impact on the resulting curves (instead of trying to describe them in terms of their "approaching and departing" tangents, presence of loops/discontinuities, etc. The "mathematician" would see this as a superfluous "toy" (as he already understands how the equations behave). But, the casual user -- or neophyte -- would see it as a means of empirically exploring the problem and learning at a pace that best fits his abilities. The other folks "in the middle" would know the basics and be able to explore some of the boundary conditions that my *application* of the technology exploits. E.g., these tuples all represent what *appears* (after rendering) as a "straight line (segment) from (0,0) to (1,0): {(0,0), (0,0), (0,0), (1,0)} {(0,0), (0,0), (1,0), (1,0)} {(0,0), (1,0), (1,0), (1,0)} {(0,0), (0.5,0), (0.5,0), (1,0)} {(0,0), (1,0), (0,0), (1,0)} !! {(0,0), (1.2,0), (-0.2,0), (1,0)} !!!! etc. But, have different computational costs and transient dynamics. The mathematician say, "Yes, of course!". The programmer says, "Let me check on that..." And the /hoi polloi/ say "But, I don't SEE any difference..." The interative nature of the tutorial lets them *approach* the understanding of the problem/solution.
On 17/05/17 19:28, Tim Wescott wrote:
> On Wed, 17 May 2017 18:55:16 +0100, Tom Gardner wrote: > >> On 17/05/17 18:43, Tim Wescott wrote: >>> On Wed, 17 May 2017 18:36:52 +0100, Tom Gardner wrote: >>> >>>> On 17/05/17 18:21, Tim Wescott wrote: >>>>> On Wed, 17 May 2017 18:11:22 +0100, Tom Gardner wrote: >>>>> >>>>>> On 17/05/17 16:02, Tim Wescott wrote: >>>>>>> On Wed, 17 May 2017 09:12:46 -0400, Randy Yates wrote: >>>>>>> >>>>>>>> Tim Wescott <seemywebsite@myfooter.really> writes: >>>>>>>> >>>>>>>>> Since this is a newsgroup, and this is news... >>>>>>>>> >>>>>>>>> Wescott Design Services is going into remission, while I pursue a >>>>>>>>> day job. Job title is Software Designer 5 at Planar Systems -- >>>>>>>>> so any circuit design or control systems jones will have to be >>>>>>>>> satisfied by hobby work or on the side. >>>>>>>>> >>>>>>>>> In the near term I'll be finishing up current work with current >>>>>>>>> customers; in the longer term I'll probably concentrate on the >>>>>>>>> educational videos and maybe hobby stuff. >>>>>>>>> >>>>>>>>> Lots of embedded Linux work in my near future, and possibly TDD >>>>>>>>> proselytizing. >>>>>>>> >>>>>>>> Congratulations Tim. We'll have to exchange trade secrets >>>>>>>> someday... >>>>>>>> >>>>>>>> BTW, does PS use a specific unit test framework? Have you used it >>>>>>>> yet? >>>>>>>> How do you like it? >>>>>>> >>>>>>> I start next week, so I don't know and no. It's my former coworker >>>>>>> from FLIR who's pushing TDD, and the manager who hired me is kind >>>>>>> of warily standing on the sidelines going "what's the deal here?" >>>>>>> >>>>>>> The guy pushing it is extremely smart and capable, so whatever it >>>>>>> is it's probably good. >>>>>> >>>>>> I expect you'll find it is the codification of many of the >>>>>> development mentality and practices that you have been using for a >>>>>> long time. >>>>>> >>>>>> Be cautious about how TDD applies to bottom-up design (i.e. finding >>>>>> things that work and clagging them together) v.s. top-down design. >>>>>> TDD works naturally with top-down design where all the >>>>>> yet-to-be-implemented are well understood and feasible. >>>>>> >>>>>> Anybody that /thinks/ will realise that sometimes it is beneficial >>>>>> to do a "spike investigation" to quickly validate key concepts from >>>>>> top-to-bottom, and then to use that experience to do it "properly" >>>>>> using full-blown TDD. >>>>> >>>>> It's not magic. I've been geeking out on the COSMAC 1802 lately, >>>>> because it was the first microprocessor I ever owned (I had an ELF-II >>>>> kit). The user's manual has an entire chapter extolling the virtue >>>>> of SUBROUTINES (ooh, ahh) and how to implement them. It's quite >>>>> gushy about how using subroutines make your code better. And yet, >>>>> I've worked on lots of crappy code that has subroutines. >>>> >>>> :) >>>> >>>> The 1802's implementation of subroutines was, um, quirky to the point >>>> of being obtuse. >>>> >>>> I hand built my first computer using a 6800, after having thought long >>>> and hard about the 1802 and 8080. It was a mess, >>>> but worked, I learned a heck of a lot, and prospective employers were >>>> duly impressed. >>> >>> The 1802 is neither a CISC processor nor a RISC processor -- it's a >>> NHISC processor -- "Never Had Instruction Set Computer". >> >> :) >> >>> I wish I had the chops to organize a contest -- I think an annual >>> "build the fastest 1802" contest would be fun to be involved in. >>> Imagine what you could do if the only basic rule was that it had to >>> execute 1802 machine code faithfully, with no constraints on how much >>> happened per clock cycle. Ditch TDA and TDB, keep the I/O command >>> lines, Q, and the flags, and go to town. Prefetch, pipelines, caches, >>> parallel execution, predictive branching, everything -- all with that >>> crazy 1802 instruction set. >> >> I hate all those with a vengeance, since they prevent hard real-time >> software. >> >> I'm currently experimenting with a /small/ XMOS device which >> deliberately avoids all of those techniques so that it can guarantee >> timing. So far I've been able to get the /software/ >> to reliably count the edges on two 20Mb/s input pins, process the >> results and simultaneously shove them up a USB link to a host PC. >> >> Now I've got to understand the algorithms in reciprocal and continuous >> timestamping frequency counters :) > > Some guy has Verilog code for an 1802 in which he claims a 60MHz clock, > one clock per instruction (or perhaps fetch). That would be > deterministic, and fast by some measures. > > All the modern pipeline/predict/prefetch whiz-bang doesn't prevent hard > real time, if only the processor manufacturers would publish the absolute > maximum time it takes to execute any possible instruction, or (better) > provide tools for finding the maximum time-from-interrupt for any given > chunk of code. Then you could just add up all the critical stuff and > make sure it works. > > In my experience there isn't THAT much variation -- you just need to know > how much variation to allow for to meet hard real time criteria.
It complicates things enormously, doubly so when all the caches are involved. ISTR someone measuring a 486 with its tiny caches, and finding the mean:max ISR time was somewhere around 1:5. I expect I've still got a paper copy, /somewhere/. The XMOS tools claim to indicate the exact loop/function times, assuming input is available and output can be delivered (Occam channel semantics). The event-driven multicore hardware+software co-implementation looks to be rather nice too. And the I/O is pleasantly high-level too: do i/o on a specific clock cycle, wait until there's a change, etc etc. Makes high speed bit-bashing in software tractable.
On Tue, 16 May 2017 13:49:28 -0500, Tim Wescott
<seemywebsite@myfooter.really> wrote:

>Since this is a newsgroup, and this is news... > >Wescott Design Services is going into remission, while I pursue a day >job. Job title is Software Designer 5 at Planar Systems -- so any >circuit design or control systems jones will have to be satisfied by >hobby work or on the side. > >In the near term I'll be finishing up current work with current >customers; in the longer term I'll probably concentrate on the >educational videos and maybe hobby stuff. > >Lots of embedded Linux work in my near future, and possibly TDD >proselytizing.
Planar Systems is one VERY cool company ! I had to look at their web site... They even have displays in the SPAM museum !! I wonder why they want you to clear your cache and cookies before applying to a job application ? Maybe so you have to enter everything from scratch. boB

The 2024 Embedded Online Conference