Hi Don, On 20.12.2014 г. 21:48, Don Y wrote:> Hi Dimiter, > > On 12/18/2014 2:31 AM, Dimiter_Popoff wrote: >>> On 12/16/2014 8:56 AM, Dimiter_Popoff wrote: >>>> On 16.12.2014 г. 10:25, upsidedown@downunder.com wrote: >>>>> ... >>>>> These days with compilers running on big platforms, much optimization >>>>> can be done and there is no point in trying to help the compiler. In >>>>> practice the only need for manual assembler is to utilize some special >>>>> target machine instructions that can't be expressed in HLL. >>>> >>>> It is perhaps true compilers can be made that good but I have yet to >>>> see HLL written code which my VPA (which I control how high it >>>> gets while I write, lines with register operations alternate with >>>> actions on objects etc.) written code won't beat by at least a factor >>>> of 10 when it comes to density. Execution speed likely too but this >>>> is harder to compare. >>> >>> While I can't comment on *your* abilities -- or the characteristics >>> of your VPA -- I can only say that *I* have (long ago) been blown away >>> by how *clever* many of the optimizing compilers can be! >> >> Of course they can be clever. My point is that overall humans are >> much cleverer at using a language - check for example how we use >> natural languages and how do the machines cope. It is a matter >> of time until they outsmart us, may be not much time, but for now >> we are incomparably better. >> It is just a matter of how well we choose to learn/use the >> language - and how good a language processor we have in our head >> of course (this varies a lot between individuals). > > The last comment is the kicker -- if you're coding for your eyes only, > you can do things a lot different than if others will have to view, > maintain or enhance your codebase.The effort it takes to create a new piece of software and that to read and understand what it does are very different. I am quite sure even someone totally unfamiliar with VPA would find it easier to read and understand what I have written than a poorly commented C source where C might be his native language (and practically all of the C sources I have seen are poorly commented). English has evolved for centuries, it is a good language to express ideas. Has yet to be beaten really. Getting into the subtleties of how to use the tool chain is another task and it takes more or less the same effort anyway, whichever tool. Getting familiar with the programming language itself (i.e. the non-comment part) will be necessary only if someone wants to write some new code in that language; making some changes etc. to something existing does not require getting really good at the language. Then the simpler it is - i.e. the lower the level - the easier it will be to grasp what and how to do.> ... > The thing that technology is lousy at is "enhancing wetware" -- programmers > don't inherently get "twice as productive" each year or two. They can't > write > twice as much debugged code or comprehend twice as many lines per unit > time. > > So, you want the tools that they use to *express* their ideas to > become more productive.Yes, which is why I opted for tools under my control. Nothing can match the efficiency you get by that. Nothing comes close really.> And, to do so in a way that allows *others* > to readily understand what they are trying to say.Understand - yes, rewrite it - no, why would this be needed. If the code they read is too old or has to be rewritten for some other reason there is no binding to any language, they can write it in whatever they opt for then. I have had to rewrite (or wanted to rewrite) very little of my few tens of megabytes of source I have written over the past 20 years. But choosing a high level language only because one hopes it will survive the next 2-3 decades so someone would find it easier to make some minor modifications is just silly (and done all the time), why would I restrain myself now and do 1/10th or less of what I can do in my lifetime only trying to save a few days work for someone a few decades later.> >>> The machine has the advantage of being able to "instantly" evaluate >>> a variety of different approaches to a *particular* problem -- and >>> settle on the "best" one (where "best" can be defined AT COMPILE TIME) >>> while taking into consideration as much (or as little) of the >>> surrounding >>> "context" that it deems appropriate. >> >> Of course there are such tasks but in my thinking they are what my code >> will have to do, not a job for the compiler. I am the one who creates >> the code, not the compiler. Leaving to it to choose the algorithm would >> simply mean I am not programming, just using the machine. Which I would >> gladly do of course were it good enough to do what I want; so far it >> is not. > > Most people can't come up with the "optimal" way of evaluating an > arbitrary expression -- given the opcodes available in the *particular* > target.Oh I agree 100% that high level languages are more convenient than machine language for expressions. However expressions take < 1% of the code we write; and there is nothing stopping you from making a call to evaluate an expression from within practically any language. But "most people" obviously would prefer something like Basic or sort of where one can put together some arithmetic and learn what he needs from the language within a day or so, of course. The advantages of being good at a language begin to show up when you have to use it on a single task/project at least for a few months. This is when the (too) high level only gets in the way. With VPA you control the level at which you write yourself by defining the various levels, calls, objects etc. etc. Then in my case, having written the entire environment, you could argue I use a much higher level than a HLL of course :-). But the point is I do have any level I want at any line of code I write, which is only achievable if you have the lowest level available - and if you maintain your fitness at being good at using it.>>.... > > What I want most in sources now is the ability to include better > commentary -- multimedia files, interactive demos, etc. I.e., > things that assist the developer/maintainer, not the "executable"Well yes, though I am not sure how much value this will add to the plain method of just putting some links/paths as text in the source. Will make things better readable at first glance of course but if someone will work on these sources the first glance is nothing I would be overly concerned with, what counts is that the information is there to be found. Dimiter ------------------------------------------------------ Dimiter Popoff, TGI http://www.tgi-sci.com ------------------------------------------------------ http://www.flickr.com/photos/didi_tgi/
Languages, is popularity dominating engineering?
Started by ●December 12, 2014
Reply by ●December 21, 20142014-12-21
Reply by ●December 21, 20142014-12-21
>>>> I, in turn, never wanted to be "strapped" with supporting/developing the >>>> same thing over and over and over (when you're seen as "good" at >>>> something, you tend to get STUCK doing it). >>> >>> I find that if you build it right, the support is pretty minimal. >> >> If you have leverage over the folks who "want changes", that can >> be so. But, if (e.g.) Marketing comes in every other week with >> some new idea ("requirement"), all bets are off. > > Never had any trouble with that. You have to frame issues in > terms of risk, cost and capability.I've worked for groups driven by Marketing, Engineering and Manufacturing. Each has its own "perversion" of "Rationality" wrt cost/benefit analysis. If *you* receive the BENEFIT (e.g., increased sales to make your commissions/bonus better) and someone *else* bears the *cost* (or a disproportionate amount of that cost), then you've little to lose by asking for The Moon (unless your engineering staff are not up to the task and deliver a crappy "feature-rich" product). Engineering-driven groups are probably the most "fun" (from a novelty perspective). But, can get caught up in their own cleverness -- doing something to prove it can be *done*, instead of because there is a market/need for it. They also tend to be the least aware of the end consumers' particular needs and capabilities (the pejorative: Designed by an Engineer). Manufacturing-driven firms tend to be the most conservative. They tend to see everything in terms of all the "capital" invested in men and machines: "Why can't we keep doing what we've BEEN doing?" Everyone sees the benefits and the costs from their own perspective. Worst scenario is incurring the costs with no benefit. Best is benefit with no *cost*. Reality lies somewhere between the extremes. Marketing/Sales have a tangible reward in place (commissions, bonuses, etc.) so they want a product that they can sell to *ANY* set of requirements. What is the associated benefit to Engineering (and individual engineers) to undertake that (aside from success or failure of the business as a whole)? When was the last time you saw a bonus comparable to the one the Sales Guy picked up... for a product that *you* implemented?? [Of course, individual personalities can bias each of these environs...]>> At one firm, I was charged with coming up with a design for a newer >> version of a product they'd been "nursing" for more than a decade. >> I had to pitch my proposal to damn near everyone: top management, >> ALL of engineering, marketing, etc. (to my knowledge, this had never >> been done there -- before or *since*!) >> >> Almost immediately, the Marketing folks started in with their >> "Oh, you HAVE to have *this* feature!" -- citing something that >> their old device had but that I had elided from the new device's >> specification. They were NOT happy when I replied, "You sold >> exactly ONE system with that capability. I know because prior >> to preparing my proposal, I examined EVERY sales order for the >> past 10+ years!" >> >> The room went quiet until the CEO looked at me and said, "You >> know, I bet I know *who* bought it -- and it's probably sitting >> on a SHELF (not in use)". > > Well, there ya go.But this was the *exception*, not the rule! Typically, a design review just looks for flaws in a proposed *implementation*. Where is the OBJECTIVE analysis of *needs*? I.e., it falls on some individual's shoulders to take on that responsibility (as a consultant, that was *my* shoulders, invariably). [Not complaining. Just pointing out that clients typically hadn't done this sort of analysis *either*. So, it's not just "employers" who are the problem]>>> Nope. But you'd better be able to dive in outside the thing. Or do you >>> ship >>> "DEBUG" projects and call 'em released? >> >> You can't ship DEBUG binaries. All dead code has to be removed prior to >> shipment. There are typically *many* aspects of a device that can't be >> examined or tested without the development scaffolding in place. The >> advantage of better tools (languages, debuggers, IDE's, etc) is that it >> allows far more thorough testing/stressing *before* you get to RELEASE. > > The point being that you have a release process.Anyone building any product has a release process. Even "software publishers". You can't get from Engineer's Desk to The Market without *some* mechanism in place. How disciplined and structured and accountable that process is can vary significantly.>>> I've never had a lick of trouble with negotiating what goes into a >>> release. All >>> "manglement" wants is documentary evidence >>> of improvement. If you learn to estimate the cost of not-fixing >>> something, >>> you'll have better luck with this. And it helps to have non-adversarial >>> relationship. >> >> This doesn't matter. See above. You are assuming people are rational. > > They are if you let 'em be rational. This is my point.As above, it boils down to who defines "rational". If *I* win and you lose, is that a rational or irrational choice?>> You don't indulge in feeping creaturism if there's no obvious >> value. OTOH, failing to add a feature that is necessary can cost >> a sale -- or a reputation! > > The point of that is that it is manageable and the way to manage it is > by balancing cost and risk. If a feature simply *HAS* to be there, then it's > gonna need to be there.Again, who defines "*has*"? Marketing tells you they *have to* have this particular feature (my previous story). Do you just take them at their word? Do you *challenge* them (as I did) and piss them off? Setting the stage for "Well, we lost that sale because Don convinced you NOT to include the feature that I *told* you we needed!" -- how do you disprove that allegation? Go *around* the Marketing folks and contact the customer directly? Gee, I wonder how *that* is going to be received when Marketing gets wind of it (from the customer!).>>>> And, by the time something is >>>> (sort of) working, the developer is looking for any excuse to "move on" >>>> to something else... >>> >>> Eventually that converges on not being a developer any more, in >>> my experience. Narrow is the path... >> >> Look around at the (older) folks who started off their careers in >> engineering: >> >> Some move into Management (money, unable/unwilling to keep up with >> technological advances, perceived prestige, etc.). >> >> Some move into their own ventures (consultancies, businesses, etc.). >> >> Some keep doing the same thing forever (every place I've worked has >> had at least one "old-timer" who is helplessly out of date with >> current technology -- hopefully, not in a position where he keeps the >> company's feet firmly planted in The Past). > > Most companies have their feet firmly planted in the past - and for > good reason. The "can't keep up" thing is always suspicious; I've > never seen it in thirty years. Generally, new technology means a new > project and those are, frankly, unusual.I've been lucky enough to have worked in several fields/markets doing leading (not bleeding) edge work. I can't speak to the *financial* successes of each of those firms but, from an Engineering perspective, the projects were exciting and "different" from other things happening in the market. But, those tended to be Engineering-driven firms. Where the people making the final decisions knew what *was* feasible or *would be* feasible -- and just had to drag the other folks along for the ride.> If you wish to introduce new tech, you're better off bringing it in > as a fait accompli.Last platter of cookies to get out of here and I can take a breather for half a day... :-/
Reply by ●December 21, 20142014-12-21
Don Y wrote: [%X]>>> At one firm, I was charged with coming up with a design for a newer >>> version of a product they'd been "nursing" for more than a decade. >>> I had to pitch my proposal to damn near everyone: top management, >>> ALL of engineering, marketing, etc. (to my knowledge, this had never >>> been done there -- before or *since*!) >>> >>> Almost immediately, the Marketing folks started in with their >>> "Oh, you HAVE to have *this* feature!" -- citing something that >>> their old device had but that I had elided from the new device's >>> specification. They were NOT happy when I replied, "You sold >>> exactly ONE system with that capability. I know because prior >>> to preparing my proposal, I examined EVERY sales order for the >>> past 10+ years!" >>> >>> The room went quiet until the CEO looked at me and said, "You >>> know, I bet I know *who* bought it -- and it's probably sitting >>> on a SHELF (not in use)". >> >> Well, there ya go. > > But this was the *exception*, not the rule! Typically, a design review > just looks for flaws in a proposed *implementation*. Where is the > OBJECTIVE analysis of *needs*? I.e., it falls on some individual's > shoulders to take on that responsibility (as a consultant, that was > *my* shoulders, invariably). > > [Not complaining. Just pointing out that clients typically hadn't done > this sort of analysis *either*. So, it's not just "employers" who are > the problem]One of the reasons why I suggest that Requirements Specifications should be the first item in the System Testing regime. Until you have Requirements Specifications that are fully testable and tested you have not got clear and unequivocal Requirements Specifications where each feature has been proven to be needed (and not just a whimsical fancy). -- ******************************************************************** Paul E. Bennett IEng MIET.....<email://Paul_E.Bennett@topmail.co.uk> Forth based HIDECS Consultancy.............<http://www.hidecs.co.uk> Mob: +44 (0)7811-639972 Tel: +44 (0)1235-510979 Going Forth Safely ..... EBA. www.electric-boat-association.org.uk.. ********************************************************************
Reply by ●December 22, 20142014-12-22
Hi Paul, On 12/21/2014 7:12 PM, Paul E Bennett wrote:> Don Y wrote: > >>>> At one firm, I was charged with coming up with a design for a newer >>>> version of a product they'd been "nursing" for more than a decade. >>>> I had to pitch my proposal to damn near everyone: top management, >>>> ALL of engineering, marketing, etc. (to my knowledge, this had never >>>> been done there -- before or *since*!)>> But this was the *exception*, not the rule! Typically, a design review >> just looks for flaws in a proposed *implementation*. Where is the >> OBJECTIVE analysis of *needs*? I.e., it falls on some individual's >> shoulders to take on that responsibility (as a consultant, that was >> *my* shoulders, invariably). >> >> [Not complaining. Just pointing out that clients typically hadn't done >> this sort of analysis *either*. So, it's not just "employers" who are >> the problem] > > One of the reasons why I suggest that Requirements Specifications should be > the first item in the System Testing regime. Until you have Requirements > Specifications that are fully testable and tested you have not got clear and > unequivocal Requirements Specifications where each feature has been proven > to be needed (and not just a whimsical fancy).But it's still a value judgement by "someone". What's "whimsy" and what's "necessary"? E.g., one of my first products was a LORAN-C position plotter. It gave you a printed record of your "trip" (typically, on a boat). One of the "requirements" from Marketing was a provision to "mark" the chart on demand. The rationale being it could be used to note the positions of lobster pots as you toss them overboard -- a long "pushbutton on a cord" so you can be "aft" and heave the pots overboard without having to keep yelling back to someone in the wheelhouse (where the plotter can be sheltered from the elements, fish glop, etc.) In its simplest form, this was: lift pen move relative (-width/2, -height/2) drop pen move relative ( width, height ) lift pen move relative ( -width, 0 ) drop pen move relative ( width, -height ) lift pen move relative (-width/2, height/2) drop pen Owing to the simplistic nature of the implementation, you could actually do this as an ISR, of sorts (even though it took a sizeable fraction of a second -- you just pause the normal motor handling code) Of course, if the plotter's scale is too high, these fixed size X's will effectively overlap each other. Should they be scaled to reflect the current plotting scale? A marine research group might want to use the facility to track the progress of a pod of whales, school of dolphins, etc. How do you (later, examining the hardcopy plot), figure out which order you dropped the pots? (or, anything else you may have used this marking facility for) Should they be labeled with numbers? This suggests there may need to be a limit to the number of such labels (1 digit? 2 digits? 5 digits??) (you could similarly "draw" properly spaced digits by augmenting the "ISR" above, right?) What happens if the button is pressed more than once in the time it takes to draw a single X? What if the pen is up against one or more "limits" when this is activated? [There are lots of other "features" that I could put on a similar list...] Where do you draw the line? *Who* draws the line -- the Project Manager, Engineering, Marketing, The Engineer writing the code? What's the cost of the feature? A flag to tell the motor handler to pause and, instead, invoke the "draw X" fixed routine (assuming it is not scaled or labeled in any way), a digital input and appropriate signal conditioning to prevent the button from being an excellent *antenna* effectively coupling the ship-to-shore radio into the CPU, and someplace to poll and debounce the button (often enough to ensure it isn't "missed") -- and, something to enable/disable this behavior (do you want to be able to make X's any time you press the button -- even if the plotter isn't actively plotting position??) [That's the *minimum* cost for a hopefully "free feature"] Keep in mind that we spent considerable time on this project "removing *bytes* (not KB) of ROM" to make things fit in the space available (because each new "free feature" kept nibbling at our meager hardware resources)
Reply by ●December 22, 20142014-12-22
Don Y wrote:> Hi Paul, > > On 12/21/2014 7:12 PM, Paul E Bennett wrote: >> Don Y wrote: >> >>>>> At one firm, I was charged with coming up with a design for a newer >>>>> version of a product they'd been "nursing" for more than a decade. >>>>> I had to pitch my proposal to damn near everyone: top management, >>>>> ALL of engineering, marketing, etc. (to my knowledge, this had never >>>>> been done there -- before or *since*!) > >>> But this was the *exception*, not the rule! Typically, a design review >>> just looks for flaws in a proposed *implementation*. Where is the >>> OBJECTIVE analysis of *needs*? I.e., it falls on some individual's >>> shoulders to take on that responsibility (as a consultant, that was >>> *my* shoulders, invariably). >>> >>> [Not complaining. Just pointing out that clients typically hadn't done >>> this sort of analysis *either*. So, it's not just "employers" who are >>> the problem] >> >> One of the reasons why I suggest that Requirements Specifications should >> be the first item in the System Testing regime. Until you have >> Requirements Specifications that are fully testable and tested you have >> not got clear and unequivocal Requirements Specifications where each >> feature has been proven to be needed (and not just a whimsical fancy). > > But it's still a value judgement by "someone". What's "whimsy" and > what's "necessary"? > > E.g., one of my first products was a LORAN-C position plotter. It gave > you a printed record of your "trip" (typically, on a boat). > > One of the "requirements" from Marketing was a provision to "mark" the > chart on demand. The rationale being it could be used to note the > positions of lobster pots as you toss them overboard -- a long "pushbutton > on a cord" so you can be "aft" and heave the pots overboard without > having to keep yelling back to someone in the wheelhouse (where the > plotter can be sheltered from the elements, fish glop, etc.) > > In its simplest form, this was: > lift pen > move relative (-width/2, -height/2) > drop pen > move relative ( width, height ) > lift pen > move relative ( -width, 0 ) > drop pen > move relative ( width, -height ) > lift pen > move relative (-width/2, height/2) > drop pen > > Owing to the simplistic nature of the implementation, you could > actually do this as an ISR, of sorts (even though it took a > sizeable fraction of a second -- you just pause the normal > motor handling code) > > Of course, if the plotter's scale is too high, these fixed size > X's will effectively overlap each other. Should they be scaled > to reflect the current plotting scale? > > A marine research group might want to use the facility to > track the progress of a pod of whales, school of dolphins, etc. > > How do you (later, examining the hardcopy plot), figure out > which order you dropped the pots? (or, anything else you may > have used this marking facility for) Should they be labeled > with numbers? This suggests there may need to be a limit to > the number of such labels (1 digit? 2 digits? 5 digits??) > (you could similarly "draw" properly spaced digits by augmenting > the "ISR" above, right?) > > What happens if the button is pressed more than once in the time > it takes to draw a single X? > > What if the pen is up against one or more "limits" when this is > activated? > > [There are lots of other "features" that I could put on a similar > list...] > > Where do you draw the line? *Who* draws the line -- the Project > Manager, Engineering, Marketing, The Engineer writing the code? > > What's the cost of the feature? A flag to tell the motor handler > to pause and, instead, invoke the "draw X" fixed routine (assuming > it is not scaled or labeled in any way), a digital input and > appropriate signal conditioning to prevent the button from being > an excellent *antenna* effectively coupling the ship-to-shore > radio into the CPU, and someplace to poll and debounce the button > (often enough to ensure it isn't "missed") -- and, something to > enable/disable this behavior (do you want to be able to make X's > any time you press the button -- even if the plotter isn't actively > plotting position??) > > [That's the *minimum* cost for a hopefully "free feature"] > > Keep in mind that we spent considerable time on this project > "removing *bytes* (not KB) of ROM" to make things fit in the space > available (because each new "free feature" kept nibbling at > our meager hardware resources)Was that a modification of an existing product? Testing the requirements would have involved asking a great many questions that would have resolved all of those issues. It might even have highlighted other possible engineering directions. You covered some questions which I hope you fired back at Marketing and obtained sensible answers. I know you don't deal with the high levels of Mission Criticality that I do, but engineers need to ask all sorts of questions about the requirements they are handed in order to ensure the best outcomes. I sometimes find that a Task Analysis with role-play within a Requirements Review can help to flesh out the whimsical notions from the real requirements. -- ******************************************************************** Paul E. Bennett IEng MIET.....<email://Paul_E.Bennett@topmail.co.uk> Forth based HIDECS Consultancy.............<http://www.hidecs.co.uk> Mob: +44 (0)7811-639972 Tel: +44 (0)1235-510979 Going Forth Safely ..... EBA. www.electric-boat-association.org.uk.. ********************************************************************
Reply by ●December 22, 20142014-12-22
Hi Paul, On 12/22/2014 7:38 AM, Paul E Bennett wrote:>> [That's the *minimum* cost for a hopefully "free feature"] >> >> Keep in mind that we spent considerable time on this project >> "removing *bytes* (not KB) of ROM" to make things fit in the space >> available (because each new "free feature" kept nibbling at >> our meager hardware resources) > > Was that a modification of an existing product?Worse -- a product *going* into release!> Testing the requirements would have involved asking a great many questions > that would have resolved all of those issues. It might even have highlighted > other possible engineering directions. You covered some questions which I > hope you fired back at Marketing and obtained sensible answers.As "engineers", we were isolated from Marketing. A potential "rationalization" for this sort of thinking was the period: late 70's. Processors were *just* seeing use in commercial/consumer products. Part of the way we (engineers, in general -- not those of us at this firm) pitched the technology was that it was so much more flexible than "hardware" solutions (adding this sort of feature to a hardware-based product would have required adding something equivalent to a flying daughter-card with patches to foils in the "original" board -- or, a new layout/design). I would estimate *weekly* we'd get some change request (never written!) followed by a comment to the effect of: "That should be easy, right? Just change a *bit* somewhere..."> I know you don't deal with the high levels of Mission Criticality that I do, > but engineers need to ask all sorts of questions about the requirements they > are handed in order to ensure the best outcomes.The problem lies with where the power in the relationship lies. What recourse do you have if you're *told* to do something that you know is "wrong" (will add significantly to cost, schedule, unreliability, etc.)? The power balance shifts when you move to a consultancy -- you can fall back on the language of your contract to decline a change ("We'll handle that later") *or* simply "no bid" the job.> I sometimes find that a Task Analysis with role-play within a Requirements > Review can help to flesh out the whimsical notions from the real > requirements.I find it difficult to get "customers" (Marketing, clients, etc.) to focus on what they *want*, let alone *why* they want it. They tend to know what they *don't* want -- AFTER they see it! But, are largely incapable of abstract thought: "Imagine this device reified before you; how does it work?" This seemed relatively consistent in my experiences in different application domains/market segments. E.g., medical devices, process control, navigation, consumer goods, etc. Anything that had to interact with people in some manner (people being "variables") caused this "fuzziness" in requirements. I suspect this is one of the reasons why engineers are responsible for so many (clumsy) designs -- "You've got to be an Engineer to use this damn thing!"
Reply by ●December 22, 20142014-12-22
On Saturday, December 13, 2014 4:43:51 PM UTC-5, upsid...@downunder.com wrote: []> > One should consider the expected lifetime of the software. If the > software expected life time is one or more decades, one must think > about the amount of competent programmers available at the end of that > period. > > Using some exotic languages or something gaining rapidly popularity > recently (and possibly falling off as quickly) would be a risk. Using > some main stream languages (such as C/C++) and there will still be > competent programmers for a few decades.Some businesses I've seen do not consider that problem. Management can be VERY short sighted. From the Engineering side, I think that where available special languages are appropriate. A simple example is SQL.> > I haven't done COBOL since the Y2K issues, but still encounter Fortran > applications written two decades ago and the users wondering what to > do during the next decade and when to rewrite it,Well, it depends. Are the requirements and designs well documented? FORTRAN is a functional language and should be fairly easy for a good programmer to learn. I'd rather hire a programmer that still wants to learn new things than one that picked up a set of languages in school and will not look outside that toolbox. Lastly are the build tools still available (compiler/linker)? Is the OS still available? This actually is a fairly simple Engineering style decision, Weigh the trade offs given the facts for the specific case.> thus the existing > code base needs maintenance during the next 0-10 years. If those > applications had been written with some exotic languages or using some > special vendor specific extensions, maintenance becomes harder by each > year.I think maintenance becomes harder over time for any system. Bug fixes and new features are added until the code collapses if some forward planning is not done. Again it comes to the type of person you hire to maintain the code. If he/she is willing to learn, it may be easier. Consider a specific application with a version written in C and a version written in an exotic language that is tailored for the problem domain. You have Much more code to maintain in C since it must implement the features of that exotic language and then implement the application. Expressing the application in that exotic language can provide a clearer understanding of the problem being solved. So it only comes down to how hard is it to learn that exotic language. And I see your point, that it may be harder to hire that type of programmer willing to learn, but they are out there. (actually this is one of the strengths that the FORTH guys chime about) There is a hidden management issue here. When management prefers to pay low wages for just a competent programmer, rather than paying a better wage for a good programmer. There is also the issue of getting past HR. HR likes to filter resumes on simple checklists. Willing to learn is seldom one of those items. So yes there are a lot of hurtles to getting that maintenance programmer. But it is Management that puts those hurtles there, not the language. Ed
Reply by ●December 22, 20142014-12-22
On 22/12/2014 17:15, Don Y wrote:> Hi Paul, > > On 12/22/2014 7:38 AM, Paul E Bennett wrote: > >>> [That's the *minimum* cost for a hopefully "free feature"] >>> >>> Keep in mind that we spent considerable time on this project >>> "removing *bytes* (not KB) of ROM" to make things fit in the space >>> available (because each new "free feature" kept nibbling at >>> our meager hardware resources) >> >> Was that a modification of an existing product? > > Worse -- a product *going* into release! > >> Testing the requirements would have involved asking a great many >> questions >> that would have resolved all of those issues. It might even have >> highlighted >> other possible engineering directions. You covered some questions which I >> hope you fired back at Marketing and obtained sensible answers. > > As "engineers", we were isolated from Marketing. A potential > "rationalization" for this sort of thinking was the period: late > 70's. Processors were *just* seeing use in commercial/consumer > products. Part of the way we (engineers, in general -- not those > of us at this firm) pitched the technology was that it was so much > more flexible than "hardware" solutions (adding this sort of > feature to a hardware-based product would have required adding > something equivalent to a flying daughter-card with patches to foils > in the "original" board -- or, a new layout/design). > > I would estimate *weekly* we'd get some change request (never written!) > followed by a comment to the effect of: "That should be easy, right? > Just change a *bit* somewhere..." > >> I know you don't deal with the high levels of Mission Criticality that >> I do, >> but engineers need to ask all sorts of questions about the >> requirements they >> are handed in order to ensure the best outcomes. > > The problem lies with where the power in the relationship lies. What > recourse do you have if you're *told* to do something that you know is > "wrong" (will add significantly to cost, schedule, unreliability, etc.)? > > The power balance shifts when you move to a consultancy -- you can > fall back on the language of your contract to decline a change > ("We'll handle that later") *or* simply "no bid" the job. > >> I sometimes find that a Task Analysis with role-play within a >> Requirements >> Review can help to flesh out the whimsical notions from the real >> requirements. > > I find it difficult to get "customers" (Marketing, clients, etc.) to focus > on what they *want*, let alone *why* they want it. They tend to know > what they *don't* want -- AFTER they see it! But, are largely incapable > of abstract thought: "Imagine this device reified before you; how does > it work?" > > This seemed relatively consistent in my experiences in different > application > domains/market segments. E.g., medical devices, process control, > navigation, > consumer goods, etc. Anything that had to interact with people in some > manner (people being "variables") caused this "fuzziness" in requirements. > > I suspect this is one of the reasons why engineers are responsible for > so many (clumsy) designs -- "You've got to be an Engineer to use this > damn thing!"I believe you are right in part. The first engineers get wrong is not to insist on a detailed set of product requirements. The second is for engineers not to cost out how much an addition, or a change will cost. Not just in man-hours, but in $ or �, just so marketing and others get to see the true cost of such unplanned and uncoordinated changes. To be fair, if you're first to market it's not always known what the product features should be. -- Mike Perkins Video Solutions Ltd www.videosolutions.ltd.uk
Reply by ●December 22, 20142014-12-22
Hi Dimiter, On 12/21/2014 3:36 PM, Dimiter_Popoff wrote:>>> Of course they can be clever. My point is that overall humans are >>> much cleverer at using a language - check for example how we use >>> natural languages and how do the machines cope. It is a matter >>> of time until they outsmart us, may be not much time, but for now >>> we are incomparably better. >>> It is just a matter of how well we choose to learn/use the >>> language - and how good a language processor we have in our head >>> of course (this varies a lot between individuals). >> >> The last comment is the kicker -- if you're coding for your eyes only, >> you can do things a lot different than if others will have to view, >> maintain or enhance your codebase. > > The effort it takes to create a new piece of software and that to > read and understand what it does are very different.Sure! Reading *existing* software you have the benefit of knowing "that" it works (or how it doesn't).> I am quite sure even someone totally unfamiliar with VPA would > find it easier to read and understand what I have written than > a poorly commented C source where C might be his native language > (and practically all of the C sources I have seen are poorly commented).But how much of your claim is based on *your* level/style of commentary? I've seen "heavily commented" pieces of code where the comments were incorrect (almost WORSE than having none at all) or inadequate (e.g., "add one", "divide by time", etc.). And, cases where there were exactly *zero* comments! OTOH, I've seen pieces of code that are delightfully well documented. When I wrote my 9-track tape driver, I prefaced the first line of code with several *pages* of commentary. Largely to explain the (archaic?) terminology applicable to such subsystems as well as the capabilities present in each of the components (controller, formatter, transport, etc.). Otherwise, someone reading the code might not understand why I was filling a buffer "backwards" (read reverse) or rewinding one transport *while* writing to another, etc. The language doesn't dictate the quality of the commentary (unless it has no provisions for inserting comments amongst code!) but, rather, the individual creating it.> English has evolved for centuries, it is a good language to express > ideas. Has yet to be beaten really.It also gives rise to lots of ambiguity! E.g., Les's comment, up-thread: - Only use "for" loops for integer-index loops. I can read as: - Don't use anything other than "for" loops for integer-index loops - Don't use "for" loops for anything other than integer-index loops If a newscaster claims "the suspect was shot to death" (a common phrase on the News), how many shots were fired? Did someone stand over him and keep shooting UNTIL DEAD? Or, was a *single* shot fired that killed him ("shot dead")? Perhaps the only ones who *try* to be precise with language are lawyers -- because meaning is the essence of their work (and we still see them argue about details of contracts in courts, etc.)> Getting into the subtleties of how to use the tool chain is another > task and it takes more or less the same effort anyway, whichever tool. > Getting familiar with the programming language itself (i.e. the > non-comment part) will be necessary only if someone wants to write > some new code in that language; making some changes etc. to something > existing does not require getting really good at the language. Then the > simpler it is - i.e. the lower the level - the easier it will be to > grasp what and how to do.I don't agree. Often there are subtleties in a language that have a pronounced effect on how code works. E.g., "call by value" vs. "call by reference" semantics need not be explicitly differentiable in the language's syntax. You'd have to *know* how particular parameters are passed. E.g., in Limbo, strings are passed by value -- changing the string in a function has no effect on the original string! OTOH, *lists* and arrays (! i.e., a string is not a char array) are passed by reference. Hidden constructors and anonymous, temporary objects. Many of the legacy text-to-phoneme algorithms express patterns using a wildcard style syntax: "one or more vowels", "a front vowel", "a voiced consonant", etc. One of the earliest "well known" rulesets (NRL) was written in SNOBOL. So, "one or more vowels" is actually implemented using a *lazy* matching algorithm ("ARBNO"). I.e., if we use '@' to represent one or more vowels, then: @ matches A, AA, AEO, OIUA, etc. @bC matches AbC, AAbC, AEObC, OIUAbC, etc. @Ab matches AAb Most folks implementing this in *C* use GREEDY matches -- and never "back out" on failure. So, that last example fails -- the second 'A' (in AAb) gets sucked into the '@' causing the literal 'A' in the template (@Ab) to not agree with the input string. I.e., people conversant in one language READ behaviors INTO new languages that aren't necessarily there!>> ... >> The thing that technology is lousy at is "enhancing wetware" -- programmers >> don't inherently get "twice as productive" each year or two. They can't >> write >> twice as much debugged code or comprehend twice as many lines per unit >> time. >> >> So, you want the tools that they use to *express* their ideas to >> become more productive. > > Yes, which is why I opted for tools under my control. Nothing can match > the efficiency you get by that. Nothing comes close really.I suspect most people (employers) don't want to take on the burden and cost of having to develop and maintain their own toolchains. Firms that invest in their own tools have to implicitly assume the need to train all new hires in the use of those tools (in addition to their existing product software base -- libraries, etc.). E.g., any time I create an ASL, I do so *only* as an expedient to "lots of (typo-prone) typing". It allows the essence of what I am saying to precipitate out of the text without all the syntactic cluttering that the native language imposes. (e.g., my state machine example, elsewhere)>> And, to do so in a way that allows *others* >> to readily understand what they are trying to say. > > Understand - yes, rewrite it - no, why would this be needed. If the code > they read is too old or has to be rewritten for some other reason there > is no binding to any language, they can write it in whatever they opt > for then. > I have had to rewrite (or wanted to rewrite) very little of my few tens > of megabytes of source I have written over the past 20 years.But *you* are the sole developer and have control over the product in its entirety. What happens when you opt to bring someone on-board to assist? Or, when you are no longer capable (interested?) in maintaining existing/new products? Are your customers left with a dead-end product (because your codebase has limited value to someone wanting to expand upon it -- other than a competitor who simply wants to *kill* it off)?> But choosing a high level language only because one hopes it will > survive the next 2-3 decades so someone would find it easier to > make some minor modifications is just silly (and done all the time), > why would I restrain myself now and do 1/10th or less of what I can > do in my lifetime only trying to save a few days work for someone > a few decades later.Because other people (organizations) have other interests above and beyond those of an individual! E.g., if your work was *for* some other business (i.e., they *own* your output), they would want to bind your services to them "indefinitely" -- and, if wise, take steps to ensure they had a "hot backup" for you (and a way to bind that individual's services as well!). Or, they can opt for something more universally used and avail themselves of more potential candidates -- losing *you* would be an inconvenience, but not a death knell for their product or their organization! Most of what I'm currently doing will be released as open source. As such, *I* won't be the one looking at it, later. The less "main stream" the tools I choose, the higher the bar for others to embrace my efforts and build on them. Since I am not "expert" in many of the technologies that I am using, others need to be able to *easily* step in and replace entire subsystems as the appropriate technology advances, is refined, etc. They can benefit from the structure I have imposed on things and focus on a particular aspect instead of having to reinvent the wheel, cart and *horse*!>>>> The machine has the advantage of being able to "instantly" evaluate >>>> a variety of different approaches to a *particular* problem -- and >>>> settle on the "best" one (where "best" can be defined AT COMPILE TIME) >>>> while taking into consideration as much (or as little) of the >>>> surrounding >>>> "context" that it deems appropriate. >>> >>> Of course there are such tasks but in my thinking they are what my code >>> will have to do, not a job for the compiler. I am the one who creates >>> the code, not the compiler. Leaving to it to choose the algorithm would >>> simply mean I am not programming, just using the machine. Which I would >>> gladly do of course were it good enough to do what I want; so far it >>> is not. >> >> Most people can't come up with the "optimal" way of evaluating an >> arbitrary expression -- given the opcodes available in the *particular* >> target. > > Oh I agree 100% that high level languages are more convenient than > machine language for expressions. However expressions take < 1% of > the code we write; and there is nothing stopping you from making a call > to evaluate an expression from within practically any language.There's nothing to stop me from writing my code in HEX, either! We add abstraction to improve productivity, readability, reduce ambiguity, etc. There is *one* way to parse: a + b * c % d / e - f yet we parenthesize to enforce a *particular* way of parsing it. I could express all of these operations as function calls -- but that would be even less clear to a reader; undoubtedly, he would read through the code and "rewrite" the operations being performed in this sort of notation. Because he is more familiar with it! My gesture recognizer uses fixed binary point (Q) arithmetic. But, as it is written in C (and not C++), I can't overload arithmetic operators to make what I am doing more obvious. The code is littered with calls to "add()", "sub()", "mul()", etc. Even constants are "obfuscated" because they have to be converted into the corresponding values in that representation. As a result, much of my commentary is a rewrite of the code using more conventional notation! This is rife for error if small changes are made and not reflected in every applicable comment.> But "most people" obviously would prefer something like Basic or sort > of where one can put together some arithmetic and learn what he needs > from the language within a day or so, of course. > > The advantages of being good at a language begin to show up when > you have to use it on a single task/project at least for a few months. > This is when the (too) high level only gets in the way. With VPA you > control the level at which you write yourself by defining the various > levels, calls, objects etc. etc. Then in my case, having written the > entire environment, you could argue I use a much higher level than a > HLL of course :-). But the point is I do have any level I want at any > line of code I write, which is only achievable if you have the lowest > level available - and if you maintain your fitness at being good at > using it. > >> What I want most in sources now is the ability to include better >> commentary -- multimedia files, interactive demos, etc. I.e., >> things that assist the developer/maintainer, not the "executable" > > Well yes, though I am not sure how much value this will add to the > plain method of just putting some links/paths as text in the source.How do you express those links? URN's? What if the resource isn't accessible at the time? Would you be willing to move your commentary into another document? The beauty of having everything integrated into a single document is that it's *with* the sources. Just as you scroll up to re-read a description of the block of code you are examining, you could look up and see an illustration of the data structure that the code is manipulating. Or, examine a graph of the role of a particular parameter in a particular algorithm.> Will make things better readable at first glance of course but if > someone will work on these sources the first glance is nothing I > would be overly concerned with, what counts is that the information > is there to be found.People lose manuals for items they purchase, all the time. I've had clients approach me because they'd misplaced the *sources* to the products on which their livelihood depended! (I turned down another such request just a few months ago -- I have no desire to reverse engineer yet another project! There's very little to LEARN with that sort of task :< ) As I *can't* embed everything pertinent to the code *in* the code, I've had to take extra measures to ensure it is in a form that is reasonably portable and *appears* "worth preserving" (e.g., scribbles on a napkin don't qualify! :> ) Unfortunately, the wide range of media formats that could potentially contain information worth preserving makes damn near every "container" impractical. So far, the best compromise is PDF's as containers (hoping they continue to evolve to support even more objects!) with the code as "attachments". As you're closer to the holiday (geographically), best wishes to you and L! Keep warm (together?? ;-)
Reply by ●December 22, 20142014-12-22
Hi Mike, On 12/22/2014 11:14 AM, Mike Perkins wrote:> On 22/12/2014 17:15, Don Y wrote: >> On 12/22/2014 7:38 AM, Paul E Bennett wrote:>> I find it difficult to get "customers" (Marketing, clients, etc.) to focus >> on what they *want*, let alone *why* they want it. They tend to know >> what they *don't* want -- AFTER they see it! But, are largely incapable >> of abstract thought: "Imagine this device reified before you; how does >> it work?">> I suspect this is one of the reasons why engineers are responsible for >> so many (clumsy) designs -- "You've got to be an Engineer to use this >> damn thing!" > > I believe you are right in part. > > The first engineers get wrong is not to insist on a detailed set of product > requirements."Insist?" How do you do that -- stomp your feet and threaten to hold your breath until you turn blue? :> Who defines "detailed"? I've seen LENGTHY requirements documents that didn't "define" anything! If you start questioning things, you push those folks WHO DON'T KNOW WHAT THEY WANT into admitting that. Or, worse, fearing that they LOOK INCOMPETENT!> The second is for engineers not to cost out how much an addition, or a change > will cost. Not just in man-hours, but in $ or �, just so marketing and others > get to see the true cost of such unplanned and uncoordinated changes.IME, that doesn't work either. *You* don't have the decision making authority. And, those that do will make arbitrary decisions, *appearing* to acknowledge your data -- then complaining later when those costs actually *do* materialize! I prepared a detailed estimate for an employer some years ago. From that, prepared a detailed timeline (week by week). I submitted it at the start of the project. My employer cut it IN HALF when pricing the project. Then, many months later, started hounding me due to my lack of progress. I retrieved my initial schedule from my desk drawer and showed him how I was *exactly* on target: "This is week X, you can see I am working on FOO... as indicated in the schedule!" (If you don't trust my abilities, then why did you hire me?)> To be fair, if you're first to market it's not always known what the product > features should be.You might not be able to "know", with certainty, but you can "pretend" the device exists and *imagine* using it. Not just a "cursory" examination but a full fledged "let's make a PROJECT out of imagining this product exists". E.g., there was an early "pocket organizer" that used a non-qwerty, non-Dvorak keyboard layout: instead, the keys were arranged in alphabetical order. Even a tiny amount of "play acting" would have shown that this was a bad choice. Anyone used to a "real" keyboard would be frustrated by it. And, folks who had NEVER experienced a real keyboard would be no better off searching for a particular letter (because it wasn't a single linear arrangement of keys -- 'J' might be right *below* 'A' instead of nine keys to the right. The LORAN plotter I mentioned (here?) used a membrane keypad (new at that time). *But*, the keypad was INCREDIBLY stiff! I commented about this as it was noticeably difficult for me to hammer away at the buttons as I tested the device. My boss's reply: "*You* aren't our intended user. Rather, we're dealing with fishermen with fish guts on their hands and hammers for fists... we're more worried about the structural strength of the case holding up to this sort of pounding!" I recall being shown the prototype of an early "electronic tape rule". A small LCD display in the top of the case to indicate the current measurement. And, a little button that *flipped* the digits upside down (think left-handed vs. right-handed use). I.e., someone decided that an electronic version of this tool that had been in common use for DECADES *needed* the ability to read the scale regardless of orientation despite the fact that most (*ALL* that I've seen or owned!) can only be read in *one* orientation!







