EmbeddedRelated.com
Forums

pulse counter using LPC1768 proving to very challenging

Started by navman June 7, 2011
David Brown wrote:
> I fully agree here. Software is not good at doing things with a few > microsecond timing - any processor fast enough to have plenty of > instruction cycles in that time will have unpredictable latencies due to > caches, pipelines, buffers, etc. But this should be fairly simple code > - with enough care, it could be done.
I have meaningful ISRs taking 400-600ns on a TMS320F2812 running at 150MHz. The entire executor engine of a dynamically reconfigurable waveform generation state sequencer completes in under 4us, with that time depending strongly on the number of possible transitions allowed per state, currently limited to 4. The new TI Delfino at 300MHz could halve these times. The reason these processors can do this is because they have (albeit not particularly generous amounts, although the Delfino again improves this greatly) SRAM running at full processor speed, and a non-cached architecture. What is critically important is interrupt latency and interrupt latency jitter. In my application, I use assembly language "pre-ISRs" above the non-critical ISRs which simply juggle a few interrupt enables, then very quickly re-enable interrupts. The main cause of interrupt latency jitter is C preamble code for each ISR takes quite a long time to complete. So if the main() code gets interrupted, the interrupt latency is about 100ns. But if another low-priority interrupt is running when the highest priority real-time interrupt is triggered, then if I just leave it to the C compiler I have to wait for it to save full context, and run a bunch of silly little extra instructions, before having a chance to re-enable interrupts and let the high-priority one take over. That might extend the effective latency to 300ns. Hence, latency jitter. By writing a pre-ISR in .asm, I can get interrupts re-enabled, restricted to just the more important ones, in just 3-4 instructions. Then the high-priority ISR can preempt the preamble codes of the low priority ISRs, greatly reducing latency jitter. I think the last time I measured, I could guarantee less than 200ns latency on the F2812. Once past getting interrupts reenabled, the C compiler is good enough at writing the actual meat and potatoes ISR code. I can't wait to get my hands on the Delfino. Being able to do serious work in a few 100s of ns is very much fun. -- _____________________ Mr.CRC crobcBOGUS@REMOVETHISsbcglobal.net SuSE 10.3 Linux 2.6.22.17
On Fri, 17 Jun 2011 07:27:11 -0700, "Mr.CRC"
<crobcBOGUS@REMOVETHISsbcglobal.net> wrote:

><snip> >What is critically important is interrupt latency and interrupt latency >jitter. ><snip>
I enjoyed reading the detailed overview. And it makes the point, again, among many other ways it can also be made. As a side bar to just you, there are a few processor families where interrupt latency __jitter__ for internally generated interrupts (timers) is zero. Such interrupts are always synchronous with the clocking system (of course) and all instructions have the exact same execution time (1 cycle) and so interrupts are always entered on the same relative phase to the timer event. If you don't disable the system itself, of course. I've used this for an operating system with a jitter-free guarantee on starting sleeping processes using delta queues (where only one such process is allowed to sleep on the same timed event.) Anyway, interesting overview. Enjoyed thinking a little about it. Makes me want to buckle down and design, build, and code up a personal arb-function gen for my desk. May do that. Jon
Jon Kirwan wrote:
> On Fri, 17 Jun 2011 07:27:11 -0700, "Mr.CRC" > <crobcBOGUS@REMOVETHISsbcglobal.net> wrote: > >> <snip> >> What is critically important is interrupt latency and interrupt latency >> jitter. >> <snip> > > I enjoyed reading the detailed overview. And it makes the > point, again, among many other ways it can also be made.
Thanks Jon. I've mostly lurked here for over 12 years, and usually listen to your writings with great eagerness to learn something and am rarely disappointed.
> As a side bar to just you, there are a few processor families > where interrupt latency __jitter__ for internally generated > interrupts (timers) is zero. Such interrupts are always > synchronous with the clocking system (of course) and all > instructions have the exact same execution time (1 cycle) and > so interrupts are always entered on the same relative phase > to the timer event. If you don't disable the system itself, > of course.
Is that ARM families that can basically switch context in hardware, or some other device?
> I've used this for an operating system with a > jitter-free guarantee on starting sleeping processes using > delta queues (where only one such process is allowed to sleep > on the same timed event.)
<scratches head, wonders what a "delta queue" is> Hmm, looking at a few search results I sort of get it.
> Anyway, interesting overview. Enjoyed thinking a little > about it. Makes me want to buckle down and design, build, > and code up a personal arb-function gen for my desk. May do > that.
The recently produced Agilent 33522A I've gotten two of at work. In the past I was using Tek AFG3022. Unfortunately, the new Agilent is seriously bug ridden. They fixed it somewhat with a recent update, but still there are just embarrassing bugs. I would resign and go become a monk if I put out something like that. I was close to buying one for home, but I can't afford to put out my allowance funds for something screwey. The nice thing about the 33522A if they ever get it to work is that it lets you choose an arbitrary sampling rate for arbs. Older generation ones like the Tek in this price range had fixed sampling rates, which seriously hampered usefulness if you wanted a precise or low frequency arbs. 33522A also has the ability to modulate many things with noise, and coolest of all is that the noise bandwidth is settable! What I'd be tempted to do if I had the time or was retired (having a little one running amok is slowing me down to where a simple Nixie clock project takes 2 years just to make a PCB) is an audio range arb. with very high vertical precision output and very low distortion. That may be doable in a DSP as well, rather than needing an FPGA which would be my first consideration to implement anything serious. It would be cool to extend it into a full blown audio development instrument with analysis capabilities as well. Weren't you one of the first people do code up a DDS on an AVR years ago? Anyway, someone did that, and it inspired me to do something a little different, instead of trying the tightest inner loop for highest sampling rate, I went for low frequencies, but with an 8-digit LED display, with 5 frequency digits, and 3 phase shift digits for 1 degree phase setting. I think it was a dual channel too. My interest was in making psychadelic doodles with a laser and closed-loop scanner mirrors, laser show style. I got that to a working stage then threw it in a drawer. Then I did something more sophisticated with the TMS320F2812, but with no UI. Then that went in the drawer. Ultimately I'll use the F2812 to make a full blown laser show. I really prefer to just goof off with eye candy toys. Only at work do I do serious stuff. -- _____________________ Mr.CRC crobcBOGUS@REMOVETHISsbcglobal.net SuSE 10.3 Linux 2.6.22.17
On Fri, 17 Jun 2011 15:14:44 -0700, "Mr.CRC"
<crobcBOGUS@REMOVETHISsbcglobal.net> wrote:

>Jon Kirwan wrote: >> On Fri, 17 Jun 2011 07:27:11 -0700, "Mr.CRC" >> <crobcBOGUS@REMOVETHISsbcglobal.net> wrote: >> >>> <snip> >>> What is critically important is interrupt latency and interrupt latency >>> jitter. >>> <snip> >> >> I enjoyed reading the detailed overview. And it makes the >> point, again, among many other ways it can also be made. > >Thanks Jon. I've mostly lurked here for over 12 years, and usually >listen to your writings with great eagerness to learn something and am >rarely disappointed.
Thanks. I don't usually have a lot to say, anymore, though. With the move towards large memory systems and 32-bit cpus with FP and memory mgmt systems capable of runing Linux on a chip, "embedded" has blurred to the point where you can't tell the difference between a Microsoft MSDN developer, a Linux guru, and an embedded micro coder, anymore. The Windows CE coder seems to imagine they are doing embedded work. So does the Linux coder. .NET can run embedded, in fact, though anyone familiar with its details must realize just how abstracted that environment is. Technically, yes, I suppose it's true that porting code from a workstation to run on an "embedded device" using .NET, for example, might still meet some people's definitions. A lot of the discussions here seem to be at that level now. Although I do .NET coding and have my paid-up annual MSDN subscription, it's dull stuff to me. I think of embedded to be about the skills required by us and where they __differ__ from hosted environment development skill sets. When a job requires familiarity more than just one language and requires familiarity with how compilers compile code, with assembly, with linker semantics and how to modify their control files for some desired result, as well as broad acquaintances with physics, numerical methods, signal processing, optics, and certainly electronic design, then we find more of these differences. When it requires less of these differences from workstation development skills, it is less about the "embedded" I know and love. Times are changing and the relative mix of skills found amongst the embedded programmer body are shifting with the capabilities being offered in todays tools. Entire operating systems are ported over by people I do consider to be well healed embedded programmers, but then just used lock-stock- and-barrel by those who know little about what took place and don't care to know and who just use the usual workstation tools without knowing much difference, at all. That's a different thing to me. So I write less, today. I haven't changed, but the audience has.
>> As a side bar to just you, there are a few processor families >> where interrupt latency __jitter__ for internally generated >> interrupts (timers) is zero. Such interrupts are always >> synchronous with the clocking system (of course) and all >> instructions have the exact same execution time (1 cycle) and >> so interrupts are always entered on the same relative phase >> to the timer event. If you don't disable the system itself, >> of course. > >Is that ARM families that can basically switch context in hardware, or >some other device?
Some other. For one example, the now "mature" or "older" ADSP-21xx Family from Analog Devices is the example I had coded that delta queue for.
>> I've used this for an operating system with a >> jitter-free guarantee on starting sleeping processes using >> delta queues (where only one such process is allowed to sleep >> on the same timed event.) > ><scratches head, wonders what a "delta queue" is> > >Hmm, looking at a few search results I sort of get it.
It's a very simple, easy to operate, precision tool. I first read about the idea from Douglas Comer's first book on XINU.
>> Anyway, interesting overview. Enjoyed thinking a little >> about it. Makes me want to buckle down and design, build, >> and code up a personal arb-function gen for my desk. May do >> that. > >The recently produced Agilent 33522A I've gotten two of at work. In the >past I was using Tek AFG3022. Unfortunately, the new Agilent is >seriously bug ridden. They fixed it somewhat with a recent update, but >still there are just embarrassing bugs. I would resign and go become a >monk if I put out something like that. I was close to buying one for >home, but I can't afford to put out my allowance funds for something >screwey. > >The nice thing about the 33522A if they ever get it to work is that it >lets you choose an arbitrary sampling rate for arbs. Older generation >ones like the Tek in this price range had fixed sampling rates, which >seriously hampered usefulness if you wanted a precise or low frequency >arbs. 33522A also has the ability to modulate many things with noise, >and coolest of all is that the noise bandwidth is settable! > >What I'd be tempted to do if I had the time or was retired (having a >little one running amok is slowing me down to where a simple Nixie clock >project takes 2 years just to make a PCB) is an audio range arb. with >very high vertical precision output and very low distortion. That may >be doable in a DSP as well, rather than needing an FPGA which would be >my first consideration to implement anything serious. It would be cool >to extend it into a full blown audio development instrument with >analysis capabilities as well. > >Weren't you one of the first people do code up a DDS on an AVR years ago? > >Anyway, someone did that, and it inspired me to do something a little >different, instead of trying the tightest inner loop for highest >sampling rate, I went for low frequencies, but with an 8-digit LED >display, with 5 frequency digits, and 3 phase shift digits for 1 degree >phase setting. > >I think it was a dual channel too. My interest was in making >psychadelic doodles with a laser and closed-loop scanner mirrors, laser >show style. > >I got that to a working stage then threw it in a drawer. Then I did >something more sophisticated with the TMS320F2812, but with no UI. > >Then that went in the drawer. Ultimately I'll use the F2812 to make a >full blown laser show. I really prefer to just goof off with eye candy >toys. Only at work do I do serious stuff.
I think I'd focus on an audio range device, as well. But I'm pretty sure I'd just make it a toy and not something professional. There is so much more "work" involved in making something ready for others to use and although I find some of that enjoyable, I don't find all of it to be so. And I'd be looking more for my own hobbyist pleasure, self-test, and education than anything else. I looked over some of what you write elsewhere and I wish I had your experiences with lasers, too. Lots of potential fun there, both for me and for students I like to teach at times. By the way, I've also got a lot of "stuff in drawers." And I definitely get it about just goofing off with toys. Work is work, but on my own I don't want the burden of having to do all the extra stuff needed to productize. I'd rather play. Jon
Hi Jon,

On 6/18/2011 11:50 AM, Jon Kirwan wrote:

> With the move towards large memory systems and 32-bit cpus > with FP and memory mgmt systems capable of runing Linux on a > chip, "embedded" has blurred to the point where you can't > tell the difference between a Microsoft MSDN developer, a > Linux guru, and an embedded micro coder, anymore. > > The Windows CE coder seems to imagine they are doing embedded > work. So does the Linux coder. .NET can run embedded, in > fact, though anyone familiar with its details must realize > just how abstracted that environment is. Technically, yes, I > suppose it's true that porting code from a workstation to run > on an "embedded device" using .NET, for example, might still > meet some people's definitions. A lot of the discussions > here seem to be at that level now. Although I do .NET coding > and have my paid-up annual MSDN subscription, it's dull stuff > to me. > > I think of embedded to be about the skills required by us and > where they __differ__ from hosted environment development > skill sets. When a job requires familiarity more than just > one language and requires familiarity with how compilers > compile code, with assembly, with linker semantics and how to > modify their control files for some desired result, as well > as broad acquaintances with physics, numerical methods, > signal processing, optics, and certainly electronic design, > then we find more of these differences. When it requires > less of these differences from workstation development > skills, it is less about the "embedded" I know and love.
The "one-liner" description of "embedded systems" that I use to try to give folks (e.g., at a dinner party) an idea of what I do is: "something that is quite obviously a computer (inside) but doesn't look/act like a 'computer'" (assuming that they will relate "desktop PC" to the term "computer"). It's easy to give them examples of all of these systems that they've interacted with in the past 24 hours: - your ATM - the gas station experience - the ECU that runs your automobile - the *mouse* (!) attached to your PC (yup, there's a computer inside!) - your sewing machine - your TV - the modem that connects your PC to the internet - your cell phone - your iPod etc. I.e., it is easy to *quickly* overwhelm them with a list of things that they take for granted without even resorting to more esoteric applications (e.g., the photo-radar detectors around town; the traffic light controller; the machine that weighs and labels the meats you purchase at the butcher/grocer) I summarize by saying "I make *things*". To folks in the Industry, I describe embedded systems as "software that comes with a warranty" (*think* about it!)
On Sat, 18 Jun 2011 12:22:12 -0700, Don Y <nowhere@here.com>
wrote:

>Hi Jon, > >On 6/18/2011 11:50 AM, Jon Kirwan wrote: > >> With the move towards large memory systems and 32-bit cpus >> with FP and memory mgmt systems capable of runing Linux on a >> chip, "embedded" has blurred to the point where you can't >> tell the difference between a Microsoft MSDN developer, a >> Linux guru, and an embedded micro coder, anymore. >> >> The Windows CE coder seems to imagine they are doing embedded >> work. So does the Linux coder. .NET can run embedded, in >> fact, though anyone familiar with its details must realize >> just how abstracted that environment is. Technically, yes, I >> suppose it's true that porting code from a workstation to run >> on an "embedded device" using .NET, for example, might still >> meet some people's definitions. A lot of the discussions >> here seem to be at that level now. Although I do .NET coding >> and have my paid-up annual MSDN subscription, it's dull stuff >> to me. >> >> I think of embedded to be about the skills required by us and >> where they __differ__ from hosted environment development >> skill sets. When a job requires familiarity more than just >> one language and requires familiarity with how compilers >> compile code, with assembly, with linker semantics and how to >> modify their control files for some desired result, as well >> as broad acquaintances with physics, numerical methods, >> signal processing, optics, and certainly electronic design, >> then we find more of these differences. When it requires >> less of these differences from workstation development >> skills, it is less about the "embedded" I know and love. > >The "one-liner" description of "embedded systems" that I >use to try to give folks (e.g., at a dinner party) an idea >of what I do is: "something that is quite obviously a computer >(inside) but doesn't look/act like a 'computer'" (assuming that >they will relate "desktop PC" to the term "computer"). >It's easy to give them examples of all of these systems that >they've interacted with in the past 24 hours: >- your ATM >- the gas station experience >- the ECU that runs your automobile >- the *mouse* (!) attached to your PC (yup, there's a computer inside!) >- your sewing machine >- your TV >- the modem that connects your PC to the internet >- your cell phone >- your iPod >etc. > >I.e., it is easy to *quickly* overwhelm them with a list of >things that they take for granted without even resorting >to more esoteric applications (e.g., the photo-radar detectors >around town; the traffic light controller; the machine that >weighs and labels the meats you purchase at the butcher/grocer) > >I summarize by saying "I make *things*". > >To folks in the Industry, I describe embedded systems as >"software that comes with a warranty" (*think* about it!)
I don't look at it from the outside. I look at it from the processes involved in performing the work and the skills and talents those entail. It's not about usage. It's about what is required by the craft. Making a table from a shipped kit that requires assembly using a provided wrench and screwdriver, with everything already cut, pre-assembled and dismantled before shipping, and nicely varnished as well is indeed a table in the end and the end user, in fact, "made it." However, someone who does all the design, taking into account structure, form, use, available tools, fasteners, and skills, and then cuts each piece after careful measurement and strategy beforehand, and then does all the treatments and so on before getting to assembly, also "made it." Yet the shared backgrounds, skills, talented are completely lacking here. To me it is about the shared life's experience and knowledge, skills and interests, depth and breadth, tools and so on that are involved in the shaping that count. It's who we are, not what we make, that makes us "embedded programmers." When the things themselves change -- for example, when the making of furniture goes from hand-selection of grain and quality and orientation and colors and hand-crafted use of a wide variety of odd styled chisels of every manner and kind, to make a piece that will last 200 years (I have many such pieces, by the way, which are in perfect condition today) through all manner of humidty and temperature.... to using 30-yr old doug fir softwood stapled together with plastic and staples and some cheesy metalwork slapped onto the outside without any real idea of how these things get used over the years in the end (as happens to be the huge difference between old "roll top" desks and the new crap that could only be said to be copied out of some catalog, by comparison)... well, I just cannot call them the same kind of craft anymore. Things have changed. And they have. It's not for the bad. In many ways, the changing face of it makes it more accessible to many who otherwise could never have laid hands to the work before. People who couldn't have readily fathomed what it takes to write their own O/S on the fly, don't need to do so. They don't even need to understand them very well. In fact, they can downright abuse them and often come up, ignorantly, with something that "works" well enough to get the job done even though they don't even know they didn't use the tools well, at all. The tools are that good and readily available. But it also means that they have no idea what a thunk is, or a coroutine, or how c++ manages to mechanically achieve its inheritance rules, or what a stack unwind actually is and does. It's a slightly different world coding for embedded Linux. So we don't share that much between us. Embedded to me is about the skills and shared backgrounds we share as a community. Not products with warranties. Jon
Hi Jon,

On 6/18/2011 6:25 PM, Jon Kirwan wrote:

>> The "one-liner" description of "embedded systems" that I >> use to try to give folks (e.g., at a dinner party) an idea >> of what I do is: "something that is quite obviously a computer >> (inside) but doesn't look/act like a 'computer'" >> >> I summarize by saying "I make *things*". >> >> To folks in the Industry, I describe embedded systems as >> "software that comes with a warranty" (*think* about it!) > > I don't look at it from the outside. I look at it from the > processes involved in performing the work and the skills and > talents those entail. It's not about usage. It's about what > is required by the craft.
But the requirements change as the craft evolves! Do you want to dope your own silicon? Do you prefer using a "pocket assembler" in lieu of a HLL compiler?
> Making a table from a shipped kit that requires assembly > using a provided wrench and screwdriver, with everything > already cut, pre-assembled and dismantled before shipping, > and nicely varnished as well is indeed a table in the end and > the end user, in fact, "made it."
What's wrong with that? If it results in more people having *tables*... Or, if it frees them up to work on something *else* that they couldn't have had time to do, otherwise (because of the the countless hours/weeks that they would have spent designing, measuring, selecting wood grains, etc.)? I.e., how much *didn't* get done, previously, because we were screwing around with 6 character, uppercase symbols in our code instead of writing things more expressively?
> However, someone who does > all the design, taking into account structure, form, use, > available tools, fasteners, and skills, and then cuts each > piece after careful measurement and strategy beforehand, and > then does all the treatments and so on before getting to > assembly, also "made it." Yet the shared backgrounds, > skills, talented are completely lacking here.
They are redirected to other purposes. One of the first products I was involved with was a LORAN-C position plotter. Feed it time-difference coordinates and it charts your progress, on paper, in real time. Anywhere on the globe. Hardware: - 8085 (did that max out at fosc of 6MHz or 4?) - 2 timer/counters - 12KB ROM - 256 bytes RAM - two unipolar stepper motor drives (low power, nothing fancy) - 2x6 digit PGD display I.e., you could almost do this with a PIC, today. The code -- complex for that time -- would: - gather incoming TD's - fit them to the appropriate "chains" based on GRI - convert hyperbolic coordinates to lat-lon - compensate for oblateness of Earth - translate to a scaled Mercator projection - drive X&Y motors to bring the pen to that position - lather, rinse, repeat Today, tackling this project would be a 4-6 *week* effort instead of the many man-years that went into it. Builds would take *seconds* instead of half an hour of listening to an 8" floppy grinding away -- followed by 2 *hours* of EPROM burning. You'd be able to step through your code "at your desk" instead of burning EPROMs and probing for signatures with a 'scope.
> To me it is about the shared life's experience and knowledge, > skills and interests, depth and breadth, tools and so on that > are involved in the shaping that count. It's who we are, not > what we make, that makes us "embedded programmers."
I don't see how evolution in tools or device capabilities changes that. To me, the differences *are* in what we make. You use an autopilot differently than a "word processor" (that some desktop programmer created). The consequences of the autopilot's failure can be more significant, and immediate. A Therac hiccups and it *cooks* someone. Users are more *engaged* with "devices" than "computers". They become integrated in their lives. Sitting down at "your PC" is an entirely different experience than cooking your dinner in the microwave; or, making a call on your cell phone; or driving to work; etc. The "warranty" aspect, to me, speaks to the mindset differences between the two product worlds. People *expect* (yes, EXPECT!) the software on their PC to crash. They *don't* expect -- nor are they tolerant of -- the software *in* their microwave to crash! And, if the latter occurs, they expect to be compensated for it ("I want a new microwave. One that *works*!").
> When the things themselves change -- for example, when the > making of furniture goes from hand-selection of grain and > quality and orientation and colors and hand-crafted use of a > wide variety of odd styled chisels of every manner and kind, > to make a piece that will last 200 years (I have many such > pieces, by the way, which are in perfect condition today) > through all manner of humidty and temperature.... to using > 30-yr old doug fir softwood stapled together with plastic and > staples and some cheesy metalwork slapped onto the outside > without any real idea of how these things get used over the > years in the end (as happens to be the huge difference > between old "roll top" desks and the new crap that could only > be said to be copied out of some catalog, by comparison)... > well, I just cannot call them the same kind of craft anymore.
So, would you prefer to spend weeks or months building that hand-crafted desk? Or, would you prefer to assemble it from a kit and spend the rest of your time building a unique set of windchimes (that no one else will ever have)? I am thrilled at the opportunities these changes have given me to move my art into different areas. If I was still relying on that "pocket assembler", I'd never have time to even *imagine* the other devices that I could create let alone actually create them! Or, the time to spend *commenting* on how "things have changed" :>
> Things have changed. And they have. > > It's not for the bad. In many ways, the changing face of it > makes it more accessible to many who otherwise could never > have laid hands to the work before. People who couldn't have > readily fathomed what it takes to write their own O/S on the > fly, don't need to do so. They don't even need to understand > them very well. In fact, they can downright abuse them and > often come up, ignorantly, with something that "works" well > enough to get the job done even though they don't even know > they didn't use the tools well, at all. The tools are that > good and readily available. But it also means that they have > no idea what a thunk is, or a coroutine, or how c++ manages > to mechanically achieve its inheritance rules, or what a > stack unwind actually is and does. > > It's a slightly different world coding for embedded Linux. So > we don't share that much between us. > > Embedded to me is about the skills and shared backgrounds we > share as a community. Not products with warranties.
On Sat, 18 Jun 2011 20:25:34 -0700, Don Y <nowhere@here.com>
wrote:

>Hi Jon, > >On 6/18/2011 6:25 PM, Jon Kirwan wrote: > >>> The "one-liner" description of "embedded systems" that I >>> use to try to give folks (e.g., at a dinner party) an idea >>> of what I do is: "something that is quite obviously a computer >>> (inside) but doesn't look/act like a 'computer'" >>> >>> I summarize by saying "I make *things*". >>> >>> To folks in the Industry, I describe embedded systems as >>> "software that comes with a warranty" (*think* about it!) >> >> I don't look at it from the outside. I look at it from the >> processes involved in performing the work and the skills and >> talents those entail. It's not about usage. It's about what >> is required by the craft. > >But the requirements change as the craft evolves!
Indeed that is so. But it isn't just that there are new tools in town. It's also that _more_ people can participate at a much wider variety of skill levels. I'm not complaining about it. Just noting it.
>Do you want to dope your own silicon?
I have done that. Were you aware of a Bell Labs kit to do just that, put out in the mid 1960's?? (I've done it since, with my own home-brew oven, as well, made with a nickel plated, water cooled chamber and halogen lamps. Long story there, too.)
>Do you prefer using a >"pocket assembler" in lieu of a HLL compiler? ><snip>
I said that the group's interests have moved away from my own over the years. That's true. I _also_ believe that as the processors used and tools applied increasingly look more like traditional, hosted programming environments found on workstations, to that degree it also is less and less a differentiating feature. Taken to its limit, there will be no difference in embedded development and any other and no point in choosing to use the adjective anymore. The group here has had this debate here. Long threads about it. I'm not changing any of the position I took a decade back about any of this. It's the same stand today. What makes embedded development "embedded" to me are how the skills and tools are differentiated from workstation development. That's the main point. It's not about the end product. If a washing machine uses Windows 7 Ultimate and Microsoft writes the .NET objects used to do the hardware interfacing at a low level and then provides abstraction objects to the programmer, then this particular washing machine programmer is no more an embedded programmer -- even though it is a washing machine -- than would be any other .NET Windows 7 Ultimate programmer dragging and dropping a few objects onto a form. Others have instinctively asked the questions you have asked. But I have considered them and don't agree with them once I thought more on it. It's not a useful dividing line. Sorry, but that's the end of it for me. What _is_ useful to know are the types of experiences, talents, and backgrounds required to _do_ development for some sphere. And in that sense, embedded has real meaning the way I apply it. It has almost no useful meaning the way you seem to suggest. I'll stop here. There's more to this, but I didn't want to go too far. If you are interested, I've posted on this topic before and with more of my views on it exposed. Still available in google, I'm sure. Jon
Jon Kirwan wrote:
> On Fri, 17 Jun 2011 15:14:44 -0700, "Mr.CRC" > <crobcBOGUS@REMOVETHISsbcglobal.net> wrote: >> Thanks Jon. I've mostly lurked here for over 12 years, and usually >> listen to your writings with great eagerness to learn something and am >> rarely disappointed. > > Thanks. I don't usually have a lot to say, anymore, though.
Your welcome.
> With the move towards large memory systems and 32-bit cpus > with FP and memory mgmt systems capable of runing Linux on a > chip, "embedded" has blurred to the point where you can't > tell the difference between a Microsoft MSDN developer, a > Linux guru, and an embedded micro coder, anymore.
I can relate. I prefer bit banging, writing ISRs, that sort of thing. Though drivers can get a little tiresome. I figure if it doesn't need an oscilloscope to debug and verify, it's not my kind of "embedded." Perhaps I just prefer any excuse to use an oscilloscope!
> The Windows CE coder seems to imagine they are doing embedded > work. So does the Linux coder. .NET can run embedded, in > fact, though anyone familiar with its details must realize > just how abstracted that environment is. Technically, yes, I > suppose it's true that porting code from a workstation to run > on an "embedded device" using .NET, for example, might still > meet some people's definitions. A lot of the discussions > here seem to be at that level now. Although I do .NET coding > and have my paid-up annual MSDN subscription, it's dull stuff > to me.
I've cringed at the mere sight of ".NET" since its inception. I also hated Java since I first heard of it. We had a guy at work who thought "embedded" meant installing Linux on a SBC and programming it. It is "embedded" in a sense, but not quite the sense that it seems we would pretty much agree upon.
> I think of embedded to be about the skills required by us and > where they __differ__ from hosted environment development > skill sets. When a job requires familiarity more than just > one language and requires familiarity with how compilers > compile code, with assembly, with linker semantics and how to > modify their control files for some desired result, as well > as broad acquaintances with physics, numerical methods, > signal processing, optics, and certainly electronic design, > then we find more of these differences. When it requires > less of these differences from workstation development > skills, it is less about the "embedded" I know and love. > > Times are changing and the relative mix of skills found > amongst the embedded programmer body are shifting with the > capabilities being offered in todays tools. Entire operating > systems are ported over by people I do consider to be well > healed embedded programmers, but then just used lock-stock- > and-barrel by those who know little about what took place and > don't care to know and who just use the usual workstation > tools without knowing much difference, at all. > > That's a different thing to me. So I write less, today. I > haven't changed, but the audience has.
Well I don't think the need for the more EE skill side of the trade will go away. The changes probably amount to an overall improvement, since more people can access more technology and tools. That still a benefit even if some of them don't become master craftsmen. There's a place for developers with a cursory, high level undertanding. Think of Arduino and kinetic skulptors, for ex. If they can get something to just "work" then the world is a better place. At first I thought Arduino was stupid. "I can work with a bare AVR, what do I need that for?" I thought. Then I realized that if it makes more people play with microcontrollers, it is good. Now I'm even curious to check it out and see if it can spare me some time on my next 8-bit project.
>> Is that ARM families that can basically switch context in hardware, or >> some other device? > > Some other. For one example, the now "mature" or "older" > ADSP-21xx Family from Analog Devices is the example I had > coded that delta queue for.
Oh that one. I was close to trying that out once. I actually would have preferred to use ADI processors for what I use the TI C2000 for, but at the time ADI had nothing like a "digital signal controller" with DSP speed and microcontroller peripherals. Blackfin has closed the gap a little, but it's still not what you'd pick to interface quadrature encoders and run MOSFET SMPS front-ends. But TI assembly language is an ugly thing. It's not that bad if you can figure out the syntax and work with it enough to keep it memorized, which I haven't, because the docs are all language lawyer style when what is needed is more simple examples. With ADI, at least for SHARC which I looked at a bit, assembly is a breeze.
>>> I've used this for an operating system with a >>> jitter-free guarantee on starting sleeping processes using >>> delta queues (where only one such process is allowed to sleep >>> on the same timed event.) >> <scratches head, wonders what a "delta queue" is> >> >> Hmm, looking at a few search results I sort of get it. > > It's a very simple, easy to operate, precision tool. I first > read about the idea from Douglas Comer's first book on XINU.
Well I've a tidbit from you again. Thanks. [edit]
> I think I'd focus on an audio range device, as well. But I'm > pretty sure I'd just make it a toy and not something > professional. There is so much more "work" involved in > making something ready for others to use and although I find > some of that enjoyable, I don't find all of it to be so. And > I'd be looking more for my own hobbyist pleasure, self-test, > and education than anything else.
Once it blurs into legalities, regulations, and injection molded die making, I start to run for cover. Probably better that I have a 9-5 job then.
> I looked over some of what you write elsewhere and I wish I > had your experiences with lasers, too. Lots of potential fun > there, both for me and for students I like to teach at times.
Yeah, well the lasers and my silly Chemistry degree cost me a lot of time that I sometimes wish I had spent on getting a proper EE degree.
> By the way, I've also got a lot of "stuff in drawers." And I > definitely get it about just goofing off with toys. Work is > work, but on my own I don't want the burden of having to do > all the extra stuff needed to productize. I'd rather play. > > Jon
Have a good Father's Day, whether or not your a father! -- _____________________ Mr.CRC crobcBOGUS@REMOVETHISsbcglobal.net SuSE 10.3 Linux 2.6.22.17
On Sun, 19 Jun 2011 08:23:02 -0700, "Mr.CRC"
<crobcBOGUS@REMOVETHISsbcglobal.net> wrote:

>Jon Kirwan wrote: ><snip> > >> With the move towards large memory systems and 32-bit cpus >> with FP and memory mgmt systems capable of runing Linux on a >> chip, "embedded" has blurred to the point where you can't >> tell the difference between a Microsoft MSDN developer, a >> Linux guru, and an embedded micro coder, anymore. > >I can relate. I prefer bit banging, writing ISRs, that sort of thing. >Though drivers can get a little tiresome. I figure if it doesn't need >an oscilloscope to debug and verify, it's not my kind of "embedded." >Perhaps I just prefer any excuse to use an oscilloscope!
I don't always require an oscilloscope or an MSO, but just being threatened that I might need one is what makes it all the more fun for me. Without at least the threat present, it's certain to be boring.
>> The Windows CE coder seems to imagine they are doing embedded >> work. So does the Linux coder. .NET can run embedded, in >> fact, though anyone familiar with its details must realize >> just how abstracted that environment is. Technically, yes, I >> suppose it's true that porting code from a workstation to run >> on an "embedded device" using .NET, for example, might still >> meet some people's definitions. A lot of the discussions >> here seem to be at that level now. Although I do .NET coding >> and have my paid-up annual MSDN subscription, it's dull stuff >> to me. > >I've cringed at the mere sight of ".NET" since its inception. I also >hated Java since I first heard of it. > >We had a guy at work who thought "embedded" meant installing Linux on a >SBC and programming it. It is "embedded" in a sense, but not quite the >sense that it seems we would pretty much agree upon.
If you don't need to read datasheets, study peripheral operation, read schematics, consider sensor/transducer physics, do some laplace and partial fractions, look over voltage thresholds and current limits, scan over compiler output in assembly or machine code, set up that HP 54645D with both 8-lead probes in hand just in case, and figure out how to modify a linker control file, and all in the same project, then it isn't embedded work... much.
>> I think of embedded to be about the skills required by us and >> where they __differ__ from hosted environment development >> skill sets. When a job requires familiarity more than just >> one language and requires familiarity with how compilers >> compile code, with assembly, with linker semantics and how to >> modify their control files for some desired result, as well >> as broad acquaintances with physics, numerical methods, >> signal processing, optics, and certainly electronic design, >> then we find more of these differences. When it requires >> less of these differences from workstation development >> skills, it is less about the "embedded" I know and love. >> >> Times are changing and the relative mix of skills found >> amongst the embedded programmer body are shifting with the >> capabilities being offered in todays tools. Entire operating >> systems are ported over by people I do consider to be well >> healed embedded programmers, but then just used lock-stock- >> and-barrel by those who know little about what took place and >> don't care to know and who just use the usual workstation >> tools without knowing much difference, at all. >> >> That's a different thing to me. So I write less, today. I >> haven't changed, but the audience has. > >Well I don't think the need for the more EE skill side of the trade will >go away.
No, it grows. But the size of the pyramid of programmers grows exponentially larger still. So it remains a dwindling proportion of the conversation here despite the truth of what you say.
>The changes probably amount to an overall improvement, since >more people can access more technology and tools. That still a benefit >even if some of them don't become master craftsmen. There's a place for >developers with a cursory, high level undertanding. Think of Arduino >and kinetic skulptors, for ex. If they can get something to just "work" >then the world is a better place.
Agreed. I think this is very good, that computers have moved from when I first worked on building my own. What I did caused me to get written up in a large spread, with pictures, in the local newspaper. It was _that_ unusual, I guess. I don't know who ratted me out at the time. But the news people showed up, one day, all the same. To have the case where one can get a TI Launchpad send to you for $4.30, with cables and a crystal and two cpus, and connectors and the rest... no shipping charges... well, what can one say? It's a great time, indeed!! I am glad for all this. And I'm glad others might be interested in them for any reason of their own, at all.
>At first I thought Arduino was stupid. "I can work with a bare AVR, >what do I need that for?" I thought. Then I realized that if it makes >more people play with microcontrollers, it is good. Now I'm even >curious to check it out and see if it can spare me some time on my next >8-bit project.
I just used a Launchpad to create a parallel port to USB "printer device" that can be used as a parallel port printer and it saves files automatically on the PC, instead. Had to add a DB25, some wire and a few resistors and one cap, is all. Oh, and a tiny piece of vector board. So yes, I get your point here.
>>> Is that ARM families that can basically switch context in hardware, or >>> some other device? >> >> Some other. For one example, the now "mature" or "older" >> ADSP-21xx Family from Analog Devices is the example I had >> coded that delta queue for. > >Oh that one. I was close to trying that out once. I actually would >have preferred to use ADI processors for what I use the TI C2000 for, >but at the time ADI had nothing like a "digital signal controller" with >DSP speed and microcontroller peripherals. Blackfin has closed the gap >a little, but it's still not what you'd pick to interface quadrature >encoders and run MOSFET SMPS front-ends.
I used a TI 'C40 quite a while back, but when I was actively also using the ADSP-21xx. I have to say it was night and day between the two. I had TI support on the phone because the hardware timing I was getting was 11 clocks for a cached bit of code that according to their docs should have taken 7 clocks. They NEVER were able to explain the timing of the bit of source code I sent them. Even after 3 weeks of their working on it and comparing it to their docs about register clashes and so on. Never did resolve the issue to my satisfaction. By comparison, the ADSP-21xx worked _exactly_ as the docs said. Always. Exactly. Never a question about them. The assembly (up to 3 instructions per cycle) was nice, too.
>But TI assembly language is an ugly thing. It's not that bad if you can >figure out the syntax and work with it enough to keep it memorized, >which I haven't, because the docs are all language lawyer style when >what is needed is more simple examples. > >With ADI, at least for SHARC which I looked at a bit, assembly is a breeze.
I know.
>>>> I've used this for an operating system with a >>>> jitter-free guarantee on starting sleeping processes using >>>> delta queues (where only one such process is allowed to sleep >>>> on the same timed event.) >>> <scratches head, wonders what a "delta queue" is> >>> >>> Hmm, looking at a few search results I sort of get it. >> >> It's a very simple, easy to operate, precision tool. I first >> read about the idea from Douglas Comer's first book on XINU. > >Well I've a tidbit from you again. Thanks.
It's a book worth reading through. Very clear, very easy, and it stimulates the imagination well.
>[edit] >> I think I'd focus on an audio range device, as well. But I'm >> pretty sure I'd just make it a toy and not something >> professional. There is so much more "work" involved in >> making something ready for others to use and although I find >> some of that enjoyable, I don't find all of it to be so. And >> I'd be looking more for my own hobbyist pleasure, self-test, >> and education than anything else. > >Once it blurs into legalities, regulations, and injection molded die >making, I start to run for cover. Probably better that I have a 9-5 job >then.
Hehe.
>> I looked over some of what you write elsewhere and I wish I >> had your experiences with lasers, too. Lots of potential fun >> there, both for me and for students I like to teach at times. > >Yeah, well the lasers and my silly Chemistry degree cost me a lot of >time that I sometimes wish I had spent on getting a proper EE degree.
I did as much chemistry as I wanted to do -- mostly explosives as a kid. Mercury fulminate was my absolute fave -- the reaction before the crystals settle out is a mad scientist's exothermic, boiling, vaporous dream. And what you get after, or better still after filtering and precipitation with glacial acetic acid, was also a lot of fun too. I did rocket fuels, explosives, fireworks, smoke bombs, and pretty much anything "thermodynamic." Luckily also learned enough extra to stay alive while doing that at home. Still have picric acid, chlorates and perchlorates, and a few other goodies laying about here. They used to ship that to 16 yr old kids, though the picric acid had to go by train. I know. I was one and they Boulevard Labs in Chicago shipped to me, regularly! Organics I got into a little. Enough to get some of the basic terms down so that I could read and draw things when asked, but nothing much more than that. I know what a hydroxy ketone is defined as and I can draw out a diagram for 1-Chloro-3,5-dinitro-4-hydroxybenzene if asked, for example. But that's about it. Although there is logic to organic naming, there is enough memorization of various specialized words to bother me. Inorganics is easier in that sense.
>> By the way, I've also got a lot of "stuff in drawers." And I >> definitely get it about just goofing off with toys. Work is >> work, but on my own I don't want the burden of having to do >> all the extra stuff needed to productize. I'd rather play. >> >> Jon > > >Have a good Father's Day, whether or not your a father!
Thanks. You too. And yes, I've 3. All in their mid 20's now. Jon