EmbeddedRelated.com
Forums

Engineering degree for embedded systems

Started by hogwarts July 27, 2017

<upsidedown@downunder.com> wrote in message 
news:23ddoch8v472cpj59u4f53dta9nau5ksuf@4ax.com...
> On Sat, 5 Aug 2017 15:20:40 -0500, Les Cargill > <lcargill99@comcast.com> wrote: > >>Hans-Bernhard Br&#4294967295;ker wrote: >>> Am 27.07.2017 um 14:35 schrieb hogwarts: >><snip> >>> And so the pork cycle rolls on. >>> >> >>That's a great way to put it. >> >>> And they don't even start to think about how anybody is supposed to >>> make an informed decision between such ultra-specialized programmes. >>> I'm convinced it's impossible. >> >>IMO, a reputable EE programme is still probably the best way. CS >>programs still vary too much; CS may or may not be a second-class >>setup in many universities. >> >>I get the feeling that *analog* engineers still have a stable job >>base because it's much harder to fake that. It's somewhat harder. >> >>And I'd warn the OP against specifically targeting IoT. It's a big >>bubble. People win in bubbles but it's not likely you will be among >>them. > > I have often wondered what this IoT hype is all about. It seems to be > very similar to the PLC (Programmable Logic Controller) used for > decades.
don't think so the IoT hype is all about marketing benefits - selling consumers extra features (that they never knew they ever wanted and probably don't need) using PLC's is an engineering benefit (or not) tim
On 03/08/17 16:03, Phil Hobbs wrote:
> On 08/01/2017 09:23 AM, Tom Gardner wrote: >> On 01/08/17 13:55, Phil Hobbs wrote: >>> On 07/30/2017 02:05 PM, Tom Gardner wrote: >>>> On 30/07/17 17:05, Phil Hobbs wrote: >>>>> Another thing is to concentrate the course work on stuff that's hard >>>>> to pick up >>>>> on your own, i.e. math and the more mathematical parts of engineering >>>>> (especially signals & systems and electrodynamics). >>>> >>>> Agreed. >>>> >>>>> Programming you can learn out of books without much difficulty, >>>> >>>> The evidence is that /isn't/ the case :( Read comp.risks, >>>> (which has an impressively high signal-to-noise ratio), or >>>> watch the news (which doesn't). >>> >>> Dunno. Nobody taught me how to program, and I've been doing it since >>> I was a >>> teenager. I picked up good habits from reading books and other >>> people's code. >> >> Yes, but it was easier back then: the tools, problems >> and solutions were, by and large, much simpler and more >> self-contained. > > I'm not so sure. Debuggers have improved out of all recognition, with two > exceptions (gdb and Arduino, I'm looking at you). Plus there are a whole lot of > libraries available (for Python especially) so a determined beginner can get > something cool working (after a fashion) fairly fast.
Yes, that's all true. The speed of getting something going is important for a beginner. But if the foundation is "sandy" then it can be necessary and difficult to get beginners (and managers) to appreciate the need to progress to tools with sounder foundations. The old time "sandy" tool was Basic. While Python is much better than Basic, it is still "sandy" when it comes to embedded real time applications.
> Seems as though youngsters mostly start with Python and then start in on either > webdev or small SBCs using Arduino / AVR Studio / Raspbian or (for the more > ambitious) something like BeagleBone or (a fave) LPCxpresso. Most of my > embedded work is pretty light-duty, so an M3 or M4 is good medicine. I'm much > better at electro-optics and analog/RF circuitry than at MCUs or HDL, so I do > only enough embedded things to get the whole instrument working. Fancy embedded > stuff I either leave to the experts, do in hardware, or hive off to an outboard > computer via USB serial, depending on the project.
I wish more people took that attitude!
> It's certainly true that things get complicated fast, but they did in the old > days too. Of course the reasons are different: nowadays it's the sheer > complexity of the silicon and the tools, whereas back then it was burn-and-crash > development, flaky in-system emulators, and debuggers which (if they even > existed) were almost as bad as Arduino.
Agreed. The key difference is that with simple-but-unreliable tools it is possible to conceive that mortals can /understand/ the tools limitations, and know when/where the tool is failing. That simply doesn't happen with modern tools; even the world experts don't understand their complexity! Seriously. Consider C++. The *design committee* refused to believe C++ templates formed a Turing-complete language inside C++. They were forced to recant when shown a correct valid C++ program that never completed compilation - because, during compilation the compiler was (slowly) emitting the sequence of prime numbers! What chance have mere mortal developers got in the face of that complexity. Another example is that C/C++ is routinely used to develop multi threaded code, e.g. using PThreads. That's despite C/C++ specifically being unable to guarantee correct operation on modern machines! Most developers are blissfully unaware of (my *emphasis*): Threads Cannot be Implemented as a Library Hans-J. Boehm HP Laboratories Palo Alto November 12, 2004 * In many environments, multi-threaded code is written in a language that was originally designed without thread support (e.g. C), to which a library of threading primitives was subsequently added. There appears to be a general understanding that this is not the right approach. We provide specific arguments that a pure library approach, in which the compiler is designed independently of threading issues, cannot guarantee correctness of the resulting code. We first review why the approach *almost* works, and then examine some of the *surprising behavior* it may entail. We further illustrate that there are very simple cases in which a pure library-based approach seems *incapable of expressing* an efficient parallel algorithm. Our discussion takes place in the context of C with Pthreads, since it is commonly used, reasonably well specified, and does not attempt to ensure type-safety, which would entail even stronger constraints. The issues we raise are not specific to that context. http://www.hpl.hp.com/techreports/2004/HPL-2004-209.pdf
> I still have nightmares about the horribly buggy PIC C17 compiler for the > PIC17C452A, circa 1999. I was using it in an interesting very low cost infrared > imager <http://electrooptical.net#footprints>. I had an ICE, which was a help, > but I spent more time finding bug workarounds than coding.
There are always crap instantiations of tools, but they can be avoided. I'm more concerned about tools where the specification prevents good and safe tools.
> Eventually when the schedule permitted I ported the code to HiTech C, which was > a vast improvement. Microchip bought HiTech soon thereafter, and PIC C died a > well deserved but belated death. > > My son and I are doing a consulting project together--it's an M4-based > concentrator unit for up to 6 UV/visible/near IR/thermal IR sensors for a fire > prevention company. He just got the SPI interrupt code working down on the > metal a couple of minutes ago. It's fun when your family understands what you > do. :)
Lucky you -- I think! I've never been convinced of the wisdom of mixing work and home life, and family businesses seem to be the source material for reality television :)
On Sun, 6 Aug 2017 10:12:36 +0100, "tim..." <tims_new_home@yahoo.com>
wrote:

> > ><upsidedown@downunder.com> wrote in message >news:23ddoch8v472cpj59u4f53dta9nau5ksuf@4ax.com... >> On Sat, 5 Aug 2017 15:20:40 -0500, Les Cargill >> <lcargill99@comcast.com> wrote: >> >>>Hans-Bernhard Br&#4294967295;ker wrote: >>>> Am 27.07.2017 um 14:35 schrieb hogwarts: >>><snip> >>>> And so the pork cycle rolls on. >>>> >>> >>>That's a great way to put it. >>> >>>> And they don't even start to think about how anybody is supposed to >>>> make an informed decision between such ultra-specialized programmes. >>>> I'm convinced it's impossible. >>> >>>IMO, a reputable EE programme is still probably the best way. CS >>>programs still vary too much; CS may or may not be a second-class >>>setup in many universities. >>> >>>I get the feeling that *analog* engineers still have a stable job >>>base because it's much harder to fake that. It's somewhat harder. >>> >>>And I'd warn the OP against specifically targeting IoT. It's a big >>>bubble. People win in bubbles but it's not likely you will be among >>>them. >> >> I have often wondered what this IoT hype is all about. It seems to be >> very similar to the PLC (Programmable Logic Controller) used for >> decades. > >don't think so > >the IoT hype is all about marketing benefits - selling consumers extra >features (that they never knew they ever wanted and probably don't need)
Yes, this seems to be the main motivation.
>using PLC's is an engineering benefit (or not)
The greatly reduced hardware cost (both processing power and Ethernet/WLAN communication) has made it possible to just handle a single signal (or a small set of related I/O signals) in a dedicated hardware for each signal. Thus the controlling "IoT" device could read a measurement and control an actuator in a closed loop and receive a setpoint from the network. This means that the controlling device can be moved much closer to the actuator, simplifying interfacing (not too much worrying about interference). Taking this even further, this allows integrating the controller into the actual device itself such as a hydraulic valve (mechatronics). Just provide power and an Ethernet condition and off you go. Of course, the environment requirements for such integrated products can be quite harsh. Anyway, with most of the intelligence moved down to the actual device reduces the need for PLC systems, so some PC based control room programs can directly control those intelligent mechatronics units. Anyway, if the "IoT" device is moved inside the actual actuator etc. device, similar skills are needed to interface to the input sensor signals as well as controlling actuators as in the case of external IoT controllers. With everything integrated into the same case, some knowledge of thermal design will also help. While some courses in computer science is useful, IMHO, spending too much time on CS might not be that productive.
Tom Gardner <spamjunk@blueyonder.co.uk> writes:

> On 03/08/17 16:03, Phil Hobbs wrote: >> On 08/01/2017 09:23 AM, Tom Gardner wrote: >>> On 01/08/17 13:55, Phil Hobbs wrote: >>>> On 07/30/2017 02:05 PM, Tom Gardner wrote: >>>>> On 30/07/17 17:05, Phil Hobbs wrote: >>>>>> Another thing is to concentrate the course work on stuff that's hard >>>>>> to pick up >>>>>> on your own, i.e. math and the more mathematical parts of engineering >>>>>> (especially signals & systems and electrodynamics). >>>>> >>>>> Agreed. >>>>> >>>>>> Programming you can learn out of books without much difficulty, >>>>> >>>>> The evidence is that /isn't/ the case :( Read comp.risks, >>>>> (which has an impressively high signal-to-noise ratio), or >>>>> watch the news (which doesn't). >>>> >>>> Dunno. Nobody taught me how to program, and I've been doing it since >>>> I was a >>>> teenager. I picked up good habits from reading books and other >>>> people's code. >>> >>> Yes, but it was easier back then: the tools, problems >>> and solutions were, by and large, much simpler and more >>> self-contained. >> >> I'm not so sure. Debuggers have improved out of all recognition, with two >> exceptions (gdb and Arduino, I'm looking at you). Plus there are a whole lot of >> libraries available (for Python especially) so a determined beginner can get >> something cool working (after a fashion) fairly fast. > > Yes, that's all true. The speed of getting something going > is important for a beginner. But if the foundation is "sandy" > then it can be necessary and difficult to get beginners > (and managers) to appreciate the need to progress to tools > with sounder foundations. > > The old time "sandy" tool was Basic. While Python is much > better than Basic, it is still "sandy" when it comes to > embedded real time applications. > > >> Seems as though youngsters mostly start with Python and then start in on either >> webdev or small SBCs using Arduino / AVR Studio / Raspbian or (for the more >> ambitious) something like BeagleBone or (a fave) LPCxpresso. Most of my >> embedded work is pretty light-duty, so an M3 or M4 is good medicine. I'm much >> better at electro-optics and analog/RF circuitry than at MCUs or HDL, so I do >> only enough embedded things to get the whole instrument working. Fancy embedded >> stuff I either leave to the experts, do in hardware, or hive off to an outboard >> computer via USB serial, depending on the project. > > I wish more people took that attitude! > > >> It's certainly true that things get complicated fast, but they did in the old >> days too. Of course the reasons are different: nowadays it's the sheer >> complexity of the silicon and the tools, whereas back then it was burn-and-crash >> development, flaky in-system emulators, and debuggers which (if they even >> existed) were almost as bad as Arduino. > > Agreed. The key difference is that with simple-but-unreliable > tools it is possible to conceive that mortals can /understand/ > the tools limitations, and know when/where the tool is failing. > > That simply doesn't happen with modern tools; even the world > experts don't understand their complexity! Seriously. > > Consider C++. The *design committee* refused to believe C++ > templates formed a Turing-complete language inside C++. > They were forced to recant when shown a correct valid C++ > program that never completed compilation - because, during > compilation the compiler was (slowly) emitting the sequence > of prime numbers! What chance have mere mortal developers > got in the face of that complexity.
I don't think that particular criticism is really fair - it seems the (rather simple) C preprocessor is also "turing complete" or at least close to it e.g,. https://stackoverflow.com/questions/3136686/is-the-c99-preprocessor-turing-complete Or a C prime number generator that mostly uses the preprocessor https://www.cise.ufl.edu/~manuel/obfuscate/zsmall.hint At any rate "Compile-time processing" is a big thing now in modern c++, see e.g. Compile Time Maze Generator (and Solver) https://www.youtube.com/watch?v=3SXML1-Ty5U Or more topically for embedded systems there are things like kvasir which do a lot of compile-time work to ~perfectly optimise register accesses and hardware initialisation https://github.com/kvasir-io/Kvasir [...] -- John Devereux
On Sun, 6 Aug 2017 10:35:03 +0100, Tom Gardner
<spamjunk@blueyonder.co.uk> wrote:

>On 03/08/17 16:03, Phil Hobbs wrote: >> On 08/01/2017 09:23 AM, Tom Gardner wrote: >>> On 01/08/17 13:55, Phil Hobbs wrote: >>>> On 07/30/2017 02:05 PM, Tom Gardner wrote: >>>>> On 30/07/17 17:05, Phil Hobbs wrote: >>>>>> Another thing is to concentrate the course work on stuff that's hard >>>>>> to pick up >>>>>> on your own, i.e. math and the more mathematical parts of engineering >>>>>> (especially signals & systems and electrodynamics). >>>>> >>>>> Agreed. >>>>> >>>>>> Programming you can learn out of books without much difficulty, >>>>> >>>>> The evidence is that /isn't/ the case :( Read comp.risks, >>>>> (which has an impressively high signal-to-noise ratio), or >>>>> watch the news (which doesn't). >>>> >>>> Dunno. Nobody taught me how to program, and I've been doing it since >>>> I was a >>>> teenager. I picked up good habits from reading books and other >>>> people's code. >>> >>> Yes, but it was easier back then: the tools, problems >>> and solutions were, by and large, much simpler and more >>> self-contained. >> >> I'm not so sure. Debuggers have improved out of all recognition, with two >> exceptions (gdb and Arduino, I'm looking at you). Plus there are a whole lot of >> libraries available (for Python especially) so a determined beginner can get >> something cool working (after a fashion) fairly fast. >
<snip>
>Another example is that C/C++ is routinely used to develop >multi threaded code, e.g. using PThreads. That's despite >C/C++ specifically being unable to guarantee correct >operation on modern machines! Most developers are >blissfully unaware of (my *emphasis*):
What is multithreaded code ? I can think of two definitions: * The operating system is running independently scheduled tasks, which happens to use a shared address space (e.g. Windows NT and later) * A single task with software implementation task switching between threads. This typically requires that the software library at least handles the timer (RTC) clock interrupts as in time sharing systems. Early examples are ADA running on VAX/VMS, MS-DOS based extenders and later on early Linux PThread. If I understand correctly, more modern (past Linux 2.6) actually implements the PTHread functionality in kernel mode.
>Threads Cannot be Implemented as a Library >Hans-J. Boehm >HP Laboratories Palo Alto >November 12, 2004 * >In many environments, multi-threaded code is written in a language that >was originally designed without thread support (e.g. C), to which a >library of threading primitives was subsequently added. There appears to >be a general understanding that this is not the right approach. We provide >specific arguments that a pure library approach, in which the compiler is >designed independently of threading issues, cannot guarantee correctness >of the resulting code. >We first review why the approach *almost* works, and then examine some >of the *surprising behavior* it may entail. We further illustrate that there >are very simple cases in which a pure library-based approach seems >*incapable of expressing* an efficient parallel algorithm. >Our discussion takes place in the context of C with Pthreads, since it is >commonly used, reasonably well specified, and does not attempt to >ensure type-safety, which would entail even stronger constraints. The >issues we raise are not specific to that context. >http://www.hpl.hp.com/techreports/2004/HPL-2004-209.pdf
Now that there is a lot of multicore processors, this is a really serious issue. But again, should multitasking/mutithreading be implemented in a multitasking OS or in a programming language is a very important question. To the OP, what you are going to need in the next 3 to 10 years is hard to predict.
upsidedown@downunder.com wrote:
> On Sat, 5 Aug 2017 15:20:40 -0500, Les Cargill > <lcargill99@comcast.com> wrote: > >> Hans-Bernhard Br&#4294967295;ker wrote: >>> Am 27.07.2017 um 14:35 schrieb hogwarts: >> <snip> >>> And so the pork cycle rolls on. >>> >> >> That's a great way to put it. >> >>> And they don't even start to think about how anybody is supposed to >>> make an informed decision between such ultra-specialized programmes. >>> I'm convinced it's impossible. >> >> IMO, a reputable EE programme is still probably the best way. CS >> programs still vary too much; CS may or may not be a second-class >> setup in many universities. >> >> I get the feeling that *analog* engineers still have a stable job >> base because it's much harder to fake that. It's somewhat harder. >> >> And I'd warn the OP against specifically targeting IoT. It's a big >> bubble. People win in bubbles but it's not likely you will be among >> them. > > I have often wondered what this IoT hype is all about. It seems to be > very similar to the PLC (Programmable Logic Controller) used for > decades.
Similar. But PLCs are more pointed more at ladder logic for use in industrial settings. You generally cannot, for example, write a socket server that just does stuff on a PLC; you have to stay inside a dev framework that cushions it for you. There is a great deal of vendor lockin and the tool suites are rather creaky. And it's all very costly.
> You need to do some programming but as equally important > interface to he external world (sensors, relay controls and > communication to other devices). >
Yep.
> These days, the programmable devices are just smaller, _much_ cheaper > and have much better performance than a PLC one or two decades ago. >
Very much so. While doing paper-engineering - as in PE work - for power distro has some learning curve, the basics of power distro aren't rocket surgery.
> Take a look at universities having industrial automation courses and > check what topics are included relevant to PLCs. Select these subjects > at your local university. You might not need process control theory > for simple IoT :-) >
You might end up building a flaky hunk of garbage if you don't...
> Analog electronics is important e.g. for interfacing exotic sensors or > controlling equally odd devices as well as protecting I/O against > overvoltage and ground potential issues. Understanding about line > voltage issues and line wiring can be a question of life and death. >
Absolutely.
>> Just be aware that people are uniformly terrible at hiring in tech, >> so networking is key. > > These days much jobs are outsourced to cheaper countries, so you might > concentrate on skills that are harder to outsource. >
-- Les Cargill
tim... wrote:
> > > <upsidedown@downunder.com> wrote in message > news:23ddoch8v472cpj59u4f53dta9nau5ksuf@4ax.com... >> On Sat, 5 Aug 2017 15:20:40 -0500, Les Cargill >> <lcargill99@comcast.com> wrote: >> >>> Hans-Bernhard Br&#4294967295;ker wrote: >>>> Am 27.07.2017 um 14:35 schrieb hogwarts: >>> <snip> >>>> And so the pork cycle rolls on. >>>> >>> >>> That's a great way to put it. >>> >>>> And they don't even start to think about how anybody is >>>> supposed to make an informed decision between such >>>> ultra-specialized programmes. I'm convinced it's impossible. >>> >>> IMO, a reputable EE programme is still probably the best way. CS >>> programs still vary too much; CS may or may not be a >>> second-class setup in many universities. >>> >>> I get the feeling that *analog* engineers still have a stable >>> job base because it's much harder to fake that. It's somewhat >>> harder. >>> >>> And I'd warn the OP against specifically targeting IoT. It's a >>> big bubble. People win in bubbles but it's not likely you will be >>> among them. >> >> I have often wondered what this IoT hype is all about. It seems to >> be very similar to the PLC (Programmable Logic Controller) used >> for decades. > > don't think so > > the IoT hype is all about marketing benefits - selling consumers > extra features (that they never knew they ever wanted and probably > don't need) >
The IoT hype that relates to people trying to get funding for things like Internet enabled juicers might be more frothy than the potential for replacing PLCs with hardware and software that comes from the IoT/Maker space.
> using PLC's is an engineering benefit (or not) >
It's not difficult to get beyond the capability of many PLCs. The highly capable ones ( like NI) tend to be "hangar queens" - they're not mechanically rugged.
> tim > > > > >
-- Les Cargill
Tom Gardner wrote on 8/6/2017 5:35 AM:
> On 03/08/17 16:03, Phil Hobbs wrote: >> On 08/01/2017 09:23 AM, Tom Gardner wrote: >>> On 01/08/17 13:55, Phil Hobbs wrote: >>>> On 07/30/2017 02:05 PM, Tom Gardner wrote: >>>>> On 30/07/17 17:05, Phil Hobbs wrote: >>>>>> Another thing is to concentrate the course work on stuff that's hard >>>>>> to pick up >>>>>> on your own, i.e. math and the more mathematical parts of engineering >>>>>> (especially signals & systems and electrodynamics). >>>>> >>>>> Agreed. >>>>> >>>>>> Programming you can learn out of books without much difficulty, >>>>> >>>>> The evidence is that /isn't/ the case :( Read comp.risks, >>>>> (which has an impressively high signal-to-noise ratio), or >>>>> watch the news (which doesn't). >>>> >>>> Dunno. Nobody taught me how to program, and I've been doing it since >>>> I was a >>>> teenager. I picked up good habits from reading books and other >>>> people's code. >>> >>> Yes, but it was easier back then: the tools, problems >>> and solutions were, by and large, much simpler and more >>> self-contained. >> >> I'm not so sure. Debuggers have improved out of all recognition, with two >> exceptions (gdb and Arduino, I'm looking at you). Plus there are a whole >> lot of >> libraries available (for Python especially) so a determined beginner can get >> something cool working (after a fashion) fairly fast. > > Yes, that's all true. The speed of getting something going > is important for a beginner. But if the foundation is "sandy" > then it can be necessary and difficult to get beginners > (and managers) to appreciate the need to progress to tools > with sounder foundations. > > The old time "sandy" tool was Basic. While Python is much > better than Basic, it is still "sandy" when it comes to > embedded real time applications.
Not sure what you mean by "sandy". Like walking on sand where every step is extra work, like sand getting into everything, like sea shore sand washing away in a storm? That is one of the things Hugh did right, he came up with a novice package that allowed a beginner to become more productive than if they had to write all that code themselves. He just has trouble understanding that his way isn't the only way.
> Another example is that C/C++ is routinely used to develop > multi threaded code, e.g. using PThreads. That's despite > C/C++ specifically being unable to guarantee correct > operation on modern machines! Most developers are > blissfully unaware of (my *emphasis*): > > Threads Cannot be Implemented as a Library > Hans-J. Boehm > HP Laboratories Palo Alto > November 12, 2004 * > In many environments, multi-threaded code is written in a language that > was originally designed without thread support (e.g. C), to which a > library of threading primitives was subsequently added. There appears to > be a general understanding that this is not the right approach. We provide > specific arguments that a pure library approach, in which the compiler is > designed independently of threading issues, cannot guarantee correctness > of the resulting code. > We first review why the approach *almost* works, and then examine some > of the *surprising behavior* it may entail. We further illustrate that there > are very simple cases in which a pure library-based approach seems > *incapable of expressing* an efficient parallel algorithm. > Our discussion takes place in the context of C with Pthreads, since it is > commonly used, reasonably well specified, and does not attempt to > ensure type-safety, which would entail even stronger constraints. The > issues we raise are not specific to that context. > http://www.hpl.hp.com/techreports/2004/HPL-2004-209.pdf
Sounds like Forth where it is up to the programmer to make sure the code is written correctly. -- Rick C
John Devereux wrote on 8/6/2017 9:40 AM:
> Tom Gardner <spamjunk@blueyonder.co.uk> writes: > >> On 03/08/17 16:03, Phil Hobbs wrote: >>> On 08/01/2017 09:23 AM, Tom Gardner wrote: >>>> On 01/08/17 13:55, Phil Hobbs wrote: >>>>> On 07/30/2017 02:05 PM, Tom Gardner wrote: >>>>>> On 30/07/17 17:05, Phil Hobbs wrote: >>>>>>> Another thing is to concentrate the course work on stuff that's hard >>>>>>> to pick up >>>>>>> on your own, i.e. math and the more mathematical parts of engineering >>>>>>> (especially signals & systems and electrodynamics). >>>>>> >>>>>> Agreed. >>>>>> >>>>>>> Programming you can learn out of books without much difficulty, >>>>>> >>>>>> The evidence is that /isn't/ the case :( Read comp.risks, >>>>>> (which has an impressively high signal-to-noise ratio), or >>>>>> watch the news (which doesn't). >>>>> >>>>> Dunno. Nobody taught me how to program, and I've been doing it since >>>>> I was a >>>>> teenager. I picked up good habits from reading books and other >>>>> people's code. >>>> >>>> Yes, but it was easier back then: the tools, problems >>>> and solutions were, by and large, much simpler and more >>>> self-contained. >>> >>> I'm not so sure. Debuggers have improved out of all recognition, with two >>> exceptions (gdb and Arduino, I'm looking at you). Plus there are a whole lot of >>> libraries available (for Python especially) so a determined beginner can get >>> something cool working (after a fashion) fairly fast. >> >> Yes, that's all true. The speed of getting something going >> is important for a beginner. But if the foundation is "sandy" >> then it can be necessary and difficult to get beginners >> (and managers) to appreciate the need to progress to tools >> with sounder foundations. >> >> The old time "sandy" tool was Basic. While Python is much >> better than Basic, it is still "sandy" when it comes to >> embedded real time applications. >> >> >>> Seems as though youngsters mostly start with Python and then start in on either >>> webdev or small SBCs using Arduino / AVR Studio / Raspbian or (for the more >>> ambitious) something like BeagleBone or (a fave) LPCxpresso. Most of my >>> embedded work is pretty light-duty, so an M3 or M4 is good medicine. I'm much >>> better at electro-optics and analog/RF circuitry than at MCUs or HDL, so I do >>> only enough embedded things to get the whole instrument working. Fancy embedded >>> stuff I either leave to the experts, do in hardware, or hive off to an outboard >>> computer via USB serial, depending on the project. >> >> I wish more people took that attitude! >> >> >>> It's certainly true that things get complicated fast, but they did in the old >>> days too. Of course the reasons are different: nowadays it's the sheer >>> complexity of the silicon and the tools, whereas back then it was burn-and-crash >>> development, flaky in-system emulators, and debuggers which (if they even >>> existed) were almost as bad as Arduino. >> >> Agreed. The key difference is that with simple-but-unreliable >> tools it is possible to conceive that mortals can /understand/ >> the tools limitations, and know when/where the tool is failing. >> >> That simply doesn't happen with modern tools; even the world >> experts don't understand their complexity! Seriously. >> >> Consider C++. The *design committee* refused to believe C++ >> templates formed a Turing-complete language inside C++. >> They were forced to recant when shown a correct valid C++ >> program that never completed compilation - because, during >> compilation the compiler was (slowly) emitting the sequence >> of prime numbers! What chance have mere mortal developers >> got in the face of that complexity. > > I don't think that particular criticism is really fair - it seems the > (rather simple) C preprocessor is also "turing complete" or at least > close to it e.g,. > > https://stackoverflow.com/questions/3136686/is-the-c99-preprocessor-turing-complete > > > Or a C prime number generator that mostly uses the preprocessor > > https://www.cise.ufl.edu/~manuel/obfuscate/zsmall.hint > > At any rate "Compile-time processing" is a big thing now in modern c++, > see e.g. > > Compile Time Maze Generator (and Solver) > https://www.youtube.com/watch?v=3SXML1-Ty5U
Funny, compile time program execution is something Forth has done for decades. Why is this important in other languages now? -- Rick C

"Les Cargill" <lcargill99@comcast.com> wrote in message 
news:om7afv$siu$1@dont-email.me...
> tim... wrote: >> >> >> <upsidedown@downunder.com> wrote in message >> news:23ddoch8v472cpj59u4f53dta9nau5ksuf@4ax.com... >>> On Sat, 5 Aug 2017 15:20:40 -0500, Les Cargill >>> <lcargill99@comcast.com> wrote: >>> >>>> Hans-Bernhard Br&#4294967295;ker wrote: >>>>> Am 27.07.2017 um 14:35 schrieb hogwarts: >>>> <snip> >>>>> And so the pork cycle rolls on. >>>>> >>>> >>>> That's a great way to put it. >>>> >>>>> And they don't even start to think about how anybody is >>>>> supposed to make an informed decision between such >>>>> ultra-specialized programmes. I'm convinced it's impossible. >>>> >>>> IMO, a reputable EE programme is still probably the best way. CS >>>> programs still vary too much; CS may or may not be a >>>> second-class setup in many universities. >>>> >>>> I get the feeling that *analog* engineers still have a stable >>>> job base because it's much harder to fake that. It's somewhat >>>> harder. >>>> >>>> And I'd warn the OP against specifically targeting IoT. It's a >>>> big bubble. People win in bubbles but it's not likely you will be >>>> among them. >>> >>> I have often wondered what this IoT hype is all about. It seems to >>> be very similar to the PLC (Programmable Logic Controller) used >>> for decades. >> >> don't think so >> >> the IoT hype is all about marketing benefits - selling consumers >> extra features (that they never knew they ever wanted and probably >> don't need) >> > > The IoT hype that relates to people trying to get funding for things > like Internet enabled juicers might be more frothy
I have just received a questionnaire from the manufactures of my PVR asking about what upgraded features I would like it to include. Whilst they didn't ask it openly, reading between the lines there were asking: "would you like to control your home heating (and several other things) via your Smart TV (box)" To which I answered, of course I bloody well don't Even if I did seen a benefit in having an internet connected heating controller, why would I want to control it from my sofa using anything other than the remote control that comes with it, in the box? tim