EmbeddedRelated.com
Forums

Engineering degree for embedded systems

Started by hogwarts July 27, 2017
On Thursday, July 27, 2017 at 12:02:36 PM UTC-5, cassiope wrote:
> On Thu, 27 Jul 2017 07:35:14 -0500, hogwarts wrote: > > > I am applying for university right now and I am wondering which > > engineering degree is better for working on embedded systems and IOT: > > "Computer engineering" vs "electronics and communication engineering" also > > a specific university offers "computer and communication engineering" I > > know that having any of those I can get into IoT but which would be better > > for the field? > > > > > > --------------------------------------- > > Posted through http://www.EmbeddedRelated.com > > I bet that these programs have much overlap. You should look at the > details of what courses are standard and what are electives, and see > what appeals to you. > > This may be antithetical to some, but I think time at a University > should mostly be on the "theoretical" side. Primarily it's because > picking up that stuff on your own, later, is relatively hard to do. > It's also more likely to have lasting value, at least in comparison > to learning the language or platform de jour. > > By all means plan on doing more "practical" work on your own, during > your educational time. These days there are many avenues for that. > > Worst case - you make a choice that later seems wrong - you should > be able to transfer at fairly low time/expense cost. > > Best wishes!
Once you get an EE job, the second part of your education starts: In my case learning all the chips and parts for circuit design (which is steered in the direction of what you anticipate you will need for your employer's work). The manufacturers provide application notes that very good at reinforcing and extending your college knowledge base.
On 07/30/2017 02:05 PM, Tom Gardner wrote:
> On 30/07/17 17:05, Phil Hobbs wrote: >> Another thing is to concentrate the course work on stuff that's hard >> to pick up >> on your own, i.e. math and the more mathematical parts of engineering >> (especially signals & systems and electrodynamics). > > Agreed. > >> Programming you can learn out of books without much difficulty, > > The evidence is that /isn't/ the case :( Read comp.risks, > (which has an impressively high signal-to-noise ratio), or > watch the news (which doesn't).
Dunno. Nobody taught me how to program, and I've been doing it since I was a teenager. I picked up good habits from reading books and other people's code. Security is another issue. I don't do IoT things myself (and try not to buy them either), but since that's the OP's interest, I agree that one should add security/cryptography to the list of subjects to learn about at school.
> >> and with a good math background you can >> teach yourself anything you need to know about. > > Agreed. > >> Just learning MCUs and FPGAs is a recipe for becoming obsolete. > > There's always a decision to be made as to whether to > be a generalist or a specialist. Both options are > valid, and they have complementary advantages and > disadvantages.
Being a specialist is one thing, but getting wedded to one set of tools and techniques is a problem. Cheers Phil Hobbs -- Dr Philip C D Hobbs Principal Consultant ElectroOptical Innovations LLC Optics, Electro-optics, Photonics, Analog Electronics 160 North State Road #203 Briarcliff Manor NY 10510 hobbs at electrooptical dot net http://electrooptical.net
On 01/08/17 13:55, Phil Hobbs wrote:
> On 07/30/2017 02:05 PM, Tom Gardner wrote: >> On 30/07/17 17:05, Phil Hobbs wrote: >>> Another thing is to concentrate the course work on stuff that's hard >>> to pick up >>> on your own, i.e. math and the more mathematical parts of engineering >>> (especially signals & systems and electrodynamics). >> >> Agreed. >> >>> Programming you can learn out of books without much difficulty, >> >> The evidence is that /isn't/ the case :( Read comp.risks, >> (which has an impressively high signal-to-noise ratio), or >> watch the news (which doesn't). > > Dunno. Nobody taught me how to program, and I've been doing it since I was a > teenager. I picked up good habits from reading books and other people's code.
Yes, but it was easier back then: the tools, problems and solutions were, by and large, much simpler and more self-contained. Nowadays it is normal to find youngsters[1] that have an inkling beyond the particular language they've been taught, plus one or two "abstract" problems. Typical statements: "FSMs? Oh, yes, they are something to do with compilers." "Caches? Oh yes, they are part of the library" "L1/2/3 caches? <silence>" "GCs? They reference count and have long pauses" "Distributed computing failures? The software framework deals with those" [1] i.e. the ones that HR-droids like to hire because they are cheap and not ornery
> Security is another issue. I don't do IoT things myself (and try not to buy > them either), but since that's the OP's interest, I agree that one should add > security/cryptography to the list of subjects to learn about at school.
I like the cryptographers' aphorism "if you think cryptography will solve your problem, you don't understand cryptography and you don't understand your problem." A quick sanity check is always to investigate how certificates are revoked when (not if) they are compromised. That's an Achilles Heel of /all/ biometric systems.
>>> and with a good math background you can >>> teach yourself anything you need to know about. >> >> Agreed. >> >>> Just learning MCUs and FPGAs is a recipe for becoming obsolete. >> >> There's always a decision to be made as to whether to >> be a generalist or a specialist. Both options are >> valid, and they have complementary advantages and >> disadvantages. > > Being a specialist is one thing, but getting wedded to one set of tools and > techniques is a problem.
Very true. Unfortunately that is encouraged in the s/w world because the recruiters and HR-droids can't extrapolate skills from one technology into a (slightly) different technology. Sometimes it manifests itself as self-inflicted cargo-cult engineering. As I taught my daughter... "Mummy, why do you cut off the end of the leg of lamb when you roast it?" "Your granny always did it, and her roasts were delicious. Ask her" "Granny, why did you cut off the end of the leg of lamb when you roasted it?" "Why did I what? ... Oh yes, it was so the joint would fit in the small oven".
Phil Hobbs wrote:
> On 07/30/2017 02:05 PM, Tom Gardner wrote: >> On 30/07/17 17:05, Phil Hobbs wrote: >>> Another thing is to concentrate the course work on stuff that's >>> hard to pick up on your own, i.e. math and the more mathematical >>> parts of engineering (especially signals & systems and >>> electrodynamics). >> >> Agreed. >> >>> Programming you can learn out of books without much difficulty, >> >> The evidence is that /isn't/ the case :( Read comp.risks, (which >> has an impressively high signal-to-noise ratio), or watch the news >> (which doesn't). > > Dunno. Nobody taught me how to program, and I've been doing it since > I was a teenager. I picked up good habits from reading books and > other people's code. >
From reading fora and such, I don't think people like to learn how to program that much any more.
> Security is another issue. I don't do IoT things myself (and try not > to buy them either), but since that's the OP's interest, I agree that > one should add security/cryptography to the list of subjects to learn > about at school. >
WRT to programming, generally "safety" or "security" means "don't expose UB in C programs". This becomes political, fast. I dunno that crypto knowlege is of any use or not, beyond the "might need it" level.
>> >>> and with a good math background you can teach yourself anything >>> you need to know about. >> >> Agreed. >> >>> Just learning MCUs and FPGAs is a recipe for becoming obsolete. >> >> There's always a decision to be made as to whether to be a >> generalist or a specialist. Both options are valid, and they have >> complementary advantages and disadvantages. > > Being a specialist is one thing, but getting wedded to one set of > tools and techniques is a problem. > > Cheers > > Phil Hobbs > >
-- Les Cargill
On 02/08/17 03:46, Les Cargill wrote:
> Phil Hobbs wrote: >> On 07/30/2017 02:05 PM, Tom Gardner wrote: >>> On 30/07/17 17:05, Phil Hobbs wrote: >>>> Another thing is to concentrate the course work on stuff that's >>>> hard to pick up on your own, i.e. math and the more mathematical >>>> parts of engineering (especially signals & systems and >>>> electrodynamics). >>> >>> Agreed. >>> >>>> Programming you can learn out of books without much difficulty, >>> >>> The evidence is that /isn't/ the case :( Read comp.risks, (which >>> has an impressively high signal-to-noise ratio), or watch the news >>> (which doesn't). >> >> Dunno. Nobody taught me how to program, and I've been doing it since >> I was a teenager. I picked up good habits from reading books and >> other people's code. >> >
You can certainly learn things that way - if the books and the code are good enough. You also need an expert or two that you can talk to (or at least, a good newsgroup!), and be able to research the details. Otherwise you learn from one example that a loop of 10 in C is written "for (i = 0; i < 10; i++)", and then a loop of 100 is "for (i = 0; i < 100; i++)". Then you see a web page with "for (i = 0; i < 100000; i++)" but when you try that on your AVR, suddenly it does not work. Most of the details of particular languages can be picked up from books (or websites), but I think that some training is needed to be a good programmer - you need to understand how to /think/ programming. Mathematics and electronics engineering help too.
> From reading fora and such, I don't think people like to learn how to > program that much any more.
Well, it is not uncommon in forums and newsgroups to get the people who have ended up with a project that is well beyond their abilities, and/or time frame, and they want to get things done without "wasting" time learning. And of course there are the people who believe they know it all already, and have great difficulty learning.
> >> Security is another issue. I don't do IoT things myself (and try not >> to buy them either), but since that's the OP's interest, I agree that >> one should add security/cryptography to the list of subjects to learn >> about at school. >> > > WRT to programming, generally "safety" or "security" means "don't > expose UB in C programs". This becomes political, fast. >
What do you mean by that? Undefined behaviour is just bugs in the code. The concept of undefined behaviour in C is a good thing, and helps you get more efficient code - but if your code relies on the results of undefined behaviour it is wrong. In some cases, it might happen to work - but it is still wrong. To be safe and secure, a program should not have bugs (at least not ones that affect safety or security!). That applies to all bugs - be it UB, overflows, misunderstandings about the specifications, mistakes in the specifications, incorrect algorithms, incorrect functions - whatever. UB is not special in that way. And what do you mean by "this becomes political" ?
> I dunno that crypto knowlege is of any use or not, beyond the "might > need it" level. >
A little crypto knowledge is good, as is lots - but a medium amount of crypto knowledge can be a dangerous thing. Most programmers know that they don't understand it, and will use third-party software or hardware devices for cryptography. They need to know a little about it, to know when and how to use it - but they don't need to know how it works. At the other end, the industry clearly needs a certain number of people who /do/ know how it all works, to implement it. The big danger is the muppets in the middle who think "that 3DES routine is so /slow/. I can write a better encryption function that is more efficient".
>>> >>>> and with a good math background you can teach yourself anything >>>> you need to know about. >>> >>> Agreed. >>> >>>> Just learning MCUs and FPGAs is a recipe for becoming obsolete. >>> >>> There's always a decision to be made as to whether to be a >>> generalist or a specialist. Both options are valid, and they have >>> complementary advantages and disadvantages. >> >> Being a specialist is one thing, but getting wedded to one set of >> tools and techniques is a problem. >> >> Cheers >> >> Phil Hobbs >> >>
On 08/01/2017 09:23 AM, Tom Gardner wrote:
> On 01/08/17 13:55, Phil Hobbs wrote: >> On 07/30/2017 02:05 PM, Tom Gardner wrote: >>> On 30/07/17 17:05, Phil Hobbs wrote: >>>> Another thing is to concentrate the course work on stuff that's hard >>>> to pick up >>>> on your own, i.e. math and the more mathematical parts of engineering >>>> (especially signals & systems and electrodynamics). >>> >>> Agreed. >>> >>>> Programming you can learn out of books without much difficulty, >>> >>> The evidence is that /isn't/ the case :( Read comp.risks, >>> (which has an impressively high signal-to-noise ratio), or >>> watch the news (which doesn't). >> >> Dunno. Nobody taught me how to program, and I've been doing it since >> I was a >> teenager. I picked up good habits from reading books and other >> people's code. > > Yes, but it was easier back then: the tools, problems > and solutions were, by and large, much simpler and more > self-contained.
I'm not so sure. Debuggers have improved out of all recognition, with two exceptions (gdb and Arduino, I'm looking at you). Plus there are a whole lot of libraries available (for Python especially) so a determined beginner can get something cool working (after a fashion) fairly fast. BITD I did a lot of coding with MS C 6.0 for DOS and OS/2, and before that, MS Quickbasic and (an old fave) HP Rocky Mountain Basic, which made graphics and instrument control a breeze. Before that, as an undergraduate I taught myself FORTRAN-77 while debugging some Danish astronemer's Monte Carlo simulation code. I never did understand how it worked in any great depth, but I got through giving a talk on it OK. It was my first and last Fortran project. Before that, I did a lot of HP calculator programming (HP25C and HP41C). I still use a couple of those 41C programs from almost 40 years ago. There was a hacking club called PPC that produced a hacking ROM for the 41C that I still have, though it doesn't always work anymore. Seems as though youngsters mostly start with Python and then start in on either webdev or small SBCs using Arduino / AVR Studio / Raspbian or (for the more ambitious) something like BeagleBone or (a fave) LPCxpresso. Most of my embedded work is pretty light-duty, so an M3 or M4 is good medicine. I'm much better at electro-optics and analog/RF circuitry than at MCUs or HDL, so I do only enough embedded things to get the whole instrument working. Fancy embedded stuff I either leave to the experts, do in hardware, or hive off to an outboard computer via USB serial, depending on the project. It's certainly true that things get complicated fast, but they did in the old days too. Of course the reasons are different: nowadays it's the sheer complexity of the silicon and the tools, whereas back then it was burn-and-crash development, flaky in-system emulators, and debuggers which (if they even existed) were almost as bad as Arduino. I still have nightmares about the horribly buggy PIC C17 compiler for the PIC17C452A, circa 1999. I was using it in an interesting very low cost infrared imager <http://electrooptical.net#footprints>. I had an ICE, which was a help, but I spent more time finding bug workarounds than coding. Eventually when the schedule permitted I ported the code to HiTech C, which was a vast improvement. Microchip bought HiTech soon thereafter, and PIC C died a well deserved but belated death. My son and I are doing a consulting project together--it's an M4-based concentrator unit for up to 6 UV/visible/near IR/thermal IR sensors for a fire prevention company. He just got the SPI interrupt code working down on the metal a couple of minutes ago. It's fun when your family understands what you do. :) Cheers Phil Hobbs -- Dr Philip C D Hobbs Principal Consultant ElectroOptical Innovations LLC Optics, Electro-optics, Photonics, Analog Electronics 160 North State Road #203 Briarcliff Manor NY 10510 hobbs at electrooptical dot net http://electrooptical.net
Am 27.07.2017 um 14:35 schrieb hogwarts:
> I am applying for university right now and I am wondering which > engineering degree is better for working on embedded systems and IOT: > "Computer engineering" vs "electronics and communication engineering" also > a specific university offers "computer and communication engineering" I > know that having any of those I can get into IoT but which would be better > for the field?
Odds are this "field" will either have vanished completely (and maybe deservedly), or have changed beyond recognition in the time from now to when you finish your degree. Betting several years of your life (and depending on your country's style of doing things, up to tens of thousands of dollars on top) on that kind of hunch is rarely advisable. This is an easy mistake to make, and there are gazillions of freshmen who make it every year. It causes the same "pork cycles" of bubbles and crashes in the education and job markets as are observed in the general economy, and for much the same reason, too. One of the worst examples in recent history was in 2001, when the very public "dot-com" bubble drove millions of youngsters worldwide to the belief that they absolutely needed to study computer science _now_, to get on the ball early. So for a year or two there were upward of 4 times as many freshmen in CS courses as usual, the vast majority of which were clearly in entirely the wrong place. And it showed. Failures and drop-out rates shot through the roof, and those relatively few "extra" graduates who actually made it onto the job market did so years _after_ the bubble had burst, explosively. Overall, the whole episode was just a colossal waste of hopes, life-time, money and other things. So my advice is: do your best to forget about any and all current trends and hypes in the economy when you make decisions about your university studies. At best, they're a pointless distraction; at worst they'll mislead you into a field of work you hate for the rest of your life, where you'll be pitted against naturals who like doing it, and are generally better at it, too. The silly number of supposedly different degrees offered in many countries these days don't help, either. Nowadays, wherever there's a particular combination of testable skills that some university believes will be useful to more than 40 people in the world, total, they'll feel obliged to invent a name for that precise combination of skills and set up a course programme to crank out bachelors of it. Of course, the Universities' predictions about future needs of the job market aren't really that much more reliable than anyone's. And so the pork cycle rolls on. And they don't even start to think about how anybody is supposed to make an informed decision between such ultra-specialized programmes. I'm convinced it's impossible.
David Brown wrote:
> On 02/08/17 03:46, Les Cargill wrote: >> Phil Hobbs wrote:
<snip>
> >> From reading fora and such, I don't think people like to learn how >> to program that much any more. > > Well, it is not uncommon in forums and newsgroups to get the people > who have ended up with a project that is well beyond their abilities, > and/or time frame, and they want to get things done without "wasting" > time learning. And of course there are the people who believe they > know it all already, and have great difficulty learning. >
I see a lot of people who really lean on higher-order constructs. IMO, C++ vectors and arrays look remarkably similar, primarily differing in lifespan. But do some people, they're wildly different. NULL pointers and NUL terminated string seem to be a problem for many people. and perhaps just pointers of any sort.
>> >>> Security is another issue. I don't do IoT things myself (and try >>> not to buy them either), but since that's the OP's interest, I >>> agree that one should add security/cryptography to the list of >>> subjects to learn about at school. >>> >> >> WRT to programming, generally "safety" or "security" means "don't >> expose UB in C programs". This becomes political, fast. >> > > What do you mean by that? Undefined behaviour is just bugs in the > code. The concept of undefined behaviour in C is a good thing, and > helps you get more efficient code - but if your code relies on the > results of undefined behaviour it is wrong. In some cases, it might > happen to work - but it is still wrong. >
That's how I see it as well; others seem see the very existence of UB as one click short of criminal. Then again, perhaps what I am seeing is propaganda trying to create buzz for the Rust language.
> To be safe and secure, a program should not have bugs (at least not > ones that affect safety or security!). That applies to all bugs - be > it UB, overflows, misunderstandings about the specifications, > mistakes in the specifications, incorrect algorithms, incorrect > functions - whatever. UB is not special in that way. > > And what do you mean by "this becomes political" ? >
By that I mean the tone of communication on the subject becomes shrill and in cases, somewhat hysterical. If this is mainly propaganda then that would also explain it. Lets just say that my confidence that anyone can learn C has been shaken this year.
>> I dunno that crypto knowlege is of any use or not, beyond the >> "might need it" level. >> > > A little crypto knowledge is good, as is lots - but a medium amount > of crypto knowledge can be a dangerous thing. Most programmers know > that they don't understand it, and will use third-party software or > hardware devices for cryptography. They need to know a little about > it, to know when and how to use it - but they don't need to know how > it works. >
Right. It's like anything complex - we have specialists for that.
> At the other end, the industry clearly needs a certain number of > people who /do/ know how it all works, to implement it. > > The big danger is the muppets in the middle who think "that 3DES > routine is so /slow/. I can write a better encryption function that > is more efficient". >
Oh good grief. :) <snip> -- Les Cargill
Hans-Bernhard Br&ouml;ker wrote:
> Am 27.07.2017 um 14:35 schrieb hogwarts:
<snip>
> And so the pork cycle rolls on. >
That's a great way to put it.
> And they don't even start to think about how anybody is supposed to > make an informed decision between such ultra-specialized programmes. > I'm convinced it's impossible.
IMO, a reputable EE programme is still probably the best way. CS programs still vary too much; CS may or may not be a second-class setup in many universities. I get the feeling that *analog* engineers still have a stable job base because it's much harder to fake that. It's somewhat harder. And I'd warn the OP against specifically targeting IoT. It's a big bubble. People win in bubbles but it's not likely you will be among them. Just be aware that people are uniformly terrible at hiring in tech, so networking is key. -- Les Cargill
On Sat, 5 Aug 2017 15:20:40 -0500, Les Cargill
<lcargill99@comcast.com> wrote:

>Hans-Bernhard Br&#4294967295;ker wrote: >> Am 27.07.2017 um 14:35 schrieb hogwarts: ><snip> >> And so the pork cycle rolls on. >> > >That's a great way to put it. > >> And they don't even start to think about how anybody is supposed to >> make an informed decision between such ultra-specialized programmes. >> I'm convinced it's impossible. > >IMO, a reputable EE programme is still probably the best way. CS >programs still vary too much; CS may or may not be a second-class >setup in many universities. > >I get the feeling that *analog* engineers still have a stable job >base because it's much harder to fake that. It's somewhat harder. > >And I'd warn the OP against specifically targeting IoT. It's a big >bubble. People win in bubbles but it's not likely you will be among >them.
I have often wondered what this IoT hype is all about. It seems to be very similar to the PLC (Programmable Logic Controller) used for decades. You need to do some programming but as equally important interface to he external world (sensors, relay controls and communication to other devices). These days, the programmable devices are just smaller, _much_ cheaper and have much better performance than a PLC one or two decades ago. Take a look at universities having industrial automation courses and check what topics are included relevant to PLCs. Select these subjects at your local university. You might not need process control theory for simple IoT :-) Analog electronics is important e.g. for interfacing exotic sensors or controlling equally odd devices as well as protecting I/O against overvoltage and ground potential issues. Understanding about line voltage issues and line wiring can be a question of life and death.
>Just be aware that people are uniformly terrible at hiring in tech, >so networking is key.
These days much jobs are outsourced to cheaper countries, so you might concentrate on skills that are harder to outsource.