Engineering degree for embedded systems

I am applying for university right now and I am wondering which engineering degree is better for working on embedded systems and IOT: "Computer engineering" vs "electronics and communication engineering" also a specific university offers "computer and communication engineering" I know that having any of those I can get into IoT but which would be better for the field?

--------------------------------------- Posted through

formatting link

Reply to
hogwarts
Loading thread data ...

On 2017-07-27 hogwarts wrote in comp.arch.embedded:

As always: that depends.

I don't know the particular programs, so just going by the titles:-\

A lot depends what you want to do on "embedded systems and IoT". Do you want to work on the hardware, low level embedded software, higher level embedded software, server backend, front end, ...

Consider an hypothetical internet connected thermometer: Do you want to measure the NTC voltage and convert to degrees KFC? Or do you want to write the App on the phone to display the value? Or something inbetween? If you want to do it all, I think it's best to start close to one of the extremes. Work your way to the other extreme by experience or additional educaation. But YMMV.

If you want to work on the hardware or software more or less closely related to the hardware, my bet would be on the "electronics ..." degree. But I'm biased ofcourse. I have seen on multiple instances, that software engineers with no electronics background have difficulty reading processor datasheets and electronics schematics. And sometimes fail to really understand what those mean.

Example: On a product we had a microcontroller with an internal reference voltage that was factory calibrated to 2% accuracy. The datasheet also explained how to use a measurement of this reference to correct ADC data on other channels. This was implemented in the software. Now, this seems fine, but: The ADC actually uses an external reference and if this reference is inaccurate, this correction process does help. In our case, the external reference was 0.5%, meaning a measurement with an accuracy of 0.5% is 'corrected' with a reference with 2% accuracy.

So, enough of that. ;-)

So what do you see yourself doing after your education and what does have your personal interest? Check that first, and then compare that to the offered education. Also pay attention to the level of theory/practice. Learning Maxwells laws does not make you solder better. ;-)

--
Stef    (remove caps, dashes and .invalid from e-mail address to reply by mail) 

Learning French is trivial: the word for horse is cheval, and everything else 
follows in the same way. 
		-- Alan J. Perlis
Reply to
Stef

I don't think that you can see IoT as a branch of the industry that requires anything special at entry level

A junior engineering role on an embedded project is probably not going to be expected to deal with any of the security issues (hell there are a lot of companies investigating adding IoT functionality to their products that don't have principle engineers working on that), so it just looks like any other embedded project at that level of experience

you just have to target your job hunt to the relevant companies at graduation time

so IME any EE or engineering biased CS degree will do

tim

Reply to
tim...

AFAICT, nobody at any level in IoT is expected to deal with any of the security issues. Or deal with making products do something useful, for that matter.

--
Grant Edwards               grant.b.edwards        Yow! Of course, you 
                                  at               UNDERSTAND about the PLAIDS 
                              gmail.com            in the SPIN CYCLE --
Reply to
Grant Edwards

I bet that these programs have much overlap. You should look at the details of what courses are standard and what are electives, and see what appeals to you.

This may be antithetical to some, but I think time at a University should mostly be on the "theoretical" side. Primarily it's because picking up that stuff on your own, later, is relatively hard to do. It's also more likely to have lasting value, at least in comparison to learning the language or platform de jour.

By all means plan on doing more "practical" work on your own, during your educational time. These days there are many avenues for that.

Worst case - you make a choice that later seems wrong - you should be able to transfer at fairly low time/expense cost.

Best wishes!

Reply to
Frank Miles

I thought that I said that

harsh

the early IoT proposals based upon mesh systems seem to have created some useful products, street light management for example

tim

Reply to
tim...

It's probably worth finding out what the routes are: if you decide to do one programme, are you stuck with that or can you take courses that lead in a different direction? Many people find their strengths are in different places than they expected.

I'd agree with that - something like 'IoT' is likely to be very different in

4-5 years time when you finish, in terms of the tools and popular platforms. So it's better to have a grounding and then keep up with the platform du jour as the icing on top.

The other aspect is good engineering practices: writing clean code, good documentation, using tools like version control appropriately, etc. I'd suggest that's a skill that isn't well taught in big groups (one instructor,

500 students). It's better to do it either on the job (eg internships) or other environments where you might receive mentoring, eg open source projects. Similiarly for practical skills like soldering, assembly, etc - to some degree you can pick those up from YouTube, or else you need someone sitting next to you telling you what you did wrong.

Also don't be afraid to look over the wall at other disciplines - occasionally having a CS/biology or EE/psychology or whatever crossover can come in very handy. Or closer to home EE/CS, EE/mechE, EE/power, EE/physics or similar combinations.

Theo

Reply to
Theo Markettos

I would choose the electronics degree first, as more likely to keep you in work rather than computer science, which for embedded work is a subset that depends on electronics. It will also stretch you more in math terms than comp sci alone.

Buy the books on comp sci as well, particularly os theory, algorithms and data structures. Learn that in your spare time and find the books s/hand on ABE books or Amazon.

Good luck, a worthwhile career and plenty of scope for innovative design and creativity...

Chris

Reply to
Chris

Another thing is to concentrate the course work on stuff that's hard to pick up on your own, i.e. math and the more mathematical parts of engineering (especially signals & systems and electrodynamics). Programming you can learn out of books without much difficulty, and with a good math background you can teach yourself anything you need to know about.

Just learning MCUs and FPGAs is a recipe for becoming obsolete.

Cheers

Phil Hobbs

--
Dr Philip C D Hobbs 
Principal Consultant 
ElectroOptical Innovations LLC 
Optics, Electro-optics, Photonics, Analog Electronics 

160 North State Road #203 
Briarcliff Manor NY 10510 

hobbs at electrooptical dot net 
http://electrooptical.net
Reply to
Phil Hobbs

Agreed.

The evidence is that /isn't/ the case :( Read comp.risks, (which has an impressively high signal-to-noise ratio), or watch the news (which doesn't).

Agreed.

There's always a decision to be made as to whether to be a generalist or a specialist. Both options are valid, and they have complementary advantages and disadvantages.

Reply to
Tom Gardner

Once you get an EE job, the second part of your education starts: In my case learning all the chips and parts for circuit design (which is steered in the direction of what you anticipate you will need for your employer's work). The manufacturers provide application notes that very good at reinforcing and extending your college knowledge base.

Reply to
jim.brakefield

Dunno. Nobody taught me how to program, and I've been doing it since I was a teenager. I picked up good habits from reading books and other people's code.

Security is another issue. I don't do IoT things myself (and try not to buy them either), but since that's the OP's interest, I agree that one should add security/cryptography to the list of subjects to learn about at school.

Being a specialist is one thing, but getting wedded to one set of tools and techniques is a problem.

Cheers

Phil Hobbs

--
Dr Philip C D Hobbs 
Principal Consultant 
ElectroOptical Innovations LLC 
Optics, Electro-optics, Photonics, Analog Electronics 

160 North State Road #203 
Briarcliff Manor NY 10510 

hobbs at electrooptical dot net 
http://electrooptical.net
Reply to
Phil Hobbs

Yes, but it was easier back then: the tools, problems and solutions were, by and large, much simpler and more self-contained.

Nowadays it is normal to find youngsters[1] that have an inkling beyond the particular language they've been taught, plus one or two "abstract" problems. Typical statements: "FSMs? Oh, yes, they are something to do with compilers." "Caches? Oh yes, they are part of the library" "L1/2/3 caches? " "GCs? They reference count and have long pauses" "Distributed computing failures? The software framework deals with those"

[1] i.e. the ones that HR-droids like to hire because they are cheap and not ornery

I like the cryptographers' aphorism "if you think cryptography will solve your problem, you don't understand cryptography and you don't understand your problem."

A quick sanity check is always to investigate how certificates are revoked when (not if) they are compromised. That's an Achilles Heel of /all/ biometric systems.

Very true. Unfortunately that is encouraged in the s/w world because the recruiters and HR-droids can't extrapolate skills from one technology into a (slightly) different technology.

Sometimes it manifests itself as self-inflicted cargo-cult engineering. As I taught my daughter...

"Mummy, why do you cut off the end of the leg of lamb when you roast it?"

"Your granny always did it, and her roasts were delicious. Ask her"

"Granny, why did you cut off the end of the leg of lamb when you roasted it?"

"Why did I what? ... Oh yes, it was so the joint would fit in the small oven".

Reply to
Tom Gardner

From reading fora and such, I don't think people like to learn how to program that much any more.

WRT to programming, generally "safety" or "security" means "don't expose UB in C programs". This becomes political, fast.

I dunno that crypto knowlege is of any use or not, beyond the "might need it" level.

--
Les Cargill
Reply to
Les Cargill

You can certainly learn things that way - if the books and the code are good enough. You also need an expert or two that you can talk to (or at least, a good newsgroup!), and be able to research the details. Otherwise you learn from one example that a loop of 10 in C is written "for (i = 0; i < 10; i++)", and then a loop of 100 is "for (i = 0; i <

100; i++)". Then you see a web page with "for (i = 0; i < 100000; i++)" but when you try that on your AVR, suddenly it does not work.

Most of the details of particular languages can be picked up from books (or websites), but I think that some training is needed to be a good programmer - you need to understand how to /think/ programming. Mathematics and electronics engineering help too.

Well, it is not uncommon in forums and newsgroups to get the people who have ended up with a project that is well beyond their abilities, and/or time frame, and they want to get things done without "wasting" time learning. And of course there are the people who believe they know it all already, and have great difficulty learning.

What do you mean by that? Undefined behaviour is just bugs in the code. The concept of undefined behaviour in C is a good thing, and helps you get more efficient code - but if your code relies on the results of undefined behaviour it is wrong. In some cases, it might happen to work

- but it is still wrong.

To be safe and secure, a program should not have bugs (at least not ones that affect safety or security!). That applies to all bugs - be it UB, overflows, misunderstandings about the specifications, mistakes in the specifications, incorrect algorithms, incorrect functions - whatever. UB is not special in that way.

And what do you mean by "this becomes political" ?

A little crypto knowledge is good, as is lots - but a medium amount of crypto knowledge can be a dangerous thing. Most programmers know that they don't understand it, and will use third-party software or hardware devices for cryptography. They need to know a little about it, to know when and how to use it - but they don't need to know how it works.

At the other end, the industry clearly needs a certain number of people who /do/ know how it all works, to implement it.

The big danger is the muppets in the middle who think "that 3DES routine is so /slow/. I can write a better encryption function that is more efficient".

Reply to
David Brown

I'm not so sure. Debuggers have improved out of all recognition, with two exceptions (gdb and Arduino, I'm looking at you). Plus there are a whole lot of libraries available (for Python especially) so a determined beginner can get something cool working (after a fashion) fairly fast.

BITD I did a lot of coding with MS C 6.0 for DOS and OS/2, and before that, MS Quickbasic and (an old fave) HP Rocky Mountain Basic, which made graphics and instrument control a breeze. Before that, as an undergraduate I taught myself FORTRAN-77 while debugging some Danish astronemer's Monte Carlo simulation code. I never did understand how it worked in any great depth, but I got through giving a talk on it OK. It was my first and last Fortran project.

Before that, I did a lot of HP calculator programming (HP25C and HP41C). I still use a couple of those 41C programs from almost 40 years ago. There was a hacking club called PPC that produced a hacking ROM for the

41C that I still have, though it doesn't always work anymore.

Seems as though youngsters mostly start with Python and then start in on either webdev or small SBCs using Arduino / AVR Studio / Raspbian or (for the more ambitious) something like BeagleBone or (a fave) LPCxpresso. Most of my embedded work is pretty light-duty, so an M3 or M4 is good medicine. I'm much better at electro-optics and analog/RF circuitry than at MCUs or HDL, so I do only enough embedded things to get the whole instrument working. Fancy embedded stuff I either leave to the experts, do in hardware, or hive off to an outboard computer via USB serial, depending on the project.

It's certainly true that things get complicated fast, but they did in the old days too. Of course the reasons are different: nowadays it's the sheer complexity of the silicon and the tools, whereas back then it was burn-and-crash development, flaky in-system emulators, and debuggers which (if they even existed) were almost as bad as Arduino.

I still have nightmares about the horribly buggy PIC C17 compiler for the PIC17C452A, circa 1999. I was using it in an interesting very low cost infrared imager . I had an ICE, which was a help, but I spent more time finding bug workarounds than coding.

Eventually when the schedule permitted I ported the code to HiTech C, which was a vast improvement. Microchip bought HiTech soon thereafter, and PIC C died a well deserved but belated death.

My son and I are doing a consulting project together--it's an M4-based concentrator unit for up to 6 UV/visible/near IR/thermal IR sensors for a fire prevention company. He just got the SPI interrupt code working down on the metal a couple of minutes ago. It's fun when your family understands what you do. :)

Cheers

Phil Hobbs

--
Dr Philip C D Hobbs 
Principal Consultant 
ElectroOptical Innovations LLC 
Optics, Electro-optics, Photonics, Analog Electronics 

160 North State Road #203 
Briarcliff Manor NY 10510 

hobbs at electrooptical dot net 
http://electrooptical.net
Reply to
Phil Hobbs

Am 27.07.2017 um 14:35 schrieb hogwarts:

Odds are this "field" will either have vanished completely (and maybe deservedly), or have changed beyond recognition in the time from now to when you finish your degree. Betting several years of your life (and depending on your country's style of doing things, up to tens of thousands of dollars on top) on that kind of hunch is rarely advisable.

This is an easy mistake to make, and there are gazillions of freshmen who make it every year. It causes the same "pork cycles" of bubbles and crashes in the education and job markets as are observed in the general economy, and for much the same reason, too.

One of the worst examples in recent history was in 2001, when the very public "dot-com" bubble drove millions of youngsters worldwide to the belief that they absolutely needed to study computer science _now_, to get on the ball early. So for a year or two there were upward of 4 times as many freshmen in CS courses as usual, the vast majority of which were clearly in entirely the wrong place. And it showed. Failures and drop-out rates shot through the roof, and those relatively few "extra" graduates who actually made it onto the job market did so years _after_ the bubble had burst, explosively. Overall, the whole episode was just a colossal waste of hopes, life-time, money and other things.

So my advice is: do your best to forget about any and all current trends and hypes in the economy when you make decisions about your university studies. At best, they're a pointless distraction; at worst they'll mislead you into a field of work you hate for the rest of your life, where you'll be pitted against naturals who like doing it, and are generally better at it, too.

The silly number of supposedly different degrees offered in many countries these days don't help, either. Nowadays, wherever there's a particular combination of testable skills that some university believes will be useful to more than 40 people in the world, total, they'll feel obliged to invent a name for that precise combination of skills and set up a course programme to crank out bachelors of it. Of course, the Universities' predictions about future needs of the job market aren't really that much more reliable than anyone's. And so the pork cycle rolls on.

And they don't even start to think about how anybody is supposed to make an informed decision between such ultra-specialized programmes. I'm convinced it's impossible.

Reply to
Hans-Bernhard Bröker

I see a lot of people who really lean on higher-order constructs. IMO, C++ vectors and arrays look remarkably similar, primarily differing in lifespan. But do some people, they're wildly different.

NULL pointers and NUL terminated string seem to be a problem for many people. and perhaps just pointers of any sort.

That's how I see it as well; others seem see the very existence of UB as one click short of criminal.

Then again, perhaps what I am seeing is propaganda trying to create buzz for the Rust language.

By that I mean the tone of communication on the subject becomes shrill and in cases, somewhat hysterical. If this is mainly propaganda then that would also explain it.

Lets just say that my confidence that anyone can learn C has been shaken this year.

Right. It's like anything complex - we have specialists for that.

Oh good grief. :)

--
Les Cargill
Reply to
Les Cargill

That's a great way to put it.

IMO, a reputable EE programme is still probably the best way. CS programs still vary too much; CS may or may not be a second-class setup in many universities.

I get the feeling that *analog* engineers still have a stable job base because it's much harder to fake that. It's somewhat harder.

And I'd warn the OP against specifically targeting IoT. It's a big bubble. People win in bubbles but it's not likely you will be among them.

Just be aware that people are uniformly terrible at hiring in tech, so networking is key.

--
Les Cargill
Reply to
Les Cargill

I have often wondered what this IoT hype is all about. It seems to be very similar to the PLC (Programmable Logic Controller) used for decades. You need to do some programming but as equally important interface to he external world (sensors, relay controls and communication to other devices).

These days, the programmable devices are just smaller, _much_ cheaper and have much better performance than a PLC one or two decades ago.

Take a look at universities having industrial automation courses and check what topics are included relevant to PLCs. Select these subjects at your local university. You might not need process control theory for simple IoT :-)

Analog electronics is important e.g. for interfacing exotic sensors or controlling equally odd devices as well as protecting I/O against overvoltage and ground potential issues. Understanding about line voltage issues and line wiring can be a question of life and death.

These days much jobs are outsourced to cheaper countries, so you might concentrate on skills that are harder to outsource.

Reply to
upsidedown

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.