Engineering degree for embedded systems

Do you have a question? Post it now! No Registration Necessary

Translate This Thread From English to

Threaded View
I am applying for university right now and I am wondering which
engineering degree is better for working on embedded systems and IOT:
"Computer engineering" vs "electronics and communication engineering" also
a specific university offers "computer and communication engineering" I
know that having any of those I can get into IoT but which would be better
for the field?



---------------------------------------
Posted through http://www.EmbeddedRelated.com


Re: Engineering degree for embedded systems
On 2017-07-27 hogwarts wrote in comp.arch.embedded:
Quoted text here. Click to load it

As always: that depends.

I don't know the particular programs, so just going by the titles:-\

A lot depends what you want to do on "embedded systems and IoT". Do you
want to work on the hardware, low level embedded software, higher level
embedded software, server backend, front end, ...

Consider an hypothetical internet connected thermometer:
Do you want to measure the NTC voltage and convert to degrees KFC? Or
do you want to write the App on the phone to display the value? Or
something inbetween? If you want to do it all, I think it's best to start
close to one of the extremes. Work your way to the other extreme by
experience or additional educaation. But YMMV.

If you want to work on the hardware or software more or less closely
related to the hardware, my bet would be on the "electronics ..." degree.
But I'm biased ofcourse. I have seen on multiple instances, that software
engineers with no electronics background have difficulty reading processor
datasheets and electronics schematics. And sometimes fail to really
understand what those mean.

Example:
On a product we had a microcontroller with an internal reference voltage
that was factory calibrated to 2% accuracy. The datasheet also explained
how to use a measurement of this reference to correct ADC data on other
channels. This was implemented in the software. Now, this seems fine, but:
The ADC actually uses an external reference and if this reference is
inaccurate, this correction process does help. In our case, the external
reference was 0.5%, meaning a measurement with an accuracy of 0.5% is
'corrected' with a reference with 2% accuracy.

So, enough of that. ;-)

So what do you see yourself doing after your education and what does have
your personal interest? Check that first, and then compare that to the
offered education. Also pay attention to the level of theory/practice.
Learning Maxwells laws does not make you solder better. ;-)


--  
Stef    (remove caps, dashes and .invalid from e-mail address to reply by mail)

Learning French is trivial: the word for horse is cheval, and everything else
We've slightly trimmed the long signature. Click to see the full one.
Re: Engineering degree for embedded systems
On 07/27/2017 09:25 AM, Stef wrote:
Quoted text here. Click to load it

Another thing is to concentrate the course work on stuff that's hard to  
pick up on your own, i.e. math and the more mathematical parts of  
engineering (especially signals & systems and electrodynamics).  
Programming you can learn out of books without much difficulty, and with  
a good math background you can teach yourself anything you need to know  
about.

Just learning MCUs and FPGAs is a recipe for becoming obsolete.

Cheers

Phil Hobbs

--  
Dr Philip C D Hobbs
Principal Consultant
We've slightly trimmed the long signature. Click to see the full one.
Re: Engineering degree for embedded systems
On 30/07/17 17:05, Phil Hobbs wrote:
Quoted text here. Click to load it

Agreed.


The evidence is that /isn't/ the case :( Read comp.risks,
(which has an impressively high signal-to-noise ratio), or
watch the news (which doesn't).

Quoted text here. Click to load it

Agreed.


There's always a decision to be made as to whether to
be a generalist or a specialist. Both options are
valid, and they have complementary advantages and
disadvantages.


Re: Engineering degree for embedded systems
On 07/30/2017 02:05 PM, Tom Gardner wrote:
Quoted text here. Click to load it

Dunno.  Nobody taught me how to program, and I've been doing it since I  
was a teenager.  I picked up good habits from reading books and other  
people's code.

Security is another issue.  I don't do IoT things myself (and try not to  
buy them either), but since that's the OP's interest, I agree that one  
should add security/cryptography to the list of subjects to learn about  
at school.

Quoted text here. Click to load it

Being a specialist is one thing, but getting wedded to one set of tools  
and techniques is a problem.

Cheers

Phil Hobbs


--  
Dr Philip C D Hobbs
Principal Consultant
We've slightly trimmed the long signature. Click to see the full one.
Re: Engineering degree for embedded systems
On 01/08/17 13:55, Phil Hobbs wrote:
Quoted text here. Click to load it

Yes, but it was easier back then: the tools, problems
and solutions were, by and large, much simpler and more
self-contained.

Nowadays it is normal to find youngsters[1] that have an
inkling beyond the particular language they've been taught,
plus one or two "abstract" problems. Typical statements:
"FSMs? Oh, yes, they are something to do with compilers."
"Caches? Oh yes, they are part of the library"
"L1/2/3 caches? <silence>"
"GCs? They reference count and have long pauses"
"Distributed computing failures? The software framework
deals with those"

[1] i.e. the ones that HR-droids like to hire because
they are cheap and not ornery


Quoted text here. Click to load it

I like the cryptographers' aphorism "if you think
cryptography will solve your problem, you don't
understand cryptography and you don't understand
your problem."

A quick sanity check is always to investigate how
certificates are revoked when (not if) they are
compromised. That's an Achilles Heel of /all/
biometric systems.


Quoted text here. Click to load it

Very true. Unfortunately that is encouraged in the s/w
world because the recruiters and HR-droids can't extrapolate
skills from one technology into a (slightly) different
technology.

Sometimes it manifests itself as self-inflicted
cargo-cult engineering. As I taught my daughter...

"Mummy, why do you cut off the end of the leg of lamb
when you roast it?"

"Your granny always did it, and her roasts were delicious.
Ask her"

"Granny, why did you cut off the end of the leg of lamb
when you roasted it?"

"Why did I what? ... Oh yes, it was so the joint would
fit in the small oven".



Re: Engineering degree for embedded systems
On 08/01/2017 09:23 AM, Tom Gardner wrote:
Quoted text here. Click to load it

I'm not so sure.  Debuggers have improved out of all recognition, with  
two exceptions (gdb and Arduino, I'm looking at you).  Plus there are a  
whole lot of libraries available (for Python especially) so a determined  
beginner can get something cool working (after a fashion) fairly fast.

BITD I did a lot of coding with MS C 6.0 for DOS and OS/2, and before  
that, MS Quickbasic and (an old fave) HP Rocky Mountain Basic, which  
made graphics and instrument control a breeze.  Before that, as an  
undergraduate I taught myself FORTRAN-77 while debugging some Danish  
astronemer's Monte Carlo simulation code.  I never did understand how it  
worked in any great depth, but I got through giving a talk on it OK.  It  
was my first and last Fortran project.

Before that, I did a lot of HP calculator programming (HP25C and HP41C).  
  I still use a couple of those 41C programs from almost 40 years ago.  
There was a hacking club called PPC that produced a hacking ROM for the  
41C that I still have, though it doesn't always work anymore.

Seems as though youngsters mostly start with Python and then start in on  
either webdev or small SBCs using Arduino / AVR Studio / Raspbian or  
(for the more ambitious) something like BeagleBone or (a fave)  
LPCxpresso.  Most of my embedded work is pretty light-duty, so an M3 or  
M4 is good medicine.  I'm much better at electro-optics and analog/RF  
circuitry than at MCUs or HDL, so I do only enough embedded things to  
get the whole instrument working.  Fancy embedded stuff I either leave  
to the experts, do in hardware, or hive off to an outboard computer via  
USB serial, depending on the project.

It's certainly true that things get complicated fast, but they did in  
the old days too.  Of course the reasons are different: nowadays it's  
the sheer complexity of the silicon and the tools, whereas back then it  
was burn-and-crash development, flaky in-system emulators, and debuggers  
which (if they even existed) were almost as bad as Arduino.

I still have nightmares about the horribly buggy PIC C17 compiler for  
the PIC17C452A, circa 1999.  I was using it in an interesting very low  
cost infrared imager <http://electrooptical.net#footprints .   I had an  
ICE, which was a help, but I spent more time finding bug workarounds  
than coding.

Eventually when the schedule permitted I ported the code to HiTech C,  
which was a vast improvement.  Microchip bought HiTech soon thereafter,  
and PIC C died a well deserved but belated death.

My son and I are doing a consulting project together--it's an M4-based  
concentrator unit for up to 6 UV/visible/near IR/thermal IR sensors for  
a fire prevention company.  He just got the SPI interrupt code working  
down on the metal a couple of minutes ago.  It's fun when your family  
understands what you do. :)

Cheers

Phil Hobbs


--  
Dr Philip C D Hobbs
Principal Consultant
We've slightly trimmed the long signature. Click to see the full one.
Re: Engineering degree for embedded systems
On 03/08/17 16:03, Phil Hobbs wrote:
Quoted text here. Click to load it

Yes, that's all true. The speed of getting something going
is important for a beginner. But if the foundation is "sandy"
then it can be necessary and difficult to get beginners
(and managers) to appreciate the need to progress to tools
with sounder foundations.

The old time "sandy" tool was Basic. While Python is much
better than Basic, it is still "sandy" when it comes to
embedded real time applications.


Quoted text here. Click to load it

I wish more people took that attitude!


Quoted text here. Click to load it

Agreed. The key difference is that with simple-but-unreliable
tools it is possible to conceive that mortals can /understand/
the tools limitations, and know when/where the tool is failing.

That simply doesn't happen with modern tools; even the world
experts don't understand their complexity! Seriously.

Consider C++. The *design committee* refused to believe C++
templates formed a Turing-complete language inside C++.
They were forced to recant when shown a correct valid C++
program that never completed compilation - because, during
compilation the compiler was (slowly) emitting the sequence
of prime numbers! What chance have mere mortal developers
got in the face of that complexity.

Another example is that C/C++ is routinely used to develop
multi threaded code, e.g. using PThreads. That's despite
C/C++ specifically being unable to guarantee correct
operation on modern machines! Most developers are
blissfully unaware of (my *emphasis*):

Threads Cannot be Implemented as a Library
Hans-J. Boehm
HP Laboratories Palo Alto
November 12, 2004 *
In many environments, multi-threaded code is written in a language that
was originally designed without thread support (e.g. C), to which a
library of threading primitives was subsequently added. There appears to
be a general understanding that this is not the right approach. We provide
specific arguments that a pure library approach, in which the compiler is
designed independently of threading issues, cannot guarantee correctness
of the resulting code.
We first review why the approach *almost* works, and then examine some
of the *surprising behavior* it may entail. We further illustrate that there
are very simple cases in which a pure library-based approach seems
*incapable of expressing* an efficient parallel algorithm.
Our discussion takes place in the context of C with Pthreads, since it is
commonly used, reasonably well specified, and does not attempt to
ensure type-safety, which would entail even stronger constraints. The
issues we raise are not specific to that context.
http://www.hpl.hp.com/techreports/2004/HPL-2004-209.pdf


Quoted text here. Click to load it

There are always crap instantiations of tools, but they
can be avoided. I'm more concerned about tools where the
specification prevents good and safe tools.


Quoted text here. Click to load it

Lucky you -- I think! I've never been convinced of the
wisdom of mixing work and home life, and family businesses
seem to be the source material for reality television :)


Re: Engineering degree for embedded systems

Quoted text here. Click to load it

I don't think that particular criticism is really fair - it seems the
(rather simple) C preprocessor is also "turing complete" or at least
close to it e.g,.

https://stackoverflow.com/questions/3136686/is-the-c99-preprocessor-turing-complete


Or a C prime number generator that mostly uses the preprocessor

https://www.cise.ufl.edu/~manuel/obfuscate/zsmall.hint

At any rate "Compile-time processing" is a big thing now in modern c++,
see e.g.

Compile Time Maze Generator (and Solver)

https://www.youtube.com/watch?v=3SXML1-Ty5U


Or more topically for embedded systems there are things like kvasir
which do a lot of compile-time work to ~perfectly optimise register
accesses and hardware initialisation

https://github.com/kvasir-io/Kvasir

[...]


--  

John Devereux

Re: Engineering degree for embedded systems
John Devereux wrote on 8/6/2017 9:40 AM:
Quoted text here. Click to load it

Funny, compile time program execution is something Forth has done for  
decades.  Why is this important in other languages now?


--  

Rick C

Re: Engineering degree for embedded systems
On 06/08/17 17:51, rickman wrote:
Quoted text here. Click to load it

It isn't important.

What is important is that the (world-expert) design committee
didn't understand (and then refused to believe) the
implications of their proposal.

That indicates the tool is so complex and baroque as to
be incomprehensible - and that is a very bad starting point.


Re: Engineering degree for embedded systems
Tom Gardner wrote on 8/6/2017 3:13 PM:
Quoted text here. Click to load it

That's the point.  Forth is one of the simplest development tools you will  
ever find.  It also has some of the least constraints.  The only people who  
think it is a bad idea are those who think RPN is a problem and object to  
other trivial issues.

--  

Rick C

Re: Engineering degree for embedded systems
On 08/06/2017 07:21 PM, rickman wrote:
Quoted text here. Click to load it

I used to program in RPN routinely, still use RPN calculators
exclusively, and don't like Forth.  Worrying about the state of the
stack is something I much prefer to let the compiler deal with.  It's
like C functions with ten positional parameters.

Cheers

Phil "existence proof" Hobbs

--  
Dr Philip C D Hobbs
Principal Consultant
We've slightly trimmed the long signature. Click to see the full one.
Re: Engineering degree for embedded systems
Phil Hobbs wrote on 8/7/2017 12:40 PM:
Quoted text here. Click to load it

If you are writing Forth code and passing 10 items into a definition, you  
have missed a *lot* on how to write Forth code.  I can see why you are  
frustrated.

--  

Rick C

Re: Engineering degree for embedded systems
On 08/07/2017 03:18 PM, rickman wrote:
Quoted text here. Click to load it
I'm not frustrated, partly because I haven't written anything in Forth
for over 30 years. ;)

And I didn't say I was passing 10 parameters to a Forth word, either.
It's just that having to worry about the state of the stack is so 1975.
I wrote my last HP calculator program in the early '80s, and have no
burning desire to do that again either.

Cheers

Phil Hobbs

--  
Dr Philip C D Hobbs
Principal Consultant
We've slightly trimmed the long signature. Click to see the full one.
Re: Engineering degree for embedded systems
Phil Hobbs wrote on 8/7/2017 3:27 PM:
Quoted text here. Click to load it

You clearly mentioned 10 parameters, no?

I get that you don't fully understand Forth.  When I said "The only people  
who think it is a bad idea are those who think RPN is a problem and object  
to other trivial issues" by other trivial issues I was referring to the use  
of the stack.

--  

Rick C

Re: Engineering degree for embedded systems
On 08/07/2017 04:47 PM, rickman wrote:
Quoted text here. Click to load it

Yes, I was making the point that having to keep the state of the stack
in mind was error prone in the same way as passing that many parameters
in C.  It's also annoying to document.  In C, I don't have to say what
the values of the local varables are--it's clear from the code.

Quoted text here. Click to load it

Well, the fact that you think of Forth's main wart as a trivial issue is
probably why you like it.  ;)

Cheers

Phil Hobbs

--  
Dr Philip C D Hobbs
Principal Consultant
We've slightly trimmed the long signature. Click to see the full one.
Re: Engineering degree for embedded systems
Phil Hobbs wrote on 8/7/2017 5:30 PM:
Quoted text here. Click to load it

Yes, it is error prone in the same way adding numbers is to a fourth grader.  
  So use a calculator... but that's actually slower and can't be done if you  
don't have a calculator!  That's the analogy I would use.  Dealing with the  
stack is trivial if you make a small effort.

Once I was in a discussion about dealing with the problems of debugging  
stack errors which usually are a mismatch between the number of parameters  
passed to/from and the number the definition is actually using.  This is  
exactly the sort of problem a compiler can check, but typically is not done  
in Forth.  Jeff Fox simply said something like, this proves the programmer  
can't count.  I realized how simple the truth is.  When considered in the  
context of how Forth programs are debugged this is simply not a problem  
worth dealing with by the compiler.  If you learn more about Forth you will  
see that.

The stack is not the problem.


Quoted text here. Click to load it

Yes, I expect you would call this a wart too...

https://k30.kn3.net/AB653626F.jpg

I think Forth's biggest problem is people who can't see the beauty for the  
mark.

--  

Rick C

Re: Engineering degree for embedded systems

Quoted text here. Click to load it

Could both of you learn to trim your posts? Then I might read enough
of them to be interested.

Stephen

--  
Stephen Pelc, snipped-for-privacy@mpeforth.com
MicroProcessor Engineering Ltd - More Real, Less Time
We've slightly trimmed the long signature. Click to see the full one.
Re: Engineering degree for embedded systems
On 08/07/2017 07:49 PM, Stephen Pelc wrote:
Quoted text here. Click to load it

Hit "end" when you load the post.  Works in Thunderbird at least.

Cheers

Phil Hobbs

--  
Dr Philip C D Hobbs
Principal Consultant
We've slightly trimmed the long signature. Click to see the full one.

Site Timeline