The value of a CS education

No, you won't learn anything more from the ASM than from the HLL.

It "doesn't matter" if performance isn't an issue. Just like choice of a linear vs switching regulator might not matter if efficiency and thermal issues aren't an issue.

Note that most code does NOT run on/in a desktop. *And* that portable devices are becoming increasingly more complex and capable (e.g., you wouldn't think of cache -- beyond the tiny prefetch buffer -- in a handheld device 10 years ago; but, I suspect a good many devices today incorporate a fair bit of cache and *rely* on it for performance).

The same is true (to a lesser degree) with VM, XIP, etc.

And, power management issues further suggest a role for this level of familiarity with these sorts of techniques.

No, if the object won't fit in a cache-line (or VM page), you will see a difference in performance. An object 10 times larger will see a proportionately larger difference. Etc.

And the *chemist* has that sort of intimate knowledge of the differences? Counted strings vs. ASCII-Z strings? The differences between Call by Name vs. Call by Value semantics, etc.?

See above. I use it, for example, to run code out of serial flash devices. The speedup (with resource constraints) is

*huge*.

And exactly where do the physics/chemistry majors have those classes in *their* curriculae?

The portion of my original comment that preceded this example (which you neglected to include) stated: 'A (good) "software" curriculum (avoiding the use of "CS") exposes people to issues that are extremely DIFFICULT TO STUMBLE ONTO IN A CASUAL (i.e., taking whichever jobs come your way) CAREER.' [emphasis mine] The "chemist" has to rely on someone more knowledgeable of the technology to bring this to his/her attention. Perhaps a physicist?

How many successfully master multithreaded programming techniques?

How many are aware of the different techniques (and shortcomings) for synchronizing competing actors?

How many can exploit a multiprocessor environment? How efficiently do they learn these techniques (i.e., what is it costing their employer for this OJT -- in terms of increased development times and/or reduced reliability)?

[I chuckle thinking of all the shows Weir did "trying" to learn how to play slide guitar. Who bore the cost of that learning experience?]
Reply to
Don Y
Loading thread data ...

My point is how these people tend to be *hired*. Many employers (or clients) think it's "programming" that they are buying. A parallel would be hiring a carpenter to frame your house -- without having blueprints for him to work off of!

I chuckle when I hear prospective clients tell me they "just need someone to program/code "... as if they were hiring a plumber to install a toilet in their new bathroom. They fail to understand the issues that they can't *see*. Would they try to hire someone to "just fabricate a couple of circuit boards" -- without first having the boards laid out?

That's not necessarily true. E.g., the object paradigm was an entirely new way of approaching a problem vs. the "procedural" approach. Virtual machines (now practical) represent yet another paradigm that offers different capabilities -- and liabilities.

It's not just languages. E.g., I am doing a fair bit of SQL "programming" lately. The scale of the operations is very different than writing "traditional" code. And, the costs of bad design choices can quickly make a solution unworkable. A "chemist" would find it a painful learning experience (on someone else's dime!).

I, for example, am completely clueless when it comes to writing efficient shell scripts which can consume large portions of a server's resources if poorly crafted. (I have friends who intuitively know which shells are most efficient for particular types of actions, as well as which "commands" carry surplus costs, etc.) I attribute the difference to the fact that their actions are less like formal programming than "automation".

I know excellent plumbers! I wouldn't want any of them to decide

*where* the fixtures should be located in my house, though... They're good at sweating pipes, not planning (human) usage patterns and traffic flow.

(they'd probably arrange ALL of the bathrooms and kitchen in a tight cluster at the *front* of the house close to the water main's entry to the premises -- cuts down on pipe and joints! :< )

Reply to
Don Y

What a load of Tosh! We need professionals, not people who pick thing up in their back yards and off youtube. A CS degree will teach you much more than just coding. Of course a 12 year old can do much of it if they work hard. A complete system could require a knowledge of advanced mathematics, not just simple coding. It depends greatly on the application. If I was to hir a programmer for National Instruments I wouldn't hire a self-taught person.

Hardy

Reply to
HardySpicer

On a sunny day (Fri, 30 Sep 2011 22:48:01 -0500) it happened Les Cargill wrote in :

No wonder, you should have asked for Dijkstra :-)

Reply to
Jan Panteltje

On a sunny day (Sat, 01 Oct 2011 08:00:49 GMT) it happened snipped-for-privacy@puntnl.niks (Nico Coesel) wrote in :

A 'programmer' is a bit of an illusive word. As 'a programmer' you always need to know about at least 2 things,

1 programming, and 2 the subject you write the program for. For example when I started writing code for the financial markets, I really had no clue, and I had to buy study material, and even take a course, buy books etc, to even be ABLE to write any code.

In electronics if you want to program for embedded, you need to know about hardware first. Only in very silly cases is subject knowledge not needed, like printing 'hello world' 1000 times.

So the statement that programmers do not 'learn' is silly, Programmers usually know in depth about at least 2 subjects, probably more.

So to translate that into the real world, if you have a degree in information technology, then that does not mean you can program a guidance system for a space shuttle. you would not even understand the jargon. I think from the reverse point, maybe having a degree in aeronautics, you could easier learn programming on top of that. But programming requires a clear mind, and logical thinking. So if you are a dope smoking 90% to 100% spaced out expert on something, that may not be enough to allow you to program stuff for that something. Not everybody can be a programmer, some cannot stay on subject for 2 minutes. No wonder with all those teevee series..

So this translates for me, if I needed to hire a programmer, I would first look for an expert in the field he needs to program for, then if he has ever done some programming, and also if he has a persistent enough set of mind. In the past my choice was not always agreed by the big boss, maybe for political reasons, or whatever, and then they would select an other candidate than I recommended after interviewing. So, what can one do? The world still turns, MS windows still sells, bloat or not. Now they fire an other 2500 people at Nokia, look up what I predicted a few years ago when they went for Qt.

Top down, failure.

There is a big health care software project that is now scrapped (after billions spend) in the UK, top down, failure. An architect needs to know about the stones and concrete. Dream castles in the air collapse in the real world.

I did see a documentary of the last voyage of a UK submarine, there they were, under water, in front of the blue screens...[1] It is easy to launch a cruise missile from there at Ghadaffi, if MS windows works.... But I was wondering how fast their system would respond under attack... But.. MS (USA-UK alliance) sold some more software. [Do we] NEED war for the crap to be scrapped and sanity and performance to return? I personally think so. But I count in the thousands if not million of years. Quite possible earth will be ruled by insects by that time. I got bitten by a mosquito last night, origin traced to a poll of water in my garden, I emptied it today. War of the species ...

The mosquito had no Phd, but got me anyways, better equipped, stealth, attacked me in the sleep. Probably IR guidance too.

Copyright (c)Jan Panteltje 2011-always All Rights Reserved. Nothing of this may be copied without permission of the Author. Violaters agree to pay Jan Panteltje 1000 Euro.

[1] little juvinile cowards, yes we do this for the greatness of the UK emprire. Rubbish.
Reply to
Jan Panteltje

From the computational point of view, the performance is not an issue.

For battery powered devices, the total power consumption is a real big issue (no, I do not work for Nokia or any other cellular phone manufacturer :-).

Did you actually perform any actual power measurements ?

On the macro level, if there are real performance issues, I would communicate these issues to my team if required.

As a project manager, I have no problem of using a person with some specific knowledge to do various hard things.

In my experience, it takes about 1-3 month to train a single thread specialist to multithreading.

The first thing to teach is how to avoid the need for synchronization primitives.

Using multiple processors is really a high level decisions, despite 60 years of theoretical studies :-).

Reply to
upsidedown

But CS is so fashion-driven. I remember breathless CS types telling me about how wonderfully modern and cool Java was, because it compiled source down into code for an _abstract_virtual_machine_ (shiver), They were sort of hurt when I said, "Oh, just like UCSD Pascal, circa 1978."

(Or, of course, Don Knuth's MIX language, circa 1962.)

Re Knuth and CS education: Here's a quote from his web page about why he came out with a new version of MIX:

"Moreover, if I did use a high-level language, what language should it be? In the 1960s I would probably have chosen Algol W; in the 1970s, I would then have had to rewrite my books using Pascal; in the 1980s, I would surely have changed everything to C; in the 1990s, I would have had to switch to C++ and then probably to Java. In the 2000s, yet another language will no doubt be de rigueur. I cannot afford the time to rewrite my books as languages go in and out of fashion; languages aren't the point of my books, the point is rather what you can do in your favorite language. My books focus on timeless truths. "

That's how CS should be taught, but all too frequently isn't.

Cheers

Phil Hobbs

--
Dr Philip C D Hobbs
Principal Consultant
ElectroOptical Innovations LLC
Optics, Electro-optics, Photonics, Analog Electronics

160 North State Road #203
Briarcliff Manor NY 10510
845-480-2058

hobbs at electrooptical dot net
http://electrooptical.net
Reply to
Phil Hobbs

That's exactly the attitude that got us Windows Vista.

It's true that a better algorithm is generally more important than more efficient code, and that a good modern compiler will re-order that nested loop for you if you ask it to.

But "in most cases it doesn't matter" is the banner waving in front of all the sloppy coders who ignore, e.g. network latency because their development lab is all GbE and has a big fat pipe to the public network. Recent Firefoxes are horrible that way--a slow DNS server will hang the UI. There's no excuse for that whatsoever--I'm a physicist, but I've been writing multithreaded code since 1992.

Multimegabyte arrays are rare? You're coding elevator controllers, right? And you don't need a huge array for this to matter--getting

1000 elements per cache line versus 1 element will start to matter pretty fast, if that loop gets executed a lot.

And if the whole program is coded like that, it doesn't matter if it spends a lot of time in that loop--no matter where it spends its time, it'll still be unnecessarily slow .

Right. Thought is required, which is the point.

Somebody has to write the compiler back ends, or we'll have to go back to banging rocks together. Expecting general competence from a CS grad is not asking too much.

And the sheer waste of people's time caused by inefficient software is costly enough that a s/w guy should have a sense of responsibility for avoiding it.

Cheers

Phil Hobbs

--
Dr Philip C D Hobbs
Principal Consultant
ElectroOptical Innovations LLC
Optics, Electro-optics, Photonics, Analog Electronics

160 North State Road #203
Briarcliff Manor NY 10510
845-480-2058

hobbs at electrooptical dot net
http://electrooptical.net
Reply to
Phil Hobbs

You too?

...or BASIC.

Sure, the language is immaterial. A decent programmer should be able to pick up a new one in days to weeks.

OTOH, I can see CS for compiler writers, and such, but as a gateway for applications programmers (where 99% of the work is)?

Reply to
krw

This stuff matters for coders too. See my reply to upsidedown. (Who can go stand on his head.) ;)

Cheers

Phil Hobbs

--
Dr Philip C D Hobbs
Principal Consultant
ElectroOptical Innovations LLC
Optics, Electro-optics, Photonics, Analog Electronics

160 North State Road #203
Briarcliff Manor NY 10510
845-480-2058

hobbs at electrooptical dot net
http://electrooptical.net
Reply to
Phil Hobbs

Keeping up with fashion isn't the same as "staying current". There's more to "software" than just chasing the trendy languages. (and, many "programmers" avoid even *that*! Preferring, instead, to stick with *a* language that they are comfortable and/or proficient with in a "one size fits all" approach to *all* problems)

What I find most interesting is how many technologies can now be "practically" applied -- given advances in hardware, etc. Soon, we'll reinvent MULTICS! :-/

The beauty of something like MIX is that its NOT something that you can run out and run on processor. (though I suppose someone, somewhere, has built a cross-assembler or interpreter for it).

Instead of teaching skills, it use focuses on defining algorithms. Techniques.

Too much time is wasted nowadays on idioms, coding tricks, *stylistic* issues, etc. -- instead of focusing on "what it is you are trying to do". E.g., using petri nets to describe dependencies and synchronization operations instead of *an* implementation using TAS instructions, etc.

Unfortunately, many employers have a closer horizon driving their decisions: "I need a guy who can write code using *these* libraries in *this* language with *that* toolchain under whatever API.

My first employer pointed this out to me before I realized it: "I hire people for *tomorrow's* problems. I'm willing to wait, 'today', for them to learn a skill that might be taught in some of the 'diploma mills' -- you can learn *that* skill here. But, I can't afford to teach you what you'll need to know to handle the problems that I can't yet foresee!"

Reply to
Don Y

And multimegabyte "Hello, World!"'s :-/

Or, devices/machines that take more than an *instant* to power-up.

Or, in this case, *not* reorder it!

The problem with much software is the "coder" doesn't think about what is really happening "under the hood". This is what gives you sluggish, bloated UI's, systems with races, latent bugs, etc.

"What do you mean, 'malloc can return NULL'? But *then* what do I do??" (hint: that's the question you were *hired* to answer!)

Developers, too often, work in resource rich environments which don't correspond with the sort of environment their product is likely to encounter (in terms of desktop apps). E.g., they have a reasonably new machine with "adequate" memory, disk, etc. And, they "test" (play with) their code as a single application -- never thinking that a *real* user might have

5 other applications running at the same time, etc. "Gee, why is this so SLOW?"

Or, in my current case, because I don't want to have to put extra RAM in a device and keep that *powered*.

"Buy a faster processor" ;-)

But, it's more than just "thought". Does the non-CS even *know* that there is a difference here? I.e., the state of value[][] is identical after both fragments have finished executing. "So, what's the big deal?"

"Big Endian? Little Endian? What does *that* have to do with anything?"

"What do you mean, '-0.0'??"

The point I was making is that you won't readily "stumble" onto the knowledge needed to even recognize these issues without a (formal) background in the "science". Or, if you do, it will be at the expense of some development effort that you aren't entirely ready to tackle -- which has forced you into a situation where you must learn this technology, YESTERDAY!

... as hardware designers should be thinking about all those watts that are "doing nothing" much of the time.

Reply to
Don Y

It's a pity that BASIC has such a bad reputation. In its latest invocations, it's a very nice language.

John

Reply to
John Larkin

I don't know--folks who don't actually take an interest in their own discipline are everywhere. As Jamie pointed out, a fire in the belly is what matters. I had one course in numerical analysis as a grad student--it was taught by a supercomputer guru, Prof. Sebastian Doniach, which made it more fun--but otherwise I picked up all that stuff by reading Knuth and folks like that, more or less recreationally, out of interest.

If you're interested in the subject, you'll make a point of doing things different ways, so as to learn new techniques--useful ones, not just trendy ones. It's sort of like motor mechanics, where part of the reason to do the work yourself is to build up a nice set of tools with the money you save. I don't think you can become a master any other way.

In any good project, there's at least one guy who understands _all_ the technologies involved, well enough to do everybody's job (though not necessarily as well or as fast, obviously). I try really hard to be that person, but it's always a relief when there's somebody else in that category as well. Less stuff falls through the cracks that way. (One reason I wrote a lore book was to try to help other folks get to that point too.)

In battery powered applications, sure. I'm also not a big fan of equipment that sits there drawing power when it's turned off, unless it's something like a spectrum analyzer that needs to keep its crystal oscillator warm. But I don't lose a lot of sleep about a watt or two in line-powered equipment.

Cheers

Phil Hobbs

--
Dr Philip C D Hobbs
Principal Consultant
ElectroOptical Innovations LLC
Optics, Electro-optics, Photonics, Analog Electronics

160 North State Road #203
Briarcliff Manor NY 10510
845-480-2058

hobbs at electrooptical dot net
http://electrooptical.net
Reply to
Phil Hobbs

I guess that depends on what the definition of "coder" is. IME, a coder is a technician. They're handed a complete spec, with all the I/Os and data structures defined.

Isn't he already?

Reply to
krw
[elided]

So, is this just a "scholastic exercise"? Is the code expected to actually *run* in a real device? Or, are we just looking at marks on a piece of paper?

I don't need to -- that's the value of an education :>

Note that the above comment addressed the *speed* of the code.

But how do you even *know* what those issues are without the education to recognize them? "Gee, if I had *known* you had cancer, I would have referred you to an Oncologist..."

So, you hire *one* CS person, one math person, one physicist, etc. and expect them, in concert (since they all work together harmoniously!), to implement all of your designs?

Would you hire a bunch of "day laborers" to build your house for you? After all, it's just a hammer and some nails... where's the special skill required *there*?

I'll ignore your use of the word "specialist" (suggesting extreme competence). I find it remarkable (read: unbelievable) that you can get such quick results!

Or, do you end up with lots of "bent over nails" in the process? :>

If two actors *need* to compete, there is no way to "avoid" the issue. "OK, *you* put data into this buffer -- and *you* pull data out of some other, totally unrelated, buffer..."

Multiple processors are present in almost every environment -- even if they aren't recognizable as processors, per se. E.g., the DMA and display controllers each take on responsibilities that are coordinated with the actions of "the" processor. Desktop machines have multiple cores. Phones have DSP "co-processors". Many subsystems are now bought "on a chip" with a processor hiding inside.

Do they understand the different issues that plague tightly coupled systems vs. loosely coupled? or *networked*? NUMA? NoRMA?

What does it cost you (the employer) for them to stumble onto the consequences of these technologies unprepared?

I am always amazed at how imprecisely people understand real-time system design. E.g., how many openly claim HRT to be "harder" (difficulty) than SRT, etc. And, how few actually *know* the difference.

I.e., it takes a fair bit of exposure to these concepts to understand their ramifications. I surely wouldn't want to have someone "picking it up as he goes" -- at least, not if *my* name is on the package that the product ships in! Does he (or, perhaps, *you*?) even know what to be wary of? When he will *need* to have a "resource" available whose brain he can "pick"?

Reply to
Don Y

It gives me the creeps when some of my guys want to code a project using a mix of C, Python, Perl, make, dongled GUIs, VHDL, dongled VHDL tools, and more, all running under some Windows and some Linux. How is anyone going to be able to change one line of this, and recompile it all, five years from now?

I was talking to a guy at a BIG company, with the Oracle/Agile database library for all their documentation. I mentioned to him that his people took our FPGA design and modified it, and then lost the source code. He said "Don't get me started..."

John

Reply to
John Larkin

Too often, that is a consequence of someone trying to glue together things that weren't designed to *work* together. (note that this doesn't preclude their working together -- e.g., mixing bipolar and CMOS technologies).

In the past, I've never used more than three "languages" in a project:

- assembly for low level drivers and OS interfaces

- some HLL (usually C) for the "meat" of the application

- a 4GL or interpreted language for what amounts to "user scripting"

[One of my current projects increases this to *four* languages as I have to drive an RDBMS with SQL.]

Managing more than one language is made easier if the languages are chosen for specific purposes (see above). OTOH, "because this package was *written* in this language" is usually a recipe for disaster. Especially with "fad" languages and languages that are in a state of flux (I've watched Perl evolve contniuously. I suspect Python is doing likewise -- will code written today

*work* with a newer version 3 years hence?)

This is actually pretty common (though more often at smaller companies that lack good document controls). Often, *the* "coder" is considered The Responsible Party for the keeping of The Sacred Papers.

I always find it amusing as so many companies' entire existence is based on these documents: "What will you do in the event of a FIRE?" "Oh, we have fire insurance!" "Great! And what did you value your documentation at?" "Huh?"

Reply to
Don Y

The difference is that BASIC is batch processed.

You need to understand the difference.

It NEVER had a 'bad reputation'.

Yet another bent Larkin perception.

Reply to
SoothSayer

I rather liked Tektronix' SPS BASIC. What's not to love about FFT, integral, or derivative primitives. ;-)

Reply to
krw

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.