Being too dependent on one "guru"

Greetings:

The department in which I work has a conundrum. We are a research lab environment staffed by Ph.D scientists mostly without EE or computer engineering backgrounds (they are primarily mechanical engineers, and among the world's best.)

Also, we have technologists with mostly 2 year technical degrees, and one software engineer with a great deal of experience in industrial controls, primarily in the engine industry (engine test stand type work).

I am an oddball among the techs. as I'm titled as Laser/Optical Technologist, but with a 4-year degree in Chemistry, but doing 75% of my work on electrical engineering related tasks.

5 years ago when I was only employed there for a year, my boss encouraged me to go to a week long LabView training course. We agreed it would be potentially useful and I complied, even though my initial impression was that, since I already could program in C and assembly, that I wouldn't like LabView very much. That turned out to be true. I haven't used it yet, though I keep it under consideration since I only know how to do mainly "close to the hardware" programming, and have not programmed under any OS since MS-DOS and the 8086. I know nothing about GUI programming under Windows or X, so LabView offers someone like me an easy way to do GUI interfaces, while focussing the bulk of my efforts on low-level stuff.

But this is getting off the subject. The point is that I didn't understand at the time why my boss wanted me to learn about LabView.

Now I understand. It is because they are concerned about what would happen to them if the software guru got hit by a bus or otherwise went unavailable. They think he may be so difficult to replace, that they would be in real trouble finding someone to help them if he went missing and they had a major failure at the same time.

The same line of thinking is causing them to be squeamish about my proposals to build new laboratory control systems with very far-reaching capabilities. They simply don't understand anything other than PCs with COTS DAQ/DIO boards installed. Custom microcontroller boards, FPGAs, they think these things are "esoteric" and "highly specialized" so that they'd never find anyone who could understand what I've done if I vanished as well.

I am planning to present to them a case for why these concerns are overblown, as well as why I should be afforded some liberty to get more involved with software development, particularly on the close-to-the-hardware side of things for an upcoming large project. Even though that might cost them some added time to deploy the system, it's Ok in my view because it's to replace a legacy PC/DIO engine control system which presently works fine but would be very time-consuming to replace if it broke (the boards inside are no longer available and the PC runs DOS to perform real-time tasks). Also I have a prototype replacement system based on a TMS320F2812 which can be deployed within weeks if there was a failure of one of the legacy machines.

So in return for the price they would pay in greater labor costs if I build the embedded side of the new system (the software guy would still need to be involved to write a Windows GUI), is that we would achieve "cross-over" in the involvement of myself and the software dude in understanding the core of the new system.

I think it is better to have me do the embedded side from scratch, because I am one of those who finds it much easier to develop a program from scratch than to understand and extend someone else's. That's mainly a result of the fact that I only do programming "part-time." Whereas for the software guy, he could probably decipher my entire work and begin making enhancements on much shorter order than if I were trying to decipher his work.

No matter what, it will be to their advantage to have two people instead of only one able to work under the hood of the new system. Since the embedded side is more of a staffing challenge to find replacement skills than the GUI interface side, here is where the initial effort at achieving such "cross-over" should be expended.

There's another reason why it makes sense, and they are also aware of this: If they don't employ me to get more involved in the embedded system design and programming, then they may force themselves into exactly the situation they fear. For this particular project, as has happened a few times in the past, I would be relegated to the task of building only the wiring of the hardware to a bunch of user-accessible connectors. This is not work that allows me to accumulate experience in embedded development or FPGA development which is the other thing I'm venturing into. Thus, I would become increasingly motivated to move onward with my career somewhere else. Then they'd have to find a replacement which can function both at my level with the lasers/OPOs, and the electronics and embedded uC systems which I've already deployed. We had a disappointing outcome just trying to find an appropriate replacement laser tech. about a year ago.

I have been pushing them to allow me to build a custom DSP/FPGA board based on F2812 since I am already familiar with that CPU. But the software guy is not very happy about this, that it will take too much time and we should just use COTS. At this point I am actually starting to agree with him, but don't want to admit this and keep arguing for custom, because I know custom will lock him out of the embedded software side, at least until the FPGA-implemented peripherals are developed into a fairly stable form from the programmer's perspective.

But now that I see their concerns regarding becoming too dependent upon gurus, it might be in all of our best interests for me to compromize on the custom/COTS issue and go with a COTS board. I just re-discovered that Innovative Integration offers a DSP+user programmable FPGA board, though it is even more overkill than what I was planning to build. Then if I do the DSP/FPGA programming on the COTS, it will benefit them by initiating the process of crossing-over of skills that they want.

Does this sound like a sensible approach that provides mutual economic benefit to all parties?

Thanks for input.

Good day!

--
_____________________
Christopher R. Carlen
crobc@bogus-remove-me.sbcglobal.net
SuSE 9.1 Linux 2.6.5
Reply to
Chris Carlen
Loading thread data ...

I think you're missing the point, and if your bosses only have the "guru" angle, then they're missing it, too.

Especially in your kind of shop, Labview or something of its kind is by far the best kind of software system to use -- it's very powerful at a high level, easy to program (after a more or less substantial learning curve), quite reliable, and easy for a scientist with little software training to understand, so he can verify that your program does what he intends before it breaks stuff (if he's interested).

Building small research systems in C and assembly is a big loser. I speak from the point of view of one who has done both -- I spent most of my career doing just that, because the Labview-type programs weren't available or advanced enough to do the job.

In your case, where you appear to have performance needs that Labview can't meet, but a lab environment that needs flexibility, the multiple-level approach is ideal. You need the high-level flexibility, which the research scientists themselves could implement if they had the interest, but which a tech or entry-level software person could handle (under expert supervision, of course). This can make possible experiments that the researchers would not otherwise consider, because the lower level design and programming is so costly and time-consuming.

Where you need performance, the microcontroller/DSP approach is much easier to build at the lowest possible level, which will usually be stable enough to be usable with little or no change between experiments. Set up a low-level command protocol, and let Labview do the high-level stuff, then send the low-level commands via the protocol to the low-level controllers. The simpler the low-level stuff, the better and more reliable the system will be.

I do understand custom micontrollers and such (I've built several myself), and I agree with them that you're dead wrong, even foolish, to want to build research systems at such a low level. In fact, you can build even farther-reaching systems using the mixed high/low approach, simply because it's so much cheaper and less time-consuming. You can't get higher low-level performance with the low-level only approach than with the mixed approach, but you can fail to build a system at all with the low-level approach, simply because it's too complex to do.

The squeamishness about "guru" culture is altogether justified, but is far less important than the vastly improved ability to do research.

I think you would be wrong, and I hope you would lose your case.

Even

This makes me wonder whether there's even a need for the truly low-level stuff. I've never done DOS, but my impression is that it works like a PLC, which would be much easier to program, and can give submillisecond performance for the stuff Labview can't handle. If you truly need microsecond performance, then maybe...

Dead wrong. What they need is a system they can change on short notice. The less low-level stuff they have, the lower their risk is. They could use two software people, yes. You would do better by them (and by yourself) to bite the bullet and learn to use Labview. That way, in the rather unlikely event that you really need the truly low-level stuff, you can do it. They don't have to pay you to do something that's probably mostly unnecessary, and if you should get hit by a bus, less of the system depends upon mysterious stuff that they'll need an expensive consultant to sort out.

Sorry, all my sympathies go to your bosses. They hired you for one job, and you're apparently trying to change it into something they probably don't need and is inappropriate. If they truly need low-level stuff, they should have (or develop) an embedded expert. But this will be so costly to keep in-house that they must have a real compelling need.

Low-level embedded systems have their place, but it's not in the small research lab.

It is unwise in the extreme to build a simple, low-complexity custom one-off board if a commercial alternative exists. It is simply foolish to try to build a complex board if a commercial alternative exists, especially if you're not already expert at building such boards. Your bosses are dead right.

And, for a research lab, gross overkill in a commercial $5k DSP/microcontroller/fpga board is still dirt cheap compared to building even a simple custom microcontroller board. Just as $15K for a reusable, flexible Labview system is dirt cheap compared to building even one C/assembly language system.

John Perry

Reply to
John Perry

Without a fulltime EE ? Unfortunate. The lab borders to the useless, IMO.

Labview suggests programming is complicated and comes up with an incredibly complicated graphical interface. As experienced programmer I once spent a month trying to these put wires through walls, a job I could have solved in an hour or so on a PC with a decent language. But it was Labview on a MAC. Dump it in favour for Labwindows.

Any scientist, and also those of the world's best should be able to program in a decent language, otherwise they are useless at least when they leave research.

We once had an Innovative Integration DSP. The lacking source of the libraries made us more work than having had built the board ourselves in the longer run.

Rene

--
Ing.Buero R.Tschaggelar - http://www.ibrtses.com
& commercial newsgroups - http://www.talkto.net
Reply to
Rene Tschaggelar

My background was similar to yours, just separated by a couple of decades. I designed analytical instrumentation for a Research University back when it was all done with logic and not processors.. It was incredibly challenging and incredibly fun. Back then, we were just on the threshold of the build vs buy argument. Now there is a very rich set of tools and solutions from which to chose. In the time since then and now, I went off and founded a small product development company that employs people similar to myself. I can probably provide comments on both sides of the question.

Even though the technology ande times have changed, the fundamental theme remains the same. Buy as high up the commercial ladder as you can. Only build what you can not get otherwise. And when you build, create it using a highly disciplined approach that will allow someone else to re-create the results of your work. It is counter-productive for any organization for a single individual to be too deeply invested in any one product/process/program.. The bus/meteor/terrorist arguement is a valid one.

And, I would add, it is counter-productive for the individual as well. While your time is lockup up twiddling bits for something your could have purchased, a project that actually requires your unique skills may appear and go by the wayside. There is an out of print book with a great title that says it well: "The Existential Pleasures of Engineering". This stuff is fun. No doubt about it. But we rarely get to do it for purely our own satisfaction. And we should not let the fun we get out of it interfere with the objectives of the people who are funding us (our customers). The voyage of self discovery is a wonderful thing, but it is expensive.

My career goal has always been to be as prolific as possible: To create as many things as my time, interests and abilities allowed. To do that, you can't waste time in re-invention. To do that, you have to learn when and how to delegate. (The more projects you are "guru'd" into, the fewer you will ultimately get to do). To do that, you have to get it right the first time. Tracy Kidder in "Soul of a New Machine" summed this up best by equating project development to "Pinball". If you are successfull, the only thing you win is an invitation to play the next game; to do the next project. And if you lose, the game is over. You don't get to do the next one. You are only successful in rising to new challenges as long as everything you have done in the past works. Otherwise you will spend a good chuck your time fixing your mistakes. And again, the stuff you maybe were really born to do will pass on by without you.

Hundreds of thousands of man hours have gone into the creation of lots of really neat stuff. (Real time O/S's, TCP/IP stacks, etc). While it might be fun to build your own, I would offer that unless you are paying for this on your own dime, it is unethical to expect a customer to pay for such a development when cost effective alternatives exist. This is not an easy determination to make. There are always factors that obscure the choice. But if you base your decision on what is in the best interests of the customer, you will choose correctly most of the time. While all of this might make what you do very rewarding personally, it will not make you wealthy. I have yet to find proof to the contrary.

Blakely

Reply to
Noone

Hello Chris,

No EEs? Maybe you guys don't do much in electronics in which case that would be ok, to some extent. Or you have a really good tech who understands electronics well enough to get by ;-)

Your boss has a very valid concern here. He or she has to think that way.

With all due respect I would side with your boss here. In a research lab and also in a production environment you want to see the least amount of custom or proprietary gear possible. Ideally everything should be COTS, just like LabView is. Try to understand you bosses. They would have a lot of egg in the face if something happened down the line with proprietary stuff that could have been done COTS. They would be asked "How could you let that happen?".

And then one day you'd find out that the chosen DSP has been obsoleted.

I was in your boss' position for many years and I would never have thought that way. If you don't have to do something from scratch, then don't. We tried to keep everything COTS and only resorted to custom builds when there either was no other way or cost was very outrageous.

If you want to get into uC and DSP designs you'd have to leave a lab career and move towards product design. Just be aware that this is nowadays a cut-throat cost cutting environment. The most elegant solution is of no interest to anybody. The cheapest solution is what wins you points there. A very, very different environment from where you are now. It's the daily bread here: You turn a couple of BAT54 or a BSS123 around and around to see if they couldn't be replaced by something that costs one or two cents less. Your opamp du jour will become the LM324 because everything else is too expensive. If the offset is too high you'll have to clamp it. Actually, most of the designs will likely be transistor level anyway for cost reasons.

I don't want to discourage you but be prepared before you take that plunge. It's cold out there.

Regards, Joerg

formatting link

Reply to
Joerg

Me too, except even more decades. That was a time when there was no buy option, and even after say 1975 there was no reliable buy option at a reasonable price. We don't need to mess with NAND gates etc. today, when we can buy a few PAL/GAL chips and synthesize any specialized logic needed. I went from hardware to software by the late '70s, and for some time could depend almost entirely on the hardware I had designed in the early '70s. However by adhering to standards I could ensure that my software designs could last much longer, even up to the present.

Just as by 1975 I wouldn't think of designing a hardware CPU any more, today I would not consider designing a CPU board in most circumstances. I would consider designing a simple preprocessing i/o peripheral.

--
Chuck F (cbfalconer@yahoo.com) (cbfalconer@worldnet.att.net)
   Available for consulting/temporary embedded and systems.
     USE worldnet address!
Reply to
CBFalconer

I have been pushing them to allow me to build a custom DSP/FPGA board

You are both right, to a degree.

The test should always be to justify the next level of complexity.

viz If Labview and a relay will work, then use them.

If you can prove some 'coal face' HW is needed to augment labview, then that should be used.

Only if you can prove that Std board, [and that _includes_ vendors Silicon EVAL pcbs ] will not be suitable, should you embark on green fields development. In that case, often a conditioning PCB is all that will be needed.

I'd look very closely at the better FPGA development PCBs: ones with USB,Ethernet and VGA support - you'll need USB/ethernet to get decent speed into Labview, and VGA allows some local-display, for setup, trace, and fault finding. Then, explore the Soft-Cpu support around, and you'll have something more flexible than your custom F2812+FPGA pcb, without the PCB development cycle.

There are also just emerging ARM chips, with FLASH, ADC/Ethernet/USB/CAN, so an Eval PCB for one of those would also be usefull. Ask ST about a EvalPCB for their STW22000 - tho that might be a tad new to deploy in a lab...

-jg

Reply to
Jim Granville

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.