OK. Neither of the groups to which I'm cross-posting this are appropriate.
But, y'all are smart, and I know who to listen to.
Problem: I'm working on a proposal for a customer, for an app that's going to require heavy computation and is more or less real time*. The guy I'll be working with the closest is pretty much a 100% LabView programmer -- he just doesn't _do_ C, or C++, or Fortran.
The customer wants me to deliver them an algorithm, to which they'll write code. They're pretty firm (for good reason) on wanting to do the code in-house, or with local talent. I'm trying to decide how hard I need to push, early on, for the computationally intensive bits to be done in C++.
I just Googled, and didn't find any good references on the relative speeds of doing things in some compiled language vs. Labview. If the ratio is similar to what you get in Scilab or Matlab, then they need to go with C++.
So -- anyone know? Any comments?
Thanks.
- It's not 100% hard real time, with an "exceed and you die" sort of deadline, but after the nominal deadline the slope of the user-crankiness vs. delay curve is pretty steep. Moreover, while _occasional_ delays could be tolerated, if the computer just can't keep up then the delays will grow ever longer -- and the user ever crankier -- with time.