Hi-Tech Software bought by Microchip - no more other compilers

scripting.

Who said anything about manual? Any decent debugger/ICE will have ways of trapping on failures.

Reply to
krw
Loading thread data ...

Just to clarify, OS/2 2.0 allowed multiple DOS boxes, 1.x didn't.

Right, (the extra costs being full licenses) that's a little different than what you said, above.

Some only shipped OS/2, which got them out of the M$ box, though that market wasn't big enough to sustain either OS/2 or those smaller box vendors.

No, that is not true (on several fronts). The cost was not minimal, since OS/2 still had to be purchased by the PC group from the IBM software group. Also, the PC software group had to buy WinOS/2 licenses from M$, effectively paying for two Win licenses. For Win->OS/2 V3x upgrades, the "red box" Warp got around the additional WinOS/2 license. Requiring 100% installs would have made it impossible to just ignore the DOJ. The results were the same, but the "discounts" for counting boxes shipped instead of installs gave M$ plausible deniability during the DOJ suit.

Reply to
krw

It takes longer than your "day" to get all the bits and options sorted out between architectures.

Reply to
krw

Purchasing tools may require a more complicated approval process, perhaps stretching beyond the engineering portion of the company.

Reply to
cs_posting

Perhaps what this really indicates is that companies are no longer hiring engineers who like the work enough to have previously pursued it for their own interest.

Reply to
cs_posting

I was pursing the debug-the-logic-on-a-desktop approach until we were temporarily bringing another developer on board. To support that, I spent an extremely long weekend at home writing scripts to get gdb- remote debugging working under an IDE. Adding the additional person failed; but have to admit I'm really using that debugging capability. I still build the code for linux often or occasionally windows if I don't have a unit handy (have both win32 and X simulations of the embedded keyboard and display that get built in the desktop version), but it was very handy to put the client in a conference room in front of the product and let them play with it, then when they don't like something run back to my desk, recompile and almost instantly remotely start the modified code under gdb on the box they are playing with. Technically a lot of that can be done just ftp'ing the code and running it without the debugger, but it's nice to be able to figure out what is going on when it behaves unexpectedly.

Reply to
cs_posting

Did you stop doing FPGA work on all but low-end-of-the-range devices, do you buy enough silicon they actually give you tool seats, or do you have some kind of revolutionary solution to share?

Reply to
cs_posting

Not everyone who writes PC software assumes unlimited RAM. Ironically, the issue tends to be more acute on servers, as the number of concurrent instances you can support is often directly dependent upon how much RAM each instance uses.

IOW, PIC is an 8-bit microcontroller, not a general-purpose computer, and can't (sensibly) be programmed like such. But this isn't specific to PIC; the first two computers I owned used the 6502, with 256 bytes of zero page RAM and a 256-byte stack. It wasn't sensible to program those in C either.

You did mention integrating PICC into the IDE. In any case, I can't see how anyone would say "seen one, seen them all" if they were programming in assembler, which makes the differences quite clear.

It's not difficult to get to the point where you can achieve results. But that isn't the same thing as "getting up to speed". The latter means having memorised the datasheets, knowing the libraries and frameworks, knowing what code the compiler will generate, etc. It means being able to write near-optimal code on the first try, rather than write code, examine assembler, modify code, repeat.

I know this quite well, as it's been a while since I've had the chance to get up to speed on much. Nowadays, every project seems to involve a bunch of new APIs, formats, and protocols. The early stages of the project involve 30 minutes reading for every 10 minutes coding. By the time I no longer have to look stuff up, the project's done and it's time for a new learning curve.

Reply to
Nobody

I consider the command line way to be the easy way. I have a LOT more control and I completely avoid dealing with IDE 'issues' when they happen.

It is the difference between using a Rube-Goldberg contraption or a simple tool. Compilers are very simple to use. So are linkers. They don't need all the foo-foo -- colored fonts, sizable fonts, dockable toolbars, print previews, and a bevy of things none of which have one iota to do with anything about compiling and linking. When you add all those extras, no matter what, there is a whole bag of software in there which can work poorly, incorrectly, well, or anything in between. And one can get caught up in trying to adjust this or that to "make it right," but it is all a waste.

By the way, if you ask any of the developers of these tools (and I have), you will get admissions from them that they spend something on the order of 80-90% of their VERY VALUABLE compiler knowledge time on all this dross. Which means it is time that does NOT go into the compiler or linker, where I'd MUCH RATHER it be spent.

Give me command line compiler/linkers, anyday.

Jon

Reply to
Jon Kirwan

I don't think of an IDE as a way to avoid compiler and linker command- line options (the IDE merely launches my script that drives the compiler, or for a production build, my makefile), I think of it as an editor backed up by an automatically generated database of useful program info, that also provides a source-level interface to the debugger.

I specifically dislike IDE's that force you to do things graphically. I want to use the IDE for development, but I don't want anyone to have to use it. Production gets an integrated firmware loader created by the makefiles. Another engineer needing to maintain the code can fire up the IDE, or they can search the source using any tool of their preference to find what they need to change, change it with the editor of their choice, and then run the makefiles.

Reply to
cs_posting

It represents a huge time-sink for the developer (team) that I'd rather see spent on the compiler and linker (and debugger, though I use that a lot less than most do.) And although I'm glad there is an IDE for young folks looking to learn and needing a greased pathway that isn't too hard for them, it serves me very little.

Debuggers benefit from a nice display and user interface, I'll admit, because a lot of information is available and needs to be presented in a fashion that gets issues across more quickly. But that's about it. (Still, I'd survive fine with a symdeb equivalent.)

Jon

Reply to
Jon Kirwan

Only IBM could have shot themselves in the foot that well...

As you say, the results were the same. I can't be sure I've remembered everything accurately, and certainly some of your details here ring a bell, so I expect you are right about the licensing. IBM certainly got a lower price than others, and they certainly sold few or no pre-installed OS/2 PC's - but any particular deals there would have to have been unofficial to avoid the DOJ (not that the DOJ came close to doing its job in this case).

Reply to
David Brown

Not when the IDE is coming from an entirely different vendor than the compiler and debugger.

That is why I spent a weekend writing scripts to integrate the toolchain with an IDE. I could have used a stand alone gdb front end, but I'm not convinced it would be any simpler. What I go for my trouble is a convenient work environment that can be run on either linux or windows/cygwin.

Reply to
cs_posting

Hehe. Well, there is that, I suppose. But the point remains for those cases where it applies.

Jon...

Reply to
Jon Kirwan

I doubt that. Judging from the IDEs from Keil and IAR they spend too little time on their IDEs. Its a pile of crap. There is a good reason everybody seems to move to Eclipse. I'm not saying Eclipse is perfect (it is a Java application after all) but it does a lot of wonderful things that make it easy to deal with sourcecode and it helps to organize different builds as well. I have projects that produce over

10 different pieces of (related) firmware.
--
Failure does not prove something is impossible, failure simply
indicates you are not using the right tools...
                     "If it doesn\'t fit, use a bigger hammer!"
--------------------------------------------------------------
Reply to
Nico Coesel

It would be inappropriate for me to name names here. But as I said, the fact is I've asked and on occasion received direct answers to this question. I'm not speaking from entire ignorance.

Jon

Reply to
Jon Kirwan

It would seam that perhaps the reason for the move to Eclipse is so they can spend less time working on the IDE itself.

Personally I decided eclipse was just too slow on perfectly functional computers, and tied things into codeblocks instead.

The idea of using an industry standard open source IDE instead of reinventing the wheel is a good one, I'm just not yet convinced that Eclipse is the best choice, though if you are developing in java that might make more sense.

Reply to
cs_posting

Quite a while back I installed Hi-Tide which was the HiTech implementation of Eclipse.

During the first day I decided to delete a project and found that the source files were automatically deleted as well. I complained to HiTech and got the response yeah it does that. I haven't tried it since.

Reply to
Raveninghorde

I like the idea of a separate effort for IDE development and your feelings about Eclipse seem close to mine. However, even if it is a perfect solution, works well under a variety of O/S environments, is open source and free to all, has active support going on year after year, somehow it just sings in everyone's minds and everyone loves it.... compiler vendors will find it removing some of their differentiation. I'm not sure how they'd take to the idea. (Many do so know with MPLAB, so perhaps they have formed some opinions about this by now.)

Jon

Reply to
Jon Kirwan

I was merely pointing out they should stick with building compilers and not try to make IDEs. Especially if it takes most of their time :-)

--
Failure does not prove something is impossible, failure simply
indicates you are not using the right tools...
                     "If it doesn\'t fit, use a bigger hammer!"
--------------------------------------------------------------
Reply to
Nico Coesel

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.