Hi-Tech Software bought by Microchip - no more other compilers

And also the hearts and minds of professionals who, if they have fun with what they're doing (and if not, perhaps should change careers), may want to "play" with a new chip or architecture. If the entry barrier to get set up is too high for an out of pocket expense, that chip may be bypassed and won't be designed-in.

--
Rich Webb     Norfolk, VA
Reply to
Rich Webb
Loading thread data ...

berichtnews: snipped-for-privacy@h11g2000yqb.googlegroups.com...

That's not particularly significant! Microchip has a far larger market share than Atmel, and makes a profit, so they must be doing something right.

Leon

Reply to
Leon

There are a lot of tool interface standards that make IDE's essentially a non issue. Everyone uses the same error return standards and error reporting file syntax. Makes work with most compilers and linkers

We encourage and support our customers to use the IDE they are familiar with (most have company standards on IDE's).

w..

Reply to
Walter Banks

Actually, there was a surprisingly large amount of software for OS/2. In particular, there was a totally disproportional amount of freeware, shareware, and open source software for it when considering the number of users compared to Windows. This was at least partly because it was so much easier for developers than Windows, so people who had a choice (such as freeware developers) chose OS/2.

Of course, I have to agree that porting Delphi (and other Borland tools) would have been a big help too.

But the death of OS/2 had little to do with lack of software (almost all DOS/Windows software at the time ran fine on OS/2, and often far better than it did on DOS or Windows). It was all to do with MS's strategies, and IBM's lack of commitment and understanding. Together they figured out that it was cheaper to sell IBM PC's with Windows than with their own OS/2, added to the fact that their own PC's were not entirely compatible with their own OS/2. If IBM was not willing to sell PC's with OS/2, it was hard for anyone else to do it.

Reply to
David Brown

It's already the case and it's called Lite

Reply to
OBones

Nope. A microcontroller always has memory (ram/rom), a cpu and peripherals. If you've seen one, you've seen them all. Ofcourse there are people that like to toy around and try to get every peripheral working without looking at the examples. Those people need a month to get started... I'd fire them asap.

--
Failure does not prove something is impossible, failure simply
indicates you are not using the right tools...
                     "If it doesn\'t fit, use a bigger hammer!"
--------------------------------------------------------------
Reply to
Nico Coesel

That's absolutely untrue, starting with the difference between double- and single-buffered UARTs, buffered vs. non-buffered PWM peripherals, special caveats and magic not often found in application notes. Yes you can compile an app note's sourcecode in a day. No, you cannot learn to use a new microcontroller _effectively_ and _optimally_ in such a short time. No god of embedded engineering can do it, especially for larger chips (I'd like to see you try this on something like an i.MX21).

Just compare the difference between a TI datasheet and a PIC datasheet. TI has a family datasheet for the chip and a user manual for the core and peripheral set. Extensive cross-referencing is required to understand how any particular peripheral works. Microchip has all the data, even down to the ISA document, in a single datasheet for the part family.

Reply to
larwe

OS/2 was a dead man walking long before it could run DOS/Windows software fine.

I died because of IBM's insistence it ran on brain dead 286s (to satisfy their hardware customers who had purchased lots of expensive PS/2 boxes with brain dead processors).

At the time Microsoft embraced the 386 which really could run Windows and DOS programs fine.

IBM chose hardware backward compatibility over software backward compatibility - very very dumb.

Reply to
nospam

Agreed! And there are plenty of professionals for whom a particular project is more like a hobby project. It's small, it's easy, and you want to use it to try out a new processor or toolset.

Reply to
MC

Perhaps but -for instance- all UARTs work the same. Provide some settings, create an interrupt routine and ready to go. Timers ditto. I use -more or less- the same set of routines on 8051 derivatives, H8/3x and several ARM based controllers. Ofcourse there are microcontrollers that are crappy at some point but most modern stuff is quite straightforward. It only takes a few reads from specific sections from the user manual to get a peripheral going.

An i.MX21 is a SOC, not a microcontroller!

--
Failure does not prove something is impossible, failure simply
indicates you are not using the right tools...
                     "If it doesn\'t fit, use a bigger hammer!"
--------------------------------------------------------------
Reply to
Nico Coesel

Nope. The UARTs can be very different and proper setup could be quite sophisticated. Look at the TMS 28xx UART, for example. Or UART DMA configuration for BlackFin.

This assumption could be very misleading and dangerous. There are subtle differences here and there, and this can result in the erratic operation in some special cases. Being burned with that, I prefer to read the manuals and the errata sheets rather then making assumptions.

Vladimir Vassilevsky DSP and Mixed Signal Design Consultant

formatting link

Reply to
Vladimir Vassilevsky

Really? I've yet to work at any company that standardized on a particular IDE -- they are just too many bits and pieces that don't typically interface "nicely" to a generic IDE. I mean, can you actually debug PICs or AVRs or MSP430s using Eclipse with a standard interface that gives you source code, diassembly, breakpoints, watch windows, registers, etc.?

The vast majority of microcontroller programmers I've met just use the IDE that comes with the tools. I'd even go so far as to say that no more than 1 user in 50 uses the more advanced features of any given IDE such as scripting.

---Joel

Reply to
Joel Koltner

Hardly! And in any case, few UARTs are pure UARTs, most are USARTs with other functionality bolted on; I2C, SPI, CAN, LIN, ... ...

I wish I lived in your world, it must be a very simple and stress-free place.

Nitpicking. You could call an 8051 with onboard LCD controller and keyboard muxer an SoC too.

Reply to
larwe

Heh. The UART came up pretty fast, but getting the clock generator to generate the right assortment of clocks is not something a Z180 ever needed done. Nor PIC.

And not enabling the /Reset pin too soon because the power-up sequence assumes that the firmware is *using* it to control startup of outside peripherals.

I like this chip, but getting really familiar with it did take some time.

Mel.

Reply to
Mel

Eclipse is still far from the best IDEs, but it has potential ... somebody with a lot more time than me could potentially develop an Eclispse based code browser as useful as Smalltalks or a debugger as good as Wind Rivers.

BTW: GDB sucks and GCC is just a decent compiler - not a really good one.

Apart from these points, I agree that some kind of a standard tool chain is desirable. Or at least a plug and play tool chain with standardized intermediate formats so you can choose compiler, linker, debugger, etc. and they all interoperate.

George

Reply to
George Neuner

Well, I use Eclipse for almost everything except 8051. But I never use debugging except for Windows and Linux (which I also use to debug software that goes into embedded platforms). There is not much use in debugging a microcontroller. The non-realtime stuff is easier & faster to debug when run on Windows or Linux. The realtime stuff can't be debugged because the moment you halt there is no more realtime...

--
Failure does not prove something is impossible, failure simply
indicates you are not using the right tools...
                     "If it doesn\'t fit, use a bigger hammer!"
--------------------------------------------------------------
Reply to
Nico Coesel

Debugging and simulation are often difficult to integrate with IDEs (the only standard is gdb, but even there you can often have to do some work to get things like proxies or hardware interfaces to start up nicely with the IDE's gdb front end). And if you want thinks like gui boxes for picking compiler options, rather than using makefiles, then you need extra integration. But for the basic work of editing, searching, project management, and often running makefiles and interpreting the errors and warnings, then you can mix and match programmer's editors or IDEs and compiler tools.

Reply to
David Brown

My experience with OS/2 began with Warp (OS/2 3.0), which was a very different OS than OS/2 2 (which again was very different from version

1). It was only with Warp that OS/2 gained significant followers, and it needed a 386 or better.

At the time of Warp, Windows was at 3.11 (or the very rare NT 3.51), and could run many DOS programs reasonably (and obviously current windows programs). However, Windows was in reality single-tasking (since it was a co-operative multi-tasking system, and windows programs were not co-operative), although it could do some limited DOS multi-tasking. OS/2 was vastly better, and could properly multi-task windows software and DOS software. In particular, it was much better for DOS since it allowed more flexible graphics and significantly better memory management.

The main cause of death was that that MS told IBM that if it wanted OEM discounts on its windows PCs, it had to install windows on *all* its machines. If it shipped PCs with OS/2 pre-installed, they would have to pay full customer prices for every PC it shipped with windows. IBM said "fair enough", and installed windows on all its PCs.

Reply to
David Brown

And then, for years, gave us a shared memory process model with cooperative multitasking.

Windows, too, was originally designed for the 286, and it retained that unfortunate legacy for nearly 10 years. Windows 1.0 was introduced in 1985. Windows/386 in 1989 was the first to use the 386 in 32-bit protected mode. Windows 3.0 was the first version that was actually useful - it used 32-bit mode internally but only drivers could use it that way - all GUI programs shared a single 16-bit VM. It wasn't until 1993 with the introduction of NT and Win32 on Windows that the average programmer could write real 32-bit software.

OS/2 was better than Windows until NT4 appeared.

George

Reply to
George Neuner

The XMOS tools use Eclipse (they can also be used from the command line). I've always disliked Eclipse but the XMOS implementation is very good.

Leon

Reply to
Leon

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.