Hi-Tech Software bought by Microchip - no more other compilers

Hardly! And in any case, few UARTs are pure UARTs, most are USARTs with other functionality bolted on; I2C, SPI, CAN, LIN, ... ...

I wish I lived in your world, it must be a very simple and stress-free place.

Nitpicking. You could call an 8051 with onboard LCD controller and keyboard muxer an SoC too.

Reply to
larwe
Loading thread data ...

Heh. The UART came up pretty fast, but getting the clock generator to generate the right assortment of clocks is not something a Z180 ever needed done. Nor PIC.

And not enabling the /Reset pin too soon because the power-up sequence assumes that the firmware is *using* it to control startup of outside peripherals.

I like this chip, but getting really familiar with it did take some time.

Mel.

Reply to
Mel

Eclipse is still far from the best IDEs, but it has potential ... somebody with a lot more time than me could potentially develop an Eclispse based code browser as useful as Smalltalks or a debugger as good as Wind Rivers.

BTW: GDB sucks and GCC is just a decent compiler - not a really good one.

Apart from these points, I agree that some kind of a standard tool chain is desirable. Or at least a plug and play tool chain with standardized intermediate formats so you can choose compiler, linker, debugger, etc. and they all interoperate.

George

Reply to
George Neuner

Well, I use Eclipse for almost everything except 8051. But I never use debugging except for Windows and Linux (which I also use to debug software that goes into embedded platforms). There is not much use in debugging a microcontroller. The non-realtime stuff is easier & faster to debug when run on Windows or Linux. The realtime stuff can't be debugged because the moment you halt there is no more realtime...

--
Failure does not prove something is impossible, failure simply
indicates you are not using the right tools...
                     "If it doesn't fit, use a bigger hammer!"
--------------------------------------------------------------
Reply to
Nico Coesel

Debugging and simulation are often difficult to integrate with IDEs (the only standard is gdb, but even there you can often have to do some work to get things like proxies or hardware interfaces to start up nicely with the IDE's gdb front end). And if you want thinks like gui boxes for picking compiler options, rather than using makefiles, then you need extra integration. But for the basic work of editing, searching, project management, and often running makefiles and interpreting the errors and warnings, then you can mix and match programmer's editors or IDEs and compiler tools.

Reply to
David Brown

My experience with OS/2 began with Warp (OS/2 3.0), which was a very different OS than OS/2 2 (which again was very different from version

1). It was only with Warp that OS/2 gained significant followers, and it needed a 386 or better.

At the time of Warp, Windows was at 3.11 (or the very rare NT 3.51), and could run many DOS programs reasonably (and obviously current windows programs). However, Windows was in reality single-tasking (since it was a co-operative multi-tasking system, and windows programs were not co-operative), although it could do some limited DOS multi-tasking. OS/2 was vastly better, and could properly multi-task windows software and DOS software. In particular, it was much better for DOS since it allowed more flexible graphics and significantly better memory management.

The main cause of death was that that MS told IBM that if it wanted OEM discounts on its windows PCs, it had to install windows on *all* its machines. If it shipped PCs with OS/2 pre-installed, they would have to pay full customer prices for every PC it shipped with windows. IBM said "fair enough", and installed windows on all its PCs.

Reply to
David Brown

And then, for years, gave us a shared memory process model with cooperative multitasking.

Windows, too, was originally designed for the 286, and it retained that unfortunate legacy for nearly 10 years. Windows 1.0 was introduced in 1985. Windows/386 in 1989 was the first to use the 386 in 32-bit protected mode. Windows 3.0 was the first version that was actually useful - it used 32-bit mode internally but only drivers could use it that way - all GUI programs shared a single 16-bit VM. It wasn't until 1993 with the introduction of NT and Win32 on Windows that the average programmer could write real 32-bit software.

OS/2 was better than Windows until NT4 appeared.

George

Reply to
George Neuner

Until you encounter one with a FIFO. Or need to figure out when data is actually finished sending to disable an RS485 transmitter. Or need to figure out _exactly_ how to kick it to clear a framing error status.

Reply to
Mike Harrison

or at the very least, be able to juggle the function keys around so the same key always builds, debugs, halts etc.

Reply to
Mike Harrison

The XMOS tools use Eclipse (they can also be used from the command line). I've always disliked Eclipse but the XMOS implementation is very good.

Leon

Reply to
Leon

When Windows had already won.

If OS/2 was any good (having enough 3rd party application and hardware support to compete) then IBM would not have needed to ship Windows on any PCs would they? OS/2 was already dead, what you suggest might just have been the last nail in its coffin.

Reply to
nospam

And Betamax was better than VHS. Windows 3 was worth buying just to run DOS programs 'better', at the time OS/2 was worse at running DOS programs than DOS. That's when the battle was lost.

Reply to
nospam

Yep.

And there the similarity ends. Even that's overstating the case; whether RAM and ROM share the same address space makes quite a difference.

Far from true. The term "microcontroller" covers everything from a 32-bit ARM/MIPS/AVR32 RISC chip down to a PIC10. Programming the former is closer to programming a PC than to programming a PIC10.

It's not even as if PICs are all the same; "PIC" just means "one of Microchip's 5 substantially different architectures".

It's not even as if the 8-bit PICs are all the same; there's a lot of difference between the the low-end (banked, 2-level stack, no interrupts, unable to read program memory), mid-range (8-level stack, has interrupts, able to read program memory) and high-end (access bank, flat indirect addressing, 31-level stack, byte-addressed program memory, multiple FSR/INDF sets, high/low priority interrupts, separate PORTx/LATx registers, built-in multiply, ...).

If you think that using C makes the low-level details irrelevant, expect to end up with code which is 3x larger and 3x slower than if it had been written with the details in mind.

Reply to
Nobody

You're sure showing your (lack of) debugging skills. If the bug can't be found single stepping, you simply halt *after* (or on) the failure, then track backwards to it.

Reply to
krw

IBM paid a *ton* of software developers to port their wares to OS/2. They took the money and didn't come across.

Reply to
krw

I used 1.0, 1.1, 1.3 (skipped M$' 1.2), and 2.0. All but 1.0 were far better DOS than DOS. V1.3 was particularly good, but still only allowed one DOS box.

Yep, until M$ started crippling WinOS2.

That's not 100% true. IBM, and all other box makers, had to buy a Win license for every box built ("simplified accounting") to get the best OEM price. OS/2 was then an additional cost. M$ wasn't *that* stupid.

Reply to
krw

False analogy. Betamax was *not* better than VHS, where it counted.

Absolute nonsense. Even OS/2 V1.1 was a better DOS than DOS.

Reply to
krw

You've *got* to be kidding! I've never seen two interrupt controllers that work the same way. It takes some time to even get GPIO bits sorted out. Timer/counters? You _must_ be a C hack.

Reply to
krw

$5K is only a week or two.

Reply to
krw

That may well be, but totally irrelevant to anything I've said.

Reply to
krw

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.