When will the 8051 and othe 8-bits go away?

1) There were availability issues with the 68000. IIRC, it wasn't going to be available in production quantities in time to meet Boca's schedule. 2) Intel was much smaller than Motorola, so IBM could buy a sizable chunk of Intel and have some influence. It would have been more difficult to do that with Motorola.

Just be glad it wasn't some completely propritary IBM processor with a closed-source bus design.

--
Grant Edwards                   grante             Yow!  Is it FUN to be
                                  at               a MIDGET?
                               visi.com
Reply to
Grant Edwards
Loading thread data ...

... snip ...

At design time I believe there was no 8 bit buss version of the

68000 available, and I think the 6809, especially the 6809 with memory management, did not exist. Intel had the 8088 available complete with the essential peripherals, such as clock generator, interrupt controller, etc. An 8 bit buss is much cheaper than a 16 bit job. Don't forget that the present ISA buss is already an extension kluge to the original 8 bit buss, by adding a second connector with all its alignment problems.

At any rate the 8088 was a more capable chip than the 6809, if not as pretty, and had the advantage of being more or less assembly mnemnonic compatible with the 8080. This meant a quick path to porting CP/M software, which was taken by WordStar. Developers didn't have to get stuck with the 640k barrier if they had just taken IBMs advice and routed things through the bios interrupts. So what if it was slow - there was no competition at the time.

--
"If you want to post a followup via groups.google.com, don't use
 the broken "Reply" link at the bottom of the article.  Click on 
 "show options" at the top of the article, then click on the 
 "Reply" at the bottom of the article headers." - Keith Thompson
Reply to
CBFalconer

How long ago are we talking here ? I used the 6809 in 1981. Seem to remember the 68008 was available around that time too. Didn't the Sinclair QL use it ?

Richard [in PE12] Derek tastes of earwax.

Reply to
Jet Morgan

The processor is architecturally OK and I do not have much to complain about the MMU either, it worked as specified.

I worked with a project in 1982/83 which used a 6809 board with a customer specific RTOS (every company had their own RTOS in those days:-) with an extension board with up to 64 KiB of RAM behind the MMU. The extension board was used for data storage only and a window in the main processor address space was used to map a small segment in the data RAM. In addition to increasing the address space, this also helped in protecting the data during crashes and resets.

At that time the only languages available from Motorola was the assembler and a lousy Pascal compiler. For any larger project, the use of assembler as the only development tool is not very realistic, so part of the job was done by the Pascal compiler. Unfortunately this compiler was full of bugs and needed frequent workarounds. Also the generated code nor the libraries were fully re-entrant, so we were forced to use it in a single task only and write everything else in assembler.

As far as I know, no RTOS was available from Motorola at that time (compared to RMX-80 from Intel) and in order to fully utilise the Motorola MMU, an OS similar to RSX-11 would have been required.

The program development tool support for the Motorola systems was quite minimal. For instance, if we needed to get the dump of a memory area before and after an operation, the standard practice was to print the memory hex dump on a thin paper before and after and then put the papers on each other and looking against a light source, to look if there were any differences :-).

This might be a device specific problem, but I include it anyway, in case anybody else had similar experiences. Our Excorciser was extremely sensitive to static electricity when the emulator card and cables were installed. Any static discharge in the same room would cause the Excorciser to tilt e.g. during an editor session. Connecting the user's chair and wrist to the device frame helped somewhat, but if you intended to do some critical edition sessions, it was a good idea to lock the door to prevent anybody entering the lab.

I had been working with Intel processors and Intellec development systems both before and after this project and before that worked in the PDP-11/RSX-11 environment. While fully aware that you can not expect the same things from the 8 bit environment, the Intel program development environment was something I expected, but the Motorola environment was well below that.

While the Motorola processors are definitively cleaner than Intel processors, the problem with Motorola processors is that they did not have sufficient software support or they did not realise how complicated programs were to be executed on current processors.

Paul

Reply to
Paul Keinanen

The 6809 was introduced in the early 1980's.

The Sinclair QL used it in the late 1980's.

Paul

Reply to
Paul Keinanen

I recall perusing the datasheet of the then newly announced 6809 in either

1979 or 1980, complete with the MMU chip as well. I probably have those datasheets around here too (now you know how deep my archive is).
--
********************************************************************
Paul E. Bennett ....................
Forth based HIDECS Consultancy .....
Mob: +44 (0)7811-639972
Tel: +44 (0)1235-811095
Going Forth Safely ....EBA. http://www.electric-boat-association.org.uk/********************************************************************
Reply to
Paul E. Bennett

Someone else mentioned the 8088 giving a shorter development path from CP/M. I recall well after the PC came out, Byte Magazine speculated that the next-generation IBM PC (which turned out to be the AT) would use a 68000 (surely well available by then), but of course it used the next-generation Intel 80286.

-----

formatting link

Reply to
Ben Bradley

This assumes that a non-8088 design would have found success. There were a bunch that didn't, so maybe the IBM PC without the 8088 would have been just as sucessful as the IBM System/23 Datamaster or the DEC Rainbow.

Why should they care about that? They were prducing a line of computers with old models going obsolete and news computers replacing them:

Model 5100... Normal, lasted about a year then was replaced. Model 5110... Normal, lasted about a year then was replaced. Model 5120... Normal, lasted about a year then was replaced. Model 5150... BLAM! design decisions still haunting us in 2005!

Who could have predicted such a thing?

Reply to
Guy Macon

The story I heard, both from IBM and Motorola folk, is that the 68K was the first choice, but it was just being rolled out and sampled, and IBM wanted exclusive use of the first several million chips and a volume that would require Motorola to build a new plant. At the time, Motorola was doing what they always did with a new processor: limit them to 2/month per company/division, and charge $500 each for them - to make sure the processor got maximum exposure, while minimizing the suckage from hobbyists.

So Motorola declined, and IBM went to Intel, who told them that they could provide the 8086 in quantity, but couldn't - but IBM had already started design and so they settled for the 8088.

As to the memory limitations, you have to realize that it wasn't IBM's computer group that developed the PC - it was their office equipment division. The PC was supposed to supplant/augment the Selectric typewriter, and to replace the keypunch machine by using floppies.

Also, the memory limit was 512K, not 640K. That's 32K of 16-byte 'paragraphs' (not pages). The original BIOS used signed arithmetic for calculating memory size, which cause problems when 640 was put it - the extra memory made the count negative.

-- _ Kevin D. Quitt snipped-for-privacy@Quitt.net 96.37% of all statistics are made up

Reply to
Kevin D. Quitt

I've never seen any evidence that the 6829 ever made it to real silicon.

Reply to
Everett M. Greene

I've heard this before, but I'm skeptical. IBM wasn't actually expecting the PC to sell very many units, and Motorola was shipping the 68000 in volume by then, so I don't think there actually would have been any issue.

Of course, the reality is that the PC sold much better than expected, so use of the 68000 could very well have run into supply problems.

But it seems much more likely that the decision was based on cost. An 8088 with the various support components (both Intel 8xxx chips and TTL) was much less expensive than a 68000 with the components it would have required.

Reply to
Eric Smith

The way I remember it, the problem was that IBM had decided on an 8-bit bus, and while the 16-bit bus 68000 was shipping, the

68008 wasn't.
--
Grant Edwards                   grante             Yow!  Used staples are good
                                  at               with SOY SAUCE!
                               visi.com
Reply to
Grant Edwards

They could also have chosen the superior, but more expensive system-wise, 8086 with its 16-bit bus, but chose not to. It was out well before the 8088.

Best regards, Spehro Pefhany

--
"it's the network..."                          "The Journey is the reward"
speff@interlog.com             Info for manufacturers: http://www.trexon.com
Embedded software/hardware/analog  Info for designers:  http://www.speff.com
Reply to
Spehro Pefhany

This reminds me of a series of ads run by AMD, where they showed 2 guys standing on boxes presenting the Z8000 and 8086.

As the ad campaign went on, more and more people moved from the 8086 guy to the Z8000 guy.

The slogan was: The Z8000 is better!

Not long after this AMD began to second source the 8086. At a seminar I attended a sales rep was asked about this.

His answer was: The Z8000 is still better, but the 8086 sells better.

Tom

Reply to
Tom Twist

Hi I'm not sure why I'd want to use a 32 bit processor to run a toaster or a TV remote? Dwight

Paul Marciano wrote:

Reply to
dkelvey

To handle the Web Server needed for the Web Browser based interface :)

[Rest snipped]

Regards Anton Erasmus

Reply to
Anton Erasmus

Besides with a 32 bit processor you can get by without the resistance for the toaster;)

Reply to
Lanarcam

Might as well use those Watts for something as they're being turned into heat...

--
Grant Edwards                   grante             Yow!  By MEER biz doo
                                  at               SCHOIN...
                               visi.com
Reply to
Grant Edwards

From the architectural point of view integrating the processor and the RAM on the same chip would make a lot of sense (but those working with the actual silicon processing might disagree :-).

Think about a typical RAM structure (at least dynamic RAM), in which the row address is decoded first and all cells in the row are activated, feeding the result from each cell to the corresponding column wire and into the sense amplifiers. When the column address is decoded, it selects one of the columns for output.

All the data from a single row is available on the column lines and needs to be squeeze out from the limited number of pins by selecting the corresponding column wire with the column address.

A 8 MiB (64 Mibit) memory might be arranged as an 8192 x 8192 array. After one of the 8192 rows are selected, data from 8192 columns would be available at once (1024 bytes). It would be very nice, if all this could be loaded into a cache page as a single memory cycle. While 8192 parallel lines would consume a lot of space on the chip, it would still be a lot more efficient than a package with more than 8000 pins and high power amplifiers to drive each external line :-).

This would eliminate the DRAM column decoder and the delays associated with it. The cache address decoding could also be simplified, since only a word select would be required.

With 50 ns DRAM cycle time, this would correspond to 20 GB/s or 160 Gbit/s transfer rate. Paul

Reply to
Paul Keinanen

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.