x86 architecture concepts

Hmm, I don't think there is any dominant MCU architecture in the embedded market (at least not in the sense that the ia32 dominates the desktop market). IMO, that the x86 is being used in embedded devices at all is some kind of spillover-effect from the desktop market: a plethora of tools and stuff available for cheap, etc. Interestingly, I would rate the more successful architectures in the embedded market (e.g. ARM, PPC, MIPS, 68k) to be all far more "elegant" than ia32. But that of course depends on what one considers to be elegant. For me, it means achieving a goal in a well thought-through way, without clumsy kludges or work-arounds. That, applied to an MCU architecture apparently translates to achieving a certain functionality with less transistors per square millimeter which is the key reason why these architectures are successful in the embedded market.

So, apparently, technical elegance does sometimes pay, *especially* in the embedded market. This may be because the people making decisions in the embedded market tend to be more technically knowledgable than the average computer user (just look at the people frequenting this newsgroup!) and are not so easily caught by FUD campaigns.

Rob

--
Robert Kaiser                     email: rkaiser AT sysgo DOT com
SYSGO AG                          http://www.elinos.com
Klein-Winternheim / Germany       http://www.sysgo.com
Reply to
Robert Kaiser
Loading thread data ...

Bwahahahaha. PIC. 8051. Z8. Forth. RS-232.

Kelly

Reply to
Kelly Hall

The PIC does seem like a bit of mess

When the 8051 was introduced, it was damned elegent.

Everybody I knew loved it.

Since then, people have taken it and tried to use it in some very inappropriate (IMO) situations. For example, the architecture simply wasn't intended to deal with more than 64K of code, with external RAM, or with a stack-intensive language like C. Sure there are kludges you can use to do those sorts of things, but it's not pretty.

When used in the intended ways, the 8051 was (and is) rather elegent.

Even the most elegent hammer in the world doesn't make a very good screwdriver.

Never used either.

I've no complaints about RS-232. I've seen it misused and poorly implimented over the years, but I don't see anything wrong with the standard.

--
Grant Edwards                   grante             Yow!  You should all JUMP
                                  at               UP AND DOWN for TWO HOURS
                               visi.com            while I decide on a NEW
                                                   CAREER!!
Reply to
Grant Edwards

I wouldn't know, never used any of these.

What with Forth wrong is? Strange UPN you may find, but else?

And whats wrong with this?

Rob

--
Robert Kaiser                     email: rkaiser AT sysgo DOT com
SYSGO AG                          http://www.elinos.com
Klein-Winternheim / Germany       http://www.sysgo.com
Reply to
Robert Kaiser

It's hard to name anything that's right with it. Swinging several volts from a positive rail to a negative rail to transmit at a low symbol rate for a few meters...

The 'standard' involved a DB-25 connector with about a dozen signals, which everyone implemented differently...

How many manhours has been wasted dicking around with serial cables?

Reply to
Jim Stewart

...

Having seen the standard being used to transmit down a couple of kilometres, obviously I am wrong.

Unless of course you are referring to the MIS-interpreted

12-15m distance quote about noise.

RS232 did NOT specify the conncetor until RS232-C onwards. The DB-25 connector was a PTT standard (CCITT later ITU-T) specification for modems, hence the DTE/DCE specifications. Also the synchronous/asynchronous for 300baud to leased line specifications.

RS232 did not specify things like DTR/DSR or CD/RI that was the modem specifications hence NULL modem cables.

Mainly by those who did not understand the standard and grabbed any old cable.

--
Paul Carpenter          | paul@pcserviceselectronics.co.uk
    PC Services
              GNU H8 & mailing list info
             For those web sites you hate
Reply to
Paul Carpenter

It's been a long time since I last designed with and programmed a PIC (in Assembly) but I will comment on it all the same. AFAIK the PIC was derived from an IBM peripheral controller (IIRC PIC stands for peripheral intelligent controller) and it was meant to provide an additinal (yet minimal) level of intelligence, flexibility and compactness where usually plain logic would do fine. I think the problem with PICs is Microchip simply extended it far beyond the original architecture could stand and so they ended with a jurry rigged solution. Just as with 8x51. For that matter the x86 architecture suffers the same problem in order to support legacy applications.

Just my EU0.02.

Reply to
Elder Costa

I think it was actually Peripheral Interface Controller.

I don't think it was IBM, but the rest is mostly right. General Instrucments adapted the PIC from a Signetics design back in the 70's to handle I/O related tasks for some other long since extinct processor. I think originally it came from Harvard or somewhere.

Well put. The PIC, the 8051, and the IA32 have all suffered from the universal hammer syndrome.

--
Grant Edwards                   grante             Yow!  Yow! Those people
                                  at               look exactly like Donnie
                               visi.com            and Marie Osmond!!
Reply to
Grant Edwards

While the RS-232 "standard" may be adequate for the initial purpose of connecting a serial signal from a computer (DTE) to a modem (DCE) in an adjacent rack, the worst problems with RS-232 for any general purpose interface is the far too high impedance levels compared to any realistic cable characteristic impedance and the problem with ground potential differences, since no galvanic isolation is required.

Paul

Reply to
Paul Keinanen

I have the original introduction by Morse the designer of the

8086. He is somewhat critical of his own design and Intels assembler, too. I think the 8086 is a decent processor, and a feat of upwards compatibility. But it really should have been the end of it. Extending the lifetime of the processor from 16 bits to 20 bits addressing, counting for 4 years in computer history, and there it should have stopped.

The real mess didn't start until the addition of protected mode. Even so, some Unix clones (notably Coherent) show that there was a way out.

But Microsoft killed the attempts of IBM to get out of the BIOS morass (microchannel) and forced Intel to introduce the "virtual real mode".

Even so Linux proves that if you ignore sufficient of its features the Intel 386 is a decent processor. A great deal of the mess in a Linux system has to do with the battle for the control of the boot code waged by Microsoft (BIOS).

8051 another dead-ugly processor ... But I prefer them over the PIC's.
--
-- 
Albert van der Horst,Oranjestr 8,3511 RA UTRECHT,THE NETHERLANDS
        One man-hour to invent,
                One man-week to implement,
                        One lawyer-year to patent.
Reply to
Albert van der Horst

IIRC, Intel's part was called the Programmable Interrupt Controller. It was basically a dedicated function 8 bit controller. My 8085 books are all in storage or I would look it up.

Bob McConnell N2SPP

Reply to
Bob McConnell

I've just recently finished a project very like this - translating from COP8 to NEC 78K0 using a pure algorithmic translation method. It worked amazingly well; I was very surprised.

Reply to
larwe

What do you think the standard says?

Reply to
Richard Henry

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.