x86 architecture concepts

Yes. The registers, instruction set, memory addressing method, and I/O design make up the programmers' model. That is one level of processor architecture. It has a major influence on the hardware design, but doesn't totally determine it. You noted different implementations having different performance and power tradeoffs.

IBM exposed the separation when they defined the System 360 architecture in the 1960s, then implemented several computer models using very different hardware designs. Microprogramming allowed them to implement machines with a wide range of performance, yet executing the same instruction set.

Today there is are many ways to implement a given programmers' model on all but the bottom-end processors. You might call this the hardware architecture for the machine.

Thad

Reply to
Thad Smith
Loading thread data ...

Np. We'd have a different one. The industry needed a monopolist, and it chose one. Everyone wanted to interchange their files with everyone else, after all. I don't think we fared well with our collective choice, but there was going to be a monopoly one way or another.

Clifford Heath.

Reply to
Clifford Heath

On Wed, 23 Feb 2005 14:51:54 +0100, Mouarf wrote:

I have better answers - I know a fair amount about the ARM, x86, and PPC architectures (not so much about the Alpha, except in general terms), and I know a fair amount of the history behind them, and the fundamentals of processor architecture design. I just don't think you asked appropriate questions in an appropriate manner in an appropriate place.

Google and usenet are different things, and exist for different purposes. If you want to get information from the web using google, you've got to ask the right questions, and you've got to be persistant - google is a patient listener. A search for "x86" is way too broad - try searching for "history of processor architectures" or something like that. You would probably be better off first searching for "google tutorial", since your original post shows you have made precious little effort on your own before firing off questions whose full answer would fill a book. Your question may not be homework, but it was certainly worded as such. It is one of these "I haven't got a clue what I'm talking about, I haven't even bothered to read some posts in this newsgroup to find out if it is relevant, but maybe I'll be lucky and someone will write my essay for me" questions. People writing these questions fall into three categories - students (the majority), interested amateurs, and proffesionals in the wrong job. Since you are not a student, and I hope for your sake you are not in the third category, I'll assume you are just curious. Curiousity is a valid reason, and there are lots of people here (including me) happy to inform the curious - as long as you are happy to do your share. That means you use other resources such as the web, and that you follow the rules and customs of the newsgroup. For example, stop top-posting. If you don't know what that means, or don't know why it is relevant, then I suggest you turn to google again.

There are two aspects of a processor architecture - there is the programmer's viewpoint, and the implementer's viewpoint. The implementer's viewpoint will depend on the vendor, and cover such things as the internal structure of the processor - for example, what is the difference between a Pentium IV and an Athlon XP ? Why is the one faster with some code, and the other faster with different code? From the programmer's point of view, they are basically the same (bar a few SSE instructions) - both are implementations of the x86 architecture. That means they have the same instruction set and register build-up.

x86 refers to the programming model - the instruction set and register bank. Similarly for PowerPC (IBM has some very complete books on the PowerPC architecture, with plenty available on-line), ARM and Alpha.

Intel made the first x86 processors, and thus are to blame for the mess that is the x86 architecture.

No, x86 is not a core, it's an instruction set and register bank.

Many things, of which the instruction set architecture is only one of them. It is possible to make low-power implementations of the x86 instruction set - look at the Geode, for example, although it is much harder to make fast low-power implementations of x86 than, for example, ARM. But mostly it is a matter of emphasis and market - most x86 implementations attempt to be as fast as possible regardless of power, while most ARM implementations are designed for low-power.

David

Reply to
David

One of the largest applications of the time was Wordstar. All you have to do is look at the code for it, and you can immediately detect the application of that conversion tool. The tool didn't necessarily leave a complete functional source, because some things would depend on code size, but it was close.

It was not the existence of CP/M that counted, but the existence of the applications. Early MsDos implemented all the system calls of CP/M, although they fouled up some in typical MsDos fashion.

--
"If you want to post a followup via groups.google.com, don't use
 the broken "Reply" link at the bottom of the article.  Click on 
 "show options" at the top of the article, then click on the 
 "Reply" at the bottom of the article headers." - Keith Thompson
Reply to
CBFalconer

Interchange of files is inhibited by the monopoly- it's hard enough to interchange files with previous versions of Microsoftandbrown, because of sneaky format changes designed to protect the monopoly. "HTML" that only works with IE?

A common platform for applications is more the reason - though if p-code had been fast enough early on, this would not have been much of a problem, just as Java is popular now.

Paul Burke

Reply to
Paul Burke

I'm not sure what "decision" you guys are talking about. But I don't agree with "That's for sure" regardless.

Remember that, for all his "reputation" as a techie, Gates is a _businessman_. Had things happened differently, he might not have had such an initial boost, but he would have been working all along to gain as much market as he could. When IBM approached Microsoft in the first place, they were primarily a development tool shop (Microsoft BASIC). Gates wants Microsoft to be the company that writes all the software. Microsoft sells the most software, not because it's the best, but because it is the best marketed. And technical superiority is no guarantee of market success.

So if IBM had gone with DR or Intel for the operating system, or with Moto 68k for the micro and yet someone else for the OS, Microsoft might not be as big as it is today, but it would certainly be big. Perhaps even dominant.

Regards,

-=Dave

--
Change is inevitable, progress is not.
Reply to
Dave Hansen

The translator did exist. I had the 2 8" floppy set. Nobody used it where I worked. By the early 80's, we were supporting CP/M, MS/DOS and CP/M86, all with one C source file and targeted compiles. So yeah, you're right.

One could make a somewhat strained argument that the architectual and market simularities of the 8080 and the 8086 (and CPM/ - MS/DOS) made them a good pair for "compatible" compilers, but that would be something of a stretch.

Reply to
Jim Stewart

"CBFalconer" wrote

Intel tried, AMD sued claiming the agreement covered derivatives and improvements ... for ever and ever and ever. To a large part the courts agreed.

Then AMD sued for the right to make 8087s, selling for $300 at the time. By the time AMD won the suit the 486 was out and the issue was moot. But boy did the lawyers and consultants rake it in.

AMD used to be the premier second-source house. But it turned out Jerry Sanders was just a crook.

--
Nicholas O. Lindan, Cleveland, Ohio
Consulting Engineer:  Electronics; Informatics; Photonics.
To reply, remove spaces: n o lindan at ix  . netcom . com
psst.. want to buy an f-stop timer? nolindan.com/da/fstop/
Reply to
Nicholas O. Lindan

"Clifford Heath" wrote

In the book 'A Random Walk Down Wall Street' the author claims that statistically there has to be someone with that sort of monopoly/ money - it doesn't matter who or in what business.

--
Nicholas O. Lindan, Cleveland, Ohio
Consulting Engineer:  Electronics; Informatics; Photonics.
To reply, remove spaces: n o lindan at ix  . netcom . com
psst.. want to buy an f-stop timer? nolindan.com/da/fstop/
Reply to
Nicholas O. Lindan

I've done a bit of '85->'86 translation. No, I didn't have a 'tool' to do it. I did 90% of it with Brief key-stroke macros. The result was like German->English using only a dictionary and not changing the grammar, but it did work and was a fast, accurate and easy job -- something that doesn't come along very often.

--
Nicholas O. Lindan, Cleveland, Ohio
Consulting Engineer:  Electronics; Informatics; Photonics.
To reply, remove spaces: n o lindan at ix  . netcom . com
psst.. want to buy an f-stop timer? nolindan.com/da/fstop/
Reply to
Nicholas O. Lindan

Did Intel ever actually ship a translator or was that all just handwaving and hype?

I'll concede that it would take longer to convert to 68K assembly.

--
Grant Edwards                   grante             Yow!  Hmmm... A hash-singer
                                  at               and a cross-eyed guy were
                               visi.com            SLEEPING on a deserted
                                                   island, when...
Reply to
Grant Edwards

Do you have any reference for this ? AFAICT the number of transistors in (Intel) CPUs is ever _increasing_.

Rob Windgassen

Reply to
Rob Windgassen

I think the question should be. Would they have made any less money if they had implemented a better technical solution ?. Would the systems available today be more advanced if less money was spent on getting the existing non-elegent solution to be faster and still stay compatible. ? Many corporations seem to manufacture less than optimal solutions, and then just quadruple the advertising budget and use less than savoury tactics to get rid of the better solutions.

That is a signicant cultural problem in the western world today. Often a much better product can be provided while the same amount of money is made. It seems that in todays corporate culture the idea is to make as much money as possible while providing the worst possible product. The client is "THE ENEMY". One can unserstand that compnaies have to make money, but in the end of the day the actual services and products that the companies provide should be the important thing. The actual value of money is very little. One cannot eat it, it does not provide transport and all the thousands of other things people need or want.

Regards Anton Erasmus

Reply to
Anton Erasmus

I think the answer is yes, given that technical elegance will generally cost more in development or time to market. But the problem isn't what a given company chooses to bring to market, it's what the marketplace decides to buy. "Technical elegance" is usually pretty far down the list of customer requirements.

I wouldn't say "worst", but "cheapest". The client wants the most features at the lowest price. The business wants to maximize profit, the difference between price and costs. As long as clients are cheap, there will be pressure to build things cheaply, and that often precludes technically elegant solutions. Microsoft uber alles. WalMart uber alles. RIP beauty.

You're confusing art and business. Artists often set the commission price up front, and then are free to create beauty subject to the customers tastes and time line. Business has to guess what the customer wants and is willing to pay for; if they take too long to finish the product, someone else ships first and captures the market.

Just out of curiosity, what are the dominant MCU architectures in the embedded field? Are there more elegant alternatives? Why aren't they dominant?

Kelly

Reply to
Kelly Hall

"Kelly Hall" wrote

As I am afraid it should be. Nowhere in "Technical Elegance" are the needs of the customer considered.

--
Nicholas O. Lindan, Cleveland, Ohio
Consulting Engineer:  Electronics; Informatics; Photonics.
To reply, remove spaces: n o lindan at ix  . netcom . com
psst.. want to buy an f-stop timer? nolindan.com/da/fstop/
Reply to
Nicholas O. Lindan

While RMX-80 and later on RMX-86 definitely were multitasking operating systems, did the RMX-86 originally really have multi-user and network capability ? When did they invent the iRMX name ?

If IBM had selected the multitasking RMX-86, would there have been enough professionals, who would have understood anything about real time and multitasking ? In those days, could you really get any kind of formal training for such environments ? While working with RSX-11 since the mid 1970's, much in-house training was required even in the mid-1980's to get most programmers in a team to become familiar with multitasking issues.

Realistically, if IBM had chosen RMX-86, how many competent programmers would have been available ? Trying to get the universities to teach multitasking issues would be very hard, since most of the professors would have been from the batch (punched card) era :-).

Paul

Reply to
Paul Keinanen

With technical elegance as in "an idea or plan that is elegant is very intelligent yet simple" (Longman dict.) it very well may be

- economical

- reliable

- easy to use

Interesting for a customer I guess.

Rob Windgassen

Reply to
Rob Windgassen

"Paul Keinanen" wrote

Originally? Now that's digging - I am pretty sure those docs got tossed a long time ago. I know Motorola's RMS-68K was multi-user in '81, so chances are iRMX-86 was also. My memory of the ISIS-III -> iNDX and iNDX + iRMX-80 -> iRMX-86 (or was it the other way round) progression is pretty hazy.

iRMX-86 has a history of being used in industrial PC-DOS machines, where it runs Windows 3.1 through Windows NT as a task/user. It is still being designed into systems.

You don't need to know real-time. The interface can look just like DOS. Instead of 'stay resident' one would spawn a separate, killable, task. iNDX would have been a better variant, and I confess to lumping the two together in my head as "that Intel thing in the 80's."

In the ultimate analysis _all_ operating systems are real-time, just with varying degrees of 'softness' (that's a measure of how seriously the OS treats getting something done on schedule -- for the non-RTOS'ers.)

Way more than you can get now. Motorola and Intel both provided series of courses on OS, hardware, languages and development systems. I only attended the 68K one. The course was pretty good: it convinced us to drop Motorola and switch to Intel right-quick. Every task in 68K needed it's own copy of the run-time library, some 40K - and 10 tasks x 40K was a _lot_ of EPROMs in '81.

Oh, I remember that. I did some traveling tent shows getting clients to use an RTOS. It was like convincing a C programmer to use Pascal. Engineers are stubborn beasts. Though once you can get them to switch they latch on to the new method as tightly as they latched on to the old. Engineers are 'show me' types.

'Competent' - probably more as the PC would have an instant entry into factory automation.

Now incompetent we had in spades what with Basic coming for free with the machine. Competent ones took a look at Lifeboat C, the Phoenix linker, MS compilers and ran for the hills. Remember Debug and Edlin.

In my experience profs stayed pretty much at the leading edge, often defining it. It may be different at non-research colleges.

Industrial consulting makes up a real-big chunk of an Engineering prof's income - stick in the mud's need not apply. There was a lot of research into real-time networks and distributed processing in the early 70's. The first 6800's (like the first dozen) went to Universities. At Case a grad student promptly pushed the 40 pin cerdip in the middle to get it into the socket and it cracked - surpassingly, it still worked. When was DARPA net developed?

It is after they graduate and spend some time on their first job that engineers' minds shut tight to new ideas. If the technology in their first job is the same as they were imprinted with at school then heaven help you.

--
Nicholas O. Lindan, Cleveland, Ohio
Consulting Engineer:  Electronics; Informatics; Photonics.
To reply, remove spaces: n o lindan at ix  . netcom . com
psst.. want to buy an f-stop timer? nolindan.com/da/fstop/
Reply to
Nicholas O. Lindan

"Rob Windgassen" wrote

Nowhere in that definition are the words reliable, cost effective, functional, correct, useful, safe etc, etc, etc. to be found.

'Intelligent yet simple' has _never_, TTBOMK, appeared in a product specification.

If a chair is described as 'intelligent yet simple' one can assume it will be hell to sit in.

Or it may not. You can as easily say the same effects 'may' be had by painting it orange.

'Technical elegance' is historically a byword for poorly documented, unreliable and unsafe.

--
Nicholas O. Lindan, Cleveland, Ohio
Consulting Engineer:  Electronics; Informatics; Photonics.
To reply, remove spaces: n o lindan at ix  . netcom . com
psst.. want to buy an f-stop timer? nolindan.com/da/fstop/
Reply to
Nicholas O. Lindan

Does anyone watch TV

IBM went to DR first. They did not agree to IBMs terms Gate bought DOS 1.0 from some guy who wrote it at home. He did this so He could sell BASIC with it. X86 was chosen for time reasons.

Reply to
Neil Kurzman

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.