The *86 CPU.

Hello there,

It looks like I've made contact now using MT-Newswatcher with sci.electronics.dsign but reckon I still have stuff to learn but probably know enough to try to post for a second time to ask your opinions about this idea. What if Intel's *86 CPU was programmable with respect to being able to add new instructions to the instruction set? Would this represent an upgrade that would advance things to the point where my hoped for severing of the counterproductive embrace Intel and Microsoft are locked in might be able to happen somehow. Am I right in saying that the *86 mp is now well out of date and that all that has happened in the last 25 years is the relentless increase in the CPU's clock speed however wonderful that is? What are the chances though by not being specific about how this might be done to build a CPU and any other parts around it so that like Dupont's policy of building in room for manoeuvre in the inevitable event that like in any system new stuff will pop up that will require space for it's accommodation. I'm also of course thinking about designing an OS with the same built in lateral expandibility. This reminds me very much of why the microprocessor came into being in the first place. I should check out

formatting link
I was listening to Andy Grove on the Wall Street Journal Report with Maria Baritromo. Intel have a big plant here just outside Dublin and we want them to stay here.

Regards,

Peter Nolan. Ph.D.

33 Templeville Rd. Templeogue Dublin 6W Republic Of Ireland.
Reply to
Peter Nolan
Loading thread data ...

Peter,

Merely FYI:

If you have www access, go to

formatting link
. They now have the searchable archive of (almost) all Usenet message traffic, from about 1981 to the present time (which was formerly at deja.com and dejanews.com). It's a goldmine! You can also post messages, from there.

Regarding your 8*86 questions, et al: Are you drunk?! But seriously, if not, maybe you'll want to go to the site I mentioned, above, and do a search for groups with names like *.comp.* , where your message might be more-appropriate to post.

Regarding microprocessors and other digital processing components, and their software, in general: It might be advantageous for you to become more familiar with their development histories, and the current states of the art, before speculating much further, publicly. Searching the Usenet archive, as mentioned above, might reveal that many or all of your questions have been asked and answered, and perhaps beaten to death, already. We've probably all "been there", by the way. (If you've already read the preceding paragraph twice, then skip the next sentence. See preceding paragraph.)

Also, check out LINUX and friends. That might be "the answer", for your current queries.

Good luck!

Regards,

Tom Gootee

formatting link

-------------------------------------------

Reply to
tomg

There are some (myself included) who consider the original x86 to be a particularly revolting invention due to it's segmented architecture. I've written code for a lot of the derivatives (including the 80188, marketed to the industrial controls market), and it's horrendous compared to the nice clean lines of 68k, MIPS and Sparc.

Good luck - as Tom notes, a lot of these questions have been beaten to death (10 years ago at least).

Search the archives, or as someone else I know suggests:

formatting link

Cheers

PeteS

Reply to
PeteS

"Peter Nolan" schrieb im Newsbeitrag news: snipped-for-privacy@news.cable.ntlworld.ie...

Hello Peter,

No, it already has too many instructions. It's difficult for a compiler to use it all.

It's nonsense to compare the recent x86 processors used in PCs with an old 8086 processor. Intels Core Duo is now the fastest dual core processor on the market. Apple switched from the over-hyped Power PC to the 80x86, because it becomes obvious hat the Power PC no more can compete with Intel's latest processor. This was the best decision Apple did since years. Now users can run both OS on their machines and their computers are back at the leading edge.

Best regards, Helmut

Reply to
Helmut Sennewald

The x86 architecture was out of date that sad day when it was conceived. What a revolting mess!

If IBM had gone with Motorola and Digital Research, and later upgraded to UNIX, what a different world this would be!

John

Reply to
John Larkin

No shit. MS Basic on the Model 16 (68000) was way ahead of its time.

Reply to
Homer J Simpson

Hello,

Many thanks for your replies. I'm a physicist and not up to speed on stuff like this except to say that I know an mp takes a block of data say 8 bits wide and so on upwards and then using an instruction performs some operation on the data that is advantageous in some way on route to achieving some objective. I did manage to find and replace a faulty chip in a DEC PDP 8E that performed some instruction around 1983 when by then that machine was out of date. Is any effort being made to improve things? Apparently not is the probable answer and in the same way that OS's are being developed or have been developed as a collective effort it's a great pity that engineers cannot get together say globally to design a great CPU with I might add the Dupont philosophy of built in expandilbility that I think is very important. If I'm not mistaken I can modify Linux/Unix if I wanted to and knew how so in that case the blue print of these OS's are known and come under the scrutiny of many and as they say two heads are better than one and that is true. The problem with making an actual chip mp is that it costs money. This is not the case for an OS that can be written up using only time and brainpower. However I guess there is no reason why a new great single chip CPU cannot be simulated by using a suite of existing chips by a global collective effort. After that contact Intel and all will be fine. I just now accept what I already knew that things are bad for the PC in use in small business and the home but just like Steve Wozniak got the whole thing going there is no reason at all why that process cannot be repeated by being able to feel and bear the pain that being patient induces and letting time pass just as it has done for the PC till that something new and much better replaces the dominant existing system. Spread the word and hop to it.

Peter.

Reply to
Peter Nolan

for the last few years they have been.

Apple joined the party earlier this year.

no. the internal data bus has grown fourfold (to 64 bits) and many features have been added

Bye. Jasen

Reply to
jasen

"Peter Nolan" wrote in message news: snipped-for-privacy@news.cable.ntlworld.ie...

All of these things have already been tried. The first approach, was RISC. here you built a processor that was as fast as possible, at performing the very low end minimum operations, and then simply(!), coded everything in these. If you want an operation to perform some major arithmetic in a particular way on a set of numbers, write the code for this, then generate a macro, that outputs this code, and you have your 'new' instruction. There have also been microcodable chips, where a small area of high performance memory, contains a list of such operations, which is itself indexed by a larger instruction. The existing Intel chips have a limited ability in this area. Then at the hardware level, quite a lot of groups have built chips containing FPGA elements, allowing the hardware to blso e reconfigured. People are even experimenting, with this type of reprogrammable array, and 'learning' algorithms that make the chip improve itself. The problems though are legion. The first is that it becomes basically impossible to diagnose faults, or predict faults on such a customisable array. Then the lack of predictability of the operation flow, undermines most of the current memory optimisation systems, which allow the current operating speeds (you have to remember that there is no memory in exixtence, that is reasonably priced, that can even remotely approach the 'core' speed of current processor - hence the processors carry cascades of small areas of cache memory, and hardware that tries to get the data 'ready' for the forthcoming operations - when the operations have different basic data flows, these areas all have to be reconfigured as well...). A core 'reprogrammable' processor is easy. However the processor itself is only a small part ofthe whole computer, and allowing reconfiguration at the processor area, increases complexity 'outside' the CPU, and currently reduces performance for a given price. However in some specialist areas (particularly image processing), where the cost of large arrays of high speed memory can be justified, this approach is used. Perhaps suprisingly, many modern video cads, contain core processors, that can be reprogrammed in this way, and are optimised for parallel processing, and these processors can be reprogrammed this way, to form a high speed mathematical 'array' engine. You have to remember that the current 'PC', is driven by cost, not 'good design'.

Best Wishes

Reply to
Roger Hamlett

Hello Roger,

I don't understand some of what you have written but you detail problems in a way that they seem to be intractable. I'm very happy I'm making sense in a general sense. I used two programs on a 80286 called Harvard Graphics and Axum Graphics in 1991/1992 for plotting data acquired incidentally by the 286 and both ran under MS-DOS. I loved the sound the PC made booting up and as far as I know the simpler MS-DOS was bug free in the end. At around that time I heard about Windows 3.1 and told this was the future. I figured that any OS called a Windows something or other couldn't be trusted. :). When then an icon on a thing called "The Desktop" is called "My Computer" I know then for sure things have taken walk on the wild side. At the risk of getting to you something I don't want to do I remain stubbornly unconvinced that change cannot be effected. I wonder if what will do this is a whole new way of thinking about the problem something as profound as the invention of the computer itself during WW2 but paying respect to Babbage's engine now running:

formatting link

I used to hear a lot about neural networks but that fuss has died down from my perspective. My intuition tells me these neural networks are leading us nowhere fast however intriguing they may be. Imagine switching on a neural network computer one morning to hear it object saying it wants a few more hours sleep.

Many thanks,

Peter.

Reply to
Peter Nolan

What exactly is it you're trying to change? All you have to do is convince everybody in the world to do it your way.

This could be a golden opportunity to get some serious education in the ways of the Real World. ;-)

There have been neural nets for decades, and massively parallel processors - the Connection Machine, for example, had something like 64 small uPs in parallel, sort of.

The problem is, nobody has yet figured out how to effectively program one.

Maybe that would be a good project - buy or build a neural net or massively parallel processor, and figure out how to program the thing! :-)

Good Luck! Rich

Reply to
Rich Grise

In article , Rich Grise wrote:

Hello Rich,

I want to be able to sit at a PC and know it isn't off galavanting somewheres where it shouldn't be and also be able to know that the entire thingamamling won't freeze without warning. I wrote to comp.lang.basic.visual.misc and a guy called Mike Williams gave me a code snippet and an upgrade on same that when I implemented it saved a copy of the email I was typing at any time interval I could specify in the program. Here is the complete thread:

formatting link

Before this I was on edge not knowing when the so and so of a thing that is the Windows operated PC would just freeze. In the end I bought a Mac Mini and then a Mac iBook G4 14" that I'm using now that is behaving very well and that in the about six months I've been using these two machines I have not had any trouble and now no longer worry at all about losing stuff that as a hunt and peck typist might have taken me say an hour or longer to type. I will get round to learning how to type at some point but I don not feel like doing this right now. I want a computer running under a solid OS that is a joy to use rather than an anxiety filled ordeal that using a PC is. I guess the Mac iBook is close to what I want but I'm very annoyed about the way things are in the world of the PC. As already pointed out I'm a physicist and computers are much a part of my trade as anything else. The problem as I see it is that years have passed now and the current PC is felt as embedded in the world of computing. I say it doesn't have to be like this if some intelligents with patience and a long term strategy commit themselves to delivering a system that will be a joy to use according to what to me is a joy then change can happen. Most people using PC's do not understand the mess things are in and for this reason the general public and owners of businesses will keep right on buying the latest PC. For this reason the price is right but the hidden costs are criminally high. Welcome to my world. It's a wonderful place.

Peter.

Reply to
Peter Nolan

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.