Time to ditch CPUs and concentrate on GPUs?

Do you have a question? Post it now! No Registration Necessary

Translate This Thread From English to

Threaded View
Time to ditch CPUs and concentrate on GPUs?
-------------------------------------------

Taking apart an Asus Revo, I couldn't help
notice that the CPU chip is half the size of
the graphics controller!

It was always destined to happen.

So now we have a situation where the CPU is
graphically slowed down because it has to
make huge number of connections the graphics chip
to make that work. The same amount of silicon
is repeated on the graphics chip.
In the process, a large amount of electrical power
and silicon is wasted trying to get the two devices
to talk.

AMD and ARM are doing the right things by building
the graphics controller into the CPU chip.

But it still 'feels' all wrong because the emphasis
is on the CPU and treating the graphics controller
as a peripheral.

The 'correct' way to do this is to treat the graphics
controller as the 'central processing unit', and then
put three or more CPUs around it as 'peripherals' to do the
menial work.

Starting with lowest spec CPU, the low spec CPU will control
all the peripherals such as UART, SPI, DMA and the like.

The next one up in power is the traditional CPU
that runs your user programs.

The next one up in power is the graphics CPU that is
dedicated to wrestling with the graphics engine
with enormous programmable bandwidth into the graphics engine.
This is the thing that will drink power if power is available
and critically, scale back speed and the number of connections
into the graphics engine if power is low.

So if power is available, it could use say a 128 bit bus from
the graphics CPU into the graphics engine, but if power is low,
then software could power down the big bus and possibly the graphics
CPU and use a 1 bit serial bus to squirt the data into the graphics
engine. Any of the lesser CPUs could do that as well
shutting down higher spec CPUs and transferring execution of
the main programs between the three computers.

Such an architecture does rely on ditching the idea of CPU
as the most important things in a computer, and focusing
all efforts to get graphics as the number one function
inside a consumer computer chip, and littering CPUs around the GPU
as peripherals that service the GPU functions.

Currently all the GPUs are encumbered.
This creates documentation access problems for anyone trying
to build an ecosystem around a CPU as rasberry pi
have found out the hard way.

So it would be good to start with an http://www.opencores.org project
to create the custom GPUs. At least one graphics processing
project is getting started there.

The development of a graphics supercomputer is made easy
if one notes some critical advances have been made at opencores.

Their OpenRISC CPU which is similar to an ARM CPU
is now operational on an FPGA and it has gcc and runs Linux
so its entirely feasible to add the graphics CPU through
'arm chair' design effort into the FPGA and get it operational in next
to no time.

The Linux and gcc compiler takes care of converting existing
graphics libraries and generating executable OpenRISC assembler
code and whatever else needed to feed a custom graphics engine.
The custom graphics engine can be designed and
modified to heart's content until it
works because its just FPGA real estate.
So the graphics centric GPU architecture could be developed in
accelerated time and then rolled out to customers.

Who benefits? Mobile makers, desktop computer makers,
gadget makers that have color graphics displays.
In short, the beneficiaries are the vast majority of consumers
that buy products with a display on it.


Re: Time to ditch CPUs and concentrate on GPUs?

Quoted text here. Click to load it

Your cluelessness is legendary. Graphics chips have been used for years
in various simulation SW where their powers can be put to good use where
3d geometry, scaling and colour interpolation, anti aliasing (Hi Peter!)
etc are suitable. but they do not replace the traditional cpu in the
general usage case. The graphics chips are designed specifically around
number crunching and large data volume throughputs in multiple texel
rendering pipelines.

In addition, generally nothing is "graphically slowed down"
since the cpu offlads a lot of the loading and instruction sequences to
the GPU via well defined APIs and can get on with its work. Also using
those incredibly useless things called cores, threads and interprocess
communications that Jed thinks are unnecessary.

Tell you what little Fraud, why dont you use your nano bot arms to
create a whole new paradigm in processing?



Meet Hadron quack - Burson-Marstelar Employee trolling Linux newsgroups


Quoted text here. Click to load it

Meet Hadron quack - Burson-Marstelar Employee trolling Linux newsgroups.


Just because appil and micoshaft are loosing business to Linux,
thats not a good reason for Burson-Marstelar employees to be
trolling Linux newsgroups on their behalf.



Re: Meet Hadron quack - Burson-Marstelar Employee trolling Linux newsgroups
Quoted text here. Click to load it
Burson-Marsteller, seems a good company to me, I like it!
"We have a clear vision: To provide gold standard performance - for our
clients, our people and our *shareowners* - as one seamless, global business
with a single culture."
I like them a lot:
http://finance.yahoo.com/echarts?s=WPPGY+Interactive#symbol=WPPGY ;range=1d
Thanks for the hint.


Re: Time to ditch CPUs and concentrate on GPUs?

Quoted text here. Click to load it

I forgot to mention, this low spec CPU could also
be programmed up to run as an intelligent caching
unit similar in functions as an MMU but much more intelligent
because it is programmable with separate software
that targets specifics of a custom motherboard environment.

In the absence of a real MMU, this unit can take over the functions
of paging and intelligent caching altogether.

Another function of the unit is to take care of disk format data
structures. So while the main programs open and close files,
this smaller CPU which handles DMA sweeps up behind it adjusting the disk
data structures relieving the main CPU from having to do the house
keeping.



Quoted text here. Click to load it


Re: Time to ditch CPUs and concentrate on GPUs?
Quoted text here. Click to load it

You are sniffing way too much glue.


Site Timeline