Google set to make its own energy efficient ARM Server chip

Google set to make its own energy efficient ARM Server chip

-----------------------------------------------------------

formatting link

Typically Intel chips waste 90% of the power that goes into a data center as heat. And Google is fed up and wants an alternative for the millions of servers that it has. What better than purchase an ARM license and build your own ARM server chips? :) It will be possible to expand the data centers to tens of millions of servers and still clock in with less wasted energy.

The biggest speed restriction is the DDR cycle time that limits CPU to about 10MHz operating speed. No matter whose chips are used. So may as well go for the more energy efficient ARM chips.

Google could use this opportunity to break into the hardware business to power its tablets and netbooks by making the chip with one single header file that names every register and every bit field and then selling the chip and the header file to the Linux and open source community. Every single open sourced project is likely to switch to using the Google ARM chip within days. If google then brought out a second chip, it would take hours to days to change that one single header file and get all the previous work up and running on the new ARM chip again.

If google played their cards right, they will own the entire ARM server market, tablet market, and Linux ARM chip based market. With that power, they can squeeze out the patent trolls like appile and microshaft by denying them any further markets because it will all be taken up by cheap Linux gadgets and open source projects that complete for attention at every niche at every level of competition.

In fact any open source project can make their own SoC with single header file and storm the entire market. It does not have to be google. It could be Ubuntu, or Mint, or one of the numerous other distro makers. All they need is about USD 2 million in the bank, with a bit crown funding perhaps, and never have to look back after taking their first orders.

Reply to
7
Loading thread data ...

I'm curious: where does the other 10% go? Is Google creating matter with it?

--
Grant
Reply to
Grant Edwards

intel-corporation-intc/

10% of the chips do the real computing at any one time.
Reply to
7

Shhhh! :)

--
Les Cargill
Reply to
Les Cargill

So Madman Muntz them other 90% offa the board. Problem solved.

--
Les Cargill
Reply to
Les Cargill

So the actual claim is that 90% of the power is wasted. Add the specifier "as heat" implies (to me) that the other 10% is converted into something other than heat.

100% of the power in a data center turns into heat.

The question ultimiately is how much of that energy consumption could be avoided while accomplishing the descired task.

--
Grant Edwards               grant.b.edwards        Yow! I feel better about 
                                  at               world problems now! 
                              gmail.com
Reply to
Grant Edwards

Actually I suspect it is far higher than that. The theoretical maximum is completely reversible computing, which should in principle create no heat at all. That tends to be very slow. Next is to argue that each erasure costs kT of energy. Again they are far far far less efficient than that. Even 1kT per bit operation would imply only 10^-21J/bit operation. Assuming 64 bit words and 10GHz operation, that would still only imply 10^(-9) watts of power consumption. Instead what we get is something like 100W so, that is

99.99999999% of the power going into heat rather than useful computations.

Useful work. One could call a computer a kind of refrigerator, you feed in random bits in the memory (max entropy), and you get the output out at zero entropy (the result of the calculation).

Actually no, some of it goes into reducing the entropy of the system.

Reply to
unruh

But surely that's only temporary. Does entropy continue to be reduced indefinitly for as long as a computer is being run? Or does it reach some sort of "average" level and from then on you're just transitioning between different configurations of similar entropy?

--
Grant Edwards               grant.b.edwards        Yow! You can't hurt me!! 
                                  at               I have an ASSUMABLE 
                              gmail.com            MORTGAGE!!
Reply to
Grant Edwards

As I said, in theory a computer can be run reversibly.-- ie it need spend no energy. Unfortunately this occurs only for a perfectly issolated computer and infinitely slowly. computers we have now use irreversible operations like mad to stabilise the memories and the operations, and alow them to operate quickly. My estimate was for

1KT/bit operation which is about the minimum we could imagine for irreversible computation. But real computers are more than a billion times worse than that if my calculation is right.
Reply to
unruh

I see where the conversation is going. What I meant was relative to an ARM chip.

If rebuilt using digital equivalent of quantum computer, which requires another 5% extra silicon, then computing power would increase x10 to x100 with 5% extra consumption. So the power saving would be astronomical, or equivalently the throughput is much greater, but since a lot of software would have to be rewritten, the trade off in benefit is not easy to work out until the software world catches up.

Regardless of Intel or ARM, the throughput is limited by page cycle times of DDR which hasn't improved since the 80's and stuck at 10MHz. The ARM wins because the quiescent drain is much lower despite both CPUs having the same DDR limit. Digital version of quantum computers will get over that, but normal software can't work on such computers.

Reply to
7

I see where the conversation is going. What I meant was relative to an ARM chip.

If rebuilt using digital equivalent of quantum computer, which requires another 5% extra silicon, then computing power would increase x10 to x100 with 5% extra consumption. So the power saving would be astronomical, or equivalently the throughput is much greater, but since a lot of software would have to be rewritten, the trade off in benefit is not easy to work out until the software world catches up.

Regardless of Intel or ARM, the throughput is limited by page cycle times of DDR which hasn't improved since the 80's and stuck at 10MHz. The ARM wins because the quiescent drain is much lower despite both CPUs having the same DDR limit. Digital version of quantum computers will get over that, but normal software can't work on such computers.

Reply to
7

I see where the conversation is going. What I meant was relative to an ARM chip.

If rebuilt using digital equivalent of quantum computer, which requires another 5% extra silicon, then computing power would increase x10 to x100 with 5% extra consumption. So the power saving would be astronomical, or equivalently the throughput is much greater, but since a lot of software would have to be rewritten, the trade off in benefit is not easy to work out until the software world catches up.

Regardless of Intel or ARM, the throughput is limited by page cycle times of DDR which hasn't improved since the 80's and stuck at 10MHz. The ARM wins because the quiescent drain is much lower despite both CPUs having the same DDR limit. Digital version of quantum computers will get over that, but normal software can't work on such computers.

Reply to
7

I see where the conversation is going. What I meant was relative to an ARM chip.

If rebuilt using digital equivalent of quantum computer, which requires another 5% extra silicon, then computing power would increase x10 to x100 with 5% extra consumption. So the power saving would be astronomical, or equivalently the throughput is much greater, but since a lot of software would have to be rewritten, the trade off in benefit is not easy to work out until the software world catches up.

Regardless of Intel or ARM, the throughput is limited by page cycle times of DDR which hasn't improved since the 80's and stuck at 10MHz. The ARM wins because the quiescent drain is much lower despite both CPUs having the same DDR limit. Digital version of quantum computers will get over that, but normal software can't work on such computers.

Reply to
7

Of course that's what you meant. :) And that sounds about right to me.

However, the laws of Usenet require that even slightly vague statements be interpreted in the manner that provides the most fuel for a good debate (and the more esoteric the resulting discussion, the better).

--
Grant Edwards               grant.b.edwards        Yow! It's OKAY -- I'm an 
                                  at               INTELLECTUAL, too. 
                              gmail.com
Reply to
Grant Edwards

Four repeats of the same response probably was a bit excessive, despite your annoyance. What I was saying was that relative to the "theoretical" ideal there is still HUGE room for improvement in the efficiency of the cpus. While some of that slack would certainly be lost to practicalities, that still leaves a lot of room for improvement.

Then I am not sure what you mean. Are you saying that a typical ARM chip is 10 times as efficient as an Intel chip? What does your "90%" mean?

No, that would be the job of the compiler, not the software.

Not sure either what you mean by "digital versions of quantum computers".

>
Reply to
unruh

Blame my service provider / ignore it.

90% of their revenue goes to servicing debts and 10% to customer service. Between 6 and 9pm there is no bandwidth to speak of so it all keeps retrying.

Afraid not old chap. The whole structure of writing programs is different. There might be a way to do it with a compiler that some clever boffin could work out. I used to think something like VHDL could not exist because hardware and software were so different. Shows how wrong one can get. :)

Analog computers are different from digital computers. Digital versions of analogue computers can be made.

Digital computers are different from quantum computers. Digital versions of quantum computers can be made.

Reply to
7

A quantum computer is not an analog computer. It uses bits to represent numbers just like any other digitial computer does. That those bits can be in a superpostion of states, or that they can be entangled with other bits does not make it analog. It is certainly true that a quantum computer working on a problem which can be sped up by quantum computation, behaves differently from a classical computer, but just as for classical digital computers, a quantum computer has a few operations which are universal. Ie, any quantum computation can be implimented by repeated application of that minimal set of operations.

The key feature as far as energy efficiency is concerned is that quantum operations must in general be reversible which means that intheory they require no energy to run. That is of course not true in practice, but they need very very little energy to run.

Note that attempting to simulate quantum operations on a classical digitial computer might be what you refer to, but that would be a completely stupid thing to do, since the memory and time requirements would be far greater than even any classical computer solving the same problem-- it would offer no advantage at all .

>
Reply to
unruh

Far less so than current cpus. As you say, quantum computers do not exist. I am discussing theory, not experiment.

It would be nice. On the other hand, the way you make qbits last a long time at room temperature is to make sure that they do not interact with anything. Since reading and writing IS interacting with something, you preferrably never write or read to these bits. On the other hand as I mentioned, current computers are horrible energy hogs, so even a little bit could give huge savings.

See above.

The problem is that quantum computers really are not better than ordinary computers except on a small subset of tasks. One might be simulation of quantum systems (eg atoms getting together to form molecules), or perhaps some decision processes ( finding the most efficient or lowest cost process amongst an exponentially large number of possibilities), both of which would be useful. On the other hand, for doing your taxes or writing your letters, they are liable to be pretty useless.

It is a HUGE industry. Pharmaceuticals for example. And they are horribly slow, and can only simulate a very few number of components.

Reply to
unruh

The last paragraph is a collection of sweeping statements :)

Not particularly true as what you are referring to is a simulation of a quantum computer. I'm talking about direct behavioural implementations of quantum behaviour in the logic gates and ways to making use of that to solve practical problems. A half way house between a real quantum computer and a simulation of a quantum computer. The product of that is computing devices with 10x to 100x performance enhancement over a digital computer in exchange for 5% extra silicon.

Reply to
7

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.