They wrap the gate around 3 sides of a fin-shaped channel...
"While not the first manufacturing technique to produce fully depleted transistors [...] it is the cheapest [...]"
-- Cheers, James Arthur
They wrap the gate around 3 sides of a fin-shaped channel...
"While not the first manufacturing technique to produce fully depleted transistors [...] it is the cheapest [...]"
-- Cheers, James Arthur
Finfets have been around for a while now, a decade at least.
Intel has always had superb silicon process and crappy CPU architectures, and I bet ARM is terrifying them. Moore's Law is going to hit the atomic limits soon, and x86 will be in trouble.
Why they dumped their ARM products, I can't imagine.
John
Yep, a decade, roughly, but they've never been in production. Th news releases yesterday said Intel is going exclusively to finFETs, for everything! That's amazing.
I couldn't find any really good technical articles, but the pop-stuff said the FETs are fully depleted, which would really be somethin'.
I dunno either, haven't kept up with that. I do love the bumper crop of tiny, low-power CPUs though.
James
Maybe they were costing them an ARM and a leg...
Depleting the channel will reduce leakage currents greatly. Congratulations on the low power solution. Previously, substrate bias was used to reduce leakages in active circuits and even more extreme bias voltages were applied to circuits in standby. With 3D, the channels are above the substrate!
You are an idiot.
This is why you are an idiot. If they have such 'crappy' CPU architectures, why would one they sold be such a threat?
Could it be that you have no clue about what is or is not a good architecture, and that your word salad is nothing more than an insult to one of the greatest chip houses ever to exist?
So, you have a fetish as well, John. By your own criteria.
I told you what your posts are one day, and you attempted to claim I had a fetish as a result of the description I gave your posts. You take 'pathetic punk' to an all new low, John.
Have a nice life, you pathetic punk.
It was before the gadgetry wave.
That's OK. They have low power offerings already that were to be competition for that market, and the new process will make that even more attractive a series.
You think x86 is a decent CPU architecture? It's a lineal descendent of the 8008, which was itself barbaric the day it was born.
It's no accident that most mobile devices use ARM these days. And that they have huge battery runtimes.
John
You're always fighting Laplace's equation when you violate scaling (as you have to nowadays). The problem is getting enough E field in the channel to deplete it. This used to be easy, because you could always make the gate dielectric thin enough, but not anymore, because of tunnelling through the gate oxide.
Using high-k gate dielectric helps a lot, because less of the gate voltage gets dropped across the insulation before it has a chance to do anything useful. That helps the depletion and also helps reduce the tunnel current.
FinFETs are another approach, where you wrap the gate around three sides of the channel. Laplace's equation is a lot more friendly in that geometry. Processing them has been a nightmare until recently, though. Bravo to Intel for getting it figured out.
Cheers
Phil Hobbs
-- Dr Philip C D Hobbs Principal
Intel--simple, crude and fast. Not a bad way to get in first, but it's a lot of baggage to carry later.
(Am I the only one who can't help noticing "Sumatra" anagrams into "traumas"?
James
It was simple once, but it's not any more. It's a register-poor, register-quirky CISC instruction set. In order to get speed, they have to pipeline it heavily, scoreboard registers, and execute instructions out of order. So a modern x86 CPU is actually doing a very complex emulation of the barbaric x86 instruction set. That takes a lot of silicon and a lot of power.
ARM has lots of general-use registers and can execute an instruction per clock, without melting the silicon. Since the GHz race is pretty much over, the future is low power and multicore. ARM will win that game.
You can buy ARM chips for under a dollar. $8 or so gets you a full
32-bit chip with ethernet, seven uarts, SPI, timers, ADC, sram, DRAM controller, flash controller, 250 MHz core, and SIMD vector floating point.John
No. Complex, advanced, and fast.
A turkey like you wouldn't even know, how, why, or even when they 'got in' much less whether it was 'first' or not. What? You think they did not have any mil roots?
Like the presumptuous character idiots like you form in your old age?
My point as stated above, exactly. It is an observation that extends beyond stupidity, though it gets beat by you telling about it here.
If their stuff is so bad, why are folks shelling out well north of $1000 each for their 'crappy' (as Johnny 'fetish boy' Larkin called it) work?
Fewer and fewer people are dumb enough to pay $1000 for a power-hog x86 chip. Apple is already using custom ARMs in most of their products, and rumor is that their desktop computers are next. ARM and Linux own the cell phone and tablet business. Server farms will migrate to ARM to save power.
Linux is open source and free, and ARM is almost the same: there is a license fee, but you get to put the architecture on your own chips, with your own graphics and peripherials, as Apple and everybody else is doing.
x86 architecture is almost 40 years old now. Windows is pushing 30. People are losing interest in paying kilobucks for obsolete captive technologies.
It's time for change!
John
Yes, that's the "lot of baggage to carry later."
Imagine all that done in finFETs...
I wonder why Intel dropped the ARM thing. Probably marketing. Getting people off the Intel instruction set would've been riscy.
James
Intel really needs that new transistor to compete with e.g. ARM Cortex- A9:
Intel Atom vs ARM:
Is ARM what I want?:
Another technology that has spreads through almost all of Apples products - transparent use of both processor and graphics chip:
Grand Central Dispatch:
Grand Central Dispatch still uses threads at the low level but abstracts them away from the programmer, who will not need to be concerned with as many details. ... Examples ..."
Used technologies:
LLVM:
The LLVM Compiler Infrastructure
LLVM presentation at Google given by Chris Lattner:
August 31, 2009 Mac OS X 10.6 Snow Leopard: the Ars Technica review:
June 20th, 2008 Apple=92s other open secret: the LLVM Complier:
LLVM is one of the best pieces of open-source software available; check it out!:
LLVM: A Compilation Framework for Lifelong Program Analysis & Transformation
Clang:
Status history:
Hey, what is the CPU and the OS of the computer on which you are typing this?
:)))) Apple is x86 for the most part.
Everything has its price. Nobody gives up good stuff for free.
If anybody could offer sigificant advantages in the cost and/or performance, it would certainly be welcomed by the market. As long as there is not much of a difference, it doesn't matter what is under the hood.
I am yet to see an ARM as capable as 10 y.o. x86. I am yet to see a Linux as usable as Windows 98.
They were preaching it for the last 20 years, did they?
VLV
Do you remember the 860 and 960 RISC processors that Intel made back in the 1990s? Those were good processors, and Intel dropped them. I recall hearing then that the reason was because the CISC processor group at Intel didn't want the competition.
I'll bet it's the same thing with ARM.
It'll be interesting to see just how big a bite this'll take out of ARM's arm. Somehow I don't think this is going to be an "ARM killer" -- it'll still be a crappy processor, and the ARM will still be a nice one. All it'll take is for some independent foundry or group thereof to figure out a lower-power process, and Intel processors will be out the door again.
-- Tim Wescott Wescott Design Services
Intel will win because their consumption will drop so low, that they will be able to beat them on speed and function, hands down.
ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.