Intel-Altera, again

What *are* you talking about? SAS is not CPU.

Bullshit. 100W CPUs are not unknown.

Reply to
krw
Loading thread data ...

By the brazillions. There is nothing really wrong with the 8051.

IoT. ;-)

Reply to
krw

The killer, actually useful app would log pulse rate and blood pressure all day. Too bad there's no decent non-intusive BP sensor. It would have to be implanted, I guess.

--

John Larkin         Highland Technology, Inc 
picosecond timing   laser drivers and controllers 

jlarkin att highlandtechnology dott com 
http://www.highlandtechnology.com
Reply to
John Larkin

There is, part of the ADC system, but we didn't instrument it; I'll have that done next release, which won't be for a couple of months.

The ZYNQ has a prograammable temperature shutdown, which we didn't program. It apparently defaults to 125C, which I think was shutting down our box at 70C ambient. We were so concerned with getting the waveform stuff to work (to work *reliably*) that we didn't worry about secondary stuff like that.

I should have included a box temperature sensor, an LM71 probably, too. I should always do that.

--

John Larkin         Highland Technology, Inc 
picosecond timing   laser drivers and controllers 

jlarkin att highlandtechnology dott com 
http://www.highlandtechnology.com
Reply to
John Larkin

That acronym is close to idIoT.

I see people crowdfunding wifi light bulbs, and networked water bottles that light up to remind you to hydrate. For people who don't have light switches, or can't tell when they are thirsty.

--

John Larkin         Highland Technology, Inc 
picosecond timing   laser drivers and controllers 

jlarkin att highlandtechnology dott com 
http://www.highlandtechnology.com
Reply to
John Larkin

We are talking about ways to cut power usage. We can't add more boxes without upgrading the electrical circuit otherwise. The Six SAS hard drives (20W to 30W each) and Four cooling fans (15W to 20W each) are major power users as well.

Unless they already have the 50W L5430 Xeno, which is the last low power CPU upgrade for socket J (771 pads/balls).

Reply to
edward.ming.lee

There is an actual proof that recursion and iteration are equivalent.

--
Les Cargill
Reply to
Les Cargill

But x86 was where the money was, so they cadged it into a heck of a system. My dev PC at work is Intel, 3.0GHz, 8 cores, a lot of RAM and It Just Works.

For their day, they were fine. I dunno from Itanic but the 3000 was a world beater at transaction processing. What changed was cost - a PC was and order of magnitude or two cheaper.

It made for a lot of software packages that could not have existed in a world of minicomputers.

There was nothing keeping say the Amiga from taking over the world except they did not do the right things to make that happen.

A PC clone around 1990 was only three month's mortgage. An Amiga was six or more.

Eventually. But I view this as a defensive acquisition. My guess is that Intel feels like they and Altera are both facing declining demand and this is designed to preserve the technologies.

Yes, there are articles about "FPGA in the data center" but I remain skeptical - FPGA development doesn't scale because the tools are disgusting.

FPGA for what? You need big graphics cards to do supercomputer stuff; you don't need big CAM any more... signals processing?

I love ARM but it's a race to the bottom.

90% of all M&A activity destroys shareholder value.
--
Les Cargill
Reply to
Les Cargill

In huge quantity.

--
Les Cargill
Reply to
Les Cargill

I think that only applies if the recursion is known to terminate at N levels, for some fixed N.

Cheers

Phil Hobbs

--
Dr Philip C D Hobbs 
Principal Consultant 
ElectroOptical Innovations LLC 
Optics, Electro-optics, Photonics, Analog Electronics 

160 North State Road #203 
Briarcliff Manor NY 10510 

hobbs at electrooptical dot net 
http://electrooptical.net
Reply to
Phil Hobbs

Irrelevant. If the recursion doesn't terminate at some fixed level N, then you can't implement it on a real machine with stack depth K*N.

--

Rick
Reply to
rickman

Sure, but a minority of the population needs 24 GHz of CPU power in a multi-kilobuck box. Most people want a smartphone or a tablet that accesses Facebook. Intel can't survive selling to EEs. Gamers need a lot of compute power, but they get it from GPUs.

The only obvious large-volume market for ever-bigger iron is in data centers. People are starting to enhance them with FPGAs, hence the Altera acquisition. But google can afford to spin their own silicon, like maybe a 64-core ARM with a lot of special hardware to help out. Or some clever fabless outfit could make a deal with Samsung or somebody.

Think about how many former semiconductor giants are now faded memories. GE. RCA. Sylvania. Transitron. Raytheon. Signetics.

Exactly. Computing keeps getting cheaper, by a factor of 1e8 since Intel was founded. It's just not worth much any more.

A beautiful color laptop with half a terabyte of hard drive is now a weeks's mortgage, if that.

For an app like a data center, the developer will get armies of support to make the tools work.

Ever cheaper computing! I like it. Intel probably doesn't.

Yup. Culture clash.

--

John Larkin         Highland Technology, Inc 
picosecond timing   laser drivers and controllers 

jlarkin att highlandtechnology dott com 
http://www.highlandtechnology.com
Reply to
John Larkin

The rule is: one recursion level == one loop nesting level. The result IIRC comes from the 1960s from the field of the so-called while-programs and they teach it during the first year of the CS undergraduate studies, among many similarly fundamental, yet boring facts. So the discussion whether or not recursion is needed is pointless. It is certainly useful and natural, especially at the algorithm design level, especially when the problem is a bit harder than checking the state of a matrix keyboard...

Best regards, Piotr

Reply to
Piotr Wyderski

That may be, but not in our lifetimes. Intel will continue selling x86 chips as their top money maker for at least a decade. Meanwhile they will be branching out into other areas.

Here is a statistic I found that is a bit surprising actually.

"Intel has maintained the No. 1 market share position for the 23rd consecutive year, capturing 15.0% of the 2014 semiconductor market, down slightly from its peak of 16.5% in 2011, Gartner said"

formatting link

I am amazed that one company has 15% share of the world wide semiconductor market! Samsung is gaining ground with 10%. Everyone else is around 5% or less.

Really? You see FPGAs in a decline? I'm finding forecasts of 8% CAGR. That includes any effect from declining ASR.

You must be using Xilinx tools.

Lol. So everything is crap and no one can make money on it?

And 78.37% of all statistics are made up.

--

Rick
Reply to
rickman

Without 100M of cache, 100G of ECC SDRAM, and RAID SAS drives, 64 cores won't help much. Not many fab can handle 1 billion transistors. Arch. design cannot work without fab cap.

Reply to
edward.ming.lee

I don't usually recall what constraints are on this because those constraints are generally present in an implementation anyway. But yeah

- it's not quite as simple as I said.

I think the magic is that it must be "tail recursive"

formatting link

--
Les Cargill
Reply to
Les Cargill

+1 . Badda bing.
--
Les Cargill
Reply to
Les Cargill

Hasn't that pretty much already happened? Ah, Intel hit a peak last year, didn't they?

I suspect the death of the desktop is greatly exaggerated.

Intel can't survive selling to EEs. Gamers need a

Which, I have to say, makes no sense to me at all :) The Internet knows not why :)

*If* you had some sort of ...MATLAB*->FPGA technology it might make sense, but I will not hold my breath. I can translate .m files to 'C' but my experience is that this is unusual - never mind trying to emit Verilog from MATLAB. *or... what?

People be writing FPGA for loading on data center computers? There's an entire discipline of just keeping web pages up.

If this is for hedge fund or EFT use, I can't imagine them tolerating the shitty tools. Besides, they use GPUs for all that.

That isn't a Google sort of thing, though. Everybody thinks Google is a tech company. Not so much.

Maybe. What's more likely is a further proliferation of ARM.

The x86 is like a British Enfeild .303 rifle; the ARM is an AK47....

The ones left moved on to more rent-seekey things. Intel was just the 400 pound gorilla for going on 40 years now.

The good thing about companies is that they end. I don't for one minute believe x86 will *END* end but it might just go static.

I will believe that when I see it. No FPGA maker has even identified this as a weakness so far.

My read is that they simply concentrated on lowering the run rate of their software efforts. I cannot imagine what it would be that would counterbalance what I perceive to be horrendous support costs.

'Course, there's always the [ name elided ] business model, in which support is a revenue generator. Pimpin' ain't easy, yo....

They've had years to get in front of it. I love the little buggers myself.

Probably all that IT technical debt exploding in their face.

%^&#ing programmers ( sez the programmer).

--
Les Cargill
Reply to
Les Cargill

Unfortunately, they don't always put GPU in server boxes, not even Dell.

Google would just goes next door (Intel) rather than next country (Samsung).

Reply to
edward.ming.lee

Wintel broke home computers to big box stores.

Seems optomistic. It's either premised on this "FPGA in the datacenter" thing or smartphones. But yeah, somebody sure said that.

I just figure the smartphone boom is mostly over.

I haven't done any of them for ten years and did precious little even then. Compared to a 'C' compiler plus IDE, all the tools are horrendous.

Heh. To an extent, yes.

We're kind of waiting on the Next Big Thing.

formatting link

--
Les Cargill
Reply to
Les Cargill

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.