Re: Overview Of New Intel Core i7(Nehalem) Processor

>Time for ARM in desktop PCs maybe?

> > ARM and ADA.

Ada was a useful advance, but a lot's happened in the last 30 years. Someone needs to drag Haskell out of the lab and cross-breed it with Eiffel and Erlang.

Reply to
Nobody
Loading thread data ...

formatting link

Terrifying.

John

Reply to
John Larkin

Would you care to elaborate?

Hmm; anyone here familiar with BlueSpec?

Reply to
Nobody

It's another language that's an intellectual puzzle for people who enjoy puzzles. As a toy for people who like this sort of thing, it's fine. As a way for real programmers to write safe, clear, maintainable code, it's a nightmare.

This...

factorial n = (foldl (.) id [\\x -> x*k | k

Reply to
John Larkin

Not by a long shot.

Which is why no-one would actually write it that way (well, unless they're making a mathematical argument which relies upon formulating it that way).

Of the various implementations given on that page, the most likely to be used are either:

factorial 0 = 1 factorial n | n > 0 = n * factorial (n-1) or: factorial = product . enumFromTo 1

If you can't grok partial application or the composition operator, you might rewrite the latter as:

factorial n = product (enumFromTo 1 n) or: factorial n = product [1..n]

This is about as close to a "natural" definition of the factorial function as you can get. It's also noticably more concise than the version which you started with (which you would only use either for mathematical argument or if you need a contrived example to make the language look bad).

Not only is this far more verbose than is remotely reasonable, computing a mathematical function as a sequence of state transformations is the kind of thing which only makes sense to someone who cannot get beyond the imperative mindset.

The only reason to prefer the imperative version is if that is all that you understand and all that you are willing to understand.

Reply to
Nobody

Well, there we have it...Nobody thinks Haskell is superior. ;)

Cheers

Phil Hobbs

PS: Some of us like the imperative model because we think computers should bloody well do as they're told.

--
Dr Philip C D Hobbs
Principal
ElectroOptical Innovations
55 Orchard Rd
Briarcliff Manor NY 10510
845-480-2058
hobbs at electrooptical dot net
http://electrooptical.net
Reply to
Phil Hobbs

One of the great formative and traumatic events of my youth was the moment that I realized that most programmers aren't interested in producing usable solutions to my problems, they mostly want to play mind games with computers, and all they want from me is to pay them while they do it.

It only cost me six wasted man-months to learn that, so I guess I got off easy.

What sort of electronics do you design?

John

Reply to
John Larkin

This is probably the fundamental difference between the imperative and declarative models. The declarative model tells the computer what to do; the imperative model tells the computer how to do it, often in far too much detail.

That detail can actually get in the way, e.g. inhibiting parallelisation. If you specify a sequence of actions to be performed in a specific order, the computer cannot always determine whether they actually need to be performed in that order or whether the order is just an arbitrary choice.

At least EEs have an excuse for using imperative languages. They actually make sense when you're interfacing with hardware, as most things really do need to occur in a specific sequence.

Reply to
Nobody

For you, maybe. Computers are stupid.

I write highly efficient clusterized, optimizing electromagnetic simulators. Almost a factor of 2 faster than Berkeley Tempest on the same hardware, plus cluster support and N-parameter optimization on arbitrary user-specified criteria. I've been writing multithreaded and parallel code since OS/2 2.0 came out in 1992.

I can't tell you how relieved I am that you think so.

So Nobody thinks that declarative languages are better. Who are you really, and what do you do? What large scale programs or electronic designs or anything have you shipped?

Cheers

Phil Hobbs

--
Dr Philip C D Hobbs
Principal
ElectroOptical Innovations
55 Orchard Rd
Briarcliff Manor NY 10510
845-480-2058
hobbs at electrooptical dot net
http://electrooptical.net
Reply to
Phil Hobbs

Of course not. I'm smarter than the computer. Computers should bloody well do as they're told.

Excuse? We need an excuse for building stuff that works and is safe?

What do you build?

John

Reply to
John Larkin

Seems reasonable enough ;)

By which I mean that it seems reasonable enough that a programmer might

*want* such a deal; actually expecting to get it isn't so reasonable, though.

Unfortunately, there's a bit of a balancing act with getting useful work out of programmers. Programmers who really enjoy working with computers can be extremely productive and knowledgeable, but are proportionally harder to keep on task and may be problematic in other areas.

I'm a computer programmer, although I've recently started getting into PICs, which includes building things to connect to them.

When I went to university, I initially studied Electronic Engineering, but quickly switched to computers. I think that was the right choice; the design side of electronics is interesting, but I find the construction part to be more work than fun.

Reply to
Nobody

clusterized isn't a word. Maybe you meant clustered? Verbing weirds language and I get enough of that made up language at work.

Reply to
AZ Nomad

"Verbing weirds language." (I read Calvin and Hobbes too.)

The program isn't clustered, it runs on a cluster. '-ize' is a common suffix added to make a verb from a noun, as in "anodized aluminum", "specialized job", and "normalized coordinates".

Verbing, on the other hand, means to use a noun unmodified as a verb. The term itself is an interesting example of the breed--it's a self-referential grammatical atrocity. In the above examples, the verbed versions would be "anoded aluminum", "specialed job", and "normaled coordinates".

"Clusterized" isn't an example of verbing. Nice try though.

Cheers

Phil Hobbs

--
Dr Philip C D Hobbs
Principal
ElectroOptical Innovations
55 Orchard Rd
Briarcliff Manor NY 10510
845-480-2058
hobbs at electrooptical dot net
http://electrooptical.net
Reply to
Phil Hobbs

I was involved with - actually helped start - a company that made tomographic atom probes, megabuck instruments that rip apart samples and plot the 3d location and isotopic composition of every atom. The software challenges are serious. So on one visit, when I heard their programmers raving about Java and stuff, and not discussing the physics, alarms went off. $27 million later, it's almost over.

So the ideal programming language is Cobol, where the language is so uninteresting and so unchanging that the coders are forced to pay attention to the actual problem, because it's the only interesting thing around.

I like the hardware and find programming to be mostly tedious. Which is why I want to get it over as efficiently as possible, and not have to revisit it, so I can get back to the fun stuff.

And since this is sci.electronics.design, my attitude is reasonable.

John

Reply to
John Larkin

On Sat, 13 Jun 2009 13:54:05 -0700, John Larkin wrote:

I think you highlight something quite vital in embedded programming regarding any instrumentation. The programmers need to be informed on a variety of areas because they will be putting the pieces together and will be part of the center of that effort. They should be skilled in mathematics and sensor physics (not expert, but good enough to read with understanding what they are given sufficient to criticize it), skilled in numerical methods (physicists won't be, as a rule, and tend to assume things like perfect constants and perfect math -- where the reality of computing impinges and may require special care when two floating point accumulations of similar magnitude are differenced in the end [standard deviation is a frequent example when N*sum and sum^2 are subtracted at one point] or other aspects of realistic computing limitations present themselves); skilled in at least reading schematics sufficiently well that they can suggest alternatives to be considered; etc. If the embedded programmer stands on some artificial "ceremony" about being skilled only in programming and doesn't take it as a matter of personal responsibility to learn about and study other aspects, then things fall through the cracks. The physicist may claim that they had no idea that PI isn't an exact number, or that the calculations aren't infinitely precise, or that the mathematics they laid out for the instrument weren't designed well for algorithms in light of needed precision; ... and so on. Part of the embedded programmer's job is to understand and bring all the pieces together, correctly and with intelligence and forethought. In instruments, this requires skill in mathematics and numerical methods and sensor physics and electronics, and an active interest in the work products of others, sufficient to at least call out some questions at an earlier point in time and to be able to know enough about algorithm design to work independently and not ignorantly when others hand over equations and methods.

An instrumentation embedded programmer should be way past bickering over programming details and all about digging into the business of everyone else, learning, asking questions, thinking, and producing their own informed inputs early on.

Jon

Reply to
Jon Kirwan

But you shouldn't need to write parallel code. You should be able to specify the problem and let the compiler take care of the details.

This is starting to matter. Multiple cores are the norm even on desktop systems now, and the trend is towards even more cores. Forcing the programmer to partition the task manually is both fragile and costly.

Reply to
Nobody

Agreed, but look at Matlab: It's had "parallelizing" operators for years (decades?), and how often do you see people actually using them? Certainly not ubiquitously, in my experience. (And some of this is because, with CPU speeds today, even using nested FOR loops to fill out arrays is often "fast enough.") Whoever said that to truly exploit parallel architectures, the average man needs a different programming paradigm than the (generally) procedural languages we've had for ages is right.

Reply to
Joel Koltner

Is that you, DimBulb?

Reply to
krw

If it's too uninteresting, the coders may be paying more attention to the "situations vacant" pages. Well, other than the ones who take 5000 lines and a week to do what anyone else would do in 500 lines and a day.

If you're using Basic, you probably aren't getting it over as efficiently as possible.

If someone only ever does one digital electronics project, a dozen

74-series chips will cost less than a PIC programmer, and the design will take less time than learning asm. Doing a hundred designs that way suggests a lack of forethought.

Admittedly, the situation with programming is more complex, as the network effects are huge. A language's popularity translates into the quality and choice of tools, the ease of hiring skilled staff, the ability to obtain external advice, etc.

This can make a popular yet technically inferior language a better choice than a technically superior niche language. C isn't used for application programming because it's inherently good for that (it's a systems language), but because it's had a hundred times the investment of everything else.

A language which is both popular and technically superior would be better still, but that isn't an option so long as 99% of the industry revolves around continually tweaking 50-year-old technology.

Reply to
Nobody

Riiiighhhhttttt. Sure. Never mind now, Nobody, drink this and wait for the nice men in the white coats.

Cheers

Phil Hobbs

Reply to
Phil Hobbs

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.