Genetic FPGA

...

"Dr. Adrian Thompson is a researcher operating from the Department of Informatics at the University of Sussex, and his experimentation in the mid-1990s represented some of science?s first practical attempts to penetrate the virgin domain of hardware evolution. The concept is roughly analogous to Charles Darwin?s elegant principle of natural selection, which describes how individuals with the most advantageous traits are more likely to survive and reproduce. This process tends to preserve favorable characteristics by passing them to the survivors? descendants, while simultaneously suppressing the spread of less-useful traits.

Dr. Thompson dabbled with computer circuits in order to determine whether survival-of-the-fittest principles might provide hints for improved microchip designs. As a test bed, he procured a special type of chip called a Field-Programmable Gate Array (FPGA) whose internal logic can be completely rewritten as opposed to the fixed design of normal chips. This flexibility results in a circuit whose operation is hot and slow compared to conventional counterparts, but it allows a single chip to become a modem, a voice-recognition unit, an audio processor, or just about any other computer component. All one must do is load the appropriate configuration." ... Dr. Thompson peered inside his perfect offspring to gain insight into its methods, but what he found inside was baffling. The plucky chip was utilizing only thirty-seven of its one hundred logic gates, and most of them were arranged in a curious collection of feedback loops. Five individual logic cells were functionally disconnected from the rest? with no pathways that would allow them to influence the output? yet when the researcher disabled any one of them the chip lost its ability to discriminate the tones. Furthermore, the final program did not work reliably when it was loaded onto other FPGAs of the same type.

formatting link

--
Grizzly H.
Reply to
mixed nuts
Loading thread data ...

Ahh, AFAICT natural selection is fairly chaotic, breeding... choosing what you want, is much faster. (I didn't read the whole post...)

George H.

Reply to
George Herold

I don't think you understand the idea here. The purpose is to remove the programmer and let the software develop on its own. Rules are set up to select variants that are closer to the desired behavior in each generation and eventually the process generates something that works, mostly.

What is interesting here is that each generation appears to have been tested on a real chip rather than in simulation. So the circuit depended on specific properties of delay and it sounds as if noise coupling for operation which means it didn't work properly on other chips. I think with some guidance such as providing a clock and running in the world of simulation (with useful boundary conditions on what features are supported) might produce a design that was closer to being useful.

--

Rick
Reply to
rickman

That was in a Xilinx 6200 family part. Xilinx bought and then discontinued that line of parts in the 1990s. They were unique (at the time) in that the bitstream format was fully specified, allowing researchers to write tools that manipulated the configuration directly.

Regards, Allan

Reply to
Allan Herriman

Trouble is genetic algorithms tend to get 'trapped' in local maxima, so you end up with a chip which works ok but has a bad back.

Cheers

--
Syd
Reply to
Syd Rumpo

Or, of course, a "...terrible pain in all the diodes down my left side".

Cheers

--
Syd
Reply to
Syd Rumpo

Furthermore, the final program did not work

Yeah, wasn't the final problem with making any practical use of this that there was too much variation between the hardware of individual devices nominally of the same device type? So you could never know exactly what it would be doing with any specific device, or if it would work when ported to another.

The only reason that there could be gates not connected to something that breaks functionality when they're removed is that the algorithm is exploiting some nonideality in the silicon.

Genetic programming/algorithms is a practical technique, but seems to only work well in software, where the physical layer is abstracted away. Could work for FPGAs too, but it would make more sense to use an abstract model of the device when "evolving" the hardware.

Reply to
bitrex

Yikes, hairball async logic! Of course it wasn't repeatable.

Random mutation and selection is a terrible way to design software or logic. The numbers are not promising.

I suspect that nature doesn't use mutation+selection either; it's too inefficient.

There are periodic bursts of enthusiasm for automatic design. So far, none seem to work. Give it another few hundred years and we'll see.

--

John Larkin         Highland Technology, Inc 

lunatic fringe electronics
Reply to
John Larkin

Wow, when I read your post I thought you were talking about natural selection. That runs in bursts too. lol

So what *does* nature use? Why does nature need to be "efficient" with natural selection. Lots of things in nature are not efficient. Plants store about 2% of the sunlight that hits them.

--

Rick
Reply to
rickman

Mutation and selection aren't great ways to design full pieces of software or hardware, no.

What nature-inspired algorithms _are_ good for is developing solutions for particular problems, ones which by their nature (NP hard/NP complete) are intractable to direct computation, and where heuristic aids to solution are hard to come by.

Reply to
bitrex

Yeah, saying that "nature wouldn't do anything that's inefficient" misunderstands the "nature" of natural selection.

Nature is full of inefficiencies. A trait doesn't have to be actively beneficial to be selected for for a particular evolutionary niche, simply not harmful.

Reply to
bitrex

Oh no don't start him on that again....

--

John Devereux
Reply to
John Devereux

People are defined by the things they refuse to think about.

--

John Larkin         Highland Technology, Inc 
picosecond timing   precision measurement  

jlarkin att highlandtechnology dott com 
http://www.highlandtechnology.com
Reply to
John Larkin

You mean like most of electronics? Yes, if a problem is "intractable" to direct computation it is a hard problem. Why worry about the easy problems?

formatting link

formatting link

--

Rick
Reply to
rickman

Asynchronous logic behavior depends on prop delays, which vary from chip to chip, and with time/temperature/Vcc. Synchronous logic will work exactly the same on any number of chips.

The numbers are discouraging. Randomly changing on bit in an FPGA config file, or a program binary, will almost always break it. Changing several bits is worse. The number of ways to break even a small design vastly exceeds the number of electrons in the universe. And every "evolutionary" experiment has to be tested, for both functionality and for bugs.

"Intelligent design" works better than mutation and selection.

--

John Larkin         Highland Technology, Inc 
picosecond timing   precision measurement  

jlarkin att highlandtechnology dott com 
http://www.highlandtechnology.com
Reply to
John Larkin

That isn't true. A properly designed asynchronous state machine will have *one* well-defined action associated with each combination of state and input change, just like a synchronous SM. Admittedly, problems may arise if several inputs are allowed to change 'simultaneously'.

SSMs can have their quirks too. The key point is that either has to be properly designed.

Jeroen Belleman

Reply to
Jeroen Belleman

Clearly a man out of his element. Asynchronous logic is designed to give repeatable results independent of the various logic delays. That is why it is harder to design and synchronous logic is used almost exclusively. There are companies working on async logic FPGAs which may provide large benefits once they tame the design issues by a combination of hardware and software targeted to the various problems.

Twiddling bits in a config file is not the only way to implement evolving hardware. Your argument is exactly the argument used to refute natural selection.

Some would disagree.

formatting link

formatting link

--

Rick
Reply to
rickman

I don't think anything less than a true AI will really be any good at designing analog electronics, except, like in the links you show, the algorithms are used for optimizing or finding an optimal solution in a large but in some sense still fairly restricted domain.

I'm not an expert at engineering, but a lot of it seems to be understanding the human factor. And just because a problem is intractable to direct algorithmic computation doesn't mean that there won't be "obvious" solutions apparent to a human immediately which would still take an algorithm a long time to grind out.

Machines aren't creative. They don't "understand" anything about humans. They don't "understand" anything about anything. They're not "really" good at playing chess.

All current search algorithms, including genetic algorithms, are essentially a variation on a theme: you can traverse a tree or you can roll dice, or some combination, and that's about it.

"Find the optimal antenna shape for an antenna of this size in this application"? Sure, it can do that.

"Design me an FM transmitter that will suit this client's fickle and changing requirements?" seems like asking an "AI" system "Hey, could you compose a piece of music that I'll like" or "Hey, could you tell me if this is a 'good' novel?"

It's probably far easier to have a human do it.

Reply to
bitrex

What I'm saying is that you could certainly write a program that spits out VHDL or Verilog files by way of genetic algorithms, and if the shit works in Quartus II or whatever, I see no reason it won't work on the hardware every time.

Reply to
bitrex

The "genetic fpga" noted here failed when unconnected gates were deleted, and couldn't be reproduced on another chip. Clearly the designers (technically, I suppose, an evolutionary design has no designers) didn't do it right.

The vast majority of async logic designs aren't done right. Even clock domain crossings in synchronous FPGA designs tend to be hazards.

--

John Larkin         Highland Technology, Inc 
picosecond timing   precision measurement  

jlarkin att highlandtechnology dott com 
http://www.highlandtechnology.com
Reply to
John Larkin

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.