• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

How much of computer programming now is, um 'evolutionary'

calebprime

Penultimate Amazing
Joined
Jul 5, 2006
Messages
13,001
I was reading The Third Culture, copyrighted 1995, the section by W. Daniel Hillis.

He describes a method of computer programming (pgs. 383-385) in which the computer evolves better and better routines by a kind of natural selection. The routines are random and ineffective at the beginning, but the best program is selected (by the computer) and, over many generations, an effective routine emerges. The code, he says, is often incomprehensible to a human programmer, but it's very effective. Millions of generations can be run in a very short time, so in a matter of hours, a usable program emerges, all without explicit human design. The example he uses is alphabetical sorting, but the code could do anything.

My question for computer programmers here: How much of computer programming today uses methods like this? What kinds of programs?
Do any common programs, like web browsers, etc. use routines developed this way?

I would simply Google the appropriate Wiki page, as in:

http://en.wikipedia.org/wiki/Evolutionary_computation

but it might be interesting for the experts here to hold forth.

I don't have an opinion or expertise, just wanted to know a little more.
 
I'm not a programmer by any stretch. As a systems administrator, my "programming" consists entirely of writing tiny shell scripts to automate mundane tasks.

None of my code is "evolved"; every line is lovingly handcrafted, and the entire project aimed single-mindedly at a specific goal like an arrow from a bow.
 
I believe the use of genetic (evolutionary) algorithms is increasing, and they are mostly used in engineering, design, and scheduling applications. This is not quite self-evolving code, although it is possible to evolve 'better' code via such algorithms. I don't know of any practical (commercial) examples of this.

There are increasing numbers of common programs that are able to learn (e.g. a user's preferences), but these don't use genetic algorithms.

For many years there have been programs (such as Tierra - see Code Wars) that simulate evolution of simple code 'creatures' in a virtual environment, and these can come up with surprisingly familiar results, such as the evolution of parasites, symbiotes, and other 'cheats', as well as unexpected code optimizations, such as loop unrolling, etc.
 
Last edited:
Programmers tend to concern themselves much more with minimizing the time it takes to write code and minimizing the amount of bugs rather than maximizing speed. The reason for this is very sound; a very, very small amount of code occupies the vast majority of run-time and that's the only code that makes sense optimizing for performance.

When the programmer really tries hard to optimize for performance I would say that it's not entirely "intelligent design"; the programmer undoubtedly tries to make a lot of intelligent guesses from his knowledge of instruction latencies, cache sizes and so on, but that only goes so far because processor architectures are so hideously complex compared to something like a 6510(commodore 64). I've dabbled a bit with SSE(more assembly than intrinsics) and it is very difficult to guess how much you should unroll a loop, whether it is faster to do it all in floating point or employ bit-twiddling hacks, how often a branch needs to be taken that it is faster to eliminate conditionals execute both branches at once and use masking to select, what size blocks you should use for loop blocking(counter-intuitive, sometimes it's faster with square blocks, sometimes with rectangular, sometimes smaller, sometimes larger) and so on. I, as a hobbyist, try to pick the best algorithm I can and then try many slight variations gradually iterating my way towards some local maxima where I can't find further improvements; it's a sort of pseudo-evolutionary process. It is not too difficult to beat the compiler(intrinsics) using hand-tweaked assembly in this manner, but you'd have to be careful; the fastest code in isolation does not necessarily result in the fastest program; if you fill the instruction cache with highly optimized yet humongous functions it could actually be slower than if you'd have used lightweight, somewhat optimized functions.
 
Last edited:
My question for computer programmers here: How much of computer programming today uses methods like this? What kinds of programs?
Do any common programs, like web browsers, etc. use routines developed this way?

No. Spreadsheets, web browsers, emailers, word processors would *not* be written by evolutionary algorithms.
 
Last edited:
One of the big issues is in testing requirements. Many domains and applications require rigorous standards for testing everything little thing and decision path, input and output, about a program. With evolutionary code (also with neural nets), that becomes problematic (and may not currently even be possible), since you don't really know what you're going to get.

So it's not just a matter of implementing that kind of code: you have to figure out a way to make sure it will meet the QA of the environment. I think this is an area that is currently lagging behind the rest.
 
One of the big issues is in testing requirements. Many domains and applications require rigorous standards for testing everything little thing and decision path, input and output, about a program. With evolutionary code (also with neural nets), that becomes problematic (and may not currently even be possible), since you don't really know what you're going to get.

Compounding this is the problem of maintenance. The initial development & testing of code is often a small part of its life-cycle cost; a successful app may be in use for decades and go through dozens of iterations and upgrades. Maintaining code is painful enough when it's written by a human; adding features to incomprehensible code would be . . . nearly impossible.
 
Does anybody do that for commercial applications? His web page claims 'many potential applications'.

Not that I'm aware of although I'm sure someone is probably using these techniques commercially.

There are some problems with using these techniques on hardware, especially for things like FPGA. I was looking for this particular paper but I couldn't find it. It's about how they evolved an FPGA as some sort of filter. What they found was that they couldn't figure out how worked! If they applied the same program to another FPGA it didn't work. It turns out that the evolutionary algorithm they applied managed to find new signaling pathways below the normal noise thresholds that were specific to that very board.

I'll poke around a bit more and see if I can find it.
 
Evolutionary algorithms can be thought of as an automated trial-and-error system. The computer can find ways to optimize certain solutions faster, better, and cheaper; than having humans manually doing all of that work. And, as you can imagine, the computer sometimes harps upon "brilliant" ideas the humans never would have thought of on their own.

So, its usage is increasing steadily in various engineering niches; such as circuit board design, computer and telephone networks, etc. GE uses them to optimize the design of certain engine parts. Some soap companies use them to optimize certain properties of laundry detergent molecules. Etc.
 
They are a very specialized niche of computer algorithm. First of all you need a fitness function which is an automatic way of determining how well the evolved population is doing relative to the goal. It is not always possible to have such a fitness function available for the task at hand.

I know there are neural networks that are "evolved" by genetic algorithm to improve their performance. In that case the fitness function is how close to the ideal soution the current neural network population are performing.

A genetic algorithm is not a good candidate to solve problems that can be resolved with regular algorithm or direct calculation.
 
I program embedded systems. I've used a genetic algorithm to find an optimal solution, which I then put in the shipped code, but I've never shipped code that used a genetic while it was running.
 
One of the big issues is in testing requirements. Many domains and applications require rigorous standards for testing everything little thing and decision path, input and output, about a program. With evolutionary code (also with neural nets), that becomes problematic (and may not currently even be possible), since you don't really know what you're going to get.

So it's not just a matter of implementing that kind of code: you have to figure out a way to make sure it will meet the QA of the environment. I think this is an area that is currently lagging behind the rest.

That doesn't seem right to me. For the "selection" aspect of evolutionariy programming to work the unit tests have to exist. The QA has to be well defined up front for this method to have any chance at working.
 
I believe the use of genetic (evolutionary) algorithms is increasing, and they are mostly used in engineering, design, and scheduling applications. This is not quite self-evolving code, although it is possible to evolve 'better' code via such algorithms. I don't know of any practical (commercial) examples of this.

One practical use of genetic algorithms is scheduling. You can give the computer a set of rules that specify schedule conflicts and you can assign certain configurations points. The auto-generated schedule that involves fewer conflicts and achieves the most points survives and reproduces with another schedule.

One such rule could be, for example, no appointments after 5, or set aside at least 15 minutes for appointments that are in separate buildings. This is all especially useful if you have a lot of constraints and a lot of people to schedule concurrently.
 
What's a genetic algorithm?

Genetic algorithm on Wikipedia.

The very short of it: A genetic algorithm is a way of solving a certain class of computing problems (global search heuristics - for instance travelling salesman amongst others) by applying evolutionary principles.
 

Back
Top Bottom