Learning and Evolution in neural networks

The evolution of modularity
sus efectos en la

Jeff Clune, a computer scientist interested in artificial life, has a paper out about how tuning evolutionary pressure in a certain way evolves networks with a high modularity. Let’s debunk this.

  • tuning evolutionary pressure: emphasize cost of connections. This makes intuitive sense for networks in which there is a physical distance to cover (like a railway network); there are arguments for applying to more abstract networks like genetic and metabolic pathways. In online social networks, however, there is no physical distance; creating a new link is very cheap; maintaining a live social connection, however, is typically costly – and even, if we are to take the Dunbar number argument seriously, constrained by a hard ceiling.
  • evolves: Clune’s model starts from random neural networks that need to perform a certain task. They receive stimuli from eight inputs (think an 8-pixel retina) and evolve to answer whether a pattern of interest is present, based on the stimuli. Patterns are perceived by two blocks of 4 pixels each; those considered (exogenously) of interest are slightly different for the two blocks (dubbed “left” and “right”). Evolution happens by simulating networks that reproduce according to a fitness measure (more fit networks have more offspring) and with random mutation. Two alternative measures of fitness are considered: maximizing performance only (PA) and maximizing performance while minimizing connection costs (P&CC). Performance is measured against two tasks: determine whether a pattern is present in both the right and the left pixel clusters (L-AND-R); or whether it is present in either the right or the left pixel clusters (L-OR-R). Which patterns count as an object differs in the left and right halves of the retina. Notice that the task assigned to the networks is itself modular, as a partition between “right” and “left” pixels is postulated. The results of the paper carry through for nonmodular problems, albeit in attenuated form.

Source: Contrordine compagni

World Scientific Pub Co Inc Recent Advances In Simulated Evolution And Learning (Advances in Natural Computation)
Book (World Scientific Pub Co Inc)

You might also like:

Evolution of Locomotion
Evolution of Locomotion
Learn to Walk
Learn to Walk
Artificial Evolution of Humanoid Jumping Behavior
Artificial Evolution of Humanoid Jumping Behavior

By the way, it is MUCH cheaper to buy DVDs from

2012-04-13 01:14:44 by CowDogsRock

Amazon's UK and French sites than to buy Region 2 DVDs on amazon's US site, in the vast majority of cases, even with the international shipping fees
it's always good to check them all, because amazon love to tinker with their pricing, and any amazon site could be the cheapest on any given day. but get used to using their european sites. they have a Currency Converter feature that allows you to use US Dollars to make purchases in Pounds or Euros, as necessary.

Rethinking Money: How New, Cooperative Currencies Can Save Small …  — Co.Exist
Now it is developing a neural system. This may sound grandiose, but actually the metaphor ... Prior to working with us, he had initiated a program in Uruguay and introduced IT and new technologies and other innovations. It turned out to be a very ...

Springer Connectionist Models of Learning, Development and Evolution: Proceedings of the Sixth Neural Computation and Psychology Workshop, Liege, Belgium, ... 2000 (Perspectives in Neural Computing)
Book (Springer)
Evolutionary Neural helicopter control
Evolutionary Neural helicopter control
Traing cars with Neuro evolution
Traing cars with Neuro evolution
Evolution [v7] - Learn to walk
Evolution [v7] - Learn to walk
EVOLUTION [V6] - Learn to (stand up and) stay Vertical
EVOLUTION [V6] - Learn to (stand up and) stay Vertical
Evolved Artificial Creature - Sidling Tiger
Evolved Artificial Creature - Sidling Tiger

Related posts:

  1. Second Generation Neural Networks
  2. Notes on Convolutional Neural Networks
  3. Learning methods Neural Networks
  4. Training Cellular Neural Networks
  5. Training data for neural networks