TY - GEN
T1 - The effects of training set size and keeping rules on the emergent selection pressure of Learnable Evolution Model
AU - Coletti, Mark
PY - 2012
Y1 - 2012
N2 - Evolutionary algorithms with computationally expensive fitness evaluations typically have smaller evaluation budgets and population sizes. However, smaller populations and fewer evaluations mean that the problem space may not be effectively explored. An evolutionary algorithm may be combined with a machine learner to compensate for these smaller populations and evaluations to increase the likelihood of finding viable solutions. Learnable Evolution Model (LEM) is such an evolutionary algorithm (EA) and machine learner (ML) hybrid that infers rules from best- and least-fit individuals and then exploits these rules when creating offspring. This paper shows that LEM introduces a unique form of emergent selection pressure that is separate from any selection pressure induced by parent or survivor selection. Additionally this work shows that this selection pressure can be attenuated by how the best and least fit subsets are chosen, and by how long learned rules are kept. Practitioners need to be aware of this novel form of selection pressure and these means of adjusting it to ensure their LEM implementations are adequately tuned. That is, too much selection pressure may mean premature convergence to inferior solutions while insufficient selection pressure may mean no sufficient solutions are found. Copyright is held by the author/owner(s).
AB - Evolutionary algorithms with computationally expensive fitness evaluations typically have smaller evaluation budgets and population sizes. However, smaller populations and fewer evaluations mean that the problem space may not be effectively explored. An evolutionary algorithm may be combined with a machine learner to compensate for these smaller populations and evaluations to increase the likelihood of finding viable solutions. Learnable Evolution Model (LEM) is such an evolutionary algorithm (EA) and machine learner (ML) hybrid that infers rules from best- and least-fit individuals and then exploits these rules when creating offspring. This paper shows that LEM introduces a unique form of emergent selection pressure that is separate from any selection pressure induced by parent or survivor selection. Additionally this work shows that this selection pressure can be attenuated by how the best and least fit subsets are chosen, and by how long learned rules are kept. Practitioners need to be aware of this novel form of selection pressure and these means of adjusting it to ensure their LEM implementations are adequately tuned. That is, too much selection pressure may mean premature convergence to inferior solutions while insufficient selection pressure may mean no sufficient solutions are found. Copyright is held by the author/owner(s).
KW - Evolutionary computation
KW - Function optimization
KW - Learnable Evolution Model
KW - Machine learning
UR - http://www.scopus.com/inward/record.url?scp=84865046800&partnerID=8YFLogxK
U2 - 10.1145/2330784.2331016
DO - 10.1145/2330784.2331016
M3 - Conference contribution
AN - SCOPUS:84865046800
SN - 9781450311786
T3 - GECCO'12 - Proceedings of the 14th International Conference on Genetic and Evolutionary Computation Companion
SP - 1505
EP - 1506
BT - GECCO'12 - Proceedings of the 14th International Conference on Genetic and Evolutionary Computation Companion
PB - Association for Computing Machinery
T2 - 14th International Conference on Genetic and Evolutionary Computation Companion, GECCO'12 Companion
Y2 - 7 July 2012 through 11 July 2012
ER -