Scribe: Tom
Paper to read for next week: Geometric Semantic GP. Feel free to skim the sections that get very mathy.
Fall Membership
- Members now include: Lee, Tom, Emma, Zeke, and Micah, with Kwaku and Omri being maybes. Someone should probably change the main lab page to reflect this.
- Lee should pester Kwaku and Omri about their membership.
Tag Space Machines
- So far, we aren’t seeing much evolutionary progress.
- Most initial programs are probably too small. We’ll focus for now on generating a viable generation 0, so that evolution has something to work with.
Uniform Mutation and Crossover
We got sidetracked by a large discussion about uniform mutation and crossover, both in TSMs and in PushGP. This may definitely be worth pursuing further in Push.
- Option 1: Set a mutation percentage, and then each point independently has that probability of being mutated.
- Option 2: For a program of size N, mutate (mutation percent * N) points within the program.
- Either of these options may also work with crossover, where the inserted code is taken from the “father” instead of created from scratch.
- These are also potentially doable in tree GP, which would be of higher interest to the community at large.
Kata Bowling
- Tom has been working on some graphing things.
- Heatmaps of best individuals of a run over all test cases.
- Piano roll plots of fitnesses and sizes within a run.
- What can we do to solve this problem?
- Some sort of combination of historically assessed hardness and lexicase selection may be useful. This could add bias to have harder test cases near the beginning of the sorted list of test cases for lexicase selection.
Geometric Semantic GP
- Singles out good features of parents and combines them (somehow).
- Problem: This creates a lot of bloat.
- Solution? Maybe use Push with simplification.
Leslie Valiant
If we can solve the 500-500 boolean parity problem with GP, that would be very good!