SwarmFest is the annual International conference for Agent Based Modeling and Simulation - IMO, a necessary skillset needed to understand complexity in systems. I just received notification that I have been accepted to present at this year's conference, and I am excited.
Why is this important to me?
My presentation details recent findings I've made regarding using a form of Swarm in which the schedule and rules for the agents are not hard-coded in the experiment. The our case, agents "discover" the rules and patterns based upon whatever pre-existing structure and connection it finds. This provides part of the solution space called a context. The discovery mechanism that searches the context for solutions is a topology and weight evolving artificial neural network, originally developed by Ken Stanley and Risto Miikkulainen at the University of Texas, called NEAT. (NEAT is actually the great-grandfather. WattsNEAT is our N-th generation implementation of HyperNEAT).
Now for the cool part. In order to configure the compute fabric for the many expriment contexts we might create in which to evolve solutions to problem experiments, we utilize WattsNEAT to evolve configurations of the compute fabric itself. This is our solution to the problem of partitioning data and structures for massively parallel computation.
If you have followed along so far, we have just used WattsNEAT to configure a compute fabric in which to effectively and efficiently run WattsNEAT in massively parallel compute fabrics. For those of you who have used Torque or SGE rolls in a Rocks cluster, the resulting advantage is an intelligent distribution and scheduling agent that configures the compute fabric according to the context (schedules, priorities, and component configurations it discovers at the time).
This is not as static as it may initially appear.
I'm moving to get the specifics (of which there are many) down and put (at least) into a provisional patent. After that, we will decide the bifurcation strategy between proprietary paths and open source.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment