It soon became apparent to me that my rudimentary understanding of biology and neuroscience where holding me back, so I decided to fix this. I took a new job in Atlanta to support myself while I went back to school to get a degree in Biology at Georgia State University (GSU). I chose GSU because it has one of the better neuroscience programs in the country. As I was nearing graduation I was introduced to Donald H. Edwards. He was beginning a new project in his lab that was very similar to some work I had already done. He wanted to build a user-friendly neuromechanical simulator. My experience with software engineering and keen interest in this type of system made me a good fit for the project. We hit it off, and soon we were working on the AnimatLab neuromechanical simulator. For my dissertation I developed the first version of AnimatLab and used it to do a neuromechanical analysis of locust jumping. However, my long term goal was always to use this application and what I learned while building it to build intelligent, adaptive robots.
While working on my dissertation I also built my first biomimetic robot. I got a hexapod robot kit and used a Microsoft CE PDA to run the neural network that controlled it. I was able to get it to walk, turn and climb over things. The main thing I learned from this project was that the only realistic way to get large neural networks to control a robot would require parallel programming. I started working on my first neuromorphic board using a Xilinx FPGA. I designed and built one myself and programmed it using VHDL. I was able to run some simple integrate and fire neural and synapse models, but it would take a lot more work to get this to the point where it could easily be used.
After I obtained my doctorate I continued working for Don as a Post Doc. We published a set of papers based on my dissertation, and worked on another project to interface the nervous system of a crayfish to a neuromechanical model running in AnimatLab. This allowed us to "close the feedback loop" between the neurons and the sensory/motor systems, making it possible to study reflexes and control of movement in ways that were previously difficult or impossible. I was responsible for designing and building the electronic control interface boards, as well as building the software to allow the neuromechanical simulation to talk to the interface hardware. When funding ran out at GSU I moved on to other IT positions to support my growing family. I continued to work on AnimatLab in my own time and worked as a consultant on the neural interface project. We are close to publishing the first set of papers on this work.
I found a job at AEgis technologies doing modelling and simulation of unmanned air and ground vehicles for military applications. This was closer to what I wanted to work on with robotics systems, and where I could utilize my extensive experience with modelling and simulation, and learn from others in the field. Soon after I started working at AEgis I started my new company NeuroRobotic Technologies. Since then I have been working to get the AnimatLab software to the point where it is completely open-source and will run on the Ubuntu operating system. Almost all of the embedded micro-computers that are used in robotics utilize the Ubuntu operating system. I then spent some time getting the first version of the robotics framework in place for AnimatLab. This toolkit makes it very easy for anyone with some basic understanding of neuroscience to build biomimetic robots. It uses simple, off-the-shelf controllers, like the Arduino. People with little or no experience in robotics or electronics can use it to build very complex neurorobotic systems.
I also started putting together some test robots. I had planned to go back to work on my neuromorphic board. However, while I was getting to this point several different options had come to the market. These included the new NVidia Jetson TK1 and the Parallela micro-computer boards. The Jetson is a relatively small micro-computer that has 192 CUDA cores, allowing it to perform some incredible number crunching. The Parallela is an even smaller system that is also designed for parallel programming. I am currently focusing on using these two systems to build large scale neural networks to control autonomous robots. As a first step I am building a set of commercially available products that make it easier to build robots with these two boards.