Agents on the Neurorobotics Platform
- I was hoping to get my CABot3 agents running using the virtual
environment from the Neurorobotics Platform.
- So, I contacted the NRP folks (who have been very helpful).
- After a virtual conference, we agreed that the best thing would be
to download their software, and run a local server.
- After a few weeks waiting to get approved (Andrew, can that process
be streamlined for people who already have approval to use one HBP
platform?) I got access to their software.
- As ever, these things are a pain to download, but considering how
much there was, it went pretty well.
- I've got a server running on my laptop.
- They provide some tools for editing their gazebo virtual environments,
but actually using gazebo for editing is awkward.
- The agent, brain and environment communicate through python transfer
functions.
- These are a bit awkward, but I managed to get it workin.
- I went through several agents, and have several up on the net
here .
- They all use a simple modification of their environment, their
Husky robot, and Nest. (Andrew Rowley mentioned they use neural
voltage instead of spikes, but I think that's an easy fix.)
- There's an agent that turns left, then right, then moves forward,
an agent that parses a command then does these actions (preset in
the transfer function),
an agent that uses neural vision to turn toward the blue box, and
one that allows you to change
a text file to continually enter commands.
- That last one is a CABot2. (It doesn't have any real memory.)
- They don't currently have support for SpiNNaker. They said
it would be slow when it comes back.