Grasping an object – model describes complete movement planning in the brain


Neuroscientists at the German Primate Center (DPZ) – Leibniz Institute for Primate Research in Göttingen have succeeded for the first time in developing a model that can seamlessly represent the entire planning of movement from seeing an object to grasping it. Comprehensive neural and motor data from grasping experiments with two rhesus monkeys provided decisive results for the development of the model, which is an artificial neural network that, by feeding it with images showing certain objects, is able to simulate processes and interactions in the brain for the processing of this information. The neuronal data from the artificial network model were able to explain the complex biological data from the animal experiments and thus prove the validity of the functional model. This could be used in the long term for the development of better neuroprostheses, for example, to bridge the damaged nerve connection between brain and extremities in paraplegia and thus restore the transmission of movement commands from the brain to arms and legs (PNAS).