I have recently completed a master's degree at the Graduate School of Neural Information Processing in Tübingen, and I am a research intern in the Brain-Computer Interfaces group. Previously I completed my B. Sc. in Mechanical Engineering at the UPB in Medellín, Colombia.
My research focuses on robotic assistive devices for patients who suffer devastating motor impairments. Some conditions damage upper limb functions, for instance spinal cord injuries, high level amputations, or amyotrophic lateral sclerosis. I believe lightweight robotic arms can be used to fulfill activities of the daily life, helping people gain independence. Among all the research that this depends upon, I am mostly interested in the problem of human control of the devices. I am following two lines of work:
I want to find out what are the best possible ways of controlling a robot arm using signals elicited by a human, which depend on the condition and capabilities of a patient. Part of my focus is Brain-Computer Interfaces. Here, control signals are usually extracted noninvasively via electroencephalography (EEG), which is very noisy and only allows a slow and low-dimensional communication between brain and robot. In order to bring better solutions, I aim to combine artificial intelligence in robotics with paradigms that enhance human interaction with the devices.
I want to develop tools for our research using novel experimental setups. In particular, I develop ArmSym, a virtual reality system intended for experiments in human-robot interaction and human control of a 7 degree of freedom robotic arm.
Our goal is to understand the principles of Perception, Action and Learning in autonomous systems that successfully interact with complex environments and to use this understanding to design future systems