
Optimising remote handling telepresence systems through studying how users adapt to altered sensory motor contingencies.
Transformative societal technologies can be used to study how the human brain processes sensory information, but the converse is also true. Understanding how the brain processes sensory information is critical to the design and use of transformative societal technologies. One area where this is demonstrated clearly is technological systems with a “human in the loop” i.e. a human user is part of the system. A prototypical example of this is in “telepresence” systems which are designed to allow humans to work
in remote or hazardous environments. A key problem to be solved in this area is that even simple tasks such as reaching to pick up an object can be difficult and time-consuming to accomplish with such systems, even though the same action in real life is near effortless.
There are many differences between natural movements and their telepresence equivalents, both in terms of the way in which the movement is controlled and completed and the feedback a person gets during the movement. The aim of this project is to map out how human users adapt over time when using telepresence systems and how this influences their user experience of the system. By doing this we will use core psychological principles of how we perceive and learn about the world to optimise an important applied transformative societal technology in an industrial setting.