April 12, 2018

Systems capable of moving and (physically) interacting with their environment can induce both affine and non-affine transformations in them. The core of my work focuses on exploiting these abilities of a robotic system to reason and learn representations with limited or no prior data in novel unstructured environments.
In order to achieve robust autonomy, physical systems should be able to leverage their ability to move and interact with their environments in order to enhance their models of these environments. It is widely accepted that such causal reasoning is widely utilized in biological systems to develop better models of their environments (as seen in children). Building on this intuition, Interactive perception can be viewed as utilizing the ability to physically manipulate the environment to service perception and vice versa. Robots capable of manipulating the environment can be endowed with the ability of interactive perception [Reference 1]. Similarly Active Perception can be viewed as the ability to move with intention in your environment. [Reference 2].
Both, Active and Interactive perception problems, can be formulated using tools from information acquisition. Such a formulation allows posing the problem of state estimation as one of action selection to reduce uncertainty in estimation [Reference 3]. This allows us to use well-studied tools from optimal control to solve these action selection problems.