SUNY Research Foundation

Undergraduate Research Fellow

Autonomous Intelligent Robotics Lab - Binghamton, NY

~~~ Quick Description ~~~

  • Research Fellow with SUNY RF for Binghamton University's Autonomous Intelligent Robotics (AIR) Lab.

  • Combined multimodal sensing with a dialogue system, such that a reinforcement learning agent can autonomously learn to compare properties of a physical object with those of a theoretical object being discussed conversationally with a human.

Screen Shot 2021-07-12 at 10.05.09 AM.png

~~~ In-Depth Description ~~~

        Imagine an environment with a robot, a human, and an object. The robot has a camera and an arm, and it can use those to interact with the object: lift, look, poke, grasp, etc. By doing this, the robot can learn about the object through multiple sensors (the camera alone reveals the color of an object, but not the weight, or whether it is full or empty, for example). We call this multimodal sensing.

        While the robot does this, the human has an object in mind that they want. The human cannot see the object the robot has, which may or may not be the object the human wants. The robot can talk to the human and ask questions conversationally to try to figure out what object the human wants.

        The goal for the robot is to determine whether the object the human has in mind is equivalent to the physical object being investigated in the shortest amount of time / least cost possible. This is done via reinforcement learning methods.

        There has been research before on human-robot dialog interaction. There has been research before on multimodal sensory robot-object interaction. My research is novel because the robot is doing both at the same time: it is physically investigating an object while also talking to a human.

        This work, and publications for it, are ongoing. You can learn more about the AIR Lab here.