curious companion: the eavesdropping robot arm

The Robot Camera Eavesdropper project explores the dynamic relationship between humans and robots by personifying a robot arm and embedding it within a live interaction scenario. Rather than presenting the robot as a purely mechanical entity, the project endows it with a "personality" to create the illusion of meaningful interaction. Through this playful engagement, the robot responds to human behaviour in a distinctive and captivating manner. This project was submitted in partial fulfilment of the Human-Robot Collaboration course at OCAD University’s Digital Futures graduate program by Shipra Balasubramani, Nicky Guo, Gavin, and Wentian Zhu. This project’s description and writing are a collaborative effort between all team members and taken from the primary project documentation.

A Playful Exploration of Human-Robot Interaction

The primary goal of the project was to investigate how robots can be personified to create more engaging and relatable human-robot interactions. The project aimed to challenge the perception of robots as purely mechanical and instead explore how robots could exhibit human-like behaviours in everyday scenarios. By designing a robot that reacts to its environment in an inquisitive and humorous manner, the team sought to inspire further research in the field of robot personas and their integration into human spaces.

Concept & Development Process

The project was inspired by the observation that robots often fall into the "uncanny valley" when replicating human behaviour. While science fiction suggests the potential for conscious robots, real-world robotics still lack truly human-like characteristics. This project aimed to bridge that gap by creating an interactive scene where a robot responds to human presence without direct conversation or physical contact.

The team conceptualized a scenario where an actor engages in a phone conversation, unknowingly awakening the robot arm with the noise. As the actor gestures for the robot to go away, the robot playfully pretends not to eavesdrop until it hears something surprising. This narrative allowed the team to simulate how a robot might behave in a shared environment while incorporating humour and personality.

The development process began with scripting and storyboarding the scene. The team broke the narrative into eight key frames to guide the shoot and visualized the sequence using MIMIC for Maya. However, manual input was ultimately chosen for the robot's movements to allow greater control over timing and expression. The robot's actions, such as glancing at documents and reacting with surprise, were designed to convey curiosity and mimic human gestures. To achieve this, the team programmed waypoints for the robot's movements, adjusting the speed to match the emotional tone of each action. For example, the robot's "surprised" reaction required rapid movement, while actions like looking at its documents involved slower, more deliberate gestures. The "wait" keyframe function was used to create pauses, enhancing the impression of thoughtful behaviour.

Final Video Shoot

The video shoot was staged to resemble a design studio or office environment, complete with props such as books, a water bottle, a laptop, and a desk lamp. The actor was positioned opposite the robot arm, with a hidden microphone capturing the dialogue. Natural lighting from a nearby window was supplemented by a desk lamp and an additional hidden off-screen lamp to ensure consistent illumination.

The team conducted test runs to synchronize the actor's performance with the robot's programmed movements. This iterative process required adjusting the script and refining the robot's actions to create a seamless interaction. Modifying the actor's delivery proved easier than reprogramming the robot, allowing for greater flexibility during the shoot.

In post-production, additional effects were added to enhance the narrative. These included a phone ringtone to signal the robot's awakening, exaggerated robot arm sounds to emphasize its movements, and a blinking effect to simulate the robot "waking up." Through these techniques, the final video effectively conveyed a playful and engaging human-robot interaction, successfully bringing the robot's persona to life.

year

2023

tools

Robots UR10E + Gripper + iPhone + URScript

.say hello

I'm likely out for a walk or watching dogs at the park. Reach out if you’d like to chat (or need a caffeine fix in Toronto!)

.say hello

I'm likely out for a walk or watching dogs at the park. Reach out if you’d like to chat (or need a caffeine fix in Toronto!)

.say hello

I'm likely out for a walk or watching dogs at the park. Reach out if you’d like to chat (or need a caffeine fix in Toronto!)

.say hello

I'm likely out for a walk or watching dogs at the park. Reach out if you’d like to chat (or need a caffeine fix in Toronto!)