Other natures

Alternative Intelligence in Robotics

 

Other Natures presents industrial robots that behave more like a flock of swans than automation infrastructure. In this fourth iteration of my Robot Taming series, four ceiling-mounted industrial robot arms use depth sensors integrated into their environment to reason about how a person moves throughout their territory. Primal, low-level gestures like proximity, hand movements, and head orientation provide all the elemental information the robots need to decide how to respond to the person in their presence. They might come in close for an affectionate touch, watch cautiously from afar, or get too feisty and forceful and need to be shoo-ed away to a safer distance.

Video of woman surrounded by industrial robot arms, interacting with hand gestures
Woman booping the nose of an industrial robot arm.

At its core, Other Natures explores primal connections between humans and machines. We think of human nature as an ineffable essence, however we connect to the outside world through many tangible frequencies. The lowest-level frequency is how the body moves through space: it is the most basic form of communication connecting all living things. By building on top of already existing technology, these most universal “human” behaviors can be captured, codified, and rendered into unexpected vessels — like robots.

The four gantry-mounted robots use sensors embedded in the environment to detect the skeletons and gestures of people on the center platform. Photo by Michael Lyrenmann.

Almost Nature

Robots are mirrors. They reflect our own human nature back at us. Our desire for attention, for connection, for acceptance can all be stimulated just by modulating how they move. The movements of the four robots in Other Natures were deeply inspired by the swans begging for bread on Zürichsee. The way they eagerly approach an out-stretched hand, wander away slightly annoyed when you have nothing left, or swarm when hearing the faintest rustle of a paper bag were looked to as guidance for modeling of influence, autonomy, and control when interacting with the robots.

These large, industrial robots’ behaviors are designed to have a careful balance between precision and chaos. Without precision, their motions can be dangerous and damaging to themselves, the environment, and the people around them. However, move too precisely and they become rote and illegible. Humans need noise — a dash of chaos — in order to see true animism. With Other Natures I hand-crafted algorithms that blended how we expected the robots to behave and they wanted to behave. I knew the balance was right when the robots surprised us every once in a while.

Woman feeding four robots suspended upside down from the ceiling

Early tests with all four robots.

Swans looking for food on Lake Zurich

Swans prowling for bread on Lake Zurich.

Implementation Details

Other Natures features four standard ABB IRB4600-40/2.55 industrial robot arms suspended upside-down in the air from a 50-tonne Güdel gantry robot system. This entire 34 degree-of-freedom system is housed in the Robotic Fabrication Laboratory at the ETH Zurich. Normally, the robots are used to fabricate complex structures and advanced research in robotic construction. However, we implemented dynamic sensing, control, and communication software to breathe life into these machines — giving them a brief holiday from their daily grind.

My body tracking solution is based on an Azure Kinect depth camera, which 3D skeletal data at 30FPS and can be reliably localized to the robot’s coordinate system. We positioned the sensor in a fixed position near the ground between two of the robots. The sensor’s NFOV provided a reliable tracking area of 5m x 2m at 4m away. Additionally, the precise placement and orientation meant the robot arms would not occlude the desired tracking area as they moved around.

I use follow and look-at targets to directly control where the robots move.

Connecting these targets to Body Tracking enables the robots to dynamically see a person in their territory. They can then decide how to move in response to what they see.

Motion Behaviors

The robots in Other Natures do not use traditional path planning to decide how to move — their motions are not explicitly planned and simulated beforehand. Instead, their motions are generated using core principles of parametric design. Here is how I break down the complex control system into easy-to-manipulate parameters:

To begin, each robot moves and orients itself in relation to the vector from the base of the robot to the look-at target (the magenta line connecting to the blue sphere). A lerp parameter lets the robot interpolate their target position (the orange sphere) to approach or retract from the look-at target (the blue sphere). A base offset parameter raises or lowers the origin of the vector from the base of the robot to the look-at target — this lets the robot approach the look-at target from above or below.

I can then add a second-order of motion behaviors on top of this foundation of parametric movements. Here I have the robot’s seek their target position (the orange sphere) using a PD controller. Past projects have used other agent-based motion algorithms — such as boid simulations — to create reactive, life-like motion. However, one nice aspect of using a PD controller is that you can render a dynamic range of motion qualities — like “springiness”, “tiredness”, “excitability”, and “reaction speed” — just by changing two parameters in the gains.

Simulation Pipeline

With only 1 week of exclusive access to the robots, accurate simulation was a key — safety-critical — component of the development pipeline. NVIDIA’s Omniverse Isaac Sim was the main development environment, which provided a physically accurate simulation environment, obstacle avoidance, and IK solutions. ABB’s RobotStudio was used for hardware emulation to verify our motion and communication systems in advance of working with the physical robots. openframeworks — a C++-based open source arts-engineering coding toolkit — was used to stitch various softwares together to meet the limited development and testing window.

Clapping, waving, and hand gestures influence the robots’ behaviors.

Screenshot of Isaac Sim with simulation environment.

The simulation environment handles body tracking, motion behaviors and generation, and communication to the physical robots.

Physical Design

Other Natures was performed in a highly controlled, highly calibrated environment to give a sense of heightened risk while providing layers of safety. A 2m x 8m elevated platform provides the interaction zone and “safe space” that the robots cannot physically reach. When a person is standing over the platform, the robots can see them, but not physically touch them. Moreover, you can reach your hand out and the robot will come to try and touch you. However, they quickly retract once they have come close to physical contact.

Other Natures was performed in front of a 360º audience. The robots were placed symmetrically around a minimalist stage to provide interesting views for the surrounding audience.


Project Credits

Other Natures was generously supported by the NCCR Digital Fabrication Researcher in Residence program and Gramazio Kohler Research.

Development Team: Madeline Gannon

Special Thanks to the team at ETH RFL who helped bring the robots to life on such a short timeline:

  • Russell Loveridge, Fabio Gramazio, Philippe Fleischmann, Michael Lyrenmann, Lea Keller, Alessandra Gabaglio, Erika Marthins

And to the team at NVIDIA who helped push these robots beyond their limits:

  • Mike Skolones, Hammad Mazhar, Bryan Peele, Avi Rudich, Buck Babish

Sponsors: NVIDIA and NCCR Digital Fabrication


Sponsors