Analysis: Understanding how new domestic machines perceive the world can teach us a lot about how to design and interact with them
In recent years, more sophisticated machines have entered our everyday life to carry out a variety of different roles in industrial, domestic and social settings. As a result, machines have become actors of agency within an increasingly complex and messy entangled network. From Alexa passively listening in on conversations in search of 'wake words' in our living rooms, to the many Roombas under constant surveillance from (rightfully) suspicious cats, or the obedient and sometimes helpless lawn-mowers that endlessly traverse through our gardens, we all share the same environment.
But, do they experience the world in the same way we do? Does a robotic lawnmower care about grass? Can an Roomba identify or 'feel' when something is dirty? The answer of course is no - but understanding how machines perceive the world can teach us a lot about how to design and interact with them.
From Waggle TV, all the cats on Roombas videos you'd ever want to see
Robots and humans have a very different way of sensing and interacting with the world. Robots are limited by their sensors and programming, which means their "experience" of the environment is nothing like ours. As a result, we can cannot expect similar perspectives of our shared environment. The tightly coupled nature of the machines material body and its ability to experience the world shows that the way in which they sense the world is vastly different to us as humans.
So, how does the iconic autonomous hoover "experience" the world? And more importantly, in the case of the Roomba, what is "dirt" to the machine? Is it the certain hue of decaying food particles under the dining table? Or is it the suffocating smell of dust as it rolls into a new room?
For the Roomba, it is none of the above, as it doesn't see a chair, door, or dirt in the same way we do. Instead, the Roomba relies on a combination of infrared sensors, acoustics signals and basic cameras to navigate and avoid obstacles, while cleaning up the mess you have left behind. When it encounters something along its path, it doesn't "know" what it is - it just knows it can't move forward.
We need your consent to load this rte-player contentWe use rte-player to manage extra content that can set cookies on your device and collect data about your activity. Please review their details and accept them to load the content.Manage Preferences
From RTÉ Radio 1's The Business, how likely are we to see robots like the Tesla Bot in our daily lives - and will they pick up the dirty socks?
Similarly, a Roomba doesn't 'see' dirt the way we do. Some models detect dirt by listening to the sound of particles on the floor, using high-frequency sound waves to decide when to clean. This is a far cry from how we perceive messiness or dirt.
This raises important questions on how do machines "experience" the world. To explore this, we can borrow ideas from biology, specifically the concept of the sensory world. This idea, originally developed by biologist Jakob Von Uexkull, suggests that every living being experiences the world differently based on it unique senses and abilities.
For example, a Sea Urchin doesn't see or hear in the same way we do. It's highly sensitive to light and shadows, which it uses to detect threats. To a sea urchin, a shadow cast by a boat is no different from a predator looming above. Its sensory world is shaped entirely by its need to survive, not by the complex categories and meanings humans assign to things.
We need your consent to load this rte-player contentWe use rte-player to manage extra content that can set cookies on your device and collect data about your activity. Please review their details and accept them to load the content.Manage Preferences
From RTÉ Radio 1's Today with Claire Byrne, the rise of robots: how automation is changing the way we live and work
If we were to apply this idea to a Roomba, the autonomous hoover has its own sensory world which is shaped by its sensors and programming. It doesn't care about the objects in your home, but only cares about navigating around them, while listening for small changes in sound to scoop up any dirt along its path.
This is a radically different way of experiencing the world compared to humans. From here, we can begin to rethink how we design and interact with machines. Instead of trying to make robots see and think like humans, we should adopt a more-than-human approach, focusing on what makes them unique, in their ability to sense and act in ways that we as humans can't.
Instead of trying to make robots see and think like humans, we should focus on their ability to sense and act in ways that we as humans can't.
This shift in perspective is crucial as we design more autonomous machines that share our physical spaces. If we wish to harmoniously co-exist with robots that have a higher degree of autonomy, we need to develop a deeper understanding of their needs and capabilities. Just as ethnography pushes us to understand the needs of people, we could possibly create a framework - a type of "machine ethnography" - that helps us design machines in a way that respects their unique ways of being in the world.
By adopting this more-than human perspective, designers can develop an empathy for the things they design, and how they exist in the world. Instead of forcing human-like characteristics onto robots, we can design them in ways that take full advantage of their unique abilities. This approach would emphasise stewardship over ownership, helping us create machines that work well for us, while at the same time, honouring their distinct ways of being.
Follow RTÉ Brainstorm on WhatsApp and Instagram for more stories and updates
The views expressed here are those of the author and do not represent or reflect the views of RTÉ