The Open University’s Animal Computer Interaction (ACI) lab organized a workshop on the 12th and 13th of April 2013 on Co-Designing with Dogs. This workshop is part of the More-Than-Human Participatory research project, led by the University of Edinburgh. This project has the aim to “explore how a broader account of community – one that recognises the active participation of non-humans – might challenge understandings of how research can be co-designed and co-produced“.
The objective of ACI research is described on the ACI Blog by Clara Mancini, Research Fellow and head of the Animal-Computer Interaction Lab at The Open University:
“One of the aims – perhaps the most important aim – of Animal-Computer Interaction as a research discipline is to develop a user-centred approach to the design of technology intended for animals. Not only does this mean developing technology which is informed by the best available knowledge of animals’ needs and preferences. Crucially it also means involving animal users in the development process as legitimate stakeholders, design contributors and research participants“.
This workshop focused on assistant dog training and specifically aimed to explore how animals can contribute to ACI research and interaction design processes. The ACI lab is planning to design “a series of plug-on, dog-friendly computing interfaces for various domestic appliances to support assistance dogs in their tasks, thus improving their welfare and professional life”. This includes, for example, dog friendly interfaces for washing machines, light switches, and door handles: devices that are currently not informed by the perspective of the dog.
The video that summarizes this workshop can be watched below. The ACI lab follows a meaningful approach that, besides the play-element, is very much in line with the research I am pursuing. I am convinced that it will give a valuable understanding of the needs and preferences of assistance dogs. Furthermore the other workshops of the More-Than-Human Participatory research project could show new examples of participatory design with animals as the intended users.
In contrast to other dogs, my dogs are not very interested in watching TV. However, with the development of new technologies this might change in the future.
According to Ernst Otto Ropstad, an associate professor at the Norwegian School of Veterinary Science, with the development of newer TV’s with a higher resolution and more frames per second, dogs might be able to actually perceive TV as film instead of a set of flickering images.
Where we as humans need about 16 to 20 frames per second to perceive images as moving film, according to this article, dogs need about 70 frames per second. Still, dogs perceive the content in a different way than humans, because dogs see different colours. They perceive colours with only two cones (retina receptors) where humans have three. In his book, The Truth About Dogs, Steven Budiansky shows the image below, visualising how dogs perceive colours based on a research by Neitz, Geist, and Jacobs. The left size represents the image how humans would observe it and the right side shows the perception of dogs. Due to this difference in visual capabilities, dogs also generally see less detail than humans.
According to Ropstad, not all dogs can see equally well or show as much interest in watching TV as others. There might be a difference in dog breeds and/or an individual difference that has not been researched yet.
The activity of TV watching dogs has already been recognized by Dog TV channel startups that create content specifically for dogs. Besides some ear movements, my dogs were not too enthusiastic about the videos shown for example on the website of DOGTV, but perhaps a better frame rate, bigger screens, and less lazy dogs might give more interesting results.
Over the last few months several mobile/tablet applications intended for animal use have caught, not only mine, but a lot of people’s attention indicated by the amount of Youtube views and other buzz created by the videos mentioned in this post:
This Ipad game for cats created by Hiccup is one of the first applications intended for non-human use that got the attention of a large audience. In this very simple game the cat can chase either a digital representation of a mouse or a laser light. By tapping the object with their paws (or other body parts) the cat receives points. So far all the cats and kittens interacting with this application in my presence showed at least an interest in the moving object on the screen, most of them also started tapping the screen, and especially kittens got quite hooked in the chase after a while.
While this game, called Ant Smasher, is actually designed for human beings, the over 6,2 million viewers of this video could see how this bearded dragon interacted with the game. Other videos confirm that certain reptiles have an interest in screen interaction through this application or similar ones.
Next to these examples, there are apps functioning as tools for humans, such as dog whistles, GPS tracking, or training apps, that solely focus on human needs and preferences, without the animal being aware of the (digital) interaction. Although these tools might come in handy for pet owners, they do not provide (human-)animal interaction or stimulate interaction through play and are therefore not part of the research I am focusing on.
The mobile applications shown in this post are commercially successful examples of how apps could facilitate ‘something’ for non-human species. However it does not provide us with a better understanding of the animal or its physical and mental needs. A lot of questions remain: is the animal actually playing? If yes, how is this form of play stimulated? How does the animal recognize represented digital objects? What does this interaction mean to the animal? Is the animal enjoying the interaction? What is actual enjoyment for an animal? Why does the animal play? How could human beings take part in the interaction? etc.
I am convinced further research towards digitally mediated (human-)animal interaction can help us in finding answers.