April 21, 2015, by Emma Thorne

Tomorrow’s technologies, today’s research

Whether it’s the Hello Barbie which can answer kids’ questions using voice recognition software, the Apple Watch offering wrist-mounted app access or the Amazon Dash Button that can re-order a host of household products with just a single click, it appears there’s no end to our appetite for the latest computer gadgetry.

But with devices that sound like science fiction becoming a daily reality, what will the technologies of the future looks like and how will they continue to transform the way we work, rest and play?

Computer scientists from The University of Nottingham are among thousands of delegates attending the world’s leading conference on human factors in computing systems this week, which presents a showcase of innovations related to how people interact with digital technologies.

The researchers from the Mixed Reality Lab (MRL) in the School of Computer Science are at the Association for Computing Machinery’s CHI conference in Seoul, Korea, which started on Saturday April 18 and runs until Thursday April 23.

The conference is featuring a dozen academic papers from Nottingham on projects representing the breadth and diversity of research undertaken at the MRL, which explores the potential of ubiquitous, mobile and interactive technologies to shape everyday life.

Among the researchers attending the conference is Dr Stuart Reeves, EPSRC Senior Research Fellow, who is presenting the results of a study into a mixed-reality game devised by performance and interactive art group Blast Theory. I’d Hide You is an online game of stealth, cunning and adventure played out by a team of illuminated runners live from the streets as they roam the city trying to film one another.

Dr Reeves’ study looked at how the runners trained for and produced interesting, compelling live video streams from the street, while engaging with online players. Runners needed to learn how to act as a good camera operator, while also performing on-screen at the same time for online players watching their streams. They need to juggle the logistics of interacting with online audiences as well as people on the streets. The findings from the work will help designers to develop technologies to better support live video streaming.

Dr Reeves said: ““With the recent surge in popularity of live video streaming apps like Periscope, I’d Hide You provided us with a unique opportunity to explore the possibilities opened up for users of live streaming apps, along with the challenges they may face in producing broadcasts. We think our findings can inform designers of live streaming services to help build better systems that support the user experience in new ways.”

From street performance to road running — delegates will also hear about RunSpotRun: Capturing and Tagging Footage of a Race by Crowds of Spectators. The project led by Dr Martin Flintham saw a team of researchers deploying a mobile phone app at the Robin Hood Marathon that allows spectators lining the route to crowd-source and tag video footage of the runners. A handful of 17 spectators were able to record 412 videos, capturing 11 hours 29 minutes of footage of the marathon, tagging 1,800 of the runners in the process. The initiative allowed the researchers to tell the personal stories of individual runners and in the future they plan to extend their application to motivate spectators to capture more of the action using social media or to support runners’ charitable giving.

Other Nottingham research projects being showcased include:

  • Building a Bird’s Eye View: Collaborative Work in Disaster Response led by Dr Joel Fischer: how disaster response teams deploy a diverse range of technologies in their work and the design of new technologies to help manage uncertainty in emergency scenarios
  • The Data Driven Lives of Wargaming Miniatures led by Dimitrios P Darzentas: how technology could affect the way in which hobbyists create, use and record the provenance of Wargaming miniatures — and the implications for how those miniatures are treated and valued, both sentimentally and financially
  • ArtMaps: Interpreting the Spatial Footprints of Artworks led by Dr Tim Coughlan: a crowd-sourcing platform developed by researchers at the University’s Horizon Digital Economy Research institute in collaboration with Tate and the University of Exeter. The platform allows the public to interact with art by visiting the locations from famous works through the process of geotagging.
Posted in Research newsScience