February 7, 2020, by Hazel Sayers

Smart Products Beacon update

Demonstrator Projects

The RoboClean project held its first meeting of 2020 to review activity.  Carolina Fuentes has been continuing development work on the unified platform to allow easy integration of the modules developed for the project (Multi-agent coordination, VUIs/ Alexa Skills, and Visualization) within the allergen aware human-robot collaboration architecture. They recruited Janey Slinger an Intern who worked over the summer months, to build a custom Alexa skill to control Neato vacuum cleaners by voice. Oliver Fisher worked with Nicholas Watson In Engineering and the Industrial Partner Intelligent Plant, to create an app for visualising the robot cleaning data.  They are now working towards producing a demonstrator to provide an overview of the RoboClean project to exhibit later on this year.  In addition, the team published their first paper, ‘The Effect of Light Intensity, Sensor Height, and spectral Pre-Processing Methods When Using NIR Spectroscopy to Identify Different Allergen-Containing Powered Foods‘ in Sensors 2020, 20(1), 230, with a second in preparation, and also produced a feature article in the Journal of the Institute of Food Science and Technology – ‘Sensors support machine learning‘, Dec 2019, (33)4, 20-24.

The FD2 project held a Smart Food Collective workshop towards the end of 2019, with the aim of building a community of University of Nottingham academics in the broad area of ‘smart foods’ and identifying key external partners.  Work to create a map of relevant capability in Nottingham is ongoing, along with the development of research challenges and ideas to respond to funding opportunities. The core Nottingham team has been instrumental in building a consortium bidding for a UKRI Circular Economy Centre focusing on food as a resource flow in the circular economy. The project team are presently exploring opportunities for knowledge transfer activities with industrial partners around digital and mixed-reality demonstrators. Collaborating chefs Blanch and Shock have systematically prototyped and developed 12 non-soy miso recipes. The next phase of this work is to prototype a digital interface to enable customised meal preparation. At the other end of the process, the project team is looking at how IoT devices can lift the lid on how biomass waste (food waste) gets generated in the home.  A food-themed day was held with the newest cohort of Horizon CDT students, along with a visit to the UoN dairy and brewery at Sutton Bonington, giving an opportunity to test food ideation cards. The cards are now being printed and plans to facilitate further food ideation workshops with industrial are underway.

The I-CUBE project has been working to develop two demonstrators – the first being an online 3D representation of the robot arm, the second, a Wiz of Oz version.  I-CUBE and RoboClean teams met to discuss their experiments in January. The categories used by RoboClean to classify people’s instructions to the robot were developed from previous work by Gilliani et al. (2015) and Wobbrock et al. (2009). The categories detail a person’s gestures and poses as well as referential categories for the objects that are referred to, and a contextual category for parameters used – such as distance, direction, time, room-centric and object-centric.  The I-CUBE team is now analysing audio-video recordings from an earlier Human-Human study. The variables and categories that emerge from this analysis will be compared to those from the RoboClean projects’ continued analysis. The two teams will meet again to reappraise categories developed by each project and consolidate the meanings for categories and variables.

The SOLARIS project held a workshop in November to frame the problem of “workflow-integrated design optimisation” and identify an application and a design aspect for integration into the targeted framework. The initial steps in the project thus developed a new reductive design approach and application that is amenable for inclusion in the joined-up computational optimisation tool. The avenue currently explored by the project team is an innovative design for customisable hand prosthetics based on a modularity scheme. As an additional application the project is considering personalised gifts.


We were delighted to receive feedback from The Broadway about the keynote talk given by Sarah Brin of Meow Wolf back in December. 61% of the attending audience rated the whole experience as ‘very good’ with comments including ‘fantastic‘, ‘more things like this – really enjoyable’ and ‘quality talk, very engaging‘.

We were delighted to sponsor Half Way to the Future – a symposium which explored the past, present and future of human computer interaction and designed based research. 180 people from 57 organisations (including IBM and Microsoft Research), spanning across 13 countries attended the conference, which took place at the Albert Hall Conference Centre in November.

Red Earth’s show Soonchild commenced their UK tour at Lakeside Arts in November, supported by funding from the Smart Products Beacon, the AHRC and the Arts Council.  To date the show – which develops ways to deliver accessibility right into the heart of a performance –  has been delivered at 13 venues across the UK, engaging with over 1885 people.


Nik Watson, Assistant Professor, Faculty of Engineering discussed whether online sensors and machine learning could deliver industry 4.0 to the food and drink manufacturing sector in the Journal of the Institute of Food Science and Technology, vol 33 issue 4 December 2019. Link to article (page 20)

Following a sandpit workshop in January – attended by colleagues from Engineering, Computer Science, Medicine, Maths, and Humanities – we have now invited 3 new demonstrators to join our Beacon.


Posted in Uncategorized