Illustration of a person swinging on a swing

RESEARCH

OUTCOME MEASURES

We use child-worn cameras, microphones, movement sensors and physiological sensors to track children’s behaviours within their environments in real time. As well as using this technology in our own research, we also offer these tools to industry partners and organisations for use in consumer and intervention research.

Traditional approaches to consumer and intervention research rely on pre-post measurements to test how participants are affected before vs after the intervention. This new ability to use child-worn wearables and neuroimaging during an intervention to capture immediate, moment-by-moment effects on children’s engagement, learning, social interaction, group dynamics and emotion regulation is a much more sensitive for detecting fine-grained effects.

Illustration of a laptop screen displaying a strategy diagram, featuring circles and crosses with an arrow, alongside a magnifying glass icon. The background is light green and the screen is lime green.
Illustration of a person sitting cross-legged in a hoodie

MOOD & REGULATION

  • Emotion regulation: using machine learning classifiers applied to microphone and/or heart rate data we can automatically identify all of the naturally occurring moments of child distress during the day. Then, we can examine how quickly children recover following each episode. In this way we can measure how a child’s emotion regulation capacity is affected by specific features of their environment.

  • Positive mood: using machine learning analyses applied to video or microphone data we can measure the emotional valence of child speech (i.e. how happy or sad they sound) .

  • Quality of emotional expression: by applying automatic transcription and semantic topic modelling to the microphone data we can track the frequency of emotional content in children’s naturally occurring speech. 

Animated transformation of a child's face into a minimalist face made of colorful beads.

SOCIAL DEVELOPMENT

Network graph with nodes labeled with names like Medici, Pucci, and Barbadori, connected by lines. Some nodes are isolated, while others form a clustered network. Red circles represent nodes, and lines represent connections. Possible social or historical relationships depicted.
Illustration of a person kicking a soccer ball

LANGUAGE

  • Language production: using machine learning analyses and speech transcription on the child-worn microphone data we can track how the frequency and complexity of children’s spontaneous vocalisations varies between different settings.

Graph displaying child-parent interaction data including child arousal, parent arousal, child attention, proximity to parent, parent vocalizations, and child affect over time, with related images at specific timestamps.
Graph showing infant and adult arousal levels (z-score) over time, with vocal affect and GPS distance from home. Includes timeline annotations like 'Home' and 'Asleep,' with photos of environments at specific times.
Outline of a person lying on their stomach with legs bent

ATTENTION & ENGAGEMENT

  • Visual objects and scenes: using machine learning applied to the head camera data we can track how, when and how often particular objects appear in the child’s visual field.

  • Attention: we can calculate the durations of attention episodes - i.e. how many seconds children remain engaged with an object or task once they have started it.

  • Types of interactions/actions: what types of interaction does the child have with objects in their environment - is it physical play (and if so what type), imaginative play, social play or solo play?

  • Attention - brain measures: by using brain measurements we can track the quality of children’s attention (i.e. are they ‘just looking’ at someone while day-dreaming, or really engaging) by tracking neural markers of child attention engagement.

A person and a baby interact with colored blocks at a desk, accompanied by graphs and data visualizations below showing statistical and neurological data.

PHYSICAL DEVELOPMENT

A person kneels on the floor next to a baby lying on a mat. The baby is dressed in a yellow outfit and there are toys nearby. On the right, there's a chart displaying movement and posture data, with graphs and labels for different body parts and activities, such as prone, crawl, sitting, and standing.

Learn more about our…