solid light blue square

RESEARCH

Outline illustration of a person on a swing

MEASURING ENVIRONMENTS

At ISEY, we use cutting-edge wearable technology and AI tools to capture real-time details of children's surroundings, i.e., where they are, what they see, what they hear. This includes both their physical spaces (home, school, outdoors) and their social interactions (parents, caregivers, strangers).

By combining body-worn cameras, microphones, movement sensors, and other environmental trackers (GPS and proximity sensors), we aim to capture a moment-by-moment picture of a child's world.

We use this technology in our own research and also collaborate with educators, industries, and intervention experts to help design better learning environments for children.

Click here to see how we track children’s behaviours within their environment.

Learn more about what we measure:

+ Physical Environment

+ Social Environment

+ Digital Exposure

Illustration of a purple bulletproof vest on a light purple background.
Illustration of a person with a hijab holding a book while seated

PHYSICAL ENVIRONMENT

We capture the physical environments children spend time in —whether it’s home, school, or outdoors — using fixed cameras (for indoor settings), wearable head cameras, microphones, GPS trackers, and proximity sensors. AI and machine learning models can analyse the visual data from cameras to understand the environment, while GPS and proximity sensors can track the child’s location and movement within those spaces.

From cameras we can track:

  • The type of environment: rural or urban, indoor or outdoor, classroom or playground.

  • Visual objects: we can identify which types of objects are present in the child’s visual field, and search for moments when specific objects appear.

  • Visual complexity: how much movement, how many edges, colours are present within the child’s field of view.

  • Lighting and visibility: how clear and easy-to-identify specific features of the environment are.

From microphones we can track:

  • Noise levels: the amount of background speech and noise from other sources in the setting.

  • Clarity of speech: how clearly differentiable speech directed towards the child is from the background noise.

Diagram depicting a study on infant and adult interactions using wearable technology. It includes photos of a child's and an adult's clothing with attached sensors like cameras and microphones. A graph shows data on GPS distance, arousal levels, vocalizations, and ambient noise over time. An image of an adult and child interacting is shown alongside the visual data timeline.
Diagram showing exocentric child play event detection with timeline and images illustrating start and end points of play events.

SOCIAL ENVIRONMENT

Through camera and microphone, we capture images, videos and audio of the people a child interacts with — such as parents, caregivers, peers, or strangers — which can be processed using AI to extract various factors related to social interactions, as given below:

  • Social density: the number of people in the child’s surroundings and how physically close they are- ie whether in a crowded, busy setting or a small, intimate group interaction.

  • Interaction frequency and duration: the number of interactions between the child and different individuals (e.g., parents, caregivers, peers, or strangers) throughout the day, and how long they last.

  • Social networks: which individuals the child interacts with most frequently.

  • Engagement zone: identifying location within the environment where interactions happen most frequently, such as play area, reading corner, bedroom, or classroom space.

Children playing indoors with toys and furniture, including sofas and chairs, visible in a living room setting with scattered pillows and baskets of toys.

DIGITAL EXPOSURE

We capture children's digital exposure, including screen time and screen content, using wearable cameras and eye trackers. AI and computer vision models can process the visual data to detect the type of content viewed. Eye tracker and cameras can measure how children engage with digital devices to track a number of factors related to digital exposure, as given below:

  • Screen time and engagement: how long does the child spend looking at screens? Do they stay engaged with the content or constantly looking away? How well are their brains tracking the auditory and visual information content being presented?

  • Interaction with digital media: Is the child actively engaging with the screen (e.g., touching, responding, following along), or are they passively watching?

  • Visual focus and gaze patterns: Where does the child look most often on the screen? Do they anticipate objects appearing in advance? How quickly do they respond to new content when it appears?

Split image showing a close-up of a human eye with tracking markers, and a landscape with Moai statues on Easter Island with crosshair overlay.
Outline drawing of a child playing soccer, kicking a ball

Learn more about our…