InMySkin

A VR experience that lets users perceive the world through altered visual conditions, transforming empathy into an embodied perceptual experience.

InMySkin - Key Visual

Overview

InMySkin is a virtual reality experience that simulates altered visual conditions, allowing users to perceive the world through the lens of visual impairments. By immersing users in different perceptual states, the project transforms vision into an embodied experience of limitation, distortion, and adaptation.

Role

Concept & Technical Direction

Team

Guido Sijabat, Ana Paola, Petr Mogutov

Institution / Year

MIT - Mechanical Engineering Dept 2025

Tools

Unity | C# | VR headset | Post-processing shaders

InMySkin - Picture1
InMySkin - Picture2
InMySkin - Picture3
InMySkin - Picture4
InMySkin - Picture5

Background

Vision is often assumed to be stable and universal, yet millions of people experience the world through conditions such as glaucoma, cataracts, or color blindness. These realities are difficult to communicate through static images or descriptions alone. The project emerges from the need to make invisible perceptual conditions experientially accessible, using immersive technology to bridge the gap between description and lived experience.

Concept

The project reframes empathy as an experiential condition rather than a cognitive understanding. By simulating altered visual perception in real time, InMySkin allows users to inhabit alternative perceptual realities. Each mode modifies clarity, color, and field of view, revealing how perception actively shapes spatial awareness and interaction with the environment.

InMySkin - cataracts
InMySkin - heroimage0.911177001610578569

The Project

Users navigate a virtual environment while switching between simulated visual conditions such as glaucoma (tunnel vision), cataracts (blur and opacity), and color blindness (shifted color perception). These perceptual transformations create an embodied understanding of how visual conditions influence orientation, movement, and interpretation of space.

Live Demo

InMySkin WebGL Demo

Demo langsung dimuat otomatis di halaman ini.

Loading InMySkin WebGL...

Process

The system was developed in Unity using real-time post-processing filters and shader-based visual transformations. Each visual condition was designed to balance perceptual accuracy with navigability, ensuring that the experience communicates limitation without preventing interaction. Iterations focused on calibration of blur intensity, field-of-view restriction, and color transformation matrices to produce perceptually convincing simulations.

Reflection / Impact

InMySkin demonstrates how immersive simulation can function as an empathy interface. By transforming invisible perceptual differences into embodied experience, the project highlights how technological mediation can expand awareness of diverse sensory realities.

Documentation

Media Carousel

1 / 3

2 / 3

3 / 3

Slide 1 of 3