AlteReal

Software for Real-Time Editing of Physical Spaces Using Image Processing and Projection Mapping.

AlteReal - Key Visual

Overview

AlteReal is a real-time system that transforms physical space into an editable visual interface. By combining camera input with projection mapping, the project allows users to manipulate reality as if it were a live Photoshop canvas—applying filters, distortions, and visual operations directly onto the world. It creates a continuous feedback loop where reality is captured, processed, and reprojected in real time.

Role

Conceptor & Technical Lead

Team

Yetong Xin

Institution / Year

Harvard University - Graduate School of Design 2024

Tools

HTML | CSS | Javascript (p5)

Background

The project draws from Jean Baudrillard’s concept of hyperreality, where distinctions between the real and the simulated collapse. Today, digital tools allow us to endlessly manipulate images, yet these transformations remain confined to screens. AlteReal emerges from the question: what happens when these tools are no longer applied to representations of reality, but to reality itself? It challenges the boundary between physical and digital by bringing image manipulation into lived space.

Media Carousel

AlteReal - Picture1

1 / 6

AlteReal - Picture2

2 / 6

AlteReal - Picture3

3 / 6

AlteReal - Picture4

4 / 6

AlteReal - Picture5

5 / 6

AlteReal - Picture6

6 / 6

Slide 1 of 6

Concept

The core idea is to treat reality as an editable medium. By merging camera and projector into a single system, the project creates a spatial interface where visual effects—such as blurring, edge detection, painting, and motion-based distortions—are applied directly onto the environment. This produces a layered condition where the physical and the augmented coexist, constantly influencing one another. Reality is no longer fixed, but continuously rewritten through computational perception.

The Project

AlteReal is a real-time system that transforms physical space into an editable visual interface. By combining camera input with projection mapping, the project allows users to manipulate reality as if it were a live Photoshop canvas—applying filters, distortions, and visual operations directly onto the world. It creates a continuous feedback loop where reality is captured, processed, and reprojected in real time.

Process

The system captures live video input of a physical environment, processes it through a series of image manipulation techniques, and projects the altered output back onto the same space using projection mapping. The workflow integrates computer vision methods such as motion detection and edge extraction with generative visual effects, enabling dynamic and responsive transformations. Iterations focused on calibration between camera and projector, latency reduction, and spatial alignment to ensure seamless integration between input and output. The result is a hybrid interface that bridges digital manipulation with physical reality in real time.

Documentation

Media Carousel

1 / 3

2 / 3

3 / 3

Slide 1 of 3