PhysiCAD

Interactive 3D Model Making Using Hand Gestures.

PhysiCAD - Key Visual

Overview

PhysiCAD is an interactive interface that enables users to create 3D models through physical gestures. By translating hand movements into digital geometry, the system reimagines 3D modeling as an embodied and intuitive experience rather than a technical skill. It lowers the barrier to entry while maintaining the precision and logic required for computational design.

Role

Interaction & Technical Lead

Team

Ben Kazer, Joe Tu

Institution / Year

Harvard University - Graduate School of Design 2023

Tools

Grasshopper | C#

Background

Traditional 3D modeling tools often come with steep learning curves, requiring familiarity with complex interfaces, commands, and workflows. This creates a gap between creative intent and technical execution, especially for beginners or those outside computational fields. PhysiCAD emerges from the need to make digital modeling more accessible, immediate, and engaging, while still retaining its underlying rigor.

Concept

The project reframes modeling as a performative act, where the body becomes the primary interface. Instead of navigating menus or writing scripts, users shape geometry through gestures—pinching, dragging, and moving in space. This shifts the process from abstract manipulation to spatial intuition. At the same time, the system embeds computational constraints, ensuring that the generated forms remain structured, editable, and compatible with parametric workflows.

PhysiCAD - image 01

The Project

PhysiCAD is an interactive interface that enables users to create 3D models through physical gestures. By translating hand movements into digital geometry, the system reimagines 3D modeling as an embodied and intuitive experience rather than a technical skill. It lowers the barrier to entry while maintaining the precision and logic required for computational design.

Process

The system was developed using Grasshopper as the computational backend, combined with a hand-tracking camera and custom gesture recognition algorithms. Hand movements are captured in real time and mapped to geometric operations, enabling users to generate and manipulate forms directly in 3D space. Iterations focused on refining gesture vocabulary, responsiveness, and the translation between continuous motion and discrete parametric logic. The result is a hybrid workflow that bridges physical interaction with computational modeling.

PhysiCAD - image 01
PhysiCAD - image 02
PhysiCAD - image 03

Documentation

Media Carousel

PhysiCAD - image 01

1 / 7

PhysiCAD - image 02

2 / 7

PhysiCAD - image 03

3 / 7

PhysiCAD - image 04

4 / 7

PhysiCAD - image 05

5 / 7

PhysiCAD - image 06

6 / 7

7 / 7

Slide 1 of 7