Vision-based tactile sensing

Visu­al­iz­ing robot­ic touch: concept illus­tra­tions for tact­ile sens­ing tech­no­logy.

This pro­ject explores how robots can learn to “feel” by using a cam­era to sense pres­sure on soft mater­i­als. Instead of rely­ing on com­plex elec­tron­ics, the sys­tem uses a simple setup: a cam­era under a soft lay­er filled with tiny particles. When some­thing presses on the sur­face, the particles shift. The cam­era cap­tures these move­ments, and machine learn­ing turns them into detailed force maps.

I col­lab­or­ated closely with research­er Carlo Sfer­razza to help tell the story behind this innov­at­ive sensor. My role in the pro­ject was to trans­late com­plex tech­nic­al ideas into clear, enga­ging visu­als. I designed cut­away illus­tra­tions that revealed the inner struc­ture of the sensor, show­ing how it’s built and how it func­tions. I also cre­ated visu­al­iz­a­tions of how the tiny particles inside the soft mater­i­al move and deform under pres­sure. These graph­ics were used in sci­entif­ic pub­lic­a­tions, present­a­tions, and out­reach mater­i­als, help­ing to make the research more access­ible and impact­ful for both tech­nic­al and gen­er­al audi­ences.