Papers
arxiv:2601.16639

HapticMatch: An Exploration for Generative Material Haptic Simulation and Interaction

Published on Jan 23
Authors:
,
,
,
,

Abstract

HapticMatch enables rapid prototyping of tactile sensations from visual input using conditional generative models trained on aligned optical, height, and vibration data.

AI-generated summary

High-fidelity haptic feedback is essential for immersive virtual environments, yet authoring realistic tactile textures remains a significant bottleneck for designers. We introduce HapticMatch, a visual-to-tactile generation framework designed to democratize haptic content creation. We present a novel dataset containing precisely aligned pairs of micro-scale optical images, surface height maps, and friction-induced vibrations for 100 diverse materials. Leveraging this data, we explore and demonstrate that conditional generative models like diffusion and flow-matching can synthesize high-fidelity, renderable surface geometries directly from standard RGB photos. By enabling a "Scan-to-Touch" workflow, HapticMatch allows interaction designers to rapidly prototype multimodal surface sensations without specialized recording equipment, bridging the gap between visual and tactile immersion in VR/AR interfaces.

Community

Sign up or log in to comment

Models citing this paper 0

No model linking this paper

Cite arxiv.org/abs/2601.16639 in a model README.md to link it from this page.

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/2601.16639 in a dataset README.md to link it from this page.

Spaces citing this paper 0

No Space linking this paper

Cite arxiv.org/abs/2601.16639 in a Space README.md to link it from this page.

Collections including this paper 0

No Collection including this paper

Add this paper to a collection to link it from this page.