HapticGen Editing Elements (Work in Progress)
Demo#
Overview#
HapticGen is a generative AI used to create haptic feedback patterns for Meta Quest controllers developed by the Touch Experience and Accessibility Lab (TEAL) at ASU. My research in TEAL focuses on integrating new UI/UX components into the HapticGen interface to visualize and edit generated haptic waveforms. The interface allows designers to implement direct manipulation features to fine-tune the waveform’s characteristics, providing real-time feedback as adjustments are made. Users can select and modify specific portions of the waveform, zoom and navigate to the desired section, and save their modifications.
Why#
Haptic feedback systems offer novel ways to interact with digital environments, particularly in augmented and virtual reality settings. Traditionally, designing haptic feedback patterns has been a complex and specialized process. Emerging technologies like generative AI systems, like HapticGen, can streamline this process by generating haptic feedback patterns from user input prompts. Introducing UI/UX elements that allow designers to visualize, edit, and adapt the generated waveforms in real-time improves workflow efficiency and creative control. The user interface will offer designers a more streamlined workflow, enabling them to quickly iterate on haptic feedback patterns and produce more immersive experiences.