Multimodal Interface for Games: A Case Study with TinyML

Haoxuan Xie, Lam Chi Hou, Lap Tou Chau, Lei Ka Weng, Xichen Wang, Yuxuan Guo, Giovanni Delnevo, Chiara Ceccarini, Chan Tong Lam, Su Kit Tang

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Multimodal interfaces go beyond the traditional interaction through keyboard and mouse by incorporating multiple modes of interaction, such as touch, voice, gesture, and even gaze, to create more intuitive and immersive user experiences. This paper investigates how TinyML can be employed for multimodal interfaces in the context of games. An endless game in which the character has to avoid obstacles and fight monsters to advance has been developed. An Arduino Nano 33 BLE Sense is then used as the input device for the game by recognizing the hand gestures of the players.

Original languageEnglish
Title of host publication2024 IEEE 21st Consumer Communications and Networking Conference, CCNC 2024
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages823-826
Number of pages4
ISBN (Electronic)9798350304572
DOIs
Publication statusPublished - 2024
Event21st IEEE Consumer Communications and Networking Conference, CCNC 2024 - Las Vegas, United States
Duration: 6 Jan 20249 Jan 2024

Publication series

NameProceedings - IEEE Consumer Communications and Networking Conference, CCNC
ISSN (Print)2331-9860

Conference

Conference21st IEEE Consumer Communications and Networking Conference, CCNC 2024
Country/TerritoryUnited States
CityLas Vegas
Period6/01/249/01/24

Keywords

  • games
  • internet of things
  • multimodal interface
  • tinyml

Fingerprint

Dive into the research topics of 'Multimodal Interface for Games: A Case Study with TinyML'. Together they form a unique fingerprint.

Cite this