Mixed Reality Lab Wiki
Advertisement


Conference Report:

ACE 2010

International Conference on Advances in Computer Entertainment Technology

17-19 November, 2010

Taipei, Taiwan

Hooman A. Samani





I attended the ACE conference with few other members of CUTE center. The reason for my attendance was to present my paper as full paper in this conference.

ACE was the the International Conference on Advances in Computer Entertainment Technology, and held in the city of Taipei, Taiwan on 17-19 November, 2010. ACE 2010 was in cooperation with ACM & ACM SIGCHI. It held in conjunction with the 9th Annual Workshop on Network and Systems Support for Games (NetGames 2010).

One of the interesting papers was RoboJockey. RoboJockey (Robot Jockey) is an interface for coordinating robot actions, such as dancing - similar to “Disc jockey” and “Video jockey.” The system enables a user to choreograph a dance for a robot to perform by using a simple visual language. Users can coordinate humanoid robot actions with a combination of arm and leg movements. Every action is automatically performed to background music and beat. The RoboJockey will give a new entertainment experience with robots to the end-users.

Another interesting work was ImpAct project for enabling direct touch and manipulation for surface computing. This work explores direct touch and manipulation techniques for surface computing platforms using a special force feedback stylus named ImpAct(Immersive Haptic Augmentation for Direct Touch). Proposed haptic stylus can change its length when it is pushed against a display surface. Correspondingly, a virtual stem is rendered inside the display area so that user perceives the stylus immersed through to the digital space below the screen. ImpAct is proposed as a tool to probe and manipulate digital objects in the shallow region beneath display surface. ImpAct creates a direct touch interface by providing kinesthetic haptic sensations along with continuous visual contact to digital objects below the screen surface.

I presented the audio section of the Lovotics in ACE conference. My presentation outlined the part of the project focus which is to develop an affective audio system for Lovotics which will act as an active participant in a bi-directional nonverbal communication process with humans. This interactive audio system is capable of synthesizing real time audio output based on eight parameters namely pitch, number of harmonics, amplitude, tempo, sound envelope, chronemics, proximity and synchrony. In addition to the first five common parameters, we focused on comprehensive research and user testing on the chronemics, proximity and synchrony (C.P.S. e_ect) and aimed to find out how these three factors enhance positive feelings in the human - robot interaction. These factors were determined through our study and were found to have a positive effect on the emotional interaction between humans and robots.

Thus, an interactive feedback audio system is implemented which allows sentimental interaction between humans and robots. The aim of such a system is to offer new possibilities for exploring the concept and possibilities of human – robot love.



Some of the main questions and my answer after my presentation was as following:

-Q: How do you express Love?

A: Within other emotions and behaviors



-Q:How do you evaluate developed theories?

A: By mapping love styles into Lovotics application we developed a questionnaire to evaluate love between humans and robots.



-Q: Why don’t you consider humanoid robot for Lovotics?

A: Minimal design is the design strategy for Lovotics. Our focus is artificial intelligence and we want to have a simple robot with required functionality to test our developed theories.



-Q: Do you consider speed?

A: yes, we designed second and third order of navigation parameters. Also we try to develop different speeds for several movements of the robot like tilting and changing heights.

Advertisement