RESEARCHES

Smart Vision & and Sensing

Professor, Robotics Laboratory
Department of System Cybernetics, Graduate School of Advanced Science and Engineering
Hiroshima University
Idaku ISHII
>> Research Contents
In order to establish high-speed robot senses that are much faster than human senses, we are conducting research and development of information systems and devices that can achieve real-time image processing at 1000 frames/s or greater. As well as integrated algorithms to accelerate sensor information processing, we are also studying new sensing methodologies based on vibration and flow dynamics; they are too fast for humans to sense.

Virtual Musical Instrument Using High-speed Vision

In this study, the concept of Visual Auditorization (VA) for converting visual sensor information into audio presentation information is proposed for real-time interaction. The concept of VA is to present visual sensor information as audio information by collecting the position, velocity, and acceleration information of a target in an actual environment through a visual sensor, converting the information from virtual audio information from the target, and to output this information to an audio presentation device. To realize this concept, a high-speed vision system fully capable of processing audio information in a frequency band of approximately several kilohertz to tens of kilohertz is important.

To demonstrate real-time VA, a virtual musical instrument system was developed. The system proposed in this research consists of a high-speed megapixel vision, a computer for producing a virtual musical instrument model, and an audio presentation device such as a MIDI sound source. By replacing the model, various virtual musical instruments can be realized. Some virtual musical instrument models are introduced below.

  • Virtual guitar

The virtual guitar model generates guitar sounds according to human hand movements. This model only notes the right-hand movement strumming virtual guitar strings and automatically determines the scale for the left-hand movement. When a marker at the tip of the right-hand thumb of the player, a sound is generated at the moment when the coordinate of the marker passes the position of the virtual guitar string, and its volume is determined from the passing velocity.


  • Virtual xylophone

The virtual xylophone model generates a sound depending on the positional relationship of the two mallets held by the player and the virtual xylophone in space. In this model, makers are set at the tips of the two mallets. When the height coordinate of a marker enters a specified area, a key strike is assumed. This model determines from the coordinates which key was struck and generates the corresponding sound. Since the two mallets are processed independently, different key sounds can be generated simultaneously.



Virutal Guitar MP4 movie(1.1M)
Virutal Guitar
Virutal Xylophone MP4 movie(1.7M)
Virtual Xylophone