SIGGRAPH 2010 Emerging Technologies Range from Robotics to Human Taste Simulations

CHICAGO — (BUSINESS WIRE) — May 20, 2010 — SIGGRAPH 2010’s Emerging Technologies presents innovations across a broad range of applications, including displays, robotics, input interfaces, vision technologies, and interactive techniques.

Presented in a combination of technologies chosen by the organizers and works selected by a jury of experts, the 22 selections came from more than 107 international submissions and will be on display and available for interaction with attendees in Los Angeles this summer.

"With every passing year, the technologies presented at SIGGRAPH become more and more astonishing," said Preston J. Smith, SIGGRAPH 2010 Emerging Technologies Chair from Laureate Institute for Brain Research. "This year is no different as conference attendees will experience first-hand the latest achievements across science, commercial, and research fields. In some instances, these technologies are making their first public appearance and are coming to SIGGRAPH directly from research labs.”

Listed below are just a few highlights from the SIGGRAPH 2010 Emerging Technologies.

Acroban the Humanoid

Olivier Ly, INRIA/LaBRI; Pierre-Yves Oudeyer, INRIA

Acroban is the first humanoid robot able to demonstrate playful, compliant, and intuitive physical interaction with children while moving and walking dynamically. Also, it is able to keep its equilibrium when moving even if unpredicted physical interactions are initiated by humans.

Potential Future Use:

The system is presented in an entertainment human-robot interaction context specifically meant to engage children. In this demonstration, the robot has a range of behaviors that it combines in order to react intuitively, naturally, and creatively to uncontrolled external intervention.

A Fluid-Suspension, Electromagnetically Driven Eye with Video Capability for Animatronic Applications

Lanny Smoot, Disney Research; Katie Bassett, Yale University; Marcus Hammond, Stanford University

This compact, fluid-suspension, electromagnetically gimbaled animatronic eye requires minimal operating power, a range of motion, and saccade speeds that can exceed those of the human eye without the traditional frictional wear points.

Potential Future Use:

In a special application, the eye can be separated into a hermetically sealable portion that might be used as a human eye prosthesis, along with an extra-cranially-mounted magnetic drive.

Gesture World Technology

Kiyoshi Hoshino, Motomasa Tomida, Takanobu Tanimoto, University of Tsukuba

This technology allows people to control devices such as computers, household appliances, and robots by means of everyday gestures without using sensors or controllers, which employs the high-speed and high-accuracy computer vision technology capable of estimating the hand and arm poses captured by a compact high-speed camera.

Potential Future Use:

This technology could be applied in a wide range of areas, such as gesture-based computer operation, virtual games, remote control without a remote controller, digital archiving of artisan skills, and remote robot control.

360-degree Autostereoscopic Display

Hiroki Kikuchi, Katsuhisa Itou, Hisao Sakurai, Izushi Kobayashi, Hiroaki Yasunaga, Kazutatsu Tokuyama, Hirotaka Ishikawa, Hidenori Mori, Kengo Hayasaka, and Hiroyuki Yanagisawa, Sony Corporation

This autostereoscopic display is a compact, cylindrical display, which can show full color, high quality, volumetric, 3D images, videos, and interactive animation viewable without glasses from any angle (360 degrees).

Potential Future Use:

This display has many potential applications, such as amusement, professional visualization, digital signage, museum display, video games, and futuristic 3D telecommunication.

Meta Cookie

Takuji Narumi, The University of Tokyo; Takashi Kajinami, The University of Tokyo; Tomohiro Tanikawa, The University of Tokyo; Michitaka Hirose, The University of Tokyo

“Meta cookie” is a novel pseudo-gustation system to change perceived taste of a cookie when people eat by overlaying visual and olfactory information onto a real cookie with an augmented reality (AR) marker by using AR and olfactory display technology.

Potential Future Use:

"Meta Cookie" combines augmented reality technology and olfactory display technology. Merging these two technologies creates a revolutionary interactive gustatory display that reveals a new horizon for computer-human interaction.

In-air Typing Interface for Mobile Devices with Vibration Feedback

Takehiro Niikura, Yuki Hirobe, Alvaro Cassinelli, Yoshihiro Watanabe, Takashi Komuro, Masatoshi Ishikawa, and Atsushi Matsutani, The University of Tokyo

This vision-based 3D input interface for mobile devices does not require space on the surface of the device, other physical devices, or specific environments. Based on a camera with a wide-angle lens, it can operate in a wide 3D space.

1 | 2  Next Page »
Featured Video
Jobs
Business Development Manager for Berntsen International, Inc. at Madison, Wisconsin
GIS Specialist for Washington State Department of Natural Resources at Olympia, Washington
Senior Principal Mechanical Engineer for General Dynamics Mission Systems at Canonsburg, Pennsylvania
Principal Engineer for Autodesk at San Francisco, California
Mechanical Test Engineer, Platforms Infrastructure for Google at Mountain View, California
Equipment Engineer, Raxium for Google at Fremont, California
Upcoming Events
URISA GIS Leadership Academy at Embassy Suites Fort Worth Downtown 600 Commerce Street Fort Worth, TX - Nov 18 - 22, 2024



© 2024 Internet Business Systems, Inc.
670 Aberdeen Way, Milpitas, CA 95035
+1 (408) 882-6554 — Contact Us, or visit our other sites:
AECCafe - Architectural Design and Engineering EDACafe - Electronic Design Automation TechJobsCafe - Technical Jobs and Resumes  MCADCafe - Mechanical Design and Engineering ShareCG - Share Computer Graphic (CG) Animation, 3D Art and 3D Models
  Privacy PolicyAdvertise