Google Tech Talk
August 2, 2010
Presented by David Sachs.
Gyroscopes, accelerometers, and compasses are increasingly prevalent in mainstream consumer electronics. Applications of these sensors include user interface, augmented reality, gaming, image stabilization, and navigation. This talk will demonstrate how all three sensor types work separately and in conjunction on a modified Android handset running a modified sensor API, then explain how algorithms are used to enable a multitude of applications.
Application developers who wish to make sense of rotational motion must master Euler angles, rotation matrices, and quaternions. Under the hood, sensor fusion algorithms must be used in order to create responsive, accurate, and low noise descriptions of motion. Reducing sensing errors involves compensating for temperature changes, magnetic disturbances, and sharp accelerations. Some of these algorithms must run at a very high rate and with very precise timing, which makes them difficult to implement within low-power real-time operating systems. Within Android specifically, this involves modifying the sensor manager, introducing new APIs, and partitioning motion processing tasks.
David Sachs began developing motion processing systems as a graduate student at the MIT Media Lab. His research there led him to InvenSense, where he continues this work with MEMS inertial sensors used in products such as the Nintendo Wii Motion Plus. David’s designs incorporate gyroscopes, accelerometers, and compasses in various combinations and contexts including handset user interfaces, image stabilizers, navigation systems, game controllers, novel Braille displays, and musical instruments.
source by Google TechTalks