Background and interests

I am known for being a pioneer in the field of Augmented Reality and am generally credited with defining Augmented Reality and guiding its early development. For my dissertation, I built the first working Augmented Reality system, in that my system was the first demonstration of accurate real-virtual registration, even during rapid head motion. Other pioneering results include the first head-tracking system scalable to any room size, the first outdoor motion-stabilized AR system, one of the earliest user studies identifying perception problems in AR visualization, and a more recent investigation of Indirect AR, which offers an alternate approach for generating a compelling, AR-like mobile experience. I have also held prominent leadership roles in the AR research community. I was a member of the Steering Committee for IEEE ISMAR, the premier research forum on Mixed and Augmented Reality, from 2002 to 2020, and was Chair from 2008 to 2012. I served as Program Chair and Area Chair multiple times for that conference. I also organized a session on AR at the request of the National Academy of Engineering for the prestigious 2010 EU-US Frontiers of Engineering symposium, and in 2013 was invited by the National Academy of Engineering to give a Gilbreth Lecture. In 2016, I became an IEEE Fellow for my contributions to Augmented Reality. In 2022, I became an inaugural member of the IEEE VGTC Virtual Reality Academy for career and lifetime achievements in VR/AR/MR.

I have extensive experience with head-worn display systems, including leading three research projects using such displays. I have worn a wide variety of displays, including the CAE FOHMD, the MicroVision Nomad, the Canon COSTAR, and recent ones like the Oculus Rift and Microsoft HoloLens. I have particular expertise about the details of making optical see-through head-worn display systems work.

I have an established track record of leading and implementing research projects in industrial research that result in working prototype demonstrations. These have been shown at CES, SIGGRAPH, ATCA and DARPA meetings. In particular, I have shown good intuition and analytical skills for understanding complex systems that combine non-COTS hardware with new algorithms and applications. I have a reputation for being meticulous and highly organized: characteristics that have helped me make such systems and demonstrations work successfully.

Besides AR, I have also worked on projects in visualization for air traffic control and battlefield awareness, in virtual environments, and in the exploitation of LIDAR data collected from urban areas. As an example of my flexibility, I managed to initiate and get funding for a project on pre-launch detection of rocket-propelled grenades, despite that being completely outside my areas of expertise. I am a T-shaped person who has sought broadening experiences and projects that have enabled me to work more effectively with people in different fields, such as specialists in entertainment and media, military personnel, experimental scientists, air traffic controllers, artists, designers, and people in the mobile industry.

My history of giving invited presentations, teaching courses and successfully raising millions of dollars from DARPA and other funding agencies testifies to my communication skills. I have recently given invited talks on Augmented Reality. My experience in project and line management has also developed my interpersonal and team leadership skills. I am also the author of “So Long, and Thanks for the Ph.D.,” which is my free guide to surviving graduate school that has become a popular resource on the web.

Currently, I lead a team in Intel Labs. My team focuses on key enabling technologies for new forms of media, including AR and VR, along with prototyping novel experiences. These technologies include computational imaging, head-worn displays, and computational displays. Before joining Intel, I helped build a new laboratory, the Nokia Research Center Hollywood, focused on novel mobile media and entertainment applications, and I recruited and led a small team to explore these areas. Two showcase projects that I worked on recently are The Westwood Experience, which is a novel location-based experience using Mixed Reality to connect a narrative to evocative locations, and Leviathan, a series of Augmented Reality demonstrations that Intel showed at CES 2014 to inspire new forms of Augmented Reality storytelling. My team also demonstrated a 3D aerial display at SIGGRAPH 2017 and a compact, wide FOV VR display approach along with light field displays at SIGGRAPH 2020.

Contact

Intel Corporation

RNB-6-61

2200 Mission College Blvd.

Santa Clara, CA 95054

Email: