Education

University of North Carolina, Chapel Hill, NC

Ph.D. in Computer Science, May 1995

M.S. in Computer Science, May 1990

Pogue Fellowship (3 years)

 

University of California, Berkeley, CA

B.S. in Electrical Engineering / Computer Science, May 1988

With highest honors (GPA 3.95/4.0). Chancellor's and National Merit scholarships

Experience

Principal Engineer and Research Manager, Intel Labs

Santa Clara, CA

4/16 - present

Augmented Reality Leader, Intel Labs

Santa Clara, CA

6/12 - 3/16

 

Line Management, Technical Leadership and Prototyping: I lead a team pursuing key enabling technologies for new forms of media, including AR and VR, along with prototyping novel experiences. These enabling technologies include computational displays, head-worn displays and computational imaging. I also advise Intel on Augmented Reality and other related technologies and experiences.

 

Computational Displays: Led research efforts to make light field displays more practical. Light field displays reduce eyestrain in 3D displays by enabling the viewer to focus at different depths.

 

ThinVR: A computational display approach to simultaneously achieve 180 degree horizontal field-of-view and a compact form factor in a VR display. Published at IEEE Virtual Reality 2020 as an IEEE TVCG journal paper and demonstrated at SIGGRAPH 2020. I was the research manager on this project and provided technical direction for the execution.

 

Mid-Air Interaction with a 3D Aerial Display: Managed the team that built and demonstrated a 3D display that looks like an interactive, touchable hologram. Demonstrated at SIGGRAPH 2017.

 

Leviathan: I personally implemented part of the Leviathan Augmented Reality demonstrations that Intel showed at CES 2014 to inspire new forms of Augmented Reality storytelling. I also directed my team to build the AR framework that enabled the demonstration in the Intel CEO keynote presentation. I served as the main Intel technical expert on this project.

 

Research Leader, Nokia Research Center Hollywood

Santa Monica, CA

10/08 - 1/12

 

Line Management: Helped build a new research laboratory focused on novel media and entertainment applications, by establishing a new environment and culture, and by recruiting and leading a team to develop new forms of mobile pervasive media.

 

Project Management: Supervised two research focus areas: 1) Developing a new approach to implement a mobile AR experience that provides a more compelling experience than AR browsers offered, and 2) Exploring the use of pervasive computing technologies combined with mobile devices to enable novel forms of mobile media and experiences.

 

The Westwood Experience: My team built this novel location-based experience that was an experiment in connecting a narrative to evocative locations via Mixed Reality.

 

Sr. Research Staff Computer Scientist, HRL Laboratories

Malibu, CA

11/99 - 10/08

Research Staff Member, HRL / Hughes Research Laboratories

Malibu, CA

3/95 - 11/99

 

Project Leadership: Principal Investigator of two DARPA projects and numerous Raytheon, Boeing and GM internal projects in visualization, tracking, and Augmented Reality.

 

Augmented Reality: Demonstrated the first motion-stabilized outdoor Augmented Reality display. Built a new algorithm for automatically positioning AR labels over real-world objects to avoid occlusions. Demonstrated a basic perception problem in “x-ray vision” in AR visualizations. Active participant in running the IEEE ISMAR conference (serving as program and area chair multiple times).

 

Visualization: Developed interactive visualization techniques for future “Free Flight” Air Traffic Control applications. Demonstrated them at the ATCA ’95, ’96 and ’98 conventions. Worked on visualization displays for battlefield awareness and time-critical decision making. Examined the application of autostereoscopic displays. Developed non-geographic visualization techniques to generate insight from National Airspace System simulation data. Collaborated with computer vision experts by building visualization tools and designing the software infrastructure to support research in the semantic recognition of objects (buildings, cars, trees, etc.) from dense urban LIDAR data.

 

Virtual Environments: Built part of a car simulator to investigate the effectiveness of multimodal warnings from Crash Avoidance sensors. Built large head-tracked stereo displays.

 

Research Assistant, UNC Chapel Hill

5/89 - 2/95

 

Augmented Reality: Demonstrated the first compelling example of virtual-real registration in an optical see-through display. Developed inertial-based predictive trackers and calibration techniques. Built a see-through Head-Mounted Display system to demonstrate and evaluate these techniques. Reduced registration errors by a factor of 5-10. Published work in two SIGGRAPH papers.

 

Human Tracking: Built a novel outward-looking optoelectronic tracking system that was the first demonstrated scalable tracker for Head-Mounted Displays (shown at SIGGRAPH 1991). This was a team effort; my contributions included designing and simulating the overall software architecture, developing and coding the mathematics that compute head locations given sensor inputs, and calibrating the optical sensors. The HiBall tracker sold by 3rdTech is a descendent of this system.

 

Instructor, UNC Chapel Hill

Summer 1992

 

Completely redesigned and taught undergraduate “Computers and Society” course.

 

Software Engineer, Apple Computer (summer internships)

Cupertino, CA

Summers of 1986, 1987 and 1988

 

Investigated image compression, wrote AppleTalk file transfer and test programs.

Skills

  • Augmented Reality: Extensive knowledge and background in Augmented Reality systems.

  • Head-worn displays: 25 years of experience with a wide variety of head-worn display systems, particularly in optical see-through head-worn displays.

  • Line and project management, team leadership

  • Public speaking and communication.

  • Development platforms: Windows and Linux. In the past I have also used Macintosh, UNIX and other platforms.

  • Languages: C/C++ and JavaScript are the languages I have used most recently. In the past I have also programmed in a variety of other languages.

  • Qt, OpenSceneGraph

  • Government contract fundraising

Invited Talks

Other Talks

  • Guest lecturer at Stanford CS377M: HCI Issues in Mixed and Augmented Reality (May 2, 2016)

  • Leviathan: Inspiring new forms of storytelling via Augmented Reality (Augmented World Expo, May 2014, Santa Clara, CA). YouTube video of talk

Panels

  • Panel at the Augmented, Virtual and Mixed Reality Conference 2019 (Photonics West, 4 February 2019, San Francisco, CA).

  • Panel at VR, AR, MR One-Day Industry Conference (Photonics West, 29 January 2018, San Francisco, CA).

  • AR-VR: There Yet? (Intel Capital Global Summit 2015, 2-4 Nov. 2015, San Diego, CA).

  • The Renaissance of VR: Are We Going to Do it Right This Time? (SIGGRAPH 2015, 9-13 August 2015, Los Angeles). Link to panel description | Link to Hollywood Reporter article covering the panel | Slides of my position statement

  • The Next Ten Years of AR (IEEE ISMAR 2008, 15-18 Sept. 2008, Cambridge, UK).

Patents

  • US 11,009,766. Foveated virtual reality near eye displays. Ginni Grover, Ronald Azuma, Oscar Nestares. Issued May 18, 2021.

  • US 10,939,085. Three dimensional glasses free light field display using eye location. Tuotuo Li, Joshua J Ratcliff, Qiong Huang, Alexey M Supikov, Ronald T Azuma. Issued Mar. 2, 2021.

  • US 10,928,639. Thin, multi-focal plane, augmented reality eyewear. Sabine Roessel, Ronald Azuma, Mario Palumbo. Issued Feb. 23, 2021.

  • US 10,284,816. Facilitating true three-dimensional virtual representation of real objects using dynamic three-dimensional shapes. Ronald T. Azuma. Issued May 7, 2019.

  • US 9,317,133. Method and apparatus for generating augmented reality content. Thommen Korah, Ronald Azuma. Issued April 16, 2016.

  • US 9,262,696. Image capture feedback. Joshua Ratcliff, Ronald Azuma, Yan Xu, Gheric Speiginer. Issued February 16, 2016.

  • US 9,122,707. Method and apparatus for providing a localized virtual reality environment. Jason Wither, Ronald Azuma. Issued Sept. 1, 2015.

  • US 8,838,381. Automatic video generation for navigation and object finding. Michael Daily, Ronald Azuma. Issued Sept. 14, 2014.

  • US 8,515,126. Multi-stage method for object detection using cognitive swarms and system for automated response to detected objects. Swarup Medasani, Yuri Owechko, Michael Daily, Ronald Azuma. Issued Aug. 20, 2013.

  • US 8,488,877. System for object recognition in colorized point clouds. Yuri Owechko, Swarup Medasani, Ronald Azuma, Jim Nelson. Issued July 16, 2013.

  • US 8,335,751. System for intelligent goal-directed search in large volume imagery and video using a cognitive-neural subsystem. Michael Daily, Deepak Khosla, Ronald Azuma. Issued Dec. 18, 2012.

  • US 8,081,088. Method and apparatus for apportioning attention to status indicators. Timothy C. Clausner, Ronald T. Azuma. Issued Dec. 20, 2011.

  • US 7,796,155. Method and apparatus for real-time group interactive augmented-reality area monitoring, suitable for enhancing the enjoyment of entertainment events. Howard Neely III, Ronald T. Azuma, Jerry Isdale, Mike Daily. Issued Sept. 14, 2010.

  • US 7,599,789. Beacon-augmented pose estimation. Jon Leonard, Howard Neely III, Ron Azuma, Mike Daily. Issued Oct. 6, 2009.

  • US 7,315,241. Enhanced Perception Lighting. Mike Daily, Ron Azuma, Chris Furmanski. Issued Jan. 1, 2008.

  • US 7,131,060. System and Method for Automatic Placement of Labels for Interactive Graphics Applications. Ronald Azuma. Issued Oct. 31, 2006.

  • US 7,120,875. Augmented Reality Hybrid Tracking System with Fiducial-Based Heading Correction. Michael Daily, Ronald Azuma, Howard Neely III, Gerald Isdale. Issued Oct. 10, 2006.

  • US 7,002,551. An Optical See-Through Augmented Reality Modified-Scale Display. Ronald Azuma, Ron Sarfaty. Issued Feb. 21, 2006.

  • US 6,577,976. Real-Time Sensor Autocalibration for a Multi-Sensor Inertial Tracking System. Bruce Hoff, Ronald Azuma. Issued June 10, 2003.

  • US 6,408,251. Calibrating a Magnetic Compass With an Angular Rate Gyroscope and a Global Positioning Receiver. Ronald Azuma. Issued June 18, 2002.

Demonstrations

Professional Activities

  • Inaugural member of the IEEE VGTC [Visualization and Graphics Technical Community] Virtual Reality Academy, for career and lifetime achievements in VR/AR/MR (March 2022). Link to a description of the award. Link to a list of Academy members.

  • Served as ISMAR 2021 panels co-chair.

  • Served on committee that determined the "10 year impact paper" at ISMAR 2019.

  • Served on poster award committee for the ISMAR 2017 conference.

  • Participated in the IVRI Summit (May 2017). The International Virtual Reality Institute is a new VR/AR research center headed by Tom Furness.

  • Served as judge for the Auggie awards at the Augmented World Expo 2016

  • International Advisory Board member of the International Journal of Virtual and Augmented Reality

  • IEEE Fellow (January 2016)

  • Papers Award Chair for IEEE ISMAR 2015.

  • I am a member of the team that won an Intel Labs Academy Award for the “Best Promising New Idea” (April 2014)

  • I will serve on the Advisory Board for NITLE (National Institute for Technology in Liberal Education) starting in Fall 2013.

  • I was a member of the Steering Committee for the IEEE International Symposium on Mixed and Augmented Reality from 2002 - 2020, and I was the Chair of the Steering Committee from 2008 - 2012.

  • My original AR survey paper was selected as one of 50 influential journal articles by MIT Press. These papers were selected from over 80 MIT Press journals from 1969 to 2011, covering all academic fields.

  • Co-guest editor of Computers & Graphics special issue on “Mobile Augmented Reality” (August 2011)

  • Program Chair for IEEE/ACM International Symposium on Mixed and Augmented Reality 2002 and 2005

  • Program Chair for International Symposium on Augmented Reality 2001

  • Area Chair for IEEE/ACM International Symposium on Mixed and Augmented Reality 2004 [There were ten area chairs who decided which papers to accept] and ISMAR 2006 and 2007 [One of 12 area chairs].

  • Invited attendee of the 9th Annual National Academy of Engineering Symposium on Frontiers of Engineering (Sept. 2003). Attendance was limited to 100 young engineers (50% from industry, 50% from academia) chosen through a competitive selection process. Also invited attendee of the 2005 Japan-America Frontiers of Engineering Symposium (Nov. 2005).

  • Co-Organizer for session on Augmented Reality at the National Academy of Engineering’s 2010 EU-US Symposium on Frontiers of Engineering (Sept. 2010). Held in Cambridge, UK. Responsible for choosing session topic, recruiting speakers and attendees, and organizing session.

  • Awards Chair for IEEE/ACM International Symposium on Mixed and Augmented Reality 2008

  • Instructor in SIGGRAPH 1995, 1997, 2001 and 2004 courses

  • IEEE VRAIS (95-98) program committee

  • IEEE Virtual Reality conference (1999-2002) program committee

  • VRST conference (2004-2005) program committee

  • First International Workshop on Mobile Geospatial Augmented Reality scientific committee (2006)

  • International Workshop on Augmented Reality (1998-2000) program committee

  • Reviewer for IEEE ISMAR, ACM SIGGRAPH, IEEE TVCG, IEEE VR, Presence, EGVE, and others

  • ACM Senior Member

  • Member ACM SIGGRAPH and IEEE Computer Society

  • Former Science Advisory Board member of the USC Integrated Media Systems Center

  • Former advisory board member for UC Irvine's Center for Virtual Reality

Contract Funding

  • Pre-launch weapon detection. One DARPA STO seedling. (2008)

  • CT2WS [Cognitive Technology Threat Warning System]. DARPA STO. (2007)

  • COVER ME seedling. DARPA IXO. (2007)

  • UltraVis seedling and Immersive Operations panel. DARPA IXO. (2006)

  • Pre-launch detection of RPG's. One DARPA STO seedling. Two DARPA TTO seedlings. (2005-6)

  • Visualization for Insight into the Overall NAS (VisION). AFRL Rome (2004-5)

  • AR Vision System for Ground Controller. SBIR (NASA/Seagull), in two phases (1999-2001)

  • Direct Visualization of the Electronic Battlefield. DARPA ATO (1999-2000)

  • Geospatial Registration of Information for Dismounted Soldiers. DARPA ETO (1997-1999)

  • Human-Computer Symbiotes. DARPA ITO (1997-1999)

Citzenship

I am a US citizen.