To further strengthen efforts on augmented and virtual reality, Apple has acquired Germany-based computer vision company SensoMotoric Instruments (SMI).
Founded in 1991, SMI provides eye tracking systems that can be integrated into virtual reality CAVEs, head-mounted displays -- such as Google Glass or Oculus Rift, simulators, cars, or computers as a measurement or interaction modality.
"Apple buys smaller technology companies from time to time, and we generally do not discuss our purpose or plans," US-based new website Axios quoted an Apple representative as saying.
The company plans to build augmented reality tools for developers into iOS 11.
Earlier this month, at the company’s annual Worldwide Developers Conference (WWDC) event, Apple introduced a new “ARKit” platform which allows developers build new apps taking advantage of iPhone’s camera and sensors to place virtual objects in real-world environments.
ARKit uses Visual Inertial Odometry (VIO) to accurately track the world around it. VIO fuses camera sensor data with CoreMotion data. These two inputs allow the device to sense how it moves within a room with a high degree of accuracy, and without any additional calibration.
With ARKit, iPhone and iPad can analyze the scene presented by the camera view and find horizontal planes in the room.
ARKit can detect horizontal planes like tables and floors, and can track and place objects on smaller feature points as well. ARKit also makes use of the camera sensor to estimate the total amount of light available in a scene and applies the correct amount of lighting to virtual objects.
ARKit runs on the Apple A9 and A10 processors. Developers can take advantage of the optimisations for ARKit in Metal, SceneKit, and third-party tools like Unity and Unreal Engine.