dogloose

In addition to providing previews of its upcoming iOS 13, macOS Catalina, watchOS 6, and other platforms, Apple announced the new ARKit 3 and RealityKit software frameworks for developers. These frameworks are designed to help the developers build great AR experiences. The company also unveiled a new app called Reality Composer for iOS, iPadOS, and Mac, which will allow developers produce and prototype AR experiences right on the devices where they are finally going to appear. Additionally, Apple is bringing HomeKit to security cameras and routers.

According to Apple[1], ARKit 3[2] puts people at the centre of AR with its new Motion Capture and People Occlusion features. With Motion Capture, the developers will be able to include people's movements right into their apps. It will able to track people's limbs, heads, and torsos in real-time. People Occlusion will make sure AR content shows up naturally in front or behind people.

Additionally, ARKit 3 adds support to track up to three people with the front camera as well as simultaneous access to both front and rear camera. Live collaborative sessions are also now available, allowing people to use a shared AR world map.

RealityKit is the company's new framework for creating photorealistic AR experiences. It will let the developers easily blend virtual objects with real world environments.

“RealityKit was built from the ground up for AR,” Apple said in a statement. “It features a photorealistic rendering, as well as incredible environment mapping and support for camera effects like noise and motion blur, making virtual content nearly indistinguishable from reality.”

Developers can make use of RealityKit capabilities with the new RealityKit Swift API.

RealityKit Composer, on the other hand, is an app that lets developers build...

Read more from our friends at NDTV/Gadgets