Apple’s ARKit vs. Google’s ARCore – The Two Large Augmented Reality Frameworks Have Emerged

Pin it

Sizzle™, the first Global TransMedia Network, in which everything and everywhere is instantly transactional, informational and entertaining, allows full integration of both ARKit and ARCore applications, media, experiences, games, etc. Learn More at http://sizzle.network

To better understand how these Augmented Reality Frameworks will integrate in the Sizzle™ iPhone and Android Apps, we’ve consolidated the necessary information, websites and resources to make your investigation fast, efficient and expeditious.

Apple took the Augmented Reality leadership position in June of this year by announcing their upcoming ARKit to allow developers to create all sorts of new augmented reality experiences with a greater efficiency and ease, by providing a readily accessible framework for the developers to use at will. Suddenly, surfaces were instantly identifiable, as well as other aspects of 3D space in a location, so that the Augmented Reality Experience has relevance in the world in which it is set.

ARCore provides SDKs for many of the most popular development environments. These SDKs provide native APIs for all of the essential AR features like motion tracking, environmental understanding, and light estimation. With these features you can build entirely new AR experiences or enhance existing apps with AR features.

Screenshot 2017-09-04 16.51.27

Contents from both the developer’s site for ARCore and the developer site for ARKit are assembled below for a fast and easy comparison for our clients:

Let’s start with the new kid on the block, ARCore…..

discover-hero

ARCore is a platform for building augmented reality apps on AndroidARCore uses three key technologies to integrate virtual content with the real world as seen through your phone’s camera:

Motion tracking allows the phone to understand and track its position relative to the world.

Environmental understanding allows the phone to detect the size and location of flat horizontal surfaces like the ground or a coffee table.

Light estimation allows the phone to estimate the environment’s current lighting conditions.

Before diving into ARCore, it’s helpful to understand a few fundamental concepts. Together, these concepts illustrate how ARCore enables experiences that can make virtual content appear to rest on real surfaces or be attached to real world locations.

Motion tracking

As your phone moves through the world, ARCore uses a process called concurrent odometry and mapping, or COM, to understand where the phone is relative to the world around it. ARCore detects visually distinct features in the captured camera image called feature points and uses these points to compute its change in location. The visual information is combined with inertial measurements from the device’s IMU to estimate the pose (position and orientation) of the camera relative to the world over time.

By aligning the pose of the virtual camera that renders your 3D content with the pose of the device’s camera provided by ARCore, developers are able to render virtual content from the correct perspective. The rendered virtual image can be overlayed on top of the image obtained from the device’s camera, making it appear as if the virtual content is part of the real world.

MotionTracking

Environmental understanding

ARCore is constantly improving its understanding of the real world environment by detecting feature points and planes.

ARCore looks for clusters of feature points that appear to lie on common horizontal surfaces, like tables and desks, and makes these surfaces available to your app as planesARCore can also determine each plane’s boundary and make that information available to your app. You can use this information to place virtual objects resting on flat surfaces.

Because ARCore uses feature points to detect planes, flat surfaces without texture, such as a white desk, may not be detected properly.

EnvUnderstanding

Light estimation

ARCore can detect information about the lighting of its environment and provide you with the average intensity of a given camera image. This information lets you light your virtual objects under the same conditions as the environment around them, increasing the sense of realism.

LightEstimation

User interaction

ARCore uses hit testing to take an (x,y) coordinate corresponding to the phone’s screen (provided by a tap or whatever other interaction you want your app to support) and projects a ray into the camera’s view of the world, returning any planes or feature points that the ray intersects, along with the pose of that intersection in world space. This allows users to select or otherwise interact with objects in the environment.

Anchoring objects

Poses can change as ARCore improves its understanding of its own position and its environment. When you want to place a virtual object, you need to define an anchor to ensure that ARCore tracks the object’s position over time. Often times you create an anchor based on the pose returned by a hit test, as described in user interaction. This allows your virtual content to remain stable relative to the real world environment even as the device moves around.

Supported Devices

ARCore is designed to work on a wide variety of qualified Android phones running N and later. During the SDK preview, ARCore supports the following devices:

  • Google Pixel and Pixel XL
  • Samsung Galaxy S8 (SM-G950U, SM-G950N, SM-G950FD, SM-G950FD, SM-G950W, SM-G950U1)

How does ARCore work?

Fundamentally, ARCore is doing two things: tracking the position of the mobile device as it moves, and building its own understanding of the real world.

 ARCore‘s motion tracking technology uses the phone’s camera to identify interesting points, called features, and tracks how those points move over time. With a combination of the movement of these points and readings from the phone’s inertial sensors,  ARCore determines both the position and orientation of the phone as it moves through space.

In addition to identifying key points, ARCore can detect flat surfaces, like a table or the floor, and can also estimate the average lighting in the area around it. These capabilities combine to enable ARCore to build its own understanding of the world around it.

ARCore‘s understanding of the real world lets you place objects, annotations, or other information in a way that integrates seamlessly with the real world. You can place a napping kitten on the corner of your coffee table, or annotate a painting with biographical information about the artist. Motion tracking means that you can move around and view these objects from any angle, and even if you turn around and leave the room, when you come back, the kitten or annotation will be right where you left it.

REFERENCE:

https://developers.google.com/ar/reference/

reference-hero

Now Let’s Cover Apple’s ARKit

https://developer.apple.com/arkit/

Screenshot 2017-09-04 16.42.44

iOS 11 introduces ARKit, a new framework that allows you to easily create unparalleled augmented reality experiences for iPhone and iPad. By blending digital objects and information with the environment around you, ARKit takes apps beyond the screen, freeing them to interact with the real world in entirely new ways.

ARKIT VIDEO OVERVIEW

Watch Introducing ARKit: Augmented Reality for iOS

Screenshot 2017-09-04 16.47.55

Visual Inertial Odometry

ARKit uses Visual Inertial Odometry (VIO) to accurately track the world around it. VIO fuses camera sensor data with CoreMotion data. These two inputs allow the device to sense how it moves within a room with a high degree of accuracy, and without any additional calibration.

Screenshot 2017-09-04 16.48.30

Scene Understanding and Lighting Estimation

With ARKit, iPhone and iPad can analyze the scene presented by the camera view and find horizontal planes in the room. ARKit can detect horizontal planes like tables and floors, and can track and place objects on smaller feature points as well. ARKit also makes use of the camera sensor to estimate the total amount of light available in a scene and applies the correct amount of lighting to virtual objects.

Screenshot 2017-09-04 16.48.51

High Performance Hardware and Rendering Optimizations

ARKit runs on the Apple A9 and A10 processors. These processors deliver breakthrough performance that enables fast scene understanding and lets you build detailed and compelling virtual content on top of real-world scenes. You can take advantage of the optimizations for ARKit in Metal, SceneKit, and third-party tools like Unity and Unreal Engine.

Sizzle™ allows full integration of both ARKit and ARCore applications, media, experiences, games, etc. Learn More at http://sizzle.network

 

Tagged under

close

Let's Invent Something Amazing, Together.

I'm ready to speak. Get someone on the line.

Wesite DevelopmentResponsive Mobile DevelopmentData SystemsECommerce3D AnimationUX DesignEnterprise Quality SoftwareRequest A ConsultationGeneral Inquiry

1-818-788-9700 x1

IllusionQuest Studios

23679 Calabasas Road Suite 785
Calabasas, CA 91302
T: 818 788 9700
info@illusionqueststudios.com