Google’s ARCore platform can create virtual objects, blending them with the real world through your device’s camera. Follow along as we explore some key ARCore concepts and delve into creating an augmented reality app from scratch.
With the release of ARCore, Google has provided us with a brand new platform for building augmented reality apps for Android, but how does it work and where do we get started?
ARCore makes use of three main technologies to achieve this:
There are five main concepts to understand before diving into the details.
When you move around with your device, ARCore uses the camera to detect “visually distinct features” in each captured image. Those are called feature points. ARCore uses these points in combination with the device sensors to figure out your location in the space and estimate your pose.
ARCore uses pose to refer to the position and orientation of the camera. It needs to align the pose of the virtual camera with the pose of your device’s camera so that virtual objects are rendered from the correct perspective. This allows you to place a virtual object on top of a plane, circle around it and watch it from behind.
When processing the camera input stream, apart from feature points ARCore also looks for horizontal surfaces, like tables, desks or the floor. Those detected surfaces are called planes. We’ll see how you can use these planes to anchor virtual objects to the scene.
For a virtual object to be placed it needs to be attached to an anchor. An anchor describes a fixed location and orientation in the real world. By attaching the virtual object to an anchor, we ensure ARCore tracks the object’s position and orientation correctly over time. An anchor is created as a result of a hit test when tapping on the screen.
When the user taps on the device’s screen ARCore runs a hit test from that (x,y) coordinate. Imagine a light ray coming from the point your finger touched and going straight into the camera’s view. ARCore will return any planes or feature points intersected by this ray, plus the pose of that intersection. The result of a hit test is a paired collection of planes & poses that we can use to create anchors and attach virtual objects to the world.
You can develop ARCore Android apps the same way you develop normal Android apps. There’s no need for any extra tools. What changes is the way you render your app’s views, because we are going to be rendering virtual objects on top of our camera image stream. For that we need to use a SurfaceView and OpenGL
In this example we are going to showcase a simple app that does all of the following:
This flowchart provides a high level overview of what the app does:
When you develop an app for ARCore it behaves pretty much like a game: there is a surface in which you draw updates every frame. That is what happens inside the green box in the above diagram.
The ARCore main entry point is a class called
Session. A new
Session is created when the Activity starts and it’s responsible for handling AR system state and session lifecycles.
This means every time draw frame is called you need to ask this
Session object for the AR state: are there any feature points? Any planes?
The sample app draws a background, feature points, planes and 3D objects. This is the sequence of operations executed for each draw frame call:
session.update()this provides you with a
Frame, which you can use for...
frame.getCamera()this provides you with
Camera, which you can use for...
Framefrom step 1 and use it to draw the background of your app. This is basically the image we are receiving from the camera. If we remove everything else and only keep these two steps what we will end up having is a surface view displaying the camera view.
Frameand ask for the “current set of estimated 3d points attached to real-world geometry”. Then it draws this point cloud using the camera projection and view matrices as reference.
Sessionfor all the trackables with type equals to
Planeand draw them (For now only
Trackable, but this makes us think more types will be added in the future).
Framehas a method
hitTestthat receives an Android
MotionEventand returns the list of successful hits for that event. Then it’s just a matter of iterating through that list and creating an anchor if the
Yes, the emulator will generate a virtual house interior scene where you can move around and simulate plane and points detection.
As ARCore was released by Google very recently documentation in relation to automated test support is currently non-existent and material surrounding the topic across the web just as scarce. However, this doesn't mean we can't test our ARCore applications.
Unit testing forms the foundation of the testing pyramid and is traditionally where the majority of our automated test coverage/effort originates. When it comes to testing our ARCore based applications this should be no different. The code we're writing is still Java after all.
Moving up the testing pyramid we come to integration and user interface testing. Unfortunately, there’s isn’t currently a clear path towards driving user interaction with our ARCore application programmatically, however, purpose built strategies and tools to accomplish this should become more apparent and available as time passes and the technology is adopted on a wider scale. In the meantime, we can focus a larger proportion of our testing effort towards manual based exploratory testing.
ARCore technology used to detect planes, points and anchoring 3D models within a space is impressive. Using all of these features is easy and intuitive once you learn all of the vocabulary and related terminology.
The main pain point we found for Android developers was rendering 3D models. The basic sample project uses openGL which is not the nicest thing to work with. It operates at a very low level and you’re required to have a good understanding of matrices transformations and complex 3D space maths to work with it.
There are third party 3D rendering libraries available for Android that you can use instead of raw openGL. None of them is officially recommended by Google, so is up to you to choose. On top of that there’s no Android Studio integration for any of them, which doesn’t help in making things easier.
Something else to worry about is the fact that we didn’t find a strategy to automatically test ARCore apps.
We’d love to see how ARCore evolves in the future. We hold high expectations for this technology and hope to see easier ways to render models in the future as well as better Android Studio support or integrations.
Finally, if you’re curious on how to do AR on iOS check out https://blog.novoda.com/getting-started-with-arkit/ by our colleague Berta
We plan, design, and develop the world’s most desirable software products. Our team’s expertise helps brands like Sony, Motorola, Tesco, Channel4, BBC, and News Corp build fully customized Android devices or simply make their mobile experiences the best on the market. Since 2008, our full in-house teams work from London, Liverpool, Berlin, Barcelona, and NYC.
Let’s get in contact