Augmented Reality has come a long way from a science fiction concept to a more science-based realism. Ever since AR has been readily available on mobile phones, the technology has grasped the interest of a much larger audience all over the world that previously had no access to it. In this article we will explore ARCore (by Google) and go through it’s sample project step-wise.



Augmented Reality Architecture: Exploring ARCore by Google


What is ARCore?

To explain simply, ARCore is a software development kit developed by Google that allows for augmented reality applications to be built.

ARCore uses three key technologies to integrate virtual content with the real environment:

  1. Motion Tracking: it allows the phone to understand its position relative to the world.
  2. Environmental understanding: It allows the phone to detect the size and location of all types of surfaces, vertical, horizontal and angled.
  3. Light Estimation: it allows the phone to estimate the environment’s current lighting conditions.


Getting Started

To get started with ARCore Android app development, you need to have some basic knowledge of android app development. Following are few prerequisites:


  • Install Android Studio version 3.1 or higher with Android SDK Platform version 7.0 (API level 24) or higher
  • You will need a basic understanding of Android development. If you are new to Android, see Building your first Android app for beginners.
  • There’s a restricted number of devices capable of running ARCore. The complete list is available here.

In this tutorial, I will walk you through sample (Hello World) app provided by Google to you. We will try to understand each line of that code. After that I will provide you with some more useful links so that you can learn more.

Let us explore our first ARCore Android App


Opening the sample project:

Download and extract the Sceneform Samples or Clone the repository with the following command:

git clone

In Android Studio, open the Hello Sceneform sample project, located in the sceneform-android-sdk/ directory.


Running the sample:

Make sure your Android device is connected to the development machine and click Run in Android Studio. Then, choose your device as the deployment target and click OK.

Android Studio installs the APK, and then runs the app on your device. You may be prompted to install or update the ARCore app if it is missing or out of date. Select CONTINUE to install it from Google Play Store.

The Hello Sceneform app lets you place and manipulate Android figurines on flat surfaces.


Walkthrough of Code:


1. Mainfest.xml

We need to add two permission of Camera and Hardware Camera AR

<uses-permission android:name="android.permission.CAMERA" />

<uses-feature android:name="" android:required="true"/>

The other line you see extra is following line, which basically make your app only being visible in the Google Play Store on devices that support ARCore.

<meta-data android:name="" android:value="required" />


2. Build.gradle (app)

dependencies {

implementation ""


This dependency provides ArFragment, and other Sceneform UX resources, we will be exploring ArFragment in detail later on. If you don’t want to use these UX resources you can add following dependency.

implementation ""


3. Build.gradle (helloscreenform)

You will also see following classpath added in this file.

dependencies {


classpath ''


These will install the Sceneform SDK to the project and Sceneform plugin to AndroidStudio. It will help you to view the .sfb files. These files are the 3D models which are rendered in your camera. It also helps you in importing, viewing, and building 3D assets.


4. Adding the assets

You will need to add the 3D models which will be rendered on your screen. Now you can build these models yourself if you are familiar with 3D model creation. Or, you can visit Poly. They are free to download. Just credit the creator and you are good to go.



In the Android Studio, expand your app folder available on the left-hand side project pane. You’ll notice a “sampledata” folder. This folder will hold all of your 3D model assets. Create a folder for your model inside the sample data folder.

When you download the zip file from poly, you will most probably find 3 files.

  1. .mtl file
  2. .obj file
  3. .png file

Most important of these 3 is the .obj file. It is your actual model. Place all the 3 files inside sampledata -> “your model’s folder”.


Now right click on the .obj file. The first option would be to Import Sceneform Asset. Click on it, do not change the default settings, just click finish on the next window. Your gradle will sync to include the asset in the assets folder. Once the gradle build finishes, you are good to go.

So, after gradle finish if you open gradle(app) at end of file you will find something like it.





So, every model should be named uniquely and its ref will automatically added to gradle file.


5. Activity_UX

<fragment android:name="" .../>

You will see a fragment enclosed in frame layout. Fragment class will be ArFragment provided by Google ScreenForm SDK.


At bottom of file you will find checkIsSupportedDeviceOrFinish, this method checks whether your device can support Sceneform SDK or not. The SDK requires Android API level 27 or newer and OpenGL ES version 3.0 or newer. If a device does not support these two, the Scene would not be rendered and your application will show a blank screen.

Let's go through the rest of the code line by line, we find the arFragment that we included in the layout file. You can think of it as the container of our scene.

arFragment = (ArFragment) getSupportFragmentManager().findFragmentById(;

Next, we are using the ModelRenderable class to build our model. With the help of setSource method, we load our model from the .sfb file. This file was generated when we imported the assets. thenAccept method receives the model once it is built. We set the loaded model to our andyRenderable. For error handling, we have .exceptionally method. It is called in case an exception is thrown. All this happens asynchronously, hence you don’t need to worry about multi-threading or deal with handlers.


.setSource(this, R.raw.andy)


.thenAccept(renderable -> andyRenderable = renderable)


   throwable -> {

    Toast toast =

       Toast.makeText(this, "Unable to load andy renderable", Toast.LENGTH_LONG);

   toast.setGravity(Gravity.CENTER, 0, 0);;

  return null;


With the model loaded and stored in the andyRenderable variable, we’ll now add it to our scene.

The arFragment hosts our scene and will receive the tap events. So we need to set the onTap listener to our fragment to register the tap and place an object accordingly.


     (HitResult hitResult, Plane plane, MotionEvent motionEvent) -> {


In onTap we will first check if our model is ready to use or not by checking andyRenderable null check. We create our anchor from the HitResult using hitresult.createAnchor() and store it in an Anchor object. Then, create a node out of this anchor, called AnchorNode. It will be attached to the scene by calling the setParent method on it and passing the scene from the fragment. Now we create a TransformableNode which will be our andy and set it to the anchor spot or our anchor node. The node still doesn’t have any information about the object it has to render. We’ll pass that object using andy.setRenderable method which takes in a renderable as its parameter, in our case its andyRenderable. Finally call;

// Create the Anchor.

Anchor anchor = hitResult.createAnchor();

AnchorNode anchorNode = new AnchorNode(anchor);



// Create the transformable andy and add it to the anchor.

TransformableNode andy = new TransformableNode(arFragment.getTransformationSystem());




This was your first look into how to create a simple ARCore app from scratch with Android studio. In the next tutorial, I will be explaining more renderable type node and we will look into more customization available by sceneform SDK.



Topics: Augmented reality, AR, Augmented reality apps, android studio, Hello sceneform, augmented reality android

Abdur Rahman Sorohy

Written by Abdur Rahman Sorohy

Software Engineer at Tintash