Sceneform SDK : A boon for Android developers

Overview

With AR being in trend, is it just me or is using a 2D and stationary image too mainstream now-a-days? 😀

ARCore brings Sceneform SDK which is capable to scan images and load 3D models accordingly which allow users to interact with them using gestures. They call this feature – Augmented Images. Not only this but a lot of other AR stuff can be done with Sceneform SDK and in a lot easier way than ever imagined.

Sceneform Overview

Google announced Sceneform SDK in Google I/O 2018. It handles all the 3D graphics, OpenGL and such complex stuff by itself allowing an Android developer to easily develop the AR apps with lesser lines of code. It requires a plugin to be installed and an Android Studio with version 3.1 or above to run.

Sceneform Capabilities

Being in beta, along with Augmented Images, it provides basic AR functionality like moving, rotating and scaling a 3D model.

It has a functionality called cloud anchors wherein two/multiple users place a 3D model each using their respective devices in a single frame (same environment) and both the models can be viewed from both the devices. Models can also interact with each other. That’s cool, right?

Now, here comes my favourite part… (drum rolls)
Being a native android developer, the functionality that I find most interesting, is that it can convert a layout/app-screen into a renderable and can load it as a 3D model into physical environment. Can’t stop thinking about unending varieties of applications that can be developed based on this concept !

sceneform-image1

Without using any other minute in dreaming about Sceneform enabled app possibilities :D, let’s dive into practical scenarios to have a look at its performance.

Basic information to get started

To start with, we will need it’s plugin to be installed. Go to Preferences and search for sceneform as shown below. Install it and restart the studio.

sceneform-image2

3D models can be downloaded from Google’s own website – https://poly.google.com. Sceneform supports 3D models with .obj, .gltf and .fbx extensions. SDK has its own extensions for models. It converts them into .sfa and .sfb formats.

.sfb (Sceneform Binary asset) is the actual model that is to be loaded into app and .sfa (Sceneform Asset Definition) is human-readable description of .sfb file.

Below is an example for .sfa file.

sceneform-image3

It stores information of the model like its scale size, textures to be loaded and other material properties. More information regarding .sfa attributes can be found at https://developers.google.com/ar/develop/java/sceneform/sfa

3D model can be converted into these formats just by right clicking on them and selecting Import Sceneform Asset. This will open a dialog wherein we can specify the output locations.

sceneform-image4

Plugin also provides a viewer to view the model in studio without running the app. It’s something that I craved for while using ARCore’s older SDKs. 😀

sceneform-image5

Practical

In our demo, we’ll concentrate on converting a layout/screen into a 3D model. We’ll develop an app wherein we’ll scan an image (Yudiz team’s picture) which will pop up 3 buttons or tappable icons in 3D to redirect user into respective screens when clicked.

Below is the image that I’ll use.

sceneform-image6

Remember : The image should be unique enough to get identified by the SDK.
Store it in assets folder.

Let’s have a look at the other required resource.

sceneform-image7

This is the layout that will pop up when image gets detected by SDK. You can design any layout based on your requirements.

Now, skipping the explanation of boilerplate code that will be needed to detect the supported devices and to initialize the ARCore fragment, let’s have a look at the core functionality.

We need to create an AugmentedImageDatabase to store the images with unique names.

This method gets fired whenever screen frame is updated.

This code is used to fetch all the images stored in augmented database.

Here, for loop is used to check where any of the fetched images is same as that we stored in DB.
When this condition is satisfied, the layout is converted into a renderer and gets added in ArScene. This is shown in below code.

Here, a CompletableFuture object is created with layout which ultimately will provide a renderable.
I have obtained a view from renderable to find the IDs of the elements and to set click listeners for them.

That’s it. We have successfully added interactions to the image. Yay ! 😀

Application ideas

An application for I-card can be developed using this feature. A card, which has human unreadable content like QR code, can be scanned and actual information can be fetched to show it in 3D using ARCore.

Conclusion

Sceneform SDK is not less than a boon for Android developers who are eager to learn AR. Being so powerful in beta version, I’m eager to see what its future features will comprise of.

mm

Sumeet Rukeja Android Developer

I am an Android Developer at Yudiz Solutions Pvt. Ltd. - a leading Mobile App Development Company. I am passionate about developing android apps and to learn more and more about the deep sea of Android.

Comments are closed.

Top