How To Add AR Effects To An Android App Using VrFace Library?

Page 1

How To Add AR Effects To An Android App Using VrFace Library?


Table of Contents ●

1) 1. History lesson

2) 2. Using VrFace library

3) 3. Including the library as the dependency

4) 4. Initializing and configuring

5) 5. Screen layout

6) 6. Adding personalized effects

7) 7. The working process of the library

8) 8. Handling the camera

9) 9. Using native code on Android

10) Final words


Introduction Are you familiar with VrFace? It’s a library based on OpenCV that can simplify the steps developers need to take to build AR Android applications. It’s an exceptionally popular computer vision library that implements face detection strategies and dlib – another popular ML library that provides methods to detect the feature points of an individual’s face. This topic is exclusively about creating Augmented Reality apps for Android devices by using an open-source library called VrFace. If you hope to uncover useful information in this write-up, you need a basic understanding of the processes employed by the providers of android app development services to build an Android solution. Of course, you can always search the web for that purpose if you aren’t well aware of the intricacies associated with these projects. After that, you should be able to guide your people in building an app within a short time that will allow users to apply masks on their faces in real-time. Additionally, such an app will even track the facial expressions of its users. This topic will also focus on adding new effects based on the VrFace library through shaders. You’ll only get a short intro to the effects if you’re a first-timer. In the end, this topic will tell you a bit about how the library works internally.


1. History lesson Before delving into the details, here’s an explanation of why this library came into existence. About 5-6 years ago, an app came to iOS devices with the ability to incorporate facial effects in real-time. Developers and communities from all four corners of the world found this particular quality of the app impressive. Unfortunately, there weren’t any apps on Android featuring similar qualities at that time. Today, however, such specimens are available on Android devices, but there’s more than enough room for improvement. The process is difficult, especially for people with no professional experience in building something on Android.

2. Using VrFace library So, how will you start this project? First of all, you’ll need to hire experienced developers offering android app development services. They must include the VrFace library as the project’s dependency. The format of the library is AAR, which is quite similar to an APK’s structure. It has all the necessary resources, a native library built specifically for the “arm” architecture, and DEX packages. The combination of these three aspects should allow developers to build apps for almost every kind of Android device currently in existence. In other words, your team of developers should include the dependency, provide configurations, initialize the library, and set up the screen layouts. The developers have to touch four specific files. The example repository has everything in it already. Your employees or third-party service providers will fork and clone the repository to simplify the process.

3. Including the library as the dependency To start with your developers should denote a maven dependency in “build.gradle:” Then, they must add credentials in the “gradle.properties:” If you want to use a token from GitHub, ask your developers to go to their “account settings” to create a token with “read:package” permission. You’ll find more information here about credentials. Once your developers add this configuration, they can download the library from the maven repository when they start working on your project.


4. Initializing and configuring The VrFace library needs an external model to pinpoint and trace sixty-eight feature points. As the model is about 60 MB in size, the library doesn’t include the model. Your developers must download it separately and unzip the same by running this code. bzip2 –d shape_predictor_68_face_landmarks.dat.bz2 Immediately after entering the code mentioned above, they will gain access to the “shape_predictor_68_face_landmarks.dat” file. They must rename the file to “sp68.dat” and move the same to the directory “app/src/main/assets/” Now, it’s time for you to take a look at the way your AR/VR app development company will initialize the library. Your developers can do it by entering the “MainActivity” class that initializes all the layouts, provides the necessary configurations and loads the library. When it comes to initializing the library, they have to utilize the OpenCV callback. They’ll call it only after OpenCV loads its exclusive native library. It’s the best place to load the native library for VrFace. According to experts, this library bears the name “detection_based_tracker” for historical reasons.

5. Screen layout If you expect your users to be able to use the camera of their device, you need your developers to add the “FastCameraPreview” to “layout.xml.” It will make sure that you get the frames from the camera in the format needed by the library. Apart from that, your developers should also be able to specify the view element for the result of applying the desired effect to the camera preview.


6. Adding personalized effects The developers of the AR/VR app development company called Moon Technolabs have to extend the “ShaderEffect” class to add effects to the application. They have to do it exactly the way it’s there in the “ShaderEffectMask.” However, this particular class uses just one effect with which you can apply a mask to a three-dimensional face. If you hope to add more effects, you have to gather more information on shaders first. Shaders are the scripts processed by the device’s GPU, along with input data coming from the textures, such as an image from the camera or other pictures you want to apply to a 3D figure. Once your developers take all the steps elucidated here, they can build the app and launch it on an Android phone. During the initial launch, the app will take about thirty seconds to initialize the library and the model. After that, it’ll apply the effects. Here’s a prebuilt APK binary for you.

7. The working process of the library The builders of the VrFace library used a specific component of Java that functions with the camera to create it. They also included shades they can use to apply effects and a native library. The native library has four vital parts. These include the camera positioning, the OpenCV library, the C++ dlib library, and a few extra methods of finding facial expressions using 3D models. It’ll be a challenge even for the best developers to write such a library. Apart from being difficult, it’s unnecessarily time-consuming. That’s why it’s better for you to stick to the basic steps only.


Also Read: How Will The AR/VR Adoption In 2022 Change The Game Of Industry?

8. Handling the camera Your team of developers has to use the android.hardware.Camera package. In the beginning, your developers have to determine the number of cameras available on the device by running the code “Camera.getNumberOfCameras().” After that, they have to get a handle to the camera required for the purpose. Also, your developers should configure the preview parameters. When they find the list of the preview sizes available, they need to select the most appropriate one based on the screen size. The next complicated task incorporates starting the previewing. Your team has to build a buffer and set up a preview callback. There’s only one method your developers have to implement for the preview. “void onPreviewFrame(byte[] data, Camera camera).” In this instance, “data” is the preview frame from the device’s camera. The preview frame adheres to the NV21 format. It means that you have to break it into two parts with one image in grey and the other in colored format. This concept is important as the next two steps will use the grey image.


9. Using native code on Android If you hire android app developers, they can use C/C++ native code. To do that, your developers have to provide a native interface implementation in a Java class. Here’s an example. “private static native void nativeDetect(long thiz, long inputImage, long faces);” It works as the bridge connecting native code with Java. It declares a function created using C/C++ using the code “JNIEXPORT:” The final step incorporates configuring the build script for native code with Android NDK builder.

Final words There’s a lot more for you to learn about using VrFace library to build an AR application for Android devices. Then again, you’re going to hire android app developers from a recognized agency like Moon Technolabs. The people working there are perfectly aware of VrFace and how to use it to build AR apps for Android devices. This topic attempted to explain the process of building an app that can create facial effects for Android gadgets with the VrFace library. To complete this project, developers have to add the library as a dependency to the project file while carrying everything through all the necessary initializations and configurations.


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.