Object Detection & Tracking with HMS ML Kit (Video Mode) - Huawei Developers

More articles like this, you can visit HUAWEI Developer Forum.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
In this article I will tell you about Object Detection with HMS ML Kit first and then we are going to build an Android application which uses HMS ML Kit to detect and track objects in a camera stream. If you haven’t read my last article on detecting objects statically yet, here it is. You can also find introductive information about Artificial Intelligence, Machine Learning and Huawei ML Kit’s capabilities in that article.
The object detection and tracking service can detect and track multiple objects in an image. The detected objects can be located and classified in real time. It is also an ideal choice for filtering out unwanted objects in an image. By the way, Huawei ML Kit provides on device object detection capabilities, hence it is completely free.
Let’s don’t waste our precious time and start doing our sample project step by step!
1. If you haven’t registered as a Huawei Developer yet. Here is the link.
2. Create a Project on AppGalleryConnect. You can follow the steps shown here.
3. In HUAWEI Developer AppGallery Connect, go to Develop > Manage APIs. Make sure ML Kit is activated.
4. Integrate ML Kit SDK into your project. Your app level build.gradle will look like this:
Code:
apply plugin: 'com.android.application'
apply plugin: 'kotlin-android'
apply plugin: 'kotlin-android-extensions'
android {
compileSdkVersion 29
buildToolsVersion "29.0.3"
defaultConfig {
applicationId "com.demo.objectdetection"
minSdkVersion 21
targetSdkVersion 29
versionCode 1
versionName "1.0"
testInstrumentationRunner "androidx.test.runner.AndroidJUnitRunner"
}
buildTypes {
release {
minifyEnabled false
proguardFiles getDefaultProguardFile('proguard-android-optimize.txt'), 'proguard-rules.pro'
}
}
compileOptions {
sourceCompatibility JavaVersion.VERSION_1_8
targetCompatibility JavaVersion.VERSION_1_8
}
kotlinOptions { jvmTarget = "1.8" }
}
dependencies {
implementation fileTree(dir: 'libs', include: ['*.jar'])
implementation "org.jetbrains.kotlin:kotlin-stdlib-jdk7:$kotlin_version"
implementation 'androidx.appcompat:appcompat:1.1.0'
implementation 'androidx.core:core-ktx:1.3.0'
implementation 'androidx.constraintlayout:constraintlayout:1.1.3'
testImplementation 'junit:junit:4.12'
androidTestImplementation 'androidx.test.ext:junit:1.1.1'
androidTestImplementation 'androidx.test.espresso:espresso-core:3.2.0'
//HMS ML Kit
implementation 'com.huawei.hms:ml-computer-vision:1.0.2.300'
}
and your project-level build.gradle is like this:
Code:
buildscript {
ext.kotlin_version = '1.3.72'
repositories {
google()
jcenter()
maven { url 'https://developer.huawei.com/repo/' }
}
dependencies {
classpath 'com.android.tools.build:gradle:3.6.3'
classpath "org.jetbrains.kotlin:kotlin-gradle-plugin:$kotlin_version"
classpath 'com.huawei.agconnect:agcp:1.3.1.300'
}
}
allprojects {
repositories {
google()
jcenter()
maven { url 'https://developer.huawei.com/repo/' }
}
}
task clean(type: Delete) {
delete rootProject.buildDir
}
5. Create the layout first. There will be two surfaceViews. The first surfaceView is to display our camera stream, the second surfaceView is to draw our canvas. We will draw rectangles around detected objects and write their respective types on our canvas and show this canvas on our second surfaceView. Here is the sample:
Code:
<?xml version="1.0" encoding="utf-8"?>
<androidx.constraintlayout.widget.ConstraintLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
tools:context=".MainActivity">
<SurfaceView
android:id="@+id/surface_view_camera"
android:layout_width="match_parent"
android:layout_height="match_parent" />
<SurfaceView
android:id="@+id/surface_view_overlay"
android:layout_width="match_parent"
android:layout_height="match_parent" />
</androidx.constraintlayout.widget.ConstraintLayout>
6. By the way, make sure you set your activity style as “Theme.AppCompat.Light.NoActionBar” or similar in res/styles.xml to hide the action bar.
7.1. We have two important classes that help us detect objects in HMS ML Kit. MLObjectAnalyzer and LensEngine. MLObjectAnalyzer detects object information (MLObject) in an image. We can also customize it using MLObjectAnalyzerSetting. Here is our createAnalyzer method:
Code:
private fun createAnalyzer(): MLObjectAnalyzer {
val analyzerSetting = MLObjectAnalyzerSetting.Factory()
.setAnalyzerType(MLObjectAnalyzerSetting.TYPE_VIDEO)
.allowMultiResults()
.allowClassification()
.create()
return MLAnalyzerFactory.getInstance().getLocalObjectAnalyzer(analyzerSetting)
}
7.2. Other important class that we are using today is LensEngine. LensEngine is responsible for camera initialization, frame obtaining, and logic control functions. Here is our createLensEngine method:
Code:
private fun createLensEngine(orientation: Int): LensEngine {
val lensEngineCreator = LensEngine.Creator(applicationContext, mAnalyzer)
.setLensType(LensEngine.BACK_LENS)
.applyFps(10F)
.enableAutomaticFocus(true)
return when(orientation) {
Configuration.ORIENTATION_PORTRAIT ->
lensEngineCreator.applyDisplayDimension(getDisplayMetrics().heightPixels, getDisplayMetrics().widthPixels).create()
else ->
lensEngineCreator.applyDisplayDimension(getDisplayMetrics().widthPixels, getDisplayMetrics().heightPixels).create()
}
}
8. Well, LensEngine handles camera frames, MLObjectAnalyzer detects MLObjects in those frames. Now we need to create our ObjectAnalyzerTranscator class which implements MLAnalyzer.MLTransactor interface. The detected MLObjects are going to be dropped in transactResult method of this class. I will share our ObjectAnalyzerTransactor class here with an additional draw method for drawing rectangles and some text around detected objects.
Code:
package com.demo.objectdetection
import android.graphics.Color
import android.graphics.Paint
import android.graphics.PorterDuff
import android.util.Log
import android.util.SparseArray
import android.view.SurfaceHolder
import androidx.core.util.forEach
import androidx.core.util.isNotEmpty
import androidx.core.util.valueIterator
import com.huawei.hms.mlsdk.common.MLAnalyzer
import com.huawei.hms.mlsdk.objects.MLObject
class ObjectAnalyzerTransactor : MLAnalyzer.MLTransactor<MLObject> {
companion object {
private const val TAG = "ML_ObAnalyzerTransactor"
}
private var mSurfaceHolderOverlay: SurfaceHolder? = null
fun setSurfaceHolderOverlay(surfaceHolder: SurfaceHolder) {
mSurfaceHolderOverlay = surfaceHolder
}
override fun transactResult(results: MLAnalyzer.Result<MLObject>?) {
val items = results?.analyseList
items?.forEach { key, value ->
Log.d(TAG, "transactResult -> " +
"Border: ${value.border} " + //Rectangle around this object
"Type Possibility: ${value.typePossibility} " + //Possibility between 0-1
"Tracing Identity: ${value.tracingIdentity} " + //Tracing number of this object
"Type Identity: ${value.typeIdentity}") //Furniture, Plant, Food etc.
}
items?.also {
draw(it)
}
}
private fun draw(items: SparseArray<MLObject>) {
val canvas = mSurfaceHolderOverlay?.lockCanvas()
if (canvas != null) {
//Clear canvas first
canvas.drawColor(0, PorterDuff.Mode.CLEAR)
for (item in items.valueIterator()) {
val type = getItemType(item)
//Draw a rectangle around detected object.
val rectangle = item.border
Paint().also {
it.color = Color.YELLOW
it.style = Paint.Style.STROKE
it.strokeWidth = 8F
canvas.drawRect(rectangle, it)
}
//Draw text on the upper left corner of the detected object, writing its type.
Paint().also {
it.color = Color.BLACK
it.style = Paint.Style.FILL
it.textSize = 24F
canvas.drawText(type, (rectangle.left).toFloat(), (rectangle.top).toFloat(), it)
}
}
}
mSurfaceHolderOverlay?.unlockCanvasAndPost(canvas)
}
private fun getItemType(item: MLObject) = when(item.typeIdentity) {
MLObject.TYPE_OTHER -> "Other"
MLObject.TYPE_FACE -> "Face"
MLObject.TYPE_FOOD -> "Food"
MLObject.TYPE_FURNITURE -> "Furniture"
MLObject.TYPE_PLACE -> "Place"
MLObject.TYPE_PLANT -> "Plant"
MLObject.TYPE_GOODS -> "Goods"
else -> "No match"
}
override fun destroy() {
Log.d(TAG, "destroy")
}
}
9. Our lensEngine needs a surfaceHolder to run on. Therefore will start it when our surfaceHolder is ready. Here is our callback:
Code:
private val surfaceHolderCallback = object : SurfaceHolder.Callback {
override fun surfaceChanged(holder: SurfaceHolder?, format: Int, width: Int, height: Int) {
mLensEngine.close()
init()
mLensEngine.run(holder)
}
override fun surfaceDestroyed(holder: SurfaceHolder?) {
mLensEngine.release()
}
override fun surfaceCreated(holder: SurfaceHolder?) {
mLensEngine.run(holder)
}
}
10. We require CAMERA and WRITE_EXTERNAL_STORAGE permissions. Make sure you add them to your AndroidManifest.xml file and ask user at runtime. For the sake of simplicity we do it as shown below:
Code:
class MainActivity : AppCompatActivity() {
companion object {
private const val PERMISSION_REQUEST_CODE = 8
private val requiredPermissions = arrayOf(Manifest.permission.CAMERA, Manifest.permission.WRITE_EXTERNAL_STORAGE)
}
override fun onCreate(savedInstanceState: Bundle?) {
if (hasPermissions(requiredPermissions))
init()
else
ActivityCompat.requestPermissions(this, requiredPermissions, PERMISSION_REQUEST_CODE)
}
private fun hasPermissions(permissions: Array<String>) = permissions.all {
ContextCompat.checkSelfPermission(this, it) == PackageManager.PERMISSION_GRANTED
}
override fun onRequestPermissionsResult(requestCode: Int, permissions: Array<out String>, grantResults: IntArray) {
super.onRequestPermissionsResult(requestCode, permissions, grantResults)
if (requestCode == PERMISSION_REQUEST_CODE && hasPermissions(requiredPermissions))
init()
}
}
11. Let’s bring them all the pieces together. Here is our MainActivity.
Code:
package com.demo.objectdetection
import android.Manifest
import android.content.Context
import android.content.pm.PackageManager
import android.content.res.Configuration
import android.graphics.PixelFormat
import android.os.Bundle
import android.util.DisplayMetrics
import android.view.SurfaceHolder
import android.view.WindowManager
import androidx.appcompat.app.AppCompatActivity
import androidx.core.app.ActivityCompat
import androidx.core.content.ContextCompat
import com.huawei.hms.mlsdk.MLAnalyzerFactory
import com.huawei.hms.mlsdk.common.LensEngine
import com.huawei.hms.mlsdk.objects.MLObjectAnalyzer
import com.huawei.hms.mlsdk.objects.MLObjectAnalyzerSetting
import kotlinx.android.synthetic.main.activity_main.*
class MainActivity : AppCompatActivity() {
companion object {
private const val TAG = "ML_MainActivity"
private const val PERMISSION_REQUEST_CODE = 8
private val requiredPermissions = arrayOf(Manifest.permission.CAMERA, Manifest.permission.WRITE_EXTERNAL_STORAGE)
}
private lateinit var mAnalyzer: MLObjectAnalyzer
private lateinit var mLensEngine: LensEngine
private lateinit var mSurfaceHolderCamera: SurfaceHolder
private lateinit var mSurfaceHolderOverlay: SurfaceHolder
private lateinit var mObjectAnalyzerTransactor: ObjectAnalyzerTransactor
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
setContentView(R.layout.activity_main)
if (hasPermissions(requiredPermissions))
init()
else
ActivityCompat.requestPermissions(this, requiredPermissions, PERMISSION_REQUEST_CODE)
}
private fun init() {
mAnalyzer = createAnalyzer()
mLensEngine = createLensEngine(resources.configuration.orientation)
mSurfaceHolderCamera = surface_view_camera.holder
mSurfaceHolderOverlay = surface_view_overlay.holder
mSurfaceHolderOverlay.setFormat(PixelFormat.TRANSPARENT)
mSurfaceHolderCamera.addCallback(surfaceHolderCallback)
mObjectAnalyzerTransactor = ObjectAnalyzerTransactor()
mObjectAnalyzerTransactor.setSurfaceHolderOverlay(mSurfaceHolderOverlay)
mAnalyzer.setTransactor(mObjectAnalyzerTransactor)
}
private fun createAnalyzer(): MLObjectAnalyzer {
val analyzerSetting = MLObjectAnalyzerSetting.Factory()
.setAnalyzerType(MLObjectAnalyzerSetting.TYPE_VIDEO)
.allowMultiResults()
.allowClassification()
.create()
return MLAnalyzerFactory.getInstance().getLocalObjectAnalyzer(analyzerSetting)
}
private fun createLensEngine(orientation: Int): LensEngine {
val lensEngineCreator = LensEngine.Creator(applicationContext, mAnalyzer)
.setLensType(LensEngine.BACK_LENS)
.applyFps(10F)
.enableAutomaticFocus(true)
return when(orientation) {
Configuration.ORIENTATION_PORTRAIT ->
lensEngineCreator.applyDisplayDimension(getDisplayMetrics().heightPixels, getDisplayMetrics().widthPixels).create()
else ->
lensEngineCreator.applyDisplayDimension(getDisplayMetrics().widthPixels, getDisplayMetrics().heightPixels).create()
}
}
private val surfaceHolderCallback = object : SurfaceHolder.Callback {
override fun surfaceChanged(holder: SurfaceHolder?, format: Int, width: Int, height: Int) {
mLensEngine.close()
init()
mLensEngine.run(holder)
}
override fun surfaceDestroyed(holder: SurfaceHolder?) {
mLensEngine.release()
}
override fun surfaceCreated(holder: SurfaceHolder?) {
mLensEngine.run(holder)
}
}
override fun onDestroy() {
super.onDestroy()
//Release resources
mAnalyzer.stop()
mLensEngine.release()
}
private fun getDisplayMetrics() = DisplayMetrics().let {
(getSystemService(Context.WINDOW_SERVICE) as WindowManager).defaultDisplay.getMetrics(it)
it
}
private fun hasPermissions(permissions: Array<String>) = permissions.all {
ContextCompat.checkSelfPermission(this, it) == PackageManager.PERMISSION_GRANTED
}
override fun onRequestPermissionsResult(requestCode: Int, permissions: Array<out String>, grantResults: IntArray) {
super.onRequestPermissionsResult(requestCode, permissions, grantResults)
if (requestCode == PERMISSION_REQUEST_CODE && hasPermissions(requiredPermissions))
init()
}
}
12. In summary we used LensEngine to handle camera frames for us. We displayed the frames on our first surfaceView. Then MLObjectAnalyzer analyzed these frames and detected objects came into transactResult of our ObjectAnalyzerTrasactor class. In this method we iterated through all objects detected and drew them on our second surfaceView which we used as an overlay. Here is the output:

Related

Build a Face Detection App with Huawei ML Kit

{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Hi all,
In the era of powerful mobile devices we store thousands of photos, have video calls, shop, manage our bank accounts and perform many other tasks in the palm of our hands. We can also manage to take photos and videos or have video calls with our cameras integrated on our mobile phones. But, it would be inefficient to only use these cameras we carry by ourselves all day to take raw photos and videos. They can perform more, much more.
Face detection services are used by many applications in different industries. It is mainly used for security and entertainment purposes. For example, it can be used by a taxi app to identify its taxi’s customer, it can be used by a smart home app to identify guests’ faces and announce to the host who the person ringing the bell at the door is, it can be used to draw moustache on faces in an entertainment app or it can be used to detect if a drivers eyes are open or not and warn our driver if his/her eyes closed.
In contrary to how many different areas face detection can be used and how important tasks this service performs, it is really easy to implement a face detection app with the help of Huawei ML Kit. As it’s a device side capability that works on all Android devices with ARM architecture, it is completely free, faster and more secure than other services. The face detection service can detect the shapes and features of your user’s face, including their facial expression, age, gender, and wearing.
With face detection service you can detect up to 855 face contour points to locate face coordinates including face contour, eyebrows, eyes, nose, mouth, and ears, and identify the pitch, yaw, and roll angles of a face. You can detect seven facial features including the possibility of opening the left eye, possibility of opening the right eye, possibility of wearing glasses, gender possibility, possibility of wearing a hat, possibility of wearing a beard, and age. In addition to there, you can also detect facial expressions, namely, smiling, neutral, anger, disgust, fear, sadness, and surprise.
Let’s start to build our demo application step by step from scratch!
1.Firstly, let’s create our project on Android Studio. We can create a project selecting Empty Activity option and then follow the steps described in this post to create and sign our project in App Gallery Connect. You can follow this guide.
2. Secondly, In HUAWEI Developer AppGallery Connect, go to Develop > Manage APIs. Make sure ML Kit is activated.
3. Now we have integrated Huawei Mobile Services (HMS) into our project. Now let’s follow the documentation on developer.huawei.com and find the packages to add to our project. In the website click Developer > HMS Core > AI > ML Kit. Here you will find introductory information to services, references, SDKs to download and others. Under ML Kit tab follow Android > Getting Started > Integrating HMS Core SDK > Adding Build Dependencies > Integrating the Face Detection SDK. We can follow the guide here to add face detection capability to our project. To later go round and learn more about this service I added all of the three packages shown here. You can only choose the base SDK or select packages according to your needs. After the integration your app-level build.gradle file will look like this.
Code:
apply plugin: 'com.android.application'
apply plugin: 'kotlin-android'
apply plugin: 'kotlin-android-extensions'
apply plugin: 'com.huawei.agconnect'
android {
compileSdkVersion 29
buildToolsVersion "30.0.1"
defaultConfig {
applicationId "com.demo.faceapp"
minSdkVersion 21
targetSdkVersion 29
versionCode 1
versionName "1.0"
testInstrumentationRunner "androidx.test.runner.AndroidJUnitRunner"
}
buildTypes {
release {
minifyEnabled false
proguardFiles getDefaultProguardFile('proguard-android-optimize.txt'), 'proguard-rules.pro'
}
}
}
dependencies {
implementation fileTree(dir: "libs", include: ["*.jar"])
implementation "org.jetbrains.kotlin:kotlin-stdlib:$kotlin_version"
implementation 'androidx.core:core-ktx:1.3.1'
implementation 'androidx.appcompat:appcompat:1.2.0'
implementation 'androidx.constraintlayout:constraintlayout:1.1.3'
testImplementation 'junit:junit:4.12'
androidTestImplementation 'androidx.test.ext:junit:1.1.1'
androidTestImplementation 'androidx.test.espresso:espresso-core:3.2.0'
// Import the base SDK.
implementation 'com.huawei.hms:ml-computer-vision-face:2.0.1.300'
// Import the contour and key point detection model package.
implementation 'com.huawei.hms:ml-computer-vision-face-shape-point-model:2.0.1.300'
// Import the facial expression detection model package.
implementation 'com.huawei.hms:ml-computer-vision-face-emotion-model:2.0.1.300'
// Import the facial feature detection model package.
implementation 'com.huawei.hms:ml-computer-vision-face-feature-model:2.0.1.300'
}
And your project-level build.gradle file will look like this.
Code:
buildscript {
ext.kotlin_version = "1.3.72"
repositories {
google()
jcenter()
maven { url 'https://developer.huawei.com/repo/' }
}
dependencies {
classpath "com.android.tools.build:gradle:4.0.1"
classpath "org.jetbrains.kotlin:kotlin-gradle-plugin:$kotlin_version"
classpath 'com.huawei.agconnect:agcp:1.3.1.300'
}
}
allprojects {
repositories {
google()
jcenter()
maven { url 'https://developer.huawei.com/repo/' }
}
}
task clean(type: Delete) {
delete rootProject.buildDir
}
Don’t forget to add the following meta-data tags in your AndroidManifest.xml. This is for automatic update of the machine learning model.
Code:
<?xml version="1.0" encoding="utf-8"?>
<manifest ...
<application ...
</application>
<meta-data
android:name="com.huawei.hms.ml.DEPENDENCY"
android:value= "face"/>
</manifest>
4. Now we can select detecting faces on a static image or on a camera stream. Let’s choose detecting faces on a camera stream for this example. Firstly, let’s create our analyzer. Its type is MLFaceAnalyzer. It is responsible for analyzing the faces detected. Here is the sample implementation. We can also use our MLFaceAnalyzer with default settings to make it simple.
Code:
private lateinit var mAnalyzer: MLFaceAnalyzer
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
setContentView(R.layout.activity_main)
mAnalyzer = createAnalyzer()
}
private fun createAnalyzer(): MLFaceAnalyzer {
val settings = MLFaceAnalyzerSetting.Factory()
.allowTracing()
.setFeatureType(MLFaceAnalyzerSetting.TYPE_FEATURES)
.setShapeType(MLFaceAnalyzerSetting.TYPE_SHAPES)
.setMinFaceProportion(.5F)
.setKeyPointType(MLFaceAnalyzerSetting.TYPE_KEYPOINTS)
.create()
return MLAnalyzerFactory.getInstance().getFaceAnalyzer(settings)
}
5. Create a simple layout. Use two surfaceViews, one above the other, for camera frames and for our overlay. Because, later we will draw some shapes on our overlay view. Here is a sample layout.
Code:
<?xml version="1.0" encoding="utf-8"?>
<androidx.constraintlayout.widget.ConstraintLayout
xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
tools:context=".MainActivity">
<SurfaceView
android:id="@+id/surfaceViewCamera"
android:layout_width="match_parent"
android:layout_height="match_parent" />
<SurfaceView
android:id="@+id/surfaceViewOverlay"
android:layout_width="match_parent"
android:layout_height="match_parent" />
</androidx.constraintlayout.widget.ConstraintLayout>
6. We should prepare our views. We need two surfaceHolders in our application. We are going to make surfaceHolderOverlay transparent, because we want to see our camera frames. Later we are going to add a callback to our surfaceHolderCamera to know when it’s created, changed and destroyed. Let’s create them.
Code:
private lateinit var mAnalyzer: MLFaceAnalyzer
private var surfaceHolderCamera: SurfaceHolder? = null
private var surfaceHolderOverlay: SurfaceHolder? = null
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
setContentView(R.layout.activity_main)
mAnalyzer = createAnalyzer()
prepareViews()
}
private fun prepareViews() {
surfaceHolderCamera = surfaceViewCamera.holder
surfaceHolderOverlay = surfaceViewOverlay.holder
surfaceHolderOverlay?.setFormat(PixelFormat.TRANSPARENT)
surfaceHolderCamera?.addCallback(surfaceHolderCallback)
}
private val surfaceHolderCallback = object : SurfaceHolder.Callback {
override fun surfaceChanged(holder: SurfaceHolder?, format: Int, width: Int, height: Int) {
}
override fun surfaceDestroyed(holder: SurfaceHolder?) {
}
override fun surfaceCreated(holder: SurfaceHolder?) {
}
}
7. Now we can create our LensEngine. It is a magic class that handles camera frames for us. You can set different settings to your LensEngine. Here is how you can create it simply. As you can see in the example, the order of width and height passed to LensEngine changes according to orientation. We can create our LensEngine inside surfaceChanged method of surfaceHolderCallback and release it inside surfaceDestroyed. Here is an example of creating and running LensEngine. LensEngine needs a surfaceHolder or a surfaceTexture to run on.
Code:
private lateinit var mAnalyzer: MLFaceAnalyzer
private lateinit var mLensEngine: LensEngine
private var surfaceHolderCamera: SurfaceHolder? = null
private var surfaceHolderOverlay: SurfaceHolder? = null
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
setContentView(R.layout.activity_main)
mAnalyzer = createAnalyzer()
prepareViews()
}
private fun prepareViews() {
surfaceHolderCamera = surfaceViewCamera.holder
surfaceHolderOverlay = surfaceViewOverlay.holder
surfaceHolderOverlay?.setFormat(PixelFormat.TRANSPARENT)
surfaceHolderCamera?.addCallback(surfaceHolderCallback)
}
private val surfaceHolderCallback = object : SurfaceHolder.Callback {
override fun surfaceChanged(holder: SurfaceHolder?, format: Int, width: Int, height: Int) {
mLensEngine = createLensEngine(width, height)
mLensEngine.run(holder)
}
override fun surfaceDestroyed(holder: SurfaceHolder?) {
mLensEngine.release()
}
override fun surfaceCreated(holder: SurfaceHolder?) {
}
}
private fun createLensEngine(width: Int, height: Int): LensEngine {
val lensEngineCreator = LensEngine.Creator(this, mAnalyzer)
.applyFps(20F)
.setLensType(LensEngine.FRONT_LENS)
.enableAutomaticFocus(true)
return if (resources.configuration.orientation == Configuration.ORIENTATION_PORTRAIT) {
lensEngineCreator.let {
it.applyDisplayDimension(height, width)
it.create()
}
} else {
lensEngineCreator.let {
it.applyDisplayDimension(width, height)
it.create()
}
}
}
8. We also need somewhere to receive detected results and interact with them. For this purpose we create our FaceAnalyzerTransactor class, you can name it as you wish. It should implement MLAnalyzer.MLTransactor<MLFace> interface. We are going to set an overlay which is of type SurfaceHolder, get our canvas from this overlay and draw some shapes on the canvas. We have the required data about the detected face inside transactResult method. Here is the sample implementation of the whole of our FaceAnalyzerTransactor class.
Code:
class FaceAnalyzerTransactor : MLAnalyzer.MLTransactor<MLFace> {
private var mOverlay: SurfaceHolder? = null
fun setOverlay(surfaceHolder: SurfaceHolder) {
mOverlay = surfaceHolder
}
override fun transactResult(result: MLAnalyzer.Result<MLFace>?) {
draw(result?.analyseList)
}
private fun draw(faces: SparseArray<MLFace>?) {
val canvas = mOverlay?.lockCanvas()
if (canvas != null && faces != null) {
//Clear the canvas
canvas.drawColor(0, PorterDuff.Mode.CLEAR)
for (face in faces.valueIterator()) {
//Draw all 855 points of the face. If Front Lens is selected, change x points side.
for (point in face.allPoints) {
val x = mOverlay?.surfaceFrame?.right?.minus(point.x)
if (x != null) {
Paint().also {
it.color = Color.YELLOW
it.style = Paint.Style.FILL
it.strokeWidth = 16F
canvas.drawPoint(x, point.y, it)
}
}
}
//Prepare a string to show if the user smiles or not and draw a text on the canvas.
val smilingString = if (face.emotions.smilingProbability > 0.5) "SMILING" else "NOT SMILING"
Paint().also {
it.color = Color.RED
it.textSize = 60F
it.textAlign = Paint.Align.CENTER
canvas.drawText(smilingString, face.border.exactCenterX(), face.border.exactCenterY(), it)
}
}
mOverlay?.unlockCanvasAndPost(canvas)
}
}
override fun destroy() {
}
}
9. Create a FaceAnalyzerTransactor instance in MainActivity and use it as shown below.
Code:
private lateinit var mAnalyzer: MLFaceAnalyzer
private lateinit var mLensEngine: LensEngine
private lateinit var mFaceAnalyzerTransactor: FaceAnalyzerTransactor
private var surfaceHolderCamera: SurfaceHolder? = null
private var surfaceHolderOverlay: SurfaceHolder? = null
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
setContentView(R.layout.activity_main)
init()
}
private fun init() {
mAnalyzer = createAnalyzer()
mFaceAnalyzerTransactor = FaceAnalyzerTransactor()
mAnalyzer.setTransactor(mFaceAnalyzerTransactor)
prepareViews()
}
Also, don’t forget to set the overlay of our transactor. We can do this inside surfaceChanged method like this.
Code:
private val surfaceHolderCallback = object : SurfaceHolder.Callback {
override fun surfaceChanged(holder: SurfaceHolder?, format: Int, width: Int, height: Int) {
mLensEngine = createLensEngine(width, height)
surfaceHolderOverlay?.let { mFaceAnalyzerTransactor.setOverlay(it) }
mLensEngine.run(holder)
}
override fun surfaceDestroyed(holder: SurfaceHolder?) {
mLensEngine.release()
}
override fun surfaceCreated(holder: SurfaceHolder?) {
}
}
10. We are almost done! Don’t forget to ask for permissions from our users. We need CAMERA and WRITE_EXTERNAL_STORAGE permissions. WRITE_EXTERNAL_STORAGE permission is for automatically updating the machine learning model. Add these permissions to your AndroidManifest.xml and ask from user to grant them at runtime. Here is a simple example.
Code:
private val requiredPermissions = arrayOf(Manifest.permission.CAMERA, Manifest.permission.WRITE_EXTERNAL_STORAGE)
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
setContentView(R.layout.activity_main)
if (hasPermissions(requiredPermissions))
init()
else
ActivityCompat.requestPermissions(this, requiredPermissions, 0)
}
private fun hasPermissions(permissions: Array<String>) = permissions.all {
ContextCompat.checkSelfPermission(this, it) == PackageManager.PERMISSION_GRANTED
}
override fun onRequestPermissionsResult(requestCode: Int, permissions: Array<out String>, grantResults: IntArray) {
super.onRequestPermissionsResult(requestCode, permissions, grantResults)
if (requestCode == 0 && grantResults.isNotEmpty() && hasPermissions(requiredPermissions))
init()
}
11. Well done! We have finished all the steps and created our project. Now we can test it. Here are some examples.
12. We have created a simple FaceApp which detects faces, face features and emotions. You can produce countless number of types of face detection apps, it is up to your imagination. ML Kit empowers your apps with the power of AI. If you have any questions, please ask through the link below. You can also find this project on Github.
Related Links
Thanks to Oğuzhan Demirci for this article.
Original post: https://medium.com/huawei-developers/build-a-face-detection-app-with-huawei-ml-kit-32caec06484
Nice and useful article
Does face detection depend on specific hardware devices?

Intermediate: Integrating Huawei Remote Configuration in Flutter QuizApp (Cross platform)

{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Introduction
In this article, we will be integrating Huawei Remote Configuration Service in Flutter QuizApp. Here we will fetch the remote data which is questions and answers JSON data from Ag-console. Huawei provides Remote Configuration service to manage parameters online, with this service you can control or change the behaviour and appearance of you app online without requiring user’s interaction or update to app. By implementing the SDK you can fetch the online parameter values delivered on the AG-console to change the app behaviour and appearance.
Functional features
1. Parameter management: This function enables user to add new parameter, delete, update existing parameter and setting conditional values.
2. Condition management: This function enables user to adding, deleting and modifying conditions, and copy and modify existing conditions. Currently, you can set the following conditions version, country/region, audience, user attribute, user percentage, time and language. You can expect more conditions in the future.
3. Version management: This feature function supports user to manage and rollback up to 90 days of 300 historical versions for parameters and conditions.
4. Permission management: This feature function allows account holder, app administrator, R&D personnel, and administrator and operations personals to access Remote Configuration by default.
Development Overview
You need to install Flutter and Dart plugin in IDE and I assume that you have prior knowledge about the Flutter and Dart.
Hardware Requirements
A computer (desktop or laptop) running Windows 10.
A Huawei phone (with the USB cable), which is used for debugging.
Software Requirements
Java JDK 1.7 or later.
Android studio software or Visual Studio or Code installed.
HMS Core (APK) 4.X or later.
Integration process
Step 1. Create flutter project
Step 2. Add the App level gradle dependencies.
Choose inside project Android > app > build.gradle.
Java:
apply plugin: 'com.android.application'
apply plugin: 'com.huawei.agconnect'
Add root level gradle dependencies
Java:
maven {url 'https://developer.huawei.com/repo/'}
classpath 'com.huawei.agconnect:agcp:1.4.1.300'
Add app level gradle dependencies
Java:
implementation 'com.huawei.agconnect:agconnect-remoteconfig:1.4.2.301'
Step 3: Add the below permissions in Android Manifest file.
XML:
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE"/>
Step 4: Add below path in pubspec.yaml file under dependencies.
Step 5 : Create a project in AppGallery Connect
Preparations for Integrating HUAWEI HMS Core(Android)
developer.huawei.com
pubspec.yaml
YAML:
name: flutter_app
description: A new Flutter application.
# The following line prevents the package from being accidentally published to
# pub.dev using `pub publish`. This is preferred for private packages.
publish_to: 'none' # Remove this line if you wish to publish to pub.dev
# The following defines the version and build number for your application.
# A version number is three numbers separated by dots, like 1.2.43
# followed by an optional build number separated by a +.
# Both the version and the builder number may be overridden in flutter
# build by specifying --build-name and --build-number, respectively.
# In Android, build-name is used as versionName while build-number used as versionCode.
# Read more about Android versioning at https://developer.android.com/studio/publish/versioning
# In iOS, build-name is used as CFBundleShortVersionString while build-number used as CFBundleVersion.
# Read more about iOS versioning at
# https://developer.apple.com/library/archive/documentation/General/Reference/InfoPlistKeyReference/Articles/CoreFoundationKeys.html
version: 1.0.0+1
environment:
sdk: ">=2.7.0 <3.0.0"
dependencies:
flutter:
sdk: flutter
huawei_account:
path: ../huawei_account/
huawei_analytics:
path: ../huawei_analytics/
huawei_location:
path: ../huawei_location/
huawei_ads:
path: ../huawei_ads/
huawei_push:
path: ../huawei_push
huawei_map:
path: ../huawei_map
huawei_scan:
path: ../huawei_scan
agconnect_crash: ^1.0.0
http: ^0.12.2
fluttertoast: ^7.1.6
agconnect_remote_config: ^1.0.0
# The following adds the Cupertino Icons font to your application.
# Use with the CupertinoIcons class for iOS style icons.
cupertino_icons: ^1.0.2
dev_dependencies:
flutter_test:
sdk: flutter
main.dart
Code:
[CODE=java]import 'dart:convert';
import 'dart:developer';
import 'package:agconnect_remote_config/agconnect_remote_config.dart';
import 'package:flutter/material.dart';
import 'package:flutter_app/login.dart';
import 'package:flutter_app/menuscreen.dart';
import 'package:flutter_app/myquestion.dart';
import 'package:flutter_app/result.dart';
import 'package:huawei_account/hmsauthservice/hms_auth_service.dart';
import 'package:huawei_ads/adslite/ad_param.dart';
import 'package:huawei_ads/adslite/banner/banner_ad.dart';
import 'package:huawei_ads/adslite/banner/banner_ad_size.dart';
import 'package:huawei_ads/hms_ads.dart';
import 'package:huawei_analytics/huawei_analytics.dart';
import './quiz.dart';
import './result.dart';
void main() {
runApp(
MaterialApp(
title: 'TechQuizApp',
// Start the app with the "/" named route. In this case, the app starts
// on the FirstScreen widget.
initialRoute: '/',
routes: {
// When navigating to the "/" route, build the FirstScreen widget.
'/': (context) => MenuScreen(),
// When navigating to the "/second" route, build the SecondScreen widget.
'/second': (context) => MyApp('', null),
},
),
);
}
class MyApp extends StatefulWidget {
final String userName;
List<MyQuestion> _questions;
MyApp(this.userName, this._questions);
@override
State<StatefulWidget> createState() {
// TODO: implement createState
return _MyAppState(_questions);
}
}
class _MyAppState extends State<MyApp> {
var _questionIndex = 0;
int _totalScore = 0;
String name;
List<MyQuestion> _questions;
final HMSAnalytics _hmsAnalytics = new HMSAnalytics();
_MyAppState(this._questions);
@override
void initState() {
_enableLog();
_predefinedEvent();
super.initState();
}
Future<void> _enableLog() async {
_hmsAnalytics.setUserId(widget.userName);
await _hmsAnalytics.enableLog();
}
void _restartQuiz() {
setState(() {
_questionIndex = 0;
_totalScore = 0;
});
}
void _logoutQuiz() async {
final signOutResult = await HmsAuthService.signOut();
if (signOutResult) {
Navigator.of(context)
.push(MaterialPageRoute(builder: (context) => LoginDemo()));
print('You are logged out');
} else {
print('signOut failed');
}
}
//Predefined
void _predefinedEvent() async {
String name = HAEventType.SIGNIN;
dynamic value = {HAParamType.ENTRY: 06534797};
await _hmsAnalytics.onEvent(name, value);
print("Event posted");
}
void _customEvent(int index, int score) async {
String name = "Question$index";
dynamic value = {'Score': score};
await _hmsAnalytics.onEvent(name, value);
print("_customEvent posted");
}
Future<void> _answerQuestion(int score) async {
_totalScore += score;
if (_questionIndex < _questions.length) {
print('Iside if $_questionIndex');
setState(() {
_questionIndex = _questionIndex + 1;
});
print('Current questionIndex $_questionIndex');
} else {
print('Inside else $_questionIndex');
}
_customEvent(_questionIndex, score);
}
@override
Widget build(BuildContext context) {
return MaterialApp(
home: Scaffold(
appBar: AppBar(
title: Text('Wel come ' + widget.userName),
),
body: callme2()));
}
}
myqueston.dart
Code:
class MyQuestion {
String questionText;
List<Answers> answers;
MyQuestion({this.questionText, this.answers});
MyQuestion.fromJson(Map<String, dynamic> json) {
questionText = json['questionText'];
if (json['answers'] != null) {
answers = new List<Answers>();
json['answers'].forEach((v) {
answers.add(new Answers.fromJson(v));
});
}
}
Map<String, dynamic> toJson() {
final Map<String, dynamic> data = new Map<String, dynamic>();
data['questionText'] = this.questionText;
if (this.answers != null) {
data['answers'] = this.answers.map((v) => v.toJson()).toList();
}
return data;
}
}
class Answers {
String text;
int score;
Answers({this.text, this.score});
Answers.fromJson(Map<String, dynamic> json) {
text = json['text'];
score = json['Score'];
}
Map<String, dynamic> toJson() {
final Map<String, dynamic> data = new Map<String, dynamic>();
data['text'] = this.text;
data['Score'] = this.score;
return data;
}
}
login.dart
import 'dart:async';
import 'dart:convert';
import 'dart:developer';
import 'package:agconnect_remote_config/agconnect_remote_config.dart';
import 'package:flutter/material.dart';
import 'package:flutter_app/main.dart';
import 'package:flutter_app/myquestion.dart';
import 'package:huawei_account/helpers/hms_auth_param_helper.dart';
import 'package:huawei_account/helpers/hms_scope.dart';
import 'package:huawei_account/hmsauthservice/hms_auth_service.dart';
import 'package:huawei_account/model/hms_auth_huawei_id.dart';
class LoginDemo extends StatefulWidget {
@override
_LoginDemoState createState() => _LoginDemoState();
}
class _LoginDemoState extends State<LoginDemo> {
TextEditingController emailController = new TextEditingController();
TextEditingController passwordController = new TextEditingController();
String email, password, user;
List<MyQuestion> _questions;
@override
void initState() {
// TODO: implement initState
fetchAndActivateImmediately();
super.initState();
}
@override
void dispose() {
// Clean up the controller when the widget is disposed.
emailController.dispose();
passwordController.dispose();
super.dispose();
}
@override
Widget build(BuildContext context) {
return MaterialApp(
home: Scaffold(
appBar: AppBar(
title: Text('Account Login'),
),
body: Center(
child: InkWell(
onTap: signInWithHuaweiAccount,
child: Ink.image(
image: AssetImage('assets/images/icon.jpg'),
// fit: BoxFit.cover,
width: 110,
height: 110,
),
),
)),
);
}
void signInWithHuaweiAccount() async {
HmsAuthParamHelper authParamHelper = new HmsAuthParamHelper();
authParamHelper
..setIdToken()
..setAuthorizationCode()
..setAccessToken()
..setProfile()
..setEmail()
..setScopeList([HmsScope.openId, HmsScope.email, HmsScope.profile])
..setRequestCode(8888);
try {
final HmsAuthHuaweiId accountInfo =
await HmsAuthService.signIn(authParamHelper: authParamHelper);
print('accountInfo ==>' + accountInfo.email);
setState(() {
String accountDetails = accountInfo.displayName;
print("account name: " + accountInfo.displayName);
print("accountDetails: " + accountDetails);
user = accountInfo.displayName;
if (_questions != null) {
Navigator.of(context).push(
MaterialPageRoute(builder: (context) => MyApp(user, _questions)));
}
});
} on Exception catch (exception) {
print(exception.toString());
print("error: " + exception.toString());
}
}
Future signOut() async {
final signOutResult = await HmsAuthService.signOut();
if (signOutResult) {
//Route route = MaterialPageRoute(builder: (context) => SignInPage());
// Navigator.pushReplacement(context, route);
print('You are logged out');
} else {
print('Login_provider:signOut failed');
}
}
fetchAndActivateImmediately() async {
await AGCRemoteConfig.instance.fetch().catchError((error) => log(error()));
await AGCRemoteConfig.instance.applyLastFetched();
Map value = await AGCRemoteConfig.instance.getMergedAll();
for (String key in value.keys) {
if (key == 'questions') {
var st = value[key].toString().replaceAll('\\', '');
var myquestionJson = jsonDecode(st) as List;
_questions =
myquestionJson.map((val) => MyQuestion.fromJson(val)).toList();
}
}
print('=================*********************======================');
print(jsonEncode(_questions));
}
}[/CODE]
quiz.dart
Code:
import 'package:flutter/material.dart';
import 'package:flutter_app/myquestion.dart';
import './answer.dart';
import './question.dart';
class Quiz extends StatelessWidget {
final List<MyQuestion> questions;
final int questionIndex;
final Function answerQuestion;
Quiz({
@required this.answerQuestion,
@required this.questions,
@required this.questionIndex,
});
@override
Widget build(BuildContext context) {
return Column(
children: [
Question(
questions[questionIndex].questionText,
),
...(questions[questionIndex].answers).map<Widget>((answer) {
return Answer(() => answerQuestion(answer.score), answer.text);
}).toList()
],
);
}
}
menuscreen.dart
Code:
import 'dart:convert';
import 'dart:developer';
import 'package:agconnect_crash/agconnect_crash.dart';
import 'package:agconnect_remote_config/agconnect_remote_config.dart';
import 'package:flutter/material.dart';
import 'package:flutter_app/AdsDemo.dart';
import 'package:flutter_app/CrashService.dart';
import 'package:flutter_app/locationdata.dart';
import 'package:flutter_app/login.dart';
import 'package:flutter_app/pushdata.dart';
import 'package:flutter_app/remotedata.dart';
class MenuScreen extends StatefulWidget {
@override
_MenuScreenState createState() => _MenuScreenState();
}
class _MenuScreenState extends State<MenuScreen> {
@override
Widget build(BuildContext context) {
return MaterialApp(
home: Scaffold(
appBar: AppBar(
title: Text('Menu'),
),
body: Center(
child: Column(
children: [
SizedBox(
width: 320,
child: RaisedButton(
color: Colors.red, // background
textColor: Colors.white, // foreground
child: Text('Enter Quiz'),
onPressed: () {
Navigator.of(context).push(
MaterialPageRoute(builder: (context) => LoginDemo()));
},
),
)
],
),
),
),
);
}
}
Result
Tricks and Tips
Makes sure that agconnect-services.json file added.
Make sure dependencies are added build file.
Run flutter pug get after adding dependencies.
Generating SHA-256 certificate fingerprint in android studio and configure in Ag-connect.
Conclusion
In this article, we have learnt how to integrate Huawei Remote Configuration Service in Flutter QuizApp, Where json data of questions and answers are fetched from remote configurations i.e. Ag-console. Likewise you can configure other parameters like app theme, language, style and country etc. to change the app behaviour and appearance.
Thank you so much for reading, I hope this article helps you to understand the Huawei Remote Configuration Service in flutter.
Reference
Remote configuration service :
Document | Huawei Developers
developer.huawei.com
Original Source
Hi, Using Remote configuration can handle UI related functionality like visible/hide

Expert: Doctor Consult using RxAndroid and MVVM with Huawei Kits (Account, Map, Identity) in Android App

Overview
Click to expand...
Click to collapse
In this article, I will create a Doctor Consult Demo App along with the integration of Huawei Id, Map and Identity Kit. Which provides an easy interface to consult with doctor. Users can choose specific doctors and get the doctor details using Huawei User Address.
By reading this article, you'll get an overview of HMS Core Identity, Map and Account Kit, including its functions, open capabilities and business value.
HMS Core Map Service Introduction
HMS Core Map SDK is a set of APIs for map development in Android. The map data covers most countries outside China and supports multiple languages. The Map SDK uses the WGS 84 GPS coordinate system, which can meet most requirements of map development outside China. You can easily add map-related functions in your Android app, including:
Map display: Displays buildings, roads, water systems, and Points of Interest (POIs).
Map interaction: Controls the interaction gestures and buttons on the map.
Map drawing: Adds location markers, map layers, overlays, and various shapes.
Prerequisite
Huawei Phone EMUI 3.0 or later.
Non-Huawei phones Android 4.4 or later (API level 19 or higher).
Android Studio.
AppGallery Account.
App Gallery Integration process
Sign In and Create or Choose a project on AppGallery Connect portal.
Navigate to Project settings and download the configuration file.
Navigate to General Information, and then provide Data Storage location.
App Development
Create A New Project.
Configure Project Gradle.
Configure App Gradle.
apply plugin: 'com.android.application'apply plugin: 'com.huawei.agconnect'android { compileSdkVersion 30 buildToolsVersion "29.0.3" defaultConfig { applicationId "com.hms.doctorconsultdemo" minSdkVersion 27 targetSdkVersion 30 versionCode 1 versionName "1.0" testInstrumentationRunner "androidx.test.runner.AndroidJUnitRunner" } buildTypes { release { minifyEnabled false proguardFiles getDefaultProguardFile('proguard-android-optimize.txt'), 'proguard-rules.pro' } } compileOptions { sourceCompatibility JavaVersion.VERSION_1_8 targetCompatibility JavaVersion.VERSION_1_8 }}dependencies { implementation fileTree(dir: "libs", include: ["*.jar"]) implementation 'androidx.appcompat:appcompat:1.3.0' implementation 'androidx.constraintlayout:constraintlayout:2.0.4' implementation 'androidx.cardview:cardview:1.0.0' testImplementation 'junit:junit:4.12' androidTestImplementation 'androidx.test.ext:junit:1.1.2' androidTestImplementation 'androidx.test.espresso:espresso-core:3.3.0' //noinspection GradleCompatible implementation 'com.android.support:recyclerview-v7:27.0.2' implementation 'androidx.navigation:navigation-ui:2.1.0' implementation 'androidx.lifecycle:lifecycle-extensions:2.2.0' implementation 'com.huawei.hms:identity:5.3.0.300' //Dagger implementation 'com.google.dagger:dagger:2.13' annotationProcessor 'com.google.dagger:dagger-compiler:2.13' //noinspection GradleCompatible implementation 'com.android.support:cardview-v7:28.0.0' implementation 'com.android.support:support-v4:28.0.0' implementation 'com.google.android.material:material:1.2.0' implementation "com.google.code.gson:gson:2.8.5" implementation('com.huawei.hms:hwid:4.0.4.300') implementation "com.squareup.okhttp3khttp:3.14.2" implementation 'com.squareup.okiokio:1.14.1' implementation 'com.github.bumptech.glide:glide:4.9.0' implementation 'com.huawei.hms:ads-lite:13.4.30.307' implementation 'com.huawei.hms:hianalytics:5.0.3.300' //map// implementation 'com.huawei.hms:maps:4.0.0.301' implementation 'com.huawei.hms:maps:5.0.1.300' //site implementation 'com.huawei.hms:site:4.0.0.300' //location implementation 'com.huawei.hms:location:4.0.3.301' implementation 'com.squareup.retrofit2:retrofit:2.5.0' implementation 'com.squareup.retrofit2:converter-gson:2.5.0' implementation "com.squareup.retrofit2:adapter-rxjava2:2.6.2" implementation 'io.reactivex:rxjava:1.3.0' implementation 'io.reactivex:rxandroid:1.2.1' implementation 'com.android.support:multidex:1.0.3' implementation 'com.squareup.okhttp3:logging-interceptor:4.2.2' // RxAndroid implementation 'io.reactivex.rxjava2:rxjava:2.2.8' implementation 'io.reactivex.rxjava2:rxandroid:2.1.1' implementation 'com.huawei.hms:awareness:1.0.4.301' implementation 'com.huawei.hms:dtm-api:5.0.2.300' implementation 'com.huawei.agconnect:agconnect-remoteconfig:1.3.1.300' implementation 'com.huawei.agconnect:agconnect-crash:1.4.1.300' implementation "com.huawei.agconnect:agconnect-appmessaging:1.4.1.300" implementation 'com.huawei.agconnect:agconnect-auth:1.4.1.300'}
Configure AndroidManifest.xml.
Create Activity class with XML UI.
DirectionActivity:
JavaScript:
package com.hms.doctorconsultdemo.map;
import android.Manifest;
import android.content.Intent;
import android.content.IntentSender;
import android.content.pm.PackageManager;
import android.graphics.Color;
import android.graphics.drawable.ColorDrawable;
import android.location.Location;
import android.os.Build;
import android.os.Bundle;
import android.os.Looper;
import android.util.Log;
import android.view.View;
import android.widget.Button;
import androidx.appcompat.app.ActionBar;
import androidx.appcompat.app.AppCompatActivity;
import androidx.appcompat.widget.Toolbar;
import androidx.cardview.widget.CardView;
import androidx.core.app.ActivityCompat;
import androidx.lifecycle.ViewModelProviders;
import com.google.android.material.card.MaterialCardView;
import com.hms.doctorconsultdemo.BookAppointmentActivity;
import com.hms.doctorconsultdemo.R;
import com.hms.doctorconsultdemo.map.apiconnector.polylineBody.Destination;
import com.hms.doctorconsultdemo.map.apiconnector.polylineBody.Origin;
import com.hms.doctorconsultdemo.map.apiconnector.polylineBody.PolylineBody;
import com.hms.doctorconsultdemo.map.apiconnector.polylineResponse.Paths;
import com.hms.doctorconsultdemo.map.apiconnector.polylineResponse.Polyline;
import com.hms.doctorconsultdemo.map.apiconnector.polylineResponse.PolylineResponse;
import com.hms.doctorconsultdemo.map.apiconnector.polylineResponse.Routes;
import com.hms.doctorconsultdemo.map.apiconnector.polylineResponse.Steps;
import com.huawei.agconnect.remoteconfig.AGConnectConfig;
import com.huawei.hmf.tasks.OnFailureListener;
import com.huawei.hms.common.ApiException;
import com.huawei.hms.common.ResolvableApiException;
import com.huawei.hms.location.FusedLocationProviderClient;
import com.huawei.hms.location.LocationAvailability;
import com.huawei.hms.location.LocationCallback;
import com.huawei.hms.location.LocationRequest;
import com.huawei.hms.location.LocationResult;
import com.huawei.hms.location.LocationServices;
import com.huawei.hms.location.LocationSettingsRequest;
import com.huawei.hms.location.LocationSettingsStatusCodes;
import com.huawei.hms.location.SettingsClient;
import com.huawei.hms.maps.CameraUpdateFactory;
import com.huawei.hms.maps.HuaweiMap;
import com.huawei.hms.maps.MapView;
import com.huawei.hms.maps.OnMapReadyCallback;
import com.huawei.hms.maps.SupportMapFragment;
import com.huawei.hms.maps.model.LatLng;
import com.huawei.hms.maps.model.MapStyleOptions;
import com.huawei.hms.maps.model.Marker;
import com.huawei.hms.maps.model.MarkerOptions;
import com.huawei.hms.maps.model.PolylineOptions;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
public class DirectionActivity extends AppCompatActivity implements OnMapReadyCallback {
public static final String TAG = "DirectionActivity";
private static final String MAPVIEW_BUNDLE_KEY = "XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX";
private HuaweiMap hmap;
private MapView mMapView;
private Marker mMarker;
private List<LatLng> latLngList;
private MapApiViewModel mapApiViewModel;
private CardView cardView;
private LocationCallback mLocationCallback;
private LocationRequest mLocationRequest;
private FusedLocationProviderClient fusedLocationProviderClient;
private SettingsClient settingsClient;
private PolylineBody polylineBody;
private Button btnBooking;
private Map<String, Object> remoteConfigMap;
private AGConnectConfig config;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
init();
remoteConfigMap = new HashMap<>();
config = AGConnectConfig.getInstance();
remoteConfigMap.put("mapstyle", "light");
config.applyDefault(remoteConfigMap);
SupportMapFragment mapFragment = (SupportMapFragment) getSupportFragmentManager().findFragmentById(R.id.mapView);
mMapView = findViewById(R.id.mapView);
Bundle mapViewBundle = null;
if (savedInstanceState != null) {
mapViewBundle = savedInstanceState.getBundle(MAPVIEW_BUNDLE_KEY);
}
mMapView.onCreate(mapViewBundle);
mMapView.getMapAsync(DirectionActivity.this);
}
private void init() {
setContentView(R.layout.activity_direction);
cardView = findViewById(R.id.card_map);
btnBooking = findViewById(R.id.btn_book_trip);
btnBooking.setOnClickListener(view -> {
Intent intent = new Intent(this, BookAppointmentActivity.class);
startActivity(intent);
});
Toolbar toolbar = findViewById(R.id.toolbar);
setSupportActionBar(toolbar);
getSupportActionBar().setDisplayHomeAsUpEnabled(true);
getSupportActionBar().setDisplayShowHomeEnabled(true);
Bundle extras = getIntent().getExtras();
if (extras != null) {
String name = extras.getString("name");
String orgLat = extras.getString("orgLat");
String orgLong = extras.getString("orgLong");
String desLat = extras.getString("desLat");
String desLong = extras.getString("desLong");
boolean tripDisplay = extras.getBoolean("isTrip");
if (!tripDisplay) {
cardView.setVisibility(View.GONE);
} else {
cardView.setVisibility(View.VISIBLE);
}
setTitle(name);
setLatLong(orgLat, orgLong, desLat, desLong);
}
mapApiViewModel = ViewModelProviders.of(this).get(MapApiViewModel.class);
fusedLocationProviderClient = LocationServices.getFusedLocationProviderClient(this);
settingsClient = LocationServices.getSettingsClient(this);
mLocationRequest = new LocationRequest();
mLocationRequest.setInterval(10000);
mLocationRequest.setPriority(LocationRequest.PRIORITY_HIGH_ACCURACY);
if (null == mLocationCallback) {
mLocationCallback = new LocationCallback() {
@Override
public void onLocationResult(LocationResult locationResult) {
if (locationResult != null) {
List<Location> locations = locationResult.getLocations();
if (!locations.isEmpty()) {
for (Location location : locations) {
Log.i(TAG,
"onLocationResult location[Longitude,Latitude,Accuracy]:" + location.getLongitude()
+ "," + location.getLatitude() + "," + location.getAccuracy());
}
}
}
}
@Override
public void onLocationAvailability(LocationAvailability locationAvailability) {
if (locationAvailability != null) {
boolean flag = locationAvailability.isLocationAvailable();
Log.i(TAG, TAG + flag);
}
}
};
}
// check location permisiion
if (Build.VERSION.SDK_INT <= Build.VERSION_CODES.P) {
Log.i(TAG, "sdk < 28 Q");
if (ActivityCompat.checkSelfPermission(this,
Manifest.permission.ACCESS_FINE_LOCATION) != PackageManager.PERMISSION_GRANTED
&& ActivityCompat.checkSelfPermission(this,
Manifest.permission.ACCESS_COARSE_LOCATION) != PackageManager.PERMISSION_GRANTED) {
String[] strings =
{Manifest.permission.ACCESS_FINE_LOCATION, Manifest.permission.ACCESS_COARSE_LOCATION};
ActivityCompat.requestPermissions(this, strings, 1);
}
} else {
if (ActivityCompat.checkSelfPermission(this,
Manifest.permission.ACCESS_FINE_LOCATION) != PackageManager.PERMISSION_GRANTED
&& ActivityCompat.checkSelfPermission(this,
Manifest.permission.ACCESS_COARSE_LOCATION) != PackageManager.PERMISSION_GRANTED
&& ActivityCompat.checkSelfPermission(this,
"android.permission.ACCESS_BACKGROUND_LOCATION") != PackageManager.PERMISSION_GRANTED) {
String[] strings = {Manifest.permission.ACCESS_FINE_LOCATION,
Manifest.permission.ACCESS_COARSE_LOCATION,
"android.permission.ACCESS_BACKGROUND_LOCATION"};
ActivityCompat.requestPermissions(this, strings, 2);
}
}
}
@Override
protected void onStart() {
super.onStart();
mMapView.onStart();
}
@Override
protected void onResume() {
super.onResume();
mMapView.onResume();
}
@Override
protected void onPause() {
super.onPause();
mMapView.onPause();
}
@Override
protected void onStop() {
super.onStop();
mMapView.onStop();
}
@Override
protected void onDestroy() {
super.onDestroy();
mMapView.onDestroy();
}
@Override
public void onMapReady(HuaweiMap map) {
hmap = map;
hmap.setMyLocationEnabled(true);
hmap.setTrafficEnabled(true);
hmap.getUiSettings().setRotateGesturesEnabled(true);
hmap.getUiSettings().setCompassEnabled(false);
mapApiViewModel.getPolylineLiveData(getPolylineBody()).observe(this, result -> {
Log.d(TAG, result.toString());
getPolylineData(result);
});
addHMSRemoteConfigListner();
}
private PolylineBody getPolylineBody() {
return polylineBody;
}
private void setLatLong(String orgLat, String orgLong, String desLat, String desLong) {
polylineBody = new PolylineBody();
Origin origin = new Origin();
origin.setLat(orgLat);
origin.setLng(orgLong);
Destination destination = new Destination();
destination.setLat(desLat);
destination.setLng(desLong);
polylineBody.setDestination(destination);
polylineBody.setOrigin(origin);
}
public void getPolylineData(PolylineResponse polylineResponse) {
List<Routes> routesList = polylineResponse.getRoutes();
List<Paths> paths = new ArrayList<>();
List<Steps> steps = new ArrayList<>();
List<Polyline> polylines = new ArrayList<>();
latLngList = new ArrayList<>();
for (int x = 0; x < routesList.size(); x++) {
for (Paths paths1 : routesList.get(x).getPaths()) {
paths.add(paths1);
}
for (int y = 0; y < paths.size(); y++) {
for (Steps step :
paths.get(y).getSteps()) {
steps.add(step);
}
}
for (int i = 0; i < steps.size(); i++) {
for (Polyline polyline :
steps.get(i).getPolyline()) {
polylines.add(polyline);
}
}
}
for (int i = 0; i < polylines.size(); i++) {
latLngList.add(new LatLng(Double.valueOf(polylines.get(i).getLat())
, Double.valueOf(polylines.get(i).getLng())));
}
hmap.animateCamera(CameraUpdateFactory.newLatLngZoom(latLngList.get(0), 12.0f));
hmap.addMarker(new MarkerOptions().position(latLngList.get(0)));
hmap.addPolyline(new PolylineOptions()
.addAll(latLngList)
.color(Color.BLUE)
.width(3));
}
private void requestLocationUpdatesWithCallback() {
try {
LocationSettingsRequest.Builder builder = new LocationSettingsRequest.Builder();
builder.addLocationRequest(mLocationRequest);
LocationSettingsRequest locationSettingsRequest = builder.build();
settingsClient.checkLocationSettings(locationSettingsRequest)
.addOnSuccessListener(locationSettingsResponse -> {
Log.i(TAG, "check location settings success");
fusedLocationProviderClient
.requestLocationUpdates(mLocationRequest, mLocationCallback, Looper.getMainLooper())
.addOnSuccessListener(aVoid -> Log.i(TAG, "requestLocationUpdatesWithCallback onSuccess"))
.addOnFailureListener(new OnFailureListener() {
@Override
public void onFailure(Exception e) {
Log.e(TAG,
"requestLocationUpdatesWithCallback onFailure:" + e.getMessage());
}
});
})
.addOnFailureListener(e -> {
Log.e(TAG, "checkLocationSetting onFailure:" + e.getMessage());
int statusCode = ((ApiException) e).getStatusCode();
switch (statusCode) {
case LocationSettingsStatusCodes.RESOLUTION_REQUIRED:
try {
ResolvableApiException rae = (ResolvableApiException) e;
rae.startResolutionForResult(DirectionActivity.this, 0);
} catch (IntentSender.SendIntentException sie) {
Log.e(TAG, "PendingIntent unable to execute request.");
}
break;
}
});
} catch (Exception e) {
Log.e(TAG, "requestLocationUpdatesWithCallback exception:" + e.getMessage());
}
}
@Override
public void onRequestPermissionsResult(int requestCode, String[] permissions, int[] grantResults) {
super.onRequestPermissionsResult(requestCode, permissions, grantResults);
if (requestCode == 1) {
if (grantResults.length > 1 && grantResults[0] == PackageManager.PERMISSION_GRANTED
&& grantResults[1] == PackageManager.PERMISSION_GRANTED) {
Log.i(TAG, "onRequestPermissionsResult: apply LOCATION PERMISSION successful");
} else {
Log.i(TAG, "onRequestPermissionsResult: apply LOCATION PERMISSSION failed");
}
}
if (requestCode == 2) {
if (grantResults.length > 2 && grantResults[2] == PackageManager.PERMISSION_GRANTED
&& grantResults[0] == PackageManager.PERMISSION_GRANTED
&& grantResults[1] == PackageManager.PERMISSION_GRANTED) {
Log.i(TAG, "onRequestPermissionsResult: apply ACCESS_BACKGROUND_LOCATION successful");
} else {
Log.i(TAG, "onRequestPermissionsResult: apply ACCESS_BACKGROUND_LOCATION failed");
}
}
}
private void addHMSRemoteConfigListner() {
config.fetch(5).addOnSuccessListener(configValues -> {
config.apply(configValues);
MapStyleOptions mapStyleOptions;
String style = config.getValueAsString("mapstyle");
String colorPrimary = config.getValueAsString("primarycolor");
Log.d(TAG, "HMS color : " + colorPrimary);
ActionBar actionBar = getSupportActionBar();
actionBar.setBackgroundDrawable(new ColorDrawable(Color.parseColor(colorPrimary)));
if (style.equalsIgnoreCase("dark")) {
mapStyleOptions = MapStyleOptions.loadRawResourceStyle(DirectionActivity.this, R.raw.mapstyle_night);
hmap.setMapStyle(mapStyleOptions);
} else if (style.equalsIgnoreCase("light")) {
mapStyleOptions = MapStyleOptions.loadRawResourceStyle(DirectionActivity.this, R.raw.mapstyle_day);
hmap.setMapStyle(mapStyleOptions);
}
}).addOnFailureListener((OnFailureListener) e -> {
Log.d(TAG, e.getMessage());
});
}
}
App Build Result
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Tips and Tricks
Map data of Map Kit does not cover the Chinese mainland. Therefore, the Map SDK for Android, Map SDK (Java) for HarmonyOS, JavaScript API, Static Map API, and Directions API of Map Kit are unavailable in the Chinese mainland. For details about locations where the services are available.
The map zoom icons flash on the map on devices running Android 8 or earlier before the map is loaded. (This issue occurs at a low probability in Android 8, but does not occur in versions later than Android 8.)
Layout file (XML file): Set uiZoomControls to false.
Code file: Set the parameter of the HuaweiMapOptions.zoomControlsEnabled(boolean isZoomControlsEnabled) method to false.
Conclusion
In this article, we have learned how to integrate HMS Core Identity and Map in Android application. After completely read this article user can easily implement Huawei User Address and Map APIs by HMS Core Identity, so that User can consult with doctor using Huawei User Address and redirect to Doctor Location.
Thanks for reading this article. Be sure to like and comment to this article, if you found it helpful. It means a lot to me.
References
HMS Identity Docs: https://developer.huawei.com/consumer/en/hms/huawei-identitykit/
https://developer.huawei.com/consum...des/android-sdk-introduction-0000001061991291
HMS Training Videos -
https://developer.huawei.com/consumer/en/training/

Expert: Directory App MVVM Jetpack (Firebase and Webrtc) in Android using Kotlin- Part-3

Overview
In this article, I will create a Directory android application using Kotlin in which I will integrate HMS Core kits such as HMS Account, AuthService, Identity Kit, Firebase Auth and Webrtc .
App will make use of android MVVM clean architecture using Jetpack components such as DataBinding, AndroidViewModel, Observer, LiveData and much more.
In this article we are going to implement DataBinding using Observable pattern.
FirebaseAuth Service Introduction
Firebase security applies Google’s internal expertise to easily build app sign-ins. Develop simple, multi-platform sign-in with Firebase Authentication. Build Fast For Any Device.
Firebase Authentication provides backend services, easy-to-use SDKs, and ready-made UI libraries to authenticate users to your app.
WebRTC Service Introduction
WebRTC is a free and open-source project providing web browsers and mobile applications with real-time communication via application programming interfaces.
Prerequisite
Huawei Phone EMUI 3.0 or later.
Non-Huawei phones Android 4.4 or later (API level 19 or higher).
HMS Core APK 4.0.0.300 or later
Android Studio
AppGallery Account
App Gallery Integration process
1. Sign In and Create or Choose a project on AppGallery Connect portal.
2. Navigate to Project settings and download the configuration file.
3. Navigate to General Information, and then provide Data Storage location.
App Development
Add Required Dependencies:
Launch Android studio and create a new project. Once the project is ready.
Add following dependency for HMS Kits
//HMS Kits
implementation 'com.huawei.agconnect:agconnect-core:1.5.0.300'
implementation 'com.huawei.hms:hwid:5.3.0.302'
implementation 'com.huawei.hms:identity:5.3.0.300'//Google Firebase
implementation platform('com.google.firebase:firebase-bom:28.4.1')
implementation 'com.google.firebase:firebase-analytics'
implementation 'com.google.firebase:firebase-auth'
implementation 'com.google.firebase:firebase-database'
implementation 'com.google.android.gmslay-services-auth:19.2.0'
implementation 'com.airbnb.android:lottie:4.1.0'
implementation 'com.mikhaellopez:circularimageview:4.3.0'
implementation 'com.kaopiz:kprogresshud:1.2.0'
implementation 'com.google.android.gmslay-services-ads:20.4.0' implementation 'com.github.bumptech.glide:glide:4.12.0'
annotationProcessor 'com.github.bumptech.glide:compiler:4.12.0'
Navigate to the Gradle scripts folder and open build.gradle (project: app)
repositories {
google()
jcenter()
maven {url 'https://developer.huawei.com/repo/'}
}
dependencies {
classpath "com.android.tools.build:gradle:4.0.1"
classpath 'com.huawei.agconnect:agcp:1.4.2.300'
Code Implementation
Created following package model, event, viewmodel.
ViewModel: The ViewModel makes it easy to update data changes on the UI.Create a package named viewmodel in your main folder.Then create a new file and name it LoginViewModel.kt along with their FactoryViewModelProviders.
MainActivity.kt:
package com.hms.directoryclass MainActivity : AppCompatActivity(), ActivityNavigation { private lateinit var viewModel: LoginViewModel
private lateinit var dataBinding: ActivityMainBinding override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
dataBinding = DataBindingUtil.setContentView(this, R.layout.activity_main) val viewModel: LoginViewModel by lazy {
val activity = requireNotNull(this) {}
ViewModelProviders.of(this, LoginViewModelFactory(activity.application))
.get(LoginViewModel::class.java)
} dataBinding.loginViewModel = viewModel
dataBinding.lifecycleOwner = this
viewModel.startActivityForResultEvent.setEventReceiver(this, this)
}
public override fun onActivityResult(requestCode: Int, resultCode: Int, data: Intent?) {
viewModel.onResultFromActivity(requestCode, data)
super.onActivityResult(requestCode, resultCode, data)
}}
LoginViewModel.kt:
package com.hms.directory.viewmodel
@SuppressLint("StaticFieldLeak")
class LoginViewModel(application: Application) : AndroidViewModel(application), Observable { private val context = getApplication<Application>().applicationContext
private var mAuthManager: AccountAuthService? = null
private var mAuthParam: AccountAuthParams? = null val startActivityForResultEvent = LiveMessageEvent<ActivityNavigation>() fun login() {
val intent = Intent(context, OrderActivity::class.java)
intent.flags = Intent.FLAG_ACTIVITY_NEW_TASK
context.startActivity(intent) /* mAuthParam = AccountAuthParamsHelper(AccountAuthParams.DEFAULT_AUTH_REQUEST_PARAM)
.setIdToken()
.setAccessToken()
.createParams()
mAuthManager = AccountAuthManager.getService(Activity(), mAuthParam)
startActivityForResultEvent.sendEvent {
startActivityForResult(
mAuthManager?.signInIntent,
HMS_SIGN_IN
)
}*/
} fun onResultFromActivity(requestCode: Int, data: Intent?) {
when (requestCode) {
HMS_SIGN_IN -> {
val authAccountTask = AccountAuthManager.parseAuthResultFromIntent(data)
onCompleteLogin(authAccountTask)
}
}
} private fun onCompleteLogin(doneTask: Task<AuthAccount>) {
if (doneTask.isSuccessful) {
val authAccount = doneTask.result
Log.d("LoginViewModel", "SigIn Success")
context.startActivity(Intent(context, ContactListActivity::class.java)) } else {
Log.d("LoginViewModel", "SigIn Error")
}
} override fun addOnPropertyChangedCallback(callback: Observable.OnPropertyChangedCallback?) {
} override fun removeOnPropertyChangedCallback(callback: Observable.OnPropertyChangedCallback?) {
}}
ContactActivity.kt:
public class ContactListActivity extends AppCompatActivity { @override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_contact_list); // Load contacts from file
Contacts.loadData(this); // Set up recycler view and fill it with all the contacts
RecyclerView recyclerView = (RecyclerView) findViewById(R.id.contact_list);
recyclerView.setAdapter(new ContactListAdapter(this, Contacts.LIST)); }
LoginFireBaseActivity.java
package com.hms.directory.app.call;import androidx.annotation.NonNull;
import androidx.annotation.Nullable;
import androidx.appcompat.app.AppCompatActivity;import android.content.Intent;
import android.os.Bundle;
import android.util.Log;
import android.view.View;
import android.widget.Toast;import com.google.android.gms.auth.api.signin.GoogleSignIn;
import com.google.android.gms.auth.api.signin.GoogleSignInAccount;
import com.google.android.gms.auth.api.signin.GoogleSignInClient;
import com.google.android.gms.auth.api.signin.GoogleSignInOptions;
import com.google.android.gms.tasks.OnCompleteListener;
import com.google.android.gms.tasks.Task;
import com.google.firebase.FirebaseApp;
import com.google.firebase.auth.AuthCredential;
import com.google.firebase.auth.AuthResult;
import com.google.firebase.auth.FirebaseAuth;
import com.google.firebase.auth.FirebaseUser;
import com.google.firebase.auth.GoogleAuthProvider;
import com.google.firebase.database.FirebaseDatabase;
import com.hms.corrierapp.R;
import com.hms.directory.app.call.models.User;import org.jetbrains.annotations.NotNull;public class LoginActivity extends AppCompatActivity { GoogleSignInClient mGoogleSignInClient;
int RC_SIGN_IN = 11;
FirebaseAuth mAuth;
FirebaseDatabase database; @override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_login_goole); mAuth = FirebaseAuth.getInstance();
if (mAuth.getCurrentUser() != null) {
goToNextActivity();
} database = FirebaseDatabase.getInstance(); GoogleSignInOptions gso = new GoogleSignInOptions.Builder(GoogleSignInOptions.DEFAULT_SIGN_IN)
.requestIdToken("1016048264402-439a9aamtpiajbgqeqg24qkum2bb7fmh.apps.googleusercontent.com")
.requestEmail()
.build(); mGoogleSignInClient = GoogleSignIn.getClient(this, gso); findViewById(R.id.loginBtn).setOnClickListener(new View.OnClickListener() {
@override
public void onClick(View view) {
Intent intent = mGoogleSignInClient.getSignInIntent();
startActivityForResult(intent, RC_SIGN_IN);
//startActivity(new Intent(LoginActivity.this, MainActivity.class));
}
});
} void goToNextActivity() {
startActivity(new Intent(LoginActivity.this, MainActivity.class));
finish();
} @override
protected void onActivityResult(int requestCode, int resultCode, @nullable @org.jetbrains.annotations.Nullable Intent data) {
super.onActivityResult(requestCode, resultCode, data); if (requestCode == RC_SIGN_IN) {
Task<GoogleSignInAccount> task = GoogleSignIn.getSignedInAccountFromIntent(data);
GoogleSignInAccount account = task.getResult();
authWithGoogle(account.getIdToken());
}
} void authWithGoogle(String idToken) {
AuthCredential credential = GoogleAuthProvider.getCredential(idToken, null);
mAuth.signInWithCredential(credential)
.addOnCompleteListener(new OnCompleteListener<AuthResult>() {
@override
public void onComplete(@NonNull @notnull Task<AuthResult> task) {
if (task.isSuccessful()) {
FirebaseUser user = mAuth.getCurrentUser();
User firebaseUser = new User(user.getUid(), user.getDisplayName(), user.getPhotoUrl().toString(), "Unknown", 500);
database.getReference()
.child("profiles")
.child(user.getUid())
.setValue(firebaseUser).addOnCompleteListener(new OnCompleteListener<Void>() {
@override
public void onComplete(@NonNull @notnull Task<Void> task) {
if (task.isSuccessful()) {
startActivity(new Intent(LoginActivity.this, MainActivity.class));
finishAffinity();
} else {
Toast.makeText(LoginActivity.this, task.getException().getLocalizedMessage(), Toast.LENGTH_SHORT).show();
}
}
});
//Log.e("profile", user.getPhotoUrl().toString());
} else {
Log.e("err", task.getException().getLocalizedMessage());
}
}
});
}
}
Xml layout DataBinding
To include data binding in the UI, enclose all content with <layout></layout>.
The ViewModel is introduced to the layout in the <data></data> section, as shown. Ensure that the type value points to the specific folder that has the required ViewModel.
App Build Result
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Tips and Tricks
Identity Kit displays the HUAWEI ID registration or sign-in page first. The user can use the functions provided by Identity Kit only after signing in using a registered HUAWEI ID.
Conclusion
In this article, we have learned how to integrate Huawei Identity Kit in Android application. After completely read this article user can easily implement Huawei ID in the Directory App android application using Kotlin.
Thanks for reading this article. Be sure to like and comment to this article, if you found it helpful. It means a lot to me.
References
HMS Docs:
https://developer.huawei.com/consum.../HMSCore-Guides/introduction-0000001050048870

Expert: Directory App MVVM Jetpack (Video Call with Webrtc & Firebase Realtime DB) in Android using Kotlin- Part-4

Overview
In this article, I will create a Directory android application using Webrtc Video Calling App in which I will integrate HMS Core kits such as HMS Account, AuthService, Identity Kit, Firebase Auth and Firebase Realtime DB .
App will make use of android MVVM clean architecture using Jetpack components such as DataBinding, AndroidViewModel, Observer, LiveData and much more.
In this article we are going to implement DataBinding using Observable pattern.
FirebaseAuth Service Introduction
Firebase security applies Google’s internal expertise to easily build app sign-ins. Develop simple, multi-platform sign-in with Firebase Authentication. Build Fast For Any Device.
Firebase Authentication provides backend services, easy-to-use SDKs, and ready-made UI libraries to authenticate users to your app.
Firebase Realtime Database Service Introduction
Firebase Realtime Database lets you build rich, collaborative applications by allowing secure access to the database directly from client-side code.
WebRTC Service Introduction
WebRTC is a free and open-source project providing web browsers and mobile applications with real-time communication via application programming interfaces.
Prerequisite
Huawei Phone EMUI 3.0 or later.
Non-Huawei phones Android 4.4 or later (API level 19 or higher).
HMS Core APK 4.0.0.300 or later
Android Studio
AppGallery Account
App Gallery Integration process
1. Sign In and Create or Choose a project on AppGallery Connect portal.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
2. Navigate to Project settings and download the configuration file.
3. Navigate to General Information, and then provide Data Storage location.
App Development
Add Required Dependencies:
Launch Android studio and create a new project. Once the project is ready.
Add following dependency for HMS Kits
Code:
//HMS Kits
implementation 'com.huawei.agconnect:agconnect-core:1.5.0.300'
implementation 'com.huawei.hms:hwid:5.3.0.302'
implementation 'com.huawei.hms:identity:5.3.0.300'Copy codeCopy code//Google Firebase
implementation platform('com.google.firebase:firebase-bom:28.4.1')
implementation 'com.google.firebase:firebase-analytics'
implementation 'com.google.firebase:firebase-auth'
implementation 'com.google.firebase:firebase-database'
implementation 'com.google.android.gms:play-services-auth:19.2.0'
implementation 'com.airbnb.android:lottie:4.1.0'
implementation 'com.mikhaellopez:circularimageview:4.3.0'
implementation 'com.kaopiz:kprogresshud:1.2.0'
implementation 'com.google.android.gms:play-services-ads:20.4.0' implementation 'com.github.bumptech.glide:glide:4.12.0'
annotationProcessor 'com.github.bumptech.glide:compiler:4.12.0'
Navigate to the Gradle scripts folder and open build.gradle (project: app)
Code:
repositories {
google()
jcenter()
maven {url 'https://developer.huawei.com/repo/'}
}
dependencies {
classpath "com.android.tools.build:gradle:4.0.1"
classpath 'com.huawei.agconnect:agcp:1.4.2.300'
Code Implementation
Created following package model, event, viewmodel.
ViewModel: The ViewModel makes it easy to update data changes on the UI.Create a package named viewmodel in your main folder.Then create a new file and name it LoginViewModel.kt along with their FactoryViewModelProviders.
MainActivity.kt:
Code:
package com.hms.directoryclass MainActivity : AppCompatActivity(), ActivityNavigation { private lateinit var viewModel: LoginViewModel
private lateinit var dataBinding: ActivityMainBinding override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
dataBinding = DataBindingUtil.setContentView(this, R.layout.activity_main) val viewModel: LoginViewModel by lazy {
val activity = requireNotNull(this) {}
ViewModelProviders.of(this, LoginViewModelFactory(activity.application))
.get(LoginViewModel::class.java)
} dataBinding.loginViewModel = viewModel
dataBinding.lifecycleOwner = this
viewModel.startActivityForResultEvent.setEventReceiver(this, this)
}
public override fun onActivityResult(requestCode: Int, resultCode: Int, data: Intent?) {
viewModel.onResultFromActivity(requestCode, data)
super.onActivityResult(requestCode, resultCode, data)
}}
LoginViewModel.kt:
Code:
package com.hms.directory.viewmodel
@SuppressLint("StaticFieldLeak")
class LoginViewModel(application: Application) : AndroidViewModel(application), Observable { private val context = getApplication<Application>().applicationContext
private var mAuthManager: AccountAuthService? = null
private var mAuthParam: AccountAuthParams? = null val startActivityForResultEvent = LiveMessageEvent<ActivityNavigation>() fun login() {
val intent = Intent(context, OrderActivity::class.java)
intent.flags = Intent.FLAG_ACTIVITY_NEW_TASK
context.startActivity(intent) /* mAuthParam = AccountAuthParamsHelper(AccountAuthParams.DEFAULT_AUTH_REQUEST_PARAM)
.setIdToken()
.setAccessToken()
.createParams()
mAuthManager = AccountAuthManager.getService(Activity(), mAuthParam)
startActivityForResultEvent.sendEvent {
startActivityForResult(
mAuthManager?.signInIntent,
HMS_SIGN_IN
)
}*/
} fun onResultFromActivity(requestCode: Int, data: Intent?) {
when (requestCode) {
HMS_SIGN_IN -> {
val authAccountTask = AccountAuthManager.parseAuthResultFromIntent(data)
onCompleteLogin(authAccountTask)
}
}
} private fun onCompleteLogin(doneTask: Task<AuthAccount>) {
if (doneTask.isSuccessful) {
val authAccount = doneTask.result
Log.d("LoginViewModel", "SigIn Success")
context.startActivity(Intent(context, ContactListActivity::class.java)) } else {
Log.d("LoginViewModel", "SigIn Error")
}
} override fun addOnPropertyChangedCallback(callback: Observable.OnPropertyChangedCallback?) {
} override fun removeOnPropertyChangedCallback(callback: Observable.OnPropertyChangedCallback?) {
}}
ContactActivity.kt:
Code:
public class ContactListActivity extends AppCompatActivity { @Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_contact_list); // Load contacts from file
Contacts.loadData(this); // Set up recycler view and fill it with all the contacts
RecyclerView recyclerView = (RecyclerView) findViewById(R.id.contact_list);
recyclerView.setAdapter(new ContactListAdapter(this, Contacts.LIST)); }
LoginFireBaseActivity.java
Code:
package com.hms.directory.app.call;import androidx.annotation.NonNull;
import androidx.annotation.Nullable;
import androidx.appcompat.app.AppCompatActivity;import android.content.Intent;
import android.os.Bundle;
import android.util.Log;
import android.view.View;
import android.widget.Toast;import com.google.android.gms.auth.api.signin.GoogleSignIn;
import com.google.android.gms.auth.api.signin.GoogleSignInAccount;
import com.google.android.gms.auth.api.signin.GoogleSignInClient;
import com.google.android.gms.auth.api.signin.GoogleSignInOptions;
import com.google.android.gms.tasks.OnCompleteListener;
import com.google.android.gms.tasks.Task;
import com.google.firebase.FirebaseApp;
import com.google.firebase.auth.AuthCredential;
import com.google.firebase.auth.AuthResult;
import com.google.firebase.auth.FirebaseAuth;
import com.google.firebase.auth.FirebaseUser;
import com.google.firebase.auth.GoogleAuthProvider;
import com.google.firebase.database.FirebaseDatabase;
import com.hms.corrierapp.R;
import com.hms.directory.app.call.models.User;import org.jetbrains.annotations.NotNull;public class LoginActivity extends AppCompatActivity { GoogleSignInClient mGoogleSignInClient;
int RC_SIGN_IN = 11;
FirebaseAuth mAuth;
FirebaseDatabase database; @Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_login_goole); mAuth = FirebaseAuth.getInstance();
if (mAuth.getCurrentUser() != null) {
goToNextActivity();
} database = FirebaseDatabase.getInstance(); GoogleSignInOptions gso = new GoogleSignInOptions.Builder(GoogleSignInOptions.DEFAULT_SIGN_IN)
.requestIdToken("1016048264402-439a9aamtpiajbgqeqg24qkum2bb7fmh.apps.googleusercontent.com")
.requestEmail()
.build(); mGoogleSignInClient = GoogleSignIn.getClient(this, gso); findViewById(R.id.loginBtn).setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View view) {
Intent intent = mGoogleSignInClient.getSignInIntent();
startActivityForResult(intent, RC_SIGN_IN);
//startActivity(new Intent(LoginActivity.this, MainActivity.class));
}
});
} void goToNextActivity() {
startActivity(new Intent(LoginActivity.this, MainActivity.class));
finish();
} @Override
protected void onActivityResult(int requestCode, int resultCode, @Nullable @org.jetbrains.annotations.Nullable Intent data) {
super.onActivityResult(requestCode, resultCode, data); if (requestCode == RC_SIGN_IN) {
Task<GoogleSignInAccount> task = GoogleSignIn.getSignedInAccountFromIntent(data);
GoogleSignInAccount account = task.getResult();
authWithGoogle(account.getIdToken());
}
} void authWithGoogle(String idToken) {
AuthCredential credential = GoogleAuthProvider.getCredential(idToken, null);
mAuth.signInWithCredential(credential)
.addOnCompleteListener(new OnCompleteListener<AuthResult>() {
@Override
public void onComplete(@NonNull @NotNull Task<AuthResult> task) {
if (task.isSuccessful()) {
FirebaseUser user = mAuth.getCurrentUser();
User firebaseUser = new User(user.getUid(), user.getDisplayName(), user.getPhotoUrl().toString(), "Unknown", 500);
database.getReference()
.child("profiles")
.child(user.getUid())
.setValue(firebaseUser).addOnCompleteListener(new OnCompleteListener<Void>() {
@Override
public void onComplete(@NonNull @NotNull Task<Void> task) {
if (task.isSuccessful()) {
startActivity(new Intent(LoginActivity.this, MainActivity.class));
finishAffinity();
} else {
Toast.makeText(LoginActivity.this, task.getException().getLocalizedMessage(), Toast.LENGTH_SHORT).show();
}
}
});
//Log.e("profile", user.getPhotoUrl().toString());
} else {
Log.e("err", task.getException().getLocalizedMessage());
}
}
});
}
}
CallConnectingActivity.java
Code:
public class ConnectingActivity extends AppCompatActivity { ActivityConnectingBinding binding;
FirebaseAuth auth;
FirebaseDatabase database;
boolean isOkay = false; @Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
binding = ActivityConnectingBinding.inflate(getLayoutInflater());
setContentView(binding.getRoot()); auth = FirebaseAuth.getInstance();
database = FirebaseDatabase.getInstance(); String profile = getIntent().getStringExtra("profile");
Glide.with(this)
.load(profile)
.into(binding.profile); String username = auth.getUid(); database.getReference().child("users")
.orderByChild("status")
.equalTo(0).limitToFirst(1)
.addListenerForSingleValueEvent(new ValueEventListener() {
@Override
public void onDataChange(@NonNull @NotNull DataSnapshot snapshot) {
if (snapshot.getChildrenCount() > 0) {
isOkay = true;
// Room Available
for (DataSnapshot childSnap : snapshot.getChildren()) {
database.getReference()
.child("users")
.child(childSnap.getKey())
.child("incoming")
.setValue(username);
database.getReference()
.child("users")
.child(childSnap.getKey())
.child("status")
.setValue(1);
Intent intent = new Intent(ConnectingActivity.this, CallActivity.class);
String incoming = childSnap.child("incoming").getValue(String.class);
String createdBy = childSnap.child("createdBy").getValue(String.class);
boolean isAvailable = childSnap.child("isAvailable").getValue(Boolean.class);
intent.putExtra("username", username);
intent.putExtra("incoming", incoming);
intent.putExtra("createdBy", createdBy);
intent.putExtra("isAvailable", isAvailable);
startActivity(intent);
finish();
}
} else {
// Not Available HashMap<String, Object> room = new HashMap<>();
room.put("incoming", username);
room.put("createdBy", username);
room.put("isAvailable", true);
room.put("status", 0); database.getReference()
.child("users")
.child(username)
.setValue(room).addOnSuccessListener(new OnSuccessListener<Void>() {
@Override
public void onSuccess(Void unused) {
database.getReference()
.child("users")
.child(username).addValueEventListener(new ValueEventListener() {
@Override
public void onDataChange(@NonNull @NotNull DataSnapshot snapshot) {
if (snapshot.child("status").exists()) {
if (snapshot.child("status").getValue(Integer.class) == 1) { if (isOkay)
return; isOkay = true;
Intent intent = new Intent(ConnectingActivity.this, CallActivity.class);
String incoming = snapshot.child("incoming").getValue(String.class);
String createdBy = snapshot.child("createdBy").getValue(String.class);
boolean isAvailable = snapshot.child("isAvailable").getValue(Boolean.class);
intent.putExtra("username", username);
intent.putExtra("incoming", incoming);
intent.putExtra("createdBy", createdBy);
intent.putExtra("isAvailable", isAvailable);
startActivity(intent);
finish();
}
}
} @Override
public void onCancelled(@NonNull @NotNull DatabaseError error) { }
});
}
}); }
} @Override
public void onCancelled(@NonNull @NotNull DatabaseError error) { }
});
}
}
Xml layout DataBinding
To include data binding in the UI, enclose all content with <layout></layout>.
The ViewModel is introduced to the layout in the <data></data> section, as shown. Ensure that the type value points to the specific folder that has the required ViewModel.
App Build Result
RealTimeDB Result
Room Created For Video Call
RealTime DB Usage
Tips and Tricks
Identity Kit displays the HUAWEI ID registration or sign-in page first. The user can use the functions provided by Identity Kit only after signing in using a registered HUAWEI ID.
Conclusion
In this article, we have learned how to integrate Huawei Identity Kit and Firebase Realtime DB using Webrtc Video Call in Android application. After completely read this article user can easily implement Huawei ID in the Directory App android application using Kotlin.
Thanks for reading this article. Be sure to like and comment to this article, if you found it helpful. It means a lot to me.
References
HMS Docs:
https://developer.huawei.com/consum.../HMSCore-Guides/introduction-0000001050048870

Categories

Resources