Monetize your app with Ads Kit - Huawei Developers

More information like this, you can visit HUAWEI Developer Forum​
What if I tell you is possible generate money even if your app can be downloaded for free?, would you like it? Well, if the answser is yes let me introduce you the Huawei Ads Kit. In this article I’m going to show you how to place ads in you app.
Note: The Ads service is only available for enterprise accounts, support for individual accounts will be added by the end of august 2020.
Supported Ad Types
Banner ads: A rectangular small add.
Native ads: Get ad data and adjust it to your own layout. Native ads are thinked to fit seamlessly into the surrounding content to match your app design.
Rewarded ads: Full-screen video ads that reward users for watching
Interstitial Ads: Full-screen ads displayed when a user starts, pauses, or exits an app, without disrupting the user's experience.
Splash Ads: Splash ads are displayed immediately after an app is launched, even before the home screen of the app is displayed.
Prerequisites
An app project in App Gallery Connect
An android project with the HMS core SDK integrated
Integrating the SDK
Add the following lines to your app level build.gradle file
Code:
implementation 'com.huawei.hms:ads-lite:13.4.30.307'
implementation 'com.huawei.hms:ads-consent:3.4.30.307'
Initializing the SDK
You can initialize the SDK by the app startup or when an activity is created. Add the next to your Application/Activity class:
Code:
override fun onCreate() {
super.onCreate()
HwAds.init(this)//Add this line
}
Adding a Banner Ad
Go to the view of the fragment/activity where you want to put your ad and add a BannerView
Code:
<com.huawei.hms.ads.banner.BannerView
android:id="@+id/banner"
android:layout_width="match_parent"
android:layout_height="wrap_content"
hwads:adId="testw6vs28auh3"
hwads:bannerSize="BANNER_SIZE_360_57"
android:layout_gravity="bottom"/>
Now go to your Fragment/Activity and load the ad
Code:
fun loadBannerAd(){
val bannerView=view?.banner
bannerView?.adId = getString(R.string.ad_id)
val adParam = AdParam.Builder().build()
bannerView?.adListener= MyAdListener(requireContext(),"Banner Ad")
bannerView?.loadAd(adParam)
}
Creating an event listener
The event listener allows you to be informed if the ad has been loaded, clicked, closed or if the load fails
Code:
package com.hms.example.dummyapplication.utils
import android.content.Context
import android.widget.Toast
import com.huawei.hms.ads.AdListener
class MyAdListener(private val context: Context,private val adTag:String): AdListener() {
override fun onAdClicked() {
}
override fun onAdClosed() {
Toast.makeText(context,"$adTag allows you enjoy the app for free", Toast.LENGTH_SHORT).show()
}
override fun onAdFailed(p0: Int) {
Toast.makeText(context,"Failed to load $adTag code: $p0", Toast.LENGTH_SHORT).show()
}
override fun onAdImpression() {
}
override fun onAdLeave() {
}
override fun onAdLoaded() {
Toast.makeText(context,"$adTag Loaded", Toast.LENGTH_SHORT).show()
}
override fun onAdOpened() {
}
}
Adding an splash ad
Create a new empty activity and add the Splash Ad view to the layout
Code:
<?xml version="1.0" encoding="utf-8"?>
<androidx.constraintlayout.widget.ConstraintLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
tools:context=".activity.AdsActivity">
<com.huawei.hms.ads.splash.SplashView
android:id="@+id/splash"
android:layout_width="match_parent"
android:layout_height="match_parent" />
</androidx.constraintlayout.widget.ConstraintLayout>
Go to your activity and load your ad
Code:
fun loadAd(){
//Create an intent to jump to the next acivity
val intent=Intent(this,NavDrawer::class.java)
//Add a listener to jump when the ad finished or if the load fails
val splashAdLoadListener: SplashAdLoadListener = object : SplashAdLoadListener() {
override fun onAdLoaded() {
// Called when an ad is loaded successfully.
}
override fun onAdFailedToLoad(errorCode: Int) {
// Called when an ad failed to be loaded. The app home screen is then displayed.
startActivity(intent)
}
override fun onAdDismissed() {
// Called when the display of an ad is complete. The app home screen is then displayed.
startActivity(intent)
}
}
val splashView = splash//find view by id is not needed in kotlin
val id=getString(R.string.ad_id2)
val orientation = ActivityInfo.SCREEN_ORIENTATION_PORTRAIT
val adParam = AdParam.Builder().build()
splashView.load(id, orientation, adParam, splashAdLoadListener)
}
Finally go to your Android Manifest and changue the launcher activity
Code:
<activity android:name=".activity.AdsActivity"
>
<intent-filter>
<action android:name="android.intent.action.MAIN" />
<category android:name="android.intent.category.LAUNCHER" />
</intent-filter>
</activity>
Testing
You must use the test ad id during the app development, this avoids invalid ad clicks during the test.
Test ID: "testw6vs28auh3"
An easy way is creating resourses for debug and release build types so you dont have to change it manually for release. You can make it creating different resource files (strings.xml) for each build type or modifying your build.gradle file like the next:
Code:
buildTypes {
release {
signingConfig signingConfigs.release
minifyEnabled false
proguardFiles getDefaultProguardFile('proguard-android-optimize.txt'), 'proguard-rules.pro'
debuggable false
resValue "string","ad_id","formal ad slot ID"
resValue "string","ad_id2","formal ad slot ID"
resValue "string","ad_id3","formal ad slot ID"
}
debug {
signingConfig signingConfigs.release
debuggable true
resValue "string","ad_id","testw6vs28auh3"
resValue "string","ad_id2","testw6vs28auh3"
resValue "string","ad_id3","testw6vs28auh3"
}
}
Conclusion
The ads kit allows you use the space in your app to monetize by displaying ads. You can use different types of ads according your necessities or app behavior, rewarded ads are very useful for games.
Reference
Development guide
Service Description
Check this and other demos: https://github.com/danms07/dummyapp

Related

Scene Kit Features

In this article I will talk about HUAWEI Scene Kit. HUAWEI Scene Kit is a lightweight rendering engine that features high performance and low consumption. It provides advanced descriptive APIs for us to edit, operate, and render 3D materials. Scene Kit adopts physically based rendering (PBR) pipelines to achieve realistic rendering effects. With this Kit, we only need to call some APIs to easily load and display complicated 3D objects on Android phones.
It was announced before with just SceneView feature. But, in the Scene Kit SDK 5.0.2.300 version, they have announced Scene Kit with new features FaceView and ARView. With these new features, the Scene Kit has made the integration of Plane Detection and Face Tracking features much easier.
At this stage, the following question may come to your mind “since there are ML Kit and AR Engine, why are we going to use Scene Kit?” Let’s give the answer to this question with an example.
Differences Between Scene Kit and AR Engine or ML Kit:
For example, we have a Shopping application. And let’s assume that our application has a feature in the glasses purchasing part that the user can test the glasses using AR to see how the glasses looks like in real. Here, we do not need to track facial gestures using the Facial expression tracking feature provided by AR Engine. All we have to do is render a 3D object on the user’s eye. Face Tracking is enough for this. So if we used AR Engine, we would have to deal with graphics libraries like OpenGL. But by using the Scene Kit FaceView, we can easily add this feature to our application without dealing with any graphics library. Because the feature here is a basic feature and the Scene Kit provides this to us.
So what distinguishes AR Engine or ML Kit from Scene Kit is AR Engine and ML Kit provide more detailed controls. However, Scene Kit only provides the basic features (I’ll talk about these features later). For this reason, its integration is much simpler.
Let’s examine what these features provide us.SceneView
With SceneView, we are able to load and render 3D materials in common scenes.
It allows us to:
Load and render 3D materials.
Load the cubemap texture of a skybox to make the scene look larger and more impressive than it actually is.
Load lighting maps to mimic real-world lighting conditions through PBR pipelines.
Swipe on the screen to view rendered materials from different angles.
ARView:
ARView uses the plane detection capability of AR Engine, together with the graphics rendering capability of Scene Kit, to provide us with the capability of loading and rendering 3D materials in common AR scenes.
With ARView, we can:
Load and render 3D materials in AR scenes.
Set whether to display the lattice plane (consisting of white lattice points) to help select a plane in a real-world view.
Tap an object placed onto the lattice plane to select it. Once selected, the object will change to red. Then we can move, resize, or rotate it.
FaceView:
FaceView can use the face detection capability provided by ML Kit or AR Engine to dynamically detect faces. Along with the graphics rendering capability of Scene Kit, FaceView provides us with superb AR scene rendering dedicated for faces.
With FaceView we can:
Dynamically detect faces and apply 3D materials to the detected faces.
As I mentioned above ARView uses the plane detection capability of AR Engine and the FaceView uses the face detection capability provided by either ML Kit or AR Engine. When using the FaceView feature, we can use the SDK we want by specifying which SDK to use in the layout.
Here, we should consider the devices to be supported when choosing the SDK. You can see the supported devices in the table below. Also for more detailed information you can visit this page. (In addition to the table on this page, the Scene Kit’s SceneView feature also supports P40 Lite devices.)
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Also, I think it is useful to mention some important working principles of Scene Kit:
Scene Kit
Provides a Full-SDK, which we can integrate into our app to access 3D graphics rendering capabilities, even though our app runs on phones without HMS Core.
Uses the Entity Component System (ECS) to reduce coupling and implement multi-threaded parallel rendering.
Adopts real-time PBR pipelines to make rendered images look like in a real world.
Supports the general-purpose GPU Turbo to significantly reduce power consumption.
Demo App
Let’s learn in more detail by integrating these 3 features of the Scene Kit with a demo application that we will develop in this section.
To configure the Maven repository address for the HMS Core SDK add the below code to project level build.gradle.
Go to
〉 project level build.gradle > buildscript > repositories
〉project level build.gradle > allprojects > repositories
Code:
maven { url 'https://developer.huawei.com/repo/' }
After that go to
module level build.gradle > dependencies
then add build dependencies for the Full-SDK of Scene Kit in the dependencies block.
Code:
implementation 'com.huawei.scenekit:full-sdk:5.0.2.302'
Note: When adding build dependencies, replace the version here “full-sdk: 5.0.2.302” with the latest Full-SDK version. You can find all the SDK and Full-SDK version numbers in Version Change History.
Then click the Sync Now as shown below
After the build is successfully completed, add the following line to the manifest.xml file for Camera permission.
Code:
<uses-permission android:name="android.permission.CAMERA" />
Now our project is ready to development. We can use all the functionalities of Scene Kit.
Let’s say this demo app is a shopping app. And I want to use Scene Kit features in this application. We’ll use the Scene Kit’s ARView feature in the “office” section of our application to test how a plant and a aquarium looks on our desk.
And in the sunglasses section, we’ll use the FaceView feature to test how sunglasses look on our face.
Finally, we will use the SceneView feature in the shoes section of our application. We’ll test a shoe to see how it looks.
We will need materials to test these properties, let’s get these materials first. I will use 3D models that you can download from the links below. You can use the same or different materials if you want.
Capability: ARView, Used Models: Plant , Aquarium
Capability: FaceView, Used Model: Sunglasses
Capability: SceneView, Used Model: Shoe
Note: I used 3D models in “.glb” format as asset in ARView and FaceView features. However, these links I mentioned contain 3D models in “.gltf” format. I converted “.gltf” format files to “.glb” format. Therefore, you can obtain a 3D model in “.glb” format by uploading all the files (textures, scene.bin and scene.gltf) of the 3D models downloaded from these links to an online converter website. You can use any online conversion website for the conversion process.
All materials must be stored in the assets directory. Thus, we place the materials under app> src> main> assets in our project. After placing it, our file structure will be as follows.
After adding the materials, we will start by adding the ARView feature first. Since we assume that there are office supplies in the activity where we will use the ARView feature, let’s create an activity named OfficeActivity and first develop its layout.
Note: Activities must extend the Activity class. Update the activities that extend the AppCompatActivity with Activity”
Example: It should be “OfficeActivity extends Activity”.
ARView
In order to use the ARView feature of the Scene Kit, we add the following ARView code to the layout (activity_office.xml file).
Code:
<com.huawei.hms.scene.sdk.ARView
android:id="@+id/ar_view"
android:layout_width="match_parent"
android:layout_height="match_parent">
</com.huawei.hms.scene.sdk.ARView>
Overview of the activity_office.xml file:
Code:
<?xml version="1.0" encoding="utf-8"?>
<RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:gravity="bottom"
tools:context=".OfficeActivity">
<com.huawei.hms.scene.sdk.ARView
android:id="@+id/ar_view"
android:layout_width="match_parent"
android:layout_height="match_parent"/>
<LinearLayout
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_alignParentBottom="true"
android:layout_centerInParent="true"
android:layout_centerHorizontal="true"
android:layout_centerVertical="true"
android:gravity="bottom"
android:layout_marginBottom="30dp"
android:orientation="horizontal">
<Button
android:id="@+id/button_flower"
android:layout_width="110dp"
android:layout_height="wrap_content"
android:onClick="onButtonFlowerToggleClicked"
android:text="Load Flower"/>
<Button
android:id="@+id/button_aquarium"
android:layout_width="110dp"
android:layout_height="wrap_content"
android:onClick="onButtonAquariumToggleClicked"
android:text="Load Aquarium"/>
</LinearLayout>
</RelativeLayout>
We specified 2 buttons, one for the aquarium and the other for loading a plant. Now, let’s do the initializations from OfficeActivity and activate the ARView feature in our application. First, let’s override the onCreate () function to obtain the ARView and the button that will trigger the code of object loading.
Code:
private ARView mARView;
private Button mButtonFlower;
private boolean isLoadFlowerResource = false;
private boolean isLoadAquariumResource = false;
private Button mButtonAquarium;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_office);
mARView = findViewById(R.id.ar_view);
mButtonFlower = findViewById(R.id.button_flower);
mButtonAquarium = findViewById(R.id.button_aquarium);
Toast.makeText(this, "Please move the mobile phone slowly to find the plane", Toast.LENGTH_LONG).show();
}
Then add the method that will be triggered when the buttons are clicked. Here we will check the loading status of the object. We will clean or load the object according to the its situation.
For plant button:
Code:
public void onButtonFlowerToggleClicked(View view) {
mARView.enablePlaneDisplay(true);
if (!isLoadFlowerResource) {
// Load 3D model.
mARView.loadAsset("ARView/flower.glb");
float[] scale = new float[] { 0.15f, 0.15f, 0.15f };
float[] rotation = new float[] { 0.707f, 0.0f, -0.500f, 0.0f };
// (Optional) Set the initial status.
mARView.setInitialPose(scale, rotation);
isLoadFlowerResource = true;
mButtonFlower.setText("Clear Flower");
} else {
// Clear the resources loaded in the ARView.
mARView.clearResource();
mARView.loadAsset("");
isLoadFlowerResource = false;
mButtonFlower.setText("Load Flower");
}
}
For the aquarium button:
Code:
public void onButtonAquariumToggleClicked(View view) {
mARView.enablePlaneDisplay(true);
if (!isLoadAquariumResource) {
// Load 3D model.
mARView.loadAsset("ARView/aquarium.glb");
float[] scale = new float[] { 0.015f, 0.015f, 0.015f };
float[] rotation = new float[] { 0.0f, 0.0f, 0.0f, 0.0f };
// (Optional) Set the initial status.
mARView.setInitialPose(scale, rotation);
isLoadAquariumResource = true;
mButtonAquarium.setText("Clear Aquarium");
} else {
// Clear the resources loaded in the ARView.
mARView.clearResource();
mARView.loadAsset("");
isLoadAquariumResource = false;
mButtonAquarium.setText("Load Aquarium");
}
}
Now let’s talk about what we do with the codes here, line by line. First, we set the ARView.enablePlaneDisplay() function to true, and if a plane is defined in the real world, the program will appear a lattice plane here.
Code:
mARView.enablePlaneDisplay(true);
Then we check whether the object has been loaded or not. If it is not loaded, we specify the path to the 3D model we selected with the mARView.loadAsset() function and load it. (assets> ARView> flower.glb)
Code:
mARView.loadAsset("ARView/flower.glb");
Then we create and initialize scale and rotation arrays for the starting position. For now, we are entering hardcoded values here. For the future versions, by holding the screen, etc. We can set a starting position.
Note: The Scene Kit ARView feature already allows us to move, adjust the size and change the direction of the object we have created on the screen. For this, we should select the object we created and move our finger on the screen to change the position, size or direction of the object.
Here we can adjust the direction or size of the object by adjusting the rotation and scale values.(These values will be used as parameter of setInitialPose() function)
Note: These values can be changed according to used model. To find the appropriate values, you should try yourself. For details of these values see the document of ARView setInitialPose() function.
Code:
float[] scale = new float[] { 0.15f, 0.15f, 0.15f };
float[] rotation = new float[] { 0.707f, 0.0f, -0.500f, 0.0f };
Then we set the scale and rotation values we created as the starting position.
Code:
mARView.setInitialPose(scale, rotation);
After this process, we set the boolean value to indicate that the object has been created and we update the text of the button.
Code:
isLoadResource = true;
mButton.setText(R.string.btn_text_clear_resource);
If the object is already loaded, we clear the resource and load the empty object so that we remove the object from the screen.
Code:
mARView.clearResource();
mARView.loadAsset("");
Then we set the boolean value again and done by updating the button text.
Code:
isLoadResource = false;
mButton.setText(R.string.btn_text_load);
Finally, we should not forget to override the following methods as in the code to ensure synchronization.
Code:
@Override
protected void onPause() {
super.onPause();
mARView.onPause();
}
@Override
protected void onResume() {
super.onResume();
mARView.onResume();
}
@Override
protected void onDestroy() {
super.onDestroy();
mARView.destroy();
}
The overview of OfficeActivity.java should be as follows.
Code:
import android.app.Activity;
import android.os.Bundle;
import android.view.View;
import android.widget.Button;
import android.widget.Toast;
import com.huawei.hms.scene.sdk.ARView;
public class OfficeActivity extends Activity {
private ARView mARView;
private Button mButtonFlower;
private boolean isLoadFlowerResource = false;
private boolean isLoadAquariumResource = false;
private Button mButtonAquarium;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_office);
mARView = findViewById(R.id.ar_view);
mButtonFlower = findViewById(R.id.button_flower);
mButtonAquarium = findViewById(R.id.button_aquarium);
Toast.makeText(this, "Please move the mobile phone slowly to find the plane", Toast.LENGTH_LONG).show();
}
/**
* Synchronously call the onPause() method of the ARView.
*/
@Override
protected void onPause() {
super.onPause();
mARView.onPause();
}
/**
* Synchronously call the onResume() method of the ARView.
*/
@Override
protected void onResume() {
super.onResume();
mARView.onResume();
}
/**
* If quick rebuilding is allowed for the current activity, destroy() of ARView must be invoked synchronously.
*/
@Override
protected void onDestroy() {
super.onDestroy();
mARView.destroy();
}
public void onButtonFlowerToggleClicked(View view) {
mARView.enablePlaneDisplay(true);
if (!isLoadFlowerResource) {
// Load 3D model.
mARView.loadAsset("ARView/flower.glb");
float[] scale = new float[] { 0.15f, 0.15f, 0.15f };
float[] rotation = new float[] { 0.707f, 0.0f, -0.500f, 0.0f };
// (Optional) Set the initial status.
mARView.setInitialPose(scale, rotation);
isLoadFlowerResource = true;
mButtonFlower.setText("Clear Flower");
} else {
// Clear the resources loaded in the ARView.
mARView.clearResource();
mARView.loadAsset("");
isLoadFlowerResource = false;
mButtonFlower.setText("Load Flower");
}
}
public void onButtonAquariumToggleClicked(View view) {
mARView.enablePlaneDisplay(true);
if (!isLoadAquariumResource) {
// Load 3D model.
mARView.loadAsset("ARView/aquarium.glb");
float[] scale = new float[] { 0.015f, 0.015f, 0.015f };
float[] rotation = new float[] { 0.0f, 0.0f, 0.0f, 0.0f };
// (Optional) Set the initial status.
mARView.setInitialPose(scale, rotation);
isLoadAquariumResource = true;
mButtonAquarium.setText("Clear Aquarium");
} else {
// Clear the resources loaded in the ARView.
mARView.clearResource();
mARView.loadAsset("");
isLoadAquariumResource = false;
mButtonAquarium.setText("Load Aquarium");
}
}
}
In this way, we added the ARView feature of Scene Kit to our application. We can now use the ARView feature. Now let’s test the ARView part on a device that supports the Scene Kit ARView feature.
Let’s place plants and aquariums on our table as below and see how it looks.
In order for ARView to recognize the ground, first you need to turn the camera slowly until the plane points you see in the photo appear on the screen. After the plane points appear on the ground, we specify that we will add plants by clicking the load flower button. Then we can add the plant by clicking the point on the screen where we want to add the plant. When we do the same by clicking the aquarium button, we can add an aquarium.
I placed an aquarium and plants on my table. You can test how it looks by placing plants or aquariums on your table or anywhere. You can see how it looks in the photo below.
Note: “Clear Flower” and “Clear Aquarium” buttons will remove the objects we have placed on the screen.
After creating the objects, we select the object we want to move, change its size or direction as you can see in the picture below. Under normal conditions, the color of the selected object will turn into red. (The color of some models doesn’t change. For example, when the aquarium model is selected, the color of the model doesn’t change to red.)
If we want to change the size of the object after selecting it, we can zoom in out by using our two fingers. In the picture above you can see that I changed plants sizes. Also we can move the selected object by dragging it. To change its direction, we can move our two fingers in a circular motion.
More information, you can visit https://forums.developer.huawei.com/forumPortal/en/topicview?tid=0201380649772650451&fid=0101187876626530001
Is Scene Kit free of charge?

Intermediate: How to integrate Huawei Awareness kit Barrier API with Local Notification into fitness app (Flutter)

Introduction
In this article, we will learn how to implement Huawei Awareness kit features with Local Notification service to send notification when certain condition met. We can create our own conditions to be met and observe them and notify to the user even when the application is not running.
The Awareness Kit is quite a comprehensive detection and event SDK. If you're developing a context-aware app for Huawei devices, this is definitely the library for you.
        
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
What can we do using Awareness kit?
With HUAWEI Awareness Kit, you can obtain a lot of different contextual information about users’ location, behavior, weather, current time, device status, ambient light, audio device status and makes it easier to provide more refined user experience.
Basic Usage
There are quite a few awareness "modules" in this SDK: Time Awareness, Location Awareness, Behavior Awareness, Beacon Awareness, Audio Device Status Awareness, Ambient Light Awareness, and Weather Awareness. Read on to find out how and when to use them.
Each of these modules has two modes: capture, which is an on-demand information retrieval; and barrier, which triggers an action when a specified condition is met.
Use case
The Barrier API allows us to set specific barriers for specific conditions in our app and when these conditions are satisfies, our app will be notified, so we could take action based on our conditions. In this sample, when user starts the activity and app notifies to the user “please connect the head set to listen music” while doing activity.
Table of content
1. Project setup
2. Headset capture API
3. Headset Barrier API
4. Flutter Local notification plugin
Advantages
1. Converged: Multi-dimensional and evolvable awareness capabilities can be called in a unified manner.
2. Accurate: The synergy of hardware and software makes data acquisition more accurate and efficient.
3. Fast: On-chip processing of local service requests and nearby access of cloud services promise a faster service response.
4. Economical: Sharing awareness capabilities avoids separate interactions between apps and the device, reducing system resource consumption. Collaborating with the EMUI (or Magic UI) and Kirin chip, Awareness Kit can even achieve optimal performance with the lowest power consumption.
Requirements
1. Any operating system(i.e. MacOS, Linux and Windows)
2. Any IDE with Flutter SDK installed (i.e. IntelliJ, Android Studio and VsCode etc.)
3. A little knowledge of Dart and Flutter.
4. A Brain to think
5. Minimum API Level 24 is required.
6. Required EMUI 5.0 and later version devices.
Setting up the Awareness kit
1. Firstly create a developer account in AppGallery Connect. After create your developer account, you can create a new project and new app. For more information check this link.
2. Generating a Signing certificate fingerprint, follow below command.
Code:
keytool -genkey -keystore <application_project_dir>\android\app\<signing_certificate_fingerprint_filename>.jks
-storepass <store_password> -alias <alias> -keypass<key_password> -keysize 2048 -keyalg RSA -validity 36500
3. The above command creates the keystore file in appdir/android/app.
4. Now we need to obtain the SHA256 key, follow the command.
Code:
keytool -list -v -keystore <application_project_dir>\android\app\<signing_certificate_fingerprint_filename>.jks
5. Enable the Awareness kit in the Manage API section and add the plugin.
6. After configuring project, we need to download agconnect-services.json file in your project and add into your project.
7. After that follow the URL for cross-platform plugins. Download required plugins.
  
8. The following dependencies for HMS usage need to be added to build.gradle file under the android directory.
Code:
buildscript {
ext.kotlin_version = '1.3.50'
repositories {
google()
jcenter()
maven {url 'http://developer.huawei.com/repo/'}
}
dependencies {
classpath 'com.android.tools.build:gradle:3.5.0'
classpath 'com.huawei.agconnect:agcp:1.4.1.300'
classpath "org.jetbrains.kotlin:kotlin-gradle-plugin:$kotlin_version"
}
}
allprojects {
repositories {
google()
jcenter()
maven {url 'http://developer.huawei.com/repo/'}
}
}
9. Then add the following line of code to the build.gradle file under the android/app directory.
Code:
apply plugin: 'com.huawei.agconnect'
10. Add the required permissions to the AndroidManifest.xml file under app>src>main folder.
Code:
<uses-permission android:name="android.permission.ACCESS_COARSE_LOCATION" />
<uses-permission android:name="android.permission.ACCESS_FINE_LOCATION" />
<uses-permission android:name="android.permission.ACCESS_BACKGROUND_LOCATION" />
<uses-permission android:name="android.permission.ACTIVITY_RECOGNITION" />
<uses-permission android:name="android.permission.BLUETOOTH" />
<uses-permission android:name="android.permission.ACCESS_WIFI_STATE" />
<uses-permission android:name="android.permission.INTERNET" />
11. After completing all the above steps, you need to add the required kits’ Flutter plugins as dependencies to pubspec.yaml file. You can find all the plugins in pub.dev with the latest versions.
Code:
huawei_awareness:
path: ../huawei_awareness/
12. To display local notification, we need to add flutter local notification plugin.
Code:
flutter_local_notifications: ^3.0.1+6
After adding them, run flutter pub get command. Now all the plugins are ready to use.
Note: Set multiDexEnabled to true in the android/app directory, so that app will not crash.
Code Implementation
Use Capture and Barrier API to get Headset status
This service will helps in your application before starting activity, it will remind you to connect the headset to play music.
We need to request the runtime permissions before calling any service.
Code:
@override
void initState() {
checkPermissions();
requestPermissions();
super.initState();
}
void checkPermissions() async {
locationPermission = await AwarenessUtilsClient.hasLocationPermission();
backgroundLocationPermission =
await AwarenessUtilsClient.hasBackgroundLocationPermission();
activityRecognitionPermission =
await AwarenessUtilsClient.hasActivityRecognitionPermission();
if (locationPermission &&
backgroundLocationPermission &&
activityRecognitionPermission) {
setState(() {
permissions = true;
notifyAwareness();
});
}
}
void requestPermissions() async {
if (locationPermission == false) {
bool status = await AwarenessUtilsClient.requestLocationPermission();
setState(() {
locationPermission = status;
});
}
if (backgroundLocationPermission == false) {
bool status =
await AwarenessUtilsClient.requestBackgroundLocationPermission();
setState(() {
locationPermission = status;
});
}
if (activityRecognitionPermission == false) {
bool status =
await AwarenessUtilsClient.requestActivityRecognitionPermission();
setState(() {
locationPermission = status;
});
checkPermissions();
}
}
Capture API: Now all the permissions are allowed, once app launched if we want to check the headset status, then we need to call the Capture API using getHeadsetStatus(), only one time this service will cal.
Code:
checkHeadsetStatus() async {
HeadsetResponse response = await AwarenessCaptureClient.getHeadsetStatus();
setState(() {
switch (response.headsetStatus) {
case HeadsetStatus.Disconnected:
_showNotification(
"Music", "Please connect headset before start activity");
break;
}
});
log(response.toJson(), name: "captureHeadset");
}
Barrier API: If you want to set some conditions into your app, then we need to use Barrier API. This service keep on listening event once satisfies the conditions automatically it will notifies to user. for example we mentioned some condition like we need to play music once activity starts, now user connects the headset automatically it will notifies to the user headset connected and music playing.
First we need to set barrier condition, it means the barrier will triggered when conditions satisfies.
Code:
String headSetBarrier = "HeadSet";
AwarenessBarrier headsetBarrier = HeadsetBarrier.keeping(
barrierLabel: headSetBarrier, headsetStatus: HeadsetStatus.Connected);
Add the barrier using updateBarriers() this method will return whether barrier added or not.
Code:
bool status =await AwarenessBarrierClient.updateBarriers(barrier: headsetBarrier);
If status returns true it means barrier successfully added, now we need to declare StreamSubscription to listen event, it will keep on update the data once condition satisfies it will trigger.
Code:
if(status){
log("Headset Barrier added.");
StreamSubscription<dynamic> subscription;
subscription = AwarenessBarrierClient.onBarrierStatusStream.listen((event) {
if (mounted) {
setState(() {
switch (event.presentStatus) {
case HeadsetStatus.Connected:
_showNotification("Cool HeadSet", "Headset Connected,Want to listen some music?");
isPlaying = true;
print("Headset Status: Connected");
break;
case HeadsetStatus.Disconnected:
_showNotification("HeadSet", "Headset Disconnected, your music stopped");
print("Headset Status: Disconnected");
isPlaying = false;
break;
case HeadsetStatus.Unknown:
_showNotification("HeadSet", "Your headset Unknown");
print("Headset Status: Unknown");
isPlaying = false;
break;
}
});
}
}, onError: (error) {
log(error.toString());
});
}else{
log("Headset Barrier not added.");
}
Need of Local Push Notification?
1. We can Schedule notification.
2. No web request is required to display Local notification.
3. No limit of notification per user.
4. Originate from the same device and displayed on the same device.
5. Alert the user or remind the user to perform some task.
This package provides us the required functionality of Local Notification. Using this package we can integrate our app with Local Notification in android and ios app both.
Add the following permission to integrate your app with the ability of scheduled notification.
Code:
<uses-permission android:name="android.permission.RECEIVE_BOOT_COMPLETED" />
<uses-permission android:name="android.permission.VIBRATE" />
Add inside application.
Code:
<receiver android:name="com.dexterous.flutterlocalnotifications.ScheduledNotificationBootReceiver">
<intent-filter>
<action android:name="android.intent.action.BOOT_COMPLETED"/>
</intent-filter>
</receiver>
<receiver android:name="com.dexterous.flutterlocalnotifications.ScheduledNotificationReceiver" />
Let’s create Flutter local notification object.
Code:
FlutterLocalNotificationsPlugin localNotification = FlutterLocalNotificationsPlugin();
@override
void initState() {
// TODO: implement initState
super.initState();
var initializationSettingsAndroid =
AndroidInitializationSettings('ic_launcher');
var initializationSettingsIOs = IOSInitializationSettings();
var initSetttings = InitializationSettings(
android: initializationSettingsAndroid, iOS: initializationSettingsIOs);
localNotification.initialize(initSetttings);
}
Now create logic for display simple notification.
Code:
Future _showNotification(String title, String description) async {
var androidDetails = AndroidNotificationDetails(
"channelId", "channelName", "content",
importance: Importance.high);
var iosDetails = IOSNotificationDetails();
var generateNotification =
NotificationDetails(android: androidDetails, iOS: iosDetails);
await localNotification.show(
0, title, description, generateNotification);
}
Demo
            
Tips and Tricks
1. Download latest HMS Flutter plugin.
2. Set minSDK version to 24 or later.
3. HMS Core APK 4.0.2.300 is required.
4. Currently this plugin not supporting background task.
5. Do not forget to click pug get after adding dependencies.
Conclusion
In this article, we have learned how to use Barrier API of Huawei Awareness Kit with a Local Notification to observe the changes in environmental factors even when the application is not running.
As you may notice, the permanent notification indicating that the application is running in the background is not dismissible by the user which can be annoying.
Based on requirement we can utilize different APIs, Huawei Awareness Kit has many other great features that we can use with foreground services in our applications.
Thanks for reading! If you enjoyed this story, please click the Like button and Follow. Feel free to leave a Comment below.
Reference
Awareness Kit URL.
Awareness Capture API Article URL.
Original Source

AR Engine and Scene Kit Deliver Virtual Try-On of Glasses

Background​The ubiquity of the Internet and smart devices has made e-commerce the preferred choice for countless consumers. However, many longtime users have grown wary of the stagnant shopping model, and thus enhancing user experience is critical to stimulating further growth in e-commerce and attracting a broader user base. HMS Core offers intelligent graphics processing capabilities to identify a user's facial and physical features, which when combined with a new display paradigm, enables users to try on products virtually through their mobile phones, for a groundbreaking digital shopping experience.
Scenarios​AR Engine and Scene Kit allow users to virtually try on products found on shopping apps and shopping list sharing apps, which in turn will lead to greater customer satisfaction and fewer returns and replacements.
Effects​A user opens an shopping app, then the user taps a product's picture to view the 3D model of the product, which they can rotate, enlarge, and shrink for interactive viewing.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Getting Started​Configuring the Maven Repository Address for the HMS Core SDK
Open the project-level build.gradle file in your Android Studio project. Go to buildscript > repositories and allprojects > repositories to configure the Maven repository address for the HMS Core SDK.
Code:
buildscript {
repositories{
...
maven {url 'http://developer.huawei.com/repo/'}
}
}
allprojects {
repositories {
...
maven { url 'http://developer.huawei.com/repo/'}
}
}
Adding Build Dependencies for the HMS Core SDK
Open the app-level build.gradle file of your project. Add build dependencies in the dependencies block and use the Full-SDK of Scene Kit and AR Engine SDK.
Code:
dependencies {
....
implementation 'com.huawei.scenekit:full-sdk:5.0.2.302'
implementation 'com.huawei.hms:arenginesdk:2.13.0.4'
}
For details about the preceding steps, please refer to the development guide for Scene Kit on HUAWEI Developers.
Adding Permissions in the AndroidManifest.xml File
Open the AndroidManifest.xml file in main directory and add the permission to use the camera above the <application line.
Code:
<!--Camera permission-->
<uses-permission android:name="android.permission.CAMERA" />
Development Procedure​Configuring MainActivity​Add two buttons to the layout configuration file of MainActivity. Set the background of the onBtnShowProduct button to the preview image of the product and add the text Try it on! to the onBtnTryProductOn button to guide the user to the feature.
Code:
<Button
android:layout_width="260dp"
android:layout_height="160dp"
android:background="@drawable/sunglasses"
android:onClick="onBtnShowProduct" />
<Button
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="Try it on!"
android:textAllCaps="false"
android:textSize="24sp"
android:onClick="onBtnTryProductOn" />
If the user taps the onBtnShowProduct button, the 3D model of the product will be loaded. After tapping the onBtnTryProductOn button, the user will enter the AR fitting screen.
Configuring the 3D Model Display for a Product​1. Create a SceneSampleView inherited from SceneView.
Code:
public class SceneSampleView extends SceneView {
public SceneSampleView(Context context) {
super(context);
}
public SceneSampleView(Context context, AttributeSet attributeSet) {
super(context, attributeSet);
}
}
Override the surfaceCreated method to create and initialize SceneView. Then call loadScene to load the materials, which should be in the glTF or GLB format, to have them rendered and displayed. Call loadSkyBox to load skybox materials, loadSpecularEnvTexture to load specular maps, and loadDiffuseEnvTexture to load diffuse maps. These files should be in the DDS (cubemap) format.
All loaded materials are stored in the src > main > assets > SceneView folder.
Code:
@Override
public void surfaceCreated(SurfaceHolder holder) {
super.surfaceCreated(holder);
// Load the materials to be rendered.
loadScene("SceneView/sunglasses.glb");
// Call loadSkyBox to load skybox texture materials.
loadSkyBox("SceneView/skyboxTexture.dds");
// Call loadSpecularEnvTexture to load specular texture materials.
loadSpecularEnvTexture("SceneView/specularEnvTexture.dds");
// Call loadDiffuseEnvTexture to load diffuse texture materials.
loadDiffuseEnvTexture("SceneView/diffuseEnvTexture.dds");
}
2. Create a SceneViewActivity inherited from Activity.
Call setContentView using the onCreate method, and then pass SceneSampleView that you have created using the XML tag in the layout file to setContentView.
Code:
public class SceneViewActivity extends Activity {
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_sample);
}
}
Create SceneSampleView in the layout file as follows:
Code:
<com.huawei.scene.demo.sceneview.SceneSampleView
android:layout_width="match_parent"
android:layout_height="match_parent"/>
3. Create an onBtnShowProduct in MainActivity.
When the user taps the onBtnShowProduct button, SceneViewActivity is called to load, render, and finally display the 3D model of the product.
Code:
public void onBtnShowProduct(View view) {
startActivity(new Intent(this, SceneViewActivity.class));
}
Configuring AR Fitting for a Product​Product virtual try-on is easily accessible, thanks to the facial recognition, graphics rendering, and AR display capabilities offered by HMS Core.
1. Create a FaceViewActivity inherited from Activity, and create the corresponding layout file.
Create face_view in the layout file to display the try-on effect.
Code:
<com.huawei.hms.scene.sdk.FaceView
android:id="@+id/face_view"
android:layout_width="match_parent"
android:layout_height="match_parent"
app:sdk_type="AR_ENGINE"></com.huawei.hms.scene.sdk.FaceView>
Create a switch. When the user taps it, they can check the difference between the appearances with and without the virtual glasses.
Code:
<Switch
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:id="@+id/switch_view"
android:layout_alignParentTop="true"
android:layout_marginTop="15dp"
android:layout_alignParentEnd="true"
android:layout_marginEnd ="15dp"
android:text="Try it on"
android:theme="@style/AppTheme"
tools:ignore="RelativeOverlap" />
2. Override the onCreate method in FaceViewActivity to obtain FaceView.
Code:
public class FaceViewActivity extends Activity {
private FaceView mFaceView;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_face_view);
mFaceView = findViewById(R.id.face_view);
}
}
3. Create a listener method for the switch. When the switch is enabled, the loadAsset method is called to load the 3D model of the product. Set the position for facial recognition in LandmarkType.
Code:
mSwitch.setOnCheckedChangeListener(new CompoundButton.OnCheckedChangeListener() {
@Override
public void onCheckedChanged(CompoundButton buttonView, boolean isChecked) {
mFaceView.clearResource();
if (isChecked) {
// Load materials.
int index = mFaceView.loadAsset("FaceView/sunglasses.glb", LandmarkType.TIP_OF_NOSE);
}
}
});
Use setInitialPose to adjust the size and position of the model. Create the position, rotation, and scale arrays and pass values to them.
Code:
final float[] position = { 0.0f, 0.0f, -0.15f };
final float[] rotation = { 0.0f, 0.0f, 0.0f, 0.0f };
final float[] scale = { 2.0f, 2.0f, 0.3f };
Put the following code below the loadAsset line:
Code:
mFaceView.setInitialPose(index, position, scale, rotation);
4. Create an onBtnTryProductOn in MainActivity. When the user taps the onBtnTryProductOn button, the FaceViewActivity is called, enabling the user to view the try-on effect.
Code:
public void onBtnTryProductOn(View view) {
if (ContextCompat.checkSelfPermission(this, Manifest.permission.CAMERA)
!= PackageManager.PERMISSION_GRANTED) {
ActivityCompat.requestPermissions(
this, new String[]{ Manifest.permission.CAMERA }, FACE_VIEW_REQUEST_CODE);
} else {
startActivity(new Intent(this, FaceViewActivity.class));
}
}
References​For more details, you can go to:
AR Engine official website; Scene Kit official website
Reddit to join our developer discussion
GitHub to download sample codes
Stack Overflow to solve any integration problems
Does it support Quick App?
Hello sir,
from where i can get the asset folder for above given code(virtual try-on glasses)
Mamta23 said:
Hello sir,
from where i can get the asset folder for above given code(virtual try-on glasses)
Click to expand...
Click to collapse
Hi,
You can get the demo here: https://github.com/HMS-Core/hms-AREngine-demo
Hope this helps!

Pygmy Collection Application Part 2 (Ads Kit)

{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Introduction
In my last article, I have explained how to integrate account kit finance application. Have a look into Pygmy collection application Part 1 (Account kit).
In this article, we will learn how to integrate Splash Ads kit in Pygmy collection finance application.
First we will understand why we need Ads kit.
Every company makes or builds some kind of product. Building or developing is not a big deal but marketing is the big deal to earn money.
Traditional marketing
1. Putting Banners in City.
2. Advertisement in Radio or TV or newspaper.
3. Painting on the wall.
4. Meeting distributors with sample of product.
So, now let’s understand the drawback of each traditional marketing.
1. Putting Banners in City
You know in one city there will be many localities, sub localities streets, area, main roads, service roads etc. How many places you will put banners? If you consider one city only you need to put so many banners and when it comes to multiple cities and globe level. Imagine you need so many marking people across all the cities. And also think about cost. As an organization they need profit with less investment. But when they go with banner advertisement they have to spent lot of money for marketing only.
2. Advertisement in Radio or TV or newspaper
Now let’s take radio or TV advertisement and newspaper. Let’s take example in one home there are 5 people. How many TV’s will be there?
Max 1 or 2.
What about phones?
Its mobile world everyone will have phone. 5 member’s means 5 phones some times more than 5 because few people will have 2-3 phones.
There are thousands of channels. If you want to give advertisement how many channels you will give, 1 or 2 or max 10 channels you will give advertisement. Do you think all people will watch only those channels which you have provided advertisement?
Radio and newspaper also right. Nowadays who will listen radio? Now everyone is moving towards the social media. And also everybody is reading news in mobile application nobody takes newspaper because people started think about paper is waste and people are thinking about environment.
3. Painting on the wall.
How many houses you will paint? Just think about money and time. As I said earlier, think about multiple cities and multiple streets.
4. Meeting distributors with sample of product.
Meeting distributors with sample product. Do you think this will work out? No it won’t work out because all marketing person will not have same marketing knowledge. On top of that you should have to give training about product for them. Even after training about product they will miss some main key points of product while explaining distributors. If distributors are not convinced about product which is explained by marketing person straight away they will say “no to your product”.
Nowadays, traditional marketing has left its place on digital marketing. Advertisers prefer to place their ads via mobile media rather than printed publications or large billboards. In this way, they can reach their target audience more easily and they can measure their efficiency by analysing many parameters such as ad display and the number of clicks. In addition to in-app purchases, the most common method used by mobile developers to generate revenue from their application is to create advertising spaces for advertisers.
In this sense, Huawei Ads meets the needs of both advertisers and mobile developers. So what is this HMS Ads Kit, let’s take a closer look.
Now let us understand Huawei Ads.
Ads Kit leverages the vast user base of Huawei devices and Huawei's extensive data capabilities to provide you with the Publisher Service, helping you to monetize traffic. Meanwhile, it provides the advertising service for advertisers to deliver personalized campaigns or commercial ads to Huawei device users.
The video on this page introduces traffic monetization through Ads Kit, advantages of HUAWEI Ads Publisher Service, and the process for advertisers to display ads.
You can click here to watch the MOOC video about Ads Kit.
Types of Huawei Ads
Banner Ads.
Native Ads.
Rewarded Ads.
Interstitial Ads.
Splash Ads.
Roll Ads.
Banner Ads.
Banner ads are rectangular images that occupy a spot within an app's layout, either at the top, middle, or bottom of the device screen. Banner ads refresh automatically at regular intervals. When a user clicks a banner ad, the user is redirected to the advertiser's page.
Native Ads.
Native ads can be images, text, or videos, which are less disruptive and fit seamlessly into the surrounding content to match your app design. You can customize native ads as needed.
Rewarded Ads.
Rewarded ads are full-screen video ads that allow users to view in exchange for in-app rewards.
Interstitial Ads.
Interstitial ads are full-screen ads that cover the interface of an app. Such an ad is displayed when a user starts, pauses, or exits an app, without disrupting the user's experience.
Splash Ads.
Splash ads are displayed immediately after an app is launched, even before the home screen of the app is displayed. You need to design a default slogan image for the app in advance, and ensure that the default slogan image is displayed before a splash ad is loaded, enhancing user experience.
Rolls Ads.
Roll ads are displayed as short videos or images, before, during, or after the video content is played.
How to integrate Ads Kit
1. Configure the application on the AGC.
2. Client application development process.
3. Testing a Splash Ad.
Configure application on the AGC
Follow the steps
Step 1: We need to register as a developer account in AppGallery Connect. If you are already a developer ignore this step.
Step 2: Create an app by referring to Creating a Project and Creating an App in the Project
Step 3: Set the data storage location based on the current location.
Step 4: Generating a Signing Certificate Fingerprint.
Step 5: Configuring the Signing Certificate Fingerprint.
Step 6: Download your agconnect-services.json file, paste it into the app root directory.
Client application development process
Follow the steps.
Step 1: Create an Android application in the Android studio (Any IDE which is your favorite).
Step 2: Add the App level Gradle dependencies. Choose inside project Android > app > build.gradle.
Code:
apply plugin: 'com.android.application'
apply plugin: 'com.huawei.agconnect'
dependencies {
implementation 'com.huawei.hms:ads:3.4.47.302'
}
Root level gradle dependencies.
Code:
maven { url 'https://developer.huawei.com/repo/' }
classpath 'com.huawei.agconnect:agcp:1.4.1.300'
Step 3: To allow HTTP and HTTPS network requests on devices with targetSdkVersion 28 or later, configure the following information in the AndroidManifest.xml file:
XML:
<application
...
android:usesCleartextTraffic="true"
>
...
</application>
Step 4: Initialize Ads kit activity or application class.
Step 5: Build Application.
SplashScreen.java
Java:
import androidx.annotation.NonNull;
import androidx.appcompat.app.AppCompatActivity;
import android.content.Intent;
import android.content.pm.ActivityInfo;
import android.os.Bundle;
import android.os.Handler;
import android.os.Message;
import android.util.Log;
import android.widget.FrameLayout;
import com.huawei.agconnect.crash.AGConnectCrash;
import com.huawei.hms.ads.AdListener;
import com.huawei.hms.ads.AdParam;
import com.huawei.hms.ads.AudioFocusType;
import com.huawei.hms.ads.BannerAdSize;
import com.huawei.hms.ads.HwAds;
import com.huawei.hms.ads.banner.BannerView;
import com.huawei.hms.ads.splash.SplashAdDisplayListener;
import com.huawei.hms.ads.splash.SplashView;
import com.shea.pygmycollection.R;
import com.shea.pygmycollection.huaweianalytics.AnalyticUtils;
import com.shea.pygmycollection.huaweianalytics.HuaweiAnalyticsClient;
import com.shea.pygmycollection.huaweianalytics.HuaweiEventParams;
import com.shea.pygmycollection.huaweianalytics.HuaweiLog;
import com.shea.pygmycollection.utils.PygmyNavigator;
import com.shea.pygmycollection.utils.UserDataUtils;
public class SplashScreen extends AppCompatActivity {
private static final int TIME_OUT = 3000;
// "testq6zq98hecj" is a dedicated test ad unit ID. Before releasing your app, replace the test ad unit ID with the formal one.
private static final String AD_ID = "testd7c5cewoj6";
private static final int AD_TIMEOUT = 10000;
private static final int MSG_AD_TIMEOUT = 1001;
private static final String TAG = SplashScreen.class.getSimpleName();
SplashView splashView;
/**
* Pause flag.
* On the splash ad screen:
* Set this parameter to true when exiting the app to ensure that the app home screen is not displayed.
* Set this parameter to false when returning to the splash ad screen from another screen to ensure that the app home screen can be displayed properly.
*/
private boolean hasPaused = false;
// Callback processing when an ad display timeout message is received.
private Handler timeoutHandler = new Handler(new Handler.Callback() {
@Override
public boolean handleMessage(@NonNull Message msg) {
if (SplashScreen.this.hasWindowFocus()) {
jump();
}
return false;
}
});
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_splash_screen);
// Initialize the HUAWEI Ads SDK.
HwAds.init(this);
splashView = findViewById(R.id.splash_ad_view);
// AGConnectCrash.getInstance().testIt(this);
AGConnectCrash.getInstance().enableCrashCollection(true);
AnalyticUtils.logHuaweiAnalyticEvent(new HuaweiLog()
.setScreenName(HuaweiEventParams.ScreenName.MAIN_SPLASH)
.setDescription(HuaweiEventParams.Description.OPEN_SHARE_SCREEN)
.setEventName(HuaweiEventParams.EventName.OPEN)
.setUiElementName(HuaweiEventParams.UiElementName.GALLARY_BUTTON)
);
loadAd();
}
SplashAdDisplayListener adDisplayListener = new SplashAdDisplayListener() {
@Override
public void onAdShowed() {
// Called when an ad is displayed.
Log.d(TAG, "onAdShowed");
AGConnectCrash.getInstance().log("onAdShowed");
}
@Override
public void onAdClick() {
// Called when an ad is clicked.
Log.d(TAG, "onAdClick");
AGConnectCrash.getInstance().log(Log.INFO, "OnAClick");
}
};
/**
* When the ad display is complete, the app home screen is displayed.
*/
private void jump() {
if (!hasPaused) {
hasPaused = true;
if (UserDataUtils.isUserLoggedIn(SplashScreen.this)) {
PygmyNavigator.navigateToHomeScreen(SplashScreen.this);
} else {
PygmyNavigator.navigateToLoginScreen(SplashScreen.this);
}
finish();
}
}
/**
* Set this parameter to true when exiting the app to ensure that the app home screen is not displayed.
*/
@Override
protected void onStop() {
// Remove the timeout message from the message queue.
timeoutHandler.removeMessages(MSG_AD_TIMEOUT);
hasPaused = true;
super.onStop();
}
/**
* Set this parameter to false when returning to the splash ad screen from another screen to ensure that the app home screen can be displayed properly.
*/
@Override
protected void onRestart() {
super.onRestart();
hasPaused = false;
jump();
}
@Override
protected void onDestroy() {
super.onDestroy();
}
private void loadAd() {
int orientation = ActivityInfo.SCREEN_ORIENTATION_PORTRAIT;
AdParam adParam = new AdParam.Builder().build();
SplashView.SplashAdLoadListener splashAdLoadListener = new SplashView.SplashAdLoadListener() {
@Override
public void onAdLoaded() {
// Called when an ad is loaded successfully.
}
@Override
public void onAdFailedToLoad(int errorCode) {
// Called when an ad fails to be loaded. The app home screen is then displayed.
jump();
}
@Override
public void onAdDismissed() {
// Called when the display of an ad is complete. The app home screen is then displayed.
jump();
}
};
// Obtain SplashView.
splashView.setAdDisplayListener(adDisplayListener);
// Set the default slogan.
splashView.setSloganResId(R.drawable.ic_baseline_account_balance_wallet_24);
// Set the audio focus type for a video splash ad.
splashView.setAudioFocusType(AudioFocusType.GAIN_AUDIO_FOCUS_ALL);
// Load the ad. AD_ID indicates the ad unit ID.
splashView.load(AD_ID, orientation, adParam, splashAdLoadListener);
// Send a delay message to ensure that the app home screen can be properly displayed after the ad display times out.
timeoutHandler.removeMessages(MSG_AD_TIMEOUT);
timeoutHandler.sendEmptyMessageDelayed(MSG_AD_TIMEOUT, AD_TIMEOUT);
}
}
activity_splash_screen.xml
XML:
<?xml version="1.0" encoding="utf-8"?>
<RelativeLayout
xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
xmlns:hwads="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
tools:context=".ui.activities.SplashScreen"
android:orientation="vertical"
android:layout_gravity="center"
android:gravity="center"
android:background="@color/colorAccent">
<RelativeLayout
android:id="@+id/logo_area"
android:layout_width="match_parent"
android:layout_height="100dp"
android:layout_alignParentBottom="true"
android:visibility="visible">
<LinearLayout
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_alignParentBottom="true"
android:layout_centerHorizontal="true"
android:layout_marginBottom="40dp"
android:orientation="horizontal">
<ImageView
android:id="@+id/iv"
android:layout_width="50dp"
android:layout_height="50dp"
android:layout_centerInParent="true"
android:layout_gravity="center"
android:src="@drawable/splash_image"
android:visibility="visible" />
<TextView
android:id="@+id/appName"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:layout_gravity="center"
android:layout_marginLeft="20dp"
android:gravity="center"
android:text="eCollection"
android:textColor="@android:color/white"
android:textSize="40sp"
android:textStyle="bold"
android:visibility="visible" />
</LinearLayout>
</RelativeLayout>
<!-- Splash ad view. -->
<com.huawei.hms.ads.splash.SplashView
android:id="@+id/splash_ad_view"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:layout_above="@id/logo_area" />
<com.huawei.hms.ads.banner.BannerView
android:id="@+id/hw_banner_view"
android:layout_width="match_parent"
android:layout_height="wrap_content"
hwads:adId="testw6vs28auh3"
android:visibility="gone"
hwads:bannerSize="BANNER_SIZE_360_57"/>
</RelativeLayout>
3. Testing a Splash Ads
Result
Tips and Tricks
Make sure you added agconnect-service.json file.
If your app can run both on Huawei mobile phones and non-Huawei Android phones (outside the Chinese mainland), integrate the HUAWEI Ads SDK (com.huawei.hms:ads) into your app.
If your app can run only on Huawei mobile phones, integrate the HUAWEI Ads Lite SDK (com.huawei.hms:ads-lite).
Add internet permission in AndroidManifest.xml
Conclusion
In this article, we have learnt what the traditional marketing. Drawbacks of traditional marketing and Types of Huawei Ads, How to integrate Huawei Splash Ads.
Reference
Huawei Splash Ads
Official Huawei Ads kit
Does this works on offline?
Thanks for sharing!!!

Beginner: Integration of Huawei Push Notification with android Work Manager in Navigation Glove IoT application - Part 8

{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Introduction​
If you are new to the series of this article, follow the below article.
Beginner: Integration of Huawei Account kit in Navigation Glove IoT application Using Kotlin - Part 1
Beginner: Integration of Huawei Map kit in Navigation Glove IoT application Using Kotlin - Part 2
Beginner: Integration of Huawei Site kit in Navigation Glove IoT application Using Kotlin - Part 3
Beginner: Integration of Huawei Direction API in Navigation Glove IoT application Using Kotlin - Part 4
Beginner: Connecting to Smart Gloves Using Bluetooth in Navigation Glove IoT application Using Kotlin - Part 5
Beginner: Integration of Huawei Analytics kit and Crash service in Navigation Glove IoT application Using Kotlin - Part 6
Beginner: Integration of Huawei Location kit in Navigation Glove IoT application - Part 7
Click to expand...
Click to collapse
In this article, we will learn about Smart Glove application and also we will learn about integration of the Huawei Push kit in Navigation Glove IoT application.
In this article, we will cover how exactly push notification works, and benefits of push notification and also we can integrate Huawei Push kit in the smart gloves application.
Content​
What is Push Notification?
Advantages of Push notification
Type of notification
Huawei Push notification
Message types.
Integration of push kit
Testing push kit from Huawei AppGalley console
In the current mobile world 90% of the applications integrated the push notification. Every application integrate the push notification to engage the users. And also to make better marketing. Every ecommerce application shared the details about offers, discounts, price drops for particular product, the delivery status through the push notification.
First let us understand what push notification is.
Push notification are basically alerts to users from mobile application. Users receives in real-time. Developers can send notification at any time and any day.
Advantages of Push Notification
User retention: In the world of mobile apps, user retention is a key indicator of how well your app is performing in the market. This metric lets you to see how many users who have downloaded your app and used it once come back to use the app again.
Increase engagement: The power of Huawei push notification is you can make user engage by sending some cool notification to users. Notification provides the interaction with application, by this user can spend more time with the application.
Push Notification types
1. Reminder Notification: Its reminder notification for example recharge reminder, meeting time reminder, appointment reminder etc.
2. Alert Notification: this type of notification alerts to user when something happens in application which depends upon user. Example when someone sends message, comment on pic.
3. Promotional notification: these are the promotional notification when application offers something to user example discount, sales date, some weekend sales etc.
4. Purchas notification: These are also valuable notifications and have to do with purchases users make within your app. It can contain information like order confirmation, order status, order updates, tracking, and receipts.
5. Survey notification: these are feedback or survey notification when application wants to get the feedback or survey at that moment these kind of notification will be used.
Huawei Push Kit is a messaging service provided for you to establish a cloud-to-device messaging channel. By integrating Push Kit, you can send messages to your apps on user devices in real time. This helps you to maintain closer ties with users and increases user awareness of and engagement with your apps.
Push Kit consists of two-part
Message push from the cloud to the device: enables you to send data and messages to your apps on user devices.
Message display on devices: provides various display styles, such as the notification panel, home screen banner, and lock screen on user devices.
Huawei has 2 types of Message
1. Notification Message
2. Data Message
Notification Message: A notification message is directly sent by Push Kit and displayed in the notification panel on the user device, not requiring your app to be running in the background. The user can tap the notification message to trigger the corresponding action such as opening your app or opening a web page. You can tailor the display styles and reminder modes to fit user needs, which will greatly improve the daily active users (DAU) of your app. The common application scenarios of the notification message include subscription, travel reminder, and account status.
Batch message: a message sent by an app in batches to users who will obtain the same content, which can improve user experience and stimulate user interaction with the app.
Personalized message: a message generated based on a unified message template and sent by an app to an audience. The unified message template contains placeholders, which will be replaced based on the settings and preferences of specific users.
Point-to-point message: a message sent by an app to a user when the user takes a specific action.
Instant message: An instant message is a point-to-point or group chatting message (or private message) between users.
Data Message: Data messages are processed by your app on user devices. After a device receives a message containing data or instructions from the cloud, the device passes the message to the app instead of directly displaying it. The app then parses the message and triggers the required action (for example, going to a web page or an app page). For such a message, you can also customize display styles for higher efficiency.
Push Kit cannot guarantee a high data message delivery rate, because it may be affected by Android system restrictions and whether the app is running in the background. The common application scenarios of the data message include the VoIP call, voice broadcast, and interaction with friends.
Prerequisite​
AppGallery Account
Android Studio 3.X
SDK Platform 19 or later
Gradle 4.6 or later
HMS Core (APK) 4.0.0.300 or later
Huawei Phone EMUI 5.0 or later
Non-Huawei Phone Android 5.1 or later
Service integration on AppGallery
1. We need to register as a developer account in AppGallery Connect.
2. Create an app by referring to Creating a Project and Creating an App in the Project.
3. Set the data storage location based on the current location.
4. Enabling Pushvfc Kit Service on AppGallery.
5. Generating a Signing Certificate Fingerprint.
6. Configuring the Signing Certificate Fingerprint.
7. Get your agconnect-services.json file to the app root directory.
Client development
1. Create android project in android studio IDE.
2. Add the maven URL inside the repositories of buildscript and allprojects respectively (project level build.gradle file).
Code:
maven { url 'https://developer.huawei.com/repo/' }
3. Add the classpath inside the dependency section of the project level build.gradle file.
Code:
classpath 'com.huawei.agconnect:agcp:1.5.2.300'
4. Add the plugin in the app-level build.gradle file.
Code:
apply plugin: 'com.huawei.agconnect'
5. Add the below library in the app-level build.gradle file dependencies section.
Code:
implementation 'com.huawei.hms:push:6.3.0.302'
6. Add all the below permission in the AndroidManifest.xml.
XML:
[CODE]<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
<uses-permission android:name="android.permission.CHANGE_WIFI_STATE" />
<uses-permission android:name="android.permission.ACCESS_WIFI_STATE" />
[/CODE]
7. Add the code PushNotificationHmsMessageService in the AndroidManifest.xml
XML:
<service android:name=".push.PushNotificationHmsMessageService" android:exported="false">
<intent-filter>
<action android:name="com.huawei.push.action.MESSAGING_EVENT"/>
</intent-filter>
</service>
8. Sync the project.
Now let’s learn the coding part.
Step 1: Create Notification
NotificationUtils.kt
Java:
package com.huawei.navigationglove.push
import android.app.NotificationChannel
import android.app.NotificationManager
import android.app.PendingIntent
import android.content.Context
import android.content.Intent
import android.media.RingtoneManager
import android.os.Build
import androidx.core.app.NotificationCompat
import androidx.core.content.ContextCompat
import com.huawei.navigationglove.R
import kotlin.random.Random
import android.provider.Settings
import com.huawei.navigationglove.ui.SplashScreenActivity
class NotificationUtil(private val context: Context) {
fun showNotification(title: String, message: String) {
val intent = Intent(context, SplashScreenActivity::class.java)
intent.addFlags(Intent.FLAG_ACTIVITY_CLEAR_TOP)
val pendingIntent = PendingIntent.getActivity(
context, 0, intent,
PendingIntent.FLAG_ONE_SHOT
)
val channelId = context.getString(R.string.default_notification_channel_id)
val defaultSoundUri = RingtoneManager.getDefaultUri(RingtoneManager.TYPE_NOTIFICATION)
val notificationBuilder = NotificationCompat.Builder(context, channelId)
.setColor(ContextCompat.getColor(context, android.R.color.holo_red_dark))
.setSmallIcon(R.drawable.gloves_ic)
.setContentTitle(title)
.setContentText(message)
.setAutoCancel(true)
.setStyle(
NotificationCompat.BigTextStyle()
.bigText(message)
)
.setPriority(NotificationCompat.PRIORITY_HIGH)
.setSound(defaultSoundUri)
.setContentIntent(pendingIntent)
val notificationManager =
context.getSystemService(Context.NOTIFICATION_SERVICE) as NotificationManager
// Since android Oreo notification channel is needed.
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.O) {
val channel = NotificationChannel(
channelId,
"Default Channel",
NotificationManager.IMPORTANCE_HIGH
)
notificationManager.createNotificationChannel(channel)
}
notificationManager.notify(Random.nextInt(), notificationBuilder.build())
}
fun isTimeAutomatic(context: Context): Boolean {
return Settings.Global.getInt(
context.contentResolver,
Settings.Global.AUTO_TIME,
0
) == 1;
}
}
Step 2: Create Worker manager to process data and notification in background. If received data need to be processed and if take more than 10 seconds then create work manager else directly show the notification.
SchedulerWorker.kt
Java:
package com.huawei.navigationglove.push
import android.content.Context
import android.util.Log
import androidx.work.Worker
import androidx.work.WorkerParameters
class ScheduledWorker(appContext: Context, workerParams: WorkerParameters) :
Worker(appContext, workerParams) {
override fun doWork(): Result {
Log.d(TAG, "Work START")
// Get Notification Data
val title = inputData.getString(NOTIFICATION_TITLE)
val message = inputData.getString(NOTIFICATION_MESSAGE)
// Show Notification
NotificationUtil(applicationContext).showNotification(title!!, message!!)
// TODO Do your other Background Processing
Log.d(TAG, "Work DONE")
// Return result
return Result.success()
}
companion object {
private const val TAG = "ScheduledWorker"
const val NOTIFICATION_TITLE = "notification_title"
const val NOTIFICATION_MESSAGE = "notification_message"
}
}
Step 3: Create broadcast receiver
NotificationBroadcastReceiver.kt
Java:
package com.huawei.navigationglove.push
import android.content.BroadcastReceiver
import android.content.Context
import android.content.Intent
import android.util.Log
import androidx.work.Data
import androidx.work.OneTimeWorkRequest
import androidx.work.WorkManager
import com.huawei.navigationglove.push.ScheduledWorker.Companion.NOTIFICATION_MESSAGE
import com.huawei.navigationglove.push.ScheduledWorker.Companion.NOTIFICATION_TITLE
class NotificationBroadcastReceiver : BroadcastReceiver() {
override fun onReceive(context: Context?, intent: Intent?) {
intent?.let {
val title = it.getStringExtra(NOTIFICATION_TITLE)
val message = it.getStringExtra(NOTIFICATION_MESSAGE)
// Create Notification Data
val notificationData = Data.Builder()
.putString(NOTIFICATION_TITLE, title)
.putString(NOTIFICATION_MESSAGE, message)
.build()
// Init Worker
val work = OneTimeWorkRequest.Builder(ScheduledWorker::class.java)
.setInputData(notificationData)
.build()
// Start Worker
WorkManager.getInstance().beginWith(work).enqueue()
Log.d(javaClass.name, "WorkManager is Enqueued.")
}
}
}
Step 4: Add NotificationBroadcastReceiver in the AndroidManifest.xml
XML:
<receiver android:name=".push.NotificationBroadcastReceiver" />
Step 5: Create Huawei PushNotificationHmsMessageService.kt
Java:
package com.huawei.navigationglove.push
import android.app.AlarmManager
import android.app.PendingIntent
import android.content.Context
import android.content.Intent
import android.os.Bundle
import android.util.Log
import com.huawei.hms.push.HmsMessageService
import com.huawei.hms.push.RemoteMessage
import android.text.TextUtils
import androidx.work.Data
import androidx.work.OneTimeWorkRequest
import androidx.work.WorkManager
import com.google.gson.Gson
import com.huawei.hms.common.ApiException
import com.huawei.hms.aaid.HmsInstanceId
import com.huawei.agconnect.config.AGConnectServicesConfig
import com.huawei.navigationglove.push.ScheduledWorker.Companion.NOTIFICATION_MESSAGE
import com.huawei.navigationglove.push.ScheduledWorker.Companion.NOTIFICATION_TITLE
import com.huawei.navigationglove.push.model.PushModel
import org.json.JSONException
import java.text.SimpleDateFormat
import java.util.*
class PushNotificationHmsMessageService : HmsMessageService() {
override fun onMessageReceived(message: RemoteMessage?) {
Log.i(TAG, "onMessageReceived is called")
if (message == null) {
Log.e(TAG, "Received message entity is null!")
return
}
Log.i(
TAG, """get Data: ${message.data} getFrom: ${message.from}
getTo: ${message.to}
getMessageId: ${message.messageId}
getSendTime: ${message.sentTime}
getDataMap: ${message.dataOfMap}
getMessageType: ${message.messageType}
getTtl: ${message.ttl}
getToken: ${message.token}"""
)
message.data.isNotEmpty().let { it ->
if (it) {
Log.d(TAG, "Message data payload: ${message.data}")
try {
val pushModel: PushModel = Gson().fromJson(message.data, PushModel::class.java)
//val jsonData = JSONObject(message.data)
// Get Message details
val title = pushModel.title
val content = pushModel.message
// Check whether notification is scheduled or not
val isScheduled = pushModel.isScheduled
isScheduled.let {
if (it) {
// Check that 'Automatic Date and Time' settings are turned ON.
// If it's not turned on, Return
val notificationUtil = NotificationUtil(this)
if (!notificationUtil.isTimeAutomatic(applicationContext)) {
Log.d(TAG, "`Automatic Date and Time` is not enabled")
return
}
// This is Scheduled Notification, Schedule it
val scheduledTime = pushModel.scheduledTime
scheduleAlarm(scheduledTime, title, content)
} else {
// This is not scheduled notification, show it now
// Create Notification Data
var body =
"You have reached from " + pushModel.data.fromLocation + " to " + pushModel.data.toLocation
val notificationData = Data.Builder()
.putString(NOTIFICATION_TITLE, title)
.putString(NOTIFICATION_MESSAGE, body)
.build()
// Init Worker
val work = OneTimeWorkRequest.Builder(ScheduledWorker::class.java)
.setInputData(notificationData)
.build()
// Start Worker
WorkManager.getInstance(this).beginWith(work).enqueue()
Log.d(javaClass.name, "WorkManager is Enqueued.")
}
}
} catch (e: JSONException) {
e.printStackTrace()
}
} else {
val notificationData = Data.Builder()
.putString(NOTIFICATION_TITLE, message.notification.title)
.putString(NOTIFICATION_MESSAGE, message.notification.body)
.build()
// Init Worker
val work = OneTimeWorkRequest.Builder(ScheduledWorker::class.java)
.setInputData(notificationData)
.build()
// Start Worker
WorkManager.getInstance(this).beginWith(work).enqueue()
}
}
}
private fun scheduleAlarm(
scheduledTimeString: String?,
title: String?,
message: String?
) {
val alarmMgr = applicationContext.getSystemService(Context.ALARM_SERVICE) as AlarmManager
val alarmIntent =
Intent(applicationContext, NotificationBroadcastReceiver::class.java).let { intent ->
intent.putExtra(NOTIFICATION_TITLE, title)
intent.putExtra(NOTIFICATION_MESSAGE, message)
PendingIntent.getBroadcast(applicationContext, 0, intent, 0)
}
// Parse Schedule time
val scheduledTime = SimpleDateFormat("yyyy-MM-dd HH:mm:ss", Locale.getDefault())
.parse(scheduledTimeString!!)
scheduledTime?.let {
// With set(), it'll set non repeating one time alarm.
alarmMgr.set(
AlarmManager.RTC_WAKEUP,
it.time,
alarmIntent
)
}
}
override fun onNewToken(token: String) {
Log.i(TAG, "received refresh token:$token")
if (!TextUtils.isEmpty(token)) {
refreshedTokenToServer(token)
}
}
// If the version of the Push SDK you integrated is 5.0.4.302 or later, you also need to override the method.
override fun onNewToken(token: String, bundle: Bundle?) {
Log.i(TAG, "have received refresh token $token")
if (!TextUtils.isEmpty(token)) {
refreshedTokenToServer(token)
}
}
private fun refreshedTokenToServer(token: String) {
Log.i(TAG, "sending token to server. token:$token")
}
private fun deleteToken() {
// Create a thread.
object : Thread() {
override fun run() {
try {
// Obtain the app ID from the agconnect-service.json file.
val appId =
AGConnectServicesConfig.fromContext([email protected])
.getString("client/app_id")
// Set tokenScope to HCM.
val tokenScope = "HCM"
// Delete the token.
HmsInstanceId.getInstance([email protected])
.deleteToken(appId, tokenScope)
Log.i(TAG, "token deleted successfully")
} catch (e: ApiException) {
Log.e(TAG, "deleteToken failed.$e")
}
}
}.start()
}
/**
* Persist token to third-party servers.
*
* Modify this method to associate the user's FCM registration token with any server-side account
* maintained by your application.
*
* @param token The new token.
*/
private fun sendRegistrationToServer(token: String?) {
// TODO: Implement this method to send token to your app server.
Log.d(TAG, "sendRegistrationTokenToServer($token)")
}
companion object {
private const val TAG = "PushNotificationHmsMessageService"
}
}
Step 6: If you are converting the data message body to kotlin data classes. This step is optional. You can directly parse the json data directly.
PushModel.kt
Java:
package com.huawei.navigationglove.push.model
import com.google.gson.annotations.SerializedName
data class PushModel(@SerializedName("scheduledTime")
val scheduledTime: String = "",
@SerializedName("data")
val data: Data,
@SerializedName("isScheduled")
val isScheduled: Boolean = false,
@SerializedName("title")
val title: String = "",
@SerializedName("message")
val message: String = "")
Data.kt
Java:
package com.huawei.navigationglove.push.model
import com.google.gson.annotations.SerializedName
data class Data(@SerializedName("fromLocation")
val fromLocation: String = "",
@SerializedName("toLocation")
val toLocation: String = "",
@SerializedName("status")
val status: String = "")
Testing Push notification on AppGallery console
Notification Message
Stpe 1: Open AppGallery. And select project.
Step 2: from left menu choose Grow>Push> Enable
Step 3: once enabled Click on Add Notification from top right corner
Step 4: Enter the details of name, select Notification message, Summary, Title and Body.
Step5: Enter push scope, token and Click on Submit button from top right corner
Step 6: Click ok on confirmation window.
Data Message
Repeat Steps 1 to 3 from Notification Message
Step 4: Enter the details of name, select Data message, Summary, Title and Body and choose push scope and Device token. Click on submit button from top right corner
Step 5: Click on OK button from confirmation dialog.
Result
Tips and Tricks
1. Make sure you are already registered as a Huawei developer.
2. Set min SDK version to 19 or later, otherwise you will get AndriodManifest to merge issue.
3. Make sure you have added the agconnect-services.json file to the app folder.
4. Make sure you have added the SHA-256 fingerprint without fail.
5. Make sure all the dependencies are added properly.
Conclusion
In this article, we have learnt the integration of the Huawei Push Kit in Smart Gloves mobile application using Android Studio and Kotlin. In this article, we understood about Huawei Push kit, types of messages, notification type, How to test push notification from AppGallery console.
Reference
Push Kit - Official document
Push Kit - Code lab
Push Kit - Training Video

Categories

Resources