Sending Push Notifications on Flutter with Huawei Push Kit Plugin - Huawei Developers

More information like this, you can visit HUAWEI Developer Forum​
Push notifications offer a great way to increase your application’s user engagement and boost your retention rates by sending meaningful messages or by informing your users about your application. These messages can be sent at any time and even if your app is not running at that time.
Huawei Push Kit
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Huawei Push Kit is a messaging service developed by Huawei for developers to send messages to apps on users’ device in real time. Push Kit supports two types of messages: notification messages and data messages, which we will cover both in this tutorial. You can send notifications and data messages to your users from your server using the Push Kit APIs or directly from the AppGallery Push Kit Console.
Recently we released the Huawei Push Kit Plugin for Flutter to make the integration of Push Kit with your Flutter apps easily.
You can find the plugin on from the link below.
Github: https://github.com/HMS-Core/hms-flutter-plugin/tree/master/flutter-hms-push
We will use this plugin throughout the tutorial, so let’s get started.
Configure your project on AppGallery Connect
First of all you should have a Huawei Developer Account to use the Huawei Mobile Services and thus the Huawei Push Kit. I will not get into details about how you create the developer account since it is very straightforward and out of the scope of this tutorial but you can find all the details here.
Let’s start by creating a new project on Huawei Developer Console. If you already created a project before you can skip this step. Just make sure you have set the Data Storage Location and entered the SHA-256 fingerprint.
Create an app and enable Push Kit
1. Go to Huawei Developer Console and select Huawei Push under Development section. Press the “New App” button on the top right corner to create a new app. You can refer to this article if you came across any problem.
2. You’ll be forwarded to Push Kit page after the creation of your app, you can also navigate here from the side menu. Enable the Push Kit service by clicking “Enable Now”.
3. After enabling Push Kit, console will prompt you to enter the package name and it will redirect you to Project Settings. You can manually enter the package name or you can upload your package to set it automatically. I choose the option to enter manually which is in my case “com.example.flutter_push_kit_tutorial”. You can find this from the first lines of your AndroidManifest.xml file under <your_flutter_project>/android/app/src/AndroidManifest.xml
4. On the Project Settings page set the Data Storage Location as Germany. This is needed to store and send out our push messages.
Generate and Configure Signing Certificate Fingerprint
A signing certificate fingerprint is needed to verify the authenticity of our app when it attempts to access the HMS Push Kit through the HMS Core SDK. So before we able to use the push service, we must generate a signing certificate and configure it in AppGallery Connect. We will also use this certificate later in the Flutter project configuration.
Before generating the signing certificate we must have the JDK installed. This is installed by default with Android Studio. Go to the Java/JDK installation’s bin directory in your pc and open cmd or powershell to run the following command:
keytool-genkey-keystore <keystore-file> -storepass <keystore-pass> -alias <key-alias> -keypass <key-pass> -keysize 2048 -keyalg RSA -validity <validity-period>
The fields that should be filled in the command are as follows:
· <keystore-file> is the path to the app’s signature file. File extension must be .jks or .keystore. For example; C:\key.jks
· <keystore-pass> is the password of your keystore.
· <key-alias> is the alias name of key that will be stored in your keystore.
· <key-pass> is the password of your key.
· <validity-period> Amount of days the key will be valid with this keystore.
Example command:
Code:
keytool -genkey -keystore C:\Users\Username\key.jks -storepass 123456 -alias pushkitkey -keypass 123456 -keysize 2048 -keyalg RSA -validity 36500
Note that you can run this command anywhere if you configured your system environment variables to include JDK folders.
Now that we generated our certificate we need to obtain the SHA-256 fingerprint and add it to the project settings on the AppGallery Connect. To obtain the fingerprints run the command below at the same directory you generated the keystore (Java/JDK bin folder where keytool located)
Code:
keytool -list -v -keystore C:\Users\Username\key.jks
Obtaining SHA-256 Fingerprint from powershell
Copy and paste the SHA-256 fingerprint you obtained to SHA-256 field on the project settings page. Make sure this value has set before you move on. After everything is done, your project settings should look like below.
Completed AppGallery Connect Configuration
Integrate HMS to your Flutter project
Now that we are done with the configuration on the AppGallery Connect let’s move to Flutter part to finish the setup.
Add the Huawei Push Kit Plugin dependency to the project’s pubspec.yaml file and run flutter pub get to load the plugin.
Code:
dependencies:
flutter:
sdk: flutter
huawei_push: 4.0.4+300
For integration with the Huawei Push Kit as well as other Huawei Mobile Services we must download agconnect-services.json file from project settings>app information (where we entered the fingerprint before) and add it to the folder :<your_flutter_project>/android/app
Your project should look like this after adding the services
Add the lines below to the project level build.gradle file
(your_flutter_project/android/build.gradle)
Code:
buildscript {
ext.kotlin_version = '1.3.50'
repositories {
google()
jcenter()
maven { url 'https://developer.huawei.com/repo/' } // Add this line
}
dependencies {
classpath 'com.android.tools.build:gradle:3.5.0'
classpath "org.jetbrains.kotlin:kotlin-gradle-plugin:$kotlin_version"
classpath 'com.huawei.agconnect:agcp:1.3.1.300' // Add this line
}
}
allprojects {
repositories {
google()
jcenter()
maven { url 'https://developer.huawei.com/repo/' } // Add this line
}
}
/* other configurations/*
Create a key.properties file inside the android folder for gradle to read the keystore values that we generated before.
Code:
storePassword=<your_keystore_pass>
keyPassword=<your_key_pass>
keyAlias=<alias_you_entered_before>
storeFile=<path_to_keystore.jks>
On the first line of app level build.gradle located on:
your_flutter _project/android/app/build.gradle
add this line to read the values of key.properties file.
Code:
def keystoreProperties = new Properties()
def keystorePropertiesFile = rootProject.file('key.properties')
if (keystorePropertiesFile.exists()) {
keystoreProperties.load(new FileInputStream(keystorePropertiesFile))
}
Increase your minSdkVersion to 17 and make the following changes on the same app level build.gradle file
Code:
android {
/*
Other configurations
…
*/
defaultConfig {
// The package name below and on AppGallery Connect should be the same
applicationId "com.example.flutter_push_kit_tutorial"
minSdkVersion 17 // Increase this to 17
targetSdkVersion 28
versionCode flutterVersionCode.toInteger()
versionName flutterVersionName
}
// Add this part
signingConfigs {
release {
keyAlias keystoreProperties['keyAlias']
keyPassword keystoreProperties['keyPassword']
storeFile keystoreProperties['storeFile'] ? file(keystoreProperties['storeFile']) : null
storePassword keystoreProperties['storePassword']
}
}
// Edit here
buildTypes {
debug {
signingConfig signingConfigs.release
}
release {
signingConfig signingConfigs.release
}
}
}
Add this line to the end of the same file
Code:
apply plugin: 'com.huawei.agconnect' // Add this line (this needs to be at the bottom of your build.gradle)
Important Note
If you are planning to get a release apk you also need to configure proguard rules to prevent HMS Core SDK from being obfuscated. As stated in this documentation Google’s R8 compiler automatically does shrinking and obfuscating on release builds to reduce the size of your app.
Add following lines to proguard-rules.pro file, create the file inside the android folder if you don’t have it already. You may need further configuration on build.gradle, please refer to the documentation above or this stackoverflow issue.
Code:
## Flutter wrapper
-keep class io.flutter.app.** { *; }
-keep class io.flutter.plugin.** { *; }
-keep class io.flutter.util.** { *; }
-keep class io.flutter.view.** { *; }
-keep class io.flutter.** { *; }
-keep class io.flutter.plugins.** { *; }
-dontwarn io.flutter.embedding.**
## HMS Core SDK
-ignorewarnings
-keepattributes *Annotation*
-keepattributes Exceptions
-keepattributes InnerClasses
-keepattributes Signature
-keepattributes SourceFile,LineNumberTable
-keep class com.hianalytics.android.**{*;}
-keep class com.huawei.updatesdk.**{*;}
-keep class com.huawei.hms.**{*;}
Testing Push Notification
Now that we are ready to use the Push Kit in our Flutter project, let’s get a token for testing the push notification and then we can move with a little bit more complex example.
To receive the token we must initialize an EventChannel and listen the changes from the stream. I’ve initialized the channel and requested a token in the initState of the HomeScreen widget.
Code:
import 'package:flutter/material.dart';
import 'package:flutter/services.dart';
import 'package:huawei_push/push.dart';
import 'package:huawei_push/constants/channel.dart' as Channel;
class HomeScreen extends StatefulWidget {
@override
_HomeScreenState createState() => _HomeScreenState();
}
class _HomeScreenState extends State<HomeScreen> {
String _token = '';
static const EventChannel TokenEventChannel =
EventChannel(Channel.TOKEN_CHANNEL);
@override
void initState() {
super.initState();
initPlatformState();
getToken();
}
Future<void> initPlatformState() async {
if (!mounted) return;
TokenEventChannel.receiveBroadcastStream()
.listen(_onTokenEvent, onError: _onTokenError);
}
void _onTokenEvent(Object event) {
// This function gets called when we receive the token successfully
setState(() {
_token = event;
});
print('Push Token: ' + _token);
Push.showToast(event);
}
void _onTokenError(Object error) {
setState(() {
_token = error;
});
Push.showToast(error);
}
void getToken() async {
await Push.getToken();
}
@override
Widget build(BuildContext context) {
return Scaffold(
// Rest of the widget...
);
}
}
We got the push token after running our app and now we can test the push notification by sending one from the Push Kit Console. Navigate to Push Kit > Add Notification and complete the required fields, you should enter the token we got earlier to the specified device in the push scope part. You can test the notification immediately by pressing test effect button or you can submit your notification.
Sending Push Notification from Huawei Push Kit Console
We have received our first notification
Subscribing to a topic and receiving data messages
Topics are like separate messaging channels that we can send notifications and data messages to. Devices, subscribe to these topics for receiving messages about that subject. For example, users of a weather forecast app can subscribe to a topic that sends notifications about the best weather for exterminating pests. You can check here for more use cases.
Data Messages are customized messages that their content is defined by you and parsed by your application. After receiving these messages, the system transfers it to the app instead of directly displaying the message. App then can parse the message and can trigger some action.
In my example I will define a ‘coupon’ topic that users can subscribe to receive coupons. And then I will send a data message which includes the coupon from the Push Kit Console. (Note that this can also be done by using the Push Kit API)
Let’s define the Coupon class to convert our data message to a coupon object
Code:
class Coupon {
String title;
String body;
String couponCode;
Coupon({this.title, this.body, this.couponCode});
Coupon.fromJson(Map<String, dynamic> json) {
title = json['title'];
body = json['body'];
couponCode = json['couponCode'];
}
Map<String, dynamic> toJson() {
final Map<String, dynamic> data = new Map<String, dynamic>();
data['title'] = this.title;
data['body'] = this.body;
data['couponCode'] = this.couponCode;
return data;
}
}
In home_screen.dart I’ve added a data message event channel just like we did in the token part to receive the messages. And then I’ve added a subscription function and a method to show a dialog if we receive a coupon.
Code:
/*
Imports...
*/
class HomeScreen extends StatefulWidget {
@override
_HomeScreenState createState() => _HomeScreenState();
}
class _HomeScreenState extends State<HomeScreen> {
bool _subscribed = false;
static const EventChannel DataMessageEventChannel =
EventChannel(Channel.DATA_MESSAGE_CHANNEL);
@override
void initState() {
super.initState();
initPlatformState();
}
Future<void> initPlatformState() async {
if (!mounted) return;
DataMessageEventChannel.receiveBroadcastStream()
.listen(_onDataMessageEvent, onError: _onDataMessageError);
}
void _onDataMessageEvent(Object event) {
Map<String, dynamic> dataObj = json.decode(event);
if (dataObj['type'] == 'coupon') {
Coupon coupon = Coupon.fromJson(dataObj);
showCouponDialog(coupon);
} else {
print('Unsupported Data Message Type');
}
Push.showToast(event);
}
void _onDataMessageError(Object error) {
Push.showToast(error);
}
void subscribeToCoupons() async {
setState(() {
_subscribed = true;
});
String topic = 'coupon';
dynamic result = await Push.subscribe(topic);
Push.showToast(result);
}
showCouponDialog(Coupon coupon) {
showDialog(
context: context,
builder: (context) => AlertDialog(
title: Container(
child: Text(
coupon.title.toUpperCase(),
textAlign: TextAlign.center,
style: TextStyle(
color: Colors.green,
fontSize: 25,
),
)),
shape:
RoundedRectangleBorder(borderRadius: BorderRadius.circular(10.0)),
content: Container(
height: 200,
child: Column(
mainAxisAlignment: MainAxisAlignment.spaceBetween,
children: <Widget>[
Text(coupon.body),
SizedBox(
height: 10,
),
Text(
coupon.couponCode,
style: TextStyle(fontSize: 30, fontWeight: FontWeight.bold),
),
MaterialButton(
color: Colors.green,
child: Text(
'Claim Now',
style: TextStyle(color: Colors.white),
),
onPressed: () => Navigator.pop(context),
)
],
),
),
),
);
}
@override
Widget build(BuildContext context) {
return Scaffold(
appBar: AppBar(
title: Text('HMS Push Kit Example'),
centerTitle: true,
),
body: Center(
child: Column(
mainAxisAlignment: MainAxisAlignment.center,
crossAxisAlignment: CrossAxisAlignment.center,
children: <Widget>[
Text('Subscribe to coupon topic to get free coupons'),
SizedBox(
height: 20,
),
OutlineButton(
onPressed: _subscribed ? null : () => subscribeToCoupons(),
child: Text('Subscribe Now'),
borderSide: BorderSide(color: Colors.green),
textColor: Colors.green,
),
],
),
),
);
}
}
Our app is shown below, let’s subscribe to coupon topic by pressing the button.
Now on Huawei Push Kit Console, create a data message like the image below and send it to the coupon topic subscribers.
Configuring the data message from the Push Kit Console
After sending the data message to our coupon subscribers we should see the coupon dialog with the message we have sent.
It must be your lucky day.
I am leaving the project’s github link in case you want to check it from there or try this example by yourself:
https://github.com/atavci/FlutterPushKitTutorial
Conclusion
Now you know how to use the Huawei Push Kit for your Flutter Projects. You can use this information to connect your existing users or add extra functionality that will attract even more users. Like every awesome feature, notifications should be handled with extra care; since nobody likes to get bombarded with them.
I leave some references for further reading, you can also ask questions on the comments section, I would happily answer them.
Have a great day and successful builds!
From:
https://medium.com/huawei-developers/sending-push-notifications-on-flutter-with-huawei-push-kit-plugin-534787862b4d

Related

Integrating Huawei Analytics Kit to Flutter Projects and Sending Events

More information like this, you can visit HUAWEI Developer Forum​
Hello everyone,
In this article, I am going to create a Flutter project –actually a tiny game- and explain how to implement Analytics Kit. But first, let me inform you about Huawei Analytics Kit a little.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Huawei Analytics Kit
Huawei Analytics Kit offers you a range of analytics models that help you not only to analyze users’ behavior with preset and custom events, but also gain an insight into your products and contents. So that you can improve your skills about marketing your apps and optimizing your products.
HUAWEI Analytics Kit identifies users and collects statistics on users by anonymous application identifier (AAID). The AAID is reset in the following scenarios:
1) Uninstall or reinstall the app.
2) The user clears the app data.
After the AAID is reset, the user will be counted as a new user.
HUAWEI Analytics Kit supports event management. For each event, maximum 25 parameters; for each app maximum 100 parameters can be defined.
There are 3 types of events: Automatically collected, predefined and custom.
Automatically collected events are collected from the moment you enable the kit in your code. Event IDs are already reserved by HUAWEI Analytics Kit and cannot be reused.
Predefined events include their own Event IDs which are predefined by the HMS Core Analytics SDK based on common application scenarios. The ID of a custom event cannot be the same as a predefined event’s ID. If so, you will create a predefined event instead of a custom event.
Custom events are the events that you can create for your own requirements.
More info about the kit and events.
Configuration in AppGallery Connect
Firstly, you will need a Huawei developer account. If you don’t have one, click here and register. It will be activated in 1–2 days.
After signing in to AppGallery Connect, you can add a new project or select an existing project. In the project you choose, add an app. While adding app, make sure you enter the package name right. It should be the same as your Flutter project’s package name.
Also, make sure you set data storage location, enable Analytics kit and add SHA-256 fingerprint to AppGallery Connect.
How to generate SHA-256 Fingerprint?
In Android Studio, right click on android folder under your project and select Flutter > Open Android module in Android Studio.
On the right panel, select Gradle and follow the steps that are shown in the picture below. Open signingReport and there is your SHA-256 fingerprint.
Copy the code and paste it on the project settings in the AppGallery Connect.
Integrate HMS to your project
Download agconnect-services.json file and place it under project_name > android > app.
Add Signing Configuration
Create a file named key.properties under android folder and add your signing configs here.
storeFile file(‘<keystore_file>.jks’)
storePassword ‘<keystore_password>’
keyAlias ‘<key_alias>’
keyPassword ‘<key_password>’
Define your key.properties file by adding the code below, before android block in your app-level build.gradle file.
Code:
def keystoreProperties = new Properties()
def keystorePropertiesFile = rootProject.file(‘key.properties’)
if (keystorePropertiesFile.exists()) {
keystoreProperties.load(new FileInputStream(keystorePropertiesFile))
}
TO-DOs in project-level build.gradle
Code:
buildscript {
repositories {
google()
jcenter()
maven {url 'http://developer.huawei.com/repo/'} //add this line
}
dependencies {
classpath 'com.android.tools.build:gradle:3.5.0'
classpath 'com.huawei.agconnect:agcp:1.1.1.300' //add this line
}
}
allprojects {
repositories {
google()
jcenter()
maven {url 'http://developer.huawei.com/repo/'} //add this line
}
}
TO-DOs in app-level build.gradle
Code:
defaultConfig {
...
minSdkVersion 19 //Increase this to 19
}
//Add these lines
signingConfigs {
release {
keyAlias keystoreProperties['keyAlias']
keyPassword keystoreProperties['keyPassword']
storeFile keystoreProperties['storeFile'] ? file(keystoreProperties['storeFile']) : null
storePassword keystoreProperties['storePassword']
}
}
//Edit buildTypes
buildTypes {
debug {
signingConfig signingConfigs.release
}
release {
signingConfig signingConfigs.release
}
}
//Add dependencies
dependencies {
implementation 'com.huawei.agconnect:agconnect-core:1.0.0.300'
implementation 'com.huawei.hms:hianalytics:5.0.1.300'
}
apply plugin: 'com.huawei.agconnect' //Add this line to the bottom of the page
Add Analytics Kit to your project
There are 2 ways to do this step.
1) Go to developer website and download plugin. In Android Studio create a new folder in your root directory and name it “hms”. Unzip the plugin and paste it in “hms” folder.
Then, go to pubspec.yaml and add the plugin under dependencies.
2) This way is much easier and also more familiar to Flutter developers. In pub.dev copy the plugin and add it under dependencies as usual.
For both ways, after running pub get command, the plugin is ready to use!
For more information about HMS Core integration, click.
We are all done. Let’s begin coding.
I will make a tiny and very easy game that I belive most of you know the concept: Guess the number!
As you play the game and try to guess the number, Huawei Analytics Kit will collect statistics how many times you guessed.
Make a simple game with Flutter
First, let’s write the method to create a random number. You should import ‘dart:math’ for this.
Code:
_setRandomNumber() {
Random random = Random();
int number = random.nextInt(100); // from 0 to 99 included
return number;
}
And call it in initState
Code:
@override
void initState() {
randomNumber = _setRandomNumber();
super.initState();
}
We will need a TextField and a button to check user’s guess.
Code:
Column(
mainAxisAlignment: MainAxisAlignment.center,
crossAxisAlignment: CrossAxisAlignment.stretch,
children: <Widget>[
TextField(
controller: _controller,
decoration: InputDecoration(
hintText: "Enter Your Guess [0-99]",
border: new OutlineInputBorder(borderSide: BorderSide()),
),
keyboardType: TextInputType.number,
inputFormatters: <TextInputFormatter>[
WhitelistingTextInputFormatter.digitsOnly
],
onChanged: (value) {
guess = int.parse(value);
},
enabled: _isFound ? false : true, //If user guesses the number right, textfield will be disabled
),
RaisedButton(
child: Text("OK!"),
onPressed: () {
if (!_isFound) {
_controller.text = "";
_count++;
_compareValues();
}
},
),
],
)
We need a method if the user guessed the number right or not.
Code:
_compareValues() {
if (guess == randomNumber) {
setState(() {
_isFound = true;
_message =
"Correct! The number was $randomNumber.\nYou guessed it in $_count times.";
});
} else if (guess > randomNumber) {
setState(() {
_message = "Lower!";
});
} else {
setState(() {
_message = "Higher!";
});
}
}
}
Let’s add a message Text in Column widget to give hints to user, also a replay button.
Code:
Column(
...
Text(
_message,
textAlign: TextAlign.center,
style: TextStyle(fontSize: 24),
),
_isFound //If user guesses the number right, iconButton will appear, otherwise it won't
? IconButton(
icon: Icon(
Icons.refresh,
size: 40,
),
onPressed: () {
setState(() {
//reset all variables and set a new random number.
randomNumber = _setRandomNumber();
_isFound = false;
_count = 0;
_message = "";
});
},
)
: Text("")
],
),
We have done a simple but fun game. Let’s play it!
Define HMS Analytics Kit and send events
As we’re done with the widgets, we will define the kit and enable logs.
Code:
class _MyHomePageState extends State<MyHomePage> {
final HMSAnalytics hmsAnalytics = new HMSAnalytics();
Future<void> _enableLog() async {
await hmsAnalytics.enableLog();
}
...
@override
void initState() {
_enableLog();
randomNumber = _setRandomNumber();
super.initState();
}
}
Once we call _enableLog(), we are ready to see auto collected events on AppGallery Connect.
What about our custom events? How can we send custom events and see them?
We have _count variable and every time user clicks OK! button, it increases. Now we will map it and send it as a custom event. We need a name for custom event, and a map value.
Code:
Future<void> _sendEvent(int count) async {
String name = "USERS_RESULTS";
Map<String, String> value = {
'number_of_guesses': count.toString()
};
await hmsAnalytics.onEvent(name, value);
}
And we call it when we are sure that user guessed the number right. In _compareValues method.
Code:
_compareValues() {
if (guess == randomNumber) {
...
_sendEvent(_count); //We know that user guessed the number right.
} else if (guess > randomNumber) {
...
} else {
...
}
}
}
Let’s go back to AppGallery Connect. In the left panel, under Management section click Events.
After _sendEvent builds for the first time, you can see your custom event with the name you have entered in your code. Click Edit.
Add your attribute and click Save.
On the left panel, click Real Time Monitoring under Overview.
Now you can see the attribute and its value in your custom event. Also you can see how many times you get this value and its proportion of all values.
Let’s play our game a few times more.
Despite I am the only user, you see 2 users in AG Connect. That’s because I uninstalled the app and installed again. Now I have a different AAID as I mentioned in the first part.
Under the graphics, there is event analysis. Here you can see all events, all attributes you’ve added and statistics for both events and attributes. 11 of them are custom events that I have sent by playing the game. And rest are collected automatically.
You can find full code in my github page. Here is the link for you.
ozkulbeng/FlutterHMSAnalyticsKitTutorial
Conclusion
In this article you have learned how to integrate HMS Analytics to your Flutter projects, send custom events and monitor them in AppGallery Connect. You can use custom events in your apps to see user behaviors, so that you can improve your app depend on them.
Thank you for reading this article, I hope you enjoyed it.
References
Analytics Kit Document
HMS-Core/hms-ananlytics-demo-android
sujith.e said:
Huawei Analytics will track fragment reports
Click to expand...
Click to collapse
Quite right. It really helps a lot

React Native HMS ML Kit | Installation and Example

More information like this, you can visit HUAWEI Developer Forum​
Introduction
This article covers, how to integrate React Native HMS ML Kit to a React Native application.
React Native Hms ML Kit supports services listed below
· Text Related Services
· Language Related Services
· Image Related Services
· Face/Body Related Services
There are several number of uses cases of these services, you can combine them or just use them to create different functionalities in your app. For basic understanding, please read uses cases from here.
Github: https://github.com/HMS-Core/hms-react-native-plugin/tree/master/react-native-hms-ml
Prerequisites
Step 1
Prepare your development environment using this guide.
After reading this guide you should have React Native Development Environment setted up, Hms Core (APK) installed and Android Sdk installed.
Step 2
Configure your app information in App Gallery by following this guide.
After reading this guide you should have a Huawei Developer Account, an App Gallery app, a keystore file and enabled ml kit service from AppGallery.
Integrating React Native Hms ML Kit
Warning : Please make sure that, prerequisites part successfully completed.
Step 1
Code:
npm i @hmscore/react-native-hms-ml
Step 2
Open build.gradle file in project-dir > android folder.
Go to buildscript > repositories and allprojects > repositories, and configure the Maven repository address.
Code:
buildscript {
repositories {
google()
jcenter()
maven {url 'https://developer.huawei.com/repo/'}
}
}
allprojects {
repositories {
google()
jcenter()
maven {url 'https://developer.huawei.com/repo/'}
}
}
Go to buildscript > dependencies and add dependency configurations.
Code:
buildscript {
dependencies {
classpath 'com.huawei.agconnect:agcp:1.2.1.301'
}
}
Step 3
Open build.gradle file which is located under project.dir > android > app directory.
Add the AppGallery Connect plug-in dependency to the file header.
Code:
apply plugin: 'com.huawei.agconnect'
The apply plugin: ‘com.huawei.agconnect’ configuration must be added after the apply plugin: ‘com.android.application’ configuration.
The minimum Android API level (minSdkVersion) required for ML Kit is 19.
Configure build dependencies of your project.
Code:
dependencies {
...
implementation 'com.huawei.agconnect:agconnect-core:1.0.0.301'
}
Now you can use React Native Hms ML and import modules like below code.
Code:
import {<module_name>} from '@hmscore/react-native-hms-ml';
Lets Create An Application
We have already created an application in prerequisites section.
Our app will be about recognizing text in images and converting it to speech. So, we will use HmsTextRecognitionLocal, HmsFrame and HmsTextToSpeech modules. We will select images by using react-native-image-picker, so don’t forget to install it.
Note That : Before running this code snippet please check for your app permissions.
Step 1
We need to add some settings before using react-native-image-picker to AndroidManifest.xml.
Code:
<uses-permission android:name="android.permission.CAMERA" />
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
<application
...
android:requestLegacyExternalStorage="true">
Step 2
Code:
import React from 'react';
import { Text, View, ScrollView, TextInput, TouchableOpacity, StyleSheet } from 'react-native';
import { HmsTextRecognitionLocal, HmsFrame, HmsTextToSpeech, NativeEventEmitter } from '@hmscore/react-native-hms-ml';
import ImagePicker from 'react-native-image-picker';
const options = {
title: 'Choose Method',
storageOptions: {
skipBackup: true,
path: 'images',
},
};
const styles = StyleSheet.create({
bg: { backgroundColor: '#eee' },
customEditBox: {
height: 450,
borderColor: 'gray',
borderWidth: 2,
width: "95%",
alignSelf: "center",
marginTop: 10,
backgroundColor: "#fff",
color: "#000"
},
buttonTts: {
width: '95%',
height: 70,
alignSelf: "center",
marginTop: 35,
},
startButton: {
paddingTop: 10,
paddingBottom: 10,
backgroundColor: 'white',
borderRadius: 10,
borderWidth: 1,
borderColor: '#888',
backgroundColor: '#42aaf5',
},
startButtonLabel: {
fontWeight: 'bold',
color: '#fff',
textAlign: 'center',
paddingLeft: 10,
paddingRight: 10,
},
});
export default class App extends React.Component {
// create your states
// for keeping imageUri and recognition result
constructor(props) {
super(props);
this.state = {
imageUri: '',
result: '',
};
}
// this is a key function in Ml Kit
// It sets the frame for you and keeps it until you set a new one
async setMLFrame() {
try {
var result = await HmsFrame.fromBitmap(this.state.imageUri);
console.log(result);
} catch (e) {
console.error(e);
}
}
// this creates text recognition settings by default options given below
// languageCode : default is "rm"
// OCRMode : default is OCR_DETECT_MODE
async createTextSettings() {
try {
var result = await HmsTextRecognitionLocal.create({});
console.log(result);
} catch (e) {
console.error(e);
}
}
// this function calls analyze function and sets the results to state
// The parameter false means we don't want a block result
// If you want to see results as blocks you can set it to true
async analyze() {
try {
var result = await HmsTextRecognitionLocal.analyze(false);
this.setState({ result: result });
} catch (e) {
console.error(e);
}
}
// this function calls close function to stop recognizer
async close() {
try {
var result = await HmsTextRecognitionLocal.close();
console.log(result);
} catch (e) {
console.error(e);
}
}
// standart image picker operation
// sets imageUri to state
// calls startAnalyze function
showImagePicker() {
ImagePicker.showImagePicker(options, (response) => {
if (response.didCancel) {
console.log('User cancelled image picker');
} else if (response.error) {
console.log('ImagePicker Error: ', response.error);
} else {
this.setState({
imageUri: response.uri,
});
this.startAnalyze();
}
});
}
// configure tts engine by giving custom parameters
async configuration() {
try {
var result = await HmsTextToSpeech.configure({
"volume": 1.0,
"speed": 1.0,
"language": HmsTextToSpeech.TTS_EN_US,
"person": HmsTextToSpeech.TTS_SPEAKER_FEMALE_EN
});
console.log(result);
} catch (e) {
console.error(e);
}
}
// create Tts engine by call
async engineCreation() {
try {
var result = await HmsTextToSpeech.createEngine();
console.log(result);
} catch (e) {
console.error(e);
}
}
// set Tts callback
async callback() {
try {
var result = await HmsTextToSpeech.setTtsCallback();
console.log(result);
} catch (e) {
console.error(e);
}
}
// start speech
async speak(word) {
try {
var result = await HmsTextToSpeech.speak(word, HmsTextToSpeech.QUEUE_FLUSH);
console.log(result);
} catch (e) {
console.error(e);
}
}
// stop engine
async stop() {
try {
var result = await HmsTextToSpeech.stop();
console.log(result);
} catch (e) {
console.error(e);
}
}
// manage functions in order
startAnalyze() {
this.setState({
result: 'processing...',
}, () => {
this.createTextSettings()
.then(() => this.setMLFrame())
.then(() => this.analyze())
.then(() => this.close())
.then(() => this.configuration())
.then(() => this.engineCreation())
.then(() => this.callback())
.then(() => this.speak(this.state.result));
});
}
render() {
return (
<ScrollView style={styles.bg}>
<TextInput
style={styles.customEditBox}
value={this.state.result}
placeholder="Text Recognition Result"
multiline={true}
editable={false}
/>
<View style={styles.buttonTts}>
<TouchableOpacity
style={styles.startButton}
onPress={this.showImagePicker.bind(this)}
underlayColor="#fff">
<Text style={styles.startButtonLabel}> Start Analyze </Text>
</TouchableOpacity>
</View>
</ScrollView>
);
}
}
Test the App
· First write “Hello World” on a blank paper.
· Then run the application.
· Press “Start Analyze” button and take photo of your paper.
· Wait for the result.
· Here it comes. You will see “Hello World” on screen and you will hear it from your phone.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Conclusion
In this article, we integrated and used React Native Hms ML Kit to our application.
From: https://medium.com/huawei-developers/react-native-hms-ml-kit-installation-and-example-242dc83e0941
Is there any advantage in Huawei ML kit compare to others .

Deep Linking on Flutter using Huawei Push Kit’s Custom Intents

In this article I will show the basics of deep linking for Flutter, using Huawei Push Kit Plugin along with uni_links package.
Here are the links for those packages:
https://pub.dev/packages/huawei_push
https://pub.dev/packages/uni_links
Deep Linking
The most basic definition for a deep link is: “A link that sends users to related content on an application”.
Okay but why is this important ?
For improving the User Experience (UX) of course. By utilizing custom uniform resource identifiers (URIs), developers can create funnels in their apps for landing users to the specific content and make the user experience better.
We can validate this with an example: An e-commerce application on your phone has sent you a notification that there will be a discount on stickers. Would you prefer going to the stickers page by tapping the notification or rather navigate by yourself through the enourmous menus like Home > Products > Handmade Products > Stationery & Party Supplies > Stationery Stickers. I am assuming you would choose the first aproach.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Huawei Push Kit allows developers to send push notifications that can include custom intents or actions. This is well suited for our case. So let’s get started for the sake of our users’ experience.
Before we begin, there are prerequisites that need to be completed.
Huawei Developer Account: you must have this account to use the Push Kit Service. Click here to sign up if you don’t have an account already.
HMS Core SDK setup: For using the Push Kit we have to make some configurations on Huawei Developer Console and in our application. Refer to this medium post for installation and if you have any trouble doing so you can also check this post for a more in-depth setup.
The project
The project will be a very simple app that will display information about Huawei Mobile Services. Here is the project’s Github link if you want to follow from there.
Project Setup
As I’ve mentioned before we will use uni_links and Huawei Push Kit plugins in our project. We will also add flutter_webview_plugin into the mix for displaying the website content of the service. So let’s start by adding these to our pubspec.yaml file
Code:
dependencies:
flutter:
sdk: flutter
huawei_push: 4.0.4+300
uni_links: 0.4.0
flutter_webview_plugin: 0.3.11
To listen for intents, uni_links package needs a configuration in the AndroidManifest.xml file. Add an intent filter inside <activity> tag like below.
Code:
<application
<!-- . . .
Other Configurations
. . . -->
<activity/>
<!-- . . .
Other Configurations
. . . -->
<!-- Add the intent filter below.(inside the application and activity tags) -->
<intent-filter>
<action android:name="android.intent.action.VIEW" />
<category android:name="android.intent.category.DEFAULT" />
<category android:name="android.intent.category.BROWSABLE" />
<data android:scheme="app"/>
</intent-filter>
</activity>
</application
Here we have used a scheme called “app”. You can also use your own custom scheme. For more information on this subject refer to this document.
Now that we are done with the installation let’s get started with coding. The project is very simple, you can check the file hierarchy below.
We have two app pages. Home page will show various Huawei Mobile Services and the content page will display the service information inside a webview. We will navigate to this page if a notification with custom intent is tapped.
Main.dart
Code:
import 'package:deep_linking_demo/router.dart';
import 'package:flutter/material.dart';
void main() {
runApp(MyApp());
}
class MyApp extends StatelessWidget {
@override
Widget build(BuildContext context) {
return MaterialApp(
title: 'Deep Linking Demo',
theme: ThemeData(
primarySwatch: Colors.red,
visualDensity: VisualDensity.adaptivePlatformDensity,
),
onGenerateRoute: Router.generateRoute,
initialRoute: '/',
);
}
}
This main.dart file is almost identical with what you get for default except for the named route generation. Here, I have defined the onGenerateRoute and initialRoute properties of the MaterialApp and I have passed Router’s generateRoute method. Let’s look at the router.dart file to see what’s going on.
Router.dart
I used named routes because they contain less boilerplate and easier to use with custom intents. For using the named routes in Flutter we should set the names for our routes and return the corresponding MaterialPageRoute based on the name. Here, I am also getting the arguments needed for the content page and passing to its widget. (We put these arguments when we call the navigator)
Code:
import 'package:deep_linking_demo/screens/content_page.dart';
import 'package:deep_linking_demo/screens/home_page.dart';
import 'package:flutter/material.dart';
class Router {
static const String HomePageRoute = '/';
static const String ContentPageRoute = '/ContentPage';
static Route<dynamic> generateRoute(RouteSettings settings) {
switch (settings.name) {
case HomePageRoute:
return MaterialPageRoute(builder: (context) => HomePage());
case ContentPageRoute:
final ContentPageArguments args = settings.arguments;
return MaterialPageRoute(
builder: (context) => ContentPage(
serviceName: args.serviceName,
serviceUrl: args.serviceUrl,
),
);
default:
// Error the named route doesn't exist
return MaterialPageRoute(builder: (context) => HomePage());
}
}
}
Hms.dart
This class’ mere purpose is to hold the data for the names and URLs for the HMS services that will be displayed on our home page.
Code:
class HMS {
final String name;
final String url;
final HMSGroup hmsGroup;
const HMS(this.name, this.url, this.hmsGroup);
static const Map<HMSGroup, List<HMS>> HMSMap = {
HMSGroup.AI : _aiServicesList,
HMSGroup.SECURITY : _securityServicesList,
// ...Rest of the mappings
};
static const List<HMS> _aiServicesList = [
HMS('ML Kit','https://developer.huawei.com/consumer/en/hms/huawei-mlkit',HMSGroup.AI),
HMS('HUAWEI HiAI Foundation','https://developer.huawei.com/consumer/en/hiai#Foundation',HMSGroup.AI),
HMS('HUAWEI HiAI Engine','https://developer.huawei.com/consumer/en/hiai#Engine',HMSGroup.AI),
HMS('HUAWEI HiAI Service','https://developer.huawei.com/consumer/en/hiai#Service',HMSGroup.AI)
];
static const List<HMS> _securityServicesList = [
HMS('FIDO','https://developer.huawei.com/consumer/en/hms/huawei-fido',HMSGroup.SECURITY),
HMS('Safety Detect','https://developer.huawei.com/consumer/en/hms/huawei-safetydetectkit/',HMSGroup.SECURITY)
];
// ...Rest of the list definitions
}
enum HMSGroup {
APP_SERVICES,
MEDIA,
GRAPHICS,
AI,
SMART_DEVICE,
SECURITY,
SYSTEM
}
extension HMSGroupExtension on HMSGroup {
String get text {
switch (this) {
case HMSGroup.APP_SERVICES:
return "App Services";
case HMSGroup.MEDIA:
return "Media";
case HMSGroup.GRAPHICS:
return "Graphics";
case HMSGroup.AI:
return "AI";
case HMSGroup.SMART_DEVICE:
return "Smart Device";
case HMSGroup.SECURITY:
return "Security";
case HMSGroup.SYSTEM:
return "System";
default:
return "Other";
}
}
}
I’ve deleted some of the definitions to not bother you with details. Check the github repo for the full code.
Home_page.dart
This is the widget that the most important functions occur so I will split into parts and get into some details for a better explanation.
You can refer to this part for creating your own custom intent navigations for the purpose of deep linking.
Obtaining a push token
For sending push notifications we need to obtain a push token for our device.
Under the state of the widget, define an EventChannel that will listen for the push token and a string variable to hold the token.
Code:
class _HomePageState extends State<HomePage> {
String _token = '';
static const EventChannel TokenEventChannel =
EventChannel(Channel.TOKEN_CHANNEL);
. . .
}
Initialize functions below for obtaining the push token.
Code:
Future<void> initPlatformState() async {
if (!mounted) return;
TokenEventChannel.receiveBroadcastStream()
.listen(_onTokenEvent, onError: _onTokenError);
await Push.getToken();
}
void _onTokenEvent(Object event) {
// This function gets called when we receive the token successfully
setState(() {
_token = event;
});
print('Push Token: ' + _token);
}
void _onTokenError(Object error) {
setState(() {
_token = error;
});
print(_token);
}
Call the initPlatformState function on the widget’s initState method. You should now see your token printed on the debug console.
Code:
@override
void initState() {
super.initState();
initPlatformState();
}
Obtaining a push token is the most crucial part when using the push kit. If you run into any errors while obtaining the token here is checklist that could help:
1. Check your SHA-256 signature and package name on the Huawei Developer Console
2. Make sure the Push Kit is enabled on the console (Manage APIs tab)
3. Whenever you change something on the console, download the agconnect-services.json file again.
Deep linking
We will need two functions to listen for the custom intents: One is for when our app is on foreground (active) and the other one is for when the app is not active and it is opened by an intent.
Code:
Future<Null> initLinkStream() async {
if (!mounted) return;
_sub = getLinksStream().listen((String link) {
print(link);
var uri = Uri.dataFromString(link);
String page = uri.path.split('://')[1];
String serviceName = uri.queryParameters['name'];
String serviceUrl = uri.queryParameters['url'];
Navigator.of(context).pushNamed(
page,
arguments: ContentPageArguments(serviceName, serviceUrl),
); // Navigate to the page from the intent
}, onError: (err) {
print("Error while listening for the link stream: " + err.toString());
});
}
Future<void> initInitialLinks() async {
// Platform messages may fail, so we use a try/catch PlatformException.
try {
String initialLink = await getInitialLink();
print(initialLink ?? 'NO LINK');
if (initialLink != null) {
print(initialLink);
var uri = Uri.dataFromString(initialLink);
String page = uri.path.split('://')[1];
String serviceName = uri.queryParameters['name'];
String serviceUrl = uri.queryParameters['url'];
try {
WidgetsBinding.instance.addPostFrameCallback((timeStamp) {
Navigator.of(context).pushNamed(
page,
arguments: ContentPageArguments(serviceName, serviceUrl),
); // Navigate to the page from the intent
});
} catch (e) {
Push.showToast(e);
}
}
} on PlatformException {
print('Error: Platform Exception');
}
}
Call these functions on the widgets initState
Code:
@override
void initState() {
super.initState();
initPlatformState();
initInitialLinks();
initLinkStream();
}
Our custom intent inside the push notification looks like this:
Code:
app:///ContentPage?name=Push Kit&url=https://developer.huawei.com/consumer/en/hms/huawei-pushkit
By adding query params, we can utilize the Uri class’ queryParameters method to easily obtain the values we need and not worry about string parsing.
Now for the final part of home_page.dart here is the UI code below.
Code:
Widget serviceButton(HMS service) {
return ListTile(
title: Text(service.name),
onTap: () => Navigator.of(context).pushNamed(
'/ContentPage',
arguments: ContentPageArguments(service.name, service.url),
),
);
}
@override
Widget build(BuildContext context) {
return Scaffold(
appBar: AppBar(
title: Text("Huawei Mobile Services"),
centerTitle: true,
),
body: Column(
mainAxisAlignment: MainAxisAlignment.start,
children: <Widget>[
Container(
height: MediaQuery.of(context).size.height * 0.8,
child: ListView.builder(
itemCount: HMS.HMSMap.length,
itemBuilder: (context, idx) => ExpansionTile(
title: Text(
'${HMS.HMSMap.keys.elementAt(idx).text}',
style: TextStyle(fontWeight: FontWeight.w500, fontSize: 25),
),
children: HMS.HMSMap[HMS.HMSMap.keys.elementAt(idx)]
.map((e) => serviceButton(e))
.toList(),
),
),
)
],
),
);
}
Content_page.dart
The last file is content_page.dart. This widget is very simple since it’s only purpose is to display the related service content inside a webview.
Code:
import 'package:flutter/material.dart';
import 'package:flutter_webview_plugin/flutter_webview_plugin.dart';
class ContentPageArguments {
final String serviceName;
final String serviceUrl;
ContentPageArguments(this.serviceName, this.serviceUrl);
}
class ContentPage extends StatelessWidget {
final String serviceName;
final String serviceUrl;
ContentPage({Key key, @required this.serviceName, @required this.serviceUrl})
: super(key: key) {
assert(serviceName != null);
assert(serviceUrl != null);
}
@override
Widget build(BuildContext context) {
return WebviewScaffold(
url: serviceUrl,
appBar: AppBar(
title: Text(serviceName),
),
withZoom: true,
withLocalStorage: true,
hidden: true,
initialChild: Container(
child: Column(
mainAxisAlignment: MainAxisAlignment.center,
children: [
const Center(
child: Text('Loading.....'),
),
SizedBox(
height: 10,
),
Center(child: CircularProgressIndicator())
],
),
),
);
}
}
Sending Push Notifications with Custom Intent
Now for the last part let’s head over to Huawei Developer Console and create a push notification that include a custom intent. Enter the details like in the image below and press “Test Effect” button or submit your push notification from top right corner.
You can find the custom intent uri entered here on the deep linking section of this article
If you press “Test Effect” button console will prompt you to enter the token you obtained earlier.
Enter the push token obtained earlier in the app
Deep Linking while app is on foreground.
Deep linking is working as expected. Kudos to you if you got this far!
Conclusion
Push notifications and deep linking is a must for a mobile application nowadays since their use can boost up user retention and experience if used properly. Huawei Push Kit’s notifications include custom intent and custom action features for the use of deep linking but they aren’t limited with these ones only. If you want to check all the features click here.
I hope this tutorial was helpful for you. If you have any questions regarding this article feel free to ask them on the comment section.
You can also check our other articles about using Huawei Mobile Services on Flutter below.
https://medium.com/huawei-developers/integrating-huawei-analytics-kit-to-flutter-projects-and-sending-events-3dcc4c4f03f
https://medium.com/huawei-developers/using-huawei-map-kit-on-flutter-applications-f83b2a5668bc
https://medium.com/huawei-developers/sending-push-notifications-on-flutter-with-huawei-push-kit-plugin-534787862b4d
Reference
Github demo project: https://github.com/HMS-Core/hms-flutter-plugin/tree/master/flutter-hms-push

Hand Keypoint Detection with HMS ML Kit Explained (with a Demo Project)

{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Hand keypoint detection is the process of finding fingertips, knuckles and wrists in an image. Hand keypoint detection and hand gesture recognition is still a challenging problem in computer vision domain. It is really a tough work to build your own model for hand keypoint detection as it is hard to collect a large enough hand dataset and it requires expertise in this domain.
Hand keypoint detection can be used in variety of scenarios. For example, it can be used during artistic creation, users can convert the detected hand keypoints into a 2D model, and synchronize the model to a character’s model to produce a vivid 2D animation. You can create a puppet animation game using the above idea. Another example may be creating a rock paper scissors game. Or if you take it further, you can create a sign language to text conversion application. As you see varieties to possible usage scenarios are abundant and there is no limit to ideas.
Hand keypoint detection service is a brand-new feature that is added to Huawei Machine Learning Kit family. It has recently been released and it is making developers and computer vision geeks really excited! It detects 21 points of a hand and can detect up to ten hands in an image. It can detect hands in a static image or in a camera stream. Currently, it does not support scenarios where your hand is blocked by more than 50% or you wear gloves. You don’t need an internet connection as this is a device side capability and what is more: It is completely free!
It wouldn’t be a nice practice only to read related documents and forget about it after a few days. So I created a simple demo application that counts fingers and tells us the number we show by hand. I strongly advise you to develop your hand keypoint detection application beside me. I developed the application in Android Studio in Kotlin. Now, I am going to explain you how to build this application step by step. Don’t hesitate to ask questions in the comments if you face any issues.
1.Firstly, let’s create our project on Android Studio. I named my project as HandKeyPointDetectionDemo. I am sure you can find better names for your application. We can create our project by selecting Empty Activity option and then follow the steps described in this post to create and sign our project in App Gallery Connect.
2. In HUAWEI Developer AppGallery Connect, go to Develop > Manage APIs. Make sure ML Kit is activated.
3. Now we have integrated Huawei Mobile Services (HMS) into our project. Now let’s follow the documentation on developer.huawei.com and find the packages to add to our project. In the website click Developer / HMS Core/ AI / ML Kit. Here you will find introductory information to services, references, SDKs to download and others. Under ML Kit tab follow Android / Getting Started / Integrating HMS Core SDK / Adding Build Dependencies / Integrating the Hand Keypoint Detection SDK. We can follow the guide here to add hand detection capability to our project. We have also one meta-data tag to be added into our AndroidManifest.xml file. After the integration your app-level build.gradle file will look like this.
Code:
apply plugin: 'com.android.application'
apply plugin: 'kotlin-android'
apply plugin: 'kotlin-android-extensions'
apply plugin: 'com.huawei.agconnect'
android {
compileSdkVersion 30
buildToolsVersion "30.0.2"
defaultConfig {
applicationId "com.demo.handkeypointdetection"
minSdkVersion 21
targetSdkVersion 30
versionCode 1
versionName "1.0"
testInstrumentationRunner "androidx.test.runner.AndroidJUnitRunner"
}
buildTypes {
release {
minifyEnabled false
proguardFiles getDefaultProguardFile('proguard-android-optimize.txt'), 'proguard-rules.pro'
}
}
}
dependencies {
implementation fileTree(dir: "libs", include: ["*.jar"])
implementation "org.jetbrains.kotlin:kotlin-stdlib:$kotlin_version"
implementation 'androidx.core:core-ktx:1.3.1'
implementation 'androidx.appcompat:appcompat:1.2.0'
implementation 'androidx.constraintlayout:constraintlayout:2.0.1'
testImplementation 'junit:junit:4.12'
androidTestImplementation 'androidx.test.ext:junit:1.1.2'
androidTestImplementation 'androidx.test.espresso:espresso-core:3.3.0'
//AppGalleryConnect Core
implementation 'com.huawei.agconnect:agconnect-core:1.3.1.300'
// Import the base SDK.
implementation 'com.huawei.hms:ml-computer-vision-handkeypoint:2.0.2.300'
// Import the hand keypoint detection model package.
implementation 'com.huawei.hms:ml-computer-vision-handkeypoint-model:2.0.2.300'
}
Our project-level build.gradle file:
Code:
buildscript {
ext.kotlin_version = "1.4.0"
repositories {
google()
jcenter()
maven {url 'https://developer.huawei.com/repo/'}
}
dependencies {
classpath "com.android.tools.build:gradle:4.0.1"
classpath "org.jetbrains.kotlin:kotlin-gradle-plugin:$kotlin_version"
classpath 'com.huawei.agconnect:agcp:1.3.1.300'
}
}
allprojects {
repositories {
google()
jcenter()
maven {url 'https://developer.huawei.com/repo/'}
}
}
task clean(type: Delete) {
delete rootProject.buildDir
}
And don’t forget to add related meta-data tags into your AndroidManifest.xml.
Code:
<?xml version="1.0" encoding="utf-8"?>
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
package="com.demo.handkeypointdetection">
<uses-permission android:name="android.permission.CAMERA" />
<application
...
<meta-data
android:name="com.huawei.hms.ml.DEPENDENCY"
android:value= "handkeypoint"/>
</application>
</manifest>
4. I created a class named HandKeyPointDetector. This class will be called from our activity or fragment. Its init method has two parameters context and a viewgroup. We will add our views on rootLayout.
Code:
fun init(context: Context, rootLayout: ViewGroup) {
mContext = context
mRootLayout = rootLayout
addSurfaceViews()
}
5. We are going to detect hand key points in a camera stream, so we create a surfaceView for camera preview and another surfaceView to draw somethings. The surfaceView that is going to be used as overlay should be transparent. Then, we add our views to our rootLayout passed as a parameter from our activity. Lastly we add SurfaceHolder.Callback to our surfaceHolder to know when it is ready.
Code:
private fun addSurfaceViews() {
val surfaceViewCamera = SurfaceView(mContext).also {
it.layoutParams = LinearLayout.LayoutParams(LinearLayout.LayoutParams.MATCH_PARENT, LinearLayout.LayoutParams.MATCH_PARENT)
mSurfaceHolderCamera = it.holder
}
val surfaceViewOverlay = SurfaceView(mContext).also {
it.layoutParams = LinearLayout.LayoutParams(LinearLayout.LayoutParams.MATCH_PARENT, LinearLayout.LayoutParams.MATCH_PARENT)
mSurfaceHolderOverlay = it.holder
mSurfaceHolderOverlay.setFormat(PixelFormat.TRANSPARENT)
mHandKeyPointTransactor.setOverlay(mSurfaceHolderOverlay)
}
mRootLayout.addView(surfaceViewCamera)
mRootLayout.addView(surfaceViewOverlay)
mSurfaceHolderCamera.addCallback(surfaceHolderCallback)
}
6. Inside our surfaceHolderCallback we override three methods: surfaceCreated, surfacehanged and surfaceDestroyed.
Code:
private val surfaceHolderCallback = object : SurfaceHolder.Callback {
override fun surfaceCreated(holder: SurfaceHolder) {
createAnalyzer()
}
override fun surfaceChanged(holder: SurfaceHolder, format: Int, width: Int, height: Int) {
prepareLensEngine(width, height)
mLensEngine.run(holder)
}
override fun surfaceDestroyed(holder: SurfaceHolder) {
mLensEngine.release()
}
}
7. createAnalyzer method creates MLKeyPointAnalyzer with settings. If you want you can use default settings also. Scene type can be keypoint and rectangle around hands or we can use TYPE_ALL for both. Max hand results can be up to MLHandKeypointAnalyzerSetting.MAX_HANDS_NUM which is 10 currently. As we will count fingers of 2 hands, I set it to 2.
Code:
private fun createAnalyzer() {
val settings = MLHandKeypointAnalyzerSetting.Factory()
.setSceneType(MLHandKeypointAnalyzerSetting.TYPE_ALL)
.setMaxHandResults(2)
.create()
mAnalyzer = MLHandKeypointAnalyzerFactory.getInstance().getHandKeypointAnalyzer(settings)
mAnalyzer.setTransactor(mHandKeyPointTransactor)
}
8. LensEngine is responsible for handling camera frames for us. All we need to do is to prepare it with right dimensions according to orientation, choose the camera we want to work with, apply fps an so on.
Code:
private fun prepareLensEngine(width: Int, height: Int) {
val dimen1: Int
val dimen2: Int
if (mContext.resources.configuration.orientation == Configuration.ORIENTATION_LANDSCAPE) {
dimen1 = width
dimen2 = height
} else {
dimen1 = height
dimen2 = width
}
mLensEngine = LensEngine.Creator(mContext, mAnalyzer)
.setLensType(LensEngine.BACK_LENS)
.applyDisplayDimension(dimen1, dimen2)
.applyFps(5F)
.enableAutomaticFocus(true)
.create()
}
9. When you no longer need the analyzer stop it and release resources.
Code:
fun stopAnalyzer() {
mAnalyzer.stop()
}
10. As you can see in step-7 we used mHandKeyPointTransactor. It is a custom class that we created named HandKeyPointTransactor, which inherits MLAnalyzer.MLTransactor<MLHandKeypoints>. It has two overriden methods inside. transactResult and destroy. Detected results will fall inside transactResult method and then we will try to find the number.
Code:
override fun transactResult(result: MLAnalyzer.Result<MLHandKeypoints>?) {
if (result == null)
return
val canvas = mOverlay?.lockCanvas() ?: return
//Clear canvas.
canvas.drawColor(0, PorterDuff.Mode.CLEAR)
//Find the number shown by our hands.
val numberString = analyzeHandsAndGetNumber(result)
//Find the middle of the canvas
val centerX = canvas.width / 2F
val centerY = canvas.height / 2F
//Draw a text that writes the number we found.
canvas.drawText(numberString, centerX, centerY, Paint().also {
it.style = Paint.Style.FILL
it.textSize = 100F
it.color = Color.GREEN
})
mOverlay?.unlockCanvasAndPost(canvas)
}
11. We will check hand by hand and then finger by finger to find the fingers that are up to find the number.
Code:
private fun analyzeHandsAndGetNumber(result: MLAnalyzer.Result<MLHandKeypoints>): String {
val hands = ArrayList<Hand>()
var number = 0
for (key in result.analyseList.keyIterator()) {
hands.add(Hand())
for (value in result.analyseList.valueIterator()) {
number += hands.last().createHand(value.handKeypoints).getNumber()
}
}
return number.toString()
}
For more information, you can visit https://forums.developer.huawei.com/forumPortal/en/topicview?tid=0202369245767250343&fid=0101187876626530001

Using Huawei Cloud Functions as Chatbot Service in Flutter ChatBotApp Part-1

{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Introduction
In this article, we will learn how to make use of Huawei Cloud Functions service as Chatbot service in ChatBotApp in flutter. Cloud Functions enables serverless computing. It provides the Function as a Service (FaaS) capabilities to simplify app development and O&M by splitting service logic into functions and offers the Cloud Functions SDK that works with Cloud DB and Cloud Storage so that your app functions can be implemented more easily. Cloud Functions automatically scales in or out functions based on actual traffic, freeing you from server resource management and helping you reduce costs.
Key Functions
Key Concepts
How the Service Works
To use Cloud Functions, you need to develop cloud functions that can implement certain service functions in AppGallery Connect and add triggers for them, for example, HTTP triggers for HTTP requests, and Cloud DB triggers for data deletion or insertion requests after Cloud DB is integrated. After your app that integrates the Cloud Functions SDK meets conditions of specific function triggers, your app can call the cloud functions, which greatly facilitates service function building.
Platform Support
Development Overview
You need to install Flutter and Dart plugin in IDE and I assume that you have prior knowledge about the Flutter and Dart.
Hardware Requirements
A computer (desktop or laptop) running Windows 10.
Android phone (with the USB cable), which is used for debugging.
Software Requirements
Java JDK 1.7 or later.
Android studio software or Visual Studio or Code installed.
HMS Core (APK) 4.X or later.
Integration process
Step 1: Create Flutter project.
Step 2: Add the App level gradle dependencies. Choose inside project Android > app > build.gradle.
[/B][/B]
apply plugin: 'com.android.application'
apply plugin: 'com.huawei.agconnect'
[B][B]
Root level gradle dependencies
[/B][/B][/B]
maven {url 'https://developer.huawei.com/repo/'}
classpath 'com.huawei.agconnect:agcp:1.5.2.300'
[B][B][B]
Step 3: Add the below permissions in Android Manifest file.
<uses-permission android:name="android.permission.INTERNET" />
Step 5: Add downloaded file into parent directory of the project. Declare plugin path in pubspec.yaml file under dependencies.
Add path location for asset image.
Let's start coding
main.dart
[/B]
void main() {
runApp(const MyApp());
}
class MyApp extends StatelessWidget {
const MyApp({Key? key}) : super(key: key);
// This widget is the root of your application.
@override
Widget build(BuildContext context) {
return MaterialApp(
title: 'ChatBotService',
theme: ThemeData(
primarySwatch: Colors.blue,
),
home: const MyHomePage(title: 'ChatBotService'),
);
}
}
class MyHomePage extends StatefulWidget {
const MyHomePage({Key? key, required this.title}) : super(key: key);
final String title;
@override
State<MyHomePage> createState() => _MyHomePageState();
}
class _MyHomePageState extends State<MyHomePage> {
bool isLoggedIn = false;
String str = 'Login required';
final HMSAnalytics _hmsAnalytics = new HMSAnalytics();
List<String> gridItems = ['Email Service', 'Call Center', 'FAQ', 'Chat Now'];
@override
void initState() {
_enableLog();
super.initState();
}
@override
Widget build(BuildContext context) {
return Scaffold(
appBar: AppBar(
title: Text(widget.title),
),
body: Center(
child:
Column(
mainAxisAlignment: MainAxisAlignment.center,
children: <Widget>[
Visibility(
visible: true,
child: Card(
child: Padding(
padding: EdgeInsets.all(20),
child: Text(
str,
style: const TextStyle(color: Colors.teal, fontSize: 22),
),
),
),
),
],
),
),
floatingActionButton: FloatingActionButton(
onPressed: () {
if (!isLoggedIn) {
setState(() {
isLoggedIn = true;
signInWithHuaweiID();
});
print('$isLoggedIn');
} else {
setState(() {
isLoggedIn = false;
signOutWithID();
});
print('$isLoggedIn');
}
},
tooltip: 'Login/Logout',
child: isLoggedIn ? const Icon(Icons.logout) : const Icon(Icons.login),
), // This trailing comma makes auto-formatting nicer for build methods.
);
}
void signInWithHuaweiID() async {
try {
// The sign-in is successful, and the user's ID information and authorization code are obtained.
Future<AuthAccount> account = AccountAuthService.signIn();
account.then(
(value) => setLoginSuccess(value),
);
} on Exception catch (e) {
print(e.toString());
}
}
Future<void> _enableLog() async {
_hmsAnalytics.setUserId("ChatBotServiceApp");
await _hmsAnalytics.enableLog();
}
void setLoginSuccess(AuthAccount value) {
setState(() {
str = 'Welcome ' + value.displayName.toString();
});
showToast(value.displayName.toString());
print('Login Success');
}
Future<void> signOutWithID() async {
try {
final bool result = await AccountAuthService.signOut();
if (result) {
setState(() {
str = 'Login required';
showToast('You are logged out.');
});
}
} on Exception catch (e) {
print(e.toString());
}
}
Future<void> showToast(String name) async {
Fluttertoast.showToast(
msg: "$name",
toastLength: Toast.LENGTH_SHORT,
gravity: ToastGravity.CENTER,
timeInSecForIosWeb: 1,
backgroundColor: Colors.lightBlue,
textColor: Colors.white,
fontSize: 16.0);
}
}
[B]
Result
Tricks and Tips
Makes sure that agconnect-services.json file added.
Make sure dependencies are added yaml file.
Run flutter pug get after adding dependencies.
Make sure that service is enabled in agc.
Makes sure images are defined in yaml file.
Conclusion
In this article, we have learnt how to integrate Huawei Account kit, analytics kit in flutter ChatBotApp. Once Account kit integrated, users can login quickly and conveniently sign in to apps with their Huawei IDs after granting initial access permission. In part-2 we will learn the actual Cloud Functions as Chatbot service.
Thank you so much for reading. I hope this article helps you to understand the integration of Huawei Account kit and Analytics kit in flutter ChatBotApp.
Reference
Cloud Functions
Checkout in forum

Categories

Resources