There are so many online games these days that are addictive, easy to play, and suitable for gamers of all ages. So, have you ever dreamed of creating a hit game of your own? If this is something you are interested in, HUAWEI ML Kit's face detection and hand keypoint detection capabilities can help you.
Crazy Rockets is a game that integrates both of these capabilities. With its two playing modes, players can control rockets using both their hands and faces. Both modes work flawlessly by detecting the motions.
Crazy Shopping Cart integrates ML Kit's hand keypoint detection capability. Players control a shopping cart by moving their hands, and try to collect as many items as they can. The cart speeds up every 15 seconds.
Crazy Rockets-Development Practice
1. Face Detection
1.1 Configure the Maven repository.
l Go to allprojects > repositories and configure the Maven repository address for the HMS Core SDK.
Code:
allprojects {
repositories {
google()
jcenter()
maven {url 'https://developer.huawei.com/repo/'}
}
}
Go to buildscript> repositories and configure the Maven repository address for the HMS Core SDK.
Code:
buildscript {
repositories {
google()
jcenter()
maven {url 'https://developer.huawei.com/repo/'}
}
}
Go to buildscript> dependencies and add the AppGallery Connect plug-in configurations.
Code:
dependencies {
...
classpath 'com.huawei.agconnect:agcp:1.3.1.300'
}
}
1.2 Integrate the SDK.
Code:
Implementation 'com.huawei.hms:ml-computer-vision-face:2.0.1.300'
1.3 Create a face analyzer.
Code:
MLFaceAnalyzer analyzer = MLAnalyzerFactory.getInstance().getFaceAnalyzer();
1.4 Create a processing class.
Code:
public class FaceAnalyzerTransactor implements MLAnalyzer.MLTransactor<MLFace> {
@Override
public void transactResult(MLAnalyzer.Result<MLFace> results) {
SparseArray<MLFace> items = results.getAnalyseList();
// Process detection results as required. Note that only the detection results are processed.
// Other detection-related APIs provided by ML Kit cannot be called.
}
@Override
public void destroy() {
// Callback method used to release resources when the detection ends.
}
1.5 Create an instance of LensEngine to capture dynamic camera streams, and pass them to the analyzer.
Code:
LensEngine lensEngine = new LensEngine.Creator(getApplicationContext(), analyzer)
.setLensType(LensEngine.BACK_LENS)
.applyDisplayDimension(1440, 1080)
.applyFps(30.0f)
.enableAutomaticFocus(true)
.create();
1.6 Call the run method to start the camera, and read camera streams for detection.
Code:
// Implement other logic of the SurfaceView control by yourself.
SurfaceView mSurfaceView = findViewById(R.id.surface_view);
try {
lensEngine.run(mSurfaceView.getHolder());
} catch (IOException e) {
// Exception handling logic.
}
1.7 Release detection resources.
Code:
if (analyzer != null) {
try {
analyzer.stop();
} catch (IOException e) {
// Exception handling.
}
}
if (lensEngine != null) {
lensEngine.release();
2. Hand Keypoint Detection
2.1 Configure the Maven repository.
l Go to allprojects > repositories and configure the Maven repository address for the HMS
Code:
allprojects {
repositories {
google()
jcenter()
maven {url 'https://developer.huawei.com/repo/'}
}
}
Core SDK.
Code:
buildscript {
repositories {
google()
jcenter()
maven {url 'https://developer.huawei.com/repo/'}
}
}
Go to buildscript > repositories and configure the Maven repository address for the HMS Core SDK.
Go to buildscript > dependencies and add the AppGallery Connect plug-in configurations.
Code:
dependencies {
...
classpath 'com.huawei.agconnect:agcp:1.3.1.300'
}
}
Code:
// Import the base SDK.
implementation 'com.huawei.hms:ml-computer-vision-handkeypoint:2.0.4.300'
// Import the hand keypoint detection model package.
implementation 'com.huawei.hms:ml-computer-vision-handkeypoint-model:2.0.4.300'
2.2 Integrate the SDK.
Code:
MLHandKeypointAnalyzer analyzer =MLHandKeypointAnalyzerFactory.getInstance().getHandKeypointAnalyzer();
2.3 Create a default hand keypoint analyzer.
2.4 Create a processing class.
Code:
public class HandKeypointTransactor implements MLAnalyzer.MLTransactor<List<MLHandKeypoints>> {
@Override
public void transactResult(MLAnalyzer.Result<List<MLHandKeypoints>> results) {
SparseArray<List<MLHandKeypoints>> analyseList = results.getAnalyseList();
// Process detection results as required. Note that only the detection results are processed.
// Other detection-related APIs provided by ML Kit cannot be called.
}
@Override
public void destroy() {
// Callback method used to release resources when the detection ends.
}
}
2.5 Set the processing class.
Code:
analyzer.setTransactor(new HandKeypointTransactor());
2.6 Create a LensEngine.
Code:
LensEngine lensEngine = new LensEngine.Creator(getApplicationContext(), analyzer)
.setLensType(LensEngine.BACK_LENS)
.applyDisplayDimension(1280, 720)
.applyFps(20.0f)
.enableAutomaticFocus(true)
.create();
2.7 Call the run method to start the camera, and read camera streams for detection.
Code:
// Implement other logic of the SurfaceView control by yourself.
SurfaceView mSurfaceView = findViewById(R.id.surface_view);
try {
lensEngine.run(mSurfaceView.getHolder());
} catch (IOException e) {
// Exception handling logic.
}
2.8 Release detection resources.
Code:
if (analyzer != null) {
analyzer.stop();
}
if (lensEngine != null) {
lensEngine.release();
}
Crazy Shopping Cart-Development Practice
1. Configure the Maven repository address.
Code:
buildscript { repositories { google() jcenter() maven {url 'https://developer.huawei.com/repo/'} } dependencies { ... classpath 'com.huawei.agconnect:agcp:1.4.1.300' }} allprojects { repositories { google() jcenter() maven {url 'https://developer.huawei.com/repo/'} }}
dependencies{ // Import the base SDK. implementation 'com.huawei.hms:ml-computer-vision-handkeypoint:2.0.4.300' // Import the hand keypoint detection model package. implementation 'com.huawei.hms:ml-computer-vision-handkeypoint-model:2.0.4.300'}
2. Perform integration in full SDK mode.
Once you've integrated the SDK in either mode, add the following configuration to the file header:
Add apply plugin: 'com.huawei.agconnect' after apply plugin: 'com.android.application'.
Code:
MLHandKeypointAnalyzer analyzer =MLHandKeypointAnalyzerFactory.getInstance().getHandKeypointAnalyzer();
3. Create a hand keypoint analyzer.
Code:
public class HandKeypointTransactor implements MLAnalyzer.MLTransactor<List<MLHandKeypoints>> {
@Override
public void transactResult(MLAnalyzer.Result<List<MLHandKeypoints>> results) {
SparseArray<List<MLHandKeypoints>> analyseList = results.getAnalyseList();
// Process detection results as required. Note that only the detection results are processed.
// Other detection-related APIs provided by ML Kit cannot be called.
}
@Override
public void destroy() {
// Callback method used to release resources when the detection ends.
}
}
4. Create the detection result processing class HandKeypointTransactor.
Code:
analyzer.setTransactor(new HandKeypointTransactor());
5. Set the detection result processor, and bind it to the analyzer.
6. Create a LensEngine.
Code:
LensEngine lensEngine = new LensEngine.Creator(getApplicationContext(), analyzer)
.setLensType(LensEngine.BACK_LENS)
.applyDisplayDimension(1280, 720)
.applyFps(20.0f)
.enableAutomaticFocus(true)
.create();
7. Call the run method to start the camera and read camera streams for detection.
Code:
// Implement other logic of the SurfaceView control by yourself.SurfaceView mSurfaceView = findViewById(R.id.surface_view);try { lensEngine.run(mSurfaceView.getHolder());} catch (IOException e) { // Exception handling logic.}
if (analyzer != null) { analyzer.stop();}if (lensEngine != null) { lensEngine.release();}
8. After the detection is complete, stop the analyzer, and release detection resources.
As you can see, the development process is really quick and simple. As well as these games, ML Kit's face detection and hand keypoint detection capabilities have lots of other useful applications. For example, users can add cute or funny special effects to their videos when using short-video apps with these capabilities integrated. Also, for smart home apps, users can customize the hand gestures they use to remotely control appliances. Try these capabilities out for yourself by integrating HUAWEI ML Kit into your own apps.
Learn More
For more information, please visit HUAWEI Developers.
For detailed instructions, please visit Development Guides.
To join the developer discussion, please go to Reddit.
To download the demo and sample code, please go to GitHub.
To solve integration problems, please go to Stack Overflow.
Related
More information like this, you can visit HUAWEI Developer Forum
Hello everyone,
In this article, I am going to create a Flutter project –actually a tiny game- and explain how to implement Analytics Kit. But first, let me inform you about Huawei Analytics Kit a little.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Huawei Analytics Kit
Huawei Analytics Kit offers you a range of analytics models that help you not only to analyze users’ behavior with preset and custom events, but also gain an insight into your products and contents. So that you can improve your skills about marketing your apps and optimizing your products.
HUAWEI Analytics Kit identifies users and collects statistics on users by anonymous application identifier (AAID). The AAID is reset in the following scenarios:
1) Uninstall or reinstall the app.
2) The user clears the app data.
After the AAID is reset, the user will be counted as a new user.
HUAWEI Analytics Kit supports event management. For each event, maximum 25 parameters; for each app maximum 100 parameters can be defined.
There are 3 types of events: Automatically collected, predefined and custom.
Automatically collected events are collected from the moment you enable the kit in your code. Event IDs are already reserved by HUAWEI Analytics Kit and cannot be reused.
Predefined events include their own Event IDs which are predefined by the HMS Core Analytics SDK based on common application scenarios. The ID of a custom event cannot be the same as a predefined event’s ID. If so, you will create a predefined event instead of a custom event.
Custom events are the events that you can create for your own requirements.
More info about the kit and events.
Configuration in AppGallery Connect
Firstly, you will need a Huawei developer account. If you don’t have one, click here and register. It will be activated in 1–2 days.
After signing in to AppGallery Connect, you can add a new project or select an existing project. In the project you choose, add an app. While adding app, make sure you enter the package name right. It should be the same as your Flutter project’s package name.
Also, make sure you set data storage location, enable Analytics kit and add SHA-256 fingerprint to AppGallery Connect.
How to generate SHA-256 Fingerprint?
In Android Studio, right click on android folder under your project and select Flutter > Open Android module in Android Studio.
On the right panel, select Gradle and follow the steps that are shown in the picture below. Open signingReport and there is your SHA-256 fingerprint.
Copy the code and paste it on the project settings in the AppGallery Connect.
Integrate HMS to your project
Download agconnect-services.json file and place it under project_name > android > app.
Add Signing Configuration
Create a file named key.properties under android folder and add your signing configs here.
storeFile file(‘<keystore_file>.jks’)
storePassword ‘<keystore_password>’
keyAlias ‘<key_alias>’
keyPassword ‘<key_password>’
Define your key.properties file by adding the code below, before android block in your app-level build.gradle file.
Code:
def keystoreProperties = new Properties()
def keystorePropertiesFile = rootProject.file(‘key.properties’)
if (keystorePropertiesFile.exists()) {
keystoreProperties.load(new FileInputStream(keystorePropertiesFile))
}
TO-DOs in project-level build.gradle
Code:
buildscript {
repositories {
google()
jcenter()
maven {url 'http://developer.huawei.com/repo/'} //add this line
}
dependencies {
classpath 'com.android.tools.build:gradle:3.5.0'
classpath 'com.huawei.agconnect:agcp:1.1.1.300' //add this line
}
}
allprojects {
repositories {
google()
jcenter()
maven {url 'http://developer.huawei.com/repo/'} //add this line
}
}
TO-DOs in app-level build.gradle
Code:
defaultConfig {
...
minSdkVersion 19 //Increase this to 19
}
//Add these lines
signingConfigs {
release {
keyAlias keystoreProperties['keyAlias']
keyPassword keystoreProperties['keyPassword']
storeFile keystoreProperties['storeFile'] ? file(keystoreProperties['storeFile']) : null
storePassword keystoreProperties['storePassword']
}
}
//Edit buildTypes
buildTypes {
debug {
signingConfig signingConfigs.release
}
release {
signingConfig signingConfigs.release
}
}
//Add dependencies
dependencies {
implementation 'com.huawei.agconnect:agconnect-core:1.0.0.300'
implementation 'com.huawei.hms:hianalytics:5.0.1.300'
}
apply plugin: 'com.huawei.agconnect' //Add this line to the bottom of the page
Add Analytics Kit to your project
There are 2 ways to do this step.
1) Go to developer website and download plugin. In Android Studio create a new folder in your root directory and name it “hms”. Unzip the plugin and paste it in “hms” folder.
Then, go to pubspec.yaml and add the plugin under dependencies.
2) This way is much easier and also more familiar to Flutter developers. In pub.dev copy the plugin and add it under dependencies as usual.
For both ways, after running pub get command, the plugin is ready to use!
For more information about HMS Core integration, click.
We are all done. Let’s begin coding.
I will make a tiny and very easy game that I belive most of you know the concept: Guess the number!
As you play the game and try to guess the number, Huawei Analytics Kit will collect statistics how many times you guessed.
Make a simple game with Flutter
First, let’s write the method to create a random number. You should import ‘dart:math’ for this.
Code:
_setRandomNumber() {
Random random = Random();
int number = random.nextInt(100); // from 0 to 99 included
return number;
}
And call it in initState
Code:
@override
void initState() {
randomNumber = _setRandomNumber();
super.initState();
}
We will need a TextField and a button to check user’s guess.
Code:
Column(
mainAxisAlignment: MainAxisAlignment.center,
crossAxisAlignment: CrossAxisAlignment.stretch,
children: <Widget>[
TextField(
controller: _controller,
decoration: InputDecoration(
hintText: "Enter Your Guess [0-99]",
border: new OutlineInputBorder(borderSide: BorderSide()),
),
keyboardType: TextInputType.number,
inputFormatters: <TextInputFormatter>[
WhitelistingTextInputFormatter.digitsOnly
],
onChanged: (value) {
guess = int.parse(value);
},
enabled: _isFound ? false : true, //If user guesses the number right, textfield will be disabled
),
RaisedButton(
child: Text("OK!"),
onPressed: () {
if (!_isFound) {
_controller.text = "";
_count++;
_compareValues();
}
},
),
],
)
We need a method if the user guessed the number right or not.
Code:
_compareValues() {
if (guess == randomNumber) {
setState(() {
_isFound = true;
_message =
"Correct! The number was $randomNumber.\nYou guessed it in $_count times.";
});
} else if (guess > randomNumber) {
setState(() {
_message = "Lower!";
});
} else {
setState(() {
_message = "Higher!";
});
}
}
}
Let’s add a message Text in Column widget to give hints to user, also a replay button.
Code:
Column(
...
Text(
_message,
textAlign: TextAlign.center,
style: TextStyle(fontSize: 24),
),
_isFound //If user guesses the number right, iconButton will appear, otherwise it won't
? IconButton(
icon: Icon(
Icons.refresh,
size: 40,
),
onPressed: () {
setState(() {
//reset all variables and set a new random number.
randomNumber = _setRandomNumber();
_isFound = false;
_count = 0;
_message = "";
});
},
)
: Text("")
],
),
We have done a simple but fun game. Let’s play it!
Define HMS Analytics Kit and send events
As we’re done with the widgets, we will define the kit and enable logs.
Code:
class _MyHomePageState extends State<MyHomePage> {
final HMSAnalytics hmsAnalytics = new HMSAnalytics();
Future<void> _enableLog() async {
await hmsAnalytics.enableLog();
}
...
@override
void initState() {
_enableLog();
randomNumber = _setRandomNumber();
super.initState();
}
}
Once we call _enableLog(), we are ready to see auto collected events on AppGallery Connect.
What about our custom events? How can we send custom events and see them?
We have _count variable and every time user clicks OK! button, it increases. Now we will map it and send it as a custom event. We need a name for custom event, and a map value.
Code:
Future<void> _sendEvent(int count) async {
String name = "USERS_RESULTS";
Map<String, String> value = {
'number_of_guesses': count.toString()
};
await hmsAnalytics.onEvent(name, value);
}
And we call it when we are sure that user guessed the number right. In _compareValues method.
Code:
_compareValues() {
if (guess == randomNumber) {
...
_sendEvent(_count); //We know that user guessed the number right.
} else if (guess > randomNumber) {
...
} else {
...
}
}
}
Let’s go back to AppGallery Connect. In the left panel, under Management section click Events.
After _sendEvent builds for the first time, you can see your custom event with the name you have entered in your code. Click Edit.
Add your attribute and click Save.
On the left panel, click Real Time Monitoring under Overview.
Now you can see the attribute and its value in your custom event. Also you can see how many times you get this value and its proportion of all values.
Let’s play our game a few times more.
Despite I am the only user, you see 2 users in AG Connect. That’s because I uninstalled the app and installed again. Now I have a different AAID as I mentioned in the first part.
Under the graphics, there is event analysis. Here you can see all events, all attributes you’ve added and statistics for both events and attributes. 11 of them are custom events that I have sent by playing the game. And rest are collected automatically.
You can find full code in my github page. Here is the link for you.
ozkulbeng/FlutterHMSAnalyticsKitTutorial
Conclusion
In this article you have learned how to integrate HMS Analytics to your Flutter projects, send custom events and monitor them in AppGallery Connect. You can use custom events in your apps to see user behaviors, so that you can improve your app depend on them.
Thank you for reading this article, I hope you enjoyed it.
References
Analytics Kit Document
HMS-Core/hms-ananlytics-demo-android
sujith.e said:
Huawei Analytics will track fragment reports
Click to expand...
Click to collapse
Quite right. It really helps a lot
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Hand keypoint detection is the process of finding fingertips, knuckles and wrists in an image. Hand keypoint detection and hand gesture recognition is still a challenging problem in computer vision domain. It is really a tough work to build your own model for hand keypoint detection as it is hard to collect a large enough hand dataset and it requires expertise in this domain.
Hand keypoint detection can be used in variety of scenarios. For example, it can be used during artistic creation, users can convert the detected hand keypoints into a 2D model, and synchronize the model to a character’s model to produce a vivid 2D animation. You can create a puppet animation game using the above idea. Another example may be creating a rock paper scissors game. Or if you take it further, you can create a sign language to text conversion application. As you see varieties to possible usage scenarios are abundant and there is no limit to ideas.
Hand keypoint detection service is a brand-new feature that is added to Huawei Machine Learning Kit family. It has recently been released and it is making developers and computer vision geeks really excited! It detects 21 points of a hand and can detect up to ten hands in an image. It can detect hands in a static image or in a camera stream. Currently, it does not support scenarios where your hand is blocked by more than 50% or you wear gloves. You don’t need an internet connection as this is a device side capability and what is more: It is completely free!
It wouldn’t be a nice practice only to read related documents and forget about it after a few days. So I created a simple demo application that counts fingers and tells us the number we show by hand. I strongly advise you to develop your hand keypoint detection application beside me. I developed the application in Android Studio in Kotlin. Now, I am going to explain you how to build this application step by step. Don’t hesitate to ask questions in the comments if you face any issues.
1.Firstly, let’s create our project on Android Studio. I named my project as HandKeyPointDetectionDemo. I am sure you can find better names for your application. We can create our project by selecting Empty Activity option and then follow the steps described in this post to create and sign our project in App Gallery Connect.
2. In HUAWEI Developer AppGallery Connect, go to Develop > Manage APIs. Make sure ML Kit is activated.
3. Now we have integrated Huawei Mobile Services (HMS) into our project. Now let’s follow the documentation on developer.huawei.com and find the packages to add to our project. In the website click Developer / HMS Core/ AI / ML Kit. Here you will find introductory information to services, references, SDKs to download and others. Under ML Kit tab follow Android / Getting Started / Integrating HMS Core SDK / Adding Build Dependencies / Integrating the Hand Keypoint Detection SDK. We can follow the guide here to add hand detection capability to our project. We have also one meta-data tag to be added into our AndroidManifest.xml file. After the integration your app-level build.gradle file will look like this.
Code:
apply plugin: 'com.android.application'
apply plugin: 'kotlin-android'
apply plugin: 'kotlin-android-extensions'
apply plugin: 'com.huawei.agconnect'
android {
compileSdkVersion 30
buildToolsVersion "30.0.2"
defaultConfig {
applicationId "com.demo.handkeypointdetection"
minSdkVersion 21
targetSdkVersion 30
versionCode 1
versionName "1.0"
testInstrumentationRunner "androidx.test.runner.AndroidJUnitRunner"
}
buildTypes {
release {
minifyEnabled false
proguardFiles getDefaultProguardFile('proguard-android-optimize.txt'), 'proguard-rules.pro'
}
}
}
dependencies {
implementation fileTree(dir: "libs", include: ["*.jar"])
implementation "org.jetbrains.kotlin:kotlin-stdlib:$kotlin_version"
implementation 'androidx.core:core-ktx:1.3.1'
implementation 'androidx.appcompat:appcompat:1.2.0'
implementation 'androidx.constraintlayout:constraintlayout:2.0.1'
testImplementation 'junit:junit:4.12'
androidTestImplementation 'androidx.test.ext:junit:1.1.2'
androidTestImplementation 'androidx.test.espresso:espresso-core:3.3.0'
//AppGalleryConnect Core
implementation 'com.huawei.agconnect:agconnect-core:1.3.1.300'
// Import the base SDK.
implementation 'com.huawei.hms:ml-computer-vision-handkeypoint:2.0.2.300'
// Import the hand keypoint detection model package.
implementation 'com.huawei.hms:ml-computer-vision-handkeypoint-model:2.0.2.300'
}
Our project-level build.gradle file:
Code:
buildscript {
ext.kotlin_version = "1.4.0"
repositories {
google()
jcenter()
maven {url 'https://developer.huawei.com/repo/'}
}
dependencies {
classpath "com.android.tools.build:gradle:4.0.1"
classpath "org.jetbrains.kotlin:kotlin-gradle-plugin:$kotlin_version"
classpath 'com.huawei.agconnect:agcp:1.3.1.300'
}
}
allprojects {
repositories {
google()
jcenter()
maven {url 'https://developer.huawei.com/repo/'}
}
}
task clean(type: Delete) {
delete rootProject.buildDir
}
And don’t forget to add related meta-data tags into your AndroidManifest.xml.
Code:
<?xml version="1.0" encoding="utf-8"?>
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
package="com.demo.handkeypointdetection">
<uses-permission android:name="android.permission.CAMERA" />
<application
...
<meta-data
android:name="com.huawei.hms.ml.DEPENDENCY"
android:value= "handkeypoint"/>
</application>
</manifest>
4. I created a class named HandKeyPointDetector. This class will be called from our activity or fragment. Its init method has two parameters context and a viewgroup. We will add our views on rootLayout.
Code:
fun init(context: Context, rootLayout: ViewGroup) {
mContext = context
mRootLayout = rootLayout
addSurfaceViews()
}
5. We are going to detect hand key points in a camera stream, so we create a surfaceView for camera preview and another surfaceView to draw somethings. The surfaceView that is going to be used as overlay should be transparent. Then, we add our views to our rootLayout passed as a parameter from our activity. Lastly we add SurfaceHolder.Callback to our surfaceHolder to know when it is ready.
Code:
private fun addSurfaceViews() {
val surfaceViewCamera = SurfaceView(mContext).also {
it.layoutParams = LinearLayout.LayoutParams(LinearLayout.LayoutParams.MATCH_PARENT, LinearLayout.LayoutParams.MATCH_PARENT)
mSurfaceHolderCamera = it.holder
}
val surfaceViewOverlay = SurfaceView(mContext).also {
it.layoutParams = LinearLayout.LayoutParams(LinearLayout.LayoutParams.MATCH_PARENT, LinearLayout.LayoutParams.MATCH_PARENT)
mSurfaceHolderOverlay = it.holder
mSurfaceHolderOverlay.setFormat(PixelFormat.TRANSPARENT)
mHandKeyPointTransactor.setOverlay(mSurfaceHolderOverlay)
}
mRootLayout.addView(surfaceViewCamera)
mRootLayout.addView(surfaceViewOverlay)
mSurfaceHolderCamera.addCallback(surfaceHolderCallback)
}
6. Inside our surfaceHolderCallback we override three methods: surfaceCreated, surfacehanged and surfaceDestroyed.
Code:
private val surfaceHolderCallback = object : SurfaceHolder.Callback {
override fun surfaceCreated(holder: SurfaceHolder) {
createAnalyzer()
}
override fun surfaceChanged(holder: SurfaceHolder, format: Int, width: Int, height: Int) {
prepareLensEngine(width, height)
mLensEngine.run(holder)
}
override fun surfaceDestroyed(holder: SurfaceHolder) {
mLensEngine.release()
}
}
7. createAnalyzer method creates MLKeyPointAnalyzer with settings. If you want you can use default settings also. Scene type can be keypoint and rectangle around hands or we can use TYPE_ALL for both. Max hand results can be up to MLHandKeypointAnalyzerSetting.MAX_HANDS_NUM which is 10 currently. As we will count fingers of 2 hands, I set it to 2.
Code:
private fun createAnalyzer() {
val settings = MLHandKeypointAnalyzerSetting.Factory()
.setSceneType(MLHandKeypointAnalyzerSetting.TYPE_ALL)
.setMaxHandResults(2)
.create()
mAnalyzer = MLHandKeypointAnalyzerFactory.getInstance().getHandKeypointAnalyzer(settings)
mAnalyzer.setTransactor(mHandKeyPointTransactor)
}
8. LensEngine is responsible for handling camera frames for us. All we need to do is to prepare it with right dimensions according to orientation, choose the camera we want to work with, apply fps an so on.
Code:
private fun prepareLensEngine(width: Int, height: Int) {
val dimen1: Int
val dimen2: Int
if (mContext.resources.configuration.orientation == Configuration.ORIENTATION_LANDSCAPE) {
dimen1 = width
dimen2 = height
} else {
dimen1 = height
dimen2 = width
}
mLensEngine = LensEngine.Creator(mContext, mAnalyzer)
.setLensType(LensEngine.BACK_LENS)
.applyDisplayDimension(dimen1, dimen2)
.applyFps(5F)
.enableAutomaticFocus(true)
.create()
}
9. When you no longer need the analyzer stop it and release resources.
Code:
fun stopAnalyzer() {
mAnalyzer.stop()
}
10. As you can see in step-7 we used mHandKeyPointTransactor. It is a custom class that we created named HandKeyPointTransactor, which inherits MLAnalyzer.MLTransactor<MLHandKeypoints>. It has two overriden methods inside. transactResult and destroy. Detected results will fall inside transactResult method and then we will try to find the number.
Code:
override fun transactResult(result: MLAnalyzer.Result<MLHandKeypoints>?) {
if (result == null)
return
val canvas = mOverlay?.lockCanvas() ?: return
//Clear canvas.
canvas.drawColor(0, PorterDuff.Mode.CLEAR)
//Find the number shown by our hands.
val numberString = analyzeHandsAndGetNumber(result)
//Find the middle of the canvas
val centerX = canvas.width / 2F
val centerY = canvas.height / 2F
//Draw a text that writes the number we found.
canvas.drawText(numberString, centerX, centerY, Paint().also {
it.style = Paint.Style.FILL
it.textSize = 100F
it.color = Color.GREEN
})
mOverlay?.unlockCanvasAndPost(canvas)
}
11. We will check hand by hand and then finger by finger to find the fingers that are up to find the number.
Code:
private fun analyzeHandsAndGetNumber(result: MLAnalyzer.Result<MLHandKeypoints>): String {
val hands = ArrayList<Hand>()
var number = 0
for (key in result.analyseList.keyIterator()) {
hands.add(Hand())
for (value in result.analyseList.valueIterator()) {
number += hands.last().createHand(value.handKeypoints).getNumber()
}
}
return number.toString()
}
For more information, you can visit https://forums.developer.huawei.com/forumPortal/en/topicview?tid=0202369245767250343&fid=0101187876626530001
HMS Video Kit — 1
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
In this article, I will write about the features of Huawei’s Video Kit and we will develop a sample application that allows you to play streaming media from a third-party video address.
Why should we use it?
Nowadays, video apps are so popular. Due to the popularity of streaming media, many developers have introduced HD movie streaming apps for people who use devices such as tablets and smartphones for everyday purposes. With Video Kit WisePlayer SDK you can bring stable HD video experiences to your users.
Service Features
It provides a high definition video experience without any delay
Responds instantly to playback requests
Have intuitive controls and offer content on demand
It selects the most suitable bitrate for your app
URL anti-leeching , playback authentication, and other security mechanisms so your videos are completely secure
It supports streaming media in 3GP, MP4, or TS format and complies with HTTP/HTTPS, HLS, or DASH.
Integration Preparations
First of all, in order to start developing an app with most of the Huawei mobile services and the Video Kit as well, you need to integrate the HUAWEI HMS Core into your application.
Software Requirements
Android Studio 3.X
JDK 1.8 or later
HMS Core (APK) 5.0.0.300 or later
EMUI 3.0 or later
The integration flow will be like this :
For a detailed HMS core integration process, you can either refer to Preparations for Integrating HUAWEI HMS Core.
After creating the application on App Gallery Connect and completed the other steps that are required, please make sure that you copied the agconnect-services.json file to the app’s root directory of your Android Studio project.
Adding SDK dependencies
Add the AppGallery Connect plug-in and the Maven repository in the project-level build.gradle file.
Code:
buildscript {
repositories {
......
maven {url 'https://developer.huawei.com/repo/'}
}
dependencies {
......
classpath 'com.huawei.agconnect:agcp:1.3.1.300' // HUAWEI agcp plugin
}
}
allprojects {
repositories {
......
maven {url 'https://developer.huawei.com/repo/'}
}
}
2. Open the build.gradle file in the app directory and add the AppGallery connect plug-in.
Code:
apply plugin: 'com.android.application'
// Add the following line
apply plugin: 'com.huawei.agconnect' // HUAWEI agconnect Gradle plugin
android {
......
}
3.Configure the Maven dependency in the app level build.gradle file
Code:
dependencies {
......
implementation "com.huawei.hms:videokit-player:1.0.1.300"
}
You can find all the version numbers of this kit in its Version Change History.
Click to expand...
Click to collapse
4.Configure the NDK in the app-level build.gradle file.
Code:
android {
defaultConfig {
......
ndk {
abiFilters "armeabi-v7a", "arm64-v8a"
}
}
......
}
Here, we have used the abiFilters in order to reduce the .apk size by selecting the desired CPU architectures.
5.Add permissons in the AndroidManifest.xml file.
Code:
<uses-permission
android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
<uses-permission
android:name="android.permission.ACCESS_WIFI_STATE" />
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
<uses-permission android:name="com.huawei.permission.SECURITY_DIAGNOSE" />
Note : For Android 6.0 and later , Video Kit dynamically applies for the write permisson on external storage.
Click to expand...
Click to collapse
6.Lastly, add configurations to exclude the HMS Core SDK from obfuscation.
The obfuscation configuration file is proguard-rules.pro for Android Studio
Open the obfuscation configuration file of your Android Studio project and add the configurations.
Code:
-ignorewarnings
-keepattributes *Annotation*
-keepattributes Exceptions
-keepattributes InnerClasses
-keepattributes Signature
-keepattributes SourceFile,LineNumberTable
-keep class com.hianalytics.android.**{*;}
-keep class com.huawei.updatesdk.**{*;}
-keep class com.huawei.hms.**{*;}
With these steps, we have terminated the integration part. Now, let's get our hands dirty with some code …
Initializing WisePlayer
In order to initialize the player, we need to create a class that inherits from Application.Application class is a base class of Android app containing components like Activities and Services.Application or its sub classes are instantiated before all the activities or any other application objects have been created in the Android app.
We can give additional introductions to the Application class by extending it. We call the initialization API WisePlayerFactory.initFactory() of the WisePlayer SDK in the onCreate() method.
Java:
public class VideoKitPlayApplication extends Application {
private static final String TAG = "VideoKitPlayApplication";
private static WisePlayerFactory wisePlayerFactory = null;
@Override
public void onCreate() {
super.onCreate();
initPlayer();
}
private void initPlayer() {
// DeviceId test is used in the demo, specific access to incoming deviceId after encryption
Log.d(TAG, "initPlayer: VideoKitPlayApplication");
WisePlayerFactoryOptions factoryOptions = new WisePlayerFactoryOptions.Builder().setDeviceId("xxx").build();
WisePlayerFactory.initFactory(this, factoryOptions, initFactoryCallback);
}
/**
* Player initialization callback
*/
private static InitFactoryCallback initFactoryCallback = new InitFactoryCallback() {
@Override
public void onSuccess(WisePlayerFactory wisePlayerFactory) {
Log.d(TAG, "init player factory success");
setWisePlayerFactory(wisePlayerFactory);
}
@Override
public void onFailure(int errorCode, String reason) {
Log.d(TAG, "init player factory fail reason :" + reason + ", errorCode is " + errorCode);
}
};
/**
* Get WisePlayer Factory
*
* @return WisePlayer Factory
*/
public static WisePlayerFactory getWisePlayerFactory() {
return wisePlayerFactory;
}
private static void setWisePlayerFactory(WisePlayerFactory wisePlayerFactory) {
VideoKitPlayApplication.wisePlayerFactory = wisePlayerFactory;
}
}
Playing a Video
We need to create a PlayActivity that inherits from AppCompatActivity and implement the Callback and SurfaceTextureListener APIs.Currently, WisePlayer supports SurfaceView and TextureView. Make sure that your app has a valid view for video display; otherwise, the playback will fail. So that In the layout file, we need to add SurfaceView or TextureView to be displayed in WisePlayer.PlayActivity also implements the OnPlayWindowListener and OnWisePlayerListener in order to get callbacks from the WisePlayer.
Java:
import android.view.SurfaceHolder.Callback;
import android.view.TextureView.SurfaceTextureListener;
import com.videokitnative.huawei.contract.OnPlayWindowListener;
import com.videokitnative.huawei.contract.OnWisePlayerListener;
public class PlayActivity extends AppCompatActivity implements Callback,SurfaceTextureListener,OnWisePlayerListener,OnPlayWindowListener{
...
}
WisePlayerFactory instance is returned when the initialization is complete in Application. We need to call createWisePlayer() to create WisePlayer.
Java:
WisePlayer player = Application.getWisePlayerFactory().createWisePlayer();
In order to make the code modular and understandable, I have created PlayControl.java class as in the official demo and created the Wiseplayer in that class. Since we create the object in our PlayActivity class through the constructor,wisePlayer will be created in the onCreate() method of our PlayActivity.
Note: Before calling createWisePlayer() to create WisePlayer, make sure that Application has successfully initialized the WisePlayer SDK.
Click to expand...
Click to collapse
Now, we need to Initialize the WisePlayer layout and add layout listeners. I have created the PlayView.java for creating the views and updating them. So that we can create the PlayView instance on onCreate() method of our PlayActivity.
Java:
/**
* init the layout
*/
private void initView() {
playView = new PlayView(this, this, this);
setContentView(playView.getContentView());
}
In the PlayView.java class I have created SurfaceView for displaying the video.
Java:
surfaceView = (SurfaceView) findViewById(R.id.surface_view); SurfaceHolder surfaceHolder = surfaceView.getHolder(); surfaceHolder.addCallback(this); surfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
I will share the demo code that I have created. You can find the activity_play.xml layout and the PlayView.java files over there.
Click to expand...
Click to collapse
Registering WisePlayer listeners is another important step. Because the app will react based on listener callbacks. I have done this on PlayControl.java class with the method below.
Java:
/**
* Set the play listener
*/
private void setPlayListener() {
if (wisePlayer != null) {
wisePlayer.setErrorListener(onWisePlayerListener);
wisePlayer.setEventListener(onWisePlayerListener);
wisePlayer.setResolutionUpdatedListener(onWisePlayerListener);
wisePlayer.setReadyListener(onWisePlayerListener);
wisePlayer.setLoadingListener(onWisePlayerListener);
wisePlayer.setPlayEndListener(onWisePlayerListener);
wisePlayer.setSeekEndListener(onWisePlayerListener);
}
}
Here, onWisePlayerListener is an interface that extends required Wiseplayer interfaces.
Java:
public interface OnWisePlayerListener extends WisePlayer.ErrorListener, WisePlayer.ReadyListener,
WisePlayer.EventListener, WisePlayer.PlayEndListener, WisePlayer.ResolutionUpdatedListener,
WisePlayer.SeekEndListener, WisePlayer.LoadingListener, SeekBar.OnSeekBarChangeListener {
}
Now, we need to set URLs for our videos on our PlayControl.java class with the method below.
Java:
wisePlayer.setPlayUrl("http://commondatastorage.googleapis.com/gtv-videos-bucket/sample/BigBuckBunny.mp4");
Since I have used CardViews on my MainActivity.java class , I have passed the Urls and movie names on click action through intent from MainActivity to PlayControl. You can check it out on my source code as well.
Click to expand...
Click to collapse
We’ve set a view to display the video with the code below. In my demo application I have used SurfaceView to display the video.
Java:
// SurfaceView listener callback
@Override
public void surfaceCreated(SurfaceHolder holder) { wisePlayer.setView(surfaceView); }
In order to prepare for the playback and start requesting data, we need the call wisePlayer.ready() method .
Lastly, we need to call wisePlayer.start() method to start the playback upon a successful response of the onReady callback method in this API.
Java:
@Override public void onReady(WisePlayer wisePlayer)
{
wisePlayer.start();
}
We have finished the development, lets pick a movie and enjoy it
Movie List
You can find the source code of the demo app here.
In this article, we developed a sample application using HUAWEI Video Kit. HMS Video Kit offers a lot of features, for the sake of simplicity we implemented a few of them. I will share another post to show more features of the video kit in the near future.
RESOURCES
Documentation
Video Kit Codelab
what is the minimum resolution video we can play ??
What should I do if the signature fails to be verified on the server side?
shikkerimath said:
what is the minimum resolution video we can play ??
Click to expand...
Click to collapse
The minimum resolution is 270p, and the maximum is 4K.
Very interesting.
This document describes how to integrate Ads Kit using the official Unity asset. After the integration, your app can use the services of this Kit on HMS mobile phones.
For details about Ads Kit, please visit HUAWEI Developers.
1.1 Restrictions
1.1.1 Ads Supported by the Official Unity Asset
The official asset of version 1.3.4 in Unity supports interstitial ads and rewarded ads of Ads Kit.
Note: To support other types of ads, you can use the Android native integration mode. This document will take the banner ad as an example to describe such integration.
1.1.2 Supported Devices and Unity Versions
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Note: If the version is earlier than 2018.4.25, you can manually import assets. If there are any unknown build errors that are not caused by the AfterBuildToDo file, upgrade your Unity.
1.1 Preparations
1.1.1 Prerequisites
l HMS Core (APK) 4.0.0.300 or later has been installed on the device. Otherwise, the APIs of the HUAWEI Ads SDK, which depend on HMS Core (APK) 4.0.0.300 or later, cannot be used.
l You have registered as a Huawei developer and completed identity verification on HUAWEI Developers. For details, please refer to Registration and Verification.
You have created a project and add an app to the project in AppGallery Connect by referring to Creating an AppGallery Connect Project and Adding an App to the Project.
For details about applying for a formal ad slot, please visit HUAWEI Developers.
1.1.2 Importing Unity Assets
1. Open Asset Store in Unity.
Go to Window > Asset Store in Unity.
2. Search for the Huawei HMS AGC Services asset. Download and then import it.
3. Import the asset to My Assets, with all services selected.
4. Change the package name.
Go to Edit > Project Settings> Player > Android > Other Settings in Unity, and then set Package Name.
The default package name is com.${Company Name}.${Product Name}. You need to change the package name, and the app will be released to AppGallery with the new name.
1.2.3 Generating .gradle Files1. Enable project gradle.
Go to Edit > Project Settings > Player in Unity, click the Android icon, and go to Publishing Settings > Build.
Enable Custom Main Manifest.
Enable Custom Main Gradle Template.
Enable Custom Launcher Gradle Template.
Enable Custom Base Gradle Template.
2. Generate a signature.
You can use an existing keystore file or create a new one to sign your app.
Go to Edit > Project Settings > Player in Unity, click the Android icon, and go to Publishing Settings > Keystore Manager.
Then, go to Keystore... > Create New.
Enter the password when you open Unity. Otherwise, you cannot build the APK.
1.1.1 Configuring .gradle Files1. Configure the BaseProjectTemplate.gradle file.
Configure the Maven repository address.
Code:
<p style="line-height: 1.5em;">buildscript {
repositories {**ARTIFACTORYREPOSITORY**
google()
jcenter()
maven { url 'https://developer.huawei.com/repo/' }
}
repositories {**ARTIFACTORYREPOSITORY**
google()
jcenter()
maven { url 'https://developer.huawei.com/repo/' }
flatDir {
dirs "${project(':unityLibrary').projectDir}/libs"
}
}
2. Configure the launcherTemplate.gradle file.
Add dependencies.
apply plugin: 'com.android.application'
apply plugin: 'com.huawei.agconnect'
dependencies {
implementation project(':unityLibrary')
implementation 'com.huawei.hms:ads-lite:13.4.29.303'
}</p>
1.2 App Development with the Official Asset1.2.1 Rewarded AdsRewarded ads are full-screen video ads that allow users to view in exchange for in-app rewards.
1.2.1.1 Sample Code
Code:
<p style="line-height: 1.5em;">using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using HuaweiHms;
namespace Demo
{
public class RewardDemo : MonoBehaviour
{
public void LoadRewardAds()
{
// Create a RewardAd object.
// "testx9dtjwj8hp" is the test ad slot. You can use it to perform a test. Replace the test slot with a formal one for official release.
RewardAd rewardAd = new RewardAd(new Context(), "testx9dtjwj8hp");
AdParam.Builder builder = new AdParam.Builder();
AdParam adParam = builder.build();
// Load the ad.
rewardAd.loadAd(adParam, new MRewardLoadListener(rewardAd));
}
}
// Listen for ad events.
public class MRewardLoadListener:RewardAdLoadListener
{
private RewardAd ad;
public rewardLoadListener(RewardAd _ad)
{
ad = _ad;
}
public override void onRewardAdFailedToLoad(int arg0)
{
}
public override void onRewardedLoaded()
{
ad.show(new Context(),new RewardAdStatusListener());
}
}
}</p>
1.3.1.2 Testing the APKGo to File > Build Settings > Android, click Switch Platform and then Build And Run.
1.3.2 Interstitial Ads1.3.2.1 Sample Code
Code:
<p style="line-height: 1.5em;">using UnityEngine;
using HuaweiService;
using HuaweiService.ads;
namespace Demo
{
public class interstitialDemo : MonoBehaviour
{
public void LoadImageAds()
{
// Create an ad object.
// "testb4znbuh3n2" and "teste9ih9j0rc3" are test ad slots. You can use them to perform a test. Replace the test slots with formal ones for official release.
InterstitialAd ad = new InterstitialAd(new Context());
ad.setAdId("teste9ih9j0rc3");
ad.setAdListener(new MAdListener(ad));
AdParam.Builder builder = new AdParam.Builder();
AdParam adParam = builder.build();
// Load the ad.
ad.loadAd(adParam);
}
public void LoadVideoAds()
{
InterstitialAd ad = new InterstitialAd(new Context());
ad.setAdId("testb4znbuh3n2");
ad.setAdListener(new MAdListener(ad));
AdParam.Builder builder = new AdParam.Builder();
ad.loadAd(builder.build());
}
public class MAdListener : AdListener
{
private InterstitialAd ad;
public MAdListener(InterstitialAd _ad) : base()
{
ad = _ad;
}
public override void onAdLoaded()
{
// Display the ad if it is successfully loaded.
ad.show();
}
public override void onAdFailed(int arg0)
{
}
public override void onAdOpened()
{
}
public override void onAdClicked()
{
}
public override void onAdLeave()
{
}
public override void onAdClosed()
{
}
}
}
}</p>
1.4 App Development with Android Studio1.4.1 Banner Ads1.4.1.1 Loading Banner Ads on a Page as RequiredAndroidJavaClass javaClass = new AndroidJavaClass("com.unity3d.player.UnityPlayerActivity");
javaClass.CallStatic("loadBannerAds");
1.4.1.2 Exporting Your Project from UnityGo to File > Build Settings > Android and click Switch Platform. Then, click Export Project, select your project, and click Export.
1.4.1.3 Integrating the Banner Ad Function in Android StudioOpen the exported project in Android Studio.
Add implementation 'com.huawei.hms:ads-lite:13.4.29.303' to build.gradle in the src directory.
1. Add code related to banner ads to UnityPlayerActivity.
a. Define the static variable bannerView.
Code:
<p style="line-height: 1.5em;">private static BannerView
bannerView
;</p>
b. Add the initialization of bannerView to the onCreate method.
Code:
<p style="line-height: 1.5em;">bannerView = new BannerView(this);
bannerView.setAdId("testw6vs28auh3");
bannerView.setBannerAdSize(BannerAdSize.BANNER_SIZE_360_57);
mUnityPlayer.addView(bannerView);</p>
c. Add the following static method for loading ads in the Android Studio project, and then build and run the project.
Code:
<p style="line-height: 1.5em;">public static void loadBannerAds()
{ // "testw6vs28auh3" is a dedicated test ad slot ID. Before releasing your app, replace the test ad slot ID with the formal one.
AdParam adParam = new AdParam.Builder().build();
bannerView.loadAd(adParam);
}</p>
d. If banner ads need to be placed at the bottom of the page, refer to the following code:
Code:
<p style="line-height: 1.5em;">// Set up activity layout.
@Override protected void onCreate(Bundle savedInstanceState)
{
requestWindowFeature(Window.FEATURE_NO_TITLE);
super.onCreate(savedInstanceState);
String cmdLine = updateUnityCommandLineArguments(getIntent().getStringExtra("unity"));
getIntent().putExtra("unity", cmdLine);
RelativeLayout relativeLayout = new RelativeLayout(this);
mUnityPlayer = new UnityPlayer(this, this);
setContentView(relativeLayout);
relativeLayout.addView(mUnityPlayer);
mUnityPlayer.requestFocus();
bannerView = new BannerView(this);
bannerView.setAdId("testw6vs28auh3");
bannerView.setBannerAdSize(BannerAdSize.BANNER_SIZE_360_57);
RelativeLayout.LayoutParams layoutParams = new RelativeLayout.LayoutParams(ViewGroup.LayoutParams.MATCH_PARENT,ViewGroup.LayoutParams.WRAP_CONTENT); layoutParams.addRule(RelativeLayout.ALIGN_PARENT_BOTTOM); relativeLayout.addView(bannerView,layoutParams);
}
public static void LoadBannerAd() {
// "testw6vs28auh3" is a dedicated test ad slot ID. Before releasing your app, replace the test ad slot ID with the formal one.
AdParam adParam = new AdParam.Builder().build();
bannerView.loadAd(adParam);
bannerView.setAdListener(new AdListener()
{
@Override
public void onAdFailed(int errorCode)
{
Log.d("BannerAds" ,"error" + errorCode);
}
});
}</p>
1.5 FAQs1. If an error indicating invalid path is reported when you export a project from Unity, change the export path to Downloads or Desktop.
2. Unity of a version earlier than 2018.4.25 does not support asset download from Asset Store. You can download the asset using Unity of 2018.4.25 or a later version, export it, and then import it to Unity of an earlier version.
More Information
To join in on developer discussion forums
To download the demo app and sample code
For solutions to integration-related issues
Checkout in forum
BackgroundLots of apps these days include an in-app map feature and the ability to mark places of interest on the map. HMS Core Map Kit enables you to implement such capabilities for your apps. With Map Kit, you can first draw a map and then add markers to the map, as well as configure the map to cluster markers depending on the level of zoom. This article will show you how to implement searching for nearby places using the keyword search capability of Site Kit and display the results on the map.
Application scenarios:
A travel app that allows users to search for scenic spots and shows the results on a map.
2.A bicycle sharing app that can show users nearby bicycles on a map.
Key functions used in the project:
Location service: Use Location Kit to obtain the current longitude-latitude coordinates of a device.
Keyword search: Use Site Kit to search for places such as scenic spots, businesses, and schools in the specified geographical area based on the specified keyword.
Map display: Use Map Kit to draw a map. Marker clustering: Use Map Kit to add markers to the map and configure the map to cluster markers depending on the level of zoom.
Integration Preparations1.Register as a developer and create a project in AppGallery Connect.
(1)Register as a developer Registration URL
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
(2)Create an app, add the SHA-256 signing certificate fingerprint, enable Map Kit and Site Kit, and download the agconnect-services.json file of the app.
2.Integrate the Map SDK and Site SDK.
(1)Copy the agconnect-services.json file to the app's directory of your project.
Go to allprojects > repositories and configure the Maven repository address for the HMS Core SDK.
Go to buildscript > repositories and configure the Maven repository address for the HMS Core SDK.
If the agconnect-services.json file has been added to the app, go to buildscript > dependencies and add the AppGallery Connect plugin configuration.
Code:
buildscript {
repositories {
maven { url 'https://developer.huawei.com/repo/' }
google()
jcenter()
}
dependencies {
classpath 'com.android.tools.build:gradle:3.3.2'
classpath 'com.huawei.agconnect:agcp:1.3.1.300'
}
}
allprojects {
repositories {
maven { url 'https://developer.huawei.com/repo/' }
google()
jcenter()
}
}
(2)Add build dependencies in the dependencies block.
Code:
dependencies {
implementation 'com.huawei.hms:maps:{version}'
implementation 'com.huawei.hms:site:{version}'
implementation 'com.huawei.hms:location:{version}'
}
(3)Add the following configuration to the file header:
Code:
apply plugin: 'com.huawei.agconnect'
(4)Copy the signing certificate generated in Generating a Signing Certificate to the app directory of your project, and configure the signing certificate in android in the build.gradle file.
Code:
signingConfigs {
release {
// Signing certificate.
storeFile file("**.**")
// KeyStore password.
storePassword "******"
// Key alias.
keyAlias "******"
// Key password.
keyPassword "******"
v2SigningEnabled true
v2SigningEnabled true
}
}
buildTypes {
release {
minifyEnabled false
proguardFiles getDefaultProguardFile('proguard-android.txt'), 'proguard-rules.pro'
debuggable true
}
debug {
debuggable true
}
}
Main Code and Used Functions1.Use Location Kit to obtain the device location.
Code:
private void getMyLoction() {
fusedLocationProviderClient = LocationServices.getFusedLocationProviderClient(this);
SettingsClient settingsClient = LocationServices.getSettingsClient(this);
LocationSettingsRequest.Builder builder = new LocationSettingsRequest.Builder();
mLocationRequest = new LocationRequest();
builder.addLocationRequest(mLocationRequest);
LocationSettingsRequest locationSettingsRequest = builder.build();
//Check the device location settings.
settingsClient.checkLocationSettings(locationSettingsRequest)
.addOnSuccessListener(new OnSuccessListener<LocationSettingsResponse>() {
@Override
public void onSuccess(LocationSettingsResponse locationSettingsResponse) {
//Initiate location requests when the location settings meet the requirements.
fusedLocationProviderClient
.requestLocationUpdates(mLocationRequest, mLocationCallback, Looper.getMainLooper())
.addOnSuccessListener(new OnSuccessListener<Void>() {
@Override
public void onSuccess(Void aVoid) {
// Processing when the API call is successful.
Log.d(TAG, "onSuccess: " + aVoid);
}
});
}
})
2.Implement the text search function using Site Kit to search for nearby places.
Code:
SearchResultListener<TextSearchResponse> resultListener = new SearchResultListener<TextSearchResponse>() {
// Return search results upon a successful search.
@Override
public void onSearchResult(TextSearchResponse results) {
List<Site> siteList;
if (results == null || results.getTotalCount() <= 0 || (siteList = results.getSites()) == null
|| siteList.size() <= 0) {
resultTextView.setText("Result is Empty!");
return;
}
updateClusterData(siteList);// Mark places on the map.
}
// Return the result code and description upon a search exception.
@Override
public void onSearchError(SearchStatus status) {
resultTextView.setText("Error : " + status.getErrorCode() + " " + status.getErrorMessage());
}
};
// Call the place search API.
searchService.textSearch(request, resultListener);
3.Draw a map
Code:
@Override
public void onMapReady(HuaweiMap huaweiMap) {
hMap = huaweiMap;
hMap.moveCamera(CameraUpdateFactory.newLatLngZoom(Constants.sMylatLng, 1));
hMap.setMyLocationEnabled(true);
hMap.getUiSettings().setMyLocationButtonEnabled(true);
initCluster(huaweiMap);
}
4.Cluster markers on the map.
Code:
private ClusterManager<MyItem> mClusterManager;
List<MyItem> items = new ArrayList<>();
private void initCluster(HuaweiMap hMap) {
mClusterManager = new ClusterManager<>(this, hMap);
hMap.setOnCameraIdleListener(mClusterManager);
// Add a custom InfoWindowAdapter by setting it to the MarkerManager.Collection object from
// ClusterManager rather than from GoogleMap.setInfoWindowAdapter
//refer: https://github.com/billtom20/3rd-maps-utils
mClusterManager.getMarkerCollection().setInfoWindowAdapter(new HuaweiMap.InfoWindowAdapter() {
@Override
public View getInfoWindow(Marker marker) {
final LayoutInflater inflater = LayoutInflater.from(SearchClusterActivity.this);
final View view = inflater.inflate(R.layout.custom_marker_window, null);
final TextView textView = view.findViewById(R.id.textViewTitle);
String text = (marker.getTitle() != null) ? marker.getTitle() : "Cluster Item";
textView.setText(text);
return view;
}
@Override
public View getInfoContents(Marker marker) {
return null;
}
});
}
// Update clustered markers.
private void updateClusterData(List<Site> siteList) {
items = new ArrayList<>();
mClusterManager.clearItems();
for (Site s:
siteList) {
Coordinate location = s.getLocation();
MyItem myItem = new MyItem(location.lat,location.lng,s.name,s.formatAddress);
items.add(myItem);
}
mClusterManager.addItems(items);
Coordinate coordinate = siteList.get(0).getLocation();
LatLng latLng = new LatLng(coordinate.lat,coordinate.lng);
mClusterManager.cluster();
hMap.animateCamera(CameraUpdateFactory.newLatLngZoom (latLng,14 ));
}
ResultsEnter a place or service in the Query box and tap Search. The figures below show how the search results are displayed as markers on the map.
The preceding four figures show the effect of searching for food and school, as well as the marker clustering effect at different zoom levels. Congratulations, you have now successfully integrated place search and marker clustering into your in-app map.
ReferencesFor more details, you can go to:official website
Development Documentation page, to find the documents you need
Reddit to join our developer discussion
GitHub to download sample codes
Stack Overflow to solve any integration problems
Thanks for sharing..