MindSpore is an open-sourced framework for AI based application development which is announced by Huawei. It is a robust alternative to AI frameworks such as TensorFlow and PyTorch which are widely used in the market.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Let’s start by emphasizing the features and advantages of MindSpore framework:
MindSpore implements AI algorithms for easier model development and provides cutting-edge technologies with Huawei AI processors to improve runtime efficiency and computing performance.
One of its advantages is that it can be used in several environments like on devices, cloud and edge environments. It supports operating systems like IOS and Android, and AI applications on various devices such as mobile phones, tablets and IoT devices.
MindSpore supports parallel training across hardware to reduce training times. It maximizes both hardware computing power and minimizes inference latency and power consumption.
It provides dynamic debugging ability for developers which enables to find out bugs in the apps easily.
According to Huawei, MindSpore does not process data by itself but ingests the gradient and model information that has been processed. This ensures the integrity of sensitive data.
MindSpore Lite is an inference framework for custom models which is provided by HMS ML Kit to simplify the integration and development. The developers can define their own model and implement model inference thanks to MindSpore Lite capabilities.
MindSpore Lite is compatible with commonly used AI platforms like TensorFlow Lite, Caffe, and Onnx. Different models can be converted into .ms (MindSpore) format and then run perfectly.
Custom models can be deployed and executed easily since they are compressed and occupy small storage.
It provides complete APIs to integrate inference framework of an on-device custom model.
HMS ML Kit enables to train and generate custom models with deep machine learning. It also offers pre-trained image classification model. You can develop your own custom model by using Transfer Learning feature of ML Kit with a specific dataset.
I will basically explain you how to train your own model over an example which contains three plant categories. We will use a small data set for reference and train the image classification model to identify cactus, pine and succulent plants. The model will be created by using HMS Toolkit plug-in and AI Create.
HMS Toolkit: As a lightweight IDE tool plugin, HMS Toolkit implements app creation, coding, conversion, debugging, test, and release. It helps you integrate HMS Core APIs with lower costs and higher efficiency.
AI Create: Provides the transfer learning capabilities of image classification and text classification. Images and texts can be identified thanks to AI Create. It uses MindSpore as a training framework and MindSpore Lite as inference framework.
Note: Use the Android Studio marketplace to install HMS Toolkit plug-in. Please go to File > Settings > Plugins > Marketplace, enter HMS Toolkit into the search box and click install. After installation complete, restart Android Studio.
We should prepare the environment to train our model first. AI Create only supports Windows operating system currently. Please open Coding Assistant by using the new HMS section that came with HMS Toolkit plug-in. Go to AI > AI Create in Coding Assistant and select Image and click Confirm for Image Classification.
After this step HMS Toolkit automatically downloads resources for you. If the Python environment is not configured, the dialog box will be displayed as below.
Note: You should download and install Python 3.7.5 from the link to use AI Create. After installation complete, please do not forget to add Python installation path into the Path variable in Environment Variables and restart the Android Studio.
After environment is ready, if you select Image and click Confirm from AI Create it will automatically start to install MindSpore. Please be sure the framework has been installed successfully by checking the Event logs.
From now, new model section will be opened to select an image folder to train our own model. You should prepare your data set in accordance with the requirements. We will train the model for our demo to identify cactus, succulent and pine plants with a small data set.
The folder structure should be like below :
The following conditions should be met for image resources:
The minimum number of pictures for each category of training data is 10.
The lower limit of the classification number of the training data set is 2, and the upper limit is 1000.
Supported image formats: .bmp, .jpg, .jpeg, .png or .gif.
After training image folder is selected, please set Output model file path and training parameters. If you check HMS Custom Model, a complete model will be generated. The train parameters affects the accuracy of image recognition model. You can modify them if you have a experience with deep learning. When you click on Create Model, MindSpore will start to train your model according to the data set.
Training process will take a time depending on your data set. As we used a small data set with the minimum number of the requirements it is completed quickly. You can also track training logs, your model will be created on the specified path at the end of the process.
The training results will be shared after model training is completed. AI create enables to test your model by adding the test images before using it in any project. You can also generate a demo project that implements your new model with Generate Demo option.
You should create a new test image folder with the same structure of provided data set.
As you see above, our test average accuracy is calculated as 57.1%. This accuracy can be improved by providing comprehensive data set and training.
You can also use and experience results of your new model over a demo project which can be created by HMS Toolkit. After the demo is created, you may directly run and build the project and check the results on real device.
In this article, I wanted to share basic information about MindSpore and how we can use Transfer Learning function of HMS for custom models.
You can also develop your own classification model by using this post as a reference. I hope that it will be useful for you !
Please follow our next articles for more details about ML Kit Custom Model and MindSpore.
References
https://developer.huawei.com/consum...ore-Guides/ml-mindspore-lite-0000001055328885
https://www.mindspore.cn/lite/en
Related
More information like this, you can visit HUAWEI Developer Forum
Original link: https://forums.developer.huawei.com/forumPortal/en/topicview?tid=0201272771294910088&fid=0101187876626530001
1 Foreword
Two previous articles have introduced the bank card recognition function of Huawei HMS MLKit.:
Ultra-simple integration of Huawei HMS MLKit Bank card recognition SDK, one-click bank card binding:
https://forums.developer.huawei.com/forumPortal/en/topicview?tid=0201258006396920241&fid=0101187876626530001
One article to understand Huawei HMS ML Kit text recognition, bank card recognition, general card identification
https://forums.developer.huawei.com/forumPortal/en/topicview?tid=0201253487604250240&fid=0101187876626530001
Through the above two articles, you must already know the usage scenarios of bank card recognition and how to use Huawei's bank card recognition SDK. But is the SDK provided by Huawei good or bad? How about competitiveness? I will give you an in-depth evaluation to see how effective it is.
2 Pick a competing product-Card.IO
In order to better reflect the evaluation results, we need to choose a competitor. We choose a more popular github open source project Card.io for comparison, see which one is better. I downloaded a card.io demo code from github, compiled and generated an APK and installed it on the phone, let's start our comparison.
3 Contrast dimension
As a developer, if you want to choose an easy-to-use SDK, you will mainly consider whether the SDK will charge or not, if it has a high accuracy, whether the recognition speed is fast, etc.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
4 Is it free? - Use cost and expense comparison
4.1 Comparison conclusion: both are*free
5 Which devices are supported? - Device type coverage comparison
5.1 Contrast conclusion: both are all phone types*covered
6 How much storage space is required for the integrated SDK? - SDK package size comparison
card.io provides a variety of algorithm libraries such as x86_64, arm64-v8a, X86, armeabi-v7a, armeabi, mips, etc. This is mainly used to adapt to different CPUs, because the current Android phones are basically arm architecture*, And armeabi is currently obsolete, so in order to be fair, we only calculate the algorithm library of arm64-v8a and armeabi-v7a.
6.1 Cardio SDK package*size:
This is the size of the entire sample APK, and the size of the SDK part after removing the sample code is about 6.1M.
6.2 Huawei MLKit bank card recognition SDK package*size:
By analyzing the size of the sample APK, you can see that the SDK contains two parts: algorithm and assets. The total size of the SDK parts is about 3.1M
6.3 Comparison conclusion: Obviously, Huawei HMS MLKit is*better
The comparison is summarized as follows:
7 Which countries and types of cards can be identified? - Comparison of supported card*types
7.1 Analysis of common card*types
Common bank cards in the world are Visa, JCB, MasterCard, American Express Card, China UnionPay, etc. We will select some examples for these cards and see how the two SDKs identify the results.
7.2 Visa card identification comparison
I search several visa cards for identification and confirmation to see if the starting card type and card number can be correctly identified.
7.2.1 Card.io: Visa card can be correctly recognized
7.2.2 Huawei HMS MLKit: Visa card can be recognized correctly
It seems that for Visa cards, the card type and card number can be correctly identified.
7.3 MasterCard identification comparison
7.3.1 Card.io: can be correctly identified
7.3.2 Huawei HMS MLKit: can be correctly identified
7.4 JCB card identification comparison
7.4.1 Card.io: The recognition test was not successful
Since there is no real JCB card in hand, I search some cards on the Internet and tested many of them, but none of them were recognized. if you has real cards*,can experience it by yourself and feedback the test results.
7.4.2 Huawei HMS MLKit: can be correctly identified
However, since there is no real card, the organization is not identified, if you have real card can experience it and feedback the test results.
7.5 American Express Card Comparison
7.5.1 Card.io: The recognition test was not successful
Since there is no real American Express Card card in hand, I search some cards on the Internet and tested many of them, but none of them were recognized. if you has real cards*,can experience it by yourself and feedback the test results.
7.5.2 Huawei HMS MLKit: can be correctly identified
Since there is no real card, the organization is not identified, if you have real cards, can experience it by yourself s and feedback the test results.
This is not the end. For full content, you can visit https://forums.developer.huawei.com/forumPortal/en/topicview?tid=0201272771294910088&fid=0101187876626530001
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
1. Get to know HarmonyOS
1)What is HarmonyOS:LINK https://device.harmonyos.com/en/docs/start/learn/oem_des_define-0000001055232642
What Is HarmonyOS? What ability? What does technical architecture look like?
HarmonyOS is a future-proof distributed operating system open to you as part of the initiatives for the all-scenario strategy, adaptable to a mobile office, fitness and health, social communication, and media entertainment, to name a few. Unlike a legacy operating system that runs on a standalone device, HarmonyOS is built on a distributed architecture designed based on a set of system capabilities. It is able to run on a wide range of device forms.
For application developers, HarmonyOS adopts distributed technologies to make application development possible on different device forms. With HarmonyOS, you have the choice to focus on upper-layer service logic and develop applications in a much easier and more efficient way.
For device developers, HarmonyOS uses a component-based software design to tailor itself to particular device forms based on their respective resource capabilities and service characteristics.
2) Security guide: What are the security mechanisms and recommended practices of HarmonyOS in terms of hardware, system, data, device interconnection, and application security?
Link:https://device.harmonyos.com/en/docs/security/sec-guides/oem_security_guide-0000001050032745
3)Obtain the source code. There are several methods for obtaining the source code. Select one based on the site requirements.
LINK:https://device.harmonyos.com/en/doc...EN_TOPIC_0000001050769927__section61172538310
4)Obtain the compilation toolchain, device development tool (HUAWEI DevEco Device Tool), application development tool (HUAWEI DevEco Studio), and tool usage guide.
LINK:https://device.harmonyos.com/en/docs/start/get-tools/oem_tool-0000001055705774
5)API reference:https://device.harmonyos.com/en/docs/develop/apiref/abilitykit-0000001054598111
2. Develop WLAN connection products
a) Have general knowledge of the Hi3861 development board: WLAN module, which provides connection capabilities for various IoT devices.
Document
device.harmonyos.com
b) Set up the Hi3861 environment, including preparing the software and hardware, and installing the compilation and development environment.
Document
device.harmonyos.com
c) Develop the first example program of the Hi3861: Compile and burn the HarmonyOS to complete the first Hello World
program.https://device.harmonyos.com/en/docs/start/introduce/oem_wifi_start_helloword-0000001051930719
d) Development example of LED peripheral control: Call the NDK interface of HarmonyOS to control the GPIO and implement LED blinking.
Document
device.harmonyos.com
e) One-Hop scenario development guide: Develop WLAN connection products that provide the FA experience.
Document
device.harmonyos.com
f) Third-party SDK integration: Integrate vendor SDKs into HarmonyOS.
Document
device.harmonyos.com
3. Develop camera products with screens.
Device Software Development]
a) Understand the Hi3516 development board: It has the screen camera module and develops HarmonyOS applications based on the development board.
Document
device.harmonyos.com
b) Set up the Hi3516 environment, including preparing the software and hardware, and installing the compilation and development environments.
Document
device.harmonyos.com
c) Develop the first application of the Hi3516: Compile and burn the HarmonyOS to complete the first application Hello World.
Document
device.harmonyos.com
d) Example of developing the first Hi3516 driver: Develop a new driver using HarmonyOS and complete the first driver Hello World.
Document
device.harmonyos.com
e) Screen and camera control development example: Use HarmonyOS to control the screen and camera.
Document
device.harmonyos.com
[Application Software Development]
a) Set up the development environment: Install HUAWEI DevEco Studio.
Document
developer.harmonyos.com
b) JavaScript application development interface: describes the framework, components, and interfaces of JavaScript application development.
Document
device.harmonyos.com
c) Visual application development example: Use HarmonyOS to develop vision applications.
Document
device.harmonyos.com
4. Develop screenless camera products.
a) Have general knowledge of the Hi3518 development board: The camera module without a screen provides camera capabilities for various IoT devices.
Document
device.harmonyos.com
b) Set up the Hi3518 environment, including preparing the software and hardware, and installing the compilation and development environments.
Document
device.harmonyos.com
c) Develop the first example program of the Hi3518: Compile and burn the HarmonyOS to complete the first program Hello World.
Document
device.harmonyos.com
d) Camera control development example: Use HarmonyOS to control the camera.
Document
device.harmonyos.com
5. Chip Adaptation Reference
a) Kernel development guide: describes the basic functions, file system, standard library, and commissioning functions of the HarmonyOS light kernel and provides development guidance.
Document
device.harmonyos.com
6. Contribution component
a) Component development specifications: basic concepts of components and how to define components according to the specifications.
Document
device.harmonyos.com
b) Component development guide: Develop HarmonyOS components and distributions.
Document
device.harmonyos.com
7. Code cloud warehouse
Code cloud docs repository: OpenHarmony is an open-source version of HarmonyOS. It provides developer documents, such as quick start, development guide, and API reference. Welcome to the document open-source project and improve developer documents together.
For details about Huawei developers and HMS, visit the website.
HUAWEI Developer Forum | HUAWEI Developer
forums.developer.huawei.com
Links not working, only the "Coming soon" message is showing up.
@Martin Bieber Thread closed! Please advise the moderators team to re-open the thread when ready to provide functionable links. Thanks for your cooperation.
Regards
Oswald Boelcke
Senior Moderator
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Introduction
Are you new to machine learning?
If yes. Then let’s start from the scratch.
What is machine learning?
Definition: “Field of study that gives computer capability to learn without being explicitly programmed.”
In general: Machine learning is an Application of the Artificial intelligence (AI) it gives devices the ability to learn from their experience improve their self-learning without doing any coding. For example if you search something related Ads will be shown on the screen.
Machine Learning is a subset of Artificial Intelligence. Machine Learning is the study of making machines more human-like in their behavior and decisions by giving them the ability to learn and develop their own programs. This is done with minimum human intervention, that is, no explicit programming. The learning process is automated and improved based on the experiences of the machines throughout the process. Good quality data is fed to the machines, and different algorithms are used to build ML models to train the machines on this data. The choice of algorithm depends on the type of data at hand, and the type of activity that needs to be automated.
Do you have question like what is the difference between machine learning and traditional programming?
Traditional programming
We would feed the input data and well written and tested code into machine to generate output.
Machine Learning
We feed the Input data along with the output is fed into the machine during the learning phase, and it works out a program for itself.
Steps of machine learning
1. Gathering Data
2. Preparing that data
3. Choosing a model
4. Training
5. Evaluation
6. Hyper parameter Tuning
7. Prediction
How does Machine Learning work?
The three major building blocks of a Machine Learning system are the model, the parameters, and the learner.
Model is the system which makes predictions.
The parameters are the factors which are considered by the model to make predictions.
The learner makes the adjustments in the parameters and the model to align the predictions with the actual results.
Now let’s build on the water example from the above and learn how machine learning works. A machine learning model here has to predict whether water is useful to drink or not. The parameters selected are as follows
Dissolved oxygen
pH
Temperature
Decayed organic materials
Pesticides
Toxic and hazardous substances
Oils, grease, and other chemicals
Detergents
Learning from the training set.
This involves taking a sample data set of several place water for which the parameters are specified. Now, we have to define the description of each classification that is useful to drink water, in terms of the value of parameters for each type. The model can use the description to decide if a new sample of water is useful to drink or not.
You can represent the values of the parameters, ‘pH’ ,‘Temperature’ , ‘Dissolved oxygen’ etc, as ‘x’ , ‘y’ and ‘z’ etc. Then (x, y, z) defines the parameters of each drink in the training data. This set of data is called a training set. These values, when plotted on a graph, present a hypothesis in the form of a line, a rectangle, or a polynomial that fits best to the desired results.
Now we have learnt what machine learning is and how it works, now let’s understand about Huawei ML kit.
Huawei ML kit
HUAWEI ML Kit allows your apps to easily leverage Huawei's long-term proven expertise in machine learning to support diverse artificial intelligence (AI) applications throughout a wide range of industries.
Huawei has already provided some built in feature in SDK which are as follows.
Text related service.
Text recognition
Document recognition
Id card recognition
Bank card recognition
General card recognition
Form Recognition
Language/Voice related services.
Translation
Language detection
Text to speech
Image related services.
Image classification
Object detection and Tracking
Landmark recognition
Product visual search
Image super resolution
Document skew correction
Text image super resolution
Scene detection
Face/Body related services.
Face detection
Skeleton detection
Liveness detection
Hand gesture recognition
Face verification
Natural language processing services.
Text embedding
Custom model.
AI create
Model deployment and Inference
Pre-trained model
In this series of article we learn about Huawei Custom model. As an on-device inference framework of the custom model, the on-device inference framework MindSpore Lite provided by ML Kit facilitates integration and development and can be running on devices. By introducing this inference framework, you can define your own model and implement model inference at the minimum cost.
Advantages of MindSpore Lite
It provides simple and complete APIs for you to integrate the inference framework of an on-device custom model.
Customize model in simple and quickest with excellent experience with Machine learning.
It is compatible with all mainstream model inference platforms or frameworks, such as MindSpore Lite, TensorFlow Lite, Caffe, and Onnx in the market. Different models can be converted into the .ms format without any loss, and then run perfectly through the on-device inference framework.
Custom models occupy small storage space and can be quantized and compressed. Models can be quickly deployed and executed. In addition, models can be hosted on the cloud and downloaded as required, reducing the APK size.
Steps to be followed to Implement Custom model
Step 1: Install HMS Toolkit from Android Studio Marketplace.
Step 2: Transfer learning by using AI Create.
Step 3: Model training
Step 4: Model verification
Step 5: Upload model to AGC
Step 6: Load the remote model
Step 7: Perform inference using model inference engine
Let us start one by one.
Step 1: Install HMS Toolkit from Android Studio Marketplace. After the installation, restart Android Studio.
· Choose File > Setting > Plugins
Result
Coming soon in upcoming article.
Tips and Tricks
Make sure you are already registered as Huawei Developer.
Learn basic of machine learning.
Install HMS tool in android studio
Conclusion
In this article, we have learnt what exactly machine learning is and how it works. And difference between traditional programming and machine learning. Steps required to build custom model and also how to install HMS tool in android studio. In upcoming article I’ll continue the remaining steps in custom model of machine learning.
Reference
ML Kit Official document
Checkout in forum
HUAWEI Prediction utilizes machine learning, based on user behavior and attributes reported by HUAWEI Analytics Kit, to predict target audiences with next-level precision. The service can help you with carrying out and optimizing operations. For example, it can work with A/B Testing to evaluate how effective your promotions have been and it can also join hands with Remote Configuration to configure dedicated plans for specific audiences through Remote Configuration. This is likely to result in dramatically improved user retention and conversion.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Integrating the Analytics SDK into your app enables the Prediction service to run preset tasks for predicting lost, paying, and returning users. On the details page of a specific prediction task, you'll find audiences with high, medium, and low probabilities of triggering a specific event, with meticulous profiling. For example, an audience with a high churn probability will include users who are very likely to quit using the app over the next 7 days. The characteristics of these users are displayed on cards, which makes it easy for you to pursue targeted operations.
The following figures give you a sense of how the prediction task list and details page look in practice.
* Data in these figures is for reference only.
Ø How we built these prediction models
First of all, we made it clear what our goal was to make predictions, so the type of data we collect reflects this. We then cleansed and sampled the collected data based on user characteristics to obtain a data set. This data set was divided into a 20% validation set and an 80% training set; multiple rounds of offline experiments were then conducted to determine the features and most suitable parameters for forming models. The generated models were later trained online to perform prediction tasks.
This process is outlined in detail below:
Ø Feature and model selection and optimization
Feature exploration
At the early stage of the project, we made sure to analyze user attributes, behavior, and requirements, in order to determine the business-relevant variables, such as user active days over the last 7 days and app use durations, through which we built a feature table.
After the features were identified, we chose a method that best suited our service and optimized parameters by performing multiple rounds of experiments. Common tree boosting methods that can be found across the industry include XGBoost, random forests, and Gradient Boost Decision Tree (GBDT). We trained our data set using these methods, and found that random forests perform best. Then the bagging method was adopted to improve models' fitting and generalization capabilities.
In addition to parameter optimization, the sampling ratio was also considered, especially for the payment prediction scenario, in which the proportion between positive samples and negative samples was large (about 1:100). For such cases, the accuracy and recall indicators should both be ensured. Then we adjusted the ratio of positive samples to negative samples to 1.5:1 during model training for payment prediction, in order to boost the recall of the model.
Hyperparameter and feature determination
Unnecessary features in a model can undermine the efficacy of predictions made by the model, or slow down model training. During experiments at this early stage, features were sorted by weight, and the top features were selected. In the model that would actually come to be, these features and relevant hyperparameters were configured.
Even after a model is applied for prediction, the data still needs to be observed and analyzed to supplement necessary features. In later iterations, we added a range of features, including the event and trend features, bringing the feature count over 400.
Automatic hyperparameter search
Model training involving full features can be quite time-consuming, and fail to produce the optimal output. In addition, the optimal hyperparameters and features may vary depending on the app. Therefore, the training should be performed by app.
To address this issue, we applied the automatic hyperparameter search function to search for optimal parameters in the configured parameter space. Matched parameters are stored in a Hive table.
The following figures show the modeling procedure and relevant external support.
Ø Research emphasis
We will continue optimizing our models, by researching the following:
l Neural network
As the number of features continues to grow (400+ currently), and user behaviors become too complex to mine common rules, our prediction models will need to be enhanced to ensure that predictions remain accurate. This will require that we introduce neural networks with strong expressive power, in addition to decision trees to train models based on behavioral features.
l Federated learning
Currently, data is isolated between apps and tenants. Horizontal federated learning can be used to train models across apps and tenants on a collaborative basis.
l Time series feature
A typical app user's device will report hundreds of events (among 1,000+ event types) and access nearly 100 pages within the app on a weekly basis. These times series can be used to build both short- and long-term user behavioral features, with the goal of improving prediction accuracy across a wide range of scenarios. Page access user behavioral data can be valuable for research, as such data bear characteristics of time series data.
l Feature mining and processing
The feature set is still being expanded. We will explore additional relevant features, such as the average app use interval, device attributes, download sources, and locations. In addition, we will also undertake such measures as discretization, normalization, square and square root operations, Cartesian product calculation, and Cartesian product calculation for multiple data sets, to build subsequent features that are based on existing features.
For more on HUAWEI Prediction, visit>>
For more details, you can go to:
l Our official website
l Our Development Documentation page, to find the documents you need
l Reddit to join our developer discussion
l GitHub to download demos and sample codes
l Stack Overflow to solve any integration problems
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
IntroductionHello everybody! In this article, I will be introducing the brand new version of HMS Unity Plugin version 2.0! Yes, it is out and it will make your life much easier, even when compared to the previous version of the plugin 1.2.0!
The HMS Unity Plugin is a tool that helps developers to quickly integrate Huawei Mobile Services (HMS) to their games in Unity without worrying about the boilerplate codes that has to be written in the background. All the necessary backend code is dealt with for you, and all you have to do is to focus on your own game and HMS features.
In this article, I will be using my own game as a scenario to quickly explain the steps required for you to integrate many HMS kits with a few clicks and/or lines of code. To be specific, I am using the version 2.0.1 for this article.
Before diving into details, you can check out here, the plugin’s GitHub page, if you think you are experienced enough with the plugin and that you do not require any specific scenario to start using the plugin. Readme already has general details about how to start using the kit.
Please note that I developed this game and used the plugin for this article series in Unity editor version 2019.4.18f1. For 2020 version of the Unity, you should be able to do the same steps with the same version of the plugin. For 2018 or other versions, please check out here, the releases page, for the corresponding plugin release (if any) and for the latest updates.
I further suggest that you download the latest version of the plugin for the corresponding Unity version, although I have used version 2.0.1 here. Details on how to use/import the plugin will follow, but I wanted to let you know beforehand.
About My GameLet me quickly introduce my game to you, so that during the phases of integration, it makes sense to you why I am implementing the plugin features in this way.
It is a hypercasual game called Raining Apocalypse, where you simply escape from the rain! You are the cool fire character that run horizontally. The more rains you run away from the more points you score. So, as you can see, very simple and affordable to implement.
I developed this game using a Udemy course and the source code is not completely written by me. However, we will not focus on the game development parts anyway, we will integrate the HMS kits to our game to increase our chances of survival in the game industry.
It consists of two scenes: main menu scene and game scene. It has several scripts attached, some of which we will use in the integration.
I will integrate “Account Kit, Ads Kit, Push Kit, Game Service, IAP, Analytics Kit and Crash Kit” to my game. For this part 1 of the article, I will talk about Account Kit, Ads Kit, Analytics Kit, Crash Kit and Push Kit. Game Service and IAP will be talked about in Part 2 of the series.
It looks too many and seems to cost the developers a lot of time, but not with Plugin 2.0 and you will see how, if you read on.
Development ProcessTo start using the plugin you must go here, the readme of the official GitHub page of the plugin and complete the phases, starting from the phase called “1 — Register your app at Huawei Developer” to “4 — Connect your game with any HMS Kit”, so that the AppGallery Connect side of your application is done. HMS requires AppGallery Connect configurations to correctly work with the in-game features you want to implement.
I will, from now on, assume that you have completed the first 3 phases and now ready to implement the 4th phase and onwards. You can continue using the following phases for the kits as well; but here, I will merge them with a real life scenario, i.e. my game, so that you can better understand how those features would work in your own game. By this way, you can decide easily where to put those methods in your own game.
Now, assuming that you are done with the AGC side, let’s get to the coding. I will talk about AGC side a bit in some of the kits I explain as well.
Coding PhaseLet’s start with Account Kit and Ads Kit. The main menu scene of my game has a logo and a play button to start the game. What I want to add is a banner ad to the bottom and also to implement sign in functionality. By this way, I will show 320*50 (or any size you choose) ads to the users at the beginning of my game. Also, since I do not want to proceed to the game without the user signed in, I will implement the sign in functionality so that when I try to use the other kits in-game, I do not have to deal with sign in process again.
To start using any kit, you must import the plugin (downloaded from the releases page) and click on the kits that you want to use. Mine currently looks like this:
Clicking them will add a manager to your game screen. They are coming with DontDestroyOnLoad() function by default, so you do not have to worry about carrying it over to the other scenes. However, it is strongly suggested that you do this in the outermost scene, so this carry-over process is smooth and bug-free. I do it in my main menu scene because it comes first, and the game scene opens the second.
I have my own manager script called EnesGameManager.cs to control the behaviour of kits and game-specific functionalities.
Before every use of managers, you must call them with .Instance because the plugin uses the singleton pattern to implement the managers. That is also helpful in changing scenes all you like because the plugin will delete the unnecessary copies of the manager and will make sure that you only work with the one and the only instance. This will save you from coding overhead and wrong instance usage.
Account KitAfter ticking the box, HMSAccountManager is added to the scene. All you have to do is to add the below line to anywhere you choose as per your own game logic. You can place it inside a new method to add a button click to the play button, for example.
HMSAccountManager.Instance.SignIn();
Important Note: If you plan to use GameService, it has automatic sign-in functionality. Instead of using Account Kit, you can just enable the GameService, tick the box (“Initialize on Start”) under Game Service tab and your app will do auto sign-in every time the user opens the app. If this is not what you want, use of Account Kit is required and GameService function should be manually initialized. All of the details about this can be found in the part 2 of the article, which is here.
GameService is dependent on AccountKit anyway, so even if you do not use Account Kit to login (which is perfectly fine), GameService login system will use it for you in the background.
Ads Kit — Part 1Ads Kit has three types of advertisements as of the publish date of this article. If you are reading this later than the publish date, some new types of ads might have been added to the plugin, so I would suggest you to check them out as well. Ads Kit, right now, supports “Banner Ads, Interstitial Ads and Rewarded Ads”. I will be using banner ads and interstitial ads in my game. Since the interstitial ads will be implemented in the game scenes, in this part 1, I will only show how to implement banner ads.
To enable the ads you want to use, just tick the corresponding boxes. That is very easy. Also, if you want to use test ads like me, check that box too. It will replace any ID you enter above with the test ID, so you can test the layout of ads in your game for example, before getting a real ID. If you have the ID already you can uncheck the test ID box, enter the ID and click save. Replacing does not mean you will lose the already-entered ID, but you will see the test ads from now on, until you uncheck the box.
The default size is 320*50 and the default position is POSITION_BOTTOM. If you require different sizes and positions, you can only change them inside of manager script in this version of the plugin. You can check Huawei -> Scripts -> Ads folder to reach the manager script and configure as per your needs.
If you build now, you should get your ads like below. You can work on alignment yourself to match the elements.
There is one more thing left. This banner ad will show in every screen you will be opening from now on. For my case, it will be present on my game screen if I press play button; which is not what I want. Thus, in the game scene, I will call below one line code to hide the ads because it is the behaviour I want. You can call this code piece in any Start() function of the active object scripts.
HMSAdsKitManager.Instance.HideBannerAd();
As you can see, no initialization etc. required. HMSAdsKitManager will remain on your other scenes and can be directly called by using its instance.
If you do not want the banner ads in your main screen and want it in other scenes, you can always use the above hide code to hide it in the beginning and call the below show method to show it anywhere you like.
HMSAdsKitManager.Instance.ShowBannerAd();
That’s it for banner ads. Very short, very simple.
Ads Kit — Part 2I also want to use interstitial ads in my app. The scenario I want to use it is when the user dies, before showing a lose screen to retry, I want to show interstitial ads, unless the user has purchased the remove_ads product (which I will talk about in IAP section).
To implement this, I need to put the code into the proper place in my own game code. There is no one way to do it. If you have a similar scenario like me, you can also follow my code and find your corresponding code and implement your own interstitial ads logic.
In my player script, I have the TakeDamage function where I control the damage inflicted on my player. It also controls the death, when the health drops below zero. So here, when the player dies, I call interstitial ad show code to display the user interstitial ads. The if check can be ignored now and will be talked about in IAP section.
public void TakeDamage(int damageAmount)
{
source.Play();
health -= damageAmount;
if (health <= 0)
{
healthDisplay.text = "0";
HMSAnalyticsManager.Instance.SendEventWithBundle("$GameCompleted", "Score", score.ToString());
Destroy(gameObject);
if (!GameObject.Find("EnesGameManager").GetComponent<EnesGameManager>().isAdsRemoved)
HMSAdsKitManager.Instance.ShowInterstitialAd();
losePanel.SetActive(true);
HMSAchievementsManager.Instance.GetAchievementsList();
}
else
{
updateHealthDisplay();
}
}
Basically, just calling the ShowInterstitialAd() function is enough to show the ads. Once the player dies, it will be shown immediately and after the user closes it, the lose panel will be shown.
Like most kits, there are callbacks that you may want to implement. For example, interstitial ads has “OnInterstitialAdClosed” callback that can be implemented. If you want to control what will happen right after the user closes the interstitial ad, you can implement this by using the code below in start function.
void Start()
{
//...
HMSAdsKitManager.Instance.OnInterstitialAdClosed = OnInterstitialAdClosed;
}
The second “OnInterstitialAdClosed” is the name of the function that you will be defining. So it can be changed to whatever you like. You can generate a new function manually by using the name you defined in the right hand side of the equal sign. However, since some callbacks come with parameters from the plugin, it is recommended that you show the potential fixes and generate one from there automatically. By this way, you will also which parameters, if any, will be returned from the callback. You can check other methods in all kits by typing “HMS…Manager.Instance.On…” and see which callbacks are supported.
Analytics KitIn the code above, you can see an AnalyticsManager instance is used. That is the whole implementation of analytics kit I have in my game. After I enabled it from the Kit Settings menu in the Unity Editor, I can call HMSAnalyticsManager however I like, as in the other managers. Here, my scenario for Analytics Kit is sending the score of the player to the server to see how much score people are achieving. You can use, just as I did, SendEventWithBundle function to send whatever items need to be sent as your game logic requires. The function requires an EventID, key and a value to be sent that can be of type string or int. That’s it and now you can check if the parameters are coming to AGC console by checking the tab “Real-time Overview”
Some of the events are predefined. As long as you enable Analytics Kit, they will be sent to the console anyway, even if you do not code anything in your game. If you cannot see your own custom event here, I would suggest adding that even manually from the Events tab close to bottom in AGC console. You can also look for support from the Huawei Developer website and/or forum.
Crash KitCrash Kit, when clicked, is automatically implemented thanks to the plugin. You should see crash reports in AGC Console, if any crash appears. You can also deliberately cause your app to crush, but how to do so is outside the scope of this article.
Push KitPush Kit, like Crash Kit, is also automatically enabled when you tick the box besides it. It is complete and ready to use. Simple as that, you should go to AGC Console, create a notification and you should receive notification depending on the time you set.
Tips & Tricks
Do not forget to get the agconnect-services.json file from the AGC and paste it to the StreamingAssets folder. The folder must be placed inside Assets folder of the Unity files.
In case any method or code piece that I shared will not work and you face any compiler complaints, make sure that you have imported the right libraries with “using” keyword. For any usage of the plugin, first line below is required and the other two lines are needed for most functions related to the kits. Make sure you imported them to use them smoothly.
using HmsPlugin;
using HuaweiMobileServices.Game;
using HuaweiMobileServices.Utils;
It is very normal that your game is different and the scenario you wanted this kits may differ from mine. That’s why, I always talked about how my need/scenario for the kit and then implemented it. If you are having trouble how to convert this to your own game logic, try to understand what I did and where, so you can implement the similar functionality where you want it to be in your own game.
Please make sure that your package name ends with either . Huawei or with .HUAWEI if you want to use IAP in your game. If you used another package name, please change it for AppGallery. Otherwise, IAP will not work. IAP will be talked about in part 3 but I wanted to warn here, so if you do not have an existing game yet, you can start off right.
ConclusionIn this article, we integrated many kits to our game. This was not just a “to integrate, do this” kind of article, but rather I tried to show you a real game scenario, so while you are applying this article to your game, you will hopefully have a better understanding. With Plugin 2.0, the speed of integration has increased dramatically, reduced down from days to perhaps hours depending on the complexity of your app.
I hope that this article has been helpful for you. You can always ask questions below, if you have anything unanswered in your mind.
Remaining two kits will be talked about here, in the part 2.
See you there!
References
HMS Unity Plugin 2.0 Branch (Github Page)
Documentation of every single kit in Huawei Docs (Links are present in the GitHub readme)
Checkout in forum