Implementing Real-Time Transcription in an Easy Way - Huawei Developers

Background​The real-time onscreen subtitle is a must-have function in an ordinary video app. However, developing such a function can prove costly for small- and medium-sized developers. And even when implemented, speech recognition is often prone to inaccuracy. Fortunately, there's a better way — HUAWEI ML Kit, which is remarkably easy to integrate, and makes real-time transcription an absolute breeze!
Introduction to ML Kit​ML Kit allows your app to leverage Huawei's longstanding machine learning prowess to apply cutting-edge artificial intelligence (AI) across a wide range of contexts. With Huawei's expertise built in, ML Kit is able to provide a broad array of easy-to-use machine learning capabilities, which serve as the building blocks for tomorrow's cutting-edge AI apps. ML Kit capabilities include those related to:
Text (including text recognition, document recognition, and ID card recognition)
Language/Voice (such as real-time/on-device translation, automatic speech recognition, and real-time transcription)
Image (such as image classification, object detection and tracking, and landmark recognition)
Face/Body (such as face detection, skeleton detection, liveness detection, and face verification)
Natural language processing (text embedding)
Custom model (including the on-device inference framework and model development tool)
Real-time transcription is required to implement the function mentioned above. Let's take a look at how this works in practice:
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Now let's move on to how to integrate this service.
Integrating Real-Time Transcription​Steps
Registering as a Huawei developer on HUAWEI Developers
Creating an app
Create an app in AppGallery Connect. For details, see Getting Started with Android.
We have provided some screenshots for your reference:
3. Enabling ML Kit
4. Integrating the HMS Core SDK.
Add the AppGallery Connect configuration file by completing the steps below:
Download and copy the agconnect-service.json file to the app directory of your Android Studio project.
Call setApiKey during app initialization.
To learn more, go to Adding the AppGallery Connect Configuration File.
5.Configuring the maven repository address
Add build dependencies.
Import the real-time transcription SDK.
Code:
implementation 'com.huawei.hms:ml-computer-voice-realtimetranscription:2.2.0.300'
Add the AppGallery Connect plugin configuration.
Method 1: Add the following information under the declaration in the file header:
Code:
apply plugin: 'com.huawei.agconnect'
Method 2: Add the plugin configuration in the plugins block.
Code:
plugins {
id 'com.android.application'
// Add the following configuration:
id 'com.huawei.agconnect'
}
Please refer to Integrating the Real-Time Transcription SDK to learn more.
Setting the cloud authentication information
When using on-cloud services of ML Kit, you can set the API key or access token (recommended) in either of the following ways:
Access token
You can use the following API to initialize the access token when the app is started. The access token does not need to be set again once initialized.
MLApplication.getInstance().setAccessToken("your access token");
API key
You can use the following API to initialize the API key when the app is started. The API key does not need to be set again once initialized.
MLApplication.getInstance().setApiKey("your ApiKey");
For details, see Notes on Using Cloud Authentication Information.
Code Development​
Create and configure a speech recognizer.
Code:
MLSpeechRealTimeTranscriptionConfig config = new MLSpeechRealTimeTranscriptionConfig.Factory()
// Set the language. Currently, this service supports Mandarin Chinese, English, and French.
.setLanguage(MLSpeechRealTimeTranscriptionConstants.LAN_ZH_CN)
// Punctuate the text recognized from the speech.
.enablePunctuation(true)
// Set the sentence offset.
.enableSentenceTimeOffset(true)
// Set the word offset.
.enableWordTimeOffset(true)
// Set the application scenario. MLSpeechRealTimeTranscriptionConstants.SCENES_SHOPPING indicates shopping, which is supported only for Chinese. Under this scenario, recognition for the name of Huawei products has been optimized.
.setScenes(MLSpeechRealTimeTranscriptionConstants.SCENES_SHOPPING)
.create();
MLSpeechRealTimeTranscription mSpeechRecognizer = MLSpeechRealTimeTranscription.getInstance();
Create a speech recognition result listener callback.
Code:
// Use the callback to implement the MLSpeechRealTimeTranscriptionListener API and methods in the API.
protected class SpeechRecognitionListener implements MLSpeechRealTimeTranscriptionListener{
@Override
public void onStartListening() {
// The recorder starts to receive speech.
}
@Override
public void onStartingOfSpeech() {
// The user starts to speak, that is, the speech recognizer detects that the user starts to speak.
}
@Override
public void onVoiceDataReceived(byte[] data, float energy, Bundle bundle) {
// Return the original PCM stream and audio power to the user. This API is not running in the main thread, and the return result is processed in a sub-thread.
}
@Override
public void onRecognizingResults(Bundle partialResults) {
// Receive the recognized text from MLSpeechRealTimeTranscription.
}
@Override
public void onError(int error, String errorMessage) {
// Called when an error occurs in recognition.
}
@Override
public void onState(int state,Bundle params) {
// Notify the app of the status change.
}
}
The recognition result can be obtained from the listener callbacks, including onRecognizingResults. Design the UI content according to the obtained results. For example, display the text transcribed from the input speech.
Bind the speech recognizer.
Code:
mSpeechRecognizer.setRealTimeTranscriptionListener(new SpeechRecognitionListener());
Call startRecognizing to start speech recognition.
Code:
mSpeechRecognizer.startRecognizing(config);
Release resources after recognition is complete.
Code:
if (mSpeechRecognizer!= null) {
mSpeechRecognizer.destroy();
}
(Optional) Obtain the list of supported languages.
Code:
MLSpeechRealTimeTranscription.getInstance()
.getLanguages(new MLSpeechRealTimeTranscription.LanguageCallback() {
@Override
public void onResult(List<String> result) {
Log.i(TAG, "support languages==" + result.toString());
}
@Override
public void onError(int errorCode, String errorMsg) {
Log.e(TAG, "errorCode:" + errorCode + "errorMsg:" + errorMsg);
}
});
We have finished integration here, so let's test it out on a simple screen.
Tap START RECORDING. The text recognized from the input speech will display in the lower portion of the screen.
We've now built a simple audio transcription function.
Eager to build a fancier UI, with stunning animations, and other effects? By all means, take your shot!
For reference:​Real-Time Transcription
Sample Code for ML Kit
To learn more, please visit:
HUAWEI Developers official website
Development Guide
Reddit to join developer discussions
GitHub or Gitee to download the demo and sample code
Stack Overflow to solve integration problems
Follow our official account for the latest HMS Core-related news and updates.
Original Source

Related

Huawei Flight Booking Application (App messaging and analytics integration) – Part 2

More information like this, you can visit HUAWEI Developer Forum​
Introduction
Flight booking app allows user to search and book flight. In this article, we will integrate app messaging and analytics into demo flight booking application.
UseCase
1. We will show app-messaging image on app screen showing list of searched flight. When user tap on app-messaging image, it will redirect to webpage showing guidelines for international arrivals and departures. Add following image URL on Huawei App gallery connect.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
2. We will use Huawei analytics to report few events like successful sign in and number of flights available between source destination.
For prerequisite and set up, refer to part 1.
Huawei App messaging
App Messaging of AppGallery Connect used to send relevant messages to target users actively using your app to encourage them to use key app functions. For example, you can send in-app messages to encourage users to subscribe certain products, provide tips on passing a game level, or recommend activities of a restaurant.
Implemetation
1. Add dependency in the app build.gradle file.
Code:
dependencies {
implementation 'com.huawei.agconnect:agconnect-appmessaging:1.4.0.300'
}
2. We need AAID for later use in sending In-App Messages. To obtain AAID, we will use getAAID() method.
3. Add following code in your project to obtain AAID.
Code:
HmsInstanceId inst = HmsInstanceId.getInstance(this);
Task<AAIDResult> idResult = inst.getAAID();
idResult.addOnSuccessListener(new OnSuccessListener<AAIDResult>() {
@Override
public void onSuccess(AAIDResult aaidResult) {
String aaid = aaidResult.getId();
Log.d(TAG, "getAAID success:" + aaid );
}
}).addOnFailureListener(new OnFailureListener() {
@Override
public void onFailure(Exception e) {
Log.d(TAG, "getAAID failure:" + e);
}
});
4. To initialize the AGConnectAppMessaging instance we will use.
Code:
AGConnectAppMessaging appMessaging = AGConnectAppMessaging.getInstance();
5. To allow data synchronization from the AppGallery Connect server we will use.
Code:
appMessaging.setFetchMessageEnable(true);
6. To enable message display.
Code:
appMessaging.setDisplayEnable(true);
7. To specify that the in-app message data must be obtained from the AppGallery Connect server by force we will use.
Code:
appMessaging.setForceFetch();
Since we are using a test device to demonstrate the use of In-App Messaging, so we use setForceFetch API. The setForceFetch API can be used only for message testing. Also In-app messages can only be displayed to users who have installed our officially released app version.
Creating Image In-App Messaging on App Gallery connect
1. Sign in to AppGallery Connect and select My projects.
2. Select your project from the project list.
3. Navigate Growing > App Messaging and click New.
4. Set Name and Description for your in-app message.
5. Provide the in-app message Type. For image in-app message, select type as Image.
6. Provide image URL.
7. Provide the action for image tapping.
8. Click Next to move on to the next section, that is, Select Sending Target section. Here we can define Condition for matching target users. In this article, we did not use any condition.
9. The next section is the Set Sending Time section.
We can schedule a date and time to send in-app message.
We can also provide an End data and time to stop sending message.
We can also display message on an events by using trigger event functionality.
Also we can flexibly set the frequency for displaying the in-app message.
10. Click Next, find Set conversation events. It associates the display or tap of the app message with a conversion event. The conversion relationship will be displayed in statistics. As off now we will keep it as none.
11. Click Save in the upper-right corner to complete message creation.
12. In-app messages can only be displayed to users who have installed our officially released app version. App Messaging allows us to test an in-app message when our app is still under tests. Find the message that you need to test, and click Test in the Operation column as shown below.
13. Provide the AAID of the test device in the text box. Click Save.
14. Click Publish.
Huawei Analytics
In this section, will show you how to use main APIs of HUAWEI Analytics Kit to report some custom events. App will report event like successful login, number of search flights.
1. Add the dependencies in app build.gradle file.
Code:
implementation 'com.huawei.hms:hianalytics:5.0.1.301'
2. The MainActivity.java is launched once user has successfully signed in. In onCreate(), we will call initializeAnalytics()to report successful signed in event.
Code:
private void initializeAnalytics() {
//Enable Analytics Kit Log
HiAnalyticsTools.enableLog();
//Generate the Analytics Instance
instance = HiAnalytics.getInstance(this);
//Parameter definition
Bundle bundle = new Bundle();
bundle.putBoolean("IsLoggedIn", true);
//Reporting event
instance.onEvent("LoginStatus", bundle);
}
3. To report number of flight available between origin and destination.
Code:
httpHandler.setFlightSearchListner(new OnFightSearchListner() {
@Override
public void onSuccess(ArrayList<Quote> quoteList) {
if(quoteList!= null && quoteList.size() > 0) {
Bundle bundle = new Bundle();
bundle.putBoolean("FlightSearchStatus", true);
bundle.putInt("FlightSearchCount", quoteList.size());
//Reporting event
instance.onEvent("FlightSearchStatus", bundle);
}
adapter = new OneWayFlightAdapter(quoteList);
recyclerView.setAdapter(adapter);
}
@Override
public void OnFailure(Exception e) {
// not implemented yet.. will do later
Log.e(TAG, e.getMessage());
}
});
To Check Analysis on App Gallery Connect
1. On the left side panel, click on Huawei Analyze > Behaviour Analysis > Event Analysis.
Conclusion:
In this article, we learnt how to integrate Huawei app-messaging and analytics to report event in application.
Can we integrate push messages along with in-app messaging?
well explained, can we load gifs also?

Integrate Nearby Service for an Enhanced Gaming Experience

HUAWEI Nearby Service is an important feature of HMS Core, designed to help users quickly find players in the vicinity, and automatically establish a low-latency, high-reliability, and zero-traffic data transmission channel, backed by underlying technologies such as Wi-Fi and Bluetooth. Integrating Nearby Service equips your game to provide users with a premium experience across the board.
1. Premium Attributes
1.1 One-Click Cross-Device Connections for Local Multiplayer Games
Current LAN games require that users are connected to the same router, in order for a connection to be established. If there is no router available, users have to manually enable a hotspot, complicating the process. Nearby Service resolves this persistent issue.
1.2 Face-to-Face Grouping and Friend Adding
Nearby Service provides you with functions for creating face-to-face groups and adding friends, without having to rely on social software or GPS, facilitating free and easy user interactions.
1.3 Face-to-Face Prop Sharing
Nearby Service allows users to quickly share game props with friends, helping you acquire new users and improve user stickiness.
2. Introduction to Plugins
Two encapsulated plugins are provided here. You can directly use the two plugins in the app, and view the source code of the plugins, for a thorough understanding of how to integrate the Nearby Service.
2.1 Preparations
Development environment of Unity.
Download the plugins on GitHub.
2.2 Plugin Import
Choose Assets > Import Package > Custom Package, and click Nearby Player or Discovery Plugin on the toolbar of Unity.
Wait for the package to be processed. After that, the resource list of plugins will display. Then click Import.
2.3 Key Code
2.3.1 Nearby Player Plugin
This plugin is applicable to such scenarios as creating face-to-face groups, adding friends, and sharing props. Declare the NearbyManager class in the plugin. The class provides APIs startDiscovery() and SendMessage() for users to discover nearby players and send messages.
Call startDiscovery to discover nearby players and make the current users discoverable by nearby players when the game starts.
The code of the called API is as follows:
Code:
void Start() {
AndroidMyCallback cb = new AndroidMyCallback(this);
nm = new NearbyManager(cb);
nm.startDiscovery(randomName());
}
The callback function AndroidMyCallback is used to perform operations after a successful discovery.
Code:
// Perform the subsequent operations when a player is discovered. In this demo, the player is being added to the player list.
public override void onFoundPlayer(string endpointName, string endpointId) {
mListController.AddPlayerToList(endpointName, endpointId);
}
// Perform the subsequent operations when a player is lost. In this demo, the player is being removed from the player list.
public override void onLostPlayer(string endpointId) {
mListController.RemovePlayerFromList(endpointId);
}
// Perform the subsequent operations when a player's message is received. In this demo, only the content of the message is displayed.
public override void onReceiveMsg(string endpointName, string Msg) {
mListController.ReceiveMsg(endpointName, Msg);
}
After discovering nearby players, users can send messages to the players for grouping, adding friends, or sharing props.
Code:
// In this demo, click a player's profile picture in the player list to send a grouping invitation.
private void OnClick(string endpointId) {
nm.log("OnClick. SendMessage to " + endpointId);
nm.SendMessage(endpointId, "invites you to join a game.");
}
2.3.2 Nearby Discovery Plugin
This is a plugin developed based on Unity UNET. Users are able to connect their devices on a mutual basis, even they are in different Wi-Fi environments. Declare the NearbyManager class, which offers two APIs: startBroadcast() and startDiscovery(). Two game devices can be connected by calling the above APIs.
The code of the called APIs is as follows:
Code:
private void OnClick() {
Button btn = this.GetComponent<Button>();
btn.enabled = false;
AndroidMyCallback androidMyCallback = new AndroidMyCallback(mNetworkManager);
NearbyManager nearbyManager = new NearbyManager(androidMyCallback);
nearbyManager.startBroadcast();
}
The callback function AndroidMyCallback is used to perform operations after a successful connection. Here, the networkManager API of UNET is called to start the game after a successful connection.
Code:
public class AndroidMyCallback : AndroidCallback {
private NetworkManager mNetworkManager;
public AndroidMyCallback(NetworkManager networkManager) : base() {
mNetworkManager = networkManager;
}
public override void onClientReady(string ipaddr) {
mNetworkManager.networkAddress = ipaddr;
mNetworkManager.StartClient();
}
public override void onServerReady(string ipaddr) {
mNetworkManager.StartHost();
}
}
2.4 Demo
Below we have provided two demos that have integrated the plugins detailed above, to provide you with a better understanding of the process.
Nearby-Player-Demo
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
UNET-NEARBY-DEMO
3. Game Apps That Have Integrated Nearby Service
Tic Tac Toe
Tic Tac Toe is a local battle game that was developed based on native Android APIs from Nearby Service, and released on the HUAWEI AppGallery platform.
NearbyGameSnake
NearbyGameSnake is a local multiplayer game that integrates Nearby Service. It is easy to play, and enables users to play directly in groups, without requiring an Internet connection.
4. Learn More
For more details, please visit HUAWEI Developers.
For more instructions, please visit Development Guide.
You can join the HMS Core developer discussion by going to Reddit.
You can download the demo and sample code from GitHub.
To solve integration problems, please go to Stack Overflow.

How to Integrate Unity IAP for a Game to Be Released on HUAWEI AppGallery

These posts have shown the first few steps of developing a Unity game:
Unity Editor Installation and APK Packaging
How Can I Use the HUAWEI Game Service Demo Provided by Unity
Initializing a Huawei Game and Implementing HUAWEI ID Sign-in Using Unity
I tried to upload the packaged APK file to AppGallery Connect. Let’s see how it went.
Based on Unity documents, I first completed my game information in UDP, and uploaded its demo package. But the upload failed and UdpSdkNotDetectedError was displayed.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
I asked Unity technical support for help, and they told me to also integrate the UDP SDK.
The UDP SDK is used for in-app purchases. You can learn more about it from this document:
https://docs.unity3d.com/Packages/[email protected]/manual/index.html
Integration of the UDP SDK
According to the Unity documents, there are three ways to integrate the UDP SDK:
Using Unity IAP (Unity IAP adds another layer to the UDP APIs.)
UDP is included in Unity IAP from version 1.22.0 to 1.23.5. If you use the Unity IAP package (1.22.0–1.23.5), do not install the UDP separately.
Using the UDP package
Using the UDP package and Unity IAP
The package of the Unity IAP 2.0.0 and later does not contain the UDP package. You need to download them both.
Latest version of Unity Asset Store (2020–11–24)
UDP version: 2.0.0
Unity IAP version: 2.2.2
Unity official documentation:
https://docs.unity3d.com/Packages/[email protected]/manual/index.html
https://distribute.dashboard.unity.com/udp/guideDoc/HUAWEI
Importing a UDP Package
If it’s your first time integrating the UDP SDK, just download the latest version of UDP package from Unity Asset Store.
Here’s the address:
https://assetstore.unity.com/packages/add-ons/services/billing/unity-distribution-portal-138507
Import the resource to Unity Editor.
If you see Unity Distribution Portal here, the resource has been imported.
Associating a Unity Project with a UDP Client ID
Document: https://docs.unity3d.com/Packages/[email protected]/manual/games-with-iap.html
Before you integrate Unity IAP, you need to associate your Unity project with a UDP client ID. Follow the instructions below.
Integrating the Unity IAP API.
Reference:
https://docs.unity3d.com/Packages/[email protected]/manual/games-with-iap.html
Before You Start
If you’re still confused by now, don’t worry, check the sample code provided by Unity. You can view it here after you downloaded the UDP package.
Integrating and Testing the StoreService.Initialize API
Integration
According to Unity, this API must be called before your game is released to other app stores using UDP.
I will continue with my earlier integrations, and here’s a few tips for you:
1. Add a button. Right-click HeaderPanel and choose UI > Button. Rename it Initialize.
2. Write code for the button.
Key points:
1) Add a namespace for UDP.
2) Declare the Initialize button, the callback listener of the API, and the boolean value.
C#:
private static InitListener m_InitListener=new InitListener();
// IAP initialized flag
private static bool m_Initialized;
private static Button initialize_button;
3) Capture the Initialize button in initUI().
C#:
initialize_button= GameObject.Find(“Initialize”).GetComponent<Button>();
4) Add the following code to private void initListeners():
C#:
initialize_button.onClick.AddListener(() =>
{
Show("starting initialize");
StoreService.Initialize(m_InitListener);
Show("initialize finished");
});
5) If you want the StoreService.Initialize API to be automatically called at app launch, add the highlighted code here:
C#:
void Start()
{
initUI();
initListeners();
// Call the Appinit method in your game code
appInit();
// Call the Initialize method in your game code
initialize();
}
private void initialize()
{
m_Initialized = false;
Debug.Log("Init button is clicked.");
Show("Initializing");
StoreService.Initialize(m_InitListener);
}
6) Edit the callback processing logic.
C#:
public class InitListener : IInitListener
{
public void OnInitialized(UserInfo userInfo)
{
Debug.Log("[Game]On Initialized suceeded");
Show("Initialize succeeded");
m_Initialized = true;
}
public void OnInitializeFailed(string message)
{
Debug.Log("[Game]OnInitializeFailed: " + message);
Show("Initialize Failed: " + message);
}
}
Packaging and Testing
The following information indicates that the API is called successfully.
Adding a Sandbox Test Account
If the following message is displayed, you need to add a sandbox account to check whether the IAP API is integrated successfully before app release.
Check this document:
https://docs.unity3d.com/Packages/[email protected]/manual/creating-a-game-on-udp.html
Make sure to click SAVE after adding the account.
The following information indicates that the API is called successfully.
Integrating and Testing the StoreService.QueryInventory API
API Functions
This API is used to query HUAWEI IAP for any missing transactions.
For consumables, if a product is purchased but not delivered, this API returns the product ID. Which means, it returns the consumable with a payment exception.
For non-consumables, this API returns all non-consumables that have been purchased by the player.
Integration
Similar to the integration of StoreService.Initialize.
Packaging and Testing
When calling this API, you can pass the product ID you want to query. If you pass one, you’ll see the corresponding information here.
After launching the app, you’ll see the following screen:
Creating a Product
To better test the IAP API, you can create a few products. Check this document:
https://docs.unity3d.com/Packages/c...#managing-in-app-purchases-on-the-udp-console
Here, I created two products.
Note: Currently, Unity IAP does not support subscriptions for games released on HUAWEI AppGallery.
Integrating and Testing the StoreService.Purchase API
Integration
Similar to the integration of StoreService.Initialize.
Packaging and TestingNon-consumables
StoreService.Purchase(“non_consumable_product01”, “payload”, m_PurchaseListener);
Here, non_consumable_product01 is a non-consumable I created in UDP.
After a successfull purchase is successful, call the API again. The following message is displayed.
Call StoreService.QueryInventory(productIds,m_PurchaseListener);.
Define productIds as follows:
List<string> productIds = new List<string> { “consumable_product01”, “non_consumable_product01” };
Now, you can see the information of the two products I created and the non-consumable I just purchased.
Consumables
After a consumable is purchased, it will be delivered and consumed.
If a consumable fails to be delivered after being purchased, you can call the StoreService.QueryInventory API to query its information, which is similar to HUAWEI IAP.
1) The consumable is not consumed after purchase.
Call this API:
StoreService.Purchase("non_consumable_product01", "payload", m_PurchaseListener);
A message is displayed, indicating that the product needs to be consumed before another purchase.
2) StoreService.QueryInventory(productIds,m_PurchaseListener);
Define productIds as follows:
C#:
List<string> productIds = new List<string> { "consumable_product01", "non_consumable_product01" };
The consumable is included in the purchased products that are returned.
3) Call this API to consume the product:
C#:
StoreService.ConsumePurchase(PurchaseInfo,IPurchaseListener)
According to the official document, PurchaseInfo is returned using the onPurchase method. You can obtain this method from the callback of StoreService.QueryInventory or StoreService.Purchase.
Here, I called the StoreService.QueryInventory API, obtained the consumable, and consumed it:
C#:
StoreService.ConsumePurchase(inventory.GetPurchaseInfo("consumable_product01"), this);
The following information indicates that the product is consumed.
Note:
After the consumption, if you call the StoreService.QueryInventory API again, you’ll see that there’s no consumable left. The query and redeliver process is complete.
Server APIs involved in Unity IAP
According to Unity, if the server APIs fail to receive the callback, no matter what the payment result is, from an app store, payment failure will be returned by UDP. Unity also provides some server APIs for you to integrate.
I’m not going to show you the demo test. You can try it on your own.
Payment Callback API
If you have your own game server, you can receive successful payment notifications sent by UDP.
For details, check this document:
https://docs.unity3d.com/Packages/c...ames-with-iap.html#game-client-implementation
Configure the callback URL here.
Order Query API
You can also send an HTTP GET request for order details and verify its information.
For details, check this document:
https://docs.unity3d.com/Packages/c...ames-with-iap.html#server-side-implementation
Very useful if you use unity.
Rebis said:
Very useful if you use unity.
Click to expand...
Click to collapse
Thank you for your liking appreciation.

Implement Virtual Try-on With Hand Skeleton Tracking

You have likely seen user reviews complaining about how the online shopping experiences, in particular the inability to try on clothing items before purchase. Augmented reality (AR) enabled virtual try-on has resolved this longstanding issue, making it possible for users to try on items before purchase.
Virtual try-on allows the user to try on clothing, or accessories like watches, glasses, and makeup, virtually on their phone. Apps that offer AR try-on features empower their users to make informed purchases, based on which items look best and fit best, and therefore considerably improve the online shopping experience for users. For merchants, AR try-on can both boost conversion rates and reduce return rates, as customers are more likely to be satisfied with what they have purchased after the try-on. That is why so many online stores and apps are now providing virtual try-on features of their own.
When developing an online shopping app, AR is truly a technology that you can't miss. For example, if you are building an app or platform for watch sellers, you will want to provide a virtual watch try-on feature, which is dependent on real-time hand recognition and tracking. This can be done with remarkable ease in HMS Core AR Engine, which provides a wide range of basic AR capabilities, including hand skeleton tracking, human body tracking, and face tracking. Once you have integrated this tool kit, your users will be able to try on different watches virtually within your app before purchases. Better yet, the development process is highly streamlined. During the virtual try-on, the user's hand skeleton is recognized in real time by the engine, with a high degree of precision, and virtual objects are superimposed on the hand. The user can even choose to place an item on their fingertip! Next I will show you how you can implement this marvelous capability.
Demo​
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Implementation​AR Engine provides a hand skeleton tracking capability, which identifies and tracks the positions and postures of up to 21 hand skeleton points, forming a hand skeleton model.
Thanks to the gesture recognition capability, the engine is able to provide AR apps with fun, interactive features. For example, your app will allow users to place virtual objects in specific positions, such as on the fingertips or in the palm, and enable the virtual hand to perform intricate movements.
Now I will show you how to develop an app that implements AR watch virtual try-on based on this engine.
Integration Procedure​Requirements on the Development Environment​JDK: 1.8.211 or later
Android Studio: 3.0 or later
minSdkVersion: 26 or later
targetSdkVersion: 29 (recommended)
compileSdkVersion: 29 (recommended)
Gradle version: 6.1.1 or later (recommended)
Make sure that you have downloaded the AR Engine APK from AppGallery and installed it on the device.
If you need to use multiple HMS Core kits, use the latest versions required for these kits.
Preparations​1. Before getting started, you will need to register as a Huawei developer and complete identity verification on the HUAWEI Developers website. You can click here to find out the detailed registration and identity verification procedure.
2. Before getting started, integrate the AR Engine SDK via the Maven repository into your development environment.
3. The procedure for configuring the Maven repository address in Android Studio varies for Gradle plugin earlier than 7.0, Gradle plugin 7.0, and Gradle plugin 7.1 or later. You need to configure it according to the specific Gradle plugin version.
4. Take Gradle plugin 7.0 as an example:
Open the project-level build.gradle file in your Android Studio project and configure the Maven repository address.
Go to buildscript > repositories and configure the Maven repository address for the SDK.
Code:
buildscript {
repositories {
google()
jcenter()
maven {url "https://developer.huawei.com/repo/" }
}
}
Open the project-level settings.gradle file and configure the Maven repository address for the HMS Core SDK.
Code:
dependencyResolutionManagement {
repositoriesMode.set(RepositoriesMode.FAIL_ON_PROJECT_REPOS)
repositories {
repositories {
google()
jcenter()
maven {url "https://developer.huawei.com/repo/" }
}
}
}
5. Add the following build dependency in the dependencies block.
Code:
dependencies {
implementation 'com.huawei.hms:arenginesdk:{version}
}
App Development​1. Check whether AR Engine has been installed on the current device. If so, your app will be able to run properly on the device. If not, you need to prompt the user to install AR Engine on the device, for example, by redirecting the user to AppGallery and prompting the user to install it. The sample code is as follows:
Code:
boolean isInstallArEngineApk =AREnginesApk.isAREngineApkReady(this);
if (!isInstallArEngineApk) {
// ConnectAppMarketActivity.class is the activity for redirecting users to AppGallery.
startActivity(new Intent(this, com.huawei.arengine.demos.common.ConnectAppMarketActivity.class));
isRemindInstall = true;
}
2. Initialize an AR scene. AR Engine supports five scenes, including motion tracking (ARWorldTrackingConfig) scene, face tracking (ARFaceTrackingConfig) scene, hand recognition (ARHandTrackingConfig) scene, human body tracking (ARBodyTrackingConfig) scene, and image recognition (ARImageTrackingConfig) scene.
Call ARHandTrackingConfig to initialize the hand recognition scene.
Code:
mArSession = new ARSession(context);
ARHandTrackingConfig config = new ARHandTrackingconfig(mArSession);
3. After obtaining an ARhandTrackingconfig object, you can set the front or rear camera. The sample code is as follows:
Code:
Config.setCameraLensFacing(ARConfigBase.CameraLensFacing.FRONT);
4. After obtaining config, configure it in ArSession, and start hand recognition.
Code:
mArSession.configure(config);
mArSession.resume();
5. Initialize the HandSkeletonLineDisplay class, which draws the hand skeleton based on the coordinates of the hand skeleton points.
Code:
Class HandSkeletonLineDisplay implements HandRelatedDisplay{
// Methods used in this class are as follows:
// Initialization method.
public void init(){
}
// Method for drawing the hand skeleton. When calling this method, you need to pass the ARHand object to obtain data.
public void onDrawFrame(Collection<ARHand> hands,){
// Call the getHandskeletonArray() method to obtain the coordinates of hand skeleton points.
Float[] handSkeletons = hand.getHandskeletonArray();
// Pass handSkeletons to the method for updating data in real time.
updateHandSkeletonsData(handSkeletons);
}
// Method for updating the hand skeleton point connection data. Call this method when any frame is updated.
public void updateHandSkeletonLinesData(){
// Method for creating and initializing the data stored in the buffer object.
GLES20.glBufferData(…,mVboSize,…);
// Update the data in the buffer object.
GLES20.glBufferSubData(…,mPointsNum,…);
}
}
6. Initialize the HandRenderManager class, which is used to render the data obtained from AR Engine.
Code:
Public class HandRenderManager implements GLSurfaceView.Renderer{
// Set the ARSession object to obtain the latest data in the onDrawFrame method.
Public void setArSession(){
}
}
7. Initialize the onDrawFrame() method in the HandRenderManager class.
Code:
Public void onDrawFrame(){
// In this method, call methods such as setCameraTextureName() and update() to update the calculation result of ArEngine.
// Call this API when the latest data is obtained.
mSession.setCameraTextureName();
ARFrame arFrame = mSession.update();
ARCamera arCamera = arFrame.getCamera();
// Obtain the tracking result returned during hand tracking.
Collection<ARHand> hands = mSession.getAllTrackables(ARHand.class);
// Pass the obtained hands object in a loop to the method for updating gesture recognition information cyclically for processing.
For(ARHand hand : hands){
updateMessageData(hand);
}
}
8. On the HandActivity page, set a render for SurfaceView.
Code:
mSurfaceView.setRenderer(mHandRenderManager);
Setting the rendering mode.
mSurfaceView.setRenderMode(GLEurfaceView.RENDERMODE_CONTINUOUSLY);
Conclusion​Augmented reality creates immersive, digital experiences that bridge the digital and real worlds, making human-machine interactions more seamless than ever. Fields like gaming, online shopping, tourism, medical training, and interior decoration have seen surging demand for AR apps and devices. In particular, AR is expected to dominate the future of online shopping, as it offers immersive experiences based on real-time interactions with virtual products, which is what younger generations are seeking for. This considerably improves user's shopping experience, and as a result helps merchants a lot in improving the conversion rate and reducing the return rate. If you are developing an online shopping app, virtual try-on is a must-have feature for your app, and AR Engine can give you everything you need. Try the engine to experience what smart, interactive features it can bring to users, and how it can streamline your development.

Obtain User Consent When Requesting Personalized Ads

Conventional pop-up ads and roll ads in apps not only frustrate users, but are a headache for advertisers. This is because on the one hand, advertising is expensive, but on the other hand, these ads do not necessarily reach their target audience. The emergence of personalized ads has proved a game changer.
To ensure ads are actually sent to their intended audience, publishers usually need to collect the personal data of users to determine their characteristics, hobbies, recent requirements, and more, and then push targeted ads in apps. Some users are unwilling to share privacy data to receive personalized ads. Therefore, if an app needs to collect, use, and share users' personal data for the purpose of personalized ads, valid consent from users must be obtained first.
HUAWEI Ads provides the capability of obtaining user consent. In countries/regions with strict privacy requirements, it is recommended that publishers access the personalized ad service through the HUAWEI Ads SDK and share personal data that has been collected and processed with HUAWEI Ads. HUAWEI Ads reserves the right to monitor the privacy and data compliance of publishers. By default, personalized ads are returned for ad requests to HUAWEI Ads, and the ads are filtered based on the user's previously collected data. HUAWEI Ads also supports ad request settings for non-personalized ads. For details, please refer to "Personalized Ads and Non-personalized Ads" in the HUAWEI Ads Privacy and Data Security Policies.
To obtain user consent, you can use the Consent SDK provided by HUAWEI Ads or the CMP that complies with IAB TCF v2.0. For details, see Integration with IAB TCF v2.0.
Let's see how the Consent SDK can be used to request user consent and how to request ads accordingly.
Development Procedure​To begin with, you will need to integrate the HMS Core SDK and HUAWEI Ads SDK. For details, see the development guide.
Using the Consent SDK​
1. Integrate the Consent SDK.
a. Configure the Maven repository address.
The code library configuration of Android Studio is different in versions earlier than Gradle 7.0, Gradle 7.0, and Gradle 7.1 and later versions. Select the corresponding configuration procedure based on your Gradle plugin version.
b. Add build dependencies to the app-level build.gradle file.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Replace {version} with the actual version number. For details about the version number, please refer to the version updates. The sample code is as follows:
Code:
dependencies {
implementation 'com.huawei.hms:ads-consent:3.4.54.300'
}
After completing all the preceding configurations, click
on the toolbar to synchronize the build.gradle file and download the dependencies.
2. Update the user consent status.
When using the Consent SDK, ensure that the Consent SDK obtains the latest information about the ad technology providers of HUAWEI Ads. If the list of ad technology providers changes after the user consent is obtained, the Consent SDK will automatically set the user consent status to UNKNOWN. This means that every time the app is launched, you should call the requestConsentUpdate() method to determine the user consent status. The sample code is as follows:
Code:
...
import com.huawei.hms.ads.consent.*;
...
public class ConsentActivity extends BaseActivity {
...
@Override
protected void onCreate(Bundle savedInstanceState) {
...
// Check the user consent status.
checkConsentStatus();
...
}
...
private void checkConsentStatus() {
...
Consent consentInfo = Consent.getInstance(this);
...
consentInfo.requestConsentUpdate(new ConsentUpdateListener() {
@Override
public void onSuccess(ConsentStatus consentStatus, boolean isNeedConsent, List<AdProvider> adProviders) {
// User consent status successfully updated.
...
}
@Override
public void onFail(String errorDescription) {
// Failed to update user consent status.
...
}
});
...
}
...
}
If the user consent status is successfully updated, the onSuccess() method of ConsentUpdateListener provides the updated ConsentStatus (specifies the consent status), isNeedConsent (specifies whether consent is required), and adProviders (specifies the list of ad technology providers).
3. Obtain user consent.
You need to obtain the consent (for example, in a dialog box) of a user and display a complete list of ad technology providers. The following example shows how to obtain user consent in a dialog box:
a. Collect consent in a dialog box.
The sample code is as follows:
Code:
...
import com.huawei.hms.ads.consent.*;
...
public class ConsentActivity extends BaseActivity {
...
@Override
protected void onCreate(Bundle savedInstanceState) {
...
// Check the user consent status.
checkConsentStatus();
...
}
...
private void checkConsentStatus() {
...
Consent consentInfo = Consent.getInstance(this);
...
consentInfo.requestConsentUpdate(new ConsentUpdateListener() {
@Override
public void onSuccess(ConsentStatus consentStatus, boolean isNeedConsent, List<AdProvider> adProviders) {
...
// The parameter indicating whether the consent is required is returned.
if (isNeedConsent) {
// If ConsentStatus is set to UNKNOWN, ask for user consent again.
if (consentStatus == ConsentStatus.UNKNOWN) {
...
showConsentDialog();
}
// If ConsentStatus is set to PERSONALIZED or NON_PERSONALIZED, no dialog box is displayed to ask for user consent.
else {
...
}
} else {
...
}
}
@Override
public void onFail(String errorDescription) {
...
}
});
...
}
...
private void showConsentDialog() {
// Start to process the consent dialog box.
ConsentDialog dialog = new ConsentDialog(this, mAdProviders);
dialog.setCallback(this);
dialog.setCanceledOnTouchOutside(false);
dialog.show();
}
}
Sample dialog box
Note: This image is for reference only. Design the UI based on the privacy page.
More information will be displayed if users tap here.
Note: This image is for reference only. Design the UI based on the privacy page.
b. Display the list of ad technology providers.
Display the names of ad technology providers to the user and allow the user to access the privacy policies of the ad technology providers.
After a user taps here on the information screen, the list of ad technology providers should appear in a dialog box, as shown in the following figure.
Note: This image is for reference only. Design the UI based on the privacy page.
c. Set consent status.
After obtaining the user's consent, use the setConsentStatus() method to set their content status. The sample code is as follows:
Code:
Consent.getInstance(getApplicationContext()).setConsentStatus(ConsentStatus.PERSONALIZED);
d. Set the tag indicating whether a user is under the age of consent.
If you want to request ads for users under the age of consent, call setUnderAgeOfPromise to set the tag for such users before calling requestConsentUpdate().
Code:
// Set the tag indicating whether a user is under the age of consent.
Consent.getInstance(getApplicationContext()).setUnderAgeOfPromise(true);
If setUnderAgeOfPromise is set to true, the onFail (String errorDescription) method is called back each time requestConsentUpdate() is called, and the errorDescription parameter is provided. In this case, do not display the dialog box for obtaining consent. The value false indicates that a user has reached the age of consent.
4. Load ads according to user consent.
By default, the setNonPersonalizedAd method is not called for requesting ads. In this case, personalized and non-personalized ads are requested, so if a user has not selected a consent option, only non-personalized ads can be requested.
The parameter of the setNonPersonalizedAd method can be set to the following values:
ALLOW_ALL: personalized and non-personalized ads.
ALLOW_NON_PERSONALIZED: non-personalized ads.
The sample code is as follows:
Code:
// Set the parameter in setNonPersonalizedAd to ALLOW_NON_PERSONALIZED to request only non-personalized ads.
RequestOptions requestOptions = HwAds.getRequestOptions();
requestOptions = requestOptions.toBuilder().setNonPersonalizedAd(ALLOW_NON_PERSONALIZED).build();
HwAds.setRequestOptions(requestOptions);
AdParam adParam = new AdParam.Builder().build();
adView.loadAd(adParam);
Testing the Consent SDK​
To simplify app testing, the Consent SDK provides debug options that you can set.
1. Call getTestDeviceId() to obtain the ID of your device.
The sample code is as follows:
Code:
String testDeviceId = Consent.getInstance(getApplicationContext()).getTestDeviceId();
2. Use the obtained device ID to add your device as a test device to the trustlist.
The sample code is as follows:
Code:
Consent.getInstance(getApplicationContext()).addTestDeviceId(testDeviceId);
3. Call setDebugNeedConsent to set whether consent is required.
The sample code is as follows:
Code:
// Require consent for debugging. In this case, the value of isNeedConsent returned by the ConsentUpdateListener method is true.
Consent.getInstance(getApplicationContext()).setDebugNeedConsent(DebugNeedConsent.DEBUG_NEED_CONSENT);
// Not to require consent for debugging. In this case, the value of isNeedConsent returned by the ConsentUpdateListener method is false.
Consent.getInstance(getApplicationContext()).setDebugNeedConsent(DebugNeedConsent.DEBUG_NOT_NEED_CONSENT);
After these steps are complete, the value of isNeedConsent will be returned based on your debug status when calls are made to update the consent status.
For more information about the Consent SDK, please refer to the sample code.
References​
Ads Kit
Development Guide of Ads Kit

Categories

Resources