Building High-Precision Location Services with Location Kit - Huawei Developers

HUAWEI Location Kit provides you with the tools to build ultra-precise location services into your apps, by utilizing GNSS, Wi-Fi, base stations, and a range of cutting-edge hybrid positioning technologies. Location Kit-supported solutions give your apps a leg up in a ruthlessly competitive marketplace, making it easier than ever for you to serve a vast, global user base.
Location Kit currently offers three main functions: fused location, geofence, and activity identification. When used in conjunction with the Map SDK, which is supported in 200+ countries and regions and 100+ languages, you'll be able to bolster your apps with premium mapping services that enjoy a truly global reach.
Fused location provides easy-to-use APIs that are capable of obtaining the user's location with meticulous accuracy, and doing so while consuming a minimal amount of power. HW NLP, Huawei's exclusive network location service, makes use of crowdsourced data to achieve heightened accuracy. Such high-precision, cost-effective positioning has enormous implications for a broad array of mobile services, including ride hailing navigation, food delivery, travel, and lifestyle services, providing customers and service providers alike with the high-value, real time information that they need.
To avoid boring you with the technical details, we've provided some specific examples of how positioning systems, geofence, activity identification, map display and route planning services can be applied in the real world.
For instance, you can use Location kit to obtain the user's current location and create a 500-meter geofence radius around it, which can be used to determine the user's activity status when the geofence is triggered, then automatically plan a route based on this activity status (for example, plan a walking route when the activity is identified as walking), and have it shown on the map.
This article addresses the following functions:
Fused location: Incorporates GNSS, Wi-Fi, and base station data via easy-to-use APIs, making it easy for your app to obtain device location information.
Activity identification: Identifies the user's motion status, using the acceleration sensor, network information, and magnetometer, so that you can tailor your app to account for the user's behavior.
Geofence: Allows you to set virtual geographic boundaries via APIs, to send out timely notifications when users enter, exit, or remain with the boundaries.
Map display: Includes the map display, interactive features, map drawing, custom map styles, and a range of other features.
Route planning: Provides HTTP/HTTPS APIs for you to initiate requests using HTTP/HTTPS, and obtain the returned data in JSON format.
Usage scenarios:
Using high-precision positioning technology to obtain real time location and tracking data for delivery or logistics personnel, for optimally efficient services. In the event of accidents or emergencies, the location of personnel could also be obtained with ease, to ensure their quick rescue.
Creating a geofence in the system, which can be used to monitor an important or dangerous area at all times. If someone enters such an area without authorization, the system could send out a proactive alert. This solution can also be linked with onsite video surveillance equipment. When an alert is triggered, the video surveillance camera could pop up to provide continual monitoring, free of any blind spots.
Tracking patients with special needs in hospitals and elderly residents in nursing homes, in order to provide them with the best possible care. Positioning services could be linked with wearable devices, for attentive 24/7 care in real time.
Using the map to directly find destinations, and perform automatic route planning.
I. Advantages of Location Kit and Map Kit
Low-power consumption (Location Kit): Implements geofence using the chipset, for optimized power efficiency
High precision (Location Kit): Optimizes positioning accuracy in urban canyons, correctly identifying the roadside of the user. Sub-meter positioning accuracy in open areas, with RTK (Real-time kinematic) technology support. Personal information, activity identification, and other data are not uploaded to the server while location services are performed. As the data processor, Location Kit only uses data, and does not store it.
Personalized map displays (Map Kit): Offers enriching map elements and a wide range of interactive methods for building your map.
Broad-ranging place searches (Map Kit): Covers 130+ million POIs and 150+ million addresses, and supports place input prompts.
Global coverage: Supports 200+ countries/regions, and 40+ languages.
For more information and development guides, please visit: https://developer.huawei.com/consumer/en/hms/huawei-MapKit
II. Demo App Introduction
In order to illustrate how to integrate Location Kit and Map Kit both easily and efficiently, we've provided a case study here, which shows the simplest coding method for running the demo.
This app is used to create a geofence on the map based on the location when the user opens the app. The user can drag on the red mark to set a destination. After being confirmed, when the user triggers the geofence condition, the app will automatically detect their activity status and plan a route for the user, such as planning a walking route if the activity status is walking, or cycling route if the activity status is cycling. You can also implement real-time voice navigation for the planned route.
III. Development Practice
You need to set the priority (which is 100 by default) before requesting locations. To request the precise GPS location, set the priority to 100. To request the network location, set the priority to 102 or 104. If you only need to passively receive locations, set the priority to 105.
Parameters related to activity identification include VEHICLE (100), BIKE (101), FOOT (102), and STILL (103).
Geofence-related parameters include ENTER_GEOFENCE_CONVERSION (1), EXIT_GEOFENCE_CONVERSION (2), and DWELL_GEOFENCE_CONVERSION (4).
The following describes how to run the demo using source code, helping you understand the implementation details.
Preparations
Preparing Tools
Huawei phones (It is recommended that multiple devices be tested)
Android Studio
2.Registering as a Developer
Register as a Huawei developer.
Create an app in AppGallery Connect.
Create an app in AppGallery Connect by referring to Location Kit development preparations or Map Kit development preparations.
Enable Location Kit and Map Kit for the app on the Manage APIs page.
Add the SHA-256 certificate fingerprint.
Download the agconnect-services.json file and add it to the app directory of the project.
Create an Android demo project.
Learn about the function restrictions.
To use the route planning function of Map Kit, refer to Supported Countries/Regions (Route Planning).
To use other services of Map Kit, refer to Supported Countries/Regions.
Device TypeFeatureOS VersionHMS Core (APK) VersionHuawei phonesFused locationEMUI 5.0 or later4.0.0 or laterGeofenceEMUI 8.0 or later4.0.4 or laterActivity identificationEMUI 9.1.1 or later3.0.2 or laterNon-Huawei Android phonesFused locationAndroid 8.0 or later (API level 26 or higher)4.0.1 or laterGeofenceAndroid 5.0 or later (API level 21 or higher)4.0.4 or laterActivity identificationNot supportedNot supported
Running the Demo App
Install the app on the test device after debugging the project in Android Studio successfully
Replace the project package name and JSON file with those of your own.
Tap related button in the demo app to create a geofence which has a radius of 200 and is centered on the current location automatically pinpointed by the demo app.
Drag the mark point on the map to select a destination.
View the route that is automatically planned based on the current activity status when the geofence is triggered.
The following figure shows the demo effect:
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Key Steps
Add the Huawei Maven repository to the project-level build.gradle file.
Add the following Maven repository address to the project-level build.gradle file of your Android Studio project:
Code:
buildscript {
repositories {
maven { url 'http://developer.huawei.com/repo/'}
}
dependencies {
...
// Add the AppGallery Connect plugin configuration.
classpath 'com.huawei.agconnect:agcp:1.4.2.300'
}
}allprojects {
repositories {
maven { url 'http://developer.huawei.com/repo/'}
}
}
Add dependencies on the SDKs in the app-level build.gradle file.
Code:
dependencies {
implementation 'com.huawei.hms:location:5.1.0.300'
implementation 'com.huawei.hms:maps:5.2.0.302' }
3. Add the following configuration to the next line under apply plugin: 'com.android.application' in the file header:
apply plugin: 'com.huawei.agconnect'
Note:
You must configure apply plugin: 'com.huawei.agconnect' under apply plugin: 'com.android.application'.
The minimum Android API level (minSdkVersion) required for the HMS Core Map SDK is 19.
4. Declare system permissions in the AndroidManifest.xml file.
Location Kit uses GNSS, Wi-Fi, and base station data for fused location, enabling your app to quickly and accurately obtain users' location information. Therefore, Location Kit requires permissions to access Internet, obtain the fine location, and obtain the coarse location. If your app needs to continuously obtain the location information when it runs in the background, you also need to declare the ACCESS_BACKGROUND_LOCATION permission in the AndroidManifest.xml file:
Code:
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
<uses-permission android:name="android.permission.ACCESS_WIFI_STATE" />
<uses-permission android:name="android.permission.WAKE_LOCK" />
<uses-permission android:name="android.permission.ACCESS_FINE_LOCATION" />
<uses-permission android:name="android.permission.ACCESS_COARSE_LOCATION" />
<uses-permission android:name="com.huawei.hms.permission.ACTIVITY_RECOGNITION" />
<uses-permission android:name="android.permission.ACTIVITY_RECOGNITION" />
Note: Because the ACCESS_FINE_LOCATION, WRITE_EXTERNAL_STORAGE, READ_EXTERNAL_STORAGE, and ACTIVITY_RECOGNITION permissions are dangerous system permissions, you need to dynamically apply for these permissions. If you do not have the permissions, Location Kit will reject to provide services for your app.
Key Code
I. Map Display
Currently, the Map SDK supports two map containers: SupportMapFragment and MapView. This document uses the SupportMapFragment container.
Add a Fragment object in the layout file (for example: activity_main.xml), and set map attributes in the file.
Code:
<fragment
android:id="@+id/mapfragment_routeplanningdemo"
android:name="com.huawei.hms.maps.SupportMapFragment"
android:layout_width="match_parent"
android:layout_height="match_parent" />
To use a map in your app, implement the OnMapReadyCallback API.
RoutePlanningActivity extends AppCompatActivity implements OnMapReadyCallback
Load SupportMapView in the onCreate method, call getMapAsync to register the callback.
Fragment fragment = getSupportFragmentManager().findFragmentById(R.id.mapfragment_routeplanningdemo);
if (fragment instanceof SupportMapFragment) {
SupportMapFragment mSupportMapFragment = (SupportMapFragment) fragment;
mSupportMapFragment.getMapAsync(this);
}
Call the onMapReady callback to obtain the HuaweiMap object.
Code:
@Override
public void onMapReady(HuaweiMap huaweiMap) {
hMap = huaweiMap;
hMap.setMyLocationEnabled(true);
hMap.getUiSettings().setMyLocationButtonEnabled(true);
}
II. Function Implementation
Check the permissions.
Code:
if (Build.VERSION.SDK_INT <= Build.VERSION_CODES.P) {
if (ActivityCompat.checkSelfPermission(context,
"com.huawei.hms.permission.ACTIVITY_RECOGNITION") != PackageManager.PERMISSION_GRANTED) {
String[] permissions = {"com.huawei.hms.permission.ACTIVITY_RECOGNITION"};
ActivityCompat.requestPermissions((Activity) context, permissions, 1);
Log.i(TAG, "requestActivityTransitionButtonHandler: apply permission");
}
} else {
if (ActivityCompat.checkSelfPermission(context,
"android.permission.ACTIVITY_RECOGNITION") != PackageManager.PERMISSION_GRANTED) {
String[] permissions = {"android.permission.ACTIVITY_RECOGNITION"};
ActivityCompat.requestPermissions((Activity) context, permissions, 2);
Log.i(TAG, "requestActivityTransitionButtonHandler: apply permission");
}
}
Check whether the location permissions have been granted. If no, the location cannot be obtained.
Code:
settingsClient.checkLocationSettings(locationSettingsRequest)
.addOnSuccessListener(locationSettingsResponse -> {
fusedLocationProviderClient
.requestLocationUpdates(mLocationRequest, mLocationCallback, Looper.getMainLooper())
.addOnSuccessListener(aVoid -> {
//Processing when the API call is successful.
});
})
.addOnFailureListener(e -> {});
if (null == mLocationCallbacks) {
mLocationCallbacks = new LocationCallback() {
@Override
public void onLocationResult(LocationResult locationResult) {
if (locationResult != null) {
List<HWLocation> locations = locationResult.getHWLocationList();
if (!locations.isEmpty()) {
for (HWLocation location : locations) {
hMap.moveCamera(CameraUpdateFactory.newLatLngZoom(new LatLng(location.getLatitude(), location.getLongitude()), 14));
latLngOrigin = new LatLng(location.getLatitude(), location.getLongitude());
if (null != mMarkerOrigin) {
mMarkerOrigin.remove();
}
MarkerOptions options = new MarkerOptions()
.position(latLngOrigin)
.title("Hello Huawei Map")
.snippet("This is a snippet!");
mMarkerOrigin = hMap.addMarker(options);
removeLocationUpdatesWith();
}
}
}
}
@Override
public void onLocationAvailability(LocationAvailability locationAvailability) {
if (locationAvailability != null) {
boolean flag = locationAvailability.isLocationAvailable();
Log.i(TAG, "onLocationAvailability isLocationAvailable:" + flag);
}
}
};
}
III. Geofence and Ground Overlay Creation
Create a geofence based on the current location and add a round ground overlay on the map.
Code:
GeofenceRequest.Builder geofenceRequest = new
GeofenceRequest.Builder geofenceRequest = new GeofenceRequest.Builder();
geofenceRequest.createGeofenceList(GeoFenceData.returnList());
geofenceRequest.setInitConversions(7);
try {
geofenceService.createGeofenceList(geofenceRequest.build(), pendingIntent)
.addOnCompleteListener(new OnCompleteListener<Void>() {
@Override
public void onComplete(Task<Void> task) {
if (task.isSuccessful()) {
Log.i(TAG, "add geofence success!");
if (null == hMap) {
return; }
if (null != mCircle) {
mCircle.remove();
mCircle = null;
}
mCircle = hMap.addCircle(new CircleOptions()
.center(latLngOrigin)
.radius(500)
.strokeWidth(1)
.fillColor(Color.TRANSPARENT));
} else {Log.w(TAG, "add geofence failed : " + task.getException().getMessage());}
}
});
} catch (Exception e) {
Log.i(TAG, "add geofence error:" + e.getMessage());
}
// Geofence service
Code:
<receiver
android:name=".GeoFenceBroadcastReceiver"
android:exported="true">
<intent-filter>
<action android:name=".GeoFenceBroadcastReceiver.ACTION_PROCESS_LOCATION" />
</intent-filter>
</receiver>
if (intent != null) {
final String action = intent.getAction();
if (ACTION_PROCESS_LOCATION.equals(action)) {
GeofenceData geofenceData = GeofenceData.getDataFromIntent(intent);
if (geofenceData != null && isListenGeofence) {
int conversion = geofenceData.getConversion();
MainActivity.setGeofenceData(conversion);
}
}
}
Mark the selected point on the map to obtain the destination information, check the current activity status, and plan routes based on the detected activity status.
Code:
hMap.setOnMapClickListener(latLng -> {
latLngDestination = new LatLng(latLng.latitude, latLng.longitude);
if (null != mMarkerDestination) {
mMarkerDestination.remove();
}
MarkerOptions options = new MarkerOptions()
.position(latLngDestination)
.title("Hello Huawei Map");
mMarkerDestination = hMap.addMarker(options);
if (identification.getText().equals("To exit the fence,Your activity is about to be detected.")) {
requestActivityUpdates(5000);
}
});
// Activity identification API
activityIdentificationService.createActivityIdentificationUpdates(detectionIntervalMillis, pendingIntent)
.addOnSuccessListener(new OnSuccessListener<Void>() {
@Override
public void onSuccess(Void aVoid) {
Log.i(TAG, "createActivityIdentificationUpdates onSuccess");
}
})
.addOnFailureListener(new OnFailureListener() {
@Override
public void onFailure(Exception e) {
Log.e(TAG, "createActivityIdentificationUpdates onFailure:" + e.getMessage());
}
});
// URL of the route planning API (cycling route is used as an example): https://mapapi.cloud.huawei.com/mapApi/v1/routeService/bicycling?key=API KEY
NetworkRequestManager.getBicyclingRoutePlanningResult(latLngOrigin, latLngDestination,
new NetworkRequestManager.OnNetworkListener() {
@Override
public void requestSuccess(String result) {
generateRoute(result);
}
@Override
public void requestFail(String errorMsg) {
Message msg = Message.obtain();
Bundle bundle = new Bundle();
bundle.putString("errorMsg", errorMsg);
msg.what = 1;
msg.setData(bundle);
mHandler.sendMessage(msg);
}
});
Note:
The route planning function provides a set of HTTPS-based APIs used to plan routes for walking, cycling, and driving and calculate route distances. The APIs return route data in JSON format and provide the route planning capabilities.
The route planning function can plan walking, cycling, and driving routes.
You can try to plan a route from one point to another point and then draw the route on the map, achieving the navigation effects.
Related Parameters
In indoor environments, the navigation satellite signals are usually weak. Therefore, HMS Core (APK) will use the network location mode, which is relatively slow compared with the GNSS location. It is recommended that the test be performed outdoors.
In Android 9.0 or later, you are advised to test the geofence outdoors. In versions earlier than Android 9.0, you can test the geofence indoors.
Map Kit is unavailable in the Chinese mainland. Therefore, the Android SDK, JavaScript API, Static Map API, and Directions API are unavailable in the Chinese mainland. For details, please refer to Supported Countries/Regions.
In the Map SDK for Android 5.0.0.300 and later versions, you must set the API key before initializing a map. Otherwise, no map data will be displayed.
Currently, the driving route planning is unavailable in some countries and regions outside China. For details about the supported countries and regions, please refer to the Huawei official website.
Before building the APK, configure the obfuscation configuration file to prevent the HMS Core SDK from being obfuscated.
Open the obfuscation configuration file proguard-rules.pro in the app's root directory of your project and add configurations to exclude the HMS Core SDK from obfuscation.
If you are using AndResGuard, add its trustlist to the obfuscation configuration file.
For details, please visit the following link: https://developer.huawei.com/consumer/en/doc/development/HMSCore-Guides/android-sdk-config-obfuscation-scripts-0000001061882229
To learn more, visit the following links:
Documentation on the HUAWEI Developers website:
https://developer.huawei.com/consumer/en/hms/huawei-locationkit
https://developer.huawei.com/consumer/en/hms/huawei-MapKit
To download the demo and sample code, please visit GitHub.
To solve integration problems, please go to Stack Overflow at the following link:
https://stackoverflow.com/questions/tagged/huawei-mobile-services?tab=Newest
To learn more, please visit:
HUAWEI Developers official website
Development Guide
Reddit to join developer discussions
GitHub or Gitee to download the demo and sample code
Stack Overflow to solve integration problems
Follow our official account for the latest HMS Core-related news and updates.
Original Source

Can we get the location offline?

Related

Developing a Function for Precisely Pushing Ads to Nearby People Using HUAWEI Nearby

More articles like this, you can visit HUAWEI Developer Forum and Medium.​
When you want to find a restaurant and your phone receives a push message recommending nearby restaurants to you at the right moment, will you tap open the message? When you're still hesitating on whether to buy a pair of sneakers in a store and an app sends you a coupon offering you a 50% discount, do you still find the ad bothering?
Ads pushed at the right moment meet users' requirements and won't make users feel bothered. More precise advertising not only reduces unnecessary disturbance to users but also improves user satisfaction with your app. Do you want to build such a function for precisely pushing ads to nearby people?
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Integrate HUAWEI Nearby Service and use the Nearby Beacon Message feature to implement the function. You need to deploy beacons at the place where you want to push messages to nearby people, for example, a mall. Beacons will provide users' locations relative to them so that when a user gets near a restaurant or store, the user will receive a promotion message (such as a coupon or discount information) preconfigured by the merchant in advance. The function demo is as follows.
If you are interested in the implementation details, download the source code from GitHub: https://github.com/HMS-Core/hms-nearby-demo/tree/master/NearbyCanteens
1. Getting Started
If you are already a Huawei developer, skip this step. If you are new to Huawei Mobile Services (HMS), you need to configure app information in AppGallery Connect, enable Nearby Service on the HUAWEI Developers console, and integrate the HMS Core SDK. For details, please refer to the documentation.
1.1 Adding Huawei Maven Repository and AppGallery Connect Plug-in Configurations to the Project-Level build.gradle File
Add the following Huawei maven repository and AppGallery Connect plug-in configurations to your project-level build.gradle file directly:
Code:
buildscript {
repositories {
google()
jcenter()
maven { url 'http://developer.huawei.com/repo/' }
}
dependencies {
classpath 'com.android.tools.build:gradle:3.4.1'
classpath 'com.huawei.agconnect:agcp:1.2.1.301'
}
}
allprojects {
repositories {
google()
jcenter()
maven { url 'http://developer.huawei.com/repo/' }
}
}
1.2 Adding SDK Dependencies to the App-Level build.gradle File
Import the Nearby Service SDKs. The most important SDKs are those starting with com.huawei.
Code:
dependencies {
implementation fileTree(dir: 'libs', include: ['*.jar'])
implementation 'androidx.appcompat:appcompat:1.1.0'
implementation 'androidx.constraintlayout:constraintlayout:1.1.3'
testImplementation 'junit:junit:4.12'
androidTestImplementation 'androidx.test.ext:junit:1.1.1'
androidTestImplementation 'androidx.test.espresso:espresso-core:3.2.0'
implementation "com.huawei.hmf:tasks:1.3.1.301"
implementation "com.huawei.hms:network-grs:1.0.9.302"
implementation 'com.huawei.agconnect:agconnect-core:1.2.1.301'
implementation 'com.huawei.hms:nearby:4.0.4.300'
api 'com.google.code.gson:gson:2.8.5'
}
1.3 Applying for the Network, Bluetooth, and Location Permissions in the AndroidManifest.xml File
The following permissions are required. Each permission name can directly express its purpose. For example, android.permission.INTERNET indicates the network permission, and android.permission.BLUETOOTH indicates the Bluetooth permission.
Code:
<uses-permission android:name="android.permission.BLUETOOTH_ADMIN" />
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.BLUETOOTH" />
<uses-permission android:name="android.permission.ACCESS_WIFI_STATE" />
<uses-permission android:name="android.permission.CHANGE_WIFI_STATE" />
<uses-permission android:name="android.permission.ACCESS_COARSE_LOCATION" />
<uses-permission android:name="android.permission.FOREGROUND_SERVICE" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
<uses-permission android:name="android.permission.ACCESS_FINE_LOCATION" />
2. Code Development
2.1 Initialization and Dynamic Permission Application
The onCreate method is called when the current activity is created. In this method, you can perform some preparations, such as applying for necessary permissions and checking whether the Internet connection, Bluetooth, and GPS are enabled.
Code:
@RequiresApi(api = Build.VERSION_CODES.P)
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
Log.i(TAG, "onCreate");
setContentView(R.layout.activity_canteen);
boolean isSuccess = requestPermissions(this, this);
if (!isSuccess) {
return;
}
Log.i(TAG, "requestPermissions success");
if (!NetCheckUtil.isNetworkAvailable(this)) {
showWarnDialog(Constant.NETWORK_ERROR);
return;
}
if (!BluetoothCheckUtil.isBlueEnabled()) {
showWarnDialog(Constant.BLUETOOTH_ERROR);
return;
}
if (!GpsCheckUtil.isGpsEnabled(this)) {
showWarnDialog(Constant.GPS_ERROR);
return;
}
intView();
init();
}
Register a listener and display a message if Bluetooth, GPS, or network disconnection is detected.
The following uses the AlertDialog component of Android as an example:
Code:
private void showWarnDialog(String content) {
DialogInterface.OnClickListener onClickListener =
new DialogInterface.OnClickListener() {
@Override
public void onClick(DialogInterface dialog, int which) {
android.os.Process.killProcess(android.os.Process.myPid());
}
};
AlertDialog.Builder builder = new AlertDialog.Builder(this);
builder.setTitle(R.string.warn);
builder.setIcon(R.mipmap.warn);
builder.setMessage(content);
builder.setNegativeButton(getText(R.string.btn_confirm), onClickListener);
builder.show();
}
2.2 Beacon Message Reception
The following startScanning method is called in the onStart method to start Bluetooth scanning. In the MessageHandler object, four callback methods are encapsulated: 1. onFound indicates that a beacon message is discovered; 2. OnLost indicates that the message is no longer discoverable; 3. onDistanceChanged indicates the change of the distance between the beacon and the device; 4. onBleSignalChanged indicates that a beacon signal change is detected.
Code:
private void startScanning() {
Log.i(TAG, "startScanning");
mMessageHandler =
new MessageHandler() {
@Override
public void onFound(Message message) {
super.onFound(message);
doOnFound(message);
}
@Override
public void onLost(Message message) {
super.onLost(message);
doOnLost(message);
}
@Override
public void onDistanceChanged(Message message, Distance distance) {
super.onDistanceChanged(message, distance);
}
@Override
public void onBleSignalChanged(Message message, BleSignal bleSignal) {
super.onBleSignalChanged(message, bleSignal);
}
};
MessagePicker msgPicker = new MessagePicker.Builder().includeAllTypes().build();
Policy policy = new Policy.Builder().setTtlSeconds(Policy.POLICY_TTL_SECONDS_INFINITE).build();
GetOption getOption = new GetOption.Builder().setPicker(msgPicker).setPolicy(policy).build();
Task<Void> task = messageEngine.get(mMessageHandler, getOption);
task.addOnFailureListener(
new OnFailureListener() {
@Override
public void onFailure(Exception e) {
Log.e(TAG, "Login failed:", e);
if (e instanceof ApiException) {
ApiException apiException = (ApiException) e;
int errorStatusCode = apiException.getStatusCode();
if (errorStatusCode == StatusCode.STATUS_MESSAGE_AUTH_FAILED) {
Toast.makeText(mContext, R.string.configuration_error, Toast.LENGTH_SHORT).show();
} else if (errorStatusCode == StatusCode.STATUS_MESSAGE_APP_UNREGISTERED) {
Toast.makeText(mContext, R.string.permission_error, Toast.LENGTH_SHORT).show();
} else {
Toast.makeText(mContext, R.string.start_get_beacon_message_failed, Toast.LENGTH_SHORT)
.show();
}
} else {
Toast.makeText(mContext, R.string.start_get_beacon_message_failed, Toast.LENGTH_SHORT)
.show();
}
}
});
}
Personalize the display of a detected beacon message.
The most important method during the process is the doOnFound method, which specifies the personalized display processing mode when the client receives a beacon message.
Code:
private void doOnFound(Message message) {
if (message == null) {
return;
}
String type = message.getType();
if (type == null) {
return;
}
String messageContent = new String(message.getContent());
Log.d(TAG, "New Message:" + messageContent + " type:" + type);
if (type.equalsIgnoreCase(Constant.CANTEEN)) {
operateOnFoundCanteen(messageContent);
} else if (type.equalsIgnoreCase(Constant.NOTICE)) {
operateOnFoundNotice(messageContent);
}
}
Display the personalized message.
The following code only demonstrates one of the message processing modes, including implementing the banner notification and text display effect.
Code:
private void operateOnFoundCanteen(String messageContent) {
CanteenAdapterInfo canteenAdapterInfo =
(CanteenAdapterInfo) JsonUtils.json2Object(messageContent, CanteenAdapterInfo.class);
if (canteenAdapterInfo == null) {
return;
}
String canteenName = canteenAdapterInfo.getCanteenName();
if (canteenName == null) {
return;
}
Log.d(TAG, "canteenName:" + canteenName);
if (!canteenNameList.contains(canteenName)) {
return;
}
String notice = "";
if (receivedNoticeMap.containsKey(canteenName)) {
notice = receivedNoticeMap.get(canteenName);
}
int canteenImage = getCanteenImage(canteenName);
int requestCode = getRequestCode(canteenName);
canteenAdapterInfo.setNotice(notice);
canteenAdapterInfo.setCanteenImage(canteenImage);
canteenAdapterInfo.setShowNotice(true);
canteenAdapterInfo.setRequestCode(requestCode);
canteenAdapterInfoMap.put(canteenName, canteenAdapterInfo);
canteenAdapterInfoList.add(canteenAdapterInfo);
sendNotification(Constant.NOTIFICATION_TITLE, Constant.NOTIFICATION_SUBTITLE, canteenName, requestCode);
runOnUiThread(
new Runnable() {
@Override
public void run() {
searchTipTv.setText(R.string.found_tip);
loadingLayout.setVisibility(View.GONE);
canteenAdapter.setDatas(canteenAdapterInfoList);
}
});
}
Conclusion
This demo uses Bluetooth beacon message subscription function of HUAWEI Nearby Service.
Based on the Nearby Beacon Message capability, you not only can develop an ad push function, but also can implement the following functions:
1. A car or lifestyle app can integrate the capability to identify whether a user is near their car to determine whether to enable keyless access and record the driving track of the car.
2. A business app can integrate the capability to accurately record the locations where employees clock in.
3. A travel or exhibition app can integrate the capability to introduce an exhibit to a user when the user gets near the exhibit.
4. A game app can integrate the capability to make your game interact with the real world, for example, unlocking a game level through a physical object and sending rewards to players who are participating in offline events.
If you are interested and want to learn more, check our development guide at the HUAWEI Developers official website.
Any questions about the process, you can visit HUAWEI Developer Forum.
Hi,
Can we use HMS nearby service to create a chat application and can also provide ads about restaurant when user enter the restaurant in the chat application as a message.
Thanks.
sanghati said:
Hi,
Can we use HMS nearby service to create a chat application and can also provide ads about restaurant when user enter the restaurant in the chat application as a message.
Thanks.
Click to expand...
Click to collapse
Nearby has two businesses, one is Connection, you can use Connection to develop chat functions. The other is Message. Deploy a Beacon in the hotel. After registering the Beacon, set the Beacon message. When the user walks into the restaurant, they can receive the message you set in advance. You can also set some conditions on the cloud service, such as the time period of the user, how long it will be for users in the restaurant, etc. After these conditions are met, you will receive the message. The advantages of this method are: 1. You can modify the rules in real time, 2. More accurate advertising push to avoid unnecessary interruptions.

Integrating Huawei Crash, Push Kit and Account Kit Flutter Plugins to Forks & Spoons: A Restaurant App — Part 1

Introduction​
Hello all,
As you’ve probably know already, Huawei has released its own mobile services as the HMS (Huawei Mobile Services) platform and thankfully these services are also supported on third party SDKs such as Flutter, React-Native, Cordova, Xamarin and more.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
In this article we will dive in to some of the HMS Flutter Plugins to enhance an imaginary online restaurant app called Forks & Spoons with useful features. We will integrate Huawei Crash, Push and Account Flutter Plugins into our app to meet certain use cases in the first part of these series. Before we begin let me introduce Forks & Spoons to you.
Forks & Spoons​Forks & Spoons is a a local restaurant that is having a hard time because of Covid-19 epidemic like most of us. The precautions for the epidemic has prohibited the restaurant to accept customers so the owner has started online food delivery in order to reach to its customers and save its business.
We will now integrate Huawei’s Mobile Services to the online food delivery app for the purpose of enriching the user experience and providing useful features with ease. Before we begin I would like to mention that I won’t go into details about the UI code and some business logic to not make this article too long but you can find all the source code in the github repository.
If you are ready let’s begin by integrating the first service to the app.
Account Kit​
> If you are going to implement this service into your app don’t forget to add an app to AppGallery Connect, as described in this document.
> You can find the integration and configuration details of Huawei Flutter Account Kit Plugin here.
Click to expand...
Click to collapse
Let’s add the Sign In function to authenticate users with their HUAWEI ID. First we create an AccountAuthParamsHelper object to set the fields we need in the response and we pass this object to the AccountAuthService.signIn method. Furthermore we can verify the user’s token after we have obtained the id from the sign in method.
Code:
/// Signing In with HUAWEI ID
void _signIn(BuildContext context) async {
// This parameter is optional. You can run the method with default options.
final helper = new AccountAuthParamsHelper();
helper
..setIdToken()
..setAccessToken()
..setAuthorizationCode()
..setEmail()
..setProfile();
try {
AuthAccount id = await AccountAuthService.signIn(helper);
log("Sign In User: ${id.displayName}");
// Optionally verify the id.
await performServerVerification(_id.idToken);
// ..
// ...
// Rest of the business logic.
// ....
} on PlatformException catch (e, stacktrace) {
log("Sign In Failed!, Error is:${e.message}");
ScaffoldMessenger.of(context).showSnackBar(SnackBar(
content: Text("Could not log in to your account."),
backgroundColor: Colors.red,
));
}
}
/// You can optionally verify the user’s token with this method.
Future<void> performServerVerification(String idToken) async {
var response = await http.post(
"https://oauth-login.cloud.huawei.com/oauth2/v3/tokeninfo",
body: {'id_token': idToken});
print(response.body);
}
We connected the _signIn method above to a user icon button on the app bar and once the user signs in successfully their first and last name will be displayed instead of the icon. Let’s see it in action:
It is a high possibility that you would need the id information again on some part of the app. In Forks&Spoons we had needed it in the drawer widget which displays the user’s shopping cart. In order to obtain the already signed in Huawei ID information we call the AccountAuthManager.getAuthResult method.
Code:
void _getAuthResult() async {
try {
AuthAccount _id = await AccountAuthManager.getAuthResult();
log(_id.givenName.toString());
// ..
// ...
// Rest of business logic.
} catch (e, stacktrace) {
log("Error while obtaining Auth Result, $e");
// ..
// ...
// Error handling.
}
}
We let people sign in to our app but there is no way to sign out yet. Let’s fix this by adding a sign out function.
Code:
void _signOut() async {
try {
final bool result = await AccountAuthService.signOut();
log("Signed out: $result");
} on PlatformException catch (e, stacktrace) {
log("Error while signing out: $e");
}
}
AGC Crash​The next service that we are going to integrate to our app is the AppGallery Connect Crash Service which provides a powerful yet lightweight solution to app crash problems. With the service, we can quickly detect, locate, and resolve app crashes, and have access to highly readable crash reports in real time.
> You can find the integration and configuration of AppGallery Connect Crash here.
Click to expand...
Click to collapse
We can catch and report all non-fatal exceptions of our app to the AGC Crash Service by adding the following code to our main method.
Note that below configuration is for Flutter version 1.17 and above. If you are using a lower version please check this document from Huawei Developers.
Click to expand...
Click to collapse
Code:
void main() {
// Obtains an instance of AGCCrash.
AGCCrash _agcCrashInstance = AGCCrash.instance;
// Defines Crash Service's [onFlutterError] API as Flutter's.
FlutterError.onError = _agcCrashInstance.onFlutterError;
// Below configuration records all exceptions that occurs in your app.
runZonedGuarded<Future<void>>(() async {
runApp(MyApp(_agcCrashInstance));
}, (Object error, StackTrace stackTrace) {
AGCCrash.instance.recordError(error, stackTrace);
});
}
Manually handling and reporting errors to AGC Crash Service​
Meet Jerry, The Forks&Spoons is his favorite restaurant and he orders from the app occasionally. Unfortunately his wifi is not working when he opened the app today and he tries to log in to the app. Below is the source code for the login process, notice that we are catching the error and reporting it to the crash service by the recordError method. If you use the try / catch block the thrown exception will not be catched by the configuration we did earlier, so we have to handle it ourselves.
Code:
void _signIn() async {
// This parameter is optional. You can run the method with default options.
final helper = new HmsAuthParamHelper();
helper
..setIdToken()
..setAccessToken()
..setAuthorizationCode()
..setEmail()
..setProfile();
try {
_id = await HmsAuthService.signIn(authParamHelper: helper);
log("Sign In User: ${_id.displayName}");
// ..
// ...
// Rest of the business logic.
} on PlatformException catch (e, stacktrace) {
widget.agcCrash.recordError(e, stacktrace);
log("Sign In Failed!, Error is:${e.message}");
}
}
Let’s see what happens when Jerry tries to log in to the app.
Ooops something went wrong. Let’s see the crash report on the AppGallery Connect. Be sure to add the Exception filter to see the exception reports.
Here is the exception report; we can check the stack trace, device information and more in this page, isn’t it cool ?
We can also test an exception by just throwing one and sending its report on the catch clause.
Code:
/// Test method for sending an exception record to the agcrash service.
void sendException() {
try {
// Throws intentional exception for testing.
throw Exception("Error occured.");
} catch (error, stackTrace) {
// Records the occured exception.
AGCCrash.instance.recordError(error, stackTrace);
}
}
Furthermore we can set a userId or any custom key value pair to be sent in the crash/exception report.
Code:
// Gets AAID for user identification.
String userAAID = await hmsAnalytics.getAAID();
// Sets AAID as user ID.
AGCCrash.instance.setUserId(userAAID);
// Sets user name as custom key while sending crash reports.
AGCCrash.instance.setCustomKey("userName", _id.displayName);
Apart from exceptions Flutter can also have build phase errors or what i like to call widget errors. To catch all of these errors you can use ErrorWidget.builder class along with the crash kit. I have also returned a custom error widget to be rendered on error. Spoiler alert: it contains ducks.
Code:
@override
Widget build(BuildContext context) {
return MaterialApp(
debugShowCheckedModeBanner: false,
builder: (BuildContext context, Widget widget) {
Widget error = Container(
height: double.infinity,
width: double.infinity,
child: Text(
'...rendering error...',
style: TextStyle(color: Colors.white),
),
decoration: BoxDecoration(
image: DecorationImage(
image: AssetImage('assets/duck.jpg'),
repeat: ImageRepeat.repeat,
)),
);
if (widget is Scaffold || widget is Navigator)
error = Scaffold(body: Center(child: error));
ErrorWidget.builder = (FlutterErrorDetails errorDetails) {
print("Widget Error Occurred");
AGCCrash.instance
.recordError(errorDetails.exception, errorDetails.stack);
return error;
};
return widget;
},
title: 'Forks&Spoons',
theme: ThemeData(
primarySwatch: Colors.grey,
),
home: MyHomePage(
hmsAnalytics: hmsAnalytics,
),
);
}
Push Kit​Let’s move on to one of the most indispensable services for the app: Push Kit. Push Kit is a messaging service provided for you to establish a messaging channel from the cloud to devices. By integrating Push Kit, you can send messages to your apps on user devices in real time. This helps you maintain closer ties with users and increases user awareness of and engagement with your apps.
> You can find the integration and configuration details of Huawei Flutter Push Kit Plugin in this document.
Click to expand...
Click to collapse
Before we start to send messages we need to obtain a push token. We can achieve this by setting a callback to the token stream and requesting a token by the Push.getToken method.
Code:
void initPush() async {
if (!mounted) return;
// Subscribe to the streams.
Push.getTokenStream.listen(_onTokenEvent, onError: _onTokenError);
// Get the push token.
Push.getToken("");
}
Code:
void _onTokenEvent(String token) {
log("Obtained push token: $token");
}
void _onTokenError(Object error) {
PlatformException e = error;
print("TokenErrorEvent: " + e.message);
}
After we add the initPush function to a stateful widget’s initState method in our app we should be able see the push token on the logs.
This token will be very useful while sending messages to our app. Let’s start with a push notification message.
Sign In to the AppGallery Connect and select your project, then select Grow /Push Kit on the side bar and click the Add Notification button. The page below will be displayed.
Let’s quickly prepare a push notification for our belowed users. You can see the message preview on the right side.
To send this push message to our users we can tap the Test Effect button and enter the push token we obtained earlier. We can also send this message to all users by pressing the Submit button above.
We have successfully sent our first push message to our app but it still seems somewhat flat. We can spice things up by adding a deep link and navigating the users to the content directly. This would empower the UX of our app.
Before we jump into the code first, we need to add the intent filter below to the project’s AndroidManifest.xml file inside activity tag. Here we are using the scheme “app” but if you are going to release your app it’s better to change it to something related to your domain name.
XML:
<activity>
<!-- Other Configurations -->
<intent-filter>
<action android:name="android.intent.action.VIEW" />
<category android:name="android.intent.category.DEFAULT" />
<category android:name="android.intent.category.BROWSABLE" />
<data android:scheme="app"/>
</intent-filter>
</activity>
For the implementation let’s return to the initPush function again and add a callback to the intent stream of Push kit. Here we can also use the same callback for handling the startup intent. This intent is included in the notification that opens your app from scratch.
Code:
void initPush() async {
if (!mounted) return;
Push.setAutoInitEnabled(true);
// Subscribe to the streams. (Token, Intent and Data Message)
Push.getTokenStream.listen(_onTokenEvent, onError: _onTokenError);
Push.getIntentStream.listen(_onNewIntent, onError: _onIntentError);
// Handles startup intents.
_onNewIntent(await Push.getInitialIntent());
// Get the push token.
Push.getToken("");
}
Now for the best part let’s go back to the AppGallery Connect again and add a button with custom intent uri string to the notification we’ve prepared earlier.
Let’s press the button on the notification and see the deep linking in action.
​Offering discounts using data messaging​Some features of your app may require you to send data messages up or down (App to Server or Server to App). By the help of data messaging from Push Kit this operation becomes very easy to achieve.
The Forks&Spoons restaurant’s owner is very generous and he wants to occasionally give discounts to the loyal customers like Jerry who orders from the app oftenly. To implement this we will go back to the initPush method again but this time we will add a callback to the onMessageReceived stream.
Code:
void initPush() async {
if (!mounted) return;
Push.setAutoInitEnabled(true);
// Subscribe to the streams. (Token, Intent and Data Message)
Push.getTokenStream.listen(_onTokenEvent, onError: _onTokenError);
Push.getIntentStream.listen(_onNewIntent, onError: _onIntentError);
_onNewIntent(await Push.getInitialIntent());
Push.onMessageReceivedStream
.listen(_onMessageReceived, onError: _onMessageReceiveError);
// Get the push token.
Push.getToken("");
}
And for the data message callback, a function that displays a discount dialog and sets the discount value to the state is enough to cover our feature.
Code:
void _onMessageReceived(RemoteMessage remoteMessage) {
String data = remoteMessage.data;
if (remoteMessage.dataOfMap.containsKey("discount")) {
setState(() {
discount = int.parse(remoteMessage.dataOfMap['discount']);
MyHomePage.discount = discount;
});
showDialog(
barrierDismissible: false,
context: context,
builder: (context) => AlertDialog(
title: Text(
"Discount for your pocket, Best food for your stomach!",
textAlign: TextAlign.center,
style: TextStyle(fontSize: 20, fontWeight: FontWeight.bold),
),
content: Column(
mainAxisSize: MainAxisSize.min,
children: <Widget>[
Text(
"Congrats you have received a ${remoteMessage.dataOfMap['discount']}% discount.",
textAlign: TextAlign.center,
),
SizedBox(
height: 15,
),
MaterialButton(
onPressed: () => Navigator.pop(context), child: Text("OK"))
],
)),
);
}
log("onRemoteMessageReceived Data: " + data);
}
void _onMessageReceiveError(Object error) {
PlatformException e = error;
log("onRemoteMessageReceiveError: " + e.message);
}
Now we head over to the AppGallery Connect and open the Push Kit Console again. This time we will send a data message that includes a discount value.
Oh a very generous fifty percent discount, Nice! Jerry will be walking on air. I can see him smiling with hunger. Let’s check the discounted prices.
Jerry is excited to order with the reduced prices but he got a text message and he forgot to order, Let us remind him that he left his meal in the cart by using the local notification feature of the Push Kit Plugin.
Did you forget your meal in the cart feature​To implement this feature we need to check the user’s cart before the app goes to background or killed state. To monitor the app state we can use the WidgetsBindingObserver class from Flutter SDK. But before that we need to add the permissions that are required to send a scheduled local notification to our AndroidManifest.xml file.
XML:
<uses-permission android:name="android.permission.VIBRATE" />
<uses-permission android:name="android.permission.RECEIVE_BOOT_COMPLETED"/>
<uses-permission android:name="android.permission.WAKE_LOCK" />
<uses-permission android:name="android.permission.SYSTEM_ALERT_WINDOW"/>
For the implementation, first we mix the WidgetBindingObserver to our widget’s state, then add the observer as on the code below to the initState function and override the didChangeAppLifecycleState callback from the WidgetBindingObserver mixin.
Inside the callback we are sending a scheduled local notification to 1 minute after the app is paused. If the user opens the app again we are cancelling all the scheduled notifications.
Code:
class _MyAppState extends State<MyApp> with WidgetsBindingObserver {
@override
void initState() {
super.initState();
WidgetsBinding.instance.addObserver(this);
}
@override
void didChangeAppLifecycleState(AppLifecycleState state) {
super.didChangeAppLifecycleState(state);
switch (state) {
case AppLifecycleState.inactive:
log('appLifeCycleState inactive');
break;
case AppLifecycleState.resumed:
log('appLifeCycleState resumed');
Push.cancelScheduledNotifications();
break;
case AppLifecycleState.paused:
log('appLifeCycleState paused');
if (MyHomePage.userChoices.length > 0 &&
state == AppLifecycleState.paused) {
log('Sending sceduled local notification.');
Push.localNotificationSchedule({
HMSLocalNotificationAttr.TITLE: 'You forgot your meal in the cart.',
HMSLocalNotificationAttr.MESSAGE:
"It is better for your meal to stay in your stomach rather than the cart.",
HMSLocalNotificationAttr.FIRE_DATE:
DateTime.now().add(Duration(minutes: 1)).millisecondsSinceEpoch,
HMSLocalNotificationAttr.ALLOW_WHILE_IDLE: true,
HMSLocalNotificationAttr.TAG: "notify_cart"
});
}
break;
case AppLifecycleState.detached:
log('appLifeCycleState suspending');
break;
}
}
// ..
// ...
// Rest of the widget.
}
Tips and Tricks​
Some exceptions and crash reports may be send after the app is opened again, to test an app crash you can use the testIt() method of AGCrash Kit.
Do not listen the streams more than once before disposing them properly, if you try to subscribe to a stream twice the app would throw an error.
Conclusion​We have authenticated our users, communicated them with messages and gave them discounts whilst monitoring the errors that may occur in the app. By the help of HMS Flutter Plugins all of these features were a lot easy to implement.
We are finished the part 1 of our series but the story will continue since we should also take care of the delivery flow. Apart from that the owner of the restaurant thinks that we can learn one or two things from Jerry by analysing his user behavior in the app.
My colleague Serdar Can will discuss about all these topics on the second part of our article. Thank you and congratulations on finishing the first part. If you have any questions related to article, feel 100% free to ask them in the comments section.
You can also check References and Further Reading section for more information about the Huawei Mobile Services and HMS Flutter Plugins.
See you next time!
References and Further Reading​Huawei Push Kit Flutter Plugin Development Guide
AG Crash Service Development Guide
Huawei Flutter Plugins Pub Dev Repository
Forks&Spoons App Github Repository
​
Can we get user Email id and phone number from account kit
useful writeup
lokeshsuryan said:
Can we get user Email id and phone number from account kit
Click to expand...
Click to collapse
You can get the user email but not the phone number.
Does it records Exceptions also in crash services?
shikkerimath said:
Does it records Exceptions also in crash services?
Click to expand...
Click to collapse
Yes you can check the recorded exceptions from the AppGallery Connect as in the article .
turkayavci said:
Introduction​
Hello all,
As you’ve probably know already, Huawei has released its own mobile services as the HMS (Huawei Mobile Services) platform and thankfully these services are also supported on third party SDKs such as Flutter, React-Native, Cordova, Xamarin and more.
View attachment 5331737
In this article we will dive in to some of the HMS Flutter Plugins to enhance an imaginary online restaurant app called Forks & Spoons with useful features. We will integrate Huawei Crash, Push and Account Flutter Plugins into our app to meet certain use cases in the first part of these series. Before we begin let me introduce Forks & Spoons to you.
Forks & Spoons​Forks & Spoons is a a local restaurant that is having a hard time because of Covid-19 epidemic like most of us. The precautions for the epidemic has prohibited the restaurant to accept customers so the owner has started online food delivery in order to reach to its customers and save its business.
View attachment 5331755
We will now integrate Huawei’s Mobile Services to the online food delivery app for the purpose of enriching the user experience and providing useful features with ease. Before we begin I would like to mention that I won’t go into details about the UI code and some business logic to not make this article too long but you can find all the source code in the github repository.
If you are ready let’s begin by integrating the first service to the app.
Account Kit​
Let’s add the Sign In function to authenticate users with their HUAWEI ID. First we create an AccountAuthParamsHelper object to set the fields we need in the response and we pass this object to the AccountAuthService.signIn method. Furthermore we can verify the user’s token after we have obtained the id from the sign in method.
Code:
/// Signing In with HUAWEI ID
void _signIn(BuildContext context) async {
// This parameter is optional. You can run the method with default options.
final helper = new AccountAuthParamsHelper();
helper
..setIdToken()
..setAccessToken()
..setAuthorizationCode()
..setEmail()
..setProfile();
try {
AuthAccount id = await AccountAuthService.signIn(helper);
log("Sign In User: ${id.displayName}");
// Optionally verify the id.
await performServerVerification(_id.idToken);
// ..
// ...
// Rest of the business logic.
// ....
} on PlatformException catch (e, stacktrace) {
log("Sign In Failed!, Error is:${e.message}");
ScaffoldMessenger.of(context).showSnackBar(SnackBar(
content: Text("Could not log in to your account."),
backgroundColor: Colors.red,
));
}
}
/// You can optionally verify the user’s token with this method.
Future<void> performServerVerification(String idToken) async {
var response = await http.post(
"https://oauth-login.cloud.huawei.com/oauth2/v3/tokeninfo",
body: {'id_token': idToken});
print(response.body);
}
We connected the _signIn method above to a user icon button on the app bar and once the user signs in successfully their first and last name will be displayed instead of the icon. Let’s see it in action:
View attachment 5331773
View attachment 5331771
It is a high possibility that you would need the id information again on some part of the app. In Forks&Spoons we had needed it in the drawer widget which displays the user’s shopping cart. In order to obtain the already signed in Huawei ID information we call the AccountAuthManager.getAuthResult method.
Code:
void _getAuthResult() async {
try {
AuthAccount _id = await AccountAuthManager.getAuthResult();
log(_id.givenName.toString());
// ..
// ...
// Rest of business logic.
} catch (e, stacktrace) {
log("Error while obtaining Auth Result, $e");
// ..
// ...
// Error handling.
}
}
We let people sign in to our app but there is no way to sign out yet. Let’s fix this by adding a sign out function.
Code:
void _signOut() async {
try {
final bool result = await AccountAuthService.signOut();
log("Signed out: $result");
} on PlatformException catch (e, stacktrace) {
log("Error while signing out: $e");
}
}
AGC Crash​The next service that we are going to integrate to our app is the AppGallery Connect Crash Service which provides a powerful yet lightweight solution to app crash problems. With the service, we can quickly detect, locate, and resolve app crashes, and have access to highly readable crash reports in real time.
We can catch and report all non-fatal exceptions of our app to the AGC Crash Service by adding the following code to our main method.
Code:
void main() {
// Obtains an instance of AGCCrash.
AGCCrash _agcCrashInstance = AGCCrash.instance;
// Defines Crash Service's [onFlutterError] API as Flutter's.
FlutterError.onError = _agcCrashInstance.onFlutterError;
// Below configuration records all exceptions that occurs in your app.
runZonedGuarded<Future<void>>(() async {
runApp(MyApp(_agcCrashInstance));
}, (Object error, StackTrace stackTrace) {
AGCCrash.instance.recordError(error, stackTrace);
});
}
Manually handling and reporting errors to AGC Crash Service​View attachment 5331757
Meet Jerry, The Forks&Spoons is his favorite restaurant and he orders from the app occasionally. Unfortunately his wifi is not working when he opened the app today and he tries to log in to the app. Below is the source code for the login process, notice that we are catching the error and reporting it to the crash service by the recordError method. If you use the try / catch block the thrown exception will not be catched by the configuration we did earlier, so we have to handle it ourselves.
Code:
void _signIn() async {
// This parameter is optional. You can run the method with default options.
final helper = new HmsAuthParamHelper();
helper
..setIdToken()
..setAccessToken()
..setAuthorizationCode()
..setEmail()
..setProfile();
try {
_id = await HmsAuthService.signIn(authParamHelper: helper);
log("Sign In User: ${_id.displayName}");
// ..
// ...
// Rest of the business logic.
} on PlatformException catch (e, stacktrace) {
widget.agcCrash.recordError(e, stacktrace);
log("Sign In Failed!, Error is:${e.message}");
}
}
Let’s see what happens when Jerry tries to log in to the app.
View attachment 5331751
Ooops something went wrong. Let’s see the crash report on the AppGallery Connect. Be sure to add the Exception filter to see the exception reports.
View attachment 5331739
Here is the exception report; we can check the stack trace, device information and more in this page, isn’t it cool ?
View attachment 5331741
We can also test an exception by just throwing one and sending its report on the catch clause.
Code:
/// Test method for sending an exception record to the agcrash service.
void sendException() {
try {
// Throws intentional exception for testing.
throw Exception("Error occured.");
} catch (error, stackTrace) {
// Records the occured exception.
AGCCrash.instance.recordError(error, stackTrace);
}
}
Furthermore we can set a userId or any custom key value pair to be sent in the crash/exception report.
Code:
// Gets AAID for user identification.
String userAAID = await hmsAnalytics.getAAID();
// Sets AAID as user ID.
AGCCrash.instance.setUserId(userAAID);
// Sets user name as custom key while sending crash reports.
AGCCrash.instance.setCustomKey("userName", _id.displayName);
Apart from exceptions Flutter can also have build phase errors or what i like to call widget errors. To catch all of these errors you can use ErrorWidget.builder class along with the crash kit. I have also returned a custom error widget to be rendered on error. Spoiler alert: it contains ducks.
Code:
@override
Widget build(BuildContext context) {
return MaterialApp(
debugShowCheckedModeBanner: false,
builder: (BuildContext context, Widget widget) {
Widget error = Container(
height: double.infinity,
width: double.infinity,
child: Text(
'...rendering error...',
style: TextStyle(color: Colors.white),
),
decoration: BoxDecoration(
image: DecorationImage(
image: AssetImage('assets/duck.jpg'),
repeat: ImageRepeat.repeat,
)),
);
if (widget is Scaffold || widget is Navigator)
error = Scaffold(body: Center(child: error));
ErrorWidget.builder = (FlutterErrorDetails errorDetails) {
print("Widget Error Occurred");
AGCCrash.instance
.recordError(errorDetails.exception, errorDetails.stack);
return error;
};
return widget;
},
title: 'Forks&Spoons',
theme: ThemeData(
primarySwatch: Colors.grey,
),
home: MyHomePage(
hmsAnalytics: hmsAnalytics,
),
);
}
View attachment 5331777
Push Kit​Let’s move on to one of the most indispensable services for the app: Push Kit. Push Kit is a messaging service provided for you to establish a messaging channel from the cloud to devices. By integrating Push Kit, you can send messages to your apps on user devices in real time. This helps you maintain closer ties with users and increases user awareness of and engagement with your apps.
Before we start to send messages we need to obtain a push token. We can achieve this by setting a callback to the token stream and requesting a token by the Push.getToken method.
Code:
void initPush() async {
if (!mounted) return;
// Subscribe to the streams.
Push.getTokenStream.listen(_onTokenEvent, onError: _onTokenError);
// Get the push token.
Push.getToken("");
}
Code:
void _onTokenEvent(String token) {
log("Obtained push token: $token");
}
void _onTokenError(Object error) {
PlatformException e = error;
print("TokenErrorEvent: " + e.message);
}
After we add the initPush function to a stateful widget’s initState method in our app we should be able see the push token on the logs.
View attachment 5331765
This token will be very useful while sending messages to our app. Let’s start with a push notification message.
Sign In to the AppGallery Connect and select your project, then select Grow /Push Kit on the side bar and click the Add Notification button. The page below will be displayed.
View attachment 5331759
Let’s quickly prepare a push notification for our belowed users. You can see the message preview on the right side.
View attachment 5331761
To send this push message to our users we can tap the Test Effect button and enter the push token we obtained earlier. We can also send this message to all users by pressing the Submit button above.
View attachment 5331763
We have successfully sent our first push message to our app but it still seems somewhat flat. We can spice things up by adding a deep link and navigating the users to the content directly. This would empower the UX of our app.
Before we jump into the code first, we need to add the intent filter below to the project’s AndroidManifest.xml file inside activity tag. Here we are using the scheme “app” but if you are going to release your app it’s better to change it to something related to your domain name.
XML:
<activity>
<!-- Other Configurations -->
<intent-filter>
<action android:name="android.intent.action.VIEW" />
<category android:name="android.intent.category.DEFAULT" />
<category android:name="android.intent.category.BROWSABLE" />
<data android:scheme="app"/>
</intent-filter>
</activity>
For the implementation let’s return to the initPush function again and add a callback to the intent stream of Push kit. Here we can also use the same callback for handling the startup intent. This intent is included in the notification that opens your app from scratch.
Code:
void initPush() async {
if (!mounted) return;
Push.setAutoInitEnabled(true);
// Subscribe to the streams. (Token, Intent and Data Message)
Push.getTokenStream.listen(_onTokenEvent, onError: _onTokenError);
Push.getIntentStream.listen(_onNewIntent, onError: _onIntentError);
// Handles startup intents.
_onNewIntent(await Push.getInitialIntent());
// Get the push token.
Push.getToken("");
}
Now for the best part let’s go back to the AppGallery Connect again and add a button with custom intent uri string to the notification we’ve prepared earlier.
View attachment 5331769
View attachment 5331767
Let’s press the button on the notification and see the deep linking in action.
View attachment 5331745
​Offering discounts using data messaging​Some features of your app may require you to send data messages up or down (App to Server or Server to App). By the help of data messaging from Push Kit this operation becomes very easy to achieve.
The Forks&Spoons restaurant’s owner is very generous and he wants to occasionally give discounts to the loyal customers like Jerry who orders from the app oftenly. To implement this we will go back to the initPush method again but this time we will add a callback to the onMessageReceived stream.
Code:
void initPush() async {
if (!mounted) return;
Push.setAutoInitEnabled(true);
// Subscribe to the streams. (Token, Intent and Data Message)
Push.getTokenStream.listen(_onTokenEvent, onError: _onTokenError);
Push.getIntentStream.listen(_onNewIntent, onError: _onIntentError);
_onNewIntent(await Push.getInitialIntent());
Push.onMessageReceivedStream
.listen(_onMessageReceived, onError: _onMessageReceiveError);
// Get the push token.
Push.getToken("");
}
And for the data message callback, a function that displays a discount dialog and sets the discount value to the state is enough to cover our feature.
Code:
void _onMessageReceived(RemoteMessage remoteMessage) {
String data = remoteMessage.data;
if (remoteMessage.dataOfMap.containsKey("discount")) {
setState(() {
discount = int.parse(remoteMessage.dataOfMap['discount']);
MyHomePage.discount = discount;
});
showDialog(
barrierDismissible: false,
context: context,
builder: (context) => AlertDialog(
title: Text(
"Discount for your pocket, Best food for your stomach!",
textAlign: TextAlign.center,
style: TextStyle(fontSize: 20, fontWeight: FontWeight.bold),
),
content: Column(
mainAxisSize: MainAxisSize.min,
children: <Widget>[
Text(
"Congrats you have received a ${remoteMessage.dataOfMap['discount']}% discount.",
textAlign: TextAlign.center,
),
SizedBox(
height: 15,
),
MaterialButton(
onPressed: () => Navigator.pop(context), child: Text("OK"))
],
)),
);
}
log("onRemoteMessageReceived Data: " + data);
}
void _onMessageReceiveError(Object error) {
PlatformException e = error;
log("onRemoteMessageReceiveError: " + e.message);
}
Now we head over to the AppGallery Connect and open the Push Kit Console again. This time we will send a data message that includes a discount value.
View attachment 5331743
View attachment 5331749
Oh a very generous fifty percent discount, Nice! Jerry will be walking on air. I can see him smiling with hunger. Let’s check the discounted prices.
View attachment 5331747
Jerry is excited to order with the reduced prices but he got a text message and he forgot to order, Let us remind him that he left his meal in the cart by using the local notification feature of the Push Kit Plugin.
Did you forget your meal in the cart feature​To implement this feature we need to check the user’s cart before the app goes to background or killed state. To monitor the app state we can use the WidgetsBindingObserver class from Flutter SDK. But before that we need to add the permissions that are required to send a scheduled local notification to our AndroidManifest.xml file.
XML:
<uses-permission android:name="android.permission.VIBRATE" />
<uses-permission android:name="android.permission.RECEIVE_BOOT_COMPLETED"/>
<uses-permission android:name="android.permission.WAKE_LOCK" />
<uses-permission android:name="android.permission.SYSTEM_ALERT_WINDOW"/>
For the implementation, first we mix the WidgetBindingObserver to our widget’s state, then add the observer as on the code below to the initState function and override the didChangeAppLifecycleState callback from the WidgetBindingObserver mixin.
Inside the callback we are sending a scheduled local notification to 1 minute after the app is paused. If the user opens the app again we are cancelling all the scheduled notifications.
Code:
class _MyAppState extends State<MyApp> with WidgetsBindingObserver {
@override
void initState() {
super.initState();
WidgetsBinding.instance.addObserver(this);
}
@override
void didChangeAppLifecycleState(AppLifecycleState state) {
super.didChangeAppLifecycleState(state);
switch (state) {
case AppLifecycleState.inactive:
log('appLifeCycleState inactive');
break;
case AppLifecycleState.resumed:
log('appLifeCycleState resumed');
Push.cancelScheduledNotifications();
break;
case AppLifecycleState.paused:
log('appLifeCycleState paused');
if (MyHomePage.userChoices.length > 0 &&
state == AppLifecycleState.paused) {
log('Sending sceduled local notification.');
Push.localNotificationSchedule({
HMSLocalNotificationAttr.TITLE: 'You forgot your meal in the cart.',
HMSLocalNotificationAttr.MESSAGE:
"It is better for your meal to stay in your stomach rather than the cart.",
HMSLocalNotificationAttr.FIRE_DATE:
DateTime.now().add(Duration(minutes: 1)).millisecondsSinceEpoch,
HMSLocalNotificationAttr.ALLOW_WHILE_IDLE: true,
HMSLocalNotificationAttr.TAG: "notify_cart"
});
}
break;
case AppLifecycleState.detached:
log('appLifeCycleState suspending');
break;
}
}
// ..
// ...
// Rest of the widget.
}
View attachment 5331753
Tips and Tricks​
Some exceptions and crash reports may be send after the app is opened again, to test an app crash you can use the testIt() method of AGCrash Kit.
Do not listen the streams more than once before disposing them properly, if you try to subscribe to a stream twice the app would throw an error.
Conclusion​We have authenticated our users, communicated them with messages and gave them discounts whilst monitoring the errors that may occur in the app. By the help of HMS Flutter Plugins all of these features were a lot easy to implement.
We are finished the part 1 of our series but the story will continue since we should also take care of the delivery flow. Apart from that the owner of the restaurant thinks that we can learn one or two things from Jerry by analysing his user behavior in the app.
My colleague Serdar Can will discuss about all these topics on the second part of our article. Thank you and congratulations on finishing the first part. If you have any questions related to article, feel 100% free to ask them in the comments section.
You can also check References and Further Reading section for more information about the Huawei Mobile Services and HMS Flutter Plugins.
See you next time!
References and Further Reading​Huawei Push Kit Flutter Plugin Development Guide
AG Crash Service Development Guide
Huawei Flutter Plugins Pub Dev Repository
Forks&Spoons App Github Repository
​
Click to expand...
Click to collapse
Can we sort crash by date in the console?
Basavaraj.navi said:
Can we sort crash by date in the console?
Click to expand...
Click to collapse
For sure.

Implementing Real-Time Transcription in an Easy Way

Background​The real-time onscreen subtitle is a must-have function in an ordinary video app. However, developing such a function can prove costly for small- and medium-sized developers. And even when implemented, speech recognition is often prone to inaccuracy. Fortunately, there's a better way — HUAWEI ML Kit, which is remarkably easy to integrate, and makes real-time transcription an absolute breeze!
Introduction to ML Kit​ML Kit allows your app to leverage Huawei's longstanding machine learning prowess to apply cutting-edge artificial intelligence (AI) across a wide range of contexts. With Huawei's expertise built in, ML Kit is able to provide a broad array of easy-to-use machine learning capabilities, which serve as the building blocks for tomorrow's cutting-edge AI apps. ML Kit capabilities include those related to:
Text (including text recognition, document recognition, and ID card recognition)
Language/Voice (such as real-time/on-device translation, automatic speech recognition, and real-time transcription)
Image (such as image classification, object detection and tracking, and landmark recognition)
Face/Body (such as face detection, skeleton detection, liveness detection, and face verification)
Natural language processing (text embedding)
Custom model (including the on-device inference framework and model development tool)
Real-time transcription is required to implement the function mentioned above. Let's take a look at how this works in practice:
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Now let's move on to how to integrate this service.
Integrating Real-Time Transcription​Steps
Registering as a Huawei developer on HUAWEI Developers
Creating an app
Create an app in AppGallery Connect. For details, see Getting Started with Android.
We have provided some screenshots for your reference:
3. Enabling ML Kit
4. Integrating the HMS Core SDK.
Add the AppGallery Connect configuration file by completing the steps below:
Download and copy the agconnect-service.json file to the app directory of your Android Studio project.
Call setApiKey during app initialization.
To learn more, go to Adding the AppGallery Connect Configuration File.
5.Configuring the maven repository address
Add build dependencies.
Import the real-time transcription SDK.
Code:
implementation 'com.huawei.hms:ml-computer-voice-realtimetranscription:2.2.0.300'
Add the AppGallery Connect plugin configuration.
Method 1: Add the following information under the declaration in the file header:
Code:
apply plugin: 'com.huawei.agconnect'
Method 2: Add the plugin configuration in the plugins block.
Code:
plugins {
id 'com.android.application'
// Add the following configuration:
id 'com.huawei.agconnect'
}
Please refer to Integrating the Real-Time Transcription SDK to learn more.
Setting the cloud authentication information
When using on-cloud services of ML Kit, you can set the API key or access token (recommended) in either of the following ways:
Access token
You can use the following API to initialize the access token when the app is started. The access token does not need to be set again once initialized.
MLApplication.getInstance().setAccessToken("your access token");
API key
You can use the following API to initialize the API key when the app is started. The API key does not need to be set again once initialized.
MLApplication.getInstance().setApiKey("your ApiKey");
For details, see Notes on Using Cloud Authentication Information.
Code Development​
Create and configure a speech recognizer.
Code:
MLSpeechRealTimeTranscriptionConfig config = new MLSpeechRealTimeTranscriptionConfig.Factory()
// Set the language. Currently, this service supports Mandarin Chinese, English, and French.
.setLanguage(MLSpeechRealTimeTranscriptionConstants.LAN_ZH_CN)
// Punctuate the text recognized from the speech.
.enablePunctuation(true)
// Set the sentence offset.
.enableSentenceTimeOffset(true)
// Set the word offset.
.enableWordTimeOffset(true)
// Set the application scenario. MLSpeechRealTimeTranscriptionConstants.SCENES_SHOPPING indicates shopping, which is supported only for Chinese. Under this scenario, recognition for the name of Huawei products has been optimized.
.setScenes(MLSpeechRealTimeTranscriptionConstants.SCENES_SHOPPING)
.create();
MLSpeechRealTimeTranscription mSpeechRecognizer = MLSpeechRealTimeTranscription.getInstance();
Create a speech recognition result listener callback.
Code:
// Use the callback to implement the MLSpeechRealTimeTranscriptionListener API and methods in the API.
protected class SpeechRecognitionListener implements MLSpeechRealTimeTranscriptionListener{
@Override
public void onStartListening() {
// The recorder starts to receive speech.
}
@Override
public void onStartingOfSpeech() {
// The user starts to speak, that is, the speech recognizer detects that the user starts to speak.
}
@Override
public void onVoiceDataReceived(byte[] data, float energy, Bundle bundle) {
// Return the original PCM stream and audio power to the user. This API is not running in the main thread, and the return result is processed in a sub-thread.
}
@Override
public void onRecognizingResults(Bundle partialResults) {
// Receive the recognized text from MLSpeechRealTimeTranscription.
}
@Override
public void onError(int error, String errorMessage) {
// Called when an error occurs in recognition.
}
@Override
public void onState(int state,Bundle params) {
// Notify the app of the status change.
}
}
The recognition result can be obtained from the listener callbacks, including onRecognizingResults. Design the UI content according to the obtained results. For example, display the text transcribed from the input speech.
Bind the speech recognizer.
Code:
mSpeechRecognizer.setRealTimeTranscriptionListener(new SpeechRecognitionListener());
Call startRecognizing to start speech recognition.
Code:
mSpeechRecognizer.startRecognizing(config);
Release resources after recognition is complete.
Code:
if (mSpeechRecognizer!= null) {
mSpeechRecognizer.destroy();
}
(Optional) Obtain the list of supported languages.
Code:
MLSpeechRealTimeTranscription.getInstance()
.getLanguages(new MLSpeechRealTimeTranscription.LanguageCallback() {
@Override
public void onResult(List<String> result) {
Log.i(TAG, "support languages==" + result.toString());
}
@Override
public void onError(int errorCode, String errorMsg) {
Log.e(TAG, "errorCode:" + errorCode + "errorMsg:" + errorMsg);
}
});
We have finished integration here, so let's test it out on a simple screen.
Tap START RECORDING. The text recognized from the input speech will display in the lower portion of the screen.
We've now built a simple audio transcription function.
Eager to build a fancier UI, with stunning animations, and other effects? By all means, take your shot!
For reference:​Real-Time Transcription
Sample Code for ML Kit
To learn more, please visit:
HUAWEI Developers official website
Development Guide
Reddit to join developer discussions
GitHub or Gitee to download the demo and sample code
Stack Overflow to solve integration problems
Follow our official account for the latest HMS Core-related news and updates.
Original Source

Integration of Huawei Analytics kit and Crash service in Navigation Glove IoT application Using Kotlin — Part 6

{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Introduction​
If you are new to series of this article, follow the below article.
Beginner: Integration of Huawei Account kit in Navigation Glove IoT application Using Kotlin - Part 1
Beginner: Integration of Huawei Map kit in Navigation Glove IoT application Using Kotlin - Part 2
Beginner: Integration of Huawei Site kit in Navigation Glove IoT application Using Kotlin - Part 3
Beginner: Integration of Huawei Direction API in Navigation Glove IoT application Using Kotlin - Part 4
Beginner: Connecting to Smart Gloves Using Bluetooth in Navigation Glove IoT application Using Kotlin - Part 5
Click to expand...
Click to collapse
In this article, we will learn how to integrate Analytics kit and Crash Service in Smart Glove application.
Adding Events with Huawei Analytics Kit
This guide walks you through the process of building application that uses Huawei Analytics Kit to trigger event and can find data on the console.
What You Will Build
You will build an application that triggers events, setting user properties, logging custom event etc.
Prerequisite​
About 10 minutes
A favorite text editor or IDE(For me Android Studio)
JDK 1.8 or later
Gradle 4+
SDK platform 19
Analytics Kit
What is Mobile analytics?
Mobile analytics captures data from mobile app, website, and web app visitors to identify unique users, track their journeys, record their behaviour, and report on the app’s performance. Similar to traditional web analytics, mobile analytics are used to improve conversions, and are the key to crafting world-class mobile experiences.
How to complete this guide
When a person says that I know theoretical concept, only when he/she knows the answer for all WH questions. To complete this guide, let’s understand all WH questions.
1. Who has to use analytics?
2. Which one to use?
3. What is Huawei Analytics kit?
4. When to use HMS Analytics kit?
5. Why to use analytics kit?
6. Where to use analytics Kit?
Once you get answer for all the above questions, then you will get theoretical knowledge. But to understand with result you should know answer for below question.
1. How to integrate Huawei analytics kit?
Who has to use the analytics kit?
The answer is very simple, the analytics kit will be used in the mobile/web application. So off course software developer has to use analytics kit.
Which one to use?
Since there are many analytics vendors in the market. But for mobile application I recommend Huawei analytics kit. Now definitely you will have question why? To answer this I’ll give some reasons.
Very easy to integrate.
Documentation is too good.
Community is too good. Response from community is so fast.
Moreover, it is very similar to other vendors, so no need to learn new things.
You can see events in real time.
What is Huawei Analytics kit?
Flutter Analytics plugin enables the communication between HMS Core analytics SDK and Flutter platform. This plugin exposed all the functionality which is provided by HMS core analytics SDK.
Huawei Analytics kit offers you a range of analytics models that helps you to analyse the users’ behaviour with predefined and custom events, you can gain a deeper insight into your users, products and content. It helps you to gain insight into that how users behave on different platforms based on the user behaviour events and user attributes reported through apps.
Huawei Analytics kit, our one-stop analytics platform provides developers with intelligent, convenient and powerful analytics capabilities, using this we can optimize apps performance and identify marketing channels.
Collect and report custom events.
Set a maximum of 25 user attributes.
Automate event collection and session calculation.
Preset event IDs and parameters.
When to use HMS Analytics kit?
Mobile app analytics are a developer’s best friend. It helps you gain understanding about that how users’ behaviour and app can be optimized to reach your goals. Without mobile app analytics, you would be trying out different things blindly without any data to back up your experiments.
That’s why it’s extremely important for developers to understand their mobile app analytics to track their progress while working towards achieving their goals.
Why to use analytics kit?
Mobile app analytics are essential to development process for many reasons. They give you insights into that how users are using your app, which parts of the app they interact with, and what actions they take within the app. You can use these insights to come up with an action plan to improve your product in future, like adding new features that the users seem to need, or improve existing ones in a way that would make the users lives easier, or removing features that the users don’t seem to use.
You’ll also gain insights into whether you’re achieving your goals for your mobile app, whether its revenue, awareness, or other KPIs, and then take the data you have to adjust your strategy and optimize your app to reach your further goals.
When it comes to why? Always everyone thinks about benefits.
Benefits of Analytics
App analytics helps you to drive ROI over every aspect of performance.
App analytics helps you to gather accurate data to better serve for customers.
App analytics allows you to drive personalized and customer-focused marketing.
App analytics allows you to track individual and group achievements of marketing goals from campaigns.
App analytics offers data-driven insights into issues concerning churn and retention.
Where to use analytics Kit?
This is very important question, because you already know that why to use the analytics kit. So, wherever you want understand about user behaviour, which part of the application users are using regularly, which functionality of the application users are using more. In the scenario you can use analytics kit in either mobile/web application you can use the analytics kit.
Crash Service
What is Huawei Crash service?
In this article, we will learn how to integrate Crash services of AppGallery in Pygmy collection finance application.
Huawei Crash is a realtime crash reporting tool that helps in tracking, prioritizing, and fix stability issues that compromise the quality of your app. Crashlytics also helps in troubleshooting and saves the debugging.
The AppGallery Connect Crash service provides a powerful lightweight solution to app crash problems. With the service, you can quickly detect, locate and resolve app crashes (unexpected exits of apps), and have access to highly readable crash reports in real time, without the need to write any code.
To ensure stable running of your app and prevent user experience deterioration caused by crashes, you are advised to monitor the running status of your app on each device, which is the key. The Crash service provides real-time reports, revealing any crash of your app on any device. In addition, the Crash service can intelligently aggregate crashes, providing context data when a crash occurs, such as environment information and stack, for you to prioritize the crash easily for rapid resolution.
Why do we need the crash service?
Although apps have gone through rounds the tests before release considering the large user base diverse device models and complex network environment. It’s inevitable for apps to crash occasionally. Crashes compromise user experience, users may even uninstall app due to crashes and your app is not going to get good reviews.
You can’t get sufficient crash information from reviews to locate crashes, therefore you can’t resolve them shortly. This will severely harm your business. That’s why we need to use the crash services in our apps to be more efficient.
Now start with practical
Till now you understood theoretical concept of the analytics kit. Now let’s start with the practical example, to understand about practical we should get answer for the below question.
How to integrate Huawei analytics kit in Android finance application?
To achieve this you need to follow the steps.
1. Configure application on the AGC.
2. Client application development process.
Configure application on the AGC
Follow the steps.
Step 1: We need to register as a developer account in AppGallery Connect. If you are already developer ignore this step.
Step 2: Create an app by referring to Creating a Project and Creating an App in the Project
Step 3: Set the data storage location based on current location.
Step 4: Enabling Analytics Kit. Project setting > Manage API > Enable analytics kit toggle button.
Step 5: Generating a Signing Certificate Fingerprint.
Step 6: Configuring the Signing Certificate Fingerprint.
Step 7: Download your agconnect-services.json file, paste it into the app root directory.
Step 8: Choose Quality > Crash > Enable the crash service.
Client application development process
Follow the steps.
Step 1: Create Android application in the Android studio (Any IDE which is your favorite)
Step 2: Add the App level gradle dependencies. Choose inside project Android > app > build.gradle.
How to integrate Analytics Kit and Crash Service
1. Configure the application on the AGC.
2. Client application development process.
3. Testing a Splash Ad.
Client application development process
Follow the steps.
Step 1: Create an Android application in the Android studio (Any IDE which is your favorite).
Step 2: Add the App level Gradle dependencies. Choose inside project Android > app > build.gradle.
Java:
apply plugin: 'com.android.application'
apply plugin: 'com.huawei.agconnect'
dependencies {
implementation 'com.huawei.hms:hianalytics:5.1.0.300'
implementation "com.huawei.agconnect:agconnect-crash:1.6.0.300"
}
Root level gradle dependencies.
Java:
maven { url 'https://developer.huawei.com/repo/' }
classpath 'com.huawei.agconnect:agcp:1.4.1.300'
Step 3: To allow HTTP and HTTPS network requests on devices with targetSdkVersion 28 or later, configure the following information in the AndroidManifest.xml file:
XML:
<application
...
android:usesCleartextTraffic="true">
...
</application>
Analytics kit
HuaweiLog.kt
Java:
package com.huawei.navigationglove.analytics
import com.huawei.navigationglove.analytics.HuaweiLog
import android.os.Bundle
import android.util.Log
import java.lang.Exception
import java.util.ArrayList
import java.util.HashMap
class HuaweiLog {
var eventName: String? = null
private val data: HashMap<String, String>? = HashMap()
fun setEventName(eventName: String?): HuaweiLog {
this.eventName = eventName
return this
}
fun setKeyAndValue(key: String, value: String): HuaweiLog {
data!![key] = value
return this
}
fun toBundle(): Bundle {
val bundle = Bundle()
try {
if (data != null && data.size > 0) {
for ((key, value) in data) {
bundle.putString(key, value)
Log.d("Huawei", "$key $value")
}
}
} catch (e: Exception) {
e.printStackTrace()
}
return bundle
} /*
public HuaweiLog setTestDescription(ArrayList list) {
testdata.put(HuaweiEventParams.Key.DESCRIPTION.textValue(), list);
return this;
}*/
}
HuaweiAnalyticsClient.kt
Java:
package com.huawei.navigationglove.analytics
import android.content.Context
import android.util.Log
import com.huawei.hms.analytics.HiAnalytics
import com.huawei.hms.analytics.type.ReportPolicy
import com.huawei.hms.analytics.HiAnalyticsInstance
import java.lang.Exception
import java.lang.RuntimeException
import java.util.HashSet
class HuaweiAnalyticsClient private constructor() {
fun initAnalytics(context: Context?) {
mHuaweiAnalyticsClient = HiAnalytics.getInstance(context)
if (mHuaweiAnalyticsClient == null) {
Log.e(TAG, "Analytics Client could not be initialized.")
return
}
mHuaweiAnalyticsClient!!.setAnalyticsEnabled(true)
mHuaweiAnalyticsClient!!.setUserId("UserId")
mHuaweiAnalyticsClient!!.setAutoCollectionEnabled(true)
// Used to report an event upon app switching to the background.
val moveBackgroundPolicy = ReportPolicy.ON_MOVE_BACKGROUND_POLICY
// Used to report an event at the specified interval.
val scheduledTimePolicy = ReportPolicy.ON_SCHEDULED_TIME_POLICY
// Set the event reporting interval to 600 seconds.
scheduledTimePolicy.threshold = 600
val reportPolicies: MutableSet<ReportPolicy> = HashSet()
// Add the ON_SCHEDULED_TIME_POLICY and ON_MOVE_BACKGROUND_POLICY policies.
reportPolicies.add(scheduledTimePolicy)
reportPolicies.add(moveBackgroundPolicy)
// Set the ON_MOVE_BACKGROUND_POLICY and ON_SCHEDULED_TIME_POLICY policies.
mHuaweiAnalyticsClient!!.setReportPolicies(reportPolicies)
}
fun logEvent(log: HuaweiLog) {
if (mHuaweiAnalyticsClient == null) {
throw RuntimeException("HuaweiAnalyticsClient is not initialized. Please call initAnalytics().")
}
try {
mHuaweiAnalyticsClient!!.onEvent(log.eventName, log.toBundle())
Log.d("Huawei", log.eventName!!)
} catch (e: Exception) {
Log.d(TAG, "Huawei analytics failed" + e.message)
}
}
/* public void putHuaweiUserProperty(HuaweiUserProperty.KeyName propertyKey, String value) {
if (mHuaweiAnalyticsClient == null) {
throw new RuntimeException("HuaweiAnalyticsClient is not initialized. Please call initAnalytics().");
}
try {
mHuaweiAnalyticsClient.setUserProfile(propertyKey.textValue(), value);
} catch (Exception e) {
Log.d(TAG, "Huawei analytics failed", e);
}
}*/
fun setHuaweiUserId(userId: String?) {
if (mHuaweiAnalyticsClient == null) {
throw RuntimeException("HuaweiAnalyticsClient is not initialized. Please call initAnalytics().")
}
if (userId == null) {
mHuaweiAnalyticsClient!!.setUserId(null)
return
}
mHuaweiAnalyticsClient!!.setUserId(userId)
}
companion object {
private val TAG = HuaweiAnalyticsClient::class.java.simpleName
private var ourInstance: HuaweiAnalyticsClient? = null
private var mHuaweiAnalyticsClient: HiAnalyticsInstance? = null
val instance: HuaweiAnalyticsClient?
get() {
if (ourInstance == null) {
ourInstance = HuaweiAnalyticsClient()
}
return ourInstance
}
}
}
AnalyticUtils.kt
Java:
package com.huawei.navigationglove.analytics
import android.util.Log
import java.lang.Exception
object AnalyticUtils {
private val TAG = AnalyticUtils::class.java.simpleName
fun logHuaweiAnalyticEvent(huaweiLog: HuaweiLog) {
try {
HuaweiAnalyticsClient.instance!!.logEvent(huaweiLog)
Log.d(TAG, "Huawei analytics $huaweiLog")
} catch (e: Exception) {
Log.d(TAG, "Huawei analytics failed")
}
}
}
Crash Report
Java:
//AGConnectCrash.getInstance().testIt(this)
AGConnectCrash.getInstance().enableCrashCollection(true)
AGConnectCrash.getInstance().setUserId("Add user ID");
AGConnectCrash.getInstance().log(Log.DEBUG, "set debug log.");
AGConnectCrash.getInstance().log(Log.INFO, "set info log.");
AGConnectCrash.getInstance().log(Log.WARN, "set warning log.");
AGConnectCrash.getInstance().log(Log.ERROR, "set error log.");
AGConnectCrash.getInstance().setCustomKey("stringKey", "Hello world");
AGConnectCrash.getInstance().setCustomKey("booleanKey", false);
AGConnectCrash.getInstance().setCustomKey("doubleKey", 1.1);
AGConnectCrash.getInstance().setCustomKey("floatKey", 1.1f);
AGConnectCrash.getInstance().setCustomKey("intKey", 0);
AGConnectCrash.getInstance().setCustomKey("longKey", 11L);
Result​
Analytics​
Crash Service​
Tips and Tricks​
Make sure you are already registered as a Huawei developer.
Make sure you have added the agconnect-services.json file to the app folder.
Make sure you have added the SHA-256 fingerprint without fail.
Make sure all the dependencies are added properly.
Make sure you added agconnect-service.json file.
Add internet permission in AndroidManifest.xml
Add the below code to download the HMS core apk
Conclusion​
In this article, we have learnt the integration of the Huawei Analytics kit and Crash service in Smart Glove application using Android Studio and Kotlin. And also we have learned how to download crash report and also we have learnt the tracking the analytics data on the console.
Reference​
Analytics Kit - Official document
Huawei Crash – Official document
Analytics Kit – Training Video
Huawei Crash – Training Video

Build a Seamless Sign-in Experience Across Different Apps and Platforms with Keyring

Mobile apps have significantly changed the way we live, bringing about greater convenience. With our mobiles we can easily book hotels online when we go sightseeing, buy train and flight tickets online for business trips, or just pay for a dinner using scan and pay.
There is rarely a one-app-fits-all approach of offering such services, so users have to switch back and forth between multiple apps. This also requires users to register and sign in to different apps, which is a trouble itself because users will need to complete complex registration process and repeatedly enter their account names and passwords.
In addition, as technology develops, a developer usually has multiple Android apps and app versions, such as the quick app and web app, for different platforms. If users have to repeatedly sign in to different apps or versions by the same developer, the churn rate will likely increase. What's more, the developer may need to even pay for sending SMS messages if users choose to sign in to their apps through SMS verification codes.
Is there anything the developer can do to streamline the sign-in process between different apps and platforms so that users do not need to enter their account names and passwords again and again?
Well fortunately, HMS Core Keyring makes this possible. Keyring is a Huawei service that offers credential management APIs for storing user credentials locally on users' Android phones and tablets and sharing the credentials between different apps and different platform versions of an app. Developers can call relevant APIs in their Android apps, web apps, or quick apps to use Keyring services, such as encrypt the sign-in credentials of users for local storage on user devices and share the credentials between different apps and platforms, thus creating a seamless sign-in experience for users across different apps and platforms. Besides, all credentials will be stored in Keyring regardless of which type of APIs developers are calling, to implement unified credential management and sharing.
In this article, I'll share how I used Keyring to manage and share sign-in credentials of users. I hope this will help you.
Advantages​
First, I'd like to explain some advantages of Keyring.
Building a seamless sign-in experience
Your app can call Keyring APIs to obtain sign-in credentials stored on user devices, for easy sign-in.
Ensuring data security and reliability
Keyring encrypts sign-in credentials of users for local storage on user devices and synchronizes the credentials between devices via end-to-end encryption technology. The encrypted credentials cannot be decrypted on the cloud.
Reducing the churn rate during sign-in
Keyring can simplify the sign-in process for your apps, thus reducing the user churn rate.
Reducing the operations cost
With Keyring, you can reduce the operations cost, such as the expense for SMS messages used by users to sign in to your app.
Development Procedure​
Next, let's look at how to integrate Keyring. Before getting started, you will need to make some preparations, such as register as a Huawei developer, generate and configure your signing certificate fingerprint in AppGallery Connect, and enable Keyring. You can click here to learn about the detailed preparation steps, which will not be introduced in this article.
After making necessary preparations, you can now start integrating the Keyring SDK. I'll detail the implementation steps in two scenarios.
User Sign-in Scenario​
In this scenario, you need to follow the steps below to implement relevant logic.
1. Initialize the CredentialClient object in the onCreate method of your activity. Below is a code snippet example.
Code:
CredentialClient credentialClient = CredentialManager.getCredentialClient(this);
2. Check whether a credential is available. Below is a code snippet example.
Code:
List<AppIdentity> trustedAppList = new ArrayList<>();
trustedAppList.add(new AndroidAppIdentity("yourAppName", "yourAppPackageName", "yourAppCodeSigningCertHash"));
trustedAppList.add(new WebAppIdentity("youWebSiteName", "www.yourdomain.com"));
trustedAppList.add(new WebAppIdentity("youWebSiteName", "login.yourdomain.com"));
SharedCredentialFilter sharedCredentialFilter = SharedCredentialFilter.acceptTrustedApps(trustedAppList);
credentialClient.findCredential(sharedCredentialFilter, new CredentialCallback<List<Credential>>() {
@Override
public void onSuccess(List<Credential> credentials) {
if (credentials.isEmpty()) {
Toast.makeText(MainActivity.this, R.string.no_available_credential, Toast.LENGTH_SHORT).show();
} else {
for (Credential credential : credentials) {
}
}
}
@Override
public void onFailure(long errorCode, CharSequence description) {
Toast.makeText(MainActivity.this, R.string.query_credential_failed, Toast.LENGTH_SHORT).show();
}
});
3. Call the Credential.getContent method to obtain the credential content and obtain the result from CredentialCallback<T>. Below is a code snippet example.
Code:
private Credential mCredential;
// Obtained credential.
mCredential.getContent(new CredentialCallback<byte[]>() {
@Override
public void onSuccess(byte[] bytes) {
String hint = String.format(getResources().getString(R.string.get_password_ok),
new String(bytes));
Toast.makeText(MainActivity.this, hint, Toast.LENGTH_SHORT).show();
mResult.setText(new String(bytes));
}
@Override
public void onFailure(long l, CharSequence charSequence) {
Toast.makeText(MainActivity.this, R.string.get_password_failed,
Toast.LENGTH_SHORT).show();
mResult.setText(R.string.get_password_failed);
}
});
4. Call the credential saving API when a user enters a new credential, to save the credential. Below is a code snippet example.
Code:
AndroidAppIdentity app2 = new AndroidAppIdentity(sharedToAppName,
sharedToAppPackage, sharedToAppCertHash);
List<AppIdentity> sharedAppList = new ArrayList<>();
sharedAppList.add(app2);
Credential credential = new Credential(username, CredentialType.PASSWORD, userAuth,
password.getBytes());
credential.setDisplayName("user_niceday");
credential.setSharedWith(sharedAppList);
credential.setSyncable(true);
credentialClient.saveCredential(credential, new CredentialCallback<Void>() {
@Override
public void onSuccess(Void unused) {
Toast.makeText(MainActivity.this,
R.string.save_credential_ok,
Toast.LENGTH_SHORT).show();
}
@Override
public void onFailure(long errorCode, CharSequence description) {
Toast.makeText(MainActivity.this,
R.string.save_credential_failed + " " + errorCode + ":" + description,
Toast.LENGTH_SHORT).show();
}
});
User Sign-out Scenario​
Similarly, follow the steps below to implement relevant logic.
1. Initialize the CredentialClient object in the onCreate method of your activity. Below is a code snippet example.
Code:
CredentialClient credentialClient = CredentialManager.getCredentialClient(this);
2. Check whether a credential is available. Below is a code snippet example.
Code:
List<AppIdentity> trustedAppList = new ArrayList<>();
trustedAppList.add(new AndroidAppIdentity("yourAppName", "yourAppPackageName", "yourAppCodeSigningCertHash"));
trustedAppList.add(new WebAppIdentity("youWebSiteName", "www.yourdomain.com"));
trustedAppList.add(new WebAppIdentity("youWebSiteName", "login.yourdomain.com"));
SharedCredentialFilter sharedCredentialFilter = SharedCredentialFilter.acceptTrustedApps(trustedAppList);
credentialClient.findCredential(sharedCredentialFilter, new CredentialCallback<List<Credential>>() {
@Override
public void onSuccess(List<Credential> credentials) {
if (credentials.isEmpty()) {
Toast.makeText(MainActivity.this, R.string.no_available_credential, Toast.LENGTH_SHORT).show();
} else {
for (Credential credential : credentials) {
// Further process the available credentials, including obtaining the credential information and content and deleting the credentials.
}
}
}
@Override
public void onFailure(long errorCode, CharSequence description) {
Toast.makeText(MainActivity.this, R.string.query_credential_failed, Toast.LENGTH_SHORT).show();
}
});
3. Call the deleteCredential method to delete the credential and obtain the result from CredentialCallback. Below is a code snippet example.
Code:
credentialClient.deleteCredential(credential, new CredentialCallback<Void>() {
@Override
public void onSuccess(Void unused) {
String hint = String.format(getResources().getString(R.string.delete_ok),
credential.getUsername());
Toast.makeText(MainActivity.this, hint, Toast.LENGTH_SHORT).show();
}
@Override
public void onFailure(long errorCode, CharSequence description) {
String hint = String.format(getResources().getString(R.string.delete_failed),
description);
Toast.makeText(MainActivity.this, hint, Toast.LENGTH_SHORT).show();
}
});
Keyring offers two modes for sharing credentials: sharing credentials using API parameters and sharing credentials using Digital Asset Links. I will detail the two modes below.
Sharing Credentials Using API Parameters​
In this mode, when calling the saveCredential method to save credentials, you can call the setSharedWith method to set parameters of the Credential object, to implement credential sharing. A credential can be shared to a maximum of 128 apps.
The sample code is as follows:
Code:
AndroidAppIdentity app1 = new AndroidAppIdentity("your android app name",
"your android app package name", "3C:99:C3:....");
QuickAppIdentity app2 = new QuickAppIdentity("your quick app name",
"your quick app package name", "DC:99:C4:....");
List<AppIdentity> sharedAppList = new ArrayList<>(); // List of apps with the credential is shared.
sharedAppList.add(app1);
sharedAppList.add(app2);
Credential credential = new Credential("username", CredentialType.PASSWORD, true,
"password".getBytes());
credential.setSharedWith(sharedAppList); // Set the credential sharing relationship.
credentialClient.saveCredential(credential, new CredentialCallback<Void>() {
@Override
public void onSuccess(Void unused) {
Toast.makeText(MainActivity.this,
R.string.save_credential_ok,
Toast.LENGTH_SHORT).show();
}
@Override
public void onFailure(long errorCode, CharSequence description) {
Toast.makeText(MainActivity.this,
R.string.save_credential_failed + " " + errorCode + ":" + description,
Toast.LENGTH_SHORT).show();
}
});
Sharing Credentials Using Digital Asset Links​
In this mode, you can add credential sharing relationships in the AndroidManifest.xml file of your Android app. The procedure is as follows:
1. Add the following content to the <application> element in the AndroidManifest.xml file:
Code:
<application>
<meta-data
android:name="asset_statements"
android:value="@string/asset_statements" />
</application>
2. Add the following content to the res\values\strings.xml file:
Code:
<string name="asset_statements">your digital asset links statements</string>
The Digital Asset Links statements are JSON strings comply with the Digital Asset Links protocol. The sample code is as follows:
Code:
[{
"relation": ["delegate_permission/common.get_login_creds"],
"target": {
"namespace": "web",
"site": "https://developer.huawei.com" // Set your website domain name.
}
},
{
"relation": ["delegate_permission/common.get_login_creds"],
"target": {
"namespace": "android_app",
"package_name": "your android app package name",
"sha256_cert_fingerprints": [
"F2:52:4D:..."
]
}
},
{
"relation": ["delegate_permission/common.get_login_creds"],
"target": {
"namespace": "quick_app",
"package_name": "your quick app package name",
"sha256_cert_fingerprints": [
"C3:68:9F:..."
]
}
}
]
The relation attribute has a fixed value of ["delegate_permission/common.get_login_creds"], indicating that the credential is shared with apps described in the target attribute.
And that's all for integrating Keyring. That was pretty straightforward, right? You can click here to find out more about Keyring and try it out.
Conclusion​
More and more developers are prioritizing the need for a seamless sign-in experience to retain users and reduce the user churn rate. This is especially true for developers with multiple apps and app versions for different platforms, because it can help them share the user base of their different apps. There are many ways to achieve this. As I illustrated earlier in this article, my solution for doing so is to integrate Keyring, which turns out to be very effective. If you have similar demands, have a try at this service and you may be surprised.
Did I miss anything? Let me know in the comments section below.

Categories

Resources