Obtaining Authentication Information
Use Case
HUAWEI ID is required for a user uses to access and manage their files in Drive. Before your app can use the Drive SDK, it needs to prompt the user to sign in with their HUAWEI ID. After the user signs in, your app saves relevant user information for calling Drive SDK APIs later.
Service Process
The figure below shows the basic service process.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Development Procedure
Step 1 Integrate the HMS SDK.
To use the Drive SDK, your app needs to obtain user information through the HMS SDK first. For details, please refer to Integrating the Drive SDK and HMS SDK.
Step 2 Set the HMS scope.
Set the Drive scope by calling the HuaweiIdAuthParamsHelper.setScopeList API provided by the HMS SDK to obtain the permission to access Drive APIs.
Code:
List<Scope> scopeList = new LinkedList<>();
scopeList.add(HuaweiIdAuthAPIManager.HUAWEIID_BASE_SCOPE);// Basic account permissions
scopeList.add(new Scope(DriveScopes.SCOPE_DRIVE));// All permissions, except permissions for the app folder.
scopeList.add(new Scope(DriveScopes.SCOPE_DRIVE_FILE));// Permissions to view and manage files.
scopeList.add(new Scope(DriveScopes.SCOPE_DRIVE_METADATA));// Permissions to view and manage file metadata, excluding file content.
scopeList.add(new Scope(DriveScopes.SCOPE_DRIVE_METADATA_READONLY));// Permissions only for viewing file metadata, excluding file entities.
scopeList.add(new Scope(DriveScopes.SCOPE_DRIVE_READONLY));// Permissions to view file metadata and content.
scopeList.add(new Scope(DriveScopes.SCOPE_DRIVE_APPDATA));// Permissions to view and manage app files.
HuaweiIdAuthParams params = new HuaweiIdAuthParamsHelper(HuaweiIdAuthParams.DEFAULT_AUTH_REQUEST_PARAM)
.setAccessToken()
.setIdToken()
.setScopeList(scopeList)
.createParams();
The access permissions vary according to the Drive scope. You may apply for the permissions as needed. For details, please refer to Signing In with HUAWEI ID.
Step 3 Store authentication information.
After obtaining the user information, save the unionId and AccessToken of the user for calling the Drive SDK whenever needed. The following provides an example implementation method. You may use another implementation method if necessary.
Sample code:
Creates the CredentialManager class.
Code:
/**
* Credential management class
* <p>
* since 1.0
*/
public class CredentialManager {
/**
* Drive authentication information.
*/
private DriveCredential mCredential;
private CredentialManager() {
}
private static class InnerHolder {
private static CredentialManager sInstance = new CredentialManager();
}
/**
* Singleton CredentialManager instance.
*
* @return CredentialManager
*/
public static CredentialManager getInstance() {
return InnerHolder.sInstance;
}
/**
* Initialize Drive based on the context and HUAWEI ID information including unionId, countrycode, and accessToken.
* When the current accessToken expires, register an AccessMethod and obtain a new accessToken.
*
* @param unionID unionID in HwID
* @param at access token
* @param refreshAT Callback function for refreshing the access token.
*/
public int init(String unionID, String at, DriveCredential.AccessMethod refreshAT) {
if (StringUtils.isNullOrEmpty(unionID) || StringUtils.isNullOrEmpty(at)) {
return DriveCode.ERROR;
}
DriveCredential.Builder builder = new DriveCredential.Builder(unionID, refreshAT);
mCredential = builder.build().setAccessToken(at);
return DriveCode.SUCCESS;
}
/**
* Obtain DriveCredential.
*
* @return DriveCredential
*/
public DriveCredential getCredential() {
return mCredential;
}
/**
* Exit Drive and clear all cache information generated during use.
*/
public void exit(Context context) {
// Clear cache files.
deleteFile(context.getCacheDir());
deleteFile(context.getFilesDir());
}
/**
* Delete cache files.
*
* @param file Designated cache file
*/
private static void deleteFile(java.io.File file) {
if (null == file || !file.exists()) {
return;
}
if (file.isDirectory()) {
java.io.File[] files = file.listFiles();
if (files != null) {
for (java.io.File f : files) {
deleteFile(f);
}
}
}
}
}
Call initDrive in the onActivityResult callback after the user signs in.
Code:
private static DriveCredential.AccessMethod refreshAT = new DriveCredential.AccessMethod() {
@Override
public String refreshToken() {
// Obtain a new access token through the HMS SDK.
return refreshAccessToken();
}
};
private void initDrive() {
// signInHuaweiId is the HUAWEI ID that the user used to sign in. Based on signInHuaweiId, obtain unionId and AccessToken.
final String unionID = signInHuaweiId.getUnionId();
final String accessToken = signInHuaweiId.getAccessToken();
int returnCode = CredentialManager.getInstance().init(unionID, accessToken, refreshAT);
if (DriveCode.SUCCESS == returnCode) {
//Jump to the app home page after successful initialization.
jumpToMainActivity();
}
}
Related
More articles like this, you can visit HUAWEI Developer Forum
Want to retrieve health or fitness data of users based on their HUAWEI ID and show in your app?
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Health Kit provide main three feature:
Data storage
Provides a data platform for developers to store fitness and health data.
Data openness
Provides a wide range of fitness and health APIs and supports sharing of the various fitness and health data, including step count, weight, and heart rate.
Data access authorization management
Provides settings for users and developers so users can manage developer's access to their health and fitness data, guaranteeing users' data privacy and legal rights.
Developement
1.Register HUAWEI ID and apply for Account Service*
https://developer.huawei.com/consumer/en/doc/development/HMSCore-Guides/apply-id-0000001050069756
2. Applying for Health Kit in developer console*
https://developer.huawei.com/consumer/en/doc/development/HMSCore-Guides/apply-kitservice-0000001050071707
3. Integrate HMS Health Kit
In Build.gradle file,add the implementation.
Code:
implementation 'com.huawei.hms:hihealth-base:{5.0.0.300}
In AndroidManifest.xml file, add the app ID generated when the creating the app on HUAWEI Developers to the application section.
Code:
<meta-data
android:name="com.huawei.hms.client.appid"
android:value="APP ID"/>
4.Call the related APIs to display HUAWEI ID sign-in screen and authorization screen. The app can only access data upon user authorization.
Sign in via HMS Core SDK and apply for the scope to obtain the permissions to access the HUAWEI Health Kit APIs.
Code:
private void signInn() {
Log.i(TAG, "begin sign in");
List<Scope> scopeList = new ArrayList<>();
// Add scopes to apply for. The following only shows an example.
// Developers need to add scopes according to their specific needs.
// View and save steps in HUAWEI Health Kit.
scopeList.add(new Scope( Scopes.HEALTHKIT_STEP_BOTH));
// View and save height and weight in HUAWEI Health Kit.
scopeList.add(new Scope(Scopes.HEALTHKIT_CALORIES_BOTH));
// View and save the heart rate data in HUAWEI Health Kit.
scopeList.add(new Scope(Scopes.HEALTHKIT_DISTANCE_BOTH));
// Configure authorization parameters.
HuaweiIdAuthParamsHelper authParamsHelper =
new HuaweiIdAuthParamsHelper(HuaweiIdAuthParams.DEFAULT_AUTH_REQUEST_PARAM);
HuaweiIdAuthParams authParams =
authParamsHelper.setIdToken().setAccessToken().setScopeList(scopeList).createParams();
// Initialize the HuaweiIdAuthService object.
final HuaweiIdAuthService authService =
HuaweiIdAuthManager.getService(this.getContext(), authParams);
// Silent sign-in. If authorization has been granted by the current account,
// the authorization screen will not display. This is an asynchronous method.
Task<AuthHuaweiId> authHuaweiIdTask = authService.silentSignIn();
final Context context = this.getContext();
// Add the callback for the call result.
authHuaweiIdTask.addOnSuccessListener(new OnSuccessListener<AuthHuaweiId>() {
@Override
public void onSuccess(AuthHuaweiId huaweiId) {
// The silent sign-in is successful.
Log.i(TAG, "silentSignIn success");
try {
readData();
} catch (ParseException e) {
e.printStackTrace();
}
Toast.makeText(context, "silentSignIn success", Toast.LENGTH_LONG).show();
}
}).addOnFailureListener(new OnFailureListener() {
@Override
public void onFailure(Exception exception) {
// The silent sign-in fails.
// This indicates that the authorization has not been granted by the current account.
if (exception instanceof ApiException) {
ApiException apiException = (ApiException) exception;
Log.i(TAG, "sign failed status:" + apiException.getStatusCode());
Log.i(TAG, "begin sign in by intent");
// Call the sign-in API using the getSignInIntent() method.
Intent signInIntent = authService.getSignInIntent();
MeFragment.this.startActivityForResult(signInIntent, REQUEST_SIGN_IN_LOGIN);
}
}
});
}
5.Build the condition for data query: a DataCollector object
Code:
DataCollector dataCollector = new DataCollector.Builder().setPackageName(context)
.setDataType(DataType.DT_CONTINUOUS_STEPS_DELTA)
.setDataStreamName("STEPS_DELTA")
.setDataGenerateType(DataCollector.DATA_TYPE_RAW)
.build();
6.Build time range and build condition-based query object.
Use the specified condition query object(call read Option) to call the data controller to query the sampling dataset.
Code:
public void readData() throws ParseException {
SimpleDateFormat dateFormat = new SimpleDateFormat("yyyy-MM-dd hh:mm:ss");
Date startDate = dateFormat.parse("2020-03-17 09:00:00");
Date endDate = dateFormat.parse("2020-07-17 09:05:00");
ReadOptions readOptions = new ReadOptions.Builder()
.read(DataType.DT_CONTINUOUS_STEPS_DELTA)
.read(DataType.DT_CONTINUOUS_CALORIES_BURNT)
.read(DataType.DT_CONTINUOUS_DISTANCE_DELTA)
.setTimeRange(startDate.getTime(), endDate.getTime(), TimeUnit.MILLISECONDS)
.build();
Task<ReadReply> readReplyTask = dataController.read(readOptions);
readReplyTask.addOnSuccessListener(new OnSuccessListener<ReadReply>() {
@Override
public void onSuccess(ReadReply readReply) {
for (SampleSet sampleSet : readReply.getSampleSets()) {
showSampleSet(sampleSet);
Log.i(TAG,"*****************"+sampleSet);
}
}
}).addOnFailureListener(new OnFailureListener() {
@Override
public void onFailure(Exception e) {
checkData.setText(e.toString()+"read");
}
});
}
7.the calories,distance and step are succesfully show in app.
Reference:
https://developer.huawei.com/consumer/en/doc/development/HMSCore-Guides/overview-dev-android-0000001050071671
Overview
QUIC (Quick UDP Internet Connections) is a general-purpose transport layer network protocol.
QUIC is a new transport which reduces latency compared to that of TCP. On the surface, QUIC is very similar to TCP+TLS+HTTP/2 implemented on UDP. Because TCP is implemented in operating system kernels, and middlebox firmware, making significant changes to TCP is next to impossible.
QUIC reduces round trips by working with UDP. With great features like eliminating head of line blocking, loss recovery over UDP by using multiplexed connections and congestion control, the protocol has evolved and is being widely used.
Introduction of hQUIC
hQUIC provides the ability to fulfil our needs for lower packet loss in our applications, high performance in our network operations and reliable fast connection. It supports the gQUIC protocol and provides intelligent congestion control algorithms to avoid congestions in different network environments.
Advantages
Ease of use: Streamlines APIs and shields low-level network details.
High compatibility: Supports gQUIC and Cronet.
Better experience: Outperforms other protocols in poor network conditions
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
What is Cronet?
Cronet is the networking stack of Chromium put into a library for use on mobile. This is the same networking stack that is used in the Chrome browser by over a billion people. It offers an easy-to-use, high performance, standards-compliant, and secure way to perform HTTP requests.
Cronet is a very well implemented and tested network stack that provides almost everything that a standard app needs from a network layer library, things like DNS, Cookies, SSL/TLS, HTTP(S), HTTP2, Proxy/PAC, OSCP, Web sockets, and F̶T̶P̶(soon to be removed). It also supports HTTP over QUIC, soon to be called HTTP3. This makes it a very attractive library to use as the network layer for your mobile app.
How TCP Works?
TCP allows for transmission of information in both directions. This means that computer systems that communicate over TCP can send and receive data at the same time, similar to a telephone conversation. The protocol uses segments (packets) as the basic units of data transmission.
The actual process for establishing a connection with the TCP protocol is as follows:
1. First, the requesting sender sends the server a SYN packet or segment (SYN stands for synchronize) with a unique, random number. This number ensures full transmission in the correct order (without duplicates).
2. If the receiver has received the segment, it agrees to the connection by returning a SYN-ACK packet (ACK stands for acknowledgement) including the client's sequence number plus 1. It also transmits its own sequence number to the client.
3. Finally, the sender acknowledges the receipt of the SYN-ACK segment by sending its own ACK packet, which in this case contains the receiver's sequence number plus 1. At the same time, the client can already begin transferring data to the receiver.
Prerequisite
1. A computer with Android Studio installed and able to access the Internet
2. A Huawei phone with HMS Core (APK) 5.0.0 or later installed for debugging the app
3. Java JDK (1.7 or later)
4. Android API (level 19 or higher)
5. EMUI 3.0 or later
6. HMS Core (APK) 5.0.0 or later
Integration process
1. Open the build.gradle file in the root directory of your Android Studio project.
Code:
buildscript {
buildscript {
repositories {
maven { url 'https://developer.huawei.com/repo/' }
google()
...
}
dependencies {
classpath 'com.android.tools.build:gradle:3.3.2'
}
}
}
2. Open the build.gradle file in the app directory and add dependency on the SDK.
Code:
dependencies {
implementation 'com.huawei.hms:hquic-provider:5.0.0.300'
}
3. Initialise the service
Code:
HQUICManager.asyncInit(context, new HQUICManager.HQUICInitCallback() {
@Override
public void onSuccess() {
Log.i(TAG, "HQUICManager asyncInit success");
}
@Override
public void onFail(Exception e) {
Log.w(TAG, "HQUICManager asyncInit fail");
}
});
4. Initialise the Cronet Engine
Code:
CronetEngine.Builder builder = new CronetEngine.Builder(context);
builder.enableQuic(true);
builder.addQuicHint(getHost(url), DEFAULT_PORT, DEFAULT_ALTERNATEPORT);
cronetEngine = builder.build();
App Development
I will use APIs of the hQUIC SDK by developing a Speed Test app step by step.
I have created HQUICSericeProvider class in which I will create the service using hQuic APIs.
Let us implements the asynchronous initialization API.
Code:
public class HQUICServiceProvider {
private static final String TAG = "HQUICServiceProvider";
private static final int DEFAULT_PORT = 443;
private static final int DEFAULT_ALTERNATEPORT = 443;
private static Executor executor = Executors.newSingleThreadExecutor();
private static CronetEngine cronetEngine;
private Context context;
private UrlRequest.Callback callback;
public HQUICServiceProvider(Context context) {
this.context = context;
init();
}
public void init() {
HQUICManager.asyncInit(
context,
new HQUICManager.HQUICInitCallback() {
@Override
public void onSuccess() {
Log.i(TAG, "HQUICManager asyncInit success");
}
@Override
public void onFail(Exception e) {
Log.w(TAG, "HQUICManager asyncInit fail");
}
});
}
private CronetEngine createCronetEngine(String url) {
if (cronetEngine != null) {
return cronetEngine;
}
CronetEngine.Builder builder = new CronetEngine.Builder(context);
builder.enableQuic(true);
builder.addQuicHint(getHost(url), DEFAULT_PORT, DEFAULT_ALTERNATEPORT);
cronetEngine = builder.build();
return cronetEngine;
}
private UrlRequest buildRequest(String url, String method) {
CronetEngine cronetEngine = createCronetEngine(url);
UrlRequest.Builder requestBuilder =
cronetEngine.newUrlRequestBuilder(url, callback, executor).setHttpMethod(method);
UrlRequest request = requestBuilder.build();
return request;
}
public void sendRequest(String url, String method) {
Log.i(TAG, "callURL: url is " + url + "and method is " + method);
UrlRequest urlRequest = buildRequest(url, method);
if (null != urlRequest) {
urlRequest.start();
}
}
private String getHost(String url) {
String host = null;
try {
java.net.URL url1 = new java.net.URL(url);
host = url1.getHost();
} catch (MalformedURLException e) {
Log.e(TAG, "getHost: ", e);
}
return host;
}
public void setCallback(UrlRequest.Callback mCallback) {
callback = mCallback;
}
}
More details, you can visit https://forums.developer.huawei.com/forumPortal/en/topic/0203400466275620106
It will support only network call related information?
Introduction
Huawei provides Remote Configuration service to manage parameters online, with this service you can control or change the behavior and appearance of you app online without requiring user’s interaction or update to app. By implementing the SDK you can fetch the online parameter values delivered on the AG-console to change the app behavior and appearance.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Functional features
1. Parameter management: This function enables user to add new parameter, delete, update existing parameter and setting conditional values.
2. Condition management: This function enables user to adding, deleting and modifying conditions and copy and modify existing conditions. Currently, you can set the following conditions version, country/region, audience, user attribute, user percentage, time and language. You can expect more conditions in the future.
3. Version management: This feature function supports user to manage and rollback up to 90 days of 300 historical versions for parameters and conditions.
4. Permission management: This feature function allows account holder, app administrator, R&D personnel, and administrator and operations personals to access Remote Configuration by default.
Service use cases
Change app language by Country/Region
Show Different Content to Different Users
Change the App Theme by Time
Development Overview
You need to install Unity software and I assume that you have prior knowledge about the unity and C#.
Hardware Requirements
A computer (desktop or laptop) running Windows 10.
A Huawei phone (with the USB cable), which is used for debugging.
Software Requirements
Java JDK 1.7 or later.
Unity software installed.
Visual Studio/Code installed.
HMS Core (APK) 4.X or later.
Integration Preparations
1. Create a project in AppGallery Connect.
2. Create Unity project.
3. Huawei HMS AGC Services to project.
4. Download and save the configuration file.
Add the agconnect-services.json file following directory Assests > Plugins > Android
5. Add the following plugin and dependencies in LaucherTemplate.
Code:
apply plugin:'com.huawei.agconnect'
Code:
implementation 'com.huawei.agconnect:agconnect-remoteconfig:1.4.1.300'
implementation 'com.huawei.agconnect:agconnect-core:1.4.2.301'
6. Add the following dependencies in MainTemplate.
Code:
apply plugin: 'com.huawei.agconnect'
Code:
implementation 'com.huawei.agconnect:agconnect-remoteconfig:1.4.1.300'
implementation 'com.huawei.agconnect:agconnect-core:1.4.2.301'
7. Add dependencies in build script repositories and all project repositories & class path in BaseProjectTemplate.
Code:
maven { url 'https://developer.huawei.com/repo/' }
8. Configuring project in AGC
9. Create Empty Game object rename to RemoteConfigManager, UI canvas texts and button and assign onclick events to respective text and button as shown below.
RemoteConfigManager.cs
C#:
using UnityEngine;
using HuaweiService.RemoteConfig;
using HuaweiService;
using Exception = HuaweiService.Exception;
using System;
public class RemoteConfigManager : MonoBehaviour
{
public static bool develporMode;
public delegate void SuccessCallBack<T>(T o);
public delegate void SuccessCallBack(AndroidJavaObject o);
public delegate void FailureCallBack(Exception e);
public void SetDeveloperMode()
{
AGConnectConfig config;
config = AGConnectConfig.getInstance();
develporMode = !develporMode;
config.setDeveloperMode(develporMode);
Debug.Log($"set developer mode to {develporMode}");
}
public void showAllValues()
{
AGConnectConfig config = AGConnectConfig.getInstance();
if(config!=null)
{
Map map = config.getMergedAll();
var keySet = map.keySet();
var keyArray = keySet.toArray();
foreach (var key in keyArray)
{
Debug.Log($"{key}: {map.getOrDefault(key, "default")}");
}
}else
{
Debug.Log(" No data ");
}
config.clearAll();
}
void Start()
{
SetDeveloperMode();
SetXmlValue();
}
public void SetXmlValue()
{
var config = AGConnectConfig.getInstance();
// get res id
int configId = AndroidUtil.GetId(new Context(), "xml", "remote_config");
config.applyDefault(configId);
// get variable
Map map = config.getMergedAll();
var keySet = map.keySet();
var keyArray = keySet.toArray();
config.applyDefault(map);
foreach (var key in keyArray)
{
var value = config.getSource(key);
//Use the key and value ...
Debug.Log($"{key}: {config.getSource(key)}");
}
}
public void GetCloudSettings()
{
AGConnectConfig config = AGConnectConfig.getInstance();
config.fetch().addOnSuccessListener(new HmsSuccessListener<ConfigValues>((ConfigValues configValues) =>
{
config.apply(configValues);
Debug.Log("===== ** Success ** ====");
showAllValues();
config.clearAll();
}))
.addOnFailureListener(new HmsFailureListener((Exception e) =>
{
Debug.Log("activity failure " + e.toString());
}));
}
public class HmsFailureListener:OnFailureListener
{
public FailureCallBack CallBack;
public HmsFailureListener(FailureCallBack c)
{
CallBack = c;
}
public override void onFailure(Exception arg0)
{
if(CallBack !=null)
{
CallBack.Invoke(arg0);
}
}
}
public class HmsSuccessListener<T>:OnSuccessListener
{
public SuccessCallBack<T> CallBack;
public HmsSuccessListener(SuccessCallBack<T> c)
{
CallBack = c;
}
public void onSuccess(T arg0)
{
if(CallBack != null)
{
CallBack.Invoke(arg0);
}
}
public override void onSuccess(AndroidJavaObject arg0)
{
if(CallBack !=null)
{
Type type = typeof(T);
IHmsBase ret = (IHmsBase)Activator.CreateInstance(type);
ret.obj = arg0;
CallBack.Invoke((T)ret);
}
}
}
}
10. Click to Build apk, choose File > Build settings > Build, to Build and Run, choose File > Build settings > Build And Run
Result
Tips and Tricks
Add agconnect-services.json file without fail.
Make sure dependencies added in build files.
Make sure that you released once parameters added/updated.
Conclusion
We have learnt integration of Huawei Remote Configuration Service into Unity Game development. Remote Configuration service lets you to fetch configuration data from local xml file and online i.e. AG-Console,changes will reflect immediately once you releases the changes.Conclusion is service lets you to change your app behaviour and appearance without app update or user interaction.
Thank you so much for reading article, hope this article helps you.
Reference
Unity Manual
GitHub Sample Android
Huawei Remote Configuration service
Read in huawei developer forum
Xamarin (Microsoft) is a multi-system development platform for mobile services that many developers use. Many AppGallery Connect services now support Xamarin, including Remote Configuration.
Remote Config allows you to make changes to your app remotely by making use of variables that you can define in the AppGallery Console. Different values can be set for different audiences, locations or sections of users. This is a great way to personalise your application, carry out testing of new features or just make updates without having to deploy a new app version.
Install the Xamarin environment and project setup.You’ll need to first download and install Visual Studio 2019.
Open Visual Studio and select Mobile development with .NET to install the Xamarin environment.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Next make sure you have enabled the Auth Service in AppGallery Connect.
Open Visual Studio, click Create a new project in the start window, select Mobile App (Xamarin.Forms), and set the app name and other required information.
Right-click your project and choose Manage NuGet Packages.
Search for the Huawei.Agconnect.RemoteConfiguration package on the displayed page and install it.
Download the JSON service file from your AppGallery project and add it into the *Assets directory in your project.
Create a new class named HmsLazyInputStreams.cs, and implement the following code to read the JSON file.
Code:
using System;
using System.IO;
using Android.Content;
using Android.Util;
using Huawei.Agconnect.Config;
namespace AppLinking1
{
public class HmsLazyInputStream : LazyInputStream
{
public HmsLazyInputStream(Context context)
: base(context)
{
}
public override Stream Get(Context context)
{
try
{
return context.Assets.Open("agconnect-services.json");
}
catch (Exception e)
{
Log.Error("Hms", $"Failed to get input stream" + e.Message);
return null;
}
}
}
}
Create a ContentProvider class and add the following code for your file to be read once your app is launched. Ensure that the package name set in ContentProvider, and the one in your project and the one set in AppGallery Connect are the same as the app package name.
Code:
using System;
using Android.Content;
using Android.Database;
using Huawei.Agconnect.Config;
namespace XamarinHmsRemoteConfig
{
[ContentProvider(new string[] { "com.huawei.cordova.remoteconfig.XamarinCustomProvider" }, InitOrder = 99)]
public class XamarinCustomProvider : ContentProvider
{
public override int Delete(Android.Net.Uri uri, string selection, string[] selectionArgs)
{
throw new NotImplementedException();
}
public override string GetType(Android.Net.Uri uri)
{
throw new NotImplementedException();
}
public override Android.Net.Uri Insert(Android.Net.Uri uri, ContentValues values)
{
throw new NotImplementedException();
}
public override bool OnCreate()
{
AGConnectServicesConfig config = AGConnectServicesConfig.FromContext(Context);
config.OverlayWith(new HmsLazyInputStream(Context));
return false;
}
public override ICursor Query(Android.Net.Uri uri, string[] projection, string selection, string[] selectionArgs, string sortOrder)
{
throw new NotImplementedException();
}
public override int Update(Android.Net.Uri uri, ContentValues values, string selection, string[] selectionArgs)
{
throw new NotImplementedException();
}
}
}
Right-click your project and choose Properties. Click Android Manifest on the displayed page and set a package name
Set in app default parameter valuesYou can configure a default parameter value in either of the following ways:
Using an XML resource file. Add a default parameter value XML file to the res/xml directory of your project. Call the ApplyDefault method to read the file.
Or you can use a Map object. Set a default parameter value dynamically through coding.
Code:
IDictionary<string, Java.Lang.Object> ConfigVariables = new Dictionary<string, Java.Lang.Object>();
ConfigVariables.Add("value1", "Default");
AGConnectConfig.Instance.ApplyDefault(ConfigVariables);
Fetch the updated parameter valueCall the fetch API to fetch the updated parameter value from Remote Configuration. The default update interval used by the fetch API is 12 hours and can be set as required.
Code:
AGConnectConfig.Instance.Fetch(fetchInterval).AddOnSuccessListener(new TaskListener(this)).AddOnFailureListener(new TaskListener(this));
Obtain all parameter valuesFor Xamarin.Android apps, the MergedAll API is called to obtain all parameter values including the in-app default parameter value and the on-cloud parameter value. For Android apps, the getMergedAll API is used
And that’s all for integrating Remote Configuration into Xamarin.Android apps. As more and more AppGallery Connect services support Xamarin, I will keep you posted about further integration tutorials.
I don't know if it's the same for you, but I always get frustrated when sorting through my phone's album. It seems to take forever before I can find the image that I want to use. As a coder, I can't help but wonder if there's a solution for this. Is there a way to organize an entire album? Well, let's take a look at how to develop an image classifier using a service called image classification.
Development Preparations1. Configure the Maven repository address for the SDK to be used.
Java:
repositories {
maven {
url'https://cmc.centralrepo.rnd.huawei.com/artifactory/product_maven/' }
}
2. Integrate the image classification SDK.
Java:
dependencies {
// Import the base SDK.
implementation 'com.huawei.hms:ml-computer-vision-classification:3.3.0.300'
// Import the image classification model package.
implementation 'com.huawei.hms:ml-computer-vision-image-classification-model:3.3.0.300'
Project Configuration1. Set the authentication information for the app.
This information can be set through an API key or access token.
Use the setAccessToken method to set an access token during app initialization. This needs to be set only once.
Java:
MLApplication.getInstance().setAccessToken("your access token");
Or, use setApiKey to set an API key during app initialization. This needs to be set only once.
Java:
MLApplication.getInstance().setApiKey("your ApiKey");
2. Create an image classification analyzer in on-device static image detection mode.
Java:
// Method 1: Use customized parameter settings for device-based recognition.
MLLocalClassificationAnalyzerSetting setting =
new MLLocalClassificationAnalyzerSetting.Factory()
.setMinAcceptablePossibility(0.8f)
.create();
MLImageClassificationAnalyzer analyzer = MLAnalyzerFactory.getInstance().getLocalImageClassificationAnalyzer(setting);
// Method 2: Use default parameter settings for on-device recognition.
MLImageClassificationAnalyzer analyzer = MLAnalyzerFactory.getInstance().getLocalImageClassificationAnalyzer();
3. Create an MLFrame object.
Java:
// Create an MLFrame object using the bitmap which is the image data in bitmap format. JPG, JPEG, PNG, and BMP images are supported. It is recommended that the image dimensions be greater than or equal to 112 x 112 px.
MLFrame frame = MLFrame.fromBitmap(bitmap);
4. Call asyncAnalyseFrame to classify images.
Java:
Task<List<MLImageClassification>> task = analyzer.asyncAnalyseFrame(frame);
task.addOnSuccessListener(new OnSuccessListener<List<MLImageClassification>>() {
@Override
public void onSuccess(List<MLImageClassification> classifications) {
// Recognition success.
// Callback when the MLImageClassification list is returned, to obtain information like image categories.
}
}).addOnFailureListener(new OnFailureListener() {
@Override
public void onFailure(Exception e) {
// Recognition failure.
try {
MLException mlException = (MLException)e;
// Obtain the result code. You can process the result code and customize relevant messages displayed to users.
int errorCode = mlException.getErrCode();
// Obtain the error message. You can quickly locate the fault based on the result code.
String errorMessage = mlException.getMessage();
} catch (Exception error) {
// Handle the conversion error.
}
}
});
5. Stop the analyzer after recognition is complete.
Java:
try {
if (analyzer != null) {
analyzer.stop();
}
} catch (IOException e) {
// Exception handling.
}
Demo
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
RemarksThe image classification capability supports the on-device static image detection mode, on-cloud static image detection mode, and camera stream detection mode. The demo here illustrates only the first mode.
I came up with a bunch of application scenarios to use image classification, for example: education apps. With the help of image classification, such an app enables its users to categorize images taken in a period into different albums; travel apps. Image classification allows such apps to classify images according to where they are taken or by objects in the images; file sharing apps. Image classification allows users of such apps to upload and share images by image category.
References>>Image classification Development Guide
>>Reddit to join developer discussions
>>GitHub to download the sample code
>>Stack Overflow to solve integration problems