Flutter check HMS/GMS Availability check - Huawei Developers

More information like this, you can visit HUAWEI Developer Forum​
This guide describes how to write custom platform-specific code. Some platform-specific functionality is available through existing packages
Flutter uses a flexible system that allows you to call platform-specific APIs whether available in Kotlin or Java code on Android, or in Swift or Objective-C code on iOS.
Flutter’s platform-specific API support does not rely on code generation, but rather on a flexible message passing style:
The Flutter portion of the app sends messages to its host, the iOS or Android portion of the app, over a platform channel.
The host listens on the platform channel, and receives the message. It then calls into any number of platform-specific APIs—using the native programming language—and sends a response back to the client, the Flutter portion of the app.
Architectural overview: platform channels
Messages are passed between the client (UI) and host (platform) using platform channels as illustrated in this diagram:
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Messages and responses are passed asynchronously, to ensure the user interface remains responsive.
Step 1: Create a new app project
Start by creating a new app:
In a terminal run: flutter create flutterhmsgmscheck
Step 2: Create the Flutter platform client
The app’s State class holds the current app state. Extend that to hold the current battery state.
First, construct the channel. Use a MethodChannel with a single platform method that returns the battery level.
The client and host sides of a channel are connected through a channel name passed in the channel constructor. All channel names used in a single app must be unique; prefix the channel name with a unique ‘domain prefix’, for example: com.salman.flutter.hmsgmscheck/isHmsGmsAvailable.
Code:
import 'package:flutter/material.dart';
import 'package:flutter/services.dart';
class HmsGmsCheck extends StatelessWidget {
HmsGmsCheck();
@override
Widget build(BuildContext context) {
return HmsGmsCheckStateful(
title: "HMS/GMS Check",
);
}
}
class HmsGmsCheckStateful extends StatefulWidget {
HmsGmsCheckStateful({Key key, this.title}) : super(key: key);
final String title;
@override
_HmsGmsCheckState createState() => _HmsGmsCheckState();
}
class _HmsGmsCheckState extends State<HmsGmsCheckStateful> {
static const MethodChannel methodChannel =
MethodChannel('com.salman.flutter.hmsallkitsflutter/isHmsGmsAvailable');
bool _isHmsAvailable;
bool _isGmsAvailable;
@override
void initState() {
checkHmsGms();
}
void checkHmsGms() async {
await _isHMS();
await _isGMS();
}
Future<void> _isHMS() async {
bool status;
try {
bool result = await methodChannel.invokeMethod('isHmsAvailable');
status = result;
print('status : ${status.toString()}');
} on PlatformException {
print('Failed to get _isHmsAvailable.');
}
setState(() {
_isHmsAvailable = status;
});
}
Future<void> _isGMS() async {
bool status;
try {
bool result = await methodChannel.invokeMethod('isGmsAvailable');
status = result;
print('status : ${status.toString()}');
} on PlatformException {
print('Failed to get _isGmsAvailable.');
}
setState(() {
_isGmsAvailable = status;
});
}
@override
Widget build(BuildContext context) {
return Scaffold(
appBar: AppBar(
title: Text(widget.title),
),
body: Column(
children: <Widget>[
new Container(
padding: EdgeInsets.all(20),
child: new Column(
children: <Widget>[
Text(
"HMS Available: $_isHmsAvailable",
style: Theme.of(context).textTheme.headline6,
),
Text(
"GMS Available: $_isGmsAvailable",
style: Theme.of(context).textTheme.headline6,
)
],
),
)
],
));
}
}
Step 3: Update your gradle
Open your gradle in Android Studio and apply huawei repo:
Project-level build.gradle
Code:
buildscript {
ext.kotlin_version = '1.3.50'
repositories {
google()
jcenter()
maven { url 'https://developer.huawei.com/repo/' }
}
}
allprojects {
repositories {
google()
jcenter()
maven { url 'https://developer.huawei.com/repo/' }
}
}
App-level build.gradle
Code:
dependencies {
implementation "org.jetbrains.kotlin:kotlin-stdlib-jdk7:$kotlin_version"
implementation "com.huawei.hms:hwid:4.0.0.300"
implementation "com.google.android.gms:play-services-base:17.3.0"
}
Step 4: Add an Android platform-specific implementation
Start by opening the Android host portion of your Flutter app in Android Studio:
Start Android Studio
Select the menu item File > Open…
Navigate to the directory holding your Flutter app, and select the android folder inside it. Click OK.
Open the file MainActivity.kt located in the kotlin folder in the Project view. (Note: If editing with Android Studio 2.3, note that the kotlin folder is shown as if named java.)
Inside the configureFlutterEngine() method, create a MethodChannel and call setMethodCallHandler(). Make sure to use the same channel name as was used on the Flutter client side.
Code:
class MainActivity: FlutterActivity() {
private val CHANNEL = "com.salman.flutter.hmsgmscheck/isHmsGmsAvailable"
var concurrentContext = [email protected]
override fun configureFlutterEngine(@NonNull flutterEngine: FlutterEngine) {
super.configureFlutterEngine(flutterEngine)
MethodChannel(flutterEngine.dartExecutor.binaryMessenger, CHANNEL).setMethodCallHandler {
call, result ->
// Note: this method is invoked on the main thread.
if (call.method.equals("isHmsAvailable")) {
result.success(isHmsAvailable());
} else if (call.method.equals("isGmsAvailable")) {
result.success(isGmsAvailable());
} else {
result.notImplemented()
}
}
}
private fun isHmsAvailable(): Boolean {
var isAvailable = false
val context: Context = concurrentContext
if (null != context) {
val result = HuaweiApiAvailability.getInstance().isHuaweiMobileServicesAvailable(context)
isAvailable = ConnectionResult.SUCCESS == result
}
Log.i("MainActivity", "isHmsAvailable: $isAvailable")
return isAvailable
}
private fun isGmsAvailable(): Boolean {
var isAvailable = false
val context: Context = concurrentContext
if (null != context) {
val result: Int = GoogleApiAvailability.getInstance().isGooglePlayServicesAvailable(context)
isAvailable = com.google.android.gms.common.ConnectionResult.SUCCESS === result
}
Log.i("MainActivity", "isGmsAvailable: $isAvailable")
return isAvailable
}
}
After completing above all steps compile your project you will get the following output.
Conclusion:
With the help of this article we can able to access platform specific native code under our flutter application. For further more details you can check offical flutter platform channels guide.

Related

React Native HMS ML Kit | Installation and Example

More information like this, you can visit HUAWEI Developer Forum​
Introduction
This article covers, how to integrate React Native HMS ML Kit to a React Native application.
React Native Hms ML Kit supports services listed below
· Text Related Services
· Language Related Services
· Image Related Services
· Face/Body Related Services
There are several number of uses cases of these services, you can combine them or just use them to create different functionalities in your app. For basic understanding, please read uses cases from here.
Github: https://github.com/HMS-Core/hms-react-native-plugin/tree/master/react-native-hms-ml
Prerequisites
Step 1
Prepare your development environment using this guide.
After reading this guide you should have React Native Development Environment setted up, Hms Core (APK) installed and Android Sdk installed.
Step 2
Configure your app information in App Gallery by following this guide.
After reading this guide you should have a Huawei Developer Account, an App Gallery app, a keystore file and enabled ml kit service from AppGallery.
Integrating React Native Hms ML Kit
Warning : Please make sure that, prerequisites part successfully completed.
Step 1
Code:
npm i @hmscore/react-native-hms-ml
Step 2
Open build.gradle file in project-dir > android folder.
Go to buildscript > repositories and allprojects > repositories, and configure the Maven repository address.
Code:
buildscript {
repositories {
google()
jcenter()
maven {url 'https://developer.huawei.com/repo/'}
}
}
allprojects {
repositories {
google()
jcenter()
maven {url 'https://developer.huawei.com/repo/'}
}
}
Go to buildscript > dependencies and add dependency configurations.
Code:
buildscript {
dependencies {
classpath 'com.huawei.agconnect:agcp:1.2.1.301'
}
}
Step 3
Open build.gradle file which is located under project.dir > android > app directory.
Add the AppGallery Connect plug-in dependency to the file header.
Code:
apply plugin: 'com.huawei.agconnect'
The apply plugin: ‘com.huawei.agconnect’ configuration must be added after the apply plugin: ‘com.android.application’ configuration.
The minimum Android API level (minSdkVersion) required for ML Kit is 19.
Configure build dependencies of your project.
Code:
dependencies {
...
implementation 'com.huawei.agconnect:agconnect-core:1.0.0.301'
}
Now you can use React Native Hms ML and import modules like below code.
Code:
import {<module_name>} from '@hmscore/react-native-hms-ml';
Lets Create An Application
We have already created an application in prerequisites section.
Our app will be about recognizing text in images and converting it to speech. So, we will use HmsTextRecognitionLocal, HmsFrame and HmsTextToSpeech modules. We will select images by using react-native-image-picker, so don’t forget to install it.
Note That : Before running this code snippet please check for your app permissions.
Step 1
We need to add some settings before using react-native-image-picker to AndroidManifest.xml.
Code:
<uses-permission android:name="android.permission.CAMERA" />
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
<application
...
android:requestLegacyExternalStorage="true">
Step 2
Code:
import React from 'react';
import { Text, View, ScrollView, TextInput, TouchableOpacity, StyleSheet } from 'react-native';
import { HmsTextRecognitionLocal, HmsFrame, HmsTextToSpeech, NativeEventEmitter } from '@hmscore/react-native-hms-ml';
import ImagePicker from 'react-native-image-picker';
const options = {
title: 'Choose Method',
storageOptions: {
skipBackup: true,
path: 'images',
},
};
const styles = StyleSheet.create({
bg: { backgroundColor: '#eee' },
customEditBox: {
height: 450,
borderColor: 'gray',
borderWidth: 2,
width: "95%",
alignSelf: "center",
marginTop: 10,
backgroundColor: "#fff",
color: "#000"
},
buttonTts: {
width: '95%',
height: 70,
alignSelf: "center",
marginTop: 35,
},
startButton: {
paddingTop: 10,
paddingBottom: 10,
backgroundColor: 'white',
borderRadius: 10,
borderWidth: 1,
borderColor: '#888',
backgroundColor: '#42aaf5',
},
startButtonLabel: {
fontWeight: 'bold',
color: '#fff',
textAlign: 'center',
paddingLeft: 10,
paddingRight: 10,
},
});
export default class App extends React.Component {
// create your states
// for keeping imageUri and recognition result
constructor(props) {
super(props);
this.state = {
imageUri: '',
result: '',
};
}
// this is a key function in Ml Kit
// It sets the frame for you and keeps it until you set a new one
async setMLFrame() {
try {
var result = await HmsFrame.fromBitmap(this.state.imageUri);
console.log(result);
} catch (e) {
console.error(e);
}
}
// this creates text recognition settings by default options given below
// languageCode : default is "rm"
// OCRMode : default is OCR_DETECT_MODE
async createTextSettings() {
try {
var result = await HmsTextRecognitionLocal.create({});
console.log(result);
} catch (e) {
console.error(e);
}
}
// this function calls analyze function and sets the results to state
// The parameter false means we don't want a block result
// If you want to see results as blocks you can set it to true
async analyze() {
try {
var result = await HmsTextRecognitionLocal.analyze(false);
this.setState({ result: result });
} catch (e) {
console.error(e);
}
}
// this function calls close function to stop recognizer
async close() {
try {
var result = await HmsTextRecognitionLocal.close();
console.log(result);
} catch (e) {
console.error(e);
}
}
// standart image picker operation
// sets imageUri to state
// calls startAnalyze function
showImagePicker() {
ImagePicker.showImagePicker(options, (response) => {
if (response.didCancel) {
console.log('User cancelled image picker');
} else if (response.error) {
console.log('ImagePicker Error: ', response.error);
} else {
this.setState({
imageUri: response.uri,
});
this.startAnalyze();
}
});
}
// configure tts engine by giving custom parameters
async configuration() {
try {
var result = await HmsTextToSpeech.configure({
"volume": 1.0,
"speed": 1.0,
"language": HmsTextToSpeech.TTS_EN_US,
"person": HmsTextToSpeech.TTS_SPEAKER_FEMALE_EN
});
console.log(result);
} catch (e) {
console.error(e);
}
}
// create Tts engine by call
async engineCreation() {
try {
var result = await HmsTextToSpeech.createEngine();
console.log(result);
} catch (e) {
console.error(e);
}
}
// set Tts callback
async callback() {
try {
var result = await HmsTextToSpeech.setTtsCallback();
console.log(result);
} catch (e) {
console.error(e);
}
}
// start speech
async speak(word) {
try {
var result = await HmsTextToSpeech.speak(word, HmsTextToSpeech.QUEUE_FLUSH);
console.log(result);
} catch (e) {
console.error(e);
}
}
// stop engine
async stop() {
try {
var result = await HmsTextToSpeech.stop();
console.log(result);
} catch (e) {
console.error(e);
}
}
// manage functions in order
startAnalyze() {
this.setState({
result: 'processing...',
}, () => {
this.createTextSettings()
.then(() => this.setMLFrame())
.then(() => this.analyze())
.then(() => this.close())
.then(() => this.configuration())
.then(() => this.engineCreation())
.then(() => this.callback())
.then(() => this.speak(this.state.result));
});
}
render() {
return (
<ScrollView style={styles.bg}>
<TextInput
style={styles.customEditBox}
value={this.state.result}
placeholder="Text Recognition Result"
multiline={true}
editable={false}
/>
<View style={styles.buttonTts}>
<TouchableOpacity
style={styles.startButton}
onPress={this.showImagePicker.bind(this)}
underlayColor="#fff">
<Text style={styles.startButtonLabel}> Start Analyze </Text>
</TouchableOpacity>
</View>
</ScrollView>
);
}
}
Test the App
· First write “Hello World” on a blank paper.
· Then run the application.
· Press “Start Analyze” button and take photo of your paper.
· Wait for the result.
· Here it comes. You will see “Hello World” on screen and you will hear it from your phone.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Conclusion
In this article, we integrated and used React Native Hms ML Kit to our application.
From: https://medium.com/huawei-developers/react-native-hms-ml-kit-installation-and-example-242dc83e0941
Is there any advantage in Huawei ML kit compare to others .

Intermediate: How to extract the data from Image using Huawei HiAI Text Recognition service in Android

Introduction
In this article, we will learn how to implement Huawei HiAI kit using Text Recognition service into android application, this service helps us to extract the data from screen shots and photos.
Now a days everybody lazy to type the content, there are many reasons why we want to integrate this service into our apps. User can capture or pic image from gallery to retrieve the text, so that user can edit the content easily.
UseCase: Using this HiAI kit, user can extract the unreadble image content to make useful, let's start.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Requirements
1. Any operating system (MacOS, Linux and Windows).
2. Any IDE with Android SDK installed (IntelliJ, Android Studio).
3. HiAI SDK.
4. Minimum API Level 23 is required.
5. Required EMUI 9.0.0 and later version devices.
6. Required process kirin 990/985/980/970/ 825Full/820Full/810Full/ 720Full/710Full
How to integrate HMS Dependencies
1. First of all, we need to create an app on AppGallery Connect and add related details about HMS Core to our project. For more information check this link
2. Download agconnect-services.json file from AGC and add into app’s root directory.
3 Add the required dependencies to the build.gradle file under root folder.
Code:
maven {url 'https://developer.huawei.com/repo/'}
classpath 'com.huawei.agconnect:agcp:1.4.1.300'
4. Add the App level dependencies to the build.gradle file under app folder.
Code:
apply plugin: 'com.huawei.agconnect'
5. Add the required permission to the Manifestfile.xml file.
Code:
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.CAMERA"/>
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE"/>
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE"/>
<uses-permission android:name="android.hardware.camera"/>
<uses-permission android:name="android.permission.HARDWARE_TEST.camera.autofocus"/>
6. Now, sync your project.
How to apply for HiAI Engine Library
1. Navigate to this URL, choose App Service > Development and click HUAWEI HiAI.
2. Click Apply for HUAWEI HiAI kit.
3. Enter required information like product name and Package name, click Next button.
4. Verify the application details and click Submit button.
5. Click the Download SDK button to open the SDK list.
6. Unzip downloaded SDK and add into your android project under lib folder.
7. Add jar files dependences into app build.gradle file.
Code:
implementationfileTree(<b><span style="font-size: 10.0pt;font-family: Consolas;">include</span></b>: [<b><span style="font-size: 10.0pt;">'*.aar'</span></b>, <b><span style="font-size: 10.0pt;">'*.jar'</span></b>], <b><span style="font-size: 10.0pt;">dir</span></b>: <b><span style="font-size: 10.0pt;">'libs'</span></b>)
implementation <b><span style="font-size: 10.0pt;font-family: Consolas;">'com.google.code.gson:gson:2.8.6'
</span></b>repositories <b>{
</b>flatDir <b>{
</b>dirs <b><span style="font-size: 10.0pt;line-height: 115.0%;font-family: Consolas;color: green;">'libs'
}
}</span></b><b><span style="font-size: 10.0pt;font-family: Consolas;">
</span></b>
8. After completing this above setup, now Sync your gradle file.
Let’s do code
I have created a project on Android studio with empty activity let’s start coding.
In the MainActivity.java we can create the business logic.
Java:
public class MainActivity extends AppCompatActivity {
private boolean isConnection = false;
private int REQUEST_CODE = 101;
private int REQUEST_PHOTO = 100;
private Bitmap bitmap;
private Bitmap resultBitmap;
private Button btnImage;
private ImageView originalImage;
private ImageView conversionImage;
private TextView textView;
private TextView contentText;
private final String[] permission = {
Manifest.permission.CAMERA,
Manifest.permission.WRITE_EXTERNAL_STORAGE,
Manifest.permission.READ_EXTERNAL_STORAGE};
private ImageSuperResolution resolution;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
requestPermissions(permission, REQUEST_CODE);
initHiAI();
originalImage = findViewById(R.id.super_origin);
conversionImage = findViewById(R.id.super_image);
textView = findViewById(R.id.text);
contentText = findViewById(R.id.content_text);
btnImage = findViewById(R.id.btn_album);
btnImage.setOnClickListener(v -> {
selectImage();
});
}
private void initHiAI() {
VisionBase.init(this, new ConnectionCallback() {
@Override
public void onServiceConnect() {
isConnection = true;
DeviceCompatibility();
}
@Override
public void onServiceDisconnect() {
}
});
}
private void DeviceCompatibility() {
resolution = new ImageSuperResolution(this);
int support = resolution.getAvailability();
if (support == 0) {
Toast.makeText(this, "Device supports HiAI Image super resolution service", Toast.LENGTH_SHORT).show();
} else {
Toast.makeText(this, "Device doesn't supports HiAI Image super resolution service", Toast.LENGTH_SHORT).show();
}
}
public void selectImage() {
Intent intent = new Intent(Intent.ACTION_PICK);
intent.setType("image/*");
startActivityForResult(intent, REQUEST_PHOTO);
}
@Override
protected void onActivityResult(int requestCode, int resultCode, @Nullable Intent data) {
super.onActivityResult(requestCode, resultCode, data);
if (resultCode == RESULT_OK) {
if (data != null && requestCode == REQUEST_PHOTO) {
try {
bitmap = MediaStore.Images.Media.getBitmap(getContentResolver(), data.getData());
setBitmap();
} catch (Exception e) {
e.printStackTrace();
}
}
}
}
private void setBitmap() {
int height = bitmap.getHeight();
int width = bitmap.getWidth();
if (width <= 1440 && height <= 15210) {
originalImage.setImageBitmap(bitmap);
setTextHiAI();
} else {
Toast.makeText(this, "Image size should be below 1440*15210 pixels", Toast.LENGTH_SHORT).show();
}
}
private void setTextHiAI() {
textView.setText("Extraction Text");
contentText.setVisibility(View.VISIBLE);
TextDetector detector = new TextDetector(this);
VisionImage image = VisionImage.fromBitmap(bitmap);
TextConfiguration config = new TextConfiguration();
config.setEngineType(TextConfiguration.AUTO);
config.setEngineType(TextDetectType.TYPE_TEXT_DETECT_FOCUS_SHOOT_EF);
detector.setTextConfiguration(config);
Text result = new Text();
int statusCode = detector.detect(image, result, null);
if (statusCode != 0) {
Log.e("TAG", "Failed to start engine, try restart app,");
}
if (result.getValue() != null) {
contentText.setText(result.getValue());
Log.d("TAG", result.getValue());
} else {
Log.e("TAG", "Result test value is null!");
}
}
}
Demo
Tips and Tricks
1. Download latest Huawei HiAI SDK.
2. Set minSDK version to 23 or later.
3. Do not forget to add jar files into gradle file.
4. Screenshots size should be 1440*15210 pixels.
5. Photos recommended size is 720p.
6. Refer this URL for supported Countries/Regions list.
Conclusion
In this article, we have learned how to implement HiAI Text Recognition service in android application to extract the content from screen shots and photos.
Thanks for reading! If you enjoyed this story, please click the Like button and Follow. Feel free to leave a Comment below.
Reference
Huawei HiAI Kit URL
Original Source

How to achieve Place Search and Marker Clustering Implementation in Map App

Background​Lots of apps these days include an in-app map feature and the ability to mark places of interest on the map. HMS Core Map Kit enables you to implement such capabilities for your apps. With Map Kit, you can first draw a map and then add markers to the map, as well as configure the map to cluster markers depending on the level of zoom. This article will show you how to implement searching for nearby places using the keyword search capability of Site Kit and display the results on the map.
Application scenarios:
A travel app that allows users to search for scenic spots and shows the results on a map.
2.A bicycle sharing app that can show users nearby bicycles on a map.
Key functions used in the project:
Location service: Use Location Kit to obtain the current longitude-latitude coordinates of a device.
Keyword search: Use Site Kit to search for places such as scenic spots, businesses, and schools in the specified geographical area based on the specified keyword.
Map display: Use Map Kit to draw a map. Marker clustering: Use Map Kit to add markers to the map and configure the map to cluster markers depending on the level of zoom.
Integration Preparations​1.Register as a developer and create a project in AppGallery Connect.
(1)Register as a developer Registration URL
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
(2)Create an app, add the SHA-256 signing certificate fingerprint, enable Map Kit and Site Kit, and download the agconnect-services.json file of the app.
2.Integrate the Map SDK and Site SDK.
(1)Copy the agconnect-services.json file to the app's directory of your project.
Go to allprojects > repositories and configure the Maven repository address for the HMS Core SDK.
Go to buildscript > repositories and configure the Maven repository address for the HMS Core SDK.
If the agconnect-services.json file has been added to the app, go to buildscript > dependencies and add the AppGallery Connect plugin configuration.
Code:
buildscript {
repositories {
maven { url 'https://developer.huawei.com/repo/' }
google()
jcenter()
}
dependencies {
classpath 'com.android.tools.build:gradle:3.3.2'
classpath 'com.huawei.agconnect:agcp:1.3.1.300'
}
}
allprojects {
repositories {
maven { url 'https://developer.huawei.com/repo/' }
google()
jcenter()
}
}
(2)Add build dependencies in the dependencies block.
Code:
dependencies {
implementation 'com.huawei.hms:maps:{version}'
implementation 'com.huawei.hms:site:{version}'
implementation 'com.huawei.hms:location:{version}'
}
(3)Add the following configuration to the file header:
Code:
apply plugin: 'com.huawei.agconnect'
(4)Copy the signing certificate generated in Generating a Signing Certificate to the app directory of your project, and configure the signing certificate in android in the build.gradle file.
Code:
signingConfigs {
release {
// Signing certificate.
storeFile file("**.**")
// KeyStore password.
storePassword "******"
// Key alias.
keyAlias "******"
// Key password.
keyPassword "******"
v2SigningEnabled true
v2SigningEnabled true
}
}
buildTypes {
release {
minifyEnabled false
proguardFiles getDefaultProguardFile('proguard-android.txt'), 'proguard-rules.pro'
debuggable true
}
debug {
debuggable true
}
}
Main Code and Used Functions​1.Use Location Kit to obtain the device location.
Code:
private void getMyLoction() {
fusedLocationProviderClient = LocationServices.getFusedLocationProviderClient(this);
SettingsClient settingsClient = LocationServices.getSettingsClient(this);
LocationSettingsRequest.Builder builder = new LocationSettingsRequest.Builder();
mLocationRequest = new LocationRequest();
builder.addLocationRequest(mLocationRequest);
LocationSettingsRequest locationSettingsRequest = builder.build();
//Check the device location settings.
settingsClient.checkLocationSettings(locationSettingsRequest)
.addOnSuccessListener(new OnSuccessListener<LocationSettingsResponse>() {
@Override
public void onSuccess(LocationSettingsResponse locationSettingsResponse) {
//Initiate location requests when the location settings meet the requirements.
fusedLocationProviderClient
.requestLocationUpdates(mLocationRequest, mLocationCallback, Looper.getMainLooper())
.addOnSuccessListener(new OnSuccessListener<Void>() {
@Override
public void onSuccess(Void aVoid) {
// Processing when the API call is successful.
Log.d(TAG, "onSuccess: " + aVoid);
}
});
}
})
2.Implement the text search function using Site Kit to search for nearby places.
Code:
SearchResultListener<TextSearchResponse> resultListener = new SearchResultListener<TextSearchResponse>() {
// Return search results upon a successful search.
@Override
public void onSearchResult(TextSearchResponse results) {
List<Site> siteList;
if (results == null || results.getTotalCount() <= 0 || (siteList = results.getSites()) == null
|| siteList.size() <= 0) {
resultTextView.setText("Result is Empty!");
return;
}
updateClusterData(siteList);// Mark places on the map.
}
// Return the result code and description upon a search exception.
@Override
public void onSearchError(SearchStatus status) {
resultTextView.setText("Error : " + status.getErrorCode() + " " + status.getErrorMessage());
}
};
// Call the place search API.
searchService.textSearch(request, resultListener);
3.Draw a map
Code:
@Override
public void onMapReady(HuaweiMap huaweiMap) {
hMap = huaweiMap;
hMap.moveCamera(CameraUpdateFactory.newLatLngZoom(Constants.sMylatLng, 1));
hMap.setMyLocationEnabled(true);
hMap.getUiSettings().setMyLocationButtonEnabled(true);
initCluster(huaweiMap);
}
4.Cluster markers on the map.
Code:
private ClusterManager<MyItem> mClusterManager;
List<MyItem> items = new ArrayList<>();
private void initCluster(HuaweiMap hMap) {
mClusterManager = new ClusterManager<>(this, hMap);
hMap.setOnCameraIdleListener(mClusterManager);
// Add a custom InfoWindowAdapter by setting it to the MarkerManager.Collection object from
// ClusterManager rather than from GoogleMap.setInfoWindowAdapter
//refer: https://github.com/billtom20/3rd-maps-utils
mClusterManager.getMarkerCollection().setInfoWindowAdapter(new HuaweiMap.InfoWindowAdapter() {
@Override
public View getInfoWindow(Marker marker) {
final LayoutInflater inflater = LayoutInflater.from(SearchClusterActivity.this);
final View view = inflater.inflate(R.layout.custom_marker_window, null);
final TextView textView = view.findViewById(R.id.textViewTitle);
String text = (marker.getTitle() != null) ? marker.getTitle() : "Cluster Item";
textView.setText(text);
return view;
}
@Override
public View getInfoContents(Marker marker) {
return null;
}
});
}
// Update clustered markers.
private void updateClusterData(List<Site> siteList) {
items = new ArrayList<>();
mClusterManager.clearItems();
for (Site s:
siteList) {
Coordinate location = s.getLocation();
MyItem myItem = new MyItem(location.lat,location.lng,s.name,s.formatAddress);
items.add(myItem);
}
mClusterManager.addItems(items);
Coordinate coordinate = siteList.get(0).getLocation();
LatLng latLng = new LatLng(coordinate.lat,coordinate.lng);
mClusterManager.cluster();
hMap.animateCamera(CameraUpdateFactory.newLatLngZoom (latLng,14 ));
}
Results​Enter a place or service in the Query box and tap Search. The figures below show how the search results are displayed as markers on the map.
The preceding four figures show the effect of searching for food and school, as well as the marker clustering effect at different zoom levels. Congratulations, you have now successfully integrated place search and marker clustering into your in-app map.
References​For more details, you can go to:official website
Development Documentation page, to find the documents you need
Reddit to join our developer discussion
GitHub to download sample codes
Stack Overflow to solve any integration problems
Thanks for sharing..

Using Huawei Cloud Functions as Chatbot Service in Flutter ChatBotApp Part-1

{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Introduction
In this article, we will learn how to make use of Huawei Cloud Functions service as Chatbot service in ChatBotApp in flutter. Cloud Functions enables serverless computing. It provides the Function as a Service (FaaS) capabilities to simplify app development and O&M by splitting service logic into functions and offers the Cloud Functions SDK that works with Cloud DB and Cloud Storage so that your app functions can be implemented more easily. Cloud Functions automatically scales in or out functions based on actual traffic, freeing you from server resource management and helping you reduce costs.
Key Functions
Key Concepts
How the Service Works
To use Cloud Functions, you need to develop cloud functions that can implement certain service functions in AppGallery Connect and add triggers for them, for example, HTTP triggers for HTTP requests, and Cloud DB triggers for data deletion or insertion requests after Cloud DB is integrated. After your app that integrates the Cloud Functions SDK meets conditions of specific function triggers, your app can call the cloud functions, which greatly facilitates service function building.
Platform Support
Development Overview
You need to install Flutter and Dart plugin in IDE and I assume that you have prior knowledge about the Flutter and Dart.
Hardware Requirements
A computer (desktop or laptop) running Windows 10.
Android phone (with the USB cable), which is used for debugging.
Software Requirements
Java JDK 1.7 or later.
Android studio software or Visual Studio or Code installed.
HMS Core (APK) 4.X or later.
Integration process
Step 1: Create Flutter project.
Step 2: Add the App level gradle dependencies. Choose inside project Android > app > build.gradle.
[/B][/B]
apply plugin: 'com.android.application'
apply plugin: 'com.huawei.agconnect'
[B][B]
Root level gradle dependencies
[/B][/B][/B]
maven {url 'https://developer.huawei.com/repo/'}
classpath 'com.huawei.agconnect:agcp:1.5.2.300'
[B][B][B]
Step 3: Add the below permissions in Android Manifest file.
<uses-permission android:name="android.permission.INTERNET" />
Step 5: Add downloaded file into parent directory of the project. Declare plugin path in pubspec.yaml file under dependencies.
Add path location for asset image.
Let's start coding
main.dart
[/B]
void main() {
runApp(const MyApp());
}
class MyApp extends StatelessWidget {
const MyApp({Key? key}) : super(key: key);
// This widget is the root of your application.
@override
Widget build(BuildContext context) {
return MaterialApp(
title: 'ChatBotService',
theme: ThemeData(
primarySwatch: Colors.blue,
),
home: const MyHomePage(title: 'ChatBotService'),
);
}
}
class MyHomePage extends StatefulWidget {
const MyHomePage({Key? key, required this.title}) : super(key: key);
final String title;
@override
State<MyHomePage> createState() => _MyHomePageState();
}
class _MyHomePageState extends State<MyHomePage> {
bool isLoggedIn = false;
String str = 'Login required';
final HMSAnalytics _hmsAnalytics = new HMSAnalytics();
List<String> gridItems = ['Email Service', 'Call Center', 'FAQ', 'Chat Now'];
@override
void initState() {
_enableLog();
super.initState();
}
@override
Widget build(BuildContext context) {
return Scaffold(
appBar: AppBar(
title: Text(widget.title),
),
body: Center(
child:
Column(
mainAxisAlignment: MainAxisAlignment.center,
children: <Widget>[
Visibility(
visible: true,
child: Card(
child: Padding(
padding: EdgeInsets.all(20),
child: Text(
str,
style: const TextStyle(color: Colors.teal, fontSize: 22),
),
),
),
),
],
),
),
floatingActionButton: FloatingActionButton(
onPressed: () {
if (!isLoggedIn) {
setState(() {
isLoggedIn = true;
signInWithHuaweiID();
});
print('$isLoggedIn');
} else {
setState(() {
isLoggedIn = false;
signOutWithID();
});
print('$isLoggedIn');
}
},
tooltip: 'Login/Logout',
child: isLoggedIn ? const Icon(Icons.logout) : const Icon(Icons.login),
), // This trailing comma makes auto-formatting nicer for build methods.
);
}
void signInWithHuaweiID() async {
try {
// The sign-in is successful, and the user's ID information and authorization code are obtained.
Future<AuthAccount> account = AccountAuthService.signIn();
account.then(
(value) => setLoginSuccess(value),
);
} on Exception catch (e) {
print(e.toString());
}
}
Future<void> _enableLog() async {
_hmsAnalytics.setUserId("ChatBotServiceApp");
await _hmsAnalytics.enableLog();
}
void setLoginSuccess(AuthAccount value) {
setState(() {
str = 'Welcome ' + value.displayName.toString();
});
showToast(value.displayName.toString());
print('Login Success');
}
Future<void> signOutWithID() async {
try {
final bool result = await AccountAuthService.signOut();
if (result) {
setState(() {
str = 'Login required';
showToast('You are logged out.');
});
}
} on Exception catch (e) {
print(e.toString());
}
}
Future<void> showToast(String name) async {
Fluttertoast.showToast(
msg: "$name",
toastLength: Toast.LENGTH_SHORT,
gravity: ToastGravity.CENTER,
timeInSecForIosWeb: 1,
backgroundColor: Colors.lightBlue,
textColor: Colors.white,
fontSize: 16.0);
}
}
[B]
Result
Tricks and Tips
Makes sure that agconnect-services.json file added.
Make sure dependencies are added yaml file.
Run flutter pug get after adding dependencies.
Make sure that service is enabled in agc.
Makes sure images are defined in yaml file.
Conclusion
In this article, we have learnt how to integrate Huawei Account kit, analytics kit in flutter ChatBotApp. Once Account kit integrated, users can login quickly and conveniently sign in to apps with their Huawei IDs after granting initial access permission. In part-2 we will learn the actual Cloud Functions as Chatbot service.
Thank you so much for reading. I hope this article helps you to understand the integration of Huawei Account kit and Analytics kit in flutter ChatBotApp.
Reference
Cloud Functions
Checkout in forum

Using Huawei Cloud Functions as Chatbot Service in Flutter ChatBotApp Part-2

{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Introduction
In this article, we will learn how to use Huawei Cloud Functions service as Chatbot service in ChatBotApp in flutter. Cloud Functions enables serverless computing.
It provides the Function as a Service (FaaS) capabilities to simplify app development and O&M by splitting service logic into functions and offers the Cloud Functions SDK that works with Cloud DB and Cloud Storage so that your app functions can be implemented more easily. Cloud Functions automatically scales in or out functions based on actual traffic, freeing you from server resource management and helping you reduce costs.
Key Functions
Key Concepts
How the Service Works
To use Cloud Functions, you need to develop cloud functions that can implement certain service functions in AppGallery Connect and add triggers for them, for example, HTTP triggers for HTTP requests, and Cloud DB triggers for data deletion or insertion requests after Cloud DB is integrated. After your app that integrates the Cloud Functions SDK meets conditions of specific function triggers, your app can call the cloud functions, which greatly facilitates service function building.
Platform Support
Development Overview
You need to install Flutter and Dart plugin in IDE and I assume that you have prior knowledge about the Flutter and Dart.
Hardware Requirements
A computer (desktop or laptop) running Windows 10.
Android phone (with the USB cable), which is used for debugging.
Software Requirements
Java JDK 1.7 or later.
Android studio software or Visual Studio or Code installed.
HMS Core (APK) 4.X or later.
Integration process
Step 1: Create Flutter project.
Step 2: Add the App level gradle dependencies. Choose inside project Android > app > build.gradle.
[/B][/B]
apply plugin: 'com.android.application'
apply plugin: 'com.huawei.agconnect'
[B][B]
Root level gradle dependencies
maven {url 'https://developer.huawei.com/repo/'}
classpath 'com.huawei.agconnect:agcp:1.5.2.300'
Step 3: Add the below permissions in Android Manifest file.
<uses-permission android:name="android.permission.INTERNET" />
Step 4: Add downloaded file into parent directory of the project. Declare plugin path in pubspec.yaml file under dependencies.
Add path location for asset image.
Prevoius article
Using Huawei Cloud Functions as Chatbot Service in Flutter ChatBotApp Part-1
Let's start coding
main.dart
[/B]
void main() {
runApp(const MyApp());
}
class MyApp extends StatelessWidget {
const MyApp({Key? key}) : super(key: key);
// This widget is the root of your application.
@override
Widget build(BuildContext context) {
return MaterialApp(
title: 'ChatBotService',
theme: ThemeData(
primarySwatch: Colors.blue,
),
home: const MyHomePage(title: 'ChatBotService'),
);
}
}
class MyHomePage extends StatefulWidget {
const MyHomePage({Key? key, required this.title}) : super(key: key);
final String title;
@override
State<MyHomePage> createState() => _MyHomePageState();
}
class _MyHomePageState extends State<MyHomePage> {
bool isLoggedIn = false;
String str = 'Login required';
final HMSAnalytics _hmsAnalytics = new HMSAnalytics();
List<String> gridItems = ['Email Service', 'Call Center', 'FAQ', 'Chat Now'];
@override
void initState() {
_enableLog();
super.initState();
}
@override
Widget build(BuildContext context) {
return Scaffold(
appBar: AppBar(
title: Text(widget.title),
),
body: Center(
child:
Column(
mainAxisAlignment: MainAxisAlignment.center,
children: <Widget>[
Visibility(
visible: true,
child: Card(
child: Padding(
padding: EdgeInsets.all(20),
child: Text(
str,
style: const TextStyle(color: Colors.teal, fontSize: 22),
),
),
),
),
],
),
),
floatingActionButton: FloatingActionButton(
onPressed: () {
if (!isLoggedIn) {
setState(() {
isLoggedIn = true;
signInWithHuaweiID();
});
print('$isLoggedIn');
} else {
setState(() {
isLoggedIn = false;
signOutWithID();
});
print('$isLoggedIn');
}
},
tooltip: 'Login/Logout',
child: isLoggedIn ? const Icon(Icons.logout) : const Icon(Icons.login),
), // This trailing comma makes auto-formatting nicer for build methods.
);
}
void signInWithHuaweiID() async {
try {
// The sign-in is successful, and the user's ID information and authorization code are obtained.
Future<AuthAccount> account = AccountAuthService.signIn();
account.then(
(value) => setLoginSuccess(value),
);
} on Exception catch (e) {
print(e.toString());
}
}
Future<void> _enableLog() async {
_hmsAnalytics.setUserId("ChatBotServiceApp");
await _hmsAnalytics.enableLog();
}
void setLoginSuccess(AuthAccount value) {
setState(() {
str = 'Welcome ' + value.displayName.toString();
});
showToast(value.displayName.toString());
print('Login Success');
}
Future<void> signOutWithID() async {
try {
final bool result = await AccountAuthService.signOut();
if (result) {
setState(() {
str = 'Login required';
showToast('You are logged out.');
});
}
} on Exception catch (e) {
print(e.toString());
}
}
Future<void> showToast(String name) async {
Fluttertoast.showToast(
msg: "$name",
toastLength: Toast.LENGTH_SHORT,
gravity: ToastGravity.CENTER,
timeInSecForIosWeb: 1,
backgroundColor: Colors.lightBlue,
textColor: Colors.white,
fontSize: 16.0);
}
}
[B]
ChatPage.dart
[/B][/B]
class ChatPage extends StatefulWidget {
const ChatPage({Key? key}) : super(key: key);
@override
_ChatPageState createState() => _ChatPageState();
}
class _ChatPageState extends State<ChatPage> {
List<types.Message> _messages = [];
final _user = const types.User(id: '06c33e8b-e835-4736-80f4-63f44b66666c');
final _bot = const types.User(id: '06c33e8b-e835-4736-80f4-63f54b66666c');
void _addMessage(types.Message message) {
setState(() {
_messages.insert(0, message);
});
}
void _handleSendPressed(types.PartialText message) {
final textMessage = types.TextMessage(
author: _user,
createdAt: DateTime.now().millisecondsSinceEpoch,
id: const Uuid().v4(),
text: message.text,
);
_addMessage(textMessage);
callCloudFunction2(message.text);
}
void _loadMessages() async {
final response = await rootBundle.loadString('assets/messages.json');
final messages = (jsonDecode(response) as List)
.map((e) => types.Message.fromJson(e as Map<String, dynamic>))
.toList();
setState(() {
_messages = messages;
});
}
@override
Widget build(BuildContext context) {
return Scaffold(
body: Chat(
messages: _messages,
onAttachmentPressed: null,
onMessageTap: null,
onPreviewDataFetched: null,
onSendPressed: _handleSendPressed,
user: _user,
),
);
}
Future<void> callCloudFunction2(String msg) async {
try {
RequestData data = RequestData(msg);
List<Map<String, Object>> params = <Map<String, Object>>[data.toMap()];
var input = data.toMap();
FunctionCallable functionCallable =
FunctionCallable('test-funnel-\$latest');
FunctionResult functionResult = await functionCallable.call(input);
print("Input " + input.toString());
var result = functionResult.getValue();
final textMessage = types.TextMessage(
author: _bot,
createdAt: DateTime.now().millisecondsSinceEpoch,
id: const Uuid().v4(),
text: jsonDecode(result)['response'].toString(),
);
_addMessage(textMessage);
} on PlatformException catch (e) {
print(e.message);
}
}
}
[B][B]
handler.js
[/B][/B][/B]
let myHandler = function(event, context, callback, logger) {
try {
var _body = JSON.parse(event.body);
var reqData = _body.message;
var test = '';
if(reqData == '1'){
test = "Thank you for choosing, you will get callback in 10 min.";
}else if(reqData == '2'){
test = "Please click on the link https://feedback.com/myfeedback";
}else if(reqData == '3'){
test = "Please click on the link https://huawei.com/faq";
}
else if(reqData == 'Hi'){
test = " Welcome to ChatBot Service.";
}else{
test = "Enter 1. For call back. 2. For send feedback. 3. For FAQ ";
}
let res = new context.HTTPResponse({"response": test}, {
"res-type": "simple example",
"faas-content-type": "json"
}, "application/json", "200");
callback(res);
} catch (error) {
let res = new context.HTTPResponse({"response": error}, {
"res-type": "simple example",
"faas-content-type": "json"
}, "application/json", "300");
callback(res);
}
};
module.exports.myHandler = myHandler;
[B][B][B]
Result
Tricks and Tips
Makes sure that agconnect-services.json file added.
Make sure dependencies are added yaml file.
Run flutter pug get after adding dependencies.
Make sure that service is enabled in agc.
Makes sure images are defined in yaml file.
Conclusion
In this article, we have learnt how to integrate Huawei Account kit, analytics kit and ChatBot function using Cloud Functions in flutter ChatBotApp. Once Account kit integrated, users can login quickly and conveniently sign in to apps with their Huawei IDs after granting initial access permission.
Thank you so much for reading. I hope this article helps you to understand the integration of Huawei Account kit, Analytics kit and Huawei Cloud Functions in flutter ChatBotApp.
Reference
Cloud Functions
Training Videos
Checkout in forum

Categories

Resources