Using Huawei Cloud Functions as Chatbot Service in Flutter ChatBotApp Part-1 - Huawei Developers

{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Introduction
In this article, we will learn how to make use of Huawei Cloud Functions service as Chatbot service in ChatBotApp in flutter. Cloud Functions enables serverless computing. It provides the Function as a Service (FaaS) capabilities to simplify app development and O&M by splitting service logic into functions and offers the Cloud Functions SDK that works with Cloud DB and Cloud Storage so that your app functions can be implemented more easily. Cloud Functions automatically scales in or out functions based on actual traffic, freeing you from server resource management and helping you reduce costs.
Key Functions
Key Concepts
How the Service Works
To use Cloud Functions, you need to develop cloud functions that can implement certain service functions in AppGallery Connect and add triggers for them, for example, HTTP triggers for HTTP requests, and Cloud DB triggers for data deletion or insertion requests after Cloud DB is integrated. After your app that integrates the Cloud Functions SDK meets conditions of specific function triggers, your app can call the cloud functions, which greatly facilitates service function building.
Platform Support
Development Overview
You need to install Flutter and Dart plugin in IDE and I assume that you have prior knowledge about the Flutter and Dart.
Hardware Requirements
A computer (desktop or laptop) running Windows 10.
Android phone (with the USB cable), which is used for debugging.
Software Requirements
Java JDK 1.7 or later.
Android studio software or Visual Studio or Code installed.
HMS Core (APK) 4.X or later.
Integration process
Step 1: Create Flutter project.
Step 2: Add the App level gradle dependencies. Choose inside project Android > app > build.gradle.
[/B][/B]
apply plugin: 'com.android.application'
apply plugin: 'com.huawei.agconnect'
[B][B]
Root level gradle dependencies
[/B][/B][/B]
maven {url 'https://developer.huawei.com/repo/'}
classpath 'com.huawei.agconnect:agcp:1.5.2.300'
[B][B][B]
Step 3: Add the below permissions in Android Manifest file.
<uses-permission android:name="android.permission.INTERNET" />
Step 5: Add downloaded file into parent directory of the project. Declare plugin path in pubspec.yaml file under dependencies.
Add path location for asset image.
Let's start coding
main.dart
[/B]
void main() {
runApp(const MyApp());
}
class MyApp extends StatelessWidget {
const MyApp({Key? key}) : super(key: key);
// This widget is the root of your application.
@override
Widget build(BuildContext context) {
return MaterialApp(
title: 'ChatBotService',
theme: ThemeData(
primarySwatch: Colors.blue,
),
home: const MyHomePage(title: 'ChatBotService'),
);
}
}
class MyHomePage extends StatefulWidget {
const MyHomePage({Key? key, required this.title}) : super(key: key);
final String title;
@override
State<MyHomePage> createState() => _MyHomePageState();
}
class _MyHomePageState extends State<MyHomePage> {
bool isLoggedIn = false;
String str = 'Login required';
final HMSAnalytics _hmsAnalytics = new HMSAnalytics();
List<String> gridItems = ['Email Service', 'Call Center', 'FAQ', 'Chat Now'];
@override
void initState() {
_enableLog();
super.initState();
}
@override
Widget build(BuildContext context) {
return Scaffold(
appBar: AppBar(
title: Text(widget.title),
),
body: Center(
child:
Column(
mainAxisAlignment: MainAxisAlignment.center,
children: <Widget>[
Visibility(
visible: true,
child: Card(
child: Padding(
padding: EdgeInsets.all(20),
child: Text(
str,
style: const TextStyle(color: Colors.teal, fontSize: 22),
),
),
),
),
],
),
),
floatingActionButton: FloatingActionButton(
onPressed: () {
if (!isLoggedIn) {
setState(() {
isLoggedIn = true;
signInWithHuaweiID();
});
print('$isLoggedIn');
} else {
setState(() {
isLoggedIn = false;
signOutWithID();
});
print('$isLoggedIn');
}
},
tooltip: 'Login/Logout',
child: isLoggedIn ? const Icon(Icons.logout) : const Icon(Icons.login),
), // This trailing comma makes auto-formatting nicer for build methods.
);
}
void signInWithHuaweiID() async {
try {
// The sign-in is successful, and the user's ID information and authorization code are obtained.
Future<AuthAccount> account = AccountAuthService.signIn();
account.then(
(value) => setLoginSuccess(value),
);
} on Exception catch (e) {
print(e.toString());
}
}
Future<void> _enableLog() async {
_hmsAnalytics.setUserId("ChatBotServiceApp");
await _hmsAnalytics.enableLog();
}
void setLoginSuccess(AuthAccount value) {
setState(() {
str = 'Welcome ' + value.displayName.toString();
});
showToast(value.displayName.toString());
print('Login Success');
}
Future<void> signOutWithID() async {
try {
final bool result = await AccountAuthService.signOut();
if (result) {
setState(() {
str = 'Login required';
showToast('You are logged out.');
});
}
} on Exception catch (e) {
print(e.toString());
}
}
Future<void> showToast(String name) async {
Fluttertoast.showToast(
msg: "$name",
toastLength: Toast.LENGTH_SHORT,
gravity: ToastGravity.CENTER,
timeInSecForIosWeb: 1,
backgroundColor: Colors.lightBlue,
textColor: Colors.white,
fontSize: 16.0);
}
}
[B]
Result
Tricks and Tips
Makes sure that agconnect-services.json file added.
Make sure dependencies are added yaml file.
Run flutter pug get after adding dependencies.
Make sure that service is enabled in agc.
Makes sure images are defined in yaml file.
Conclusion
In this article, we have learnt how to integrate Huawei Account kit, analytics kit in flutter ChatBotApp. Once Account kit integrated, users can login quickly and conveniently sign in to apps with their Huawei IDs after granting initial access permission. In part-2 we will learn the actual Cloud Functions as Chatbot service.
Thank you so much for reading. I hope this article helps you to understand the integration of Huawei Account kit and Analytics kit in flutter ChatBotApp.
Reference
Cloud Functions
Checkout in forum

Related

Flutter check HMS/GMS Availability check

More information like this, you can visit HUAWEI Developer Forum​
This guide describes how to write custom platform-specific code. Some platform-specific functionality is available through existing packages
Flutter uses a flexible system that allows you to call platform-specific APIs whether available in Kotlin or Java code on Android, or in Swift or Objective-C code on iOS.
Flutter’s platform-specific API support does not rely on code generation, but rather on a flexible message passing style:
The Flutter portion of the app sends messages to its host, the iOS or Android portion of the app, over a platform channel.
The host listens on the platform channel, and receives the message. It then calls into any number of platform-specific APIs—using the native programming language—and sends a response back to the client, the Flutter portion of the app.
Architectural overview: platform channels
Messages are passed between the client (UI) and host (platform) using platform channels as illustrated in this diagram:
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Messages and responses are passed asynchronously, to ensure the user interface remains responsive.
Step 1: Create a new app project
Start by creating a new app:
In a terminal run: flutter create flutterhmsgmscheck
Step 2: Create the Flutter platform client
The app’s State class holds the current app state. Extend that to hold the current battery state.
First, construct the channel. Use a MethodChannel with a single platform method that returns the battery level.
The client and host sides of a channel are connected through a channel name passed in the channel constructor. All channel names used in a single app must be unique; prefix the channel name with a unique ‘domain prefix’, for example: com.salman.flutter.hmsgmscheck/isHmsGmsAvailable.
Code:
import 'package:flutter/material.dart';
import 'package:flutter/services.dart';
class HmsGmsCheck extends StatelessWidget {
HmsGmsCheck();
@override
Widget build(BuildContext context) {
return HmsGmsCheckStateful(
title: "HMS/GMS Check",
);
}
}
class HmsGmsCheckStateful extends StatefulWidget {
HmsGmsCheckStateful({Key key, this.title}) : super(key: key);
final String title;
@override
_HmsGmsCheckState createState() => _HmsGmsCheckState();
}
class _HmsGmsCheckState extends State<HmsGmsCheckStateful> {
static const MethodChannel methodChannel =
MethodChannel('com.salman.flutter.hmsallkitsflutter/isHmsGmsAvailable');
bool _isHmsAvailable;
bool _isGmsAvailable;
@override
void initState() {
checkHmsGms();
}
void checkHmsGms() async {
await _isHMS();
await _isGMS();
}
Future<void> _isHMS() async {
bool status;
try {
bool result = await methodChannel.invokeMethod('isHmsAvailable');
status = result;
print('status : ${status.toString()}');
} on PlatformException {
print('Failed to get _isHmsAvailable.');
}
setState(() {
_isHmsAvailable = status;
});
}
Future<void> _isGMS() async {
bool status;
try {
bool result = await methodChannel.invokeMethod('isGmsAvailable');
status = result;
print('status : ${status.toString()}');
} on PlatformException {
print('Failed to get _isGmsAvailable.');
}
setState(() {
_isGmsAvailable = status;
});
}
@override
Widget build(BuildContext context) {
return Scaffold(
appBar: AppBar(
title: Text(widget.title),
),
body: Column(
children: <Widget>[
new Container(
padding: EdgeInsets.all(20),
child: new Column(
children: <Widget>[
Text(
"HMS Available: $_isHmsAvailable",
style: Theme.of(context).textTheme.headline6,
),
Text(
"GMS Available: $_isGmsAvailable",
style: Theme.of(context).textTheme.headline6,
)
],
),
)
],
));
}
}
Step 3: Update your gradle
Open your gradle in Android Studio and apply huawei repo:
Project-level build.gradle
Code:
buildscript {
ext.kotlin_version = '1.3.50'
repositories {
google()
jcenter()
maven { url 'https://developer.huawei.com/repo/' }
}
}
allprojects {
repositories {
google()
jcenter()
maven { url 'https://developer.huawei.com/repo/' }
}
}
App-level build.gradle
Code:
dependencies {
implementation "org.jetbrains.kotlin:kotlin-stdlib-jdk7:$kotlin_version"
implementation "com.huawei.hms:hwid:4.0.0.300"
implementation "com.google.android.gms:play-services-base:17.3.0"
}
Step 4: Add an Android platform-specific implementation
Start by opening the Android host portion of your Flutter app in Android Studio:
Start Android Studio
Select the menu item File > Open…
Navigate to the directory holding your Flutter app, and select the android folder inside it. Click OK.
Open the file MainActivity.kt located in the kotlin folder in the Project view. (Note: If editing with Android Studio 2.3, note that the kotlin folder is shown as if named java.)
Inside the configureFlutterEngine() method, create a MethodChannel and call setMethodCallHandler(). Make sure to use the same channel name as was used on the Flutter client side.
Code:
class MainActivity: FlutterActivity() {
private val CHANNEL = "com.salman.flutter.hmsgmscheck/isHmsGmsAvailable"
var concurrentContext = [email protected]
override fun configureFlutterEngine(@NonNull flutterEngine: FlutterEngine) {
super.configureFlutterEngine(flutterEngine)
MethodChannel(flutterEngine.dartExecutor.binaryMessenger, CHANNEL).setMethodCallHandler {
call, result ->
// Note: this method is invoked on the main thread.
if (call.method.equals("isHmsAvailable")) {
result.success(isHmsAvailable());
} else if (call.method.equals("isGmsAvailable")) {
result.success(isGmsAvailable());
} else {
result.notImplemented()
}
}
}
private fun isHmsAvailable(): Boolean {
var isAvailable = false
val context: Context = concurrentContext
if (null != context) {
val result = HuaweiApiAvailability.getInstance().isHuaweiMobileServicesAvailable(context)
isAvailable = ConnectionResult.SUCCESS == result
}
Log.i("MainActivity", "isHmsAvailable: $isAvailable")
return isAvailable
}
private fun isGmsAvailable(): Boolean {
var isAvailable = false
val context: Context = concurrentContext
if (null != context) {
val result: Int = GoogleApiAvailability.getInstance().isGooglePlayServicesAvailable(context)
isAvailable = com.google.android.gms.common.ConnectionResult.SUCCESS === result
}
Log.i("MainActivity", "isGmsAvailable: $isAvailable")
return isAvailable
}
}
After completing above all steps compile your project you will get the following output.
Conclusion:
With the help of this article we can able to access platform specific native code under our flutter application. For further more details you can check offical flutter platform channels guide.

React Native HMS ML Kit | Installation and Example

More information like this, you can visit HUAWEI Developer Forum​
Introduction
This article covers, how to integrate React Native HMS ML Kit to a React Native application.
React Native Hms ML Kit supports services listed below
· Text Related Services
· Language Related Services
· Image Related Services
· Face/Body Related Services
There are several number of uses cases of these services, you can combine them or just use them to create different functionalities in your app. For basic understanding, please read uses cases from here.
Github: https://github.com/HMS-Core/hms-react-native-plugin/tree/master/react-native-hms-ml
Prerequisites
Step 1
Prepare your development environment using this guide.
After reading this guide you should have React Native Development Environment setted up, Hms Core (APK) installed and Android Sdk installed.
Step 2
Configure your app information in App Gallery by following this guide.
After reading this guide you should have a Huawei Developer Account, an App Gallery app, a keystore file and enabled ml kit service from AppGallery.
Integrating React Native Hms ML Kit
Warning : Please make sure that, prerequisites part successfully completed.
Step 1
Code:
npm i @hmscore/react-native-hms-ml
Step 2
Open build.gradle file in project-dir > android folder.
Go to buildscript > repositories and allprojects > repositories, and configure the Maven repository address.
Code:
buildscript {
repositories {
google()
jcenter()
maven {url 'https://developer.huawei.com/repo/'}
}
}
allprojects {
repositories {
google()
jcenter()
maven {url 'https://developer.huawei.com/repo/'}
}
}
Go to buildscript > dependencies and add dependency configurations.
Code:
buildscript {
dependencies {
classpath 'com.huawei.agconnect:agcp:1.2.1.301'
}
}
Step 3
Open build.gradle file which is located under project.dir > android > app directory.
Add the AppGallery Connect plug-in dependency to the file header.
Code:
apply plugin: 'com.huawei.agconnect'
The apply plugin: ‘com.huawei.agconnect’ configuration must be added after the apply plugin: ‘com.android.application’ configuration.
The minimum Android API level (minSdkVersion) required for ML Kit is 19.
Configure build dependencies of your project.
Code:
dependencies {
...
implementation 'com.huawei.agconnect:agconnect-core:1.0.0.301'
}
Now you can use React Native Hms ML and import modules like below code.
Code:
import {<module_name>} from '@hmscore/react-native-hms-ml';
Lets Create An Application
We have already created an application in prerequisites section.
Our app will be about recognizing text in images and converting it to speech. So, we will use HmsTextRecognitionLocal, HmsFrame and HmsTextToSpeech modules. We will select images by using react-native-image-picker, so don’t forget to install it.
Note That : Before running this code snippet please check for your app permissions.
Step 1
We need to add some settings before using react-native-image-picker to AndroidManifest.xml.
Code:
<uses-permission android:name="android.permission.CAMERA" />
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
<application
...
android:requestLegacyExternalStorage="true">
Step 2
Code:
import React from 'react';
import { Text, View, ScrollView, TextInput, TouchableOpacity, StyleSheet } from 'react-native';
import { HmsTextRecognitionLocal, HmsFrame, HmsTextToSpeech, NativeEventEmitter } from '@hmscore/react-native-hms-ml';
import ImagePicker from 'react-native-image-picker';
const options = {
title: 'Choose Method',
storageOptions: {
skipBackup: true,
path: 'images',
},
};
const styles = StyleSheet.create({
bg: { backgroundColor: '#eee' },
customEditBox: {
height: 450,
borderColor: 'gray',
borderWidth: 2,
width: "95%",
alignSelf: "center",
marginTop: 10,
backgroundColor: "#fff",
color: "#000"
},
buttonTts: {
width: '95%',
height: 70,
alignSelf: "center",
marginTop: 35,
},
startButton: {
paddingTop: 10,
paddingBottom: 10,
backgroundColor: 'white',
borderRadius: 10,
borderWidth: 1,
borderColor: '#888',
backgroundColor: '#42aaf5',
},
startButtonLabel: {
fontWeight: 'bold',
color: '#fff',
textAlign: 'center',
paddingLeft: 10,
paddingRight: 10,
},
});
export default class App extends React.Component {
// create your states
// for keeping imageUri and recognition result
constructor(props) {
super(props);
this.state = {
imageUri: '',
result: '',
};
}
// this is a key function in Ml Kit
// It sets the frame for you and keeps it until you set a new one
async setMLFrame() {
try {
var result = await HmsFrame.fromBitmap(this.state.imageUri);
console.log(result);
} catch (e) {
console.error(e);
}
}
// this creates text recognition settings by default options given below
// languageCode : default is "rm"
// OCRMode : default is OCR_DETECT_MODE
async createTextSettings() {
try {
var result = await HmsTextRecognitionLocal.create({});
console.log(result);
} catch (e) {
console.error(e);
}
}
// this function calls analyze function and sets the results to state
// The parameter false means we don't want a block result
// If you want to see results as blocks you can set it to true
async analyze() {
try {
var result = await HmsTextRecognitionLocal.analyze(false);
this.setState({ result: result });
} catch (e) {
console.error(e);
}
}
// this function calls close function to stop recognizer
async close() {
try {
var result = await HmsTextRecognitionLocal.close();
console.log(result);
} catch (e) {
console.error(e);
}
}
// standart image picker operation
// sets imageUri to state
// calls startAnalyze function
showImagePicker() {
ImagePicker.showImagePicker(options, (response) => {
if (response.didCancel) {
console.log('User cancelled image picker');
} else if (response.error) {
console.log('ImagePicker Error: ', response.error);
} else {
this.setState({
imageUri: response.uri,
});
this.startAnalyze();
}
});
}
// configure tts engine by giving custom parameters
async configuration() {
try {
var result = await HmsTextToSpeech.configure({
"volume": 1.0,
"speed": 1.0,
"language": HmsTextToSpeech.TTS_EN_US,
"person": HmsTextToSpeech.TTS_SPEAKER_FEMALE_EN
});
console.log(result);
} catch (e) {
console.error(e);
}
}
// create Tts engine by call
async engineCreation() {
try {
var result = await HmsTextToSpeech.createEngine();
console.log(result);
} catch (e) {
console.error(e);
}
}
// set Tts callback
async callback() {
try {
var result = await HmsTextToSpeech.setTtsCallback();
console.log(result);
} catch (e) {
console.error(e);
}
}
// start speech
async speak(word) {
try {
var result = await HmsTextToSpeech.speak(word, HmsTextToSpeech.QUEUE_FLUSH);
console.log(result);
} catch (e) {
console.error(e);
}
}
// stop engine
async stop() {
try {
var result = await HmsTextToSpeech.stop();
console.log(result);
} catch (e) {
console.error(e);
}
}
// manage functions in order
startAnalyze() {
this.setState({
result: 'processing...',
}, () => {
this.createTextSettings()
.then(() => this.setMLFrame())
.then(() => this.analyze())
.then(() => this.close())
.then(() => this.configuration())
.then(() => this.engineCreation())
.then(() => this.callback())
.then(() => this.speak(this.state.result));
});
}
render() {
return (
<ScrollView style={styles.bg}>
<TextInput
style={styles.customEditBox}
value={this.state.result}
placeholder="Text Recognition Result"
multiline={true}
editable={false}
/>
<View style={styles.buttonTts}>
<TouchableOpacity
style={styles.startButton}
onPress={this.showImagePicker.bind(this)}
underlayColor="#fff">
<Text style={styles.startButtonLabel}> Start Analyze </Text>
</TouchableOpacity>
</View>
</ScrollView>
);
}
}
Test the App
· First write “Hello World” on a blank paper.
· Then run the application.
· Press “Start Analyze” button and take photo of your paper.
· Wait for the result.
· Here it comes. You will see “Hello World” on screen and you will hear it from your phone.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Conclusion
In this article, we integrated and used React Native Hms ML Kit to our application.
From: https://medium.com/huawei-developers/react-native-hms-ml-kit-installation-and-example-242dc83e0941
Is there any advantage in Huawei ML kit compare to others .

Intermediate: How to Integrate Huawei Analytics kit in Flutter (Cross platform)

{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Introduction
Flutter Analytics Plugin provides wider range of predefined analytics models to get more insight into your application users, products, and content. With this insight, you can prepare data-driven approach to market your apps and optimize your products based on the analytics.
With Analytics Kit's on-device data collection SDK, you can:
Collect and report custom events.
Set a maximum of 25 user attributes.
Automate event collection and session calculation.
Pre-set event IDs and parameters.
Restrictions
1. Devices:
a. Analytics Kit depends on HMS Core (APK) to automatically collect the following events: INSTALLAPP (app installation), UNINSTALLAPP (app uninstallation), CLEARNOTIFICATION (data deletion), INAPPPURCHASE (in-app purchase), RequestAd (ad request), DisplayAd (ad display), ClickAd (ad tapping), ObtainAdAward (ad award claiming), SIGNIN (sign-in), and SIGNOUT (sign-out). These events cannot be automatically collected on third-party devices where HMS Core (APK) is not installed (including but not limited to OPPO, vivo, Xiaomi, Samsung, and OnePlus).
b. Analytics Kit does not work on iOS devices.
2. Number of events:
A maximum of 500 events are supported.
3. Number of event parameters:
You can define a maximum of 25 parameters for each event, and a maximum of 100 event parameters for each project.
4. Supported countries/regions
The service is now available only in the countries/regions listed in Supported Countries/Regions.
Integration process
1. Create flutter project
Step 2: Add the App level gradle dependencies. Choose inside project Android > app > build.gradle
Java:
apply plugin: 'com.android.application'
apply plugin: 'com.huawei.agconnect'
Add root level gradle dependencies
Java:
maven {url 'https://developer.huawei.com/repo/'}
classpath 'com.huawei.agconnect:agcp:1.4.1.300'
Add app level gradle dependencies
Java:
implementation 'com.huawei.hms:hianalytics:5.1.0.300'
Step 3: Add the below permissions in Android Manifest file.
XML:
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE"/>
<uses-permission android:name="com.huawei.appmarket.service.commondata.permission.GET_COMMON_DATA"/>
Step 4: Flutter plugin for Huawei analytics kit.
Unzip downloaded plugin in the parent directory of the project.
Step 5: Declare plugin path in pubspec.yaml file under dependencies.
Step 5 : Create a project in AppGallery Connect.
pubspec.yaml
YAML:
<p style="margin-top: 20.0px;white-space: normal;">name: flutter_app
</p><p style="margin-top: 20.0px;white-space: normal;">description: A new Flutter application.
# The following line prevents the package from being accidentally published to
# pub.dev using `pub publish`. This is preferred for private packages.
publish_to: 'none' # Remove this line if you wish to publish to pub.dev
version: 1.0.0+1
environment:
sdk: ">=2.7.0 <3.0.0"
dependencies:
flutter:
sdk: flutter
huawei_analytics:
path: ../huawei_analytics/
# The following adds the Cupertino Icons font to your application.
# Use with the CupertinoIcons class for iOS style icons.
cupertino_icons: ^1.0.2
dev_dependencies:
flutter_test:
sdk: flutter
# The following section is specific to Flutter.
flutter:</p>
main.dart
Code:
import 'package:flutter/material.dart';
import 'package:flutter_app/result.dart';
import 'package:huawei_analytics/huawei_analytics.dart';
import './quiz.dart';
import './result.dart';
void main() => runApp(MyApp());
class MyApp extends StatefulWidget {
@override
State<StatefulWidget> createState() {
return _MyAppState();
}
}
class _MyAppState extends State<MyApp> {
var _questionIndex = 0;
int _totalScore = 0;
final HMSAnalytics _hmsAnalytics = new HMSAnalytics();
@override
void initState() {
_enableLog();
_predefinedEvent();
super.initState();
}
Future<void> _enableLog() async {
_hmsAnalytics.setUserId("TestUser123");
await _hmsAnalytics.enableLog();
}
void _restartQuiz() {
setState(() {
_questionIndex = 0;
_totalScore = 0;
});
}
void _predefinedEvent() async {
String name = HAEventType.SIGNIN;
dynamic value = {HAParamType.ENTRY: 06534797};
await _hmsAnalytics.onEvent(name, value);
print("Event posted");
}
void _customEvent(int index, int score) async {
String name = "Question$index";
dynamic value = {'Score': score};
await _hmsAnalytics.onEvent(name, value);
print("Event posted");
}
static const _questions = [
{
'questionText': 'What\'s you favorite color?',
'answers': [
{'text': 'Black', 'Score': 10},
{'text': 'White', 'Score': 1},
{'text': 'Green', 'Score': 3},
{'text': 'Red', 'Score': 5},
]
},
{
'questionText': 'What\'s your favorite place?',
'answers': [
{'text': 'India', 'Score': 1},
{'text': 'Rassia', 'Score': 5},
{'text': 'US', 'Score': 4},
{'text': 'Singapore', 'Score': 7},
]
},
{
'questionText': 'What\'s your childwood nick name?',
'answers': [
{'text': 'Bunty', 'Score': 2},
{'text': 'Binto', 'Score': 1},
{'text': 'Tom', 'Score': 5},
{'text': 'Ruby', 'Score': 3},
]
},
{
'questionText': 'What\'s your favorite subject?',
'answers': [
{'text': 'Math', 'Score': 5},
{'text': 'Physics', 'Score': 1},
{'text': 'Chemistry', 'Score': 3},
{'text': 'Biology', 'Score': 2},
]
}
];
Future<void> _answerQuestion(int score) async {
_totalScore += score;
if (_questionIndex < _questions.length) {
print('Iside if $_questionIndex');
setState(() {
_questionIndex = _questionIndex + 1;
});
print('Current questionIndex $_questionIndex');
} else {
print('Inside else $_questionIndex');
}
_customEvent(_questionIndex, score);
}
@override
Widget build(BuildContext context) {
return MaterialApp(
home: Scaffold(
appBar: AppBar(
title: Text('QuizApp'),
),
body: _questionIndex < _questions.length
? Quiz(
answerQuestion: _answerQuestion,
questionIndex: _questionIndex,
questions: _questions,
)
: Result(_totalScore, _restartQuiz),
));
}
}
question.dart
Code:
import 'package:flutter/material.dart';
class Question extends StatelessWidget {
final String questionText;
Question(this.questionText);
@override
Widget build(BuildContext context) {
return Container(
width: double.infinity,
margin: EdgeInsets.all(30.0),
child: Text(
questionText,
style: TextStyle(
fontSize: 28,
),
textAlign: TextAlign.center,
),
);
}
}
answer.dart
Code:
import 'package:flutter/material.dart';
class Answer extends StatelessWidget {
final Function selectHandler;
final String answerText;
Answer(this.selectHandler, this.answerText);
@override
Widget build(BuildContext context) {
return Container(
width: double.infinity,
margin: EdgeInsets.fromLTRB(20, 10, 20, 10),
child: RaisedButton(
child: Text(answerText),
color: Colors.blue,
textColor: Colors.white,
onPressed: selectHandler,
),
);
}
}
quiz.dart
Code:
import 'package:flutter/material.dart';
import './answer.dart';
import './question.dart';
class Quiz extends StatelessWidget {
final List<Map<String, Object>> questions;
final int questionIndex;
final Function answerQuestion;
Quiz({
@required this.answerQuestion,
@required this.questions,
@required this.questionIndex,
});
@override
Widget build(BuildContext context) {
return Column(
children: [
Question(
questions[questionIndex]['questionText'],
),
...(questions[questionIndex]['answers'] as List<Map<String, Object>>)
.map((answer) {
return Answer(() => answerQuestion(answer['Score']), answer['text']);
}).toList()
],
);
}
}
result.dart
Code:
import 'package:flutter/material.dart';
class Result extends StatelessWidget {
final int resulScore;
final Function restarthandler;
Result(this.resulScore, this.restarthandler);
String get resultPhrase {
String resultText;
if (resulScore <= 8) {
resultText = 'You are awesome and innocent!.';
} else if (resulScore <= 12) {
resultText = 'Pretty likable!.';
} else if (resulScore <= 12) {
resultText = 'You are .. strange!.';
} else {
resultText = 'You are so bad!';
}
return resultText;
}
@override
Widget build(BuildContext context) {
return Center(
child: Column(
children: [
Text(
resultPhrase,
style: TextStyle(fontSize: 36, fontWeight: FontWeight.bold),
textAlign: TextAlign.center,
),
FlatButton(
child: Text('Restart again', style: TextStyle(fontSize: 22)),
textColor: Colors.blue,
onPressed: restarthandler,
),
],
),
);
}
}
Result
Tricks and Tips
Make sure that downloaded plugin is added in specified directory.
Makes sure that agconnect-services.json file added.
Make sure dependencies are added yaml file.
Run flutter pug get after adding dependencies.
Generating SHA-256 certificate fingerprint in android studio and configure in Ag-connect.
Enable debug mode using following command
Code:
adb shell setprop debug.huawei.hms.analytics.app package_name
Conclusion
In this article, we have learnt how to integrate Huawei Analytics Kit into Flutter QuizApp, which lets you to app analytics like users, predefined events and Custom events in the Ag-connect.
Thank you so much for reading, I hope this article helps you to understand the Huawei Analytics Kit in flutter.
Reference
Official plugin guide for flutter :
Document
developer.huawei.com
Flutter plugin :
Document
developer.huawei.com
HMS Core :
Document
developer.huawei.com
Read In Forum
Does it supports real time analytics?

Intermediate: How to Integrate Image Classification Feature of Huawei ML Kit in Flutter

Introduction
In this article, we will learn how to implement Image Classification feature in flutter application. Image classification uses the transfer learning algorithm to perform multi-level learning training. Huawei ML Kit provides many useful machine learning related features to developers and one of them is Image Classification.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
About Image Classification
Image classification is one of the features of HMS ML Kit. By this service we can classify the objects in images. This service analyses an image, classifies it into possible categories in real world, like people, animal, objects etc. and it returns the recognized results.
We can detects images two ways static or from camera stream. Image recognition it supports both cloud and device recognition.
Device based recognition
1. More efficient.
2. Supports more than 400 image categories.
3. Supports both static image detection and camera stream detection.
Cloud based recognition
1. More accurate.
2. Supports more than1200 image categories.
3. Supports only static image detection.
Requirements
1. Any operating system (MacOS, Linux and Windows etc.)
2. Any IDE with Flutter SDK installed (IntelliJ, Android Studio and VsCode etc.)
3. A little knowledge of Dart and Flutter.
4. Minimum API Level 19 is required.
5. Required EMUI 5.0 and later version devices.
Setting up the Awareness kit
1. First create a developer account in AppGallery Connect. After create your developer account, you can create a new project and new app. For more information, click here.
2. Enable the ML kit in the Manage API section and add the plugin.
3. Add the required dependencies to the build.gradle file under root folder.
Code:
maven {url'http://developer.huawei.com/repo/'}
classpath 'com.huawei.agconnect:agcp:1.4.1.300'
4. Add the required permissions to the AndroidManifest.xml file under app/src/main folder.
Code:
<uses-permission android:name ="android.permission.CAMERA"/>
<uses-permission android:name ="android.permission.READ_EXTERNAL_STORAGE"/>
After completing all the above steps, you need to add the required kits’ Flutter plugins as dependencies to pubspec.yaml file. Refer this URL for cross-platform plugins to download the latest versions.
Code:
huawei_ml:
path: ../huawei_ml/
After adding them, run flutter pub get command. Now all the plugins are ready to use.
Note: Set multiDexEnabled to true in the android/app directory, so the app will not crash.
Code Integration
In this sample, I used both static and camera detection. First we have to initialize the ML service, then check camera permissions.
Code:
class ImageClassification extends StatefulWidget {
@override
_ImageClassificationState createState() => _ImageClassificationState();
}
class _ImageClassificationState extends State<ImageClassification> {
MLClassificationAnalyzer mlClassificationAnalyzer;
MLClassificationAnalyzerSetting mlClassificationAnalyzerSetting;
String _name = " ";
File _imageFile;
PickedFile _pickedFile;
@override
void initState() {
mlClassificationAnalyzer = new MLClassificationAnalyzer();
mlClassificationAnalyzerSetting = new MLClassificationAnalyzerSetting();
_setApiKey();
_checkPermissions();
super.initState();
}
_setApiKey() async {
await MLApplication().setApiKey(
apiKey:
"CgB6e3x9vOdMNP0juX6Wj65ziX/FR0cs1k37FBOB3iYL+ecElA9k+K9YUQMAlD4pXRuEVvb+hoDQB2KDdXYTpqfH");
}
_checkPermissions() async {
if (await MLPermissionClient().checkCameraPermission()) {
Scaffold.of(context).showSnackBar(SnackBar(
content: Text("Permission Granted"),
));
} else {
await MLPermissionClient().requestCameraPermission();
}
}
@override
Widget build(BuildContext context) {
return Scaffold(
body: Column(
children: [
SizedBox(height: 15),
_setImageView(),
SizedBox(height: 15),
_setText(),
SizedBox(height: 15),
_showImagePickingOptions(),
],
));
}
Widget _showImagePickingOptions() {
return Expanded(
child: Align(
child: Column(
mainAxisAlignment: MainAxisAlignment.center,
children: [
Container(
margin: EdgeInsets.only(left: 20.0, right: 20.0),
width: MediaQuery.of(context).size.width,
child: MaterialButton(
color: Colors.amber,
textColor: Colors.white,
child: Text("TAKE PICTURE"),
onPressed: () async {
final String path = await getImage(ImageSource.camera);
setState(() {
_imageFile = File(path);
});
_startRecognition(path);
})),
Container(
width: MediaQuery.of(context).size.width,
margin: EdgeInsets.only(left: 20.0, right: 20.0),
child: MaterialButton(
color: Colors.amber,
textColor: Colors.white,
child: Text("PICK FROM GALLERY"),
onPressed: () async {
final String path = await getImage(ImageSource.gallery);
setState(() {
_imageFile = File(path);
});
_startRecognition(path);
})),
],
),
),
);
}
Widget _setImageView() {
if (_imageFile != null) {
return Image.file(_imageFile, width: 300, height: 300);
} else {
return Text(" ");
}
}
Widget _setText() {
return Text(
_name,
style: (TextStyle(fontWeight: FontWeight.bold)),
);
}
_startRecognition(String path) async {
mlClassificationAnalyzerSetting.path = path;
mlClassificationAnalyzerSetting.isRemote = true;
mlClassificationAnalyzerSetting.largestNumberOfReturns = 6;
mlClassificationAnalyzerSetting.minAcceptablePossibility = 0.5;
try {
List<MLImageClassification> list = await mlClassificationAnalyzer
.asyncAnalyzeFrame(mlClassificationAnalyzerSetting);
if (list.length != 0) {
setState(() {
_name = list.first.name;
});
}
} on Exception catch (er) {
print(er.toString());
}
}
Future<String> getImage(ImageSource imageSource) async {
final picker = ImagePicker();
_pickedFile = await picker.getImage(source: imageSource);
return _pickedFile.path;
}
}
Demo
Tips and Tricks
1. Download latest HMS Flutter plugin.
2. Set minSDK version to 19 or later.
3. Do not forget to add Camera permission in Manifest file.
4. Latest HMS Core APK is required.
5. The PNG, JPG, JPEG, and BMP formats are supported.
Conclusion
That’s it!
This article will help you to use Image classification feature in your flutter application, Image classification service of ML Kit gives a real-time experience for AI apps of analyzing elements available in image or camera stream.
Thanks for reading! If you enjoyed this story, please click the Like button and Follow. Feel free to leave a Comment below.
Reference
ML kit URL
Original Source

Using Huawei Cloud Functions as Chatbot Service in Flutter ChatBotApp Part-2

{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Introduction
In this article, we will learn how to use Huawei Cloud Functions service as Chatbot service in ChatBotApp in flutter. Cloud Functions enables serverless computing.
It provides the Function as a Service (FaaS) capabilities to simplify app development and O&M by splitting service logic into functions and offers the Cloud Functions SDK that works with Cloud DB and Cloud Storage so that your app functions can be implemented more easily. Cloud Functions automatically scales in or out functions based on actual traffic, freeing you from server resource management and helping you reduce costs.
Key Functions
Key Concepts
How the Service Works
To use Cloud Functions, you need to develop cloud functions that can implement certain service functions in AppGallery Connect and add triggers for them, for example, HTTP triggers for HTTP requests, and Cloud DB triggers for data deletion or insertion requests after Cloud DB is integrated. After your app that integrates the Cloud Functions SDK meets conditions of specific function triggers, your app can call the cloud functions, which greatly facilitates service function building.
Platform Support
Development Overview
You need to install Flutter and Dart plugin in IDE and I assume that you have prior knowledge about the Flutter and Dart.
Hardware Requirements
A computer (desktop or laptop) running Windows 10.
Android phone (with the USB cable), which is used for debugging.
Software Requirements
Java JDK 1.7 or later.
Android studio software or Visual Studio or Code installed.
HMS Core (APK) 4.X or later.
Integration process
Step 1: Create Flutter project.
Step 2: Add the App level gradle dependencies. Choose inside project Android > app > build.gradle.
[/B][/B]
apply plugin: 'com.android.application'
apply plugin: 'com.huawei.agconnect'
[B][B]
Root level gradle dependencies
maven {url 'https://developer.huawei.com/repo/'}
classpath 'com.huawei.agconnect:agcp:1.5.2.300'
Step 3: Add the below permissions in Android Manifest file.
<uses-permission android:name="android.permission.INTERNET" />
Step 4: Add downloaded file into parent directory of the project. Declare plugin path in pubspec.yaml file under dependencies.
Add path location for asset image.
Prevoius article
Using Huawei Cloud Functions as Chatbot Service in Flutter ChatBotApp Part-1
Let's start coding
main.dart
[/B]
void main() {
runApp(const MyApp());
}
class MyApp extends StatelessWidget {
const MyApp({Key? key}) : super(key: key);
// This widget is the root of your application.
@override
Widget build(BuildContext context) {
return MaterialApp(
title: 'ChatBotService',
theme: ThemeData(
primarySwatch: Colors.blue,
),
home: const MyHomePage(title: 'ChatBotService'),
);
}
}
class MyHomePage extends StatefulWidget {
const MyHomePage({Key? key, required this.title}) : super(key: key);
final String title;
@override
State<MyHomePage> createState() => _MyHomePageState();
}
class _MyHomePageState extends State<MyHomePage> {
bool isLoggedIn = false;
String str = 'Login required';
final HMSAnalytics _hmsAnalytics = new HMSAnalytics();
List<String> gridItems = ['Email Service', 'Call Center', 'FAQ', 'Chat Now'];
@override
void initState() {
_enableLog();
super.initState();
}
@override
Widget build(BuildContext context) {
return Scaffold(
appBar: AppBar(
title: Text(widget.title),
),
body: Center(
child:
Column(
mainAxisAlignment: MainAxisAlignment.center,
children: <Widget>[
Visibility(
visible: true,
child: Card(
child: Padding(
padding: EdgeInsets.all(20),
child: Text(
str,
style: const TextStyle(color: Colors.teal, fontSize: 22),
),
),
),
),
],
),
),
floatingActionButton: FloatingActionButton(
onPressed: () {
if (!isLoggedIn) {
setState(() {
isLoggedIn = true;
signInWithHuaweiID();
});
print('$isLoggedIn');
} else {
setState(() {
isLoggedIn = false;
signOutWithID();
});
print('$isLoggedIn');
}
},
tooltip: 'Login/Logout',
child: isLoggedIn ? const Icon(Icons.logout) : const Icon(Icons.login),
), // This trailing comma makes auto-formatting nicer for build methods.
);
}
void signInWithHuaweiID() async {
try {
// The sign-in is successful, and the user's ID information and authorization code are obtained.
Future<AuthAccount> account = AccountAuthService.signIn();
account.then(
(value) => setLoginSuccess(value),
);
} on Exception catch (e) {
print(e.toString());
}
}
Future<void> _enableLog() async {
_hmsAnalytics.setUserId("ChatBotServiceApp");
await _hmsAnalytics.enableLog();
}
void setLoginSuccess(AuthAccount value) {
setState(() {
str = 'Welcome ' + value.displayName.toString();
});
showToast(value.displayName.toString());
print('Login Success');
}
Future<void> signOutWithID() async {
try {
final bool result = await AccountAuthService.signOut();
if (result) {
setState(() {
str = 'Login required';
showToast('You are logged out.');
});
}
} on Exception catch (e) {
print(e.toString());
}
}
Future<void> showToast(String name) async {
Fluttertoast.showToast(
msg: "$name",
toastLength: Toast.LENGTH_SHORT,
gravity: ToastGravity.CENTER,
timeInSecForIosWeb: 1,
backgroundColor: Colors.lightBlue,
textColor: Colors.white,
fontSize: 16.0);
}
}
[B]
ChatPage.dart
[/B][/B]
class ChatPage extends StatefulWidget {
const ChatPage({Key? key}) : super(key: key);
@override
_ChatPageState createState() => _ChatPageState();
}
class _ChatPageState extends State<ChatPage> {
List<types.Message> _messages = [];
final _user = const types.User(id: '06c33e8b-e835-4736-80f4-63f44b66666c');
final _bot = const types.User(id: '06c33e8b-e835-4736-80f4-63f54b66666c');
void _addMessage(types.Message message) {
setState(() {
_messages.insert(0, message);
});
}
void _handleSendPressed(types.PartialText message) {
final textMessage = types.TextMessage(
author: _user,
createdAt: DateTime.now().millisecondsSinceEpoch,
id: const Uuid().v4(),
text: message.text,
);
_addMessage(textMessage);
callCloudFunction2(message.text);
}
void _loadMessages() async {
final response = await rootBundle.loadString('assets/messages.json');
final messages = (jsonDecode(response) as List)
.map((e) => types.Message.fromJson(e as Map<String, dynamic>))
.toList();
setState(() {
_messages = messages;
});
}
@override
Widget build(BuildContext context) {
return Scaffold(
body: Chat(
messages: _messages,
onAttachmentPressed: null,
onMessageTap: null,
onPreviewDataFetched: null,
onSendPressed: _handleSendPressed,
user: _user,
),
);
}
Future<void> callCloudFunction2(String msg) async {
try {
RequestData data = RequestData(msg);
List<Map<String, Object>> params = <Map<String, Object>>[data.toMap()];
var input = data.toMap();
FunctionCallable functionCallable =
FunctionCallable('test-funnel-\$latest');
FunctionResult functionResult = await functionCallable.call(input);
print("Input " + input.toString());
var result = functionResult.getValue();
final textMessage = types.TextMessage(
author: _bot,
createdAt: DateTime.now().millisecondsSinceEpoch,
id: const Uuid().v4(),
text: jsonDecode(result)['response'].toString(),
);
_addMessage(textMessage);
} on PlatformException catch (e) {
print(e.message);
}
}
}
[B][B]
handler.js
[/B][/B][/B]
let myHandler = function(event, context, callback, logger) {
try {
var _body = JSON.parse(event.body);
var reqData = _body.message;
var test = '';
if(reqData == '1'){
test = "Thank you for choosing, you will get callback in 10 min.";
}else if(reqData == '2'){
test = "Please click on the link https://feedback.com/myfeedback";
}else if(reqData == '3'){
test = "Please click on the link https://huawei.com/faq";
}
else if(reqData == 'Hi'){
test = " Welcome to ChatBot Service.";
}else{
test = "Enter 1. For call back. 2. For send feedback. 3. For FAQ ";
}
let res = new context.HTTPResponse({"response": test}, {
"res-type": "simple example",
"faas-content-type": "json"
}, "application/json", "200");
callback(res);
} catch (error) {
let res = new context.HTTPResponse({"response": error}, {
"res-type": "simple example",
"faas-content-type": "json"
}, "application/json", "300");
callback(res);
}
};
module.exports.myHandler = myHandler;
[B][B][B]
Result
Tricks and Tips
Makes sure that agconnect-services.json file added.
Make sure dependencies are added yaml file.
Run flutter pug get after adding dependencies.
Make sure that service is enabled in agc.
Makes sure images are defined in yaml file.
Conclusion
In this article, we have learnt how to integrate Huawei Account kit, analytics kit and ChatBot function using Cloud Functions in flutter ChatBotApp. Once Account kit integrated, users can login quickly and conveniently sign in to apps with their Huawei IDs after granting initial access permission.
Thank you so much for reading. I hope this article helps you to understand the integration of Huawei Account kit, Analytics kit and Huawei Cloud Functions in flutter ChatBotApp.
Reference
Cloud Functions
Training Videos
Checkout in forum

Categories

Resources