Huawei Map Kit with automatic dark mode - Huawei Developers

Introduction
Hello reader, Since Android supported system wide dark mode since Android 10, it became important that developers support dark mode in their apps to provide good user experience. That is why in this article I am going to explain how to achieve automatic dark mode with HMS Map Kit.
If you have never integrate HMS Map Kit in your app, please refer to this site.
Once we are finished with the steps stated in the link above, we can move on to Android Studio.
Permissions
Map Kit requires internet connection to function, so let’s add network permissions to manifest file. Read more using link above.
Layout
First, let’s set up our layout xml file. It contains a MapView, and a button in the corner to toggle dark mode manually.
XML:
<?xml version="1.0" encoding="utf-8"?>
<androidx.constraintlayout.widget.ConstraintLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:map="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
tools:context=".MainActivity">
<com.huawei.hms.maps.MapView
android:id="@+id/viewMap"
android:layout_width="match_parent"
android:layout_height="match_parent"
map:cameraTargetLat="48.893478"
map:cameraTargetLng="2.334595"
map:cameraZoom="10" />
<com.google.android.material.button.MaterialButton
android:id="@+id/btnDarkSide"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_marginEnd="16dp"
android:layout_marginBottom="16dp"
android:text="Dark Side"
map:layout_constraintBottom_toBottomOf="parent"
map:layout_constraintEnd_toEndOf="parent">
</com.google.android.material.button.MaterialButton>
</androidx.constraintlayout.widget.ConstraintLayout>
Logic
First, let’s set up our activity without dark mode functionality. It should look something like this. Don’t forget to set your API key.
Java:
class MainActivity : AppCompatActivity() {
private lateinit var binding: ActivityMainBinding
private lateinit var mapView: MapView
private lateinit var huaweiMap: HuaweiMap
private lateinit var darkStyle: MapStyleOptions
private var darkModeOn: Boolean = false
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
//TODO a valid api key has to be set
MapsInitializer.setApiKey(Constants.API_KEY)
binding = ActivityMainBinding.inflate(layoutInflater)
setContentView(binding.root)
mapView = binding.viewMap
var mapViewBundle: Bundle? = null
if (savedInstanceState != null) {
mapViewBundle = savedInstanceState.getBundle("MapViewBundleKey")
}
mapView.onCreate(mapViewBundle)
mapView.getMapAsync {
onMapReady(it)
}
}
private fun onMapReady(map: HuaweiMap) {
huaweiMap = map
}
override fun onSaveInstanceState(outState: Bundle) {
super.onSaveInstanceState(outState)
var mapViewBundle: Bundle? = outState.getBundle("MapViewBundleKey")
if (mapViewBundle == null) {
mapViewBundle = Bundle()
outState.putBundle("MapViewBundleKey", mapViewBundle)
}
mapView.onSaveInstanceState(mapViewBundle)
}
override fun onStart() {
super.onStart()
mapView.onStart()
}
override fun onResume() {
super.onResume()
mapView.onResume()
}
override fun onPause() {
super.onPause()
mapView.onPause()
}
override fun onStop() {
super.onStop()
mapView.onStop()
}
override fun onDestroy() {
super.onDestroy()
mapView.onDestroy()
}
override fun onLowMemory() {
super.onLowMemory()
mapView.onLowMemory()
}
}
Map Kit supports setting a style through reading a JSON file. You can refer to this site to learn how this JSON file should be stylized.
I’m using the dark style JSON file from official demo. I’ll put the link below. It looks like this, should be located in …\app\src\main\res\raw directory.
JSON:
[
{
"mapFeature": "all",
"options": "geometry",
"paint": {
"color": "#25292B"
}
},
{
"mapFeature": "all",
"options": "labels.text.stroke",
"paint": {
"color": "#25292B"
}
},
{
"mapFeature": "all",
"options": "labels.icon",
"paint": {
"icon-type": "night"
}
},
{
"mapFeature": "administrative",
"options": "labels.text.fill",
"paint": {
"color": "#E0D5C7"
}
},
{
"mapFeature": "administrative.country",
"options": "geometry",
"paint": {
"color": "#787272"
}
},
{
"mapFeature": "administrative.province",
"options": "geometry",
"paint": {
"color": "#666262"
}
},
{
"mapFeature": "administrative.province",
"options": "labels.text.fill",
"paint": {
"color": "#928C82"
}
},
{
"mapFeature": "administrative.district",
"options": "labels.text.fill",
"paint": {
"color": "#AAA59E"
}
},
{
"mapFeature": "administrative.locality",
"options": "labels.text.fill",
"paint": {
"color": "#928C82"
}
},
{
"mapFeature": "landcover.parkland.natural",
"visibility": false,
"options": "geometry",
"paint": {
"color": "#25292B"
}
},
{
"mapFeature": "landcover.parkland.public-garden",
"options": "geometry",
"paint": {
"color": "#283631"
}
},
{
"mapFeature": "landcover.parkland.human-made",
"visibility": false,
"options": "geometry",
"paint": {
"color": "#25292B"
}
},
{
"mapFeature": "landcover.parkland.public-garden",
"options": "labels.text.fill",
"paint": {
"color": "#8BAA7F"
}
},
{
"mapFeature": "landcover.hospital",
"options": "geometry",
"paint": {
"color": "#382B2B"
}
},
{
"mapFeature": "landcover",
"options": "labels.text.fill",
"paint": {
"color": "#928C82"
}
},
{
"mapFeature": "poi.shopping",
"options": "labels.text.fill",
"paint": {
"color": "#9C8C5F"
}
},
{
"mapFeature": "landcover.human-made.building",
"visibility": false,
"options": "labels.text.fill",
"paint": {
"color": "#000000"
}
},
{
"mapFeature": "poi.tourism",
"options": "labels.text.fill",
"paint": {
"color": "#578C8C"
}
},
{
"mapFeature": "poi.beauty",
"options": "labels.text.fill",
"paint": {
"color": "#9E7885"
}
},
{
"mapFeature": "poi.leisure",
"options": "labels.text.fill",
"paint": {
"color": "#916A91"
}
},
{
"mapFeature": "poi.eating&drinking",
"options": "labels.text.fill",
"paint": {
"color": "#996E50"
}
},
{
"mapFeature": "poi.lodging",
"options": "labels.text.fill",
"paint": {
"color": "#A3678F"
}
},
{
"mapFeature": "poi.health-care",
"options": "labels.text.fill",
"paint": {
"color": "#B07373"
}
},
{
"mapFeature": "poi.public-service",
"options": "labels.text.fill",
"paint": {
"color": "#5F7299"
}
},
{
"mapFeature": "poi.business",
"options": "labels.text.fill",
"paint": {
"color": "#6B6B9D"
}
},
{
"mapFeature": "poi.automotive",
"options": "labels.text.fill",
"paint": {
"color": "#6B6B9D"
}
},
{
"mapFeature": "poi.sports.outdoor",
"options": "labels.text.fill",
"paint": {
"color": "#597A52"
}
},
{
"mapFeature": "poi.sports.other",
"options": "labels.text.fill",
"paint": {
"color": "#3E90AB"
}
},
{
"mapFeature": "poi.natural",
"options": "labels.text.fill",
"paint": {
"color": "#597A52"
}
},
{
"mapFeature": "poi.miscellaneous",
"options": "labels.text.fill",
"paint": {
"color": "#A7ADB0"
}
},
{
"mapFeature": "road.highway",
"options": "labels.text.fill",
"paint": {
"color": "#E3CAA2"
}
},
{
"mapFeature": "road.national",
"options": "labels.text.fill",
"paint": {
"color": "#A7ADB0"
}
},
{
"mapFeature": "road.province",
"options": "labels.text.fill",
"paint": {
"color": "#A7ADB0"
}
},
{
"mapFeature": "road.city-arterial",
"options": "labels.text.fill",
"paint": {
"color": "#808689"
}
},
{
"mapFeature": "road.minor-road",
"options": "labels.text.fill",
"paint": {
"color": "#808689"
}
},
{
"mapFeature": "road.sidewalk",
"options": "labels.text.fill",
"paint": {
"color": "#808689"
}
},
{
"mapFeature": "road.highway.country",
"options": "geometry.fill",
"paint": {
"color": "#8C7248"
}
},
{
"mapFeature": "road.highway.city",
"options": "geometry.fill",
"paint": {
"color": "#706148"
}
},
{
"mapFeature": "road.national",
"options": "geometry.fill",
"paint": {
"color": "#444A4D"
}
},
{
"mapFeature": "road.province",
"options": "geometry.fill",
"paint": {
"color": "#444A4D"
}
},
{
"mapFeature": "road.city-arterial",
"options": "geometry.fill",
"paint": {
"color": "#434B4F"
}
},
{
"mapFeature": "road.minor-road",
"options": "geometry.fill",
"paint": {
"color": "#434B4F"
}
},
{
"mapFeature": "road.sidewalk",
"options": "geometry.fill",
"paint": {
"color": "#434B4F"
}
},
{
"mapFeature": "transit",
"options": "labels.text.fill",
"paint": {
"color": "#4F81B3"
}
},
{
"mapFeature": "transit.railway",
"options": "geometry",
"paint": {
"color": "#5B2E57"
}
},
{
"mapFeature": "transit.ferry-line",
"options": "geometry",
"paint": {
"color": "#364D67"
}
},
{
"mapFeature": "transit.airport",
"options": "geometry",
"paint": {
"color": "#2C3235"
}
},
{
"mapFeature": "water",
"options": "geometry",
"paint": {
"color": "#243850"
}
},
{
"mapFeature": "water",
"options": "labels.text.fill",
"paint": {
"color": "#4C6481"
}
},
{
"mapFeature": "trafficInfo.smooth",
"options": "geometry",
"paint": {
"color": "#348734"
}
},
{
"mapFeature": "trafficInfo.amble",
"options": "geometry",
"paint": {
"color": "#947000"
}
},
{
"mapFeature": "trafficInfo.congestion",
"options": "geometry",
"paint": {
"color": "#A4281E"
}
},
{
"mapFeature": "trafficInfo.extremelycongestion",
"options": "geometry",
"paint": {
"color": "#7A120B"
}
}
]
We’ll read the JSON file like this.
Java:
...
//read dark map options from json file from raw folder
darkStyle = MapStyleOptions.loadRawResourceStyle(this, R.raw.mapstyle_night)
...
Then, we have to detect if the system wide dark mode is on using this function. We will set darkModeOn property with this function when activity launches.
Java:
...
//detect if system is set to dark mode
private fun isDarkModeOn(): Boolean {
return when (resources.configuration.uiMode.and(Configuration.UI_MODE_NIGHT_MASK)) {
Configuration.UI_MODE_NIGHT_YES -> true
Configuration.UI_MODE_NIGHT_NO -> false
Configuration.UI_MODE_NIGHT_UNDEFINED -> false
else -> false
}
}
...
Now we have to set map style onMapReady callback. Pass null value if no custom style is needed, for example when night mode is not active. When the user toggle dark mode on or off, activity will recreate and respective style will be set automatically.
Java:
...
private fun onMapReady(map: HuaweiMap) {
huaweiMap = map
if (darkModeOn) huaweiMap.setMapStyle(darkStyle) else huaweiMap.setMapStyle(null)
}
...
Remember the button in the layout file? Let’s set it up to manually switch style.
Java:
...
//set on click listener to the button to toggle map style
binding.btnDarkSide.setOnClickListener {
if (darkModeOn) huaweiMap.setMapStyle(null) else huaweiMap.setMapStyle(darkStyle)
darkModeOn = !darkModeOn
}
...
We’re done, we now have a MapView with automatic/manual dark mode switch. Refer to final activity class below. If you want to see how it looks in action, refer to this medium article.
Java:
class MainActivity : AppCompatActivity() {
private lateinit var binding: ActivityMainBinding
private lateinit var mapView: MapView
private lateinit var huaweiMap: HuaweiMap
private lateinit var darkStyle: MapStyleOptions
private var darkModeOn: Boolean = false
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
//TODO a valid api key has to be set
MapsInitializer.setApiKey(Constants.API_KEY)
binding = ActivityMainBinding.inflate(layoutInflater)
setContentView(binding.root)
//read dark map options from json file from raw folder
darkStyle = MapStyleOptions.loadRawResourceStyle(this, R.raw.mapstyle_night)
//set darkModeOn property using isDarkModeOn function
darkModeOn = isDarkModeOn()
//set on click listener to the button to toggle map style
binding.btnDarkSide.setOnClickListener {
if (darkModeOn) huaweiMap.setMapStyle(null) else huaweiMap.setMapStyle(darkStyle)
darkModeOn = !darkModeOn
}
mapView = binding.viewMap
var mapViewBundle: Bundle? = null
if (savedInstanceState != null) {
mapViewBundle = savedInstanceState.getBundle("MapViewBundleKey")
}
mapView.onCreate(mapViewBundle)
mapView.getMapAsync {
onMapReady(it)
}
}
//detect if system is set to dark mode
private fun isDarkModeOn(): Boolean {
return when (resources.configuration.uiMode.and(Configuration.UI_MODE_NIGHT_MASK)) {
Configuration.UI_MODE_NIGHT_YES -> true
Configuration.UI_MODE_NIGHT_NO -> false
Configuration.UI_MODE_NIGHT_UNDEFINED -> false
else -> false
}
}
private fun onMapReady(map: HuaweiMap) {
huaweiMap = map
if (darkModeOn) huaweiMap.setMapStyle(darkStyle) else huaweiMap.setMapStyle(null)
}
override fun onSaveInstanceState(outState: Bundle) {
super.onSaveInstanceState(outState)
var mapViewBundle: Bundle? = outState.getBundle("MapViewBundleKey")
if (mapViewBundle == null) {
mapViewBundle = Bundle()
outState.putBundle("MapViewBundleKey", mapViewBundle)
}
mapView.onSaveInstanceState(mapViewBundle)
}
override fun onStart() {
super.onStart()
mapView.onStart()
}
override fun onResume() {
super.onResume()
mapView.onResume()
}
override fun onPause() {
super.onPause()
mapView.onPause()
}
override fun onStop() {
super.onStop()
mapView.onStop()
}
override fun onDestroy() {
super.onDestroy()
mapView.onDestroy()
}
override fun onLowMemory() {
super.onLowMemory()
mapView.onLowMemory()
}
}
Conclusions
By modifying the HMS Map Kit style JSON file, we can achieve map views with custom styles. We can even switch out styles on runtime for our needs. Thank you.
Reference
Map Kit Demo

Related

Help in android app developing

Please Can Someone help me to get hands on the error in this source code for 2 subjects gpa calculator as it is giving me Force Close
Here`s the source code
"
package com.gado.e001;
import android.app.Activity;
import android.os.Bundle;
import android.view.View;
import android.view.View.OnClickListener;
import android.widget.Button;
import android.widget.CompoundButton;
import android.widget.CompoundButton.OnCheckedChangeListener;
import android.widget.EditText;
import android.widget.RadioButton;
import android.widget.Toast;
public class MainActivity extends Activity {
RadioButton gradea1, gradeb1, gradec1, graded1, gradea2, gradeb2, gradec2,
graded2;
EditText hours1, hours2;
Button calc;
float fgrade1, fgrade2, fhours1, fhours2;
float Result = fgrade1 + fgrade2 + fhours1 + fhours2;
public void Identify() {
gradea1 = (RadioButton) findViewById(R.id.radioButton1);
gradeb1 = (RadioButton) findViewById(R.id.radioButton2);
gradec1 = (RadioButton) findViewById(R.id.radioButton3);
graded1 = (RadioButton) findViewById(R.id.radioButton4);
gradea2 = (RadioButton) findViewById(R.id.radioButton5);
gradeb2 = (RadioButton) findViewById(R.id.radioButton6);
gradec2 = (RadioButton) findViewById(R.id.radioButton7);
graded2 = (RadioButton) findViewById(R.id.radioButton8);
hours1 = (EditText) findViewById(R.id.editText1);
hours2 = (EditText) findViewById(R.id.editText2);
calc = (Button) findViewById(R.id.button1);
}
public void getting() {
gradea1.setOnCheckedChangeListener(new OnCheckedChangeListener() {
@Override
public void onCheckedChanged(CompoundButton buttonView,
boolean isChecked) {
if (isChecked) {
fgrade1 = 4f;
}
else{
Toast.makeText(getBaseContext(), "", Toast.LENGTH_SHORT).show();
}
}
});
gradeb1.setOnCheckedChangeListener(new OnCheckedChangeListener() {
@Override
public void onCheckedChanged(CompoundButton buttonView,
boolean isChecked) {
if (isChecked) {
fgrade1 = 3f;
}
else{
Toast.makeText(getBaseContext(), "", Toast.LENGTH_SHORT).show();
}
}
});
gradec1.setOnCheckedChangeListener(new OnCheckedChangeListener() {
@Override
public void onCheckedChanged(CompoundButton buttonView,
boolean isChecked) {
if (isChecked) {
fgrade1 = 2f;
}
else{
Toast.makeText(getBaseContext(), "", Toast.LENGTH_SHORT).show();
}
}
});
graded1.setOnCheckedChangeListener(new OnCheckedChangeListener() {
@Override
public void onCheckedChanged(CompoundButton buttonView,
boolean isChecked) {
if (isChecked) {
fgrade1 = 1f;
}
else{
Toast.makeText(getBaseContext(), "", Toast.LENGTH_SHORT).show();
}
}
});
gradea2.setOnCheckedChangeListener(new OnCheckedChangeListener() {
@Override
public void onCheckedChanged(CompoundButton buttonView,
boolean isChecked) {
if (isChecked) {
fgrade2 = 4f;
}
else{
Toast.makeText(getBaseContext(), "", Toast.LENGTH_SHORT).show();
}
}
});
gradeb2.setOnCheckedChangeListener(new OnCheckedChangeListener() {
@Override
public void onCheckedChanged(CompoundButton buttonView,
boolean isChecked) {
if (isChecked) {
fgrade2 = 3f;
}
else{
Toast.makeText(getBaseContext(), "", Toast.LENGTH_SHORT).show();
}
}
});
gradec2.setOnCheckedChangeListener(new OnCheckedChangeListener() {
@Override
public void onCheckedChanged(CompoundButton buttonView,
boolean isChecked) {
if (isChecked) {
fgrade2 = 2f;
}
else{
Toast.makeText(getBaseContext(), "", Toast.LENGTH_SHORT).show();
}
}
});
graded2.setOnCheckedChangeListener(new OnCheckedChangeListener() {
@Override
public void onCheckedChanged(CompoundButton buttonView,
boolean isChecked) {
if (isChecked) {
fgrade2 = 1f;
}
else{
Toast.makeText(getBaseContext(), "", Toast.LENGTH_SHORT).show();
}
}
});
fhours1 = Float.parseFloat(hours1.getText().toString());
fhours2 = Float.parseFloat(hours2.getText().toString());
}
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
calc.setOnClickListener(new OnClickListener() {
@Override
public void onClick(View v) {
Identify();
getting();
Toast.makeText(getBaseContext(), "" + Result, Toast.LENGTH_LONG)
.show();
}
});
}
}
"
Please Help Me
You might be better off asking this question on Stackoverflow.
Run your app with your phone connected to your PC and Eclipse running, this way Eclipse will show you the error so you can tell us what is it.

Beginner: Integration of Lite Wearable security app with Huawei Wear Engine

Introduction
In this article, will explain how to develop a security application in Lite wearable. To achieve it we have to use the Wear Engine library. It will give us the solution for communication between Harmony wearable and android smartphone.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Requirements
1) DevEco IDE.
2) Lite wearable watch.
3) Android Smartphone.
4) Huawei developer account.
Integration process
The integration process contains two parts. Android smartphone side and Wear app side.
Android side
Step 1: Create the android project on Android studio.
Step 2: Generate Android signature files.
Step 3: Generate SHA -256 from the keystore generated. Please refer this link. https://developer.huawei.com/consumer/en/codelab/HMSPreparation/index.html#0
Step 4: Navigate to Huawei developer console. Click on Huawei ID (https://developer.huawei.com/consumer/en/console#/productlist/32).
Step 5: Create new product. Add the SHA-256 as the first signed certificate.
Step 6: Click Wear Engine under App services.
Step 7: Click Apply for Wear Engine, agree to the agreement, and the screen for the data permission application is displayed.
Wait for the approval.
Step 8: Open the project-level build gradle of your Android project.
Step 9: Navigateto buildscript > repositories and add the Maven repository configurations.
Java:
maven {url 'https://developer.huawei.com/repo/'}
Step 10: Navigate to allprojects > repositories and add the Maven repository address
Java:
maven {url 'https://developer.huawei.com/repo/'}
Step 11: Add wear engine sdk on the build gradle.
Java:
implementation 'com.huawei.hms:wearengine:{version}'
Step 12: Add the proguard rules in proguard-rules.pro
Java:
<p>-keepattributes *Annotation*
-keepattributes Signature
-keepattributes InnerClasses
-keepattributes EnclosingMethod
-keep class com.huawei.wearengine.**{*;}
</p>
Step 13: Add code snippet to Search for the available device on the MainActivity.java
Java:
private void searchAvailableDevices() {
DeviceClient deviceClient = HiWear.getDeviceClient(this);
deviceClient.hasAvailableDevices().addOnSuccessListener(new OnSuccessListener<Boolean>() {
@Override
public void onSuccess(Boolean result) {
checkPermissionGranted();
}
}).addOnFailureListener(new OnFailureListener() {
@Override
public void onFailure(Exception e) {
}
});
Step 14: If the devices are available call for device permission granted or not.
Java:
private void checkPermissionGranted() {
AuthClient authClient = HiWear.getAuthClient(this);
authClient.checkPermission(Permission.DEVICE_MANAGER).addOnSuccessListener(new OnSuccessListener<Boolean>() {
@Override
public void onSuccess(Boolean aBoolean) {
if (!aBoolean) {
askPermission();
}
}
}).addOnFailureListener(new OnFailureListener() {
@Override
public void onFailure(Exception e) {
}
});
}
Step 15: If permission is not granted ask for permission.
Java:
private void askPermission() {
AuthClient authClient = HiWear.getAuthClient(this);
AuthCallback authCallback = new AuthCallback() {
@Override
public void onOk(Permission[] permissions) {
if (permissions.length != 0) {
checkCurrentConnectedDevice();
}
}
@Override
public void onCancel() {
}
};
authClient.requestPermission(authCallback, Permission.DEVICE_MANAGER)
.addOnSuccessListener(new OnSuccessListener<Void>() {
@Override
public void onSuccess(Void successVoid) {
}
})
.addOnFailureListener(new OnFailureListener() {
@Override
public void onFailure(Exception e) {
}
});
}
Step 16: Get the connected device object for the communication.
Java:
private void checkCurrentConnectedDevice() {
final List<Device> deviceList = new ArrayList<>();
DeviceClient deviceClient = HiWear.getDeviceClient(this);
deviceClient.getBondedDevices()
.addOnSuccessListener(new OnSuccessListener<List<Device>>() {
@Override
public void onSuccess(List<Device> devices) {
deviceList.addAll(devices);
if (!deviceList.isEmpty()) {
for (Device device : deviceList) {
if (device.isConnected()) {
connectedDevice = device;
}
}
}
if (connectedDevice != null) {
checkAppInstalledInWatch(connectedDevice);
}
}
})
.addOnFailureListener(new OnFailureListener() {
@Override
public void onFailure(Exception e) {
//Process logic when the device list fails to be obtained
}
});
}
Step 17: Call pingfunction to check if the Wear app is installed on the watch.
Java:
private void checkAppInstalledInWatch(final Device connectedDevice) {
P2pClient p2pClient = HiWear.getP2pClient(this);
String peerPkgName = "com.wearengine.huawei";
p2pClient.setPeerPkgName(peerPkgName);
if (connectedDevice != null && connectedDevice.isConnected()) {
p2pClient.ping(connectedDevice, new PingCallback() {
@Override
public void onPingResult(int errCode) {
}
}).addOnSuccessListener(new OnSuccessListener<Void>() {
@Override
public void onSuccess(Void successVoid) {
}
}).addOnFailureListener(new OnFailureListener() {
@Override
public void onFailure(Exception e) {
}
});
}
Step 18: If the ping is success your watch app will launch automatically.
Step 19: Add Password check method on click listener function.
Java:
@Override
public void onClick(View view) {
if (view.getId() == R.id.btLogin) {
if (etPin.getText().toString().equals("1234")) {
sendMessageToWatch("Success", connectedDevice);
} else {
sendMessageToWatch("Wrong password", connectedDevice);
}
}
}
Step 20: Send message to the watch.
Java:
private void sendMessageToWatch(String message, Device connectedDevice) {
P2pClient p2pClient = HiWear.getP2pClient(this);
String peerPkgName = "com.wearengine.huawei";
p2pClient.setPeerPkgName(peerPkgName);
String peerFingerPrint = "com.wearengine.huawei_BALgPWTbV2CKZ9swMfG1n9ReRlQFqiZrEGWyVQp/6UIgCUsgXn7PojLPA4iatPktya1pLAORwvHgHpv/Z5DfMK8=";
p2pClient.setPeerFingerPrint(peerFingerPrint);
Message.Builder builder = new Message.Builder();
builder.setPayload(message.getBytes(StandardCharsets.UTF_8));
Message sendMessage = builder.build();
SendCallback sendCallback = new SendCallback() {
@Override
public void onSendResult(int resultCode) {
}
@Override
public void onSendProgress(long progress) {
}
};
if (connectedDevice != null && connectedDevice.isConnected() && sendMessage != null && sendCallback != null) {
p2pClient.send(connectedDevice, sendMessage, sendCallback)
.addOnSuccessListener(new OnSuccessListener<Void>() {
@Override
public void onSuccess(Void aVoid) {
//Related processing logic for your app after the send command runs
}
})
.addOnFailureListener(new OnFailureListener() {
@Override
public void onFailure(Exception e) {
//Related processing logic for your app after the send command fails to run
}
});
}
Step 21: Generate the p2p fingerprint. Please follow this article - https://forums.developer.huawei.com/forumPortal/en/topic/0202466737940270075
The final code for your android application will be as given below.
Java:
package com.phone.wearengine;
import android.os.Bundle;
import android.view.View;
import android.widget.EditText;
import androidx.appcompat.app.AppCompatActivity;
import com.huawei.hmf.tasks.OnFailureListener;
import com.huawei.hmf.tasks.OnSuccessListener;
import com.huawei.wearengine.HiWear;
import com.huawei.wearengine.auth.AuthCallback;
import com.huawei.wearengine.auth.AuthClient;
import com.huawei.wearengine.auth.Permission;
import com.huawei.wearengine.device.Device;
import com.huawei.wearengine.device.DeviceClient;
import com.huawei.wearengine.p2p.Message;
import com.huawei.wearengine.p2p.P2pClient;
import com.huawei.wearengine.p2p.PingCallback;
import com.huawei.wearengine.p2p.SendCallback;
import java.nio.charset.StandardCharsets;
import java.util.ArrayList;
import java.util.List;
public class MainActivity extends AppCompatActivity implements View.OnClickListener {
private Device connectedDevice;
private EditText etPin;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
initUi();
searchAvailableDevices();
checkCurrentConnectedDevice();
}
private void initUi() {
etPin = findViewById(R.id.etPin);
findViewById(R.id.btLogin).setOnClickListener(this);
}
private void searchAvailableDevices() {
DeviceClient deviceClient = HiWear.getDeviceClient(this);
deviceClient.hasAvailableDevices().addOnSuccessListener(new OnSuccessListener<Boolean>() {
@Override
public void onSuccess(Boolean result) {
checkPermissionGranted();
}
}).addOnFailureListener(new OnFailureListener() {
@Override
public void onFailure(Exception e) {
}
});
}
private void checkPermissionGranted() {
AuthClient authClient = HiWear.getAuthClient(this);
authClient.checkPermission(Permission.DEVICE_MANAGER).addOnSuccessListener(new OnSuccessListener<Boolean>() {
@Override
public void onSuccess(Boolean aBoolean) {
if (!aBoolean) {
askPermission();
}
}
}).addOnFailureListener(new OnFailureListener() {
@Override
public void onFailure(Exception e) {
}
});
}
private void askPermission() {
AuthClient authClient = HiWear.getAuthClient(this);
AuthCallback authCallback = new AuthCallback() {
@Override
public void onOk(Permission[] permissions) {
if (permissions.length != 0) {
checkCurrentConnectedDevice();
}
}
@Override
public void onCancel() {
}
};
authClient.requestPermission(authCallback, Permission.DEVICE_MANAGER)
.addOnSuccessListener(new OnSuccessListener<Void>() {
@Override
public void onSuccess(Void successVoid) {
}
})
.addOnFailureListener(new OnFailureListener() {
@Override
public void onFailure(Exception e) {
}
});
}
private void checkCurrentConnectedDevice() {
final List<Device> deviceList = new ArrayList<>();
DeviceClient deviceClient = HiWear.getDeviceClient(this);
deviceClient.getBondedDevices()
.addOnSuccessListener(new OnSuccessListener<List<Device>>() {
@Override
public void onSuccess(List<Device> devices) {
deviceList.addAll(devices);
if (!deviceList.isEmpty()) {
for (Device device : deviceList) {
if (device.isConnected()) {
connectedDevice = device;
}
}
}
if (connectedDevice != null) {
checkAppInstalledInWatch(connectedDevice);
}
}
})
.addOnFailureListener(new OnFailureListener() {
@Override
public void onFailure(Exception e) {
//Process logic when the device list fails to be obtained
}
});
}
private void checkAppInstalledInWatch(final Device connectedDevice) {
P2pClient p2pClient = HiWear.getP2pClient(this);
String peerPkgName = "com.wearengine.huawei";
p2pClient.setPeerPkgName(peerPkgName);
if (connectedDevice != null && connectedDevice.isConnected()) {
p2pClient.ping(connectedDevice, new PingCallback() {
@Override
public void onPingResult(int errCode) {
}
}).addOnSuccessListener(new OnSuccessListener<Void>() {
@Override
public void onSuccess(Void successVoid) {
}
}).addOnFailureListener(new OnFailureListener() {
@Override
public void onFailure(Exception e) {
}
});
}
}
private void sendMessageToWatch(String message, Device connectedDevice) {
P2pClient p2pClient = HiWear.getP2pClient(this);
String peerPkgName = "com.wearengine.huawei";
p2pClient.setPeerPkgName(peerPkgName);
String peerFingerPrint = "com.wearengine.huawei_BALgPWTbV2CKZ9swMfG1n9ReRlQFqiZrEG*******";
p2pClient.setPeerFingerPrint(peerFingerPrint);
Message.Builder builder = new Message.Builder();
builder.setPayload(message.getBytes(StandardCharsets.UTF_8));
Message sendMessage = builder.build();
SendCallback sendCallback = new SendCallback() {
@Override
public void onSendResult(int resultCode) {
}
@Override
public void onSendProgress(long progress) {
}
};
if (connectedDevice != null && connectedDevice.isConnected() && sendMessage != null && sendCallback != null) {
p2pClient.send(connectedDevice, sendMessage, sendCallback)
.addOnSuccessListener(new OnSuccessListener<Void>() {
@Override
public void onSuccess(Void aVoid) {
//Related processing logic for your app after the send command runs
}
})
.addOnFailureListener(new OnFailureListener() {
@Override
public void onFailure(Exception e) {
//Related processing logic for your app after the send command fails to run
}
});
}
}
@Override
public void onClick(View view) {
if (view.getId() == R.id.btLogin) {
if (etPin.getText().toString().equals("1234")) {
sendMessageToWatch("Success", connectedDevice);
} else {
sendMessageToWatch("Wrong password", connectedDevice);
}
}
}
}
Watch side
Step 1: Create a Lite Wearable project on DevEco studio.
Step 2: Generate the required certificates to run the application. Please refer to this article https://forums.developer.huawei.com/forumPortal/en/topic/0202465210302250053
Step 3: Download and Add the Wear Engine library in the pages folder of the Harmony project. https://developer.huawei.com/consum...ity-Library/litewearable-sdk-0000001053562589
Step 4: Design the UI.
Index.hml
HTML:
<div class="container">
<text class="title">
{{title}}
</text>
</div>
Index.css
CSS:
.container {
display: flex;
justify-content: center;
align-items: center;
left: 0px;
top: 0px;
width: 454px;
height: 454px;
background-color: cadetblue;
}
.title {
font-size: 90px;
text-align: center;
width: 300px;
height: 150px;
}
Step 5: Open index.js file and import the wearengine SDK.
JavaScript:
import {P2pClient, Message, Builder} from '../wearengine';
Step 6: Add the receiver code snippet on index.js.
JavaScript:
onInit() {
var _that = this;
_that.setBrightnessKeepScreenOn();
//Step 1: Obtain the point-to-point communication object
var p2pClient = new P2pClient();
var peerPkgName = "com.phone.wearengine";
var peerFinger = "79C3B257672C32974283E712EF7FEC******";
//Step 2: Set your app package name that needs communications on the phone
p2pClient.setPeerPkgName(peerPkgName);
//Step 3: Set the fingerprint information of the app on the phone. (This API is unavailable currently. In this version, you need to set fingerprint mode in the config.json file in Step 5.)
p2pClient.setPeerFingerPrint(peerFinger);
//Step 4: Receive short messages or files from your app on the phone
//Define the receiver
var flash = this;
var receiver = {
onSuccess: function () {
console.info("Recieved message");
//Process the callback function returned when messages or files fail to be received from the phone during registration.
flash.receiveMessageOK = "Succeeded in receiving the message";
},
onFailure: function () {
console.info("Failed message");
//Registering a listener for the callback method of failing to receive messages or files from phone
flash.receiveMessageOK = "Failed to receive the message";
},
onReceiveMessage: function (data) {
if (data && data.isFileType) {
//Process the file sent by your app on the phone
flash.receiveMessgeOK = "file:" + data.name;
} else {
console.info("Got message - " + data);
//Process the message sent from your app on the phone.
flash.receiveMessageOK = "message:" + data;
_that.title = "" + data;
if (data != "Success") {
vibrator.vibrate({
mode: "long"
})
}
}
},
}
p2pClient.registerReceiver(receiver);
},
PeerFingerPrint on watch side is SHA-256 of Android application (Make sure you have removed the colons)
Step 7: Unregister the receiver on destroy of wearable app.
Java:
onDestroy() {
this.p2pClient.unregisterReceiver();
}
Step 8: Add metadata inside of module object of config.json.
JSON:
"metaData": {
"customizeData": [
{
"name": "supportLists",
"value": "com.phone.wearengine:79C3B257672C32974283E756535C86728BE4DF51E*******",
"extra": ""
}
]
}
The final code for your android application given below.
JavaScript:
import {P2pClient, Message, Builder} from '../wearengine';
import brightness from '@system.brightness';
import vibrator from '@system.vibrator';
export default {
data: {
title: 'Enter pin'
},
onInit() {
var _that = this;
_that.setBrightnessKeepScreenOn();
//Step 1: Obtain the point-to-point communication object
var p2pClient = new P2pClient();
var peerPkgName = "com.phone.wearengine";
var peerFinger = "79C3B257672C32974283E756535C*****************";
//Step 2: Set your app package name that needs communications on the phone
p2pClient.setPeerPkgName(peerPkgName);
//Step 3: Set the fingerprint information of the app on the phone. (This API is unavailable currently. In this version, you need to set fingerprint mode in the config.json file in Step 5.)
p2pClient.setPeerFingerPrint(peerFinger);
//Step 4: Receive short messages or files from your app on the phone
//Define the receiver
var flash = this;
var receiver = {
onSuccess: function () {
console.info("Recieved message");
//Process the callback function returned when messages or files fail to be received from the phone during registration.
flash.receiveMessageOK = "Succeeded in receiving the message";
},
onFailure: function () {
console.info("Failed message");
//Registering a listener for the callback method of failing to receive messages or files from phone
flash.receiveMessageOK = "Failed to receive the message";
},
onReceiveMessage: function (data) {
if (data && data.isFileType) {
//Process the file sent by your app on the phone
flash.receiveMessgeOK = "file:" + data.name;
} else {
console.info("Got message - " + data);
//Process the message sent from your app on the phone.
flash.receiveMessageOK = "message:" + data;
_that.title = "" + data;
if (data != "Success") {
vibrator.vibrate({
mode: "long"
})
}
}
},
}
p2pClient.registerReceiver(receiver);
},
setBrightnessKeepScreenOn: function () {
brightness.setKeepScreenOn({
keepScreenOn: true,
success: function () {
console.log("handling set keep screen on success")
},
fail: function (data, code) {
console.log("handling set keep screen on fail, code:" + code);
}
});
},
onDestroy() {
// FeatureAbility.unsubscribeMsg();
this.p2pClient.unregisterReceiver();
}
}
Results
Tips & Tricks
Make sure you are generated the SHA - 256 fingerprint of proper keystore.
Follow the P2P generation steps properly.
Conclusion
In this article, we are learned how to develop a security app using Wear Engine. The wear engine will allow us to communicate between the Android application and the Harmony Wear application.
Reference
Harmony Official document - https://developer.harmonyos.com/en/docs/documentation/doc-guides/harmonyos-overview-0000000000011903
Wear Engine documentation - https://developer.huawei.com/consumer/en/doc/development/connectivity-Guides/service-introduction-0000001050978399
Certificate generation article - https://forums.developer.huawei.com/forumPortal/en/topic/0202465210302250053
P2P generation article - https://forums.developer.huawei.com/forumPortal/en/topic/0202466737940270075
Interesting app.
Nice article
ask011 said:
Introduction
In this article, will explain how to develop a security application in Lite wearable. To achieve it we have to use the Wear Engine library. It will give us the solution for communication between Harmony wearable and android smartphone.
Requirements
1) DevEco IDE.
2) Lite wearable watch.
3) Android Smartphone.
4) Huawei developer account.
Integration process
The integration process contains two parts. Android smartphone side and Wear app side.
Android side
Step 1: Create the android project on Android studio.
Step 2: Generate Android signature files.
Step 3: Generate SHA -256 from the keystore generated. Please refer this link. https://developer.huawei.com/consumer/en/codelab/HMSPreparation/index.html#0
Step 4: Navigate to Huawei developer console. Click on Huawei ID (https://developer.huawei.com/consumer/en/console#/productlist/32).
Step 5: Create new product. Add the SHA-256 as the first signed certificate.
Step 6: Click Wear Engine under App services.
Step 7: Click Apply for Wear Engine, agree to the agreement, and the screen for the data permission application is displayed.
Wait for the approval.
Step 8: Open the project-level build gradle of your Android project.
Step 9: Navigateto buildscript > repositories and add the Maven repository configurations.
Java:
maven {url 'https://developer.huawei.com/repo/'}
Step 10: Navigate to allprojects > repositories and add the Maven repository address
Java:
maven {url 'https://developer.huawei.com/repo/'}
Step 11: Add wear engine sdk on the build gradle.
Java:
implementation 'com.huawei.hms:wearengine:{version}'
Step 12: Add the proguard rules in proguard-rules.pro
Java:
<p>-keepattributes *Annotation*
-keepattributes Signature
-keepattributes InnerClasses
-keepattributes EnclosingMethod
-keep class com.huawei.wearengine.**{*;}
</p>
Step 13: Add code snippet to Search for the available device on the MainActivity.java
Java:
private void searchAvailableDevices() {
DeviceClient deviceClient = HiWear.getDeviceClient(this);
deviceClient.hasAvailableDevices().addOnSuccessListener(new OnSuccessListener<Boolean>() {
@Override
public void onSuccess(Boolean result) {
checkPermissionGranted();
}
}).addOnFailureListener(new OnFailureListener() {
@Override
public void onFailure(Exception e) {
}
});
Step 14: If the devices are available call for device permission granted or not.
Java:
private void checkPermissionGranted() {
AuthClient authClient = HiWear.getAuthClient(this);
authClient.checkPermission(Permission.DEVICE_MANAGER).addOnSuccessListener(new OnSuccessListener<Boolean>() {
@Override
public void onSuccess(Boolean aBoolean) {
if (!aBoolean) {
askPermission();
}
}
}).addOnFailureListener(new OnFailureListener() {
@Override
public void onFailure(Exception e) {
}
});
}
Step 15: If permission is not granted ask for permission.
Java:
private void askPermission() {
AuthClient authClient = HiWear.getAuthClient(this);
AuthCallback authCallback = new AuthCallback() {
@Override
public void onOk(Permission[] permissions) {
if (permissions.length != 0) {
checkCurrentConnectedDevice();
}
}
@Override
public void onCancel() {
}
};
authClient.requestPermission(authCallback, Permission.DEVICE_MANAGER)
.addOnSuccessListener(new OnSuccessListener<Void>() {
@Override
public void onSuccess(Void successVoid) {
}
})
.addOnFailureListener(new OnFailureListener() {
@Override
public void onFailure(Exception e) {
}
});
}
Step 16: Get the connected device object for the communication.
Java:
private void checkCurrentConnectedDevice() {
final List<Device> deviceList = new ArrayList<>();
DeviceClient deviceClient = HiWear.getDeviceClient(this);
deviceClient.getBondedDevices()
.addOnSuccessListener(new OnSuccessListener<List<Device>>() {
@Override
public void onSuccess(List<Device> devices) {
deviceList.addAll(devices);
if (!deviceList.isEmpty()) {
for (Device device : deviceList) {
if (device.isConnected()) {
connectedDevice = device;
}
}
}
if (connectedDevice != null) {
checkAppInstalledInWatch(connectedDevice);
}
}
})
.addOnFailureListener(new OnFailureListener() {
@Override
public void onFailure(Exception e) {
//Process logic when the device list fails to be obtained
}
});
}
Step 17: Call pingfunction to check if the Wear app is installed on the watch.
Java:
private void checkAppInstalledInWatch(final Device connectedDevice) {
P2pClient p2pClient = HiWear.getP2pClient(this);
String peerPkgName = "com.wearengine.huawei";
p2pClient.setPeerPkgName(peerPkgName);
if (connectedDevice != null && connectedDevice.isConnected()) {
p2pClient.ping(connectedDevice, new PingCallback() {
@Override
public void onPingResult(int errCode) {
}
}).addOnSuccessListener(new OnSuccessListener<Void>() {
@Override
public void onSuccess(Void successVoid) {
}
}).addOnFailureListener(new OnFailureListener() {
@Override
public void onFailure(Exception e) {
}
});
}
Step 18: If the ping is success your watch app will launch automatically.
Step 19: Add Password check method on click listener function.
Java:
@Override
public void onClick(View view) {
if (view.getId() == R.id.btLogin) {
if (etPin.getText().toString().equals("1234")) {
sendMessageToWatch("Success", connectedDevice);
} else {
sendMessageToWatch("Wrong password", connectedDevice);
}
}
}
Step 20: Send message to the watch.
Java:
private void sendMessageToWatch(String message, Device connectedDevice) {
P2pClient p2pClient = HiWear.getP2pClient(this);
String peerPkgName = "com.wearengine.huawei";
p2pClient.setPeerPkgName(peerPkgName);
String peerFingerPrint = "com.wearengine.huawei_BALgPWTbV2CKZ9swMfG1n9ReRlQFqiZrEGWyVQp/6UIgCUsgXn7PojLPA4iatPktya1pLAORwvHgHpv/Z5DfMK8=";
p2pClient.setPeerFingerPrint(peerFingerPrint);
Message.Builder builder = new Message.Builder();
builder.setPayload(message.getBytes(StandardCharsets.UTF_8));
Message sendMessage = builder.build();
SendCallback sendCallback = new SendCallback() {
@Override
public void onSendResult(int resultCode) {
}
@Override
public void onSendProgress(long progress) {
}
};
if (connectedDevice != null && connectedDevice.isConnected() && sendMessage != null && sendCallback != null) {
p2pClient.send(connectedDevice, sendMessage, sendCallback)
.addOnSuccessListener(new OnSuccessListener<Void>() {
@Override
public void onSuccess(Void aVoid) {
//Related processing logic for your app after the send command runs
}
})
.addOnFailureListener(new OnFailureListener() {
@Override
public void onFailure(Exception e) {
//Related processing logic for your app after the send command fails to run
}
});
}
Step 21: Generate the p2p fingerprint. Please follow this article - https://forums.developer.huawei.com/forumPortal/en/topic/0202466737940270075
The final code for your android application will be as given below.
Java:
package com.phone.wearengine;
import android.os.Bundle;
import android.view.View;
import android.widget.EditText;
import androidx.appcompat.app.AppCompatActivity;
import com.huawei.hmf.tasks.OnFailureListener;
import com.huawei.hmf.tasks.OnSuccessListener;
import com.huawei.wearengine.HiWear;
import com.huawei.wearengine.auth.AuthCallback;
import com.huawei.wearengine.auth.AuthClient;
import com.huawei.wearengine.auth.Permission;
import com.huawei.wearengine.device.Device;
import com.huawei.wearengine.device.DeviceClient;
import com.huawei.wearengine.p2p.Message;
import com.huawei.wearengine.p2p.P2pClient;
import com.huawei.wearengine.p2p.PingCallback;
import com.huawei.wearengine.p2p.SendCallback;
import java.nio.charset.StandardCharsets;
import java.util.ArrayList;
import java.util.List;
public class MainActivity extends AppCompatActivity implements View.OnClickListener {
private Device connectedDevice;
private EditText etPin;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
initUi();
searchAvailableDevices();
checkCurrentConnectedDevice();
}
private void initUi() {
etPin = findViewById(R.id.etPin);
findViewById(R.id.btLogin).setOnClickListener(this);
}
private void searchAvailableDevices() {
DeviceClient deviceClient = HiWear.getDeviceClient(this);
deviceClient.hasAvailableDevices().addOnSuccessListener(new OnSuccessListener<Boolean>() {
@Override
public void onSuccess(Boolean result) {
checkPermissionGranted();
}
}).addOnFailureListener(new OnFailureListener() {
@Override
public void onFailure(Exception e) {
}
});
}
private void checkPermissionGranted() {
AuthClient authClient = HiWear.getAuthClient(this);
authClient.checkPermission(Permission.DEVICE_MANAGER).addOnSuccessListener(new OnSuccessListener<Boolean>() {
@Override
public void onSuccess(Boolean aBoolean) {
if (!aBoolean) {
askPermission();
}
}
}).addOnFailureListener(new OnFailureListener() {
@Override
public void onFailure(Exception e) {
}
});
}
private void askPermission() {
AuthClient authClient = HiWear.getAuthClient(this);
AuthCallback authCallback = new AuthCallback() {
@Override
public void onOk(Permission[] permissions) {
if (permissions.length != 0) {
checkCurrentConnectedDevice();
}
}
@Override
public void onCancel() {
}
};
authClient.requestPermission(authCallback, Permission.DEVICE_MANAGER)
.addOnSuccessListener(new OnSuccessListener<Void>() {
@Override
public void onSuccess(Void successVoid) {
}
})
.addOnFailureListener(new OnFailureListener() {
@Override
public void onFailure(Exception e) {
}
});
}
private void checkCurrentConnectedDevice() {
final List<Device> deviceList = new ArrayList<>();
DeviceClient deviceClient = HiWear.getDeviceClient(this);
deviceClient.getBondedDevices()
.addOnSuccessListener(new OnSuccessListener<List<Device>>() {
@Override
public void onSuccess(List<Device> devices) {
deviceList.addAll(devices);
if (!deviceList.isEmpty()) {
for (Device device : deviceList) {
if (device.isConnected()) {
connectedDevice = device;
}
}
}
if (connectedDevice != null) {
checkAppInstalledInWatch(connectedDevice);
}
}
})
.addOnFailureListener(new OnFailureListener() {
@Override
public void onFailure(Exception e) {
//Process logic when the device list fails to be obtained
}
});
}
private void checkAppInstalledInWatch(final Device connectedDevice) {
P2pClient p2pClient = HiWear.getP2pClient(this);
String peerPkgName = "com.wearengine.huawei";
p2pClient.setPeerPkgName(peerPkgName);
if (connectedDevice != null && connectedDevice.isConnected()) {
p2pClient.ping(connectedDevice, new PingCallback() {
@Override
public void onPingResult(int errCode) {
}
}).addOnSuccessListener(new OnSuccessListener<Void>() {
@Override
public void onSuccess(Void successVoid) {
}
}).addOnFailureListener(new OnFailureListener() {
@Override
public void onFailure(Exception e) {
}
});
}
}
private void sendMessageToWatch(String message, Device connectedDevice) {
P2pClient p2pClient = HiWear.getP2pClient(this);
String peerPkgName = "com.wearengine.huawei";
p2pClient.setPeerPkgName(peerPkgName);
String peerFingerPrint = "com.wearengine.huawei_BALgPWTbV2CKZ9swMfG1n9ReRlQFqiZrEG*******";
p2pClient.setPeerFingerPrint(peerFingerPrint);
Message.Builder builder = new Message.Builder();
builder.setPayload(message.getBytes(StandardCharsets.UTF_8));
Message sendMessage = builder.build();
SendCallback sendCallback = new SendCallback() {
@Override
public void onSendResult(int resultCode) {
}
@Override
public void onSendProgress(long progress) {
}
};
if (connectedDevice != null && connectedDevice.isConnected() && sendMessage != null && sendCallback != null) {
p2pClient.send(connectedDevice, sendMessage, sendCallback)
.addOnSuccessListener(new OnSuccessListener<Void>() {
@Override
public void onSuccess(Void aVoid) {
//Related processing logic for your app after the send command runs
}
})
.addOnFailureListener(new OnFailureListener() {
@Override
public void onFailure(Exception e) {
//Related processing logic for your app after the send command fails to run
}
});
}
}
@Override
public void onClick(View view) {
if (view.getId() == R.id.btLogin) {
if (etPin.getText().toString().equals("1234")) {
sendMessageToWatch("Success", connectedDevice);
} else {
sendMessageToWatch("Wrong password", connectedDevice);
}
}
}
}
Watch side
Step 1: Create a Lite Wearable project on DevEco studio.
Step 2: Generate the required certificates to run the application. Please refer to this article https://forums.developer.huawei.com/forumPortal/en/topic/0202465210302250053
Step 3: Download and Add the Wear Engine library in the pages folder of the Harmony project. https://developer.huawei.com/consum...ity-Library/litewearable-sdk-0000001053562589
Step 4: Design the UI.
Index.hml
HTML:
<div class="container">
<text class="title">
{{title}}
</text>
</div>
Index.css
CSS:
.container {
display: flex;
justify-content: center;
align-items: center;
left: 0px;
top: 0px;
width: 454px;
height: 454px;
background-color: cadetblue;
}
.title {
font-size: 90px;
text-align: center;
width: 300px;
height: 150px;
}
Step 5: Open index.js file and import the wearengine SDK.
JavaScript:
import {P2pClient, Message, Builder} from '../wearengine';
Step 6: Add the receiver code snippet on index.js.
JavaScript:
onInit() {
var _that = this;
_that.setBrightnessKeepScreenOn();
//Step 1: Obtain the point-to-point communication object
var p2pClient = new P2pClient();
var peerPkgName = "com.phone.wearengine";
var peerFinger = "79C3B257672C32974283E712EF7FEC******";
//Step 2: Set your app package name that needs communications on the phone
p2pClient.setPeerPkgName(peerPkgName);
//Step 3: Set the fingerprint information of the app on the phone. (This API is unavailable currently. In this version, you need to set fingerprint mode in the config.json file in Step 5.)
p2pClient.setPeerFingerPrint(peerFinger);
//Step 4: Receive short messages or files from your app on the phone
//Define the receiver
var flash = this;
var receiver = {
onSuccess: function () {
console.info("Recieved message");
//Process the callback function returned when messages or files fail to be received from the phone during registration.
flash.receiveMessageOK = "Succeeded in receiving the message";
},
onFailure: function () {
console.info("Failed message");
//Registering a listener for the callback method of failing to receive messages or files from phone
flash.receiveMessageOK = "Failed to receive the message";
},
onReceiveMessage: function (data) {
if (data && data.isFileType) {
//Process the file sent by your app on the phone
flash.receiveMessgeOK = "file:" + data.name;
} else {
console.info("Got message - " + data);
//Process the message sent from your app on the phone.
flash.receiveMessageOK = "message:" + data;
_that.title = "" + data;
if (data != "Success") {
vibrator.vibrate({
mode: "long"
})
}
}
},
}
p2pClient.registerReceiver(receiver);
},
PeerFingerPrint on watch side is SHA-256 of Android application (Make sure you have removed the colons)
Step 7: Unregister the receiver on destroy of wearable app.
Java:
onDestroy() {
this.p2pClient.unregisterReceiver();
}
Step 8: Add metadata inside of module object of config.json.
JSON:
"metaData": {
"customizeData": [
{
"name": "supportLists",
"value": "com.phone.wearengine:79C3B257672C32974283E756535C86728BE4DF51E*******",
"extra": ""
}
]
}
The final code for your android application given below.
JavaScript:
import {P2pClient, Message, Builder} from '../wearengine';
import brightness from '@system.brightness';
import vibrator from '@system.vibrator';
export default {
data: {
title: 'Enter pin'
},
onInit() {
var _that = this;
_that.setBrightnessKeepScreenOn();
//Step 1: Obtain the point-to-point communication object
var p2pClient = new P2pClient();
var peerPkgName = "com.phone.wearengine";
var peerFinger = "79C3B257672C32974283E756535C*****************";
//Step 2: Set your app package name that needs communications on the phone
p2pClient.setPeerPkgName(peerPkgName);
//Step 3: Set the fingerprint information of the app on the phone. (This API is unavailable currently. In this version, you need to set fingerprint mode in the config.json file in Step 5.)
p2pClient.setPeerFingerPrint(peerFinger);
//Step 4: Receive short messages or files from your app on the phone
//Define the receiver
var flash = this;
var receiver = {
onSuccess: function () {
console.info("Recieved message");
//Process the callback function returned when messages or files fail to be received from the phone during registration.
flash.receiveMessageOK = "Succeeded in receiving the message";
},
onFailure: function () {
console.info("Failed message");
//Registering a listener for the callback method of failing to receive messages or files from phone
flash.receiveMessageOK = "Failed to receive the message";
},
onReceiveMessage: function (data) {
if (data && data.isFileType) {
//Process the file sent by your app on the phone
flash.receiveMessgeOK = "file:" + data.name;
} else {
console.info("Got message - " + data);
//Process the message sent from your app on the phone.
flash.receiveMessageOK = "message:" + data;
_that.title = "" + data;
if (data != "Success") {
vibrator.vibrate({
mode: "long"
})
}
}
},
}
p2pClient.registerReceiver(receiver);
},
setBrightnessKeepScreenOn: function () {
brightness.setKeepScreenOn({
keepScreenOn: true,
success: function () {
console.log("handling set keep screen on success")
},
fail: function (data, code) {
console.log("handling set keep screen on fail, code:" + code);
}
});
},
onDestroy() {
// FeatureAbility.unsubscribeMsg();
this.p2pClient.unregisterReceiver();
}
}
Results
Tips & Tricks
Make sure you are generated the SHA - 256 fingerprint of proper keystore.
Follow the P2P generation steps properly.
Conclusion
In this article, we are learned how to develop a security app using Wear Engine. The wear engine will allow us to communicate between the Android application and the Harmony Wear application.
Reference
Harmony Official document - https://developer.harmonyos.com/en/docs/documentation/doc-guides/harmonyos-overview-0000000000011903
Wear Engine documentation - https://developer.huawei.com/consumer/en/doc/development/connectivity-Guides/service-introduction-0000001050978399
Certificate generation article - https://forums.developer.huawei.com/forumPortal/en/topic/0202465210302250053
P2P generation article - https://forums.developer.huawei.com/forumPortal/en/topic/0202466737940270075
Click to expand...
Click to collapse

On Device Camera Stream Text Recognition in KnowMyBoard Android App Using ML Kit, MVVM, Navigation Components

{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Introduction
In this article, we will learn how to integrate Huawei ML kit camera stream, Map kit and Location kit in Android application KnowMyBoard. Account Kit provides seamless login functionality to the app with large user base.
The text recognition service can extract text from images of receipts, business cards, and documents. This service is useful for industries such as printing, education, and logistics. You can use it to create apps that handle data entry and check tasks.
The text recognition service is able to recognize text in both static images and dynamic camera streams with a host of APIs, which you can call synchronously or asynchronously to build your text recognition-enabled apps.
Precautions
Development Overview
You need to install Android Studio IDE and I assume that you have prior knowledge of Android application development.
Hardware Requirements
A computer (desktop or laptop) running Windows 10.
Android phone (with the USB cable), which is used for debugging.
Software Requirements
Java JDK 1.8 or later.
Android Studio software or Visual Studio or Code installed.
HMS Core (APK) 4.X or later
Integration steps
Step 1. Huawei developer account and complete identity verification in Huawei developer website, refer to register Huawei ID.
Step 2. Create project in AppGallery Connect
Step 3. Adding HMS Core SDK
Let’s start coding
navigation_graph.xml
[/B][/B]
<?xml version="1.0" encoding="utf-8"?>
<navigation xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
android:id="@+id/navigation_graph"
app:startDestination="@id/loginFragment">
<fragment
android:id="@+id/loginFragment"
android:name="com.huawei.hms.knowmyboard.dtse.activity.fragments.LoginFragment"
android:label="LoginFragment"/>
<fragment
android:id="@+id/mainFragment"
android:name="com.huawei.hms.knowmyboard.dtse.activity.fragments.MainFragment"
android:label="MainFragment"/>
<fragment
android:id="@+id/searchFragment"
android:name="com.huawei.hms.knowmyboard.dtse.activity.fragments.SearchFragment"
android:label="fragment_search"
tools:layout="@layout/fragment_search" />
</navigation>
[B][B]
TextRecognitionActivity.java
[/B][/B][/B]
public final class TextRecognitionActivity extends BaseActivity
implements OnRequestPermissionsResultCallback, View.OnClickListener {
private static final String TAG = "TextRecognitionActivity";
private LensEngine lensEngine = null;
private LensEnginePreview preview;
private GraphicOverlay graphicOverlay;
private ImageButton takePicture;
private ImageButton imageSwitch;
private RelativeLayout zoomImageLayout;
private ZoomImageView zoomImageView;
private ImageButton zoomImageClose;
CameraConfiguration cameraConfiguration = null;
private int facing = CameraConfiguration.CAMERA_FACING_BACK;
private Camera mCamera;
private boolean isLandScape;
private Bitmap bitmap;
private Bitmap bitmapCopy;
private LocalTextTransactor localTextTransactor;
private Handler mHandler = new MsgHandler(this);
private Dialog languageDialog;
private AddPictureDialog addPictureDialog;
private TextView textCN;
private TextView textEN;
private TextView textJN;
private TextView textKN;
private TextView textLN;
private TextView tv_language,tv_translated_txt;
private String textType = Constant.POSITION_CN;
private boolean isInitialization = false;
MLTextAnalyzer analyzer;
private static class MsgHandler extends Handler {
WeakReference<TextRecognitionActivity> mMainActivityWeakReference;
public MsgHandler(TextRecognitionActivity mainActivity) {
this.mMainActivityWeakReference = new WeakReference<>(mainActivity);
}
@Override
public void handleMessage(Message msg) {
super.handleMessage(msg);
TextRecognitionActivity mainActivity = this.mMainActivityWeakReference.get();
if (mainActivity == null) {
return;
}
//Log.d(TextRecognitionActivity.TAG, "msg what :" + msg.what);
//Log.e("TAG", "msg what :" + msg.getTarget().getMessageName(msg));
if (msg.what == Constant.SHOW_TAKE_PHOTO_BUTTON) {
mainActivity.setVisible();
} else if (msg.what == Constant.HIDE_TAKE_PHOTO_BUTTON) {
mainActivity.setGone();
}
}
}
private void setVisible() {
if (this.takePicture.getVisibility() == View.GONE) {
this.takePicture.setVisibility(View.VISIBLE);
}
}
private void setGone() {
if (this.takePicture.getVisibility() == View.VISIBLE) {
this.takePicture.setVisibility(View.GONE);
}
}
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
this.setContentView(R.layout.activity_text_recognition);
if (savedInstanceState != null) {
this.facing = savedInstanceState.getInt(Constant.CAMERA_FACING);
}
this.tv_language = this.findViewById(R.id.tv_lang);
this.tv_translated_txt = this.findViewById(R.id.tv_translated_txt);
this.preview = this.findViewById(R.id.live_preview);
this.graphicOverlay = this.findViewById(R.id.live_overlay);
this.cameraConfiguration = new CameraConfiguration();
this.cameraConfiguration.setCameraFacing(this.facing);
this.initViews();
this.isLandScape = (this.getResources().getConfiguration().orientation == Configuration.ORIENTATION_LANDSCAPE);
this.createLensEngine();
this.setStatusBar();
}
private void initViews() {
this.takePicture = this.findViewById(R.id.takePicture);
this.takePicture.setOnClickListener(this);
this.imageSwitch = this.findViewById(R.id.text_imageSwitch);
this.imageSwitch.setOnClickListener(this);
this.zoomImageLayout = this.findViewById(R.id.zoomImageLayout);
this.zoomImageView = this.findViewById(R.id.take_picture_overlay);
this.zoomImageClose = this.findViewById(R.id.zoomImageClose);
this.zoomImageClose.setOnClickListener(this);
this.findViewById(R.id.back).setOnClickListener(this);
this.findViewById(R.id.language_setting).setOnClickListener(this);
this.createLanguageDialog();
this.createAddPictureDialog();
}
@Override
public void onClick(View view) {
if (view.getId() == R.id.takePicture) {
this.takePicture();
} else if (view.getId() == R.id.zoomImageClose) {
this.zoomImageLayout.setVisibility(View.GONE);
this.recycleBitmap();
} else if (view.getId() == R.id.text_imageSwitch) {
this.showAddPictureDialog();
} else if (view.getId() == R.id.language_setting) {
this.showLanguageDialog();
} else if (view.getId() == R.id.simple_cn) {
SharedPreferencesUtil.getInstance(this)
.putStringValue(Constant.POSITION_KEY, Constant.POSITION_CN);
this.languageDialog.dismiss();
this.restartLensEngine(Constant.POSITION_CN);
} else if (view.getId() == R.id.english) {
SharedPreferencesUtil.getInstance(this)
.putStringValue(Constant.POSITION_KEY, Constant.POSITION_EN);
this.languageDialog.dismiss();
this.preview.release();
this.restartLensEngine(Constant.POSITION_EN);
} else if (view.getId() == R.id.japanese) {
SharedPreferencesUtil.getInstance(this)
.putStringValue(Constant.POSITION_KEY, Constant.POSITION_JA);
this.languageDialog.dismiss();
this.preview.release();
this.restartLensEngine(Constant.POSITION_JA);
} else if (view.getId() == R.id.korean) {
SharedPreferencesUtil.getInstance(this)
.putStringValue(Constant.POSITION_KEY, Constant.POSITION_KO);
this.languageDialog.dismiss();
this.preview.release();
this.restartLensEngine(Constant.POSITION_KO);
} else if (view.getId() == R.id.latin) {
SharedPreferencesUtil.getInstance(this)
.putStringValue(Constant.POSITION_KEY, Constant.POSITION_LA);
this.languageDialog.dismiss();
this.preview.release();
this.restartLensEngine(Constant.POSITION_LA);
} else if (view.getId() == R.id.back) {
releaseLensEngine();
this.finish();
}
}
private void restartLensEngine(String type) {
if (this.textType.equals(type)) {
return;
}
this.lensEngine.release();
this.lensEngine = null;
this.createLensEngine();
this.startLensEngine();
if (this.lensEngine == null || this.lensEngine.getCamera() == null) {
return;
}
this.mCamera = this.lensEngine.getCamera();
try {
this.mCamera.setPreviewDisplay(this.preview.getSurfaceHolder());
} catch (IOException e) {
Log.d(TextRecognitionActivity.TAG, "initViews IOException");
}
}
@Override
public void onBackPressed() {
if (this.zoomImageLayout.getVisibility() == View.VISIBLE) {
this.zoomImageLayout.setVisibility(View.GONE);
this.recycleBitmap();
} else {
super.onBackPressed();
releaseLensEngine();
}
}
private void createLanguageDialog() {
this.languageDialog = new Dialog(this, R.style.MyDialogStyle);
View view = View.inflate(this, R.layout.dialog_language_setting, null);
// Set up a custom layout
this.languageDialog.setContentView(view);
this.textCN = view.findViewById(R.id.simple_cn);
this.textCN.setOnClickListener(this);
this.textEN = view.findViewById(R.id.english);
this.textEN.setOnClickListener(this);
this.textJN = view.findViewById(R.id.japanese);
this.textJN.setOnClickListener(this);
this.textKN = view.findViewById(R.id.korean);
this.textKN.setOnClickListener(this);
this.textLN = view.findViewById(R.id.latin);
this.textLN.setOnClickListener(this);
this.languageDialog.setCanceledOnTouchOutside(true);
// Set the size of the dialog
Window dialogWindow = this.languageDialog.getWindow();
WindowManager.LayoutParams layoutParams = dialogWindow.getAttributes();
layoutParams.width = WindowManager.LayoutParams.MATCH_PARENT;
layoutParams.height = WindowManager.LayoutParams.WRAP_CONTENT;
layoutParams.gravity = Gravity.BOTTOM;
dialogWindow.setAttributes(layoutParams);
}
private void showLanguageDialog() {
this.initDialogViews();
this.languageDialog.show();
}
private void createAddPictureDialog() {
this.addPictureDialog = new AddPictureDialog(this, AddPictureDialog.TYPE_NORMAL);
final Intent intent = new Intent(TextRecognitionActivity.this, RemoteDetectionActivity.class);
intent.putExtra(Constant.MODEL_TYPE, Constant.CLOUD_TEXT_DETECTION);
this.addPictureDialog.setClickListener(new AddPictureDialog.ClickListener() {
@Override
public void takePicture() {
lensEngine.release();
isInitialization = false;
intent.putExtra(Constant.ADD_PICTURE_TYPE, Constant.TYPE_TAKE_PHOTO);
TextRecognitionActivity.this.startActivity(intent);
}
@Override
public void selectImage() {
intent.putExtra(Constant.ADD_PICTURE_TYPE, Constant.TYPE_SELECT_IMAGE);
TextRecognitionActivity.this.startActivity(intent);
}
@Override
public void doExtend() {
}
});
}
private void showAddPictureDialog() {
this.addPictureDialog.show();
}
private void initDialogViews() {
String position = SharedPreferencesUtil.getInstance(this).getStringValue(Constant.POSITION_KEY);
this.textType = position;
this.textCN.setSelected(false);
this.textEN.setSelected(false);
this.textJN.setSelected(false);
this.textLN.setSelected(false);
this.textKN.setSelected(false);
switch (position) {
case Constant.POSITION_CN:
this.textCN.setSelected(true);
break;
case Constant.POSITION_EN:
this.textEN.setSelected(true);
break;
case Constant.POSITION_LA:
this.textLN.setSelected(true);
break;
case Constant.POSITION_JA:
this.textJN.setSelected(true);
break;
case Constant.POSITION_KO:
this.textKN.setSelected(true);
break;
default:
}
}
@Override
protected void onSaveInstanceState(Bundle outState) {
outState.putInt(Constant.CAMERA_FACING, this.facing);
super.onSaveInstanceState(outState);
}
private void createLensEngine() {
MLLocalTextSetting setting = new MLLocalTextSetting.Factory()
.setOCRMode(MLLocalTextSetting.OCR_DETECT_MODE)
// Specify languages that can be recognized.
.setLanguage("ko")
.create();
analyzer = MLAnalyzerFactory.getInstance().getLocalTextAnalyzer(setting);
//analyzer = new MLTextAnalyzer.Factory(this).create();
if (this.lensEngine == null) {
this.lensEngine = new LensEngine(this, this.cameraConfiguration, this.graphicOverlay);
}
try {
this.localTextTransactor = new LocalTextTransactor(this.mHandler, this);
this.lensEngine.setMachineLearningFrameTransactor(this.localTextTransactor);
// this.lensEngine.setMachineLearningFrameTransactor((ImageTransactor) new ObjectAnalyzerTransactor());
isInitialization = true;
} catch (Exception e) {
Toast.makeText(
this,
"Can not create image transactor: " + e.getMessage(),
Toast.LENGTH_LONG)
.show();
}
}
private void startLensEngine() {
if (this.lensEngine != null) {
try {
this.preview.start(this.lensEngine, false);
} catch (IOException e) {
Log.e(TextRecognitionActivity.TAG, "Unable to start lensEngine.", e);
this.lensEngine.release();
this.lensEngine = null;
}
}
}
@Override
public void onResume() {
super.onResume();
if (!isInitialization){
createLensEngine();
}
this.startLensEngine();
}
@Override
protected void onStop() {
super.onStop();
this.preview.stop();
}
private void releaseLensEngine() {
if (this.lensEngine != null) {
this.lensEngine.release();
this.lensEngine = null;
}
recycleBitmap();
}
@Override
protected void onDestroy() {
super.onDestroy();
releaseLensEngine();
if (analyzer != null) {
try {
analyzer.stop();
} catch (IOException e) {
// Exception handling.
Log.e(TAG,"Error while releasing analyzer");
}
}
}
private void recycleBitmap() {
if (this.bitmap != null && !this.bitmap.isRecycled()) {
this.bitmap.recycle();
this.bitmap = null;
}
if (this.bitmapCopy != null && !this.bitmapCopy.isRecycled()) {
this.bitmapCopy.recycle();
this.bitmapCopy = null;
}
}
private void takePicture() {
this.zoomImageLayout.setVisibility(View.VISIBLE);
LocalDataProcessor localDataProcessor = new LocalDataProcessor();
localDataProcessor.setLandScape(this.isLandScape);
this.bitmap = BitmapUtils.getBitmap(this.localTextTransactor.getTransactingImage(), this.localTextTransactor.getTransactingMetaData());
float previewWidth = localDataProcessor.getMaxWidthOfImage(this.localTextTransactor.getTransactingMetaData());
float previewHeight = localDataProcessor.getMaxHeightOfImage(this.localTextTransactor.getTransactingMetaData());
if (this.isLandScape) {
previewWidth = localDataProcessor.getMaxHeightOfImage(this.localTextTransactor.getTransactingMetaData());
previewHeight = localDataProcessor.getMaxWidthOfImage(this.localTextTransactor.getTransactingMetaData());
}
this.bitmapCopy = Bitmap.createBitmap(this.bitmap).copy(Bitmap.Config.ARGB_8888, true);
Canvas canvas = new Canvas(this.bitmapCopy);
float min = Math.min(previewWidth, previewHeight);
float max = Math.max(previewWidth, previewHeight);
if (this.getResources().getConfiguration().orientation == Configuration.ORIENTATION_PORTRAIT) {
localDataProcessor.setCameraInfo(this.graphicOverlay, canvas, min, max);
} else {
localDataProcessor.setCameraInfo(this.graphicOverlay, canvas, max, min);
}
localDataProcessor.drawHmsMLVisionText(canvas, this.localTextTransactor.getLastResults().getBlocks());
this.zoomImageView.setImageBitmap(this.bitmapCopy);
// Create an MLFrame object using the bitmap, which is the image data in bitmap format.
MLFrame frame = MLFrame.fromBitmap(bitmap);
Task<MLText> task = analyzer.asyncAnalyseFrame(frame);
task.addOnSuccessListener(new OnSuccessListener<MLText>() {
@Override
public void onSuccess(MLText text) {
String detectText = text.getStringValue();
// Processing for successful recognition.
}
}).addOnFailureListener(new OnFailureListener() {
@Override
public void onFailure(Exception e) {
// Processing logic for recognition failure.
Log.e("TAG"," Text : Processing logic for recognition failure");
}
});
}
}
[B][B][B]
Result
Please refer forum page to see result.
Tricks and Tips
Makes sure that agconnect-services.json file added.
Make sure required dependencies are added
Make sure that service is enabled in AGC
Enable data binding in gradle.build file
Make sure bottom navigation id’s should be same as fragment id’s in navigation graph
Make sure that set apk key before calling service.
Make sure that you added the module-text from below link
Make changes in gradle file application to library in module-text
Conclusion
In this article, we have learnt how to integrate Huawei ML kit camera stream, where you can extract text on device camera stream in Android application KnowMyBoard. You can check the desired result in the result section. You can also go through previous article part-4 here. Hoping Huawei ML kit capabilities are helpful to you as well, like this sample, you can make use as per your requirement.
Thank you so much for reading. I hope this article helps you to understand the integration of Huawei ML kit in Android application KnowMyBoard.
Reference
Huawei ML Kit – Training video
ML Text Recognition
Module-text
Checkout in forum

Intermediate: Integration of Huawei ML Kit in KnowMyBoard App using Kotlin Part 2

{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Introduction​In this article, we will learn how to integrate the Huawei ML kit and Location kit into the Android application KnowMyBoard.
If you are new to this series follow the below article.
Intermediate: Integration of Huawei Account Kit and Analytics Kit in Android App KnowMyBoard Part -1
ML Kit provides an app to easily leverage Huawei's long-term proven expertise in machine learning to support diverse artificial intelligence (AI) applications throughout a wide range of industries.
ML kit provides various services, in this application, we will be integrating its text-related service like text recognition, text detection, and text translation services, which helps in achieving the goal of the application.
Location kit SDK for Android offers location-related APIs for Android apps. These APIs mainly relate to 6 functions fused location, activity identification, geofence, high-precision location, indoor location, and geocoding. This mode is applicable to mobile phones and Huawei tablets. We are using the Location kit to get the location of the user.
Hardware Requirements​
A computer (desktop or laptop) running Windows 10.
Android phone (with the USB cable), which is used for debugging.
Java JDK 1.8 or later.
Android Studio software or Visual Studio or Code installed.
HMS Core (APK) 4.X or later
Integration steps
Step 1. Huawei developer account and complete identity verification in Huawei developer website, refer to register Huawei ID.
Step 2. Create a project in AppGallery Connect
Step 3. Adding HMS Core SDK
Let's start coding
MainActivity.kt
Java:
package com.huawei.hms.knowmyboard.dtse.activity.view
import com.huawei.hms.knowmyboard.dtse.activity.app.MyApplication.Companion.activity
import androidx.appcompat.app.AppCompatActivity
import com.huawei.hms.knowmyboard.dtse.activity.viewmodel.LoginViewModel
import com.huawei.hms.mlsdk.text.MLTextAnalyzer
import android.graphics.Bitmap
import com.huawei.hms.mlsdk.langdetect.local.MLLocalLangDetector
import com.huawei.hms.mlsdk.translate.local.MLLocalTranslator
import android.app.ProgressDialog
import android.os.Bundle
import androidx.lifecycle.ViewModelProvider
import android.content.Intent
import com.huawei.hms.support.account.AccountAuthManager
import com.google.gson.Gson
import com.huawei.hms.common.ApiException
import android.provider.MediaStore
import com.huawei.hms.mlsdk.common.MLFrame
import com.huawei.hms.mlsdk.text.MLText
import com.huawei.hmf.tasks.OnSuccessListener
import com.huawei.hms.mlsdk.langdetect.MLLangDetectorFactory
import com.huawei.hms.mlsdk.langdetect.local.MLLocalLangDetectorSetting
import com.huawei.hmf.tasks.OnFailureListener
import android.content.DialogInterface
import android.net.Uri
import android.util.Log
import androidx.appcompat.app.AlertDialog
import com.huawei.hms.knowmyboard.dtse.R
import com.huawei.hms.knowmyboard.dtse.activity.model.UserData
import com.huawei.hms.mlsdk.common.MLApplication
import com.huawei.hms.mlsdk.translate.local.MLLocalTranslateSetting
import com.huawei.hms.mlsdk.translate.MLTranslatorFactory
import com.huawei.hms.mlsdk.model.download.MLModelDownloadStrategy
import com.huawei.hms.mlsdk.model.download.MLModelDownloadListener
import com.huawei.hms.mlsdk.text.MLLocalTextSetting
import com.huawei.hms.mlsdk.MLAnalyzerFactory
import java.io.IOException
import java.lang.Exception
import java.util.ArrayList
class MainActivity() : AppCompatActivity() {
var loginViewModel: LoginViewModel? = null
private var mTextAnalyzer: MLTextAnalyzer? = null
var imagePath: Uri? = null
var bitmap: Bitmap? = null
var result = ArrayList<String>()
var myLocalLangDetector: MLLocalLangDetector? = null
var myLocalTranslator: MLLocalTranslator? = null
var textRecognized: String? = null
var progressDialog: ProgressDialog? = null
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
setContentView(R.layout.activity_main)
loginViewModel = ViewModelProvider(this).get(LoginViewModel::class.java)
activity = this
progressDialog = ProgressDialog(this)
progressDialog!!.setCancelable(false)
}
override fun onActivityResult(requestCode: Int, resultCode: Int, data: Intent?) {
// Process the authorization result to obtain the authorization code from AuthAccount.
super.onActivityResult(requestCode, resultCode, data)
if (requestCode == 8888) {
val authAccountTask = AccountAuthManager.parseAuthResultFromIntent(data)
if (authAccountTask.isSuccessful) {
// The sign-in is successful, and the user's ID information and authorization code are obtained.
val authAccount = authAccountTask.result
val userData = UserData()
userData.accessToken = authAccount.accessToken
userData.countryCode = authAccount.countryCode
userData.displayName = authAccount.displayName
userData.email = authAccount.email
userData.familyName = authAccount.familyName
userData.givenName = authAccount.givenName
userData.idToken = authAccount.idToken
userData.openId = authAccount.openId
userData.uid = authAccount.uid
userData.photoUriString = authAccount.avatarUri.toString()
userData.unionId = authAccount.unionId
val gson = Gson()
Log.e("TAG", "sign in success : " + gson.toJson(authAccount))
loginViewModel = ViewModelProvider([email protected]).get(
LoginViewModel::class.java
)
loginViewModel!!.sendData(authAccount.displayName)
progressDialog!!.dismiss()
} else {
// The sign-in failed.
Log.e(
"TAG",
"sign in failed:" + (authAccountTask.exception as ApiException).statusCode
)
progressDialog!!.dismiss()
}
}
if ((requestCode == 2323) && (resultCode == RESULT_OK) && (data != null)) {
progressDialog!!.setMessage("Initializing text detection..")
progressDialog!!.show()
imagePath = data.data
try {
bitmap = MediaStore.Images.Media.getBitmap(this.contentResolver, imagePath)
} catch (e: IOException) {
e.printStackTrace()
Log.e("TAG", " BITMAP ERROR")
}
Log.d("TAG", "Path " + imagePath!!.path)
try {
val selectedBitmap = MediaStore.Images.Media.getBitmap(contentResolver, imagePath)
asyncAnalyzeText(selectedBitmap)
} catch (e: IOException) {
e.printStackTrace()
progressDialog!!.dismiss()
}
}
}
private fun asyncAnalyzeText(bitmap: Bitmap) {
if (mTextAnalyzer == null) {
createMLTextAnalyzer()
}
val frame = MLFrame.fromBitmap(bitmap)
val task = mTextAnalyzer!!.asyncAnalyseFrame(frame)
task.addOnSuccessListener(object : OnSuccessListener<MLText> {
override fun onSuccess(text: MLText) {
progressDialog!!.setMessage("Initializing language detection..")
Log.d("TAG", "#==>" + text.stringValue)
textRecognized = text.stringValue.trim { it <= ' ' }
if (!textRecognized!!.isEmpty()) {
// Create a local language detector.
val factory = MLLangDetectorFactory.getInstance()
val setting =
MLLocalLangDetectorSetting.Factory() // Set the minimum confidence threshold for language detection.
.setTrustedThreshold(0.01f)
.create()
myLocalLangDetector = factory.getLocalLangDetector(setting)
val firstBestDetectTask = myLocalLangDetector?.firstBestDetect(textRecognized)
firstBestDetectTask?.addOnSuccessListener(OnSuccessListener { languageDetected ->
progressDialog!!.setMessage("Initializing text translation..")
// Processing logic for detection success.
Log.d("TAG", "Lang detect :$languageDetected")
textTranslate(languageDetected, textRecognized!!, bitmap)
})?.addOnFailureListener(object : OnFailureListener {
override fun onFailure(e: Exception) {
// Processing logic for detection failure.
Log.e("TAG", "Lang detect error:" + e.message)
}
})
} else {
progressDialog!!.dismiss()
showErrorDialog("Failed to recognize text.")
}
}
}).addOnFailureListener(object : OnFailureListener {
override fun onFailure(e: Exception) {
Log.e("TAG", "#==>" + e.message)
}
})
}
private fun showErrorDialog(msg: String) {
val alertDialog = AlertDialog.Builder(this).create()
alertDialog.setTitle("Error")
alertDialog.setMessage(msg)
alertDialog.setButton(
AlertDialog.BUTTON_POSITIVE,
"OK",
object : DialogInterface.OnClickListener {
override fun onClick(dialog: DialogInterface, which: Int) {
dialog.dismiss()
}
})
alertDialog.show()
}
private fun textTranslate(languageDetected: String, textRecognized: String, uri: Bitmap) {
MLApplication.initialize(application)
MLApplication.getInstance().apiKey =
"DAEDAF48ZIMI4ettQdTfCKlXgaln/E+TO/PrsX+LpP2BubkmED/iC0iVEps5vfx1ol27rHvuwiq64YphpPkGYWbf9La8XjnvC9qhwQ=="
// Create an offline translator.
val setting =
MLLocalTranslateSetting.Factory() // Set the source language code. The ISO 639-1 standard is used. This parameter is mandatory. If this parameter is not set, an error may occur.
.setSourceLangCode(languageDetected) // Set the target language code. The ISO 639-1 standard is used. This parameter is mandatory. If this parameter is not set, an error may occur.
.setTargetLangCode("en")
.create()
myLocalTranslator = MLTranslatorFactory.getInstance().getLocalTranslator(setting)
// Set the model download policy.
val downloadStrategy = MLModelDownloadStrategy.Factory()
.needWifi() // It is recommended that you download the package in a Wi-Fi environment.
.create()
// Create a download progress listener.
val modelDownloadListener: MLModelDownloadListener = object : MLModelDownloadListener {
override fun onProcess(alreadyDownLength: Long, totalLength: Long) {
runOnUiThread(object : Runnable {
override fun run() {
// Display the download progress or perform other operations.
}
})
}
}
myLocalTranslator?.preparedModel(downloadStrategy, modelDownloadListener)
?.addOnSuccessListener(object : OnSuccessListener<Void?> {
override fun onSuccess(aVoid: Void?) {
// Called when the model package is successfully downloaded.
// input is a string of less than 5000 characters.
val task = myLocalTranslator?.asyncTranslate(textRecognized)
// Before translation, ensure that the models have been successfully downloaded.
task?.addOnSuccessListener(object : OnSuccessListener<String> {
override fun onSuccess(translated: String) {
// Processing logic for detection success.
result.clear()
result.add(languageDetected.trim { it <= ' ' })
result.add(textRecognized.trim { it <= ' ' })
result.add(translated.trim { it <= ' ' })
loginViewModel!!.setImage(uri)
loginViewModel!!.setTextRecognized(result)
progressDialog!!.dismiss()
}
})?.addOnFailureListener(object : OnFailureListener {
override fun onFailure(e: Exception) {
// Processing logic for detection failure.
progressDialog!!.dismiss()
}
})
}
})?.addOnFailureListener(object : OnFailureListener {
override fun onFailure(e: Exception) {
// Called when the model package fails to be downloaded.
progressDialog!!.dismiss()
}
})
}
private fun createMLTextAnalyzer() {
val setting = MLLocalTextSetting.Factory()
.setOCRMode(MLLocalTextSetting.OCR_DETECT_MODE)
.create()
mTextAnalyzer = MLAnalyzerFactory.getInstance().getLocalTextAnalyzer(setting)
}
override fun onStop() {
if (myLocalLangDetector != null) {
myLocalLangDetector!!.stop()
}
if (myLocalTranslator != null) {
myLocalTranslator!!.stop()
}
if (progressDialog != null) {
progressDialog!!.dismiss()
}
super.onStop()
}
}
LoginViewModel.kt
Java:
package com.huawei.hms.knowmyboard.dtse.activity.viewmodel
import android.app.Application
import com.huawei.hms.knowmyboard.dtse.activity.app.MyApplication.Companion.activity
import androidx.lifecycle.AndroidViewModel
import com.huawei.hms.support.account.service.AccountAuthService
import androidx.lifecycle.MutableLiveData
import android.graphics.Bitmap
import android.util.Log
import android.view.View
import com.huawei.hms.location.LocationResult
import androidx.lifecycle.LiveData
import android.widget.Toast
import com.huawei.hms.support.account.request.AccountAuthParams
import com.huawei.hms.support.account.request.AccountAuthParamsHelper
import com.huawei.hms.support.account.AccountAuthManager
import com.huawei.hms.common.ApiException
import java.util.ArrayList
class LoginViewModel(application: Application) : AndroidViewModel(application) {
var service: AccountAuthService? = null
private val message = MutableLiveData<String>()
val textRecongnized = MutableLiveData<ArrayList<String>>()
val imagePath = MutableLiveData<Bitmap>()
val locationResult = MutableLiveData<LocationResult>()
fun sendData(msg: String) {
message.value = msg
}
fun getMessage(): LiveData<String> {
return message
}
fun setImage(imagePath: Bitmap) {
this.imagePath.value = imagePath
}
fun setLocationResult(locationResult: LocationResult) {
this.locationResult.value = locationResult
}
fun setTextRecognized(textRecognized: ArrayList<String>) {
this.textRecongnized.value = textRecognized
}
fun logoutHuaweiID() {
if (service != null) {
service!!.signOut()
sendData("KnowMyBoard")
Toast.makeText(getApplication(), "You are logged out from Huawei ID", Toast.LENGTH_LONG)
.show()
}
}
fun loginClicked(view: View?) {
val authParams =
AccountAuthParamsHelper(AccountAuthParams.DEFAULT_AUTH_REQUEST_PARAM).setAuthorizationCode()
.createParams()
service = AccountAuthManager.getService(activity, authParams)
activity!!.startActivityForResult(service?.signInIntent, 8888)
}
fun cancelAuthorization() {
if (service != null) {
// service indicates the AccountAuthService instance generated using the getService method during the sign-in authorization.
service!!.cancelAuthorization().addOnCompleteListener { task ->
if (task.isSuccessful) {
// Processing after a successful authorization cancellation.
Log.i("TAG", "onSuccess: ")
sendData("KnowMyBoard")
Toast.makeText(getApplication(), "Cancelled authorization", Toast.LENGTH_LONG)
.show()
} else {
// Handle the exception.
val exception = task.exception
if (exception is ApiException) {
val statusCode = exception.statusCode
Log.i("TAG", "onFailure: $statusCode")
Toast.makeText(
getApplication(),
"Failed to cancel authorization",
Toast.LENGTH_LONG
).show()
}
}
}
} else {
Toast.makeText(getApplication(), "Login required", Toast.LENGTH_LONG).show()
}
}
fun onClickScan() {
Log.d("TAG", "...Scan...")
}
}
RequestLocationData.kt
Java:
package com.huawei.hms.knowmyboard.dtse.activity.util
import android.Manifest
import androidx.fragment.app.FragmentActivity
import com.huawei.hms.knowmyboard.dtse.activity.viewmodel.LoginViewModel
import android.app.Activity
import android.content.Context
import com.huawei.hmf.tasks.OnSuccessListener
import com.huawei.hms.knowmyboard.dtse.activity.util.RequestLocationData
import com.huawei.hmf.tasks.OnFailureListener
import android.os.Build
import androidx.core.app.ActivityCompat
import android.content.pm.PackageManager
import com.google.gson.Gson
import android.os.Looper
import android.location.Geocoder
import android.util.Log
import com.huawei.hms.location.*
import java.io.IOException
import java.lang.StringBuilder
import java.util.*
class RequestLocationData(
context: Context?,
activity: FragmentActivity?,
loginViewModel: LoginViewModel?
) {
var settingsClient: SettingsClient? = null
private var isLocationSettingSuccess = 0
private var myLocationRequest: LocationRequest? = null
// Define a fusedLocationProviderClient object.
private var fusedLocationProviderClient: FusedLocationProviderClient? = null
var myLocationCallback: LocationCallback? = null
var context: Context? = null
var activity: Activity? = null
var locationResult: LocationResult? = null
var loginViewModel: LoginViewModel? = null
fun initFusionLocationProviderClint() {
// Instantiate the fusedLocationProviderClient object.
fusedLocationProviderClient = LocationServices.getFusedLocationProviderClient(activity)
settingsClient = LocationServices.getSettingsClient(activity)
}
fun checkDeviceLocationSettings() {
val builder = LocationSettingsRequest.Builder()
myLocationRequest = LocationRequest()
builder.addLocationRequest(myLocationRequest)
val locationSettingsRequest = builder.build()
// Check the device location settings.
settingsClient!!.checkLocationSettings(locationSettingsRequest) // Define the listener for success in calling the API for checking device location settings.
.addOnSuccessListener { locationSettingsResponse: LocationSettingsResponse ->
val locationSettingsStates = locationSettingsResponse.locationSettingsStates
val stringBuilder = StringBuilder()
// Check whether the location function is enabled.
stringBuilder.append(",\nisLocationUsable=")
.append(locationSettingsStates.isLocationUsable)
// Check whether HMS Core (APK) is available.
stringBuilder.append(",\nisHMSLocationUsable=")
.append(locationSettingsStates.isHMSLocationUsable)
Log.i(TAG, "checkLocationSetting onComplete:$stringBuilder")
// Set the location type.
myLocationRequest!!.priority = LocationRequest.PRIORITY_HIGH_ACCURACY
// Set the number of location updates to 1.
myLocationRequest!!.numUpdates = 1
isLocationSettingSuccess = 1
} // Define callback for failure in checking the device location settings.
.addOnFailureListener { e -> Log.i(TAG, "checkLocationSetting onFailure:" + e.message) }
}
fun checkPermission() {
// Dynamically apply for required permissions if the API level is 28 or lower.
if (Build.VERSION.SDK_INT <= Build.VERSION_CODES.P) {
Log.i(TAG, "android sdk <= 28 Q")
if (ActivityCompat.checkSelfPermission(
context!!,
Manifest.permission.ACCESS_FINE_LOCATION
) != PackageManager.PERMISSION_GRANTED
&& ActivityCompat.checkSelfPermission(
context!!,
Manifest.permission.ACCESS_COARSE_LOCATION
) != PackageManager.PERMISSION_GRANTED
) {
val strings = arrayOf(
Manifest.permission.READ_EXTERNAL_STORAGE,
Manifest.permission.MANAGE_MEDIA,
Manifest.permission.MEDIA_CONTENT_CONTROL,
Manifest.permission.ACCESS_FINE_LOCATION,
Manifest.permission.ACCESS_COARSE_LOCATION
)
ActivityCompat.requestPermissions(activity!!, strings, 1)
}
} else {
// Dynamically apply for the android.permission.ACCESS_BACKGROUND_LOCATION permission in addition to the preceding permissions if the API level is higher than 28.
if (ActivityCompat.checkSelfPermission(
activity!!,
Manifest.permission.ACCESS_FINE_LOCATION
) != PackageManager.PERMISSION_GRANTED && ActivityCompat.checkSelfPermission(
context!!,
Manifest.permission.ACCESS_COARSE_LOCATION
) != PackageManager.PERMISSION_GRANTED && ActivityCompat.checkSelfPermission(
context!!,
"android.permission.ACCESS_BACKGROUND_LOCATION"
) != PackageManager.PERMISSION_GRANTED
) {
val strings = arrayOf(
Manifest.permission.ACCESS_FINE_LOCATION,
Manifest.permission.ACCESS_COARSE_LOCATION,
Manifest.permission.MEDIA_CONTENT_CONTROL,
Manifest.permission.MANAGE_MEDIA,
"android.permission.ACCESS_BACKGROUND_LOCATION"
)
ActivityCompat.requestPermissions(activity!!, strings, 2)
}
}
}
fun refreshLocation(): LocationResult? {
Log.d(TAG, "Refreshing location")
if (isLocationSettingSuccess == 1) {
myLocationCallback = object : LocationCallback() {
override fun onLocationResult(locationResult: LocationResult) {
if (locationResult != null) {
val gson = Gson()
Log.d(
TAG,
" Location data :" + locationResult.lastLocation.latitude + " : " + locationResult.lastLocation.longitude
)
Log.d(TAG, " Location data :" + gson.toJson(locationResult.lastHWLocation))
Log.d(TAG, " Location data :" + locationResult.lastHWLocation.countryName)
Log.d(TAG, " Location data :" + locationResult.lastHWLocation.latitude)
Log.d(TAG, " Location data :" + locationResult.lastHWLocation.longitude)
// binding.textDetected.setText("Latitude " + locationResult.getLastHWLocation().getLatitude() + " Longitude " + locationResult.getLastHWLocation().getLongitude());
getGeoCoderValues(
locationResult.lastHWLocation.latitude,
locationResult.lastHWLocation.longitude
)
//locationResult = locationResult1;
loginViewModel!!.setLocationResult(locationResult)
}
}
}
fusedLocationProviderClient!!.requestLocationUpdates(
myLocationRequest,
myLocationCallback,
Looper.getMainLooper()
)
} else {
Log.d(TAG, "Failed to get location settings")
}
return locationResult
}
fun disableLocationData() {
fusedLocationProviderClient!!.disableBackgroundLocation()
fusedLocationProviderClient!!.removeLocationUpdates(myLocationCallback)
}
private fun getGeoCoderValues(latitude: Double, longitude: Double) {
getAddress(context, latitude, longitude)
/* Geocoder geocoder;
List<Address> addresses;
Locale locale = new Locale("en", "IN");
geocoder = new Geocoder(getContext(), locale);
try {
addresses = geocoder.getFromLocation(latitude, longitude, 1); // Here 1 represent max location result to returned, by documents it recommended 1 to 5
Gson gson=new Gson();
Log.d(TAG,"Geo coder :"+gson.toJson(addresses));
String address = addresses.get(0).getAddressLine(0); // If any additional address line present than only, check with max available address lines by getMaxAddressLineIndex()
String city = addresses.get(0).getLocality();
String state = addresses.get(0).getAdminArea();
String country = addresses.get(0).getCountryName();
String postalCode = addresses.get(0).getPostalCode();
String knownName = addresses.get(0).getFeatureName();
} catch (IOException e) {
e.printStackTrace();
Log.e(TAG,"Error while fetching Geo coder :"+e.getMessage());
}*/
/* Locale locale = new Locale("en", "IN");
GeocoderService geocoderService =
LocationServices.getGeocoderService(getActivity().getBaseContext(), locale);
// Request reverse geocoding.
GetFromLocationRequest getFromLocationRequest = new GetFromLocationRequest(latitude, longitude, 5);
// Initiate reverse geocoding.
geocoderService.getFromLocation(getFromLocationRequest)
.addOnSuccessListener(hwLocation -> {
Gson gson=new Gson();
Log.d(TAG,"Geo coder :"+gson.toJson(hwLocation));
})
.addOnFailureListener(e -> {
Log.e(TAG,"Error while fetching Geo coder :"+e.getMessage());
});*/
}
companion object {
var TAG = "TAG"
fun getAddress(context: Context?, LATITUDE: Double, LONGITUDE: Double) {
//Set Address
try {
val geocoder = Geocoder(context, Locale.getDefault())
val addresses = geocoder.getFromLocation(LATITUDE, LONGITUDE, 1)
if (addresses != null && addresses.size > 0) {
val address =
addresses[0].getAddressLine(0) // If any additional address line present than only, check with max available address lines by getMaxAddressLineIndex()
val city = addresses[0].locality
val state = addresses[0].adminArea
val country = addresses[0].countryName
val postalCode = addresses[0].postalCode
val knownName = addresses[0].featureName // Only if available else return NULL
Log.d(TAG, "getAddress: address$address")
Log.d(TAG, "getAddress: city$city")
Log.d(TAG, "getAddress: state$state")
Log.d(TAG, "getAddress: postalCode$postalCode")
Log.d(TAG, "getAddress: knownName$knownName")
}
} catch (e: IOException) {
e.printStackTrace()
Log.e(TAG, "Error while fetching Geo coder :" + e.message)
}
}
}
init {
this.context = context
this.activity = activity
[email protected] = loginViewModel
}
}
LoginFragment.kt
Java:
package com.huawei.hms.knowmyboard.dtse.activity.fragments
import com.huawei.hms.knowmyboard.dtse.activity.viewmodel.LoginViewModel
import androidx.navigation.Navigation.findNavController
import android.content.SharedPreferences
import androidx.navigation.NavController
import android.os.Bundle
import androidx.databinding.DataBindingUtil
import androidx.lifecycle.ViewModelProvider
import android.annotation.SuppressLint
import android.content.Context
import android.util.Log
import android.view.*
import androidx.fragment.app.Fragment
import com.huawei.hms.knowmyboard.dtse.R
import java.lang.Exception
class LoginFragment : Fragment() {
var loginBinding: FragmentLoginBinding? = null
var loginViewModel: LoginViewModel? = null
var menu: Menu? = null
var prefs: SharedPreferences? = null
var editor: SharedPreferences.Editor? = null
var navController: NavController? = null
private val MY_PREF_NAME = "my_pref_name"
private val TAG = "TAG"
override fun onViewCreated(view: View, savedInstanceState: Bundle?) {
super.onViewCreated(view, savedInstanceState)
navController = findNavController(view)
}
override fun onCreateView(
inflater: LayoutInflater, container: ViewGroup?,
savedInstanceState: Bundle?
): View? {
loginBinding = DataBindingUtil.inflate(inflater, R.layout.fragment_login, container, false)
loginViewModel = ViewModelProvider(requireActivity()).get(LoginViewModel::class.java)
loginBinding?.loginViewModel = loginViewModel
Log.d(TAG, " Pref $preferenceValue")
if (preferenceValue == "user_name") {
loginBinding?.btnHuaweiIdAuth?.visibility = View.VISIBLE
} else {
enableMenu(menu)
requireActivity().title = preferenceValue
loginBinding?.btnHuaweiIdAuth?.visibility = View.GONE
}
loginBinding?.imageLogo?.setOnClickListener { v: View? -> navController!!.navigate(R.id.action_loginFragment_to_mainFragment) }
loginViewModel!!.getMessage().observeForever { message ->
updateMessage(message)
if (message != resources.getString(R.string.app_name)) {
preferenceValue = message
enableMenu(menu)
loginBinding?.btnHuaweiIdAuth?.visibility = View.GONE
} else {
disableMenu(menu)
loginBinding?.btnHuaweiIdAuth?.visibility = View.VISIBLE
preferenceValue = "user_name"
}
}
return loginBinding?.root
}
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
setHasOptionsMenu(true)
}
override fun onCreateOptionsMenu(menu: Menu, inflater: MenuInflater) {
menu.clear()
super.onCreateOptionsMenu(menu, inflater)
inflater.inflate(R.menu.main, menu)
this.menu = menu
disableMenu(menu)
}
private fun disableMenu(menu: Menu?) {
try {
if (menu != null) {
if (preferenceValue == "user_name") {
menu.findItem(R.id.menu_login_logout).isVisible = false
menu.findItem(R.id.menu_cancel_auth).isVisible = false
requireActivity().title = resources.getString(R.string.app_name)
} else {
menu.findItem(R.id.menu_login_logout).isVisible = true
menu.findItem(R.id.menu_cancel_auth).isVisible = true
}
}
} catch (e: Exception) {
e.printStackTrace()
}
}
private fun enableMenu(menu: Menu?) {
try {
menu!!.findItem(R.id.menu_login_logout).isVisible = true
menu.findItem(R.id.menu_cancel_auth).isVisible = true
} catch (e: Exception) {
e.printStackTrace()
}
}
@SuppressLint("NonConstantResourceId")
override fun onOptionsItemSelected(item: MenuItem): Boolean {
when (item.itemId) {
R.id.menu_cancel_auth -> {
preferenceValue = "user_name"
loginViewModel!!.cancelAuthorization()
loginBinding!!.btnHuaweiIdAuth.visibility = View.VISIBLE
disableMenu(menu)
return true
}
R.id.menu_login_logout -> {
preferenceValue = "user_name"
loginViewModel!!.logoutHuaweiID()
loginBinding!!.btnHuaweiIdAuth.visibility = View.VISIBLE
disableMenu(menu)
return true
}
else -> {}
}
return super.onOptionsItemSelected(item)
}
private fun updateMessage(msg: String?) {
//loginBinding.txtMessage.setText(msg);
requireActivity().title = msg
}
var preferenceValue: String?
get() {
prefs = requireActivity().getSharedPreferences(MY_PREF_NAME, Context.MODE_PRIVATE)
return prefs?.getString("user_name", "user_name")
}
set(userName) {
editor = requireActivity().getSharedPreferences(MY_PREF_NAME, Context.MODE_PRIVATE).edit()
editor?.putString("user_name", userName)
editor?.apply()
}
}
MainFragment.kt
Java:
package com.huawei.hms.knowmyboard.dtse.activity.fragments
import com.huawei.hms.knowmyboard.dtse.activity.viewmodel.LoginViewModel
import com.huawei.hms.knowmyboard.dtse.activity.util.RequestLocationData
import android.os.Bundle
import androidx.databinding.DataBindingUtil
import androidx.lifecycle.ViewModelProvider
import android.content.Intent
import android.annotation.SuppressLint
import android.util.Log
import android.view.*
import androidx.fragment.app.Fragment
import com.huawei.hms.knowmyboard.dtse.R
import com.huawei.hms.knowmyboard.dtse.databinding.FragmentMainFragmentBinding
import java.lang.Exception
class MainFragment : Fragment() {
var binding: FragmentMainFragmentBinding? = null
var loginViewModel: LoginViewModel? = null
var locationData: RequestLocationData? = null
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
setHasOptionsMenu(true)
}
override fun onCreateView(
inflater: LayoutInflater, container: ViewGroup?,
savedInstanceState: Bundle?
): View? {
// Inflate the layout for this fragment
binding =
DataBindingUtil.inflate(inflater, R.layout.fragment_main_fragment, container, false)
// settingsClient = LocationServices.getSettingsClient(getActivity());
loginViewModel = ViewModelProvider(requireActivity()).get(LoginViewModel::class.java)
binding?.loginViewModel = loginViewModel
locationData = RequestLocationData(context, activity, loginViewModel)
locationData!!.initFusionLocationProviderClint()
locationData!!.checkPermission()
locationData!!.checkDeviceLocationSettings()
// checkDeviceLocationSettings();
// Instantiate the fusedLocationProviderClient object.
// fusedLocationProviderClient = LocationServices.getFusedLocationProviderClient(getActivity());
binding?.buttonScan?.setOnClickListener {
Log.d(TAG, "*******")
//loginViewModel.setTextRecongnized("");
scan()
}
loginViewModel!!.imagePath.observeForever { bitmap ->
try {
binding?.imageView?.setImageBitmap(bitmap)
} catch (e: Exception) {
e.printStackTrace()
Log.e("TAG", "Error : " + e.message)
}
}
loginViewModel!!.textRecongnized.observeForever { res ->
Log.i("TAG", "OBSERVER : " + "Language : " + getStringResourceByName(
res[0].trim { it <= ' ' }) +
" Detected text : " + res[1].trim { it <= ' ' } +
" Translated text : " + res[2].trim { it <= ' ' })
binding?.textLanguage?.text = "Language : " + getStringResourceByName(res[0])
binding?.textDetected?.text = "Detected text : " + res[1]
binding?.textTranslated?.text = "Translated text : " + res[2]
}
loginViewModel!!.locationResult.observeForever { locationResult ->
binding?.textDetected?.text =
"Latitude " + locationResult.lastHWLocation.latitude + " Longitude " + locationResult.lastHWLocation.longitude
}
return binding?.root
}
private fun getStringResourceByName(aString: String): String {
val packageName = requireActivity().packageName
val resId = resources
.getIdentifier(aString, "string", packageName)
return if (resId == 0) {
aString
} else {
getString(resId)
}
}
private fun scan() {
/* MLTextAnalyzer analyzer = new MLTextAnalyzer.Factory(getContext()).create();
analyzer.setTransactor(new OcrDetectorProcessor());
LensEngine lensEngine = new LensEngine.Creator(getContext(), analyzer)
.setLensType(LensEngine.BACK_LENS)
.applyDisplayDimension(1440, 1080)
.applyFps(30.0f)
.enableAutomaticFocus(true)
.create();
SurfaceView mSurfaceView = new SurfaceView(getContext());
try {
lensEngine.run(mSurfaceView.getHolder());
} catch (IOException e) {
// Exception handling logic.
Log.e(TAG,e.getMessage());
}*/
val intent = Intent(Intent.ACTION_GET_CONTENT)
intent.type = "image/*"
requireActivity().startActivityForResult(intent, 2323)
}
override fun onCreateOptionsMenu(menu: Menu, inflater: MenuInflater) {
menu.clear()
super.onCreateOptionsMenu(menu, inflater)
inflater.inflate(R.menu.main_fragment_menu, menu)
}
@SuppressLint("NonConstantResourceId")
override fun onOptionsItemSelected(item: MenuItem): Boolean {
when (item.itemId) {
R.id.option_refresh_location -> {
//refreshLocation();
locationData!!.refreshLocation()
return true
}
}
return super.onOptionsItemSelected(item)
}
/* private void checkDeviceLocationSettings() {
LocationSettingsRequest.Builder builder = new LocationSettingsRequest.Builder();
myLocationRequest = new LocationRequest();
builder.addLocationRequest(myLocationRequest);
LocationSettingsRequest locationSettingsRequest = builder.build();
// Check the device location settings.
settingsClient.checkLocationSettings(locationSettingsRequest)
// Define the listener for success in calling the API for checking device location settings.
.addOnSuccessListener(locationSettingsResponse -> {
LocationSettingsStates locationSettingsStates =
locationSettingsResponse.getLocationSettingsStates();
StringBuilder stringBuilder = new StringBuilder();
// Check whether the location function is enabled.
stringBuilder.append(",\nisLocationUsable=")
.append(locationSettingsStates.isLocationUsable());
// Check whether HMS Core (APK) is available.
stringBuilder.append(",\nisHMSLocationUsable=")
.append(locationSettingsStates.isHMSLocationUsable());
Log.i(TAG, "checkLocationSetting onComplete:" + stringBuilder.toString());
// Set the location type.
myLocationRequest.setPriority(LocationRequest.PRIORITY_HIGH_ACCURACY);
// Set the number of location updates to 1.
myLocationRequest.setNumUpdates(1);
isLocationSettingSuccess = 1;
})
// Define callback for failure in checking the device location settings.
.addOnFailureListener(new OnFailureListener() {
@Override
public void onFailure(Exception e) {
isLocationSettingSuccess = 0;
Log.i(TAG, "checkLocationSetting onFailure:" + e.getMessage());
}
});
}*/
/* private void refreshLocation() {
Log.d(TAG, "Refreshing location");
if (isLocationSettingSuccess == 1) {
myLocationCallback = new LocationCallback() {
@Override
public void onLocationResult(LocationResult locationResult) {
if (locationResult != null) {
Gson gson = new Gson();
Log.d(TAG, " Location data :" + locationResult.getLastLocation().getLatitude() + " : " + locationResult.getLastLocation().getLongitude());
Log.d(TAG, " Location data :" + gson.toJson(locationResult.getLastHWLocation()));
Log.d(TAG, " Location data :" + locationResult.getLastHWLocation().getCountryName());
Log.d(TAG, " Location data :" + locationResult.getLastHWLocation().getLatitude());
Log.d(TAG, " Location data :" + locationResult.getLastHWLocation().getLongitude());
binding.textDetected.setText("Latitude " + locationResult.getLastHWLocation().getLatitude() + " Longitude " + locationResult.getLastHWLocation().getLongitude());
getGeoCoderValues(locationResult.getLastHWLocation().getLatitude(),locationResult.getLastHWLocation().getLongitude());
}
}
};
fusedLocationProviderClient.requestLocationUpdates(myLocationRequest, myLocationCallback, Looper.getMainLooper());
} else {
Log.d(TAG, "Failed to get location settings");
}
}*/
/*private void getGeoCoderValues(double latitude, double longitude) {
Locale locale = new Locale("en", "in");
GeocoderService geocoderService =
LocationServices.getGeocoderService(getActivity().getBaseContext(), locale);
// Request reverse geocoding.
GetFromLocationRequest getFromLocationRequest = new GetFromLocationRequest(latitude, longitude, 5);
// Initiate reverse geocoding.
geocoderService.getFromLocation(getFromLocationRequest)
.addOnSuccessListener(hwLocation -> {
Gson gson=new Gson();
Log.d(TAG,"Geo coder :"+gson.toJson(hwLocation));
})
.addOnFailureListener(e -> {
// TODO: Processing when the API call fails.
Log.e(TAG,"Error while fetching Geo coder :"+e.getMessage());
});
}*/
/* void checkPermission() {
// Dynamically apply for required permissions if the API level is 28 or lower.
if (Build.VERSION.SDK_INT <= Build.VERSION_CODES.P) {
Log.i(TAG, "android sdk <= 28 Q");
if (ActivityCompat.checkSelfPermission(getContext(),
Manifest.permission.ACCESS_FINE_LOCATION) != PackageManager.PERMISSION_GRANTED
&& ActivityCompat.checkSelfPermission(getContext(),
Manifest.permission.ACCESS_COARSE_LOCATION) != PackageManager.PERMISSION_GRANTED) {
String[] strings =
{Manifest.permission.READ_EXTERNAL_STORAGE,Manifest.permission.MANAGE_MEDIA,Manifest.permission.MEDIA_CONTENT_CONTROL,Manifest.permission.ACCESS_FINE_LOCATION, Manifest.permission.ACCESS_COARSE_LOCATION};
ActivityCompat.requestPermissions(getActivity(), strings, 1);
}
} else {
// Dynamically apply for the android.permission.ACCESS_BACKGROUND_LOCATION permission in addition to the preceding permissions if the API level is higher than 28.
if (ActivityCompat.checkSelfPermission(getActivity(),
Manifest.permission.ACCESS_FINE_LOCATION) != PackageManager.PERMISSION_GRANTED
&& ActivityCompat.checkSelfPermission(getContext(),
Manifest.permission.ACCESS_COARSE_LOCATION) != PackageManager.PERMISSION_GRANTED
&& ActivityCompat.checkSelfPermission(getContext(),
"android.permission.ACCESS_BACKGROUND_LOCATION") != PackageManager.PERMISSION_GRANTED) {
String[] strings = {android.Manifest.permission.ACCESS_FINE_LOCATION,
android.Manifest.permission.ACCESS_COARSE_LOCATION,Manifest.permission.MEDIA_CONTENT_CONTROL,Manifest.permission.MANAGE_MEDIA,
"android.permission.ACCESS_BACKGROUND_LOCATION"};
ActivityCompat.requestPermissions(getActivity(), strings, 2);
}
}
}*/
override fun onStop() {
super.onStop()
locationData!!.disableLocationData()
}
companion object {
var TAG = "TAG"
}
}
Result​
​Tricks and Tips​
Makes sure that agconnect-services.json file added.​
Make sure required dependencies are added
Make sure that ML service is enabled in AGC
Images has clear visibility of text
Conclusion​In this article, we have learnt how to integrate the Huawei Location kit and ML kit into the Android application KnowMyBoard. And also we have learnt how to convert the image to text Using HMS ML service, In my previous article I have written an article about account kit please go through the previous article which is available in the introduction section. barrier.
Reference​
ML Kit – Training video
Location Kit – Training video

On Device Text Detection and Translation from Camera Stream Using Huawei ML Kit in Android KnowMyBoard App [Navigation Components, MVVM]

{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Introduction
In this article, we will learn how to integrate Huawei ML kit camera stream in Android application KnowMyBoard. Account Kit provides seamless login functionality to the app with large user base.
The text recognition service can extract text from images of receipts, business cards, and documents. This service is useful for industries such as printing, education, and logistics. You can use it to create apps that handle data entry and check tasks.
The text recognition service is able to recognize text in both static images and dynamic camera streams with a host of APIs, which you can call synchronously or asynchronously to build your text recognition-enabled apps.
The on-device language detection service can detect the language of text when the Internet is unavailable. ML Kit detects languages in text and returns the language codes (which comply with the ISO 639-1 standard) and their respective confidences or the language code with the highest confidence. Currently, 56 languages can be detected.
Similar to the real-time translation service, the on-device translation service can be widely used in scenarios where translation between different languages is required. For example, travel apps can integrate this service to translate road signs and menus in other languages into tourists' native languages, providing more considerate services for them. Different from real-time translation, on-device translation does not require the Internet connection. You can easily use the translation service even if the Internet is disconnected.
Precautions
Development Overview
You need to install Android Studio IDE and I assume that you have prior knowledge of Android application development.
Hardware Requirements
A computer (desktop or laptop) running Windows 10.
Android phone (with the USB cable), which is used for debugging.
Software Requirements
Java JDK 1.8 or later.
Android Studio software or Visual Studio or Code installed.
HMS Core (APK) 4.X or later
Integration steps
Step 1. Huawei developer account and complete identity verification in Huawei developer website, refer to register Huawei ID.
Step 2. Create project in AppGallery Connect
Step 3. Adding HMS Core SDK
Let's start coding
navigation_graph.xml
[/B]
<?xml version="1.0" encoding="utf-8"?>
<navigation xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
android:id="@+id/navigation_graph"
app:startDestination="@id/loginFragment">
<fragment
android:id="@+id/loginFragment"
android:name="com.huawei.hms.knowmyboard.dtse.activity.fragments.LoginFragment"
android:label="LoginFragment"/>
<fragment
android:id="@+id/mainFragment"
android:name="com.huawei.hms.knowmyboard.dtse.activity.fragments.MainFragment"
android:label="MainFragment"/>
<fragment
android:id="@+id/searchFragment"
android:name="com.huawei.hms.knowmyboard.dtse.activity.fragments.SearchFragment"
android:label="fragment_search"
tools:layout="@layout/fragment_search" />
</navigation>
[B]
TextRecognitionActivity.java
[/B]
public final class TextRecognitionActivity extends BaseActivity
implements OnRequestPermissionsResultCallback, View.OnClickListener {
private static final String TAG = "TextRecognitionActivity";
private LensEngine lensEngine = null;
private LensEnginePreview preview;
private GraphicOverlay graphicOverlay;
private ImageButton takePicture;
private ImageButton imageSwitch;
private RelativeLayout zoomImageLayout;
private ZoomImageView zoomImageView;
private ImageButton zoomImageClose;
CameraConfiguration cameraConfiguration = null;
private int facing = CameraConfiguration.CAMERA_FACING_BACK;
private Camera mCamera;
private boolean isLandScape;
private Bitmap bitmap;
private Bitmap bitmapCopy;
private LocalTextTransactor localTextTransactor;
private Handler mHandler = new MsgHandler(this);
private Dialog languageDialog;
private AddPictureDialog addPictureDialog;
private TextView textCN;
private TextView textEN;
private TextView textJN;
private TextView textKN;
private TextView textLN;
private TextView tv_language,tv_translated_txt;
private String textType = Constant.POSITION_CN;
private boolean isInitialization = false;
MLTextAnalyzer analyzer;
private static class MsgHandler extends Handler {
WeakReference<TextRecognitionActivity> mMainActivityWeakReference;
public MsgHandler(TextRecognitionActivity mainActivity) {
this.mMainActivityWeakReference = new WeakReference<>(mainActivity);
}
@Override
public void handleMessage(Message msg) {
super.handleMessage(msg);
TextRecognitionActivity mainActivity = this.mMainActivityWeakReference.get();
if (mainActivity == null) {
return;
}
if (msg.what == Constant.SHOW_TAKE_PHOTO_BUTTON) {
mainActivity.setVisible();
} else if (msg.what == Constant.HIDE_TAKE_PHOTO_BUTTON) {
mainActivity.setGone();
}
}
}
private void setVisible() {
if (this.takePicture.getVisibility() == View.GONE) {
this.takePicture.setVisibility(View.VISIBLE);
}
}
private void setGone() {
if (this.takePicture.getVisibility() == View.VISIBLE) {
this.takePicture.setVisibility(View.GONE);
}
}
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
this.setContentView(R.layout.activity_text_recognition);
if (savedInstanceState != null) {
this.facing = savedInstanceState.getInt(Constant.CAMERA_FACING);
}
this.tv_language = this.findViewById(R.id.tv_lang);
this.tv_translated_txt = this.findViewById(R.id.tv_translated_txt);
this.preview = this.findViewById(R.id.live_preview);
this.graphicOverlay = this.findViewById(R.id.live_overlay);
this.cameraConfiguration = new CameraConfiguration();
this.cameraConfiguration.setCameraFacing(this.facing);
this.initViews();
this.isLandScape = (this.getResources().getConfiguration().orientation == Configuration.ORIENTATION_LANDSCAPE);
this.createLensEngine();
this.setStatusBar();
}
private void initViews() {
this.takePicture = this.findViewById(R.id.takePicture);
this.takePicture.setOnClickListener(this);
this.imageSwitch = this.findViewById(R.id.text_imageSwitch);
this.imageSwitch.setOnClickListener(this);
this.zoomImageLayout = this.findViewById(R.id.zoomImageLayout);
this.zoomImageView = this.findViewById(R.id.take_picture_overlay);
this.zoomImageClose = this.findViewById(R.id.zoomImageClose);
this.zoomImageClose.setOnClickListener(this);
this.findViewById(R.id.back).setOnClickListener(this);
this.findViewById(R.id.language_setting).setOnClickListener(this);
this.createLanguageDialog();
this.createAddPictureDialog();
}
@Override
public void onClick(View view) {
if (view.getId() == R.id.takePicture) {
this.takePicture();
} else if (view.getId() == R.id.zoomImageClose) {
this.zoomImageLayout.setVisibility(View.GONE);
this.recycleBitmap();
} else if (view.getId() == R.id.text_imageSwitch) {
this.showAddPictureDialog();
} else if (view.getId() == R.id.language_setting) {
this.showLanguageDialog();
} else if (view.getId() == R.id.simple_cn) {
SharedPreferencesUtil.getInstance(this)
.putStringValue(Constant.POSITION_KEY, Constant.POSITION_CN);
this.languageDialog.dismiss();
this.restartLensEngine(Constant.POSITION_CN);
} else if (view.getId() == R.id.english) {
SharedPreferencesUtil.getInstance(this)
.putStringValue(Constant.POSITION_KEY, Constant.POSITION_EN);
this.languageDialog.dismiss();
this.preview.release();
this.restartLensEngine(Constant.POSITION_EN);
} else if (view.getId() == R.id.japanese) {
SharedPreferencesUtil.getInstance(this)
.putStringValue(Constant.POSITION_KEY, Constant.POSITION_JA);
this.languageDialog.dismiss();
this.preview.release();
this.restartLensEngine(Constant.POSITION_JA);
} else if (view.getId() == R.id.korean) {
SharedPreferencesUtil.getInstance(this)
.putStringValue(Constant.POSITION_KEY, Constant.POSITION_KO);
this.languageDialog.dismiss();
this.preview.release();
this.restartLensEngine(Constant.POSITION_KO);
} else if (view.getId() == R.id.latin) {
SharedPreferencesUtil.getInstance(this)
.putStringValue(Constant.POSITION_KEY, Constant.POSITION_LA);
this.languageDialog.dismiss();
this.preview.release();
this.restartLensEngine(Constant.POSITION_LA);
} else if (view.getId() == R.id.back) {
releaseLensEngine();
this.finish();
}
}
private void restartLensEngine(String type) {
if (this.textType.equals(type)) {
return;
}
this.lensEngine.release();
this.lensEngine = null;
this.createLensEngine();
this.startLensEngine();
if (this.lensEngine == null || this.lensEngine.getCamera() == null) {
return;
}
this.mCamera = this.lensEngine.getCamera();
try {
this.mCamera.setPreviewDisplay(this.preview.getSurfaceHolder());
} catch (IOException e) {
Log.d(TextRecognitionActivity.TAG, "initViews IOException");
}
}
@Override
public void onBackPressed() {
if (this.zoomImageLayout.getVisibility() == View.VISIBLE) {
this.zoomImageLayout.setVisibility(View.GONE);
this.recycleBitmap();
} else {
super.onBackPressed();
releaseLensEngine();
}
}
private void createLanguageDialog() {
this.languageDialog = new Dialog(this, R.style.MyDialogStyle);
View view = View.inflate(this, R.layout.dialog_language_setting, null);
// Set up a custom layout
this.languageDialog.setContentView(view);
this.textCN = view.findViewById(R.id.simple_cn);
this.textCN.setOnClickListener(this);
this.textEN = view.findViewById(R.id.english);
this.textEN.setOnClickListener(this);
this.textJN = view.findViewById(R.id.japanese);
this.textJN.setOnClickListener(this);
this.textKN = view.findViewById(R.id.korean);
this.textKN.setOnClickListener(this);
this.textLN = view.findViewById(R.id.latin);
this.textLN.setOnClickListener(this);
this.languageDialog.setCanceledOnTouchOutside(true);
// Set the size of the dialog
Window dialogWindow = this.languageDialog.getWindow();
WindowManager.LayoutParams layoutParams = dialogWindow.getAttributes();
layoutParams.width = WindowManager.LayoutParams.MATCH_PARENT;
layoutParams.height = WindowManager.LayoutParams.WRAP_CONTENT;
layoutParams.gravity = Gravity.BOTTOM;
dialogWindow.setAttributes(layoutParams);
}
private void showLanguageDialog() {
this.initDialogViews();
this.languageDialog.show();
}
private void createAddPictureDialog() {
this.addPictureDialog = new AddPictureDialog(this, AddPictureDialog.TYPE_NORMAL);
final Intent intent = new Intent(TextRecognitionActivity.this, RemoteDetectionActivity.class);
intent.putExtra(Constant.MODEL_TYPE, Constant.CLOUD_TEXT_DETECTION);
this.addPictureDialog.setClickListener(new AddPictureDialog.ClickListener() {
@Override
public void takePicture() {
lensEngine.release();
isInitialization = false;
intent.putExtra(Constant.ADD_PICTURE_TYPE, Constant.TYPE_TAKE_PHOTO);
TextRecognitionActivity.this.startActivity(intent);
}
@Override
public void selectImage() {
intent.putExtra(Constant.ADD_PICTURE_TYPE, Constant.TYPE_SELECT_IMAGE);
TextRecognitionActivity.this.startActivity(intent);
}
@Override
public void doExtend() {
}
});
}
private void showAddPictureDialog() {
this.addPictureDialog.show();
}
private void initDialogViews() {
String position = SharedPreferencesUtil.getInstance(this).getStringValue(Constant.POSITION_KEY);
this.textType = position;
this.textCN.setSelected(false);
this.textEN.setSelected(false);
this.textJN.setSelected(false);
this.textLN.setSelected(false);
this.textKN.setSelected(false);
switch (position) {
case Constant.POSITION_CN:
this.textCN.setSelected(true);
break;
case Constant.POSITION_EN:
this.textEN.setSelected(true);
break;
case Constant.POSITION_LA:
this.textLN.setSelected(true);
break;
case Constant.POSITION_JA:
this.textJN.setSelected(true);
break;
case Constant.POSITION_KO:
this.textKN.setSelected(true);
break;
default:
}
}
@Override
protected void onSaveInstanceState(Bundle outState) {
outState.putInt(Constant.CAMERA_FACING, this.facing);
super.onSaveInstanceState(outState);
}
private void createLensEngine() {
MLLocalTextSetting setting = new MLLocalTextSetting.Factory()
.setOCRMode(MLLocalTextSetting.OCR_DETECT_MODE)
// Specify languages that can be recognized.
.setLanguage("ko")
.create();
analyzer = MLAnalyzerFactory.getInstance().getLocalTextAnalyzer(setting);
//analyzer = new MLTextAnalyzer.Factory(this).create();
if (this.lensEngine == null) {
this.lensEngine = new LensEngine(this, this.cameraConfiguration, this.graphicOverlay);
}
try {
this.localTextTransactor = new LocalTextTransactor(this.mHandler, this);
this.lensEngine.setMachineLearningFrameTransactor(this.localTextTransactor);
// this.lensEngine.setMachineLearningFrameTransactor((ImageTransactor) new ObjectAnalyzerTransactor());
isInitialization = true;
} catch (Exception e) {
Toast.makeText(
this,
"Can not create image transactor: " + e.getMessage(),
Toast.LENGTH_LONG)
.show();
}
}
private void startLensEngine() {
if (this.lensEngine != null) {
try {
this.preview.start(this.lensEngine, false);
} catch (IOException e) {
Log.e(TextRecognitionActivity.TAG, "Unable to start lensEngine.", e);
this.lensEngine.release();
this.lensEngine = null;
}
}
}
@Override
public void onResume() {
super.onResume();
if (!isInitialization){
createLensEngine();
}
this.startLensEngine();
}
@Override
protected void onStop() {
super.onStop();
this.preview.stop();
}
private void releaseLensEngine() {
if (this.lensEngine != null) {
this.lensEngine.release();
this.lensEngine = null;
}
recycleBitmap();
}
@Override
protected void onDestroy() {
super.onDestroy();
releaseLensEngine();
if (analyzer != null) {
try {
analyzer.stop();
} catch (IOException e) {
// Exception handling.
Log.e(TAG,"Error while releasing analyzer");
}
}
}
private void recycleBitmap() {
if (this.bitmap != null && !this.bitmap.isRecycled()) {
this.bitmap.recycle();
this.bitmap = null;
}
if (this.bitmapCopy != null && !this.bitmapCopy.isRecycled()) {
this.bitmapCopy.recycle();
this.bitmapCopy = null;
}
}
private void takePicture() {
this.zoomImageLayout.setVisibility(View.VISIBLE);
LocalDataProcessor localDataProcessor = new LocalDataProcessor();
localDataProcessor.setLandScape(this.isLandScape);
this.bitmap = BitmapUtils.getBitmap(this.localTextTransactor.getTransactingImage(), this.localTextTransactor.getTransactingMetaData());
float previewWidth = localDataProcessor.getMaxWidthOfImage(this.localTextTransactor.getTransactingMetaData());
float previewHeight = localDataProcessor.getMaxHeightOfImage(this.localTextTransactor.getTransactingMetaData());
if (this.isLandScape) {
previewWidth = localDataProcessor.getMaxHeightOfImage(this.localTextTransactor.getTransactingMetaData());
previewHeight = localDataProcessor.getMaxWidthOfImage(this.localTextTransactor.getTransactingMetaData());
}
this.bitmapCopy = Bitmap.createBitmap(this.bitmap).copy(Bitmap.Config.ARGB_8888, true);
Canvas canvas = new Canvas(this.bitmapCopy);
float min = Math.min(previewWidth, previewHeight);
float max = Math.max(previewWidth, previewHeight);
if (this.getResources().getConfiguration().orientation == Configuration.ORIENTATION_PORTRAIT) {
localDataProcessor.setCameraInfo(this.graphicOverlay, canvas, min, max);
} else {
localDataProcessor.setCameraInfo(this.graphicOverlay, canvas, max, min);
}
localDataProcessor.drawHmsMLVisionText(canvas, this.localTextTransactor.getLastResults().getBlocks());
this.zoomImageView.setImageBitmap(this.bitmapCopy);
// Create an MLFrame object using the bitmap, which is the image data in bitmap format.
MLFrame frame = MLFrame.fromBitmap(bitmap);
Task<MLText> task = analyzer.asyncAnalyseFrame(frame);
task.addOnSuccessListener(new OnSuccessListener<MLText>() {
@Override
public void onSuccess(MLText text) {
String detectText = text.getStringValue();
// Processing for successful recognition.
// Create a local language detector.
MLLangDetectorFactory factory = MLLangDetectorFactory.getInstance();
MLLocalLangDetectorSetting setting = new MLLocalLangDetectorSetting.Factory()
// Set the minimum confidence threshold for language detection.
.setTrustedThreshold(0.01f)
.create();
MLLocalLangDetector myLocalLangDetector = factory.getLocalLangDetector(setting);
Task<String> firstBestDetectTask = myLocalLangDetector.firstBestDetect(detectText);
firstBestDetectTask.addOnSuccessListener(new OnSuccessListener<String>() {
@Override
public void onSuccess(String languageDetected) {
// Processing logic for detection success.
Log.d("TAG", "Lang detect :" + languageDetected);
Log.d("TAG", " detectText :" + detectText);
translate(languageDetected,detectText);
}
}).addOnFailureListener(new OnFailureListener() {
@Override
public void onFailure(Exception e) {
// Processing logic for detection failure.
Log.e("TAG", "Lang detect error:" + e.getMessage());
}
});
}
}).addOnFailureListener(new OnFailureListener() {
@Override
public void onFailure(Exception e) {
// Processing logic for recognition failure.
Log.e("TAG"," Text : Processing logic for recognition failure");
}
});
}
private void translate(String languageDetected, String detectText) {
MLApplication.initialize(getApplication());
MLApplication.getInstance().setApiKey("DAEDAF48ZIMI4ettQdTfCKlXgaln/E+TO/PrsX+LpP2BubkmED/iC0iVEps5vfx1ol27rHvuwiq64YphpPkGYWbf9La8XjnvC9qhwQ==");
// Create an offline translator.
MLLocalTranslateSetting setting = new MLLocalTranslateSetting.Factory()
// Set the source language code. The ISO 639-1 standard is used. This parameter is mandatory. If this parameter is not set, an error may occur.
.setSourceLangCode(languageDetected)
// Set the target language code. The ISO 639-1 standard is used. This parameter is mandatory. If this parameter is not set, an error may occur.
.setTargetLangCode("en")
.create();
MLLocalTranslator myLocalTranslator = MLTranslatorFactory.getInstance().getLocalTranslator(setting);
// Set the model download policy.
MLModelDownloadStrategy downloadStrategy = new MLModelDownloadStrategy.Factory()
.needWifi()// It is recommended that you download the package in a Wi-Fi environment.
.create();
// Create a download progress listener.
MLModelDownloadListener modelDownloadListener = new MLModelDownloadListener() {
@Override
public void onProcess(long alreadyDownLength, long totalLength) {
runOnUiThread(new Runnable() {
@Override
public void run() {
// Display the download progress or perform other operations.
}
});
}
};
myLocalTranslator.preparedModel(downloadStrategy, modelDownloadListener).
addOnSuccessListener(new OnSuccessListener<Void>() {
@Override
public void onSuccess(Void aVoid) {
// Called when the model package is successfully downloaded.
// input is a string of less than 5000 characters.
final Task<String> task = myLocalTranslator.asyncTranslate(detectText);
// Before translation, ensure that the models have been successfully downloaded.
task.addOnSuccessListener(new OnSuccessListener<String>() {
@Override
public void onSuccess(String translated) {
// Processing logic for detection success.
Log.e("TAG"," Translated Text : "+translated);
tv_translated_txt.setText(translated);
}
}).addOnFailureListener(new OnFailureListener() {
@Override
public void onFailure(Exception e) {
// Processing logic for detection failure.
Log.e("TAG"," Translation failed "+e.getMessage());
Toast.makeText(TextRecognitionActivity.this,"Please check internet connection.",Toast.LENGTH_SHORT).show();
}
});
}
}).addOnFailureListener(new OnFailureListener() {
@Override
public void onFailure(Exception e) {
// Called when the model package fails to be downloaded.
Log.e("TAG"," Translation failed onFailure "+e.getMessage());
}
});
}
}
[B]
MainFragment.java

Categories

Resources