Book Reader Application Using General Text Recognition by Huawei HiAI in Android - Huawei Developers

Introduction
In this article, we will learn how to integrate Huawei General Text Recognition using Huawei HiAI. We will build the Book reader application.
About application:
Usually user get bored to read book. This application helps them to listen book reading instead of manual book reading. So all they need to do is just capture photo of book and whenever user is travelling or whenever user want to read the book on their free time. Just user need to select image from galley and listen like music.
Huawei general text recognition works on OCR technology.
First let us understand about OCR.
What is optical character recognition (OCR)?
Optical Character Recognition (OCR) technology is a business solution for automating data extraction from printed or written text from a scanned document or image file and then converting the text into a machine-readable form to be used for data processing like editing or searching.
Now let us understand about General Text Recognition (GTR).
At the core of the GTR is Optical Character Recognition (OCR) technology, which extracts text in screenshots and photos taken by the phone camera. For photos taken by the camera, this API can correct for tilts, camera angles, reflections, and messy backgrounds up to a certain degree. It can also be used for document and streetscape photography, as well as a wide range of usage scenarios, and it features strong anti-interference capability. This API works on device side processing and service connection.
Features
For photos: Provides text area detection and text recognition for Chinese, English, Japanese, Korean, Russian, Italian, Spanish, Portuguese, German, and French texts in multiple printing fonts. A wide range of scenarios are supported, and a high recognition accuracy can be achieved even under the influence of complex lighting condition, background, or more.
For screenshots: Optimizes text extraction algorithms based on the characteristics of screenshots captured on mobile phones. Currently, this function is available in the Chinese mainland supporting Chinese and English texts.
OCR features
Lightweight: This API greatly reduces the computing time and ROM space the algorithm model takes up, making your app more lightweight.
Customized hierarchical result return: You can choose to return the coordinates of text blocks, text lines, and text characters in the screenshot based on app requirements.
How to integrate General Text Recognition
1. Configure the application on the AGC.
2. Apply for HiAI Engine Library
3. Client application development process.
Configure application on the AGC
Follow the steps
Step 1: We need to register as a developer account in AppGallery Connect. If you are already a developer ignore this step.
Step 2: Create an app by referring to Creating a Project and Creating an App in the Project
Step 3: Set the data storage location based on the current location.
Step 4: Generating a Signing Certificate Fingerprint.
Step 5: Configuring the Signing Certificate Fingerprint.
Step 6: Download your agconnect-services.json file, paste it into the app root directory.
Apply for HiAI Engine Library
What is Huawei HiAI?
HiAI is Huawei’s AI computing platform. HUAWEI HiAI is a mobile terminal–oriented artificial intelligence (AI) computing platform that constructs three layers of ecology: service capability openness, application capability openness, and chip capability openness. The three-layer open platform that integrates terminals, chips, and the cloud brings more extraordinary experience for users and developers.
How to apply for HiAI Engine?
Follow the steps
Step 1: Navigate to this URL, choose App Service > Development and click HUAWEI HiAI.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Step 2: Click Apply for HUAWEI HiAI kit.
Step 3: Enter required information like Product name and Package name, click Next button.
Step 4: Verify the application details and click Submit button.
Step 5: Click the Download SDK button to open the SDK list.
Step 6: Unzip downloaded SDK and add into your android project under libs folder.
Step 7: Add jar files dependences into app build.gradle file.
Code:
implementation fileTree(include: ['*.aar', '*.jar'], dir: 'libs')
implementation 'com.google.code.gson:gson:2.8.6'
repositories {
flatDir {
dirs 'libs'
}
}
Client application development process
Follow the steps
Step 1: Create an Android application in the Android studio (Any IDE which is your favorite).
Step 2: Add the App level Gradle dependencies. Choose inside project Android > app > build.gradle.
Code:
apply plugin: 'com.android.application'
apply plugin: 'com.huawei.agconnect'
Root level gradle dependencies.
Code:
maven { url 'https://developer.huawei.com/repo/' }
classpath 'com.huawei.agconnect:agcp:1.4.1.300'
Step 3: Add permission in AndroidManifest.xml
XML:
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.READ_INTERNAL_STORAGE" />
<uses-permission android:name="android.permission.CAMERA" />
Step 4: Build application.
Initialize all view.
Java:
private void initializeView() {
mPlayAudio = findViewById(R.id.playAudio);
mTxtViewResult = findViewById(R.id.result);
mImageView = findViewById(R.id.imgViewPicture);
}
Request the runtime permission
Java:
private void requestPermissions() {
try {
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.M) {
int permission1 = ActivityCompat.checkSelfPermission(this,
Manifest.permission.WRITE_EXTERNAL_STORAGE);
int permission2 = ActivityCompat.checkSelfPermission(this,
Manifest.permission.CAMERA);
if (permission1 != PackageManager.PERMISSION_GRANTED || permission2 != PackageManager
.PERMISSION_GRANTED) {
ActivityCompat.requestPermissions(this, new String[]{Manifest.permission.WRITE_EXTERNAL_STORAGE,
Manifest.permission.READ_EXTERNAL_STORAGE, Manifest.permission.CAMERA}, 0x0010);
}
}
} catch (Exception e) {
e.printStackTrace();
}
}
@override
public void onRequestPermissionsResult(int requestCode, String[] permissions, int[] grantResults) {
super.onRequestPermissionsResult(requestCode, permissions, grantResults);
if (grantResults.length <= 0
|| grantResults[0] != PackageManager.PERMISSION_GRANTED) {
Toast.makeText(this, "Permission denied", Toast.LENGTH_SHORT).show();
}
}
Initialize vision base
Java:
private void initVision() {
VisionBase.init(this, new ConnectionCallback() {
@override
public void onServiceConnect() {
Log.e(TAG, " onServiceConnect");
}
@override
public void onServiceDisconnect() {
Log.e(TAG, " onServiceDisconnect");
}
});
}
Initialize text to speech
Java:
private void initializeTextToSpeech() {
textToSpeech = new TextToSpeech(getApplicationContext(), new TextToSpeech.OnInitListener() {
@override
public void onInit(int status) {
if (status != TextToSpeech.ERROR) {
textToSpeech.setLanguage(Locale.UK);
}
}
});
}
Copy code
Create TextDetector instance.
Java:
mTextDetector = new TextDetector(this);
Define Vision image.
Java:
VisionImage image = VisionImage.fromBitmap(mBitmap);
Create instance of Text class.
Java:
final Text result = new Text();
Create and set VisionTextConfiguration
Java:
VisionTextConfiguration config = new VisionTextConfiguration.Builder()
.setAppType(VisionTextConfiguration.APP_NORMAL)
.setProcessMode(VisionTextConfiguration.MODE_IN)
.setDetectType(TextDetectType.TYPE_TEXT_DETECT_FOCUS_SHOOT)
.setLanguage(TextConfiguration.AUTO).build();
//Set vision configuration
mTextDetector.setVisionConfiguration(config);
Call detect method to get the result
Java:
int result_code = mTextDetector.detect(image, result, new VisionCallback<Text>() {
@override
public void onResult(Text text) {
dismissDialog();
Message message = Message.obtain();
message.what = TYPE_SHOW_RESULT;
message.obj = text;
mHandler.sendMessage(message);
}
@override
public void onError(int i) {
Log.d(TAG, "Callback: onError " + i);
mHandler.sendEmptyMessage(TYPE_TEXT_ERROR);
}
@override
public void onProcessing(float v) {
Log.d(TAG, "Callback: onProcessing:" + v);
}
});
Copy code
Create Handler
Java:
private final Handler mHandler = new Handler() {
[USER=439709]@override[/USER]
public void handleMessage(Message msg) {
super.handleMessage(msg);
int status = msg.what;
Log.d(TAG, "handleMessage status = " + status);
switch (status) {
case TYPE_CHOOSE_PHOTO: {
if (mBitmap == null) {
Log.e(TAG, "bitmap is null");
return;
}
mImageView.setImageBitmap(mBitmap);
mTxtViewResult.setText("");
showDialog();
detectTex();
break;
}
case TYPE_SHOW_RESULT: {
Text result = (Text) msg.obj;
if (dialog != null && dialog.isShowing()) {
dialog.dismiss();
}
if (result == null) {
mTxtViewResult.setText("Failed to detect text lines, result is null.");
break;
}
String textValue = result.getValue();
Log.d(TAG, "text value: " + textValue);
StringBuffer textResult = new StringBuffer();
List<TextLine> textLines = result.getBlocks().get(0).getTextLines();
for (TextLine line : textLines) {
textResult.append(line.getValue() + " ");
}
Log.d(TAG, "OCR Detection succeeded.");
mTxtViewResult.setText(textResult.toString());
textToSpeechString = textResult.toString();
break;
}
case TYPE_TEXT_ERROR: {
mTxtViewResult.setText("Failed to detect text lines, result is null.");
}
default:
break;
}
}
};
Complete code as follows
Java:
import android.Manifest;
import android.app.Activity;
import android.app.ProgressDialog;
import android.content.Intent;
import android.content.pm.PackageManager;
import android.database.Cursor;
import android.graphics.Bitmap;
import android.graphics.BitmapFactory;
import android.net.Uri;
import android.os.Build;
import android.os.Handler;
import android.os.Message;
import android.provider.MediaStore;
import android.speech.tts.TextToSpeech;
import android.support.v4.app.ActivityCompat;
import android.support.v7.app.AppCompatActivity;
import android.os.Bundle;
import android.util.Log;
import android.view.View;
import android.widget.Button;
import android.widget.ImageView;
import android.widget.TextView;
import android.widget.Toast;
import com.huawei.hiai.vision.common.ConnectionCallback;
import com.huawei.hiai.vision.common.VisionBase;
import com.huawei.hiai.vision.common.VisionCallback;
import com.huawei.hiai.vision.common.VisionImage;
import com.huawei.hiai.vision.text.TextDetector;
import com.huawei.hiai.vision.visionkit.text.Text;
import com.huawei.hiai.vision.visionkit.text.TextDetectType;
import com.huawei.hiai.vision.visionkit.text.TextLine;
import com.huawei.hiai.vision.visionkit.text.config.TextConfiguration;
import com.huawei.hiai.vision.visionkit.text.config.VisionTextConfiguration;
import java.util.List;
import java.util.Locale;
public class MainActivity extends AppCompatActivity {
private static final String TAG = MainActivity.class.getSimpleName();
private static final int REQUEST_CHOOSE_PHOTO_CODE = 2;
private Bitmap mBitmap;
private ImageView mPlayAudio;
private ImageView mImageView;
private TextView mTxtViewResult;
protected ProgressDialog dialog;
private TextDetector mTextDetector;
Text imageText = null;
TextToSpeech textToSpeech;
String textToSpeechString = "";
private static final int TYPE_CHOOSE_PHOTO = 1;
private static final int TYPE_SHOW_RESULT = 2;
private static final int TYPE_TEXT_ERROR = 3;
[USER=439709]@override[/USER]
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
initializeView();
requestPermissions();
initVision();
initializeTextToSpeech();
}
private void initializeView() {
mPlayAudio = findViewById(R.id.playAudio);
mTxtViewResult = findViewById(R.id.result);
mImageView = findViewById(R.id.imgViewPicture);
}
private void initVision() {
VisionBase.init(this, new ConnectionCallback() {
[USER=439709]@override[/USER]
public void onServiceConnect() {
Log.e(TAG, " onServiceConnect");
}
[USER=439709]@override[/USER]
public void onServiceDisconnect() {
Log.e(TAG, " onServiceDisconnect");
}
});
}
private void initializeTextToSpeech() {
textToSpeech = new TextToSpeech(getApplicationContext(), new TextToSpeech.OnInitListener() {
[USER=439709]@override[/USER]
public void onInit(int status) {
if (status != TextToSpeech.ERROR) {
textToSpeech.setLanguage(Locale.UK);
}
}
});
}
public void onChildClick(View view) {
switch (view.getId()) {
case R.id.btnSelect: {
Log.d(TAG, "Select an image");
Intent intent = new Intent(Intent.ACTION_PICK);
intent.setType("image/*");
startActivityForResult(intent, REQUEST_CHOOSE_PHOTO_CODE);
break;
}
case R.id.playAudio: {
if (textToSpeechString != null && !textToSpeechString.isEmpty())
textToSpeech.speak(textToSpeechString, TextToSpeech.QUEUE_FLUSH, null);
break;
}
}
}
private void detectTex() {
/* create a TextDetector instance firstly */
mTextDetector = new TextDetector(this);
/*Define VisionImage and transfer the Bitmap image to be detected*/
VisionImage image = VisionImage.fromBitmap(mBitmap);
/*Define the Text class.*/
final Text result = new Text();
/*Use VisionTextConfiguration to select the type of the image to be called. */
VisionTextConfiguration config = new VisionTextConfiguration.Builder()
.setAppType(VisionTextConfiguration.APP_NORMAL)
.setProcessMode(VisionTextConfiguration.MODE_IN)
.setDetectType(TextDetectType.TYPE_TEXT_DETECT_FOCUS_SHOOT)
.setLanguage(TextConfiguration.AUTO).build();
//Set vision configuration
mTextDetector.setVisionConfiguration(config);
/*Call the detect method of TextDetector to obtain the result*/
int result_code = mTextDetector.detect(image, result, new VisionCallback<Text>() {
[USER=439709]@override[/USER]
public void onResult(Text text) {
dismissDialog();
Message message = Message.obtain();
message.what = TYPE_SHOW_RESULT;
message.obj = text;
mHandler.sendMessage(message);
}
[USER=439709]@override[/USER]
public void onError(int i) {
Log.d(TAG, "Callback: onError " + i);
mHandler.sendEmptyMessage(TYPE_TEXT_ERROR);
}
[USER=439709]@override[/USER]
public void onProcessing(float v) {
Log.d(TAG, "Callback: onProcessing:" + v);
}
});
}
private void showDialog() {
if (dialog == null) {
dialog = new ProgressDialog(MainActivity.this);
dialog.setTitle("Detecting text...");
dialog.setMessage("Please wait...");
dialog.setIndeterminate(true);
dialog.setCancelable(false);
}
dialog.show();
}
private final Handler mHandler = new Handler() {
[USER=439709]@override[/USER]
public void handleMessage(Message msg) {
super.handleMessage(msg);
int status = msg.what;
Log.d(TAG, "handleMessage status = " + status);
switch (status) {
case TYPE_CHOOSE_PHOTO: {
if (mBitmap == null) {
Log.e(TAG, "bitmap is null");
return;
}
mImageView.setImageBitmap(mBitmap);
mTxtViewResult.setText("");
showDialog();
detectTex();
break;
}
case TYPE_SHOW_RESULT: {
Text result = (Text) msg.obj;
if (dialog != null && dialog.isShowing()) {
dialog.dismiss();
}
if (result == null) {
mTxtViewResult.setText("Failed to detect text lines, result is null.");
break;
}
String textValue = result.getValue();
Log.d(TAG, "text value: " + textValue);
StringBuffer textResult = new StringBuffer();
List<TextLine> textLines = result.getBlocks().get(0).getTextLines();
for (TextLine line : textLines) {
textResult.append(line.getValue() + " ");
}
Log.d(TAG, "OCR Detection succeeded.");
mTxtViewResult.setText(textResult.toString());
textToSpeechString = textResult.toString();
break;
}
case TYPE_TEXT_ERROR: {
mTxtViewResult.setText("Failed to detect text lines, result is null.");
}
default:
break;
}
}
};
[USER=439709]@override[/USER]
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
super.onActivityResult(requestCode, resultCode, data);
if (requestCode == REQUEST_CHOOSE_PHOTO_CODE && resultCode == Activity.RESULT_OK) {
if (data == null) {
return;
}
Uri selectedImage = data.getData();
getBitmap(selectedImage);
}
}
private void requestPermissions() {
try {
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.M) {
int permission1 = ActivityCompat.checkSelfPermission(this,
Manifest.permission.WRITE_EXTERNAL_STORAGE);
int permission2 = ActivityCompat.checkSelfPermission(this,
Manifest.permission.CAMERA);
if (permission1 != PackageManager.PERMISSION_GRANTED || permission2 != PackageManager
.PERMISSION_GRANTED) {
ActivityCompat.requestPermissions(this, new String[]{Manifest.permission.WRITE_EXTERNAL_STORAGE,
Manifest.permission.READ_EXTERNAL_STORAGE, Manifest.permission.CAMERA}, 0x0010);
}
}
} catch (Exception e) {
e.printStackTrace();
}
}
private void getBitmap(Uri imageUri) {
String[] pathColumn = {MediaStore.Images.Media.DATA};
Cursor cursor = getContentResolver().query(imageUri, pathColumn, null, null, null);
if (cursor == null) return;
cursor.moveToFirst();
int columnIndex = cursor.getColumnIndex(pathColumn[0]);
/* get image path */
String picturePath = cursor.getString(columnIndex);
cursor.close();
mBitmap = BitmapFactory.decodeFile(picturePath);
if (mBitmap == null) {
return;
}
//You can set image here
//mImageView.setImageBitmap(mBitmap);
// You can pass it handler as well
mHandler.sendEmptyMessage(TYPE_CHOOSE_PHOTO);
mTxtViewResult.setText("");
mPlayAudio.setEnabled(true);
}
private void dismissDialog() {
if (dialog != null && dialog.isShowing()) {
dialog.dismiss();
}
}
[USER=439709]@override[/USER]
public void onRequestPermissionsResult(int requestCode, String[] permissions, int[] grantResults) {
super.onRequestPermissionsResult(requestCode, permissions, grantResults);
if (grantResults.length <= 0
|| grantResults[0] != PackageManager.PERMISSION_GRANTED) {
Toast.makeText(this, "Permission denied", Toast.LENGTH_SHORT).show();
}
}
[USER=439709]@override[/USER]
protected void onDestroy() {
super.onDestroy();
/* release ocr instance and free the npu resources*/
if (mTextDetector != null) {
mTextDetector.release();
}
dismissDialog();
if (mBitmap != null) {
mBitmap.recycle();
}
}
}
activity_main.xml
XML:
<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:layout_width="match_parent"
android:layout_height="match_parent"
xmlns:app="http://schemas.android.com/apk/res-auto"
android:fitsSystemWindows="true"
androidrientation="vertical"
android:background="@android:color/darker_gray">
<android.support.v7.widget.Toolbar
android:layout_width="match_parent"
android:layout_height="50dp"
android:background="#ff0000"
android:elevation="10dp">
<LinearLayout
android:layout_width="match_parent"
android:layout_height="match_parent"
androidrientation="horizontal">
<TextView
android:layout_width="match_parent"
android:layout_height="match_parent"
android:text="Book Reader"
android:layout_gravity="center"
android:gravity="center|start"
android:layout_weight="1"
android:textColor="@android:color/white"
android:textStyle="bold"
android:textSize="20sp"/>
<ImageView
android:layout_width="40dp"
android:layout_height="40dp"
android:src="@drawable/ic_baseline_play_circle_outline_24"
android:layout_gravity="center|end"
android:layout_marginEnd="10dp"
android:id="@+id/playAudio"
androidadding="5dp"/>
</LinearLayout>
</android.support.v7.widget.Toolbar>
<ScrollView
android:layout_width="match_parent"
android:layout_height="match_parent"
android:fitsSystemWindows="true">
<LinearLayout
android:layout_width="match_parent"
android:layout_height="match_parent"
androidrientation="vertical"
android:background="@android:color/darker_gray"
>
<android.support.v7.widget.CardView
android:layout_width="match_parent"
android:layout_height="wrap_content"
app:cardCornerRadius="5dp"
app:cardElevation="10dp"
android:layout_marginStart="10dp"
android:layout_marginEnd="10dp"
android:layout_marginTop="20dp"
android:layout_gravity="center">
<ImageView
android:id="@+id/imgViewPicture"
android:layout_width="300dp"
android:layout_height="300dp"
android:layout_margin="8dp"
android:layout_gravity="center_horizontal"
android:scaleType="fitXY" />
</android.support.v7.widget.CardView>
<android.support.v7.widget.CardView
android:layout_width="match_parent"
android:layout_height="wrap_content"
app:cardCornerRadius="5dp"
app:cardElevation="10dp"
android:layout_marginStart="10dp"
android:layout_marginEnd="10dp"
android:layout_marginTop="10dp"
android:layout_gravity="center"
android:layout_marginBottom="20dp">
<LinearLayout
android:layout_width="match_parent"
android:layout_height="wrap_content"
androidrientation="vertical"
>
<TextView
android:layout_margin="5dp"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:textColor="@android:color/black"
android:text="Text on the image"
android:textStyle="normal"
/>
<TextView
android:id="@+id/result"
android:layout_margin="5dp"
android:layout_marginBottom="20dp"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:textSize="18sp"
android:textColor="#ff0000"/>
</LinearLayout>
</android.support.v7.widget.CardView>
<Button
android:id="@+id/btnSelect"
android:layout_width="match_parent"
android:layout_height="wrap_content"
androidnClick="onChildClick"
android:layout_marginStart="10dp"
android:layout_marginEnd="10dp"
android:layout_marginBottom="10dp"
android:text="[USER=936943]@string[/USER]/select_picture"
android:background="@drawable/round_button_bg"
android:textColor="@android:color/white"
android:textAllCaps="false"/>
</LinearLayout>
</ScrollView>
</LinearLayout>
Result
Tips and Tricks
Maximum width and height: 1440 px and 15210 px (If the image is larger than this, you will receive error code 200).
Photos recommended size for optimal recognition accuracy.
Resolution > 720P
Aspect ratio < 2:1
If you are taking Video from a camera or gallery make sure your app has camera and storage permission.
Add the downloaded huawei-hiai-vision-ove-10.0.4.307.aar, huawei-hiai-pdk-1.0.0.aar file to libs folder.
Check dependencies added properly
Latest HMS Core APK is required.
Min SDK is 21. Otherwise you will get Manifest merge issue.
Conclusion
In this article, we have learnt the following concepts.
What is OCR?
Learnt about general text recognition.
Feature of GTR
Features of OCR
How to integrate General Text Recognition using Huawei HiAI
How to Apply Huawei HiAI
How to build the application
Reference
General Text Recognition
Apply for Huawei HiAI
Happy coding

how many languages can it be detected?

Related

Surface detection with AR Engine

More information like this, you can visit HUAWEI Developer Forum
​
Introduction
AR Engine has support to detect objects in the real world is called "Environment tracking" and with it you can records illumination, plane, image, object, surface, and other environmental information to help your apps merge virtual objects into scenarios in the physical world.
What is HUAWEI AR Engine?
HUAWEI AR Engine is a platform for building augmented reality (AR) apps on Android smartphones. It is based on the HiSilicon chipset, and integrates AR core algorithms to provide basic AR capabilities such as motion tracking, environment tracking, body tracking, and face tracking, allowing your app to bridge virtual world with the real world, for a brand new visually interactive user experience.
Currently, HUAWEI AR Engine provides three types of capabilities, including motion tracking, environment tracking, and human body and face tracking.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Example Android Application
For this example we will work on Environment tracking so we can detect surfaces, like a table or a floor.
Development Process
Creating an App
Create an app following instructions in Creating an AppGallery Connect Project and Adding an App to the Project.
Platform: Android
Device: Mobile phone
App category: App or Game
Integrating HUAWEI AR Engine SDK
Before development, integrate the HUAWEI AR Engine SDK via the Maven repository into your development environment.
Open the build.gradle file in the root directory of your Android Studio project.
Code:
// Top-level build file where you can add configuration options common to all sub-projects/modules.
buildscript {
repositories {
google()
jcenter()
maven {url 'https://developer.huawei.com/repo/'}
}
dependencies {
classpath "com.android.tools.build:gradle:4.0.1"
classpath 'com.huawei.agconnect:agcp:1.3.2.301'
// NOTE: Do not place your application dependencies here; they belong
// in the individual module build.gradle files
}
}
allprojects {
repositories {
maven {url 'https://developer.huawei.com/repo/'}
google()
jcenter()
}
}
task clean(type: Delete) {
delete rootProject.buildDir
}
Open the build.gradle file in the app directory of your project
Code:
apply plugin: 'com.android.application'
android {
compileSdkVersion 30
buildToolsVersion "30.0.1"
defaultConfig {
applicationId "com.vsm.myarapplication"
minSdkVersion 27
targetSdkVersion 30
versionCode 1
versionName "1.0"
testInstrumentationRunner "androidx.test.runner.AndroidJUnitRunner"
}
buildTypes {
release {
minifyEnabled false
proguardFiles getDefaultProguardFile('proguard-android-optimize.txt'), 'proguard-rules.pro'
}
}
}
dependencies {
implementation fileTree(dir: "libs", include: ["*.jar"])
implementation 'androidx.appcompat:appcompat:1.2.0'
implementation 'androidx.constraintlayout:constraintlayout:2.0.1'
testImplementation 'junit:junit:4.12'
//
implementation 'com.huawei.agconnect:agconnect-core:1.4.1.300'
//
implementation 'com.huawei.hms:arenginesdk:2.13.0.4'
//
implementation 'de.javagl:obj:0.3.0'
androidTestImplementation 'androidx.test.ext:junit:1.1.2'
androidTestImplementation 'androidx.test.espresso:espresso-core:3.3.0'
}
apply plugin: 'com.huawei.agconnect'
We create our Main Activity:
Code:
<?xml version="1.0" encoding="utf-8"?>
<RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
tools:context=".MainActivity">
<android.opengl.GLSurfaceView
android:id="@+id/surfaceview"
android:layout_width="fill_parent"
android:layout_height="fill_parent"
android:layout_gravity="top" />
<TextView
android:id="@+id/wordTextView"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="TextView"
android:textColor="@color/red"
tools:layout_editor_absoluteX="315dp"
tools:layout_editor_absoluteY="4dp" />
<TextView
android:id="@+id/searchingTextView"
android:layout_width="match_parent"
android:layout_height="47dp"
android:layout_alignParentStart="true"
android:layout_alignParentTop="true"
android:layout_marginStart="2dp"
android:layout_marginTop="59dp"
android:layout_marginBottom="403dp"
android:gravity="center"
android:text="Please move the mobile phone slowly to find the plane"
android:textColor="#ffffff"
tools:layout_editor_absoluteX="0dp"
tools:layout_editor_absoluteY="512dp" />
<TextView
android:id="@+id/plane_other"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="@string/plane_other"
android:visibility="gone"
android:rotation="180"
android:textColor="#ff2211"
tools:layout_editor_absoluteX="315dp"
tools:layout_editor_absoluteY="4dp" />
<TextView
android:id="@+id/plane_floor"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="@string/plane_floor"
android:visibility="gone"
android:rotation="180"
android:textColor="#ff2211"
tools:layout_editor_absoluteX="315dp"
tools:layout_editor_absoluteY="4dp" />
<TextView
android:id="@+id/plane_wall"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="@string/plane_wall"
android:visibility="gone"
android:rotation="180"
android:textColor="#ff2211"
tools:layout_editor_absoluteX="315dp"
tools:layout_editor_absoluteY="4dp" />
<TextView
android:id="@+id/plane_table"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="@string/plane_table"
android:visibility="gone"
android:rotation="180"
android:textColor="#ff2211"
tools:layout_editor_absoluteX="315dp"
tools:layout_editor_absoluteY="4dp" />
<TextView
android:id="@+id/plane_seat"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="@string/plane_seat"
android:visibility="gone"
android:rotation="180"
android:textColor="#ff2211"
tools:layout_editor_absoluteX="315dp"
tools:layout_editor_absoluteY="4dp" />
<TextView
android:id="@+id/plane_ceiling"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="@string/plane_ceiling"
android:visibility="gone"
android:rotation="180"
android:textColor="#ff2211"
tools:layout_editor_absoluteX="315dp"
tools:layout_editor_absoluteY="4dp" />
</RelativeLayout>
AR Engine is not for all devices, so first we need to validate if the device support AR Engine and is aviable, here is the list of devices supported
Code:
private boolean arEngineAbilityCheck() {
boolean isInstallArEngineApk = AREnginesApk.isAREngineApkReady(this);
if (!isInstallArEngineApk && isRemindInstall) {
Toast.makeText(this, "Please agree to install.", Toast.LENGTH_LONG).show();
finish();
}
Log.d(TAG, "Is Install AR Engine Apk: " + isInstallArEngineApk);
if (!isInstallArEngineApk) {
startActivity(new Intent(this, ConnectAppMarketActivity.class));
isRemindInstall = true;
}
return AREnginesApk.isAREngineApkReady(this);
}
Code:
private void setMessageWhenError(Exception catchException) {
if (catchException instanceof ARUnavailableServiceNotInstalledException) {
startActivity(new Intent(getApplicationContext(), ConnectAppMarketActivity.class));
} else if (catchException instanceof ARUnavailableServiceApkTooOldException) {
message = "Please update HuaweiARService.apk";
} else if (catchException instanceof ARUnavailableClientSdkTooOldException) {
message = "Please update this app";
} else if (catchException instanceof ARUnSupportedConfigurationException) {
message = "The configuration is not supported by the device!";
} else {
message = "exception throw";
}
}
On our MainActivity.java we call the surface detection
Code:
package com.vsm.myarapplication;
import androidx.appcompat.app.AppCompatActivity;
import android.content.Intent;
import android.opengl.GLSurfaceView;
import android.os.Bundle;
import android.util.Log;
import android.view.GestureDetector;
import android.view.MotionEvent;
import android.view.View;
import android.widget.Toast;
import com.huawei.hiar.ARConfigBase;
import com.huawei.hiar.AREnginesApk;
import com.huawei.hiar.ARSession;
import com.huawei.hiar.ARWorldTrackingConfig;
import com.huawei.hiar.exceptions.ARCameraNotAvailableException;
import com.huawei.hiar.exceptions.ARUnSupportedConfigurationException;
import com.huawei.hiar.exceptions.ARUnavailableClientSdkTooOldException;
import com.huawei.hiar.exceptions.ARUnavailableServiceApkTooOldException;
import com.huawei.hiar.exceptions.ARUnavailableServiceNotInstalledException;
import com.vsm.myarapplication.common.ConnectAppMarketActivity;
import com.vsm.myarapplication.common.DisplayRotationManager;
import com.vsm.myarapplication.common.PermissionManager;
import com.vsm.myarapplication.rendering.WorldRenderManager;
import java.util.concurrent.ArrayBlockingQueue;
public class MainActivity extends AppCompatActivity {
private static final String TAG = MainActivity.class.getSimpleName();
private static final int MOTIONEVENT_QUEUE_CAPACITY = 2;
private static final int OPENGLES_VERSION = 2;
private ARSession mArSession;
private GLSurfaceView mSurfaceView;
private WorldRenderManager mWorldRenderManager;
private GestureDetector mGestureDetector;
private DisplayRotationManager mDisplayRotationManager;
private ArrayBlockingQueue<GestureEvent> mQueuedSingleTaps = new ArrayBlockingQueue<>(MOTIONEVENT_QUEUE_CAPACITY);
private String message = null;
private boolean isRemindInstall = false;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
// AR Engine requires the camera permission.
PermissionManager.checkPermission(this);
mSurfaceView = findViewById(R.id.surfaceview);
mDisplayRotationManager = new DisplayRotationManager(this);
initGestureDetector();
mSurfaceView.setPreserveEGLContextOnPause(true);
mSurfaceView.setEGLContextClientVersion(OPENGLES_VERSION);
// Set the EGL configuration chooser, including for the number of
// bits of the color buffer and the number of depth bits.
mSurfaceView.setEGLConfigChooser(8, 8, 8, 8, 16, 0);
mWorldRenderManager = new WorldRenderManager(this, this);
mWorldRenderManager.setDisplayRotationManage(mDisplayRotationManager);
mWorldRenderManager.setQueuedSingleTaps(mQueuedSingleTaps);
mSurfaceView.setRenderer(mWorldRenderManager);
mSurfaceView.setRenderMode(GLSurfaceView.RENDERMODE_CONTINUOUSLY);
}
private void initGestureDetector() {
mGestureDetector = new GestureDetector(this, new GestureDetector.SimpleOnGestureListener() {
@Override
public boolean onDoubleTap(MotionEvent motionEvent) {
onGestureEvent(GestureEvent.createDoubleTapEvent(motionEvent));
return true;
}
@Override
public boolean onSingleTapConfirmed(MotionEvent motionEvent) {
onGestureEvent(GestureEvent.createSingleTapConfirmEvent(motionEvent));
return true;
}
@Override
public boolean onDown(MotionEvent motionEvent) {
return true;
}
@Override
public boolean onScroll(MotionEvent e1, MotionEvent e2, float distanceX, float distanceY) {
onGestureEvent(GestureEvent.createScrollEvent(e1, e2, distanceX, distanceY));
return true;
}
});
mSurfaceView.setOnTouchListener(new View.OnTouchListener() {
@Override
public boolean onTouch(View v, MotionEvent event) {
return mGestureDetector.onTouchEvent(event);
}
});
}
private void onGestureEvent(GestureEvent e) {
boolean offerResult = mQueuedSingleTaps.offer(e);
if (offerResult) {
Log.i(TAG, "Successfully joined the queue.");
} else {
Log.i(TAG, "Failed to join queue.");
}
}
@Override
protected void onResume() {
Log.i(TAG, "onResume");
super.onResume();
Exception exception = null;
message = null;
if (mArSession == null) {
try {
if (!arEngineAbilityCheck()) {
finish();
return;
}
mArSession = new ARSession(getApplicationContext());
ARWorldTrackingConfig config = new ARWorldTrackingConfig(mArSession);
config.setFocusMode(ARConfigBase.FocusMode.AUTO_FOCUS);
config.setSemanticMode(ARWorldTrackingConfig.SEMANTIC_PLANE);
mArSession.configure(config);
mWorldRenderManager.setArSession(mArSession);
} catch (Exception capturedException) {
Log.e(TAG,capturedException.toString());
exception = capturedException;
setMessageWhenError(capturedException);
}
if (message != null) {
stopArSession(exception);
return;
}
}
try {
mArSession.resume();
} catch (ARCameraNotAvailableException e) {
Toast.makeText(this, "Camera open failed, please restart the app", Toast.LENGTH_LONG).show();
mArSession = null;
return;
}
mDisplayRotationManager.registerDisplayListener();
mSurfaceView.onResume();
}
@Override
protected void onPause() {
Log.i(TAG, "onPause start.");
super.onPause();
if (mArSession != null) {
mDisplayRotationManager.unregisterDisplayListener();
mSurfaceView.onPause();
mArSession.pause();
}
Log.i(TAG, "onPause end.");
}
@Override
protected void onDestroy() {
Log.i(TAG, "onDestroy start.");
if (mArSession != null) {
mArSession.stop();
mArSession = null;
}
super.onDestroy();
Log.i(TAG, "onDestroy end.");
}
private boolean arEngineAbilityCheck() {
boolean isInstallArEngineApk = AREnginesApk.isAREngineApkReady(this);
if (!isInstallArEngineApk && isRemindInstall) {
Toast.makeText(this, "Please agree to install.", Toast.LENGTH_LONG).show();
finish();
}
Log.d(TAG, "Is Install AR Engine Apk: " + isInstallArEngineApk);
if (!isInstallArEngineApk) {
startActivity(new Intent(this, ConnectAppMarketActivity.class));
isRemindInstall = true;
}
return AREnginesApk.isAREngineApkReady(this);
}
private void setMessageWhenError(Exception catchException) {
if (catchException instanceof ARUnavailableServiceNotInstalledException) {
startActivity(new Intent(getApplicationContext(), ConnectAppMarketActivity.class));
} else if (catchException instanceof ARUnavailableServiceApkTooOldException) {
message = "Please update HuaweiARService.apk";
} else if (catchException instanceof ARUnavailableClientSdkTooOldException) {
message = "Please update this app";
} else if (catchException instanceof ARUnSupportedConfigurationException) {
message = "The configuration is not supported by the device!";
} else {
message = "exception throw";
}
}
private void stopArSession(Exception exception) {
Log.i(TAG, "stopArSession start.");
Toast.makeText(this, message, Toast.LENGTH_LONG).show();
Log.e(TAG, "Creating session error", exception);
if (mArSession != null) {
mArSession.stop();
mArSession = null;
}
Log.i(TAG, "stopArSession end.");
}
}
Conclusion
We can detect surfaces for multiple purposes in a simple way.
Documentation:
https://developer.huawei.com/consumer/en/doc/development/HMSCore-Guides/introduction-0000001050130900
Codelab:
https://developer.huawei.com/consumer/en/codelab/HWAREngine/index.html#0
Code Sample:
https://github.com/spartdark/hms-arengine-myarapplication

Validate your news: Feat. Huawei ML Kit (Text Image Super-Resolution)

Introduction
Quality improvement has become crucial in this era of digitalization where all our documents are kept in the folders, shared over the network and read on the digital device.
Imaging the grapple of an elderly person who has no way to read and understand an old prescribed medical document which has gone blurred and deteriorated.
Can we evade such issues??
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Let’s unbind what Huawei ML Kit offers to overcome such challenges of our day to day life.
Huawei ML Kit provides Text Image Super-Resolution API to improvise the quality and visibility of old and blurred text on an image.
Text Image Super-Resolution can zoom in an image that contains the text and significantly improve the definition of the text.
Limitations
The text image super-resolution service requires images with the maximum resolution 800 x 800 px and the length greater than or equal to 64 px.
Development Overview
Prerequisite
Must have a Huawei Developer Account
Must have Android Studio 3.0 or later
Must have a Huawei phone with HMS Core 4.0.2.300 or later
EMUI 3.0 or later
Software Requirements
Java SDK 1.7 or later
Android 5.0 or later
Preparation
Create an app or project in the Huawei app gallery connect.
Provide the SHA Key and App Package name of the project in App Information Section and enable the ML Kit API.
Download the agconnect-services.json file.
Create an Android project.
Integration
Add below to build.gradle (project)file, under buildscript/repositories and allprojects/repositories.
Code:
Maven {url 'http://developer.huawei.com/repo/'}
Add below to build.gradle (app) file, under dependencies.
To use the Base SDK of ML Kit-Text Image Super Resolution, add the following dependencies:
Code:
dependencies{
// Import the base SDK.
implementation 'com.huawei.hms:ml-computer-vision-textimagesuperresolution:2.0.3.300'
}
Adding permissions
Code:
<uses-permission android:name="android.permission.CAMERA " />
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
Automatically Updating the Machine Learning Model
Add the following statements to the AndroidManifest.xml file to automatically install the machine learning model on the user’s device.
Code:
<meta-data
android:name="com.huawei.hms.ml.DEPENDENCY"
android:value= "tisr"/>
Development Process
This article focuses on demonstrating the capabilities of Huawei’s ML Kit: Text Image Super- Resolution API’s.
Here is the example which explains how can we integrate this powerful API to leverage the benefits of improvising the Text-Image quality and provide full accessibility to the reader to read the old and blur newspapers from an online news directory.
TextImageView Activity : Launcher Activity
This is main activity of “The News Express “application.
Code:
package com.mlkitimagetext.example;
import androidx.appcompat.app.AppCompatActivity;
import android.content.Intent;
import android.os.Bundle;
import android.view.View;
import android.widget.Button;
import com.mlkitimagetext.example.textimagesuperresolution.TextImageSuperResolutionActivity;
public class TextImageView extends AppCompatActivity {
Button NewsExpress;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_text_image_view);
NewsExpress = findViewById(R.id.bt1);
NewsExpress.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View v) {
startActivity(new Intent(TextImageView.this, TextImageSuperResolutionActivity.class));
}
});
}
}
Activity_text_image_view.xml
This is the view class for the above activity class.
Code:
<?xml version="1.0" encoding="utf-8"?>
<RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:background="@drawable/im3">
<LinearLayout
android:id="@+id/ll_buttons"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:layout_marginTop="200dp"
android:orientation="vertical">
<Button
android:id="@+id/bt1"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:background="@android:color/transparent"
android:layout_gravity="center"
android:text="The News Express"
android:textAllCaps="false"
android:textStyle="bold"
android:textSize="34dp"
android:textColor="@color/mlkit_bcr_text_color_white"></Button>
<TextView
android:layout_width="wrap_content"
android:layout_height="match_parent"
android:textStyle="bold"
android:text="Validate Your News"
android:textSize="20sp"
android:layout_gravity="center"
android:textColor="#9fbfdf"/>
</LinearLayout>
</RelativeLayout>
TextImageSuperResolutionActivity
This activity class performs following actions:
Image picker implementation to pick the image from the gallery
Convert selected image to Bitmap
Create a text image super-resolution analyser.
Create an MLFrame object by using android.graphics.Bitmap.
Perform super-resolution processing on the image with text.
Stop the analyser to release detection resources.
Code:
package com.mlkitimagetext.example;
import android.content.Intent;
import android.graphics.Bitmap;
import android.net.Uri;
import android.os.Bundle;
import android.provider.MediaStore;
import android.view.View;
import android.widget.ImageView;
import android.widget.Toast;
import com.huawei.hmf.tasks.OnFailureListener;
import com.huawei.hmf.tasks.OnSuccessListener;
import com.huawei.hmf.tasks.Task;
import com.huawei.hms.mlsdk.common.MLException;
import com.huawei.hms.mlsdk.common.MLFrame;
import com.huawei.hms.mlsdk.textimagesuperresolution.MLTextImageSuperResolution;
import com.huawei.hms.mlsdk.textimagesuperresolution.MLTextImageSuperResolutionAnalyzer;
import com.huawei.hms.mlsdk.textimagesuperresolution.MLTextImageSuperResolutionAnalyzerFactory;
import com.mlkitimagetext.example.R;
import androidx.appcompat.app.AppCompatActivity;
import java.io.IOException;
public class TextImageSuperResolutionActivity<button> extends AppCompatActivity implements View.OnClickListener {
private static final String TAG = "TextSuperResolutionActivity";
private MLTextImageSuperResolutionAnalyzer analyzer;
private static final int INDEX_3X = 1;
private static final int INDEX_ORIGINAL = 2;
private ImageView imageView;
private Bitmap srcBitmap;
Uri imageUri;
Boolean ImageSetupFlag = false;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_text_super_resolution);
imageView = findViewById(R.id.image);
imageView.setOnClickListener(this);
findViewById(R.id.btn_load).setOnClickListener(this);
createAnalyzer();
}
@Override
public void onClick(View view) {
if (view.getId() == R.id.btn_load) {
openGallery();
}else if (view.getId() == R.id.image)
{
if(ImageSetupFlag != true)
{
detectImage(INDEX_3X);
}else {
detectImage(INDEX_ORIGINAL);
ImageSetupFlag = false;
}
}
}
private void openGallery() {
Intent gallery = new Intent(Intent.ACTION_PICK, MediaStore.Images.Media.EXTERNAL_CONTENT_URI);
startActivityForResult(gallery, 1);
}
@Override
protected void onActivityResult(int requestCode, int resultCode, Intent data){
super.onActivityResult(requestCode, resultCode, data);
if (resultCode == RESULT_OK && requestCode == 1){
imageUri = data.getData();
try {
srcBitmap = MediaStore.Images.Media.getBitmap(this.getContentResolver(), imageUri);
} catch (IOException e) {
e.printStackTrace();
}
//BitmapFactory.decodeResource(getResources(), R.drawable.new1);
imageView.setImageURI(imageUri);
}
}
private void release() {
if (analyzer == null) {
return;
}
analyzer.stop();
}
private void detectImage(int type) {
if (type == INDEX_ORIGINAL) {
setImage(srcBitmap);
return;
}
if (analyzer == null) {
return;
}
// Create an MLFrame by using the bitmap.
MLFrame frame = new MLFrame.Creator().setBitmap(srcBitmap).create();
Task<MLTextImageSuperResolution> task = analyzer.asyncAnalyseFrame(frame);
task.addOnSuccessListener(new OnSuccessListener<MLTextImageSuperResolution>() {
public void onSuccess(MLTextImageSuperResolution result) {
// success.
Toast.makeText(getApplicationContext(), "Success", Toast.LENGTH_SHORT).show();
setImage(result.getBitmap());
ImageSetupFlag = true;
}
})
.addOnFailureListener(new OnFailureListener() {
public void onFailure(Exception e) {
// failure.
if (e instanceof MLException) {
MLException mlException = (MLException) e;
// Get the error code, developers can give different page prompts according to the error code.
int errorCode = mlException.getErrCode();
// Get the error message, developers can combine the error code to quickly locate the problem.
String errorMessage = mlException.getMessage();
Toast.makeText(getApplicationContext(), "Error:" + errorCode + " Message:" + errorMessage, Toast.LENGTH_SHORT).show();
} else {
// Other exception。
Toast.makeText(getApplicationContext(), "Failed:" + e.getMessage(), Toast.LENGTH_SHORT).show();
}
}
});
}
private void setImage(final Bitmap bitmap) {
imageView.setImageBitmap(bitmap);
}
private void createAnalyzer() {
analyzer = MLTextImageSuperResolutionAnalyzerFactory.getInstance().getTextImageSuperResolutionAnalyzer();
}
@Override
protected void onDestroy() {
super.onDestroy();
if (srcBitmap != null) {
srcBitmap.recycle();
}
release();
}
}
More details, you can check https://forums.developer.huawei.com/forumPortal/en/topicview?tid=0202388336667910498&fid=0101187876626530001
Which all image format is supported?

Demystifying Document Skew Correction feat. HUAWEI ML KIT

Prolusion
This era is revolutionary for the science and research as most of the innovation is for the consumer needs.
We all know that document scanning is routine errand for most of us and a dire need for today’s digital world.
In such needs, we often require a powerful mechanism which can correct the informalities and skew for our documents.
Document Skew Correction is a technique which helps in correcting the tilted images to the right facing angle which further improvise the visibility of the image.
Huawei ML Kit offers a robust API for skew correction which enables automatic position identification of a document in an image and corrects the shooting angle. It also allows users to customize the edge points.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Suggestions
It is recommended the shooting angle of the image should be within 30 degrees.
It is recommended that the image size be within the range of 320 x 320 px to 1920 x 1920 px.
Skew detection API supports JPG, JPEG, and PNG image formats.
Development Overview
Prerequisite
Must have a Huawei Developer Account
Must have Android Studio 3.0 or later
Must have a Huawei phone with HMS Core 4.0.2.300 or later
EMUI 3.0 or later
Software Requirements
Java SDK 1.7 or later
Android 5.0 or later
Preparation
Create an app or project in the Huawei app gallery connect.
Provide the SHA Key and App Package name of the project in App Information Section and enable the ML Kit API.
Download the agconnect-services.json file.
Create an Android project.
Integration
Add below to build.gradle (project)file, under buildscript/repositories and allprojects/repositories.
Maven {url 'http://developer.huawei.com/repo/'}
Add below to build.gradle (app) file, under dependencies.
To use the Base SDK of ML Kit-Document Skew Correction, add the following dependencies:
dependencies{
// Import the base SDK.
implementation 'com.huawei.hms:ml-computer-vision-documentskew:2.0.4.300'
}
To use the Full SDK of ML Kit- Document Skew Correction, add the following dependencies:
dependencies{
// Import the Model Package.
implementation 'com.huawei.hms:ml-computer-vision-documentskew-model:2.0.4.300'
}
Adding permissions
<uses-permission android:name="android.permission.CAMERA " />
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
Automatically Updating the Machine Learning Model
Add the following statements to the AndroidManifest.xml file to automatically install the machine learning model on the user’s device.
<meta-data
android:name="com.huawei.hms.ml.DEPENDENCY"
android:value= "dsc"/>
Development Process
This article focuses on demonstrating the capabilities of Huawei’s ML Kit: Document Skew Correction API’s.
Here is the example of “SUPER DOC” application which allows user to capture and fetch the images from local memory of the device and let them correct using the document which explains how we can integrate this powerful API to leverage the benefits of correcting a skewed document image to provider the right angle to the document which eventually improves the readability of the document.
Skewdetect Activity
This activity is responsible to click and fetch the images and detect them for the skew correction and further align them and provide the output as aligned document image.
Code:
package com.mlkit.documentSkewCorrection;
import java.io.IOException;
import java.util.ArrayList;
import java.util.List;
import android.content.Intent;
import android.graphics.Bitmap;
import android.graphics.BitmapFactory;
import android.graphics.Point;
import android.net.Uri;
import android.os.Bundle;
import android.provider.MediaStore;
import android.util.Log;
import android.view.View;
import android.widget.ImageView;
import android.widget.Toast;
import androidx.appcompat.app.AppCompatActivity;
import com.google.android.material.floatingactionbutton.FloatingActionButton;
import com.huawei.hmf.tasks.OnFailureListener;
import com.huawei.hmf.tasks.OnSuccessListener;
import com.huawei.hmf.tasks.Task;
import com.huawei.hms.mlsdk.common.MLFrame;
import com.huawei.hms.mlsdk.dsc.MLDocumentSkewCorrectionConstant;
import com.huawei.hms.mlsdk.dsc.MLDocumentSkewCorrectionCoordinateInput;
import com.huawei.hms.mlsdk.dsc.MLDocumentSkewCorrectionResult;
import com.huawei.hms.mlsdk.dsc.MLDocumentSkewCorrectionAnalyzer;
import com.huawei.hms.mlsdk.dsc.MLDocumentSkewCorrectionAnalyzerFactory;
import com.huawei.hms.mlsdk.dsc.MLDocumentSkewCorrectionAnalyzerSetting;
import ccom.huawei.hms.mlsdk.dsc.MLDocumentSkewDetectResult;
import com.mlkit.documentSkewCorrection.R;
public class SkewDetect extends AppCompatActivity implements View.OnClickListener {
private static final String TAG = "SkewDetectActivity";
private MLDocumentSkewCorrectionAnalyzer analyzer;
private ImageView mImageView;
private Bitmap bitmap;
Uri imageUri;
private MLDocumentSkewCorrectionCoordinateInput input;
private MLFrame mlFrame;
Boolean FlagCameraClickDone = false;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
this.setContentView(R.layout.activity_document_skew_correction);
this.findViewById(R.id.image_refine).setOnClickListener(this);
this.mImageView = this.findViewById(R.id.image_refine_result);
if(FlagCameraClickDone)
{
this.findViewById(R.id.image_refine).setVisibility(View.VISIBLE);
}
else
{
this.findViewById(R.id.image_refine).setVisibility(View.GONE);
}
// Create the setting.
MLDocumentSkewCorrectionAnalyzerSetting setting = new MLDocumentSkewCorrectionAnalyzerSetting
.Factory()
.create();
// Get the analyzer.
this.analyzer = MLDocumentSkewCorrectionAnalyzerFactory.getInstance().getDocumentSkewCorrectionAnalyzer(setting);
FloatingActionButton fab = (FloatingActionButton) findViewById(R.id.fab);
fab.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View view) {
FlagCameraClickDone = false;
Intent gallery = new Intent(Intent.ACTION_PICK, MediaStore.Images.Media.EXTERNAL_CONTENT_URI);
startActivityForResult(gallery, 1);
}
});
}
@Override
protected void onActivityResult(int requestCode, int resultCode, Intent data){
super.onActivityResult(requestCode, resultCode, data);
if (resultCode == RESULT_OK && requestCode == 1){
imageUri = data.getData();
try {
bitmap = MediaStore.Images.Media.getBitmap(this.getContentResolver(), imageUri);
// Create a MLFrame by using the bitmap.
this.mlFrame = new MLFrame.Creator().setBitmap(this.bitmap).create();
} catch (IOException e) {
e.printStackTrace();
}
//BitmapFactory.decodeResource(getResources(), R.drawable.new1);
FlagCameraClickDone = true;
this.findViewById(R.id.image_refine).setVisibility(View.VISIBLE);
mImageView.setImageURI(imageUri);
}
}
@Override
public void onClick(View v) {
this.analyzer();
}
private void analyzer() {
// Call document skew detect interface to get coordinate data
Task<MLDocumentSkewDetectResult> detectTask = this.analyzer.asyncDocumentSkewDetect(this.mlFrame);
detectTask.addOnSuccessListener(new OnSuccessListener<MLDocumentSkewDetectResult>() {
@Override
public void onSuccess(MLDocumentSkewDetectResult detectResult) {
Log.e(TAG, detectResult.getResultCode() + ":");
if (detectResult != null) {
int resultCode = detectResult.getResultCode();
// Detect success.
if (resultCode == MLDocumentSkewCorrectionConstant.SUCCESS) {
Point leftTop = detectResult.getLeftTopPosition();
Point rightTop = detectResult.getRightTopPosition();
Point leftBottom = detectResult.getLeftBottomPosition();
Point rightBottom = detectResult.getRightBottomPosition();
List<Point> coordinates = new ArrayList<>();
coordinates.add(leftTop);
coordinates.add(rightTop);
coordinates.add(rightBottom);
coordinates.add(leftBottom);
SkewDetect .this.setDetectData(new MLDocumentSkewCorrectionCoordinateInput(coordinates));
SkewDetect .this.refineImg();
} else if (resultCode == MLDocumentSkewCorrectionConstant.IMAGE_DATA_ERROR) {
// Parameters error.
Log.e(TAG, "Parameters error!");
SkewDetect.this.displayFailure();
} else if (resultCode == MLDocumentSkewCorrectionConstant.DETECT_FAILD) {
// Detect failure.
Log.e(TAG, "Detect failed!");
SkewDetect .this.displayFailure();
}
} else {
// Detect exception.
Log.e(TAG, "Detect exception!");
SkewDetect .this.displayFailure();
}
}
}).addOnFailureListener(new OnFailureListener() {
@Override
public void onFailure(Exception e) {
// Processing logic for detect failure.
Log.e(TAG, e.getMessage() + "");
SkewDetect .this.displayFailure();
}
});
}
// Show result
private void displaySuccess(MLDocumentSkewCorrectionResult refineResult) {
if (this.bitmap == null) {
this.displayFailure();
return;
}
// Draw the portrait with a transparent background.
Bitmap corrected = refineResult.getCorrected();
if (corrected != null) {
this.mImageView.setImageBitmap(corrected);
} else {
this.displayFailure();
}
}
private void displayFailure() {
Toast.makeText(this.getApplicationContext(), "Fail", Toast.LENGTH_SHORT).show();
}
private void setDetectData(MLDocumentSkewCorrectionCoordinateInput input) {
this.input = input;
}
// Refine image
private void refineImg() {
// Call refine image interface
Task<MLDocumentSkewCorrectionResult> correctionTask = this.analyzer.asyncDocumentSkewCorrect(this.mlFrame, this.input);
correctionTask.addOnSuccessListener(new OnSuccessListener<MLDocumentSkewCorrectionResult>() {
@Override
public void onSuccess(MLDocumentSkewCorrectionResult refineResult) {
if (refineResult != null) {
int resultCode = refineResult.getResultCode();
if (resultCode == MLDocumentSkewCorrectionConstant.SUCCESS) {
SkewDetect .this.displaySuccess(refineResult);
} else if (resultCode == MLDocumentSkewCorrectionConstant.IMAGE_DATA_ERROR) {
// Parameters error.
Log.e(TAG, "Parameters error!");
SkewDetect .this.displayFailure();
} else if (resultCode == MLDocumentSkewCorrectionConstant.CORRECTION_FAILD) {
// Correct failure.
Log.e(TAG, "Correct failed!");
SkewDetect .this.displayFailure();
}
} else {
// Correct exception.
Log.e(TAG, "Correct exception!");
SkewDetect .this.displayFailure();
}
}
}).addOnFailureListener(new OnFailureListener() {
@Override
public void onFailure(Exception e) {
// Processing logic for refine failure.
SkewDetect .this.displayFailure();
}
});
}
@Override
protected void onDestroy() {
super.onDestroy();
if (this.analyzer != null) {
try {
this.analyzer.stop();
} catch (IOException e) {
Log.e(SkewDetect .TAG, "Stop failed: " + e.getMessage());
}
}
}
}
Skewdetect Activity View Class
This class is responsible for creating the UI definition of the application.
Code:
<?xml version="1.0" encoding="utf-8"?>
<RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:background="@drawable/shape"
tools:context="com.huawei.mlkit.example.face.StillFaceAnalyseActivity">
/**Create an image view to hold the bitmap**/
<ImageView
android:id="@+id/image_refine_result"
android:layout_width="500dp"
android:layout_height="300dp"
android:layout_below="@+id/image_foreground"
android:layout_marginTop="20dp"></ImageView>
<RelativeLayout
android:id="@+id/relativeLayout1"
android:layout_width="fill_parent"
android:layout_height="wrap_content"
android:layout_margin="20dp">
/**Create a button to fetch the ML kit API for skew correction**/
<Button
android:id="@+id/imagecorrection"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_centerHorizontal="true"
android:layout_alignParentBottom="true"
android:layout_gravity="start|bottom"
android:background="@color/emui_color_gray_1"
android:text=" Skew Correction "
android:textAllCaps="false"
android:textColor="@color/emui_color_gray_7"></Button>
/**Create a button to capture the image for skew correction**/
<com.google.android.material.floatingactionbutton.FloatingActionButton
android:id="@+id/fab"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_alignParentRight="true"
android:layout_alignParentBottom="true"
android:layout_gravity="end|bottom"
android:contentDescription="@string/camera"
android:outlineProvider="none"
android:src="@drawable/gall"
app:backgroundTint="@color/emui_color_gray_1"
app:borderWidth="0dp"
app:elevation="2dp" />
/**Create a button to fetch the image for skew correction**/
<com.google.android.material.floatingactionbutton.FloatingActionButton
android:id="@+id/cam"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_alignParentBottom="true"
android:layout_gravity="bottom"
android:layout_alignParentLeft="true"
android:contentDescription="@string/camera"
android:outlineProvider="none"
android:src="@drawable/icon_cam"
app:backgroundTint="@color/emui_color_gray_1"
app:borderWidth="0dp"
app:elevation="2dp" />
</RelativeLayout>
</RelativeLayout>
Results
Conclusion
In this article we took a small step to create and demonstrate the integration of Document Skew Correction API’s from Huawei ML Kit for better document image readability.
Upcoming article will have the integration of multiple ML API’s under one powerful application.
Stay tuned!!
References
https://developer.huawei.com/consumer/en/doc/development/HMSCore-Guides/documentskewcorrection-0000001051703156

Expert: Xamarin Android Online Book Store App Using In-App Purchase and Login with Huawei Id

Overview
In this article, I will create a demo app that highlights an online book store with In-App Purchases. User can easily buy book and make a payment online. I have integrated HMS Account and IAP Kit which is based on Cross-platform Technology Xamarin.
Account Kit Service Introduction
HMS Account Kit allows you to connect to the Huawei ecosystem using your HUAWEI ID from a range of devices, such as mobile phones, tablets, and smart screens.
It’s a simple, secure, and quick sign-in and authorization functions. Instead of entering accounts and passwords and waiting for authentication.
Complies with international standards and protocols such as OAuth2.0 and OpenID Connect, and supports two-factor authentication (password authentication and mobile number authentication) to ensure high security.
HMS IAP Service Introduction
HMS In-App Purchase Kit allows purchasing any product from the application with highly secure payment. Users can purchase a variety of products or services, including common virtual products and subscriptions, directly within your app. It also provides a product management system (PMS) for managing the prices and languages of in-app products (including games) in multiple locations.
These are the following 3 types of in-app products supported by the IAP:
1. Consumable: Consumables are used once, are depleted, and can be purchased again.
2. Non-consumable: Non-consumables are purchased once and do not expire.
3. Auto-renewable subscriptions: Users can purchase access to value-added functions or content in a specified period of time. The subscriptions are automatically renewed on a recurring basis until users decide to cancel.
Prerequisite
1. Xamarin Framework
2. Huawei phone
3. Visual Studio 2019
App Gallery Integration process
1. Sign In and Create or Choose a project on AppGallery Connect portal.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
2. Navigate to Project settings > download the configuration file and Add SHA-256 key.
3. Navigate to General Information > Data Storage location.
4. Navigate to Manage APIs > enable APIs to require by an application.
5. Navigate to In-App Purchases and Copy Public Key.
6. Navigate to My apps > Operate, and then enter details in Add Product.
7. Click View and Edit in the above screenshot, enter Product price details, and then click Save.
8. Click Activate for product activation.
Xamarin Account Kit Setup Process
1. Download Xamarin Plugin all the aar and zip files from below URL:
https://developer.huawei.com/consum...y-V1/xamarin-sdk-download-0000001050768441-V1
2. Open the XHwid-5.03.302.sln solution in Visual Studio.
Xamarin IAP Kit Setup Process
1. Download Xamarin Plugin all the aar and zip files from below url:
https://developer.huawei.com/consum...y-V1/xamarin-sdk-download-0000001051011015-V1
2. Open the XIAP-5.0.2.300.sln solution in Visual Studio.
3. Navigate to Solution Explorer and Right-click on jar Add > Existing Item and choose aar file which downloads in Step 1.
4. Right-click on added aar file then choose Properties > Build Action > LibraryProjectZip.
Note: Repeat Step 3 & 4 for all aar file.
5. Build the Library and make dll files.
Xamarin App Development
1. Open Visual Studio 2019 and Create a new project.
2. Navigate to Solution Explore > Project > Assets > Add Json file.
3. Navigate to Solution Explore > Project > Add > Add New Folder.
4. Navigate to Folder(created) > Add > Add Existing and add all DLL files.
5. Right click > Properties > Build Action > None.
7. Navigate to Solution Explore > Project > Reference > Right Click > Add References then Navigate to Browse and add all DLL files from recently added Folder.
8. Added reference then click Ok.
MainActivity.cs
This activity performs all the operation regarding login with Huawei Id.
Java:
using System;
using Android.App;
using Android.Content;
using Android.Content.PM;
using Android.OS;
using Android.Runtime;
using Android.Support.Design.Widget;
using Android.Support.V4.App;
using Android.Support.V4.Content;
using Android.Support.V7.App;
using Android.Util;
using Android.Views;
using Android.Widget;
using Com.Huawei.Agconnect.Config;
using Com.Huawei.Hmf.Tasks;
using Com.Huawei.Hms.Common;
using Com.Huawei.Hms.Iap;
using Com.Huawei.Hms.Iap.Entity;
using Com.Huawei.Hms.Support.Hwid;
using Com.Huawei.Hms.Support.Hwid.Request;
using Com.Huawei.Hms.Support.Hwid.Result;
using Com.Huawei.Hms.Support.Hwid.Service;
namespace BookStore
{
[Activity(Label = "@string/app_name", Theme = "@style/AppTheme.NoActionBar", MainLauncher = true)]
public class MainActivity : AppCompatActivity
{
private Button btnLoginWithHuaweiId;
private HuaweiIdAuthParams mAuthParam;
public static IHuaweiIdAuthService mAuthManager;
private static String TAG = "MainActivity";
protected override void OnCreate(Bundle savedInstanceState)
{
base.OnCreate(savedInstanceState);
Xamarin.Essentials.Platform.Init(this, savedInstanceState);
SetContentView(Resource.Layout.activity_main);
Android.Support.V7.Widget.Toolbar toolbar = FindViewById<Android.Support.V7.Widget.Toolbar>(Resource.Id.toolbar);
SetSupportActionBar(toolbar);
btnLoginWithHuaweiId = FindViewById<Button>(Resource.Id.btn_huawei_id);
CheckIfIAPAvailable();
// Write code for Huawei id button click
mAuthParam = new HuaweiIdAuthParamsHelper(HuaweiIdAuthParams.DefaultAuthRequestParam)
.SetIdToken().SetEmail()
.SetAccessToken()
.CreateParams();
mAuthManager = HuaweiIdAuthManager.GetService(this, mAuthParam);
// Click listener for each button
btnLoginWithHuaweiId.Click += delegate
{
StartActivityForResult(mAuthManager.SignInIntent, 1011);
};
/*FloatingActionButton fab = FindViewById<FloatingActionButton>(Resource.Id.fab);
fab.Click += FabOnClick;*/
//check permissions
checkPermission(new string[] { Android.Manifest.Permission.Internet,
Android.Manifest.Permission.AccessNetworkState,
Android.Manifest.Permission.ReadSms,
Android.Manifest.Permission.ReceiveSms,
Android.Manifest.Permission.SendSms,
Android.Manifest.Permission.BroadcastSms}, 100);
}
protected override void OnActivityResult(int requestCode, Result resultCode, Intent data)
{
base.OnActivityResult(requestCode, resultCode, data);
if (requestCode == 1011)
{
//login success
Task authHuaweiIdTask = HuaweiIdAuthManager.ParseAuthResultFromIntent(data);
if (authHuaweiIdTask.IsSuccessful)
{
AuthHuaweiId huaweiAccount = (AuthHuaweiId)authHuaweiIdTask.TaskResult();
Log.Info(TAG, "signIn get code success.");
Log.Info(TAG, "ServerAuthCode: " + huaweiAccount.AuthorizationCode);
Toast.MakeText(Android.App.Application.Context, "SignIn Success", ToastLength.Short).Show();
}
else
{
Log.Info(TAG, "signIn failed: " + ((ApiException)authHuaweiIdTask.Exception).StatusCode);
Toast.MakeText(Android.App.Application.Context, ((ApiException)authHuaweiIdTask.Exception).StatusCode.ToString(), ToastLength.Short).Show();
Toast.MakeText(Android.App.Application.Context, "SignIn Failed", ToastLength.Short).Show();
}
}
}
public void checkPermission(string[] permissions, int requestCode)
{
foreach (string permission in permissions)
{
if (ContextCompat.CheckSelfPermission(this, permission) == Permission.Denied)
{
ActivityCompat.RequestPermissions(this, permissions, requestCode);
}
}
}
/*private void FabOnClick(object sender, EventArgs eventArgs)
{
View view = (View) sender;
Snackbar.Make(view, "Replace with your own action", Snackbar.LengthLong)
.SetAction("Action", (Android.Views.View.IOnClickListener)null).Show();
}*/
public override void OnRequestPermissionsResult(int requestCode, string[] permissions, [GeneratedEnum] Android.Content.PM.Permission[] grantResults)
{
Xamarin.Essentials.Platform.OnRequestPermissionsResult(requestCode, permissions, grantResults);
base.OnRequestPermissionsResult(requestCode, permissions, grantResults);
}
protected override void AttachBaseContext(Context context)
{
base.AttachBaseContext(context);
AGConnectServicesConfig config = AGConnectServicesConfig.FromContext(context);
config.OverlayWith(new HmsLazyInputStream(context));
}
private void CancelAuthorisation()
{
Task cancelAuthorizationTask = mAuthManager.CancelAuthorization();
Log.Info(TAG, "Cancel Authorisation");
cancelAuthorizationTask.AddOnCompleteListener(
new OnCompleteListener
(
this, "Cancel Authorization Success",
"Cancel Authorization Failed"
)
);
}
public void SignOut()
{
Task signOutTask = mAuthManager.SignOut();
signOutTask.AddOnSuccessListener(new OnSuccessListener(this, "SignOut Success"))
.AddOnFailureListener(new OnFailureListener("SignOut Failed"));
}
public class OnCompleteListener : Java.Lang.Object, IOnCompleteListener
{
//Message when task is successful
private string successMessage;
//Message when task is failed
private string failureMessage;
MainActivity context;
public OnCompleteListener(MainActivity context, string SuccessMessage, string FailureMessage)
{
this.context = context;
this.successMessage = SuccessMessage;
this.failureMessage = FailureMessage;
}
public void OnComplete(Task task)
{
if (task.IsSuccessful)
{
//do some thing while cancel success
Log.Info(TAG, successMessage);
//context.SignOut();
}
else
{
//do some thing while cancel failed
Exception exception = task.Exception;
if (exception is ApiException)
{
int statusCode = ((ApiException)exception).StatusCode;
Log.Info(TAG, failureMessage + ": " + statusCode);
}
//context.ManageHomeScreen(null, true);
}
}
}
public class OnSuccessListener : Java.Lang.Object, Com.Huawei.Hmf.Tasks.IOnSuccessListener
{
//Message when task is successful
private string successMessage;
MainActivity context;
public OnSuccessListener(MainActivity context, string SuccessMessage)
{
this.successMessage = SuccessMessage;
this.context = context;
}
public void OnSuccess(Java.Lang.Object p0)
{
Log.Info(TAG, successMessage);
Toast.MakeText(Android.App.Application.Context, successMessage, ToastLength.Short).Show();
Intent intent = new Intent(context, typeof(BookStoreActivity));
context.StartActivity(intent);
}
}
public class OnFailureListener : Java.Lang.Object, Com.Huawei.Hmf.Tasks.IOnFailureListener
{
//Message when task is failed
private string failureMessage;
public OnFailureListener(string FailureMessage)
{
this.failureMessage = FailureMessage;
}
public void OnFailure(Java.Lang.Exception p0)
{
Log.Info(TAG, failureMessage);
Toast.MakeText(Android.App.Application.Context, failureMessage, ToastLength.Short).Show();
}
}
public void CheckIfIAPAvailable()
{
IIapClient mClient = Iap.GetIapClient(this);
Task isEnvReady = mClient.IsEnvReady();
isEnvReady.AddOnSuccessListener(new ListenerImp(this)).AddOnFailureListener(new ListenerImp(this));
}
class ListenerImp : Java.Lang.Object, IOnSuccessListener, IOnFailureListener
{
private MainActivity mainActivity;
public ListenerImp(MainActivity mainActivity)
{
this.mainActivity = mainActivity;
}
public void OnSuccess(Java.Lang.Object IsEnvReadyResult)
{
// Obtain the execution result.
}
public void OnFailure(Java.Lang.Exception e)
{
Toast.MakeText(Android.App.Application.Context, "Feature Not available for your country", ToastLength.Short).Show();
if (e.GetType() == typeof(IapApiException))
{
IapApiException apiException = (IapApiException)e;
if (apiException.Status.StatusCode == OrderStatusCode.OrderHwidNotLogin)
{
// Not logged in.
//Call StartResolutionForResult to bring up the login page
}
else if (apiException.Status.StatusCode == OrderStatusCode.OrderAccountAreaNotSupported)
{
// The current region does not support HUAWEI IAP.
}
}
}
}
}
}
content_main.xml
XML:
<?xml version="1.0" encoding="utf-8"?>
<RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:layout_width="match_parent"
xmlns:hwads="http://schemas.android.com/apk/res-auto"
android:layout_height="match_parent"
android:background="#FFA500">
<TextView
android:id="@+id/txt_title"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_alignParentTop="true"
android:layout_centerHorizontal="true"
android:clickable="true"
android:gravity="bottom"
android:padding="20dp"
android:text="Online Book Store"
android:textColor="#fff"
android:textSize="25dp"
android:textStyle="bold" />
<LinearLayout
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:layout_centerInParent="true"
android:orientation="vertical"
android:gravity="center_horizontal"
android:padding="10dp">
<TextView
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:layout_margin="10dp"
android:gravity="center_horizontal"
android:text="Sign in"
android:textSize="30dp"
android:textColor="#fff" />
<Button
android:id="@+id/btn_huawei_id"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:layout_margin="10dp"
android:background="#dd4b39"
android:text="Huawei Id"
android:textColor="#fff" />
<Button
android:id="@+id/btn_email_phone"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:layout_margin="10dp"
android:background="#dd4b39"
android:text="Login with Phone/Email"
android:textColor="#fff" />
</LinearLayout>
<!--<com.huawei.hms.ads.banner.BannerView
android:id="@+id/hw_banner_view"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:layout_alignParentBottom="true"
android:layout_centerHorizontal="true"
hwads:adId="@string/banner_ad_id"
hwads:bannerSize="BANNER_SIZE_320_50"/>-->
</RelativeLayout>
BookStoreActivity.cs
This activity performs all the operation regarding In-App purchasing and display list of books.
Java:
using Android.App;
using Android.Content;
using Android.OS;
using Android.Runtime;
using Android.Support.V7.App;
using Android.Support.V7.Widget;
using Android.Util;
using Android.Views;
using Android.Widget;
using Com.Huawei.Hmf.Tasks;
using Com.Huawei.Hms.Iap;
using Com.Huawei.Hms.Iap.Entity;
using Org.Json;
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
namespace BookStore
{
[Activity(Label = "BookStoreActivity", Theme = "@style/AppTheme")]
public class BookStoreActivity : AppCompatActivity, BuyProduct
{
private static String TAG = "BookStoreActivity";
private RecyclerView recyclerView;
private BookStoreAdapter storeAdapter;
IList<ProductInfo> productList;
protected override void OnCreate(Bundle savedInstanceState)
{
base.OnCreate(savedInstanceState);
Xamarin.Essentials.Platform.Init(this, savedInstanceState);
SetContentView(Resource.Layout.activity_store);
recyclerView = FindViewById<RecyclerView>(Resource.Id.recyclerview);
recyclerView.SetLayoutManager(new LinearLayoutManager(this));
recyclerView.SetItemAnimator(new DefaultItemAnimator());
//ADAPTER
storeAdapter = new BookStoreAdapter(this);
storeAdapter.SetData(productList);
recyclerView.SetAdapter(storeAdapter);
GetProducts();
}
private void GetProducts()
{
List<String> productIdList = new List<String>();
productIdList.Add("Book101");
productIdList.Add("Book102");
ProductInfoReq req = new ProductInfoReq();
// PriceType: 0: consumable; 1: non-consumable; 2: auto-renewable subscription
req.PriceType = 0;
req.ProductIds = productIdList;
//"this" in the code is a reference to the current activity
Task task = Iap.GetIapClient(this).ObtainProductInfo(req);
task.AddOnSuccessListener(new QueryProductListenerImp(this)).AddOnFailureListener(new QueryProductListenerImp(this));
}
class QueryProductListenerImp : Java.Lang.Object, IOnSuccessListener, IOnFailureListener
{
private BookStoreActivity storeActivity;
public QueryProductListenerImp(BookStoreActivity storeActivity)
{
this.storeActivity = storeActivity;
}
public void OnSuccess(Java.Lang.Object result)
{
// Obtain the result
ProductInfoResult productlistwrapper = (ProductInfoResult)result;
IList<ProductInfo> productList = productlistwrapper.ProductInfoList;
storeActivity.storeAdapter.SetData(productList);
storeActivity.storeAdapter.NotifyDataSetChanged();
}
public void OnFailure(Java.Lang.Exception e)
{
//get the status code and handle the error
}
}
public void OnBuyProduct(ProductInfo pInfo)
{
//Toast.MakeText(Android.App.Application.Context, pInfo.ProductName, ToastLength.Short).Show();
CreatePurchaseRequest(pInfo);
}
private void CreatePurchaseRequest(ProductInfo pInfo)
{
// Constructs a PurchaseIntentReq object.
PurchaseIntentReq req = new PurchaseIntentReq();
// The product ID is the same as that set by a developer when configuring product information in AppGallery Connect.
// PriceType: 0: consumable; 1: non-consumable; 2: auto-renewable subscription
req.PriceType = pInfo.PriceType;
req.ProductId = pInfo.ProductId;
//"this" in the code is a reference to the current activity
Task task = Iap.GetIapClient(this).CreatePurchaseIntent(req);
task.AddOnSuccessListener(new BuyListenerImp(this)).AddOnFailureListener(new BuyListenerImp(this));
}
protected override void OnActivityResult(int requestCode, Android.App.Result resultCode, Intent data)
{
base.OnActivityResult(requestCode, resultCode, data);
if (requestCode == 6666)
{
if (data == null)
{
Log.Error(TAG, "data is null");
return;
}
//"this" in the code is a reference to the current activity
PurchaseResultInfo purchaseIntentResult = Iap.GetIapClient(this).ParsePurchaseResultInfoFromIntent(data);
switch (purchaseIntentResult.ReturnCode)
{
case OrderStatusCode.OrderStateCancel:
// User cancel payment.
Toast.MakeText(Android.App.Application.Context, "Payment Cancelled", ToastLength.Short).Show();
break;
case OrderStatusCode.OrderStateFailed:
Toast.MakeText(Android.App.Application.Context, "Order Failed", ToastLength.Short).Show();
break;
case OrderStatusCode.OrderProductOwned:
// check if there exists undelivered products.
Toast.MakeText(Android.App.Application.Context, "Undelivered Products", ToastLength.Short).Show();
break;
case OrderStatusCode.OrderStateSuccess:
// pay success.
Toast.MakeText(Android.App.Application.Context, "Payment Success", ToastLength.Short).Show();
// use the public key of your app to verify the signature.
// If ok, you can deliver your products.
// If the user purchased a consumable product, call the ConsumeOwnedPurchase API to consume it after successfully delivering the product.
String inAppPurchaseDataStr = purchaseIntentResult.InAppPurchaseData;
MakeProductReconsumeable(inAppPurchaseDataStr);
break;
default:
break;
}
return;
}
}
private void MakeProductReconsumeable(String InAppPurchaseDataStr)
{
String purchaseToken = null;
try
{
InAppPurchaseData InAppPurchaseDataBean = new InAppPurchaseData(InAppPurchaseDataStr);
if (InAppPurchaseDataBean.PurchaseStatus != InAppPurchaseData.PurchaseState.Purchased)
{
return;
}
purchaseToken = InAppPurchaseDataBean.PurchaseToken;
}
catch (JSONException e) { }
ConsumeOwnedPurchaseReq req = new ConsumeOwnedPurchaseReq();
req.PurchaseToken = purchaseToken;
//"this" in the code is a reference to the current activity
Task task = Iap.GetIapClient(this).ConsumeOwnedPurchase(req);
task.AddOnSuccessListener(new ConsumListenerImp()).AddOnFailureListener(new ConsumListenerImp());
}
class ConsumListenerImp : Java.Lang.Object, IOnSuccessListener, IOnFailureListener
{
public void OnSuccess(Java.Lang.Object result)
{
// Obtain the result
Log.Info(TAG, "Product available for purchase");
}
public void OnFailure(Java.Lang.Exception e)
{
//get the status code and handle the error
Log.Info(TAG, "Product available for purchase API Failed");
}
}
class BuyListenerImp : Java.Lang.Object, IOnSuccessListener, IOnFailureListener
{
private BookStoreActivity storeActivity;
public BuyListenerImp(BookStoreActivity storeActivity)
{
this.storeActivity = storeActivity;
}
public void OnSuccess(Java.Lang.Object result)
{
// Obtain the payment result.
PurchaseIntentResult InResult = (PurchaseIntentResult)result;
if (InResult.Status != null)
{
// 6666 is an int constant defined by the developer.
InResult.Status.StartResolutionForResult(storeActivity, 6666);
}
}
public void OnFailure(Java.Lang.Exception e)
{
//get the status code and handle the error
Toast.MakeText(Android.App.Application.Context, "Purchase Request Failed !", ToastLength.Short).Show();
}
}
}
}
activity_store.xml
XML:
<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:orientation="vertical"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:padding="5dp"
android:background="#ADD8E6">
<android.support.v7.widget.RecyclerView
android:id="@+id/recyclerview"
android:layout_width="match_parent"
android:layout_height="wrap_content" />
</LinearLayout>
Xamarin App Build Result
1. Navigate to Solution Explore > Project > Right Click > Archive/View Archive to generate SHA-256 for build release and Click on Distribute.
2. Choose Distribution Channel > Ad Hoc to sign apk.
3. Choose Demo Keystore to release apk.
4. Build succeed and Save apk file.
5. Finally here is the result.
Tips and Tricks
1. It is recommended that the app obtains the public payment key from your server in real-time. Do not store it on the app to prevent app version incompatibility caused by the subsequent key upgrade.
2. The sandbox testing function can be used only when the following conditions are met: A sandbox testing account is successfully added, and the value of versionCode of the test package is greater than that of the released package. In the HMS Core IAP SDK 4.0.2, the isSandboxActivated API is added to check whether the current sandbox testing environment is available. If not, the API returns the reason why the environment is unavailable.
Conclusion
In this article, we have learned how to integrate HMS In-App Purchase and Account Kit in Xamarin based Android application. User can easily log in and purchase an online book with easy and hassle-free payment.
Thanks for reading this article.
Be sure to like and comments on this article, if you found it helpful. It means a lot to me.
References
Account Kit
In-App Purchase
Read In Forum

Pygmy Collection Application Part 7 (Document Skew correction Huawei HiAI)

Introduction​
If are you new to this application, please follow my previous articles
Pygmy collection application Part 1 (Account kit)
Intermediate: Pygmy Collection Application Part 2 (Ads Kit)
Intermediate: Pygmy Collection Application Part 3 (Crash service)
Intermediate: Pygmy Collection Application Part 4 (Analytics Kit Custom Events)
Intermediate: Pygmy Collection Application Part 5 (Safety Detect)
Intermediate: Pygmy Collection Application Part 6 (Room database)
Click to expand...
Click to collapse
In this article, we will learn how to integrate Huawei Document skew correction using Huawei HiAI in Pygmy collection finance application.
In pygmy collection application for customers KYC update need to happen, so agents will update the KYC, in that case document should be proper, so we will integrate the document skew correction for the image angle adjustment.
Commonly user Struggles a lot while uploading or filling any form due to document issue. This application helps them to take picture from the camera or from the gallery, it automatically detects document from the image.
Document skew correction is used to improve the document photography process by automatically identifying the document in an image. This actually returns the position of the document in original image.
Document skew correction also adjusts the shooting angle of the document based on the position information of the document in original image. This function has excellent performance in scenarios where photos of old photos, paper letters, and drawings are taken for electronic storage.
Features
Document detection: Recognizes documents in images and returns the location information of the documents in the original images.
Document correction: Corrects the document shooting angle based on the document location information in the original images, where areas to be corrected can be customized.
How to integrate Document Skew Correction
1. Configure the application on the AGC.
2. Apply for HiAI Engine Library.
3. Client application development process.
Configure application on the AGC
Follow the steps.
Step 1: We need to register as a developer account in AppGallery Connect. If you are already a developer ignore this step.
Step 2: Create an app by referring to Creating a Project and Creating an App in the Project
Step 3: Set the data storage location based on the current location.
Step 4: Generating a Signing Certificate Fingerprint.
Step 5: Configuring the Signing Certificate Fingerprint.
Step 6: Download your agconnect-services.json file, paste it into the app root directory.
Apply for HiAI Engine Library
What is Huawei HiAI?
HiAI is Huawei’s AI computing platform. HUAWEI HiAI is a mobile terminal–oriented artificial intelligence (AI) computing platform that constructs three layers of ecology: service capability openness, application capability openness, and chip capability openness. The three-layer open platform that integrates terminals, chips, and the cloud brings more extraordinary experience for users and developers.
How to apply for HiAI Engine?
Follow the steps.
Step 1: Navigate to this URL, choose App Service > Development and click HUAWEI HiAI.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Step 2: Click Apply for HUAWEI HiAI kit.
Step 3: Enter required information like Product name and Package name, click Next button.
Step 4: Verify the application details and click Submit button.
Step 5: Click the Download SDK button to open the SDK list.
Step 6: Unzip downloaded SDK and add into your android project under libs folder.
Step 7: Add jar files dependences into app build.gradle file.
Code:
implementation fileTree(include: ['*.aar', '*.jar'], dir: 'libs')
implementation 'com.google.code.gson:gson:2.8.6'
repositories {
flatDir {
dirs 'libs'
}
}
Client application development process
Follow the steps.
Step 1: Create an Android application in the Android studio (Any IDE which is your favorite).
Step 2: Add the App level Gradle dependencies. Choose inside project Android > app > build.gradle.
Code:
apply plugin: 'com.android.application'
apply plugin: 'com.huawei.agconnect'
Root level gradle dependencies.
Code:
maven { url 'https://developer.huawei.com/repo/' }
classpath 'com.huawei.agconnect:agcp:1.4.1.300'
Step 3: Add permission in AndroidManifest.xml
XML:
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.READ_INTERNAL_STORAGE" />
<uses-permission android:name="android.permission.CAMERA" />
Step 4: Build application.
Select image dialog
Java:
private void selectImage() {
final CharSequence[] items = {"Take Photo", "Choose from Library",
"Cancel"};
AlertDialog.Builder builder = new AlertDialog.Builder(KycUpdateActivity.this);
builder.setTitle("Add Photo!");
builder.setItems(items, new DialogInterface.OnClickListener() {
@Override
public void onClick(DialogInterface dialog, int item) {
boolean result = PygmyUtils.checkPermission(KycUpdateActivity.this);
if (items[item].equals("Take Photo")) {
/*userChoosenTask = "Take Photo";
if (result)
cameraIntent();*/
operate_type = TAKE_PHOTO;
requestPermission(Manifest.permission.CAMERA);
} else if (items[item].equals("Choose from Library")) {
/* userChoosenTask = "Choose from Library";
if (result)
galleryIntent();*/
operate_type = SELECT_ALBUM;
requestPermission(Manifest.permission.READ_EXTERNAL_STORAGE);
} else if (items[item].equals("Cancel")) {
dialog.dismiss();
}
}
});
builder.show();
}
Open Document Skew correction activity
Java:
private void startSuperResolutionActivity() {
Intent intent = new Intent(KycUpdateActivity.this, DocumentSkewCorrectionActivity.class);
intent.putExtra("operate_type", operate_type);
startActivityForResult(intent, DOCUMENT_SKEW_CORRECTION_REQUEST);
}
DocumentSkewCorrectonActivity.java
Java:
import androidx.annotation.Nullable;
import androidx.appcompat.app.AppCompatActivity;
import android.app.Activity;
import android.app.AlertDialog;
import android.app.ProgressDialog;
import android.content.ContentValues;
import android.content.DialogInterface;
import android.content.Intent;
import android.graphics.Bitmap;
import android.graphics.Matrix;
import android.graphics.Point;
import android.media.ExifInterface;
import android.net.Uri;
import android.os.Bundle;
import android.os.Handler;
import android.provider.MediaStore;
import android.util.Log;
import android.view.View;
import android.widget.ImageButton;
import android.widget.ImageView;
import android.widget.RelativeLayout;
import android.widget.TextView;
import android.widget.Toast;
import com.huawei.hmf.tasks.OnFailureListener;
import com.huawei.hmf.tasks.OnSuccessListener;
import com.huawei.hmf.tasks.Task;
import com.huawei.hms.mlsdk.common.MLFrame;
import com.huawei.hms.mlsdk.dsc.MLDocumentSkewCorrectionAnalyzer;
import com.huawei.hms.mlsdk.dsc.MLDocumentSkewCorrectionAnalyzerFactory;
import com.huawei.hms.mlsdk.dsc.MLDocumentSkewCorrectionAnalyzerSetting;
import com.huawei.hms.mlsdk.dsc.MLDocumentSkewCorrectionCoordinateInput;
import com.huawei.hms.mlsdk.dsc.MLDocumentSkewCorrectionResult;
import com.huawei.hms.mlsdk.dsc.MLDocumentSkewDetectResult;
import com.shea.pygmycollection.R;
import com.shea.pygmycollection.customview.DocumentCorrectImageView;
import com.shea.pygmycollection.utils.FileUtils;
import com.shea.pygmycollection.utils.UserDataUtils;
import java.io.IOException;
import java.util.ArrayList;
import java.util.List;
public class DocumentSkewCorrectionActivity extends AppCompatActivity implements View.OnClickListener {
private static final String TAG = "SuperResolutionActivity";
private static final int REQUEST_SELECT_IMAGE = 1000;
private static final int REQUEST_TAKE_PHOTO = 1;
private ImageView desImageView;
private ImageButton adjustImgButton;
private Bitmap srcBitmap;
private Bitmap getCompressesBitmap;
private Uri imageUri;
private MLDocumentSkewCorrectionAnalyzer analyzer;
private Bitmap corrected;
private ImageView back;
private Task<MLDocumentSkewCorrectionResult> correctionTask;
private DocumentCorrectImageView documetScanView;
private Point[] _points;
private RelativeLayout layout_image;
private MLFrame frame;
TextView selectTv;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_document_skew_corretion);
//setStatusBarColor(this, R.color.black);
analyzer = createAnalyzer();
adjustImgButton = findViewById(R.id.adjust);
layout_image = findViewById(R.id.layout_image);
desImageView = findViewById(R.id.des_image);
documetScanView = findViewById(R.id.iv_documetscan);
back = findViewById(R.id.back);
selectTv = findViewById(R.id.selectTv);
adjustImgButton.setOnClickListener(this);
findViewById(R.id.back).setOnClickListener(this);
selectTv.setOnClickListener(this);
findViewById(R.id.rl_chooseImg).setOnClickListener(this);
back.setOnClickListener(this);
int operate_type = getIntent().getIntExtra("operate_type", 101);
if (operate_type == 101) {
takePhoto();
} else if (operate_type == 102) {
selectLocalImage();
}
}
private String[] chooseTitles;
@Override
public void onClick(View v) {
if (v.getId() == R.id.adjust) {
List<Point> points = new ArrayList<>();
Point[] cropPoints = documetScanView.getCropPoints();
if (cropPoints != null) {
points.add(cropPoints[0]);
points.add(cropPoints[1]);
points.add(cropPoints[2]);
points.add(cropPoints[3]);
}
MLDocumentSkewCorrectionCoordinateInput coordinateData = new MLDocumentSkewCorrectionCoordinateInput(points);
getDetectdetectResult(coordinateData, frame);
} else if (v.getId() == R.id.rl_chooseImg) {
chooseTitles = new String[]{getResources().getString(R.string.take_photo), getResources().getString(R.string.select_from_album)};
AlertDialog.Builder builder = new AlertDialog.Builder(this);
builder.setItems(chooseTitles, new DialogInterface.OnClickListener() {
@Override
public void onClick(DialogInterface dialogInterface, int position) {
if (position == 0) {
takePhoto();
} else {
selectLocalImage();
}
}
});
builder.create().show();
} else if (v.getId() == R.id.selectTv) {
if (corrected == null) {
Toast.makeText(this, "Document Skew correction is not yet success", Toast.LENGTH_SHORT).show();
return;
} else {
ProgressDialog pd = new ProgressDialog(this);
pd.setMessage("Please wait...");
pd.show();
//UserDataUtils.saveBitMap(this, corrected);
Intent intent = new Intent();
intent.putExtra("status", "success");
new Handler().postDelayed(new Runnable() {
@Override
public void run() {
if (pd != null && pd.isShowing()) {
pd.dismiss();
}
setResult(Activity.RESULT_OK, intent);
finish();
}
}, 3000);
}
} else if (v.getId() == R.id.back) {
finish();
}
}
private MLDocumentSkewCorrectionAnalyzer createAnalyzer() {
MLDocumentSkewCorrectionAnalyzerSetting setting = new MLDocumentSkewCorrectionAnalyzerSetting
.Factory()
.create();
return MLDocumentSkewCorrectionAnalyzerFactory.getInstance().getDocumentSkewCorrectionAnalyzer(setting);
}
private void takePhoto() {
layout_image.setVisibility(View.GONE);
Intent takePictureIntent = new Intent(MediaStore.ACTION_IMAGE_CAPTURE);
if (takePictureIntent.resolveActivity(this.getPackageManager()) != null) {
ContentValues values = new ContentValues();
values.put(MediaStore.Images.Media.TITLE, "New Picture");
values.put(MediaStore.Images.Media.DESCRIPTION, "From Camera");
this.imageUri = this.getContentResolver().insert(MediaStore.Images.Media.EXTERNAL_CONTENT_URI, values);
takePictureIntent.putExtra(MediaStore.EXTRA_OUTPUT, this.imageUri);
this.startActivityForResult(takePictureIntent, DocumentSkewCorrectionActivity.this.REQUEST_TAKE_PHOTO);
}
}
private void selectLocalImage() {
layout_image.setVisibility(View.GONE);
Intent intent = new Intent(Intent.ACTION_PICK, null);
intent.setDataAndType(MediaStore.Images.Media.EXTERNAL_CONTENT_URI, "image/*");
startActivityForResult(intent, REQUEST_SELECT_IMAGE);
}
@Override
protected void onActivityResult(int requestCode, int resultCode, @Nullable Intent data) {
super.onActivityResult(requestCode, resultCode, data);
if (requestCode == REQUEST_SELECT_IMAGE && resultCode == Activity.RESULT_OK) {
imageUri = data.getData();
try {
if (imageUri != null) {
srcBitmap = MediaStore.Images.Media.getBitmap(getContentResolver(), imageUri);
String realPathFromURI = getRealPathFromURI(imageUri);
int i = readPictureDegree(realPathFromURI);
Bitmap spBitmap = rotaingImageView(i, srcBitmap);
Matrix matrix = new Matrix();
matrix.setScale(0.5f, 0.5f);
getCompressesBitmap = Bitmap.createBitmap(spBitmap, 0, 0, spBitmap.getWidth(),
spBitmap.getHeight(), matrix, true);
reloadAndDetectImage();
}
} catch (IOException e) {
Log.e(TAG, e.getMessage());
}
} else if (requestCode == REQUEST_TAKE_PHOTO && resultCode == Activity.RESULT_OK) {
try {
if (imageUri != null) {
srcBitmap = MediaStore.Images.Media.getBitmap(getContentResolver(), imageUri);
String realPathFromURI = getRealPathFromURI(imageUri);
int i = readPictureDegree(realPathFromURI);
Bitmap spBitmap = rotaingImageView(i, srcBitmap);
Matrix matrix = new Matrix();
matrix.setScale(0.5f, 0.5f);
getCompressesBitmap = Bitmap.createBitmap(spBitmap, 0, 0, spBitmap.getWidth(),
srcBitmap.getHeight(), matrix, true);
reloadAndDetectImage();
}
} catch (IOException e) {
Log.e(TAG, e.getMessage());
}
} else if (resultCode == REQUEST_SELECT_IMAGE && resultCode == Activity.RESULT_CANCELED) {
finish();
}
}
private void reloadAndDetectImage() {
if (imageUri == null) {
return;
}
frame = MLFrame.fromBitmap(getCompressesBitmap);
Task<MLDocumentSkewDetectResult> task = analyzer.asyncDocumentSkewDetect(frame);
task.addOnSuccessListener(new OnSuccessListener<MLDocumentSkewDetectResult>() {
public void onSuccess(MLDocumentSkewDetectResult result) {
if (result.getResultCode() != 0) {
corrected = null;
Toast.makeText(DocumentSkewCorrectionActivity.this, "The picture does not meet the requirements.", Toast.LENGTH_SHORT).show();
} else {
// Recognition success.
Point leftTop = result.getLeftTopPosition();
Point rightTop = result.getRightTopPosition();
Point leftBottom = result.getLeftBottomPosition();
Point rightBottom = result.getRightBottomPosition();
_points = new Point[4];
_points[0] = leftTop;
_points[1] = rightTop;
_points[2] = rightBottom;
_points[3] = leftBottom;
layout_image.setVisibility(View.GONE);
documetScanView.setImageBitmap(getCompressesBitmap);
documetScanView.setPoints(_points);
}
}
}).addOnFailureListener(new OnFailureListener() {
public void onFailure(Exception e) {
Toast.makeText(getApplicationContext(), e.getMessage(), Toast.LENGTH_SHORT).show();
}
});
}
private void getDetectdetectResult(MLDocumentSkewCorrectionCoordinateInput coordinateData, MLFrame frame) {
try {
correctionTask = analyzer.asyncDocumentSkewCorrect(frame, coordinateData);
} catch (Exception e) {
Log.e(TAG, "The image does not meet the detection requirements.");
}
try {
correctionTask.addOnSuccessListener(new OnSuccessListener<MLDocumentSkewCorrectionResult>() {
@Override
public void onSuccess(MLDocumentSkewCorrectionResult refineResult) {
// The check is successful.
if (refineResult != null && refineResult.getResultCode() == 0) {
corrected = refineResult.getCorrected();
layout_image.setVisibility(View.VISIBLE);
desImageView.setImageBitmap(corrected);
UserDataUtils.saveBitMap(DocumentSkewCorrectionActivity.this, corrected);
} else {
Toast.makeText(DocumentSkewCorrectionActivity.this, "The check fails.", Toast.LENGTH_SHORT).show();
}
}
}).addOnFailureListener(new OnFailureListener() {
@Override
public void onFailure(Exception e) {
Toast.makeText(DocumentSkewCorrectionActivity.this, "The check fails.", Toast.LENGTH_SHORT).show();
}
});
} catch (Exception e) {
Log.e(TAG, "Please set an image.");
}
}
@Override
protected void onDestroy() {
super.onDestroy();
if (srcBitmap != null) {
srcBitmap.recycle();
}
if (getCompressesBitmap != null) {
getCompressesBitmap.recycle();
}
if (corrected != null) {
corrected.recycle();
}
if (analyzer != null) {
try {
analyzer.stop();
} catch (IOException e) {
Log.e(TAG, e.getMessage());
}
}
}
public static int readPictureDegree(String path) {
int degree = 0;
try {
ExifInterface exifInterface = new ExifInterface(path);
int orientation = exifInterface.getAttributeInt(
ExifInterface.TAG_ORIENTATION,
ExifInterface.ORIENTATION_NORMAL);
switch (orientation) {
case ExifInterface.ORIENTATION_ROTATE_90:
degree = 90;
break;
case ExifInterface.ORIENTATION_ROTATE_180:
degree = 180;
break;
case ExifInterface.ORIENTATION_ROTATE_270:
degree = 270;
break;
}
} catch (IOException e) {
Log.e(TAG, e.getMessage());
}
return degree;
}
private String getRealPathFromURI(Uri contentURI) {
String result;
result = FileUtils.getFilePathByUri(this, contentURI);
return result;
}
public static Bitmap rotaingImageView(int angle, Bitmap bitmap) {
Matrix matrix = new Matrix();
matrix.postRotate(angle);
Bitmap resizedBitmap = Bitmap.createBitmap(bitmap, 0, 0,
bitmap.getWidth(), bitmap.getHeight(), matrix, true);
return resizedBitmap;
}
}
activity_document_skew_correction.xml
XML:
<?xml version="1.0" encoding="utf-8"?>
<RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:background="#000000"
tools:ignore="MissingDefaultResource">
<LinearLayout
android:id="@+id/linear_views"
android:layout_width="match_parent"
android:layout_height="80dp"
android:layout_alignParentBottom="true"
android:layout_marginBottom="10dp"
android:orientation="horizontal">
<RelativeLayout
android:id="@+id/rl_chooseImg"
android:layout_width="0dp"
android:layout_height="match_parent"
android:layout_weight="1">
<ImageView
android:layout_width="25dp"
android:layout_height="25dp"
android:layout_centerInParent="true"
android:src="@drawable/add_picture"
app:tint="@color/colorPrimary" />
</RelativeLayout>
<RelativeLayout
android:id="@+id/rl_adjust"
android:layout_width="0dp"
android:layout_height="match_parent"
android:layout_weight="1">
<ImageButton
android:id="@+id/adjust"
android:layout_width="70dp"
android:layout_height="70dp"
android:layout_centerInParent="true"
android:background="@drawable/ic_baseline_adjust_24" />
</RelativeLayout>
<RelativeLayout
android:id="@+id/rl_help"
android:layout_width="0dp"
android:layout_height="match_parent"
android:layout_weight="1">
<ImageView
android:layout_width="30dp"
android:layout_height="30dp"
android:layout_centerInParent="true"
android:src="@drawable/back"
android:visibility="invisible" />
</RelativeLayout>
</LinearLayout>
<RelativeLayout
android:id="@+id/rl_navigation"
android:layout_width="match_parent"
android:layout_height="50dp"
android:background="@color/colorPrimary">
<ImageButton
android:id="@+id/back"
android:layout_width="30dp"
android:layout_height="30dp"
android:layout_centerVertical="true"
android:layout_marginLeft="20dp"
android:layout_marginTop="@dimen/icon_back_margin"
android:background="@drawable/back" />
<TextView
android:id="@+id/selectTv"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_alignParentEnd="true"
android:layout_centerVertical="true"
android:layout_marginEnd="10dp"
android:fontFamily="@font/montserrat_bold"
android:padding="@dimen/hiad_10_dp"
android:text="Select Doc"
android:textColor="@color/white" />
</RelativeLayout>
<RelativeLayout
android:layout_width="match_parent"
android:layout_height="match_parent"
android:layout_centerHorizontal="true"
android:layout_marginTop="99dp"
android:layout_marginBottom="100dp">
<com.shea.pygmycollection.customview.DocumentCorrectImageView
android:id="@+id/iv_documetscan"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:layout_centerInParent="true"
android:padding="20dp"
app:LineColor="@color/colorPrimary" />
</RelativeLayout>
<RelativeLayout
android:id="@+id/layout_image"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:layout_centerHorizontal="true"
android:layout_marginTop="99dp"
android:layout_marginBottom="100dp"
android:background="#000000"
android:visibility="gone">
<ImageView
android:id="@+id/des_image"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:layout_centerInParent="true"
android:padding="20dp" />
</RelativeLayout>
</RelativeLayout>
Result​
Tips and Tricks​
Recommended image width and height: 1080 px and 2560 px.
Multi-thread invoking is currently not supported.
The document detection and correction API can only be called by 64-bit apps.
If you are taking Video from a camera or gallery make sure your app has camera and storage permission.
Add the downloaded huawei-hiai-vision-ove-10.0.4.307.aar, huawei-hiai-pdk-1.0.0.aar file to libs folder.
Check dependencies added properly.
Latest HMS Core APK is required.
Min SDK is 21. Otherwise you will get Manifest merge issue.
Conclusion​
In this article, we have built an application where that detects the document in the image, and correct the document and it gives a result. We have learnt the following concepts.
1. What is Document skew correction?
2. Feature of Document skew correction.
3. How to integrate Document Skew correction using Huawei HiAI?
4. How to Apply Huawei HiAI?
5. How to build the application?
Reference​
Document skew correction
Apply for Huawei HiAI​

Categories

Resources