Learn Form Recognition using ML kit | JAVA - Huawei Developers

Introduction
Huawei ML kit allows to improve our lives in countless ways, streamling complex interactions with better solution such as Text detection,Face detection,Product visual search,Voice detection,Image related etc ..!
Form recognition service can recognize the information from FORM it will return table content such as table count, rows, columns, cellcoordinate, textInfo, etc..!
Use case
This service will help you in daily basis, for example after collecting a large number of data we can use this service to recognize and convert the content into electronic documents.
Suggestions
1. Forms such as questionnaires can be recognized.
2. Currently images containing multiple forms cannot be recognized.
3. Shooting Angle: The horizontal tilt angle is less than 5 degrees.
4. Form Integrity: No missing corners and no bent or segment lines
5. Form Content: Only printed content can recognized, images, hand written content, seals and watermarks in the form cannot be recognized.
6. Image Specification: Image ratio should be less than or equal 3:1, resolution must be greater than 960 x 960 px.
ML Kit Configuration.
1. Login into AppGallery Connect, select MlKitSample in My Project list.
2. Enable Ml Kit, Choose My Projects > Project settings > Manage APIs
Development Process
Create Application in Android Studio.
App level gradle dependencies.
Code:
apply plugin: 'com.android.application'
apply plugin: 'com.huawei.agconnect'
Gradle dependencies
Code:
implementation 'com.huawei.hms:ml-computer-vision-formrecognition:2.0.4.300'
implementation 'com.huawei.hms:ml-computer-vision-formrecognition-model:2.0.4.300'
Root level gradle dependencies
Code:
maven {url 'https://developer.huawei.com/repo/'}
classpath 'com.huawei.agconnect:agcp:1.3.1.300'
Add the below permissions in Android Manifest file
Code:
<manifest xlmns:android...>
...
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.CAMERA" />
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE"/>
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
<application ...
</manifest>
Add the following meta data in Manifest file its automatically install machine learning model.
Code:
<meta-data
android:name="com.huawei.hms.ml.DEPENDENCY"
android:value="fr" />
1. Create Instance for MLFormRecognitionAnalyzerSetting in onCreate.
Code:
MLFormRecognitionAnalyzerSetting setting = new MLFormRecognitionAnalyzerSetting.Factory().create();
2. Create instance for MLFormRecognitionAnalyzer in onCreate.
Code:
MLFormRecognitionAnalyzerFactory analyzer = MLFormRecognitionAnalyzerFactory.getInstance().getFormRecognitionAnalyzer(setting);
3. Check Runtime permissions.
4. FormRecogActivity is the responsible for load forms from local storage and capture the live forms using FormRecognigation service and we can extract the form cells content.
Code:
public class FormRecogActivity extends AppCompatActivity {
private MLFormRecognitionAnalyzerSetting setting;
private MLFormRecognitionAnalyzer analyzer;
private ImageView mImageView;
private TextView text, textTotal;
private MLFrame mlFrame;
private Uri imageUri;
private Bitmap bitmap;
private int camRequestCode = 100;
private int storageRequestCode = 200;
private int sum = 0;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_form_recog);
mImageView = findViewById(R.id.image);
text = findViewById(R.id.text);
textTotal = findViewById(R.id.text_total);
setting = new MLFormRecognitionAnalyzerSetting.Factory().create();
analyzer = MLFormRecognitionAnalyzerFactory.getInstance().getFormRecognitionAnalyzer(setting);
}
public void onLoadImage(View view) {
Intent intent = new Intent(Intent.ACTION_PICK, MediaStore.Images.Media.EXTERNAL_CONTENT_URI);
startActivityForResult(intent, storageRequestCode);
}
public void onClikCam(View view) {
if (checkSelfPermission(Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED) {
requestPermissions(new String[]{Manifest.permission.CAMERA}, camRequestCode);
} else {
Intent intent = new Intent(MediaStore.ACTION_IMAGE_CAPTURE);
startActivityForResult(intent, camRequestCode);
}
}
@Override
public void onRequestPermissionsResult(int requestCode, @NonNull String[] permissions, @NonNull int[] grantResults) {
super.onRequestPermissionsResult(requestCode, permissions, grantResults);
if (requestCode == camRequestCode) {
if (grantResults[0] == PackageManager.PERMISSION_GRANTED) {
Intent intent = new Intent(MediaStore.ACTION_IMAGE_CAPTURE);
startActivityForResult(intent, camRequestCode);
} else {
Toast.makeText(this, "Camera permission denied", Toast.LENGTH_LONG).show();
}
}
if (requestCode == storageRequestCode) {
if (grantResults[0] == PackageManager.PERMISSION_GRANTED) {
Intent intent = new Intent(
Intent.ACTION_PICK,
MediaStore.Images.Media.EXTERNAL_CONTENT_URI
);
startActivityForResult(intent, storageRequestCode);
} else {
Toast.makeText(this, "Storage permission denied", Toast.LENGTH_LONG).show();
}
}
}
@Override
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
super.onActivityResult(requestCode, resultCode, data);
if (resultCode == RESULT_OK && requestCode == storageRequestCode) {
imageUri = data.getData();
try {
bitmap = MediaStore.Images.Media.getBitmap(this.getContentResolver(), imageUri);
mImageView.setImageBitmap(bitmap);
callFormService();
} catch (IOException e) {
e.printStackTrace();
}
} else if (resultCode == RESULT_OK && requestCode == camRequestCode) {
bitmap = (Bitmap) data.getExtras().get("data");
mImageView.setImageBitmap(bitmap);
callFormService();
}
}
private void callFormService() {
mlFrame = MLFrame.fromBitmap(bitmap);
analyzer = MLFormRecognitionAnalyzerFactory.getInstance().getFormRecognitionAnalyzer();
Task<JsonObject> task = analyzer.asyncAnalyseFrame(mlFrame);
task.addOnSuccessListener(new OnSuccessListener<JsonObject>() {
@Override
public void onSuccess(JsonObject jsonObject) {
if (jsonObject != null && jsonObject.get("retCode").getAsInt() == MLFormRecognitionConstant.SUCCESS) {
Gson gson = new Gson();
String result = jsonObject.toString();
MLFormRecognitionTablesAttribute mlObject = gson.fromJson(result, MLFormRecognitionTablesAttribute.class);
ArrayList<MLFormRecognitionTablesAttribute.TablesContent.TableAttribute> tableAttributeArrayList = mlObject.getTablesContent().getTableAttributes();
ArrayList<MLFormRecognitionTablesAttribute.TablesContent.TableAttribute.TableCellAttribute> tableCellAttributes = tableAttributeArrayList.get(0).getTableCellAttributes();
for (MLFormRecognitionTablesAttribute.TablesContent.TableAttribute.TableCellAttribute attribute : tableCellAttributes) {
String info = attribute.textInfo;
text.setText(text.getText().toString() + "\n" + info);
}
Toast.makeText(FormRecogActivity.this, "Successfully Form Recognized", Toast.LENGTH_LONG).show();
Log.d("TAG", "result: " + result);
} else if (jsonObject != null && jsonObject.get("retCode").getAsInt() == MLFormRecognitionConstant.FAILED) {
Toast.makeText(FormRecogActivity.this, "Form Recognition Convertion Failed", Toast.LENGTH_LONG).show();
}
textTotal.setText("Total Cart Value : "+ sum +" Rs ");
}
}).addOnFailureListener(new OnFailureListener() {
@Override
public void onFailure(Exception e) {
Toast.makeText(FormRecogActivity.this, "Form Recognition API Failed", Toast.LENGTH_LONG).show();
}
});
}
}
5.This Xml class for creating the UI.
Code:
<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:background="#2196F3"
android:orientation="vertical">
<ImageView
android:id="@+id/image"
android:layout_width="match_parent"
android:layout_height="250dp"
android:layout_marginTop="?actionBarSize" />
<Button
android:id="@+id/btn_storage"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:layout_margin="20dp"
android:onClick="onLoadImage"
android:background="#F44336"
android:textColor="@color/upsdk_white"
android:text="Load Image from storage" />
<Button
android:id="@+id/btn_capture"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:layout_marginLeft="20dp"
android:layout_marginRight="20dp"
android:onClick="onClikCam"
android:background="#F44336"
android:textColor="@color/upsdk_white"
android:text="Capture Image" />
<TextView
android:id="@+id/text_total"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:layout_marginLeft="20dp"
android:textSize="20sp"
android:textColor="@color/upsdk_white"
android:textStyle="bold" />
<ScrollView
android:layout_marginTop="8dp"
android:layout_width="match_parent"
android:layout_height="wrap_content">
<TextView
android:id="@+id/text"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:layout_marginLeft="20dp"
android:textSize="20sp"
android:textColor="@color/upsdk_white"
android:textStyle="bold" />
</ScrollView>
</LinearLayout>
Result
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Tips & Tricks
1. Enable ML service API in AppGallery Connect.
2. Capture single form only, form service not supporting multiple forms.
3. The resolution must greater than 960 x 960px.
4. Currently this service not supporting handwritten text information.
Conclusion
This article will help you to get textinfo from form, it will extract the individual cells data with coordinates. This service will help you in daily basis.
Thank you for reading and if you have enjoyed this article I would suggest you implement this and provide your experience.
Reference
ML Kit – Form Recognition
Refer the URL

Related

Huawei Share Kit Facilitates to Acquire Skill in Enforcement - Part 2

More articles like this, you can visit HUAWEI Developer Forum and Medium.
​https://forums.developer.huawei.com/forumPortal/en/home
In this article, We will implement the Huawei Share Kit SDK and complete our demo application.
In the previous article, we have learned about Share Kit introduction and created project. So let’s start our implementation.
I will represent the functionality of Share Kit in a simple way with a working application and give a demo.
Before start developing the application we must have the following requirement.
Hardware Requirements
1. A computer (desktop or laptop) that runs Windows 7 or Windows 10
2. A Huawei phone (with the USB cable), which is used for debugging
3. A third-party Android device, which is used for debugging
Software Requirements
1. JDK 1.8 or later
2. Android API (level 26 or higher)
3. EMUI 10.0 or later
Let’s start the development:
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
1. Add Share Kit SDK in project:
2. We need to add the code repository to the project root directory gradle.
Code:
maven {
url 'http://developer.huawei.com/repo/'
}
3. We need to add the following dependencies in our app gradle.
Code:
dependencies {
implementation files('libs/sharekit-1.0.1.300.aar')
implementation 'com.android.support:support-annotations:28.0.0'
implementation 'com.android.support:localbroadcastmanager:28.0.0'
implementation 'com.android.support:support-compat:28.0.0'
implementation 'com.google.guava:guava:24.1-android'
}
Note: You need to raise a ticket to get Share Kit SDK “sharekit-1.0.300.aar” file
Click on the below link and raise your ticket.
https://developer.huawei.com/consumer/en/support/feedback/#/
4. I have created following package and resource file:
5. I have mentioned all activities in manifest file:
Code:
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
package="com.hms.myshare">
<application
android:allowBackup="true"
android:icon="@mipmap/ic_launcher"
android:label="@string/app_name"
android:roundIcon="@mipmap/ic_launcher_round"
android:supportsRtl="true"
android:theme="@style/AppTheme">
<activity
android:name=".SplashScreen"
android:label="@string/app_name">
<intent-filter>
<action android:name="android.intent.action.MAIN" />
<category android:name="android.intent.category.LAUNCHER" />
</intent-filter>
</activity>
<activity
android:name=".SearchingActivity"
android:configChanges="orientation|keyboardHidden|screenSize"/>
<activity
android:name=".ReceiveActivity"
android:configChanges="orientation|keyboardHidden|screenSize"/>
</application>
</manifest>
Let’s create an awesome User Interface:
1. I have created a wave ripple effect which will help to find the device from a UI perspective.
I have created a SearchingView.java class:
Code:
public class SearchingView extends RelativeLayout {
private static final int DEFAULT_RIPPLE_COUNT=6;
private static final int DEFAULT_DURATION_TIME=3000;
private static final float DEFAULT_SCALE=6.0f;
private static final int DEFAULT_FILL_TYPE=0;
private int rippleColor;
private float rippleStrokeWidth;
private float rippleRadius;
private int rippleDurationTime;
private int rippleAmount;
private int rippleDelay;
private float rippleScale;
private int rippleType;
private Paint paint;
private boolean animationRunning=false;
private AnimatorSet animatorSet;
private ArrayList<Animator> animatorList;
private LayoutParams rippleParams;
private ArrayList<RippleView> rippleViewList=new ArrayList<RippleView>();
public SearchingView(Context context) {
super(context);
}
public SearchingView(Context context, AttributeSet attrs) {
super(context, attrs);
init(context, attrs);
}
public SearchingView(Context context, AttributeSet attrs, int defStyleAttr) {
super(context, attrs, defStyleAttr);
init(context, attrs);
}
private void init(final Context context, final AttributeSet attrs) {
if (isInEditMode())
return;
if (null == attrs) {
throw new IllegalArgumentException("Attributes should be provided to this view,");
}
final TypedArray typedArray = context.obtainStyledAttributes(attrs, R.styleable.RippleBackground);
rippleColor=typedArray.getColor(R.styleable.RippleBackground_rb_color, getResources().getColor(R.color.rippelColor));
rippleStrokeWidth=typedArray.getDimension(R.styleable.RippleBackground_rb_strokeWidth, getResources().getDimension(R.dimen.rippleStrokeWidth));
rippleRadius=typedArray.getDimension(R.styleable.RippleBackground_rb_radius,getResources().getDimension(R.dimen.rippleRadius));
rippleDurationTime=typedArray.getInt(R.styleable.RippleBackground_rb_duration,DEFAULT_DURATION_TIME);
rippleAmount=typedArray.getInt(R.styleable.RippleBackground_rb_rippleAmount,DEFAULT_RIPPLE_COUNT);
rippleScale=typedArray.getFloat(R.styleable.RippleBackground_rb_scale,DEFAULT_SCALE);
rippleType=typedArray.getInt(R.styleable.RippleBackground_rb_type,DEFAULT_FILL_TYPE);
typedArray.recycle();
rippleDelay=rippleDurationTime/rippleAmount;
paint = new Paint();
paint.setAntiAlias(true);
if(rippleType==DEFAULT_FILL_TYPE){
rippleStrokeWidth=0;
paint.setStyle(Paint.Style.FILL);
}else
paint.setStyle(Paint.Style.STROKE);
paint.setColor(rippleColor);
rippleParams=new LayoutParams((int)(2*(rippleRadius+rippleStrokeWidth)),(int)(2*(rippleRadius+rippleStrokeWidth)));
rippleParams.addRule(CENTER_IN_PARENT, TRUE);
animatorSet = new AnimatorSet();
animatorSet.setInterpolator(new AccelerateDecelerateInterpolator());
animatorList=new ArrayList<Animator>();
for(int i=0;i<rippleAmount;i++){
RippleView rippleView=new RippleView(getContext());
addView(rippleView,rippleParams);
rippleViewList.add(rippleView);
final ObjectAnimator scaleXAnimator = ObjectAnimator.ofFloat(rippleView, "ScaleX", 1.0f, rippleScale);
scaleXAnimator.setRepeatCount(ObjectAnimator.INFINITE);
scaleXAnimator.setRepeatMode(ObjectAnimator.RESTART);
scaleXAnimator.setStartDelay(i * rippleDelay);
scaleXAnimator.setDuration(rippleDurationTime);
animatorList.add(scaleXAnimator);
final ObjectAnimator scaleYAnimator = ObjectAnimator.ofFloat(rippleView, "ScaleY", 1.0f, rippleScale);
scaleYAnimator.setRepeatCount(ObjectAnimator.INFINITE);
scaleYAnimator.setRepeatMode(ObjectAnimator.RESTART);
scaleYAnimator.setStartDelay(i * rippleDelay);
scaleYAnimator.setDuration(rippleDurationTime);
animatorList.add(scaleYAnimator);
final ObjectAnimator alphaAnimator = ObjectAnimator.ofFloat(rippleView, "Alpha", 1.0f, 0f);
alphaAnimator.setRepeatCount(ObjectAnimator.INFINITE);
alphaAnimator.setRepeatMode(ObjectAnimator.RESTART);
alphaAnimator.setStartDelay(i * rippleDelay);
alphaAnimator.setDuration(rippleDurationTime);
animatorList.add(alphaAnimator);
}
animatorSet.playTogether(animatorList);
}
private class RippleView extends View {
public RippleView(Context context) {
super(context);
this.setVisibility(View.INVISIBLE);
}
@Override
protected void onDraw(Canvas canvas) {
int radius=(Math.min(getWidth(),getHeight()))/2;
canvas.drawCircle(radius,radius,radius-rippleStrokeWidth,paint);
}
}
public void startRippleAnimation(){
if(!isRippleAnimationRunning()){
for(RippleView rippleView:rippleViewList){
rippleView.setVisibility(VISIBLE);
}
animatorSet.start();
animationRunning=true;
}
}
public void stopRippleAnimation(){
if(isRippleAnimationRunning()){
animatorSet.end();
animationRunning=false;
}
}
public boolean isRippleAnimationRunning(){
return animationRunning;
}
Let’s see the implementation of this custom view inside xml layout:
Code:
<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
android:orientation="vertical"
android:gravity="center_horizontal"
android:layout_width="match_parent"
android:background="@drawable/background"
android:layout_height="match_parent">
<RelativeLayout
android:layout_width="wrap_content"
android:layout_height="0dp"
android:layout_weight="1"
android:gravity="center">
<com.hms.myshare.view.SearchingView
android:id="@+id/searching"
android:layout_width="match_parent"
android:layout_height="match_parent"
app:rb_color="@android:color/white"
app:rb_duration="3000"
app:rb_radius="40dp"
app:rb_rippleAmount="6"
app:rb_scale="5">
<ImageView
android:id="@+id/img_logo"
android:layout_width="200dp"
android:layout_height="200dp"
android:layout_centerInParent="true"
android:src="@drawable/log" />
</com.hms.myshare.view.SearchingView>
</RelativeLayout>
<androidx.appcompat.widget.AppCompatTextView
android:layout_width="wrap_content"
android:layout_gravity="bottom|center_horizontal"
android:textColor="@android:color/white"
android:textSize="28sp"
android:gravity="center"
android:layout_height="wrap_content"
android:layout_margin="10dp"
android:text="Huawei Share Kit"
android:id="@+id/appCompatTextView2" />
</LinearLayout>
Let’ see the output of this view:
Let’s implement Search device and Send Data:
· We have implemented this functionality inside SearchingActivity class.
We need to perform the following operation in order to implement sending data to found device.
1. We need to instantiate SDK manager class i.e. ShareKitManager with current context of Activity inside the oncreate method.
Code:
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
binding = DataBindingUtil.setContentView(this, R.layout.searching_activity);
shareKitManager = new ShareKitManager(this);
2. Add callback IShareKitInitCallback to initialize the ShareKitManager class.
Code:
IShareKitInitCallback initCallback = isSuccess -> {
Log.i(TAG, "share kit init result:" + isSuccess);
if (isSuccess) {
binding.txtError.setText(getString(R.string.sharekit_init_finish));
} else {
binding.txtError.setText(getString(R.string.sharekit_init_failed));
}
};
shareKitManager.init(initCallback);
3. Register the ShareKitManager with IWidgetCallback:
Code:
private IWidgetCallback callback = new IWidgetCallback.Stub() {
@Override
public synchronized void onDeviceFound(NearByDeviceEx nearByDeviceEx) {
String deviceId = nearByDeviceEx.getCommonDeviceId();
if (deviceId == null) {
Log.e(TAG, "onDeviceFound: deviceId is null");
return;
}
Log.i(TAG, "onDeviceFound: " + deviceId + ", btName: " + nearByDeviceEx.getBtName());
synchronized (lock) {
deviceMap.put(deviceId, nearByDeviceEx);
foundTimeMap.put(deviceId, format.format(new Date()));
updateDeviceList();
}
}
@Override
public void onDeviceDisappeared(NearByDeviceEx nearByDeviceEx) {
String deviceId = nearByDeviceEx.getCommonDeviceId();
if (deviceId == null) {
Log.e(TAG, "onDeviceDisappeared: deviceId is null");
return;
}
Log.i(TAG, "onDeviceDisappeared: " + deviceId + ", btName: " + nearByDeviceEx.getBtName());
synchronized (lock) {
deviceMap.remove(deviceId);
foundTimeMap.remove(deviceId);
updateDeviceList();
}
}
@Override
public void onTransStateChange(NearByDeviceEx nearByDeviceEx, int state, int stateValue) {
Log.i(TAG, "trans state:" + state + " value:" + stateValue);
String stateDesc = "";
switch (state) {
case STATE_PROGRESS:
stateDesc = getString(R.string.sharekit_send_progress, stateValue);
break;
case STATE_SUCCESS:
stateDesc = getString(R.string.sharekit_send_finish);
break;
case STATE_STATUS:
stateDesc = getString(R.string.sharekit_state_chg, translateStateValue(stateValue));
break;
case STATE_ERROR:
stateDesc = getString(R.string.sharekit_send_error, translateErrorValue(stateValue));
showError(getString(R.string.sharekit_send_error, translateErrorValue(stateValue)));
break;
default:
break;
}
// showToast(stateDesc);
}
@Override
public void onEnableStatusChanged() {
int status = shareKitManager.getShareStatus();
Log.i(TAG, "sharekit ability current status:" + status);
}
};
We need to pass this callback to Register api.
Code:
shareKitManager.registerCallback(callback);
4. Start searching device using Discorvey api.
Code:
shareKitManager.startDiscovery();
5. If you found the device successfully we need to call the ShareBean api for send the data.
Code:
private void doSendText() {
String text = binding.sharetext.getText().toString();
ShareBean shareBean = new ShareBean(text);
doSend(destDevice, shareBean);
}
Followed by doSend() method:
Code:
private void doSend(String deviceName, ShareBean shareBean) {
List<NearByDeviceEx> processingDevices = shareKitManager.getDeviceList();
for (NearByDeviceEx device : processingDevices) {
if (deviceName.equals(device.getBtName())) {
return;
}
}
synchronized (lock) {
for (NearByDeviceEx device : deviceMap.values()) {
if (deviceName.equals(device.getBtName())) {
shareKitManager.doSend(device, shareBean);
}
}
}
}
Let’s implement Receive data functionality:
· We have implemented this functionality inside ReceivingActivity.
· We need to enable wifi in Huawei device which receive the socket connection request from sender device.
· So we need to initialize the ShareKitManager inside this activity oncreate method.
Code:
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
binding = DataBindingUtil.setContentView(this, R.layout.receiver_activity);
binding.searching.startRippleAnimation();
shareKitManager = new ShareKitManager(this);
IShareKitInitCallback initCallback = isSuccess -> {
Log.i(TAG, "share kit init result:" + isSuccess);
};
shareKitManager.init(initCallback);
shareKitManager.enable();
}
Android device (Sender):
Huawei device (Receiver):
If you have any doubts or queries. Please leave your valuable comment or post your doubts in HUAWEI Developer Forum.

HMS Game Service - How to create a Sign In / Get User Info [Java]

More information like this, you can visit HUAWEI Developer Forum​
Introduction
In this article I would like to address the Game Service topic by doing a practical example in which we will implement the kit. The goal is to achieve a simple application where our user has the possibility to log in with his Huawei ID and obtain information regarding his player on Game Service. In the Huawei repositories we can find projects with all the implementation but in my opinion it is better to create a new project where we have the opportunity to do our own development.
Steps:
1. Create an App in AGC
2. Add the necessary libraries and repositories
3. Permissions in our Application
4. Building the user interface
5. Create the Signing class
6. Write the code in our MainActity
7. Test the App
Create an App in AGC
If you already have experience implementing HMS you will have noticed that the creation of an App in the AGC console is regularly required. If you are in this case, I recommend that you skip this step and go to step 3.
In this link you can find a detailed guide on how to create an App in AGC, generate your finger print and download the json services.
https://developer.huawei.com/consumer/en/codelab/HMSPreparation/index.html#0
In case you do not have Android Studio Configured on your device, I also share a guide with the requirements.
https://developer.huawei.com/consumer/en/codelab/HMSAccounts-Kotlin/index.html#1
Once your App is created in AGC, it is important that you activate the Game Service, Account Kit and In App purchases services in case you require it. We can go to the Apis Management tab
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
What have we accomplished so far?
We have an App in AGC connected to our project and with activated services.
Add the necessary libraries and repositories
Once our project is created in Android Studio it will be necessary to add the following lines of code
Let's add the following lines to the project gradle
Code:
buildscript {
repositories {
google()
jcenter()
//Esta linea
maven { url 'http://developer.huawei.com/repo/' }
}
dependencies {
classpath "com.android.tools.build:gradle:4.0.0"
//Tambien esta esta linea
classpath 'com.huawei.agconnect:agcp:1.3.1.300'
}
}
allprojects {
repositories {
google()
jcenter()
//Repositorio de Maven
maven {url 'http://developer.huawei.com/repo/'}
}
}
Now let's add the necessary dependencies in the app gradle.
Code:
implementation'com.huawei.agconnect:agconnect-core:1.3.1.300' //HMSCore
implementation 'com.huawei.hms:hwid:4.0.4.300' //Huawei Id
implementation 'com.huawei.hms:game:4.0.3.301' //Game Service
implementation 'com.huawei.hms:base:4.0.4.301'
implementation 'com.squareup.picasso:picasso:2.71828'
Do not forget to add the plugin.
Code:
apply plugin:'com.huawei.agconnect'
Permissions in our Application
Now let's add the permissions of our Application because we want to use the Game Service services. So in the Android Manifest File, let's add the necessary permissions.
Code:
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
Building the user interface
The time has come to build the user interface, what we will do is use the powerful tool that Android Studio from Constraint offers us, basically what we are looking for is to achieve something like this. Where we will place the elements that we want to show when obtaining the data of our user. Of course you can create the interface that you like the most, but if you want to use this simple distribution of elements this is the code
Code:
<?xml version="1.0" encoding="utf-8"?>
<androidx.constraintlayout.widget.ConstraintLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
tools:context=".MainActivity">
<TextView
android:id="@+id/textView"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_marginTop="68dp"
android:text="Welcome"
android:textAppearance="@style/TextAppearance.AppCompat.Large"
app:layout_constraintEnd_toEndOf="parent"
app:layout_constraintStart_toStartOf="parent"
app:layout_constraintTop_toTopOf="parent" />
<ImageView
android:id="@+id/avatarImg"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_marginTop="36dp"
app:layout_constraintEnd_toEndOf="parent"
app:layout_constraintHorizontal_bias="0.498"
app:layout_constraintStart_toStartOf="parent"
app:layout_constraintTop_toBottomOf="@+id/textView"
tools:srcCompat="@tools:sample/avatars" />
<TextView
android:id="@+id/textView9"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_marginStart="28dp"
android:text="Player Id"
app:layout_constraintStart_toStartOf="parent"
app:layout_constraintTop_toTopOf="@+id/idtv" />
<TextView
android:id="@+id/textView10"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_marginTop="32dp"
android:layout_marginEnd="51dp"
android:text="Player Level"
app:layout_constraintEnd_toStartOf="@+id/leveltv"
app:layout_constraintHorizontal_bias="0.0"
app:layout_constraintStart_toStartOf="@+id/textView9"
app:layout_constraintTop_toBottomOf="@+id/textView9" />
<TextView
android:id="@+id/idtv"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_marginStart="40dp"
android:layout_marginTop="24dp"
android:text="TextView"
app:layout_constraintEnd_toEndOf="parent"
app:layout_constraintHorizontal_bias="0.082"
app:layout_constraintStart_toEndOf="@+id/textView9"
app:layout_constraintTop_toBottomOf="@+id/avatarImg" />
<TextView
android:id="@+id/leveltv"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_marginTop="32dp"
android:text="TextView"
app:layout_constraintStart_toStartOf="@+id/idtv"
app:layout_constraintTop_toBottomOf="@+id/idtv" />
<Button
android:id="@+id/loginButton"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:layout_marginBottom="16dp"
android:text="Login"
app:layout_constraintBottom_toBottomOf="parent"
tools:layout_editor_absoluteX="0dp" />
<Button
android:id="@+id/btnInfo"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_marginTop="88dp"
android:text="PlayerInfo"
app:layout_constraintBottom_toTopOf="@+id/loginButton"
app:layout_constraintEnd_toEndOf="parent"
app:layout_constraintHorizontal_bias="0.475"
app:layout_constraintStart_toStartOf="parent"
app:layout_constraintTop_toBottomOf="@+id/textView10"
app:layout_constraintVertical_bias="0.178" />
</androidx.constraintlayout.widget.ConstraintLayout>
Create the SignInCenter class
In this class we will make the instance of the class and we will be able to manage the Huawei of Authentication
Code:
public class SignInCenter {
private static SignInCenter INS = new SignInCenter();
private static AuthHuaweiId currentAuthHuaweiId;
public static SignInCenter get() {
return INS;
}
public void updateAuthHuaweiId(AuthHuaweiId AuthHuaweiId) {
currentAuthHuaweiId = AuthHuaweiId;
}
public AuthHuaweiId getAuthHuaweiId() {
return currentAuthHuaweiId;
}
}
Write the code in our MainActity
Let's work on our Main Activity is the time to get our hands dirty, I will put the source code and within the add comments to each method this for reading is easier
Code:
/**
*Variable Declarations
* Buttons, TextViews
* AuthHuaweiId to store retrived elements
* ImageView to show avatar
* Handler to use a service
*/
private Button loginButton;
private Button infoButton;
private TextView welcomeTv,idtv,leveltv;
private AuthHuaweiId mAuthid;
private final static int SIGN_IN_INTENT = 3000;
private String playerId;
private String sessionId = null;
private ImageView avatar;
private Handler handler;
private final static int HEARTBEAT_TIME = 15 * 60 * 1000;
private static final String TAG = "TAG1";
private boolean hasInit = false;
//---------------------------------------------------------------------------------------------
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
//Call UI elements we can use Butterknife or if you are using Kotlin use the extensions to get rid of this part
loginButton = findViewById(R.id.loginButton);
loginButton.setOnClickListener(this);
welcomeTv = findViewById(R.id.textView);
avatar = findViewById(R.id.avatarImg);
infoButton = findViewById(R.id.btnInfo);
infoButton.setOnClickListener(this);
idtv = findViewById(R.id.idtv);
leveltv = findViewById(R.id.leveltv);
}
//---------------------------------------------------------------------------------------------
/*
* Overide onclick method and by using elements id call the proper methods, dont forget to
* implement View.Onclick Listener Interface
*/
//---------------------------------------------------------------------------------------------
@Override
public void onClick(View view) {
switch (view.getId()){
case R.id.loginButton:
signIn();
break;
case R.id.btnInfo:
init();
getCurrentPlayer();
break;
}
}
//---------------------------------------------------------------------------------------------
/*
* This is the method where we will initialize the service by passing the Authentification ID
* if the user has logged in correctly
*/
//---------------------------------------------------------------------------------------------
public void init() {
JosAppsClient appsClient = JosApps.getJosAppsClient(this,mAuthid);
appsClient.init();
Log.d(TAG,"init success");
hasInit = true;
}
//---------------------------------------------------------------------------------------------
/*
*Get the current player by using the ID there are many ways to achive this but for this example
* we will keep it simple
**/
//---------------------------------------------------------------------------------------------
private void getCurrentPlayer() {
PlayersClientImpl client = (PlayersClientImpl) Games.getPlayersClient(this,mAuthid);
//Create a Task to get the Player
Task<Player> task = client.getCurrentPlayer();
task.addOnSuccessListener(new OnSuccessListener<Player>() {
@Override
public void onSuccess(Player player) {
String result = "display:" + player.getDisplayName() + "\n" + "playerId:" + player.getPlayerId() + "\n"
+ "playerLevel:" + player.getLevel() + "\n" + "timestamp:" + player.getSignTs() + "\n"
+ "playerSign:" + player.getPlayerSign();
Log.d("TAG1",result);
idtv.setText(player.getPlayerId());
leveltv.setText(player.getLevel() + "");
playerId = player.getPlayerId();
gameBegin();
handler = new Handler() {
@Override
public void handleMessage(Message msg) {
super.handleMessage(msg);
//gamePlayExtra();
}
};
new Timer().schedule(new TimerTask() {
@Override
public void run() {
Message message = new Message();
handler.sendMessage(message);
}
}, HEARTBEAT_TIME, HEARTBEAT_TIME);
}
}).addOnFailureListener(new OnFailureListener() {
@Override
public void onFailure(Exception e) {
if (e instanceof ApiException) {
String result = "rtnCode:" + ((ApiException) e).getStatusCode();
Log.d("TAG1",result);
}
}
});
}
//---------------------------------------------------------------------------------------------
/*
*On this method the Game will begin with the obtained ID
*/
//---------------------------------------------------------------------------------------------
public void gameBegin() {
if (TextUtils.isEmpty(playerId)) {
Log.d("TAG1","GetCurrentPlayer first.");
return;
}
String uid = UUID.randomUUID().toString();
PlayersClient client = Games.getPlayersClient(this,mAuthid);
Task<String> task = client.submitPlayerEvent(playerId, uid, "GAMEBEGIN");
task.addOnSuccessListener(new OnSuccessListener<String>() {
@Override
public void onSuccess(String jsonRequest) {
if (jsonRequest == null) {
Log.d("TAG1","jsonRequest is null");
return;
}
try {
JSONObject data = new JSONObject(jsonRequest);
sessionId = data.getString("transactionId");
} catch (JSONException e) {
Log.d("TAG1","parse jsonArray meet json exception");
return;
}
Log.d("TAG1","submitPlayerEvent traceId: " + jsonRequest);
}
}).addOnFailureListener(new OnFailureListener() {
@Override
public void onFailure(Exception e) {
if (e instanceof ApiException) {
String result = "rtnCode:" + ((ApiException) e).getStatusCode();
Log.d("TAG1",result);
}
}
});
}
//---------------------------------------------------------------------------------------------
/*
* This is the method to sign in i have commented most of the lines to understa what we have been doing
*/
//---------------------------------------------------------------------------------------------
private void signIn() {
//Create a task with the Type of AuthHuaweiId
Task<AuthHuaweiId> authHuaweiIdTask = HuaweiIdAuthManager.getService(this, getHuaweiIdParams()).silentSignIn();
//Add the proper Listener so we can track the response of the Task
authHuaweiIdTask.addOnSuccessListener(new OnSuccessListener<AuthHuaweiId>() {
//Must overide the Osuccess Method which will return an AuthHuaweiId Object
@Override
public void onSuccess(AuthHuaweiId authHuaweiId) {
//Logs to track the information
Log.d("TAG1","signIn success");
Log.d("TAG1","Id" + authHuaweiId.getDisplayName());
Log.d("TAG1", "Picture" + authHuaweiId.getAvatarUriString());
//Handle the user interface
welcomeTv.setVisibility(View.VISIBLE);
welcomeTv.setText("Welcome back " + authHuaweiId.getDisplayName());
Picasso.get().load(authHuaweiId.getAvatarUriString()).into(avatar);
loginButton.setVisibility(View.INVISIBLE);
Log.d("TAG1","AT:" + authHuaweiId.getAccessToken());
mAuthid = authHuaweiId;
//Sign in Center update
SignInCenter.get().updateAuthHuaweiId(authHuaweiId);
infoButton.setVisibility(View.VISIBLE);
}
}).addOnFailureListener(new OnFailureListener() {
//Something went wrong, use this method to inform your users
@Override
public void onFailure(Exception e) {
if (e instanceof ApiException) {
ApiException apiException = (ApiException) e;
Log.d("TAG1","signIn failed:" + apiException.getStatusCode());
Log.d("TAG1","start getSignInIntent");
signInNewWay();
}
}
});
}
//---------------------------------------------------------------------------------------------
private void signInNewWay() {
Intent intent = HuaweiIdAuthManager.getService(MainActivity.this, getHuaweiIdParams()).getSignInIntent();
startActivityForResult(intent, SIGN_IN_INTENT);
}
//---------------------------------------------------------------------------------------------
/*
*Create the HuaweiIdParams so we can send it to the service
*/
//---------------------------------------------------------------------------------------------
public HuaweiIdAuthParams getHuaweiIdParams() {
return new HuaweiIdAuthParamsHelper(HuaweiIdAuthParams.DEFAULT_AUTH_REQUEST_PARAM_GAME).createParams();
}
//---------------------------------------------------------------------------------------------
/*
*Dont Forget to overide onActivity Result otherwise we wont have any functionality working
* */
//---------------------------------------------------------------------------------------------
@Override
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
super.onActivityResult(requestCode, resultCode, data);
if (SIGN_IN_INTENT == requestCode) {
handleSignInResult(data);
} else {
Log.d("TAG1","unknown requestCode in onActivityResult");
}
}
//---------------------------------------------------------------------------------------------
/*
*Handle the result of signin we can take some desicion here
* */
//---------------------------------------------------------------------------------------------
private void handleSignInResult(Intent data) {
if (null == data) {
Log.d("TAG1","signIn inetnt is null");
return;
}
// HuaweiIdSignIn.getSignedInAccountFromIntent(data);
String jsonSignInResult = data.getStringExtra("HUAWEIID_SIGNIN_RESULT");
if (TextUtils.isEmpty(jsonSignInResult)) {
Log.d("TAG1","SignIn result is empty");
return;
}
try {
HuaweiIdAuthResult signInResult = new HuaweiIdAuthResult().fromJson(jsonSignInResult);
if (0 == signInResult.getStatus().getStatusCode()) {
Log.d("TAG1","Sign in success.");
Log.d("TAG1","Sign in result: " + signInResult.toJson());
SignInCenter.get().updateAuthHuaweiId(signInResult.getHuaweiId());
// getCurrentPlayer();
} else {
Log.d("TAG1","Sign in failed: " + signInResult.getStatus().getStatusCode());
}
} catch (JSONException var7) {
Log.d("TAG1","Failed to convert json from signInResult.");
}
}
Conclusion
Well! Well we have achieved it we have a small implementation of a simple Game Service but that we could add to our video games. Game Service has many very useful methods, check the documentation if you would like to go deeper into this topic.
https://developer.huawei.com/consumer/en/hms/huawei-game

Surface detection with AR Engine

More information like this, you can visit HUAWEI Developer Forum
​
Introduction
AR Engine has support to detect objects in the real world is called "Environment tracking" and with it you can records illumination, plane, image, object, surface, and other environmental information to help your apps merge virtual objects into scenarios in the physical world.
What is HUAWEI AR Engine?
HUAWEI AR Engine is a platform for building augmented reality (AR) apps on Android smartphones. It is based on the HiSilicon chipset, and integrates AR core algorithms to provide basic AR capabilities such as motion tracking, environment tracking, body tracking, and face tracking, allowing your app to bridge virtual world with the real world, for a brand new visually interactive user experience.
Currently, HUAWEI AR Engine provides three types of capabilities, including motion tracking, environment tracking, and human body and face tracking.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Example Android Application
For this example we will work on Environment tracking so we can detect surfaces, like a table or a floor.
Development Process
Creating an App
Create an app following instructions in Creating an AppGallery Connect Project and Adding an App to the Project.
Platform: Android
Device: Mobile phone
App category: App or Game
Integrating HUAWEI AR Engine SDK
Before development, integrate the HUAWEI AR Engine SDK via the Maven repository into your development environment.
Open the build.gradle file in the root directory of your Android Studio project.
Code:
// Top-level build file where you can add configuration options common to all sub-projects/modules.
buildscript {
repositories {
google()
jcenter()
maven {url 'https://developer.huawei.com/repo/'}
}
dependencies {
classpath "com.android.tools.build:gradle:4.0.1"
classpath 'com.huawei.agconnect:agcp:1.3.2.301'
// NOTE: Do not place your application dependencies here; they belong
// in the individual module build.gradle files
}
}
allprojects {
repositories {
maven {url 'https://developer.huawei.com/repo/'}
google()
jcenter()
}
}
task clean(type: Delete) {
delete rootProject.buildDir
}
Open the build.gradle file in the app directory of your project
Code:
apply plugin: 'com.android.application'
android {
compileSdkVersion 30
buildToolsVersion "30.0.1"
defaultConfig {
applicationId "com.vsm.myarapplication"
minSdkVersion 27
targetSdkVersion 30
versionCode 1
versionName "1.0"
testInstrumentationRunner "androidx.test.runner.AndroidJUnitRunner"
}
buildTypes {
release {
minifyEnabled false
proguardFiles getDefaultProguardFile('proguard-android-optimize.txt'), 'proguard-rules.pro'
}
}
}
dependencies {
implementation fileTree(dir: "libs", include: ["*.jar"])
implementation 'androidx.appcompat:appcompat:1.2.0'
implementation 'androidx.constraintlayout:constraintlayout:2.0.1'
testImplementation 'junit:junit:4.12'
//
implementation 'com.huawei.agconnect:agconnect-core:1.4.1.300'
//
implementation 'com.huawei.hms:arenginesdk:2.13.0.4'
//
implementation 'de.javagl:obj:0.3.0'
androidTestImplementation 'androidx.test.ext:junit:1.1.2'
androidTestImplementation 'androidx.test.espresso:espresso-core:3.3.0'
}
apply plugin: 'com.huawei.agconnect'
We create our Main Activity:
Code:
<?xml version="1.0" encoding="utf-8"?>
<RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
tools:context=".MainActivity">
<android.opengl.GLSurfaceView
android:id="@+id/surfaceview"
android:layout_width="fill_parent"
android:layout_height="fill_parent"
android:layout_gravity="top" />
<TextView
android:id="@+id/wordTextView"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="TextView"
android:textColor="@color/red"
tools:layout_editor_absoluteX="315dp"
tools:layout_editor_absoluteY="4dp" />
<TextView
android:id="@+id/searchingTextView"
android:layout_width="match_parent"
android:layout_height="47dp"
android:layout_alignParentStart="true"
android:layout_alignParentTop="true"
android:layout_marginStart="2dp"
android:layout_marginTop="59dp"
android:layout_marginBottom="403dp"
android:gravity="center"
android:text="Please move the mobile phone slowly to find the plane"
android:textColor="#ffffff"
tools:layout_editor_absoluteX="0dp"
tools:layout_editor_absoluteY="512dp" />
<TextView
android:id="@+id/plane_other"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="@string/plane_other"
android:visibility="gone"
android:rotation="180"
android:textColor="#ff2211"
tools:layout_editor_absoluteX="315dp"
tools:layout_editor_absoluteY="4dp" />
<TextView
android:id="@+id/plane_floor"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="@string/plane_floor"
android:visibility="gone"
android:rotation="180"
android:textColor="#ff2211"
tools:layout_editor_absoluteX="315dp"
tools:layout_editor_absoluteY="4dp" />
<TextView
android:id="@+id/plane_wall"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="@string/plane_wall"
android:visibility="gone"
android:rotation="180"
android:textColor="#ff2211"
tools:layout_editor_absoluteX="315dp"
tools:layout_editor_absoluteY="4dp" />
<TextView
android:id="@+id/plane_table"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="@string/plane_table"
android:visibility="gone"
android:rotation="180"
android:textColor="#ff2211"
tools:layout_editor_absoluteX="315dp"
tools:layout_editor_absoluteY="4dp" />
<TextView
android:id="@+id/plane_seat"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="@string/plane_seat"
android:visibility="gone"
android:rotation="180"
android:textColor="#ff2211"
tools:layout_editor_absoluteX="315dp"
tools:layout_editor_absoluteY="4dp" />
<TextView
android:id="@+id/plane_ceiling"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="@string/plane_ceiling"
android:visibility="gone"
android:rotation="180"
android:textColor="#ff2211"
tools:layout_editor_absoluteX="315dp"
tools:layout_editor_absoluteY="4dp" />
</RelativeLayout>
AR Engine is not for all devices, so first we need to validate if the device support AR Engine and is aviable, here is the list of devices supported
Code:
private boolean arEngineAbilityCheck() {
boolean isInstallArEngineApk = AREnginesApk.isAREngineApkReady(this);
if (!isInstallArEngineApk && isRemindInstall) {
Toast.makeText(this, "Please agree to install.", Toast.LENGTH_LONG).show();
finish();
}
Log.d(TAG, "Is Install AR Engine Apk: " + isInstallArEngineApk);
if (!isInstallArEngineApk) {
startActivity(new Intent(this, ConnectAppMarketActivity.class));
isRemindInstall = true;
}
return AREnginesApk.isAREngineApkReady(this);
}
Code:
private void setMessageWhenError(Exception catchException) {
if (catchException instanceof ARUnavailableServiceNotInstalledException) {
startActivity(new Intent(getApplicationContext(), ConnectAppMarketActivity.class));
} else if (catchException instanceof ARUnavailableServiceApkTooOldException) {
message = "Please update HuaweiARService.apk";
} else if (catchException instanceof ARUnavailableClientSdkTooOldException) {
message = "Please update this app";
} else if (catchException instanceof ARUnSupportedConfigurationException) {
message = "The configuration is not supported by the device!";
} else {
message = "exception throw";
}
}
On our MainActivity.java we call the surface detection
Code:
package com.vsm.myarapplication;
import androidx.appcompat.app.AppCompatActivity;
import android.content.Intent;
import android.opengl.GLSurfaceView;
import android.os.Bundle;
import android.util.Log;
import android.view.GestureDetector;
import android.view.MotionEvent;
import android.view.View;
import android.widget.Toast;
import com.huawei.hiar.ARConfigBase;
import com.huawei.hiar.AREnginesApk;
import com.huawei.hiar.ARSession;
import com.huawei.hiar.ARWorldTrackingConfig;
import com.huawei.hiar.exceptions.ARCameraNotAvailableException;
import com.huawei.hiar.exceptions.ARUnSupportedConfigurationException;
import com.huawei.hiar.exceptions.ARUnavailableClientSdkTooOldException;
import com.huawei.hiar.exceptions.ARUnavailableServiceApkTooOldException;
import com.huawei.hiar.exceptions.ARUnavailableServiceNotInstalledException;
import com.vsm.myarapplication.common.ConnectAppMarketActivity;
import com.vsm.myarapplication.common.DisplayRotationManager;
import com.vsm.myarapplication.common.PermissionManager;
import com.vsm.myarapplication.rendering.WorldRenderManager;
import java.util.concurrent.ArrayBlockingQueue;
public class MainActivity extends AppCompatActivity {
private static final String TAG = MainActivity.class.getSimpleName();
private static final int MOTIONEVENT_QUEUE_CAPACITY = 2;
private static final int OPENGLES_VERSION = 2;
private ARSession mArSession;
private GLSurfaceView mSurfaceView;
private WorldRenderManager mWorldRenderManager;
private GestureDetector mGestureDetector;
private DisplayRotationManager mDisplayRotationManager;
private ArrayBlockingQueue<GestureEvent> mQueuedSingleTaps = new ArrayBlockingQueue<>(MOTIONEVENT_QUEUE_CAPACITY);
private String message = null;
private boolean isRemindInstall = false;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
// AR Engine requires the camera permission.
PermissionManager.checkPermission(this);
mSurfaceView = findViewById(R.id.surfaceview);
mDisplayRotationManager = new DisplayRotationManager(this);
initGestureDetector();
mSurfaceView.setPreserveEGLContextOnPause(true);
mSurfaceView.setEGLContextClientVersion(OPENGLES_VERSION);
// Set the EGL configuration chooser, including for the number of
// bits of the color buffer and the number of depth bits.
mSurfaceView.setEGLConfigChooser(8, 8, 8, 8, 16, 0);
mWorldRenderManager = new WorldRenderManager(this, this);
mWorldRenderManager.setDisplayRotationManage(mDisplayRotationManager);
mWorldRenderManager.setQueuedSingleTaps(mQueuedSingleTaps);
mSurfaceView.setRenderer(mWorldRenderManager);
mSurfaceView.setRenderMode(GLSurfaceView.RENDERMODE_CONTINUOUSLY);
}
private void initGestureDetector() {
mGestureDetector = new GestureDetector(this, new GestureDetector.SimpleOnGestureListener() {
@Override
public boolean onDoubleTap(MotionEvent motionEvent) {
onGestureEvent(GestureEvent.createDoubleTapEvent(motionEvent));
return true;
}
@Override
public boolean onSingleTapConfirmed(MotionEvent motionEvent) {
onGestureEvent(GestureEvent.createSingleTapConfirmEvent(motionEvent));
return true;
}
@Override
public boolean onDown(MotionEvent motionEvent) {
return true;
}
@Override
public boolean onScroll(MotionEvent e1, MotionEvent e2, float distanceX, float distanceY) {
onGestureEvent(GestureEvent.createScrollEvent(e1, e2, distanceX, distanceY));
return true;
}
});
mSurfaceView.setOnTouchListener(new View.OnTouchListener() {
@Override
public boolean onTouch(View v, MotionEvent event) {
return mGestureDetector.onTouchEvent(event);
}
});
}
private void onGestureEvent(GestureEvent e) {
boolean offerResult = mQueuedSingleTaps.offer(e);
if (offerResult) {
Log.i(TAG, "Successfully joined the queue.");
} else {
Log.i(TAG, "Failed to join queue.");
}
}
@Override
protected void onResume() {
Log.i(TAG, "onResume");
super.onResume();
Exception exception = null;
message = null;
if (mArSession == null) {
try {
if (!arEngineAbilityCheck()) {
finish();
return;
}
mArSession = new ARSession(getApplicationContext());
ARWorldTrackingConfig config = new ARWorldTrackingConfig(mArSession);
config.setFocusMode(ARConfigBase.FocusMode.AUTO_FOCUS);
config.setSemanticMode(ARWorldTrackingConfig.SEMANTIC_PLANE);
mArSession.configure(config);
mWorldRenderManager.setArSession(mArSession);
} catch (Exception capturedException) {
Log.e(TAG,capturedException.toString());
exception = capturedException;
setMessageWhenError(capturedException);
}
if (message != null) {
stopArSession(exception);
return;
}
}
try {
mArSession.resume();
} catch (ARCameraNotAvailableException e) {
Toast.makeText(this, "Camera open failed, please restart the app", Toast.LENGTH_LONG).show();
mArSession = null;
return;
}
mDisplayRotationManager.registerDisplayListener();
mSurfaceView.onResume();
}
@Override
protected void onPause() {
Log.i(TAG, "onPause start.");
super.onPause();
if (mArSession != null) {
mDisplayRotationManager.unregisterDisplayListener();
mSurfaceView.onPause();
mArSession.pause();
}
Log.i(TAG, "onPause end.");
}
@Override
protected void onDestroy() {
Log.i(TAG, "onDestroy start.");
if (mArSession != null) {
mArSession.stop();
mArSession = null;
}
super.onDestroy();
Log.i(TAG, "onDestroy end.");
}
private boolean arEngineAbilityCheck() {
boolean isInstallArEngineApk = AREnginesApk.isAREngineApkReady(this);
if (!isInstallArEngineApk && isRemindInstall) {
Toast.makeText(this, "Please agree to install.", Toast.LENGTH_LONG).show();
finish();
}
Log.d(TAG, "Is Install AR Engine Apk: " + isInstallArEngineApk);
if (!isInstallArEngineApk) {
startActivity(new Intent(this, ConnectAppMarketActivity.class));
isRemindInstall = true;
}
return AREnginesApk.isAREngineApkReady(this);
}
private void setMessageWhenError(Exception catchException) {
if (catchException instanceof ARUnavailableServiceNotInstalledException) {
startActivity(new Intent(getApplicationContext(), ConnectAppMarketActivity.class));
} else if (catchException instanceof ARUnavailableServiceApkTooOldException) {
message = "Please update HuaweiARService.apk";
} else if (catchException instanceof ARUnavailableClientSdkTooOldException) {
message = "Please update this app";
} else if (catchException instanceof ARUnSupportedConfigurationException) {
message = "The configuration is not supported by the device!";
} else {
message = "exception throw";
}
}
private void stopArSession(Exception exception) {
Log.i(TAG, "stopArSession start.");
Toast.makeText(this, message, Toast.LENGTH_LONG).show();
Log.e(TAG, "Creating session error", exception);
if (mArSession != null) {
mArSession.stop();
mArSession = null;
}
Log.i(TAG, "stopArSession end.");
}
}
Conclusion
We can detect surfaces for multiple purposes in a simple way.
Documentation:
https://developer.huawei.com/consumer/en/doc/development/HMSCore-Guides/introduction-0000001050130900
Codelab:
https://developer.huawei.com/consumer/en/codelab/HWAREngine/index.html#0
Code Sample:
https://github.com/spartdark/hms-arengine-myarapplication

Book Reader Application Using General Text Recognition by Huawei HiAI in Android

Introduction
In this article, we will learn how to integrate Huawei General Text Recognition using Huawei HiAI. We will build the Book reader application.
About application:
Usually user get bored to read book. This application helps them to listen book reading instead of manual book reading. So all they need to do is just capture photo of book and whenever user is travelling or whenever user want to read the book on their free time. Just user need to select image from galley and listen like music.
Huawei general text recognition works on OCR technology.
First let us understand about OCR.
What is optical character recognition (OCR)?
Optical Character Recognition (OCR) technology is a business solution for automating data extraction from printed or written text from a scanned document or image file and then converting the text into a machine-readable form to be used for data processing like editing or searching.
Now let us understand about General Text Recognition (GTR).
At the core of the GTR is Optical Character Recognition (OCR) technology, which extracts text in screenshots and photos taken by the phone camera. For photos taken by the camera, this API can correct for tilts, camera angles, reflections, and messy backgrounds up to a certain degree. It can also be used for document and streetscape photography, as well as a wide range of usage scenarios, and it features strong anti-interference capability. This API works on device side processing and service connection.
Features
For photos: Provides text area detection and text recognition for Chinese, English, Japanese, Korean, Russian, Italian, Spanish, Portuguese, German, and French texts in multiple printing fonts. A wide range of scenarios are supported, and a high recognition accuracy can be achieved even under the influence of complex lighting condition, background, or more.
For screenshots: Optimizes text extraction algorithms based on the characteristics of screenshots captured on mobile phones. Currently, this function is available in the Chinese mainland supporting Chinese and English texts.
OCR features
Lightweight: This API greatly reduces the computing time and ROM space the algorithm model takes up, making your app more lightweight.
Customized hierarchical result return: You can choose to return the coordinates of text blocks, text lines, and text characters in the screenshot based on app requirements.
How to integrate General Text Recognition
1. Configure the application on the AGC.
2. Apply for HiAI Engine Library
3. Client application development process.
Configure application on the AGC
Follow the steps
Step 1: We need to register as a developer account in AppGallery Connect. If you are already a developer ignore this step.
Step 2: Create an app by referring to Creating a Project and Creating an App in the Project
Step 3: Set the data storage location based on the current location.
Step 4: Generating a Signing Certificate Fingerprint.
Step 5: Configuring the Signing Certificate Fingerprint.
Step 6: Download your agconnect-services.json file, paste it into the app root directory.
Apply for HiAI Engine Library
What is Huawei HiAI?
HiAI is Huawei’s AI computing platform. HUAWEI HiAI is a mobile terminal–oriented artificial intelligence (AI) computing platform that constructs three layers of ecology: service capability openness, application capability openness, and chip capability openness. The three-layer open platform that integrates terminals, chips, and the cloud brings more extraordinary experience for users and developers.
How to apply for HiAI Engine?
Follow the steps
Step 1: Navigate to this URL, choose App Service > Development and click HUAWEI HiAI.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Step 2: Click Apply for HUAWEI HiAI kit.
Step 3: Enter required information like Product name and Package name, click Next button.
Step 4: Verify the application details and click Submit button.
Step 5: Click the Download SDK button to open the SDK list.
Step 6: Unzip downloaded SDK and add into your android project under libs folder.
Step 7: Add jar files dependences into app build.gradle file.
Code:
implementation fileTree(include: ['*.aar', '*.jar'], dir: 'libs')
implementation 'com.google.code.gson:gson:2.8.6'
repositories {
flatDir {
dirs 'libs'
}
}
Client application development process
Follow the steps
Step 1: Create an Android application in the Android studio (Any IDE which is your favorite).
Step 2: Add the App level Gradle dependencies. Choose inside project Android > app > build.gradle.
Code:
apply plugin: 'com.android.application'
apply plugin: 'com.huawei.agconnect'
Root level gradle dependencies.
Code:
maven { url 'https://developer.huawei.com/repo/' }
classpath 'com.huawei.agconnect:agcp:1.4.1.300'
Step 3: Add permission in AndroidManifest.xml
XML:
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.READ_INTERNAL_STORAGE" />
<uses-permission android:name="android.permission.CAMERA" />
Step 4: Build application.
Initialize all view.
Java:
private void initializeView() {
mPlayAudio = findViewById(R.id.playAudio);
mTxtViewResult = findViewById(R.id.result);
mImageView = findViewById(R.id.imgViewPicture);
}
Request the runtime permission
Java:
private void requestPermissions() {
try {
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.M) {
int permission1 = ActivityCompat.checkSelfPermission(this,
Manifest.permission.WRITE_EXTERNAL_STORAGE);
int permission2 = ActivityCompat.checkSelfPermission(this,
Manifest.permission.CAMERA);
if (permission1 != PackageManager.PERMISSION_GRANTED || permission2 != PackageManager
.PERMISSION_GRANTED) {
ActivityCompat.requestPermissions(this, new String[]{Manifest.permission.WRITE_EXTERNAL_STORAGE,
Manifest.permission.READ_EXTERNAL_STORAGE, Manifest.permission.CAMERA}, 0x0010);
}
}
} catch (Exception e) {
e.printStackTrace();
}
}
@override
public void onRequestPermissionsResult(int requestCode, String[] permissions, int[] grantResults) {
super.onRequestPermissionsResult(requestCode, permissions, grantResults);
if (grantResults.length <= 0
|| grantResults[0] != PackageManager.PERMISSION_GRANTED) {
Toast.makeText(this, "Permission denied", Toast.LENGTH_SHORT).show();
}
}
Initialize vision base
Java:
private void initVision() {
VisionBase.init(this, new ConnectionCallback() {
@override
public void onServiceConnect() {
Log.e(TAG, " onServiceConnect");
}
@override
public void onServiceDisconnect() {
Log.e(TAG, " onServiceDisconnect");
}
});
}
Initialize text to speech
Java:
private void initializeTextToSpeech() {
textToSpeech = new TextToSpeech(getApplicationContext(), new TextToSpeech.OnInitListener() {
@override
public void onInit(int status) {
if (status != TextToSpeech.ERROR) {
textToSpeech.setLanguage(Locale.UK);
}
}
});
}
Copy code
Create TextDetector instance.
Java:
mTextDetector = new TextDetector(this);
Define Vision image.
Java:
VisionImage image = VisionImage.fromBitmap(mBitmap);
Create instance of Text class.
Java:
final Text result = new Text();
Create and set VisionTextConfiguration
Java:
VisionTextConfiguration config = new VisionTextConfiguration.Builder()
.setAppType(VisionTextConfiguration.APP_NORMAL)
.setProcessMode(VisionTextConfiguration.MODE_IN)
.setDetectType(TextDetectType.TYPE_TEXT_DETECT_FOCUS_SHOOT)
.setLanguage(TextConfiguration.AUTO).build();
//Set vision configuration
mTextDetector.setVisionConfiguration(config);
Call detect method to get the result
Java:
int result_code = mTextDetector.detect(image, result, new VisionCallback<Text>() {
@override
public void onResult(Text text) {
dismissDialog();
Message message = Message.obtain();
message.what = TYPE_SHOW_RESULT;
message.obj = text;
mHandler.sendMessage(message);
}
@override
public void onError(int i) {
Log.d(TAG, "Callback: onError " + i);
mHandler.sendEmptyMessage(TYPE_TEXT_ERROR);
}
@override
public void onProcessing(float v) {
Log.d(TAG, "Callback: onProcessing:" + v);
}
});
Copy code
Create Handler
Java:
private final Handler mHandler = new Handler() {
[USER=439709]@override[/USER]
public void handleMessage(Message msg) {
super.handleMessage(msg);
int status = msg.what;
Log.d(TAG, "handleMessage status = " + status);
switch (status) {
case TYPE_CHOOSE_PHOTO: {
if (mBitmap == null) {
Log.e(TAG, "bitmap is null");
return;
}
mImageView.setImageBitmap(mBitmap);
mTxtViewResult.setText("");
showDialog();
detectTex();
break;
}
case TYPE_SHOW_RESULT: {
Text result = (Text) msg.obj;
if (dialog != null && dialog.isShowing()) {
dialog.dismiss();
}
if (result == null) {
mTxtViewResult.setText("Failed to detect text lines, result is null.");
break;
}
String textValue = result.getValue();
Log.d(TAG, "text value: " + textValue);
StringBuffer textResult = new StringBuffer();
List<TextLine> textLines = result.getBlocks().get(0).getTextLines();
for (TextLine line : textLines) {
textResult.append(line.getValue() + " ");
}
Log.d(TAG, "OCR Detection succeeded.");
mTxtViewResult.setText(textResult.toString());
textToSpeechString = textResult.toString();
break;
}
case TYPE_TEXT_ERROR: {
mTxtViewResult.setText("Failed to detect text lines, result is null.");
}
default:
break;
}
}
};
Complete code as follows
Java:
import android.Manifest;
import android.app.Activity;
import android.app.ProgressDialog;
import android.content.Intent;
import android.content.pm.PackageManager;
import android.database.Cursor;
import android.graphics.Bitmap;
import android.graphics.BitmapFactory;
import android.net.Uri;
import android.os.Build;
import android.os.Handler;
import android.os.Message;
import android.provider.MediaStore;
import android.speech.tts.TextToSpeech;
import android.support.v4.app.ActivityCompat;
import android.support.v7.app.AppCompatActivity;
import android.os.Bundle;
import android.util.Log;
import android.view.View;
import android.widget.Button;
import android.widget.ImageView;
import android.widget.TextView;
import android.widget.Toast;
import com.huawei.hiai.vision.common.ConnectionCallback;
import com.huawei.hiai.vision.common.VisionBase;
import com.huawei.hiai.vision.common.VisionCallback;
import com.huawei.hiai.vision.common.VisionImage;
import com.huawei.hiai.vision.text.TextDetector;
import com.huawei.hiai.vision.visionkit.text.Text;
import com.huawei.hiai.vision.visionkit.text.TextDetectType;
import com.huawei.hiai.vision.visionkit.text.TextLine;
import com.huawei.hiai.vision.visionkit.text.config.TextConfiguration;
import com.huawei.hiai.vision.visionkit.text.config.VisionTextConfiguration;
import java.util.List;
import java.util.Locale;
public class MainActivity extends AppCompatActivity {
private static final String TAG = MainActivity.class.getSimpleName();
private static final int REQUEST_CHOOSE_PHOTO_CODE = 2;
private Bitmap mBitmap;
private ImageView mPlayAudio;
private ImageView mImageView;
private TextView mTxtViewResult;
protected ProgressDialog dialog;
private TextDetector mTextDetector;
Text imageText = null;
TextToSpeech textToSpeech;
String textToSpeechString = "";
private static final int TYPE_CHOOSE_PHOTO = 1;
private static final int TYPE_SHOW_RESULT = 2;
private static final int TYPE_TEXT_ERROR = 3;
[USER=439709]@override[/USER]
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
initializeView();
requestPermissions();
initVision();
initializeTextToSpeech();
}
private void initializeView() {
mPlayAudio = findViewById(R.id.playAudio);
mTxtViewResult = findViewById(R.id.result);
mImageView = findViewById(R.id.imgViewPicture);
}
private void initVision() {
VisionBase.init(this, new ConnectionCallback() {
[USER=439709]@override[/USER]
public void onServiceConnect() {
Log.e(TAG, " onServiceConnect");
}
[USER=439709]@override[/USER]
public void onServiceDisconnect() {
Log.e(TAG, " onServiceDisconnect");
}
});
}
private void initializeTextToSpeech() {
textToSpeech = new TextToSpeech(getApplicationContext(), new TextToSpeech.OnInitListener() {
[USER=439709]@override[/USER]
public void onInit(int status) {
if (status != TextToSpeech.ERROR) {
textToSpeech.setLanguage(Locale.UK);
}
}
});
}
public void onChildClick(View view) {
switch (view.getId()) {
case R.id.btnSelect: {
Log.d(TAG, "Select an image");
Intent intent = new Intent(Intent.ACTION_PICK);
intent.setType("image/*");
startActivityForResult(intent, REQUEST_CHOOSE_PHOTO_CODE);
break;
}
case R.id.playAudio: {
if (textToSpeechString != null && !textToSpeechString.isEmpty())
textToSpeech.speak(textToSpeechString, TextToSpeech.QUEUE_FLUSH, null);
break;
}
}
}
private void detectTex() {
/* create a TextDetector instance firstly */
mTextDetector = new TextDetector(this);
/*Define VisionImage and transfer the Bitmap image to be detected*/
VisionImage image = VisionImage.fromBitmap(mBitmap);
/*Define the Text class.*/
final Text result = new Text();
/*Use VisionTextConfiguration to select the type of the image to be called. */
VisionTextConfiguration config = new VisionTextConfiguration.Builder()
.setAppType(VisionTextConfiguration.APP_NORMAL)
.setProcessMode(VisionTextConfiguration.MODE_IN)
.setDetectType(TextDetectType.TYPE_TEXT_DETECT_FOCUS_SHOOT)
.setLanguage(TextConfiguration.AUTO).build();
//Set vision configuration
mTextDetector.setVisionConfiguration(config);
/*Call the detect method of TextDetector to obtain the result*/
int result_code = mTextDetector.detect(image, result, new VisionCallback<Text>() {
[USER=439709]@override[/USER]
public void onResult(Text text) {
dismissDialog();
Message message = Message.obtain();
message.what = TYPE_SHOW_RESULT;
message.obj = text;
mHandler.sendMessage(message);
}
[USER=439709]@override[/USER]
public void onError(int i) {
Log.d(TAG, "Callback: onError " + i);
mHandler.sendEmptyMessage(TYPE_TEXT_ERROR);
}
[USER=439709]@override[/USER]
public void onProcessing(float v) {
Log.d(TAG, "Callback: onProcessing:" + v);
}
});
}
private void showDialog() {
if (dialog == null) {
dialog = new ProgressDialog(MainActivity.this);
dialog.setTitle("Detecting text...");
dialog.setMessage("Please wait...");
dialog.setIndeterminate(true);
dialog.setCancelable(false);
}
dialog.show();
}
private final Handler mHandler = new Handler() {
[USER=439709]@override[/USER]
public void handleMessage(Message msg) {
super.handleMessage(msg);
int status = msg.what;
Log.d(TAG, "handleMessage status = " + status);
switch (status) {
case TYPE_CHOOSE_PHOTO: {
if (mBitmap == null) {
Log.e(TAG, "bitmap is null");
return;
}
mImageView.setImageBitmap(mBitmap);
mTxtViewResult.setText("");
showDialog();
detectTex();
break;
}
case TYPE_SHOW_RESULT: {
Text result = (Text) msg.obj;
if (dialog != null && dialog.isShowing()) {
dialog.dismiss();
}
if (result == null) {
mTxtViewResult.setText("Failed to detect text lines, result is null.");
break;
}
String textValue = result.getValue();
Log.d(TAG, "text value: " + textValue);
StringBuffer textResult = new StringBuffer();
List<TextLine> textLines = result.getBlocks().get(0).getTextLines();
for (TextLine line : textLines) {
textResult.append(line.getValue() + " ");
}
Log.d(TAG, "OCR Detection succeeded.");
mTxtViewResult.setText(textResult.toString());
textToSpeechString = textResult.toString();
break;
}
case TYPE_TEXT_ERROR: {
mTxtViewResult.setText("Failed to detect text lines, result is null.");
}
default:
break;
}
}
};
[USER=439709]@override[/USER]
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
super.onActivityResult(requestCode, resultCode, data);
if (requestCode == REQUEST_CHOOSE_PHOTO_CODE && resultCode == Activity.RESULT_OK) {
if (data == null) {
return;
}
Uri selectedImage = data.getData();
getBitmap(selectedImage);
}
}
private void requestPermissions() {
try {
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.M) {
int permission1 = ActivityCompat.checkSelfPermission(this,
Manifest.permission.WRITE_EXTERNAL_STORAGE);
int permission2 = ActivityCompat.checkSelfPermission(this,
Manifest.permission.CAMERA);
if (permission1 != PackageManager.PERMISSION_GRANTED || permission2 != PackageManager
.PERMISSION_GRANTED) {
ActivityCompat.requestPermissions(this, new String[]{Manifest.permission.WRITE_EXTERNAL_STORAGE,
Manifest.permission.READ_EXTERNAL_STORAGE, Manifest.permission.CAMERA}, 0x0010);
}
}
} catch (Exception e) {
e.printStackTrace();
}
}
private void getBitmap(Uri imageUri) {
String[] pathColumn = {MediaStore.Images.Media.DATA};
Cursor cursor = getContentResolver().query(imageUri, pathColumn, null, null, null);
if (cursor == null) return;
cursor.moveToFirst();
int columnIndex = cursor.getColumnIndex(pathColumn[0]);
/* get image path */
String picturePath = cursor.getString(columnIndex);
cursor.close();
mBitmap = BitmapFactory.decodeFile(picturePath);
if (mBitmap == null) {
return;
}
//You can set image here
//mImageView.setImageBitmap(mBitmap);
// You can pass it handler as well
mHandler.sendEmptyMessage(TYPE_CHOOSE_PHOTO);
mTxtViewResult.setText("");
mPlayAudio.setEnabled(true);
}
private void dismissDialog() {
if (dialog != null && dialog.isShowing()) {
dialog.dismiss();
}
}
[USER=439709]@override[/USER]
public void onRequestPermissionsResult(int requestCode, String[] permissions, int[] grantResults) {
super.onRequestPermissionsResult(requestCode, permissions, grantResults);
if (grantResults.length <= 0
|| grantResults[0] != PackageManager.PERMISSION_GRANTED) {
Toast.makeText(this, "Permission denied", Toast.LENGTH_SHORT).show();
}
}
[USER=439709]@override[/USER]
protected void onDestroy() {
super.onDestroy();
/* release ocr instance and free the npu resources*/
if (mTextDetector != null) {
mTextDetector.release();
}
dismissDialog();
if (mBitmap != null) {
mBitmap.recycle();
}
}
}
activity_main.xml
XML:
<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:layout_width="match_parent"
android:layout_height="match_parent"
xmlns:app="http://schemas.android.com/apk/res-auto"
android:fitsSystemWindows="true"
androidrientation="vertical"
android:background="@android:color/darker_gray">
<android.support.v7.widget.Toolbar
android:layout_width="match_parent"
android:layout_height="50dp"
android:background="#ff0000"
android:elevation="10dp">
<LinearLayout
android:layout_width="match_parent"
android:layout_height="match_parent"
androidrientation="horizontal">
<TextView
android:layout_width="match_parent"
android:layout_height="match_parent"
android:text="Book Reader"
android:layout_gravity="center"
android:gravity="center|start"
android:layout_weight="1"
android:textColor="@android:color/white"
android:textStyle="bold"
android:textSize="20sp"/>
<ImageView
android:layout_width="40dp"
android:layout_height="40dp"
android:src="@drawable/ic_baseline_play_circle_outline_24"
android:layout_gravity="center|end"
android:layout_marginEnd="10dp"
android:id="@+id/playAudio"
androidadding="5dp"/>
</LinearLayout>
</android.support.v7.widget.Toolbar>
<ScrollView
android:layout_width="match_parent"
android:layout_height="match_parent"
android:fitsSystemWindows="true">
<LinearLayout
android:layout_width="match_parent"
android:layout_height="match_parent"
androidrientation="vertical"
android:background="@android:color/darker_gray"
>
<android.support.v7.widget.CardView
android:layout_width="match_parent"
android:layout_height="wrap_content"
app:cardCornerRadius="5dp"
app:cardElevation="10dp"
android:layout_marginStart="10dp"
android:layout_marginEnd="10dp"
android:layout_marginTop="20dp"
android:layout_gravity="center">
<ImageView
android:id="@+id/imgViewPicture"
android:layout_width="300dp"
android:layout_height="300dp"
android:layout_margin="8dp"
android:layout_gravity="center_horizontal"
android:scaleType="fitXY" />
</android.support.v7.widget.CardView>
<android.support.v7.widget.CardView
android:layout_width="match_parent"
android:layout_height="wrap_content"
app:cardCornerRadius="5dp"
app:cardElevation="10dp"
android:layout_marginStart="10dp"
android:layout_marginEnd="10dp"
android:layout_marginTop="10dp"
android:layout_gravity="center"
android:layout_marginBottom="20dp">
<LinearLayout
android:layout_width="match_parent"
android:layout_height="wrap_content"
androidrientation="vertical"
>
<TextView
android:layout_margin="5dp"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:textColor="@android:color/black"
android:text="Text on the image"
android:textStyle="normal"
/>
<TextView
android:id="@+id/result"
android:layout_margin="5dp"
android:layout_marginBottom="20dp"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:textSize="18sp"
android:textColor="#ff0000"/>
</LinearLayout>
</android.support.v7.widget.CardView>
<Button
android:id="@+id/btnSelect"
android:layout_width="match_parent"
android:layout_height="wrap_content"
androidnClick="onChildClick"
android:layout_marginStart="10dp"
android:layout_marginEnd="10dp"
android:layout_marginBottom="10dp"
android:text="[USER=936943]@string[/USER]/select_picture"
android:background="@drawable/round_button_bg"
android:textColor="@android:color/white"
android:textAllCaps="false"/>
</LinearLayout>
</ScrollView>
</LinearLayout>
Result
Tips and Tricks
Maximum width and height: 1440 px and 15210 px (If the image is larger than this, you will receive error code 200).
Photos recommended size for optimal recognition accuracy.
Resolution > 720P
Aspect ratio < 2:1
If you are taking Video from a camera or gallery make sure your app has camera and storage permission.
Add the downloaded huawei-hiai-vision-ove-10.0.4.307.aar, huawei-hiai-pdk-1.0.0.aar file to libs folder.
Check dependencies added properly
Latest HMS Core APK is required.
Min SDK is 21. Otherwise you will get Manifest merge issue.
Conclusion
In this article, we have learnt the following concepts.
What is OCR?
Learnt about general text recognition.
Feature of GTR
Features of OCR
How to integrate General Text Recognition using Huawei HiAI
How to Apply Huawei HiAI
How to build the application
Reference
General Text Recognition
Apply for Huawei HiAI
Happy coding
how many languages can it be detected?

Pygmy Collection Application Part 7 (Document Skew correction Huawei HiAI)

Introduction​
If are you new to this application, please follow my previous articles
Pygmy collection application Part 1 (Account kit)
Intermediate: Pygmy Collection Application Part 2 (Ads Kit)
Intermediate: Pygmy Collection Application Part 3 (Crash service)
Intermediate: Pygmy Collection Application Part 4 (Analytics Kit Custom Events)
Intermediate: Pygmy Collection Application Part 5 (Safety Detect)
Intermediate: Pygmy Collection Application Part 6 (Room database)
Click to expand...
Click to collapse
In this article, we will learn how to integrate Huawei Document skew correction using Huawei HiAI in Pygmy collection finance application.
In pygmy collection application for customers KYC update need to happen, so agents will update the KYC, in that case document should be proper, so we will integrate the document skew correction for the image angle adjustment.
Commonly user Struggles a lot while uploading or filling any form due to document issue. This application helps them to take picture from the camera or from the gallery, it automatically detects document from the image.
Document skew correction is used to improve the document photography process by automatically identifying the document in an image. This actually returns the position of the document in original image.
Document skew correction also adjusts the shooting angle of the document based on the position information of the document in original image. This function has excellent performance in scenarios where photos of old photos, paper letters, and drawings are taken for electronic storage.
Features
Document detection: Recognizes documents in images and returns the location information of the documents in the original images.
Document correction: Corrects the document shooting angle based on the document location information in the original images, where areas to be corrected can be customized.
How to integrate Document Skew Correction
1. Configure the application on the AGC.
2. Apply for HiAI Engine Library.
3. Client application development process.
Configure application on the AGC
Follow the steps.
Step 1: We need to register as a developer account in AppGallery Connect. If you are already a developer ignore this step.
Step 2: Create an app by referring to Creating a Project and Creating an App in the Project
Step 3: Set the data storage location based on the current location.
Step 4: Generating a Signing Certificate Fingerprint.
Step 5: Configuring the Signing Certificate Fingerprint.
Step 6: Download your agconnect-services.json file, paste it into the app root directory.
Apply for HiAI Engine Library
What is Huawei HiAI?
HiAI is Huawei’s AI computing platform. HUAWEI HiAI is a mobile terminal–oriented artificial intelligence (AI) computing platform that constructs three layers of ecology: service capability openness, application capability openness, and chip capability openness. The three-layer open platform that integrates terminals, chips, and the cloud brings more extraordinary experience for users and developers.
How to apply for HiAI Engine?
Follow the steps.
Step 1: Navigate to this URL, choose App Service > Development and click HUAWEI HiAI.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Step 2: Click Apply for HUAWEI HiAI kit.
Step 3: Enter required information like Product name and Package name, click Next button.
Step 4: Verify the application details and click Submit button.
Step 5: Click the Download SDK button to open the SDK list.
Step 6: Unzip downloaded SDK and add into your android project under libs folder.
Step 7: Add jar files dependences into app build.gradle file.
Code:
implementation fileTree(include: ['*.aar', '*.jar'], dir: 'libs')
implementation 'com.google.code.gson:gson:2.8.6'
repositories {
flatDir {
dirs 'libs'
}
}
Client application development process
Follow the steps.
Step 1: Create an Android application in the Android studio (Any IDE which is your favorite).
Step 2: Add the App level Gradle dependencies. Choose inside project Android > app > build.gradle.
Code:
apply plugin: 'com.android.application'
apply plugin: 'com.huawei.agconnect'
Root level gradle dependencies.
Code:
maven { url 'https://developer.huawei.com/repo/' }
classpath 'com.huawei.agconnect:agcp:1.4.1.300'
Step 3: Add permission in AndroidManifest.xml
XML:
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.READ_INTERNAL_STORAGE" />
<uses-permission android:name="android.permission.CAMERA" />
Step 4: Build application.
Select image dialog
Java:
private void selectImage() {
final CharSequence[] items = {"Take Photo", "Choose from Library",
"Cancel"};
AlertDialog.Builder builder = new AlertDialog.Builder(KycUpdateActivity.this);
builder.setTitle("Add Photo!");
builder.setItems(items, new DialogInterface.OnClickListener() {
@Override
public void onClick(DialogInterface dialog, int item) {
boolean result = PygmyUtils.checkPermission(KycUpdateActivity.this);
if (items[item].equals("Take Photo")) {
/*userChoosenTask = "Take Photo";
if (result)
cameraIntent();*/
operate_type = TAKE_PHOTO;
requestPermission(Manifest.permission.CAMERA);
} else if (items[item].equals("Choose from Library")) {
/* userChoosenTask = "Choose from Library";
if (result)
galleryIntent();*/
operate_type = SELECT_ALBUM;
requestPermission(Manifest.permission.READ_EXTERNAL_STORAGE);
} else if (items[item].equals("Cancel")) {
dialog.dismiss();
}
}
});
builder.show();
}
Open Document Skew correction activity
Java:
private void startSuperResolutionActivity() {
Intent intent = new Intent(KycUpdateActivity.this, DocumentSkewCorrectionActivity.class);
intent.putExtra("operate_type", operate_type);
startActivityForResult(intent, DOCUMENT_SKEW_CORRECTION_REQUEST);
}
DocumentSkewCorrectonActivity.java
Java:
import androidx.annotation.Nullable;
import androidx.appcompat.app.AppCompatActivity;
import android.app.Activity;
import android.app.AlertDialog;
import android.app.ProgressDialog;
import android.content.ContentValues;
import android.content.DialogInterface;
import android.content.Intent;
import android.graphics.Bitmap;
import android.graphics.Matrix;
import android.graphics.Point;
import android.media.ExifInterface;
import android.net.Uri;
import android.os.Bundle;
import android.os.Handler;
import android.provider.MediaStore;
import android.util.Log;
import android.view.View;
import android.widget.ImageButton;
import android.widget.ImageView;
import android.widget.RelativeLayout;
import android.widget.TextView;
import android.widget.Toast;
import com.huawei.hmf.tasks.OnFailureListener;
import com.huawei.hmf.tasks.OnSuccessListener;
import com.huawei.hmf.tasks.Task;
import com.huawei.hms.mlsdk.common.MLFrame;
import com.huawei.hms.mlsdk.dsc.MLDocumentSkewCorrectionAnalyzer;
import com.huawei.hms.mlsdk.dsc.MLDocumentSkewCorrectionAnalyzerFactory;
import com.huawei.hms.mlsdk.dsc.MLDocumentSkewCorrectionAnalyzerSetting;
import com.huawei.hms.mlsdk.dsc.MLDocumentSkewCorrectionCoordinateInput;
import com.huawei.hms.mlsdk.dsc.MLDocumentSkewCorrectionResult;
import com.huawei.hms.mlsdk.dsc.MLDocumentSkewDetectResult;
import com.shea.pygmycollection.R;
import com.shea.pygmycollection.customview.DocumentCorrectImageView;
import com.shea.pygmycollection.utils.FileUtils;
import com.shea.pygmycollection.utils.UserDataUtils;
import java.io.IOException;
import java.util.ArrayList;
import java.util.List;
public class DocumentSkewCorrectionActivity extends AppCompatActivity implements View.OnClickListener {
private static final String TAG = "SuperResolutionActivity";
private static final int REQUEST_SELECT_IMAGE = 1000;
private static final int REQUEST_TAKE_PHOTO = 1;
private ImageView desImageView;
private ImageButton adjustImgButton;
private Bitmap srcBitmap;
private Bitmap getCompressesBitmap;
private Uri imageUri;
private MLDocumentSkewCorrectionAnalyzer analyzer;
private Bitmap corrected;
private ImageView back;
private Task<MLDocumentSkewCorrectionResult> correctionTask;
private DocumentCorrectImageView documetScanView;
private Point[] _points;
private RelativeLayout layout_image;
private MLFrame frame;
TextView selectTv;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_document_skew_corretion);
//setStatusBarColor(this, R.color.black);
analyzer = createAnalyzer();
adjustImgButton = findViewById(R.id.adjust);
layout_image = findViewById(R.id.layout_image);
desImageView = findViewById(R.id.des_image);
documetScanView = findViewById(R.id.iv_documetscan);
back = findViewById(R.id.back);
selectTv = findViewById(R.id.selectTv);
adjustImgButton.setOnClickListener(this);
findViewById(R.id.back).setOnClickListener(this);
selectTv.setOnClickListener(this);
findViewById(R.id.rl_chooseImg).setOnClickListener(this);
back.setOnClickListener(this);
int operate_type = getIntent().getIntExtra("operate_type", 101);
if (operate_type == 101) {
takePhoto();
} else if (operate_type == 102) {
selectLocalImage();
}
}
private String[] chooseTitles;
@Override
public void onClick(View v) {
if (v.getId() == R.id.adjust) {
List<Point> points = new ArrayList<>();
Point[] cropPoints = documetScanView.getCropPoints();
if (cropPoints != null) {
points.add(cropPoints[0]);
points.add(cropPoints[1]);
points.add(cropPoints[2]);
points.add(cropPoints[3]);
}
MLDocumentSkewCorrectionCoordinateInput coordinateData = new MLDocumentSkewCorrectionCoordinateInput(points);
getDetectdetectResult(coordinateData, frame);
} else if (v.getId() == R.id.rl_chooseImg) {
chooseTitles = new String[]{getResources().getString(R.string.take_photo), getResources().getString(R.string.select_from_album)};
AlertDialog.Builder builder = new AlertDialog.Builder(this);
builder.setItems(chooseTitles, new DialogInterface.OnClickListener() {
@Override
public void onClick(DialogInterface dialogInterface, int position) {
if (position == 0) {
takePhoto();
} else {
selectLocalImage();
}
}
});
builder.create().show();
} else if (v.getId() == R.id.selectTv) {
if (corrected == null) {
Toast.makeText(this, "Document Skew correction is not yet success", Toast.LENGTH_SHORT).show();
return;
} else {
ProgressDialog pd = new ProgressDialog(this);
pd.setMessage("Please wait...");
pd.show();
//UserDataUtils.saveBitMap(this, corrected);
Intent intent = new Intent();
intent.putExtra("status", "success");
new Handler().postDelayed(new Runnable() {
@Override
public void run() {
if (pd != null && pd.isShowing()) {
pd.dismiss();
}
setResult(Activity.RESULT_OK, intent);
finish();
}
}, 3000);
}
} else if (v.getId() == R.id.back) {
finish();
}
}
private MLDocumentSkewCorrectionAnalyzer createAnalyzer() {
MLDocumentSkewCorrectionAnalyzerSetting setting = new MLDocumentSkewCorrectionAnalyzerSetting
.Factory()
.create();
return MLDocumentSkewCorrectionAnalyzerFactory.getInstance().getDocumentSkewCorrectionAnalyzer(setting);
}
private void takePhoto() {
layout_image.setVisibility(View.GONE);
Intent takePictureIntent = new Intent(MediaStore.ACTION_IMAGE_CAPTURE);
if (takePictureIntent.resolveActivity(this.getPackageManager()) != null) {
ContentValues values = new ContentValues();
values.put(MediaStore.Images.Media.TITLE, "New Picture");
values.put(MediaStore.Images.Media.DESCRIPTION, "From Camera");
this.imageUri = this.getContentResolver().insert(MediaStore.Images.Media.EXTERNAL_CONTENT_URI, values);
takePictureIntent.putExtra(MediaStore.EXTRA_OUTPUT, this.imageUri);
this.startActivityForResult(takePictureIntent, DocumentSkewCorrectionActivity.this.REQUEST_TAKE_PHOTO);
}
}
private void selectLocalImage() {
layout_image.setVisibility(View.GONE);
Intent intent = new Intent(Intent.ACTION_PICK, null);
intent.setDataAndType(MediaStore.Images.Media.EXTERNAL_CONTENT_URI, "image/*");
startActivityForResult(intent, REQUEST_SELECT_IMAGE);
}
@Override
protected void onActivityResult(int requestCode, int resultCode, @Nullable Intent data) {
super.onActivityResult(requestCode, resultCode, data);
if (requestCode == REQUEST_SELECT_IMAGE && resultCode == Activity.RESULT_OK) {
imageUri = data.getData();
try {
if (imageUri != null) {
srcBitmap = MediaStore.Images.Media.getBitmap(getContentResolver(), imageUri);
String realPathFromURI = getRealPathFromURI(imageUri);
int i = readPictureDegree(realPathFromURI);
Bitmap spBitmap = rotaingImageView(i, srcBitmap);
Matrix matrix = new Matrix();
matrix.setScale(0.5f, 0.5f);
getCompressesBitmap = Bitmap.createBitmap(spBitmap, 0, 0, spBitmap.getWidth(),
spBitmap.getHeight(), matrix, true);
reloadAndDetectImage();
}
} catch (IOException e) {
Log.e(TAG, e.getMessage());
}
} else if (requestCode == REQUEST_TAKE_PHOTO && resultCode == Activity.RESULT_OK) {
try {
if (imageUri != null) {
srcBitmap = MediaStore.Images.Media.getBitmap(getContentResolver(), imageUri);
String realPathFromURI = getRealPathFromURI(imageUri);
int i = readPictureDegree(realPathFromURI);
Bitmap spBitmap = rotaingImageView(i, srcBitmap);
Matrix matrix = new Matrix();
matrix.setScale(0.5f, 0.5f);
getCompressesBitmap = Bitmap.createBitmap(spBitmap, 0, 0, spBitmap.getWidth(),
srcBitmap.getHeight(), matrix, true);
reloadAndDetectImage();
}
} catch (IOException e) {
Log.e(TAG, e.getMessage());
}
} else if (resultCode == REQUEST_SELECT_IMAGE && resultCode == Activity.RESULT_CANCELED) {
finish();
}
}
private void reloadAndDetectImage() {
if (imageUri == null) {
return;
}
frame = MLFrame.fromBitmap(getCompressesBitmap);
Task<MLDocumentSkewDetectResult> task = analyzer.asyncDocumentSkewDetect(frame);
task.addOnSuccessListener(new OnSuccessListener<MLDocumentSkewDetectResult>() {
public void onSuccess(MLDocumentSkewDetectResult result) {
if (result.getResultCode() != 0) {
corrected = null;
Toast.makeText(DocumentSkewCorrectionActivity.this, "The picture does not meet the requirements.", Toast.LENGTH_SHORT).show();
} else {
// Recognition success.
Point leftTop = result.getLeftTopPosition();
Point rightTop = result.getRightTopPosition();
Point leftBottom = result.getLeftBottomPosition();
Point rightBottom = result.getRightBottomPosition();
_points = new Point[4];
_points[0] = leftTop;
_points[1] = rightTop;
_points[2] = rightBottom;
_points[3] = leftBottom;
layout_image.setVisibility(View.GONE);
documetScanView.setImageBitmap(getCompressesBitmap);
documetScanView.setPoints(_points);
}
}
}).addOnFailureListener(new OnFailureListener() {
public void onFailure(Exception e) {
Toast.makeText(getApplicationContext(), e.getMessage(), Toast.LENGTH_SHORT).show();
}
});
}
private void getDetectdetectResult(MLDocumentSkewCorrectionCoordinateInput coordinateData, MLFrame frame) {
try {
correctionTask = analyzer.asyncDocumentSkewCorrect(frame, coordinateData);
} catch (Exception e) {
Log.e(TAG, "The image does not meet the detection requirements.");
}
try {
correctionTask.addOnSuccessListener(new OnSuccessListener<MLDocumentSkewCorrectionResult>() {
@Override
public void onSuccess(MLDocumentSkewCorrectionResult refineResult) {
// The check is successful.
if (refineResult != null && refineResult.getResultCode() == 0) {
corrected = refineResult.getCorrected();
layout_image.setVisibility(View.VISIBLE);
desImageView.setImageBitmap(corrected);
UserDataUtils.saveBitMap(DocumentSkewCorrectionActivity.this, corrected);
} else {
Toast.makeText(DocumentSkewCorrectionActivity.this, "The check fails.", Toast.LENGTH_SHORT).show();
}
}
}).addOnFailureListener(new OnFailureListener() {
@Override
public void onFailure(Exception e) {
Toast.makeText(DocumentSkewCorrectionActivity.this, "The check fails.", Toast.LENGTH_SHORT).show();
}
});
} catch (Exception e) {
Log.e(TAG, "Please set an image.");
}
}
@Override
protected void onDestroy() {
super.onDestroy();
if (srcBitmap != null) {
srcBitmap.recycle();
}
if (getCompressesBitmap != null) {
getCompressesBitmap.recycle();
}
if (corrected != null) {
corrected.recycle();
}
if (analyzer != null) {
try {
analyzer.stop();
} catch (IOException e) {
Log.e(TAG, e.getMessage());
}
}
}
public static int readPictureDegree(String path) {
int degree = 0;
try {
ExifInterface exifInterface = new ExifInterface(path);
int orientation = exifInterface.getAttributeInt(
ExifInterface.TAG_ORIENTATION,
ExifInterface.ORIENTATION_NORMAL);
switch (orientation) {
case ExifInterface.ORIENTATION_ROTATE_90:
degree = 90;
break;
case ExifInterface.ORIENTATION_ROTATE_180:
degree = 180;
break;
case ExifInterface.ORIENTATION_ROTATE_270:
degree = 270;
break;
}
} catch (IOException e) {
Log.e(TAG, e.getMessage());
}
return degree;
}
private String getRealPathFromURI(Uri contentURI) {
String result;
result = FileUtils.getFilePathByUri(this, contentURI);
return result;
}
public static Bitmap rotaingImageView(int angle, Bitmap bitmap) {
Matrix matrix = new Matrix();
matrix.postRotate(angle);
Bitmap resizedBitmap = Bitmap.createBitmap(bitmap, 0, 0,
bitmap.getWidth(), bitmap.getHeight(), matrix, true);
return resizedBitmap;
}
}
activity_document_skew_correction.xml
XML:
<?xml version="1.0" encoding="utf-8"?>
<RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:background="#000000"
tools:ignore="MissingDefaultResource">
<LinearLayout
android:id="@+id/linear_views"
android:layout_width="match_parent"
android:layout_height="80dp"
android:layout_alignParentBottom="true"
android:layout_marginBottom="10dp"
android:orientation="horizontal">
<RelativeLayout
android:id="@+id/rl_chooseImg"
android:layout_width="0dp"
android:layout_height="match_parent"
android:layout_weight="1">
<ImageView
android:layout_width="25dp"
android:layout_height="25dp"
android:layout_centerInParent="true"
android:src="@drawable/add_picture"
app:tint="@color/colorPrimary" />
</RelativeLayout>
<RelativeLayout
android:id="@+id/rl_adjust"
android:layout_width="0dp"
android:layout_height="match_parent"
android:layout_weight="1">
<ImageButton
android:id="@+id/adjust"
android:layout_width="70dp"
android:layout_height="70dp"
android:layout_centerInParent="true"
android:background="@drawable/ic_baseline_adjust_24" />
</RelativeLayout>
<RelativeLayout
android:id="@+id/rl_help"
android:layout_width="0dp"
android:layout_height="match_parent"
android:layout_weight="1">
<ImageView
android:layout_width="30dp"
android:layout_height="30dp"
android:layout_centerInParent="true"
android:src="@drawable/back"
android:visibility="invisible" />
</RelativeLayout>
</LinearLayout>
<RelativeLayout
android:id="@+id/rl_navigation"
android:layout_width="match_parent"
android:layout_height="50dp"
android:background="@color/colorPrimary">
<ImageButton
android:id="@+id/back"
android:layout_width="30dp"
android:layout_height="30dp"
android:layout_centerVertical="true"
android:layout_marginLeft="20dp"
android:layout_marginTop="@dimen/icon_back_margin"
android:background="@drawable/back" />
<TextView
android:id="@+id/selectTv"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_alignParentEnd="true"
android:layout_centerVertical="true"
android:layout_marginEnd="10dp"
android:fontFamily="@font/montserrat_bold"
android:padding="@dimen/hiad_10_dp"
android:text="Select Doc"
android:textColor="@color/white" />
</RelativeLayout>
<RelativeLayout
android:layout_width="match_parent"
android:layout_height="match_parent"
android:layout_centerHorizontal="true"
android:layout_marginTop="99dp"
android:layout_marginBottom="100dp">
<com.shea.pygmycollection.customview.DocumentCorrectImageView
android:id="@+id/iv_documetscan"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:layout_centerInParent="true"
android:padding="20dp"
app:LineColor="@color/colorPrimary" />
</RelativeLayout>
<RelativeLayout
android:id="@+id/layout_image"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:layout_centerHorizontal="true"
android:layout_marginTop="99dp"
android:layout_marginBottom="100dp"
android:background="#000000"
android:visibility="gone">
<ImageView
android:id="@+id/des_image"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:layout_centerInParent="true"
android:padding="20dp" />
</RelativeLayout>
</RelativeLayout>
Result​
Tips and Tricks​
Recommended image width and height: 1080 px and 2560 px.
Multi-thread invoking is currently not supported.
The document detection and correction API can only be called by 64-bit apps.
If you are taking Video from a camera or gallery make sure your app has camera and storage permission.
Add the downloaded huawei-hiai-vision-ove-10.0.4.307.aar, huawei-hiai-pdk-1.0.0.aar file to libs folder.
Check dependencies added properly.
Latest HMS Core APK is required.
Min SDK is 21. Otherwise you will get Manifest merge issue.
Conclusion​
In this article, we have built an application where that detects the document in the image, and correct the document and it gives a result. We have learnt the following concepts.
1. What is Document skew correction?
2. Feature of Document skew correction.
3. How to integrate Document Skew correction using Huawei HiAI?
4. How to Apply Huawei HiAI?
5. How to build the application?
Reference​
Document skew correction
Apply for Huawei HiAI​

Categories

Resources