Example of Automatic Speech Recognition without Pickup UI - Huawei Developers

More information like this, you can visit HUAWEI Developer Forum​
Original link: https://forums.developer.huawei.com/forumPortal/en/topicview?tid=0202351447289070169&fid=0101187876626530001
Automatic Speech Recognition
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Service Introduction
Automatic speech recognition (ASR) can recognize speech not longer than 60 seconds and convert the input speech into text in real time. This service uses industry-leading deep learning technologies to achieve a recognition accuracy of over 95%. Currently, Mandarin Chinese (including Chinese-English bilingual speech), English, French, German, Spanish, and Italian can be recognized.
Precausions
Currently, ASR is available only on Huawei phones.
ASR depends on the on-cloud API for speech recognition. During commissioning and usage, ensure that the device can access the Internet.
Integrating the Automatic Speech Recognition Service
You can integrate the automatic speech recognition (ASR) service in either of the following modes:
Integrate the ASR plug-in
Integrate the ASR SDK
The SDK provides only the basic ASR services, and you need to develop the speech pickup UI by yourself.
The sample code for integration through SDK is as follows:
Code:
dependencies{
// Import the ASR SDK.
implementation 'com.huawei.hms:ml-computer-voice-asr:2.0.1.300'
}
Development Process
Before API developement, you need to create custom layout based on your requirements. Check the below code
Code:
<?xml version="1.0" encoding="utf-8"?>
<RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:background="#fff">
<androidx.cardview.widget.CardView
android:id="@+id/card_listen"
android:visibility="gone"
android:layout_width="match_parent"
android:layout_height="match_parent"
app:cardBackgroundColor="#f1f1f1"
app:cardElevation="5dp">
<RelativeLayout
android:layout_width="match_parent"
android:layout_height="match_parent"
android:layout_margin="@dimen/dimen_10">
<ImageView
android:id="@+id/img_close_listen"
android:layout_width="@dimen/dimen_50"
android:layout_height="@dimen/dimen_50"
android:padding="@dimen/dimen_10"
android:src="@drawable/close_pc" />
<TextView
android:id="@+id/txt_listening"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_below="@id/img_close_listen"
android:layout_marginTop="@dimen/dimen_20"
android:text="Listening..."
android:textColor="#000"
android:textSize="18sp"
android:textStyle="bold" />
<ImageView
android:id="@+id/img_listening_mic"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_alignParentBottom="true"
android:src="@drawable/listening_mic" />
<RelativeLayout
android:id="@+id/rl_try_again"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:visibility="gone"
android:layout_alignParentBottom="true"
android:layout_margin="@dimen/dimen_50">
<com.google.android.material.floatingactionbutton.FloatingActionButton
android:id="@+id/fab_mic"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_centerHorizontal="true"
android:backgroundTint="#fff"
android:src="@drawable/mic_pc" />
<TextView
android:id="@+id/txt_tap_micro"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_below="@id/fab_mic"
android:layout_centerHorizontal="true"
android:layout_marginTop="@dimen/dimen_20"
android:text="Tap microphone to try again"
android:textColor="#000"
android:textSize="12sp"
android:textStyle="bold" />
</RelativeLayout>
</RelativeLayout>
</androidx.cardview.widget.CardView>
</RelativeLayout>
1. Set the ApiKey
Code:
private void setApiKey() {
// AGConnectServicesConfig config = AGConnectServicesConfig.fromContext(getApplication());
// MLApplication.getInstance().setApiKey(config.getString("client/api_key"));
MLApplication.getInstance().setApiKey(getString(R.string.api_key));
}
2. Initialize ASR
Code:
private void initASR() {
speechRecognizer = MLAsrRecognizer.createAsrRecognizer(this);
speechRecognizer.setAsrListener(this);
}
3. Implement a speech recognition result listener callback
Code:
@Override
public void onResults(Bundle bundle) {
Log.e(TAG + "onResults", "onResults");
final String speechResult = bundle.getString("results_recognized");
if (speechResult != null && !speechResult.equals("")) {
txtListeningText.setTextColor(Color.RED);
txtListeningText.setText(speechResult);
Log.e(TAG + " Speech Text", speechResult);
Handler handler = new Handler();
handler.postDelayed(new Runnable() {
@Override
public void run() {
Intent intent = new Intent(SearchActivity.this, MapActivity.class);
intent.putExtra("LISTEN_DATA", speechResult);
setResult(Activity.RESULT_OK, intent);
finish();
stopSpeechListener();
}
}, 2000);
}
}
@Override
public void onRecognizingResults(Bundle bundle) {
Log.e(TAG + "onRecognizingResults", "onRecognizingResults");
}
@Override
public void onError(int i, String s) {
if (s != null)
Toast.makeText(SearchActivity.this, TAG + "onError" + s, Toast.LENGTH_SHORT).show();
}
@Override
public void onStartListening() {
txtListeningText.setText("Listening...");
}
@Override
public void onStartingOfSpeech() {
Log.e(TAG + "onStartOfSpeech", "Speech Started");
}
@Override
public void onVoiceDataReceived(byte[] bytes, float v, Bundle bundle) {
}
@Override
public void onState(int i, Bundle bundle) {
Log.e(TAG + "onState", String.valueOf(i));
}
4. Set recognition parameters and call startRecognizing to start speech recognition
Code:
private void startCustomASR() {
if (checkPermission()) {
countDown();
rlTryAgain.setVisibility(View.GONE);
imgListening.setVisibility(View.VISIBLE);
txtListeningText.setTextColor(Color.BLACK);
Intent mSpeechRecognizerIntent = new Intent(MLAsrConstants.ACTION_HMS_ASR_SPEECH);
mSpeechRecognizerIntent.putExtra(MLAsrConstants.LANGUAGE, MLAsrConstants.LAN_EN_IN);
mSpeechRecognizerIntent.putExtra(MLAsrConstants.FEATURE, MLAsrConstants.FEATURE_WORDFLUX);
speechRecognizer.startRecognizing(mSpeechRecognizerIntent);
} else {
requestPermissions();
}
}
5. Add runtime permission in AndroidManifest.xml file
Code:
<uses-permission android:name="android.permission.RECORD_AUDIO" />
6. Before startRecognizing need to check runtime permission, Check the below methods for check and request permissions
Code:
private boolean checkPermission() {
if (ActivityCompat.checkSelfPermission(this, Manifest.permission.RECORD_AUDIO) == PackageManager.PERMISSION_GRANTED) {
return true;
} else {
requestPermissions();
return false;
}
}
private void requestPermissions() {
ActivityCompat.requestPermissions(this, new String[]{Manifest.permission.RECORD_AUDIO}, REQUEST_RECORD_AUDIO);
}
Override onRequestPermissionResult method
Code:
@Override
public void onRequestPermissionsResult(int requestCode, @NonNull String[] permissions, @NonNull int[] grantResults) {
super.onRequestPermissionsResult(requestCode, permissions, grantResults);
if (requestCode == REQUEST_RECORD_AUDIO) {
if (grantResults[0] == PackageManager.PERMISSION_GRANTED) {
cardListening.setVisibility(View.VISIBLE);
startCustomASR();
} else {
requestPermissions();
}
}
}
7. Add Count down time like this
Code:
void countDown() {
if (countDownTimer != null) {
countDownTimer.cancel();
}
countDownTimer = new CountDownTimer(7000, 1000) {
@Override
public void onTick(long l) {
}
@Override
public void onFinish() {
rlTryAgain.setVisibility(View.VISIBLE);
imgListening.setVisibility(View.GONE);
txtListeningText.setText("Try Again");
stopSpeechListener();
}
};
countDownTimer.start();
}
8. Release resources after the detection is complete
Code:
private void stopSpeechListener() {
if (speechRecognizer != null) {
speechRecognizer.destroy();
}
}
Find the output in below image
Conclusion
This article covers how to use ASR without Pickup UI, Always do not use default one sometimes need to try custom ASR UI
Reference link
https://developer.huawei.com/consumer/en/doc/development/HMSCore-Guides/ml-asr-0000001050066212

Very interesting, thank you.

Very useful with a translator if you need to travel in another country.

Related

Surface detection with AR Engine

More information like this, you can visit HUAWEI Developer Forum
​
Introduction
AR Engine has support to detect objects in the real world is called "Environment tracking" and with it you can records illumination, plane, image, object, surface, and other environmental information to help your apps merge virtual objects into scenarios in the physical world.
What is HUAWEI AR Engine?
HUAWEI AR Engine is a platform for building augmented reality (AR) apps on Android smartphones. It is based on the HiSilicon chipset, and integrates AR core algorithms to provide basic AR capabilities such as motion tracking, environment tracking, body tracking, and face tracking, allowing your app to bridge virtual world with the real world, for a brand new visually interactive user experience.
Currently, HUAWEI AR Engine provides three types of capabilities, including motion tracking, environment tracking, and human body and face tracking.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Example Android Application
For this example we will work on Environment tracking so we can detect surfaces, like a table or a floor.
Development Process
Creating an App
Create an app following instructions in Creating an AppGallery Connect Project and Adding an App to the Project.
Platform: Android
Device: Mobile phone
App category: App or Game
Integrating HUAWEI AR Engine SDK
Before development, integrate the HUAWEI AR Engine SDK via the Maven repository into your development environment.
Open the build.gradle file in the root directory of your Android Studio project.
Code:
// Top-level build file where you can add configuration options common to all sub-projects/modules.
buildscript {
repositories {
google()
jcenter()
maven {url 'https://developer.huawei.com/repo/'}
}
dependencies {
classpath "com.android.tools.build:gradle:4.0.1"
classpath 'com.huawei.agconnect:agcp:1.3.2.301'
// NOTE: Do not place your application dependencies here; they belong
// in the individual module build.gradle files
}
}
allprojects {
repositories {
maven {url 'https://developer.huawei.com/repo/'}
google()
jcenter()
}
}
task clean(type: Delete) {
delete rootProject.buildDir
}
Open the build.gradle file in the app directory of your project
Code:
apply plugin: 'com.android.application'
android {
compileSdkVersion 30
buildToolsVersion "30.0.1"
defaultConfig {
applicationId "com.vsm.myarapplication"
minSdkVersion 27
targetSdkVersion 30
versionCode 1
versionName "1.0"
testInstrumentationRunner "androidx.test.runner.AndroidJUnitRunner"
}
buildTypes {
release {
minifyEnabled false
proguardFiles getDefaultProguardFile('proguard-android-optimize.txt'), 'proguard-rules.pro'
}
}
}
dependencies {
implementation fileTree(dir: "libs", include: ["*.jar"])
implementation 'androidx.appcompat:appcompat:1.2.0'
implementation 'androidx.constraintlayout:constraintlayout:2.0.1'
testImplementation 'junit:junit:4.12'
//
implementation 'com.huawei.agconnect:agconnect-core:1.4.1.300'
//
implementation 'com.huawei.hms:arenginesdk:2.13.0.4'
//
implementation 'de.javagl:obj:0.3.0'
androidTestImplementation 'androidx.test.ext:junit:1.1.2'
androidTestImplementation 'androidx.test.espresso:espresso-core:3.3.0'
}
apply plugin: 'com.huawei.agconnect'
We create our Main Activity:
Code:
<?xml version="1.0" encoding="utf-8"?>
<RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
tools:context=".MainActivity">
<android.opengl.GLSurfaceView
android:id="@+id/surfaceview"
android:layout_width="fill_parent"
android:layout_height="fill_parent"
android:layout_gravity="top" />
<TextView
android:id="@+id/wordTextView"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="TextView"
android:textColor="@color/red"
tools:layout_editor_absoluteX="315dp"
tools:layout_editor_absoluteY="4dp" />
<TextView
android:id="@+id/searchingTextView"
android:layout_width="match_parent"
android:layout_height="47dp"
android:layout_alignParentStart="true"
android:layout_alignParentTop="true"
android:layout_marginStart="2dp"
android:layout_marginTop="59dp"
android:layout_marginBottom="403dp"
android:gravity="center"
android:text="Please move the mobile phone slowly to find the plane"
android:textColor="#ffffff"
tools:layout_editor_absoluteX="0dp"
tools:layout_editor_absoluteY="512dp" />
<TextView
android:id="@+id/plane_other"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="@string/plane_other"
android:visibility="gone"
android:rotation="180"
android:textColor="#ff2211"
tools:layout_editor_absoluteX="315dp"
tools:layout_editor_absoluteY="4dp" />
<TextView
android:id="@+id/plane_floor"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="@string/plane_floor"
android:visibility="gone"
android:rotation="180"
android:textColor="#ff2211"
tools:layout_editor_absoluteX="315dp"
tools:layout_editor_absoluteY="4dp" />
<TextView
android:id="@+id/plane_wall"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="@string/plane_wall"
android:visibility="gone"
android:rotation="180"
android:textColor="#ff2211"
tools:layout_editor_absoluteX="315dp"
tools:layout_editor_absoluteY="4dp" />
<TextView
android:id="@+id/plane_table"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="@string/plane_table"
android:visibility="gone"
android:rotation="180"
android:textColor="#ff2211"
tools:layout_editor_absoluteX="315dp"
tools:layout_editor_absoluteY="4dp" />
<TextView
android:id="@+id/plane_seat"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="@string/plane_seat"
android:visibility="gone"
android:rotation="180"
android:textColor="#ff2211"
tools:layout_editor_absoluteX="315dp"
tools:layout_editor_absoluteY="4dp" />
<TextView
android:id="@+id/plane_ceiling"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="@string/plane_ceiling"
android:visibility="gone"
android:rotation="180"
android:textColor="#ff2211"
tools:layout_editor_absoluteX="315dp"
tools:layout_editor_absoluteY="4dp" />
</RelativeLayout>
AR Engine is not for all devices, so first we need to validate if the device support AR Engine and is aviable, here is the list of devices supported
Code:
private boolean arEngineAbilityCheck() {
boolean isInstallArEngineApk = AREnginesApk.isAREngineApkReady(this);
if (!isInstallArEngineApk && isRemindInstall) {
Toast.makeText(this, "Please agree to install.", Toast.LENGTH_LONG).show();
finish();
}
Log.d(TAG, "Is Install AR Engine Apk: " + isInstallArEngineApk);
if (!isInstallArEngineApk) {
startActivity(new Intent(this, ConnectAppMarketActivity.class));
isRemindInstall = true;
}
return AREnginesApk.isAREngineApkReady(this);
}
Code:
private void setMessageWhenError(Exception catchException) {
if (catchException instanceof ARUnavailableServiceNotInstalledException) {
startActivity(new Intent(getApplicationContext(), ConnectAppMarketActivity.class));
} else if (catchException instanceof ARUnavailableServiceApkTooOldException) {
message = "Please update HuaweiARService.apk";
} else if (catchException instanceof ARUnavailableClientSdkTooOldException) {
message = "Please update this app";
} else if (catchException instanceof ARUnSupportedConfigurationException) {
message = "The configuration is not supported by the device!";
} else {
message = "exception throw";
}
}
On our MainActivity.java we call the surface detection
Code:
package com.vsm.myarapplication;
import androidx.appcompat.app.AppCompatActivity;
import android.content.Intent;
import android.opengl.GLSurfaceView;
import android.os.Bundle;
import android.util.Log;
import android.view.GestureDetector;
import android.view.MotionEvent;
import android.view.View;
import android.widget.Toast;
import com.huawei.hiar.ARConfigBase;
import com.huawei.hiar.AREnginesApk;
import com.huawei.hiar.ARSession;
import com.huawei.hiar.ARWorldTrackingConfig;
import com.huawei.hiar.exceptions.ARCameraNotAvailableException;
import com.huawei.hiar.exceptions.ARUnSupportedConfigurationException;
import com.huawei.hiar.exceptions.ARUnavailableClientSdkTooOldException;
import com.huawei.hiar.exceptions.ARUnavailableServiceApkTooOldException;
import com.huawei.hiar.exceptions.ARUnavailableServiceNotInstalledException;
import com.vsm.myarapplication.common.ConnectAppMarketActivity;
import com.vsm.myarapplication.common.DisplayRotationManager;
import com.vsm.myarapplication.common.PermissionManager;
import com.vsm.myarapplication.rendering.WorldRenderManager;
import java.util.concurrent.ArrayBlockingQueue;
public class MainActivity extends AppCompatActivity {
private static final String TAG = MainActivity.class.getSimpleName();
private static final int MOTIONEVENT_QUEUE_CAPACITY = 2;
private static final int OPENGLES_VERSION = 2;
private ARSession mArSession;
private GLSurfaceView mSurfaceView;
private WorldRenderManager mWorldRenderManager;
private GestureDetector mGestureDetector;
private DisplayRotationManager mDisplayRotationManager;
private ArrayBlockingQueue<GestureEvent> mQueuedSingleTaps = new ArrayBlockingQueue<>(MOTIONEVENT_QUEUE_CAPACITY);
private String message = null;
private boolean isRemindInstall = false;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
// AR Engine requires the camera permission.
PermissionManager.checkPermission(this);
mSurfaceView = findViewById(R.id.surfaceview);
mDisplayRotationManager = new DisplayRotationManager(this);
initGestureDetector();
mSurfaceView.setPreserveEGLContextOnPause(true);
mSurfaceView.setEGLContextClientVersion(OPENGLES_VERSION);
// Set the EGL configuration chooser, including for the number of
// bits of the color buffer and the number of depth bits.
mSurfaceView.setEGLConfigChooser(8, 8, 8, 8, 16, 0);
mWorldRenderManager = new WorldRenderManager(this, this);
mWorldRenderManager.setDisplayRotationManage(mDisplayRotationManager);
mWorldRenderManager.setQueuedSingleTaps(mQueuedSingleTaps);
mSurfaceView.setRenderer(mWorldRenderManager);
mSurfaceView.setRenderMode(GLSurfaceView.RENDERMODE_CONTINUOUSLY);
}
private void initGestureDetector() {
mGestureDetector = new GestureDetector(this, new GestureDetector.SimpleOnGestureListener() {
@Override
public boolean onDoubleTap(MotionEvent motionEvent) {
onGestureEvent(GestureEvent.createDoubleTapEvent(motionEvent));
return true;
}
@Override
public boolean onSingleTapConfirmed(MotionEvent motionEvent) {
onGestureEvent(GestureEvent.createSingleTapConfirmEvent(motionEvent));
return true;
}
@Override
public boolean onDown(MotionEvent motionEvent) {
return true;
}
@Override
public boolean onScroll(MotionEvent e1, MotionEvent e2, float distanceX, float distanceY) {
onGestureEvent(GestureEvent.createScrollEvent(e1, e2, distanceX, distanceY));
return true;
}
});
mSurfaceView.setOnTouchListener(new View.OnTouchListener() {
@Override
public boolean onTouch(View v, MotionEvent event) {
return mGestureDetector.onTouchEvent(event);
}
});
}
private void onGestureEvent(GestureEvent e) {
boolean offerResult = mQueuedSingleTaps.offer(e);
if (offerResult) {
Log.i(TAG, "Successfully joined the queue.");
} else {
Log.i(TAG, "Failed to join queue.");
}
}
@Override
protected void onResume() {
Log.i(TAG, "onResume");
super.onResume();
Exception exception = null;
message = null;
if (mArSession == null) {
try {
if (!arEngineAbilityCheck()) {
finish();
return;
}
mArSession = new ARSession(getApplicationContext());
ARWorldTrackingConfig config = new ARWorldTrackingConfig(mArSession);
config.setFocusMode(ARConfigBase.FocusMode.AUTO_FOCUS);
config.setSemanticMode(ARWorldTrackingConfig.SEMANTIC_PLANE);
mArSession.configure(config);
mWorldRenderManager.setArSession(mArSession);
} catch (Exception capturedException) {
Log.e(TAG,capturedException.toString());
exception = capturedException;
setMessageWhenError(capturedException);
}
if (message != null) {
stopArSession(exception);
return;
}
}
try {
mArSession.resume();
} catch (ARCameraNotAvailableException e) {
Toast.makeText(this, "Camera open failed, please restart the app", Toast.LENGTH_LONG).show();
mArSession = null;
return;
}
mDisplayRotationManager.registerDisplayListener();
mSurfaceView.onResume();
}
@Override
protected void onPause() {
Log.i(TAG, "onPause start.");
super.onPause();
if (mArSession != null) {
mDisplayRotationManager.unregisterDisplayListener();
mSurfaceView.onPause();
mArSession.pause();
}
Log.i(TAG, "onPause end.");
}
@Override
protected void onDestroy() {
Log.i(TAG, "onDestroy start.");
if (mArSession != null) {
mArSession.stop();
mArSession = null;
}
super.onDestroy();
Log.i(TAG, "onDestroy end.");
}
private boolean arEngineAbilityCheck() {
boolean isInstallArEngineApk = AREnginesApk.isAREngineApkReady(this);
if (!isInstallArEngineApk && isRemindInstall) {
Toast.makeText(this, "Please agree to install.", Toast.LENGTH_LONG).show();
finish();
}
Log.d(TAG, "Is Install AR Engine Apk: " + isInstallArEngineApk);
if (!isInstallArEngineApk) {
startActivity(new Intent(this, ConnectAppMarketActivity.class));
isRemindInstall = true;
}
return AREnginesApk.isAREngineApkReady(this);
}
private void setMessageWhenError(Exception catchException) {
if (catchException instanceof ARUnavailableServiceNotInstalledException) {
startActivity(new Intent(getApplicationContext(), ConnectAppMarketActivity.class));
} else if (catchException instanceof ARUnavailableServiceApkTooOldException) {
message = "Please update HuaweiARService.apk";
} else if (catchException instanceof ARUnavailableClientSdkTooOldException) {
message = "Please update this app";
} else if (catchException instanceof ARUnSupportedConfigurationException) {
message = "The configuration is not supported by the device!";
} else {
message = "exception throw";
}
}
private void stopArSession(Exception exception) {
Log.i(TAG, "stopArSession start.");
Toast.makeText(this, message, Toast.LENGTH_LONG).show();
Log.e(TAG, "Creating session error", exception);
if (mArSession != null) {
mArSession.stop();
mArSession = null;
}
Log.i(TAG, "stopArSession end.");
}
}
Conclusion
We can detect surfaces for multiple purposes in a simple way.
Documentation:
https://developer.huawei.com/consumer/en/doc/development/HMSCore-Guides/introduction-0000001050130900
Codelab:
https://developer.huawei.com/consumer/en/codelab/HWAREngine/index.html#0
Code Sample:
https://github.com/spartdark/hms-arengine-myarapplication

HMS Video Kit For Movie Promotion Application

More information like this, you can visit HUAWEI Developer Forum​
Intoduction:
HUAWEI Video Kit provides an excellent playback experience with video streaming from a third-party cloud platform. It supports streaming media in 3GP, MP4, or TS format and comply with HTTP/HTTPS, HLS, or DASH.
Advantage of Video Kit:
Provides an excellent video experience with no lag, no delay, and high definition.
Provides a complete and rich playback control interfaces.
Provides rich video operation experience.
Prerequisites:
Android Studio 3.X
JDK 1.8 or later
HMS Core (APK) 5.0.0.300 or later
EMUI 3.0 or later
Integration:
1. Create an project in android studio and Huawei AGC.
2. Provide the SHA-256 Key in App Information Section.
3. Download the agconnect-services.json from AGCand save into app directory.
4. In root build.gradle
Navigate to allprojects > repositories and buildscript > repositories and add the given line.
Code:
maven { url 'http://developer.huawei.com/repo/' }
In dependency add class path.
Code:
classpath 'com.huawei.agconnect:agcp:1.3.1.300'
5. In app build.gradle
Configure the Maven dependency
Code:
implementation "com.huawei.hms:videokit-player:1.0.1.300"
Configure the NDK
Code:
android {
defaultConfig {
......
ndk {
abiFilters "armeabi-v7a", "arm64-v8a"
}
}
......
}
Apply plugin
Code:
apply plugin: 'com.huawei.agconnect'
6. Permissions in Manifest
Code:
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
<uses-permission android:name="android.permission.ACCESS_WIFI_STATE" />
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
<uses-permission android:name="com.huawei.permission.SECURITY_DIAGNOSE" />
Code Implementation:
A movie promo application has been created to demonstrate HMS Video Kit . The application uses recycleview, cardview and piccaso libraries apart from HMS Video Kit library. Let us go to the details of HMS Video kit code integration.
1. Initializing WisePlayer
We have to implement a class that inherits Application and the onCreate() method has to call the initialization API WisePlayerFactory.initFactory()
Code:
public class VideoKitPlayApplication extends Application {
private static final String TAG = VideoKitPlayApplication.class.getSimpleName();
private static WisePlayerFactory wisePlayerFactory = null;
@Override
public void onCreate() {
super.onCreate();
initPlayer();
}
private void initPlayer() {
// DeviceId test is used in the demo.
WisePlayerFactoryOptions factoryOptions = new WisePlayerFactoryOptions.Builder().setDeviceId("xxx").build();
WisePlayerFactory.initFactory(this, factoryOptions, initFactoryCallback);
}
/**
* Player initialization callback
*/
private static InitFactoryCallback initFactoryCallback = new InitFactoryCallback() {
@Override
public void onSuccess(WisePlayerFactory wisePlayerFactory) {
LogUtil.i(TAG, "init player factory success");
setWisePlayerFactory(wisePlayerFactory);
}
@Override
public void onFailure(int errorCode, String reason) {
LogUtil.w(TAG, "init player factory failed :" + reason + ", errorCode is " + errorCode);
}
};
/**
* Get WisePlayer Factory
*
* @return WisePlayer Factory
*/
public static WisePlayerFactory getWisePlayerFactory() {
return wisePlayerFactory;
}
private static void setWisePlayerFactory(WisePlayerFactory wisePlayerFactory) {
VideoKitPlayApplication.wisePlayerFactory = wisePlayerFactory;
}
}
2. Creating instance of wise player
Code:
wisePlayer = VideoKitPlayApplication.getWisePlayerFactory().createWisePlayer();
3. Initialize the WisePlayer layout and add layout listeners
Code:
private void initView(View view) {
if (view != null) {
surfaceView = (SurfaceView) view.findViewById(R.id.surface_view);
textureView = (TextureView) view.findViewById(R.id.texture_view);
if (PlayControlUtil.isSurfaceView()) {
//SurfaceView display interface
SurfaceHolder surfaceHolder = surfaceView.getHolder();
surfaceHolder.addCallback(thois);
textureView.setVisibility(View.GONE);
surfaceView.setVisibility(View.VISIBLE);
} else {
//TextureView display interface
textureView.setSurfaceTextureListener(this);
textureView.setVisibility(View.VISIBLE);
surfaceView.setVisibility(View.GONE);
}
}
4. Register WisePlayer listeners
Code:
private void setPlayListener() {
if (wisePlayer != null) {
wisePlayer.setErrorListener(onWisePlayerListener);
wisePlayer.setEventListener(onWisePlayerListener);
wisePlayer.setResolutionUpdatedListener(onWisePlayerListener);
wisePlayer.setReadyListener(onWisePlayerListener);
wisePlayer.setLoadingListener(onWisePlayerListener);
wisePlayer.setPlayEndListener(onWisePlayerListener);
wisePlayer.setSeekEndListener(onWisePlayerListener);
}
}
5. Set playback parameters
Code:
player.setVideoType(PlayMode.PLAY_MODE_NORMAL);
player.setBookmark(10000);
player.setCycleMode(CycleMode.MODE_CYCLE);
6. Set URL for video
Code:
wisePlayer.setPlayUrl(new String[] {currentPlayData.getUrl()});
7. Set a view to display the video.
Code:
// SurfaceView listener callback
@Override
public void surfaceCreated(SurfaceHolder holder) {
wisePlayer.setView(surfaceView);
}
// TextureView listener callback
@Override
public void onSurfaceTextureAvailable(SurfaceTexture surface, int width, int height) {
wisePlayer.setView(textureView);
// Call the resume API to bring WisePlayer to the foreground.
wisePlayer.resume(ResumeType.KEEP);
}
8. Prepare for the playback and start requesting data.
Code:
wisePlayer.ready();
9. Start the playback upon a success response of the onReady callback method
Code:
@Override
public void onReady(WisePlayer wisePlayer) {
player.start();
}
10. select_play_movie.xml
Code:
<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:orientation="vertical"
android:tag="cards main container">
<androidx.cardview.widget.CardView
android:id="@+id/card_view"
xmlns:card_view="http://schemas.android.com/apk/res-auto"
android:layout_width="match_parent"
android:layout_height="wrap_content"
card_view:cardBackgroundColor="@color/colorbg"
card_view:cardCornerRadius="10dp"
card_view:cardElevation="5dp"
card_view:cardUseCompatPadding="true">
<LinearLayout
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:orientation="horizontal"
android:gravity="center"
>
<ImageView
android:id="@+id/videoIcon"
android:tag="image_tag"
android:layout_width="0dp"
android:layout_height="100dp"
android:layout_margin="5dp"
android:layout_weight="1"
android:src="@drawable/j1"/>
<LinearLayout
android:layout_width="0dp"
android:layout_height="wrap_content"
android:layout_marginTop="12dp"
android:layout_weight="2"
android:orientation="vertical"
>
<TextView
android:id="@+id/play_name"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_gravity="center_horizontal"
android:layout_marginTop="10dp"
android:text="Jurassic Park"
android:textColor="@color/colorTitle"
android:textAppearance="?android:attr/textAppearanceLarge"/>
<TextView
android:id="@+id/releasedYear"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_gravity="center_horizontal"
android:layout_marginTop="10dp"
android:textColor="@color/black"
android:textAppearance="?android:attr/textAppearanceMedium"/>
<TextView
android:id="@+id/briefStory"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_gravity="center_horizontal"
android:layout_margin="10dp"
android:textColor="@color/green"
android:textAppearance="?android:attr/textAppearanceSmall"/>
<TextView
android:id="@+id/play_type"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="0"
android:textColor="@color/select_play_text_color"
android:textSize="20sp"
android:visibility="gone"/>
<TextView
android:id="@+id/play_url"
android:layout_width="fill_parent"
android:layout_height="wrap_content"
android:ellipsize="end"
android:marqueeRepeatLimit="marquee_forever"
android:maxLines="2"
android:paddingTop="5dip"
android:singleLine="false"
android:textColor="@color/select_play_text_color"
android:textSize="14sp"
android:visibility="gone"/>
</LinearLayout>
</LinearLayout>
</androidx.cardview.widget.CardView>
</LinearLayout>
11.SelectMoviePlayAdapter.java
Code:
/**
* Play recyclerView adapter
*/
public class SelectMoviePlayAdapter extends RecyclerView.Adapter<SelectMoviePlayAdapter.PlayViewHolder> {
private static final String TAG = "SelectMoviePlayAdapter ";
// Data sources list
private List<MovieEntity> playList;
// Context
private Context context;
// Click item listener
private OnItemClickListener onItemClickListener;
/**
* Constructor
*
* @param context Context
* @param onItemClickListener Listener
*/
public SelectMoviePlayAdapter(Context context, OnItemClickListener onItemClickListener) {
this.context = context;
playList = new ArrayList<>();
this.onItemClickListener = onItemClickListener;
}
/**
* Set list data
*
* @param playList Play data
*/
public void setSelectPlayList(List<MovieEntity> playList) {
if (this.playList.size() > 0) {
this.playList.clear();
}
this.playList.addAll(playList);
notifyDataSetChanged();
}
@NonNull
@Override
public PlayViewHolder onCreateViewHolder(@NonNull ViewGroup parent, int viewType) {
View view = LayoutInflater.from(context).inflate(R.layout.select_play_movie, parent, false);
return new PlayViewHolder(view);
}
@Override
public void onBindViewHolder(PlayViewHolder holder, final int position) {
if (playList.size() > position && holder != null) {
MovieEntitymovieEntity = playList.get(position);
if (movieEntity== null) {
LogUtil.i(TAG, "current item data is empty.");
return;
}
holder.playName.setText(movieEntity.getName());
holder.releasedYear.setText(movieEntity.getYear());
holder.briefStory.setText(movieEntity.getStory());
holder.playUrl.setText(movieEntity.getUrl());
holder.playType.setText(String.valueOf(movieEntity.getUrlType()));
holder.itemView.setOnClickListener(new OnClickListener() {
@Override
public void onClick(View v) {
onItemClickListener.onItemClick(position);
}
});
Picasso.with(context).load(movieEntity.getIcon()).into(holder.videoIcon);
}
}
@Override
public int getItemCount() {
return playList.size();
}
/**
* Show view holder
*/
static class PlayViewHolder extends RecyclerView.ViewHolder {
private TextView playName;
private TextView releasedYear,briefStory;
private TextView playType;
private TextView playUrl;
private ImageView videoIcon;
/**
* Constructor
*
* @param itemView Item view
*/
public PlayViewHolder(View itemView) {
super(itemView);
if (itemView != null) {
playName = itemView.findViewById(R.id.play_name);
releasedYear = itemView.findViewById(R.id.releasedYear);
briefStory = itemView.findViewById(R.id.briefStory);
playType = itemView.findViewById(R.id.play_type);
playUrl = itemView.findViewById(R.id.play_url);
videoIcon = itemView.findViewById(R.id.videoIcon);
}
}
}
}
ScreenShots:
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Conclusion:
Video Kit provides an excellent experience in video playback. In future it will support video editing and video hosting, through that users can easily and quickly enjoy an end-to-end video solution for all scenarios
Reference:
https://developer.huawei.com/consumer/en/doc/development/HMSCore-Guides-V5/introduction-0000001050439577-V5

Learn Form Recognition using ML kit | JAVA

Introduction
Huawei ML kit allows to improve our lives in countless ways, streamling complex interactions with better solution such as Text detection,Face detection,Product visual search,Voice detection,Image related etc ..!
Form recognition service can recognize the information from FORM it will return table content such as table count, rows, columns, cellcoordinate, textInfo, etc..!
Use case
This service will help you in daily basis, for example after collecting a large number of data we can use this service to recognize and convert the content into electronic documents.
Suggestions
1. Forms such as questionnaires can be recognized.
2. Currently images containing multiple forms cannot be recognized.
3. Shooting Angle: The horizontal tilt angle is less than 5 degrees.
4. Form Integrity: No missing corners and no bent or segment lines
5. Form Content: Only printed content can recognized, images, hand written content, seals and watermarks in the form cannot be recognized.
6. Image Specification: Image ratio should be less than or equal 3:1, resolution must be greater than 960 x 960 px.
ML Kit Configuration.
1. Login into AppGallery Connect, select MlKitSample in My Project list.
2. Enable Ml Kit, Choose My Projects > Project settings > Manage APIs
Development Process
Create Application in Android Studio.
App level gradle dependencies.
Code:
apply plugin: 'com.android.application'
apply plugin: 'com.huawei.agconnect'
Gradle dependencies
Code:
implementation 'com.huawei.hms:ml-computer-vision-formrecognition:2.0.4.300'
implementation 'com.huawei.hms:ml-computer-vision-formrecognition-model:2.0.4.300'
Root level gradle dependencies
Code:
maven {url 'https://developer.huawei.com/repo/'}
classpath 'com.huawei.agconnect:agcp:1.3.1.300'
Add the below permissions in Android Manifest file
Code:
<manifest xlmns:android...>
...
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.CAMERA" />
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE"/>
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
<application ...
</manifest>
Add the following meta data in Manifest file its automatically install machine learning model.
Code:
<meta-data
android:name="com.huawei.hms.ml.DEPENDENCY"
android:value="fr" />
1. Create Instance for MLFormRecognitionAnalyzerSetting in onCreate.
Code:
MLFormRecognitionAnalyzerSetting setting = new MLFormRecognitionAnalyzerSetting.Factory().create();
2. Create instance for MLFormRecognitionAnalyzer in onCreate.
Code:
MLFormRecognitionAnalyzerFactory analyzer = MLFormRecognitionAnalyzerFactory.getInstance().getFormRecognitionAnalyzer(setting);
3. Check Runtime permissions.
4. FormRecogActivity is the responsible for load forms from local storage and capture the live forms using FormRecognigation service and we can extract the form cells content.
Code:
public class FormRecogActivity extends AppCompatActivity {
private MLFormRecognitionAnalyzerSetting setting;
private MLFormRecognitionAnalyzer analyzer;
private ImageView mImageView;
private TextView text, textTotal;
private MLFrame mlFrame;
private Uri imageUri;
private Bitmap bitmap;
private int camRequestCode = 100;
private int storageRequestCode = 200;
private int sum = 0;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_form_recog);
mImageView = findViewById(R.id.image);
text = findViewById(R.id.text);
textTotal = findViewById(R.id.text_total);
setting = new MLFormRecognitionAnalyzerSetting.Factory().create();
analyzer = MLFormRecognitionAnalyzerFactory.getInstance().getFormRecognitionAnalyzer(setting);
}
public void onLoadImage(View view) {
Intent intent = new Intent(Intent.ACTION_PICK, MediaStore.Images.Media.EXTERNAL_CONTENT_URI);
startActivityForResult(intent, storageRequestCode);
}
public void onClikCam(View view) {
if (checkSelfPermission(Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED) {
requestPermissions(new String[]{Manifest.permission.CAMERA}, camRequestCode);
} else {
Intent intent = new Intent(MediaStore.ACTION_IMAGE_CAPTURE);
startActivityForResult(intent, camRequestCode);
}
}
@Override
public void onRequestPermissionsResult(int requestCode, @NonNull String[] permissions, @NonNull int[] grantResults) {
super.onRequestPermissionsResult(requestCode, permissions, grantResults);
if (requestCode == camRequestCode) {
if (grantResults[0] == PackageManager.PERMISSION_GRANTED) {
Intent intent = new Intent(MediaStore.ACTION_IMAGE_CAPTURE);
startActivityForResult(intent, camRequestCode);
} else {
Toast.makeText(this, "Camera permission denied", Toast.LENGTH_LONG).show();
}
}
if (requestCode == storageRequestCode) {
if (grantResults[0] == PackageManager.PERMISSION_GRANTED) {
Intent intent = new Intent(
Intent.ACTION_PICK,
MediaStore.Images.Media.EXTERNAL_CONTENT_URI
);
startActivityForResult(intent, storageRequestCode);
} else {
Toast.makeText(this, "Storage permission denied", Toast.LENGTH_LONG).show();
}
}
}
@Override
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
super.onActivityResult(requestCode, resultCode, data);
if (resultCode == RESULT_OK && requestCode == storageRequestCode) {
imageUri = data.getData();
try {
bitmap = MediaStore.Images.Media.getBitmap(this.getContentResolver(), imageUri);
mImageView.setImageBitmap(bitmap);
callFormService();
} catch (IOException e) {
e.printStackTrace();
}
} else if (resultCode == RESULT_OK && requestCode == camRequestCode) {
bitmap = (Bitmap) data.getExtras().get("data");
mImageView.setImageBitmap(bitmap);
callFormService();
}
}
private void callFormService() {
mlFrame = MLFrame.fromBitmap(bitmap);
analyzer = MLFormRecognitionAnalyzerFactory.getInstance().getFormRecognitionAnalyzer();
Task<JsonObject> task = analyzer.asyncAnalyseFrame(mlFrame);
task.addOnSuccessListener(new OnSuccessListener<JsonObject>() {
@Override
public void onSuccess(JsonObject jsonObject) {
if (jsonObject != null && jsonObject.get("retCode").getAsInt() == MLFormRecognitionConstant.SUCCESS) {
Gson gson = new Gson();
String result = jsonObject.toString();
MLFormRecognitionTablesAttribute mlObject = gson.fromJson(result, MLFormRecognitionTablesAttribute.class);
ArrayList<MLFormRecognitionTablesAttribute.TablesContent.TableAttribute> tableAttributeArrayList = mlObject.getTablesContent().getTableAttributes();
ArrayList<MLFormRecognitionTablesAttribute.TablesContent.TableAttribute.TableCellAttribute> tableCellAttributes = tableAttributeArrayList.get(0).getTableCellAttributes();
for (MLFormRecognitionTablesAttribute.TablesContent.TableAttribute.TableCellAttribute attribute : tableCellAttributes) {
String info = attribute.textInfo;
text.setText(text.getText().toString() + "\n" + info);
}
Toast.makeText(FormRecogActivity.this, "Successfully Form Recognized", Toast.LENGTH_LONG).show();
Log.d("TAG", "result: " + result);
} else if (jsonObject != null && jsonObject.get("retCode").getAsInt() == MLFormRecognitionConstant.FAILED) {
Toast.makeText(FormRecogActivity.this, "Form Recognition Convertion Failed", Toast.LENGTH_LONG).show();
}
textTotal.setText("Total Cart Value : "+ sum +" Rs ");
}
}).addOnFailureListener(new OnFailureListener() {
@Override
public void onFailure(Exception e) {
Toast.makeText(FormRecogActivity.this, "Form Recognition API Failed", Toast.LENGTH_LONG).show();
}
});
}
}
5.This Xml class for creating the UI.
Code:
<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:background="#2196F3"
android:orientation="vertical">
<ImageView
android:id="@+id/image"
android:layout_width="match_parent"
android:layout_height="250dp"
android:layout_marginTop="?actionBarSize" />
<Button
android:id="@+id/btn_storage"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:layout_margin="20dp"
android:onClick="onLoadImage"
android:background="#F44336"
android:textColor="@color/upsdk_white"
android:text="Load Image from storage" />
<Button
android:id="@+id/btn_capture"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:layout_marginLeft="20dp"
android:layout_marginRight="20dp"
android:onClick="onClikCam"
android:background="#F44336"
android:textColor="@color/upsdk_white"
android:text="Capture Image" />
<TextView
android:id="@+id/text_total"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:layout_marginLeft="20dp"
android:textSize="20sp"
android:textColor="@color/upsdk_white"
android:textStyle="bold" />
<ScrollView
android:layout_marginTop="8dp"
android:layout_width="match_parent"
android:layout_height="wrap_content">
<TextView
android:id="@+id/text"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:layout_marginLeft="20dp"
android:textSize="20sp"
android:textColor="@color/upsdk_white"
android:textStyle="bold" />
</ScrollView>
</LinearLayout>
Result
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Tips & Tricks
1. Enable ML service API in AppGallery Connect.
2. Capture single form only, form service not supporting multiple forms.
3. The resolution must greater than 960 x 960px.
4. Currently this service not supporting handwritten text information.
Conclusion
This article will help you to get textinfo from form, it will extract the individual cells data with coordinates. This service will help you in daily basis.
Thank you for reading and if you have enjoyed this article I would suggest you implement this and provide your experience.
Reference
ML Kit – Form Recognition
Refer the URL

Example of Integrating Native Ads in between RecyclerView items - ADS KIT

Introduction
Huawei offer a range of ad formats so you can choose whichever that suits your app best. Currently, you can integrate Banner, Native, Rewarded, Interstitial, Splash, and Roll ads, and we will be launching even more formats in the future.
Use the HUAWEI Ads SDK to quickly integrate HUAWEI Ads into your app.
Native Ads
Native ads fit seamlessly into the surrounding content to match your app design. Such ads can be customized as needed.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
In this article we can learn about how to integrate Native image and video Ads in between RecyclerView items.
Check the below activity_main.xml
Code:
<?xml version="1.0" encoding="utf-8"?>
<RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
android:id="@+id/rl_root"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:background="@color/black">
<RelativeLayout
android:id="@+id/rl_tool"
android:layout_width="match_parent"
android:layout_height="?attr/actionBarSize">
<TextView
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_centerInParent="true"
android:text="Music Player"
android:textColor="#fff"
android:textSize="16sp" />
</RelativeLayout>
<TextView
android:id="@+id/txt_category"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_below="@id/rl_tool"
android:layout_marginTop="10dp"
android:layout_marginBottom="10dp"
android:text="Playlist"
android:textColor="#fff"
android:textSize="16sp" />
<com.yarolegovich.discretescrollview.DiscreteScrollView
android:id="@+id/discrete_scroll_view"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:layout_above="@id/hw_banner_view"
android:layout_below="@id/txt_category"
app:dsv_orientation="vertical" />
<ProgressBar
android:id="@+id/progress_bar"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_centerInParent="true"
android:indeterminate="false"
android:indeterminateDrawable="@drawable/circular_progress"
android:visibility="invisible" />
<com.huawei.hms.ads.banner.BannerView
android:id="@+id/hw_banner_view"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:layout_alignParentBottom="true"
android:layout_centerHorizontal="true" />
</RelativeLayout>
After that check below MainActivity for how to get API response data by using Volley library
Code:
public class MainActivity extends AppCompatActivity {
private DiscreteScrollView discreteScrollView;
public static String BASE_URL = "https://beatsapi.media.jio.com/v2_1/beats-api/jio/src/response/home/";
private String BASE_IMAGE_URL;
private ArrayList<PlaylistModel> playlist = new ArrayList();
private boolean isConnected = false;
private HomeAdapterNew homeAdapter;
private SharedPreferences preferences;
private SharedPreferences.Editor editor;
private ProgressBar progressBar;
private BannerView bannerView;
private RelativeLayout rlRoot;
@Override
protected void onCreate(@Nullable Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
init();
addBannerAd();
initHomeAdapter();
if (NetworkUtil.isNetworkConnected(this))
callHomeResponseApiVolley();
else {
String homeResponse = preferences.getString("HomeResponse", "");
filterResponse(homeResponse);
}
}
private void init() {
preferences = this.getSharedPreferences("MyPref", Context.MODE_PRIVATE);
discreteScrollView = findViewById(R.id.discrete_scroll_view);
progressBar = findViewById(R.id.progress_bar);
bannerView = findViewById(R.id.hw_banner_view);
rlRoot = findViewById(R.id.rl_root);
}
private void initHomeAdapter() {
homeAdapter = new HomeAdapterNew(this);
discreteScrollView.setAdapter(homeAdapter);
discreteScrollView.setSlideOnFling(true);
discreteScrollView.setItemTransformer(new ScaleTransformer.Builder()
.setMaxScale(1.05f)
.setMinScale(0.8f)
.setPivotX(Pivot.X.CENTER)
.setPivotY(Pivot.Y.CENTER)
.build());
discreteScrollView.addScrollStateChangeListener(new DiscreteScrollView.ScrollStateChangeListener<RecyclerView.ViewHolder>() {
@Override
public void onScrollStart(@NonNull RecyclerView.ViewHolder viewHolder, int i) {
}
@Override
public void onScrollEnd(@NonNull RecyclerView.ViewHolder viewHolder, int adapterPosition) {
}
@Override
public void onScroll(float v, int i, int i1, @Nullable RecyclerView.ViewHolder viewHolder, @Nullable RecyclerView.ViewHolder t1) {
}
});
}
private void addBannerAd() {
bannerView.setAdId("testw6vs28auh3");
bannerView.setBannerAdSize(BannerAdSize.BANNER_SIZE_360_57);
AdParam adParam = new AdParam.Builder().build();
bannerView.loadAd(adParam);
}
private void callHomeResponseApiVolley() {
progressBar.setVisibility(View.VISIBLE);
StringRequest request = new StringRequest(Request.Method.GET, BASE_URL + "Telugu", new com.android.volley.Response.Listener<String>() {
@Override
public void onResponse(String response) {
progressBar.setVisibility(View.INVISIBLE);
if (response != null) {
editor = preferences.edit();
editor.putString("HomeResponse", response);
editor.apply();
filterResponse(response);
}
}
}, new com.android.volley.Response.ErrorListener() {
@Override
public void onErrorResponse(VolleyError error) {
progressBar.setVisibility(View.INVISIBLE);
}
});
RequestQueue requestQueue = Volley.newRequestQueue(this);
requestQueue.add(request);
}
void filterResponse(String response) {
try {
JSONObject object = new JSONObject(response);
JSONObject resultObject = object.getJSONObject("result");
JSONArray jsonDataArray = resultObject.getJSONArray("data");
BASE_IMAGE_URL = resultObject.getString("imageurl");
Log.e("BASE_URL_IMAGE", BASE_IMAGE_URL);
for (int i = 0; i < jsonDataArray.length(); i++) {
String type = jsonDataArray.getJSONObject(i).getString("type");
JSONArray songsListArray = jsonDataArray.getJSONObject(i).getJSONArray("list");
if (type.equalsIgnoreCase("playlist")) {
getSongsFromArray(songsListArray);
}
}
} catch (JSONException e) {
e.printStackTrace();
}
}
private void getSongsFromArray(JSONArray songsListArray) throws JSONException {
for (int i = 0; i < songsListArray.length(); i++) {
JSONObject jsonObject = songsListArray.getJSONObject(i);
String title = jsonObject.getString("title");
String imageUrl = jsonObject.getString("image");
Log.e("ImageUrl", imageUrl);
String playlistID = jsonObject.getString("playlistid");
playlist.add(new PlaylistModel(title, playlistID, BASE_IMAGE_URL + imageUrl));
}
ArrayList<AdsListModel> adsList = new ArrayList<>();
adsList.add(new AdsListModel(getString(R.string.ad_id_native_image)));
adsList.add(new AdsListModel(getString(R.string.ad_id_native_video)));
homeAdapter.addList(playlist, adsList);
}
}
Banner Ads
Banner ads are rectangular images that occupy a spot at the top, middle, or bottom within an app layout. Banner ads refresh automatically at regular intervals. When a user clicks a banner ad, the user is usually redirected to the advertiser page.
Here in addBannerAd method bannerView taken from xml. We can integrate programatically. Check the below code
Code:
BannerView bannerView = new BannerView(this);
bannerView.setAdId("testw6vs28auh3");
bannerView.setBannerAdSize(BannerAdSize.BANNER_SIZE_SMART);
AdParam adParam = new AdParam.Builder().build();
bannerView.loadAd(adParam);
RelativeLayout.LayoutParams rLParams = new RelativeLayout.LayoutParams(ViewGroup.LayoutParams.MATCH_PARENT, ViewGroup.LayoutParams.WRAP_CONTENT);
rLParams.addRule(RelativeLayout.ALIGN_PARENT_BOTTOM, 1);
rlRoot.addView(bannerView, rLParams);
Standard Banner Ad Sizes
The following table lists the standard banner ad sizes.
NOTE :
In the Chinese mainland, only BANNER_SIZE_360_57 and BANNER_SIZE_360_144 are supported.
In the above getSongsFromArray method, We are passing both playlist data and ads data to RecyclerView adapter
with the help of addList method in HomeAdapter
Check below for Native Ad slot id
Code:
<string name="ad_id_native_image">testu7m3hc4gvm</string>
<string name="ad_id_native_video">testy63txaom86</string>
Check the below PlaylistModel class for playlist data
Code:
public class PlaylistModel extends BaseListModel implements Serializable {
String playlistTitle;
String playlistID;
String playlistImage;
public PlaylistModel(String title, String playlistID, String playlistImage) {
this.playlistID = playlistID;
this.playlistImage = playlistImage;
this.playlistTitle = title;
}
public String getPlaylistID() {
return playlistID;
}
public String getPlaylistImage() {
return playlistImage;
}
public String getPlaylistTitle() {
return playlistTitle;
}
}
Check the below code for AdsListModel
Code:
public class AdsListModel extends BaseListModel {
private String adID;
AdsListModel(String ID) {
this.adID = ID;
}
public String getAdID() {
return adID;
}
}
Check the below code for BaseListModel
Code:
public class BaseListModel {
}
After that for showing content item and ad item, Need to create two different layouts. One for actual content and another for ad.
Check the below inflate_home_item.xml for showing actual content
Code:
<?xml version="1.0" encoding="utf-8"?>
<androidx.cardview.widget.CardView xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
android:layout_width="300dp"
android:layout_height="350dp"
android:layout_margin="10dp"
app:cardBackgroundColor="#000"
app:cardCornerRadius="5dp"
app:cardElevation="5dp">
<RelativeLayout
android:layout_width="match_parent"
android:layout_height="match_parent">
<TextView
android:id="@+id/txt_playlist_name"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_centerHorizontal="true"
android:layout_margin="20dp"
android:singleLine="true"
android:text="Prabhas Playlist"
android:textColor="#fff"
android:textSize="16sp" />
<ImageView
android:id="@+id/img_home"
android:layout_width="250dp"
android:layout_height="wrap_content"
android:layout_below="@id/txt_playlist_name"
android:layout_alignParentBottom="true"
android:layout_centerHorizontal="true"
android:scaleType="fitXY"
android:transitionName="PlaylistImage" />
</RelativeLayout>
</androidx.cardview.widget.CardView>
More details, you can check https://forums.developer.huawei.com/forumPortal/en/topic/0204417759943370014
Very useful.
Really interesting. Thanks.
Quite helpful.
Quite interesting.
How ad layout behaves when device orientation is changed?

Intermediate: How to Integrate Rest APIs using Huawei Network Kit in Android

Introduction
In this article, we will learn how to implement Huawei Network kit in Android. Network kit is a basic network service suite we can utilizes scenario based REST APIs as well as file upload and download. The Network kit can provide with easy-to-use device-cloud transmission channels featuring low latency and high security.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
About Huawei Network kit
Huawei Network Kit is a service that allows us to perform our network operations quickly and safely. It provides a powerful interacting with Rest APIs and sending synchronous and asynchronous network requests with annotated parameters. Also it allows us to quickly and easily upload or download files with additional features such as multitasking, multithreading, uploads and downloads. With Huawei Network Kit we can improve the network connection when you want to access to a URL.
Supported Devices
Huawei Network Kit is not for all devices, so first we need to validate if the device support or not, and here is the list of devices supported.
Requirements
1. Any operating system (i.e. MacOS, Linux and Windows).
2. Any IDE with Android SDK installed (i.e. IntelliJ, Android Studio).
3. Minimum API Level 19 is required.
4. Required EMUI 3.0 and later version devices.
Code Integration
Create Application in Android Studio.
App level gradle dependencies.
Code:
apply plugin: 'com.android.application'
apply plugin: 'com.huawei.agconnect'
Gradle dependencies
Code:
implementation "com.huawei.hms:network-embedded:5.0.1.301"
implementation "androidx.multidex:multidex:2.0.1"
implementation 'com.google.code.gson:gson:2.8.6'
implementation 'androidx.recyclerview:recyclerview:1.2.0'
implementation 'androidx.cardview:cardview:1.0.0'
implementation 'com.github.bumptech.glide:glide:4.11.0'
annotationProcessor 'com.github.bumptech.glide:compiler:4.9.0'
Root level gradle dependencies
Code:
maven {url 'https://developer.huawei.com/repo/'}
classpath 'com.huawei.agconnect:agcp:1.4.1.300'
Add the below permissions in Android Manifest file.
Code:
<manifest xlmns:android...>
...
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.ACCESS_WIFI_STATE" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
<application ...
</manifest>
First we need to implement HMS Network kit with this we will check Network kit initialization status.
Code:
public class HWApplication extends Application {
@Override
protected void attachBaseContext(Context base) {
super.attachBaseContext(base);
MultiDex.install(this);
}
@Override
public void onCreate() {
super.onCreate();
NetworkKit.init(getApplicationContext(), new NetworkKit.Callback() {
@Override
public void onResult(boolean status) {
if (status) {
Toast.makeText(getApplicationContext(), "Network kit successfully initialized", Toast.LENGTH_SHORT).show();
} else {
Toast.makeText(getApplicationContext(), "Network kit initialization failed", Toast.LENGTH_SHORT).show();
}
}
});
}
}
Now we need to create ApiClient class, here we will declare the Restclient object.
Code:
public class ApiClient {
private static final String BASE_URL = "https://newsapi.org/v2/";
private static final int CONNECT_TIMEOUT = 10000;
private static final int WRITE_TIMEOUT = 1000;
private static final int TIMEOUT = 10000;
public static RestClient restClient;
public static RestClient getRestClient() {
if (restClient == null) {
restClient = new RestClient
.Builder()
.baseUrl(BASE_URL)
.httpClient(getHttpClient())
.build();
}
return restClient;
}
public static HttpClient getHttpClient() {
return new HttpClient.Builder()
.connectTimeout(CONNECT_TIMEOUT)
.readTimeout(TIMEOUT)
.writeTimeout(WRITE_TIMEOUT)
.enableQuic(false)
.build();
}
}
Code:
public interface ApiInterface {
@GET("top-headlines")
Submit<String> getMovies(@Query("country") String country, @Query("category") String category, @Query("apiKey") String apiKey);
}
In Our MainActivity.java class we need to create the instance for ApiInterface, now we need to call the Restclient object to send synchronous or asynchronous requests.
Java:
public class MainActivity extends AppCompatActivity {
ApiInterface apiInterface;
private RecyclerView recyclerView;
private List<NewsInfo.Article> mArticleList;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
recyclerView = findViewById(R.id.recyclerView);
init();
}
private void init() {
recyclerView.setLayoutManager(new LinearLayoutManager(this));
recyclerView.smoothScrollToPosition(0);
}
@Override
protected void onResume() {
super.onResume();
loadData();
}
private void loadData() {
apiInterface = ApiClient.getRestClient().create(ApiInterface.class);
apiInterface.getMovies("us", "business", "e4d3e43d2c0b4e2bab6500ec6e469a94")
.enqueue(new Callback<String>() {
@Override
public void onResponse(Submit<String> submit, Response<String> response) {
runOnUiThread(() -> {
Gson gson = new Gson();
NewsInfo newsInfo = gson.fromJson(response.getBody(), NewsInfo.class);
mArticleList = newsInfo.articles;
recyclerView.setAdapter(new ContentAdapter(getApplicationContext(), mArticleList));
recyclerView.smoothScrollToPosition(0);
});
}
@Override
public void onFailure(Submit<String> submit, Throwable throwable) {
Log.i("TAG", "Api failure");
}
});
}
}
NewsInfo.java
Java:
public class NewsInfo {
@SerializedName("status")
public String status;
@SerializedName("totalResults")
public Integer totalResults;
@SerializedName("articles")
public List<Article> articles = null;
public class Article {
@SerializedName("source")
public Source source;
@SerializedName("author")
public String author;
@SerializedName("title")
public String title;
@SerializedName("description")
public String description;
@SerializedName("url")
public String url;
@SerializedName("urlToImage")
public String urlToImage;
@SerializedName("publishedAt")
public String publishedAt;
@SerializedName("content")
public String content;
public String getAuthor() {
return author;
}
public String getTitle() {
return title;
}
public class Source {
@SerializedName("id")
public Object id;
@SerializedName("name")
public String name;
public String getName() {
return name;
}
}
}
}
main_activity.xml
XML:
<androidx.constraintlayout.widget.ConstraintLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
tools:context=".MainActivity">
<androidx.recyclerview.widget.RecyclerView
xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:tools="http://schemas.android.com/tools"
xmlns:app="http://schemas.android.com/apk/res-auto"
android:id="@+id/recyclerView"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:backgroundTint="#f2f2f2"
tools:showIn="@layout/activity_main" />
</androidx.constraintlayout.widget.ConstraintLayout>
ContentAdapter.java
Code:
public class ContentAdapter extends RecyclerView.Adapter<ContentAdapter.ViewHolder> {
private List<NewsInfo.Article> newsInfos;
private Context context;
public ContentAdapter(Context applicationContext, List<NewsInfo.Article> newsInfoArrayList) {
this.context = applicationContext;
this.newsInfos = newsInfoArrayList;
}
@Override
public ContentAdapter.ViewHolder onCreateViewHolder(ViewGroup viewGroup, int i) {
View view = LayoutInflater.from(viewGroup.getContext()).inflate(R.layout.layout_adapter, viewGroup, false);
return new ViewHolder(view);
}
@Override
public void onBindViewHolder(ContentAdapter.ViewHolder viewHolder, int i) {
viewHolder.chanelName.setText(newsInfos.get(i).source.getName());
viewHolder.title.setText(newsInfos.get(i).getTitle());
viewHolder.author.setText(newsInfos.get(i).getAuthor());
Glide.with(context).load(newsInfos.get(i).urlToImage).into(viewHolder.imageView);
}
@Override
public int getItemCount() {
return newsInfos.size();
}
public class ViewHolder extends RecyclerView.ViewHolder {
private TextView chanelName, author, title;
private ImageView imageView;
public ViewHolder(View view) {
super(view);
chanelName = view.findViewById(R.id.chanelName);
author = view.findViewById(R.id.author);
title = view.findViewById(R.id.title);
imageView = view.findViewById(R.id.cover);
itemView.setOnClickListener(v -> {
});
}
}
}
layout_adapter.xml
XML:
<?xml version="1.0" encoding="utf-8"?>
<androidx.cardview.widget.CardView android:layout_width="match_parent"
android:layout_height="150dp"
xmlns:app="http://schemas.android.com/apk/res-auto"
android:layout_marginTop="10dp"
android:layout_marginLeft="10dp"
android:layout_marginRight="10dp"
android:clickable="true"
android:focusable="true"
android:elevation="60dp"
android:foreground="?android:attr/selectableItemBackground"
xmlns:android="http://schemas.android.com/apk/res/android">
<RelativeLayout
android:layout_width="match_parent"
android:layout_height="match_parent">
<ImageView
android:id="@+id/cover"
android:layout_width="100dp"
android:layout_height="100dp"
android:layout_marginLeft="20dp"
android:layout_marginTop="10dp"
android:scaleType="fitXY" />
<TextView
android:id="@+id/chanelName"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_toRightOf="@id/cover"
android:layout_marginLeft="20dp"
android:layout_marginTop="20dp"
android:textStyle="bold" />
<TextView
android:id="@+id/author"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_toRightOf="@id/cover"
android:layout_marginLeft="20dp"
android:layout_marginTop="5dp"
android:layout_below="@id/chanelName" />
<TextView
android:id="@+id/title"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_toRightOf="@id/cover"
android:layout_marginLeft="20dp"
android:layout_marginTop="25dp"
android:layout_below="@id/chanelName" />
</RelativeLayout>
</androidx.cardview.widget.CardView>
Demo
Tips and Tricks
1. Add latest Network kit dependency.
2. Minimum SDK 19 is required.
3. Do not forget to add Internet permission in Manifest file.
4. Before sending request you can check internet connection.
Conclusion
That’s it!
This article will help you to use Network kit in your android application, as we have implemented REST API. We can get the data using either HttpClient object or RestClient object.
Thanks for reading! If you enjoyed this story, please click the Like button and Follow. Feel free to leave a Comment below.
Reference
Network kit URL
Original Source

Categories

Resources