More information like this, you can visit HUAWEI Developer Forum
Introduction:
HUAWEI Audio Kit provides a set of audio capabilities based on the HMS Core ecosystem, including audio encoding and decoding capabilities at the hardware level and system bottom layer. It provides developers with convenient, efficient, and rich audio services.
Audio Kit provides developers with the ability to parse and play multiple audio formats such as m4a / aac / amr / flac / imy / wav / ogg / rtttl / mp3. It can provide cache capabilty and support customizig the size of the cache space.
Prerequisites:
Java JDK (JDK 1.7 is recommended.)
Android Studio 3.X
SDK Platform 19 or later
Gradle 4.6 or later
HMS Core (APK) 5.0.0.300 or later
Integration:
1. Create a project in android studio and Huawei AGC.
2. Provide the SHA-256 Key in App Information Section.
3. Download the agconnect-services.json from AGCand save into app directory.
4. In root build.gradle
Navigate to allprojects > repositories and buildscript > repositories and add the given line.
Code:
maven { url 'http://developer.huawei.com/repo/' }
5. In app build.gradle
Configure the Maven dependency
Code:
implementation 'com.huawei.hms:audiokit-player:1.0.0.302'
6. Permissions in Manifest
Code:
<!-- Need to access the network and obtain network status information-->
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
<!-- android4.4 To operate SD card, you need to apply for the following permissions -->
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.READ_MEDIA_STORAGE" />
<!-- 9.0 Adaptation -->
<uses-permission android:name="android.permission.FOREGROUND_SERVICE" />
<!-- Play songs to prevent CPU from sleeping. -->
<uses-permission android:name="android.permission.WAKE_LOCK" />
Code Implementation:
Here is an Audio application which is using HMS Audio Kit features. The application gives an excellent audio experience with audio streaming from a third-party cloud platform. The application listing song details with a name and duration. User can select the track from the list and can listen the music. The application uses recycleview and cardview libraries apart from HMS Audio Kit library.
1. Create an audio management instance by calling HwAudioManager; manage audio playback by calling HwAudioPlayerManager; manage audio queues by calling HwAudioQueueManager;manage audio configurations by calling HwAudioConfigManager.
Code:
public void init(final Context context) {
Log.i(TAG, "init start");
new AsyncTask<Void, Void, Void>() {
@Override
protected Void doInBackground(Void... voids) {
HwAudioPlayerConfig hwAudioPlayerConfig = new HwAudioPlayerConfig(context);
HwAudioManagerFactory.createHwAudioManager(hwAudioPlayerConfig, new HwAudioConfigCallBack() {
@Override
public void onSuccess(HwAudioManager hwAudioManager) {
try {
Log.i(TAG, "createHwAudioManager onSuccess");
mHwAudioManager = hwAudioManager;
mHwAudioPlayerManager = hwAudioManager.getPlayerManager();
mHwAudioQueueManager = hwAudioManager.getQueueManager();
mHwAudioConfigManager = hwAudioManager.getConfigManager();
} catch (Exception e) {
Log.e(TAG, "init fail", e);
}
}
@Override
public void onError(int errorCode) {
Log.e(TAG, "init error:" + errorCode);
}
});
return null;
}
}.execute();
}
2. Create a Playlist. Here using ArrayList
Code:
public List<HwAudioPlayItem> getAudioList() {
List<HwAudioPlayItem> playItemList = new ArrayList<>();
for (int i=0;i<MyData.nameArray.length;i++) {
HwAudioPlayItem audioPlayItem = new HwAudioPlayItem();
audioPlayItem.setAudioId(""+i+1);
audioPlayItem.setSinger("K.J Yesudas");
audioPlayItem.setOnlinePath(MyData.AudiorlArray[i]);
audioPlayItem.setOnline(1);
audioPlayItem.setAudioTitle(MyData.nameArray[i]);
playItemList.add(audioPlayItem);
}
return playItemList;
}
3. Play Audio
Code:
HwAudioPlayerManager.play();
4. Pause Audio
Code:
mHwAudioPlayerManager.pause();
5. For Next track
Code:
mHwAudioPlayerManager.playNext();
6. For previous track
Code:
mHwAudioPlayerManager.playPre();
7. Select a track from recycleview
Code:
@Override
public void onItemClick(int pos) {
if (mHwAudioPlayerManager != null) {
mHwAudioPlayerManager.play(pos);
}
}
8. Stop Audio
Code:
mHwAudioPlayerManager.stop();
Screen Shots:
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Conclusion:-
Audio Kit provides an excellent experience in Audio playback. It allows developers to quickly build their own local or online playback applications. It can provide a better hearing effects based on the multiple audio effects capabilities. It provides a variety of playback control operation methods to facilitate developers to control application operations.
Reference:
https://developer.huawei.com/consumer/en/doc/development/HMSCore-Guides/introduction-0000001050749665
What are all the formats audio player supports.
Hi, Sujith.e. HUAWEI Audio Kit supports the native audio formats of Android systems such as M4A, AAC, AMR, IMY, WAV, OGG, RTTTL, and MP3.
More information, you can acquire from https://developer.huawei.com/consumer/en/
Is there any requirement of Minimum EMUI which will support this ?
How much time it takes to integrate HMS audio kit in android project? What is the app size after integration ?
Looks simple enough
Is any repeat song Api in audio player?
Related
More information like this, you can visit HUAWEI Developer Forum
Introduction
Online food ordering is process that delivers food from local restaurants. Mobile apps make our world better and easier customer to prefer comfort and quality instead of quantity.
Sign In Module
User can login with mobile number to access food order application. Using auth service we can integrate third party sign in options. Huawei Auth service provides a cloud based auth service and SDK.
In this article, following Kits are covered:
1. AGC Auth Service
2. Ads Kit
3. Site kit
Steps
1. Create App in Android.
2. Configure App in AGC.
3. Integrate the SDK in our new Android project.
4. Integrate the dependencies.
5. Sync project.
Screens
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Configuration
1. Login into AppGallery Connect, select FoodApp in My Project list.
2. Enable required APIs in manage APIs tab.
Choose Project Settings > ManageAPIs
3. Enable auth service before enabling Authentication modes as we need to enable Auth Service.
Choose Build > Auth Service and click Enable now top right-corner
4. Enable sign in modes required for application
Integration
Create Application in Android Studio.
App level gradle dependencies.
Code:
apply plugin: 'com.android.application'
apply plugin: 'com.huawei.agconnect'
Gradle dependencies
Code:
implementation 'com.google.android.material:material:1.3.0-alpha02'
implementation 'com.huawei.agconnect:agconnect-core:1.4.0.300'
implementation 'com.huawei.agconnect:agconnect-auth:1.4.0.300'
implementation 'com.huawei.hms:hwid:4.0.4.300'
implementation 'com.huawei.hms:site:4.0.3.300'
implementation 'com.huawei.hms:ads-lite:13.4.30.307'
Root level gradle dependencies
Code:
maven {url 'https://developer.huawei.com/repo/'}
classpath 'com.huawei.agconnect:agcp:1.3.1.300'
Add the below permissions in Android Manifest file
Code:
<manifest xlmns:android...>
...
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.ACCESS_WIFI_STATE" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
<application ...
</manifest>
Ads Kit: Huawei ads SDK to quickly integrate ads into your app, ads can be a powerful assistant to attract users.
Code Snippet
Code:
<com.huawei.hms.ads.banner.BannerView
android:id="@+id/huawei_banner_view"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:layout_alignParentBottom="true"
android:layout_centerHorizontal="true"
hwads:adId="testw6vs28auh3"
hwads:bannerSize="BANNER_SIZE_360_57" />
Code:
BannerView hwBannerView = findViewById(R.id.huawei_banner_view);
AdParam adParam = new AdParam.Builder()
.build();
hwBannerView.loadAd(adParam);
Auth Service:
1. Create Object for VerifyCodeSettings. Apply for verification code by mobile number based login.
Code:
VerifyCodeSettings mVerifyCodeSettings = VerifyCodeSettings.newBuilder()
.action(VerifyCodeSettings.ACTION_REGISTER_LOGIN)
.sendInterval(30)
.locale(Locale.getDefault())
.build();
2. Send mobile number, country code and verify code settings object.
Code:
if (!mMobileNumber.isEmpty() && mMobileNumber.length() == 10) {
Task<VerifyCodeResult> resultTask = PhoneAuthProvider.requestVerifyCode("+91", mMobileNumber, mVerifyCodeSettings);
resultTask.addOnSuccessListener(verifyCodeResult -> {
Toast.makeText(SignIn.this, "verify code has been sent.", Toast.LENGTH_SHORT).show();
if (!isDialogShown) {
verfiyOtp();
}
}).addOnFailureListener(e -> Toast.makeText(SignIn.this, "Send verify code failed.", Toast.LENGTH_SHORT).show());
Toast.makeText(this, mEdtPhone.getText().toString(), Toast.LENGTH_SHORT).show();
} else {
Toast.makeText(this, "Invalid Phone Number!", Toast.LENGTH_SHORT).show();
}
3. Create object for Phone User Builder.
Code:
PhoneUser mPhoneUser = new PhoneUser.Builder()
.setCountryCode("+91")
.setPhoneNumber(mMobileNumber)
.setVerifyCode(otp)
.setPassword(null)
.build();
4. Check below code snippet to validate OTP.
Code:
AGConnectAuth.getInstance().createUser(mPhoneUser)
.addOnSuccessListener(signInResult -> {
Toast.makeText(SignIn.this, "Verfication success!", Toast.LENGTH_LONG).show();
SharedPrefenceHelper.setPreferencesBoolean(SignIn.this, IS_LOGGEDIN, true);
redirectActivity(MainActivity.class);
}).addOnFailureListener(e -> Toast.makeText(SignIn.this, "Verfication failed!", Toast.LENGTH_LONG).show());
Site kit: Using HMS Site kit we can provide easy access to hotels and places.
Code:
searchService.textSearch(textSearchRequest, new SearchResultListener<TextSearchResponse>() {
@Override
public void onSearchResult(TextSearchResponse response) {
for (Site site : response.getSites()) {
SearchResult searchResult = new SearchResult(site.getAddress().getLocality(), site.getName());
String result = site.getName() + "," + site.getAddress().getSubAdminArea() + "\n" +
site.getAddress().getAdminArea() + "," +
site.getAddress().getLocality() + "\n" +
site.getAddress().getCountry() + "\n";
list.add(result);
searchList.add(searchResult);
}
mAutoCompleteAdapter.clear();
mAutoCompleteAdapter.addAll(list);
mAutoCompleteAdapter.notifyDataSetChanged();
autoCompleteTextView.setAdapter(mAutoCompleteAdapter);
Toast.makeText(getActivity(), String.valueOf(response.getTotalCount()), Toast.LENGTH_SHORT).show();
}
@Override
public void onSearchError(SearchStatus searchStatus) {
Toast.makeText(getActivity(), searchStatus.getErrorCode(), Toast.LENGTH_SHORT).show();
}
});
Result:
Conclusion:
In this article we have learnt Auth service,Ads Kit & site kit Integration in food application.
Reference:
Auth Service:
https://developer.huawei.com/consumer/en/doc/development/Tools-Guides/agc-conversion-auth-0000001050157270
Ads Kit :
https://developer.huawei.com/consumer/en/doc/development/HMSCore-Guides/publisher-service-introduction-0000001050064960
Site Kit :
https://developer.huawei.com/consumer/en/doc/development/HMSCore-Guides/android-sdk-introduction-0000001050158571
More information like this, you can visit HUAWEI Developer Forum
Introduction
In the previous post, we looked at how to use HUAWEI ML Kit’s card recognition function to implement the card binding function. With this function, users only need to provide a photo of their card, and your app will automatically recognize all of the key information. That makes entering card information much easier, but can we do the same thing for bills or discount coupons? Of course we can! In this post, I will show you how to implement automatic input of bill numbers and discount codes using HUAWEI ML Kit’s text recognition function.
Application
Text recognition is useful in a huge range of situations. For example, if you scan the following bill, indicate that the bill service number starts with “NO.DE SERVICIO”, and limit the length to 12 characters, you can quickly get the bill service number “123456789123” using the text recognition.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Similarly, if you scan the discount coupon below, start the discount code with “FAVE-” and limit the length to 4 characters, you’ll get the discount code “8329”, and can then complete the payment.
Pretty useful, right? You can also customize the information you want your app to recognize.
Integrating Text Recognition
So, let’s look at how to process bill numbers and discount codes.
1. Preparations
You can find detailed information about the preparations you need to make on the HUAWEI Developer.
Here, we’ll just look at the most important procedures.
1.1 Configure the Maven Repository Address in the Project-Level build.gradle File
Code:
buildscript {
repositories {
...
maven {url 'https://developer.huawei.com/repo/'}
}
}
dependencies {
...
classpath 'com.huawei.agconnect:agcp:1.3.1.300'
}
allprojects {
repositories {
...
maven {url 'https://developer.huawei.com/repo/'}
}
}
1.2 Add Configurations to the File Header
Once you’ve integrated the SDK, add the following configuration to the file header:
Code:
apply plugin: 'com.android.application'
apply plugin: 'com.huawei.agconnect'
1.3 Configure SDK Dependencies in the App-Level build.gradle File
Code:
dependencies {
// Import the base SDK.
implementation 'com.huawei.hms:ml-computer-vision-ocr:2.0.1.300'
// Import the Latin character recognition model package.
implementation 'com.huawei.hms:ml-computer-vision-ocr-latin-model:2.0.1.300'
// Import the Japanese and Korean character recognition model package.
implementation 'com.huawei.hms:ml-computer-vision-ocr-jk-model:2.0.1.300'
// Import the Chinese and English character recognition model package.
implementation 'com.huawei.hms:ml-computer-vision-ocr-cn-model:2.0.1.300'
}
1.4 Add these Statements to the AndroidManifest.xml File so the Machine Learning Model can Automatically Update
Code:
<manifest>
...
<meta-data
android:name="com.huawei.hms.ml.DEPENDENCY"
android:value="ocr" />
...
</manifest>
1.5 Apply for the Camera Permission
Code:
<uses-permission android:name="android.permission.CAMERA" />
<uses-permission android:name="android.permission.INTERNET" />
<uses-feature android:name="android.hardware.camera" />
<uses-feature android:name="android.hardware.camera.autofocus" />
2. Code Development
2.1 Create an Analyzer
Code:
MLTextAnalyzer analyzer = new MLTextAnalyzer.Factory(context).setLanguage(type).create();
2.2 Set the Recognition Result Processor to Bind with the Analyzer
Code:
analyzer.setTransactor(new OcrDetectorProcessor());
2.3 Call the Synchronous API
Use the built-in LensEngine of the SDK to create an object, register the analyzer, and initialize camera parameters.
Code:
lensEngine = new LensEngine.Creator(context, analyzer)
.setLensType(LensEngine.BACK_LENS)
.applyDisplayDimension(width, height)
.applyFps(30.0f)
.enableAutomaticFocus(true)
.create();
2.4 Call the run Method to Start the Camera and Read the Camera Streams for the Recognition
Code:
try {
lensEngine.run(holder);
} catch (IOException e) {
// Exception handling logic.
Log.e("TAG", "e=" + e.getMessage());
}
2.5 Process the Recognition Result As Required
Code:
public class OcrDetectorProcessor implements MLAnalyzer.MLTransactor<MLText.Block> {
@Override
public void transactResult(MLAnalyzer.Result<MLText.Block> results) {
SparseArray<MLText.Block> items = results.getAnalyseList();
// Process the recognition result as required. Only the detection results are processed.
// Other detection-related APIs provided by ML Kit cannot be called.
…
}
@Override
public void destroy() {
// Callback method used to release resources when the detection ends.
}
}
2.6 Stop the Analyzer and Release the Detection Resources When the Detection Ends
Code:
if (analyzer != null) {
try {
analyzer.stop();
} catch (IOException e) {
// Exception handling.
}
}
if (lensEngine != null) {
lensEngine.release();
}
Demo Effect
And that’s it! Remember that you can expand this capability if you need to. Now, let’s look at how to scan travel bills.
And here’s how to scan discount codes to quickly obtain online discounts and complete payments.
Github Source Code
https://github.com/HMS-Core/hms-ml-demo/tree/master/Receipt-Text-Recognition
Awsome work !!
More information like this, you can visit HUAWEI Developer Forum
Introduction:
HMS ML Kit features image super-resolution service which provides 1x super-resolution capability. This feature removes the compression noise of images to obtain clear images.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Precautions:
1. Prior to the image super-resolution service, it is necessary to convert images into bitmaps in ARGB format. After
the service processes, the image output are bitmaps in ARGB format.
2. Maximum size of an input image is 1024 x 768 px or 768 x 1024 px. The minimum size is 64 x 64 px.
Integration:
1. Create a project in android studio and Huawei AGC.
2. Provide the SHA-256 Key in App Information Section.
3. Download the agconnect-services.json from AGCand save into app directory.
4. In root build.gradle
Navigate to allprojects > repositories and buildscript > repositories and add the below line.
Code:
maven { url 'http://developer.huawei.com/repo/' }
5. In app build.gradle
Configure the Maven dependency
Code:
implementation 'com.huawei.hms:ml-computer-vision-imageSuperResolution:2.0.2.300'
implementation 'com.huawei.hms:ml-computer-vision-imageSuperResolution-model:2.0.2.300'
Apply plugin
Code:
apply plugin: 'com.huawei.agconnect'
6. Permissions in Manifest
Code:
<uses-permission android:name="android.permission.CAMERA" />
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
Code Implementation:
Here is an image convertor application that uses HMS image super-resolution service. This application can select image from gallery and also can capture image. Follow the steps
1. Create an image super-resolution analyzer.
Code:
private void createAnalyzer() {
MLImageSuperResolutionAnalyzerSetting settings = new MLImageSuperResolutionAnalyzerSetting.Factory()
// Set the scale of image super resolution to 1x.
.setScale(MLImageSuperResolutionAnalyzerSetting.ISR_SCALE_1X)
.create();
analyzer = MLImageSuperResolutionAnalyzerFactory.getInstance().getImageSuperResolutionAnalyzer(settings);
}
2. Create an MLFrame object by using android.graphics.Bitmap.
Code:
MLFrame mlFrame = new MLFrame.Creator().setBitmap(srcBitmap).create();
3. Perform super-resolution processing on the image.
Code:
Task<MLImageSuperResolutionResult> task = analyzer.asyncAnalyseFrame(mlFrame);
task.addOnSuccessListener(new OnSuccessListener<MLImageSuperResolutionResult>() {
public void onSuccess(MLImageSuperResolutionResult result) {
// Recognition success.
Toast.makeText(getApplicationContext(), "Success", Toast.LENGTH_SHORT).show();
setImage(result.getBitmap());
}
}).addOnFailureListener(new OnFailureListener() {
public void onFailure(Exception e) {
// Recognition failure.
Toast.makeText(getApplicationContext(), "Failed:" + e.getMessage(), Toast.LENGTH_SHORT).show();
}
});
4. After the recognition is complete, stop the analyzer to release recognition resources.
Code:
private void release() {
if (analyzer == null) {
return;
}
analyzer.stop();
}
5. Capture picture from camera.
Code:
private void capturePictureFromCamera(){
if (checkSelfPermission(Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED)
{
requestPermissions(new String[]{Manifest.permission.CAMERA}, MY_CAMERA_PERMISSION_CODE);
}
else
{
Intent cameraIntent = new Intent(android.provider.MediaStore.ACTION_IMAGE_CAPTURE);
startActivityForResult(cameraIntent, CAMERA_REQUEST);
}
}
6. Acceessing image from Gallery.
Code:
private void getImageFromGallery(){
Intent intent = new Intent();
intent.setAction(Intent.ACTION_GET_CONTENT);
intent.setType("image/*");
startActivityForResult(intent, GALLERY_REQUEST);
}
Screen Shots:
Conclusion:
Image super-resolution service is widely used in day to day life and supports in common scenarios like improving low-quality images on the network, obtaining clear images while reading news, improving image clarity of Identification Card etc.This service intelligently reduces the image noise and provides clear image without changing resolution.
Reference:
HMSCore-Guides
Is there any image size limitations.
How to Reset the Mobile app?
I have installed an emulated to play the world best game on my computer. But emulator is not working smoothly.
Please share some good technique to run it well so that I can enjoy the taste of the game.
Thank you.
cool thing, I'll definitely have to try it, especially with my old photos.
Does this work with every image? I've some old black and white photos
JohnMes said:
cool thing, I'll definitely have to try it, especially with my old photos.
Click to expand...
Click to collapse
It will be a good choice.:highfive:
Rushikesh787 said:
Does this work with every image? I've some old black and white photos
Click to expand...
Click to collapse
I think it can be better to have a try. Maybe we can gain a surprise.
will it convert all format images ?
I would definitely want to try this out
Awesome
I would definitely try this app on my girlfriend's photo
Introduction
In an age of digital living, maintaining interpersonal connections has never been more important. It's a common ritual for professionals to exchange business cards, but this can come with certain hassles – having to input the contact information for each new contact, including names, companies, and addresses. That's where business card recognition comes into the picture.
How It Works
The business card recognition service utilizes the optical character recognition (OCR) technology to digitize text on a card, converting it into an editable format, and classifying the recognized information by category. When you have integrated the service into your app, your users will be able to collect information on a business card simply by snapping a picture, scanning the QR code on the card, or importing the card image.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Preparations
Before API development, there are a few things that you'll need to do in preparation, for example, configuring the Maven repository address of the HMS Core SDK in your project, and integrating the SDK for the business card recognition service.
Install Android Studio
Official download website
Installation guide
Add Huawei Maven Repository to the Project-Level build.gradle File
Open the build.gradle File in the Root Directory of Your Android Studio Project
Add the Maven Repository Address
Go to buildscript > repositories and configure the Maven repository address for the HMS Core SDK.
Code:
repositories {
maven { url 'https://developer.huawei.com/repo/' }
}
}
Go to allprojects > repositories and configure the Maven repository address for the HMS Core SDK.
allprojects {
repositories {
maven { url 'https://developer.huawei.com/repo/' }
}
}
Import the SDK
Code:
dependencies {
// Text recognition SDK.
implementation 'com.huawei.hms:ml-computer-vision-ocr:2.0.1.300'
// Text recognition model.
implementation 'com.huawei.hms:ml-computer-vision-ocr-cn-model:2.0.1.300'
implementation 'com.huawei.hms:ml-computer-vision-ocr-jk-model:2.0.1.300'
implementation 'com.huawei.hms:ml-computer-vision-ocr-latin-model:2.0.1.300'
}
Add the Manifest File
Code:
<manifest
...
<meta-data
android:name="com.huawei.hms.ml.DEPENDENCY"
android:value="ocr" />
...
</manifest>
Add Permissions
Code:
<uses-permission android:name="android.permission.CAMERA" />
<uses-permission android:name="android.hardware.camera.autofocus" />
<uses-feature android:name="android.hardware.camera" />
<uses-feature android:name="android.hardware.autofocus" />
Submit a Dynamic Permission Application
Code:
if (!(ActivityCompat.checkSelfPermission(this, Manifest.permission.CAMERA) == PackageManager.PERMISSION_GRANTED)) {
requestCameraPermission();
}
Development
1. Create the text analyzer MLTextAnalyzer to recognize text within images. Use the custom parameter MLLocalTextSetting to configure the on-device text analyzer.
Code:
MLLocalTextSetting setting = new MLLocalTextSetting.Factory()
.setOCRMode(MLLocalTextSetting.OCR_DETECT_MODE)
.setLanguage("zh")
.create();
MLTextAnalyzer analyzer = MLAnalyzerFactory.getInstance()
.getLocalTextAnalyzer(setting);
2. Use android.graphics.Bitmap to create an MLFrame. Supported image formats include JPG, JPEG, PNG, and BMP. It is recommended that the aspect ratio for landscape business cards be 2:1 and that for portrait business cards be 1:2.
Code:
MLFrame frame = MLFrame.fromBitmap(bitmap);
3. Pass the MLFrame object to the asyncAnalyseFrame method for text recognition.
Code:
Task<MLText> task = analyzer.asyncAnalyseFrame(frame);
task.addOnSuccessListener(new OnSuccessListener<MLText>() {
@Override
public void onSuccess(MLText text) {
// Recognition success.
}
}).addOnFailureListener(new OnFailureListener() {
@Override
public void onFailure(Exception e) {
// Recognition failure.
}
});
4. After recognition is complete, stop the analyzer to release recognition resources.
Code:
try {
if (analyzer != null) {
analyzer.stop();
}
} catch (IOException e) {
// IOException
} catch (Exception e) {
// Exception
}
More Information
We have developed a demo app that demonstrates the effects of the business card recognition service.
Reference
Official website of Huawei Developers
Development Guide
HMS Core official community on Reddit
Demo and sample code
Discussions on Stack Overflow
HUAWEI Audio Kit
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Article Introduction
The following article describes the easiest way to use the playback capability of the HUAWEI Audio Kit and how to read audio online, locally, and from the resources folder as well.
HUAWEI Audio Kit Introduction
HUAWEI Audio Kit is a set of audio capabilities that provides you with audio playback capabilities based on the HMS ecosystem, including audio encoding and decoding capabilities at the hardware level and system bottom layer.
Supported by Android phones and tablets running Android 6.0 (EMUI 4.0) or later with HMS core vers. 5.0.0.300 or later.
Integrating the HUAWEI Audio Kit SDK
Before start using the AppGallery Auth Service, we must first integrate the HMS Core Audio SDK by following the steps below :
Step 1:
- Configure the Maven Repository Address for the HMS Core Audio SDK in the Gradle file at the project-level:
Code:
buildscript {
repositories {
google()
jcenter()
maven {url 'https://developer.huawei.com/repo/'} // to be added
}
dependencies {
classpath 'com.android.tools.build:gradle:3.6.3'
}
}
allprojects {
repositories {
google()
jcenter()
maven {url 'https://developer.huawei.com/repo/'} // to be added
}
}
- Add the build Dependencies in the Gradle file at the app-level:
Code:
implementation 'com.huawei.hms:audiokit-player:1.1.0.300'
Step 2:
- The HMS Core Audio SDK requires permissions to access the network, obtain the network status, operate SD cards, and read data from the Android media library. Declare the permissions in the Manifest file:
Code:
// Permission to access the network.
<uses-permission android:name="android.permission.INTERNET" />
// Permission to obtain the network status.
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
// Permission to write data into the SD card.
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
// Permission to read data from the SD card.
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
// Permission to read data from the Android media library.
<uses-permission android:name="android.permission.READ_MEDIA_STORAGE" />
Coding
The Audio Kit allow us to:
- Create an audio management instance by calling HwAudioManager;
- Manage audio playback by calling HwAudioPlayerManager;
- Manage audio queues by calling HwAudioQueueManager;
- Manage audio configurations by calling HwAudioConfigManager.
- Manage audio data by calling HwAudioPlayItem.
In the following, we'll create a HwAudioManager to manage the manage playback, and later we'll create a playlist to play online audio, local audio, and from the resources folder audio.
Code:
// Create a configuration instance, including various playback-related configurations.
HwAudioPlayerConfig hwAudioPlayerConfig = new HwAudioPlayerConfig(context);
// Create a control instance.
HwAudioManagerFactory.createHwAudioManager(hwAudioPlayerConfig, new HwAudioConfigCallBack() {
// Return the control instance through callback.
@Override
public void onSuccess(HwAudioManager hwAudioManager) {
try {
Log.i(TAG, "createHwAudioManager onSuccess");
mHwAudioManager = hwAudioManager;
// Obtain the playback control instance.
mHwAudioPlayerManager = hwAudioManager.getPlayerManager();
// get the play list of audio using and play the play list of audios.
mHwAudioPlayerManager.playList(getOnlinePlaylist(), 0, 0);
} catch (Exception e) {
Log.e(TAG, "player init fail", e);
}
}
@Override
public void onError(int errorCode) {
Log.e(TAG, "init err:" + errorCode);
}
});
In the following, we'll see how to read audio files from different sources, and later we'll implement the previously used method getOnlinePlaylist() that returns the playlist.
The audio data class HwAudioPlayItem includes the album name, artist, whether audio is online, etc., and provides as well two methods to set the path of the audio file; setFilePath(java.lang.String filePath) to set the local path of the audio. and setOnlinePath(java.lang.String onlinePath) to set the online URL of the audio.
In order to have a 3rd method to set the audio path from the resources folder let's create our own class HwAudioPlayItemBeta that extends HwAudioPlayItem, in which we will add a method that copies the audio file from the resources folder to a cashed local folder to be able to read that file as a local file using setFilePath.
More details, you can visit https://forums.developer.huawei.com/forumPortal/en/topic/0204412320648460216