How to Build a Translation Function - Huawei Developers

Programmers are — or should be — voracious readers. To keep up with the latest updates in the world of software, we need to be constantly scrolling through books, forum articles, news, and more.
This process is certainly mentally enriching, but it can also be a tiring and tedious one due to one major obstacle: language. I used to struggle a lot with reading articles written in another language, because I was looking up every word that I couldn't understand in the dictionary in order to make sense of what I was reading — until I developed this.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
No muss, no fuss. Just select the foreign text you don't understand and instantly translate it into a language that you want.
Now let's get to the development part. Not being much of a linguist, I knew that I would struggle to develop a translation feature for my app all on my own.
Luckily, I got a great helper — HMS Core ML Kit's translation service. It supports real-time and on-device translation, making translation possible even in the absence of an Internet connection. With the help of the translation service, language barriers become a thing of the past.
Now, I'll explain how I developed this function, using the source code for the demo above.
Development Process​Preparations​Make necessary preparations as detailed here. This includes the following:
Configure the app information.
Enable the service.
Integrate the SDK of the service.
Configure the obfuscation scripts.
Declare necessary permissions.
Function Building​1. Set the app authentication information via an access token:
Code:
MLApplication.getInstance().setAccessToken("your access token");
Or an API key:
Code:
MLApplication.getInstance().setApiKey("your ApiKey");
2. Create a real-time translator.
Code:
MLLocalTranslateSetting setting = new MLLocalTranslateSetting
.Factory()
.setSourceLangCode(mSourceLangCode)
.setTargetLangCode(mTargetLangCode)
.create();
this.localTranslator =
MLTranslatorFactory.getInstance().getLocalTranslator(setting);
3. Query the languages supported by the service.
Code:
MLTranslateLanguage.getCloudAllLanguages().addOnSuccessListener(new OnSuccessListener<Set<String>>() {
@Override
public void onSuccess(Set<String> result) {
// Callback when the supported languages are obtained.
}
});
4. Translate the text.
Code:
localTranslator.preparedModel(downloadStrategy, modelDownloadListener).addOnSuccessListener(new OnSuccessListener<Void>() {
@Override
public void onSuccess(Void aVoid) {
final Task<String> task = localTranslator.asyncTranslate(input);
task.addOnSuccessListener(new OnSuccessListener<String>() {
@Override
public void onSuccess(String text) {
displaySuccess(text, true);
}
}).addOnFailureListener(new OnFailureListener() {
@Override
public void onFailure(Exception e) {
displayFailure(e);
}
});
}
}).addOnFailureListener(new OnFailureListener() {
@Override
public void onFailure(Exception e) {
displayFailure(e);
}
});
5. Release resources occupied by the translator when the translation is complete.
Code:
if (localTranslator != null) {
localTranslator.stop();
}
And voila, the translation function is built.
Besides e-book readers, there are lots of other apps that can benefit greatly from having a translation function, such as travel apps, which can use the translation service to translate foreign road signs and menus for visitors. Translation is also useful for educational apps, to help users who are not familiar with the language used in the app.
That concludes my development journey for the demo e-book reader. What other ideas and suggestions do you have for using the translation function? Feel free to provide your thoughts in the comments section.
References​Why Translation is Important In A World Where English is Everywhere
ML Kit

Related

HUAWEI Scene Kit to add 3D Objects on Android app

More articles like this, you can visit HUAWEI Developer Forum​
HUAWEI Scene Kit
-Lightweight rendering engine that features high performance and low consumption.
-Provides advanced descriptive APIs for you to edit, operate, and render 3D materials.
-Adopts physically based rendering (PBR) pipelines to achieve realistic rendering effects.
-Easily load and display complicated 3D objects on Android phones.
-Can be apply on AR virtual fitting room, 3D virtual galleries for art, and VR remote teaching.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Supported Devices:
HMS Core 4.0.2.300 later
Android 8.0/EMUI 8.0 later
Suppport Vulkan graphics APIs.
Supported Formats:
1.Materials rendered:GITP,glb
2.gITF textures: png,jpeg
3.Sky box material: dds(cubeMap)
Implementation Process
1. Add build dependencies in the dependencies section.
Code:
implementation 'com.huawei.scenekit:sdk:1.5.0.300'
2.Create a SampleView that inherits from SceneView
(SceneView inherits from surfaceView and overrides methods including surfaceCreated,surfaceChanged, surfaceDestroyed, onTouchEvent, and onDraw)
Code:
public class SampleView extends SceneView {
public SampleView(Context context) {
super(context);
}
public SampleView(Context context, AttributeSet attributeSet) {
super(context, attributeSet);
}
}
3.Override the surfaceCreated method in SampleView and call the super method and load 3D scene materials, skybox materials, and lighting maps.
Code:
@Override
public void surfaceCreated(SurfaceHolder holder) {
super.surfaceCreated(holder);
// Loads the model of a scene by reading files from assets.
loadScene("scene.gltf");
// Loads skybox materials by reading files from assets.
loadSkyBox("skyboxTexture.dds");
// Loads specular maps by reading files from assets.
loadSpecularEnvTexture("specularEnvTexture.dds");
// Loads diffuse maps by reading files from assets.
loadDiffuseEnvTexture("diffuseEnvTexture.dds");
}
(dds,gltf will be stored in assets directory and assets is inside app directory)
4.You may override other surface lifecycle methods.
Code:
@Override
public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
super.surfaceChanged(holder, format, width, height);
}
@Override
public void surfaceDestroyed(SurfaceHolder holder) {
super.surfaceDestroyed(holder);
}
@Override
public boolean onTouchEvent(MotionEvent motionEvent) {
return super.onTouchEvent(motionEvent);
}
@Override
public void onDraw(Canvas canvas) {
super.onDraw(canvas);
}
5.Create an Activity that inherits from Activity and call setContentView in the onCreate method to load the SampleView.
Code:
public class SampleActivity extends Activity {
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_sample);
}
}
6.In acitivity_sample create the SampleView
Code:
<com.huawei.huaweitextocr.huawei.SampleView
android:layout_width="match_parent"
android:layout_height="match_parent"/>
Reference Link:
https://developer.huawei.com/consumer/en/doc/development/HMSCore-Guides-V5/client-dev-0000001050162137-V5

Flutter HMS Location Plugin

More information like this, you can visit HUAWEI Developer Forum
​Introduction
This article shows the setp to integrate HMS Flutter Location plugin with a news app.
The news app will obtain the country that the user is located in. It will then fetch that country's news headlines and display them onto a list.
For example, if the user is currently located in Hong Kong, the app will show Hong Kong's news headlines on launch. The user may switch to read other countries' news headlines afterwards.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
The datasource of the news headlines are obtained from the NewsAPI.
Configuration
Assuming that there is already a running Flutter app,
*Update: for 1) and 2), the plugin is now uploaded to pub.dev, there is no need to download and configure it manually.
Add this to your pubspec.yaml is the preferred way.
Code:
dependencies:
huawei_location: ^4.0.4+300
1) Download the huawei_location flutter plugin and unzip it. For this project, it is placed under the project's root.
2) Configure pubspec.yaml to add the following under dependencies. Replace the path to your own's.
Code:
huawei_location:
path: 'hms/location/huawei_location'
3) Configure AndroidManifest.xml for location permission.
Code:
<uses-permission android:name="android.permission.ACCESS_COARES_LOCATION"/>
<uses-permission android:name="android.permission.ACCESS_FINE_LOCATION"/>
Project Structure
The architecture of the flutter project is shown below, which is adapted from this repo.
https://github.com/FilledStacks/flutter-tutorials/tree/master/010-provider-architecture
The project is layered into three parts, namely UI, Services and Business Logic.
The UI is dumb and display only what it is given. It will not contain any logic to process the data.
The Service provides various services and exposes APIs for others to use.
The business logic contains view models and models etc.
Since the UI is pretty simple and does not contain any logic to process data, we will mainly focus on Services and Business Logic.
Services
 1) Permission Service
Accessing user's location generally requires permission at runtime.
The permission services expose two methods to achieve this. For convenience, the built-in PermissionHandler from Huawei-location plugin is used.
Code:
class PermissionServicesImpl implements PermissionServices {
PermissionHandler _permissionHandler = PermissionHandler();
@override
Future<bool> hasLocationPermission() async {
return _permissionHandler.hasLocationPermission();
}
@override
Future<bool> requestLocationPermission() async {
return _permissionHandler
.requestLocationPermission();
}
}
For this app, the permission to obtain user's location information is acquired during app launch. If the user has allowed, the country code obtained is saved with SharedPreferences.
Code:
Future<void> _initApp() async {
final isPermitted = await _permissionServices.requestLocationPermission();
if(isPermitted == null) {
await _sharedPreferencesServices.saveCountryCode(CommonString.defaultCountry);}
if(isPermitted) {
final hwLocation = await _locationService.getHWLocation();
if(hwLocation != null) {
await _sharedPreferencesServices.saveCountryCode(hwLocation.countryCode);
}
} else {
await _sharedPreferencesServices.saveCountryCode(CommonString.defaultCountry);
}
}
2) Location Service
The location service provides only one method to return HWLocation using huawei_location plugin.
It utilizes location plugin's FusedLocationProviderClient.getLastLocationWithAddress(LocationRequest) to acquire HWLocation.
Remember to set LocationRequest.needAddress to true if you need to obtain the address information also.
Code:
class LocationServicesImpl implements LocationService {
final PermissionServices _permissionServices =
serviceLocator<PermissionServices>();
@override
Future<HWLocation> getHWLocation() async {
if (await _permissionServices.hasLocationPermission()) {
FusedLocationProviderClient locationService =
FusedLocationProviderClient();
LocationRequest locationRequest = LocationRequest();
locationRequest.needAddress = true;
final hwLocation = locationService
.getLastLocationWithAddress(locationRequest);
return hwLocation;
}
}
}
3) SharedPreferences Service
The shared preferences service allows the ability to store and retreive country code. For simplicity, only the abstract class is shown.
Code:
abstract class SharedPreferencesServices {
Future<void> saveCountryCode(String countryCode);
Future<String> getCountryCode();
}
4) TopHeadlines Service
TopHeadlines service allows the ability to fetch a specific countries' news headlines and also to change to another countries' news headlines. For simplicity, only the abstract class is shown.
Code:
abstract class TopHeadlinesService {
Future<List<Article>> getTopHeadlines(String countryCode);
Future<List<Article>> changeHeadlinesLanguage(String countryCode);
}
Business Logic
1) HeadlinesScreenViewModel:
This viewmodel is responsible for managing the state of the headlines screen, processing data and gluing all parts together.
In case of a change in state, the viewmodel will notify its listneres (the UI), so that they could act on the event.
To complete the picture, in the loadData method, the country code is first retreived from SharedPreferences, the country code is then used to call getTopHeadlines(String countryCode) to fetch the headline news of that particular country.
The app will call loadData at the initState lifecycle method of the headlines screen.
Code:
void loadData() async {
_setIsLoading(true);
final _countryCode = await _sharedPreferencesServices
.getCountryCode()
.timeout(Duration(milliseconds: 2000), onTimeout: () => CommonString.defaultCountry);
_headlines = await _topHeadlinesService
.getTopHeadlines(_countryCode)
.timeout(Duration(milliseconds: 2000), onTimeout: () => null);
_setIsLoading(false);
}
The above explains the necessary parts of using HMS Location kit for Flutter.
If you are interested in the details, feel free to visit the github repo.
https://github.com/lkhe/news_app

Huawei Game Service for libGDX games

More information like this, you can visit HUAWEI Developer Forum​
Original link: https://forums.developer.huawei.com/forumPortal/en/topicview?tid=0201320947104400266&fid=0101187876626530001
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Hi folks! In this guide I’ll explain how to integrate the Huawei Game Service with gdx-gamesvcs, a libGDX extension library for Game Services.
Medium Link: https://medium.com/huawei-developers/huawei-game-service-for-libgdx-games-3b189d8b4a82
What’s gdx-gamesvcs?
gdx-gamesvcs is a libGDX extension that aims to provide a cross-platforms API
for Game Services.
It supports the following services:
• Google Play Games;
• Apple Game Center;
• GameJolt;
• Amazon GameCircle;
• Kongregate;
• Huawei Game Service (starting from version 1.1.0-SNAPSHOT version)
For further information, please visit: https://github.com/MrStahlfelge/gdx-gamesvcs
Huawei Game Service for libGDX
It supports the following features:
- Account Kit: login, logout;
- Game Service: player info, achievements, leaderboards, game events;
- Drive Kit: save, load and delete data on Cloud
Requirements
• EMUI 3.0+ / Android 4.4+
• HMS Core 4.0.0.300+
• Android Studio 3.0+
Preparation
• Create an app in AppGallery Connect. The app type must be Game.
• Create a libGDX project.
• Generate a signature certificate.
• Generate a signature certificate fingerprint.
• Configure the signature certificate fingerprint.
• Add the app package name and save the configuration file.
• Configure the Maven repository address and AppGallery Connect gradle
plug-in.
• Configure the signature file in Android Studio.
For further information, please visit: https://developer.huawei.com/consumer/en/codelab/HMSPreparation/index.html#0
Finally coding!
Gradle:
implementation “de.golfgl.gdxgamesvcs:gdx-gamesvcs-android-huawei:1.1.0-SNAPSHOT”
Now, in your Activity extending the AndroidApplication class (a libGDX class), optionally implementing the IGameServiceListener (to listen events), You have to instantiate the HuaweiGameServicesClient, and then manage the lifecycle events. Like this:
Code:
public class GameServiceActivity extends AndroidApplication implements IGameServiceListener {
private IGameServiceClient gsClient = null;
@Override
protected void onCreate (Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
//true if You want to manage data on Cloud
this.gsClient = HuaweiGameServicesClient(this, true);
this.gsClient.setListener(this);
initialize(gdxGameSvcsApp, config);
}
@Override
protected void onPause() {
super.onPause();
if (this.gsClient != null) {
this.gsClient.pauseSession();
}
}
@Override
protected void onResume() {
super.onResume();
if (this.gsClient != null) {
this.gsClient.resumeSession();
}
}
@Override
public void gsOnSessionActive() {
Log.d("SESSION", "ACTIVE");
}
@Override
public void gsOnSessionInactive() {
Log.d("SESSION", "INACTIVE");
}
@Override
public void gsShowErrorToUser(GsErrorType et, String msg, Throwable t) {
Toast.makeText(this, msg, Toast.LENGTH_LONG).show();
}
}
So? Unleash the power of the HuaweiGameServiceClient!
Code:
...
//ACCOUNT KIT
gsClient.logIn();
gsClient.logOff();
//GAME SERVICE
//achievements
gsClient.showAchievements();
gsClient.unlockAchievement("id");
gsClient.incrementAchievement("id", increment, completionPercentage);
gsClient.fetchAchievements(new IFetchAchievementsResponseListener() {
@Override
public void onFetchAchievementsResponse(Array<IAchievement> achievements) {
}
});
//leaderboards
gsClient.showLeaderboards("id");
gsClient.submitToLeaderboard("id", score, scoreTips);
gsClient.fetchLeaderboardEntries("id", limit, isRelatedToPlayer, new IFetchLeaderBoardEntriesResponseListener() {
@Override
public void onLeaderBoardResponse(Array<ILeaderBoardEntry> leaderBoard) {
}
});
//game events
gsClient.submitEvent("id", increment);
//DRIVE KIT
gsClient.saveGameState(null, gameState, progressValue, new ISaveGameStateResponseListener() {
@Override
public void onGameStateSaved(boolean success, String errorCode) {
}
});
gsClient.loadGameState(null, new ILoadGameStateResponseListener() {
@Override
public void gsGameStateLoaded(byte[] gameState) {
}
});
gsClient.deleteGameState(null, new ISaveGameStateResponseListener() {
@Override
public void onGameStateSaved(boolean success, String errorCode) {
}
});
...
Do You want moaAAaaRrRr?
Read my article about Huawei IAP for libGDX: Medium
https://medium.com/huawei-developers/huawei-iap-for-libgdx-games-eb5aec5662af
Nice and useful article
i am implementing game using huawei game service. I am from india, here drive kit is not working, is there alternative to save game progress

Make Users to Update Your Application with HMS

{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
AppGalleryKit App and AppGalleryKit Game services allows to jointly operate the apps with Huawei and share revenue generated in the process. Developer will have access to Huawei’s diverse services including HUAWEI AppGallery connection, data reports, activity operations, and user operations to obtain more premium HUAWEI AppGallery resources in order to promotion purposes.
To enable AppGalleryKit App or AppGalleryKitGame operations, you need to sign the HUAWEI AppGallery Connect Distribution Service Agreement For Paid Apps with Huawei. For details about the joint operations process, please refer to Joint Operations Services at a Glance.
AppGalleryKit App and AppGalleryKit Game is a product concept derived from Account, Analytics, In-App Purchases and other kits. With AppGalleryKit App or AppGalleryKit Game, initializing the app, updating the app, implementing Account Kit is optional. But it is necessary to implementing In-App Purchases kit to use AppGalleryKit App or AppGalleryKit Game. Also it is advised that use Analytics Kit, Auth Service, Crash Service, A/B Testing and APM.
AppGalleryKitApp or AppGalleryKitGame is not pure kit integration. It is required for developers to sign AppGalleryKitApp or AppGalleryKitGame related agreements and they are derived from many features.
Initiliazing app, updating the app, Account Kit and In-App Purchases can be implemented seperately. These kits do not depend on AppGalleryKitApp or AppGalleryKitGame. But AppGalleryKitApp or AppGalleryKitGame depends on these kits.
Although we are not going to use AppGalleryKitApp or AppGalleryKit game, we can still use the update status of the application. In this article, we will check if there is an update in a demo app. Of course due to this app will not be in AppGallery market, there will not any update required.
In order to use this feature, first HMS Core is needed to be integrated to the project.
You can click this link to integrate HMS Core to your project.
Adding Dependency
After HMS Core is integrated, app-service library needs to be implemented.
Code:
implementation 'com.huawei.hms:appservice:{version}' // Currently version is 5.0.4.302
We will create a checkUpdate method and use it in onCreate. JosApps.getAppUpdateClient method, AppUpdateClient instance will be obtained. This object provides the methods related to app update. checkAppUpdate method, checks for app updates after the app is launched and initialized.
Java:
private void checkUpdate(){
AppUpdateClient client = JosApps.getAppUpdateClient(this);
client.checkAppUpdate(this, new UpdateCallBack(this));
}
We need to create a static class which is UpdateCallBack and it will implement CheckUpdateCallBack. CheckUpdateCallBack returns a result for checking for app updates. It requires onUpdateInfo, onMarketInstallInfo, onMarketStoreError and onUpdateStoreError methods are implemented.
in onUpdateInfo method, we can get status code, fail code, fail reason and other informations.
For more information you can click this link.
Code:
private static class UpdateCallBack implements CheckUpdateCallBack {
private final WeakReference<MainActivity> weakMainActivity;
private UpdateCallBack(MainActivity mainActivity) {
this.weakMainActivity = new WeakReference<>(mainActivity);
}
public void onUpdateInfo(Intent intent) {
if (intent != null) {
MainActivity mainActivity = null;
if (weakMainActivity.get() != null){
mainActivity = weakMainActivity.get();
}
int status = intent.getIntExtra(UpdateKey.STATUS, 100);
int rtnCode = intent.getIntExtra(UpdateKey.FAIL_CODE, 200);
String rtnMessage = intent.getStringExtra(UpdateKey.FAIL_REASON);
Serializable info = intent.getSerializableExtra(UpdateKey.INFO);
if (info instanceof ApkUpgradeInfo && mainActivity != null ) {
AppUpdateClient client = JosApps.getAppUpdateClient(mainActivity);
//Force Update option is selected as false.
client.showUpdateDialog(mainActivity, (ApkUpgradeInfo) info, false);
Log.i("AppGalleryKit", "checkUpdatePop success");
}
if(mainActivity != null) {
//status --> 3: constant value NO_UPGRADE_INFO, indicating that no update is available.
Log.i("AppGalleryKit","onUpdateInfo status: " + status + ", rtnCode: "
+ rtnCode + ", rtnMessage: " + rtnMessage);
}
}
}
@Override
public void onMarketInstallInfo(Intent intent) {
//onMarketInstallInfo
}
@Override
public void onMarketStoreError(int i) {
//onMarketStoreError
}
@Override
public void onUpdateStoreError(int i) {
//onUpdateStoreError
}
}
In this example, due to we do not have the application released in the market, we got a status code which is equal to 3. This indicates that for the application there is no upgrade needed.
For all status codes, you can check the below image.
For more details, you can check AppGalleryKit App and AppGalleryKit Game development guide links in reference section. Also you can download this demo application from the Github link.
Reference
AppGalleryKit App
AppGalleryKit Game
Github

Solution to Creating an Image Classifier

I don't know if it's the same for you, but I always get frustrated when sorting through my phone's album. It seems to take forever before I can find the image that I want to use. As a coder, I can't help but wonder if there's a solution for this. Is there a way to organize an entire album? Well, let's take a look at how to develop an image classifier using a service called image classification.
Development Preparations​1. Configure the Maven repository address for the SDK to be used.
Java:
repositories {
maven {
url'https://cmc.centralrepo.rnd.huawei.com/artifactory/product_maven/' }
}
2. Integrate the image classification SDK.
Java:
dependencies {
// Import the base SDK.
implementation 'com.huawei.hms:ml-computer-vision-classification:3.3.0.300'
// Import the image classification model package.
implementation 'com.huawei.hms:ml-computer-vision-image-classification-model:3.3.0.300'
Project Configuration​1. Set the authentication information for the app.
This information can be set through an API key or access token.
Use the setAccessToken method to set an access token during app initialization. This needs to be set only once.
Java:
MLApplication.getInstance().setAccessToken("your access token");
Or, use setApiKey to set an API key during app initialization. This needs to be set only once.
Java:
MLApplication.getInstance().setApiKey("your ApiKey");
2. Create an image classification analyzer in on-device static image detection mode.
Java:
// Method 1: Use customized parameter settings for device-based recognition.
MLLocalClassificationAnalyzerSetting setting =
new MLLocalClassificationAnalyzerSetting.Factory()
.setMinAcceptablePossibility(0.8f)
.create();
MLImageClassificationAnalyzer analyzer = MLAnalyzerFactory.getInstance().getLocalImageClassificationAnalyzer(setting);
// Method 2: Use default parameter settings for on-device recognition.
MLImageClassificationAnalyzer analyzer = MLAnalyzerFactory.getInstance().getLocalImageClassificationAnalyzer();
3. Create an MLFrame object.
Java:
// Create an MLFrame object using the bitmap which is the image data in bitmap format. JPG, JPEG, PNG, and BMP images are supported. It is recommended that the image dimensions be greater than or equal to 112 x 112 px.
MLFrame frame = MLFrame.fromBitmap(bitmap);
4. Call asyncAnalyseFrame to classify images.
Java:
Task<List<MLImageClassification>> task = analyzer.asyncAnalyseFrame(frame);
task.addOnSuccessListener(new OnSuccessListener<List<MLImageClassification>>() {
@Override
public void onSuccess(List<MLImageClassification> classifications) {
// Recognition success.
// Callback when the MLImageClassification list is returned, to obtain information like image categories.
}
}).addOnFailureListener(new OnFailureListener() {
@Override
public void onFailure(Exception e) {
// Recognition failure.
try {
MLException mlException = (MLException)e;
// Obtain the result code. You can process the result code and customize relevant messages displayed to users.
int errorCode = mlException.getErrCode();
// Obtain the error message. You can quickly locate the fault based on the result code.
String errorMessage = mlException.getMessage();
} catch (Exception error) {
// Handle the conversion error.
}
}
});
5. Stop the analyzer after recognition is complete.
Java:
try {
if (analyzer != null) {
analyzer.stop();
}
} catch (IOException e) {
// Exception handling.
}
Demo​
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Remarks​The image classification capability supports the on-device static image detection mode, on-cloud static image detection mode, and camera stream detection mode. The demo here illustrates only the first mode.
I came up with a bunch of application scenarios to use image classification, for example: education apps. With the help of image classification, such an app enables its users to categorize images taken in a period into different albums; travel apps. Image classification allows such apps to classify images according to where they are taken or by objects in the images; file sharing apps. Image classification allows users of such apps to upload and share images by image category.
References​>>Image classification Development Guide
>>Reddit to join developer discussions
>>GitHub to download the sample code
>>Stack Overflow to solve integration problems

Categories

Resources