What barcodes can I scan with HMS Scan Kit? - Huawei Developers

Hello, I want to make an app which is going to scan some barcodes. Does anyone know that what kind of barcodes can I scan with HMS?

Hello,
Scanning Barcodes
Scan Kit allows support for 13 major barcode formats (listed as follows). If your app requires only some of the 13 formats, specify the desired ones to speed up barcode scanning.
1D barcodes: EAN-8, EAN-13, UPC-A, UPC-E, Codabar, Code 39, Code 93, Code 128, and ITF
2D barcodes: QR code, Data Matrix, PDF417, and Aztec
Click to expand...
Click to collapse
You can get more information on https://developer.huawei.com/consumer/en/doc/development/HMSCore-Guides/use-cases-0000001050043945

Related

[free app] Maps Tool

Maps Tool - new free app (with no ads either) to work with gps coordinate, maps and grids, including public land survey system (township, range, section)!
Do you happen to own some real paper maps? If so, my new app is designed exactly to make using them simpler. Or maybe you are just interested to learn something about geodesy, navigation and cartography? If so – read on.
While most online and other digital maps are now standardized around using decimal degrees latitude/longitude pairs using WGS84 geodesic datum, paper maps are not so simple. They use all kinds of different coordinate system. Many use UTM – Universal Transverse Mercator coordinate system that was developed by the United States Army Corps of Engineers in the 1940s. NATO militaries around the world wanted their own thing and developed another grid coordinate system called MGRS. While these two are most popular there are many more other grids, especially once we go to local level. Maps in UK often use something called Ordinance Survey National Grid (OSGB) while Ireland had to use yet another system called Irish Grid.
To make things even more complicated years before recent standardization around WGS84, maps often used other geodesic datum, based on different models of earth ellipsoid and measurements. In fact many historic reference earth ellipsoids are still used in local areas, since they may provide better accuracy of measurements in certain territories. In practice this means that getting coordinates assuming wrong datum can put you hundreds of meters away.
Maps Tool is a powerful app that allows you to convert data between many coordinate system representations and also adjust for various datum. This app currently supports lat/lng, UTM, MGRS, Earth-centered, Earth-fixed Cartesian coordinate system, Swedish Grid, Ordnance Survey of Great Britain and Irish National Grid and many different geodetic datums, including, but not limited to: WGS 84, NAD 27, ED 50, ETRF 89, OSGB 36, OSI 65, RT 90, SWEREF 99, SIRGAS 2000, SAD 69, Córrego Alegre, SICAD, Astro Chuá and SK 42.
By default Maps Tool uses offline HERE+ maps already stored on your phone, but can also use Bing Maps, Google Maps, and OpenStreetMaps for better coverage. Overlays such as topographic map and satellite imagery are also available. Other handy function of Maps Tool include geocoding (finding location by address) and reverse geocoding (find closest address for a given location), sharing your location via sms/email and build-in database of around 3000 major cities and 9000 world airports for quick access (you can just key in a 3-letter IATA or 4-letter ICAO code).
Using web service from BLM (Bureau of Land Management) Maps Tool also gives you access to PLSS (Public Land Survey System) information (township, range, section) as a convenient map overlay (you can search for parcels by TRS or view parcels around you by tapping on the map).
Last but not least, Maps Tool app allows you to see rough altitude information from the GPS module in your phone, or (available as an optional download) query various GIS systems for accurate topographical altitude information including currently supported: USGS (US only), SRTM3, GTOPO30, ASTER and Google.
Best of all this app does not require a data connection to work, so you can take it with you into the field.
It is currently translated to English, Russian and Japanese and if you can help with more languages - please let me know.
Here is the store link:
http://www.windowsphone.com/s?appid=4bf54046-e1a3-4140-9f6f-7516948a5d29

Do you want to deploy android application in AppGallery???

This article will help you understand how easily you can add any Huawei Mobile Services into your android application.
===== Article Takeaway =====
After reading this article you will be able to add any SDK into your android project without any hustle or blockage. It will increase the efficiency as well as reduces error chances.
Automatic HMS Kit search.
Automatic Permission Addition
Code snippets for API usage.
Add API by dragging the code snippets
#android #hms #huawei #androiddev #huaweifacts
bit.ly/HMS-Toolkit
Tshrsri said:
This article will help you understand how easily you can add any Huawei Mobile Services into your android application.
===== Article Takeaway =====
After reading this article you will be able to add any SDK into your android project without any hustle or blockage. It will increase the efficiency as well as reduces error chances.
Automatic HMS Kit search.
Automatic Permission Addition
Code snippets for API usage.
Add API by dragging the code snippets
#android #hms #huawei #androiddev #huaweifacts
bit.ly/HMS-Toolkit
Click to expand...
Click to collapse
Hi, Tshrsri. Where can I see the complete article?

Custom Model Generation with MindSpore Lite | HMS ML Kit

MindSpore is an open-sourced framework for AI based application development which is announced by Huawei. It is a robust alternative to AI frameworks such as TensorFlow and PyTorch which are widely used in the market.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Let’s start by emphasizing the features and advantages of MindSpore framework:
MindSpore implements AI algorithms for easier model development and provides cutting-edge technologies with Huawei AI processors to improve runtime efficiency and computing performance.
One of its advantages is that it can be used in several environments like on devices, cloud and edge environments. It supports operating systems like IOS and Android, and AI applications on various devices such as mobile phones, tablets and IoT devices.
MindSpore supports parallel training across hardware to reduce training times. It maximizes both hardware computing power and minimizes inference latency and power consumption.
It provides dynamic debugging ability for developers which enables to find out bugs in the apps easily.
According to Huawei, MindSpore does not process data by itself but ingests the gradient and model information that has been processed. This ensures the integrity of sensitive data.
MindSpore Lite is an inference framework for custom models which is provided by HMS ML Kit to simplify the integration and development. The developers can define their own model and implement model inference thanks to MindSpore Lite capabilities.
MindSpore Lite is compatible with commonly used AI platforms like TensorFlow Lite, Caffe, and Onnx. Different models can be converted into .ms (MindSpore) format and then run perfectly.
Custom models can be deployed and executed easily since they are compressed and occupy small storage.
It provides complete APIs to integrate inference framework of an on-device custom model.
HMS ML Kit enables to train and generate custom models with deep machine learning. It also offers pre-trained image classification model. You can develop your own custom model by using Transfer Learning feature of ML Kit with a specific dataset.
I will basically explain you how to train your own model over an example which contains three plant categories. We will use a small data set for reference and train the image classification model to identify cactus, pine and succulent plants. The model will be created by using HMS Toolkit plug-in and AI Create.
HMS Toolkit: As a lightweight IDE tool plugin, HMS Toolkit implements app creation, coding, conversion, debugging, test, and release. It helps you integrate HMS Core APIs with lower costs and higher efficiency.
AI Create: Provides the transfer learning capabilities of image classification and text classification. Images and texts can be identified thanks to AI Create. It uses MindSpore as a training framework and MindSpore Lite as inference framework.
Note: Use the Android Studio marketplace to install HMS Toolkit plug-in. Please go to File > Settings > Plugins > Marketplace, enter HMS Toolkit into the search box and click install. After installation complete, restart Android Studio.
We should prepare the environment to train our model first. AI Create only supports Windows operating system currently. Please open Coding Assistant by using the new HMS section that came with HMS Toolkit plug-in. Go to AI > AI Create in Coding Assistant and select Image and click Confirm for Image Classification.
After this step HMS Toolkit automatically downloads resources for you. If the Python environment is not configured, the dialog box will be displayed as below.
Note: You should download and install Python 3.7.5 from the link to use AI Create. After installation complete, please do not forget to add Python installation path into the Path variable in Environment Variables and restart the Android Studio.
After environment is ready, if you select Image and click Confirm from AI Create it will automatically start to install MindSpore. Please be sure the framework has been installed successfully by checking the Event logs.
From now, new model section will be opened to select an image folder to train our own model. You should prepare your data set in accordance with the requirements. We will train the model for our demo to identify cactus, succulent and pine plants with a small data set.
The folder structure should be like below :
The following conditions should be met for image resources:
The minimum number of pictures for each category of training data is 10.
The lower limit of the classification number of the training data set is 2, and the upper limit is 1000.
Supported image formats: .bmp, .jpg, .jpeg, .png or .gif.
After training image folder is selected, please set Output model file path and training parameters. If you check HMS Custom Model, a complete model will be generated. The train parameters affects the accuracy of image recognition model. You can modify them if you have a experience with deep learning. When you click on Create Model, MindSpore will start to train your model according to the data set.
Training process will take a time depending on your data set. As we used a small data set with the minimum number of the requirements it is completed quickly. You can also track training logs, your model will be created on the specified path at the end of the process.
The training results will be shared after model training is completed. AI create enables to test your model by adding the test images before using it in any project. You can also generate a demo project that implements your new model with Generate Demo option.
You should create a new test image folder with the same structure of provided data set.
As you see above, our test average accuracy is calculated as 57.1%. This accuracy can be improved by providing comprehensive data set and training.
You can also use and experience results of your new model over a demo project which can be created by HMS Toolkit. After the demo is created, you may directly run and build the project and check the results on real device.
In this article, I wanted to share basic information about MindSpore and how we can use Transfer Learning function of HMS for custom models.
You can also develop your own classification model by using this post as a reference. I hope that it will be useful for you !
Please follow our next articles for more details about ML Kit Custom Model and MindSpore.
References
https://developer.huawei.com/consum...ore-Guides/ml-mindspore-lite-0000001055328885
https://www.mindspore.cn/lite/en

Free analytics for over 2.3 million developers worldwide provided by GameAnalytics and Huawei HMS

As an industry-leading analytics company for mobile games, GameAnalytics empowers developers, helping them to build more engaging user experiences through a strong understanding of their game's core metrics. GameAnalytics provides critical insights to help developers tap into the potential of their mobile games through collecting, analyzing, and presenting game performance data. This includes everything from basic metrics (such as active users, retention, and playtime,) to more advanced analytics regarding ad revenue, virtual currency, and level progression.
The partnership brings mutual benefits to both parties; Huawei can continue growing its ecosystem with more platform partners and further support the success of its game developers. Simultaneously, GameAnalytics can leverage Huawei's technological capabilities, enhancing its Android SDK to support OAID (Huawei Ads Kit) across all mobile devices. This enables HMS ecosystem game developers to integrate with the platform, to unlock deeper analysis and continue to grow their games.
Used by nearly 100,000 developers and over 63,000 studios worldwide, GameAnalytics supports around 100,000 active games, providing necessary data to help mobile developers achieve their growth goals. GameAnalytics can be easily integrated in less than 15 minutes, and its core analytics tool has and will continue to remain free.
How to get started
If you’re a Huawei game developer, or want to become one and use GameAnalytics, here’s what you need to do:
1 Download and integrate our Android SDK. Head to our documentation to learn how.
2 Integrate Huawei Ads SDK by following these steps.
3 And you’re done! The Android SDK can now automatically detect if the OAID is available to be used.
Please note, if both GAID and OAID are available on a player’s device, then GAID will be used as a primary device ID, though both will be collected and available in features like the Player Warehouse, Event Export, or Raw Export. If you have any questions, just get in touch with our friendly support team.
Find more here:https://gameanalytics.com/product-updates/gameanalytics-partners-with-huawei
For more information, please visit Huawei Partner Site and GameAnalytics website SDK integration guide
Martin Bieber said:
As an industry-leading analytics company for mobile games, GameAnalytics empowers developers, helping them to build more engaging user experiences through a strong understanding of their game's core metrics. GameAnalytics provides critical insights to help developers tap into the potential of their mobile games through collecting, analyzing, and presenting game performance data. This includes everything from basic metrics (such as active users, retention, and playtime,) to more advanced analytics regarding ad revenue, virtual currency, and level progression.
The partnership brings mutual benefits to both parties; Huawei can continue growing its ecosystem with more platform partners and further support the success of its game developers. Simultaneously, GameAnalytics can leverage Huawei's technological capabilities, enhancing its Android SDK to support OAID (Huawei Ads Kit) across all mobile devices. This enables HMS ecosystem game developers to integrate with the platform, to unlock deeper analysis and continue to grow their games.
Used by nearly 100,000 developers and over 63,000 studios worldwide, GameAnalytics supports around 100,000 active games, providing necessary data to help mobile developers achieve their growth goals. GameAnalytics can be easily integrated in less than 15 minutes, and its core analytics tool has and will continue to remain free.
How to get started
If you’re a Huawei game developer, or want to become one and use GameAnalytics, here’s what you need to do:
1 Download and integrate our Android SDK. Head to our documentation to learn how.
2 Integrate Huawei Ads SDK by following these steps.
3 And you’re done! The Android SDK can now automatically detect if the OAID is available to be used.
Please note, if both GAID and OAID are available on a player’s device, then GAID will be used as a primary device ID, though both will be collected and available in features like the Player Warehouse, Event Export, or Raw Export. If you have any questions, just get in touch with our friendly support team.
Find more here:https://gameanalytics.com/product-updates/gameanalytics-partners-with-huawei
For more information, please visit Huawei Partner Site and GameAnalytics website SDK integration guide
Click to expand...
Click to collapse

Intuitive Controls with AR-based Gesture Recognition

The emergence of AR technology has allowed us to interact with our devices in a new and unexpected way. With regard to smart device development, from PCs to mobile phones and beyond, the process has been dramatically simplified. Interactions have been streamlined to the point where only slides and taps are required, and even children as young as 2 or 3 can use devices.
Rather than having to rely on tools like keyboards, mouse devices, and touchscreens, we can now control devices in a refreshingly natural and easy way. Traditional interactions with smart devices have tended to be cumbersome and unintuitive, and there is a hunger for new engaging methods, particularly among young people. Many developers have taken heed of this, building practical but exhilarating AR features into their apps. For example, during live streams, or when shooting videos or images, AR-based apps allow users to add stickers and special effects with newfound ease, simply by striking a pose; in smart home scenarios, users can use specific gestures to turn smart home appliances on and off, or switch settings, all without any screen operations required; or when dancing using a video game console, the dancer can raise a palm to pause or resume the game at any time, or swipe left or right to switch between settings, without having to touch the console itself.
So what is the technology behind these groundbreaking interactions between human and devices?
HMS Core AR Engine is a preferred choice among AR app developers. Its SDK provides AR-based capabilities that streamline the development process. This SDK is able to recognize specific gestures with a high level of accuracy, output the recognition result, and provide the screen coordinates of the palm detection box, and both the left and right hands can be recognized. However, it is important to note that when there are multiple hands within an image, only the recognition results and coordinates from the hand that has been most clearly captured, with the highest degree of confidence, will be sent back to your app. You can switch freely between the front and rear cameras during the recognition.
Gesture recognition allows you to place virtual objects in the user's hand, and trigger certain statuses based on the changes to the hand gestures, providing a wealth of fun interactions within your AR app.
The hand skeleton tracking capability works by detecting and tracking the positions and postures of up to 21 hand joints in real time, and generating true-to-life hand skeleton models with attributes like fingertip endpoints and palm orientation, as well as the hand skeleton itself.
AR Engine detects the hand skeleton in a precise manner, allowing your app to superimpose virtual objects on the hand with a high degree of accuracy, including on the fingertips or palm. You can also perform a greater number of precise operations on virtual hands and objects, to enrich your AR app with fun new experiences and interactions.
Getting Started​Prepare the development environment as follows:
JDK: 1.8.211 or later
Android Studio: 3.0 or later
minSdkVersion: 26 or later
targetSdkVersion: 29 (recommended)
compileSdkVersion: 29 (recommended)
Gradle version: 6.1.1 or later (recommended)
Before getting started, make sure that the AR Engine APK is installed on the device. You can download it from AppGallery. Click here to learn on which devices you can test the demo.
Note that you will need to first register as a Huawei developer and verify your identity on HUAWEI Developers. Then, you will be able to integrate the AR Engine SDK via the Maven repository in Android Studio. Check which Gradle plugin version you are using, and configure the Maven repository address according to the specific version.
App Development​1. Check whether AR Engine has been installed on the current device. Your app can run properly only on devices with AR Engine installed. If it is not installed, you need to prompt the user to download and install AR Engine, for example, by redirecting the user to AppGallery. The sample code is as follows:
Code:
boolean isInstallArEngineApk =AREnginesApk.isAREngineApkReady(this);
if (!isInstallArEngineApk) {
// ConnectAppMarketActivity.class is the activity for redirecting users to AppGallery.
startActivity(new Intent(this, com.huawei.arengine.demos.common.ConnectAppMarketActivity.class));
isRemindInstall = true;
}
2. Initialize an AR scene. AR Engine supports the following five scenes: motion tracking (ARWorldTrackingConfig), face tracking (ARFaceTrackingConfig), hand recognition (ARHandTrackingConfig), human body tracking (ARBodyTrackingConfig), and image recognition(ARImageTrackingConfig).
Call ARHandTrackingConfig to initialize the hand recognition scene.
Code:
mArSession = new ARSession(context);
ARHandTrackingConfig config = new ARHandTrackingconfig(mArSession);
3. You can set the front or rear camera as follows after obtaining an ARhandTrackingconfig object.
Code:
Config.setCameraLensFacing(ARConfigBase.CameraLensFacing.FRONT);
4. After obtaining config, configure it in ArSession, and start hand recognition.
Code:
mArSession.configure(config);
mArSession.resume();
5. Initialize the HandSkeletonLineDisplay class, which draws the hand skeleton based on the coordinates of the hand skeleton points.
Code:
Class HandSkeletonLineDisplay implements HandRelatedDisplay{
// Methods used in this class are as follows:
// Initialization method.
public void init(){
}
// Method for drawing the hand skeleton. When calling this method, you need to pass the ARHand object to obtain data.
public void onDrawFrame(Collection<ARHand> hands,){
// Call the getHandskeletonArray() method to obtain the coordinates of hand skeleton points.
Float[] handSkeletons = hand.getHandskeletonArray();
// Pass handSkeletons to the method for updating data in real time.
updateHandSkeletonsData(handSkeletons);
}
// Method for updating the hand skeleton point connection data. Call this method when any frame is updated.
public void updateHandSkeletonLinesData(){
// Method for creating and initializing the data stored in the buffer object.
GLES20.glBufferData(..., mVboSize, ...);
//Update the data in the buffer object.
GLES20.glBufferSubData(..., mPointsNum, ...);
}
}
6. Initialize the HandRenderManager class, which is used to render the data obtained from AR Engine.
Code:
Public class HandRenderManager implements GLSurfaceView.Renderer{
// Set the ARSession object to obtain the latest data in the onDrawFrame method.
Public void setArSession(){
}
}
7. Initialize the onDrawFrame() method in the HandRenderManager class.
Code:
Public void onDrawFrame(){
// In this method, call methods such as setCameraTextureName() and update() to update the calculation result of ArEngine.
// Call this API when the latest data is obtained.
mSession.setCameraTextureName();
ARFrame arFrame = mSession.update();
ARCamera arCamera = arFrame.getCamera();
// Obtain the tracking result returned during hand tracking.
Collection<ARHand> hands = mSession.getAllTrackables(ARHand.class);
// Pass the obtained hands object in a loop to the method for updating gesture recognition information cyclically for processing.
For(ARHand hand : hands){
updateMessageData(hand);
}
}
8. On the HandActivity page, set a render for SurfaceView.
Code:
mSurfaceView.setRenderer(mHandRenderManager);
Setting the rendering mode.
mSurfaceView.setRenderMode(GLEurfaceView.RENDERMODE_CONTINUOUSLY);
Physical controls and gesture-based interactions come with unique advantages and disadvantages. For example, gestures are unable to provide the tactile feedback provided by keys, especially crucial for shooting games, in which pulling the trigger is an essential operation; but in simulation games and social networking, gesture-based interactions provide a high level of versatility.
Gestures are unable to replace physical controls in situations that require tactile feedback, and physical controls are unable to naturally reproduce the effects of hand movements and complex hand gestures, but there is no doubt that gestures will become indispensable to future smart device interactions.
Many somatosensory games, smart home appliances, and camera-dependent games are now using AR to offer a diverse range of smart, convenient features. Common gestures include eye movements, pinches, taps, swipes, and shakes, which users can strike without having to learn additionally. These gestures are captured and identified by mobile devices, and used to implement specific functions for users. When developing an AR-based mobile app, you will need to first enable your app to identify these gestures. AR Engine helps by dramatically streamlining the development process. Integrate the SDK to equip your app with the capability to accurately identify common user gestures, and trigger corresponding operations. Try out the toolkit for yourself, to explore a treasure trove of powerful, interesting AR features.
References​
AR Engine Development Guide
AR Engine Sample Code

Categories

Resources