How Can an App Show More POI Details to Users - Huawei Developers

{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
With the increasing popularity of the mobile Internet, mobile apps are now becoming an integral part of our daily lives and provide increasingly more diverse functions that bring many benefits to users. One such function is searching for Point of Interests (POIs) or places, such as banks and restaurants, in an app.
When a user searches for a POI in an app, besides general information about the POI, such as the name and location, they also expect to be shown other relevant details. For example, when searching for a POI in a taxi-hailing app, a user usually expects the app to display both the searched POI and other nearby POIs, so that the user can select the most convenient pick-up and drop-off point. When searching for a bank branch in a mobile banking app, a user usually wants the app to show both the searched bank branch and nearby POIs of a similar type and their details such as business hours, telephone numbers, and nearby roads.
However, showing POI details in an app is usually a challenge for developers of non-map-related apps, because it requires a large amount of detailed POI data that is generally hard to collect for most app developers. So, wouldn't it be great if there was a service which an app can use to provide users with information about POI (such as the business hours and ratings) when they search for different types of POIs (such as hotels, restaurants, and scenic spots) in the app?
Fortunately, HMS Core Site Kit provides a one-stop POI search service, which boasts more than 260 million POIs in over 200 countries and regions around the world. In addition, the service supports more than 70 languages, empowering users to search for places in their own native languages. The place detail search function in the kit allows an app to obtain information about a POI, such as the name, address, and longitude and latitude, based on the unique ID of the POI. For example, a user can search for nearby bank branches in a mobile banking app, and view information about each branch, such as their business hours and telephone numbers, or search for the location of a scenic spot and view information about nearby hotels and weather forecasts in a travel app, thanks to the place detail search function. The place detail search function can even be utilized by location-based games that can use the function to show in-game tasks and rankings of other players at a POI when a player searches for the POI in the game.
Th integration process for this kit is straightforward, which I'll demonstrate below.
Demo​
Integration Procedure​
Preparations​Before getting started, you'll need to make some preparations, such as configuring your app information in AppGallery Connect, integrating the Site SDK, and configuring the obfuscation configuration file.
If you use Android Studio, you can integrate the SDK into your project via the Maven repository. The purpose of configuring the obfuscation configuration file is to prevent the SDK from being obfuscated.
You can follow instructions here to make relevant preparations. In this article, I won't be describing the preparation steps.
Developing Place Detail Search​After making relevant preparations, you will need to implement the place detail search function for obtaining POI details. The process is as follows:
1. Declare a SearchService object and use SearchServiceFactory to instantiate the object.
2. Create a DetailSearchRequest object and set relevant parameters.
The object will be used as the request body for searching for POI details. Relevant parameters are as follows:
siteId: ID of a POI. This parameter is mandatory.
language: language in which search results are displayed. English will be used if no language is specified, and if English is unavailable, the local language will be used.
children: indicates whether to return information about child nodes of the POI. The default value is false, indicating that child node information is not returned. If this parameter is set to true, all information about child nodes of the POI will be returned.
3. Create a SearchResultListener object to listen for the search result.
4. Use the created SearchService object to call the detailSearch() method and pass the created DetailSearchRequest and SearchResultListener objects to the method.
5. Obtain the DetailSearchResponse object using the created SearchResultListener object. You can obtain a Site object from the DetailSearchResponse object and then parse it to obtain the search results.
The sample code is as follows:
Code:
// Declare a SearchService object.
private SearchService searchService;
// Create a SearchService instance.
searchService = SearchServiceFactory.create(this, "
API key
");
// Create a request body.
DetailSearchRequest request = new DetailSearchRequest();
request.setSiteId("
C2B922CC4651907A1C463127836D3957
");
request.setLanguage("
fr
");
request.setChildren(
false
);
// Create a search result listener.
SearchResultListener<DetailSearchResponse> resultListener = new SearchResultListener<DetailSearchResponse>() {
// Return the search result when the search is successful.
@Override
public void onSearchResult(DetailSearchResponse result) {
Site site;
if (result == null || (site = result.getSite()) == null) {
return;
}
Log.i("TAG", String.format("siteId: '%s', name: %s\r\n", site.getSiteId(), site.getName()));
}
// Return the result code and description when a search exception occurs.
@Override
public void onSearchError(SearchStatus status) {
Log.i("TAG", "Error : " + status.getErrorCode() + " " + status.getErrorMessage());
}
};
// Call the place detail search API.
searchService.detailSearch(request, resultListener);
You have now completed the integration process and your app should be able to show users details about the POIs they search for.
Conclusion​
Mobile apps are now an integral part of our daily life. To improve user experience and provide users with a more convenient experience, mobile apps are providing more and more functions such as POI search.
When searching for POIs in an app, besides general information such as the name and location of the POI, users usually expect to be shown other context-relevant information as well, such as business hours and similar POIs nearby. However, showing POI details in an app can be challenging for developers of non-map-related apps, because it requires a large amount of detailed POI data that is usually hard to collect for most app developers.
In this article, I demonstrated how I solved this challenge using the place detail search function, which allows my app to show POI details to users. The whole integration process is straightforward and cost-efficient, and is an effective way to show POI details to users.

Related

All About Maps - Episode 1: Showing Routes from GPX files on Maps

More articles like this one, you can visit HUAWEI Developer Forum and Medium.​
All About Maps
Let's talk about maps. I started an open source project called All About Maps (https://github.com/ulusoyca/AllAboutMaps). In this project I aim to demonstrate how we can implement the same map related use cases with different map providers in one codebase. We will use Mapbox Maps, Google Maps, and Huawei HMS Map Kit. This project uses following libraries and patterns:
MVVM pattern with Android Jetpack Libraries
Kotlin Coroutines for asynchronous operations
Dagger2 Dependency Injection
Android Clean Architecture
Note: The codebase changes by time. You can always find the latest code in develop branch. The code when this article is written can be seen by choosing the tag: episode_1-parse-gpx:
https://github.com/ulusoyca/AllAboutMaps/tree/episode_1-parse-gpx/
Motivation
Why do we need maps in our apps? What are the features a developer would expect from a map SDK? Let's try to list some:
Showing a coordinate on a map with camera options (zoom, tilt, latitude, longitude, bearing)
Adding symbols, photos, polylines, polygons to map
Handle user gestures (click, pinch, move events)
Showing maps with different map styles (Outdoor, Hybrid, Satallite, Winter, Dark etc.)
Data visualization (heatmaps, charts, clusters, time-lapse effect)
Offline map visualization (providing map tiles without network connectivity)
Generate snapshot image of a bounded region
We can probably add more items but I believe this is the list of features which all map provider companies would most likely provide. Knowing that we can achieve the same tasks with different map providers, we should not create huge dependencies to any specific provider in our codebase. When a product owner (PO) tells to developers to switch from Google Maps to Mapbox Maps, or Huawei Maps, developers should never see it as a big deal. It is software development. Business as usual.
One would probably think why a PO would want to switch from one map provider to another. In many cases, the reason is not the technical details. For example, Google Play Services may not be available in some devices or regions like China. Another case is when a company X which has a subscription to Mapbox, acquires company Y which uses Google Maps. In this case the transition to one provider is more efficient. Change in the terms of services, and pricing might be other motivations.
We need competition in the market! Let's switch easily when needed but how dependencies make things worse? Problematic dependencies in the codebase are usually created by developing software like there is no tomorrow. It is not always developers' fault. Tight schedules, anti-refactoring minded teams, unorganized plannings may cause careless coding and then eventually to technical depts. In this project, I aim to show how we can encapsulate the import lines below belonging to three different map providers to minimum number of classes with minimum lines:
import com.huawei.hms.maps.*
import com.google.android.gms.maps.*
import com.mapbox.mapboxsdk.maps.*
It should be noted that the way of achieving this in this post is just one proposal. There are always alternative and better ways of implementations. In the end, as software developers, we should deliver our tasks time-efficiently, without over-engineering.
About the project
In the home page of the project you will see the list of tutorials. Since this is the first blog post, there is only one item for now. To make our life easier with RecyclerViews, I use Epoxy library by Airbnb in the project. Once you click the buttons in the card, it will take to the detail page. Using bottom sheet we can switch between map providers. Note that Huawei Map Kit requires a Huawei mobile phone.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
In this first blog post, we will parse the GPX file of 120 km route of Cappadocia Ultra Trail race and show the route and check points (food stations) on map. I finished this race in 23 hours 45 mins and you can also read my experience here (https://link.medium.com/uWmrWLAzR6). GPX is an open standart which contains route points that constructs a polyline and waypoints which are the attraction location. In this case, the waypoints represents the food and aid stations in the race. We will show the route with a polyline and waypoints with markers on map.
Architecture
Architecture is definitely not an overrated concept. Since the early days of Android, we have been seeking for the best architectural patterns that suits with Android development. We have heard of MVC, MVP, MVVM, MVI and many other patterns will emerge. The change and adaptation to a new pattern is inevitable by time. We should keep in mind some basic and commonly accepted concepts like SOLID principles, seperation of concerns, maintainability, readibility, testablity etc. so that we can switch to between patterns easily when needed.
Nowadays, widely accepted architecture in Android community is modularization with Clean Architecture. If you have time to invest more, I would strongly suggest Joe Birch's clean architecture tutorials. As Joe suggests in his tutorials, we do not have to apply every rule line by line but instead we take whatever we feel like is needed. Here is my take and how I modularized the All About Maps app:
Note that dependency injection with Dagger2 is the core of this implementation. If you are not familiar with the concept, I strongly suggest you to read the best Dagger2 tutorial in the wild Dagger2 world by Nimrod Dayan.
Domain Module
Many of us are excited to start implementation with UI to see the results immediately but we should patiently build our blocks. We shall start with the domain module since we will put our business logic and define the entities and user interactions there.
First question: What entities do we need for a Map app?
We don't have to put every entity at once. Since our first tutorial is about drawing polylines and symbols we will need the following data:
LatLng class which holds Latitude and Longitude
Point which represents a geo-coordinate.
RouteInfo that holds points to be used to draw route and waypoints
Let's see the implementations:
Code:
inline class Latitude(val value: Float)
inline class Longitude(val value: Float)
Code:
data class LatLng(
val latitude: Latitude,
val longitude: Longitude
)
Code:
data class Point(
val latitude: Latitude,
val longitude: Longitude,
val altitude: Float? = null,
val name: String? = null
) {
val latLng: LatLng
get() = LatLng(latitude, longitude)
}
Code:
data class RouteInfo(
val routePoints: List<Point> = emptyList(),
val wayPoints: List<Point> = emptyList()
)
I could have used Float primitive type for Latitude and Longitude fields. However, I strongly suggest you to take advantage of Kotlin inline classes. In my relatively long career of working on maps, I spent hours on issues caused by mistakenly using longitude for latitude values.
Note that LatLng class is available in all Map SDKs. However, all the modules below the domain layer should use only our own LatLng to prevent the dependency to map SDKs in those modules. In the app layer we can map our LatLng class to corresponding classes:
Code:
import com.ulusoy.allaboutmaps.domain.entities.LatLng
import com.mapbox.mapboxsdk.geometry.LatLng as MapboxLatLng
import com.huawei.hms.maps.model.LatLng as HuaweiLatLng
import com.google.android.gms.maps.model.LatLng as GoogleLatLang
fun LatLng.toMapboxLatLng() = MapboxLatLng(
latitude.value.toDouble(),
longitude.value.toDouble()
)
fun LatLng.toHuaweiLatLng() = HuaweiLatLng(
latitude.value.toDouble(),
longitude.value.toDouble()
)
fun LatLng.toGoogleLatLng() = GoogleLatLang(
latitude.value.toDouble(),
longitude.value.toDouble()
)
Second question: What actions user can trigger?
Domain module contains the uses cases (interactors) that an application can perform to achieve goals based on user interactions. The code in this module is less likely to change compared to other modules. Business is business. For example, this application has one job for now: showing the route info with a polyline and markers. It can get the route info from a web server, a database or in this case from application resource file which is a GPX file. Neither the app module nor the domain module doesn't care where the route points and waypoints are retrieved from. It is not their concern. The concerns are seperated.
Lets see the use case definition in our domain module:
Code:
class GetRouteInfoUseCase
@Inject constructor(
private val routeInfoRepository: RouteInfoRepository
) {
suspend operator fun invoke(): RouteInfo {
return routeInfoRepository.getRouteInfo()
}
}
Code:
interface RouteInfoRepository {
suspend fun getRouteInfo(): RouteInfo
}
RouteInfoRepository is an interface that lives in the domain module and it is a contract between domain and datasource modules. Its concrete implementation lives in the datasource module.
Datasource Module
Datasource module is an abstraction world. Life here is based on interfaces. The domain module communicates with datasource module through the repository interface, then datasource module orchestrates the data flow in repository class and returns the final value.
Here, the domain module asks for the route info. Datasource module decides what to return after retrieving data from different data sources. For the sake of simplicity, in this case we have only one datasource: GPX parser. The route info is extracted from a GPX file. We don't know where, and how. Let's see the code:
Here is the concrete implementation of RouteInfoRepository interface. Route info datasource is injected as constructor parameter to this class.
Code:
class RouteInfoDataRepository
@Inject constructor(
@Named("GPX_DATA_SOURCE")
private val gpxFileDatasource: RouteInfoDatasource
) : RouteInfoRepository {
override suspend fun getRouteInfo(): RouteInfo {
return gpxFileDatasource.parseGpxFile()
}
}
Here is our one only route info data source: GpxFileDataSource. It still doesn't know how to get the data from gpx file. However, it knows where to get the data from thanks to contract GpxFileParser
Code:
class GpxFileDatasource
@Inject constructor(
private val gpxFileParser: GpxFileParser
): RouteInfoDatasource {
override suspend fun parseGpxFile(): RouteInfo {
return gpxFileParser.parseGpxFile()
}
}
What is a GPX file? How is it parsed? Where is the file located? Datasource doesn't care about these details. It only knows that the concrete implementation of GpxFileParser will return the RouteInfo. Here is the contract between the datasource and the concrete implementation:
Code:
interface GpxFileParser {
suspend fun parseGpxFile(): RouteInfo
}
Is it already too confusing with too many abstractions around? Is it overengineering? You might be right and choose to have less abstractions when you have one datasource like in this case. However, in real world, we have multiple datasources. Data is all around us. It may come from web server, from database or from our connected devices such as wearables. The benefit here is when things get more complicated with multiple datasources. Let's think about this complicated scenario.
App asks for the route info through a use case class.
Domain module forwards the request to data source.
Datasource orchestrates the data in the repository class
It first asks to Web servers (remote data source) to get the route info. However, the user is offline. Thus, the remote data source is not available.
Then it checks what we have locally by first checking if the route info is available in database or not.
It is not available in database but we have a gpx file in our resource folder (I know it doesn't make sense but to give an example).
The repository class asks GPX parser to parse the file and return the desired RouteInfo data.
Too complicated? It would be this much easier to implement this code in repository class based on the scenario:
Code:
class RouteInfoDataRepository
@Inject constructor(
@Named("GPX_DATA_SOURCE")
private val gpxFileDatasource: RouteInfoDatasource,
@Named("REMOTE_DATA_SOURCE")
private val remoteDatasource: RouteInfoDatasource,
@Named("DATABASE_SOURCE")
private val localDatasource: RouteInfoDatasource
) : RouteInfoRepository {
override suspend fun getRouteInfo(): RouteInfo? {
var routeInfo = remoteDatasource.parseGpxFile()
if (routeInfo == null) {
Timber.d("Route info is not available in remote source, now trying local database")
routeInfo = localDatasource.parseGpxFile()
if (routeInfo == null) {
Timber.d("Route info is not available in local database. Let's hope we have a gpx file in the app resource folder")
gpxFileDatasource.parseGpxFile()
}
}
return routeInfo
}
}
Thanks to Kotlin coroutines we can write these asynchronous operations sequentially.
For full content, you can visit HUAWEI Developer Forum.​

What Huawei Analytics Offers?

1) What is analytics? Why is it used?
How your app is used by users? which pages get more traffic?on which page customers leave the app?
If you want to understand that and direct your future developments accordingly
Analytics is just for you. Understanding user’s habits will help you to make your app better.
If your app is at a certain level or you want to move your app and business forward, you need to use analytics.
To date, there are many services that offer solutions for analytics. Huawei handled analytics as a priority on HMS.
So why would I prefer Huawei Analytics?
Easy to integrate and use
It is very easy to integrate and use the analytics dashboard after integrating Huawei Analytics.
Moreover, you can customize your tables on the dashboard as you wish and you can easily see the data you want to see, not imposed on you.
Reach Huawei users
As you know, google services are not used in the latest Huawei devices. When you integrate Huawei Analytics, you will reach all Huawei users and all other devices users. So Huawei Analytics a connective, not a divider.
Power of HMS Core
Analytics gets its power from HMS Core.
It is very easy to reach all the documents you need for integration or usage of dashboard.When there is a technical problem you can find technical support very easily.
Powerful Analysis Capabilities
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
2) How to Integrate Huawei Analytics?
For integrate Huawei Analytics kit to our application, first of all, it is necessary to register in the Huawei developer community
After adding your app to AppGallery Connect, we need to enable Analytics service to usage.
After completing the enabling process, we go to the manage API tab from the project settings, activate the analytics service and add the json file on the project settings page to our project and add HMS SDK Dependencies to your project(Note: The order of these processes is important. You should update your json file in case of any service change.).
Add build dependencies to dependencies.
Code:
implementation 'com.huawei.hms:hianalytics:4.0.0.303'
Then all you have to do is go to the class you want to collect and start collecting the data.
Code:
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
//Enable SDK log recording
HiAnalyticsTools.enableLog();
HiAnalyticsInstance instance = HiAnalytics.getInstance(this);
//Or, use context initialization
//Context context = this.getApplicationContext();
//HiAnalyticsInstance instance = HiAnalytics.getInstance(context);
//instance.setUserProfile("userKey","value");
}
You can customize the data to be collected.
Code:
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
//Enable SDK log recording
HiAnalyticsTools.enableLog();
HiAnalyticsInstance instance=HiAnalytics.getInstance(this);
//Or, use context initialization
//Context context = this.getApplicationContext();
//HiAnalyticsInstance instance = HiAnalytics.getInstance(context);
//instance.setUserProfile("userKey","value");
}
As you can see, we can write just a few lines of code and connect our project with Huawei Analytics. So where and how do we evaluate the data we collect, let’s look at it
3) Viewing Analytics Data
You have a 4 section in welcome page. Overview, Distirbution Analysis, Operation Analysis and Advance Analysis. By default, the Overview page of an app displays its KPIs in all channels (AppGallery and AppTouch), including App impressions, Details page views, New downloads, and In-app purchase amount.
The Distribution Analysis page displays the app usage and sales data. Use Downloads & installs to measure the installation and usage of your app. Use In-App Purchases and Paid application details to measure the sales of your app.
With Operation Analysis, you can view the financial reports of your application and set up your future strategies based on this.
Advanced Analysis is a platform where you can create personalized tables, filter your users and view them according to categories, and instantly track active users and all other data.
With Event Analysis Gains insight into user behaviour details. Activity analysis providing gains insight into user loyalty. Funnel analysis gains insight the phase in which user lost.
Huawei Analytics gives us uncountable opportunities to do. As you can see here, your limit is your imagination.
Thanks for reading!
Related Links
Thanks to Mehmet Batuhan Ulper

Integrate Nearby Service for an Enhanced Gaming Experience

HUAWEI Nearby Service is an important feature of HMS Core, designed to help users quickly find players in the vicinity, and automatically establish a low-latency, high-reliability, and zero-traffic data transmission channel, backed by underlying technologies such as Wi-Fi and Bluetooth. Integrating Nearby Service equips your game to provide users with a premium experience across the board.
1. Premium Attributes
1.1 One-Click Cross-Device Connections for Local Multiplayer Games
Current LAN games require that users are connected to the same router, in order for a connection to be established. If there is no router available, users have to manually enable a hotspot, complicating the process. Nearby Service resolves this persistent issue.
1.2 Face-to-Face Grouping and Friend Adding
Nearby Service provides you with functions for creating face-to-face groups and adding friends, without having to rely on social software or GPS, facilitating free and easy user interactions.
1.3 Face-to-Face Prop Sharing
Nearby Service allows users to quickly share game props with friends, helping you acquire new users and improve user stickiness.
2. Introduction to Plugins
Two encapsulated plugins are provided here. You can directly use the two plugins in the app, and view the source code of the plugins, for a thorough understanding of how to integrate the Nearby Service.
2.1 Preparations
Development environment of Unity.
Download the plugins on GitHub.
2.2 Plugin Import
Choose Assets > Import Package > Custom Package, and click Nearby Player or Discovery Plugin on the toolbar of Unity.
Wait for the package to be processed. After that, the resource list of plugins will display. Then click Import.
2.3 Key Code
2.3.1 Nearby Player Plugin
This plugin is applicable to such scenarios as creating face-to-face groups, adding friends, and sharing props. Declare the NearbyManager class in the plugin. The class provides APIs startDiscovery() and SendMessage() for users to discover nearby players and send messages.
Call startDiscovery to discover nearby players and make the current users discoverable by nearby players when the game starts.
The code of the called API is as follows:
Code:
void Start() {
AndroidMyCallback cb = new AndroidMyCallback(this);
nm = new NearbyManager(cb);
nm.startDiscovery(randomName());
}
The callback function AndroidMyCallback is used to perform operations after a successful discovery.
Code:
// Perform the subsequent operations when a player is discovered. In this demo, the player is being added to the player list.
public override void onFoundPlayer(string endpointName, string endpointId) {
mListController.AddPlayerToList(endpointName, endpointId);
}
// Perform the subsequent operations when a player is lost. In this demo, the player is being removed from the player list.
public override void onLostPlayer(string endpointId) {
mListController.RemovePlayerFromList(endpointId);
}
// Perform the subsequent operations when a player's message is received. In this demo, only the content of the message is displayed.
public override void onReceiveMsg(string endpointName, string Msg) {
mListController.ReceiveMsg(endpointName, Msg);
}
After discovering nearby players, users can send messages to the players for grouping, adding friends, or sharing props.
Code:
// In this demo, click a player's profile picture in the player list to send a grouping invitation.
private void OnClick(string endpointId) {
nm.log("OnClick. SendMessage to " + endpointId);
nm.SendMessage(endpointId, "invites you to join a game.");
}
2.3.2 Nearby Discovery Plugin
This is a plugin developed based on Unity UNET. Users are able to connect their devices on a mutual basis, even they are in different Wi-Fi environments. Declare the NearbyManager class, which offers two APIs: startBroadcast() and startDiscovery(). Two game devices can be connected by calling the above APIs.
The code of the called APIs is as follows:
Code:
private void OnClick() {
Button btn = this.GetComponent<Button>();
btn.enabled = false;
AndroidMyCallback androidMyCallback = new AndroidMyCallback(mNetworkManager);
NearbyManager nearbyManager = new NearbyManager(androidMyCallback);
nearbyManager.startBroadcast();
}
The callback function AndroidMyCallback is used to perform operations after a successful connection. Here, the networkManager API of UNET is called to start the game after a successful connection.
Code:
public class AndroidMyCallback : AndroidCallback {
private NetworkManager mNetworkManager;
public AndroidMyCallback(NetworkManager networkManager) : base() {
mNetworkManager = networkManager;
}
public override void onClientReady(string ipaddr) {
mNetworkManager.networkAddress = ipaddr;
mNetworkManager.StartClient();
}
public override void onServerReady(string ipaddr) {
mNetworkManager.StartHost();
}
}
2.4 Demo
Below we have provided two demos that have integrated the plugins detailed above, to provide you with a better understanding of the process.
Nearby-Player-Demo
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
UNET-NEARBY-DEMO
3. Game Apps That Have Integrated Nearby Service
Tic Tac Toe
Tic Tac Toe is a local battle game that was developed based on native Android APIs from Nearby Service, and released on the HUAWEI AppGallery platform.
NearbyGameSnake
NearbyGameSnake is a local multiplayer game that integrates Nearby Service. It is easy to play, and enables users to play directly in groups, without requiring an Internet connection.
4. Learn More
For more details, please visit HUAWEI Developers.
For more instructions, please visit Development Guide.
You can join the HMS Core developer discussion by going to Reddit.
You can download the demo and sample code from GitHub.
To solve integration problems, please go to Stack Overflow.

Implementing Real-Time Transcription in an Easy Way

Background​The real-time onscreen subtitle is a must-have function in an ordinary video app. However, developing such a function can prove costly for small- and medium-sized developers. And even when implemented, speech recognition is often prone to inaccuracy. Fortunately, there's a better way — HUAWEI ML Kit, which is remarkably easy to integrate, and makes real-time transcription an absolute breeze!
Introduction to ML Kit​ML Kit allows your app to leverage Huawei's longstanding machine learning prowess to apply cutting-edge artificial intelligence (AI) across a wide range of contexts. With Huawei's expertise built in, ML Kit is able to provide a broad array of easy-to-use machine learning capabilities, which serve as the building blocks for tomorrow's cutting-edge AI apps. ML Kit capabilities include those related to:
Text (including text recognition, document recognition, and ID card recognition)
Language/Voice (such as real-time/on-device translation, automatic speech recognition, and real-time transcription)
Image (such as image classification, object detection and tracking, and landmark recognition)
Face/Body (such as face detection, skeleton detection, liveness detection, and face verification)
Natural language processing (text embedding)
Custom model (including the on-device inference framework and model development tool)
Real-time transcription is required to implement the function mentioned above. Let's take a look at how this works in practice:
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Now let's move on to how to integrate this service.
Integrating Real-Time Transcription​Steps
Registering as a Huawei developer on HUAWEI Developers
Creating an app
Create an app in AppGallery Connect. For details, see Getting Started with Android.
We have provided some screenshots for your reference:
3. Enabling ML Kit
4. Integrating the HMS Core SDK.
Add the AppGallery Connect configuration file by completing the steps below:
Download and copy the agconnect-service.json file to the app directory of your Android Studio project.
Call setApiKey during app initialization.
To learn more, go to Adding the AppGallery Connect Configuration File.
5.Configuring the maven repository address
Add build dependencies.
Import the real-time transcription SDK.
Code:
implementation 'com.huawei.hms:ml-computer-voice-realtimetranscription:2.2.0.300'
Add the AppGallery Connect plugin configuration.
Method 1: Add the following information under the declaration in the file header:
Code:
apply plugin: 'com.huawei.agconnect'
Method 2: Add the plugin configuration in the plugins block.
Code:
plugins {
id 'com.android.application'
// Add the following configuration:
id 'com.huawei.agconnect'
}
Please refer to Integrating the Real-Time Transcription SDK to learn more.
Setting the cloud authentication information
When using on-cloud services of ML Kit, you can set the API key or access token (recommended) in either of the following ways:
Access token
You can use the following API to initialize the access token when the app is started. The access token does not need to be set again once initialized.
MLApplication.getInstance().setAccessToken("your access token");
API key
You can use the following API to initialize the API key when the app is started. The API key does not need to be set again once initialized.
MLApplication.getInstance().setApiKey("your ApiKey");
For details, see Notes on Using Cloud Authentication Information.
Code Development​
Create and configure a speech recognizer.
Code:
MLSpeechRealTimeTranscriptionConfig config = new MLSpeechRealTimeTranscriptionConfig.Factory()
// Set the language. Currently, this service supports Mandarin Chinese, English, and French.
.setLanguage(MLSpeechRealTimeTranscriptionConstants.LAN_ZH_CN)
// Punctuate the text recognized from the speech.
.enablePunctuation(true)
// Set the sentence offset.
.enableSentenceTimeOffset(true)
// Set the word offset.
.enableWordTimeOffset(true)
// Set the application scenario. MLSpeechRealTimeTranscriptionConstants.SCENES_SHOPPING indicates shopping, which is supported only for Chinese. Under this scenario, recognition for the name of Huawei products has been optimized.
.setScenes(MLSpeechRealTimeTranscriptionConstants.SCENES_SHOPPING)
.create();
MLSpeechRealTimeTranscription mSpeechRecognizer = MLSpeechRealTimeTranscription.getInstance();
Create a speech recognition result listener callback.
Code:
// Use the callback to implement the MLSpeechRealTimeTranscriptionListener API and methods in the API.
protected class SpeechRecognitionListener implements MLSpeechRealTimeTranscriptionListener{
@Override
public void onStartListening() {
// The recorder starts to receive speech.
}
@Override
public void onStartingOfSpeech() {
// The user starts to speak, that is, the speech recognizer detects that the user starts to speak.
}
@Override
public void onVoiceDataReceived(byte[] data, float energy, Bundle bundle) {
// Return the original PCM stream and audio power to the user. This API is not running in the main thread, and the return result is processed in a sub-thread.
}
@Override
public void onRecognizingResults(Bundle partialResults) {
// Receive the recognized text from MLSpeechRealTimeTranscription.
}
@Override
public void onError(int error, String errorMessage) {
// Called when an error occurs in recognition.
}
@Override
public void onState(int state,Bundle params) {
// Notify the app of the status change.
}
}
The recognition result can be obtained from the listener callbacks, including onRecognizingResults. Design the UI content according to the obtained results. For example, display the text transcribed from the input speech.
Bind the speech recognizer.
Code:
mSpeechRecognizer.setRealTimeTranscriptionListener(new SpeechRecognitionListener());
Call startRecognizing to start speech recognition.
Code:
mSpeechRecognizer.startRecognizing(config);
Release resources after recognition is complete.
Code:
if (mSpeechRecognizer!= null) {
mSpeechRecognizer.destroy();
}
(Optional) Obtain the list of supported languages.
Code:
MLSpeechRealTimeTranscription.getInstance()
.getLanguages(new MLSpeechRealTimeTranscription.LanguageCallback() {
@Override
public void onResult(List<String> result) {
Log.i(TAG, "support languages==" + result.toString());
}
@Override
public void onError(int errorCode, String errorMsg) {
Log.e(TAG, "errorCode:" + errorCode + "errorMsg:" + errorMsg);
}
});
We have finished integration here, so let's test it out on a simple screen.
Tap START RECORDING. The text recognized from the input speech will display in the lower portion of the screen.
We've now built a simple audio transcription function.
Eager to build a fancier UI, with stunning animations, and other effects? By all means, take your shot!
For reference:​Real-Time Transcription
Sample Code for ML Kit
To learn more, please visit:
HUAWEI Developers official website
Development Guide
Reddit to join developer discussions
GitHub or Gitee to download the demo and sample code
Stack Overflow to solve integration problems
Follow our official account for the latest HMS Core-related news and updates.
Original Source

Obtain Nearest Address to a Longitude-Latitude Point

In the mobile Internet era, people are increasingly using mobile apps for a variety of different purposes, such as buying products online, hailing taxis, and much more. When using such an app, a user usually needs to manually enter their address for package delivery or search for an appropriate pick-up and drop-off location when they hail a taxi, which can be inconvenient.
To improve user experience, many apps nowadays allow users to select a point on the map and then use the selected point as the location, for example, for package delivery or getting on or off a taxi. Each location has a longitude-latitude coordinate that pinpoints its position precisely on the map. However, longitude-latitude coordinates are simply a string of numbers and provide little information to the average user. It would therefore be useful if there was a tool which an app can use to convert longitude-latitude coordinates into human-readable addresses.
Fortunately, the reverse geocoding function in HMS Core Location Kit can obtain the nearest address to a selected point on the map based on the longitude and latitude of the point. Reverse geocoding is the process of converting a location as described by geographic coordinates (longitude and latitude) to a human-readable address or place name, which is much more useful information for users. It permits the identification of nearby street addresses, places, and subdivisions such as neighborhoods, counties, states, and countries.
Generally, the reverse geocoding function can be used to obtain the nearest address to the current location of a device, show the address or place name when a user taps on the map, find the address of a geographic location, and more. For example, with reverse geocoding, an e-commerce app can show users the detailed address of a selected point on the map in the app; a ride-hailing or takeout delivery app can show the detailed address of a point that a user selects by dragging the map in the app or tapping the point on the map in the app, so that the user can select the address as the pick-up address or takeout delivery address; and an express delivery app can utilize reverse geocoding to show the locations of delivery vehicles based on the passed longitude-latitude coordinates, and intuitively display delivery points and delivery routes to users.
Bolstered by a powerful address parsing capability, the reverse geocoding function in this kit can display addresses of locations in accordance with local address formats with an accuracy as high as 90%. In addition, it supports 79 languages and boasts a parsing latency as low as 200 milliseconds.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Demo​The file below is a demo of the reverse geocoding function in this kit.
Preparations​
Before getting started with the development, you will need to make the following preparations:
Register as a Huawei developer and complete identity verification on the HUAWEI Developers website. You can click here to find out the detailed registration and identity verification procedure.
Create a project and then create an app in the project in AppGallery Connect. Before doing so, you must have a Huawei developer account and complete identity verification.
Generate a signing certificate fingerprint and configure it in AppGallery Connect. The signing certificate fingerprint is used to verify the authenticity of an app. Before releasing an app, you must generate a signing certificate fingerprint locally based on the signing certificate and configure it in AppGallery Connect.
Integrate the Location SDK into your app. If you are using Android Studio, you can integrate the SDK via the Maven repository.
Here, I won't be describing how to generate and configure a signing certificate fingerprint and integrate the SDK. You can click here to learn about the detailed procedure.
Development Procedure​
After making relevant preparations, you can perform the steps below to use the reverse geocoding service in your app. Before using the service, ensure that you have installed HMS Core (APK) on your device.
1. Create a geocoding service client.
In order to call geocoding APIs, you first need to create a GeocoderService instance in the onClick() method of GeocoderActivity in your project. The sample code is as follows:
Code:
Locale locale = new Locale("zh", "CN");
GeocoderService geocoderService = LocationServices.getGeocoderService(GeocoderActivity.this, locale);
2. Obtain the reverse geocoding information.
To empower your app to obtain the reverse geocoding information, you need to call the getFromLocation() method of the GeocoderService object in your app. This method will return a List<HWLocation> object containing the location information based on the set GetFromLocationRequest object.
a. Set reverse geocoding request parameters.
There are three request parameters in the GetFromLocationRequest object, which indicate the latitude, longitude, and maximum number of returned results respectively. The sample code is as follows:
Code:
// Parameter 1: latitude
// Parameter 2: longitude
// Parameter 3: maximum number of returned results
// Pass valid longitude-latitude coordinates. If the coordinates are invalid, no geographical information will be returned. Outside China, pass longitude-latitude coordinates located outside China and ensure that the coordinates are correct.
GetFromLocationRequest getFromLocationRequest = new GetFromLocationRequest(39.985071, 116.501717, 5);
b. Call the getFromLocation() method to obtain reverse geocoding information.
The obtained reverse geocoding information will be returned in a List<HWLocation> object. You can add listeners using the addOnSuccessListener() and addOnFailureListener() methods, and obtain the task execution result using the onSuccess() and onFailure() methods.
The sample code is as follows:
Code:
private void getReverseGeocoding() {
// Initialize the GeocoderService object.
if (geocoderService == null) {
geocoderService = new GeocoderService(this, new Locale("zh", "CN"));
}
geocoderService.getFromLocation(getFromLocationRequest)
.addOnSuccessListener(new OnSuccessListener<List<HWLocation>>() {
@Override
public void onSuccess(List<HWLocation> hwLocation) {
// TODO: Define callback for API call success.
if (null != hwLocation && hwLocation.size() > 0) {
Log.d(TAG, "hwLocation data set quantity: " + hwLocation.size());
Log.d(TAG, "CountryName: " + hwLocation.get(0).getCountryName());
Log.d(TAG, "City: " + hwLocation.get(0).getCity());
Log.d(TAG, "Street: " + hwLocation.get(0).getStreet());
}
}
})
.addOnFailureListener(new OnFailureListener() {
@Override
public void onFailure(Exception e) {
// TODO: Define callback for API call failure.
}
});
}
Congratulations, your app is now able to use the reverse geocoding function to obtain the address of a location based on its longitude and latitude.
Conclusion​
The quick development and popularization of the mobile Internet has caused many changes to our daily lives. One such change is that more and more people are using mobile apps on a daily basis, for example, to buy daily necessities or hail a taxi. These tasks traditionally require users to manually enter the delivery address or pick-up and drop-off location addresses. Manually entering such addresses is inconvenient and prone to mistakes.
To solve this issue, many apps allow users to select a point on the in-app map as the delivery address or the address for getting on or off a taxi. However, the point on the map is usually expressed as a set of longitude-latitude coordinates, which most users will find hard to understand.
As described in this post, my app resolves this issue using the reverse geocoding function, which is proven a very effective way for obtaining human-readable addresses based on longitude-latitude coordinates. If you are looking for a solution to such issues, have a try to find out if this is what your app needs.

Categories

Resources