More information like this, you can visit HUAWEI Developer Forum
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
In our previous article we have discussed regarding how we can integrate Banner Ads and Splash Ads.
We have also discussed on how you can integrate Ads kit sdk into flutter project. If you have missed the project then i would recommend you to give it a read. Here is the link.
In this article we will be discussing about how we can integrate Interstitial Ads, Reward Ads and Native Ads
Plugin for Ads Kit: https://developer.huawei.com/consumer/en/doc/development/HMS-Plugin-Library/flutter-sdk-download-0000001050196675
Application: In case you want to try the demo yourself please download Calculate BMI app form AppGallery. This application helps you calculate your Body Mass Index and shows you Banner and Interstitial Ads at the same time.
Below is the demo
Interstitial Ads:
These are full Screen Ads. In these ads the control to close the ads is given to customer.
The ideal place to add these kind of ads is transition between activities or if there is a level up then you can insert these ads.
These ads can be configured to redirect to a particular URL as well.
Let us see how to add these ads in flutter.
Code:
import 'package:flutter/material.dart';
import 'package:huawei_ads/adslite/interstitial/interstitial_ad.dart';
class InterstitialAds extends StatelessWidget {
@override
Widget build(BuildContext context) {
return Align(
alignment: Alignment.topCenter,
child: RaisedButton(
child: Text(
'Load Interstitial Ads',
),
onPressed: () {
InterstitialAd _interstitialAd = createInterstitialAd();
_interstitialAd
..loadAd()
..show();
},
),
);
}
}
InterstitialAd createInterstitialAd() {
return InterstitialAd(
adUnitId: "teste9ih9j0rc3",
);
}
Below is the result you will get after running it.
Rewarded Ads:
These are also full screen ads but they provide some kind of reward to customer if he watches full add.
Example – User can get extra life in a game if he watches full video.
These are explicitly chosen by customer but user have full control to close the ads. User can anytime close these ads but in that case he won’t get any reward.
These ads are always singleton ads hence we need to check whether the ad is showing or not.
Code:
import 'package:flutter/material.dart';
import 'package:huawei_ads/hms_ads_lib.dart';
import 'package:huawei_ads/adslite/ad_param.dart';
import 'package:huawei_ads/adslite/reward/reward_verify_config.dart';
import 'package:flutter/cupertino.dart';
import 'package:huawei_ads/hms_ads.dart';
typedef void RewardAdListener(RewardAdEvent event,
{Reward reward, int errorCode});
class RewardedAds extends StatelessWidget {
@override
Widget build(BuildContext context) {
return Align(
alignment: Alignment.topCenter,
child: RaisedButton(
child: Text(
'Load Rewarded Ads',
),
onPressed: () {
RewardAd.instance
.loadAd(adUnitId: "testx9dtjwj8hp", adParam: AdParam.build());
RewardAd.instance.isLoaded().then((isLoaded) {
if (isLoaded) {
RewardAd.instance.show();
}
});
RewardAd.instance.setRewardAdListener =
(RewardAdEvent event, {Reward reward, int errorCode}) {
print("RewardedVideoAd event $event");
if (event == RewardAdEvent.rewarded) {
print('Received reward : ');
}
};
},
),
);
}
}
As you can see in above code we are checking whether Rewarded as is loaded or not by isLoading() method.
If it is not loaded we will show it. Also we have applied listener to it so we can listen to user events.
Below are example of some events:
Opened - RewardAdEvent.opened
Started - RewardAdEvent.started
Completed - RewardAdEvent.completed
Closed - RewardAdEvent.closed
While adding reward ads make sure validation is done at client side so there is no delay in providing reward to user.
Below is the result you will get after running it.
Native Ads:
These ads fit into the surrounding into which they are added.
Here Native ads acts like a widget which you can add into a child, loading and displaying the ads is handled by plugin itself. Images and videos both are supported here. We also need to use a controller in order to set properties of native ads.
We can also add listener in order to see the user interaction with the ads.
Below is the code to add native add.
Code:
import 'package:flutter/material.dart';
import 'package:huawei_ads/adslite/nativead/native_ad.dart';
import 'package:huawei_ads/adslite/nativead/native_ad_configuration.dart';
import 'package:huawei_ads/adslite/nativead/native_ad_controller.dart';
import 'package:huawei_ads/adslite/nativead/native_styles.dart';
import 'package:huawei_ads/utils/constants.dart';
class NativeAds extends StatelessWidget {
@override
Widget build(BuildContext context) {
NativeStyles stylesSmall = NativeStyles();
stylesSmall.setCallToAction(fontSize: 8);
stylesSmall.setFlag(fontSize: 10);
stylesSmall.setSource(fontSize: 11);
NativeAdConfiguration configuration = NativeAdConfiguration.build();
configuration.setChoicesPosition = NativeAdChoicesPosition.bottomRight;
return Align(
alignment: Alignment.topCenter,
child: Column(
children: <Widget>[
Container(
height: 100,
margin: EdgeInsets.only(bottom: 20.0),
child: NativeAd(
adUnitId: "testu7m3hc4gvm",
controller: NativeAdController(
adConfiguration: configuration,
listener: (AdEvent event, {int errorCode}) {
print("NativeAd event $event");
}),
type: NativeAdType.small,
styles: stylesSmall,
),
),
],
),
);
}
}
Below is the result you will get after running it.
Github Link:
Ads Kit: https://github.com/DTSE-India-Community/Flutter/tree/master/add_kit_flutter
BMI Calculator: https://github.com/DTSE-India-Community/Flutter/tree/master/calculate_bmi
Conclusion:
I hope with this article you will find some ease in integrating Ads kit in flutter project., In case you want to see any other kit integration or sample code then make sure to give the feedback.
Banner and Splash Ads: https://forums.developer.huawei.com/forumPortal/en/topicview?tid=0201286813113710087&fid=0101187876626530001
Below is the official documentation: https://developer.huawei.com/consumer/en/doc/development/HMS-Plugin-Guides/publisher-service-0000001050196431
In this and your previous 'projects' you discuss ways of pushing ads.
How about not discussing ad development and instead doing something actually beneficial and useful .
If you can ?
Related
Allow your users the freedom to choose their Android platform providing the same feature
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Some time ago I developed a Word Search game solver Android application using the services from Firebase ML Kit.
Solve WordSearch games with Android and ML Kit
A Kotlin ML Kit Data Structure & Algorithm Story
It was an interesting trip discovering the features of a framework that allows the developer to use AI capabilities without knowing all the rocket science behind.
In the specific, I’ve used the Document recognition feature to try to extract text from a word search game image.
After the text recognition phase, the output was cleaned and arranged into a matrix to be processed by the solver algorithm. This algo tried to look for all the words formed by grouping the letters respecting the rules of the games: contiguous letters in all the straight directions (vertical, horizontal, diagonal)
This app ran well on all the Android devices capable to run the Google Firebase SDK and the Google Mobile Services (GMS).
Since the second half of last year all new Huawei devices cannot run the GMS any more due to government restrictions, you can read more about this here:
[Update 14: Temporary License Extended Again]
Google has revoked Huawei's Android license
www.xda-developers.com
My app was not capable to run on the brand new Huawei devices
So I tried to look for solutions to make this case study app running on the new Huawei terminals.
Let’s follow my journey…
The Discovery of HMS ML Kit
I went throughout the Huawei documentation on HUAWEI Developer--The official site for HUAWEI developers. Provides HUAWEI appgallery service,HUAWEI Mobile Services,AI SDK,VR SDK
Here you can find many SDKs AKA Kits offering a set of smart features to the developers.
I’ve found one offering the features that I was looking for: HMS ML Kit. It is quite similar to the one from Firebase as it allows the developer to use Machine Learning capabilities like Image, Text, Face recognition and so on.
Huawei ML Kit
In particular, for my specific use case, I’ve used the text analyzer capable to run locally and taking advantage of the neural processing using NPU hardware.
Documentation HMS ML Kit Text recognition
Integrating HMS ML Kit was super easy. If you want to give it a try It’s just a matter of adding a dependency in your build.gradle file, enabling the service from the AppGallery web dashboard if you want to use the Cloud API and download the agconnect-services.json configuration file and use it in your app.
You can refer to the official guide here for the needed steps:
Documentation HMS ML Kit
Architectural Approach
My first desire was to maintain and deploy only one apk so I wanted to integrate both the Firebase ML Kit SDK and the HMS ML Kit one.
I thought about the main feature
Decode the image and getting back the text detected together with the bounding boxes surrounding each character to better display the spotted text to the user.
This was defined by this interface
Code:
package com.laquysoft.wordsearchai.textrecognizer
import android.graphics.Bitmap
interface DocumentTextRecognizer {
fun processImage(bitmap: Bitmap, success: (Document) -> Unit, error: (String?) -> Unit)
}
I’ve also defined my own data classes to have a common output format from both services
Code:
data class Symbol(
val text: String?,
val rect: Rect,
val idx: Int = 0,
val length: Int = 0
)
data class Document(val stringValue: String, val count: Int, val symbols: List<Symbol>)
Where Document represents the text result returned by the ML Kit services, it contains a list of Symbol (the character recognized) each one with its own char, the bounding box surrounding it (Rect), and the index in the string detected as both MLKit service will group some chars in a string with a unique bounding box.
Then I’ve created an object capable to instantiate the right service depending which service (HMS or GMS) is running on the device
Code:
object DocumentTextRecognizerService {
private fun getServiceType(context: Context) = when {
isGooglePlayServicesAvailable(
context
) -> ServiceType.GOOGLE
isHuaweiMobileServicesAvailable(
context
) -> ServiceType.HUAWEI
else -> ServiceType.GOOGLE
}
private fun isGooglePlayServicesAvailable(context: Context): Boolean {
return GoogleApiAvailability.getInstance()
.isGooglePlayServicesAvailable(context) == ConnectionResult.SUCCESS
}
private fun isHuaweiMobileServicesAvailable(context: Context): Boolean {
return HuaweiApiAvailability.getInstance()
.isHuaweiMobileServicesAvailable(context) == com.huawei.hms.api.ConnectionResult.SUCCESS
}
fun create(context: Context): DocumentTextRecognizer {
val type =
getServiceType(
context
)
if (type == ServiceType.HUAWEI)
return HMSDocumentTextRecognizer()
return GMSDocumentTextRecognizer()
}
}
This was pretty much all to make it works.
The ViewModel can use the service provided
Code:
class WordSearchAiViewModel(
private val resourceProvider: ResourceProvider,
private val recognizer: DocumentTextRecognizer
) : ViewModel() {
val resultList: MutableLiveData<List<String>> = MutableLiveData()
val resultBoundingBoxes: MutableLiveData<List<Symbol>> = MutableLiveData()
private lateinit var dictionary: List<String>
fun detectDocumentTextIn(bitmap: Bitmap) {
loadDictionary()
recognizer.processImage(bitmap, {
postWordsFound(it)
postBoundingBoxes(it)
},
{
Log.e("WordSearchAIViewModel", it)
})
}
by the right recognizer instantiated when the WordSearchAiViewModel is instantiated as well.
Running the app and choosing a word search game image on a Mate 30 Pro (an HMS device) shows this result
The Recognizer Brothers
You can check the code of the two recognizers below. What they are doing is to use the custom SDK implementation to get the result and adapt it to the interface, you can virtually use any other service capable to do the same.
Code:
package com.laquysoft.wordsearchai.textrecognizer
import android.graphics.Bitmap
import com.google.firebase.ml.vision.FirebaseVision
import com.google.firebase.ml.vision.common.FirebaseVisionImage
class GMSDocumentTextRecognizer : DocumentTextRecognizer {
private val detector = FirebaseVision.getInstance().onDeviceTextRecognizer
override fun processImage(
bitmap: Bitmap,
success: (Document) -> Unit,
error: (String?) -> Unit
) {
val firebaseImage = FirebaseVisionImage.fromBitmap(bitmap)
detector.processImage(firebaseImage)
.addOnSuccessListener { firebaseVisionDocumentText ->
if (firebaseVisionDocumentText != null) {
val words = firebaseVisionDocumentText.textBlocks
.flatMap { it -> it.lines }
.flatMap { it.elements }
val symbols: MutableList<Symbol> = emptyList<Symbol>().toMutableList()
words.forEach {
val rect = it.boundingBox
if (rect != null) {
it.text.forEachIndexed { idx, value ->
symbols.add(
Symbol(
value.toString(),
rect,
idx,
it.text.length
)
)
}
}
}
val document =
Document(
firebaseVisionDocumentText.text,
firebaseVisionDocumentText.textBlocks.size,
symbols
)
success(document)
}
}
.addOnFailureListener { error(it.localizedMessage) }
}
}
Code:
package com.laquysoft.wordsearchai.textrecognizer
import android.graphics.Bitmap
import com.huawei.hms.mlsdk.MLAnalyzerFactory
import com.huawei.hms.mlsdk.common.MLFrame
class HMSDocumentTextRecognizer : DocumentTextRecognizer {
//private val detector = MLAnalyzerFactory.getInstance().remoteDocumentAnalyzer
private val detector = MLAnalyzerFactory.getInstance().localTextAnalyzer
override fun processImage(
bitmap: Bitmap,
success: (Document) -> Unit,
error: (String?) -> Unit
) {
val hmsFrame = MLFrame.fromBitmap(bitmap)
detector.asyncAnalyseFrame(hmsFrame)
.addOnSuccessListener { mlDocument ->
if (mlDocument != null) {
val words = mlDocument.blocks
.flatMap { it.contents }
.flatMap { it.contents }
val symbols: MutableList<Symbol> = emptyList<Symbol>().toMutableList()
words.forEach {
val rect = it.border
it.stringValue.forEachIndexed { idx, value ->
symbols.add(Symbol(
value.toString(),
rect,
idx,
it.stringValue.length
))
}
}
val document =
Document(
mlDocument.stringValue,
mlDocument.blocks.size,
symbols
)
success(document)
}
}
.addOnFailureListener { error(it.localizedMessage) }
}
}
Conclusion
As good Android developers we should develop and deploy our apps in all the platforms our user can reach, love and adopt, without excluding anyone.
We should spend some time trying to give the users the same experience. This is a small sample about it and others will comes in the future.
More information like this, you can visit HUAWEI Developer Forum
Original link: https://forums.developer.huawei.com/forumPortal/en/topicview?tid=0201257812100840239&fid=0101187876626530001
It’s an application level development and we won’t go through the algorithm of image segmentation. Use Huawei Mlkit help to develop this app and it provides the capability of image segmentation. Developers will learn how to quickly develop a ID photo DIY applet using such SDK.
Background
I don’t know if you have had such an experience. All of a sudden, schools or companies needed to provide one inch or two inch head photos of individuals. They needed to apply for a passport or student card which have requirements for the background color of the photos. However, many people don’t have time to take photos at the photo studio. Or they have taken them before, but the background color of the photos doesn’t meet the requirements. I had a similar experience. At that time, the school asked for a passport, and the school photo studio was closed again. I took photos with my mobile phone in a hurry, and then used the bedspread as the background to deal with it. As a result, I was scolded by the teacher.
Many years later, mlkit machine learning has the function of image segmentation. Using this SDK to develop a small program of certificate photo DIY could perfectly solve the embarrassment in that year.
Here is the demo for the result.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
How effective is it, is it great, just need to write a small program to quickly achieve!
Core Tip: This SDK is free, and all Android models are covered!
ID photo development actual combat
1. Preparation
1.1 Add Huawei Maven Warehouse in Project Level Gradle
Open the Android studio project level build.gradle file.
Add the following Maven addresses:
Code:
buildscript {
repositories {
maven {url 'http://developer.huawei.com/repo/'}
} }allprojects {
repositories {
maven { url 'http://developer.huawei.com/repo/'}
}}
1.2 Add SDK Dependency in Application Level build.gradle
Introducing SDK and basic SDK of face recognition:
Code:
dependencies{
implementation 'com.huawei.hms:ml-computer-vision:1.0.2.300'
implementation 'com.huawei.hms:ml-computer-vision-image-segmentation-body-model:1.0.2.301' }
1.3 Add Model in Android manifest.xml File
To enable the application to automatically update the latest machine learning model to the user’s device after the user installs your application from the Huawei application market. Add the following statement to the Android manifest.xml file of the application:
Code:
<manifest
<application
<meta-data
android:name="com.huawei.hms.ml.DEPENDENCY"
android:value= "imgseg "/>
</application></manifest>
1.4 Apply for Camera and Storage Permission in Android manifest.xml File
Code:
<!--Uses storage permissions--><uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
2. Two Key Steps of Code Development
2.1 Dynamic Authority Application
Code:
@Overrideprotected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
if (!allPermissionsGranted()) {
getRuntimePermissions();
}}@Overridepublic void onRequestPermissionsResult(int requestCode, @NonNull String[] permissions,
@NonNull int[] grantResults) {
super.onRequestPermissionsResult(requestCode, permissions, grantResults);
if (requestCode != PERMISSION_REQUESTS) {
return;
}
boolean isNeedShowDiag = false;
for (int i = 0; i < permissions.length; i++) {
if (permissions[i].equals(Manifest.permission.READ_EXTERNAL_STORAGE) && grantResults[i] != PackageManager.PERMISSION_GRANTED) {
isNeedShowDiag = true;
}
}
if (isNeedShowDiag && !ActivityCompat.shouldShowRequestPermissionRationale(this, Manifest.permission.CALL_PHONE)) {
AlertDialog dialog = new AlertDialog.Builder(this)
.setMessage(getString(R.string.camera_permission_rationale))
.setPositiveButton(getString(R.string.settings), new DialogInterface.OnClickListener() {
@Override public void onClick(DialogInterface dialog, int which) {
Intent intent = new Intent(Settings.ACTION_APPLICATION_DETAILS_SETTINGS);
intent.setData(Uri.parse("package:" + getPackageName())); // Open the corresponding configuration page based on the package name.
startActivityForResult(intent, 200);
startActivity(intent);
}
})
.setNegativeButton(getString(R.string.cancel), new DialogInterface.OnClickListener() {
@Override public void onClick(DialogInterface dialog, int which) {
finish();
}
}).create();
dialog.show();
}}
2.2 Creating an Image Segmentation Detector
The image segmentation detector can be created through the image segmentation detection configurator “mlimagesegmentation setting".
Code:
MLImageSegmentationSetting setting = new MLImageSegmentationSetting.Factory()
.setAnalyzerType(MLImageSegmentationSetting.BODY_SEG)
.setExact(true)
.create();
this.analyzer = MLAnalyzerFactory.getInstance().getImageSegmentationAnalyzer(setting);
2.3 Create “mlframe” Object through android.graphics.bitmap for Analyzer to Detect Pictures
The image segmentation detector can be created through the image segmentation detection configurator “MLImageSegmentationSetting".
Code:
MLFrame mlFrame = new MLFrame.Creator().setBitmap(this.originBitmap).create();
2.4 Call “asyncanalyseframe” Method for Image Segmentation
Code:
// Create a task to process the result returned by the image segmentation detector. Task<MLImageSegmentation> task = analyzer.asyncAnalyseFrame(frame); // Asynchronously processing the result returned by the image segmentation detector Task<MLImageSegmentation> task = this.analyzer.asyncAnalyseFrame(mlFrame);
task.addOnSuccessListener(new OnSuccessListener<MLImageSegmentation>() {
@Override public void onSuccess(MLImageSegmentation mlImageSegmentationResults) {
// Transacting logic for segment success.
if (mlImageSegmentationResults != null) {
StillCutPhotoActivity.this.foreground = mlImageSegmentationResults.getForeground();
StillCutPhotoActivity.this.preview.setImageBitmap(StillCutPhotoActivity.this.foreground);
StillCutPhotoActivity.this.processedImage = ((BitmapDrawable) ((ImageView) StillCutPhotoActivity.this.preview).getDrawable()).getBitmap();
StillCutPhotoActivity.this.changeBackground();
} else {
StillCutPhotoActivity.this.displayFailure();
}
}
}).addOnFailureListener(new OnFailureListener() {
@Override public void onFailure(Exception e) {
// Transacting logic for segment failure.
StillCutPhotoActivity.this.displayFailure();
return;
}
});
2.5 Change the Picture Background
Code:
this.backgroundBitmap = BitmapUtils.loadFromPath(StillCutPhotoActivity.this, id, targetedSize.first, targetedSize.second);BitmapDrawable drawable = new BitmapDrawable(backgroundBitmap);
this.preview.setDrawingCacheEnabled(true);
this.preview.setBackground(drawable);
this.preview.setImageBitmap(this.foreground);
this.processedImage = Bitmap.createBitmap(this.preview.getDrawingCache());
this.preview.setDrawingCacheEnabled(false);
Conclusion
In this way, a small program of ID photo DIY has been made. Let’s see the demo.
If you have strong hands-on ability, you can also add and change suits or other operations. The source code has been uploaded to GitHub. You can also improve this function on GitHub.
https://github.com/HMS-MLKit/HUAWEI-HMS-MLKit-Sample=
Please stamp the source code address of GitHub (the project directory is id-photo-diy).
Based on the ability of image segmentation, it cannot only be used to do the DIY program of ID photo, but also realize the following related functions:
People’s portraits in daily life can be cut out, some interesting photos can be made by changing the background, or the background can be virtualized to get more beautiful and artistic photos.
Identify the sky, plants, food, cats and dogs, flowers, water surface, sand surface, buildings, mountains and other elements in the image, and make special beautification for these elements, such as making the sky bluer and the water clearer.
Identify the objects in the video stream, edit the special effects of the video stream, and change the background.
For other functions, please brainstorm together!
For a more detailed development guide, please refer to the official website of Huawei developer Alliance:
https://developer.huawei.com/consumer/en/doc/development/HMS-Guides/ml-introduction-4
Previous link:
NO. 1:One article to understand Huawei HMS ML Kit text recognition, bank card recognition, general card identification
NO.2: Integrating MLkit and publishing ur app on Huawei AppGallery
NO.3.: Comparison Between Zxing and Huawei HMS Scan Kit
NO.4: How to use Huawei HMS MLKit service to quickly develop a photo translation app
Introduction
Flutter is a mobile application development kit for crafting high-quality native experiences on iOS and Android platforms in record time. Even though Flutter is enough to build great mobile apps, interactivity, such as map integration, is needed in order to increase the user experience.
Huawei Map Kit
Huawei Map Kit is a development kit and map service developed by Huawei to easily integrate map-based functions into your apps. The kit currently covers map data of more than 200 countries and regions, supports 40+ languages, provides UI elements such as markers, shapes, and layers to customize your map, and also enables users to interact with the map in your app through gestures and buttons in different scenarios.
With the recently released Huawei Map Kit Flutter Plugin, Huawei developers now can use these features and integrate map-based functions to their Flutter projects. Hence, in this article, to explore the kit and Huawei Services, we will try to build a mobile app featuring Huawei Map using the plugin and Flutter SDK.
HMS Core Github: https://github.com/HMS-Core/hms-flutter-plugin/tree/master/flutter-hms-map
Required Configurations
Before we get started, to use Huawei Map Kit, and also other Huawei Mobile Services, you should be a Huawei Developer Account holder. For more detailed information on Developer Accounts and how to apply for them, please refer to this link.
Creating an App
· Sign in to AppGallery Connect using your Huawei ID and create a new project to work with, by clicking My projects>Add Project button.
· Click Add App button to add a new application to your project by filling the required fields such as name, category, and default language.
· Map Kit APIs of your app is enabled by default, but just to be sure, you can check it from the Manage APIs tab of your project on AppGallery Connect. You can also refer to Enabling Services article if you need any help.
· Open your Android Studio and create a Flutter application. The package name of your Flutter application should be the same with the package name of your app which you created on AppGallery Connect.
· Android requires a Signing Certificate to verify the authenticity of apps. Thus, you need to generate a Signing Certificate for your app. If you don’t know how to generate a Signing Certificate please click here for related article. Copy your generated Keystore file to your android/app directory of your project.
A side note: Flutter project structure have folders, such as ios and android folders, which belongs to different platforms, by nature, yet Android Studio treats them as a Flutter project folders and throws errors on these files. For this reason, before changing anything related to Android platform, you should right click to your android folder on your project directory and select Flutter > Open Android module in Android Studio. You can easily modify the files and also generate Signing Certificates from the Android Studio window that opened after selection.
· After generating your Signing Certificate (Keystore) you should extract SHA-256 fingerprint using keytool, which provided by JDK, and add to the AppGallery Connect by navigating to the App Information section of your app. You may also refer to Generating Fingerprint from a Keystore and Add Fingerprint certificate to AppGallery Connect articles for further help.
Integrating HMS and Map Plugin to Your Flutter Project
You also need to configure your Flutter application in order to communicate with Huawei to use Map Kit.
· Add the Huawei Map Kit Flutter Plugin as a dependency to your project’s pubspec.yaml file and run flutter pub get command to integrate Map Plugin to your project.
Code:
dependencies:
flutter:
sdk: flutter
huawei_map: ^4.0.4+300
· Download the agconnect-services.json file from the App Information section of the AppGallery Connect and copy it to your android/app directory of your project.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Your project’s android directory should look like this after adding both agconnect-services.json and keystore file.
· Add the Maven repository address and AppGallery Connect plugin to the project level build.gradle (android/build.gradle) file.
Code:
buildscript {
repositories {
//other repositories
maven { url 'https://developer.huawei.com/repo/' }
}
dependencies {
//other dependencies
classpath 'com.huawei.agconnect:agcp:1.2.1.301'
}
}
allprojects {
repositories {
//other repositories
maven { url 'https://developer.huawei.com/repo/' }
}
}
· Open your app level build.gradle (android/app/build.gradle) file and add the AppGallery Connect plugin.
Code:
apply plugin: 'com.android.application'
apply plugin: 'com.huawei.agconnect' //Added Line
apply plugin: 'kotlin-android'
· In the same file (android/app/build.gradle), add the signing configurations, and change the minSdkVersion of your project as shown below.
Code:
android {
/*
* Other configurations
*/
defaultConfig {
applicationId "<package_name>" //Your unique package name
minSdkVersion 19 //Change minSdkVersion to 19
targetSdkVersion 28
versionCode flutterVersionCode.toInteger()
versionName flutterVersionName
}
signingConfigs {
config{
storeFile file('<keystore_file>')
storePassword '<keystore_password>'
keyAlias '<key_alias>'
keyPassword '<key_password>'
}
}
buildTypes {
debug {
signingConfig signingConfigs.config
}
release {
signingConfig signingConfigs.config
}
}
}
· Finally, to call capabilities of Huawei Map Kit, apply for the following permissions for your app in your AndroidManifest.xml file.
Code:
<uses-permission android:name="android.permission.INTERNET"/>
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE"/>
— To obtain the current device location following permissions are also needed to be declared in your AndroidManifest.xml file (on Android 6.0 and later versions, you need to apply for these permissions dynamically). But we won’t use the current location in our app, this step is optional.
Code:
<uses-permission android:name="android.permission.ACCESS_COARSE_LOCATION"/>
<uses-permission android:name="android.permission.ACCESS_FINE_LOCATION"/>
Using Huawei Map Kit Flutter Plugin
Creating a Map
Now that we are ready to use Huawei’s Map features, let’s implement a simple map.
Huawei Map Kit Flutter Plugin provides a single widget, called HuaweiMap, for developers to easily create and manage map fragments. By using this widget, developers can enable or disable attributes or gestures of a map, set initial markers, circles, or other shapes, decide the map type, and also set an initial camera position to focus an area when the map is ready.
Let’s choose a random location from İstanbul as an initial camera position. While declaring the initial camera position, the zoom level, which indicates the value of magnification, and target, which indicates the latitude and longitude of the location, is required. You can find my target coordinates and zoom level below, which we will use while creating our map.
Code:
static const LatLng _center = const LatLng(41.027470, 28.999339);
static const double _zoom = 12;
Since now we have an initial position, we should implement the map itself. We will first create a simple Scaffold and set a simple AppBar, then create a HuaweiMap object as a Scaffold’s body. HuaweiMap object, as mentioned before, has different attributes which you can see below. The following code will create a HuaweiMap object that is full-screen, scrollable, tiltable, and also shows the buildings or traffic.
Code:
class HomeScreen extends StatefulWidget {
@override
_HomeScreenState createState() => _HomeScreenState();
}
class _HomeScreenState extends State<HomeScreen> {
static const LatLng _center = const LatLng(41.027470, 28.999339);
static const double _zoom = 12;
@override
Widget build(BuildContext context) {
return new Scaffold(
appBar: AppBar(
title: Text("Map Demo"),
centerTitle: true,
backgroundColor: Colors.red,
),
body: HuaweiMap(
initialCameraPosition: CameraPosition(
target: _center,
zoom: _zoom,
),
mapType: MapType.normal,
tiltGesturesEnabled: true,
buildingsEnabled: true,
compassEnabled: true,
zoomControlsEnabled: false,
rotateGesturesEnabled: true,
myLocationButtonEnabled: false,
myLocationEnabled: false,
trafficEnabled: true,
),
);
}
}
Considering creating a ‘clean’ map, I disabled myLocationEnabled, myLocationButtonEnabled and zoomControlsEnabled attributes of the map, but do not forget to explore these attributes by trying yourself since they are great to boost your user experience of your app.
Huawei Map
Resizing the Map
A full-screen map is not always useful in some scenarios, thus, since HuaweiMap is a standalone widget, we can resize the map by wrapping it to Container or ConstrainedBox widgets.
For this project, I will create a layout in Scaffold by using Expanded, Column, and Container widgets. The following code shows a HuaweiMap widget which fills only one-third of a screen.
Code:
class HomeScreen extends StatefulWidget {
@override
_HomeScreenState createState() => _HomeScreenState();
}
class _HomeScreenState extends State<HomeScreen> {
static const LatLng _center = const LatLng(41.027470, 28.999339);
static const double _zoom = 12;
@override
Widget build(BuildContext context) {
return new Scaffold(
appBar: AppBar(
title: Text("Locations"),
centerTitle: true,
backgroundColor: Colors.red,
),
body: Column(
children: [
Expanded(
child: Padding(
padding: EdgeInsets.all(8),
child: Container(
decoration: BoxDecoration(
border: Border.all(color: Colors.green, width: 2)),
child: HuaweiMap(
initialCameraPosition: CameraPosition(
target: _center,
zoom: _zoom,
),
mapType: MapType.normal,
tiltGesturesEnabled: true,
buildingsEnabled: true,
compassEnabled: true,
zoomControlsEnabled: false,
rotateGesturesEnabled: true,
myLocationButtonEnabled: false,
myLocationEnabled: false,
trafficEnabled: true,
),
),
),
),
Expanded(flex: 2, child: Container()),
],
),
);
}
}
Adding Interactivity by Using Markers and CameraUpdate
Let’s assume that we are building an app that shows different restaurant locations as markers on our HuaweiMap object. To do this, we will set some initial markers to our map using HuaweiMap widget’s markers field.
Markers field of the widget takes a Set of markers, hence we should first create a set of Marker objects. Then use them as initial markers for our HuaweiMap widget.
As you know, the two-third of our screen is empty, to fill the space we will create some Card widgets that hold the location’s name, motto, and address as a String. To reduce the redundant code blocks, I create a separate widget, called LocationCard, which returns a styled custom Card widget. To not lose the scope of this article, I will not share the steps of how to create a custom card widget but you may find its code from the project’s GitHub link.
Code:
class HomeScreen extends StatefulWidget {
@override
_HomeScreenState createState() => _HomeScreenState();
}
class _HomeScreenState extends State<HomeScreen> {
static const LatLng _center = const LatLng(41.027470, 28.999339);
static const double _zoom = 12;
//Marker locations
static const LatLng _location1 = const LatLng(41.0329109, 28.9840904);
static const LatLng _location2 = const LatLng(41.0155957, 28.9827176);
static const LatLng _location3 = const LatLng(41.0217315, 29.0111898);
//Set of markers
Set<Marker> _markers = {
Marker(markerId: MarkerId("Location1"), position: _location1),
Marker(markerId: MarkerId("Location2"), position: _location2),
Marker(markerId: MarkerId("Location3"), position: _location3),
};
@override
Widget build(BuildContext context) {
return new Scaffold(
appBar: AppBar(
title: Text("Locations"),
centerTitle: true,
backgroundColor: Colors.red,
),
body: Column(
children: [
Expanded(
child: Padding(
padding: EdgeInsets.all(8),
child: Container(
decoration: BoxDecoration(
border: Border.all(color: Colors.green, width: 2)),
child: HuaweiMap(
initialCameraPosition: CameraPosition(
target: _center,
zoom: _zoom,
),
mapType: MapType.normal,
tiltGesturesEnabled: true,
buildingsEnabled: true,
compassEnabled: true,
zoomControlsEnabled: false,
rotateGesturesEnabled: true,
myLocationButtonEnabled: false,
myLocationEnabled: false,
trafficEnabled: true,
markers: _markers, //Using the set
),
),
),
),
//Styled Card widgets
Expanded(flex: 2, child: Padding(
padding: EdgeInsets.all(8),
child: SingleChildScrollView(
child: Column(
children: [
LocationCard(
title: "Location 1",
motto: "A Fine Dining Restaurant",
address:
"Avrupa Yakası, Cihangir, 34433 Beyoğlu/İstanbul Türkiye",
),
LocationCard(
title: "Location 2",
motto: "A Restaurant with an Extraordinary View",
address:
"Avrupa Yakası, Hoca Paşa, 34110 Fatih/İstanbul Türkiye",
),
LocationCard(
title: "Location 3",
motto: "A Casual Dining Restaurant",
address:
"Anadolu Yakası, Aziz Mahmut Hüdayi, 34672 Üsküdar/İstanbul Türkiye",
)
],
),
),),
)],
),
);
}
}
Now we have some custom cards beneath the map object which also has some initial markers. We will use these custom cards as a button to zoom in the desired marker with a smooth camera animation. To do so, users can easily tab a card to see the zoomed-in location on a Huawei map and explore the surrounding without leaving the page.
Resized Map with Location Cards
Before turning cards into a button we should first set a HuaweiMapController object in order to provide a controller for HuaweiMap, then use this controller on HuaweiMap widgets onMapCreated field to pair map and its controller. Below, I created a controller, and with the help of a simple function, use it in our HuaweiMap object.
Code:
HuaweiMapController mapController;
void _onMapCreated(HuaweiMapController controller) {
mapController = controller;
}
/*
This section only shows the added line. Remaining code is not changed.
*/
child: HuaweiMap(
initialCameraPosition: CameraPosition(
target: _center,
zoom: _zoom,
),
onMapCreated: _onMapCreated, // Added Line
mapType: MapType.normal,
tiltGesturesEnabled: true,
/*
Rest of the code
*/
We now have a controller for non-user camera moves, so let’s use the controller. I wrapped the LocationCards with InkWell widget to provide an onTap functionality. There are several useful methods in plugins CameraUpdate class which enables us to zoom in, zoom out, or change camera position. We will use the newLatLngZoom method to zoom in the stated location then, by using the controller and animateCamera method, we will animate the camera move to our new camera location. You can find the wrapped LocationCard with the CameraUpdate and controller.
Code:
InkWell(
onTap: () {
CameraUpdate cameraUpdate =
CameraUpdate.newLatLngZoom(
_location1, _zoomMarker);
mapController.animateCamera(cameraUpdate);
},
child: LocationCard(
title: "Location 1",
motto: "A Fine Dining Restaurant",
address:
"Avrupa Yakası, Cihangir, 34433 Beyoğlu/İstanbul Türkiye",
)),
The used _zoomMarker variable is a constant double and has a value of 18. Also, the used _location1 variable is the variable we set while creating our markers.
After implementing these steps, tap a card and you will see a smooth camera move animation with the change of zoom level in your HuaweiMap widget. Voila!
As I mentioned before, you can also set some circles, polylines, or polygons similar to the markers. Furthermore, you can add some on-click actions both to your map and shapes or markers you set. Do not forget to explore other functionalities that Huawei Map Kit Flutter Plugin offers.
I am leaving the project’s GitHub link in case you want to check or try this example by yourself. You may also find LocationCard’s code and other minor adjustments that I made, from the link.
https://github.com/SerdarCanDev/FlutterHuaweiMapTutorial
Conclusion
Since Huawei created its own services, the demand for support to cross-platform frameworks such as Flutter, React Native, Cordova or Xamarin, is increased. To meet these demands Huawei continuously releasing plugins and updates in order to support its developers. We already learned how to use Huawei’s Map Kit in our Flutter projects, yet there are several more official plugins for Huawei Services to explore.
For further reading, I will provide some links in the “References and Further Reading” section including an article which showcases another service. You may also ask any question related to this article in the comments section.
https://developer.huawei.com/consumer/en/hms/huawei-MapKit
https://pub.dev/publishers/developer.huawei.com/packages
Sending Push Notifications on Flutter with Huawei Push Kit Plugin:
https://medium.com/huawei-developers/sending-push-notifications-on-flutter-with-huawei-push-kit-plugin-534787862b4d
AR placement apps have enhanced daily life in a myriad of different ways, from AR furniture placement for home furnishing and interior decorating, AR fitting in retail, to AR-based 3D models in education which gives students an opportunity to take an in-depth look at the internal workings of objects.
From integrating HUAWEI Scene Kit, you're only eight steps away from launching an AR placement app of your own. ARView, a set of AR-oriented scenario-based APIs in Scene Kit, uses the plane detection capability of AR Engine, along with the graphics rendering capability in Scene Kit, to create a 3D materials loading and rendering capability for common AR scenes.
ARView Functions
With ARView, you'll be able to:
1. Load and render 3D materials in AR scenes.
2. Set whether to display the lattice plane (consisting of white lattice points) to help select a plane in a real-world view.
3. Tap an object placed on the lattice plane to select it. Once selected, the object will turn red. You can then move, resize, or rotate it as needed.
Development Procedure
Before using ARView, you'll need to integrate the Scene Kit SDK into your Android Studio project. For details, please refer to Integrating the HMS Core SDK.
ARView inherits from GLSurfaceView, and overrides lifecycle-related methods. It can facilitate the creation of an ARView-based app in just eight steps:
1. Create an ARViewActivity that inherits from Activity. Add a Button to load materials.
Code:
public class ARViewActivity extends Activity {
private ARView mARView;
private Button mButton;
private boolean isLoadResource = false;
}
2. Add an ARView to the layout.
Code:
<com.huawei.hms.scene.sdk.ARView
android:id="@+id/ar_view"
android:layout_width="match_parent"
android:layout_height="match_parent">
</com.huawei.hms.scene.sdk.ARView>
Note: To achieve the desired experience offered by ARView, your app should not support screen orientation changes or split-screen mode; thus, add the following configuration in the AndroidManifest.xml file:
Code:
android:screenOrientation="portrait"
android:resizeableActivity="false"
3. Override the onCreate method of ARViewActivity and obtain the ARView.
Code:
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_ar_view);
mARView = findViewById(R.id.ar_view);
mButton = findViewById(R.id.button);
}
4. Add a Switch button in the onCreate method to set whether or not to display the lattice plane.
Code:
Switch mSwitch = findViewById(R.id.show_plane_view);
mSwitch.setChecked(true);
mSwitch.setOnCheckedChangeListener(new CompoundButton.OnCheckedChangeListener() {
@Override
public void onCheckedChanged(CompoundButton buttonView, boolean isChecked) {
mARView.enablePlaneDisplay(isChecked);
}
});
Note: Add the Switch button in the layout before using it.
Code:
<Switch
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:id="@+id/show_plane_view"
android:layout_alignParentTop="true"
android:layout_marginTop="15dp"
android:layout_alignParentEnd="true"
android:layout_marginEnd ="15dp"
android:layout_gravity="end"
android:text="@string/show_plane"
android:theme="@style/AppTheme"
tools:ignore="RelativeOverlap" />
5. Add a button callback method. Tapping the button once will load a material, and tapping it again will clear the material.
Code:
public void onBtnClearResourceClicked(View view) {
if (!isLoadResource) {
mARView.loadAsset("ARView/scene.gltf");
isLoadResource = true;
mButton.setText(R.string.btn_text_clear_resource);
} else {
mARView.clearResource();
mARView.loadAsset("");
isLoadResource = false;
mButton.setText(R.string.btn_text_load);
}
}
Note: The onBtnSceneKitDemoClicked method must be registered in the layout attribute onClick of the button, which is tapped to load or clear a material.
6. Override the onPause method of ARViewActivity and call the onPause method of ARView.
Code:
@Override
protected void onPause() {
super.onPause();
mARView.onPause();
}
7. Override the onResume method of ARViewActivity and call the onResume method of ARView.
Code:
@Override
protected void onResume() {
super.onResume();
mARView.onResume();
}
8. Override the onDestroy method for ARViewActivity and call the destroy method of ARView.
Code:
@Override
protected void onDestroy() {
super.onDestroy();
mARView.destroy();
}
9. (Optional) After the material is loaded, use setInitialPose to set its initial status (scale and rotation).
Code:
float[] scale = new float[] { 0.1f, 0.1f, 0.1f };
float[] rotation = new float[] { 0.707f, 0.0f, -0.707f, 0.0f };
mARView.setInitialPose(scale, rotation);
Effects
You can develop a basic AR placement app, simply by calling ARView from Scene Kit, as detailed in the eight steps above. If you are interested in this implementation method, you can view the Scene Kit demo on GitHub.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
The ARView capability can be used to do far more than just develop AR placement apps; it can also help you implement a range of engaging functions, such as AR games, virtual exhibitions, and AR navigation features.
GitHub to download demos and sample codes
Huawei provides a map SDK for developers. We can show markers, make clustering, draw routes, shapes etc. with this SDK. If you don’t have an idea about Huawei Map Kit, you can check this article for more information.
In this article, we will focus how can we use heat map with Huawei Map Kit via external library. Also beside heat map, it is possible to integrate extra flexible features like customizable marker clustering, icon generator, poly decoding and encoding, spherical geometry, KML and GeoJSON with this library.
Huawei Map was released at the end of 2019 and is updated very frequently by Huawei core developers but it does not have official helper utility library for now.
So what should we do if we need a feature which Huawei Map Kit does not support?
Power of Open Source
Google shares this Utility Library on Github, so other developers can see the code and contribute it.
When we look at the repo, we see that, library contains some Google Map related classes.
Code:
import com.google.android.gms.maps.model.LatLng;
import com.google.android.gms.maps.model.Tile;
import com.google.android.gms.maps.model.TileProvider;
Since Huawei Map core SDK is very similar to Google Map SDK, we can replace them with Huawei related classes.
Code:
import com.huawei.hms.maps.model.LatLng;
import com.huawei.hms.maps.model.Tile;
import com.huawei.hms.maps.model.TileProvider;
Above example is only for HeatMapTileProvider class. We should replace whole GMS related class with HMS related class in the library, otherwise we will get type mismatch error.
After replace all Google dependencies with Huawei dependencies, we can use library with Huawei Map Kit.
Heat Map
Heat maps are different way of representing data using colors.
The library has components that enable us to easily make heat maps. One of them is the WeightedLatLng class, which can store latitude, longitude and the weight (intensity). The other one is simple LatLng class.
In this example we will use simple LatLng class. Find a dummy json which contains latitude and longitude data just like below.
Code:
{
"lat": -35.9798,
"lng": 142.917
},
{
"lat": -38.3363,
"lng": 143.783
},
....
If we want to use WeightedLatLng we also need a intensity value.
Code:
{
"density": 123.4,
"lat": -35.9798,
"lng": 142.917
},
{
"density": 123.4,
"lat": -38.3363,
"lng": 143.783
},
...
Put json file under the raw folder.
Code:
private ArrayList<LatLng> readItems(int resource) throws JSONException {
ArrayList<LatLng> list = new ArrayList<>();
InputStream inputStream = getResources().openRawResource(resource);
String json = new Scanner(inputStream).useDelimiter("\\A").next();
JSONArray array = new JSONArray(json);
for (int i = 0; i < array.length(); i++) {
JSONObject object = array.getJSONObject(i);
double lat = object.getDouble("lat");
double lng = object.getDouble("lng");
list.add(new LatLng(lat, lng));
}
return list;
}
Then we read this raw file and convert it to a list of LatLng Now our data is ready.
After that, we will be creating a HeatMapTileProvider object using its builder.
Code:
mProvider = new HeatmapTileProvider.Builder().data(mDataList).build();
Use weightedData instead of data if you use WeightedLatLng
Code:
mProvider = new HeatmapTileProvider.Builder()
.weightedData(data) // load our weighted data
.radius(50) // optional, in pixels, can be anything between 20 and 50
.maxIntensity(1000.0) // set the maximum intensity
.build()
Last, we have to create a TileOverlay and add this overlay to our Map.
Code:
mOverlay = hMap.addTileOverlay(new TileOverlayOptions().tileProvider(mProvider));
Lets run the project and move the camera.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Customizing the Heat Map
We can style heat map according to our requirements.
We can change the display colors by passing a Gradient to the tile provider builder.
Code:
private static final int[] ALT_HEATMAP_GRADIENT_COLORS = {
Color.argb(0, 0, 255, 255),// transparent
Color.argb(255 / 3 * 2, 0, 255, 255),
Color.rgb(0, 191, 255),
Color.rgb(0, 0, 127),
Color.rgb(255, 0, 0)
};
public static final float[] ALT_HEATMAP_GRADIENT_START_POINTS = {
0.0f, 0.10f, 0.20f, 0.60f, 1.0f
};
public static final Gradient ALT_HEATMAP_GRADIENT = new Gradient(ALT_HEATMAP_GRADIENT_COLORS,
ALT_HEATMAP_GRADIENT_START_POINTS);
Set gradient while creating provider or after created
Code:
mProvider.setGradient(ALT_HEATMAP_GRADIENT);
mOverlay.clearTileCache();
// or while creating
mProvider = new HeatmapTileProvider.Builder()
.weightedData(data) // load our weighted data
.radius(50)
.maxIntensity(1000.0)
.gradient(ALT_HEATMAP_GRADIENT)
.build();
We can also change the opacity of our heat map layer and radius.
Code:
mProvider.setRadius(10); // Default 20
mProvider.setOpacity(0.4); // Default 0.7
mOverlay.clearTileCache();
// or while creating
mProvider = new HeatmapTileProvider.Builder()
.weightedData(data) // load our weighted data
.radius(50)
.maxIntensity(1000.0)
.gradient(ALT_HEATMAP_GRADIENT)
.radius(10)
.opacity(0.4)
.build();
Summary
Google provides a utility library for Google Map but replacing GMS related classes with HMS related classes, we can use this utility library with Huawei Map Kit until Huawei develop their own utility library.
You can also check this repo where all GMS related classes are replaced with HMS related classes.
For the original, you can visit https://forums.developer.huawei.com/forumPortal/en/topic/0203417811482880018
Very interesting.
Really helpful
very useful ,can we show shortest path using Huawei map kit?
NIce. Thank you.