Developing with HMS Awareness Kit - Huawei Developers

Ever wanted your app to know how bright it is in a room, so it can dynamically change its UI? How about whether one of your users is close to a certain location?
If so, Huawei's Awareness Kit SDK is for you. As long as you're targeting Huawei devices, this SDK makes it easy to query and detect various contextual situations.
Let's get started.
Preparation
First up, make sure you have a Huawei Developer Account. This process can take a couple days, and you'll need one to use this SDK, so be sure to start that as soon as possible. You can sign up at https://developer.huawei.com.
Next, you'll want to obtain the SHA-256 representation of your app's signing key. If you don't have a signing key yet, be sure to create one before continuing. To obtain your signing key's SHA-256, you'll need to use Keytool which is part of the JDK installation. Keytool is a command-line program. If you're on Windows, open CMD. If you're on Linux, open Terminal.
On Windows, you'll need to "cd" into the directory containing the Keytool executable. For example, if you have JDK 1.8 v231 installed, Keytool will be located at the following path:
Code:
C:\Program Files\Java\jdk1.8.0_231\bin\
Once you find the directory, "cd" into it:
Code:
C: #Make sure you're in the right drive
cd C:\Program Files\Java\jdk1.8.0_231\bin\
Next, you need to find the location of your keystore. Using Android's debug keystore as an example, where the Android SDK is hosted on the "E:" drive in Windows, the path will be as follows:
Code:
E:\AndroidSDK\.android\debug.keystore
(Keytool also supports JKS-format keystores.)
Now you're ready to run the command. On Windows, it'll look something like this:
Code:
keytool -list -v -keystore E:\AndroidSDK\.android\debug.keystore
On Linux, the command should be similar, just using UNIX-style paths instead.
Enter the keystore password, and the key name (if applicable), and you'll be presented with something similar to the following:
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Make note of the SHA256 field.
SDK Setup
Now we're ready to add the Awareness Kit SDK to your Android Studio project. Go to your Huawei Developer Console and click the HUAWEI AppGallery tile. Agree to the terms of use if prompted.
Click the "My projects" tile here. If you haven't already added your project to the AppGallery, add it now. You'll be asked for a project name. Make it something descriptive so you know what it's for.
Now, you should be on a screen that looks something like the following:
Click the "Add app" button. Here, you'll need to provide some details about your app, like its name and package name.
Once you click OK, some SDK setup instructions will be displayed. Follow them to get everything added to your project. You'll also need to add the following to the "dependencies" section of your app-level build.gradle file:
Code:
implementation 'com.huawei.hms:awareness:1.0.4.301'
If you ever need to come back to these instructions, you can always click the "Add SDK" button after "App information" on the "Project setting" page.
Now you should be back on the "Project setting" page. Find the "SHA-256 certificate fingerprint" field under "App information," click the "+" button, and paste your SHA-256.
Now, go to the Manage APIs tab on the "Project setting" page. Scroll down until you find "Awareness Kit" and make sure it's enabled.
Now, if you're using obfuscation in your app, you'll need to whitelist a few things for HMS to work properly.
For ProGuard:
Code:
-ignorewarnings
-keepattributes *Annotation*
-keepattributes Exceptions
-keepattributes InnerClasses
-keepattributes Signature
-keepattributes SourceFile,LineNumberTable
-keep class com.hianalytics.android.**{*;}
-keep class com.huawei.updatesdk.**{*;}
-keep class com.huawei.hms.**{*;}
For AndResGuard:
Code:
"R.string.hms*",
"R.string.agc*",
"R.string.connect_server_fail_prompt_toast",
"R.string.getting_message_fail_prompt_toast",
"R.string.no_available_network_prompt_toast",
"R.string.third_app_*",
"R.string.upsdk_*",
"R.layout.hms*",
"R.layout.upsdk_*",
"R.drawable.upsdk*",
"R.color.upsdk*",
"R.dimen.upsdk*",
"R.style.upsdk*
That's it! The Awareness Kit SDK should now be available in your project.
Basic Usage
There are quite a few awareness "modules" in this SDK: Time Awareness, Location Awareness, Behavior Awareness, Beacon Awareness, Audio Device Status Awareness, Ambient Light Awareness, and Weather Awareness. Read on to find out how and when to use them.
Each of these modules has two modes: capture, which is an on-demand information retrieval; and barrier, which triggers an action when a specified condition is met.
Time Awareness
Time awareness is a pretty simple API. You can use it to obtain things like the current time for a specific location, or holidays in the area.
Before you start using it, though, you need to have the user grant permission for your app to retrieve the device's location. Declare one of the following permissions depending on your use-case.
XML:
<!-- Use this permission if you're only going to be using IP-based location -->
<uses-permission android:name="android.permission.ACCESS_COARSE_LOCATION" />
<!-- Use this permission if you need more precise location access, such as GPS. -->
<uses-permission android:name="android.permission.ACCESS_FINE_LOCATION" />
Since both of these permissions are dangerous-level, you'll need to request them at runtime on devices running Marshmallow or later.
Here's some code showing you the basics of capturing time values:
Code:
val captureClient = Awareness.getCaptureClient(context)
//Retrieve current time info.
captureClient
//The below run{} block shows possible methods for retrieving
//the current time info. Use the one that best suits your needs.
.run {
//Retrieve the time categories for the current location
//(requires FINE_LOCATION).
timeCategories
//Retrieve the time categories for the current location by IP
//(requires COARSE_LOCATION).
timeCategoriesByIP
//Retrieve the time categories for a specific country.
getTimeCategoriesByCountryCode("US")
//Retrieve the time for the specific lat and long.
//Minimum precision is 2 decimal places.
getTimeCategoriesByUser(12.00, 12.00)
//Retrieve the time categories for the current location
//sometime in the future. The supplied Long should be
//the Unix epoch time in milliseconds. The timestamp
//has to be in the current year.
//Requires COARSE_LOCATION.
getTimeCategoriesForFuture(System.currentTimeMillis() + 1000L)
}
.addOnSuccessListener {
//Retrieval succeeded.
//Get the time categories. A category is either a rough
//indicator of what time it is (morning, afternoon, etc),
//whether or not it's a weekday,
//or whether or not it's currently a holiday.
//The IntArray here should contain three values:
//a time, the weekday status, and the holiday status.
//e.g:
//times[0] == TimeBarrier.TIME_CATEGORY_MORNING
//times[1] == TimeBarrier.TIME_CATEGORY_WEEKDAY
//times[2] == TimeBarrier.TIME_CATEGORY_HOLIDAY
val categories: TimeCategories = it.timeCategories
val times: IntArray = categories.timeCategories
}
.addOnFailureListener {
//Something went wrong!
}
Here's some code showing you how to set up a barrier trigger based on the time:
Code:
val barrierClient = Awareness.getBarrierClient(context)
//Create a period barrier from 11AM to 1PM.
//The barrier will trigger when entering and exiting the specified period.
//You can also listen for when the time is currently in a sunrise/sunset state,
//when a user enters and exits a specific Unix timestamp period, when the user
//is in a certain period of the week, and when the user is in a specific time
//category (defined by TimeBarrier).
val periodOfDayBarrier = TimeBarrier.duringPeriodOfDay(TimeZone.getDefault(), 11 * 60 * 60 * 1000L, 13 * 60 * 60 * 1000L)
//Create a receiver Intent for when this barrier is triggered.
//Your PendingIntent should have an explicit BroadcastReceiver class.
//Your receiver should use the BarrierStatus.extract(Intent) method
//in onReceive() to obtain the status.
val periodPendingIntent = PendingIntent.getBroadcast(
context,
1,
Intent("PERIOD_OF_DAY_BARRIER_ACTION"),
PendingIntent.FLAG_UPDATE_CURRENT
)
//Create an update request to add this barrier.
val periodRequest = BarrierUpdateRequest.Builder()
.addBarrier("barrier label", periodOfDayBarrier, periodPendingIntent)
.build()
barrierClient.updateBarriers(periodRequest)
Weather Awareness
Weather awareness allows you to query the current weather state of the user's location or a specified location.
Before you start using it, though, you need to have the user grant permission for your app to retrieve the device's location. Declare the following permission:
XML:
<uses-permission android:name="android.permission.ACCESS_FINE_LOCATION" />
Since this permission is dangerous-level, you'll need to request it at runtime on devices running Marshmallow or later. Location access is optional if you're just getting weather for a pre-defined position.
Here's some code showing the different ways to query the weather:
Code:
val captureClient = Awareness.getCaptureClient(context)
//Retrieve current weather info.
captureClient
//The below run{} block shows various ways to retrieve the
//current weather info.
.run {
//Use the device IP to retrieve weather info.
//Requires COARSE_LOCATION.
weatherByIP
//Use the device's exact location to retrieve
//weather info.
//Requires FINE_LOCATION.
weatherByDevice
//Get the weather for a specified position.
getWeatherByPosition(
WeatherPosition().apply {
//The city parameter is required. The others
//are optional, but can help with accuracy if
//supplied.
this.city = "Paris"
this.country = "FR"
// this.province = "N/A"
// this.county = "N/A"
// this.district = "N/A"
this.locale = "fr_FR"
}
)
}
.addOnSuccessListener {
val status = it.weatherStatus
//Get the weather AQI, like CO, NO2, PM10, PM2.5, SO2, etc.
val aqi = status.aqi
//Get daily weather info for the next 7 days.
val daily: MutableList<DailyWeather> = status.dailyWeather
//Get hourly weather info for the next 24 hours.
val hourly: MutableList<HourlyWeather> = status.hourlyWeather
//Get the living index for the current day and 1-2 days in the future.
val live: MutableList<LiveInfo> = status.liveInfo
//Get the current weather situation.
val situation = status.weatherSituation
val currentAqiValue = aqi.aqiValue
val currentCOLevel = aqi.co
val currentNO2Level = aqi.no2
val currentO3Level = aqi.o3
val currentPM10Level = aqi.pm10
val currentPM25Level = aqi.pm25
val currentSO2Level = aqi.so2
daily.forEach {
//Get the AQI value for this day.
val aqiValue = it.aqiValue
//Get the Unix-epoch timestamp for this date.
val time = it.dateTimeStamp
//Get the max temperature for this day.
val maxTempC = it.maxTempC
val maxTempF = it.maxTempF
//Get the min temperature for this day.
val minTempC = it.minTempC
val minTempF = it.minTempF
}
hourly.forEach {
//Whether this hour is daytime or nighttime.
//true for daytime, false for nighttime.
val isDayNight = it.isDayNight
//Probability it will rain. Percentage.
val rainPropbability = it.rainprobability
//Temperature at this hour.
val tempC = it.tempC
val tempF = it.tempF
//Weather type at this hour.
//Possible values can be found in the WeatherId
//interface.
val weatherType = it.weatherId
}
live.forEach {
//Get the type of living index for this LiveInfo.
//1: dressing index
//2: sports index
//3: coldness index
//4: car washing index
//5: tourism index
//6: UV index
val type = it.code
//A list of index levels for the current day and the next 1-2.
it.levelList.forEach {
val time = it.dateTimeStamp
val level = it.level
}
}
//The city this Situation applies to.
val city = situation.city
val cityCode = city.cityCode
val cityName = city.name
val cityProvince = city.provinceName
val cityTimezone = city.timeZone
situation.situation.apply {
//Possible information you can get from the current situation.
this.humidity
this.pressure
this.realFeelC
this.realFeelF
this.temperatureC
this.temperatureF
this.updateTime
this.uvIndex
this.windDir
this.windLevel
this.windSpeed
}
}
Behavior Awareness
Behavior Awareness allows you detect what the user is currently doing, and trigger events based on behavior changes.
Before you start using it, though, you need to have the user grant permission for your app to recognize the user's activity. Declare the following permissions:
XML:
<uses-permission android:name="android.permission.ACTIVITY_RECOGNITION" />
<uses-permission android:name="com.huawei.hms.permission.ACTIVITY_RECOGNITION" />
Since both of these permissions are dangerous-level, you'll need to request them at runtime on devices running Marshmallow or later.
Here's an example of how you can query the user's current behavior:
Code:
val captureClient = Awareness.getCaptureClient(context)
//Check the most likely behavior of the user.
//Requires ACTIVITY_RECOGNITION
captureClient.behavior.addOnSuccessListener {
//Check the list of probable behaviors.
it.behaviorStatus.probableBehavior.forEach {
//Types can be found in the BehaviorBarrier class.
it.type
//How likely this behavior is.
it.confidence
}
//Get the most likely user behavior.
it.behaviorStatus.mostLikelyBehavior.apply {
//Types can be found in the BehaviorBarrier class.
this.type
//How likely this behavior is.
this.confidence
}
//Get the time in milliseconds used for this detection.
it.behaviorStatus.elapsedRealtimeMillis
//Get the detection time in Unix-epoch form.
it.behaviorStatus.time
}
Here's an exmaple of how you can set up an event trigger based on specific user behavior:
Code:
val barrierClient = Awareness.getBarrierClient(context)
//Create a behavior barrier. In this case, detect
//if the user is keeping still.
//You can also set up barriers for "beginning" and "ending"
//any behaviors defined in BehaviorBarrier.
val keepStillBarrier = BehaviorBarrier.keeping(BehaviorBarrier.BEHAVIOR_STILL)
//Create a receiver Intent for when this barrier is triggered.
//Your PendingIntent should have an explicit BroadcastReceiver class.
//Your receiver should use the BarrierStatus.extract(Intent) method
//in onReceive() to obtain the status.
val keepStillPendingIntent = PendingIntent.getBroadcast(
context,
3,
Intent("KEEP_STILL_BARRIER_ACTION"),
PendingIntent.FLAG_UPDATE_CURRENT
)
//Create an update request to add this barrier.
val keepStillRequest = BarrierUpdateRequest.Builder()
.addBarrier("barrier label", keepStillBarrier, keepStillPendingIntent)
.build()
barrierClient.updateBarriers(keepStillRequest)
Location Awareness[SIZE]
Location Awareness allows you to query the user's current or last-known location, and set up triggers based on where the user goes.
Before you start using it, though, you need to have the user grant permission for your app to retrieve the device's location. Declare the following permission:
XML:
<uses-permission android:name="android.permission.ACCESS_FINE_LOCATION" />
Since this permission is dangerous-level, you'll need to request it at runtime on devices running Marshmallow or later.
Here's how to retrieve the user's location:
Code:
val captureClient = Awareness.getCaptureClient(context)
//Get the user's location.
//Requires FINE_LOCATION.
captureClient
.run {
//Get the cached last-known location of the user.
location
//Get the current location of the user.
currentLocation
}
.addOnSuccessListener {
val location = it.location
//The Location class has a bunch of properties
//for judging the user's location.
}
Here's how to set up a trigger based on the user's location:
Code:
val barrierClient = Awareness.getBarrierClient(context)
//Create a location barrier for a specified radius around
//a certain location. In this case, the barrier will trigger
//when the user enters the specified range.
//You can also define an "exit" barrier.
val enterBarrier = LocationBarrier.enter(12.3456, 12.3456, 200.0)
//Create a receiver Intent for when this barrier is triggered.
//Your PendingIntent should have an explicit BroadcastReceiver class.
//Your receiver should use the BarrierStatus.extract(Intent) method
//in onReceive() to obtain the status.
val enterPendingIntent = PendingIntent.getBroadcast(
context,
2,
Intent("ENTER_LOCATION_BARRIER_ACTION"),
PendingIntent.FLAG_UPDATE_CURRENT
)
//Create an update request to add this barrier.
val enterRequest = BarrierUpdateRequest.Builder()
.addBarrier("barrier label", enterBarrier, enterPendingIntent)
.build()
barrierClient.updateBarriers(enterRequest)
Headset Awareness
Headset Awareness allows you to check if headphones are currently connected, and carry out actions when the connection state changes.
Before you start using it, though, you need permission for your app to manage Bluetooth connections. Declare the following permission:
XML:
<uses-permission android:name="android.permission.BLUETOOTH" />
Here's how you can check the current headset connection state:
Code:
val captureClient = Awareness.getCaptureClient(context)
//Get whether the user has headphones connected.
captureClient.headsetStatus
.addOnSuccessListener {
//The current headset status (connected, disconnected, unknown).
//Constants are in the HeadsetStatus interface.
val status = it.headsetStatus.status
}
Here's how you can set up a trigger based on the connection state:
Code:
val barrierClient = Awareness.getBarrierClient(context)
//Create a headset connection barrier. This will trigger when
//a headset is connecting. You can also detect a "disconnecting"
//event and a "keeping" event.
val headsetConnectingBarrier = HeadsetBarrier.connecting()
//Create a receiver Intent for when this barrier is triggered.
//Your PendingIntent should have an explicit BroadcastReceiver class.
//Your receiver should use the BarrierStatus.extract(Intent) method
//in onReceive() to obtain the status.
val headsetConnectingPendingIntent = PendingIntent.getBroadcast(
context,
4,
Intent("HEADSET_CONNECTING_BARRIER_ACTION"),
PendingIntent.FLAG_UPDATE_CURRENT
)
//Create an update request to add this barrier.
val headsetConnectingRequest = BarrierUpdateRequest.Builder()
.addBarrier("barrier label", headsetConnectingBarrier, headsetConnectingPendingIntent)
.build()
barrierClient.updateBarriers(headsetConnectingRequest)
Bluetooth Car Stereo Awareness
This Awareness module allows you to detect when the user is connected to a Bluetooth car stereo, and carry out actions when that connection state changes.
Before you start using it, though, you need permission for your app to manage Bluetooth connections. Declare the following permission:
XML:
<uses-permission android:name="android.permission.BLUETOOTH" />
Here's how you can check if the user is currently connected to a car stereo:
Code:
val captureClient = Awareness.getCaptureClient(context)
//Get the current Bluetooth status for a specific
//device type.
//e.g: 0 is a car stereo.
captureClient.getBluetoothStatus(/* deviceType */ 0)
.addOnSuccessListener {
//Get the current connection status for the
//device type.
//Values can be found in the BluetoothStatus interface.
it.bluetoothStatus.status
}
Here's how to set up a trigger event:
Code:
val barrierClient = Awareness.getBarrierClient(context)
//Create a Bluetooth connection barrier. This will trigger when
//a matching Bluetooth device is connecting. You can also detect
//a "disconnecting" event.
val btConnectingBarrier = BluetoothBarrier.connecting(/* deviceType */ 0)
//Create a receiver Intent for when this barrier is triggered.
//Your PendingIntent should have an explicit BroadcastReceiver class.
//Your receiver should use the BarrierStatus.extract(Intent) method
//in onReceive() to obtain the status.
val btConnectingPendingIntent = PendingIntent.getBroadcast(
context,
4,
Intent("BT_CONNECTING_BARRIER_ACTION"),
PendingIntent.FLAG_UPDATE_CURRENT
)
//Create an update request to add this barrier.
val btConnectingRequest = BarrierUpdateRequest.Builder()
.addBarrier("barrier label", btConnectingBarrier, btConnectingPendingIntent)
.build()
barrierClient.updateBarriers(btConnectingRequest)
Ambient Light Awareness
Ambient Light Awareness allows you to detect the current ambient light level, and set up events to trigger when that level reaches certain values.
Here's how to query the ambient light level:
Code:
val captureClient = Awareness.getCaptureClient(context)
//Get the current light intensity.
captureClient.lightIntensity
.addOnSuccessListener {
//The current light intensity, as a floating point
//illuminance value.
it.ambientLightStatus.lightIntensity
}
Here's how to set up triggers based on light level changes:
Code:
val barrierClient = Awareness.getBarrierClient(context)
//Create a light level barrier to detect when the
//light level goes above a certain value in lux.
//You can also detect a "below" event and a "range"
//event.
val aboveLightBarrier = AmbientLightBarrier.above(2500f)
//Create a receiver Intent for when this barrier is triggered.
//Your PendingIntent should have an explicit BroadcastReceiver class.
//Your receiver should use the BarrierStatus.extract(Intent) method
//in onReceive() to obtain the status.
val aboveLightPendingIntent = PendingIntent.getBroadcast(
context,
4,
Intent("ABOVE_LIGHT_CONNECTING_BARRIER_ACTION"),
PendingIntent.FLAG_UPDATE_CURRENT
)
//Create an update request to add this barrier.
val aboveLightRequest = BarrierUpdateRequest.Builder()
.addBarrier("barrier label", aboveLightBarrier, aboveLightPendingIntent)
.build()
barrierClient.updateBarriers(aboveLightRequest)
Beacon Awareness
Beacon Awareness allows you to detect whether beacons that match a specified filter are in range, or trigger an event when a beacon appears or disappears.
Before you start using it, though, you need to have the user grant permission for your app to retrieve the device's location, and request permission to use Bluetooth. Declare the following permissions:
XML:
<uses-permission android:name="android.permission.ACCESS_FINE_LOCATION" />
<uses-permission android:name="android.permission.BLUETOOTH" />
Since the location permission is dangerous-level, you'll need to request it at runtime on devices running Marshmallow or later.
Here's how you can query beacons:
Code:
val captureClient = Awareness.getCaptureClient(context)
//Query nearby beacon status based on a filter.
//Requires FINE_LOCATION and BLUETOOTH.
captureClient.getBeaconStatus(BeaconStatus.Filter.match("namespace", "type", /* content */ ByteArray(1)))
.addOnSuccessListener {
//Loop through all the beacons found.
it.beaconStatus.beaconData.forEach {
it.content
it.namespace
it.type
}
}
Here's how you can set up events based on beacon availability:
Code:
//Create a beacon barrier. This will trigger when
//a matching beacon is discovered.
//You can also trigger barriers based on whether a matching
//beacon is kept or missed.
val beaconBarrier = BeaconBarrier.discover(BeaconStatus.Filter.match("namespace", "type", /* content */ ByteArray(1)))
//Create a receiver Intent for when this barrier is triggered.
//Your PendingIntent should have an explicit BroadcastReceiver class.
//Your receiver should use the BarrierStatus.extract(Intent) method
//in onReceive() to obtain the status.
val beaconPendingIntent = PendingIntent.getBroadcast(
context,
4,
Intent("BEACON_BARRIER_ACTION"),
PendingIntent.FLAG_UPDATE_CURRENT
)
//Create an update request to add this barrier.
val beaconRequest = BarrierUpdateRequest.Builder()
.addBarrier("barrier label", beaconBarrier, beaconPendingIntent)
.build()
barrierClient.updateBarriers(beaconRequest)
Conclusion
And that's it!
The Awareness Kit is quite a comprehensive detection and event SDK. If you're developing a context-aware app for Huawei devices, this is definitely the library for you.
Make sure to check out Huawei's full documentation for updates and more details.

Related

All About Maps - Episode 2: Moving Map Camera to Bounded Regions

More articles like this, you can visit HUAWEI Developer Forum and Medium.​
Previously on All About Maps: Episode 1:
The principles of clean architecture
The importance of eliminating map provider dependencies with abstraction
Drawing polylines and markers on Mapbox Maps, Google Maps (GMS), and Huawei Maps (HMS)
Episode 2: Bounded Regions
Welcome to the second episode of AllAboutMaps. In order to understand this blog post better, I would first suggest reading the Episode 1. Otherwise, it will be difficult to follow the context.
In this episode we will talk about bounded regions:
The GPX parser datasource will parse the the file to get the list of attraction points (waypoints in this case).
The datasource module will emit the bounded region information in every 3 seconds
A rectangular bounded area from the centered attraction points with a given radius using a utility method (No dependency to any Map Provider!)
We will move the map camera to the bounded region each time a new bounded region is emitted.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
ChangeLog since Episode 1
As we all know, software development is continous process. It helps a lot when you have reviewers who can comment on your code and point out issues or come up with suggestions. Since this project is one person task, it is not always easy to spot the flows in the code duirng implementation. The software gets better and evolves hopefully in a good way when we add new features. Once again. I would like to add the disclaimer that my suggestions here are not silver bullets. There are always better approaches. I am more than happy to hear your suggestions in the comments!
You can see the full code change between episode 1 and 2 here:
https://github.com/ulusoyca/AllAboutMaps/compare/episode_1-parse-gpx...episode_2-bounded-region
Here are the main changes I would like to mention:
1- Added MapLifecycleHandlerFragment.kt base class
In episode 1, I had one feature: show the polyline and markers on the map. The base class of all 3 fragments (RouteInfoMapboxFragment, RouteInfoGoogleFragment and RouteInfoHuaweiFragment) called these lifecycle methods. When I added another feature (showing bounded regions) I realized that the new base class of this feature again implemented the same lifecycle methods. This is against the DRY rule (Dont Repeat Yourself)! Here is the base class I introduced so that each feature's base class will extend this one:
Code:
/**
* The base fragment handles map lifecycle. To use it, the mapview classes should implement
* [AllAboutMapView] interface.
*/
abstract class MapLifecycleHandlerFragment : DaggerFragment() {
protected lateinit var mapView: AllAboutMapView
override fun onViewCreated(view: View, savedInstanceState: Bundle?) {
mapView.onMapViewCreate(savedInstanceState)
}
override fun onResume() {
super.onResume()
mapView.onMapViewResume()
}
override fun onPause() {
super.onPause()
mapView.onMapViewPause()
}
override fun onStart() {
super.onStart()
mapView.onMapViewStart()
}
override fun onStop() {
super.onStop()
mapView.onMapViewStop()
}
override fun onDestroyView() {
super.onDestroyView()
mapView.onMapViewDestroy()
}
override fun onSaveInstanceState(outState: Bundle) {
super.onSaveInstanceState(outState)
mapView.onMapViewSaveInstanceState(outState)
}
}
Let's see the big picture now:
2- Refactored the abstraction for styles, marker options, and line options.
In the first episode, we encapsulated a dark map style inside each custom MapView. When I intended to use outdoor map style for the second episode, I realized that my first approach was a mistake. A specific style should not be encapsulated inside MapView. Each feature should be able to select different style. I took the responsibility to load the style from MapViews to fragments. Once the style is loaded, the style object is passed to MapView.
Code:
override fun onViewCreated(view: View, savedInstanceState: Bundle?) {
mapView = binding.mapView
super.onViewCreated(view, savedInstanceState)
binding.mapView.getMapAsync { mapboxMap ->
binding.mapView.onMapReady(mapboxMap)
mapboxMap.setStyle(Style.OUTDOORS) {
binding.mapView.onStyleLoaded(it)
onMapStyleLoaded()
}
}
}
I also realized the need for MarkerOptions and LineOptions entities in our domain module:
Code:
data class MarkerOptions(
var latLng: LatLng,
var text: String? = null,
@DrawableRes var iconResId: Int,
var iconMapStyleId: String,
@ColorRes var iconColor: Int,
@ColorRes var textColor: Int
)
Code:
data class LineOptions(
var latLngs: List<LatLng>,
@DimenRes var lineWidth: Int,
@ColorRes var lineColor: Int
)
Above entities has properties based on the needs of my project. I only care about the color, text, location, and icon properties of the marker. For polyline, I will customize width, color and text properties. If your project needs to customize the marker offset, opacity, line join type, and other properties, then feel free to add them in your case.
These entities are mapped to corresponding map provider classes:
LineOptions:
Code:
private fun LineOptions.toGoogleLineOptions(context: Context) = PolylineOptions()
.color(ContextCompat.getColor(context, lineColor))
.width(resources.getDimension(lineWidth))
.addAll(latLngs.map { it.toGoogleLatLng() })
Code:
private fun LineOptions.toHuaweiLineOptions(context: Context) = PolylineOptions()
.color(ContextCompat.getColor(context, lineColor))
.width(resources.getDimension(lineWidth))
.addAll(latLngs.map { it.toHuaweiLatLng() })
Code:
private fun LineOptions.toMapboxLineOptions(context: Context): MapboxLineOptions {
val color = ColorUtils.colorToRgbaString(ContextCompat.getColor(context, lineColor))
return MapboxLineOptions()
.withLineColor(color)
.withLineWidth(resources.getDimension(lineWidth))
.withLatLngs(latLngs.map { it.toMapboxLatLng() })
}
MarkerOptions
Code:
private fun DomainMarkerOptions.toGoogleMarkerOptions(): GoogleMarkerOptions {
var markerOptions = GoogleMarkerOptions()
.icon(BitmapDescriptorFactory.fromResource(iconResId))
.position(latLng.toGoogleLatLng())
markerOptions = text?.let { markerOptions.title(it) } ?: markerOptions
return markerOptions
}
Code:
private fun DomainMarkerOptions.toHuaweiMarkerOptions(context: Context): HuaweiMarkerOptions {
BitmapDescriptorFactory.setContext(context)
var markerOptions = HuaweiMarkerOptions()
.icon(BitmapDescriptorFactory.fromResource(iconResId))
.position(latLng.toHuaweiLatLng())
markerOptions = text?.let { markerOptions.title(it) } ?: markerOptions
return markerOptions
}
Code:
private fun DomainMarkerOptions.toMapboxSymbolOptions(context: Context, style: Style): SymbolOptions {
val drawable = ContextCompat.getDrawable(context, iconResId)
val bitmap = BitmapUtils.getBitmapFromDrawable(drawable)!!
style.addImage(iconMapStyleId, bitmap)
val iconColor = ColorUtils.colorToRgbaString(ContextCompat.getColor(context, iconColor))
val textColor = ColorUtils.colorToRgbaString(ContextCompat.getColor(context, textColor))
var symbolOptions = SymbolOptions()
.withIconImage(iconMapStyleId)
.withLatLng(latLng.toMapboxLatLng())
.withIconColor(iconColor)
.withTextColor(textColor)
symbolOptions = text?.let { symbolOptions.withTextField(it) } ?: symbolOptions
return symbolOptions
}
There are minor technical details to handle the differences between map provider APIs but it is out of this blog post's scope.
Earlier our methods for drawing polyline and marker looked like this:
Code:
fun drawPolyline(latLngs: List<LatLng>, @ColorRes mapLineColor: Int)
fun drawMarker(latLng: LatLng, icon: Bitmap, name: String?)
After this refactor they look like this:
Code:
fun drawPolyline(lineOptions: LineOptions)
fun drawMarker(markerOptions: MarkerOptions)
It is a code smell when the number of the arguments in a method increases when you add a new feature. That's why we created data holders to pass around.
3- A secondary constructor method for LatLng
While working on this feature, I realized that a secondary method that constructs the LatLng entity from double values would also be useful when mapping the entities with different map providers. I mentioned the reason why I use inline classes for Latitude and Longitude in the first episode.
Code:
inline class Latitude(val value: Float)
inline class Longitude(val value: Float)
data class LatLng(
val latitude: Latitude,
val longitude: Longitude
) {
constructor(latitude: Double, longitude: Double) : this(
Latitude(latitude.toFloat()),
Longitude(longitude.toFloat())
)
val latDoubleValue: Double
get() = latitude.value.toDouble()
val lngDoubleValue: Double
get() = longitude.value.toDouble()
}
Bounded Region
A bounded region is used to describe a particular area (in many cases it is rectangular) on a map. We usually need two coordinate pairs to describe a region: Soutwest and Northeast. In this stackoverflow answer (https://stackoverflow.com/a/31029389), it is well described:
As expected Mapbox, GMS and HMS maps provide LatLngBounds classes. However, they require a pair of coordinates to construct the bound. In our case we only have one location for each attraction point. We want to show the region with a radius from center on map. We need to do a little bit extra work to calculate the location pair but first let's add LatLngBound entity to our domain module:
Code:
data class LatLngBounds(
val southwestCorner: LatLng,
val northeastCorner: LatLng
)
Implementation
First, let's see the big (literally!) picture:
Thanks to our clean architecture, it is very easy to add a new feature with a new use case. Let's start with the domain module as always:
Code:
/**
* Emits the list of waypoints with a given update interval
*/
class StartWaypointPlaybackUseCase
@Inject constructor(
private val routeInfoRepository: RouteInfoRepository
) {
suspend operator fun invoke(
points: List<Point>,
updateInterval: Long
): Flow<Point> {
return routeInfoRepository.startWaypointPlayback(points, updateInterval)
}
}
The user interacts with the app to start the playback of waypoints. I call this playback because playback is "the reproduction of previously recorded sounds or moving images." We have a list of points to be listened in a given time. We will move map camera periodically from one bounded region to another. The waypoints are emitted from datasource with a given update interval. Domain module doesn't know the implementation details. It sends the request to our datasource module.
Let's see our datasource module. We added a new method in RouteInfoDataRepository:
Code:
override suspend fun startWaypointPlayback(
points: List<Point>,
updateInterval: Long
): Flow<Point> = flow {
val routeInfo = gpxFileDatasource.parseGpxFile()
routeInfo.wayPoints.forEachIndexed { index, waypoint ->
if (index != 0) {
delay(updateInterval)
}
emit(waypoint)
}
}.flowOn(Dispatchers.Default)
Thanks to Kotlin Coroutines, it is very simple to emit the points with a delay. Roman Elizarov describes the flow api in very neat diagram below. If you are interested to learn more about it, his talks are the best to start with.
Long story short, our app module invokes the use case from domain module, domain module forwards the request to datasource module. The corresponding repository class inside datasource module gets the data from GPX datasource and the datasource module orchestrates the data flow.
For full content, you can visit HUAWEI Developer Forum.

Building Contextual Apps with Huawei Awareness Kit : Capture API

More articles like this, you can visit HUAWEI Developer Forum​
Hello everyone, in this article we’re going to take a look at the Awareness Kit features so we can easily put these features to our apps.
Providing dynamic and effective user experiences to the app is an important point. Huawei Awareness Kit allows this to be done quickly and economically.It has collection of both contextual and location based features such as users’ current time, location, behavior, audio device status, ambient light, weather, and nearby beacons.
Huawei Awareness Kit also strongly emphasizes both the power and memory consumption when accessing these features and helping to ensure that the battery life and memory usage of your apps.
To use these features, Awareness Kit has two different sections:
Capture API
The Capture API allows your app to request the current user status, such as time, location, behavior, and whether a headset is connected.
Barrier API
The Barrier API allows your app to set a combination of contextual conditions. When the preset contextual conditions are met, your app will receive a notification.
I created a sample app which I’ve open-sourced on GitHub for enthusiasts who want to use this kit.
Setting up the Awareness Kit
You’ll need to do setup before you can use the Awareness Kit.You can follow the official documentation on how to prepare app on App Gallery Connect.
Add the required dependencies to the build.gradle file under app folder.We need awareness and location kit dependencies.
Code:
implementation 'com.huawei.hms:awareness:1.0.3.300'
implementation 'com.huawei.hms:location:4.0.4.300'
Add the required permissions to the AndroidManifest.xml file under app/src/main folder.
Code:
<!-- Location permission, which is sensitive. After the permission is declared, you need to dynamically apply for it in the code. -->
<uses-permission android:name="android.permission.ACCESS_FINE_LOCATION" /> <!-- Location permission. If only the IP address-based time capture function is used, you can declare only this permission. -->
<uses-permission android:name="android.permission.ACCESS_COARSE_LOCATION" /> <!-- If your app uses the barrier capability and runs on Android 10 or later, you need to add the background location access permission, which is also a sensitive permission. -->
<uses-permission android:name="android.permission.ACCESS_BACKGROUND_LOCATION" />
<uses-permission android:name="android.permission.BLUETOOTH" /> <!-- Behavior detection permission (Android 10 or later). This permission is sensitive and needs to be dynamically applied for in the code after being declared. -->
<uses-permission android:name="android.permission.ACTIVITY_RECOGNITION" /> <!-- Behavior detection permission (Android 9). This permission is sensitive and needs to be dynamically applied for in the code after being declared. -->
<uses-permission android:name="com.huawei.hms.permission.ACTIVITY_RECOGNITION" />
Don’t forget checking permissions in the app.
Checking location permission:
Code:
if (ActivityCompat.checkSelfPermission(this, Manifest.permission.ACCESS_FINE_LOCATION) != PackageManager.PERMISSION_GRANTED) {
ActivityCompat.requestPermissions(this, arrayOf(Manifest.permission.ACCESS_FINE_LOCATION), PERMISSION_REQUEST_ACCESS_FINE_LOCATION)
return
}
Capture API
Firstly, We’re going to examine the Capture API in depth.This API allows us to request the current user status from current environment.Using this API we can retrieve data such as:
Obtains the current local time or time of a specified location, such as working day, weekend, etc.
Getting the users’ current location.
Obtains the current activity behavior, such as walking, running, cycling, driving, or staying still.
Indicates whether the device has approached, connected to, or disconnected from a registered beacon.
Retrieve info related to the user currently has their headphones plugged in or unplugged.
Obtains the status of an audio device (connected or disconnected).
Obtains the illuminance of the environment in lux unit where the device is located.
The weather status based on where the user is currently located.
Let’s take a deep look at these different awarenesses that we can retrieve from the Capture API.
Time Awareness
Using the Capture API we can detect whether the holiday information of most countries/regions and sunrise and sunset time of all cities around the world. Some simple use cases could be found in:
A meditation app that wishes to show a notification to remind morning rest
A gift card sending app that may wish to send virtual holiday wish cards to users
To obtain the time categories from the Capture API we simply need to call the timeCategories method.This will return instance of the TimeCategoriesResponse class that if successful, will contain information about the time categories as integer array.
Code:
private fun getTimeCategories() {
try {
// Use getTimeCategories() to get the information about the current time of the user location.
// Time information includes whether the current day is a workday or a holiday, and whether the current day is in the morning, afternoon, or evening, or at the night.
val task = Awareness.getCaptureClient(this).timeCategories
task.addOnSuccessListener { timeCategoriesResponse ->
val timeCategories = timeCategoriesResponse.timeCategories
// do anything with time categories array
}.addOnFailureListener { e ->
Toast.makeText(
applicationContext, "get Time Categories failed",
Toast.LENGTH_SHORT
).show()
Log.e(TAG, "get Time Categories failed", e)
}
} catch (e: Exception) {
logViewTv.text = "get Time Categories failed.Exception:" + e.message
}
}
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Location Awareness
Using the Capture API we can get location information where the user currently is. Some simple use cases could be found in:
A shopping app can show their nearby stores in current user location if there is.
An event app can show the upcoming events based on the appropriate location.
To get the location from the Capture API we simply need to call the location method.This will return instance of the LocationResponse class that if successful, will contain information about the users current location includes latitude, longitude, accuracy and altitude.
Code:
private fun getLocation(){
Awareness.getCaptureClient(this).location
.addOnSuccessListener { locationResponse ->
val location: Location = locationResponse.location
val latitude = location.latitude
val longitude = location.longitude
val accuracy = location.accuracy
val altitude = location.altitude
}
.addOnFailureListener {
logViewTv.text = "get location failed:" + it.message
}
}
Behavior Awareness
We can use the Capture API to detect the current activity behavior. Some simple use cases could be found in:
A fitness app can log user activity in background and make a graph from this data
An ebook listening app that wishes to recommend listening book when user is walking.
To get the current user behavior from the Capture API we simply need to call the behavior method.This will return instance of the BehaviorResponse class that if successful, will contain information about the users current context.
Code:
private fun getUserBehavior(){
Awareness.getCaptureClient(this).behavior
.addOnSuccessListener { behaviorResponse ->
val behaviorStatus = behaviorResponse.behaviorStatus
val mostLikelyBehavior = behaviorStatus.mostLikelyBehavior
val mostLikelyBehaviors = behaviorStatus.probableBehavior
val mostProbableActivity = mostLikelyBehavior.type
val detectedBehavior: Long = behaviorStatus.time
val elapsedTime: Long = behaviorStatus.elapsedRealtimeMillis
}
.addOnFailureListener {
logViewTv.text = "get behavior failed: " + it.message
}
}
Beacon Awareness
We can use the Capture API to get registered beacons in nearby. You need to register beacon devices with your project. For details, please refer to Beacon Management. Some simple use cases could be found in:
When the user enters a store, the application can show a special discount to the person.
A bus app that wishes to show push notifications about buses going in that direction.
To get any nearby registered beacons from the Capture API we simply need to call the getBeaconStatus method. This will return instance of the BeaconStatusResponse class that if successful, will contain information about nearby registered beacons.
Code:
private fun getBeacons(){
val namespace = "sample namespace"
val type = "sample type"
val content = byteArrayOf(
's'.toByte(),
'a'.toByte(),
'm'.toByte(),
'p'.toByte(),
'l'.toByte(),
'e'.toByte()
)
val filter = BeaconStatus.Filter.match(namespace, type, content)
Awareness.getCaptureClient(this).getBeaconStatus(filter)
.addOnSuccessListener { beaconStatusResponse ->
val beaconDataList = beaconStatusResponse.beaconStatus.beaconData
if (beaconDataList != null && beaconDataList.size != 0) {
var i = 1
val builder = StringBuilder()
for (beaconData in beaconDataList) {
builder.append("Beacon Data ").append(i)
builder.append(" namespace:").append(beaconData.namespace)
builder.append(",type:").append(beaconData.type)
builder.append(",content:").append(Arrays.toString(beaconData.content))
builder.append("; ")
i++
}
} else {
logViewTv.text = "no beacons match filter nearby"
}
}
.addOnFailureListener { e ->
logViewTv.text = "get beacon status failed: " + e.message
}
}
Headset Awareness
Using the Capture API we can get information about headphones connected. Some simple use cases could be found in:
A music app stop playing song on their app when the user disconnects their headphones.
An ebook listening app can show a notification to allow quick access to their app when the user connects their headphones.
To obtain the headphone status from the Capture API we simply need to call the headsetStatus method .This will return instance of the HeadsetStatusResponse class that if successful, will contain information about the devices current headphone status.
Bluetooth Car Stereo Awareness
Using the Capture API we can get information about bluetooth car stereo connected. Some simple use cases could be found in:
A music app that wishes to show a notification to play a song when the car stereo connects.
A radio app that wishes to ask playing a radio when the car stereo connects.
To get the car stereo status from the Capture API we simply need to call the getBluetoothStatus method .This will return instance of the BluetoothStatusResponse class that if successful, will contain information about the car stereo status.
Code:
private fun getCarStereoStatus() {
val deviceType = 0 // Value 0 indicates a Bluetooth car stereo.
Awareness.getCaptureClient(this).getBluetoothStatus(deviceType)
.addOnSuccessListener { bluetoothStatusResponse ->
val bluetoothStatus = bluetoothStatusResponse.bluetoothStatus
val status = bluetoothStatus.status
val stateStr = "The Bluetooth car stereo is " + if (status == BluetoothStatus.CONNECTED) "connected" else "disconnected"
}
.addOnFailureListener { e ->
logViewTv.text = "get bluetooth status failed: " + e.message
}
}
Ambient Light Awareness
Using the Capture API we can obtain information about ambient light. Some simple use cases could be found in:
A book reading application can adjust the brightness of the screen according to the ambient light to consider the eyesight.
A video watching application can adjust the brightness of the screen according to the ambient light to consider the eyesight.
To get the ambient light from the Capture API we simply need to call the lightIntensity method .This will return instance of the AmbientLightResponse class that if successful, will contain information about the ambient light as lux.
Code:
private fun getAmbientLight() {
Awareness.getCaptureClient(this).lightIntensity
.addOnSuccessListener { ambientLightResponse ->
val ambientLightStatus = ambientLightResponse.ambientLightStatus
logViewTv.text = "Light intensity is " + ambientLightStatus.lightIntensity + " lux"
}
.addOnFailureListener { e ->
logViewTv.text = "get light intensity failed" + e.message
}
}
Weather Awareness
We can also use the Capture API to retrieve the weather status of where the user is currently located. Some simple use cases could be found in:
A music app can recommend a playlist to listen to, depending on the weather.
A fitness app can give a weather alert to users before running outside.
To get the weather status from the Capture API we simply need to call the weatherByDevice method .This will return instance of the WeatherStatusResponse class that if successful.
Code:
private fun getWeather(){
Awareness.getCaptureClient(this).weatherByDevice
.addOnSuccessListener { weatherStatusResponse ->
val weatherStatus = weatherStatusResponse.weatherStatus
val weatherSituation = weatherStatus.weatherSituation
val situation = weatherSituation.situation
cityNameTv.text = "City: " + weatherSituation.city.name
weatherIdTv.text = "Weather id is: " + situation.weatherId
cnWeatherIdTv.text = "CN Weather id is: " + situation.cnWeatherId
temperatureCTv.text = "Temperature is: " + situation.temperatureC + "℃"
temperatureFTv.text = "Temperature is: " + situation.temperatureF + "℉"
windSpeedTv.text = "Wind speed is: " + situation.windSpeed
windDirectionTv.text = "Wind direction is: " + situation.windDir
humidityTv.text = "Humidity is : " + situation.humidity
}
.addOnFailureListener {
logViewTv.text = "get weather failed: " + it.message
}
}
Can Awareness will be useful for restaurants for promoting their customers with a message.

Using HMS Site Kit with Clean Architecture + MVVM

{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Introduction
Hello again my fellow HMS enthusiasts, long time no see…or talk…or write / read… you know what I mean. My new article is about integrating one of Huawei’s kits, namely Site Kit in a project using Clean Architecture and MVVM to bring the user a great experience whilst making it easy for the developers to test and maintain the application.
Before starting with the project, we have to dwell into the architecture of the project in order not to get confused later on when checking the separation of the files.
Clean Architecture
The software design behind Clean Architecture aims to separate the design elements such that the organization of the levels is clean and easy to develop, maintain or test, and where the business logic is completely encapsulated.
The design elements are split into circle layers and the most important rule is the outward dependency rule, stating that the inner layers functionalities have no dependency on the outer ones. The Clean Architecture adaption I have chosen to illustrate is the simple app, data, domain layer, outward in.
The domain layer is the inner layer of the architecture, where all the business logic is maintained, or else the core functionality of the code and it is completely encapsulated from the rest of the layers since it tends to not change throughout the development of the code. This layer contains the Entities, Use Cases and Repository Interfaces.
The middle circle or layer is the data, containing Repository Implementations as well and Data Sources and it depends on the Domain layer.
The outer layer is the app layer, or presentation of the application, containing Activities and Fragments modeled by View Models which execute the use cases of the domain layer. It depends on both data and domain layer.
The work flow of the Clean Architecture using MVVM (Model-View-Viewmodel) is given as follows:
The fragments used call certain methods from the Viewmodels.
The Viewmodels execute the Use Cases attached to them.
The Use Case makes use of the data coming from the repositories.
The Repositories return the data from either a local or remote Data Source.
From there the data returns to the User Interface through Mutable Live Data observation so we can display it to the user. Hence we can tell the data goes through the app ring to the data ring and then all the way back down.
Now that we have clarified the Clean Architecture we will be passing shorty to MVVM so as to make everything clearer on the reader.
MVVM Architecture
This is another architecture used with the aim of facilitating the developers work and separating the development of the graphical interface. It consists in Model-View-Viewmodel method which was shortly mentioned in the previous sections.
This software pattern consists in Views, ViewModels and Models (duhhh how did I come up with that?!). The View is basically the user interface, made up of Activities and Fragments supporting a set of use cases and it is connected through DataBinding to the ModelView which serves as a intermediate between the View and the Model, or else between the UI and the back logic to all the use cases and methods called in the UI.
Why did I choose MVVM with Clean Architecture? Because when projects start to increase in size from small to middle or expand to bigger ones, then the separation of responsibilities becomes harder as the codebase grows huge, making the project more error prone thus increasing the difficulty of the development, testing and maintenance.
With these being said, we can now move on to the development of Site Kit using Clean Architecture + MVVM.
Site Kit
Before you are able to integrate Site Kit, you should create a application and perform the necessary configurations by following this post. Afterwards we can start.
Site Kit is a site service offered by Huawei to help users find places and points of interest, including but not limited to the name of the place, location and address. It can also make suggestions using the autocomplete function or make use of the coordinates to give the users written address and time zone. In this scenario, we will search for restaurants based on the type of food they offer, and we include 6 main types such as burger, pizza, taco, kebab, coffee and dessert.
Now since there is no function in Site Kit that allows us to make a search of a point of interest (POI) based on such types, we will instead conduct a text search where the query will be the type of restaurant we have picked. In the UI or View we call this function with the type of food passed as an argument.
Code:
type = args.type.toString()
type?.let { viewModel.getSitesWithKeyword(type,41.0082,28.9784) }
Since we are using MVVM, we will need the ViewModel to call the usecase for us hence in the ViewModel we add the following function and invoke the usecase, to then proceed getting the live data that will come as a response when we move back up in the data flow.
Code:
class SearchInputViewModel @ViewModelInject constructor(
private val getSitesWithKeywordUseCase: GetSitesWithKeywordUseCase
) : BaseViewModel() {
private val _restaurantList = MutableLiveData<ResultData<List<Restaurant>>>()
val restaurantList: LiveData<ResultData<List<Restaurant>>>
get() = _restaurantList
@InternalCoroutinesApi
fun getSitesWithKeyword(keyword: String, latitude: Double, longitude: Double) {
viewModelScope.launch(Dispatchers.IO) {
getSitesWithKeywordUseCase.invoke(keyword, latitude, longitude).collect { it ->
handleTask(it) {
_restaurantList.postValue(it)
}
}
}
}
companion object {
private const val TAG = "SearchInputViewModel"
}
}
After passing the app layer of the onion we will now call the UseCase implemented in the domain side where we inject the Site Repository interface so that the UseCase can make use of the data flowing in from the Repository.
Code:
class GetSitesWithKeywordUseCase @Inject constructor(private val repository: SitesRepository) {
suspend operator fun invoke(keyword:String, lat: Double, lng: Double): Flow<ResultData<List<Restaurant>>> {
return repository.getSitesWithKeyword(keyword,lat,lng)
}
}
Code:
interface SitesRepository {
suspend fun getSitesWithKeyword(keyword:String, lat:Double, lng: Double): Flow<ResultData<List<Restaurant>>>
}
The interface of the Site Repository in the domain actually represents the implemented Site Repository in the data layer which returns data from the remote Sites DataSource using an interface and uses a mapper to map the Site Results to a data class of type Restaurant (since we are getting the data of the Restaurants).
Code:
@InternalCoroutinesApi
class SitesRepositoryImpl @Inject constructor(
private val sitesRemoteDataSource: SitesRemoteDataSource,
private val restaurantMapper: Mapper<Restaurant, Site>
) :
SitesRepository {
override suspend fun getSitesWithKeyword(keyword: String,lat:Double, lng:Double): Flow<ResultData<List<Restaurant>>> =
flow {
emit(ResultData.Loading())
val response = sitesRemoteDataSource.getSitesWithKeyword(keyword,lat,lng)
when (response) {
is SitesResponse.Success -> {
val sites = response.data.sites.orEmpty()
val restaurants = restaurantMapper.mapToEntityList(sites)
emit(ResultData.Success(restaurants))
Log.d(TAG, "ResultData.Success emitted ${restaurants.size}")
}
is SitesResponse.Error -> {
emit(ResultData.Failed(response.errorMessage))
Log.d(TAG, "ResultData.Error emitted ${response.errorMessage}")
}
}
}
companion object {
private const val TAG = "SitesRepositoryImpl"
}
}
The SitesRemoteDataSource interface in fact only serves an an interface for the implementation of the real data source (SitesRemoteDataSourceImpl) and gets the SiteResponse coming from it.
Code:
interface SitesRemoteDataSource {
suspend fun getSitesWithKeyword(keyword:String, lat:Double, lng:Double): SitesResponse<TextSearchResponse>
}
Code:
@ExperimentalCoroutinesApi
class SitesRemoteDataSourceImpl @Inject constructor(private val sitesService: SitesService) :
SitesRemoteDataSource {
override suspend fun getSitesWithKeyword(keyword: String, lat: Double, lng: Double): SitesResponse<TextSearchResponse> {
return sitesService.getSitesByKeyword(keyword,lat,lng)
}
}
We rollback to the Repository in case of any response; in case of success specifically, meaning we got an answer of type Restaurant List from the SiteResponse, we map those results to the data class Restaurant and then keep rolling the data flow back up until we can observe them in the ViewModel through the Mutable Live Data and then display them in the UI Fragment.
Code:
sealed class SitesResponse<T> {
data class Success <T>(val data: T) : SitesResponse<T>()
data class Error<T>(val errorMessage: String , val errorCode:String) : SitesResponse<T>()
}
However, before we start rolling back, in order to even be able to get a SiteResponse, we should implement the framework SiteService where we make the necessary API request, in our case the TextSearchRequest by injecting the Site Kit’s Search Service and inserting the type of food the user chose as a query and Restaurant as a POI type.
Code:
@ExperimentalCoroutinesApi
class SitesService @Inject constructor(private val searchService: SearchService) {
suspend fun getSitesByKeyword(keyword: String, lat: Double, lng: Double) =
suspendCoroutine<SitesResponse<TextSearchResponse>> { continuation ->
val callback = object : SearchResultListener<TextSearchResponse> {
override fun onSearchResult(p0: TextSearchResponse) {
continuation.resume(SitesResponse.Success(data = p0))
Log.d(
TAG,
"SitesResponse.Success ${p0.totalCount} emitted to flow controller"
)
}
override fun onSearchError(p0: SearchStatus) {
continuation.resume(
SitesResponse.Error(
errorCode = p0.errorCode,
errorMessage = p0.errorMessage
)
)
Log.d(TAG, "SitesResponse.Error emitted to flow controller")
}
}
val request = TextSearchRequest()
val locationIstanbul = Coordinate(lat, lng)
request.apply {
query = keyword
location = locationIstanbul
hwPoiType = HwLocationType.RESTAURANT
radius = 1000
pageSize = 20
pageIndex = 1
}
searchService.textSearch(request, callback)
}
companion object {
const val TAG = "SitesService"
}
}
After making the Text Search Request, we get the result from the callback as a SiteResponse and then start the dataflow back up by passing the SiteResponse to the DataSource, from there to the Respository, then to the UseCase and then finally we observe the data live in the ViewModel, to finally display it in the fragment / UI.
For a better understanding of how the whole project is put together I have prepared a small demo showing the flow of the process.
Site Kit with Clean Architecture and MVVM Demo
And that was it, looks complicated but it really is pretty easy once you get the hang of it. Give it a shot!
Tips and Tricks
Tips are important here as all this process might look confusing at a first glance, so what I would suggest is:
Follow the Clean Architecture structure of the project by splitting your files in separate folders according to their function.
Use Coroutines instead of threads since they are faster and lighter to run.
Use dependency injections (Hilt, Dagger) so as to avoid the tedious job of manual dependency injection for every class.
Conclusion
In this article, we got to mention the structure of Clean Architecture and MVVM and their importance when implemented together in medium / big size projects. We moved on in the implementation of Site Kit Service using the aforementioned architectures and explaining the process of it step by step, until we retrieved the final search result. I hope you try it and like it. As always, stay healthy my friends and see you in other articles.
Reference
HMS Site Kit
Clean Architecture
MVVM with Clean Architecture
Sabrina Cara said:
View attachment 5232405
Introduction
Hello again my fellow HMS enthusiasts, long time no see…or talk…or write / read… you know what I mean. My new article is about integrating one of Huawei’s kits, namely Site Kit in a project using Clean Architecture and MVVM to bring the user a great experience whilst making it easy for the developers to test and maintain the application.
Before starting with the project, we have to dwell into the architecture of the project in order not to get confused later on when checking the separation of the files.
Clean Architecture
The software design behind Clean Architecture aims to separate the design elements such that the organization of the levels is clean and easy to develop, maintain or test, and where the business logic is completely encapsulated.
The design elements are split into circle layers and the most important rule is the outward dependency rule, stating that the inner layers functionalities have no dependency on the outer ones. The Clean Architecture adaption I have chosen to illustrate is the simple app, data, domain layer, outward in.
The domain layer is the inner layer of the architecture, where all the business logic is maintained, or else the core functionality of the code and it is completely encapsulated from the rest of the layers since it tends to not change throughout the development of the code. This layer contains the Entities, Use Cases and Repository Interfaces.
The middle circle or layer is the data, containing Repository Implementations as well and Data Sources and it depends on the Domain layer.
The outer layer is the app layer, or presentation of the application, containing Activities and Fragments modeled by View Models which execute the use cases of the domain layer. It depends on both data and domain layer.
The work flow of the Clean Architecture using MVVM (Model-View-Viewmodel) is given as follows:
The fragments used call certain methods from the Viewmodels.
The Viewmodels execute the Use Cases attached to them.
The Use Case makes use of the data coming from the repositories.
The Repositories return the data from either a local or remote Data Source.
From there the data returns to the User Interface through Mutable Live Data observation so we can display it to the user. Hence we can tell the data goes through the app ring to the data ring and then all the way back down.
Now that we have clarified the Clean Architecture we will be passing shorty to MVVM so as to make everything clearer on the reader.
MVVM Architecture
This is another architecture used with the aim of facilitating the developers work and separating the development of the graphical interface. It consists in Model-View-Viewmodel method which was shortly mentioned in the previous sections.
This software pattern consists in Views, ViewModels and Models (duhhh how did I come up with that?!). The View is basically the user interface, made up of Activities and Fragments supporting a set of use cases and it is connected through DataBinding to the ModelView which serves as a intermediate between the View and the Model, or else between the UI and the back logic to all the use cases and methods called in the UI.
Why did I choose MVVM with Clean Architecture? Because when projects start to increase in size from small to middle or expand to bigger ones, then the separation of responsibilities becomes harder as the codebase grows huge, making the project more error prone thus increasing the difficulty of the development, testing and maintenance.
With these being said, we can now move on to the development of Site Kit using Clean Architecture + MVVM.
Site Kit
Before you are able to integrate Site Kit, you should create a application and perform the necessary configurations by following this post. Afterwards we can start.
Site Kit is a site service offered by Huawei to help users find places and points of interest, including but not limited to the name of the place, location and address. It can also make suggestions using the autocomplete function or make use of the coordinates to give the users written address and time zone. In this scenario, we will search for restaurants based on the type of food they offer, and we include 6 main types such as burger, pizza, taco, kebab, coffee and dessert.
Now since there is no function in Site Kit that allows us to make a search of a point of interest (POI) based on such types, we will instead conduct a text search where the query will be the type of restaurant we have picked. In the UI or View we call this function with the type of food passed as an argument.
Code:
type = args.type.toString()
type?.let { viewModel.getSitesWithKeyword(type,41.0082,28.9784) }
Since we are using MVVM, we will need the ViewModel to call the usecase for us hence in the ViewModel we add the following function and invoke the usecase, to then proceed getting the live data that will come as a response when we move back up in the data flow.
Code:
class SearchInputViewModel @ViewModelInject constructor(
private val getSitesWithKeywordUseCase: GetSitesWithKeywordUseCase
) : BaseViewModel() {
private val _restaurantList = MutableLiveData<ResultData<List<Restaurant>>>()
val restaurantList: LiveData<ResultData<List<Restaurant>>>
get() = _restaurantList
@InternalCoroutinesApi
fun getSitesWithKeyword(keyword: String, latitude: Double, longitude: Double) {
viewModelScope.launch(Dispatchers.IO) {
getSitesWithKeywordUseCase.invoke(keyword, latitude, longitude).collect { it ->
handleTask(it) {
_restaurantList.postValue(it)
}
}
}
}
companion object {
private const val TAG = "SearchInputViewModel"
}
}
After passing the app layer of the onion we will now call the UseCase implemented in the domain side where we inject the Site Repository interface so that the UseCase can make use of the data flowing in from the Repository.
Code:
class GetSitesWithKeywordUseCase @Inject constructor(private val repository: SitesRepository) {
suspend operator fun invoke(keyword:String, lat: Double, lng: Double): Flow<ResultData<List<Restaurant>>> {
return repository.getSitesWithKeyword(keyword,lat,lng)
}
}
Code:
interface SitesRepository {
suspend fun getSitesWithKeyword(keyword:String, lat:Double, lng: Double): Flow<ResultData<List<Restaurant>>>
}
The interface of the Site Repository in the domain actually represents the implemented Site Repository in the data layer which returns data from the remote Sites DataSource using an interface and uses a mapper to map the Site Results to a data class of type Restaurant (since we are getting the data of the Restaurants).
Code:
@InternalCoroutinesApi
class SitesRepositoryImpl @Inject constructor(
private val sitesRemoteDataSource: SitesRemoteDataSource,
private val restaurantMapper: Mapper<Restaurant, Site>
) :
SitesRepository {
override suspend fun getSitesWithKeyword(keyword: String,lat:Double, lng:Double): Flow<ResultData<List<Restaurant>>> =
flow {
emit(ResultData.Loading())
val response = sitesRemoteDataSource.getSitesWithKeyword(keyword,lat,lng)
when (response) {
is SitesResponse.Success -> {
val sites = response.data.sites.orEmpty()
val restaurants = restaurantMapper.mapToEntityList(sites)
emit(ResultData.Success(restaurants))
Log.d(TAG, "ResultData.Success emitted ${restaurants.size}")
}
is SitesResponse.Error -> {
emit(ResultData.Failed(response.errorMessage))
Log.d(TAG, "ResultData.Error emitted ${response.errorMessage}")
}
}
}
companion object {
private const val TAG = "SitesRepositoryImpl"
}
}
The SitesRemoteDataSource interface in fact only serves an an interface for the implementation of the real data source (SitesRemoteDataSourceImpl) and gets the SiteResponse coming from it.
Code:
interface SitesRemoteDataSource {
suspend fun getSitesWithKeyword(keyword:String, lat:Double, lng:Double): SitesResponse<TextSearchResponse>
}
Code:
@ExperimentalCoroutinesApi
class SitesRemoteDataSourceImpl @Inject constructor(private val sitesService: SitesService) :
SitesRemoteDataSource {
override suspend fun getSitesWithKeyword(keyword: String, lat: Double, lng: Double): SitesResponse<TextSearchResponse> {
return sitesService.getSitesByKeyword(keyword,lat,lng)
}
}
We rollback to the Repository in case of any response; in case of success specifically, meaning we got an answer of type Restaurant List from the SiteResponse, we map those results to the data class Restaurant and then keep rolling the data flow back up until we can observe them in the ViewModel through the Mutable Live Data and then display them in the UI Fragment.
Code:
sealed class SitesResponse<T> {
data class Success <T>(val data: T) : SitesResponse<T>()
data class Error<T>(val errorMessage: String , val errorCode:String) : SitesResponse<T>()
}
However, before we start rolling back, in order to even be able to get a SiteResponse, we should implement the framework SiteService where we make the necessary API request, in our case the TextSearchRequest by injecting the Site Kit’s Search Service and inserting the type of food the user chose as a query and Restaurant as a POI type.
Code:
@ExperimentalCoroutinesApi
class SitesService @Inject constructor(private val searchService: SearchService) {
suspend fun getSitesByKeyword(keyword: String, lat: Double, lng: Double) =
suspendCoroutine<SitesResponse<TextSearchResponse>> { continuation ->
val callback = object : SearchResultListener<TextSearchResponse> {
override fun onSearchResult(p0: TextSearchResponse) {
continuation.resume(SitesResponse.Success(data = p0))
Log.d(
TAG,
"SitesResponse.Success ${p0.totalCount} emitted to flow controller"
)
}
override fun onSearchError(p0: SearchStatus) {
continuation.resume(
SitesResponse.Error(
errorCode = p0.errorCode,
errorMessage = p0.errorMessage
)
)
Log.d(TAG, "SitesResponse.Error emitted to flow controller")
}
}
val request = TextSearchRequest()
val locationIstanbul = Coordinate(lat, lng)
request.apply {
query = keyword
location = locationIstanbul
hwPoiType = HwLocationType.RESTAURANT
radius = 1000
pageSize = 20
pageIndex = 1
}
searchService.textSearch(request, callback)
}
companion object {
const val TAG = "SitesService"
}
}
After making the Text Search Request, we get the result from the callback as a SiteResponse and then start the dataflow back up by passing the SiteResponse to the DataSource, from there to the Respository, then to the UseCase and then finally we observe the data live in the ViewModel, to finally display it in the fragment / UI.
For a better understanding of how the whole project is put together I have prepared a small demo showing the flow of the process.
Site Kit with Clean Architecture and MVVM Demo
And that was it, looks complicated but it really is pretty easy once you get the hang of it. Give it a shot!
Tips and Tricks
Tips are important here as all this process might look confusing at a first glance, so what I would suggest is:
Follow the Clean Architecture structure of the project by splitting your files in separate folders according to their function.
Use Coroutines instead of threads since they are faster and lighter to run.
Use dependency injections (Hilt, Dagger) so as to avoid the tedious job of manual dependency injection for every class.
Conclusion
In this article, we got to mention the structure of Clean Architecture and MVVM and their importance when implemented together in medium / big size projects. We moved on in the implementation of Site Kit Service using the aforementioned architectures and explaining the process of it step by step, until we retrieved the final search result. I hope you try it and like it. As always, stay healthy my friends and see you in other articles.
Reference
HMS Site Kit
Clean Architecture
MVVM with Clean Architecture
Click to expand...
Click to collapse
Sabrina Cara said:
View attachment 5232405
Introduction
Hello again my fellow HMS enthusiasts, long time no see…or talk…or write / read… you know what I mean. My new article is about integrating one of Huawei’s kits, namely Site Kit in a project using Clean Architecture and MVVM to bring the user a great experience whilst making it easy for the developers to test and maintain the application.
Before starting with the project, we have to dwell into the architecture of the project in order not to get confused later on when checking the separation of the files.
Clean Architecture
The software design behind Clean Architecture aims to separate the design elements such that the organization of the levels is clean and easy to develop, maintain or test, and where the business logic is completely encapsulated.
The design elements are split into circle layers and the most important rule is the outward dependency rule, stating that the inner layers functionalities have no dependency on the outer ones. The Clean Architecture adaption I have chosen to illustrate is the simple app, data, domain layer, outward in.
The domain layer is the inner layer of the architecture, where all the business logic is maintained, or else the core functionality of the code and it is completely encapsulated from the rest of the layers since it tends to not change throughout the development of the code. This layer contains the Entities, Use Cases and Repository Interfaces.
The middle circle or layer is the data, containing Repository Implementations as well and Data Sources and it depends on the Domain layer.
The outer layer is the app layer, or presentation of the application, containing Activities and Fragments modeled by View Models which execute the use cases of the domain layer. It depends on both data and domain layer.
The work flow of the Clean Architecture using MVVM (Model-View-Viewmodel) is given as follows:
The fragments used call certain methods from the Viewmodels.
The Viewmodels execute the Use Cases attached to them.
The Use Case makes use of the data coming from the repositories.
The Repositories return the data from either a local or remote Data Source.
From there the data returns to the User Interface through Mutable Live Data observation so we can display it to the user. Hence we can tell the data goes through the app ring to the data ring and then all the way back down.
Now that we have clarified the Clean Architecture we will be passing shorty to MVVM so as to make everything clearer on the reader.
MVVM Architecture
This is another architecture used with the aim of facilitating the developers work and separating the development of the graphical interface. It consists in Model-View-Viewmodel method which was shortly mentioned in the previous sections.
This software pattern consists in Views, ViewModels and Models (duhhh how did I come up with that?!). The View is basically the user interface, made up of Activities and Fragments supporting a set of use cases and it is connected through DataBinding to the ModelView which serves as a intermediate between the View and the Model, or else between the UI and the back logic to all the use cases and methods called in the UI.
Why did I choose MVVM with Clean Architecture? Because when projects start to increase in size from small to middle or expand to bigger ones, then the separation of responsibilities becomes harder as the codebase grows huge, making the project more error prone thus increasing the difficulty of the development, testing and maintenance.
With these being said, we can now move on to the development of Site Kit using Clean Architecture + MVVM.
Site Kit
Before you are able to integrate Site Kit, you should create a application and perform the necessary configurations by following this post. Afterwards we can start.
Site Kit is a site service offered by Huawei to help users find places and points of interest, including but not limited to the name of the place, location and address. It can also make suggestions using the autocomplete function or make use of the coordinates to give the users written address and time zone. In this scenario, we will search for restaurants based on the type of food they offer, and we include 6 main types such as burger, pizza, taco, kebab, coffee and dessert.
Now since there is no function in Site Kit that allows us to make a search of a point of interest (POI) based on such types, we will instead conduct a text search where the query will be the type of restaurant we have picked. In the UI or View we call this function with the type of food passed as an argument.
Code:
type = args.type.toString()
type?.let { viewModel.getSitesWithKeyword(type,41.0082,28.9784) }
Since we are using MVVM, we will need the ViewModel to call the usecase for us hence in the ViewModel we add the following function and invoke the usecase, to then proceed getting the live data that will come as a response when we move back up in the data flow.
Code:
class SearchInputViewModel @ViewModelInject constructor(
private val getSitesWithKeywordUseCase: GetSitesWithKeywordUseCase
) : BaseViewModel() {
private val _restaurantList = MutableLiveData<ResultData<List<Restaurant>>>()
val restaurantList: LiveData<ResultData<List<Restaurant>>>
get() = _restaurantList
@InternalCoroutinesApi
fun getSitesWithKeyword(keyword: String, latitude: Double, longitude: Double) {
viewModelScope.launch(Dispatchers.IO) {
getSitesWithKeywordUseCase.invoke(keyword, latitude, longitude).collect { it ->
handleTask(it) {
_restaurantList.postValue(it)
}
}
}
}
companion object {
private const val TAG = "SearchInputViewModel"
}
}
After passing the app layer of the onion we will now call the UseCase implemented in the domain side where we inject the Site Repository interface so that the UseCase can make use of the data flowing in from the Repository.
Code:
class GetSitesWithKeywordUseCase @Inject constructor(private val repository: SitesRepository) {
suspend operator fun invoke(keyword:String, lat: Double, lng: Double): Flow<ResultData<List<Restaurant>>> {
return repository.getSitesWithKeyword(keyword,lat,lng)
}
}
Code:
interface SitesRepository {
suspend fun getSitesWithKeyword(keyword:String, lat:Double, lng: Double): Flow<ResultData<List<Restaurant>>>
}
The interface of the Site Repository in the domain actually represents the implemented Site Repository in the data layer which returns data from the remote Sites DataSource using an interface and uses a mapper to map the Site Results to a data class of type Restaurant (since we are getting the data of the Restaurants).
Code:
@InternalCoroutinesApi
class SitesRepositoryImpl @Inject constructor(
private val sitesRemoteDataSource: SitesRemoteDataSource,
private val restaurantMapper: Mapper<Restaurant, Site>
) :
SitesRepository {
override suspend fun getSitesWithKeyword(keyword: String,lat:Double, lng:Double): Flow<ResultData<List<Restaurant>>> =
flow {
emit(ResultData.Loading())
val response = sitesRemoteDataSource.getSitesWithKeyword(keyword,lat,lng)
when (response) {
is SitesResponse.Success -> {
val sites = response.data.sites.orEmpty()
val restaurants = restaurantMapper.mapToEntityList(sites)
emit(ResultData.Success(restaurants))
Log.d(TAG, "ResultData.Success emitted ${restaurants.size}")
}
is SitesResponse.Error -> {
emit(ResultData.Failed(response.errorMessage))
Log.d(TAG, "ResultData.Error emitted ${response.errorMessage}")
}
}
}
companion object {
private const val TAG = "SitesRepositoryImpl"
}
}
The SitesRemoteDataSource interface in fact only serves an an interface for the implementation of the real data source (SitesRemoteDataSourceImpl) and gets the SiteResponse coming from it.
Code:
interface SitesRemoteDataSource {
suspend fun getSitesWithKeyword(keyword:String, lat:Double, lng:Double): SitesResponse<TextSearchResponse>
}
Code:
@ExperimentalCoroutinesApi
class SitesRemoteDataSourceImpl @Inject constructor(private val sitesService: SitesService) :
SitesRemoteDataSource {
override suspend fun getSitesWithKeyword(keyword: String, lat: Double, lng: Double): SitesResponse<TextSearchResponse> {
return sitesService.getSitesByKeyword(keyword,lat,lng)
}
}
We rollback to the Repository in case of any response; in case of success specifically, meaning we got an answer of type Restaurant List from the SiteResponse, we map those results to the data class Restaurant and then keep rolling the data flow back up until we can observe them in the ViewModel through the Mutable Live Data and then display them in the UI Fragment.
Code:
sealed class SitesResponse<T> {
data class Success <T>(val data: T) : SitesResponse<T>()
data class Error<T>(val errorMessage: String , val errorCode:String) : SitesResponse<T>()
}
However, before we start rolling back, in order to even be able to get a SiteResponse, we should implement the framework SiteService where we make the necessary API request, in our case the TextSearchRequest by injecting the Site Kit’s Search Service and inserting the type of food the user chose as a query and Restaurant as a POI type.
Code:
@ExperimentalCoroutinesApi
class SitesService @Inject constructor(private val searchService: SearchService) {
suspend fun getSitesByKeyword(keyword: String, lat: Double, lng: Double) =
suspendCoroutine<SitesResponse<TextSearchResponse>> { continuation ->
val callback = object : SearchResultListener<TextSearchResponse> {
override fun onSearchResult(p0: TextSearchResponse) {
continuation.resume(SitesResponse.Success(data = p0))
Log.d(
TAG,
"SitesResponse.Success ${p0.totalCount} emitted to flow controller"
)
}
override fun onSearchError(p0: SearchStatus) {
continuation.resume(
SitesResponse.Error(
errorCode = p0.errorCode,
errorMessage = p0.errorMessage
)
)
Log.d(TAG, "SitesResponse.Error emitted to flow controller")
}
}
val request = TextSearchRequest()
val locationIstanbul = Coordinate(lat, lng)
request.apply {
query = keyword
location = locationIstanbul
hwPoiType = HwLocationType.RESTAURANT
radius = 1000
pageSize = 20
pageIndex = 1
}
searchService.textSearch(request, callback)
}
companion object {
const val TAG = "SitesService"
}
}
After making the Text Search Request, we get the result from the callback as a SiteResponse and then start the dataflow back up by passing the SiteResponse to the DataSource, from there to the Respository, then to the UseCase and then finally we observe the data live in the ViewModel, to finally display it in the fragment / UI.
For a better understanding of how the whole project is put together I have prepared a small demo showing the flow of the process.
Site Kit with Clean Architecture and MVVM Demo
And that was it, looks complicated but it really is pretty easy once you get the hang of it. Give it a shot!
Tips and Tricks
Tips are important here as all this process might look confusing at a first glance, so what I would suggest is:
Follow the Clean Architecture structure of the project by splitting your files in separate folders according to their function.
Use Coroutines instead of threads since they are faster and lighter to run.
Use dependency injections (Hilt, Dagger) so as to avoid the tedious job of manual dependency injection for every class.
Conclusion
In this article, we got to mention the structure of Clean Architecture and MVVM and their importance when implemented together in medium / big size projects. We moved on in the implementation of Site Kit Service using the aforementioned architectures and explaining the process of it step by step, until we retrieved the final search result. I hope you try it and like it. As always, stay healthy my friends and see you in other articles.
Reference
HMS Site Kit
Clean Architecture
MVVM with Clean Architecture
Click to expand...
Click to collapse

Intermediate: How to integrate Huawei Awareness kit Barrier API with Local Notification into fitness app (Flutter)

Introduction
In this article, we will learn how to implement Huawei Awareness kit features with Local Notification service to send notification when certain condition met. We can create our own conditions to be met and observe them and notify to the user even when the application is not running.
The Awareness Kit is quite a comprehensive detection and event SDK. If you're developing a context-aware app for Huawei devices, this is definitely the library for you.
        
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
What can we do using Awareness kit?
With HUAWEI Awareness Kit, you can obtain a lot of different contextual information about users’ location, behavior, weather, current time, device status, ambient light, audio device status and makes it easier to provide more refined user experience.
Basic Usage
There are quite a few awareness "modules" in this SDK: Time Awareness, Location Awareness, Behavior Awareness, Beacon Awareness, Audio Device Status Awareness, Ambient Light Awareness, and Weather Awareness. Read on to find out how and when to use them.
Each of these modules has two modes: capture, which is an on-demand information retrieval; and barrier, which triggers an action when a specified condition is met.
Use case
The Barrier API allows us to set specific barriers for specific conditions in our app and when these conditions are satisfies, our app will be notified, so we could take action based on our conditions. In this sample, when user starts the activity and app notifies to the user “please connect the head set to listen music” while doing activity.
Table of content
1. Project setup
2. Headset capture API
3. Headset Barrier API
4. Flutter Local notification plugin
Advantages
1. Converged: Multi-dimensional and evolvable awareness capabilities can be called in a unified manner.
2. Accurate: The synergy of hardware and software makes data acquisition more accurate and efficient.
3. Fast: On-chip processing of local service requests and nearby access of cloud services promise a faster service response.
4. Economical: Sharing awareness capabilities avoids separate interactions between apps and the device, reducing system resource consumption. Collaborating with the EMUI (or Magic UI) and Kirin chip, Awareness Kit can even achieve optimal performance with the lowest power consumption.
Requirements
1. Any operating system(i.e. MacOS, Linux and Windows)
2. Any IDE with Flutter SDK installed (i.e. IntelliJ, Android Studio and VsCode etc.)
3. A little knowledge of Dart and Flutter.
4. A Brain to think
5. Minimum API Level 24 is required.
6. Required EMUI 5.0 and later version devices.
Setting up the Awareness kit
1. Firstly create a developer account in AppGallery Connect. After create your developer account, you can create a new project and new app. For more information check this link.
2. Generating a Signing certificate fingerprint, follow below command.
Code:
keytool -genkey -keystore <application_project_dir>\android\app\<signing_certificate_fingerprint_filename>.jks
-storepass <store_password> -alias <alias> -keypass<key_password> -keysize 2048 -keyalg RSA -validity 36500
3. The above command creates the keystore file in appdir/android/app.
4. Now we need to obtain the SHA256 key, follow the command.
Code:
keytool -list -v -keystore <application_project_dir>\android\app\<signing_certificate_fingerprint_filename>.jks
5. Enable the Awareness kit in the Manage API section and add the plugin.
6. After configuring project, we need to download agconnect-services.json file in your project and add into your project.
7. After that follow the URL for cross-platform plugins. Download required plugins.
  
8. The following dependencies for HMS usage need to be added to build.gradle file under the android directory.
Code:
buildscript {
ext.kotlin_version = '1.3.50'
repositories {
google()
jcenter()
maven {url 'http://developer.huawei.com/repo/'}
}
dependencies {
classpath 'com.android.tools.build:gradle:3.5.0'
classpath 'com.huawei.agconnect:agcp:1.4.1.300'
classpath "org.jetbrains.kotlin:kotlin-gradle-plugin:$kotlin_version"
}
}
allprojects {
repositories {
google()
jcenter()
maven {url 'http://developer.huawei.com/repo/'}
}
}
9. Then add the following line of code to the build.gradle file under the android/app directory.
Code:
apply plugin: 'com.huawei.agconnect'
10. Add the required permissions to the AndroidManifest.xml file under app>src>main folder.
Code:
<uses-permission android:name="android.permission.ACCESS_COARSE_LOCATION" />
<uses-permission android:name="android.permission.ACCESS_FINE_LOCATION" />
<uses-permission android:name="android.permission.ACCESS_BACKGROUND_LOCATION" />
<uses-permission android:name="android.permission.ACTIVITY_RECOGNITION" />
<uses-permission android:name="android.permission.BLUETOOTH" />
<uses-permission android:name="android.permission.ACCESS_WIFI_STATE" />
<uses-permission android:name="android.permission.INTERNET" />
11. After completing all the above steps, you need to add the required kits’ Flutter plugins as dependencies to pubspec.yaml file. You can find all the plugins in pub.dev with the latest versions.
Code:
huawei_awareness:
path: ../huawei_awareness/
12. To display local notification, we need to add flutter local notification plugin.
Code:
flutter_local_notifications: ^3.0.1+6
After adding them, run flutter pub get command. Now all the plugins are ready to use.
Note: Set multiDexEnabled to true in the android/app directory, so that app will not crash.
Code Implementation
Use Capture and Barrier API to get Headset status
This service will helps in your application before starting activity, it will remind you to connect the headset to play music.
We need to request the runtime permissions before calling any service.
Code:
@override
void initState() {
checkPermissions();
requestPermissions();
super.initState();
}
void checkPermissions() async {
locationPermission = await AwarenessUtilsClient.hasLocationPermission();
backgroundLocationPermission =
await AwarenessUtilsClient.hasBackgroundLocationPermission();
activityRecognitionPermission =
await AwarenessUtilsClient.hasActivityRecognitionPermission();
if (locationPermission &&
backgroundLocationPermission &&
activityRecognitionPermission) {
setState(() {
permissions = true;
notifyAwareness();
});
}
}
void requestPermissions() async {
if (locationPermission == false) {
bool status = await AwarenessUtilsClient.requestLocationPermission();
setState(() {
locationPermission = status;
});
}
if (backgroundLocationPermission == false) {
bool status =
await AwarenessUtilsClient.requestBackgroundLocationPermission();
setState(() {
locationPermission = status;
});
}
if (activityRecognitionPermission == false) {
bool status =
await AwarenessUtilsClient.requestActivityRecognitionPermission();
setState(() {
locationPermission = status;
});
checkPermissions();
}
}
Capture API: Now all the permissions are allowed, once app launched if we want to check the headset status, then we need to call the Capture API using getHeadsetStatus(), only one time this service will cal.
Code:
checkHeadsetStatus() async {
HeadsetResponse response = await AwarenessCaptureClient.getHeadsetStatus();
setState(() {
switch (response.headsetStatus) {
case HeadsetStatus.Disconnected:
_showNotification(
"Music", "Please connect headset before start activity");
break;
}
});
log(response.toJson(), name: "captureHeadset");
}
Barrier API: If you want to set some conditions into your app, then we need to use Barrier API. This service keep on listening event once satisfies the conditions automatically it will notifies to user. for example we mentioned some condition like we need to play music once activity starts, now user connects the headset automatically it will notifies to the user headset connected and music playing.
First we need to set barrier condition, it means the barrier will triggered when conditions satisfies.
Code:
String headSetBarrier = "HeadSet";
AwarenessBarrier headsetBarrier = HeadsetBarrier.keeping(
barrierLabel: headSetBarrier, headsetStatus: HeadsetStatus.Connected);
Add the barrier using updateBarriers() this method will return whether barrier added or not.
Code:
bool status =await AwarenessBarrierClient.updateBarriers(barrier: headsetBarrier);
If status returns true it means barrier successfully added, now we need to declare StreamSubscription to listen event, it will keep on update the data once condition satisfies it will trigger.
Code:
if(status){
log("Headset Barrier added.");
StreamSubscription<dynamic> subscription;
subscription = AwarenessBarrierClient.onBarrierStatusStream.listen((event) {
if (mounted) {
setState(() {
switch (event.presentStatus) {
case HeadsetStatus.Connected:
_showNotification("Cool HeadSet", "Headset Connected,Want to listen some music?");
isPlaying = true;
print("Headset Status: Connected");
break;
case HeadsetStatus.Disconnected:
_showNotification("HeadSet", "Headset Disconnected, your music stopped");
print("Headset Status: Disconnected");
isPlaying = false;
break;
case HeadsetStatus.Unknown:
_showNotification("HeadSet", "Your headset Unknown");
print("Headset Status: Unknown");
isPlaying = false;
break;
}
});
}
}, onError: (error) {
log(error.toString());
});
}else{
log("Headset Barrier not added.");
}
Need of Local Push Notification?
1. We can Schedule notification.
2. No web request is required to display Local notification.
3. No limit of notification per user.
4. Originate from the same device and displayed on the same device.
5. Alert the user or remind the user to perform some task.
This package provides us the required functionality of Local Notification. Using this package we can integrate our app with Local Notification in android and ios app both.
Add the following permission to integrate your app with the ability of scheduled notification.
Code:
<uses-permission android:name="android.permission.RECEIVE_BOOT_COMPLETED" />
<uses-permission android:name="android.permission.VIBRATE" />
Add inside application.
Code:
<receiver android:name="com.dexterous.flutterlocalnotifications.ScheduledNotificationBootReceiver">
<intent-filter>
<action android:name="android.intent.action.BOOT_COMPLETED"/>
</intent-filter>
</receiver>
<receiver android:name="com.dexterous.flutterlocalnotifications.ScheduledNotificationReceiver" />
Let’s create Flutter local notification object.
Code:
FlutterLocalNotificationsPlugin localNotification = FlutterLocalNotificationsPlugin();
@override
void initState() {
// TODO: implement initState
super.initState();
var initializationSettingsAndroid =
AndroidInitializationSettings('ic_launcher');
var initializationSettingsIOs = IOSInitializationSettings();
var initSetttings = InitializationSettings(
android: initializationSettingsAndroid, iOS: initializationSettingsIOs);
localNotification.initialize(initSetttings);
}
Now create logic for display simple notification.
Code:
Future _showNotification(String title, String description) async {
var androidDetails = AndroidNotificationDetails(
"channelId", "channelName", "content",
importance: Importance.high);
var iosDetails = IOSNotificationDetails();
var generateNotification =
NotificationDetails(android: androidDetails, iOS: iosDetails);
await localNotification.show(
0, title, description, generateNotification);
}
Demo
            
Tips and Tricks
1. Download latest HMS Flutter plugin.
2. Set minSDK version to 24 or later.
3. HMS Core APK 4.0.2.300 is required.
4. Currently this plugin not supporting background task.
5. Do not forget to click pug get after adding dependencies.
Conclusion
In this article, we have learned how to use Barrier API of Huawei Awareness Kit with a Local Notification to observe the changes in environmental factors even when the application is not running.
As you may notice, the permanent notification indicating that the application is running in the background is not dismissible by the user which can be annoying.
Based on requirement we can utilize different APIs, Huawei Awareness Kit has many other great features that we can use with foreground services in our applications.
Thanks for reading! If you enjoyed this story, please click the Like button and Follow. Feel free to leave a Comment below.
Reference
Awareness Kit URL.
Awareness Capture API Article URL.
Original Source

Beginner: Integration of Huawei Push Notification with android Work Manager in Navigation Glove IoT application - Part 8

{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Introduction​
If you are new to the series of this article, follow the below article.
Beginner: Integration of Huawei Account kit in Navigation Glove IoT application Using Kotlin - Part 1
Beginner: Integration of Huawei Map kit in Navigation Glove IoT application Using Kotlin - Part 2
Beginner: Integration of Huawei Site kit in Navigation Glove IoT application Using Kotlin - Part 3
Beginner: Integration of Huawei Direction API in Navigation Glove IoT application Using Kotlin - Part 4
Beginner: Connecting to Smart Gloves Using Bluetooth in Navigation Glove IoT application Using Kotlin - Part 5
Beginner: Integration of Huawei Analytics kit and Crash service in Navigation Glove IoT application Using Kotlin - Part 6
Beginner: Integration of Huawei Location kit in Navigation Glove IoT application - Part 7
Click to expand...
Click to collapse
In this article, we will learn about Smart Glove application and also we will learn about integration of the Huawei Push kit in Navigation Glove IoT application.
In this article, we will cover how exactly push notification works, and benefits of push notification and also we can integrate Huawei Push kit in the smart gloves application.
Content​
What is Push Notification?
Advantages of Push notification
Type of notification
Huawei Push notification
Message types.
Integration of push kit
Testing push kit from Huawei AppGalley console
In the current mobile world 90% of the applications integrated the push notification. Every application integrate the push notification to engage the users. And also to make better marketing. Every ecommerce application shared the details about offers, discounts, price drops for particular product, the delivery status through the push notification.
First let us understand what push notification is.
Push notification are basically alerts to users from mobile application. Users receives in real-time. Developers can send notification at any time and any day.
Advantages of Push Notification
User retention: In the world of mobile apps, user retention is a key indicator of how well your app is performing in the market. This metric lets you to see how many users who have downloaded your app and used it once come back to use the app again.
Increase engagement: The power of Huawei push notification is you can make user engage by sending some cool notification to users. Notification provides the interaction with application, by this user can spend more time with the application.
Push Notification types
1. Reminder Notification: Its reminder notification for example recharge reminder, meeting time reminder, appointment reminder etc.
2. Alert Notification: this type of notification alerts to user when something happens in application which depends upon user. Example when someone sends message, comment on pic.
3. Promotional notification: these are the promotional notification when application offers something to user example discount, sales date, some weekend sales etc.
4. Purchas notification: These are also valuable notifications and have to do with purchases users make within your app. It can contain information like order confirmation, order status, order updates, tracking, and receipts.
5. Survey notification: these are feedback or survey notification when application wants to get the feedback or survey at that moment these kind of notification will be used.
Huawei Push Kit is a messaging service provided for you to establish a cloud-to-device messaging channel. By integrating Push Kit, you can send messages to your apps on user devices in real time. This helps you to maintain closer ties with users and increases user awareness of and engagement with your apps.
Push Kit consists of two-part
Message push from the cloud to the device: enables you to send data and messages to your apps on user devices.
Message display on devices: provides various display styles, such as the notification panel, home screen banner, and lock screen on user devices.
Huawei has 2 types of Message
1. Notification Message
2. Data Message
Notification Message: A notification message is directly sent by Push Kit and displayed in the notification panel on the user device, not requiring your app to be running in the background. The user can tap the notification message to trigger the corresponding action such as opening your app or opening a web page. You can tailor the display styles and reminder modes to fit user needs, which will greatly improve the daily active users (DAU) of your app. The common application scenarios of the notification message include subscription, travel reminder, and account status.
Batch message: a message sent by an app in batches to users who will obtain the same content, which can improve user experience and stimulate user interaction with the app.
Personalized message: a message generated based on a unified message template and sent by an app to an audience. The unified message template contains placeholders, which will be replaced based on the settings and preferences of specific users.
Point-to-point message: a message sent by an app to a user when the user takes a specific action.
Instant message: An instant message is a point-to-point or group chatting message (or private message) between users.
Data Message: Data messages are processed by your app on user devices. After a device receives a message containing data or instructions from the cloud, the device passes the message to the app instead of directly displaying it. The app then parses the message and triggers the required action (for example, going to a web page or an app page). For such a message, you can also customize display styles for higher efficiency.
Push Kit cannot guarantee a high data message delivery rate, because it may be affected by Android system restrictions and whether the app is running in the background. The common application scenarios of the data message include the VoIP call, voice broadcast, and interaction with friends.
Prerequisite​
AppGallery Account
Android Studio 3.X
SDK Platform 19 or later
Gradle 4.6 or later
HMS Core (APK) 4.0.0.300 or later
Huawei Phone EMUI 5.0 or later
Non-Huawei Phone Android 5.1 or later
Service integration on AppGallery
1. We need to register as a developer account in AppGallery Connect.
2. Create an app by referring to Creating a Project and Creating an App in the Project.
3. Set the data storage location based on the current location.
4. Enabling Pushvfc Kit Service on AppGallery.
5. Generating a Signing Certificate Fingerprint.
6. Configuring the Signing Certificate Fingerprint.
7. Get your agconnect-services.json file to the app root directory.
Client development
1. Create android project in android studio IDE.
2. Add the maven URL inside the repositories of buildscript and allprojects respectively (project level build.gradle file).
Code:
maven { url 'https://developer.huawei.com/repo/' }
3. Add the classpath inside the dependency section of the project level build.gradle file.
Code:
classpath 'com.huawei.agconnect:agcp:1.5.2.300'
4. Add the plugin in the app-level build.gradle file.
Code:
apply plugin: 'com.huawei.agconnect'
5. Add the below library in the app-level build.gradle file dependencies section.
Code:
implementation 'com.huawei.hms:push:6.3.0.302'
6. Add all the below permission in the AndroidManifest.xml.
XML:
[CODE]<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
<uses-permission android:name="android.permission.CHANGE_WIFI_STATE" />
<uses-permission android:name="android.permission.ACCESS_WIFI_STATE" />
[/CODE]
7. Add the code PushNotificationHmsMessageService in the AndroidManifest.xml
XML:
<service android:name=".push.PushNotificationHmsMessageService" android:exported="false">
<intent-filter>
<action android:name="com.huawei.push.action.MESSAGING_EVENT"/>
</intent-filter>
</service>
8. Sync the project.
Now let’s learn the coding part.
Step 1: Create Notification
NotificationUtils.kt
Java:
package com.huawei.navigationglove.push
import android.app.NotificationChannel
import android.app.NotificationManager
import android.app.PendingIntent
import android.content.Context
import android.content.Intent
import android.media.RingtoneManager
import android.os.Build
import androidx.core.app.NotificationCompat
import androidx.core.content.ContextCompat
import com.huawei.navigationglove.R
import kotlin.random.Random
import android.provider.Settings
import com.huawei.navigationglove.ui.SplashScreenActivity
class NotificationUtil(private val context: Context) {
fun showNotification(title: String, message: String) {
val intent = Intent(context, SplashScreenActivity::class.java)
intent.addFlags(Intent.FLAG_ACTIVITY_CLEAR_TOP)
val pendingIntent = PendingIntent.getActivity(
context, 0, intent,
PendingIntent.FLAG_ONE_SHOT
)
val channelId = context.getString(R.string.default_notification_channel_id)
val defaultSoundUri = RingtoneManager.getDefaultUri(RingtoneManager.TYPE_NOTIFICATION)
val notificationBuilder = NotificationCompat.Builder(context, channelId)
.setColor(ContextCompat.getColor(context, android.R.color.holo_red_dark))
.setSmallIcon(R.drawable.gloves_ic)
.setContentTitle(title)
.setContentText(message)
.setAutoCancel(true)
.setStyle(
NotificationCompat.BigTextStyle()
.bigText(message)
)
.setPriority(NotificationCompat.PRIORITY_HIGH)
.setSound(defaultSoundUri)
.setContentIntent(pendingIntent)
val notificationManager =
context.getSystemService(Context.NOTIFICATION_SERVICE) as NotificationManager
// Since android Oreo notification channel is needed.
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.O) {
val channel = NotificationChannel(
channelId,
"Default Channel",
NotificationManager.IMPORTANCE_HIGH
)
notificationManager.createNotificationChannel(channel)
}
notificationManager.notify(Random.nextInt(), notificationBuilder.build())
}
fun isTimeAutomatic(context: Context): Boolean {
return Settings.Global.getInt(
context.contentResolver,
Settings.Global.AUTO_TIME,
0
) == 1;
}
}
Step 2: Create Worker manager to process data and notification in background. If received data need to be processed and if take more than 10 seconds then create work manager else directly show the notification.
SchedulerWorker.kt
Java:
package com.huawei.navigationglove.push
import android.content.Context
import android.util.Log
import androidx.work.Worker
import androidx.work.WorkerParameters
class ScheduledWorker(appContext: Context, workerParams: WorkerParameters) :
Worker(appContext, workerParams) {
override fun doWork(): Result {
Log.d(TAG, "Work START")
// Get Notification Data
val title = inputData.getString(NOTIFICATION_TITLE)
val message = inputData.getString(NOTIFICATION_MESSAGE)
// Show Notification
NotificationUtil(applicationContext).showNotification(title!!, message!!)
// TODO Do your other Background Processing
Log.d(TAG, "Work DONE")
// Return result
return Result.success()
}
companion object {
private const val TAG = "ScheduledWorker"
const val NOTIFICATION_TITLE = "notification_title"
const val NOTIFICATION_MESSAGE = "notification_message"
}
}
Step 3: Create broadcast receiver
NotificationBroadcastReceiver.kt
Java:
package com.huawei.navigationglove.push
import android.content.BroadcastReceiver
import android.content.Context
import android.content.Intent
import android.util.Log
import androidx.work.Data
import androidx.work.OneTimeWorkRequest
import androidx.work.WorkManager
import com.huawei.navigationglove.push.ScheduledWorker.Companion.NOTIFICATION_MESSAGE
import com.huawei.navigationglove.push.ScheduledWorker.Companion.NOTIFICATION_TITLE
class NotificationBroadcastReceiver : BroadcastReceiver() {
override fun onReceive(context: Context?, intent: Intent?) {
intent?.let {
val title = it.getStringExtra(NOTIFICATION_TITLE)
val message = it.getStringExtra(NOTIFICATION_MESSAGE)
// Create Notification Data
val notificationData = Data.Builder()
.putString(NOTIFICATION_TITLE, title)
.putString(NOTIFICATION_MESSAGE, message)
.build()
// Init Worker
val work = OneTimeWorkRequest.Builder(ScheduledWorker::class.java)
.setInputData(notificationData)
.build()
// Start Worker
WorkManager.getInstance().beginWith(work).enqueue()
Log.d(javaClass.name, "WorkManager is Enqueued.")
}
}
}
Step 4: Add NotificationBroadcastReceiver in the AndroidManifest.xml
XML:
<receiver android:name=".push.NotificationBroadcastReceiver" />
Step 5: Create Huawei PushNotificationHmsMessageService.kt
Java:
package com.huawei.navigationglove.push
import android.app.AlarmManager
import android.app.PendingIntent
import android.content.Context
import android.content.Intent
import android.os.Bundle
import android.util.Log
import com.huawei.hms.push.HmsMessageService
import com.huawei.hms.push.RemoteMessage
import android.text.TextUtils
import androidx.work.Data
import androidx.work.OneTimeWorkRequest
import androidx.work.WorkManager
import com.google.gson.Gson
import com.huawei.hms.common.ApiException
import com.huawei.hms.aaid.HmsInstanceId
import com.huawei.agconnect.config.AGConnectServicesConfig
import com.huawei.navigationglove.push.ScheduledWorker.Companion.NOTIFICATION_MESSAGE
import com.huawei.navigationglove.push.ScheduledWorker.Companion.NOTIFICATION_TITLE
import com.huawei.navigationglove.push.model.PushModel
import org.json.JSONException
import java.text.SimpleDateFormat
import java.util.*
class PushNotificationHmsMessageService : HmsMessageService() {
override fun onMessageReceived(message: RemoteMessage?) {
Log.i(TAG, "onMessageReceived is called")
if (message == null) {
Log.e(TAG, "Received message entity is null!")
return
}
Log.i(
TAG, """get Data: ${message.data} getFrom: ${message.from}
getTo: ${message.to}
getMessageId: ${message.messageId}
getSendTime: ${message.sentTime}
getDataMap: ${message.dataOfMap}
getMessageType: ${message.messageType}
getTtl: ${message.ttl}
getToken: ${message.token}"""
)
message.data.isNotEmpty().let { it ->
if (it) {
Log.d(TAG, "Message data payload: ${message.data}")
try {
val pushModel: PushModel = Gson().fromJson(message.data, PushModel::class.java)
//val jsonData = JSONObject(message.data)
// Get Message details
val title = pushModel.title
val content = pushModel.message
// Check whether notification is scheduled or not
val isScheduled = pushModel.isScheduled
isScheduled.let {
if (it) {
// Check that 'Automatic Date and Time' settings are turned ON.
// If it's not turned on, Return
val notificationUtil = NotificationUtil(this)
if (!notificationUtil.isTimeAutomatic(applicationContext)) {
Log.d(TAG, "`Automatic Date and Time` is not enabled")
return
}
// This is Scheduled Notification, Schedule it
val scheduledTime = pushModel.scheduledTime
scheduleAlarm(scheduledTime, title, content)
} else {
// This is not scheduled notification, show it now
// Create Notification Data
var body =
"You have reached from " + pushModel.data.fromLocation + " to " + pushModel.data.toLocation
val notificationData = Data.Builder()
.putString(NOTIFICATION_TITLE, title)
.putString(NOTIFICATION_MESSAGE, body)
.build()
// Init Worker
val work = OneTimeWorkRequest.Builder(ScheduledWorker::class.java)
.setInputData(notificationData)
.build()
// Start Worker
WorkManager.getInstance(this).beginWith(work).enqueue()
Log.d(javaClass.name, "WorkManager is Enqueued.")
}
}
} catch (e: JSONException) {
e.printStackTrace()
}
} else {
val notificationData = Data.Builder()
.putString(NOTIFICATION_TITLE, message.notification.title)
.putString(NOTIFICATION_MESSAGE, message.notification.body)
.build()
// Init Worker
val work = OneTimeWorkRequest.Builder(ScheduledWorker::class.java)
.setInputData(notificationData)
.build()
// Start Worker
WorkManager.getInstance(this).beginWith(work).enqueue()
}
}
}
private fun scheduleAlarm(
scheduledTimeString: String?,
title: String?,
message: String?
) {
val alarmMgr = applicationContext.getSystemService(Context.ALARM_SERVICE) as AlarmManager
val alarmIntent =
Intent(applicationContext, NotificationBroadcastReceiver::class.java).let { intent ->
intent.putExtra(NOTIFICATION_TITLE, title)
intent.putExtra(NOTIFICATION_MESSAGE, message)
PendingIntent.getBroadcast(applicationContext, 0, intent, 0)
}
// Parse Schedule time
val scheduledTime = SimpleDateFormat("yyyy-MM-dd HH:mm:ss", Locale.getDefault())
.parse(scheduledTimeString!!)
scheduledTime?.let {
// With set(), it'll set non repeating one time alarm.
alarmMgr.set(
AlarmManager.RTC_WAKEUP,
it.time,
alarmIntent
)
}
}
override fun onNewToken(token: String) {
Log.i(TAG, "received refresh token:$token")
if (!TextUtils.isEmpty(token)) {
refreshedTokenToServer(token)
}
}
// If the version of the Push SDK you integrated is 5.0.4.302 or later, you also need to override the method.
override fun onNewToken(token: String, bundle: Bundle?) {
Log.i(TAG, "have received refresh token $token")
if (!TextUtils.isEmpty(token)) {
refreshedTokenToServer(token)
}
}
private fun refreshedTokenToServer(token: String) {
Log.i(TAG, "sending token to server. token:$token")
}
private fun deleteToken() {
// Create a thread.
object : Thread() {
override fun run() {
try {
// Obtain the app ID from the agconnect-service.json file.
val appId =
AGConnectServicesConfig.fromContext([email protected])
.getString("client/app_id")
// Set tokenScope to HCM.
val tokenScope = "HCM"
// Delete the token.
HmsInstanceId.getInstance([email protected])
.deleteToken(appId, tokenScope)
Log.i(TAG, "token deleted successfully")
} catch (e: ApiException) {
Log.e(TAG, "deleteToken failed.$e")
}
}
}.start()
}
/**
* Persist token to third-party servers.
*
* Modify this method to associate the user's FCM registration token with any server-side account
* maintained by your application.
*
* @param token The new token.
*/
private fun sendRegistrationToServer(token: String?) {
// TODO: Implement this method to send token to your app server.
Log.d(TAG, "sendRegistrationTokenToServer($token)")
}
companion object {
private const val TAG = "PushNotificationHmsMessageService"
}
}
Step 6: If you are converting the data message body to kotlin data classes. This step is optional. You can directly parse the json data directly.
PushModel.kt
Java:
package com.huawei.navigationglove.push.model
import com.google.gson.annotations.SerializedName
data class PushModel(@SerializedName("scheduledTime")
val scheduledTime: String = "",
@SerializedName("data")
val data: Data,
@SerializedName("isScheduled")
val isScheduled: Boolean = false,
@SerializedName("title")
val title: String = "",
@SerializedName("message")
val message: String = "")
Data.kt
Java:
package com.huawei.navigationglove.push.model
import com.google.gson.annotations.SerializedName
data class Data(@SerializedName("fromLocation")
val fromLocation: String = "",
@SerializedName("toLocation")
val toLocation: String = "",
@SerializedName("status")
val status: String = "")
Testing Push notification on AppGallery console
Notification Message
Stpe 1: Open AppGallery. And select project.
Step 2: from left menu choose Grow>Push> Enable
Step 3: once enabled Click on Add Notification from top right corner
Step 4: Enter the details of name, select Notification message, Summary, Title and Body.
Step5: Enter push scope, token and Click on Submit button from top right corner
Step 6: Click ok on confirmation window.
Data Message
Repeat Steps 1 to 3 from Notification Message
Step 4: Enter the details of name, select Data message, Summary, Title and Body and choose push scope and Device token. Click on submit button from top right corner
Step 5: Click on OK button from confirmation dialog.
Result
Tips and Tricks
1. Make sure you are already registered as a Huawei developer.
2. Set min SDK version to 19 or later, otherwise you will get AndriodManifest to merge issue.
3. Make sure you have added the agconnect-services.json file to the app folder.
4. Make sure you have added the SHA-256 fingerprint without fail.
5. Make sure all the dependencies are added properly.
Conclusion
In this article, we have learnt the integration of the Huawei Push Kit in Smart Gloves mobile application using Android Studio and Kotlin. In this article, we understood about Huawei Push kit, types of messages, notification type, How to test push notification from AppGallery console.
Reference
Push Kit - Official document
Push Kit - Code lab
Push Kit - Training Video

Categories

Resources