More information like this, you can visit HUAWEI Developer Forum
A geofence is a virtual perimeter set on a real geographic area. Combining a user position with a geofence perimeter, it is possible to know if the user is inside the geofence or if he is exiting or entering the area.
In this article, we will discuss how to use the geofence to notify the user when the device enters/exits an area using the HMS Location Kit in a Xamarin.Android application. We will also add and customize HuaweiMap, which includes drawing circles, adding pointers, and using nearby searches in search places. We are going to learn how to use the below features together:
Geofence
Reverse Geocode
HuaweiMap
Nearby Search
Project Setup
First of all, you need to be a registered Huawei Mobile Developer and create an application in Huawei App Console in order to use HMS Map Location and Site Kits. You can follow these steps to complete the configuration that required for development:
Configuring App Information in AppGallery Connect
Creating Xamarin Android Binding Libraries
Integrating the HMS Map Kit Libraries for Xamarin
Integrating the HMS Location Kit Libraries for Xamarin
Integrating the HMS Site Kit Libraries for Xamarin
Integrating the HMS Core SDK
Setting Package in Xamarin
When we create our Xamarin.Android application in the above steps, we need to make sure that the package name is the same as we entered the Console. Also, don’t forget the enable them in Console.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Manifest & Permissions
We have to update the application’s manifest file by declaring permissions that we need as shown below.
Code:
<uses-permission android:name="android.permission.ACCESS_COARSE_LOCATION" />
<uses-permission android:name="android.permission.ACCESS_FINE_LOCATION" />
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
<uses-permission android:name="android.permission.ACCESS_BACKGROUND_LOCATION" />
Also, add a meta-data element to embed your app id in the application tag, it is required for this app to authenticate on the Huawei’s cloud server. You can find this id in agconnect-services.json file.
Code:
<meta-data android:name="com.huawei.hms.client.appid" android:value="appid=YOUR_APP_ID" />
Request location permission
Request runtime permissions in our app in order to use Location and Map Services. The following code checks whether the user has granted the required location permissions in Main Activity.
Code:
private void RequestPermissions()
{
if (ContextCompat.CheckSelfPermission(this, Manifest.Permission.AccessCoarseLocation) != (int)Permission.Granted ||
ContextCompat.CheckSelfPermission(this, Manifest.Permission.AccessFineLocation) != (int)Permission.Granted ||
ContextCompat.CheckSelfPermission(this, Manifest.Permission.WriteExternalStorage) != (int)Permission.Granted ||
ContextCompat.CheckSelfPermission(this, Manifest.Permission.ReadExternalStorage) != (int)Permission.Granted ||
ContextCompat.CheckSelfPermission(this, Manifest.Permission.Internet) != (int)Permission.Granted)
{
ActivityCompat.RequestPermissions(this,
new System.String[]
{
Manifest.Permission.AccessCoarseLocation,
Manifest.Permission.AccessFineLocation,
Manifest.Permission.WriteExternalStorage,
Manifest.Permission.ReadExternalStorage,
Manifest.Permission.Internet
},
100);
}
else
GetCurrentPosition();
}
public override void OnRequestPermissionsResult(int requestCode, string[] permissions, [GeneratedEnum] Android.Content.PM.Permission[] grantResults)
{
if (requestCode == 100)
{
foreach (var item in permissions)
{
if (ContextCompat.CheckSelfPermission(this, item) == Permission.Denied)
{
if (ActivityCompat.ShouldShowRequestPermissionRationale(this, permissions[0]) || ActivityCompat.ShouldShowRequestPermissionRationale(this, permissions[1]))
Snackbar.Make(FindViewById<RelativeLayout>(Resource.Id.mainLayout), "You need to grant permission to use location services.", Snackbar.LengthLong).SetAction("Ask again", v => RequestPermissions()).Show();
else
Toast.MakeText(this, "You need to grant location permissions in settings.", ToastLength.Long).Show();
}
else
GetCurrentPosition();
}
}
else
{
base.OnRequestPermissionsResult(requestCode, permissions, grantResults);
}
}
Add a Map
Within our UI, a map will be represented by either a MapFragment or MapView object. We will use the MapFragment object in this sample.
Add a <fragment> element to your activity’s layout file, activity_main.xml. This element defines a MapFragment to act as a container for the map and to provide access to the HuaweiMap object.
Also, let’s add other controls to use through this sample. That is two Button and a SeekBar. One button for clearing the map and the other for searching nearby locations. And seekbar is helping us to create a radius for the geofence.
Code:
<RelativeLayout
xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
xmlns:map="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
android:id="@+id/mainLayout"
android:layout_width="match_parent"
android:layout_height="match_parent">
<fragment
android:id="@+id/mapfragment"
class="com.huawei.hms.maps.MapFragment"
android:layout_width="match_parent"
android:layout_height="match_parent"/>
<LinearLayout
android:orientation="vertical"
android:layout_width="wrap_content"
android:layout_height="match_parent">
<Button
android:text="Get Geofence List"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_margin="5dp"
android:padding="5dp"
android:background="@drawable/abc_btn_colored_material"
android:textColor="@android:color/white"
android:id="@+id/btnGetGeofenceList" />
<Button
android:text="Clear Map"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_margin="5dp"
android:background="@drawable/abc_btn_colored_material"
android:textColor="@android:color/white"
android:id="@+id/btnClearMap" />
</LinearLayout>
<SeekBar
android:visibility="invisible"
android:min="30"
android:layout_alignParentBottom="true"
android:layout_marginBottom="20dp"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:id="@+id/radiusBar" />
</RelativeLayout>
In our activity’s OnCreate method, set the layout file as the content view, load AGConnectService, set button’s click events, and initialize FusedLocationProviderClient. Get a handle to the map fragment by calling FragmentManager.FindFragmentById. Then use GetMapAsync to register for the map callback.
Also, implement the IOnMapReadyCallback interface to MainActivity and override OnMapReady method which is triggered when the map is ready to use.
Code:
public class MainActivity : AppCompatActivity, IOnMapReadyCallback
{
MapFragment mapFragment;
HuaweiMap hMap;
Marker marker;
Circle circle;
SeekBar radiusBar;
FusedLocationProviderClient fusedLocationProviderClient;
GeofenceModel selectedCoordinates;
List<Marker> searchMarkers;
private View search_view;
private AlertDialog alert;
public static LatLng CurrentPosition { get; set; }
protected override void OnCreate(Bundle savedInstanceState)
{
base.OnCreate(savedInstanceState);
Xamarin.Essentials.Platform.Init(this, savedInstanceState);
SetContentView(Resource.Layout.activity_main);
AGConnectServicesConfig config = AGConnectServicesConfig.FromContext(ApplicationContext);
fusedLocationProviderClient = LocationServices.GetFusedLocationProviderClient(this);
mapFragment = (MapFragment)FragmentManager.FindFragmentById(Resource.Id.mapfragment);
mapFragment.GetMapAsync(this);
FindViewById<Button>(Resource.Id.btnGeoWithAddress).Click += btnGeoWithAddress_Click;
FindViewById<Button>(Resource.Id.btnClearMap).Click += btnClearMap_Click;
radiusBar = FindViewById<SeekBar>(Resource.Id.radiusBar);
radiusBar.ProgressChanged += radiusBar_ProgressChanged; ;
RequestPermissions();
}
public void OnMapReady(HuaweiMap map)
{
hMap = map;
hMap.UiSettings.MyLocationButtonEnabled = true;
hMap.UiSettings.CompassEnabled = true;
hMap.UiSettings.ZoomControlsEnabled = true;
hMap.UiSettings.ZoomGesturesEnabled = true;
hMap.MyLocationEnabled = true;
hMap.MapClick += HMap_MapClick;
if (selectedCoordinates == null)
selectedCoordinates = new GeofenceModel { LatLng = CurrentPosition, Radius = 30 };
}
}
As you can see above, with the UiSettings property of the HuaweiMap object we set my location button, enable compass, etc. Other properties like below:
Code:
public bool CompassEnabled { get; set; }
public bool IndoorLevelPickerEnabled { get; set; }
public bool MapToolbarEnabled { get; set; }
public bool MyLocationButtonEnabled { get; set; }
public bool RotateGesturesEnabled { get; set; }
public bool ScrollGesturesEnabled { get; set; }
public bool ScrollGesturesEnabledDuringRotateOrZoom { get; set; }
public bool TiltGesturesEnabled { get; set; }
public bool ZoomControlsEnabled { get; set; }
public bool ZoomGesturesEnabled { get; set; }
Now when the app launch, directly get the current location and move the camera to it. In order to do that we use FusedLocationProviderClient that we instantiated and call LastLocation API.
LastLocation API returns a Task object that we can check the result by implementing the relevant listeners for success and failure.In success listener we are going to move the map’s camera position to the last known position.
Code:
private void GetCurrentPosition()
{
var locationTask = fusedLocationProviderClient.LastLocation;
locationTask.AddOnSuccessListener(new LastLocationSuccess(this));
locationTask.AddOnFailureListener(new LastLocationFail(this));
}
...
public class LastLocationSuccess : Java.Lang.Object, IOnSuccessListener
{
private MainActivity mainActivity;
public LastLocationSuccess(MainActivity mainActivity)
{
this.mainActivity = mainActivity;
}
public void OnSuccess(Java.Lang.Object location)
{
Toast.MakeText(mainActivity, "LastLocation request successful", ToastLength.Long).Show();
if (location != null)
{
MainActivity.CurrentPosition = new LatLng((location as Location).Latitude, (location as Location).Longitude);
mainActivity.RepositionMapCamera((location as Location).Latitude, (location as Location).Longitude);
}
}
}
To change the position of the camera, we must specify where we want to move the camera, using a CameraUpdate. The Map Kit allows us to create many different types of CameraUpdate using CameraUpdateFactory.
Code:
public static CameraUpdate NewCameraPosition(CameraPosition p0);
public static CameraUpdate NewLatLng(LatLng p0);
public static CameraUpdate NewLatLngBounds(LatLngBounds p0, int p1);
public static CameraUpdate NewLatLngBounds(LatLngBounds p0, int p1, int p2, int p3);
public static CameraUpdate NewLatLngZoom(LatLng p0, float p1);
public static CameraUpdate ScrollBy(float p0, float p1);
public static CameraUpdate ZoomBy(float p0);
public static CameraUpdate ZoomBy(float p0, Point p1);
public static CameraUpdate ZoomIn();
public static CameraUpdate ZoomOut();
public static CameraUpdate ZoomTo(float p0);
There are some methods for the camera position changes as we see above. Simply these are:
1. NewLatLng: Change camera’s latitude and longitude, while keeping other properties
2. NewLatLngZoom: Changes the camera’s latitude, longitude, and zoom, while keeping other properties
3. NewCameraPosition: Full flexibility in changing the camera position
We are going to use NewCameraPosition. A CameraPosition can be obtained with a CameraPosition.Builder. And then we can set target, bearing, tilt and zoom properties.
Code:
public void RepositionMapCamera(double lat, double lng)
{
var cameraPosition = new CameraPosition.Builder();
cameraPosition.Target(new LatLng(lat, lng));
cameraPosition.Zoom(1000);
cameraPosition.Bearing(45);
cameraPosition.Tilt(20);
CameraUpdate cameraUpdate = CameraUpdateFactory.NewCameraPosition(cameraPosition.Build());
hMap.MoveCamera(cameraUpdate);
}
Creating Geofence
Now that we’ve created the map, we can now start to create geofences using it. In this article, we will choose the location where we want to set geofence in two different ways. The first is to select the location by clicking on the map, and the second is to search for nearby places by keyword and select one after placing them on the map with the marker.
Set the geofence location by clicking on the map
It is always easier to select a location by seeing it. After this section, we are able to set a geofence around the clicked point when the map’s clicked. We attached the Click event to our map in the OnMapReady method. In this Click event, we will add a marker to the clicked point and draw a circle around it.
After clicking the map, we will add a circle, a marker, and a custom info window to that point like this:
Also, we will use the Seekbar at the bottom of the page to adjust the circle radius.
We set selectedCoordinates variable when adding the marker. Let’s create the following method to create the marker:
Code:
private void HMap_MapClick(object sender, HuaweiMap.MapClickEventArgs e)
{
selectedCoordinates.LatLng = e.P0;
if (circle != null)
{
circle.Remove();
circle = null;
}
AddMarkerOnMap();
}
void AddMarkerOnMap()
{
if (marker != null) marker.Remove();
var markerOption = new MarkerOptions()
.InvokeTitle("You are here now")
.InvokePosition(selectedCoordinates.LatLng);
hMap.SetInfoWindowAdapter(new MapInfoWindowAdapter(this));
marker = hMap.AddMarker(markerOption);
bool isInfoWindowShown = marker.IsInfoWindowShown;
if (isInfoWindowShown)
marker.HideInfoWindow();
else
marker.ShowInfoWindow();
}
With MarkerOptions we can set the title and position properties. And for creating a custom info window, there is SetInfoWindowAdapter method. Adding MapInfoWindowAdapter class to our project for rendering the custom info model. And implement HuaweiMap.IInfoWindowAdapter interface to it.
This interface provides a custom information window view of a marker and contains two methods:
Code:
View GetInfoContents(Marker marker);
View GetInfoWindow(Marker marker);
When an information window needs to be displayed for a marker, methods provided by this adapter are called in any case.
Now let’s create a custom info window layout and named it as map_info_view.xml
Code:
<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:orientation="vertical"
android:layout_width="match_parent"
android:layout_height="match_parent">
<Button
android:text="Add geofence"
android:width="100dp"
style="@style/Widget.AppCompat.Button.Colored"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:id="@+id/btnInfoWindow" />
</LinearLayout>
And return it after customizing it in GetInfoWindow() method. The full code of the adapter is below:
Code:
internal class MapInfoWindowAdapter : Java.Lang.Object, HuaweiMap.IInfoWindowAdapter
{
private MainActivity activity;
private GeofenceModel selectedCoordinates;
private View addressLayout;
public MapInfoWindowAdapter(MainActivity currentActivity)
{
activity = currentActivity;
}
public View GetInfoContents(Marker marker)
{
return null;
}
public View GetInfoWindow(Marker marker)
{
if (marker == null)
return null;
//update everytime, drawcircle need it
selectedCoordinates = new GeofenceModel { LatLng = new LatLng(marker.Position.Latitude, marker.Position.Longitude) };
View mapInfoView = activity.LayoutInflater.Inflate(Resource.Layout.map_info_view, null);
var radiusBar = activity.FindViewById<SeekBar>(Resource.Id.radiusBar);
if (radiusBar.Visibility == Android.Views.ViewStates.Invisible)
{
radiusBar.Visibility = Android.Views.ViewStates.Visible;
radiusBar.SetProgress(30, true);
}
activity.FindViewById<SeekBar>(Resource.Id.radiusBar)?.SetProgress(30, true);
activity.DrawCircleOnMap(selectedCoordinates);
Button button = mapInfoView.FindViewById<Button>(Resource.Id.btnInfoWindow);
button.Click += btnInfoWindow_ClickAsync;
return mapInfoView;
}
}
This is not the end. For full content, you can visit https://forums.developer.huawei.com/forumPortal/en/topicview?tid=0201357111605920240&fid=0101187876626530001
Related
This article is orginally from HUAWEI Developer Forum.
Forum link: https://forums.developer.huawei.com/forumPortal/en/home
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Before we start learning about today’s topic, I strongly recommend you to go through my previous article i.e. HMS Site Map (Part 1). It will help you to have a clear picture.
Let’s Begin
In the previous article, we were successfully able to get details after selecting the place that we want to search using Site Kit. Today in this article we are going to see how to show a map using Map Kit after fetching the Latitude and Longitude from the details we selected. Also we are going to see how to use the Site APIs and Map APIs using POSTMAN in our Part 3 article.
One Step at a time
First we need to add Map Kit dependencies in the app gradle file and sync the app.
implementation 'com.huawei.hms:maps:4.0.1.300'
After adding the dependencies we need to provide permission in AndroidManifest.xml file.
Code:
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.ACCESS_COARSE_LOCATION" />
<uses-permission android:name="android.permission.ACCESS_FINE_LOCATION" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE"/>
<uses-permission android:name="com.huawei.appmarket.service.commondata.permission.GET_COMMON_DATA"/>
Let’s Code
Main Activity class
Code:
private void showDetails(String item) {
String pattern = Pattern.quote("\\" + "n");
String[] lines = item.split("\\n+");
autoCompleteTextView.setText(lines[0]);
mLat = lines[2]; // This is latitude
mLon = lines[3]; // This is longitude
title = lines[0]; // This is title or place name
String details = "<font color='red'>PLACE NAME : </font>" + lines[0] + "<br>"
+ "<font color='#CD5C5C'>COUNTRY : </font>" + lines[1] + "<br>"
+ "<font color='#8E44AD'>ADDRESS : </font>" + lines[4] + "<br>"
+ "<font color='#008000'>PHONE : </font>" + lines[5];
txtDetails.setText(Html.fromHtml(details, Html.FROM_HTML_MODE_COMPACT));
}
private void showMap(){
Intent intent = new Intent(MainActivity.this, MapActivity.class);
intent.putExtra("lat",mLat); // Here we are passing Latitude and Longitude
intent.putExtra("lon",mLon); // and titile from MainActivity class to
intent.putExtra("title",title);// MapActivity class…
startActivity(intent);
}v
Main Code
1) First we need to understand whether we are showing the map in view or fragment. Because there are two way we can show our map.
a) Fragment way
In fragment way we add MapFragment in the layout file of an activity.
Code:
<fragment xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:map="http://schemas.android.com/apk/res-auto"
android:id="@+id/mapfragment_mapfragmentdemo"
class="com.huawei.hms.maps.MapFragment"
android:layout_width="match_parent"
android:layout_height="match_parent"
map:cameraTargetLat="48.893478"
map:cameraTargetLng="2.334595"
map:cameraZoom="10" />
b) MapView way
Here we add MapView in the layout file of an activity.
Code:
<com.huawei.hms.maps.MapView
xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:map="http://schemas.android.com/apk/res-auto"
android:id="@+id/mapView"
android:layout_width="match_parent"
android:layout_height="match_parent"
map:mapType="normal"
map:uiCompass="true"
map:uiZoomControls="true"
map:cameraTargetLat="51"
map:cameraTargetLng="10"
map:cameraZoom="8.5"/>
2) Here we are going with MapView.
3) For both Fragment as well as View, we need to implement OnMapReadyCallback API in our MapActivity to use a Map. After implementing this API, it will ask to implement onMapReady method.
Code:
public void onMapReady(HuaweiMap map) {
Log.d(TAG, "onMapReady: ");
hMap = map;
}
4) The only difference which we will see between MapFragment and MapView is instantiating Map.
a) MapFragement
Code:
private MapFragment mMapFragment;
mMapFragment = (MapFragment) getFragmentManager()
.findFragmentById(R.id.mapfragment_mapfragmentdemo);
mMapFragment.getMapAsync(this);
b) MapView
Code:
private MapView mMapView;
mMapView = findViewById(R.id.mapview_mapviewdemo);
Bundle mapViewBundle = null;
if (savedInstanceState != null) {
mapViewBundle = savedInstanceState.getBundle("MapViewBundleKey");
}
mMapView.onCreate(mapViewBundle);
mMapView.getMapAsync(this);
5) Permission we need to check
Code:
//Put this in the top of the onCreate() method …
private static final String[] RUNTIME_PERMISSIONS = {
Manifest.permission.WRITE_EXTERNAL_STORAGE,
Manifest.permission.READ_EXTERNAL_STORAGE,
Manifest.permission.ACCESS_COARSE_LOCATION,
Manifest.permission.ACCESS_FINE_LOCATION,
Manifest.permission.INTERNET
};
// This will placed in the onCreate() method …
if (!hasPermissions(this, RUNTIME_PERMISSIONS)) {
ActivityCompat.requestPermissions(this, RUNTIME_PERMISSIONS, REQUEST_CODE);
}
// Use this method to check Permission …
private static boolean hasPermissions(Context context, String... permissions) {
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.M && permissions != null) {
for (String permission : permissions) {
if (ActivityCompat.checkSelfPermission(context, permission)
!= PackageManager.PERMISSION_GRANTED) {
return false;
}
}
}
return true;
}
MapActivity Class
Code:
public class MapActivity extends AppCompatActivity implements OnMapReadyCallback {
private static final String TAG = "MapActivity";
private MapView mMapView;
private HuaweiMap hmap;
private Marker mMarker;
private static final String[] RUNTIME_PERMISSIONS = {
Manifest.permission.WRITE_EXTERNAL_STORAGE,
Manifest.permission.READ_EXTERNAL_STORAGE,
Manifest.permission.ACCESS_COARSE_LOCATION,
Manifest.permission.ACCESS_FINE_LOCATION,
Manifest.permission.INTERNET
};
private static final String MAPVIEW_BUNDLE_KEY = "MapViewBundleKey";
private static final int REQUEST_CODE = 100;
private String mLatitude, mLongitude,mTitle;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_map);
mLatitude = getIntent().getExtras().getString("lat");
mLongitude = getIntent().getExtras().getString("lon");
mTitle = getIntent().getExtras().getString("title");
if (!hasPermissions(this, RUNTIME_PERMISSIONS)) {
ActivityCompat.requestPermissions(this, RUNTIME_PERMISSIONS, REQUEST_CODE);
}
mMapView = findViewById(R.id.mapView);
Bundle mapViewBundle = null;
if (savedInstanceState != null) {
mapViewBundle = savedInstanceState.getBundle(MAPVIEW_BUNDLE_KEY);
}
mMapView.onCreate(mapViewBundle);
mMapView.getMapAsync(this);
}
@Override
protected void onStart() {
super.onStart();
mMapView.onStart();
}
@Override
protected void onStop() {
super.onStop();
mMapView.onStop();
}
@Override
protected void onDestroy() {
super.onDestroy();
mMapView.onDestroy();
}
@Override
protected void onPause() {
mMapView.onPause();
super.onPause();
}
@Override
protected void onResume() {
super.onResume();
mMapView.onResume();
}
@Override
public void onLowMemory() {
super.onLowMemory();
mMapView.onLowMemory();
}
@Override
public void onMapReady(HuaweiMap huaweiMap) {
Log.d(TAG, "onMapReady: ");
hmap = huaweiMap;
hmap.setMyLocationEnabled(true);
hmap.setMapType(HuaweiMap.MAP_TYPE_NORMAL);
hmap.setMaxZoomPreference(15);
hmap.setMinZoomPreference(5);
CameraPosition build = new CameraPosition.Builder()
.target(new LatLng(Double.parseDouble(mLatitude), Double.parseDouble(mLongitude)))
.build();
CameraUpdate cameraUpdate = CameraUpdateFactory
.newCameraPosition(build);
hmap.animateCamera(cameraUpdate);
MarkerOptions options = new MarkerOptions()
.position(new LatLng(Double.parseDouble(mLatitude),
Double.parseDouble(mLongitude)))
.title(mTitle);
mMarker = hmap.addMarker(options);
mMarker.showInfoWindow();
hmap.setOnMarkerClickListener(new HuaweiMap.OnMarkerClickListener() {
@Override
public boolean onMarkerClick(Marker marker) {
Toast.makeText(getApplicationContext(), "onMarkerClick:" +
marker.getTitle(), Toast.LENGTH_SHORT).show();
return false;
}
});
}
private static boolean hasPermissions(Context context, String... permissions) {
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.M && permissions != null) {
for (String permission : permissions) {
if (ActivityCompat.checkSelfPermission(context, permission)
!= PackageManager.PERMISSION_GRANTED) {
return false;
}
}
}
return true;
}
}
Core Functionality of Map
1) Types of Map
There are five types map:
· HuaweiMap.MAP_TYPE_NORMAL
· HuaweiMap.MAP_TYPE_NONE
· HuaweiMap.MAP_TYPE_SATELLITE
· HuaweiMap.MAP_TYPE_HYBRID
· HuaweiMap.MAP_TYPE_TERRAIN
But we can only use MAP_TYPE_NORMAL and MAP_TYPE_NONE. Normal type is a standard map, which shows roads, artificial structures, and natural features such as rivers. None type is an empty map without any data.
The Rest Map type is in development phase.
2) Camera Movement
Huawei maps are moved by simulating camera movement. You can control the visible region of a map by changing the camera's position. To change the camera's position, create different types of CameraUpdate objects using the CameraUpdateFactory class, and use these objects to move the camera.
Code:
CameraPosition build = new CameraPosition.Builder().target(new
LatLng(Double.parseDouble(mLatitude),
Double.parseDouble(mLongitude))).build();
CameraUpdate cameraUpdate = CameraUpdateFactory
.newCameraPosition(build);
hmap.animateCamera(cameraUpdate);
In the above code we are using Map camera in animation mode. When moving the map camera in animation mode, you can set the animation duration and API to be called back when the animation stops. By default, the animation duration is 250 ms.
3) My Location in Map
We can get our location in our Map by simply enabling my-location layer. Also, we can display my-location icon in the Map.
Code:
hmap.setMyLocationEnabled(true);
hmap.getUiSettings().setMyLocationButtonEnabled(true);
4) Show Marker in Map
We can add markers to a map to identify locations such as stores and buildings, and provide additional information with information windows.
Code:
MarkerOptions options = new MarkerOptions()
.position(new LatLng(Double.parseDouble(mLatitude),
Double.parseDouble(mLongitude)))
.title(mTitle); // Adding the title here …
mMarker = hmap.addMarker(options);
mMarker.showInfoWindow();
We can customize our marker according to our need using BitmapDescriptor object.
Code:
Bitmap bitmap = ResourceBitmapDescriptor.drawableToBitmap(this,
ContextCompat.getDrawable(this, R.drawable.badge_ph));
BitmapDescriptor bitmapDescriptor = BitmapDescriptorFactory.fromBitmap(bitmap);
mMarker.setIcon(bitmapDescriptor);
We can title to the Marker as shown in the above code. We can also make the marker clickable as shown below.
Code:
hmap.setOnMarkerClickListener(new HuaweiMap.OnMarkerClickListener() {
@Override
public boolean onMarkerClick(Marker marker) {
Toast.makeText(getApplicationContext(), "onMarkerClick:" +
marker.getTitle(), Toast.LENGTH_SHORT).show();
return false;
}
});
5) Map comes in shape
a) Polyline
b) Polygon
c) Circle
We can use Polyline if we need to show routes from one place to another. We can combine Direction API with Polyline to show routes for walking, bicycling and driving also calculating routes distance.
If we need to show radius like the location under 500 meter or something we use Circle shape to show in the map.
The Result
Any questions about this process, you can try to acquire answers from HUAWEI Developer Forum.
This article is originally from HUAWEI Developer Forum
Forum link: https://forums.developer.huawei.com/forumPortal/en/home
HiAi Image Super Resolution
Upscales an image or reduces image noise and improves image details without changing the resolution.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Upscales an image or reduces image noise and improves image details without changing the resolution.
Base on AI deep learning of CV (Computer Vision)
Utilize Huawei NPU (Neural Processing Unit), 50X faster than CPU
1X & 3X super-resolution furnish images with clearer effect, reducing JPEG compression noise
You can check the offical documentation about HiAi Image super resolution.
Huawei continuous investment on NPU technology
Huawei Phones support HiAI
Software: Huawei EMUI 9.0 & above
Hardware: CPU 970,810, 820,985,990
Codelab
https://developer.huawei.com/consumer/en/codelab/HiAIImageSuperresolution/index.html#0
You can also follow the codelab to implement the HiAi image resolution with the help DevEco IDE plugin in Android Studio.
Project: (HiAi Image Super Resolution)
In this article we are going to make project in which we can implement HiAi Image Super Resolution to improve low resolution image quality which is used in most of the application as a thumbnail images.
1. Implementation:
Download the vision-oversea-release.aar package in the Huawei AI Engine SDKs from the Huawei developer community.
Copy the downloaded vision-oversea-release.aar package to the app/libs directory of the project.
Add the following code to build.gradle in the APP directory of the project, and add vision-release.aar to the project. Dependency on the Gson library must be added, because the conversion of parameters and results between the JavaScript Object Notation (JSON) and Java classes inside vision-release.aar depends on the Gson library.
Code:
repositories {
flatDir {
dirs 'libs'
}
}
dependencies {
implementation fileTree(dir: 'libs', include: ['*.jar'])
implementation(name: 'vision-oversea-release', ext: 'aar')
implementation 'com.google.code.gson:gson:2.8.6'
}
2. Assets:
In this section we adding some low resolution images in "assets/material/image_super_resolution" directory, to further fetch images for local direction for optimization.
3. Design ListView:
In this section we are design ListView in our layouts to show original images and optimized images.
activity_main.xml
Code:
<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:orientation="vertical"
tools:context=".MainActivity">
<LinearLayout
android:id="@+id/linearLayout"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:layout_marginTop="1pt"
android:layout_marginBottom="5pt"
android:orientation="horizontal"
app:layout_constraintStart_toStartOf="parent"
app:layout_constraintTop_toTopOf="parent">
<TextView
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_weight="1"
android:gravity="center"
android:text="Original Image"
android:textSize="24sp" />
<TextView
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_weight="1"
android:gravity="center"
android:text="Improved Image"
android:textSize="24sp" />
</LinearLayout>
<ListView
android:id="@+id/item_listView"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:dividerHeight="3pt"
/>
</LinearLayout>
items.xml
Code:
<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
android:orientation="horizontal"
android:layout_width="match_parent"
android:layout_height="match_parent"
>
<ImageView
android:id="@+id/imgOriginal"
android:layout_width="100dp"
android:layout_height="100dp"
app:srcCompat="@drawable/noimage"
android:layout_gravity="start"
android:layout_weight="1"
/>
<TextView
android:id="@+id/imgTitle"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text=" "
android:layout_gravity="center_horizontal"
android:layout_weight="0"
android:textAlignment="center"
/>
<ImageView
android:id="@+id/imgConverted"
android:layout_width="100dp"
android:layout_height="100dp"
app:srcCompat="@drawable/noimage"
android:layout_gravity="end"
android:layout_weight="1"
app:layout_constraintDimensionRatio="h,4:3"
/>
</LinearLayout>
4. Coding: (Adapter, HiAi Image Super Resolution )
Util Class:
We make a util class AssetsFileUtil, to get all the images from local assets directory, get single BitMap image.
Code:
public class AssetsFileUtil {
public static Bitmap getBitmapByFilePath(Context context, String filePath){
try{
AssetManager assetManager = context.getAssets();
InputStream is = assetManager.open(filePath);
Bitmap bitmap = BitmapFactory.decodeStream(is);
return bitmap;
}catch (Exception e){
e.printStackTrace();
return null;
}
}
public static List<Bitmap> getBitmapListByDirPath(Context context, String dirPath){
List<Bitmap> list = new ArrayList<Bitmap>();
try{
AssetManager assetManager = context.getResources().getAssets();
String[] photos = assetManager.list(dirPath);
for(String photo : photos){
if(isFile(photo)){
Bitmap bitmap = getBitmapByFilePath(context,dirPath + "/" + photo);
list.add(bitmap);
}else {
List<Bitmap> childList = getBitmapListByDirPath(context,dirPath + "/" + photo);
list.addAll(childList);
}
}
}catch (Exception e){
e.printStackTrace();
}
return list;
}
public static List<String> getFileNameListByDirPath(Context context, String dirPath){
List<String> list = new ArrayList<String>();
try{
AssetManager assetManager = context.getResources().getAssets();
String[] photos = assetManager.list(dirPath);
for(String photo : photos){
if(isFile(photo)){
list.add(dirPath + "/" + photo);
}else {
List<String> childList = getFileNameListByDirPath(context,dirPath + "/" + photo);
list.addAll(childList);
}
}
}catch (Exception e){
e.printStackTrace();
}
return list;
}
public static boolean isFile(String fileName){
if(fileName.contains(".")){
return true;
}else {
return false;
}
}
}
MainActivity Class:
In this class we are getting local images and attaching images list to our Adapter
Code:
public class MainActivity extends AppCompatActivity {
private String mDirPath;
private ArrayList<Item> itemList;
private List<String> imageList;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
itemList = new ArrayList<Item>();
getLocalImages();
// Setting Adapter and listview
ItemAdapter itemAdapter = new ItemAdapter(getApplicationContext(), R.layout.items, itemList);
ListView listView = findViewById(R.id.item_listView);
listView.setAdapter(itemAdapter);
}
public void getLocalImages(){
mDirPath ="material/image_super_resolution";
imageList = AssetsFileUtil.getFileNameListByDirPath(this,mDirPath);
for(int i=0; i<imageList.size();i++){
itemList.add(new Item(imageList.get(i), " ", imageList.get(i)));
}
}
}
Item Class:
Prepare item data class.
Code:
public class Item {
private String imgOriginal;
private String imgTitle;
private String imgConverted;
public Item(String imgOriginal, String imgTitle, String imgConverted) {
this.imgOriginal = imgOriginal;
this.imgTitle = imgTitle;
this.imgConverted = imgConverted;
}
public String getImgOriginal() {
return imgOriginal;
}
public void setImgOriginal(String imgOriginal) {
this.imgOriginal = imgOriginal;
}
public String getImgTitle() {
return imgTitle;
}
public void setImgTitle(String imgTitle) {
this.imgTitle = imgTitle;
}
public String getImgConverted() {
return imgConverted;
}
public void setImgConverted(String imgConverted) {
this.imgConverted = imgConverted;
}
}
ItemAdapter Class:
In this class we are binding our images array list with our layout ImageView and implement HiAi Image Resolution to optimize image resolution.
ItemAdapter class extends ArrayAdapter with the type of Item class.
Code:
public class ItemAdapter extends ArrayAdapter<Item>
Define some constants
Code:
private ArrayList<Item> itemList;
private final static int SUPERRESOLUTION_RESULT = 110;
private Bitmap bitmapOriginal;
private Bitmap bitmapConverted;
ImageView imgOriginal;
ImageView imgConverted;
private String TAG = "ItemAdapter";
Prepare constructor for the Adpater class
Code:
public ItemAdapter(@NonNull Context context, int resource, @NonNull ArrayList<Item> itemList) {
super(context, resource, itemList);
this.itemList = itemList;
}
Define initHiAi function to check service connected or disconnected
Code:
/**
* init HiAI interface
*/
private void initHiAI() {
/** Initialize with the VisionBase static class and asynchronously get the connection of the service */
VisionBase.init(getContext(), new ConnectionCallback() {
@Override
public void onServiceConnect() {
/** This callback method is invoked when the service connection is successful; you can do the initialization of the detector class, mark the service connection status, and so on */
}
@Override
public void onServiceDisconnect() {
/** When the service is disconnected, this callback method is called; you can choose to reconnect the service here, or to handle the exception*/
}
});
}
Define setHiAi function to perform HiAi operation on Original image and generate the optimized Bitmap.
Code:
/**
* Capability Interfaces
*
* @return
*/
private void setHiAi() {
/** Define class detector, the context of this project is the input parameter */
ImageSuperResolution superResolution = new ImageSuperResolution(getContext());
/** Define the frame, put the bitmap that needs to detect the image into the frame*/
Frame frame = new Frame();
/** BitmapFactory.decodeFile input resource file path*/
// Bitmap bitmap = BitmapFactory.decodeFile(null);
frame.setBitmap(bitmapOriginal);
/** Define and set super-resolution parameters*/
SuperResolutionConfiguration paras = new SuperResolutionConfiguration(
SuperResolutionConfiguration.SISR_SCALE_3X,
SuperResolutionConfiguration.SISR_QUALITY_HIGH);
superResolution.setSuperResolutionConfiguration(paras);
/** Run super-resolution and get result of processing */
ImageResult result = superResolution.doSuperResolution(frame, null);
/** After the results are processed to get bitmap*/
Bitmap bmp = result.getBitmap();
/** Note: The result and the Bitmap in the result must be NULL, but also to determine whether the returned error code is 0 (0 means no error)*/
this.bitmapConverted = bmp;
handler.sendEmptyMessage(SUPERRESOLUTION_RESULT);
}
Define Handler if the optimization completed attach the optimized Bitmap to image.
Code:
private Handler handler = new Handler() {
@Override
public void handleMessage(Message msg) {
super.handleMessage(msg);
switch (msg.what) {
case SUPERRESOLUTION_RESULT:
if (bitmapConverted != null) {
imgConverted.setImageBitmap(bitmapConverted);
} else { // Set the original image
imgConverted.setImageBitmap(bitmapOriginal);
// toast("High Resolution image");
}
break;
}
}
};
Override the getView method attached orginial image to image view and process the original image to attached optimized image.
Code:
@NonNull
@Override
public View getView(int position, @Nullable View convertView, @NonNull ViewGroup parent) {
initHiAI();
int itemIndex = position;
if(convertView == null){
convertView = LayoutInflater.from(getContext()).inflate(R.layout.items,parent, false);
}
imgOriginal = convertView.findViewById(R.id.imgOriginal);
TextView imgTitle = convertView.findViewById(R.id.imgTitle);
imgConverted = convertView.findViewById(R.id.imgConverted);
bitmapOriginal = AssetsFileUtil.getBitmapByFilePath(imgOriginal.getContext(), itemList.get(itemIndex).getImgConverted());
imgOriginal.setImageBitmap(bitmapOriginal);
bitmapConverted =AssetsFileUtil.getBitmapByFilePath(imgConverted.getContext(), itemList.get(itemIndex).getImgOriginal());
imgConverted.setImageBitmap(bitmapConverted);
int height = bitmapOriginal.getHeight();
int width = bitmapOriginal.getWidth();
Log.e(TAG, "width:" + width + ";height:" + height);
if (width <= 800 && height <= 600) {
new Thread() {
@Override
public void run() {
setHiAi();
}
}.start();
} else {
toast("Width and height of the image cannot exceed 800*600");
}
imgTitle.setText(itemList.get(itemIndex).getImgTitle());
return convertView;
}
public void toast(String text) {
Toast.makeText(getContext(), text, Toast.LENGTH_SHORT).show();
}
Coding section has been complete here, now run you project and check the output of the Image Optimization by using HiAi Image Super Resolution.
5. Result
More articles like this, you can visit HUAWEI Developer Forum and Medium.
https://forums.developer.huawei.com/forumPortal/en/home
In this article, We will implement the Huawei Share Kit SDK and complete our demo application.
In the previous article, we have learned about Share Kit introduction and created project. So let’s start our implementation.
I will represent the functionality of Share Kit in a simple way with a working application and give a demo.
Before start developing the application we must have the following requirement.
Hardware Requirements
1. A computer (desktop or laptop) that runs Windows 7 or Windows 10
2. A Huawei phone (with the USB cable), which is used for debugging
3. A third-party Android device, which is used for debugging
Software Requirements
1. JDK 1.8 or later
2. Android API (level 26 or higher)
3. EMUI 10.0 or later
Let’s start the development:
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
1. Add Share Kit SDK in project:
2. We need to add the code repository to the project root directory gradle.
Code:
maven {
url 'http://developer.huawei.com/repo/'
}
3. We need to add the following dependencies in our app gradle.
Code:
dependencies {
implementation files('libs/sharekit-1.0.1.300.aar')
implementation 'com.android.support:support-annotations:28.0.0'
implementation 'com.android.support:localbroadcastmanager:28.0.0'
implementation 'com.android.support:support-compat:28.0.0'
implementation 'com.google.guava:guava:24.1-android'
}
Note: You need to raise a ticket to get Share Kit SDK “sharekit-1.0.300.aar” file
Click on the below link and raise your ticket.
https://developer.huawei.com/consumer/en/support/feedback/#/
4. I have created following package and resource file:
5. I have mentioned all activities in manifest file:
Code:
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
package="com.hms.myshare">
<application
android:allowBackup="true"
android:icon="@mipmap/ic_launcher"
android:label="@string/app_name"
android:roundIcon="@mipmap/ic_launcher_round"
android:supportsRtl="true"
android:theme="@style/AppTheme">
<activity
android:name=".SplashScreen"
android:label="@string/app_name">
<intent-filter>
<action android:name="android.intent.action.MAIN" />
<category android:name="android.intent.category.LAUNCHER" />
</intent-filter>
</activity>
<activity
android:name=".SearchingActivity"
android:configChanges="orientation|keyboardHidden|screenSize"/>
<activity
android:name=".ReceiveActivity"
android:configChanges="orientation|keyboardHidden|screenSize"/>
</application>
</manifest>
Let’s create an awesome User Interface:
1. I have created a wave ripple effect which will help to find the device from a UI perspective.
I have created a SearchingView.java class:
Code:
public class SearchingView extends RelativeLayout {
private static final int DEFAULT_RIPPLE_COUNT=6;
private static final int DEFAULT_DURATION_TIME=3000;
private static final float DEFAULT_SCALE=6.0f;
private static final int DEFAULT_FILL_TYPE=0;
private int rippleColor;
private float rippleStrokeWidth;
private float rippleRadius;
private int rippleDurationTime;
private int rippleAmount;
private int rippleDelay;
private float rippleScale;
private int rippleType;
private Paint paint;
private boolean animationRunning=false;
private AnimatorSet animatorSet;
private ArrayList<Animator> animatorList;
private LayoutParams rippleParams;
private ArrayList<RippleView> rippleViewList=new ArrayList<RippleView>();
public SearchingView(Context context) {
super(context);
}
public SearchingView(Context context, AttributeSet attrs) {
super(context, attrs);
init(context, attrs);
}
public SearchingView(Context context, AttributeSet attrs, int defStyleAttr) {
super(context, attrs, defStyleAttr);
init(context, attrs);
}
private void init(final Context context, final AttributeSet attrs) {
if (isInEditMode())
return;
if (null == attrs) {
throw new IllegalArgumentException("Attributes should be provided to this view,");
}
final TypedArray typedArray = context.obtainStyledAttributes(attrs, R.styleable.RippleBackground);
rippleColor=typedArray.getColor(R.styleable.RippleBackground_rb_color, getResources().getColor(R.color.rippelColor));
rippleStrokeWidth=typedArray.getDimension(R.styleable.RippleBackground_rb_strokeWidth, getResources().getDimension(R.dimen.rippleStrokeWidth));
rippleRadius=typedArray.getDimension(R.styleable.RippleBackground_rb_radius,getResources().getDimension(R.dimen.rippleRadius));
rippleDurationTime=typedArray.getInt(R.styleable.RippleBackground_rb_duration,DEFAULT_DURATION_TIME);
rippleAmount=typedArray.getInt(R.styleable.RippleBackground_rb_rippleAmount,DEFAULT_RIPPLE_COUNT);
rippleScale=typedArray.getFloat(R.styleable.RippleBackground_rb_scale,DEFAULT_SCALE);
rippleType=typedArray.getInt(R.styleable.RippleBackground_rb_type,DEFAULT_FILL_TYPE);
typedArray.recycle();
rippleDelay=rippleDurationTime/rippleAmount;
paint = new Paint();
paint.setAntiAlias(true);
if(rippleType==DEFAULT_FILL_TYPE){
rippleStrokeWidth=0;
paint.setStyle(Paint.Style.FILL);
}else
paint.setStyle(Paint.Style.STROKE);
paint.setColor(rippleColor);
rippleParams=new LayoutParams((int)(2*(rippleRadius+rippleStrokeWidth)),(int)(2*(rippleRadius+rippleStrokeWidth)));
rippleParams.addRule(CENTER_IN_PARENT, TRUE);
animatorSet = new AnimatorSet();
animatorSet.setInterpolator(new AccelerateDecelerateInterpolator());
animatorList=new ArrayList<Animator>();
for(int i=0;i<rippleAmount;i++){
RippleView rippleView=new RippleView(getContext());
addView(rippleView,rippleParams);
rippleViewList.add(rippleView);
final ObjectAnimator scaleXAnimator = ObjectAnimator.ofFloat(rippleView, "ScaleX", 1.0f, rippleScale);
scaleXAnimator.setRepeatCount(ObjectAnimator.INFINITE);
scaleXAnimator.setRepeatMode(ObjectAnimator.RESTART);
scaleXAnimator.setStartDelay(i * rippleDelay);
scaleXAnimator.setDuration(rippleDurationTime);
animatorList.add(scaleXAnimator);
final ObjectAnimator scaleYAnimator = ObjectAnimator.ofFloat(rippleView, "ScaleY", 1.0f, rippleScale);
scaleYAnimator.setRepeatCount(ObjectAnimator.INFINITE);
scaleYAnimator.setRepeatMode(ObjectAnimator.RESTART);
scaleYAnimator.setStartDelay(i * rippleDelay);
scaleYAnimator.setDuration(rippleDurationTime);
animatorList.add(scaleYAnimator);
final ObjectAnimator alphaAnimator = ObjectAnimator.ofFloat(rippleView, "Alpha", 1.0f, 0f);
alphaAnimator.setRepeatCount(ObjectAnimator.INFINITE);
alphaAnimator.setRepeatMode(ObjectAnimator.RESTART);
alphaAnimator.setStartDelay(i * rippleDelay);
alphaAnimator.setDuration(rippleDurationTime);
animatorList.add(alphaAnimator);
}
animatorSet.playTogether(animatorList);
}
private class RippleView extends View {
public RippleView(Context context) {
super(context);
this.setVisibility(View.INVISIBLE);
}
@Override
protected void onDraw(Canvas canvas) {
int radius=(Math.min(getWidth(),getHeight()))/2;
canvas.drawCircle(radius,radius,radius-rippleStrokeWidth,paint);
}
}
public void startRippleAnimation(){
if(!isRippleAnimationRunning()){
for(RippleView rippleView:rippleViewList){
rippleView.setVisibility(VISIBLE);
}
animatorSet.start();
animationRunning=true;
}
}
public void stopRippleAnimation(){
if(isRippleAnimationRunning()){
animatorSet.end();
animationRunning=false;
}
}
public boolean isRippleAnimationRunning(){
return animationRunning;
}
Let’s see the implementation of this custom view inside xml layout:
Code:
<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
android:orientation="vertical"
android:gravity="center_horizontal"
android:layout_width="match_parent"
android:background="@drawable/background"
android:layout_height="match_parent">
<RelativeLayout
android:layout_width="wrap_content"
android:layout_height="0dp"
android:layout_weight="1"
android:gravity="center">
<com.hms.myshare.view.SearchingView
android:id="@+id/searching"
android:layout_width="match_parent"
android:layout_height="match_parent"
app:rb_color="@android:color/white"
app:rb_duration="3000"
app:rb_radius="40dp"
app:rb_rippleAmount="6"
app:rb_scale="5">
<ImageView
android:id="@+id/img_logo"
android:layout_width="200dp"
android:layout_height="200dp"
android:layout_centerInParent="true"
android:src="@drawable/log" />
</com.hms.myshare.view.SearchingView>
</RelativeLayout>
<androidx.appcompat.widget.AppCompatTextView
android:layout_width="wrap_content"
android:layout_gravity="bottom|center_horizontal"
android:textColor="@android:color/white"
android:textSize="28sp"
android:gravity="center"
android:layout_height="wrap_content"
android:layout_margin="10dp"
android:text="Huawei Share Kit"
android:id="@+id/appCompatTextView2" />
</LinearLayout>
Let’ see the output of this view:
Let’s implement Search device and Send Data:
· We have implemented this functionality inside SearchingActivity class.
We need to perform the following operation in order to implement sending data to found device.
1. We need to instantiate SDK manager class i.e. ShareKitManager with current context of Activity inside the oncreate method.
Code:
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
binding = DataBindingUtil.setContentView(this, R.layout.searching_activity);
shareKitManager = new ShareKitManager(this);
2. Add callback IShareKitInitCallback to initialize the ShareKitManager class.
Code:
IShareKitInitCallback initCallback = isSuccess -> {
Log.i(TAG, "share kit init result:" + isSuccess);
if (isSuccess) {
binding.txtError.setText(getString(R.string.sharekit_init_finish));
} else {
binding.txtError.setText(getString(R.string.sharekit_init_failed));
}
};
shareKitManager.init(initCallback);
3. Register the ShareKitManager with IWidgetCallback:
Code:
private IWidgetCallback callback = new IWidgetCallback.Stub() {
@Override
public synchronized void onDeviceFound(NearByDeviceEx nearByDeviceEx) {
String deviceId = nearByDeviceEx.getCommonDeviceId();
if (deviceId == null) {
Log.e(TAG, "onDeviceFound: deviceId is null");
return;
}
Log.i(TAG, "onDeviceFound: " + deviceId + ", btName: " + nearByDeviceEx.getBtName());
synchronized (lock) {
deviceMap.put(deviceId, nearByDeviceEx);
foundTimeMap.put(deviceId, format.format(new Date()));
updateDeviceList();
}
}
@Override
public void onDeviceDisappeared(NearByDeviceEx nearByDeviceEx) {
String deviceId = nearByDeviceEx.getCommonDeviceId();
if (deviceId == null) {
Log.e(TAG, "onDeviceDisappeared: deviceId is null");
return;
}
Log.i(TAG, "onDeviceDisappeared: " + deviceId + ", btName: " + nearByDeviceEx.getBtName());
synchronized (lock) {
deviceMap.remove(deviceId);
foundTimeMap.remove(deviceId);
updateDeviceList();
}
}
@Override
public void onTransStateChange(NearByDeviceEx nearByDeviceEx, int state, int stateValue) {
Log.i(TAG, "trans state:" + state + " value:" + stateValue);
String stateDesc = "";
switch (state) {
case STATE_PROGRESS:
stateDesc = getString(R.string.sharekit_send_progress, stateValue);
break;
case STATE_SUCCESS:
stateDesc = getString(R.string.sharekit_send_finish);
break;
case STATE_STATUS:
stateDesc = getString(R.string.sharekit_state_chg, translateStateValue(stateValue));
break;
case STATE_ERROR:
stateDesc = getString(R.string.sharekit_send_error, translateErrorValue(stateValue));
showError(getString(R.string.sharekit_send_error, translateErrorValue(stateValue)));
break;
default:
break;
}
// showToast(stateDesc);
}
@Override
public void onEnableStatusChanged() {
int status = shareKitManager.getShareStatus();
Log.i(TAG, "sharekit ability current status:" + status);
}
};
We need to pass this callback to Register api.
Code:
shareKitManager.registerCallback(callback);
4. Start searching device using Discorvey api.
Code:
shareKitManager.startDiscovery();
5. If you found the device successfully we need to call the ShareBean api for send the data.
Code:
private void doSendText() {
String text = binding.sharetext.getText().toString();
ShareBean shareBean = new ShareBean(text);
doSend(destDevice, shareBean);
}
Followed by doSend() method:
Code:
private void doSend(String deviceName, ShareBean shareBean) {
List<NearByDeviceEx> processingDevices = shareKitManager.getDeviceList();
for (NearByDeviceEx device : processingDevices) {
if (deviceName.equals(device.getBtName())) {
return;
}
}
synchronized (lock) {
for (NearByDeviceEx device : deviceMap.values()) {
if (deviceName.equals(device.getBtName())) {
shareKitManager.doSend(device, shareBean);
}
}
}
}
Let’s implement Receive data functionality:
· We have implemented this functionality inside ReceivingActivity.
· We need to enable wifi in Huawei device which receive the socket connection request from sender device.
· So we need to initialize the ShareKitManager inside this activity oncreate method.
Code:
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
binding = DataBindingUtil.setContentView(this, R.layout.receiver_activity);
binding.searching.startRippleAnimation();
shareKitManager = new ShareKitManager(this);
IShareKitInitCallback initCallback = isSuccess -> {
Log.i(TAG, "share kit init result:" + isSuccess);
};
shareKitManager.init(initCallback);
shareKitManager.enable();
}
Android device (Sender):
Huawei device (Receiver):
If you have any doubts or queries. Please leave your valuable comment or post your doubts in HUAWEI Developer Forum.
Most android applications download it's content from the cloud (commonly a REST API) getting ready to parse and display that information with lists and menus in order to display dynamic content or provide a personalized experience. There are some third party libraries designed to consume a REST API (like Retrofit) or to download media content (like Glide and Picasso). This time, let me introduce you the new Huawei Network Kit.
What is nework kit?
Network kit is the new Huawei's System SDK designed to simplify the communications with web services by providing 2 main connection modes:
Rest Client
HTTP Client
Network kit supports QUIC connections automatically, that means if the Web service supports QUIC or migrates to QUIC, your app will keep working without require any change. In addition, this kit is pretty similar to the well known Retrofit, so, if you have previous experience with Retrofit, you will be able to integrate Network Kit withount complications.
Previously, we made a News client by using the HQUIC kit. In this article we are going to develop a news client application by using the new Huawei Network Kit.
Previous requirements
A developer account in newsapi.org
Android Studio 4.0 or later and the kotlin plugin
Setting up the project
Network kit doesn't require to setup a project in AGC, but you still need to add the Huawei Maven repositories to your project-level build.gradle:
Java:
buildscript {
ext.kotlin_version = "1.4.31"
repositories {
...
maven {url 'https://developer.huawei.com/repo/'}
}
dependencies {
classpath "com.android.tools.build:gradle:4.1.2"
classpath "org.jetbrains.kotlin:kotlin-gradle-plugin:$kotlin_version"
}
}
allprojects {
repositories {
...
maven {url 'https://developer.huawei.com/repo/'}
}
}
task clean(type: Delete) {
delete rootProject.buildDir
}
Go to the official documents and look for the Network kit latest version under version change history. Once you have found the latest version available, add it to yout app-level build.gradle as follows
Java:
implementation 'com.huawei.hms:network-embedded:5.0.1.301'
We will use Moshi to parse the response from the web service, let's add the related dependencies and the kapt plugin to proccess the annotations.
Java:
plugins {
id 'com.android.application'
id 'kotlin-android'
id 'kotlin-kapt'
}
android{
...
}
dependencies {
...
implementation 'com.huawei.hms:network-embedded:5.0.1.301'
implementation 'com.squareup.moshi:moshi:1.11.0'
implementation "com.squareup.moshi:moshi-kotlin:1.11.0"
kapt 'com.squareup.moshi:moshi-kotlin-codegen:1.11.0'
...
}
To display the news in a list, we must add RecyclerView and CardView to our project and must enable the DataBinding library to make our job easier.
Java:
android {
...
//Enabling DataBinding and ViewBinding
buildFeatures{
viewBinding true
dataBinding true
}
...
}
dependencies {
...
//MVVM dependencies
implementation 'androidx.lifecycle:lifecycle-extensions:2.2.0'
implementation "androidx.lifecycle:lifecycle-viewmodel-ktx:2.3.0"
//DataBinding dependency
kapt "com.android.databinding:compiler:3.1.4"
//Layout dependencies
implementation "androidx.recyclerview:recyclerview:1.1.0"
implementation "androidx.cardview:cardview:1.0.0"
...
}
We are ready to start the project.
Building the request
First of all, Network kit must be initialized, let's create an Application class to do this job
NetworkApplication.kt
Java:
class NetworkApplication: Application() {
companion object {
const val TAG="Network Application"
}
override fun onCreate() {
super.onCreate()
initNetworkKit()
}
private fun initNetworkKit() {
// Initialize the object only once, upon the first call.
NetworkKit.init(this ,object : NetworkKit.Callback() {
override fun onResult(result: Boolean) {
if (result) {
Log.i(TAG, "Networkkit init success")
} else {
Log.i(TAG, "Networkkit init failed")
}
}
})
}
}
To make sure this code will be excecuted upon each startup, we must specify this class inside the application element in our AndroidManifest.xml. Let's add the required permissions too.
XML:
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE"/>
<uses-permission android:name="android.permission.INTERNET"/>
<uses-permission android:name="android.permission.ACCESS_WIFI_STATE"/>
<uses-permission android:name="android.permission.INTERNET"/>
<application
android:name=".NetworkApplication"
android:requestLegacyExternalStorage="true"
android:allowBackup="true"
android:icon="@mipmap/ic_launcher"
android:label="@string/app_name"
android:roundIcon="@mipmap/ic_launcher_round"
android:supportsRtl="true"
android:theme="@style/Theme.NetworkKitDemo"
android:usesCleartextTraffic="true">
<activity android:name=".MainActivity">
<intent-filter>
<action android:name="android.intent.action.MAIN" />
<category android:name="android.intent.category.LAUNCHER" />
</intent-filter>
</activity>
</application>
Now, we must create the data models wich Moshi will use to parse the response
NewsResponse.kt
Java:
@JsonClass(generateAdapter = true)
data class NewsResponse(
@Json(name = "status") val status: String?,
@Json(name = "totalResults") val totalResults: Int?,
@Json(name = "articles") val articles: List<Article>
)
@JsonClass(generateAdapter = true)
data class Article(
@Json(name = "source") val source: Source?,
@Json(name = "author") val author: String?,
@Json(name = "title") val title: String?,
@Json(name = "description") val description: String?,
@Json(name = "url") val url: String?,
@Json(name = "urlToImage") val urlToImage: String?,
@Json(name = "publishedAt") val publishedAt: String?,
@Json(name = "content") val content: String?
)
@JsonClass(generateAdapter = true)
data class Source(
@Json(name = "id") val id: String?,
@Json(name = "name") val name: String?
)
Network kit porvides 2 operation modes, we will use the REST Client to get the Top headlines in the user's country and the HTTP Client mode to download the picture of each Article. We will create a singleton class called NetworkKitHelper.
Let's take a look to the REST Client mode:
NetworkKitHelper.kt
Java:
object NetworkKitHelper {
const val TAG: String = "HTTPClient"
//Your API key from newsapi.org
val apiKey = Keys.readApiKey()
fun createNewsClient(): NewsService {
val restClient = RestClient.Builder()
.httpClient(HttpClient.Builder().build())
.baseUrl("https://newsapi.org/v2/")//Specify the API base URL, this is useful if you will consume multiple paths of the same API
.build()
return restClient.create(NewsService::class.java)
}
//Declare a Request API
interface NewsService {
//Use the GET annotation to specify the path
@GET("top-headlines/")
fun getTopHeadlines(/* use the Query annotation to specify a query parameter in the request*/
@Query("apiKey") apiKey: String? = "",
@Query("country") country: String
): Submit<String?>?
}
fun loadTopHeadlines(sampleService: NewsService, listener: NewsClientListener?,country:String=Locale.getDefault().country) {
sampleService.getTopHeadlines(apiKey,country)?.enqueue(object : Callback<String?>() {
@Throws(IOException::class)
override fun onResponse(submit: Submit<String?>?, response: Response<String?>) {
// Obtain the response. This method will be called if the request is successful.
val body = response.body
body?.let {
try {
val moshi = Moshi.Builder().build()
val adapter = moshi.adapter(NewsResponse::class.java)
val news = adapter.fromJson(it)
news?.let { myNews ->
listener?.onNewsDownloaded(myNews.articles)
}
} catch (e: JSONException) {
Log.e("excepion", e.toString())
}
}
}
override fun onFailure(submit: Submit<String?>?, exception: Throwable?) {
// Obtain the response. This method will be called if the request fails.
Log.e("LoadTopHeadlines", "response onFailure = " + exception?.message)
}
})
}
interface NewsClientListener {
fun onNewsDownloaded(news: List<Article>)
}
}
Put special attention to the loadTopHeadlines function. As you can see, there aren't coroutines or threads defined, we are using the enqueue API instead. By this way Network Kit will handle the request in asynchronous mode for us.
If the API call is successfull, we will use Moshi to parse the response into data objects. By other way, we will be notified about the error in the onFailure callback. Once the response has been parsed, NetworkKitHelper will repor the news to the specified NewsClientListener.
Let's add the code to download the preview pics:
NetworkKitHelper.kt (Adding)
Java:
object NetworkKitHelper {
private val httpClient: HttpClient = createClient()
private fun createClient(): HttpClient {
return HttpClient.Builder()
.callTimeout(1000)
.connectTimeout(10000)
.build()
}
fun createRequest(url: String): Request {
return httpClient.newRequest()
.url(url)
.method("GET")
.build()
}
fun httpClientEnqueue(request: Request, listener: HttpClientListener? = null) {
httpClient.newSubmit(request).enqueue(object : Callback<ResponseBody?>() {
@Throws(IOException::class)
override fun onResponse(
submit: Submit<ResponseBody?>?,
response: Response<ResponseBody?>
) {
// Process the response if the request is successful.
Log.i(TAG, "response code:" + response.code)
response.body?.let {
listener?.onSuccess(it.bytes())
}
}
override fun onFailure(submit: Submit<ResponseBody?>?, throwable: Throwable?) {
// Process the exception if the request fails.
Log.w(TAG, "response onFailure = ${throwable?.message}")
}
})
}
interface HttpClientListener {
fun onSuccess(body: ByteArray)
}
}
As well as with the REST Client mode, we are able to enqueue HTTP Requests and define a callback for each one. In this case, we are receiving a byte array which will be used to create and display a bitmap.
Here we will face a complication. If we try to store the bitmap in the same data class as the Article, Moshi will cause a reflection error at compilation time. To solve this, we will define a new class to store the article and be responsible to load the bitmap, by doing so, we will be able to load the news as soon as we get them and then using the observer pattern, the bitmap will be added to the view as soon as it's ready.
ArticleModel.kt
Java:
class ArticleModel(val article: Article) : NetworkKitHelper.HttpClientListener {
private val _bitmap= MutableLiveData<Bitmap?>().apply{postValue(null)}
val bitmap: LiveData<Bitmap?> =_bitmap
init {
loadBitmap()
}
fun loadBitmap() {
article.urlToImage?.let{
val request=NetworkKitHelper.createRequest(it)
NetworkKitHelper.httpClientEnqueue(request, this)
}
}
override fun onSuccess(body: ByteArray) {
val bitmap= BitmapFactory.decodeByteArray(body, 0, body.size)
val resizedBitmap = Bitmap.createScaledBitmap(bitmap, 1000, 600, true)
_bitmap.postValue(resizedBitmap)
}
}
As soon as any instance of ArticleModel is created, it will enqueue an HTTP request async for the preview pic. If the call is successfull, we will receive a ByteArray in the onSuccess callback to create our bitmap from it and let the observer know the bitmap is ready to be displayed.
Sending the request
Let's create a ViewModel which will be responsible to invoke the API and store the data. Here we will use the observer pattern to let the observer know the Articles are ready to be displayed.
MainViewModel.kt
Java:
class MainViewModel : ViewModel(), NetworkKitHelper.NewsClientListener {
private val _articles = MutableLiveData<ArrayList<ArticleModel>>().apply { value = ArrayList() }
val articles: LiveData<ArrayList<ArticleModel>> = _articles
fun loadTopHeadlines(){
articles.value?.let{
if(it.isEmpty()) getTopHeadlines()
else return
}
}
private fun getTopHeadlines() {
NetworkKitHelper.loadTopHeadlines(NetworkKitHelper.createNewsClient(),this)
}
override fun onNewsDownloaded(news: List<Article>) {
val list=ArrayList<ArticleModel>()
for (article: Article in news) {
list.add(ArticleModel(article))
}
_articles.postValue(list)
}
}
To avoid downloading the news again when the user rotates the screen, we are defining the loadTopHeadlines function. It will only make the request if the list of articles is empty.
Displaying the Articles
We will use DataBinding to quicly display our news in a RecyclerView on the MainActivity, let's take a look to the main layout
activity_main.xml
XML:
<?xml version="1.0" encoding="utf-8"?>
<layout
xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
>
<data class="MainBinding"/>
<androidx.constraintlayout.widget.ConstraintLayout
android:layout_width="match_parent"
android:layout_height="match_parent"
tools:context=".MainActivity">
<androidx.recyclerview.widget.RecyclerView
android:id="@+id/recycler"
android:layout_height="match_parent"
android:layout_width="match_parent"
app:layoutManager="androidx.recyclerview.widget.LinearLayoutManager"
/>
</androidx.constraintlayout.widget.ConstraintLayout>
</layout>
Now we must define the card wich will be rendered for each article
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
article_card.xml
XML:
<?xml version="1.0" encoding="utf-8"?>
<layout
xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:card_view="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools">
<data class="ArticleBinding">
<variable
name="item"
type="com.hms.demo.networkkitdemo.ArticleModel" />
</data>
<androidx.cardview.widget.CardView
android:id="@+id/card_view"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:layout_gravity="center"
android:layout_marginHorizontal="5dp"
android:layout_marginVertical="5dp"
card_view:cardCornerRadius="15dp"
card_view:cardElevation="20dp"
android:foreground="?android:attr/selectableItemBackground"
android:clickable="true"
android:focusable="true">
<androidx.constraintlayout.widget.ConstraintLayout
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:padding="10dp">
<com.google.android.material.imageview.ShapeableImageView
android:id="@+id/pic"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:layout_marginTop="5dp"
android:contentDescription="@string/desc"
card_view:layout_constraintEnd_toEndOf="parent"
card_view:layout_constraintHorizontal_bias="1.0"
card_view:layout_constraintStart_toStartOf="parent"
card_view:layout_constraintTop_toBottomOf="@+id/articleTitle"
card_view:shapeAppearanceOverlay="@style/roundedImageView"
tools:srcCompat="@tools:sample/avatars" />
<TextView
android:id="@+id/articleTitle"
android:layout_width="0dp"
android:layout_height="wrap_content"
android:text="@{item.article.title}"
android:textAlignment="viewStart"
android:textSize="24sp"
android:textStyle="bold"
card_view:layout_constraintEnd_toEndOf="parent"
card_view:layout_constraintHorizontal_bias="0.498"
card_view:layout_constraintStart_toStartOf="parent"
card_view:layout_constraintTop_toTopOf="parent" />
<TextView
android:id="@+id/desc"
android:layout_width="0dp"
android:layout_height="wrap_content"
android:layout_marginTop="5dp"
android:text="@{item.article.description}"
android:textSize="20sp"
card_view:layout_constraintEnd_toEndOf="parent"
card_view:layout_constraintStart_toStartOf="parent"
card_view:layout_constraintTop_toBottomOf="@+id/pic" />
<TextView
android:id="@+id/source"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_marginTop="15dp"
android:text="@{item.article.source.name}"
card_view:layout_constraintBottom_toBottomOf="parent"
card_view:layout_constraintStart_toStartOf="parent"
card_view:layout_constraintTop_toBottomOf="@+id/desc"
card_view:layout_constraintVertical_bias="0.09" />
</androidx.constraintlayout.widget.ConstraintLayout>
</androidx.cardview.widget.CardView>
</layout>
The ArticleBinding class will be responsible to fill the view with the values in it's ArticleModel instance for us. That's the magic of DataBinding.
As you may know, to display elements in a RecyclerView we need an Adapter, so let's define it
NewsAdapter.kt
Java:
class NewsAdapter: RecyclerView.Adapter<NewsAdapter.NewsViewHolder>() {
var articles:List<ArticleModel> =ArrayList()
class NewsViewHolder(private val binding:ArticleBinding): RecyclerView.ViewHolder(binding.root) {
fun bind(item:ArticleModel){
binding.item=item
item.bitmap.observe(binding.root.context as LifecycleOwner){
it?.let{
binding.pic.setImageBitmap(it)
}
}
}
}
override fun onCreateViewHolder(parent: ViewGroup, viewType: Int): NewsViewHolder {
val inflater=LayoutInflater.from(parent.context)
val binding=ArticleBinding.inflate(inflater,parent,false)
return NewsViewHolder(binding)
}
override fun onBindViewHolder(holder: NewsViewHolder, position: Int) {
holder.bind(articles[position])
}
override fun getItemCount(): Int {
return articles.size
}
}
Put special attention to the bind function of the NewsViewHolder class, from here we are telling to the ArticleBinding instance what is the information we want to display in the view. Also, we are using the observer pattern to update the ImageView once the article's preview pic has been downloaded.
Finally, is time to join everything through the MainActivity
MainActivity.kt
Java:
class MainActivity : AppCompatActivity() {
companion object {
const val TAG="Main"
}
private lateinit var binding:MainBinding
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
binding= MainBinding.inflate(layoutInflater)
binding.lifecycleOwner = this
setContentView(binding.root)
val viewModel:MainViewModel=ViewModelProvider(this).get(MainViewModel::class.java)
val adapter=NewsAdapter()
viewModel.articles.observe(this){
adapter.articles=it
adapter.notifyDataSetChanged()
}
binding.recycler.adapter=adapter
viewModel.loadTopHeadlines()
}
}
Final result
Tips & Tricks
If your app will consume a REST API with Kotlin, is better to use Moshi instead of gson because Moshi can understand the kotlin's not-nullable types.
If you will use API keys to authenticate your client with the server, is better to use the NDK to hide your KEY and prevent it from being obtained by using reverse engineering. Let's use the Rahul Sharma's hidding method. (Make sure to download the Android NDK from the SDK Manager)
1. Swithch to the Project view and create a jni directory under the main directory.
2. Under the jni directory add the next 3 files:
Android.mk
Code:
LOCAL_PATH := $(call my-dir)
include $(CLEAR_VARS)
LOCAL_MODULE := keys
LOCAL_SRC_FILES := keys.c
include $(BUILD_SHARED_LIBRARY)
Application.mk
Code:
APP_ABI := all
Keys.c (Put here your API key)
Code:
#include <jni.h>
JNIEXPORT jstring JNICALL
Java_com_hms_demo_networkkitdemo_Keys_getApiKey(JNIEnv *env, jclass instance) {
return (*env)->NewStringUTF(env, "PUT_HERE_YOUR_API_KEY");
}
3. Switch back to the Android View and create a Keys kotlin object
Keys.kt
Java:
object Keys {
init {
System.loadLibrary("keys")
}
private external fun getApiKey(): String?
public fun readApiKey(): String? { //use this method for String
return getApiKey()
}
}
4. Tell gradle you will use NDK by adding the next code inside android
build.gradle (app-level)
Java:
plugins {
...
}
android {
...
externalNativeBuild {
ndkBuild {
path 'src/main/jni/Android.mk'
}
}
}
dependencies {
...
}
Finally, modify the NetworkKitHelper object to read the API key from the native library.
NetworkKitHelper.kt (Modifying)
Code:
object NetworkKitHelper {
val apiKey = Keys.readApiKey()
}
Conclusion
By using Network kit your app will be ready to perform requests over QUIC or HTTP/2 without writting extra code. The REST Client mode and it's annotations are helpful to to quickly consume a REST API without taking care about Threads or Coroutines. And finally, the HTTP Client mode is useful to download preview images or any other stuff which is not a JSON.
References
Read In Forum
Network Kit official Docs
Hiding Secret/Api key from reverse engineering in Android using NDK
Moshi
Hi, i have one question if we use network kit then we no need to use any third-party like volley, Retrofit?
Is it faster and easy to use than Retrofit library?
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Introduction
Nowadays, everybody is using smartphones to do daily tasks like taking photos, looking up movie times, making calls etc. The best part of Android apps on mobile phones is that they are trying more and more to get to know their users. Many applications today take users' locations to provide users with locational feeds. One common example is a normal news app, where the app takes your current location and shows the news by location.
If you're a developer, you need to understand users better to give users a better experience of the application. You should know at any time what your users do. The more you know about your users, the better application for your users can build. For example, a distance calculator app lunches by itself when you start driving yourcar or bike and stops when you stop driving. Health and fitness app also uses this service to determine how many meters/kilometers you have covered on particular day.
What is Activity Identification Service?
Activity Identification Service does the heavy lifting using acceleration sensor, cellular network information and magnetometer from device to identify user’s current activity. Your app receives a list of detected activities, each of which includes possibility and identity properties.
The Activity Identification Service can detect following activities:
STILL: When the mobile device will be still, that is, the user is either sitting at someplace or the mobile device is having no motion, then the Activity Recognition Client will detect the STILL activity.
FOOT: When the mobile device is moving at a normal speed , that is, the user carrying the mobile device is either walking or running then the Activity Identification Service will detect the FOOT activity.
WALKING: This is a sub-activity of the FOOT activity which is detected by the Activity Identification Service when the user carrying the mobile device is walking.
RUNNING: This is also a sub-activity of FOOT activity which is detected by the Activity Recognition Client when the user carrying the mobile device is running.
VEHICLE: This activity detected when the mobile device is on the bus or car or some other kind of vehicle or the user holding the mobile device is present in the vehicle.
OTHERS: The Activity Identification service will show this result when the device is unable to detect any activity on the mobile device.
In this article, we will create a sample application to show user activity. When user clicks start button, we will identify user activity status along with possibility level and display the status in Textview and Imageview. And when user clicks on stop button, we will stop requesting activity identification updates.
Development Overview
Prerequisite
1. Must have a Huawei Developer Account.
2. Must have Android Studio 3.0 or later.
3. Must have Huawei phone running EMUI 5.0 or later.
4. EMUI 5.0 or later.
Software Requirements
1. Java SDK 1.7 or later.
2. Android 5.0 or later.
Preparation
1. Create an app or project in the Huawei App Gallery Connect.
2. Provide the SHA Key and App Package name of the project in App Information Section and enable the Location Kit API.
3. Download the agconnect-services.json file.
4. Create an Android project.
Integration
1. Add below to build.gradle (project) file under buildscript/repositories and allprojects/repositories.
Code:
// Top-level build file where you can add configuration options common to all sub-projects/modules.
buildscript {
repositories {
google()
jcenter()
maven {url 'https://developer.huawei.com/repo/'}
}
dependencies {
classpath "com.android.tools.build:gradle:4.0.1"
classpath 'com.huawei.agconnect:agcp:1.4.2.300'
// NOTE: Do not place your application dependencies here; they belong
// in the individual module build.gradle files
}
}
allprojects {
repositories {
google()
jcenter()
maven {url 'https://developer.huawei.com/repo/'}
}
}
task clean(type: Delete) {
delete rootProject.buildDir
}
2. Add below to build.gradle (app) file, under dependencies to use the Location kit SDK.
Code:
apply plugin: 'com.huawei.agconnect'
dependencies {
implementation 'com.huawei.hms:location:5.0.5.300'
}
Tip: Minimum android version supported for these kits is 19.
3. Add below permissions to manifest file.
For version earlier than android Q
Code:
<uses-permission android:name="com.huawei.hms.permission.ACTIVITY_RECOGNITION"/>
For version Android Q and later
Code:
<uses-permission android:name="android.permission.ACTIVITY_RECOGNITION" />
Note: The above permissions are dangerous permission and need to be requested dynamically. Requesting permission dynamically is not covered in this article.
Development
We need to register static broadcast receiver in AndroidManifest.xmlto listen to activity status update identified by Activity Identification Service.
Code:
<receiver
android:name=".LocationReceiver"
android:exported="true">
<intent-filter>
<action android:name="com.huawei.hmssample.location.LocationBroadcastReceiver.ACTION_PROCESS_LOCATION" />
</intent-filter>
</receiver>
Now the next step is to add the UI for our Main Activity. In our application, we will be having one TextView to display the name of the current activity and display corresponding image on ImageView and one TextView to display the possibility of Activity. We will have two Buttons to start and stop activity identification tracking. So, the activity_main.xml file looks something like this:
XML:
<?xml version="1.0" encoding="utf-8"?>
<RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:background="#FAF0E6"
tools:context=".MainActivity">
<ImageView
android:id="@+id/ivDisplay"
android:layout_width="250dp"
android:layout_height="250dp"
android:layout_centerInParent="true"
android:scaleType="centerInside"
android:src="@drawable/ic_still" />
<TextView
android:id="@+id/tvidentity"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_below="@+id/ivDisplay"
android:layout_marginTop="5dp"
android:textStyle="bold"
android:textColor="#192841"
android:textSize="25sp"
android:layout_centerHorizontal="true"
app:layout_constraintBottom_toBottomOf="parent"
app:layout_constraintLeft_toLeftOf="parent"
app:layout_constraintRight_toRightOf="parent"
app:layout_constraintTop_toTopOf="parent" />
<TextView
android:id="@+id/tvpossiblity"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_below="@+id/tvidentity"
android:textSize="20sp"
android:textColor="#192841"
android:layout_centerHorizontal="true"
app:layout_constraintBottom_toBottomOf="parent"
app:layout_constraintLeft_toLeftOf="parent"
app:layout_constraintRight_toRightOf="parent"
app:layout_constraintTop_toTopOf="parent" />
<LinearLayout
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:layout_alignParentBottom="true"
android:orientation="horizontal">
<Button
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:id="@+id/bStart"
android:layout_weight="1"
android:layout_margin="5dp"
android:text="Start Tracking"
android:textColor="@color/upsdk_white"
android:background="#192841"/>
<Button
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:id="@+id/bStop"
android:layout_margin="5dp"
android:layout_weight="1"
android:text="Stop Tracking"
android:textColor="@color/upsdk_white"
android:background="#192841"/>
</LinearLayout>
</RelativeLayout>
Now let’s create instance of ActivityIdentificationService in onCreate() method of MainActivity.java
Java:
private PendingIntent mPendingIntent;
private ActivityIdentificationService identificationService;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState); intializeTracker(); }
private void intializeTracker() {
identificationService = ActivityIdentification.getService(this);
mPendingIntent = obtainPendingIntent();
}
To obtain PendingIntent object
Java:
private PendingIntent obtainPendingIntent() {
Intent intent = new Intent(this, LocationReceiver.class);
intent.setAction(LocationReceiver.ACTION_NAME);
return PendingIntent.getBroadcast(this, 0, intent, PendingIntent.FLAG_UPDATE_CURRENT);
}
When user clicks on Start Tracking Button, we will request activity identification updates by calling createActivityIdentificationUpdates() method.
identificationService.createActivityIdentificationUpdates(5000, mPendingIntent)
.addOnSuccessListener(new OnSuccessListener<Void>() {
@override
public void onSuccess(Void aVoid) {
Log.i(TAG, "createActivityIdentificationUpdates onSuccess");
}
})
// Define callback for request failure.
.addOnFailureListener(new OnFailureListener() {
@override
public void onFailure(Exception e) {
Log.e(TAG, "createActivityIdentificationUpdates onFailure:" + e.getMessage());
}
});
This method has two parameters: detectionIntervalMillis and pendingIntent, which indicate the detection interval (in milliseconds) and action to perform, respectively.
On click of Stop Tracking Button, we will stop activity identification updates.
Java:
identificationService.deleteActivityIdentificationUpdates(mPendingIntent)
.addOnSuccessListener(new OnSuccessListener<Void>() {
@Override
public void onSuccess(Void aVoid) {
Log.i(TAG, "deleteActivityIdentificationUpdates onSuccess");
}
})
.addOnFailureListener(new OnFailureListener() {
@Override
public void onFailure(Exception e) {
Log.e(TAG, "deleteActivityIdentificationUpdates onFailure:" + e.getMessage());
}
});
Finally, We can get activity identification result (containing identity and possibility) from intent received by the broadcast receiver.
Java:
public class LocationReceiver extends BroadcastReceiver {
public static final String ACTION_NAME = "com.huawei.hms.location.ACTION_PROCESS_LOCATION";
@Override
public void onReceive(Context context, Intent intent) {
if (intent != null) {
final String action = intent.getAction();
if (ACTION_NAME.equals(action)) {
// Obtains ActivityIdentificationResponse from extras of the intent sent by the activity identification service.
ActivityIdentificationResponse activityIdentificationResponse = ActivityIdentificationResponse.getDataFromIntent(intent);
if(activityIdentificationResponse!= null) {
List<ActivityIdentificationData> list = activityIdentificationResponse.getActivityIdentificationDatas();
ActivityIdentificationData identificationData = list.get(list.size() -1);
int identificationIdentity = identificationData.getIdentificationActivity();
int possibility = identificationData.getPossibility();
Intent i = new Intent("activityIdentificationReceiver");
i.putExtra("identity", identificationIdentity);
i.putExtra("possibility", possibility);
context.sendBroadcast(i);
}
}
}
}
}
getActivityIdentificationDatas() API is used to obtain the list of activitiy identification list. The activity identifications are sorted by most probable activity first.
We have created Utils.java class to obtain activity status from identity code obtained from LocationReceiver
Java:
public class Utils {
public static String getActivityIdentityName(int code) {
switch(code) {
case ActivityIdentificationData.VEHICLE:
return "VEHICLE";
case ActivityIdentificationData.BIKE:
return "BIKE";
case ActivityIdentificationData.FOOT:
return "FOOT";
case ActivityIdentificationData.STILL:
return "STILL";
case ActivityIdentificationData.OTHERS:
return "OTHERS";
case ActivityIdentificationData.WALKING:
return "WALKING";
case ActivityIdentificationData.RUNNING:
return "RUNNING";
default:
return "No Data Available";
}
}
public static int getActivityIdentityDrawableID(int code) {
switch(code) {
case ActivityIdentificationData.VEHICLE:
return R.drawable.ic_driving;
case ActivityIdentificationData.BIKE:
return R.drawable.ic_on_bicycle;
case ActivityIdentificationData.FOOT:
return R.drawable.ic_still;
case ActivityIdentificationData.STILL:
return R.drawable.ic_still;
case ActivityIdentificationData.OTHERS:
return R.drawable.ic_unknown;
case ActivityIdentificationData.WALKING:
return R.drawable.ic_walking;
case ActivityIdentificationData.RUNNING:
return R.drawable.ic_running;
default:
return R.drawable.ic_unknown;
}
}
}
Code snippet of MainActivity.java
Java:
public class MainActivity extends AppCompatActivity {
private static final String TAG = "MainActivity";
private ActivityConversionRequest request;
private Button bStart, bStop;
private TextView tvPossiblity, tvIdentity;
private ImageView ivDisplay;
private PendingIntent mPendingIntent;
private ActivityIdentificationService identificationService;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
intializeTracker();
bStart = findViewById(R.id.bStart);
bStop = findViewById(R.id.bStop);
tvIdentity = findViewById(R.id.tvidentity);
tvPossiblity = findViewById(R.id.tvpossiblity);
ivDisplay = findViewById(R.id.ivDisplay);
bStart.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View view) {
identificationService.createActivityIdentificationUpdates(5000, mPendingIntent)
.addOnSuccessListener(new OnSuccessListener<Void>() {
@Override
public void onSuccess(Void aVoid) {
Log.i(TAG, "createActivityIdentificationUpdates onSuccess");
}
})
.addOnFailureListener(new OnFailureListener() {
@Override
public void onFailure(Exception e) {
Log.e(TAG, "createActivityIdentificationUpdates onFailure:" + e.getMessage());
}
});
}
});
bStop.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View view) {
identificationService.deleteActivityIdentificationUpdates(mPendingIntent)
.addOnSuccessListener(new OnSuccessListener<Void>() {
@Override
public void onSuccess(Void aVoid) {
Log.i(TAG, "deleteActivityIdentificationUpdates onSuccess");
}
})
.addOnFailureListener(new OnFailureListener() {
@Override
public void onFailure(Exception e) {
Log.e(TAG, "deleteActivityIdentificationUpdates onFailure:" + e.getMessage());
}
});
}
});
}
private void intializeTracker() {
identificationService = ActivityIdentification.getService(this);
mPendingIntent = obtainPendingIntent();
}
// Get PendingIntent associated with the custom static broadcast class LocationBroadcastReceiver.
private PendingIntent obtainPendingIntent() {
Intent intent = new Intent(this, LocationReceiver.class);
intent.setAction(LocationReceiver.ACTION_NAME);
return PendingIntent.getBroadcast(this, 0, intent, PendingIntent.FLAG_UPDATE_CURRENT);
}
@Override
protected void onResume() {
super.onResume();
IntentFilter filter = new IntentFilter();
filter.addAction("activityIdentificationReceiver");
registerReceiver(mIdentificationReceiver , filter);
}
@Override
protected void onPause() {
super.onPause();
try {
if(mIdentificationReceiver != null){
unregisterReceiver(mIdentificationReceiver);
}
} catch (Exception e) {
e.printStackTrace();
}
}
private BroadcastReceiver mIdentificationReceiver = new BroadcastReceiver(){
@Override
public void onReceive(Context context, Intent intent) {
int possibility = intent.getIntExtra("possibility", 0);
int identity = intent.getIntExtra("identity", 103);
tvIdentity.setText(Utils.getActivityIdentityName(identity));
tvPossiblity.setText("Possibility : " + String.valueOf(possibility));
ivDisplay.setImageResource(Utils.getActivityIdentityDrawableID(identity));
}
};
}
Tips and Tricks
1.During writing of this article, the activity identification service cannot identify the cycling and riding activities on devices outside the Chinese mainland.
2. ACTIVITY_RECOGNITION is dangerous permission and should be requested dynamically.
Conclusion
In this article, we have learnt how to use the Activity Identification Service in our application to determine the activities that users are doing at any given time. The Activity Identification Service determines the ongoing activities based on a possibility value that tells you which activity is currently taking place.
Hope you found this story useful and interesting.
Happy coding!
References
https://developer.huawei.com/consum...troduction-0000001050706106-V5?ha_source=hms1