Image classification uses the transfer learning algorithm to perform minute-level learning training on hundreds of images in specific fields (such as vehicles and animals) based on the base classification model with good generalization capabilities, and can automatically generate a model for image classification. The generated model can automatically identify the category to which the image belongs. This is an auto generated model. What if we want to create our image classification model?
In Huawei ML Kit it is possible. The AI Create function in HiAI Foundation provides the transfer learning capability of image classification. With in-depth machine learning and model training, AI Create can help users accurately identify images. In this article we will create own image classification model and we will develop an Android application with using this model. Let’s start.
First of all we need some requirement for creating our model;
You need a Huawei account for create custom model. For more detail click here.
You will need HMS Toolkit. In Android Studio plugins find HMS Toolkit and add plugin into your Android Studio.
You will need Python in our computer. Install Python 3.7.5 version. Mindspore is not used in other versions.
And the last requirements is the model. You will need to find the dataset. You can use any dataset you want. I will use flower dataset. You can find my dataset in here.
Model Creation
Create a new project in Android Studio. Then click HMS on top of the Android Studio screen. Then open Coding Assistant.
1- In the Coding Assistant screen, go to AI and then click AI Create. Set the following parameters, then click Confirm.
Operation type : Select New Model
Model Deployment Location : Select Deployment Cloud.
After click confirm a browser will be opened to log into your Huawei account. After log into your account a window will opened as below.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
2- Drag or add the image classification folders to the Please select train image folder area then set Output model file path and train parameters. If you have extensive experience in deep learning development, you can modify the parameter settings to improve the accuracy of the image recognition model. After preparation click Create Model to start training and generate an image classification model.
3- Then it will start training. You can follow the process on log screen:
4- After training successfully completed you will see the screen like below:
In this screen you can see the train result, train parameter and train dataset information of your model. You can give some test data for testing your model accuracy if you want. Here is the sample test results:
5- After confirming that the training model is available, you can choose to generate a demo project.
Generate Demo: HMS Toolkit automatically generates a demo project, which automatically integrates the trained model. You can directly run and build the demo project to generate an APK file, and run the file on the simulator or real device to check the image classification performance.
Using Model Without Generated Demo Project
If you want to use the model in your project you can follow the steps:
1- In your project create an Assests file:
2- Then navigate to the folder path you chose in step 1 in Model Creation. Find your model the extension will be in the form of “.ms” . Then copy your model into Assets file. After we need one more file. Create a txt file containing your model tags. Then copy that file into Assets folder also.
3- Download and add the CustomModelHelper.kt file into your project. You can find repository in here:
https://github.com/iebayirli/AICreateCustomModel
Don’t forget the change the package name of CustomModelHelper class. After the ML Kit SDK is added, its errors will be fixed.
4- After completing the add steps, we need to add maven to the project level build.gradle file to get the ML Kit SDKs. Your gradle file should be like this:
Code:
// Top-level build file where you can add configuration options common to all sub-projects/modules.
buildscript {
ext.kotlin_version = "1.3.72"
repositories {
google()
jcenter()
maven { url "https://developer.huawei.com/repo/" }
}
dependencies {
classpath "com.android.tools.build:gradle:4.0.1"
classpath "org.jetbrains.kotlin:kotlin-gradle-plugin:$kotlin_version"
// NOTE: Do not place your application dependencies here; they belong
// in the individual module build.gradle files
}
}
allprojects {
repositories {
google()
jcenter()
maven { url "https://developer.huawei.com/repo/" }
}
}
task clean(type: Delete) {
delete rootProject.buildDir
}
5- Next, we are adding ML Kit SDKs into our app level build.gradle. And don’t forget the add aaptOption. Your app level build.gradle file should be like this:
Code:
apply plugin: 'com.android.application'
apply plugin: 'kotlin-android'
apply plugin: 'kotlin-android-extensions'
android {
compileSdkVersion 30
buildToolsVersion "30.0.2"
defaultConfig {
applicationId "com.iebayirli.aicreatecustommodel"
minSdkVersion 26
targetSdkVersion 30
versionCode 1
versionName "1.0"
testInstrumentationRunner "androidx.test.runner.AndroidJUnitRunner"
}
buildTypes {
release {
minifyEnabled false
proguardFiles getDefaultProguardFile('proguard-android-optimize.txt'), 'proguard-rules.pro'
}
}
kotlinOptions{
jvmTarget= "1.8"
}
aaptOptions {
noCompress "ms"
}
}
dependencies {
implementation fileTree(dir: "libs", include: ["*.jar"])
implementation "org.jetbrains.kotlin:kotlin-stdlib:$kotlin_version"
implementation 'androidx.core:core-ktx:1.3.2'
implementation 'androidx.appcompat:appcompat:1.2.0'
implementation 'androidx.constraintlayout:constraintlayout:2.0.2'
testImplementation 'junit:junit:4.12'
androidTestImplementation 'androidx.test.ext:junit:1.1.2'
androidTestImplementation 'androidx.test.espresso:espresso-core:3.3.0'
implementation 'com.huawei.hms:ml-computer-model-executor:2.0.3.301'
implementation 'mindspore:mindspore-lite:0.0.7.000'
def activity_version = "1.2.0-alpha04"
// Kotlin
implementation "androidx.activity:activity-ktx:$activity_version"
}
6- Let’s create the layout first:
Code:
<?xml version="1.0" encoding="utf-8"?>
<androidx.constraintlayout.widget.ConstraintLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
tools:context=".MainActivity">
<androidx.constraintlayout.widget.Guideline
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:id="@+id/guideline1"
android:orientation="horizontal"
app:layout_constraintGuide_percent=".65"/>
<ImageView
android:id="@+id/ivImage"
android:layout_width="0dp"
android:layout_height="0dp"
app:layout_constraintTop_toTopOf="parent"
app:layout_constraintDimensionRatio="3:4"
android:layout_margin="16dp"
android:scaleType="fitXY"
app:layout_constraintStart_toStartOf="parent"
app:layout_constraintEnd_toEndOf="parent"
app:layout_constraintBottom_toBottomOf="@+id/guideline1"/>
<TextView
android:id="@+id/tvResult"
android:layout_width="0dp"
android:layout_height="0dp"
android:layout_margin="16dp"
android:autoSizeTextType="uniform"
android:background="@android:color/white"
android:autoSizeMinTextSize="12sp"
android:autoSizeMaxTextSize="36sp"
android:autoSizeStepGranularity="2sp"
android:gravity="center_horizontal|center_vertical"
app:layout_constraintTop_toTopOf="@+id/guideline1"
app:layout_constraintBottom_toTopOf="@+id/btnRunModel"
app:layout_constraintStart_toStartOf="parent"
app:layout_constraintEnd_toEndOf="parent"/>
<Button
android:id="@+id/btnRunModel"
android:layout_width="0dp"
android:layout_height="wrap_content"
android:text="Pick Image and Run"
android:textAllCaps="false"
android:background="#ffd9b3"
android:layout_marginBottom="8dp"
app:layout_constraintWidth_percent=".75"
app:layout_constraintBottom_toBottomOf="parent"
app:layout_constraintStart_toStartOf="parent"
app:layout_constraintEnd_toEndOf="parent"/>
</androidx.constraintlayout.widget.ConstraintLayout>
7- Then lets create const values in our activity. We are creating four values. First value is for permission. Other values are relevant to our model. Your code should look like this:
Code:
companion object {
const val readExternalPermission = android.Manifest.permission.READ_EXTERNAL_STORAGE
const val modelName = "flowers"
const val modelFullName = "flowers" + ".ms"
const val labelName = "labels.txt"
}
8- Then we create the CustomModelHelper example. We indicate the information of our model and where we want to download the model:
Code:
private val customModelHelper by lazy {
CustomModelHelper(
this,
modelName,
modelFullName,
labelName,
LoadModelFrom.ASSETS_PATH
)
}
9- After, we are creating two ActivityResultLauncher instances for gallery permission and image picking with using Activity Result API:
Code:
private val galleryPermission =
registerForActivityResult(ActivityResultContracts.RequestPermission()) {
if (!it)
finish()
}
private val getContent =
registerForActivityResult(ActivityResultContracts.GetContent()) {
val inputBitmap = MediaStore.Images.Media.getBitmap(
contentResolver,
it
)
ivImage.setImageBitmap(inputBitmap)
customModelHelper.exec(inputBitmap, onSuccess = { str ->
tvResult.text = str
})
}
In getContent instance. We are converting selected uri to bitmap and calling the CustomModelHelper exec() method. If the process successfully finish we update textView.
10- After creating instances the only thing we need to is launching ActivityResultLauncher instances into onCreate():
Code:
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
setContentView(R.layout.activity_main)
galleryPermission.launch(readExternalPermission)
btnRunModel.setOnClickListener {
getContent.launch(
"image/*"
)
}
}
11- Let’s bring them all the pieces together. Here is our MainActivity:
Code:
package com.iebayirli.aicreatecustommodel
import android.os.Bundle
import android.provider.MediaStore
import androidx.activity.result.contract.ActivityResultContracts
import androidx.appcompat.app.AppCompatActivity
import kotlinx.android.synthetic.main.activity_main.*
class MainActivity : AppCompatActivity() {
private val customModelHelper by lazy {
CustomModelHelper(
this,
modelName,
modelFullName,
labelName,
LoadModelFrom.ASSETS_PATH
)
}
private val galleryPermission =
registerForActivityResult(ActivityResultContracts.RequestPermission()) {
if (!it)
finish()
}
private val getContent =
registerForActivityResult(ActivityResultContracts.GetContent()) {
val inputBitmap = MediaStore.Images.Media.getBitmap(
contentResolver,
it
)
ivImage.setImageBitmap(inputBitmap)
customModelHelper.exec(inputBitmap, onSuccess = { str ->
tvResult.text = str
})
}
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
setContentView(R.layout.activity_main)
galleryPermission.launch(readExternalPermission)
btnRunModel.setOnClickListener {
getContent.launch(
"image/*"
)
}
}
companion object {
const val readExternalPermission = android.Manifest.permission.READ_EXTERNAL_STORAGE
const val modelName = "flowers"
const val modelFullName = "flowers" + ".ms"
const val labelName = "labels.txt"
}
}
Summary
In summary, we learned how to create a custom image classification model. We used HMS Toolkit for model training. After model training and creation we learned how to use our model in our application. If you want more information about Huawei ML Kit you find in here.
Here is the output:
https://github.com/iebayirli/AICreateCustomModel
Minimum sdk version for this
Related
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Hello everyone.
This article about Huawei Map Kit and Site Kit. I will explain how to use site kit with map. Firstly I would like to give some detail about Map Kit and site Kit.
Map Kit provides an SDK for map development. It covers map data of more than 200 countries and regions, and supports dozens of languages. With this SDK, you can easily integrate map-based functions into your apps. Map kit supports only .Huawei devices. Thanks to Map Kit, you can add markers and shapes on your custom map. Also, Map kit provides camera movements and two different map type.
Site Kit provides the following core capabilities you need to quickly build apps with which your users can explore the world around them. Thanks to site kit, both you can search places and provides to you nearby places. Site Kit not only list places but also provide places detail.
Development Preparation
Step 1 : Register as Developer
Firstly, you have to register as developer on AppGallery Connect and create an app. You can find the guide of registering as a developer here :
Step 2 : Generating a Signing Certificate Fingerprint
Firstly, create a new project on Android Studio. Secondly, click gradle tab on the right of the screen. Finally, click Task > android > signingReport. And you will see on console your projects SHA-256 key.
Copy this fingerprint key and go AppGallery Console > My Apps > select your app > Project Settings and paste “SHA-256 certificate fingerprint” area. Don’t forget click the tick on the right.
Step 3 : Enabling Required Services
In HUAWEI Developer AppGallery Connect, go to Develop > Overview > Manage APIs.
Enable Huawei Map Kit and Site Kit on this page.
Step 4 : Download agconnect-services.json
Go to Develop > Overview > App information. Click agconnect-services.json to download the configuration file. Copy the agconnect-services.json file to the app root directory.
Step 5: Adding Dependecies
Open the build.gradle file in the root directory of your Android Studio project. Go to buildscript > repositories and allprojects > repositories, and configure the Maven repository address for the HMS SDK.
Code:
buildscript {
repositories {
maven { url 'http://developer.huawei.com/repo/' }
google()
jcenter()
}
dependencies {
classpath 'com.android.tools.build:gradle:3.6.2'
classpath 'com.huawei.agconnect:agcp:1.2.1.301'
}
}
allprojects {
repositories {
google()
jcenter()
maven {url 'http://developer.huawei.com/repo/'}
}
}
Add dependencies in build.gradle file in app directory.
Code:
dependencies {
implementation 'com.huawei.hms:maps: 4.0.1.302 ' //For Map Kit
implementation 'com.huawei.hms:site:4.0.2.301' //For Site Kit
implementation 'com.jakewharton:butterknife:10.1.0' //Butterknife Library is optional.
annotationProcessor 'com.jakewharton:butterknife-compiler:10.1.0'
}
Add the AppGallery Connect plug-in dependency to the file header.
Code:
apply plugin: 'com.huawei.agconnect'
Configure the signature in android. Copy the signature file generated in Generating a Signing Certificate Fingerprint to the app directory of your project and configure the signature in the build.gradle file.
Code:
android {
signingConfigs {
release {
storeFile file("**.**") //Signing certificate.
storePassword "******" //Keystore password.
keyAlias "******" //Alias.
keyPassword "******" //Key password.
v2SigningEnabled true
}
}
buildTypes {
release {
minifyEnabled false
proguardFiles getDefaultProguardFile('proguard-android-optimize.txt'), 'proguard-rules.pro'
signingConfig signingConfigs.release
}
debug {
signingConfig signingConfigs.release}
}
}
}
Open the modified build.gradle file again. You will find a Sync Now link in the upper right corner of the page. Click Sync Now and wait until synchronization is complete.
Step 6: Adding Permissions
To call capabilities of HUAWEI Map Kit, you must apply for the following permissions for your app:
Code:
<uses-permission android:name="android.permission.INTERNET"/>
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE"/>
<uses-permission android:name="com.huawei.appmarket.service.commondata.permission.GET_COMMON_DATA"/>
To obtain the current device location, you need to add the following permissions in the AndroidManifest file. In Android 6.0 and later, you need to apply for these permissions dynamically.
Code:
<uses-permission android:name="android.permission.ACCESS_COARSE_LOCATION"/>
<uses-permission android:name="android.permission.ACCESS_FINE_LOCATION"/>
Development Process
Step 1 : Create Fragment XML File
I used fragment for this app. But you can use in the activity. Firstly You have to create a XML file for page design. My page looks like this screenshot.
Code:
<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:map="http://schemas.android.com/apk/res-auto"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:orientation="vertical">
<ScrollView
android:layout_width="match_parent"
android:layout_height="wrap_content">
<LinearLayout
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:orientation="vertical"
android:layout_marginBottom="10dp">
<LinearLayout
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:orientation="horizontal"
android:layout_marginTop="10dp">
<EditText
android:id="@+id/editText_search"
android:layout_width="0dp"
android:layout_height="35dp"
android:layout_weight="1"
android:textSize="15dp"
android:hint="Search..."
android:background="@drawable/input_model_selected"
android:fontFamily="@font/muli_regular"
android:inputType="textEmailAddress"
android:layout_marginTop="10dp"
android:layout_marginRight="25dp"
android:layout_marginLeft="25dp"/>
<Button
android:id="@+id/btn_search"
android:layout_width="0dp"
android:layout_height="35dp"
android:layout_marginTop="10dp"
android:background="@drawable/orange_background"
android:layout_weight="1"
android:layout_marginRight="5dp"
android:onClick="search"
android:layout_marginLeft="5dp"
android:text="Search"
android:textAllCaps="false" />
</LinearLayout>
</LinearLayout>
</ScrollView>
<com.huawei.hms.maps.MapView
android:id="@+id/mapview_mapviewdemo"
android:layout_width="match_parent"
android:layout_height="match_parent"
map:cameraTargetLat="48.893478"
map:cameraTargetLng="2.334595"
map:cameraZoom="10" />
</LinearLayout>
Step 2 : Create Java File and Implement CallBacks
Now, create a java file called MapFragment and implement to this class OnMapReadyCallback. After this implementation onMapReady method will override on your class.
Secondy, you have to bind Map View, Edit Text and search button. Also, map object, permissions and search service should be defined.
Code:
View rootView;
@BindView(R.id.mapview_mapviewdemo)
MapView mMapView;
@BindView(R.id.editText_search)
EditText editText_search;
@BindView(R.id.btn_search)
Button btn_search;
private HuaweiMap hMap;
private static final String[] RUNTIME_PERMISSIONS = {
Manifest.permission.WRITE_EXTERNAL_STORAGE,
Manifest.permission.READ_EXTERNAL_STORAGE,
Manifest.permission.ACCESS_COARSE_LOCATION,
Manifest.permission.ACCESS_FINE_LOCATION,
Manifest.permission.INTERNET
};
private static final int REQUEST_CODE = 100;
//Site Kit
private SearchService searchService;
Step 3 : onCreateView and onMapReady Methods
First of all, XML should be bound. onCreateView should start view binding and onCreateView should end return view. All codes should be written between view binding and return lines.
Secondly permissions should be checked. For this you have to add hasPermissions method like this:
Code:
private static boolean hasPermissions(Context context, String... permissions) {
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.M && permissions != null) {
for (String permission : permissions) {
if (ActivityCompat.checkSelfPermission(context, permission) != PackageManager.PERMISSION_GRANTED) {
return false;
}
}
}
return true;
}
Thirdliy, onClick event should be defined for search button like this:
Code:
btn_search.setOnClickListener(this);
Next, search service should be created. When creating search service, API Key should be given as a parameter. You can access your API Key from console.
Code:
searchService = SearchServiceFactory.create(getContext(), “API KEY HERE”);
Finally, mapView should be created like this :
Code:
Bundle mapViewBundle = null;
if (savedInstanceState != null) {
mapViewBundle = savedInstanceState.getBundle(“MapViewBundleKey”);
}
mMapView.onCreate(mapViewBundle);
mMapView.getMapAsync(this);
On the onMapReady method should be include a map object. With this object you can add camera zoom and current location. If you want to zoom in on a specific coordinate when the page is opened, moveCamera should be added. And ıf you want to show current location on map, please add hMap.setMyLocationEnabled(true);
All of onCreateView should be like this :
Code:
@Nullable
@Override
public View onCreateView(@NonNull LayoutInflater inflater, @Nullable ViewGroup container, @Nullable Bundle savedInstanceState) {
rootView = inflater.inflate(R.layout.fragment_map, container, false);
ButterKnife.bind(this,rootView);
if (!hasPermissions(getContext(), RUNTIME_PERMISSIONS)) {
ActivityCompat.requestPermissions(getActivity(), RUNTIME_PERMISSIONS, REQUEST_CODE);
}
btn_search.setOnClickListener(this);
searchService = SearchServiceFactory.create(getContext(), "API KEY HERE");
Bundle mapViewBundle = null;
if (savedInstanceState != null) {
mapViewBundle = savedInstanceState.getBundle("MapViewBundleKey");
}
mMapView.onCreate(mapViewBundle);
mMapView.getMapAsync(this);
return rootView;
}
@Override
public void onMapReady(HuaweiMap huaweiMap) {
hMap = huaweiMap;
hMap.moveCamera(CameraUpdateFactory.newLatLngZoom(latLng2, 13));
hMap.setMyLocationEnabled(true);
}
Finally map is ready. Now we will search with Site Kit and show the results as a marker on the map.
Step 4 : Make Search Request And Show On Map As Marker
Now, a new method should be created for searching. TextSearchRequest object and a specific coordinate must be created within the search method. These coordinates will be centered while searching.
Coordinates and keywords should be set on the TextSearchRequest object.
On the onSearchResult method, you have to clear map. Because in the new search results, old markers shouldn’t appear. And create new StringBuilder AddressDetail objects.
Get all results with a for loop. On this for loop markers will be created. Some parameters should be given while creating the marker. Creating a sample marker can be defined as follows.
Code:
hMap.addMarker(new MarkerOptions()
.position(new LatLng(site.getLocation().getLat(), site.getLocation().getLng())) //For location
.title(site.getName()) // Marker tittle
.snippet(site.getFormatAddress())); //Will show when click to marker.
All of the search method should be like this :
Code:
public void search(){
TextSearchRequest textSearchRequest = new TextSearchRequest();
Coordinate location = new Coordinate(currentLat, currentLng);
textSearchRequest.setQuery(editText_search.getText().toString());
textSearchRequest.setLocation(location);
searchService.textSearch(textSearchRequest, new SearchResultListener<TextSearchResponse>() {
@Override
public void onSearchResult(TextSearchResponse textSearchResponse) {
hMap.clear();
StringBuilder response = new StringBuilder("\n");
response.append("success\n");
int count = 1;
AddressDetail addressDetail;
for (Site site :textSearchResponse.getSites()){
addressDetail = site.getAddress();
response.append(String.format(
"[%s] name: %s, formatAddress: %s, country: %s, countryCode: %s \r\n",
"" + (count++), site.getName(), site.getFormatAddress(),
(addressDetail == null ? "" : addressDetail.getCountry()),
(addressDetail == null ? "" : addressDetail.getCountryCode())));
hMap.addMarker(new MarkerOptions().position(new LatLng(site.getLocation().getLat(), site.getLocation().getLng())).title(site.getName()).snippet(site.getFormatAddress()));
}
Log.d("SEARCH RESULTS", "search result is : " + response);
}
@Override
public void onSearchError(SearchStatus searchStatus) {
Log.e("SEARCH RESULTS", "onSearchError is: " + searchStatus.getErrorCode());
}
});
}
Now your app that searches on the map using the Site Kit is ready. Using the other features of Map Kit, you can create a more advanced, more specific application. You can get directions on the map. Or you can draw lines on the map. There are many features of Map Kit. You can add them to your project collectively by examining all of them at the link below.
Good job
Can we customize search list.
sujith.e said:
Can we customize search list.
Click to expand...
Click to collapse
Sure, we can customize search parameters. You can add
textSearchRequest.setPoiType(***LocationType.***HOSPITAL);
in the search() method. Also, LocationType object include so many different types. For eg. Museums, hospitals, banks, art gallery, airport, atm, bus station, gym etc.
Can I share my live location to other peoples?
Can we show customized view on the marker click?
ProManojKumar said:
Can I share my live location to other peoples?
Click to expand...
Click to collapse
Unfortunately site kit and map kit don't allow share your live location to other people. If you want to share your live location, you have to create a background location service and create a server&client system.
ask011 said:
Can we show customized view on the marker click?
Click to expand...
Click to collapse
Sure, you can create a customized view for marker click action. You can create a view and set it on "OnMarkerClickListener" override method.
Hi everyone!
Today I will be briefing through how to implement a 3D Scene to display objects and play sounds in your Andorid Kotlin projects.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
All we need is Android Studio with version 3.5 or higher and a smartphone running Android 4.4 or later. The kits we need requires these specifications at minimum:
For Scene Kit alone:
JDK version: 1.7 or later
minSdkVersion: 19 or later
targetSdkVersion: 19 or later
compileSdkVersion: 19 or later
Gradle version: 3.5 or later
For Audio Kit alone:
JDK version: 1.8.211 or later
minSdkVersion: 21
targetSdkVersion: 29
compileSdkVersion: 29
Gradle version: 4.6 or later
That brings us to use Audio Kit’s minimum requirements as Scene Kit requirements are lower. So we should keep those in our minds while configuring our project. Let’s begin with implementing Scene Kit.
First of all, our aim with this Scene Kit implementation is to achieve a view of 3D object that we can interact with like this:
We will also add multiple objects and be able to cycle through. In order to use Scene Kit in your project, start by adding these implementations to build.gradle files.
Code:
buildscript {
repositories {
...
maven { url 'https://developer.huawei.com/repo/' }
}
...
}
allprojects {
repositories {
...
maven { url 'https://developer.huawei.com/repo/' }
}
}
Code:
dependencies {
...
implementation 'com.huawei.scenekit:full-sdk:5.1.0.300'
}
Note that in the project I have used viewBinding feature of Kotlin in order to skip boilerplate view initialization codes. If you want to use viewBinding, you should add this little code in your app-level build.gradle.
Code:
android {
...
buildFeatures {
viewBinding true
}
...
}
After syncing gradle files, we are ready to use Scene Kit in our project. Keep in mind that our purpose is solely to display 3D objects that user can interact with. But Scene Kit has much more deeper abilities that is provided for us. If you are actually looking for something different or want to discover all abilities, follow the link. Else, let’s continue with a custom scene view.
Scene Kit - HMS Core - HUAWEI Developer
Simple purpose of this custom view is just to load automatically our first object into the view when it is done initializing. Of course you can skip this part if you don’t need this purpose. Bear in mind that you should use default SceneView and load manually instead. You can still find the code for loading objects in this code snippet.
Code:
import android.content.Context
import android.util.AttributeSet
import android.view.SurfaceHolder
import com.huawei.hms.scene.sdk.SceneView
class CustomSceneView : SceneView {
constructor(context: Context?) : super(context)
constructor(
context: Context?,
attributeSet: AttributeSet?
) : super(context, attributeSet)
override fun surfaceCreated(holder: SurfaceHolder) {
super.surfaceCreated(holder)
loadScene("car1/scene.gltf")
loadSpecularEnvTexture("car1/specularEnvTexture.dds")
loadDiffuseEnvTexture("car1/diffuseEnvTexture.dds")
}
}
Well we cannot display anything actually before adding our object files in our projects. You will need to obtain object files elsewhere as object models I have, are not my creation. You can find public objects with ‘gltf object’ queries in search engines. Once you have your object, head to your project file and create ‘assets’ file under ‘../src/main/’ and place your object file here. In my case:
In the surfaceCreated method, loadScene(), loadSpecularEnvTexture() and loadDiffuseEnvTexture() methods are used to load our object. Once the view surface is created, our first object will be loaded into it. Now head to the xml for your 3D objects to display, in this guide case, activity_main.xml. In activity_main.xml, create the view we just created. I have also added simple arrows to navigate between models.
Code:
<?xml version="1.0" encoding="utf-8"?>
<androidx.constraintlayout.widget.ConstraintLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
tools:context=".MainActivity">
<com.example.sceneaudiodemo.CustomSceneView
android:id="@+id/csv_main"
android:layout_width="match_parent"
android:layout_height="match_parent"/>
<ImageView
android:id="@+id/iv_rightArrow"
android:layout_width="32dp"
android:layout_height="32dp"
android:layout_margin="12dp"
android:src="@drawable/ic_arrow"
android:tint="@color/white"
app:layout_constraintBottom_toBottomOf="parent"
app:layout_constraintEnd_toEndOf="parent"
app:layout_constraintTop_toTopOf="parent" />
<ImageView
android:id="@+id/iv_leftArrow"
android:layout_width="32dp"
android:layout_height="32dp"
android:layout_margin="12dp"
android:rotation="180"
android:src="@drawable/ic_arrow"
android:tint="@color/white"
app:layout_constraintBottom_toBottomOf="parent"
app:layout_constraintStart_toStartOf="parent"
app:layout_constraintTop_toTopOf="parent" />
</androidx.constraintlayout.widget.ConstraintLayout>
Now we are all set for our object to be displayed once our app is launched. Let’s add a few other objects and navigate between. In MainActivity:
Code:
private lateinit var binding: ActivityMainBinding
private var selectedId = 0
private val modelSceneList = arrayListOf(
"car1/scene.gltf",
"car2/scene.gltf",
"car3/scene.gltf"
)
private val modelSpecularList = arrayListOf(
"car1/specularEnvTexture.dds",
"car2/specularEnvTexture.dds",
"car3/specularEnvTexture.dds"
)
private val modelDiffList = arrayListOf(
"car1/diffuseEnvTexture.dds",
"car2/diffuseEnvTexture.dds",
"car3/diffuseEnvTexture.dds"
)
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
binding = ActivityMainBinding.inflate(layoutInflater)
val view = binding.root
setContentView(view)
binding.ivRightArrow.setOnClickListener {
if (modelSceneList.size == 0) [email protected]
selectedId = (selectedId + 1) % modelSceneList.size // To keep our id in the range of our model list
loadImage()
}
binding.ivLeftArrow.setOnClickListener {
if (modelSceneList.size == 0) [email protected]
if (selectedId == 0) selectedId = modelSceneList.size - 1 // To keep our id in the range of our model list
else selectedId -= 1
loadImage()
}
}
private fun loadImage() {
binding.csvMain.loadScene(modelSceneList[selectedId])
binding.csvMain.loadSpecularEnvTexture(modelSpecularList[selectedId])
binding.csvMain.loadDiffuseEnvTexture(modelDiffList[selectedId])
}
In onCreate(), we are making a simple next/previous logic to change our objects. And we are storing our objects’ file paths as strings in separate lists that we created hardcoded. You may want to tinker to make it dynamic but I wanted to keep it simple for the guide. We used ‘selectedId’ to keep track of current object being displayed.
And that’s all for SceneView implementation for 3D object views!
Now no time to waste, let’s continue with adding Audio Kit.
Head back to the app-level build.gradle and add Audio Kit implementation.
Code:
dependencies {
...
implementation 'com.huawei.hms:audiokit-player:1.1.0.300'
...
}
As we already added necessary repository while implementing Scene Kit, we won’t need to make any changes in the project-level build.gradle. So let’s go back and complete Audio Kit.
I added a simple play button to activity_main.xml.
Code:
<Button
android:id="@+id/btn_playSound"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="Play"
app:layout_constraintBottom_toBottomOf="parent"
app:layout_constraintEnd_toEndOf="parent"
app:layout_constraintStart_toStartOf="parent" />
I will use this button to play sound for current object. Afterwards, all we need to do is make these changes in our MainActivity.
Code:
private var mHwAudioManager: HwAudioManager? = null
private var mHwAudioPlayerManager: HwAudioPlayerManager? = null
override fun onCreate(savedInstanceState: Bundle?) {
...
initPlayer(this)
binding.btnPlaySound.setOnClickListener {
mHwAudioPlayerManager?.play(selectedId) // Requires playlist to play, selectedId works for index to play.
}
...
}
private fun initPlayer(context: Context) {
val hwAudioPlayerConfig = HwAudioPlayerConfig(context)
HwAudioManagerFactory.createHwAudioManager(hwAudioPlayerConfig,
object : HwAudioConfigCallBack {
override fun onSuccess(hwAudioManager: HwAudioManager?) {
try {
mHwAudioManager = hwAudioManager
mHwAudioPlayerManager = hwAudioManager?.playerManager
mHwAudioPlayerManager?.playList(getPlaylist(), 0, 0)
} catch (ex: Exception) {
ex.printStackTrace()
}
}
override fun onError(p0: Int) {
Log.e("init:onError: ","$p0")
}
})
}
fun getPlaylist(): List<HwAudioPlayItem>? {
val playItemList: MutableList<HwAudioPlayItem> = ArrayList()
val audioPlayItem1 = HwAudioPlayItem()
val sound = Uri.parse("android.resource://yourpackagename/raw/soundfilename").toString() // soundfilename should not include file extension.
audioPlayItem1.audioId = "1000"
audioPlayItem1.singer = "Taoge"
audioPlayItem1.onlinePath =
"https://lfmusicservice.hwcloudtest.cn:18084/HMS/audio/Taoge-chengshilvren.mp3"
audioPlayItem1.setOnline(1)
audioPlayItem1.audioTitle = "chengshilvren"
playItemList.add(audioPlayItem1)
val audioPlayItem2 = HwAudioPlayItem()
audioPlayItem2.audioId = "1001"
audioPlayItem2.singer = "Taoge"
audioPlayItem2.onlinePath =
"https://lfmusicservice.hwcloudtest.cn:18084/HMS/audio/Taoge-dayu.mp3"
audioPlayItem2.setOnline(1)
audioPlayItem2.audioTitle = "dayu"
playItemList.add(audioPlayItem2)
val audioPlayItem3 = HwAudioPlayItem()
audioPlayItem3.audioId = "1002"
audioPlayItem3.singer = "Taoge"
audioPlayItem3.onlinePath =
"https://lfmusicservice.hwcloudtest.cn:18084/HMS/audio/Taoge-wangge.mp3"
audioPlayItem3.setOnline(1)
audioPlayItem3.audioTitle = "wangge"
playItemList.add(audioPlayItem3)
return playItemList
}
After making the changes, we will be able to play sounds in our projects. I had to add online sounds available, if you want to use sounds added in your project, then you should use ‘sound’ variable I have given example of, and change ‘audioPlayItem.setOnline(1)’ to ‘audioPlayItem.setOnline(0)’. Also change ‘audioPlayItem.onlinePath’ to ‘audioPlayItem.filePath’. Then you should be able to play imported sound files too. By the way, that’s all for Audio Kit as well! We didn’t need to implement any play/pause or seekbar features as we just want to hear the sound and get done with it.
So we completed our guide for how to implement a Scene Kit 3D Scene View and Audio Kit to play sounds in our projects in Kotlin. If you have any questions or suggestions, feel free to contact me. Thanks for reading this far and I hope it was useful for you!
References
Scene Kit - HMS Core - HUAWEI Developer
Audio Kit - Audio Development Component - HUAWEI Developer
Can we install in non-huawei devices will it support?
sujith.e said:
Can we install in non-huawei devices will it support?
Click to expand...
Click to collapse
For Scene Kit compability our options are as these:
As for the Audio Kit, only Huawei devices are supported, referenced from: https://developer.huawei.com/consum.../HMSCore-Guides/introduction-0000001050749665
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Introduction
In this article, we can learn how capture the bills of text images using this Money Management app. This app converts the images to quality visibility by zooming. So, whenever the user purchases some shopping or spending, he can capture the bill using this application and can save in memory.
So, I will provide the series of articles on this Money Management App, in upcoming articles I will integrate other Huawei Kits.
If you are new to this application, follow my previous articles.
Beginner: Find the introduction Sliders and Huawei Account Kit Integration in Money Management Android app (Kotlin) - Part 1
Beginner: Integration of Huawei Ads Kit and Analytics Kit in Money Management Android app (Kotlin) – Part 2
Beginner: Manage the Budget using Room Database in Money Management Android app (Kotlin) – Part 3
ML Kit - Text Image Super-Resolution
The ML Kit - Text Image Super-Resolution feature of Huawei ML Kit. It provides better quality and visibility of old and blurred text on an image. When you take a photograph of a document from far or cannot properly adjust the focus, the text may not be clear. In this situation, it can zoom an image that contains the text up to three times and significantly improves the definition of the text.
Requirements
1. Any operating system (MacOS, Linux and Windows).
2. Must have a Huawei phone with HMS 4.0.0.300 or later.
3. Must have a laptop or desktop with Android Studio, Jdk 1.8, SDK platform 26 and Gradle 4.6 and above installed.
4. Minimum API Level 19 is required.
5. Required EMUI 9.0.0 and later version devices.
How to integrate HMS Dependencies
1. First register as Huawei developer and complete identity verification in Huawei developers website, refer to register a Huawei ID.
2. Create a project in android studio, refer Creating an Android Studio Project.
3. Generate a SHA-256 certificate fingerprint.
4. To generate SHA-256 certificate fingerprint. On right-upper corner of android project click Gradle, choose Project Name > Tasks > android, and then click signingReport, as follows.
Note: Project Name depends on the user created name.
5. Create an App in AppGallery Connect.
6. Download the agconnect-services.json file from App information, copy and paste in android Project under app directory, as follows.
7. Enter SHA-256 certificate fingerprint and click Save button, as follows.
Note: Above steps from Step 1 to 7 is common for all Huawei Kits.
8. Click Manage APIs tab and enable ML Kit.
9. Add the below maven URL in build.gradle(Project) file under the repositories of buildscript, dependencies and allprojects, refer Add Configuration.
Java:
maven { url 'http://developer.huawei.com/repo/' }
classpath 'com.huawei.agconnect:agcp:1.4.1.300'
10. Add the below plugin and dependencies in build.gradle(Module) file.
Java:
apply plugin: 'com.huawei.agconnect'
// Huawei AGC
implementation 'com.huawei.agconnect:agconnect-core:1.5.0.300'
// Import the text image super-resolution base SDK.
implementation 'com.huawei.hms:ml-computer-vision-textimagesuperresolution:2.0.4.300'
// Import the text image super-resolution model package.
implementation 'com.huawei.hms:ml-computer-vision-textimagesuperresolution-model:2.0.4.300'
11. Now Sync the gradle.
12. Add the required permission to the AndroidManifest.xml file.
XML:
<uses-permission android:name="android.permission.CAMERA" />
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
Let us move to development
I have created a project on Android studio with empty activity let us start coding.
In the CaptureActivity.kt we can find the business logic.
Java:
class CaptureActivity : AppCompatActivity(), View.OnClickListener {
private var analyzer: MLTextImageSuperResolutionAnalyzer? = null
private val QUALITY = 1
private val ORIGINAL = 2
private var imageView: ImageView? = null
private var srcBitmap: Bitmap? = null
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
setContentView(R.layout.activity_capture)
imageView = findViewById(R.id.bill)
srcBitmap = BitmapFactory.decodeResource(resources, R.drawable.bill_1)
findViewById<View>(R.id.btn_quality).setOnClickListener(this)
findViewById<View>(R.id.btn_original).setOnClickListener(this)
createAnalyzer()
}
// Find the on click listeners
override fun onClick(v: View?) {
if (v!!.id == R.id.btn_quality) {
detectImage(QUALITY)
} else if (v.id == R.id.btn_original) {
detectImage(ORIGINAL)
}
}
private fun release() {
if (analyzer == null) {
return
}
analyzer!!.stop()
}
// Find the method to detect bills or text images
private fun detectImage(type: Int) {
if (type == ORIGINAL) {
setImage(srcBitmap!!)
return
}
if (analyzer == null) {
return
}
// Create an MLFrame by using the bitmap.
val frame = MLFrame.Creator().setBitmap(srcBitmap).create()
val task = analyzer!!.asyncAnalyseFrame(frame)
task.addOnSuccessListener { result -> // success.
Toast.makeText(applicationContext, "Success", Toast.LENGTH_LONG).show()
setImage(result.bitmap)
}.addOnFailureListener { e ->
// Failure
if (e is MLException) {
val mlException = e
// Get the error code, developers can give different page prompts according to the error code.
val errorCode = mlException.errCode
// Get the error message, developers can combine the error code to quickly locate the problem.
val errorMessage = mlException.message
Toast.makeText(applicationContext,"Error:$errorCode Message:$errorMessage", Toast.LENGTH_LONG).show()
// Log.e(TAG, "Error:$errorCode Message:$errorMessage")
} else {
// Other exception
Toast.makeText(applicationContext, "Failed:" + e.message, Toast.LENGTH_LONG).show()
// Log.e(TAG, e.message!!)
}
}
}
private fun setImage(bitmap: Bitmap) {
[email protected](Runnable {
imageView!!.setImageBitmap(
bitmap
)
})
}
private fun createAnalyzer() {
analyzer = MLTextImageSuperResolutionAnalyzerFactory.getInstance().textImageSuperResolutionAnalyzer
}
override fun onDestroy() {
super.onDestroy()
if (srcBitmap != null) {
srcBitmap!!.recycle()
}
release()
}
}
In the activity_capture.xml we can create the UI screen.
XML:
<?xml version="1.0" encoding="utf-8"?>
<RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
tools:context=".mlkit.CaptureActivity">
<LinearLayout
android:id="@+id/buttons"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:layout_alignParentBottom="true"
android:orientation="vertical"
tools:ignore="MissingConstraints">
<Button
android:id="@+id/btn_quality"
android:layout_width="match_parent"
android:layout_height="50dp"
android:layout_margin="15dp"
android:gravity="center"
android:textSize="19sp"
android:text="Quality"
android:textAllCaps="false"
android:textColor="@color/Red"
tools:ignore="HardcodedText" />
<Button
android:id="@+id/btn_original"
android:layout_width="match_parent"
android:layout_height="50dp"
android:layout_margin="15dp"
android:gravity="center"
android:text="Original"
android:textSize="19sp"
android:textAllCaps="false"
android:textColor="@color/Red"
tools:ignore="HardcodedText" />
</LinearLayout>
<ScrollView
android:layout_width="match_parent"
android:layout_height="match_parent"
android:layout_above="@+id/buttons"
android:layout_marginBottom="15dp">
<ImageView
android:id="@+id/bill"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_centerInParent="true"
android:layout_gravity="center"
tools:ignore="ObsoleteLayoutParam" />
</ScrollView>
</RelativeLayout>
Demo
Tips and Tricks
1. Make sure you are already registered as Huawei developer.
2. Set minSDK version to 19 or later, otherwise you will get AndriodManifest merge issue.
3. Make sure you have added the agconnect-services.json file to app folder.
4. Make sure you have added SHA-256 fingerprint without fail.
5. Make sure all the dependencies are added properly.
Conclusion
In this article, we have learnt about Text Image Super-Resolution feature of Huawei ML Kit and its functionality. It provides better quality and visibility of old and blurred text on an image. It can zoom an image that contains the text up to three times and significantly improves the definition of the text.
Reference
ML Kit – Documentation
ML Kit – Training Video
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Introduction
In this article, we can learn how to click the items of Recyclerview to navigate to different activities in Quiz App. Here, I have given the items in grid view to click each item. So, I will provide a series of articles on this Quiz App, in upcoming articles.
If you are new to this application, follow my previous articles.
https://forums.developer.huawei.com/forumPortal/en/topic/0202877278014350004
https://forums.developer.huawei.com/forumPortal/en/topic/0201884103719030016?fid=0101187876626530001
https://forums.developer.huawei.com/forumPortal/en/topic/0202890333711770040
Requirements
1. Any operating system (MacOS, Linux, and Windows).
2. Must have a Huawei phone with HMS 4.0.0.300 or later.
3. Must have a laptop or desktop with Android Studio, Jdk 1.8, SDK platform 26 and Gradle 4.6 and above installed.
4. Minimum API Level 24 is required.
5. Required EMUI 9.0.0 and later version devices.
How to integrate HMS Dependencies
1. First register as Huawei developer and complete identity verification on the Huawei developers website, refer to register a Huawei ID.
2. Create a project in android studio, refer Creating an Android Studio Project.
3. Generate a SHA-256 certificate fingerprint.
4. To generate SHA-256 certificate fingerprint. On right-upper corner of android project click Gradle, choose Project Name > Tasks > android, and then click signingReport, as follows.
Note: Project Name depends on the user created name.
5. Create an App in AppGallery Connect.
6. Download the agconnect-services.json file from App information, copy and paste in android Project under app directory, as follows.
7. Enter SHA-256 certificate fingerprint and click Save button, as follows.
Note: Above steps from Step 1 to 7 is common for all Huawei Kits.
8. Add the below maven URL in build.gradle(Project) file under the repositories of buildscript, dependencies and allprojects, refer Add Configuration.
Java:
maven { url 'http://developer.huawei.com/repo/' }
classpath 'com.huawei.agconnect:agcp:1.6.0.300'
9. Add the below plugin and dependencies in build.gradle(Module) file.
Java:
apply plugin: id 'com.huawei.agconnect'
// Huawei AGC
implementation 'com.huawei.agconnect:agconnect-core:1.6.0.300'
// Recyclerview
implementation 'androidx.recyclerview:recyclerview:1.2.1'
// Lifecycle components
implementation "androidx.lifecycle:lifecycle-extensions:2.2.0"
10. Now Sync the gradle.
Let us move to development
I have created a project on Android studio with empty activity let us start coding.
In the Home.kt we can find the business logic for button click listeners.
Java:
class Home : AppCompatActivity(), HomeAdapter.ItemListener {
private lateinit var recyclerView: RecyclerView
private lateinit var arrayList: ArrayList<QuesIcons>
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
setContentView(R.layout.activity_home)
recyclerView = findViewById(R.id.recyclerview_list)
arrayList = ArrayList()
arrayList.add(QuesIcons("Android", R.drawable.android_icon, "#09A9FF"))
arrayList.add(QuesIcons("HMS", R.drawable.huawei_icon, "#3E51B1"))
arrayList.add(QuesIcons("Sports", R.drawable.sports_icon, "#673BB7"))
arrayList.add(QuesIcons("Country Flags", R.drawable.flags_icon, "#4BAA50"))
val adapter = HomeAdapter(applicationContext, arrayList, this)
recyclerView.adapter = adapter
recyclerView.layoutManager = GridLayoutManager(this, 2)
recyclerView.setHasFixedSize(true)
}
override fun onItemClick(item: Int) {
when(item ) {
0 -> {val intent = Intent([email protected], AndroidActivity::class.java)
startActivity(intent)
}
1 -> {val intent = Intent([email protected], HMSActivity::class.java)
startActivity(intent)
}
2 -> {val intent = Intent([email protected], SportsActivity::class.java)
startActivity(intent)
}
3 -> {val intent = Intent([email protected], QuizActivity::class.java)
startActivity(intent)
}
}
}
}
In the HomeAdapter.kt we can find the business logic to holder the adapter items.
Java:
class HomeAdapter(private val mContext: Context, private val mValues: ArrayList<QuesIcons>, private var mListener: ItemListener?) :
RecyclerView.Adapter<HomeAdapter.ViewHolder>() {
inner class ViewHolder(v: View) : RecyclerView.ViewHolder(v), View.OnClickListener {
private val textView: TextView
private val imageView: ImageView
private val relativeLayout: RelativeLayout
private var item: QuesIcons? = null
fun setData(item: QuesIcons) {
this.item = item
textView.text = item.heading
imageView.setImageResource(item.titleImage)
relativeLayout.setBackgroundColor(Color.parseColor(item.colour))
}
override fun onClick(view: View) {
if (mListener != null) {
item?.let { mListener!!.onItemClick(adapterPosition) }
}
}
init {
v.setOnClickListener(this)
textView = v.findViewById<View>(R.id.text_item) as TextView
imageView = v.findViewById<View>(R.id.img_icon) as ImageView
relativeLayout = v.findViewById<View>(R.id.relativeLayout) as RelativeLayout
}
}
override fun onCreateViewHolder(parent: ViewGroup, viewType: Int): ViewHolder {
val view: View = LayoutInflater.from(mContext).inflate(R.layout.home_list, parent, false)
return ViewHolder(view)
}
override fun onBindViewHolder(viewHolder: ViewHolder, position: Int) {
viewHolder.setData(mValues[position])
}
override fun getItemCount(): Int {
return mValues.size
}
interface ItemListener {
fun onItemClick(position: Int)
}
}
Create QuesIcons.kt data class to find the declared the data.
Java:
data class QuesIcons(var heading: String, var titleImage: Int, var colour: String)
In the activity_home.xml we can create the recycler view list.
Java:
<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:orientation="vertical"
android:paddingLeft="8dp"
android:paddingRight="8dp"
tools:context=".Home">
<androidx.recyclerview.widget.RecyclerView
android:id="@+id/recyclerview_list"
android:layout_width="match_parent"
android:layout_height="match_parent" />
</LinearLayout>
In the home_list.xml we can create customize view for items.
Java:
<?xml version="1.0" encoding="utf-8"?>
<androidx.cardview.widget.CardView xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:card_view="http://schemas.android.com/apk/res-auto"
android:id="@+id/cardView"
android:layout_width="match_parent"
android:layout_height="170dp"
android:layout_margin="4dp"
card_view:cardCornerRadius="4dp">
<RelativeLayout
android:id="@+id/relativeLayout"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:orientation="vertical"
android:layout_marginTop="10dp"
android:layout_gravity="center">
<ImageView
android:id="@+id/img_icon"
android:layout_width="90dp"
android:layout_height="90dp"
android:layout_centerInParent="true"
android:contentDescription="@null"
card_view:tint="@color/white" />
<TextView
android:id="@+id/text_item"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_centerHorizontal="true"
android:textColor="@android:color/white"
android:textSize="16sp"
android:layout_below="@+id/img_icon" />
</RelativeLayout>
</androidx.cardview.widget.CardView>
Demo
Tips and Tricks
1. Make sure you are already registered as Huawei developer.
2. Set minSDK version to 24 or later, otherwise you will get AndriodManifest merge issue.
3. Make sure you have added the agconnect-services.json file to app folder.
4. Make sure you have added SHA-256 fingerprint without fail.
5. Make sure all the dependencies are added properly.
Conclusion
In this article, we have learned how to click the items of Recyclerview to navigate to different activities in Quiz App. Here, we can find the items in grid view to click each item. So, I will provide a series of articles on this Quiz App, in upcoming articles. So, I will provide a series of articles on this Quiz App, in upcoming articles.
I hope you have read this article. If you found it helpful, please provide likes and comments.
Reference
Clik here - https://www.geeksforgeeks.org/android-recyclerview/
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Introduction
In this article, we can learn how to integrate Rewarded Ads feature of Huawei Ads Kit into the android app. So, Rewarded ads are full-screen video ads that allow users to view in exchange for in-app rewards.
Ads Kit
Huawei Ads provides to developers a wide-ranging capabilities to deliver good quality ads content to users. This is the best way to reach the target audience easily and can measure user productivity. It is very useful when we publish a free app and want to earn some money from it.
HMS Ads Kit has 7 types of Ads kits. Now we can implement Rewarded Ads in this application.
Requirements
1. Any operating system (MacOS, Linux and Windows).
2. Must have a Huawei phone with HMS 4.0.0.300 or later.
3. Must have a laptop or desktop with Android Studio, Jdk 1.8, SDK platform 26 and Gradle 4.6 and above installed.
4. Minimum API Level 24 is required.
5. Required EMUI 9.0.0 and later version devices.
How to integrate HMS Dependencies
1. First register as Huawei developer and complete identity verification in Huawei developers website, refer to register a Huawei ID.
2. Create a project in android studio, refer Creating an Android Studio Project.
3. Generate a SHA-256 certificate fingerprint.
4. To generate SHA-256 certificate fingerprint. On right-upper corner of android project click Gradle, choose Project Name > Tasks > android, and then click signingReport, as follows.
Note: Project Name depends on the user created name.
5. Create an App in AppGallery Connect.
6. Download the agconnect-services.json file from App information, copy and paste in android Project under app directory, as follows.
7. Enter SHA-256 certificate fingerprint and click Save button, as follows.
Note: Above steps from Step 1 to 7 is common for all Huawei Kits.
8. Add the below maven URL in build.gradle(Project) file under the repositories of buildscript, dependencies and allprojects, refer Add Configuration.
Java:
maven { url 'http://developer.huawei.com/repo/' }
classpath 'com.huawei.agconnect:agcp:1.6.0.300'
9. Add the below plugin and dependencies in build.gradle(Module) file.
Java:
apply plugin: id 'com.huawei.agconnect'
// Huawei AGC
implementation 'com.huawei.agconnect:agconnect-core:1.6.0.300'
// Huawei Ads Kit
implementation 'com.huawei.hms:ads-lite:13.4.51.300'
10. Now Sync the gradle.
11. Add the required permission to the AndroidManifest.xml file.
Java:
// Ads Kit
<uses-permission android:name="android.permission.INTERNET" />
Let us move to development
I have created a project on Android studio with empty activity let us start coding.
In the MainActivity.kt we can find the business logic for Ads.
Java:
class MainActivity : AppCompatActivity() {
companion object {
private const val PLUS_SCORE = 1
private const val MINUS_SCORE = 5
private const val RANGE = 2
}
private var rewardedTitle: TextView? = null
private var scoreView: TextView? = null
private var reStartButton: Button? = null
private var watchAdButton: Button? = null
private var rewardedAd: RewardAd? = null
private var score = 1
private val defaultScore = 10
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
setContentView(R.layout.activity_main)
title = getString(R.string.reward_ad)
rewardedTitle = findViewById(R.id.text_reward)
rewardedTitle!!.setText(R.string.reward_ad_title)
// Load a rewarded ad.
loadRewardAd()
// Load a score view.
loadScoreView()
// Load the button for watching a rewarded ad.
loadWatchButton()
// Load the button for starting a game.
loadPlayButton()
}
// Load a rewarded ad.
private fun loadRewardAd() {
if (rewardedAd == null) {
rewardedAd = RewardAd([email protected], getString(R.string.ad_id_reward))
}
val rewardAdLoadListener: RewardAdLoadListener = object : RewardAdLoadListener() {
override fun onRewardAdFailedToLoad(errorCode: Int) {
showToast("onRewardAdFailedToLoad errorCode is :$errorCode");
}
override fun onRewardedLoaded() {
showToast("onRewardedLoaded")
}
}
rewardedAd!!.loadAd(AdParam.Builder().build(), rewardAdLoadListener)
}
// Display a rewarded ad.
private fun rewardAdShow() {
if (rewardedAd!!.isLoaded) {
rewardedAd!!.show([email protected], object : RewardAdStatusListener() {
override fun onRewardAdClosed() {
showToast("onRewardAdClosed")
loadRewardAd()
}
override fun onRewardAdFailedToShow(errorCode: Int) {
showToast("onRewardAdFailedToShow errorCode is :$errorCode")
}
override fun onRewardAdOpened() {
showToast("onRewardAdOpened")
}
override fun onRewarded(reward: Reward) {
// You are advised to grant a reward immediately and at the same time, check whether the reward
// takes effect on the server. If no reward information is configured, grant a reward based on the
// actual scenario.
val addScore = if (reward.amount == 0) defaultScore else reward.amount
showToast("Watch video show finished , add $addScore scores")
score += addScore
setScore(score)
loadRewardAd()
}
})
}
}
// Set a Score
private fun setScore(score: Int) {
scoreView!!.text = "Score:$score"
}
// Load the button for watching a rewarded ad
private fun loadWatchButton() {
watchAdButton = findViewById(R.id.show_video_button)
watchAdButton!!.setOnClickListener(View.OnClickListener { rewardAdShow() })
}
// Load the button for starting a game
private fun loadPlayButton() {
reStartButton = findViewById(R.id.play_button)
reStartButton!!.setOnClickListener(View.OnClickListener { play() })
}
private fun loadScoreView() {
scoreView = findViewById(R.id.score_count_text)
scoreView!!.text = "Score:$score"
}
// Used to play a game
private fun play() {
// If the score is 0, a message is displayed, asking users to watch the ad in exchange for scores.
if (score == 0) {
Toast.makeText([email protected], "Watch video ad to add score", Toast.LENGTH_SHORT).show()
return
}
// The value 0 or 1 is returned randomly. If the value is 1, the score increases by 1. If the value is 0, the
// score decreases by 5. If the score is a negative number, the score is set to 0.
val random = Random().nextInt(RANGE)
if (random == 1) {
score += PLUS_SCORE
Toast.makeText([email protected], "You win!", Toast.LENGTH_SHORT).show()
} else {
score -= MINUS_SCORE
score = if (score < 0) 0 else score
Toast.makeText([email protected], "You lose!", Toast.LENGTH_SHORT).show()
}
setScore(score)
}
private fun showToast(text: String) {
runOnUiThread {
Toast.makeText([email protected], text, Toast.LENGTH_SHORT).show()
}
}
}
In the activity_main.xml we can create the UI screen.
Java:
<?xml version="1.0" encoding="utf-8"?>
<RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
tools:context=".MainActivity">
<TextView
android:id="@+id/text_reward"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:layout_marginTop="16dp"
android:textAlignment="center"
android:textSize="20sp"
android:text="This is rewarded ads sample"/>
<Button
android:id="@+id/play_button"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_below="@+id/text_reward"
android:layout_centerHorizontal="true"
android:layout_marginTop="20dp"
android:text="Play" />
<Button
android:id="@+id/show_video_button"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_below="@+id/play_button"
android:layout_centerHorizontal="true"
android:layout_marginTop="20dp"
android:text="Watch Video" />
<TextView
android:id="@+id/score_count_text"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_below="@+id/show_video_button"
android:layout_centerHorizontal="true"
android:layout_marginTop="30dp"
android:textAppearance="?android:attr/textAppearanceLarge" />
</RelativeLayout>
Demo
Tips and Tricks
1. Make sure you are already registered as Huawei developer.
2. Set minSDK version to 24 or later, otherwise you will get AndriodManifest merge issue.
3. Make sure you have added the agconnect-services.json file to app folder.
4. Make sure you have added SHA-256 fingerprint without fail.
5. Make sure all the dependencies are added properly.
Conclusion
In this article, we have learned how to integrate the Huawei Analytics Kit and Ads Kit in Book Reading app. So, I will provide the series of articles on this Book Reading App, in upcoming articles will integrate other Huawei Kits.
I hope you have read this article. If you found it is helpful, please provide likes and comments.
Reference
Ads Kit - Rewarded Ads
Ads Kit – Training Video