Introduction
In this article, we can learn the integration of landmark recognition feature in apps using Huawei Machine Learning (ML) Kit. The landmark recognition can be used in tourism scenarios. For example, if you have visited any place in the world and not knowing about that monument or natural landmarks? In this case, ML Kit helps you to take image from camera or upload from gallery, then the landmark recognizer analyses the capture and shows the exact landmark of that picture with results such as landmark name, longitude and latitude, and confidence of the input image. A higher confidence indicates that the landmark in input image is more likely to be recognized. Currently, more than 17,000 global landmarks can be recognized. In landmark recognition, the device calls the on-cloud API for detection and the detection algorithm model runs on the cloud. During commissioning and usage, make sure the device has Internet access.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Requirements
1. Any operating system (MacOS, Linux and Windows).
2. Must have a Huawei phone with HMS 4.0.0.300 or later.
3. Must have a laptop or desktop with Android Studio, Jdk 1.8, SDK platform 26 and Gradle 4.6 installed.
4. Minimum API Level 21 is required.
5. Required EMUI 9.0.0 and later version devices.
Integration Process
1. First register as Huawei developer and complete identity verification in Huawei developers website, refer to register a Huawei ID.
2. Create a project in android studio, refer Creating an Android Studio Project.
3. Generate a SHA-256 certificate fingerprint.
4. To generate SHA-256 certificate fingerprint. On right-upper corner of android project click Gradle, choose Project Name > Tasks > android, and then click signingReport, as follows.
Note: Project Name depends on the user created name.
5. Create an App in AppGallery Connect.
6. Download the agconnect-services.json file from App information, copy and paste in android Project under app directory, as follows.
7. Enter SHA-256 certificate fingerprint and click tick icon, as follows.
Note: Above steps from Step 1 to 7 is common for all Huawei Kits.
8. Click Manage APIs tab and enable ML Kit.
9. Add the below maven URL in build.gradle(Project) file under the repositories of buildscript, dependencies and allprojects, refer Add Configuration.
Java:
maven { url 'http://developer.huawei.com/repo/' }
classpath 'com.huawei.agconnect:agcp:1.4.1.300'
10. Add the below plugin and dependencies in build.gradle(Module) file.
Code:
apply plugin: 'com.huawei.agconnect'
// Huawei AGC
implementation 'com.huawei.agconnect:agconnect-core:1.5.0.300'
// Import the landmark recognition SDK.
implementation 'com.huawei.hms:ml-computer-vision-cloud:2.0.5.304'
11. Now Sync the gradle.
12. Add the required permission to the AndroidManifest.xml file.
Java:
<uses-permission android:name="android.permission.CAMERA"/>
<uses-permission android:name="android.permission.INTERNET"/>
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE"/>
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE"/>
<uses-permission android:name="android.permission.RECORD_AUDIO"/>
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE"/>
<uses-permission android:name="android.permission.ACCESS_WIFI_STATE"/>
Let us move to development
I have created a project on Android studio with empty activity let us start coding.
In the MainActivity.kt we can create the business logic.
Java:
class MainActivity : AppCompatActivity(), View.OnClickListener {
private val images = arrayOf(R.drawable.forbiddencity_image, R.drawable.maropeng_image,
R.drawable.natural_landmarks, R.drawable.niagarafalls_image,
R.drawable.road_image, R.drawable.stupa_thimphu,
R.drawable.statue_image)
private var curImageIdx = 0
private var analyzer: MLRemoteLandmarkAnalyzer? = null
// You can find api key in agconnect-services.json file.
val apiKey = "Enter your API Key"
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
setContentView(R.layout.activity_main)
this.btn_ok.setOnClickListener(this)
//Images to change in background with buttons
landmark_images.setBackgroundResource(images[curImageIdx])
btn_next.setOnClickListener{
curImageIdx = (curImageIdx + 1) % images.size
nextImage()
}
btn_back.setOnClickListener {
curImageIdx = (curImageIdx - 1) % images.size
prevImage()
}
}
private fun nextImage(){
landmark_images.setBackgroundResource(images[curImageIdx])
}
private fun prevImage(){
landmark_images.setBackgroundResource(images[curImageIdx])
}
private fun analyzer(i: Int) {
val settings = MLRemoteLandmarkAnalyzerSetting.Factory()
.setLargestNumOfReturns(1)
.setPatternType(MLRemoteLandmarkAnalyzerSetting.STEADY_PATTERN)
.create()
analyzer = MLAnalyzerFactory.getInstance().getRemoteLandmarkAnalyzer(settings)
// Created an MLFrame by android graphics. Recommended image size is large than 640*640 pixel.
val bitmap = BitmapFactory.decodeResource(this.resources, images[curImageIdx])
val mlFrame = MLFrame.Creator().setBitmap(bitmap).create()
//set API key
MLApplication.getInstance().apiKey = this.apiKey
//set access token
val task = analyzer!!.asyncAnalyseFrame(mlFrame)
task.addOnSuccessListener{landmarkResults ->
[email protected](landmarkResults[0])
}.addOnFailureListener{ e ->
[email protected](e)
}
}
private fun displayFailure(exception: Exception){
var error = "Failure: "
error += try {
val mlException = exception as MLException
"""
error code: ${mlException.errCode}
error message: ${mlException.message}
error reason: ${mlException.cause}
""".trimIndent()
} catch(e: Exception) {
e.message
}
landmark_result!!.text = error
}
private fun displaySuccess(landmark: MLRemoteLandmark){
var result = ""
if(landmark.landmark != null){
result = "Landmark: " + landmark.landmark
}
result += "\nPositions: "
if(landmark.positionInfos != null){
for(coordinate in landmark.positionInfos){
result += """
Latitude: ${coordinate.lat}
""".trimIndent()
result += """
Longitude: ${coordinate.lng}
""".trimIndent()
}
}
if (result != null)
landmark_result.text = result
}
override fun onClick(v: View?) {
analyzer(images[curImageIdx])
}
override fun onDestroy() {
super.onDestroy()
if (analyzer == null) {
return
}
try {
analyzer!!.stop()
} catch (e: IOException) {
Toast.makeText(this, "Stop failed: " + e.message, Toast.LENGTH_LONG).show()
}
}
}
In the activity_main.xml we can create the UI screen.
XML:
<?xml version="1.0" encoding="utf-8"?>
<RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
tools:context=".MainActivity">
<ImageView
android:id="@+id/landmark_images"
android:layout_width="match_parent"
android:layout_height="470dp"
android:layout_centerHorizontal="true"
android:background="@drawable/forbiddencity_image"/>
<TextView
android:id="@+id/landmark_result"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:layout_below="@+id/landmark_images"
android:layout_marginLeft="15dp"
android:layout_marginTop="15dp"
android:layout_marginBottom="10dp"
android:textSize="17dp"
android:textColor="@color/design_default_color_error"/>
<Button
android:id="@+id/btn_back"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_alignParentBottom="true"
android:layout_alignParentLeft="true"
android:layout_marginLeft="5dp"
android:textAllCaps="false"
android:text="Back" />
<Button
android:id="@+id/btn_ok"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_alignParentBottom="true"
android:layout_centerHorizontal="true"
android:text="OK" />
<Button
android:id="@+id/btn_next"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_alignParentBottom="true"
android:layout_alignParentRight="true"
android:layout_marginRight="5dp"
android:textAllCaps="false"
android:text="Next" />
</RelativeLayout>
Demo
Tips and Tricks
1. Make sure you are already registered as Huawei developer.
2. Set minSDK version to 21 or later.
3. Make sure you have added the agconnect-services.json file to app folder.
4. Make sure you have added SHA-256 fingerprint without fail.
5. Make sure all the dependencies are added properly.
6. The recommended image size is large than 640*640 pixel.
Conclusion
In this article, we have learnt integration of landmark recognition feature in apps using Huawei Machine Learning (ML) Kit. The landmark recognition is mainly used in tourism apps to know about the monuments or natural landmarks visited by user. The user captures image, then the landmark recognizer analyses the capture and provides the landmark name, longitude and latitude, and confidence of input image. In landmark recognition, device calls the on-cloud API for detection and the detection algorithm model runs on the cloud.
I hope you have read this article. If you found it is helpful, please provide likes and comments.
Reference
ML Kit - Landmark Recognition
Original Source
Related
Introduction
In this article, we can learn the integration of landmark recognition feature in apps using Huawei Machine Learning (ML) Kit. The landmark recognition can be used in tourism scenarios. For example, if you have visited any place in the world and not knowing about that monument or natural landmarks? In this case, ML Kit helps you to take image from camera or upload from gallery, then the landmark recognizer analyses the capture and shows the exact landmark of that picture with results such as landmark name, longitude and latitude, and confidence of the input image. A higher confidence indicates that the landmark in input image is more likely to be recognized. Currently, more than 17,000 global landmarks can be recognized. In landmark recognition, the device calls the on-cloud API for detection and the detection algorithm model runs on the cloud. During commissioning and usage, make sure the device has Internet access.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Requirements
1. Any operating system (MacOS, Linux and Windows).
2. Must have a Huawei phone with HMS 4.0.0.300 or later.
3. Must have a laptop or desktop with Android Studio, Jdk 1.8, SDK platform 26 and Gradle 4.6 installed.
4. Minimum API Level 21 is required.
5. Required EMUI 9.0.0 and later version devices.
Integration Process
1. First register as Huawei developer and complete identity verification in Huawei developers website, refer to register a Huawei ID.
2. Create a project in android studio, refer Creating an Android Studio Project.
3. Generate a SHA-256 certificate fingerprint.
4. To generate SHA-256 certificate fingerprint. On right-upper corner of android project click Gradle, choose Project Name > Tasks > android, and then click signingReport, as follows.
Note: Project Name depends on the user created name.
5. Create an App in AppGallery Connect.
6. Download the agconnect-services.json file from App information, copy and paste in android Project under app directory, as follows.
7. Enter SHA-256 certificate fingerprint and click tick icon, as follows.
Note: Above steps from Step 1 to 7 is common for all Huawei Kits.
8. Click Manage APIs tab and enable ML Kit.
9. Add the below maven URL in build.gradle(Project) file under the repositories of buildscript, dependencies and allprojects, refer Add Configuration.
Java:
maven { url 'http://developer.huawei.com/repo/' }
classpath 'com.huawei.agconnect:agcp:1.4.1.300'
10. Add the below plugin and dependencies in build.gradle(Module) file.
Code:
apply plugin: 'com.huawei.agconnect'
// Huawei AGC
implementation 'com.huawei.agconnect:agconnect-core:1.5.0.300'
// Import the landmark recognition SDK.
implementation 'com.huawei.hms:ml-computer-vision-cloud:2.0.5.304'
11. Now Sync the gradle.
12. Add the required permission to the AndroidManifest.xml file.
Java:
<uses-permission android:name="android.permission.CAMERA"/>
<uses-permission android:name="android.permission.INTERNET"/>
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE"/>
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE"/>
<uses-permission android:name="android.permission.RECORD_AUDIO"/>
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE"/>
<uses-permission android:name="android.permission.ACCESS_WIFI_STATE"/>
Let us move to development
I have created a project on Android studio with empty activity let us start coding.
In the MainActivity.kt we can create the business logic.
Java:
class MainActivity : AppCompatActivity(), View.OnClickListener {
private val images = arrayOf(R.drawable.forbiddencity_image, R.drawable.maropeng_image,
R.drawable.natural_landmarks, R.drawable.niagarafalls_image,
R.drawable.road_image, R.drawable.stupa_thimphu,
R.drawable.statue_image)
private var curImageIdx = 0
private var analyzer: MLRemoteLandmarkAnalyzer? = null
// You can find api key in agconnect-services.json file.
val apiKey = "Enter your API Key"
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
setContentView(R.layout.activity_main)
this.btn_ok.setOnClickListener(this)
//Images to change in background with buttons
landmark_images.setBackgroundResource(images[curImageIdx])
btn_next.setOnClickListener{
curImageIdx = (curImageIdx + 1) % images.size
nextImage()
}
btn_back.setOnClickListener {
curImageIdx = (curImageIdx - 1) % images.size
prevImage()
}
}
private fun nextImage(){
landmark_images.setBackgroundResource(images[curImageIdx])
}
private fun prevImage(){
landmark_images.setBackgroundResource(images[curImageIdx])
}
private fun analyzer(i: Int) {
val settings = MLRemoteLandmarkAnalyzerSetting.Factory()
.setLargestNumOfReturns(1)
.setPatternType(MLRemoteLandmarkAnalyzerSetting.STEADY_PATTERN)
.create()
analyzer = MLAnalyzerFactory.getInstance().getRemoteLandmarkAnalyzer(settings)
// Created an MLFrame by android graphics. Recommended image size is large than 640*640 pixel.
val bitmap = BitmapFactory.decodeResource(this.resources, images[curImageIdx])
val mlFrame = MLFrame.Creator().setBitmap(bitmap).create()
//set API key
MLApplication.getInstance().apiKey = this.apiKey
//set access token
val task = analyzer!!.asyncAnalyseFrame(mlFrame)
task.addOnSuccessListener{landmarkResults ->
[email protected](landmarkResults[0])
}.addOnFailureListener{ e ->
[email protected](e)
}
}
private fun displayFailure(exception: Exception){
var error = "Failure: "
error += try {
val mlException = exception as MLException
"""
error code: ${mlException.errCode}
error message: ${mlException.message}
error reason: ${mlException.cause}
""".trimIndent()
} catch(e: Exception) {
e.message
}
landmark_result!!.text = error
}
private fun displaySuccess(landmark: MLRemoteLandmark){
var result = ""
if(landmark.landmark != null){
result = "Landmark: " + landmark.landmark
}
result += "\nPositions: "
if(landmark.positionInfos != null){
for(coordinate in landmark.positionInfos){
result += """
Latitude: ${coordinate.lat}
""".trimIndent()
result += """
Longitude: ${coordinate.lng}
""".trimIndent()
}
}
if (result != null)
landmark_result.text = result
}
override fun onClick(v: View?) {
analyzer(images[curImageIdx])
}
override fun onDestroy() {
super.onDestroy()
if (analyzer == null) {
return
}
try {
analyzer!!.stop()
} catch (e: IOException) {
Toast.makeText(this, "Stop failed: " + e.message, Toast.LENGTH_LONG).show()
}
}
}
In the activity_main.xml we can create the UI screen.
XML:
<?xml version="1.0" encoding="utf-8"?>
<RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
tools:context=".MainActivity">
<ImageView
android:id="@+id/landmark_images"
android:layout_width="match_parent"
android:layout_height="470dp"
android:layout_centerHorizontal="true"
android:background="@drawable/forbiddencity_image"/>
<TextView
android:id="@+id/landmark_result"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:layout_below="@+id/landmark_images"
android:layout_marginLeft="15dp"
android:layout_marginTop="15dp"
android:layout_marginBottom="10dp"
android:textSize="17dp"
android:textColor="@color/design_default_color_error"/>
<Button
android:id="@+id/btn_back"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_alignParentBottom="true"
android:layout_alignParentLeft="true"
android:layout_marginLeft="5dp"
android:textAllCaps="false"
android:text="Back" />
<Button
android:id="@+id/btn_ok"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_alignParentBottom="true"
android:layout_centerHorizontal="true"
android:text="OK" />
<Button
android:id="@+id/btn_next"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_alignParentBottom="true"
android:layout_alignParentRight="true"
android:layout_marginRight="5dp"
android:textAllCaps="false"
android:text="Next" />
</RelativeLayout>
Demo
Tips and Tricks
1. Make sure you are already registered as Huawei developer.
2. Set minSDK version to 21 or later.
3. Make sure you have added the agconnect-services.json file to app folder.
4. Make sure you have added SHA-256 fingerprint without fail.
5. Make sure all the dependencies are added properly.
6. The recommended image size is large than 640*640 pixel.
Conclusion
In this article, we have learnt integration of landmark recognition feature in apps using Huawei Machine Learning (ML) Kit. The landmark recognition is mainly used in tourism apps to know about the monuments or natural landmarks visited by user. The user captures image, then the landmark recognizer analyses the capture and provides the landmark name, longitude and latitude, and confidence of input image. In landmark recognition, device calls the on-cloud API for detection and the detection algorithm model runs on the cloud.
I hope you have read this article. If you found it is helpful, please provide likes and comments.
Reference
ML Kit - Landmark Recognition
Original Source
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Introduction
In this article, we can learn how to integrate the Text to Speech feature of Huawei ML Kit in Book Reading app. Text to speech (TTS) can convert text information into human voice in real time. This service uses Deep Neural Networks in order to process the text and create a natural sound, rich timbers are also supported to enhance the result. TTS is widely used in broadcasting, news, voice navigation, and audio reading. For example, TTS can convert a large amount of text into speech output and highlight the content that is being played to free users eyes, bringing interests to users. TTS records a voice segment based on navigation data, and then synthesizes the voice segment into navigation voice, so that navigation is more personalized.
Precautions
1. The text in a single request can contain a maximum of 500 characters and is encoded using UTF-8.
2. Currently, TTS in French, Spanish, German, Italian, Russian, Thai, Malay, and Polish is deployed only in China, Asia, Africa, Latin America, and Europe.
3. TTS depends on on-cloud APIs. During commissioning and usage, ensure that the device can access the Internet.
4. Default specifications of the real-time output audio data are as follows: MP3 mono, 16-bit depth, and 16 kHz audio sampling rate.
Requirements
1. Any operating system (MacOS, Linux and Windows).
2. Must have a Huawei phone with HMS 4.0.0.300 or later.
3. Must have a laptop or desktop with Android Studio, Jdk 1.8, SDK platform 26 and Gradle 4.6 and above installed.
4. Minimum API Level 24 is required.
5. Required EMUI 9.0.0 and later version devices.
How to integrate HMS Dependencies
1. First register as Huawei developer and complete identity verification in Huawei developers website, refer to register a Huawei ID.
2. Create a project in android studio, refer Creating an Android Studio Project.
3. Generate a SHA-256 certificate fingerprint.
4. To generate SHA-256 certificate fingerprint. On right-upper corner of android project click Gradle, choose Project Name > Tasks > android, and then click signingReport, as follows.
Note: Project Name depends on the user created name.
5. Create an App in AppGallery Connect.
6. Download the agconnect-services.json file from App information, copy and paste in android Project under app directory, as follows.
7. Enter SHA-256 certificate fingerprint and click Save button, as follows.
Note: Above steps from Step 1 to 7 is common for all Huawei Kits.
8. Click Manage APIs tab and enable ML Kit.
9. Add the below maven URL in build.gradle(Project) file under the repositories of buildscript, dependencies and allprojects, refer Add Configuration.
Java:
maven { url 'http://developer.huawei.com/repo/' }
classpath 'com.huawei.agconnect:agcp:1.6.0.300'
10. Add the below plugin and dependencies in build.gradle(Module) file.
Code:
apply plugin: id 'com.huawei.agconnect'
dataBinding {
enabled = true
}
// Huawei AGC
implementation 'com.huawei.agconnect:agconnect-core:1.6.0.300'
// ML Kit - Text to Speech
implementation 'com.huawei.hms:ml-computer-voice-tts:3.3.0.305'
// Data Binding
implementation 'androidx.databinding:databinding-runtime:7.1.1'
11. Now Sync the gradle.
12. Add the required permission to the AndroidManifest.xml file.
Code:
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
<uses-permission android:name="android.permission.INTERNET" />
Let us move to development
I have created a project on Android studio with empty activity let us start coding.
In the ListActivity.kt to find the button click.
Java:
class ListActivity : AppCompatActivity() {
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
setContentView(R.layout.activity_list)
btn_voice.setOnClickListener {
val intent = Intent([email protected], TranslateActivity::class.java)
startActivity(intent)
}
}
}
In the TranslateActivity.kt to find the business logic for text translation.
Java:
class TranslateActivity : AppCompatActivity() {
private lateinit var binding: ActivityTranslateBinding
private lateinit var ttsViewModel: TtsViewModel
private var sourceText: String = ""
private lateinit var mlTtsEngine: MLTtsEngine
private lateinit var mlConfigs: MLTtsConfig
private val TAG: String = TranslateActivity::class.java.simpleName
private var callback: MLTtsCallback = object : MLTtsCallback {
override fun onError(taskId: String, err: MLTtsError) {
}
override fun onWarn(taskId: String, warn: MLTtsWarn) {
}
override fun onRangeStart(taskId: String, start: Int, end: Int) {
Log.d("", start.toString())
img_view.setImageResource(R.drawable.on)
}
override fun onAudioAvailable(p0: String?, p1: MLTtsAudioFragment?, p2: Int, p3: android.util.Pair<Int, Int>?, p4: Bundle?) {
}
override fun onEvent(taskId: String, eventName: Int, bundle: Bundle?) {
if (eventName == MLTtsConstants.EVENT_PLAY_STOP) {
Toast.makeText(applicationContext, "Service Stopped", Toast.LENGTH_LONG).show()
}
img_view.setImageResource(R.drawable.off)
}
}
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
setContentView(activity_translate)
binding = setContentView(this, activity_translate)
binding.lifecycleOwner = this
ttsViewModel = ViewModelProvider(this).get(TtsViewModel::class.java)
binding.ttsViewModel = ttsViewModel
setApiKey()
supportActionBar?.title = "Text to Speech Conversion"
ttsViewModel.ttsService.observe(this, Observer {
startTtsService()
})
ttsViewModel.textData.observe(this, Observer {
sourceText = it
})
}
private fun startTtsService() {
mlConfigs = MLTtsConfig()
.setLanguage(MLTtsConstants.TTS_EN_US)
.setPerson(MLTtsConstants.TTS_SPEAKER_FEMALE_EN)
.setSpeed(1.0f)
.setVolume(1.0f)
mlTtsEngine = MLTtsEngine(mlConfigs)
mlTtsEngine.setTtsCallback(callback)
// ID to use for Audio Visualizer.
val id = mlTtsEngine.speak(sourceText, MLTtsEngine.QUEUE_APPEND)
Log.i(TAG, id)
}
private fun setApiKey(){
MLApplication.getInstance().apiKey = "DAEDAOB+zyB7ajg1LGcp8F65qxZduDjQ1E6tVovUp4lU/PywqhT4g+bxBCtStYAa33V9tUQrKvUp89m+0Gi/fPwfNN6WCJxcVLA+WA=="
}
override fun onDestroy() {
super.onDestroy()
mlTtsEngine.shutdown()
}
override fun onPause() {
super.onPause()
mlTtsEngine.stop()
}
}
In the activity_list.xml we can create the UI screen.
Code:
<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:orientation="vertical"
android:paddingTop="10dp"
android:paddingBottom="10dp"
tools:context=".ListActivity">
android:id="@+id/btn_voice"
android:layout_width="310dp"
android:layout_height="wrap_content"
android:layout_marginTop="50dp"
android:textAlignment="center"
android:layout_gravity="center_horizontal"
android:textSize="20sp"
android:textColor="@color/black"
android:padding="8dp"
android:textAllCaps="false"
android:text="Text to Voice" />
</LinearLayout>
In the activity_translate.xml we can create the UI screen.
Java:
<?xml version="1.0" encoding="utf-8"?>
<layout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools">
<data>
<variable
name="ttsViewModel"
type="com.example.huaweibookreaderapp1.TtsViewModel" />
</data>
<androidx.constraintlayout.widget.ConstraintLayout
android:layout_width="match_parent"
android:layout_height="match_parent"
android:background="@color/white"
tools:context=".TranslateActivity">
<Button
android:id="@+id/btn_click"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:onClick="@{() -> ttsViewModel.callTtsService()}"
android:text="@string/speak"
android:textSize="20sp"
app:layout_constraintBottom_toBottomOf="parent"
app:layout_constraintEnd_toEndOf="parent"
app:layout_constraintHorizontal_bias="0.498"
app:layout_constraintStart_toStartOf="parent"
app:layout_constraintTop_toTopOf="parent"
app:layout_constraintVertical_bias="0.43" />
<EditText
android:id="@+id/edt_text"
android:layout_width="409dp"
android:layout_height="wrap_content"
android:layout_marginBottom="36dp"
android:ems="10"
android:textSize="20sp"
android:hint="@string/enter_text_here"
android:inputType="textPersonName"
android:onTextChanged="@{ttsViewModel.noDataChangedText}"
android:paddingStart="70dp"
app:layout_constraintBottom_toTopOf="@+id/btn_click"
app:layout_constraintEnd_toEndOf="parent"
app:layout_constraintHorizontal_bias="1.0"
app:layout_constraintStart_toStartOf="parent"
android:autofillHints="@string/enter_text_here" />
<ImageView
android:id="@+id/img_view"
android:layout_width="100dp"
android:layout_height="100dp"
android:layout_marginTop="7dp"
app:layout_constraintBottom_toTopOf="@+id/edt_text"
app:layout_constraintEnd_toEndOf="parent"
app:layout_constraintHorizontal_bias="0.498"
app:layout_constraintStart_toStartOf="parent"
app:layout_constraintTop_toTopOf="parent"
app:layout_constraintVertical_bias="0.8"
app:srcCompat="@drawable/off"
android:contentDescription="@string/speaker" />
</androidx.constraintlayout.widget.ConstraintLayout>
</layout>
Demo
Tips and Tricks
1. Make sure you are already registered as Huawei developer.
2. Set minSDK version to 24 or later, otherwise you will get AndriodManifest merge issue.
3. Make sure you have added the agconnect-services.json file to app folder.
4. Make sure you have added SHA-256 fingerprint without fail.
5. Make sure all the dependencies are added properly.
Conclusion
In this article, we have learned how to integrate the Text to Speech feature of Huawei ML Kit in Book Reading app. Text to speech (TTS) can convert text information into human voice in real time. This service uses Deep Neural Networks in order to process the text and create a natural sound, rich timbers are also supported to enhance the result.
I hope you have read this article. If you found it is helpful, please provide likes and comments.
Reference
ML Kit – Text to Speech
ML Kit – Training Video
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Introduction
In this article, we can learn how capture the bills of text images using this Money Management app. This app converts the images to quality visibility by zooming. So, whenever the user purchases some shopping or spending, he can capture the bill using this application and can save in memory.
So, I will provide the series of articles on this Money Management App, in upcoming articles I will integrate other Huawei Kits.
If you are new to this application, follow my previous articles.
Beginner: Find the introduction Sliders and Huawei Account Kit Integration in Money Management Android app (Kotlin) - Part 1
Beginner: Integration of Huawei Ads Kit and Analytics Kit in Money Management Android app (Kotlin) – Part 2
Beginner: Manage the Budget using Room Database in Money Management Android app (Kotlin) – Part 3
ML Kit - Text Image Super-Resolution
The ML Kit - Text Image Super-Resolution feature of Huawei ML Kit. It provides better quality and visibility of old and blurred text on an image. When you take a photograph of a document from far or cannot properly adjust the focus, the text may not be clear. In this situation, it can zoom an image that contains the text up to three times and significantly improves the definition of the text.
Requirements
1. Any operating system (MacOS, Linux and Windows).
2. Must have a Huawei phone with HMS 4.0.0.300 or later.
3. Must have a laptop or desktop with Android Studio, Jdk 1.8, SDK platform 26 and Gradle 4.6 and above installed.
4. Minimum API Level 19 is required.
5. Required EMUI 9.0.0 and later version devices.
How to integrate HMS Dependencies
1. First register as Huawei developer and complete identity verification in Huawei developers website, refer to register a Huawei ID.
2. Create a project in android studio, refer Creating an Android Studio Project.
3. Generate a SHA-256 certificate fingerprint.
4. To generate SHA-256 certificate fingerprint. On right-upper corner of android project click Gradle, choose Project Name > Tasks > android, and then click signingReport, as follows.
Note: Project Name depends on the user created name.
5. Create an App in AppGallery Connect.
6. Download the agconnect-services.json file from App information, copy and paste in android Project under app directory, as follows.
7. Enter SHA-256 certificate fingerprint and click Save button, as follows.
Note: Above steps from Step 1 to 7 is common for all Huawei Kits.
8. Click Manage APIs tab and enable ML Kit.
9. Add the below maven URL in build.gradle(Project) file under the repositories of buildscript, dependencies and allprojects, refer Add Configuration.
Java:
maven { url 'http://developer.huawei.com/repo/' }
classpath 'com.huawei.agconnect:agcp:1.4.1.300'
10. Add the below plugin and dependencies in build.gradle(Module) file.
Java:
apply plugin: 'com.huawei.agconnect'
// Huawei AGC
implementation 'com.huawei.agconnect:agconnect-core:1.5.0.300'
// Import the text image super-resolution base SDK.
implementation 'com.huawei.hms:ml-computer-vision-textimagesuperresolution:2.0.4.300'
// Import the text image super-resolution model package.
implementation 'com.huawei.hms:ml-computer-vision-textimagesuperresolution-model:2.0.4.300'
11. Now Sync the gradle.
12. Add the required permission to the AndroidManifest.xml file.
XML:
<uses-permission android:name="android.permission.CAMERA" />
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
Let us move to development
I have created a project on Android studio with empty activity let us start coding.
In the CaptureActivity.kt we can find the business logic.
Java:
class CaptureActivity : AppCompatActivity(), View.OnClickListener {
private var analyzer: MLTextImageSuperResolutionAnalyzer? = null
private val QUALITY = 1
private val ORIGINAL = 2
private var imageView: ImageView? = null
private var srcBitmap: Bitmap? = null
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
setContentView(R.layout.activity_capture)
imageView = findViewById(R.id.bill)
srcBitmap = BitmapFactory.decodeResource(resources, R.drawable.bill_1)
findViewById<View>(R.id.btn_quality).setOnClickListener(this)
findViewById<View>(R.id.btn_original).setOnClickListener(this)
createAnalyzer()
}
// Find the on click listeners
override fun onClick(v: View?) {
if (v!!.id == R.id.btn_quality) {
detectImage(QUALITY)
} else if (v.id == R.id.btn_original) {
detectImage(ORIGINAL)
}
}
private fun release() {
if (analyzer == null) {
return
}
analyzer!!.stop()
}
// Find the method to detect bills or text images
private fun detectImage(type: Int) {
if (type == ORIGINAL) {
setImage(srcBitmap!!)
return
}
if (analyzer == null) {
return
}
// Create an MLFrame by using the bitmap.
val frame = MLFrame.Creator().setBitmap(srcBitmap).create()
val task = analyzer!!.asyncAnalyseFrame(frame)
task.addOnSuccessListener { result -> // success.
Toast.makeText(applicationContext, "Success", Toast.LENGTH_LONG).show()
setImage(result.bitmap)
}.addOnFailureListener { e ->
// Failure
if (e is MLException) {
val mlException = e
// Get the error code, developers can give different page prompts according to the error code.
val errorCode = mlException.errCode
// Get the error message, developers can combine the error code to quickly locate the problem.
val errorMessage = mlException.message
Toast.makeText(applicationContext,"Error:$errorCode Message:$errorMessage", Toast.LENGTH_LONG).show()
// Log.e(TAG, "Error:$errorCode Message:$errorMessage")
} else {
// Other exception
Toast.makeText(applicationContext, "Failed:" + e.message, Toast.LENGTH_LONG).show()
// Log.e(TAG, e.message!!)
}
}
}
private fun setImage(bitmap: Bitmap) {
[email protected](Runnable {
imageView!!.setImageBitmap(
bitmap
)
})
}
private fun createAnalyzer() {
analyzer = MLTextImageSuperResolutionAnalyzerFactory.getInstance().textImageSuperResolutionAnalyzer
}
override fun onDestroy() {
super.onDestroy()
if (srcBitmap != null) {
srcBitmap!!.recycle()
}
release()
}
}
In the activity_capture.xml we can create the UI screen.
XML:
<?xml version="1.0" encoding="utf-8"?>
<RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
tools:context=".mlkit.CaptureActivity">
<LinearLayout
android:id="@+id/buttons"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:layout_alignParentBottom="true"
android:orientation="vertical"
tools:ignore="MissingConstraints">
<Button
android:id="@+id/btn_quality"
android:layout_width="match_parent"
android:layout_height="50dp"
android:layout_margin="15dp"
android:gravity="center"
android:textSize="19sp"
android:text="Quality"
android:textAllCaps="false"
android:textColor="@color/Red"
tools:ignore="HardcodedText" />
<Button
android:id="@+id/btn_original"
android:layout_width="match_parent"
android:layout_height="50dp"
android:layout_margin="15dp"
android:gravity="center"
android:text="Original"
android:textSize="19sp"
android:textAllCaps="false"
android:textColor="@color/Red"
tools:ignore="HardcodedText" />
</LinearLayout>
<ScrollView
android:layout_width="match_parent"
android:layout_height="match_parent"
android:layout_above="@+id/buttons"
android:layout_marginBottom="15dp">
<ImageView
android:id="@+id/bill"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_centerInParent="true"
android:layout_gravity="center"
tools:ignore="ObsoleteLayoutParam" />
</ScrollView>
</RelativeLayout>
Demo
Tips and Tricks
1. Make sure you are already registered as Huawei developer.
2. Set minSDK version to 19 or later, otherwise you will get AndriodManifest merge issue.
3. Make sure you have added the agconnect-services.json file to app folder.
4. Make sure you have added SHA-256 fingerprint without fail.
5. Make sure all the dependencies are added properly.
Conclusion
In this article, we have learnt about Text Image Super-Resolution feature of Huawei ML Kit and its functionality. It provides better quality and visibility of old and blurred text on an image. It can zoom an image that contains the text up to three times and significantly improves the definition of the text.
Reference
ML Kit – Documentation
ML Kit – Training Video
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Introduction
In this article, we can learn how to save the hospital details by scanning the barcode and saving the details in your contacts directory using Huawei Scan Kit. Due to busy days like journey, office work and personal work, users are not able to save many details. So, this app helps you to save the hospital information by just one scan of barcode from your phone such as Hospital Name, Contact Number, Email address, Website etc.
So, I will provide a series of articles on this Patient Tracking App, in upcoming articles I will integrate other Huawei Kits.
If you are new to this application, follow my previous articles.
https://forums.developer.huawei.com/forumPortal/en/topic/0201902220661040078
https://forums.developer.huawei.com/forumPortal/en/topic/0201908355251870119
https://forums.developer.huawei.com/forumPortal/en/topic/0202914346246890032
https://forums.developer.huawei.com/forumPortal/en/topic/0202920411340450018
https://forums.developer.huawei.com/forumPortal/en/topic/0202926518891830059
What is scan kit?
HUAWEI Scan Kit scans and parses all major 1D and 2D barcodes and generates QR codes, helps you to build quickly barcode scanning functions into your apps.
HUAWEI Scan Kit automatically detects, magnifies and identifies barcodes from a distance and also it can scan a very small barcode in the same way. It supports 13 different formats of barcodes, as follows.
1D barcodes: EAN-8, EAN-13, UPC-A, UPC-E, Codabar, Code 39, Code 93, Code 128 and ITF
2D barcodes: QR Code, Data Matrix, PDF 417 and Aztec
Requirements
1. Any operating system (MacOS, Linux and Windows).
2. Must have a Huawei phone with HMS 4.0.0.300 or later.
3. Must have a laptop or desktop with Android Studio, Jdk 1.8, SDK platform 26 and Gradle 4.6 installed.
4. Minimum API Level 19 is required.
5. Required EMUI 9.0.0 and later version devices.
How to integrate HMS Dependencies
1. First register as Huawei developer and complete identity verification in Huawei developers website, refer to register a Huawei ID.
2. Create a project in android studio, refer Creating an Android Studio Project.
3. Generate a SHA-256 certificate fingerprint.
4. To generate SHA-256 certificate fingerprint. On right-upper corner of android project click Gradle, choose Project Name > Tasks > android, and then click signingReport, as follows.
Note: Project Name depends on the user created name.
5. Create an App in AppGallery Connect.
6. Download the agconnect-services.json file from App information, copy and paste in android Project under app directory, as follows.
7. Enter SHA-256 certificate fingerprint and click tick icon, as follows.
Note: Above steps from Step 1 to 7 is common for all Huawei Kits.
8. Add the below maven URL in build.gradle(Project) file under the repositories of buildscript, dependencies and allprojects, refer Add Configuration.
Java:
maven { url 'http://developer.huawei.com/repo/' }
classpath 'com.huawei.agconnect:agcp:1.4.1.300'
9. Add the below plugin and dependencies in build.gradle(Module) file.
Java:
apply plugin: 'com.huawei.agconnect'
// Huawei AGC
implementation 'com.huawei.agconnect:agconnect-core:1.5.0.300'
// Scan Kit
implementation 'com.huawei.hms:scan:1.2.5.300'
10. Now Sync the gradle.
11. Add the required permission to the AndroidManifest.xml file.
Java:
<!-- Camera permission -->
<uses-permission android:name="android.permission.CAMERA" />
<!-- File read permission -->
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
<uses-feature android:name="android.hardware.camera" />
<uses-feature android:name="android.hardware.camera.autofocus" />
Let us move to development
I have created a project on Android studio with empty activity let's start coding.
In the ScanActivity.kt we can find the button click.
Java:
class ScanActivity : AppCompatActivity() {
companion object{
private val CUSTOMIZED_VIEW_SCAN_CODE = 102
}
private var resultText: TextView? = null
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
setContentView(R.layout.activity_scan)
resultText = findViewById<View>(R.id.result) as TextView
requestPermission()
}
fun onCustomizedViewClick(view: View?) {
resultText!!.text = ""
this.startActivityForResult(Intent(this, BarcodeScanActivity::class.java), CUSTOMIZED_VIEW_SCAN_CODE)
}
override fun onActivityResult(requestCode: Int, resultCode: Int, data: Intent?) {
super.onActivityResult(requestCode, resultCode, data)
if (resultCode != RESULT_OK || data == null) {
return
}
// Get return value of HmsScan from the value returned by the onActivityResult method by ScanUtil.RESULT as key value.
val obj: HmsScan? = data.getParcelableExtra(ScanUtil.RESULT)
try {
val json = JSONObject(obj!!.originalValue)
// Log.e("Scan","Result "+json.toString())
val name = json.getString("hospital name")
val phone = json.getString("phone")
val mail = json.getString("email")
val web = json.getString("site")
val i = Intent(Intent.ACTION_INSERT_OR_EDIT)
i.type = ContactsContract.Contacts.CONTENT_ITEM_TYPE
i.putExtra(ContactsContract.Intents.Insert.NAME, name)
i.putExtra(ContactsContract.Intents.Insert.PHONE, phone)
i.putExtra(ContactsContract.Intents.Insert.EMAIL, mail)
i.putExtra(ContactsContract.Intents.Insert.COMPANY, web)
startActivity(i)
} catch (e: JSONException) {
e.printStackTrace()
Toast.makeText(this, "JSON exception", Toast.LENGTH_SHORT).show()
} catch (e: Exception) {
e.printStackTrace()
Toast.makeText(this, "Exception", Toast.LENGTH_SHORT).show()
}
}
private fun requestPermission() {
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.M) {
requestPermissions(arrayOf(android.Manifest.permission.CAMERA, READ_EXTERNAL_STORAGE),1001)
}
}
@SuppressLint("MissingSuperCall")
override fun onRequestPermissionsResult(requestCode: Int, permissions: Array<String?>, grantResults: IntArray) {
if (permissions == null || grantResults == null || grantResults.size < 2 || grantResults[0] != PackageManager.PERMISSION_GRANTED || grantResults[1] != PackageManager.PERMISSION_GRANTED) {
requestPermission()
}
}
}
In the BarcodeScanActivity.kt we can find the code to scan barcode.
Java:
class BarcodeScanActivity : AppCompatActivity() {
companion object {
private var remoteView: RemoteView? = null
//val SCAN_RESULT = "scanResult"
var mScreenWidth = 0
var mScreenHeight = 0
//scan view finder width and height is 350dp
val SCAN_FRAME_SIZE = 300
}
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
setContentView(R.layout.activity_barcode_scan)
// 1. get screen density to calculate viewfinder's rect
val dm = resources.displayMetrics
val density = dm.density
// 2. get screen size
mScreenWidth = resources.displayMetrics.widthPixels
mScreenHeight = resources.displayMetrics.heightPixels
val scanFrameSize = (SCAN_FRAME_SIZE * density).toInt()
// 3. Calculate viewfinder's rect, it is in the middle of the layout.
// set scanning area(Optional, rect can be null. If not configure, default is in the center of layout).
val rect = Rect()
rect.left = mScreenWidth / 2 - scanFrameSize / 2
rect.right = mScreenWidth / 2 + scanFrameSize / 2
rect.top = mScreenHeight / 2 - scanFrameSize / 2
rect.bottom = mScreenHeight / 2 + scanFrameSize / 2
// Initialize RemoteView instance and set calling back for scanning result.
remoteView = RemoteView.Builder().setContext(this).setBoundingBox(rect).setFormat(HmsScan.ALL_SCAN_TYPE).build()
remoteView?.onCreate(savedInstanceState)
remoteView?.setOnResultCallback(OnResultCallback { result -> //judge the result is effective
if (result != null && result.size > 0 && result[0] != null && !TextUtils.isEmpty(result[0].getOriginalValue())) {
val intent = Intent()
intent.putExtra(ScanUtil.RESULT, result[0])
setResult(RESULT_OK, intent)
this.finish()
}else{
Log.e("Barcode","Barcode: No barcode recognized ")
}
})
// Add the defined RemoteView to page layout.
val params = FrameLayout.LayoutParams(LinearLayout.LayoutParams.MATCH_PARENT, LinearLayout.LayoutParams.MATCH_PARENT)
val frameLayout = findViewById<FrameLayout>(R.id.rim1)
frameLayout.addView(remoteView, params)
}
// Manage remoteView lifecycle
override fun onStart() {
super.onStart()
remoteView?.onStart()
}
override fun onResume() {
super.onResume()
remoteView?.onResume()
}
override fun onPause() {
super.onPause()
remoteView?.onPause()
}
override fun onDestroy() {
super.onDestroy()
remoteView?.onDestroy()
}
override fun onStop() {
super.onStop()
remoteView?.onStop()
}
}
In the activity_scan.xml we can create the UI screen.
XML:
<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:orientation="vertical"
android:gravity="center"
tools:context=".scan.ScanActivity">
<Button
android:id="@+id/btn_click"
android:layout_width="wrap_content"
android:layout_height="50dp"
android:textAllCaps="false"
android:textSize="20sp"
android:layout_gravity="center"
android:text="Click to Scan"
android:onClick="onCustomizedViewClick"
tools:ignore="OnClick" />
<TextView
android:id="@+id/result"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:textSize="18sp"
android:layout_marginTop="80dp"
android:textColor="#C0F81E"/>
</LinearLayout>
In the activity_barcode_scan.xml we can create the frame layout.
XML:
<?xml version="1.0" encoding="utf-8"?>
<RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
tools:context=".scan.BarcodeScanActivity">
// customize layout for camera preview to scan
<FrameLayout
android:id="@+id/rim1"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:background="#C0C0C0" />
// customize scanning mask
<ImageView
android:layout_width="match_parent"
android:layout_height="match_parent"
android:layout_centerInParent="true"
android:layout_centerHorizontal="true"
android:alpha="0.1"
android:background="#FF000000"/>
// customize scanning view finder
<ImageView
android:id="@+id/scan_view_finder"
android:layout_width="300dp"
android:layout_height="300dp"
android:layout_centerInParent="true"
android:layout_centerHorizontal="true"
android:background="#1f00BCD4"
tools:ignore="MissingConstraints" />
</RelativeLayout>
Demo
Find the demo in attachment or click here for original content.
Tips and Tricks
1. Make sure you are already registered as Huawei developer.
2. Set minSDK version to 19 or later, otherwise you will get AndriodManifest merge issue.
3. Make sure you have added the agconnect-services.json file to app folder.
4. Make sure you have added SHA-256 fingerprint without fail.
5. Make sure all the dependencies are added properly.
Conclusion
In this article, we can learn how to save the hospital details by scanning the barcode and saving the details in your contacts directory using Huawei Scan Kit. Due to busy days like journey, office work and personal work, users are not able to save many details. So, this app helps you to save the hospital information by just one scan of barcode from your phone such as Hospital Name, Contact Number, Email address, Website etc.
Reference
Scan Kit - Customized View
Scan Kit - Training Video
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Introduction
In this article, we can learn how to integrate Rewarded Ads feature of Huawei Ads Kit into the android app. So, Rewarded ads are full-screen video ads that allow users to view in exchange for in-app rewards.
Ads Kit
Huawei Ads provides to developers a wide-ranging capabilities to deliver good quality ads content to users. This is the best way to reach the target audience easily and can measure user productivity. It is very useful when we publish a free app and want to earn some money from it.
HMS Ads Kit has 7 types of Ads kits. Now we can implement Rewarded Ads in this application.
Requirements
1. Any operating system (MacOS, Linux and Windows).
2. Must have a Huawei phone with HMS 4.0.0.300 or later.
3. Must have a laptop or desktop with Android Studio, Jdk 1.8, SDK platform 26 and Gradle 4.6 and above installed.
4. Minimum API Level 24 is required.
5. Required EMUI 9.0.0 and later version devices.
How to integrate HMS Dependencies
1. First register as Huawei developer and complete identity verification in Huawei developers website, refer to register a Huawei ID.
2. Create a project in android studio, refer Creating an Android Studio Project.
3. Generate a SHA-256 certificate fingerprint.
4. To generate SHA-256 certificate fingerprint. On right-upper corner of android project click Gradle, choose Project Name > Tasks > android, and then click signingReport, as follows.
Note: Project Name depends on the user created name.
5. Create an App in AppGallery Connect.
6. Download the agconnect-services.json file from App information, copy and paste in android Project under app directory, as follows.
7. Enter SHA-256 certificate fingerprint and click Save button, as follows.
Note: Above steps from Step 1 to 7 is common for all Huawei Kits.
8. Add the below maven URL in build.gradle(Project) file under the repositories of buildscript, dependencies and allprojects, refer Add Configuration.
Java:
maven { url 'http://developer.huawei.com/repo/' }
classpath 'com.huawei.agconnect:agcp:1.6.0.300'
9. Add the below plugin and dependencies in build.gradle(Module) file.
Java:
apply plugin: id 'com.huawei.agconnect'
// Huawei AGC
implementation 'com.huawei.agconnect:agconnect-core:1.6.0.300'
// Huawei Ads Kit
implementation 'com.huawei.hms:ads-lite:13.4.51.300'
10. Now Sync the gradle.
11. Add the required permission to the AndroidManifest.xml file.
Java:
// Ads Kit
<uses-permission android:name="android.permission.INTERNET" />
Let us move to development
I have created a project on Android studio with empty activity let us start coding.
In the MainActivity.kt we can find the business logic for Ads.
Java:
class MainActivity : AppCompatActivity() {
companion object {
private const val PLUS_SCORE = 1
private const val MINUS_SCORE = 5
private const val RANGE = 2
}
private var rewardedTitle: TextView? = null
private var scoreView: TextView? = null
private var reStartButton: Button? = null
private var watchAdButton: Button? = null
private var rewardedAd: RewardAd? = null
private var score = 1
private val defaultScore = 10
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
setContentView(R.layout.activity_main)
title = getString(R.string.reward_ad)
rewardedTitle = findViewById(R.id.text_reward)
rewardedTitle!!.setText(R.string.reward_ad_title)
// Load a rewarded ad.
loadRewardAd()
// Load a score view.
loadScoreView()
// Load the button for watching a rewarded ad.
loadWatchButton()
// Load the button for starting a game.
loadPlayButton()
}
// Load a rewarded ad.
private fun loadRewardAd() {
if (rewardedAd == null) {
rewardedAd = RewardAd([email protected], getString(R.string.ad_id_reward))
}
val rewardAdLoadListener: RewardAdLoadListener = object : RewardAdLoadListener() {
override fun onRewardAdFailedToLoad(errorCode: Int) {
showToast("onRewardAdFailedToLoad errorCode is :$errorCode");
}
override fun onRewardedLoaded() {
showToast("onRewardedLoaded")
}
}
rewardedAd!!.loadAd(AdParam.Builder().build(), rewardAdLoadListener)
}
// Display a rewarded ad.
private fun rewardAdShow() {
if (rewardedAd!!.isLoaded) {
rewardedAd!!.show([email protected], object : RewardAdStatusListener() {
override fun onRewardAdClosed() {
showToast("onRewardAdClosed")
loadRewardAd()
}
override fun onRewardAdFailedToShow(errorCode: Int) {
showToast("onRewardAdFailedToShow errorCode is :$errorCode")
}
override fun onRewardAdOpened() {
showToast("onRewardAdOpened")
}
override fun onRewarded(reward: Reward) {
// You are advised to grant a reward immediately and at the same time, check whether the reward
// takes effect on the server. If no reward information is configured, grant a reward based on the
// actual scenario.
val addScore = if (reward.amount == 0) defaultScore else reward.amount
showToast("Watch video show finished , add $addScore scores")
score += addScore
setScore(score)
loadRewardAd()
}
})
}
}
// Set a Score
private fun setScore(score: Int) {
scoreView!!.text = "Score:$score"
}
// Load the button for watching a rewarded ad
private fun loadWatchButton() {
watchAdButton = findViewById(R.id.show_video_button)
watchAdButton!!.setOnClickListener(View.OnClickListener { rewardAdShow() })
}
// Load the button for starting a game
private fun loadPlayButton() {
reStartButton = findViewById(R.id.play_button)
reStartButton!!.setOnClickListener(View.OnClickListener { play() })
}
private fun loadScoreView() {
scoreView = findViewById(R.id.score_count_text)
scoreView!!.text = "Score:$score"
}
// Used to play a game
private fun play() {
// If the score is 0, a message is displayed, asking users to watch the ad in exchange for scores.
if (score == 0) {
Toast.makeText([email protected], "Watch video ad to add score", Toast.LENGTH_SHORT).show()
return
}
// The value 0 or 1 is returned randomly. If the value is 1, the score increases by 1. If the value is 0, the
// score decreases by 5. If the score is a negative number, the score is set to 0.
val random = Random().nextInt(RANGE)
if (random == 1) {
score += PLUS_SCORE
Toast.makeText([email protected], "You win!", Toast.LENGTH_SHORT).show()
} else {
score -= MINUS_SCORE
score = if (score < 0) 0 else score
Toast.makeText([email protected], "You lose!", Toast.LENGTH_SHORT).show()
}
setScore(score)
}
private fun showToast(text: String) {
runOnUiThread {
Toast.makeText([email protected], text, Toast.LENGTH_SHORT).show()
}
}
}
In the activity_main.xml we can create the UI screen.
Java:
<?xml version="1.0" encoding="utf-8"?>
<RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
tools:context=".MainActivity">
<TextView
android:id="@+id/text_reward"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:layout_marginTop="16dp"
android:textAlignment="center"
android:textSize="20sp"
android:text="This is rewarded ads sample"/>
<Button
android:id="@+id/play_button"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_below="@+id/text_reward"
android:layout_centerHorizontal="true"
android:layout_marginTop="20dp"
android:text="Play" />
<Button
android:id="@+id/show_video_button"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_below="@+id/play_button"
android:layout_centerHorizontal="true"
android:layout_marginTop="20dp"
android:text="Watch Video" />
<TextView
android:id="@+id/score_count_text"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_below="@+id/show_video_button"
android:layout_centerHorizontal="true"
android:layout_marginTop="30dp"
android:textAppearance="?android:attr/textAppearanceLarge" />
</RelativeLayout>
Demo
Tips and Tricks
1. Make sure you are already registered as Huawei developer.
2. Set minSDK version to 24 or later, otherwise you will get AndriodManifest merge issue.
3. Make sure you have added the agconnect-services.json file to app folder.
4. Make sure you have added SHA-256 fingerprint without fail.
5. Make sure all the dependencies are added properly.
Conclusion
In this article, we have learned how to integrate the Huawei Analytics Kit and Ads Kit in Book Reading app. So, I will provide the series of articles on this Book Reading App, in upcoming articles will integrate other Huawei Kits.
I hope you have read this article. If you found it is helpful, please provide likes and comments.
Reference
Ads Kit - Rewarded Ads
Ads Kit – Training Video