{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
What is FIDO BioAuthn
FIDO provides your app with powerful local biometric authentication capabilities, including fingerprint authentication and 3D facial authentication. It allows your app to provide secure and easy-to-use password-free authentication for users while ensuring reliable authentication results.
Service Features
· Takes the system integrity check result as the prerequisite for using BioAuthn, ensuring more secure authentication.
· Uses cryptographic key verification to ensure the security and reliability of authentication results.
Requirements
· Android Studio version: 3.X or later
· Test device: a Huawei phone running EMUI 10.0 or later
Configurations
For the step by step tutorial follow this link for integrating Huawei HMS Core: link
When you finish those steps you need to add below code to your build.gradle file under app directory of your project.
Code:
implementation 'com.huawei.hms:fido-bioauthn-androidx:{LatestVersion} '
*Current latest version: 5.0.5.304
After that, add bellow lines to your proguard-rules.pro in the app directory of your project.
Code:
-ignorewarnings
-keepattributes *Annotation*
-keepattributes Exceptions
-keepattributes InnerClasses
-keepattributes Signature
-keepattributes SourceFile,LineNumberTable
-keep class com.huawei.hianalytics.**{*;}
-keep class com.huawei.updatesdk.**{*;}
-keep class com.huawei.hms.**{*;}
Sync project and you are ready to go.
Development
1 - We need to add permissions to the AndroidManifest.xml.
XML:
<uses-permission android:name="android.permission.CAMERA"/>
<uses-permission android:name="android.permission.USE_BIOMETRIC"/>
2 – Create two buttons for fingerprint authentication and face recognition.
XML:
<Button
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:onClick="fingerAuth"
android:layout_marginBottom="16dp"
android:textAllCaps="false"
android:text="@string/btn_finger" />
<Button
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:onClick="faceAuth"
android:textAllCaps="false"
android:text="@string/btn_face" />
3 – First let’s ask for Camera permission on onResume method of activity.
Java:
@Override
protected void onResume() {
super.onResume();
if (checkSelfPermission(Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED) {
String[] permissions = {Manifest.permission.CAMERA};
requestPermissions(permissions, 0);
}
}
4 – Create a function that returns BioAuthnCallback object for later use.
Java:
public BioAuthnCallback bioAuthCallback() {
return new BioAuthnCallback() {
@Override
public void onAuthError(int errMsgId, @NonNull CharSequence errString) {
showResult("Authentication error. errorCode=" + errMsgId + ",errorMessage=" + errString
+ (errMsgId == 1012 ? " The camera permission may not be enabled." : ""));
}
@Override
public void onAuthHelp(int helpMsgId, @NonNull CharSequence helpString) {
showResult("Authentication help. helpMsgId=" + helpMsgId + ",helpString=" + helpString + "\n");
}
@Override
public void onAuthSucceeded(@NonNull BioAuthnResult result) {
showResult("Authentication succeeded. CryptoObject=" + result.getCryptoObject());
}
@Override
public void onAuthFailed() {
showResult("Authentication failed.");
}
};
}
5 – So far we implemented requirements. Now we can implement Fingerprint authentication button onClick method.
Java:
public void fingerAuth(View v) {
BioAuthnPrompt bioAuthnPrompt = new BioAuthnPrompt(this, ContextCompat.getMainExecutor(this), bioAuthCallback());
BioAuthnPrompt.PromptInfo.Builder builder =
new BioAuthnPrompt.PromptInfo.Builder().setTitle("FIDO")
.setDescription("To proceed please verify identification");
builder.setDeviceCredentialAllowed(true);
//builder.setNegativeButtonText("Cancel");
BioAuthnPrompt.PromptInfo info = builder.build();
bioAuthnPrompt.auth(info);
}
The user will first be prompted to authenticate with biometrics, but also given the option to authenticate with their device PIN, pattern, or password. setNegativeButtonText(CharSequence) should not be set if this is set to true vice versa.
Huawei provides the secure fingerprint authentication capability. If the system is insecure, the callback method BioAuthnCallback.onAuthError() returns the error code BioAuthnPrompt.ERROR_SYS_INTEGRITY_FAILED (Code: 1001). If the system is secure, fingerprint authentication is performed.
6 – Now we can also implement face recognition button’s onPress method.
Java:
public void faceAuth(View v) {
CancellationSignal cancellationSignal = new CancellationSignal();
FaceManager faceManager = new FaceManager(this);
int flags = 0;
Handler handler = null;
CryptoObject crypto = null;
faceManager.auth(crypto, cancellationSignal, flags, bioAuthCallback(), handler);
}
You are advised to set CryptoObject to null. KeyStore is not associated with face authentication in the current version. KeyGenParameterSpec.Builder.setUserAuthenticationRequired() must be set to false in this scenario.
Huawei provides the secure 3D facial authentication capability. If the system is insecure, the callback method BioAuthnCallback.onAuthError returns the error code FaceManager.FACE_ERROR_SYS_INTEGRITY_FAILED (Code: 1001). If the system is secure, 3D facial authentication is performed.
7 – For the last part lets implement showResult method that we used on bioAuthCallback method to keep log of the operations and show a toast message.
Java:
public void showResult(String text) {
Log.d("ResultTag", text);
Toast.makeText(this, text, Toast.LENGTH_SHORT).show();
}
You can shape showResult method like you can proceed to another activity-fragment or whatever you want your application to do.
With all set you are ready to implement Huawei FIDO BioAuthn to your application.
Conclusion
With this article you can learn what Huawei FIDO BioAuthn is and with the step by step implementation it will be very easy to use it on your code.
For more information about Huawei FIDO follow this link.
Thank you.
Related
More information like this, you can visit HUAWEI Developer Forum
Introduction
Customers of Huawei maps are people who use maps API like Ecommerce, real estate portals, travel portals etc. and end users who use the Huawei maps application through different devices. End users also experience the maps interface through different search experiences like the local business searches, hotel searches, flight searches.
Let’s Start how to Integrate Map:
Step1: create a new project in Android studio.
Step 2: Configure your app into AGC.
Step 3: Enable required Api & add SHA-256.
Step 4: Download the agconnect-services.json from AGC. Paste into app directory.
Step 5: Add the below dependency in app.gradle file.
Code:
implementation 'com.huawei.hms:maps:4.0.0.301'
implementation 'com.huawei.hms:site:4.0.3.300'
Step 6: Add the below dependency in root.gradle file
Code:
maven { url 'http://developer.huawei.com/repo/' }
Step 7: Add appId & permissions in AndoridManifest.xml file
Code:
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
<meta-data
android:name="com.huawei.hms.client.appid"
android:value="appid=*********" />
Step 8: Sync your Project
Use Cases:
Huawei site kit supports below capabilities.
· Keyword Search: It will display list based on user input.
· Nearby Place Search: It will display nearby places based on the current location of the user's device.
· Place Detail Search: Searches for details about a place.
· Place Search Suggestion: Returns a list of suggested places.
Let’s Discuss how to integrate Site Kit:
Declare SearchService create Instance for SearchService.
Code:
SearchService searchService = SearchServiceFactory.create(this,URLEncoder.encode("API_KEY", "utf-8"));
Create TextSearchRequest, which is the place to search request. below are the parameters.
query: search keyword.
location: longitude and latitude to which search results need to be biased.
radius: search radius, in meters. The value ranges from 1 to 50000. The default value is 50000.
poiType: POI type. The value range is the same as that of LocationType.
HwPoiType: Huawei POI type. his parameter is recommended. The value range is the same as that of HwLocationType.
countrycode: code of the country where places are searched. The country code must comply with the ISO 3166-1 alpha-2 standard.
language: language in which search results are displayed. For details about the value range, please refer to language codes in LanguageMapping
If this parameter is not passed, the language of the query field is used in priority. If the field language is unavailable, the local language will be used.
pageSize: number of records on each page. The value ranges from 1 to 20. The default value is 20.
pageIndex: current page number. The value ranges from 1 to 60. The default value is 1.
Code:
autoCompleteTextView.addTextChangedListener(new TextWatcher() {
@Override
public void beforeTextChanged(CharSequence charSequence, int i, int i1, int i2) {
}
@Override
public void onTextChanged(CharSequence charSequence, int i, int i1, int i2) {
if (timer != null) {
timer.cancel();
}
}
@Override
public void afterTextChanged(final Editable text) {
timer = new Timer();
timer.schedule(new TimerTask() {
@Override
public void run() {
if (text.length() > 3) {
list.clear();
final TextSearchRequest textSearchRequest = new TextSearchRequest();
textSearchRequest.setQuery(text.toString());
searchService.textSearch(textSearchRequest, new SearchResultListener<TextSearchResponse>() {
@Override
public void onSearchResult(TextSearchResponse response) {
for (Site site : response.getSites()) {
LatLng latLng = new LatLng(site.getLocation().getLat(), site.getLocation().getLng());
SearchResult searchResult = new SearchResult(latLng, site.getAddress().getLocality(), site.getName());
String result = site.getName() + "," + site.getAddress().getSubAdminArea() + "\n" +
site.getAddress().getAdminArea() + "," +
site.getAddress().getLocality() + "\n" +
site.getAddress().getCountry() + "\n" +
site.getAddress().getPostalCode() + "\n";
list.add(result);
searchList.add(searchResult);
}
mAutoCompleteAdapter.clear();
mAutoCompleteAdapter.addAll(list);
mAutoCompleteAdapter.notifyDataSetChanged();
autoCompleteTextView.setAdapter(mAutoCompleteAdapter);
Toast.makeText(MainActivity.this, String.valueOf(response.getTotalCount()), Toast.LENGTH_SHORT).show();
}
@Override
public void onSearchError(SearchStatus searchStatus) {
Toast.makeText(MainActivity.this, searchStatus.getErrorCode(), Toast.LENGTH_SHORT).show();
}
});
}
}
}, 200);
autoCompleteTextView.setOnItemClickListener(new AdapterView.OnItemClickListener() {
@Override
public void onItemClick(AdapterView<?> parent, View view, int position, long id) {
mAutoCompleteAdapter.getItem(position);
selectedPostion = searchList.get(position);
try {
createMarker(new LatLng(selectedPostion.latLng.latitude, selectedPostion.latLng.longitude));
} catch (IOException e) {
e.printStackTrace();
}
}
});
}
});
Output:
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Reference:
https://developer.huawei.com/consumer/en/doc/development/HMS-References/hms-map-cameraupdate
https://developer.huawei.com/consumer/en/doc/development/HMSCore-Guides/android-sdk-introduction-0000001050158571
More information like this, you can visit HUAWEI Developer Forum
Original link: https://forums.developer.huawei.com/forumPortal/en/topicview?tid=0201257887466590240&fid=0101187876626530001
Introductions
Richard Yu introduced Huawei HMS Core 4.0 to you at the launch event a while ago. Please check the launch event information:
What does the global release of HMS Core 4.0 mean?
Machine Learning Kit (MLKit) is one of the most important services.
What can MLKIT do? Which of the following problems can be solved during application development?
Today, let’s take face detection as an example to show you the powerful functions of MLKIT and the convenience it provides for developers.
1.1 Capabilities Provided by MLKIT Face Detection
First, let’s look at the face detection capability of Huawei Machine Learning Service (MLKIT).
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
As shown in the animation, facial recognition can recognize the face direction, detect facial expressions (such as happy, disgusted, surprised, sad, angry, and angry), detect facial attributes (such as gender, age, and wearable), and detect whether to open or close eyes, supports coordinate detection of features such as faces, noses, eyes, lips, and eyebrows. In addition, multiple faces can be detected at the same time.
Tips: This function is free of charge and covers all Android models.
2 Development of the Multi-Face Smile Photographing Function
Today, I will use the multi-facial recognition and expression detection capabilities of MLKIT to write a small demo for smiling snapshot and perform a practice.
To download the Github demo source code, click here (the project directory is Smile-Camera).
2.1 Development Preparations
The preparations for developing the kit of Huawei HMS are similar. The only difference is that the Maven dependency is added and the SDK is introduced.
1. Add the Huawei Maven repository to the project-level gradle.
Incrementally add the following Maven addresses:
Code:
buildscript {
repositories {
maven {url 'http://developer.huawei.com/repo/'}
}
}
allprojects {
repositories {
maven {url 'http://developer.huawei.com/repo/'}
}
}
2. Add the SDK dependency to the build.gradle file at the application level.
Introduce the facial recognition SDK and basic SDK.
Code:
dependencies{
// Introduce the basic SDK.
implementation 'com.huawei.hms:ml-computer-vision:1.0.2.300'
// Introduce the face detection capability package.
implementation 'com.huawei.hms:ml-computer-vision-face-recognition-model:1.0.2.300'
}
3. The model is added to the AndroidManifest.xml file in incremental mode for automatic download.
This is mainly used to update the model. After the algorithm is optimized, the model can be automatically downloaded to the mobile phone for update.
Code:
<manifest
<application
<meta-data
android:name="com.huawei.hms.ml.DEPENDENCY"
android:value= "face"/>
</application>
</manifest>
4. Apply for camera and storage permissions in the AndroidManifest.xml file.
Code:
<!-Camera permission-->
<uses-permission android:name="android.permission.CAMERA" />
<!--Use the storage permission.-->
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
2.2 Code development
1. Create a face analyzer and take photos when a smile is detected.
Photos taken after detection:
1) Analyzer parameter configuration
2) Sends analyzer parameter settings to the analyzer.
3) In analyzer.setTransacto, transactResult is rewritten to process the content after facial recognition. After facial recognition, a confidence level (smiling probability) is returned. You only need to set the confidence level to a certain value.
Code:
private MLFaceAnalyzer analyzer;
private void createFaceAnalyzer() {
MLFaceAnalyzerSetting setting =
new MLFaceAnalyzerSetting.Factory()
.setFeatureType(MLFaceAnalyzerSetting.TYPE_FEATURES)
.setKeyPointType(MLFaceAnalyzerSetting.TYPE_UNSUPPORT_KEYPOINTS)
.setMinFaceProportion(0.1f)
.setTracingAllowed(true)
.create();
this.analyzer = MLAnalyzerFactory.getInstance().getFaceAnalyzer(setting);
this.analyzer.setTransactor(new MLAnalyzer.MLTransactor<MLFace>() {
@Override
public void destroy() {
}
Code:
@Override
public void transactResult(MLAnalyzer.Result<MLFace> result) {
SparseArray<MLFace> faceSparseArray = result.getAnalyseList();
int flag = 0;
for (int i = 0; i < faceSparseArray.size(); i++) {
MLFaceEmotion emotion = faceSparseArray.valueAt(i).getEmotions();
if (emotion.getSmilingProbability() > smilingPossibility) {
flag++;
}
}
if (flag > faceSparseArray.size() * smilingRate && safeToTakePicture) {
safeToTakePicture = false;
mHandler.sendEmptyMessage(TAKE_PHOTO);
}
}
});
}
Photographing and storage:
Code:
private void takePhoto() {
this.mLensEngine.photograph(null,
new LensEngine.PhotographListener() {
@Override
public void takenPhotograph(byte[] bytes) {
mHandler.sendEmptyMessage(STOP_PREVIEW);
Bitmap bitmap = BitmapFactory.decodeByteArray(bytes, 0, bytes.length);
saveBitmapToDisk(bitmap);
}
});
}
2. Create a visual engine to capture dynamic video streams from cameras and send the streams to the analyzer.
Code:
private void createLensEngine() {
Context context = this.getApplicationContext();
// Create LensEngine
this.mLensEngine = new LensEngine.Creator(context, this.analyzer).setLensType(this.lensType)
.applyDisplayDimension(640, 480)
.applyFps(25.0f)
.enableAutomaticFocus(true)
.create();
}
3. Dynamic permission application, attaching the analyzer and visual engine creation code3. Dynamic permission application, attaching the analyzer and visual engine creation code
Code:
@Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
this.setContentView(R.layout.activity_live_face_analyse);
if (savedInstanceState! = null) {
this.lensType = savedInstanceState.getInt("lensType");
}
this.mPreview = this.findViewById(R.id.preview);
this.createFaceAnalyzer();
this.findViewById(R.id.facingSwitch).setOnClickListener(this);
// Checking Camera Permissions
if (ActivityCompat.checkSelfPermission(this, Manifest.permission.CAMERA) == PackageManager.PERMISSION_GRANTED) {
this.createLensEngine();
} else {
this.requestCameraPermission();
}
}
Code:
private void requestCameraPermission() {
final String[] permissions = new String[]{Manifest.permission.CAMERA, Manifest.permission.WRITE_EXTERNAL_STORAGE};
Code:
if (!ActivityCompat.shouldShowRequestPermissionRationale(this, Manifest.permission.CAMERA)) {
ActivityCompat.requestPermissions(this, permissions, LiveFaceAnalyseActivity.CAMERA_PERMISSION_CODE);
return;
}
}
Code:
@Override
public void onRequestPermissionsResult(int requestCode, @NonNull String[] permissions,
@NonNull int[] grantResults) {
if (requestCode != LiveFaceAnalyseActivity.CAMERA_PERMISSION_CODE) {
super.onRequestPermissionsResult(requestCode, permissions, grantResults);
return;
}
if (grantResults.length != 0 && grantResults[0] == PackageManager.PERMISSION_GRANTED) {
this.createLensEngine();
return;
}
}
3 Conclusion
Is the development process very simple? A new feature can be developed in 30 minutes. Let’s experience the effect of the multi-faced smile capture.
Multi-person smiling face snapshot:
Based on the face detection capability, which functions can be done? Open your brain hole! Here are a few hints:
1. Add interesting decorative effects by identifying the locations of facial features such as ears, eyes, nose, mouth, and eyebrows.
2. Identify facial contours and stretch the contours to generate interesting portraits or develop facial beautification functions for contour areas.
3. Develop some parental control functions based on age identification and children’s infatuation with electronic products.
4. Develop the eye comfort feature by detecting the duration of eyes staring at the screen.
5. Implements liveness detection through random commands (such as shaking the head, blinking the eyes, and opening the mouth).
6. Recommend offerings to users based on their age and gender.
For details about the development guide, visit the HUAWEI Developers
The Huawei Search Kit provide awesome capabilities to build your own search engine or even go beyond. With Search Kit you can add a mini web browser to your app app with web searching capabilities, by this way, the user will be able to surf into the Internet without knowing the complete URL of the website. In this article we will build a Web Browser application powered by Search Kit.
Previous Requirements
A verified developer account
A configured project in AppGallery Connect
Configuring and adding the Search Kit
Once you have propely configured you project in AGC, go to "Manage APIs" from your project settings and enable Search Kit.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Note: Search Kit require to choose a Data Storage Location, make sure to choose a DSL which support the countries where you plan to release your app.
Once you have enabled the API and choosen the Data Storage Location, download (or re-download) you agconnect-services.json and add it to your Android Project. Then, add the Serach Kit latest dependency to your app-level build.gradle file, we will also need the related dependencies of coroutines and data binding.
Code:
implementation "androidx.lifecycle:lifecycle-runtime-ktx:2.2.0"
implementation "android.arch.lifecycle:extensions:1.1.1"
implementation 'com.huawei.agconnect:agconnect-core:1.4.2.301'
implementation 'com.huawei.hms:searchkit:5.0.4.303'
implementation 'org.jetbrains.kotlinx:kotlinx-coroutines-core:1.3.9'
implementation 'org.jetbrains.kotlinx:kotlinx-coroutines-android:1.3.9'
Initializing the SDK
Before using Search Kit, you must apply for an access token, in order to authenticate the requests yo made by using the kit. If you don't provide the proper credentials, Search Kit will not work.
To get your Access Token, perform the next POST request:
Endpoint:
https://oauth-login.cloud.huawei.com/oauth2/v3/token
Headers:
content-type: application/x-www-form-urlencoded
Body:
"grant_type=client_credentials&client_secret=APP_SECRET&client_id=APP_ID"
Your App Id and App Secret will be available from your App dashboard in AGC
Note: Before sending the request, you must encode your AppSecret with URLEncode.
In order to perform the token request, let's create a request helper
HTTPHelper.kt
Code:
object HTTPHelper {
fun sendHttpRequest(
url: URL,
method: String,
headers: HashMap<String, String>,
body: ByteArray
): String {
val conn = url.openConnection() as HttpURLConnection
conn.apply {
requestMethod = method
doOutput = true
doInput = true
for (key in headers.keys) {
setRequestProperty(key, headers[key])
}
if(method!="GET"){
outputStream.write(body)
outputStream.flush()
}
}
val inputStream = if (conn.responseCode < 400) {
conn.inputStream
} else conn.errorStream
return convertStreamToString(inputStream).also { conn.disconnect() }
}
private fun convertStreamToString(input: InputStream): String {
return BufferedReader(InputStreamReader(input)).use {
val response = StringBuffer()
var inputLine = it.readLine()
while (inputLine != null) {
response.append(inputLine)
inputLine = it.readLine()
}
it.close()
response.toString()
}
}
}
We will take advantage of the Kotlin Extensions to add the token request to the SearchKitInstance class, if we initialize the Search Kit from the Application Class, a fresh token will be obtained upon each app startup. When you have retrieved the token, call the method SearchKitInstance.getInstance().setInstanceCredential(token)
BrowserApplication.kt
Code:
class BrowserApplication : Application() {
companion object {
const val TOKEN_URL = "https://oauth-login.cloud.huawei.com/oauth2/v3/token"
const val APP_SECRET = "YOUR_OWN_APP_SECRET"
}
override fun onCreate() {
super.onCreate()
val appID = AGConnectServicesConfig
.fromContext(this)
.getString("client/app_id")
SearchKitInstance.init(this, appID)
CoroutineScope(Dispatchers.IO).launch {
SearchKitInstance.instance.refreshToken([email protected])
}
}
}
fun SearchKitInstance.refreshToken(context: Context) {
val config = AGConnectServicesConfig.fromContext(context)
val appId = config.getString("client/app_id")
val url = URL(BrowserApplication.TOKEN_URL)
val headers = HashMap<String, String>().apply {
put("content-type", "application/x-www-form-urlencoded")
}
val msgBody = MessageFormat.format(
"grant_type={0}&client_secret={1}&client_id={2}",
"client_credentials", URLEncoder.encode(APP_SECRET, "UTF-8"), appId
).toByteArray(Charsets.UTF_8)
val response = HTTPHelper.sendHttpRequest(url, "POST", headers, msgBody)
Log.e("token", response)
val accessToken = JSONObject(response).let {
if (it.has("access_token")) {
it.getString("access_token")
} else ""
}
setInstanceCredential(accessToken)
}
In short words, the basic setup of Search Kit is as follows:
Init the service, by calling SearchKitInstance.init(Context, appID)
Rerieve an App-Level access token
Call SearchKitInstance.getInstance().setInstanceCredential(token) to specify the access token which Search Kit will use to authenticate the requests
Auto Complete Widget
Let's create a custom widget wich can be added to any Activity or Fragment, this widget will show some search suggestions to the user when begins to write in the search bar.
First, we must design the Search Bar, it will contain an Image View and an Edit Text.
view_search.xml
Code:
<?xml version="1.0" encoding="utf-8"?>
<androidx.constraintlayout.widget.ConstraintLayout
xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:background="@drawable/gray_rec"
android:paddingHorizontal="5dp"
android:paddingVertical="5dp"
>
<ImageView
android:id="@+id/imageView"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:src="@android:drawable/ic_menu_search"
app:layout_constraintBottom_toBottomOf="parent"
app:layout_constraintStart_toStartOf="parent"
app:layout_constraintTop_toTopOf="parent"
app:srcCompat="@android:drawable/ic_menu_search" />
<EditText
android:id="@+id/textSearch"
android:layout_width="0dp"
android:layout_height="wrap_content"
app:layout_constraintBottom_toBottomOf="parent"
app:layout_constraintEnd_toEndOf="parent"
app:layout_constraintStart_toEndOf="@+id/imageView"
app:layout_constraintTop_toTopOf="parent"
/>
</androidx.constraintlayout.widget.ConstraintLayout>
Now is time to create the widget layout, it will contain our search bar and a recycler view to show the suggestions
More details, you can check https://forums.developer.huawei.com/forumPortal/en/topic/0202455305475510024
Overview
In this article, I will create a Doctor Consult android application in which I will integrate HMS Core kits such as Huawei ID, Crash and Analytics.
Huawei ID Service Introduction
Huawei ID login provides you with simple, secure, and quick sign-in and authorization functions. Instead of entering accounts and passwords and waiting for authentication, users can just tap the Sign in with HUAWEI ID button to quickly and securely sign in to your app with their HUAWEI IDs.
Prerequisite
Huawei Phone EMUI 3.0 or later.
Non-Huawei phones Android 4.4 or later (API level 19 or higher).
HMS Core APK 4.0.0.300 or later
Android Studio
AppGallery Account.
App Gallery Integration process
Sign In and Create or Choose a project on AppGallery Connect portal.
Navigate to Project settings and download the configuration file.
Navigate to General Information, and then provide Data Storage location.
App Development
Create A New Project.
Configure Project Gradle.
Code:
buildscript {
repositories {
google()
jcenter()
maven { url 'https://developer.huawei.com/repo/' }
}
dependencies {
classpath "com.android.tools.build:gradle:4.0.1"
classpath 'com.google.gms:google-services:4.3.5'
classpath 'com.huawei.agconnect:agcp:1.3.1.300'
}
}
allprojects {
repositories {
google()
jcenter()
maven { url 'https://developer.huawei.com/repo/' }
}
}
task clean(type: Delete) {
Configure App Gradle.
Code:
api 'com.huawei.hms:dynamicability:1.0.11.302'
implementation 'com.huawei.agconnect:agconnect-auth:1.4.1.300'
implementation 'com.huawei.hms:hwid:5.3.0.302'
implementation 'com.huawei.hms:ads-lite:13.4.30.307'
implementation 'com.huawei.agconnect:agconnect-remoteconfig:1.6.0.300'
implementation 'com.huawei.hms:hianalytics:5.0.3.300'
implementation 'com.huawei.agconnect:agconnect-crash:1.4.1.300'
Configure AndroidManifest.xml.
Code:
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.ACCESS_WIFI_STATE" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
Create Activity class with XML UI.
MainActivity:
Code:
public class MainActivity extends AppCompatActivity {
Toolbar t;
DrawerLayout drawer;
EditText nametext;
EditText agetext;
ImageView enter;
RadioButton male;
RadioButton female;
RadioGroup rg;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
drawer = findViewById(R.id.draw_activity);
t = (Toolbar) findViewById(R.id.toolbar);
nametext = findViewById(R.id.nametext);
agetext = findViewById(R.id.agetext);
enter = findViewById(R.id.imageView7);
male = findViewById(R.id.male);
female = findViewById(R.id.female);
rg=findViewById(R.id.rg1);
ActionBarDrawerToggle toggle = new ActionBarDrawerToggle(this, drawer, t, R.string.navigation_drawer_open, R.string.navigation_drawer_close);
drawer.addDrawerListener(toggle);
toggle.syncState();
NavigationView nav = findViewById(R.id.nav_view);
enter.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View view) {
String name = nametext.getText().toString();
String age = agetext.getText().toString();
String gender= new String();
int id=rg.getCheckedRadioButtonId();
switch(id)
{
case R.id.male:
gender = "Mr.";
break;
case R.id.female:
gender = "Ms.";
break;
}
Intent symp = new Intent(MainActivity.this, SymptomsActivity.class);
symp.putExtra("name",name);
symp.putExtra("gender",gender);
startActivity(symp);
}
});
nav.setNavigationItemSelectedListener(new NavigationView.OnNavigationItemSelectedListener() {
@Override
public boolean onNavigationItemSelected(@NonNull MenuItem menuItem) {
switch(menuItem.getItemId())
{
case R.id.nav_sos:
Intent in = new Intent(MainActivity.this, call.class);
startActivity(in);
break;
case R.id.nav_share:
Intent myIntent = new Intent(Intent.ACTION_SEND);
myIntent.setType("text/plain");
startActivity(Intent.createChooser(myIntent,"SHARE USING"));
break;
case R.id.nav_hosp:
Intent browserIntent = new Intent(Intent.ACTION_VIEW);
browserIntent.setData(Uri.parse("https://www.google.com/maps/search/hospitals+near+me"));
startActivity(browserIntent);
break;
case R.id.nav_cntus:
Intent c_us = new Intent(MainActivity.this, activity_contact_us.class);
startActivity(c_us);
break;
}
drawer.closeDrawer(GravityCompat.START);
return true;
}
});
}
}
App Build Result
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Tips and Tricks
Identity Kit displays the HUAWEI ID registration or sign-in page first. You can use the functions provided by Identity Kit only after signing in using a registered HUAWEI ID.
Conclusion
In this article, we have learned how to integrate Huawei ID in Android application. After completely read this article user can easily implement Huawei ID in the Doctor Consult application.
Thanks for reading this article. Be sure to like and comment to this article, if you found it helpful. It means a lot to me.
References
HMS Docs:
https://developer.huawei.com/consum.../HMSCore-Guides/introduction-0000001050048870
Overview
In this article, I will create a DoctorConsultApp android application in which I will integrate HMS Core kits such as Huawei ID, Crash and Analytics.
Huawei ID Service Introduction
Huawei ID login provides you with simple, secure, and quick sign-in and authorization functions. Instead of entering accounts and passwords and waiting for authentication, users can just tap the Sign in with HUAWEI ID button to quickly and securely sign in to your app with their HUAWEI IDs.
Prerequisite
Huawei Phone EMUI 3.0 or later.
Non-Huawei phones Android 4.4 or later (API level 19 or higher).
HMS Core APK 4.0.0.300 or later
Android Studio
AppGallery Account.
App Gallery Integration process
Sign In and Create or Choose a project on AppGallery Connect portal.
Navigate to Project settings and download the configuration file.
Navigate to General Information, and then provide Data Storage location.
App Development
Create A New Project.
Configure Project Gradle.
Code:
// Top-level build file where you can add configuration options common to all sub-projects/modules.
buildscript {
repositories {
google()
jcenter()
maven {url 'https://developer.huawei.com/repo/'}
}
dependencies {
classpath "com.android.tools.build:gradle:4.0.1"
classpath 'com.huawei.agconnect:agcp:1.4.2.300'
// NOTE: Do not place your application dependencies here; they belong
// in the individual module build.gradle files
}
}
allprojects {
repositories {
google()
jcenter()
maven {url 'https://developer.huawei.com/repo/'}
}
}
task clean(type: Delete) {
delete rootProject.buildDir
}
Configure App Gradle.
Code:
implementation 'com.huawei.hms:identity:5.3.0.300'
implementation 'com.huawei.agconnect:agconnect-auth:1.4.1.300'
implementation 'com.huawei.hms:hwid:5.3.0.302'
Configure AndroidManifest.xml.
Code:
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.ACCESS_WIFI_STATE" />
Create Activity class with XML UI.
MainActivity:
Code:
public class MainActivity extends AppCompatActivity {
Toolbar t;
DrawerLayout drawer;
EditText nametext;
EditText agetext;
ImageView enter;
RadioButton male;
RadioButton female;
RadioGroup rg;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
drawer = findViewById(R.id.draw_activity);
t = (Toolbar) findViewById(R.id.toolbar);
nametext = findViewById(R.id.nametext);
agetext = findViewById(R.id.agetext);
enter = findViewById(R.id.imageView7);
male = findViewById(R.id.male);
female = findViewById(R.id.female);
rg=findViewById(R.id.rg1);
ActionBarDrawerToggle toggle = new ActionBarDrawerToggle(this, drawer, t, R.string.navigation_drawer_open, R.string.navigation_drawer_close);
drawer.addDrawerListener(toggle);
toggle.syncState();
NavigationView nav = findViewById(R.id.nav_view);
enter.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View view) {
String name = nametext.getText().toString();
String age = agetext.getText().toString();
String gender= new String();
int id=rg.getCheckedRadioButtonId();
switch(id)
{
case R.id.male:
gender = "Mr.";
break;
case R.id.female:
gender = "Ms.";
break;
}
Intent symp = new Intent(MainActivity.this, SymptomsActivity.class);
symp.putExtra("name",name);
symp.putExtra("gender",gender);
startActivity(symp);
}
});
nav.setNavigationItemSelectedListener(new NavigationView.OnNavigationItemSelectedListener() {
@Override
public boolean onNavigationItemSelected(@NonNull MenuItem menuItem) {
switch(menuItem.getItemId())
{
case R.id.nav_sos:
Intent in = new Intent(MainActivity.this, call.class);
startActivity(in);
break;
case R.id.nav_share:
Intent myIntent = new Intent(Intent.ACTION_SEND);
myIntent.setType("text/plain");
startActivity(Intent.createChooser(myIntent,"SHARE USING"));
break;
case R.id.nav_hosp:
Intent browserIntent = new Intent(Intent.ACTION_VIEW);
browserIntent.setData(Uri.parse("https://www.google.com/maps/search/hospitals+near+me"));
startActivity(browserIntent);
break;
case R.id.nav_cntus:
Intent c_us = new Intent(MainActivity.this, activity_contact_us.class);
startActivity(c_us);
break;
}
drawer.closeDrawer(GravityCompat.START);
return true;
}
});
}
}
App Build Result
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Tips and Tricks
Identity Kit displays the HUAWEI ID registration or sign-in page first. You can use the functions provided by Identity Kit only after signing in by a registered HUAWEI ID.
Conclusion
In this article, we have learned how to integrate Huawei ID in Android application. After completely read this article, user can easily implement Huawei ID in DoctorConsultApp application.
Thanks for reading this article. Be sure to like and comment to this article, if you found it helpful. It means a lot to me.
References
HMS Docs:
https://developer.huawei.com/consum.../HMSCore-Guides/introduction-0000001050048870
Video Training:
https://developer.huawei.com/consumer/en/training/course/video/101583015541549183