More information like this, you can visit HUAWEI Developer Forum
This article covers to draw polylines for different directions types like, Driving, Bicycling and Walking. To get directions response, use OkHttpClient, Volley and Retrofit libraries.
In this article, used Retrofit library to get the response. Check the below code
Code:
private void callDirectionsApi(final String directionType) {
if (!NetworkUtil.isNetworkConnected(this)) {
showNetworkDialog();
return;
}
Origin origin = new Origin();
origin.setLat(13.242430); // Current Location Latitude
origin.setLng(79.277502); // Current Location Longitude
Destination destination = new Destination();
destination.setLat(destinationLocation.getLat());
destination.setLng(destinationLocation.getLng());
DirectionsRequest request = new DirectionsRequest();
request.setOrigin(origin);
request.setDestination(destination);
ApiInterface apiInterface = ApiClient.getClient().create(ApiInterface.class);
Call<DirectionsResponse> call = apiInterface.getDirections(directionType, request);
call.enqueue(new Callback<DirectionsResponse>() {
@Override
public void onResponse(@NotNull Call<DirectionsResponse> call, @NotNull Response<DirectionsResponse> response) {
if (response.isSuccessful()) {
startEndMarkerList = new ArrayList<>();
DirectionsResponse directionsResponse = response.body();
assert directionsResponse != null;
Log.e(TAG + " Response", directionsResponse.toString());
List<RoutesItem> routesItemList = directionsResponse.getRoutes();
if (routesItemList.size() > 0) {
List<StepsItem> steps = routesItemList.get(0).getPaths().get(0).getSteps();
lineLatLngList = new ArrayList<>();
for (StepsItem stepItem : steps) {
List<PolylineItem> polylineItems = stepItem.getPolyline();
for (PolylineItem polylineItem : polylineItems) {
LatLng latLng = new LatLng(polylineItem.getLat(), polylineItem.getLng());
lineLatLngList.add(latLng);
}
}
Iterable<LatLng> latLngIterable = lineLatLngList;
PolylineOptions polyOptions = new PolylineOptions();
if (directionType == "bicycling") {
polyOptions .width(3);
} else if (directionType == "driving") {
polyOptions .width(5);
} else {
List<PatternItem> pattern = Arrays.asList(new Dot(), new Gap(70));
polyOptions .pattern(pattern);
polyOptions .width(7);
}
polyOptions .color(Color.RED);
polyOptions .jointType(ROUND);
directionsPolyline= huaweiMap.addPolyline(polyOptions );
LatLngBounds.Builder builder = new LatLngBounds.Builder();
for (LatLng latLng : latLngIterable) {
builder.include(latLng);
}
LatLngBounds bounds = builder.build();
CameraUpdate cameraUpdate = CameraUpdateFactory.newLatLngBounds(bounds, 50);
huaweiMap.animateCamera(cameraUpdate);
}
addPolylineCap(lineLatLngList.get(0), R.drawable.grey_circle);
addPolylineCap(lineLatLngList.get(lineLatLngList.size() - 1), R.drawable.white_circle);
addMarkerToMap(destinationLocation, false, destinationName);
animatePolyLine();
}
}
@Override
public void onFailure
(@NotNull Call<DirectionsResponse> call, @NotNull Throwable t) {
Log.e(TAG + " Response Error", Objects.requireNonNull(t.getMessage()));
}
});
}
Dotted and Dashed Polyline:
To draw Dotted and Dashed polyline, following are three classes:
1. Dot
2. Dash
3. Gap
Here based upon directionType changed the width of polyline and for walking path added pattern with new Dot() and new Gap(70) classes.
If we use new Dash() instead of new Dot() we will get dashed polyline.
Code:
if (directionType == "bicycling") {
polyOptions .width(3);
} else if (directionType == "driving") {
polyOptions .width(6);
} else {
List<PatternItem> pattern = Arrays.asList(new Dot(), new Gap(70));
polyOptions .pattern(pattern);
polyOptions .width(10);
}
Polyline Caps:
To add start and end caps for polyline, check the below code.
Code:
addPolylineCap(lineLatLngList.get(0), R.drawable.grey_circle);
addPolylineCap(lineLatLngList.get(lineLatLngList.size() - 1), R.drawable.white_circle);
Instead of adding markers like this, Use default classes for adding caps to polyline, as follows.
1. RoundCap
2. Square Cap
To add this caps to polyline. Check the below code.
Code:
polyOptions.startCap(new RoundCap());
polyOptions.endCap(new CustomCap(getEndCapIcon(Color.BLACK), 100));
Here we can set new RoundCap() and new SquareCap() and also we can give custom cap. Check the below code for adding custom cap.
Code:
public BitmapDescriptor getEndCapIcon(int color) {
// mipmap icon - white arrow, pointing up, with point at center of image
// you will want to create: mdpi=24x24, hdpi=36x36, xhdpi=48x48, xxhdpi=72x72, xxxhdpi=96x96
Drawable drawable = ContextCompat.getDrawable(this, R.mipmap.arrow);
drawable.setBounds(0, 0, drawable.getIntrinsicWidth(), drawable.getIntrinsicHeight());
drawable.setColorFilter(color, PorterDuff.Mode.MULTIPLY);
android.graphics.Bitmap bitmap = android.graphics.Bitmap.createBitmap(drawable.getIntrinsicWidth(), drawable.getIntrinsicHeight(), Bitmap.Config.ARGB_8888);
Canvas canvas = new Canvas(bitmap);
drawable.draw(canvas);
return BitmapDescriptorFactory.fromBitmap(bitmap);
}
Code:
addMarkerToMap(destinationLocation, false, destinationName);
To add custom marker to map. Check my previous article.
Article Link: To customize Maps with Custom Marker and Custom Info Window - Maps Kit
To Animate Polyline like Uber and Swiggy
To animate polylines, Use ValueAnimator to achieve easily. Check the below code.
Code:
private void animatePolyLine() {
ValueAnimator animator = ValueAnimator.ofInt(0, 100);
animator.setDuration(2000);
animator.setInterpolator(new LinearInterpolator());
animator.addUpdateListener(new ValueAnimator.AnimatorUpdateListener() {
@Override
public void onAnimationUpdate(ValueAnimator animator) {
List<LatLng> latLngList = directionsPolyline.getPoints();
int initialPointSize = latLngList.size();
int animatedValue = (int) animator.getAnimatedValue();
int newPoints = (animatedValue * lineLatLngList.size()) / 100;
if (initialPointSize < newPoints) {
latLngList.addAll(lineLatLngList.subList(initialPointSize, newPoints));
directionsPolyline.setPoints(latLngList);
}
}
});
animator.addListener(polyLineAnimationListener);
animator.start();
}
Using AnimatorListener, we can do much more based upon our requirements. Check the below code.
Code:
Animator.AnimatorListener polyLineAnimationListener = new Animator.AnimatorListener() {
@Override
public void onAnimationStart(Animator animator) {
}
@Override
public void onAnimationEnd(Animator animator) {
if (selectedMarker != null) {
selectedMarker.showInfoWindow();
}
}
@Override
public void onAnimationCancel(Animator animator) {
}
@Override
public void onAnimationRepeat(Animator animator) {
}
};
Find the output in below image
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Conclusion :
In this articles covers some of the functionalities of Directions. There are lot of concepts are there just try it
Reference links
Driving Direction API:
https://developer.huawei.com/consumer/en/doc/development/HMSCore-References-V5/directions-driving-0000001050161496-V5
Walking Directions API:
https://developer.huawei.com/consumer/en/doc/development/HMSCore-References-V5/directions-walking-0000001050161494-V5
BiCycling Directions API:
https://developer.huawei.com/consumer/en/doc/development/HMSCore-References-V5/directions-bicycling-0000001050163449-V5
will it support available shortest path from origin to destination?
Related
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Start your project
First of all, install eclipse and the android stuff
Then create a new android project, in my case its called com.xda.tutorialscript
Then we need to create the required layout items
Edit res->layout->main.xml
Add a Edittext, Button and TextView
<?xml version=”1.0″ encoding=”utf-8″?>
<LinearLayout xmlns:android=”http://schemas.android.com/apk/res/android”
android:layout_width=”fill_parent”
android:layout_height=”fill_parent”
androidrientation=”vertical” >
<EditText
android:id=”@+input/input1″
android:layout_height=”wrap_content”
android:layout_width=”fill_parent”
/>
<Button
android:id=”@+button/run”
android:layout_height=”wrap_content”
android:layout_width=”wrap_content”
android:text=”Go”
/>
<TextView
android:id=”@+text/text1″
android:layout_width=”fill_parent”
android:layout_height=”wrap_content”
android:text=”Output” />
</LinearLayout>
After that we need to create the code for the root command, bellow there is my version with output treatment and input of a textview to be used as input also, by that you can use the same command on lots of activities and have the output on the textview
// Root Access script runner
void execCommandLine(String command, TextView tv)
{
Runtime runtime = Runtime.getRuntime();
Process proc = null;
OutputStreamWriter osw = null;
// Running the Script
try
{
proc = runtime.exec(“su”);
osw = new OutputStreamWriter(proc.getOutputStream());
osw.write(command);
osw.flush();
osw.close();
}
// If return error
catch (IOException ex)
{
// Log error
Log.e(“execCommandLine()”, “Command resulted in an IO Exception: ” + command);
return;
}
// Try to close the process
finally
{
if (osw != null)
{
try
{
osw.close();
}
catch (IOException e){}
}
}
try
{
proc.waitFor();
}
catch (InterruptedException e){}
// Display on screen if error
if (proc.exitValue() != 0)
{
Log.e(“execCommandLine()”, “Command returned error: ” + command + “\n Exit code: ” + proc.exitValue());
AlertDialog.Builder builder = new AlertDialog.Builder(ScriptrunActivity.this);
builder.setMessage(command + “\nWas not executed sucessfully!”);
builder.setNeutralButton(“OK”, null);
AlertDialog dialog = builder.create();
dialog.setTitle(“Script Error”);
dialog.show();
}
BufferedReader reader = new BufferedReader(
new InputStreamReader(proc.getInputStream()));
int read;
char[] buffer = new char[4096];
StringBuffer output = new StringBuffer();
try {
while ((read = reader.read(buffer)) > 0) {
output.append(buffer, 0, read);
}
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
String exit = output.toString();
if(exit != null && exit.length() == 0) {
exit = “Command executed Successfully but no output was generated”;
}
tv.setText(exit);
}
After that we have to create a button to grab the text inside the textedit and use as a command
// Button
/** Called when the activity is first created. */
@Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.main);
Button go = (Button) findViewById(R.button.run);
go.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View view) {
// Get ID’s
TextView tv = (TextView)findViewById(R.text.text1);
EditText text_id = (EditText)findViewById(R.input.input1);
String input = text_id.getText().toString();
execCommandLine(input, tv);
}
});
A simple explanation of what happens:
The findViewById is used to get the id of the layout item, so we can use it later and by that we declare the textview id ( to be used as input of the execcommand) and button id
Then we get the text from the input1 that is the textedit and transform it to a string to be used in the execCommandLine
And for last we use the execCommandLine
The string input will be used and the output will be transformed from inputstream to string and sent to the textview choosed
Hope you all enjoyed this Do it Your Self
madteam.co
I suggest Code Tags.
thanks
More information like this, you can visit HUAWEI Developer Forum
Original article link: https://forums.developer.huawei.com/forumPortal/en/topicview?tid=0202327658275700021&fid=0101187876626530001
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
If you followed the part one here, you know where we left off. In this part of the tutorial, we will be implemeting onCreate in detail, including button clicks and listeners which is a crucial part in playback control.
Let’s start by implemeting add/remove listeners to control the listener attachment. Then implement a small part of onCreate method.
Code:
public void addListener(HwAudioStatusListener listener) {
if (mHwAudioManager != null) {
try {
mHwAudioManager.addPlayerStatusListener(listener);
} catch (RemoteException e) {
Log.e("TAG", "TAG", e);
}
} else {
mTempListeners.add(listener);
}
}
public void removeListener(HwAudioStatusListener listener) { //will be called in onDestroy() method
if (mHwAudioManager != null) {
try {
mHwAudioManager.removePlayerStatusListener(listener);
} catch (RemoteException e) {
Log.e("TAG", "TAG", e);
}
}
mTempListeners.remove(listener);
}
@Override
protected void onDestroy() {
if(mHwAudioPlayerManager!= null && isReallyPlaying){
isReallyPlaying = false;
mHwAudioPlayerManager.stop();
removeListener(mPlayListener);
}
super.onDestroy();
}
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
binding = ActivityMainBinding.inflate(getLayoutInflater());
View view = binding.getRoot();
setContentView(view);
//I set the MainActivity cover image as a placeholder image to cover for those audio files which do not have a cover image.
binding.albumPictureImageView.setImageDrawable(getDrawable(R.drawable.ic_launcher_foreground));
initializeManagerAndGetPlayList(this); //I call my method to set my playlist
addListener(mPlayListener); //I add my listeners
}
If you followed the first part closely, you already have the mTempListeners variable. So, in onCreate, we first initialize everything and then attach our listener to it.
Now that our playlist is programatically ready, if you try to run your code, you will not be able to control what you want because we have not touched our views yet.
Let’s implement our basic buttons first.
Code:
final Drawable drawablePlay = getDrawable(R.drawable.btn_playback_play_normal);
final Drawable drawablePause = getDrawable(R.drawable.btn_playback_pause_normal);
binding.playButtonImageView.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View view) {
if(binding.playButtonImageView.getDrawable().getConstantState().equals(drawablePlay.getConstantState())){
if (mHwAudioPlayerManager != null){
mHwAudioPlayerManager.play();
binding.playButtonImageView.setImageDrawable(getDrawable(R.drawable.btn_playback_pause_normal));
isReallyPlaying = true;
}
}
else if(binding.playButtonImageView.getDrawable().getConstantState().equals(drawablePause.getConstantState())){
if (mHwAudioPlayerManager != null) {
mHwAudioPlayerManager.pause();
binding.playButtonImageView.setImageDrawable(getDrawable(R.drawable.btn_playback_play_normal));
isReallyPlaying = false;
}
}
}
});
binding.nextSongImageView.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View view) {
if (mHwAudioPlayerManager != null) {
mHwAudioPlayerManager.playNext();
isReallyPlaying = true;
}
}
});
binding.previousSongImageView.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View view) {
if (mHwAudioPlayerManager != null) {
mHwAudioPlayerManager.playPre();
isReallyPlaying = true;
}
}
});
Do these in your onCreate(…) method. isReallyPlaying is a global boolean variable that I created to keep track of playback. I update it everytime the playback changes. You do not have to have it, as I also tested it with online playlists. I still keep there in case you also want to test your app with online playlists, as in the sample apps provided by Huawei.
For button visual changes, I devised a method like above and I am not asserting that it is the best. You can use your own method but onClicks should rougly be like this.
Now we should implement our listener so that we can track the change in playback. It will also include updating the UI elements, so that the app tracks the audio file changes and seekbar updates. Put this listener outside onCreate(…) method, it will be called anytime it is required. What we should do is to implement necessary methods to respond this calling.
Code:
HwAudioStatusListener mPlayListener = new HwAudioStatusListener() {
@Override
public void onSongChange(HwAudioPlayItem hwAudioPlayItem) {
setSongDetails(hwAudioPlayItem);
if(mHwAudioPlayerManager.getOffsetTime() != -1 && mHwAudioPlayerManager.getDuration() != -1)
updateSeekBar(mHwAudioPlayerManager.getOffsetTime(), mHwAudioPlayerManager.getDuration());
}
@Override
public void onQueueChanged(List list) {
if (mHwAudioPlayerManager != null && list.size() != 0 && !isReallyPlaying) {
mHwAudioPlayerManager.play();
isReallyPlaying = true;
binding.playButtonImageView.setImageDrawable(getDrawable(R.drawable.btn_playback_pause_normal));
}
}
@Override
public void onBufferProgress(int percent) {
}
@Override
public void onPlayProgress(final long currentPosition, long duration) {
updateSeekBar(currentPosition, duration);
}
@Override
public void onPlayCompleted(boolean isStopped) {
if (mHwAudioPlayerManager != null && isStopped) {
mHwAudioPlayerManager.playNext();
}
isReallyPlaying = !isStopped;
}
@Override
public void onPlayError(int errorCode, boolean isUserForcePlay) {
Toast.makeText(MainActivity.this, "We cannot play this!!", Toast.LENGTH_LONG).show();
}
@Override
public void onPlayStateChange(boolean isPlaying, boolean isBuffering) {
if(isPlaying || isBuffering){
binding.playButtonImageView.setImageDrawable(getDrawable(R.drawable.btn_playback_pause_normal));
isReallyPlaying = true;
}
else{
binding.playButtonImageView.setImageDrawable(getDrawable(R.drawable.btn_playback_play_normal));
isReallyPlaying = false;
}
}
};
Now let’s understand the snippet above. Status listener instance wants us to implement some methods shown. They are not required but they will make our lives easier. I will explain the methods inside of them later.
Understanding HwAudioStatusListener
onSongChange is called upon a song change in the queue, i.e. whenever the you skip a song for example, it will be called. Also, it is called for the first time you set the playlist because technically your song changed, from nothing to your index 0 audio file.
onQueueChanged is called upon the queue is altered. I implemented some code just in case but since we will not be dealing with adding/removing items to the playlist and change the queues, it is not very important.
onBufferProgress returns the progress as percentage when you are using online audio.
onPlayProgress is one of the most important method here, everytime the playback is changed, it is called. That is, even the audio files proceed one second (i.e. plays) it will be called. So, it is wise to call UI updates here, rather than dealing with heavy-load fragments, in my opinion.
onPlayCompleted lets you decide the behaviour when the playlist finishes playing. You can restart the playlist, just stop or do something else (like notifying the user that the playlist has ended etc.). It is totally up to you. I restart the playlist in the sample code.
onPlayError is called whenever there is an error occurred in the playback. It could, for example, be that the buffered song is not loaded correctly, format not supported, audio file is corrupted etc. You can handle the result here or you can just notify the user that the file cannot be played; just like I did.
And finally, onPlayStateChange is called whenever the music is paused/re-played. Thus, you can handle the changes here in general. I update isReallyPlaying variable and play/pause button views inside and outside of this method just to be sure. You can update it just here if you want. It should work in theory.
Extra Methods
This listener contains two extra custom methods that I wrote to update the UI and set/render the details of the song on the screen. They are called setSongDetails(…) and updateSeekBar(…). Let’s see how they are implemented.
Code:
public void updateSeekBar(final long currentPosition, long duration){
//seekbar
binding.musicSeekBar.setMax((int) (duration / 1000));
if(mHwAudioPlayerManager != null){
int mCurrentPosition = (int) (currentPosition / 1000);
binding.musicSeekBar.setProgress(mCurrentPosition);
setProgressText(mCurrentPosition);
}
binding.musicSeekBar.setOnSeekBarChangeListener(new SeekBar.OnSeekBarChangeListener() {
@Override
public void onStopTrackingTouch(SeekBar seekBar) {
//Log.i("ONSTOPTRACK", "STOP TRACK TRIGGERED.");
}
@Override
public void onStartTrackingTouch(SeekBar seekBar) {
//Log.i("ONSTARTTRACK", "START TRACK TRIGGERED.");
}
@Override
public void onProgressChanged(SeekBar seekBar, int progress, boolean fromUser) {
if(mHwAudioPlayerManager != null && fromUser){
mHwAudioPlayerManager.seekTo(progress*1000);
}
if(!isReallyPlaying){
setProgressText(progress); //when the song is not playing and user updates the seekbar, seekbar still should be updated
}
}
});
}
public void setProgressText(int progress){
String progressText = String.format(Locale.US, "d:d",
TimeUnit.MILLISECONDS.toMinutes(progress*1000),
TimeUnit.MILLISECONDS.toSeconds(progress*1000) -
TimeUnit.MINUTES.toSeconds(TimeUnit.MILLISECONDS.toMinutes(progress*1000))
);
binding.progressTextView.setText(progressText);
}
public void setSongDetails(HwAudioPlayItem currentItem){
if(currentItem != null){
getBitmapOfCover(currentItem);
if(!currentItem.getAudioTitle().equals(""))
binding.songNameTextView.setText(currentItem.getAudioTitle());
else{
binding.songNameTextView.setText("Try choosing a song");
}
if(!currentItem.getSinger().equals(""))
binding.artistNameTextView.setText(currentItem.getSinger());
else
binding.artistNameTextView.setText("From the playlist");
binding.albumNameTextView.setText("Album Unknown"); //there is no field assinged to this in HwAudioPlayItem
binding.progressTextView.setText("00:00"); //initial progress of every song
long durationTotal = currentItem.getDuration();
String totalDurationText = String.format(Locale.US, "d:d",
TimeUnit.MILLISECONDS.toMinutes(durationTotal),
TimeUnit.MILLISECONDS.toSeconds(durationTotal) -
TimeUnit.MINUTES.toSeconds(TimeUnit.MILLISECONDS.toMinutes(durationTotal))
);
binding.totalDurationTextView.setText(totalDurationText);
}
else{
//Here is measure to prevent the bad first opening of the app. After user
//selects a song from the playlist, here should never be executed again.
binding.songNameTextView.setText("Try choosing a song");
binding.artistNameTextView.setText("From the playlist");
binding.albumNameTextView.setText("It is at right-top of the screen");
}
}
Remarks
I format the time everytime I receive it because currently AudioKit sends them as miliseconds and the user needs to see in 00:00 format.
I divide by 1000 and while seeking to progress, multiply by 1000 to keep the changes visible on screen. It is also more convenient and accustomed way of implementing seekbars.
I take measure by if checks in case they return null. It should not happen except the first opening but still, as developers, we should be careful.
Please follow comments in the code for additional information.
That’s it for this part. However, we are not done yet. As I mentioned earlier we should also implement the playlist to choose songs from and implement advanced playback controls "); background-size: 1px 1px; background-position: 0px calc(1em + 1px); box-sizing: inherit; font-family: medium-content-serif-font, Georgia, Cambria, "Times New Roman", Times, serif; font-weight: 700; font-size: 18px; text-decoration: underline;">in part 3. See you there!
HTML:
sujith.e said:
Well explained,do we need login ?
Click to expand...
Click to collapse
Yeah, you need to log in
Introduction
Sound detection service can detect sound events. Automatic environmental sound classification is a growing area of research with real world applications.
Steps
1. Create App in Android
2. Configure App in AGC
3. Integrate the SDK in our new Android project
4. Integrate the dependencies
5. Sync project
Use case
This service we will use in day to day life, it will detect different types of sounds such as Baby crying, laugher, snoring, running water, alarm sounds, doorbell, etc.! Currently this service will detect only one sound at a time, so multiple sound detection is not supporting this service. Default interval at least 2 seconds for each sound detection.
ML Kit Configuration.
1. Login into AppGallery Connect, select MlKitSample in My Project list.
2. Enable Ml Kit, Choose My Projects > Project settings > Manage APIs
Integration
Create Application in Android Studio.
App level gradle dependencies.
Code:
apply plugin: 'com.android.application'
apply plugin: 'com.huawei.agconnect'
Gradle dependencies
Code:
implementation 'com.huawei.hms:ml-speech-semantics-sounddect-sdk:2.0.3.300'
implementation 'com.huawei.hms:ml-speech-semantics-sounddect-model:2.0.3.300'
Root level gradle dependencies
Code:
maven {url 'https://developer.huawei.com/repo/'}
classpath 'com.huawei.agconnect:agcp:1.3.1.300'
Add the below permissions in Android Manifest file
Code:
<manifest xlmns:android...>
...
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.RECORD_AUDIO" />
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE"/>
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.FOREGROUND_SERVICE"/>
</manifest>
1. Create Instance for Sound Detection in onCreate.
Code:
MLSoundDector soundDector = MLSoundDector.createSoundDector();
2. Check Run time permissions.
Code:
private void getRuntimePermissions() {
List<String> allNeededPermissions = new ArrayList<>();
for (String permission : getRequiredPermissions()) {
if (!isPermissionGranted(this, permission)) {
allNeededPermissions.add(permission);
}
}
if (!allNeededPermissions.isEmpty()) {
ActivityCompat.requestPermissions(
this, allNeededPermissions.toArray(new String[0]), PERMISSION_REQUESTS);
}
}
private boolean allPermissionsGranted() {
for (String permission : getRequiredPermissions()) {
if (!isPermissionGranted(this, permission)) {
return false;
}
}
return true;
}
private static boolean isPermissionGranted(Context context, String permission) {
if (ContextCompat.checkSelfPermission(context, permission)
== PackageManager.PERMISSION_GRANTED) {
Log.i(TAG, "Permission granted: " + permission);
return true;
}
Log.i(TAG, "Permission NOT granted: " + permission);
return false;
}
private String[] getRequiredPermissions() {
try {
PackageInfo info = this.getPackageManager().getPackageInfo(this.getPackageName(), PackageManager.GET_PERMISSIONS);
String[] ps = info.requestedPermissions;
if (ps != null && ps.length > 0) {
return ps;
} else {
return new String[0];
}
} catch (RuntimeException e) {
throw e;
} catch (Exception e) {
return new String[0];
}
}
@Override
public void onRequestPermissionsResult(int requestCode, @NonNull String[] permissions, @NonNull int[] grantResults) {
super.onRequestPermissionsResult(requestCode, permissions, grantResults);
if (requestCode != PERMISSION_REQUESTS) {
return;
}
boolean isNeedShowDiag = false;
for (int i = 0; i < permissions.length; i++) {
if ((permissions[i].equals(Manifest.permission.READ_EXTERNAL_STORAGE)
&& grantResults[i] != PackageManager.PERMISSION_GRANTED)
|| (permissions[i].equals(Manifest.permission.CAMERA)
&& permissions[i].equals(Manifest.permission.RECORD_AUDIO)
&& grantResults[i] != PackageManager.PERMISSION_GRANTED)) {
isNeedShowDiag = true;
}
}
if (isNeedShowDiag && !ActivityCompat.shouldShowRequestPermissionRationale(this, Manifest.permission.CALL_PHONE)) {
AlertDialog dialog = new AlertDialog.Builder(this)
.setMessage(getString(R.string.camera_permission_rationale))
.setPositiveButton(getString(R.string.settings), new DialogInterface.OnClickListener() {
@Override
public void onClick(DialogInterface dialog, int which) {
Intent intent = new Intent(Settings.ACTION_APPLICATION_DETAILS_SETTINGS);
intent.setData(Uri.parse("package:" + getPackageName()));
startActivityForResult(intent, 200);
startActivity(intent);
}
})
.setNegativeButton(getString(R.string.cancel), new DialogInterface.OnClickListener() {
@Override
public void onClick(DialogInterface dialog, int which) {
finish();
}
}).create();
dialog.show();
}
}
3. Create sound detection result callback, this callback will detect the sound results.
Code:
MLSoundDectListener listener = new MLSoundDectListener() {
@Override
public void onSoundSuccessResult(Bundle result) {
int soundType = result.getInt(MLSoundDector.RESULTS_RECOGNIZED);
String soundName = hmap.get(soundType);
textView.setText("Successfully sound has been detected : " + soundName);
}
@Override
public void onSoundFailResult(int errCode) {
textView.setText("Failure" + errCode);
}
};
soundDector.setSoundDectListener(listener);
soundDector.start(this);
4. Once sound detection obtained call notification service.
Code:
serviceIntent = new Intent(MainActivity.this, NotificationService.class);
serviceIntent.putExtra("response", soundName);
ContextCompat.startForegroundService(MainActivity.this, serviceIntent);
5. If you want to stop sound detection call onStop()
Code:
soundDector.stop();
6. Below are the sound type results.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Result
Conclusion
This article will help you to detect Real time streaming sounds, sound detection service will help you to notify sounds to users in daily life.
Thank you for reading and if you have enjoyed this article, I would suggest you implement this and provide your experience.
Reference
ML Kit – Sound Detection
Refer the URL
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Hi Folks
I will introduce you to how to implement Push Kit both Android side and server side. Also I will show you how to send push notification from backend, catch Android side and open specific page on Android app.
Backend Development
Code:
...
SEND_PUSH_REQUEST_URL = https://push-api.cloud.huawei.com/v1/APP_ID/messages:send
ACCESS_TOKEN_CONTENT = grant_type=client_credentials&client_id=APP_ID&client_secret=APP_SECRET
ACCESS_TOKEN_REQUEST_URL = https://oauth-login.cloud.huawei.com/oauth2/v2/token?grant_type=client_credentials&client_id=APP_ID&client_secret=APP_SECRET
...
There is application.properties class on backend project. There are using to get access token and send notification process. You must get APP_ID and APP_SECRET ‘s values from AGC.
Preparing JSON Object For Notification
Code:
public class PushNotification {
private final Message message;
}
public class Message {
private Android android;
private List<String> token;
}
public class Android {
private int collapse_key;
private String urgency;
private Notification notification;
}
public class Notification {
private String title;
private String body;
private Click_Action click_action;
}
public class Click_Action {
private int type;
private String intent;
}
The classes for create required JSON object on the below.
Code:
public class PushNotificationProcess {
private static final String AUTH = "Authorization";
private static final String METHOD = "POST";
private static final String ACCESS_TOKEN_CONTENT_TYPE = "application/x-www-form-urlencoded";
private static final String CONTENT_TYPE_KEY = "Content-Type";
private static final String SEND_PUSH_CONTENT_TYPE = "application/json";
@Autowired
private Environment environment;
private String accessToken = "";
public void getAccessToken() {
try {
OkHttpClient client = new OkHttpClient();
MediaType mediaType = MediaType.parse(ACCESS_TOKEN_CONTENT_TYPE);
RequestBody body = RequestBody.create(mediaType, environment.getProperty("ACCESS_TOKEN_CONTENT"));
Request request = new Request.Builder()
.url(environment.getProperty("ACCESS_TOKEN_REQUEST_URL"))
.method(METHOD, body)
.addHeader(CONTENT_TYPE_KEY, ACCESS_TOKEN_CONTENT_TYPE)
.build();
client.newCall(request).enqueue(new Callback() {
@Override
public void onResponse(@NotNull Call call, @NotNull Response response) throws IOException {
if (response.isSuccessful()) {
ObjectMapper objectMapper = new ObjectMapper();
ResponseBody responseBody = client.newCall(request).execute().body();
AccessToken entity = objectMapper.readValue(responseBody.string(), AccessToken.class);
accessToken = entity.getToken_type() + " " + entity.getAccess_token();
}
}
@Override
public void onFailure(@NotNull Call call, @NotNull IOException e) {
e.printStackTrace();
}
});
} catch (Exception e) {
e.printStackTrace();
}
}
public void sendPushNotification(String pushBody) {
try {
OkHttpClient client = new OkHttpClient();
MediaType mediaType = MediaType.parse(SEND_PUSH_CONTENT_TYPE);
RequestBody body = RequestBody
.create(mediaType, pushBody);
Request request = new Request.Builder()
.url(environment.getProperty("SEND_PUSH_REQUEST_URL"))
.method(METHOD, body)
.addHeader(AUTH, accessToken)
.addHeader(CONTENT_TYPE_KEY, SEND_PUSH_CONTENT_TYPE)
.build();
client.newCall(request).enqueue(new Callback() {
@Override
public void onResponse(@NotNull Call call, @NotNull Response response) throws IOException {
try {
if (response.isSuccessful()) {
//Success
} else {
//Handler error code
System.out.println(response.code());
}
} catch (Exception e) {
e.printStackTrace();
}
}
@Override
public void onFailure(@NotNull Call call, @NotNull IOException e) {
e.printStackTrace();
System.out.println("onFailure");
}
});
} catch (Exception e) {
e.printStackTrace();
}
}
}
There are 2 functions on the PushNotificationProcess class. The functions about HTTP Request to Huawei Push Kit server. I used OkHttpClient library also you can use Retrofit or something like that. I don’t explain static strings ’cause I think they are clear.
When the calling onResponse method on getAccessToken function, firstly I check response.isSuccesful and then I can get the response body. We must use the access_token like “Baerer xxxx” so I combine token_type and token on the 30. line and the access_token ready to use.
sendPushNotification looks like getAccessToken. I mean there is another HTTP Request. The method has a string param as pushBody -I will explain next part- . I get some strings from application.properties class, you can see 51. line.
Code:
public class PushNotification {
...
private void prepareNotification() {
this.message.setToken(new ArrayList<>());
this.message.getAndroid().setCollapse_key(0);
this.message.getAndroid().setUrgency("NORMAL");
}
public void prepareIntent(int topicId) {
this.message.getAndroid().getNotification().getClick_action().setType(1);
this.message.getAndroid().getNotification().getClick_action()
.setIntent(Constant.DataMessage.SCHEME_URL + Constant.DataMessage.SCHEME_URL_FORUM + topicId + Constant.DataMessage.SCHEME_URL_END);
}
public void prepareContent(String title, String body) {
this.message.getAndroid().getNotification().setTitle(title);
this.message.getAndroid().getNotification().setBody(body);
}
public void prepareTokens(List<String> tokenList) {
this.message.getToken().addAll(tokenList);
}
public String getBodyAsString() {
return new Gson().toJson(this);
}
}
More details, you can check https://forums.developer.huawei.com/forumPortal/en/topic/0203442155932950046
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Introduction
In this article, we will learn how to integrate Huawei Account kit in Android application. Account Kit provides you with simple, secure, and quick sign-in and authorization functions. Instead of entering accounts and passwords and waiting for authentication, users can just tap the button to quickly and securely sign in to your app with their HUAWEI IDs. It helps app user seamless login functionality to the app with large user base.
Supported Devices
Development Overview
You need to install Android Studio IDE and I assume that you have prior knowledge of Android application development.
Hardware Requirements
A computer (desktop or laptop) running Windows 10.
Android phone (with the USB cable), which is used for debugging.
Software Requirements
Java JDK 1.8 or later.
Android Studio software installed.
HMS Core (APK) 4.X or later
Integration steps
Step 1. Huawei developer account and complete identity verification in Huawei developer website, refer to register Huawei ID.
Step 2. Create project in AppGallery Connect
Step 3. Adding HMS Core SDK
Let's start coding
How do I call sign in method?
[/B]
private void signInWithHuaweiID() {
AccountAuthParams authParams = new AccountAuthParamsHelper(AccountAuthParams.DEFAULT_AUTH_REQUEST_PARAM).setAuthorizationCode().createParams();
service = AccountAuthManager.getService(ClientActivity.this, authParams);
startActivityForResult(service.getSignInIntent(), 1212);
}
[B]
How do I get sign in result?
[/B][/B]
@Override
protected void onActivityResult(int requestCode, int resultCode, @Nullable Intent data) {
// Process the authorization result to obtain the authorization code from AuthAccount.
super.onActivityResult(requestCode, resultCode, data);
if (requestCode == 1212) {
Task<AuthAccount> authAccountTask = AccountAuthManager.parseAuthResultFromIntent(data);
if (authAccountTask.isSuccessful()) {
// The sign-in is successful, and the user's ID information and authorization code are obtained.
AuthAccount authAccount = authAccountTask.getResult();
Log.i("TAG", "serverAuthCode:" + authAccount.getAuthorizationCode());
userName = authAccount.getDisplayName();
makeConnect();
} else {
// The sign-in failed.
Log.e("TAG", "sign in failed:" + ((ApiException) authAccountTask.getException()).getStatusCode());
}
}
}
[B][B]
How do I start server?
[/B][/B][/B]
wManager = (WifiManager) getSystemService(WIFI_SERVICE);
serverIP = Formatter.formatIpAddress(wManager.getConnectionInfo().getIpAddress());
ip_txt.setText(serverIP);
class ServerThread implements Runnable {
@Override
public void run() {
try {
while (true) {
serverSocket = new ServerSocket(POST_NUMBER);
socket = serverSocket.accept();
output = new PrintWriter(socket.getOutputStream());
input = new BufferedReader(new InputStreamReader(socket.getInputStream()));
Log.d("TAG", " here ");
runOnUiThread(new Runnable() {
@Override
public void run() {
tv_status.setText("Waiting for conn at " + POST_NUMBER);
}
});
handler.post(new Runnable() {
@Override
public void run() {
tv_status.setText("Connected");
}
});
}
} catch (Exception e) {
e.printStackTrace();
}
}
}
[B][B][B]
How do I send message using socket?
class SendMessage implements Runnable {
private String message;
SendMessage(String message) {
this.message = message;
}
@Override
public void run() {
output.write(message+"\r");
output.flush();
runOnUiThread(new Runnable() {
@Override
public void run() {
tv_chat.append("\n New Message: " + message);
ed_message.setText("");
}
});
Thread.interrupted();
}
}
How do I receive message using socket?
[/B]
private class ReadMessage implements Runnable {
@Override
public void run() {
while (true) {
try {
// Log.d("TAG","Server: Listening for message");
if(input!=null){
final String message = input.readLine();
if (message != null) {
handler.post(new Runnable() {
@Override
public void run() {
tv_chat.append("\n" + message );
}
});
}
}
} catch (IOException e) {
// Log.e("TAG","Error while receiving message");
e.printStackTrace();
}
}
}
}
[B]
Close the Socket and other connections
[/B][/B]
@Override
protected void onPause() {
super.onPause();
if (socket != null) {
try {
output.close();
input.close();
socket.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
[B][B]
How do I revoke auth permission?
[/B][/B][/B]
if(service!=null){
// service indicates the AccountAuthService instance generated using the getService method during the sign-in authorization.
service.cancelAuthorization().addOnCompleteListener(new OnCompleteListener<Void>() {
@Override
public void onComplete(Task<Void> task) {
if (task.isSuccessful()) {
// Processing after a successful authorization cancellation.
Log.i("TAG", "onSuccess: ");
} else {
// Handle the exception.
Exception exception = task.getException();
if (exception instanceof ApiException){
int statusCode = ((ApiException) exception).getStatusCode();
Log.i("TAG", "onFailure: " + statusCode);
}
}
}
});
}
[B][B][B]
Result
Tricks and Tips
Makes sure that agconnect-services.json file added.
Make sure required dependencies are added
Make sure that service is enabled in AGC
Add required permissions
Conclusion
In this article, we have learnt how to integrate Huawei Account kit in Client Server messaging using Socket in Android application. You can check the desired result in the result section. Hoping Huawei Analytics kit capabilities are helpful to you as well, like this sample, you can make use of Huawei kits as per your requirement.
Thank you so much for reading. I hope this article helps you to understand the integration of Huawei Account kit in Android application.
Reference
Huawei Account Kit – Training video
Checkout in forum