About an year ago I used this code:
Code:
private void merge2WavFiles(String wavFile1, String wavFile2, String newWavFilePath) {
try {
File wave1 = new File(wavFile1);
if(!wave1.exists())
throw new Exception(wave1.getPath() + " - File Not Found");
AudioInputStream clip1 = AudioSystem.getAudioInputStream(wave1);
AudioInputStream clip2 = AudioSystem.getAudioInputStream(new File(wavFile2));
AudioInputStream emptyClip =
AudioSystem.getAudioInputStream(new File(emptyWavPath));
AudioInputStream appendedFiles =
new AudioInputStream(
new SequenceInputStream(clip1, emptyClip),
clip1.getFormat(),
clip1.getFrameLength() + 100
);
clip1 = appendedFiles;
appendedFiles =
new AudioInputStream(
new SequenceInputStream(clip1, clip2),
clip1.getFormat(),
clip1.getFrameLength() + clip2.getFrameLength()
);
AudioSystem.write(appendedFiles, AudioFileFormat.Type.WAVE, new File(newWavFilePath));
} catch (Exception e) {
e.printStackTrace();
}
}
But today when I try to use it I get a page saying "Source not found".
After a little search using Google I found out that Android doesn't support AudioInputStream Class anymore. What are the alternative for this Class?
Moreover , I think that it doesn't support the hole package of: javax.sound.sampled.
What are the alternatives?
Anyone?
I think it could be solved by the AudioRecord class?
Merging two or more .wav files in android might be helpful as well...
I tried this and it didn't work
Questions should be posted in Q&A forums, not Development forums.
http://forum.xda-developers.com/announcement.php?a=81
See rule #15
Thread moved.
Have you tried Faasoft Audio Joiner?
I used it to merge WAV, MP3, AAC, AC3, WMA, M4B, VOC, CAF, APE, AIFF, Apple Lossless ALAC, QCP, AMR, AWB, DTS, AU, RA, OGG, XWM, 3GA and more.
Happy with it.
Many of my friends use iDealshare VideoGo to merge audio and video files into one
It can merge audio files into one; merge audio and video files into one; merge video files into one.
It supports merging audio and video files whatever it is in MP3, WAV, AAC, AC3, FLAC, ALAC, WMA, OGG, M4A, MP4, AVI, WMV, MOV, DV, AIFF, OGV, FLV, WebM, MKV, MPG, etc.
hi,
I have to play live video (103.78.14.20:8080/0.m3u8)
If i play video direct by MX player it playing fine( go to network Streaming to play this)
But When I am trying to play video by Intent using below code
TestActivity.MXPackageInfo packageInfo = getMXPackageInfo();
if( packageInfo == null )
return ;
Intent intent = new Intent(Intent.ACTION_VIEW);
intent.setPackage(packageInfo.packageName);
intent.setClassName(packageInfo.packageName, packageInfo.activityName);
intent.setData(Uri.parse("103.78.14.20:8080/0.m3u8"));
startActivity(intent);
Video will not be getting play. only getting cant play video.
Please help me, I am not getting what i am doing wrong
I am creating the article in 3 parts like basic, medium and advanced. Now in this article, I will cover the basic integration about video kit. Follow 5 steps to watch the video in HMS devices.
Introduction
HUAWEI Video kit is used to provide the smoother HD video playback, bolstered by wide-ranging control options, raises the ceiling for your app and makes it more appealing.
Part I: Basic Level – Just follow 5 steps to enjoy playing video on your HMS device, later check how to show videos in RecyclerView.
Part II: Medium Level – More details about playback process and enhanced playback experience.
Part III: Advanced Level – Create demo app which consists of all features.
Let us start the integration with easy steps:
1. About Video Kit
2. Create project and app in AGC Console.
3. Create Android project and setup it.
4. Add UI element & Initialize the video player.
5. Setup player properties and play video.
1. About Video Kit
Currently Video kit provides the video playback features, along with cache management using WisePlayer SDK. It supports video formats like 3GP, MP4, or TS format and comply with HTTP/HTTPS, HLS, or DASH and don’t support local videos.
In future, later versions, video editing and video hosting features will be available.
We can play videos using Surfaceview and TextureView. I’ll show, how to implement these widgets to play the video.
2.Create project and app in AGC console
Follow the instructions to create an app Creating an AppGallery Connect Project and Adding an App to the Project.
3.Create Android project and setup
Create Android project and follow the instructions to add the code in project build.gradle, application build.gradle and application class files.
Project build.gradle file, place the below code
Code:
repository {
maven {url 'http://developer.huawei.com/repo/'}
}
dependencies {
'com.huawei.agconnect:agcp:1.3.1.300'
}
allprojects {
repositories {
maven { url 'http://developer.huawei.com/repo/'
}
}
Application build.gradle file, place the below dependencies and plugin as shown below:
Code:
dependencies {
implementation 'com.huawei.agconnect:agconnect-core:1.3.1.300'
implementation 'com.huawei.hms:videokit-player:1.0.1.300'
}
apply plugin: 'com.huawei.agconnect'
More information, you can check https://forums.developer.huawei.com/forumPortal/en/topic/0204445987315520067
1 Overview
Video and live streaming apps have become widespread in recent years, with skyrocketing numbers of users spread across a myriad of different platforms. HUAWEI Video Kit offers a wide range of video playback services, designed to assist you with building video features and delivering a superb viewing experience for users of your app.
This article introduces three common and easy-to-implement methods for building a video player, using: native Android components, third-party SDKs, and Video Kit, and explains the key differences between these methods.
2 Primary Methods for Developing a Video Player
You can apply the following solutions to achieve video playback.
1. VideoView: A simple and encapsulated Android method for video playback, which provides some basic functions, such as a timeline and progress bar. SurfaceView+MediaPlayer: SurfaceView is used for video display and MediaPlayer is used to control the playback of media files.
2. Professional video service providers: These providers tend to build their own streaming media libraries to perform video website and live streaming-related tasks. For example, ijkplayer, a lightweight, FFplay-based Android/iOS video player released by Bilibili, is capable of implementing cross-platform functions with easy to integrate APIs, while also minimizing the size of the installation package. Hardware-based decoding is supported for the purposes of accelerating decoding, and preserving power, and an integration solution for the bullet comment function on Android platforms is also provided.
Video Kit falls under this category.
3. Open-source SDKs provided by individual or team developers: These SDKs offer open APIs to meet the varying requirements of developers, and can be integrated into your app's video playback framework with just a few lines of code.
The components in the first solution are built into Android, and can only be used to play video files in simple formats, such as MP4 and 3GP. They do not support such formats as FLV and RMVB. As open-source SDKs come to support a greater range of video formats and playback capabilities, more developers will opt to use these SDKs.
3 Overall Process
The video stream loading to playback process requires the following: Protocol decapsulation > Decapsulation > Decoding, as illustrated in the figure below.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
The protocols are streaming protocols, such as HTTP, RTSP, and RTMP, with the HTTP protocol most commonly used, and the RTSP and RTMP protocols generally restricted to live streaming or streaming with control signaling, for example, remote video surveillance.
Video encapsulation protocols are streaming media encapsulation protocols with.mp4, .avi, .rmvb, .mkv, .ts, .3gp, and .flv file extensions. Audio and video codes are packaged together during transmission using the protocols, and extracted before playback.
Audio encoding
Audio data is usually encoded in the following formats: MP3, PCM, WAV, AAC, and AC-3. Since the original audio data tends to be so large that it cannot be steamed over, the original audio size is calculated based on the following method:
sampling rate x number of channels x sample format x duration. Let's assume that the audio sampling rate is 48 kHz, the channel is mono, the sample format is 16 bit, and the duration is 24s. This would make the original audio size:
48000 x 1 x 16 x 24/8 = 2.3 MB
However, the size of the extracted audio data is reduced to 353 KB, thanks to the presence of audio encoding.
Video encoding
Video encoding refers to the encoding and compression methods of video images, including H.263, H.264, HEVC (H.265), MPEG-2, and MPEG-4, among which H.264 is most commonly used. The principles behind video encoding are very complex, so we will not discuss them here. Similar to audio encoding, the purpose of video encoding is to compress the video information.
Hardware decoding and software decoding
In certain media players, hardware decoding and software decoding modes are both available. So what are the differences between them?
Our phones come equipped with a vast array of hardware, including the CPU, GPU, and decoder, and computing is generally performed on the CPU, which functions as the executing chip for the phone's software. The GPU is responsible for image display (hardware acceleration).
Software decoding utilizes the CPU's computing capabilities for decoding purposes. The speed of decoding speed will vary depending on the CPU's capabilities. The decoding process may slow down, or the phone may overheat, if its CPU is relatively weak. However, the presence of a comprehensive algorithm can ensure that compatibility remains good.
Hardware decoding uses a dedicated decoding chip on the mobile phone to accelerate the decoding process. In general, the speed of hardware decoding speed is much faster, but compatibility issues may occur as the decoding chips provided by some vendors are of poor quality.
1 Integration Guide
1.1 Native Android
Let's use VideoView, integrates SurfaceView and MediaPlayer, as an example.
Step 1: Add VideoView to the layout.
XML:
<p style="line-height: 1.5em;"><LinearLayout
android:layout_width="match_parent"
android:layout_height="200dp">
<VideoView
android:id="@+id/videoView"
android:layout_width="wrap_content"
android:layout_height="wrap_content">
</VideoView>
</LinearLayout>
</p>
Step 2: Set the playback source and controller.
XML:
<p style="line-height: 1.5em;">// Network video.
String netVideoUrl = "http://baobab.kaiyanapp.com/api/v1/playUrl?vid=221119&resourceType=video&editionType=default&source=aliyun&playUrlType=url_oss&udid=1111";
// Specify the URL of the video file.
videoView.setVideoURI(Uri.parse(videoUrl1));
// Set the video controller.
videoView.setMediaController(new MediaController(this));
// Playback callback is complete.
videoView.setOnCompletionListener( new MyPlayerOnCompletionListener());
</p>
Step 3: Add playback buttons for play, pause, and replay.
Java:
switch (v.getId()){
case R.id.play:
if(!videoView.isPlaying()){ // Play.
Log.d(TAG, "onClick: play video");
videoView.start();
}
break;
case R.id.pause:
Log.d(TAG, "onClick: pause video");
if(videoView.isPlaying()){// Pause.
videoView.pause();
}
break;
case R.id.replay:
Log.d(TAG, "onClick: repaly video");
if(videoView.isPlaying()){
videoView.resume();// Replay.
}
break;
Step 4: Add the onDestroy method to release resources.
XML:
<p style="line-height: 1.5em;">public void onDestroy(){// Release resources.
super.onDestroy();
if(videoView!=null){
videoView.suspend();
}
}
</p>
Video playback effects:
1.1Third-Party Open-Source SDKs
Now, let's take a look at JZVideo (https://github.com/Jzvd/JZVideo).
You can use the ListView method to display a vertically-scrollable list of video sources here.
Step 1: Add the following information to the dependencies package in the build.gradle file for the app.
XML:
<p style="line-height: 1.5em;">implementation 'cn.jzvd:jiaozivideoplayer:7.5.0'
</p>
Step 2: Add cn.jzvd.JzvdStd to the layout.
XML:
<p style="line-height: 1.5em;"><?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:orientation="vertical"
android:layout_width="match_parent"
android:layout_height="wrap_content">
<cn.jzvd.JzvdStd
android:id="@+id/item_jz_video"
android:layout_width="match_parent"
android:layout_height="200dp"/>
</RelativeLayout>
</LinearLayout>
</p>
Step 3: Set the video playback URL and playback source.
XML:
<p style="line-height: 1.5em;">class ViewHolder{
JzvdStd jzvdStd;
public ViewHolder(View view){
jzvdStd = view.findViewById(R.id.item_jz_video);
}
}
jzvdStd = view.findViewById(R.id.item_jz_video);
// Set the video playback source. The first and second parameters indicate the video URL and video title, respectively.
viewHolder.jzvdStd.setUp(
videoUrls[position],
videoTitles[position], Jzvd.SCREEN_NORMAL);
// Set the video thumbnail.
Glide.with(convertView.getContext())
.load(videoposters[position])
.into(viewHolder.jzvdStd.posterImageView);
// Record the moment where the video was last stopped.
viewHolder.jzvdStd.positionInList = position;
// Add the onStop method to release video resources.
@Override
protected void onStop() {
super.onStop();
JzvdStd.releaseAllVideos();
}
</p>
Video playback effects:
JzvdStd features: Previous video stops when the next video in ListView is played; full screen display; automatic video buffering; audio-only playback in the background; thumbnail and video title customization, and automatic brightness and volume adjustments.
1.1 Video Kit
When the Video Kit SDK is integrated, you can select either SurfaceView or TextureView in the layout. The SDK dependency is as follows:
XML:
<p style="line-height: 1.5em;">implementation "com.huawei.hms:videokit-player:1.0.1.300"
</p>
Step 1: Initialize the player.
Start the independently running Video Kit process. After the process is initiated, the onSuccess method is called to generate an SDK instance.
The initialization only needs to be performed once, when the app is started for the first time.
Code:
<p style="line-height: 1.5em;">/**
*Inittheplayer
*/
privatevoidinitPlayer(){
//DeviceIdtestisusedinthedemo,specificaccesstoincomingdeviceIdafterencryption
WisePlayerFactoryOptionsfactoryOptions=newWisePlayerFactoryOptions.Builder().setDeviceId("xxx").build();
WisePlayerFactory.initFactory(this,factoryOptions,initFactoryCallback);
}
/**
*Playerinitializationcallback
*/
privatestaticInitFactoryCallbackinitFactoryCallback=newInitFactoryCallback(){
@Override
publicvoidonSuccess(WisePlayerFactorywisePlayerFactory){
LogUtil.i(TAG,"initplayerfactorysuccess");
setWisePlayerFactory(wisePlayerFactory);
}
};
</p>
Steps 2 through 8 need to be repeated each time that a video source is switched.
Step 2: Create a playback instance.
Code:
<p style="line-height: 1.5em;">Private void initPlayer(){
if(VideoKitPlayApplication.getWisePlayerFactory()==null){
return;
}
wisePlayer=VideoKitPlayApplication.getWisePlayerFactory().createWisePlayer();
}
</p>
Step 3: Set a listener.
After creating an instance, add a listener to the SDK.
Code:
<p style="line-height: 1.5em;">privatevoidsetPlayListener(){
if(wisePlayer!=null){
wisePlayer.setErrorListener(onWisePlayerListener);
wisePlayer.setEventListener(onWisePlayerListener);
wisePlayer.setResolutionUpdatedListener(onWisePlayerListener);
wisePlayer.setReadyListener(onWisePlayerListener);
wisePlayer.setLoadingListener(onWisePlayerListener);
wisePlayer.setPlayEndListener(onWisePlayerListener);
wisePlayer.setSeekEndListener(onWisePlayerListener);
}
}
</p>
Step 4: Set the playback source.
Code:
<p style="line-height: 1.5em;">// Set a URL for a video.
wisePlayer.setPlayUrl("http://baobab.kaiyanapp.com/api/v1/playUrl?vid=221119&resourceType=video&editionType=default&source=aliyun&playUrlType=url_oss&udid=1111");
</p>
Step 5: Set the video playback window.
Code:
<p style="line-height: 1.5em;">publicvoidsetSurfaceView(SurfaceViewsurfaceView){
if(wisePlayer!=null){
wisePlayer.setView(surfaceView);
}
}
</p>
Step 6: Request data buffering.
Code:
<p style="line-height: 1.5em;">wisePlayer.ready();// Start requesting data.
</p>
Step 7: Start playback.
Code:
<p style="line-height: 1.5em;">publicvoidstart(){
wisePlayer.start();
}
</p>
Step 8: Stop playback to release the instance.
Code:
<p style="line-height: 1.5em;">player.stop();
</p>
Video playback effects:
5 Other
5.1 Comparison between native Android, third-party SDKs, and the Video Kit SDK
As you can see, Video Kit provides more powerful capabilities than Android VideoView and SurfaceView+MediaPlayer, and the kit is expected to come with improved video playback and broadened support for video encoding formats as well, as compared with open-source SDKs. In addition to enhanced playback capabilities, Video Kit will soon provide E2E solutions for long, short, and live videos, including video editing, live recording, video distribution, video hosting, video review, and video encryption, all of which are crucial to quickly integrating and launching a high-level media app. Therefore, if you are looking to develop a media app of your own, you'd be remiss not to check out Video Kit!
Glossary: