I am making an app that currently only make squares move on a "road" and make random desicions on where to go. Now I want to replace them with a image of a car.
I'm currently using the "onDraw(Canvas canvas)" method. confused
So how do I import images to the app.
And how do I do If I want it to animate with 2-3 images.
Easyer explained. I want to replace the drawRect with drawImage or drawBitmap and make it work. I want to load the image from the drawable folder.
Code:
@Override
protected void onDraw(Canvas canvas) {
Paint paint = new Paint();
paint.setColor(android.graphics.Color.RED);
paint.setStyle(Paint.Style.FILL);
canvas.drawRect(square1.x, square1.y, square1.x + 20, square1.y + 20,
paint);
paint.setColor(android.graphics.Color.BLUE);
paint.setStyle(Paint.Style.FILL);
canvas.drawRect(square2.x, square2.y, square2.x + 20, square2.y + 20, paint);
paint.setColor(android.graphics.Color.GREEN);
paint.setStyle(Paint.Style.FILL);
canvas.drawRect(square3.x, square3.y, square3.x + 20, square3.y + 20,
paint);
paint.setColor(android.graphics.Color.BLACK);
paint.setStyle(Paint.Style.FILL);
canvas.drawRect(square4.x, square4.y, square4.x + 20, square4.y + 20,
paint);
paint.setColor(android.graphics.Color.YELLOW);
paint.setStyle(Paint.Style.FILL);
canvas.drawRect(square5.x, square5.y, square5.x + 20, square5.y + 20,
paint);
Generally the easiest way to animate a limited number of frames on any system is using clip rectangles and a single image that contains the frames side by side... depending on the system this doesn't even create a whole let of overhead if you use an uncompressed format like TGA or BMP, laid out vertically. Is there anything specific you need to know about that... or is your problem with the timing? You could just use the nanotime and decide which frame to draw based on y=(time%fullAnimationTime)/frameTime*tileHeight
That will be my problem in the future.
But right now I don't know how to even import and use images properly.
I'm very new at programming for android. Have only programmed a little java before
Just read up a little on this... seems AnimationDrawable actually does exactly what you need, just look it up in the reference.
If you still want to do the side-by-side thing I suggested first, here's some code to get you started, which uses assets to access a BMP. You still need a timer to actually cause the drawing, but I guess you've already set that up:
Code:
Bitmap animBmp=null;
try {
InputStream animStream=null;
animStream=this.getAssets().open("anim.bmp");
BufferedInputStream animBufStream = new BufferedInputStream(animStream);
animBmp = BitmapFactory.decodeStream(animBufStream);
animBufStream.close();
animStream.close();
} catch (IOException e) {
e.printStackTrace();
}
class AnimView extends View{
public int frameTime=100;
public int frames=-1;
public int frameWidth=0;
public int frameHeight=0;
public Bitmap animBmp=null;
public Paint paint=null;
private int lastFrame=-1;
public AnimView(Context ctx, Bitmap bmp,int frms) {
super(ctx);
animBmp=bmp;
frames=frms;
frameWidth=bmp.getWidth();
frameHeight=bmp.getHeight()/frames;
paint=new Paint();
}
protected void onDraw(Canvas c){
Log.i("draw", "draw");
int frame=(int)((SystemClock.elapsedRealtime()%(frameTime*frames))/frameTime);
if(frame!=lastFrame){
c.drawBitmap(animBmp, new Rect(0,frame*frameHeight,frameWidth,(frame+1)*frameHeight), new Rect(0,0,frameWidth,frameHeight),paint);
lastFrame=frame;
}
}
};
AnimView v=new AnimView(this,animBmp,3);
It says that
"animStream=this.getAssets().open("anim.bmp");"
The method getAssets() is undefined for the type MyTimedView
I am a real noob at this. Where am I suppose to set my test.bmp? (The image is named test) Which variables should I change for it to understand that it always should use the "test" image?
Also. I have a image of a house that I want to be placed on the screen. And this that you gave me doesn't help me there I think?
Thanks for all your help!!
getAssets operates on the Activity, not the View.
Change this line to your bitmap's path:
animStream=this.getAssets().open("anim.bmp");
in your case
animStream=this.getAssets().open("test.bmp");
You'll also have to change this line to match the actual number of frames:
AnimView v=new AnimView(this,animBmp,3);
And lastly, the file (test.bmp) should be inside the ASSETS directory of your eclipse project.
Ok I have done everything you have said. I made 3 images and stored them in assets. Named test1, test2, test3. But now I only get a white screen. I had some stuff from the beginning but they are now gone. And I don't get a image or anything.
Even more complex. How do I set the X and Y values to the animation?
It's supposed to be one image with the frames ontop of each other. Give me a second... I'll pack it into a project and attach it.
Here you go. I added a few comments:
Code:
package net.tapperware.simpleanimation;
import java.io.BufferedInputStream;
import java.io.IOException;
import java.io.InputStream;
import android.app.Activity;
import android.content.Context;
import android.graphics.Bitmap;
import android.graphics.BitmapFactory;
import android.graphics.Canvas;
import android.graphics.Paint;
import android.graphics.Rect;
import android.os.Bundle;
import android.view.View;
public class SimpleAnimationActivity extends Activity {
@Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
//This will hold our single bitmap which will contain all frames stacked ontop of each other
Bitmap animBmp=null;
try {
//First, load the file: This is supposed to be in the assets folder of the project
InputStream animStream=this.getAssets().open("anim.bmp");
//BitmapFactory needs a BufferedInputStream, so let's just convert our InputStream
BufferedInputStream animBufStream = new BufferedInputStream(animStream);
//OK, now we decode it to an Android-Bitmap
animBmp = BitmapFactory.decodeStream(animBufStream);
//And finally, we clean up
animBufStream.close();
animStream.close();
} catch (IOException e) {
e.printStackTrace();
}
//This will be used to operate the loop... looping on Android needs a Runnable, and this class provides it. When
//run, it will invalidate the view, which causes onDraw to be called.
class AnimLoop implements Runnable{
private View view;
AnimLoop(View v){ view=v; }
@Override public void run() { view.invalidate(); }
}
//And here the actual view
class AnimView extends View{
//Time between frames
public int frameTime=100;
//Number of frames
public int frames=-1;
//Width of each frame
public int frameWidth=0;
//Height of each frame
public int frameHeight=0;
//The previous frame number shown
public int lastFrame=0;
//This is our Runnable that will call invalidate()
public AnimLoop loop=null;
//Of course, the actual Bitmap
public Bitmap animBmp=null;
//Just a default Paint object, since it's required by drawBitmap
public Paint paint=null;
public AnimView(Context ctx, Bitmap bmp,int frms) {
super(ctx);
//Assign the args to our attributes
animBmp=bmp;
frames=frms;
//Since the images are stacked ontop of each other, each frame is widthOfBitmap * (heightOfBitmap / numberOfFrames )
frameWidth=bmp.getWidth();
frameHeight=bmp.getHeight()/frames;
//And finally, our dummy height
paint=new Paint();
//And the Runnable that will cause invalidate to be called
loop=new AnimLoop(this);
}
protected void onDraw(Canvas c){
//Draw the rect that corresponds to the current frame
c.drawBitmap(
animBmp
//We need to move down by frame number*the frame index
,new Rect(0,lastFrame*frameHeight,frameWidth,(lastFrame+1)*frameHeight)
//This is our target rect... draw wherever you want
,new Rect(0,0,frameWidth,frameHeight)
//And our standard paint settings
,paint);
//Increase the frame number by one, if it's >= the total number of frames, make it wrap back to 0
lastFrame=(lastFrame+1)%frames;
//And ask for the next frame to be drawn in frameTime milliseconds.
this.getHandler().postDelayed(loop, frameTime);
}
};
//Arguments: Activity/Context , Bitmap, Frame Count
AnimView v=new AnimView(this,animBmp,3);
setContentView(v);
}
}
[2011-08-21 01:53:39 - SimpleAnimation] ERROR: Application requires API version 10. Device API version is 8 (Android 2.2).
[2011-08-21 01:53:39 - SimpleAnimation] Launch canceled!
Right now I feel like a 10 year old that doesn't understand how to spell.
Any quickfix to make it 2.2 compatible?
Yep, go to Project /Properties /Android and set the checkmark somewhere else... the code is not using any of it, I just set it to something. Any 2.x should be fine, probably even 1.x
Thanks for all your awsome work!
But I have 1 problem. The picture. anim.bmp never gets placed on the screen. I have altered the x and y values but it seems like it's never getting loaded. I have changed all that I thought needed to be changed.
Lets keep this simple. After I am done with the above. I want to make 1 image on a specific x and y. And I want it to be clickable. Nobody really goes into detail about how to even get a picture into the app. Everyone believes it's so simple...
That's... troubling. You mean the unchanged project right? It works perfectly here and I can't see anything that would cause issues with the code, so it must be the project setup. Could you open the compiled APK with WinRAR or 7zip and verify that there is indeed a folder named assets, containing anim.bmp?
It's supposed to appear at 0,0... to move it, you'd have to change this line
,new Rect(0,0,frameWidth,frameHeight)
to
,new Rect(desiredX,desiredY,desiredX+frameWidth,desiredY+frameHeight)
I'm not really qualified to tell you how to work with resources... I just plain don't like the system, so I usually work around it and use assets instead, which are simply put into the assets folder and that's the end of it (BUT, they don't have any automatic resolution handling, so you have to implement that yourself).
Yay it worked! If I would want to use this code for my app, do I create a new class for it or just dump everything in activitys? Because right now when I dumped it in activitys it just made a white screen. No animation or anything. It's just printing out the backround.
Phew! Your code will usually consist of a bunch of classes which inherit from View, which should each be contained in their own file. This AnimView will just be one more class among your collection.
According to the guidelines, you should then reference your Classes in res/layout/*.xml, but I've never done that (this strange, broken thing that Android calls XML gives me a headache), so well... you've got to find out how to do that without my help.
Why does it still give me a white screen? It is now back to the original and no new view is implemented.
I know my code looks horrible.
You load R.layout.main... but from what I can tell, there's nothing in there?
It is suppose to be loaded I think. If you mean when it is activitys.
And yes it isn't that much code. I recently started.
Where?:
StepByStepActivity:
Code:
public class StepByStepActivity extends Activity {
@Override
public void onCreate(Bundle savedInstanceState) {
(...)
setContentView(R.layout.main);
}
}
This loads R.layout.main... as far as I can tell, that's the only thing being executed on load, aside from a few config changes.
res/layout/main.xml:
Code:
<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
(...) android:id="@+id/mainlayout" (...)>
</LinearLayout>
That defines a linear layout.... but where is its content
This is the content. 5 squares and some lines drawed so it would be easyer to see where they are suppose to go.
Code:
@Override
protected void onDraw(Canvas canvas) {
Paint paint = new Paint();
paint.setColor(android.graphics.Color.RED);
paint.setStyle(Paint.Style.FILL);
canvas.drawRect(square1.x, square1.y, square1.x + 20, square1.y + 20,
paint);
paint.setColor(android.graphics.Color.BLUE);
paint.setStyle(Paint.Style.FILL);
canvas.drawRect(square2.x, square2.y, square2.x + 20, square2.y + 20, paint);
paint.setColor(android.graphics.Color.GREEN);
paint.setStyle(Paint.Style.FILL);
canvas.drawRect(square3.x, square3.y, square3.x + 20, square3.y + 20,
paint);
paint.setColor(android.graphics.Color.BLACK);
paint.setStyle(Paint.Style.FILL);
canvas.drawRect(square4.x, square4.y, square4.x + 20, square4.y + 20,
paint);
paint.setColor(android.graphics.Color.YELLOW);
paint.setStyle(Paint.Style.FILL);
canvas.drawRect(square5.x, square5.y, square5.x + 20, square5.y + 20,
paint);
paint.setStrokeWidth(3);
paint.setColor(android.graphics.Color.BLACK);
paint.setStyle(Paint.Style.FILL);
canvas.drawLine(95, 200, 650, 200, paint);
// 6 Vågrät
paint.setStrokeWidth(3);
paint.setColor(android.graphics.Color.BLACK);
paint.setStyle(Paint.Style.FILL);
canvas.drawLine(100, 200, 100, 480, paint);
// 5 lodrät
paint.setStrokeWidth(3);
paint.setColor(android.graphics.Color.BLACK);
paint.setStyle(Paint.Style.FILL);
canvas.drawLine(650, 200, 650, 480, paint);
// 4 lodrät
paint.setStrokeWidth(3);
paint.setColor(android.graphics.Color.BLACK);
paint.setStyle(Paint.Style.FILL);
canvas.drawLine(600, 0, 600, 200, paint);
// 3 lodrät
paint.setStrokeWidth(3);
paint.setColor(android.graphics.Color.BLACK);
paint.setStyle(Paint.Style.FILL);
canvas.drawLine(340, 200, 340, 480, paint);
// 2 lodrät
paint.setStrokeWidth(3);
paint.setColor(android.graphics.Color.BLACK);
paint.setStyle(Paint.Style.FILL);
canvas.drawLine(180, 0, 180, 200, paint);
// 1 lodrät
}
I'm a very new Java developer, having started learning it last month specifically for this project. I have fifteen years of development experience, but it has almost all been web related (html, JS, JQuery, ColdFusion, etc), so this is a major paradigm change that I'm having trouble wrapping my head around.
Anyway, I'm attempting to create a movie-based live wallpaper to sell on the app store. I have a 15 second mpg (or 450 png frames) derived from some rendered artwork I did, the bottom 35% of which has motion (the rest remains relatively static). I'd like code flexible enough to handle future animations as well, though, as I just rediscovered Vue and may do other videos where the entire frame has motion.
My initial attempts are detailed on my Stack Overflow question at: (link removed due to forum rules; findable with the title: How do you create a video live wallpaper).
That post, in short, boils down to having tried these different approaches:
Load frames into a bitmap array and display on canvas in loop; excellent FPS but hundreds of MB of memory use.
Load frames into byteArray as jpgs and decode during display; clocking in at only 10 FPS at 60% cpu usage on powerful hardware, but with good memory usage.
Load tiled sprite with all 450 frames in AndEngine as a texture and display; went oom while trying to allocate 200 MB of memory.
AndEngine again. Load tiled jpg with 10 frames into sprite, load next tiled jpg into a second sprite, every 400ms hide one sprite and display the second, then load the upcoming jpg into the hidden sprite; rinse, repeat. Attempting to decode in a makeshift buffer, essentially.
I feel like maybe method 4 has promise and am including the code I'm using below. However, every time the sprites are swapped out the screen freezes for as long as a second or two. I tried adding timers between every line of code to determine what's taking so much time, but they almost always come back with barely a millisecond or two taken, leaving me confused about where the freeze is occurring. But I don't understand AndEngine well yet (or even Java) so I may be doing something completely boneheaded.
I'd welcome any thoughts, whether a refinement on an existing method or a completely new idea. I've had a horrible time trying to find tutorials on doing this, and questions I find here and on SO generally don't offer much encouragement. I just want to get this thing finished so I can concentrate on the heart of this project: the art. Thanks!
As an aside, how much work would this be (ie: how much would it cost) for an experienced developer to create a template for me? I wouldn't mind paying a small amount for something I can keep using with future animations.
Code:
public void onCreateResources(
OnCreateResourcesCallback pOnCreateResourcesCallback)
throws Exception {
scene = new Scene();
initializePreferences();
// Water
waterTexture = new BitmapTextureAtlas(this.getTextureManager(), 1200, 950, TextureOptions.BILINEAR);
waterRegion = BitmapTextureAtlasTextureRegionFactory.createTiledFromAsset(waterTexture, this.getAssets(), "testten1.jpg", 0, 0, 2, 5);
waterTexture.load();
waterTexture2 = new BitmapTextureAtlas(this.getTextureManager(), 1200, 950, TextureOptions.BILINEAR);
waterRegion2 = BitmapTextureAtlasTextureRegionFactory.createTiledFromAsset(waterTexture2, this.getAssets(), "testten2.jpg", 0, 0, 2, 5);
waterTexture2.load();
water = new AnimatedSprite(0, 0, waterRegion, this.getVertexBufferObjectManager());
water2 = new AnimatedSprite(0, 0, waterRegion2, this.getVertexBufferObjectManager());
scene.attachChild(water);
water.animate(40);
mHandler.postDelayed(mUpdateDisplay, 400);
}
private final Handler mHandler = new Handler();
private final Runnable mUpdateDisplay = new Runnable() {
[user=439709]@override[/user]
public void run() {
changeWater();
}
};
public void changeWater() {
mHandler.removeCallbacks(mUpdateDisplay);
mHandler.postDelayed(mUpdateDisplay, 400);
if (curWaterTexture == 1) {
Log.w("General", "Changed texture to 2 with resource: " + curWaterResource);
curWaterTexture = 2;
scene.attachChild(water2);
water2.animate(40);
scene.detachChild(water);
curWaterResource = curWaterResource + 1;
if (curWaterResource > 4) curWaterResource = 1;
String resourceName = "testten" + curWaterResource + ".jpg";
waterRegion = BitmapTextureAtlasTextureRegionFactory.createTiledFromAsset(waterTexture, this.getAssets(), resourceName, 0, 0, 2, 5);
waterTexture.load();
water = new AnimatedSprite(0, 0, waterRegion, this.getVertexBufferObjectManager());
} else {
Log.w("General", "Changed texture to 1 with resource: " + curWaterResource);
curWaterTexture = 1;
scene.attachChild(water);
water.animate(40);
scene.detachChild(water2);
curWaterResource = curWaterResource + 1;
if (curWaterResource > 4) curWaterResource = 1;
String resourceName = "testten" + curWaterResource + ".jpg";
waterRegion2 = BitmapTextureAtlasTextureRegionFactory.createTiledFromAsset(waterTexture2, this.getAssets(), resourceName, 0, 0, 2, 5);
waterTexture2.load();
water2 = new AnimatedSprite(0, 0, waterRegion2, this.getVertexBufferObjectManager());
}
}
The launch of HMS Core Video Editor Kit 6.2.0 has brought two notable highlights: various AI-empowered capabilities and flexible integration methods. One method is to integrate the fundamental capability SDK, which is described below.
PreparationsFor details, please check the official document.
Code DevelopmentConfiguring a Video Editing Project
1. Set the authentication information for your app through an API key or access token.
Use the setAccessToken method to set an access token when the app is started. The access token needs to be set only once.
Code:
MediaApplication.getInstance().setAccessToken("your access token");
Use the setApiKey method to set an API key when the app is started. The API key needs to be set only once.
Code:
MediaApplication.getInstance().setApiKey("your ApiKey");
2. Set a License ID.
This ID is used to manage your usage quotas, so ensure that the ID is unique.
Code:
MediaApplication.getInstance().setLicenseId("License ID");
3. Initialize the running environment for HuaweiVideoEditor.
When creating a video editing project, first create a HuaweiVideoEditor object and initialize its running environment. When exiting a video editing project, release the HuaweiVideoEditor object.
Create a HuaweiVideoEditor object.
Code:
HuaweiVideoEditor editor = HuaweiVideoEditor.create(getApplicationContext());
Specify the position for the preview area.
This area renders video images, which is implemented by creating SurfaceView in the fundamental capability SDK. Ensure that the preview area position on your app is specified before creating this area.
Code:
<LinearLayout
android:id="@+id/video_content_layout"
android:layout_width="0dp"
android:layout_height="0dp"
android:background="@color/video_edit_main_bg_color"
android:gravity="center"
android:orientation="vertical" />
// Specify the preview area position.
LinearLayout mSdkPreviewContainer = view.findViewById(R.id.video_content_layout);
// Set the layout of the preview area.
editor.setDisplay(mSdkPreviewContainer);
Initialize the running environment. If the license verification fails, LicenseException will be thrown.
After the HuaweiVideoEditor object is created, it has not occupied any system resource. You need to manually set the time for initializing the running environment of the object. Then, necessary threads and timers will be created in the fundamental capability SDK.
Code:
try {
editor.initEnvironment();
} catch (LicenseException error) {
SmartLog.e(TAG, "initEnvironment failed: " + error.getErrorMsg());
finish();
return;
}
4. Add a video or image.
Create a video lane and add a video or image to the lane using the file path.
Code:
// Obtain the HVETimeLine object.
HVETimeLine timeline = editor.getTimeLine();
// Create a video lane.
HVEVideoLane videoLane = timeline.appendVideoLane();
// Add a video to the end of the video lane.
HVEVideoAsset videoAsset = videoLane.appendVideoAsset("test.mp4");
// Add an image to the end of the video lane.
HVEImageAsset imageAsset = videoLane.appendImageAsset("test.jpg");
5. Add audio.
Create an audio lane and add audio to the lane using the file path.
Code:
// Create an audio lane.
HVEAudioLane audioLane = timeline.appendAudioLane();
// Add an audio asset to the end of the audio lane.
HVEAudioAsset audioAsset = audioLane.appendAudioAsset("test.mp3");
6. Add a sticker and text.
Create a sticker lane and add a sticker and text to the lane. A sticker needs to be added using its file path, while the text needs to be added by specifying its content.
Code:
// Create a sticker lane.
HVEStickerLane stickerLane = timeline.appendStickerLane();
// Add a sticker to the end of the lane.
HVEStickerAsset stickerAsset = stickerLane.appendStickerAsset("test.png");
// Add text to the end of the lane.
HVEWordAsset wordAsset = stickerLane.appendWord("Input text",0,3000);
7. Add a special effect.
Special effects are classified into the external special effect and embedded special effect.
Add an external special effect to an effect lane. This special effect can be applied to multiple assets, and its duration can be adjusted.
Code:
// Create an effect lane.
HVEEffectLane effectLane = timeline.appendEffectLane();
// Create a color adjustment effect with a duration of 3000 ms. Add it to the 0 ms playback position of the lane.
HVEEffect effect = effectLane.appendEffect(new HVEEffect.Options(HVEEffect.EFFECT_COLORADJUST, "", ""), 0, 3000);
Add an embedded special effect to an asset. Such a special effect can be applied to a single asset. The special effect's duration is the same as that of the asset and cannot be adjusted.
Code:
// Create an embedded special effect of color adjustment.
HVEEffect effect = videoAsset.appendEffectUniqueOfType(new HVEEffect.Options(HVEEffect.EFFECT_COLORADJUST, "", ""), ADJUST);
8. Play a timeline.
To play a timeline, specify its start time and end time. The timeline will play from its start time to end time at a fixed frame rate, and the image and sound in the preview will play simultaneously. You can obtain the playback progress, playback pause, payback completion, and playback failure via the registered callback.
Code:
// Register the playback progress callback.
editor.setPlayCallback(callback);
// Play the complete timeline.
editor.playTimeLine(timeline.getStartTime(), timeline.getEndTime());
9. Export a video.
After the editing is complete, export a new video using the assets in the timeline via the export API. Set the export callback to listen to the export progress, export completion, or export failure, and specify the frame rate, resolution, and path for the video to be exported.
Code:
// Path for the video to be exported.
String outputPath =
Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_PICTURES)
+ File.separator + Constant.LOCAL_VIDEO_SAVE_PATH
+ File.separator + VideoExportActivity.getTime() + ".mp4";
// Resolution for the video to be exported.
HVEVideoProperty videoProperty = new HVEVideoProperty(1920, 1080);
// Export the video.
HVEExportManager.exportVideo(targetEditor, callback, videoProperty, outputPath);
Managing Materials
After allocating materials, use APIs provided in the on-cloud material management module to query and download a specified material. For details, please refer to the description in the official document.
Integrating an AI-Empowered Capability
The fundamental capability SDK of Video Editor Kit provides multiple AI-empowered capabilities including AI filter, track person, moving picture, and AI color, for integration into your app. For more details, please refer to the instruction in this document.
AI Filter
Lets users flexibly customize and apply a filter to their imported videos and images.
Code:
// Create an AI algorithm engine for AI filter.
HVEExclusiveFilter filterEngine = new HVEExclusiveFilter();
// Initialize the engine.
mFilterEngine.initExclusiveFilterEngine(new HVEAIInitialCallback() {
@Override
public void onProgress(int progress) {
// Initialization progress.
}
@Override
public void onSuccess() {
// The initialization is successful.
}
@Override
public void onError(int errorCode, String errorMessage) {
// The initialization failed.
}
});
// Create an AI filter of the extract type from an image, by specifying the image bitmap and filter name.
// The filter ID is returned. Using the ID, you can query all information about the filter in the database.
String effectId = mFilterEngine.createExclusiveEffect(bitmap, "AI filter 01");
// Add the filter for the first 3000 ms segment of the effect lane.
effectLane.appendEffect(new HVEEffect.Options(
HVEEffect.CUSTOM_FILTER + mSelectName, effectId, ""), 0, 3000);
Color Hair
Changes the hair color of one or more persons detected in the imported image, in just a tap. The color strength is adjustable.
Code:
// Initialize the AI algorithm for the color hair effect.
asset.initHairDyeingEngine(new HVEAIInitialCallback() {
@Override
public void onProgress(int progress) {
// Initialization progress.
}
@Override
public void onSuccess() {
// The initialization is successful.
}
@Override
public void onError(int errorCode, String errorMessage) {
// The initialization failed.
}
});
// Add the color hair effect by specifying a color and the default strength.
asset.addHairDyeingEffect(new HVEAIProcessCallback() {
@Override
public void onProgress(int progress) {
// Handling progress.
}
@Override
public void onSuccess() {
// The handling is successful.
}
@Override
public void onError(int errorCode, String errorMessage) {
// The handling failed.
}
}, colorPath, defaultStrength);
// Remove the color hair effect.
asset.removeHairDyeingEffect();
Moving Picture
Animates one or more persons in the imported image, so that they smile, nod, or more.
Code:
// Add the moving picture effect.
asset.addFaceReenactAIEffect(new HVEAIProcessCallback() {
@Override
public void onProgress(int progress) {
// Handling progress.
}
@Override
public void onSuccess() {
// The handling is successful.
}
@Override
public void onError(int errorCode, String errorMessage) {
// The handling failed.
}
});
// Remove the moving picture effect.
asset.removeFaceReenactAIEffect();
This article presents just a few features of Video Editor Kit. For more, check here.
To learn more, please visit:
>> HUAWEI Developers official website
>> Development Guide
>> Reddit to join developer discussions
>> GitHub to download the sample code
>> Stack Overflow to solve integration problems
Follow our official account for the latest HMS Core-related news and updates.