[Q] Draw Objects on Screen - Android Q&A, Help & Troubleshooting

I'm starting to build an app that needs 2D drawing on Android Studio and even after some search I still have some troubles understanding some things. My intention is to draw Balls as objects, so that I can specify its radius and locations to draw. That ball should be a filled circle.
So here are my questions:
1. How to create the ball class in the most organized way? (Its constructors, methods, and stuff like that)​1. Where to call that class to create balls?(what should I do on the onCreate() method? My coding should go in my MainActivity class?)​1. How to draw them by a timer?(create a ball with a random radius and position and draw it on screen)​1. Where to draw them? (container)​
I saw a lot of things using canvas, the mysterious onDraw() method and I didn't get much of it.
I didn't understand how to control when to draw it.
I used this code for leaning and kinda control the drawing of objects , but It seems too messy and dont feel right.
Its a sensor controlled ball.
Bubbleview.java
Code:
public class BubbleView extends View
{
private int diametro;
private int x;
private int y;
private int Xt;
private int Yt;
private ShapeDrawable bubble;
private ShapeDrawable rect;
private boolean isTouch = false;
private int width;
private int height;
private WindowManager wm;
private Display display;
private Point size;
private Point centroBola;
private Point centroRet;
//private Color Retcor;
//private Color Bolacor;
public BubbleView(Context context)
{
super(context);
/*--Instancias--*/
bubble = new ShapeDrawable(new OvalShape());
rect = new ShapeDrawable(new RectShape());
size = new Point();
//Retcor = new Color();
//Bolacor = new Color();
createBubble();
}
private void createBubble()
{
wm = (WindowManager)getContext().getSystemService(Context.WINDOW_SERVICE);
display = wm.getDefaultDisplay();
display.getSize(size);
width = (size.x);
height = (size.y);
x = width/2;
y = height/2;
//centroBola.set(x+(diametro/2),y-(diametro/2));
diametro = 100;
bubble.setBounds(x-diametro,y-diametro, x + diametro, y + diametro);
rect.setBounds(0, 10, size.x-1,80);
bubble.getPaint().setColor(0xff74AC23);
//rect.getPaint().setColor(0xff74AC23);//0xffEE4400
}
protected void move(float f, float g)
{
if (isTouch)
{
x = Xt;
y = Yt;
}
else
{
if (bubble.getBounds().right > width)
{
x=width-diametro-1;
y = (int) (y + g);
}
if (bubble.getBounds().left < 0)
{
x = 0;
y = (int) (y + g);
}
if (bubble.getBounds().top < 0)
{
x = (int) (x - f);
y = 1;
}
if (bubble.getBounds().bottom > height) {
x = (int) (x - f);
y = height - diametro -1;
}
x = (int) (x - f);
y = (int) (y + g);
}
//rect.setBounds(x+40, y+40, x + diametro, y + diametro);
bubble.setBounds(x,y,x +diametro,y+diametro);
}
@Override
public boolean onTouchEvent(MotionEvent event)
{
Xt = (int) event.getX();
Yt = (int) event.getY();
int evento = event.getAction();
switch (evento)
{
case MotionEvent.ACTION_DOWN:
bubble.setBounds(0, 0, Xt + diametro,Yt+ diametro);
//rect.setBounds(Xt, Yt, Xt + diametro,Yt+ diametro);
//coords.setText("X: " + X + ", Y: " + Y);
isTouch = true;
break;
case MotionEvent.ACTION_MOVE:
//coords.setText("X: " + X + ", Y: " + Y);
break;
case MotionEvent.ACTION_UP:
//Toast.makeText(this, "Saiu da tela em: X: " + X + ", Y: " + Y, Toast.LENGTH_SHORT);
isTouch = false;
break;
}
return true;
}
protected void onDraw(Canvas canvas)
{
super.onDraw(canvas);
//rect.draw(canvas);
bubble.draw(canvas);
}
}
So I tried to build my app using this as a reference, but no success.
Wich way should the code becomes more intuitive?
thx in advance

Related

[Q] Working with simple XML array

Not sure why it crash while looping the array:
Code:
final String tname[] = getResources().getStringArray(R.array.tname);
final int tid[] = getResources().getIntArray(R.array.tid);
final int tlv[] = getResources().getIntArray(R.array.tlv);
int tlength = tname.length;
String pname[] = new String[tlength];
int pid[] = new int[tlength];
for(int i = 0; i < tlength; i++) {
if(tlv[i] <= 0) {
pname[i] = tname[i];
pid[i] = tid[i];
}
}
setListAdapter(new ArrayAdapter<String>(this, R.layout.pMenu, pname));
Thanks in advance!

[Q] How to use multi-threaded class in bootanimation?

Hi Droid-expert
For stock bootanimation in JB, there is a single threaded animation process/class inside. If I'd like to implement the power-on sound/tune as well. What's the best programming scheme/model it could be?
So far, by my understanding, I did something in BootAnimation.cpp in the following:
Did I do wrong? The handset boots up, it shows the animation first, and then the sound is played.
However, in a normal AOS adb shell session, try to run bootanimation, and it works fine (I mean both animation and sound are played at the same time)
Does the init is multi-threaded?
Anthony
class PlayerListener: public MediaPlayerListener, public Thread
{
public:
PlayerListener(): Thread(false), mp(NULL) {}
~PlayerListener() { delete mp; mp = NULL; }
virtual void notify(int, int, int, const Parcel *) {}
virtual bool threadLoop();
virtual status_t readyToRun();
private:
MediaPlayer* mp;
};
// ----------------------------------------------------------------------
bool PlayerListener::threadLoop()
{
if (mp != NULL)
mp->start();
sleep(100);
requestExit();
return false;
}
status_t PlayerListener::readyToRun()
{
int index = 7;
audio_devices_t device;
bool r;
mp = new MediaPlayer();
mp->setListener(this);
if (mp->setDataSource(SYSTEM_BOOTANIMATION_SOUND_FILE, NULL) == NO_ERROR) {
mp->setAudioStreamType(AUDIO_STREAM_ENFORCED_AUDIBLE);
mp->prepare();
mp->setLooping(false);
}
// AudioSystem::getStreamVolumeIndex(AUDIO_STREAM_ENFORCED_AUDIBLE, &index);
device = AudioSystem::getDevicesForStream(AUDIO_STREAM_ENFORCED_AUDIBLE);
r = AudioSystem::setStreamVolumeIndex(AUDIO_STREAM_ENFORCED_AUDIBLE, index, device);
r = AudioSystem::getStreamVolumeIndex(AUDIO_STREAM_ENFORCED_AUDIBLE, &index, device);
if (index != 0) {
mp->seekTo(0);
// mp->start();
}
return NO_ERROR;
}
bool BootAnimation::threadLoop()
{
bool r;
sp<PlayerListener> player = new PlayerListener();
player->run("BootAnimatedMelody", PRIORITY_AUDIO);
if (mAndroidAnimation) {
r = android();
} else {
r = movie();
}

Trouble with string encode after decompile APK file

Hello,
I have a trouble with a string text encode after decompile an APK file.
I used APK Tool decompile.
The code was encoded here:
Code:
public class API
extends Activity
{
public static String a = "081b11458016006d513b0290cc7be0b39c28626ea506b9ed291b125114f2a38369a42e77a066a7789e5883ed47113fc3";
public static String b = "081b11458016006d513b0290cc7be0b36f01584bcfccde8bd9e2bd628ca804e7056b175ad6a1fa1bf3cf31d8a28b94e9";
public static String c = "081b11458016006d513b0290cc7be0b36f01584bcfccde8bd9e2bd628ca804e7056b175ad6a1fa1bf3cf31d8a28b94e9";
public static JSONObject d = new JSONObject();
public static String e = "";
private static int h = 0;
private boolean f = false;
private ProgressDialog g;
private JSONObject a(JSONObject paramJSONObject)
{
int i = 0;
Iterator localIterator = paramJSONObject.keys();
int[] arrayOfInt = new int[paramJSONObject.length()];
int j = 0;
JSONObject localJSONObject1;
int m;
if (!localIterator.hasNext())
{
Arrays.sort(arrayOfInt);
localJSONObject1 = new JSONObject();
m = arrayOfInt.length;
}
for (;;)
{
if (i >= m)
{
return localJSONObject1;
int k = j + 1;
arrayOfInt[j] = Integer.parseInt(((String)localIterator.next()).toString());
j = k;
break;
}
String str1 = Integer.toString(arrayOfInt[i]);
try
{
JSONObject localJSONObject2 = (JSONObject)paramJSONObject.get(str1);
if (((paramJSONObject.get(str1) instanceof JSONObject)) && (!a(localJSONObject2.get("package").toString())))
{
String str2 = localJSONObject2.get("package").toString();
if (!getApplicationContext().getSharedPreferences("listpacks", 0).getBoolean(str2, false)) {
if (localJSONObject2.get("type").toString().equals("1"))
{
if (!this.f)
{
this.f = true;
d = localJSONObject2;
}
}
else
{
e = e + localJSONObject2.get("package").toString() + "|";
localJSONObject1.put(str1, localJSONObject2);
}
}
}
}
catch (JSONException localJSONException)
{
localJSONException.printStackTrace();
}
i++;
}
}
I'm looking for many forum but I'm not find answer.
Anybody can help me decode 3 text string a,b,c please?
Updated: Dear, I have been decode text string by step by step replace char - because I'm not coder.
Result string a:
Code:
081b11458016006d513b0290cc7be0b39c28626ea506b9ed291b125114f2a38369a42e77a066a7789e5883ed47113fc3
is
Code:
http_abc.com/defgh_v2/api_in.php?app_name=9apps&api=19
Update new question, maybe easy: Anybody please tell me what is type encode? ( Again, I'm not a coder but I'm a Google Man hihi)

Camera real time image processing

I need to do some image processing in real time on mobile. I am using camera2 library. I managed to put camera preview directly on surfaceView but when i redirect it to ImageReader to do something with frames before i preview them it goes really slow, and there is not even processing just taking Y value from YUV_420_888 and converting it to gray picture in ARGB_8888 format. Code is below. For that one for loop i need about 600ms. And for image processing i will need to go trough that image at least one more time. In Bits array i form ARGB_8888 format out of Y array which is Y value of YUV_420_888 format. I test it on phone with Quad Core 1ghz processor and 1gb ram. Is there any way to speed this code up so i can go lets say 2-3 times trough picture and have at least 7-10 fps? If i delete for loop and just measure fps its 8-12fps. Why is it so slow compering to 25fps when i put it directly on surfaceView?
Code:
private final CameraDevice.StateCallback mStateCallback = new CameraDevice.StateCallback() {
@Override
public void onOpened(@NonNull CameraDevice cameraDevice) {
// This method is called when the camera is opened. We start camera preview here.
mCameraOpenCloseLock.release();
mCameraDevice = cameraDevice;
createCameraPreviewSession();
mImageReader.setOnImageAvailableListener(new ImageReader.OnImageAvailableListener() {
@Override
public void onImageAvailable(ImageReader reader) {
Image image = reader.acquireLatestImage();
if(image != null) {
ByteBuffer buffer0 = image.getPlanes()[0].getBuffer();
byte[] Y = new byte[buffer0.remaining()];
buffer0.get(Y);
byte[] Bits = new byte[Y.length*4]; //That's where the RGBA array goes.
int Ylength = Y.length;
for (int i = 0; i < Ylength; i++) {
int i1 = i*4;
Bits[i1] =Y[i];
Bits[i1 + 1] =Y[i];
Bits[i1 + 2] = Y[i];
Bits[i1 + 3] = -1;//0xff, that's the alpha.
}
Bitmap bm = Bitmap.createBitmap(image.getWidth(), image.getHeight(), Bitmap.Config.ARGB_8888);
bm.copyPixelsFromBuffer(ByteBuffer.wrap(Bits));
Bitmap scaled = Bitmap.createScaledBitmap(bm, surfaceView.getWidth(), surfaceView.getHeight(), true);
Canvas c;
c = surfaceHolder.lockCanvas();
c.drawBitmap(scaled, 0, 0, null);
surfaceHolder.unlockCanvasAndPost(c);
image.close();
time2 = System.nanoTime();
Log.d("Vreme",Double.toString(1000000000/(time2-time1))+"fps");
time1 = System.nanoTime();
}
}
},mBackgroundHandler);
}
@Override
public void onDisconnected(@NonNull CameraDevice cameraDevice) {
mCameraOpenCloseLock.release();
cameraDevice.close();
mCameraDevice = null;
}
@Override
public void onError(@NonNull CameraDevice cameraDevice, int error) {
mCameraOpenCloseLock.release();
cameraDevice.close();
mCameraDevice = null;
}
};

Android VideoView Resize Algorithm - Bilinear or Bicubic?

Hi, community,
I was trying to resize a video to be played on Google Pixel 5 to fit the screen. I am able to achieve the required but I need to dig a bit deeper to understand the functioning under the hood. I want to understand if the resizing is happening on hardware or software? Also, what is the algorithm used in the resizing operation whether it is bilinear or bicubic or Lanczos or anything else?
Java:
public void setVideo(String id)
{
System.out.println("Starting video");
System.out.println(id);
String videoPath = Environment.getExternalStorageDirectory().getPath()+"/"+ id;
videoView.setVideoPath(videoPath);
videoView.setOnPreparedListener(new MediaPlayer.OnPreparedListener() {
@Override
public void onPrepared(MediaPlayer mp) {
DisplayMetrics metrics = new DisplayMetrics();
getWindowManager().getDefaultDisplay().getMetrics(metrics);
int screenWidth = metrics.widthPixels;
int screenHeight = metrics.heightPixels;
float screenProportion = (float) screenWidth / (float) screenHeight;
android.view.ViewGroup.LayoutParams lp = videoView.getLayoutParams();
int VideoWidth = mp.getVideoWidth();
int VideoHeight = mp.getVideoHeight();
float videoProportion = (float) VideoWidth / (float) VideoHeight;
if (videoProportion > screenProportion) {
lp.width = screenWidth;
lp.height = (int) ((float) screenWidth / videoProportion);
} else {
lp.width = (int) (videoProportion * (float) screenHeight);
lp.height = screenHeight;
}
videoView.setLayoutParams(lp);
}
});
videoView.start();

Categories

Resources