[Q] Best standalone HDTV setup - Android Q&A, Help & Troubleshooting

I'm looking for hardware device on what compact device to hook up to my HDTV for turning it into a media center primarily and a light PC otherwise.
I'm interested in something like the RK3066 because it seems to have decent performance compared to the older single core devices.
Things it needs to do:
- run XBMC, stream media from fileserver via wifi
- stream another display to it in 1080p
So.. basically run Android or Linux effectively enough. This one supports bluetooth which seems useful for peripherals.
Things I might do with it:
- put linux and steam on it
- light browsing ans basic gaming
- hook a webcam to it
I've built full HTPCs before and I don't think I can do a compact one for anywhere near this price, even though it would be way more powerful. I also don't want a single purpose 'streaming box' solution that I can't replace the OS on. I see this as a cheaper way of doing 99% of what I want and I can still choose to replace it at anytime without feeling bad about it.
Any better performing options at this price, or similar options at lower price? Alternative hardware that I should be looking at?

Soldier Blue said:
I'm looking for hardware device on what compact device to hook up to my HDTV for turning it into a media center primarily and a light PC otherwise.
I'm interested in something like the RK3066 because it seems to have decent performance compared to the older single core devices.
Any better performing options at this price, or similar options at lower price? Alternative hardware that I should be looking at?
Click to expand...
Click to collapse
if you have a bit of patience a few new ones based on rk3188 are launching that are a lot better, not only they are way faster (quad core, faster gpu, 2gb ram - around 18k antutu just as reference) , they solved the overheating and EM interferences that plagues rk3066 ones.

NixZero said:
if you have a bit of patience a few new ones based on rk3188 are launching that are a lot better, not only they are way faster (quad core, faster gpu, 2gb ram - around 18k antutu just as reference) , they solved the overheating and EM interferences that plagues rk3066 ones.
Click to expand...
Click to collapse
That's exactly what I wanted to know, since I'm not normally following the latest for this type of hardware. Do you know when these might be available?

Related

What do you know about the Tegra 3 SoC in the Asus Prime?

-The Tegra 3 SoC (System on a chip) is a combo of a microprocessor, a memory controller, an audio processor, a video encoder and a graphics renderer. It's designed and manufactured by Nvidia, world leader of graphics computing, making it's first appearance in the Asus Transformer Prime.
-The Tegra 3 SoC has 5 physical cores, but limited to performance of quad-cores. The 5th, lower power core, is activated only when the device is idle or handling low tasks, such as syncing and e-mail checking. So, power consumption is always kept to minimum when performance of the quad-core is not needed, ensuring longer battery life. Once you run a normal or higher-demanding task on the tablet, the 5th core shuts off automatically before the 4 main cores are activated. This is all the bios of the chip and doesn't require the user or the developer to change anything to use the Android OS and application this way. Android OS already has a the best support for multi-tasking and is multi-threaded friendly compared to competing operating systems in the market. So, this should be good news of the Asus Transformer Prime to-be users soon.
-The GPU (Graphics Processing Unit) in the Tegra 3 Soc has 12 shaders. But, because Nvidia has not followed a unified-shader architecture in this ARM SoC like they've been doing in their PC and MAC discrete graphics cards, 8 of those 12 shaders are reserved for pixel work and the remaining 4 are for vertex. Maybe Nvidia will use unified-shader architecture in the next generation Tegra SoC'es, when the ARM-based devices are ready for it. The PowerVR MP2 GPU in the iPad 2 has more raw power than the Tegra 3 GPU (Actually, it's the only one thing I personally like about the iPad 2, it's GPU!), but the Tegra 3 Geforce (the commercial name Nvidia uses for their gaming graphics processors) should give a solid 3D performance in games, especially the officially supported games. Nvidia has long history in 3D gaming and been using it's solid connections with game developers to bring higher quality gaming to Android, like what we've seen with Tegra 2 SoC capabilities in games listed in the TegraZone Android app. Add to that, games are not just GPU bound, Tegra 3's quad-cores and 1GB system RAM (iPad has 512MB) will pump up gaming qualities for sure and the pixel density of 149ppi displays crisper images than the 132ppi of the iPad 2. Once the Asus Prime is released, it can be officially considered the highest performing Android device in the world, especially 3D gaming.
Well, I thought I'd have more to type, I paused for a long time and could not think of anything to add. I only wanted to share few things I know about the Tegra 3. I have high interest in computer graphics/processors and been following the Tegra project since 2008.
Some of the Asus Prime to-be-owners doesn't know or care that much about technical details of the CPU in the device and I thought of sharing with them.
Thanks and gold luck.
Thanks for the info. Very interesting
As I understand it, the use of the lower power 5th core has decreased battery consumption by over 60% when compared to the earlier 2 core design. I am not sure how they are measuring consumption and the task load.
I am most exited about the tablet because of the tegra 3.
In smartphones I find the idea of putting more than one core quite rubbish.
It is not the best solution for a tablet or any other mobile device too. I would highly appreciate a well programmed software over overpowered hardware.
Yet the tegra has a nice concept.
I think most of the time I won't use more than that 5th core. I mean it is even powerful enough to play HD video.
I will primarily use apps that display text and images. Like the browser who is said to utilize 4 cores. But I am sure only because of the crappy programming.
So if people finally come to their minds and start optimizing their apps we will have one quite powerful core and 4 in backup for REAL needs. Seems like an investment in the future for me.
Sent from my Nexus One using XDA App
Straight from Wikipedia:
Tegra 3 (Kal-El) series
Processor: quad-core ARM Cortex-A9 MPCore, up to 1.4 GHz single-core mode and 1.3 GHz multi-core mode
12-Core Nvidia GPU with support for 3D stereo
Ultra low power GPU mode
40 nm process by TSMC
Video output up to 2560×1600
NEON vector instruction set
1080p MPEG-4 AVC/h.264 40 Mbps High-Profile, VC1-AP and DivX 5/6 video decode[18]
The Kal-El chip (CPU and GPU) is to be about 5 times faster than Tegra 2[19]
Estimated release date is now to be Q4 2011 for tablets and Q1 2012 for smartphones, after being set back from Nvidia's prior estimated release dates of Q2 2011,[20] then August 2011,[21] then October 2011[22]
The Tegra 3 is functionally a quad-core processor, but includes a fifth "companion" core. All cores are Cortex-A9s, but the companion core is manufactured with a special low power silicon process. This means it uses less power at low clock rates, but more at higher rates; hence it is limited to 500 MHz. There is also special logic to allow running state to be quickly transferred between the companion core and one of the normal cores. The goal is for a mobile phone or tablet to be able to power down all the normal cores and run on only the companion core, using comparatively little power, during standby mode or when otherwise using little CPU. According to Nvidia, this includes playing music or even video content.[23]
Tegra 3 officially released on November 9, 2011[/LEFT][/CENTER][/FONT]
Tegra 2's maximum ram limit was 1GB. Tegra 3's could be 2GB.
xTRICKYxx said:
Straight from Wikipedia:
Tegra 2's maximum ram limit was 1GB. Tegra 3's could be 2GB.
Click to expand...
Click to collapse
The rumor mill is churning out some specs on an upcoming Lenovo tablet with some funky specs, like 2GB DD3....so it's possible. However, the same leak/article also says its chip is clocked at 1.6 Ghz which is quite a bit out of spec, so I would take it with a usual:
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
jerrykur said:
As I understand it, the use of the lower power 5th core has decreased battery consumption by over 60% when compared to the earlier 2 core design. I am not sure how they are measuring consumption and the task load.
Click to expand...
Click to collapse
You can read the White Papers on the Tegra 3 over on Nvidia's website. But the chip has a controller built into the chip that activates either the 4 cores, or the 1 core based on power demand of a given processing activity.
The quad vs single core are made out of different silicone materials, but same design structure in order to maximize the energy efficiency at the performance curve. The difference of Materials is more efficient at different power curves. So the 5th core is very efficient at low processing levels where it is actively being used.
It's pretty cool stuff
RussianMenace said:
The rumor mill is churning out some specs on an upcoming Lenovo tablet with some funky specs, like 2GB DD3....so it's possible. However, the same leak/article also says its chip is clocked at 1.6 Ghz which is quite a bit out of spec, so I would take it with a usual:
Click to expand...
Click to collapse
*Correction: Tegra 3 supports DDR2 AND DDR3. The original Transformer had 1GB of DDR2 @ 667Mhz. The Prime has 1GB of LPDDR2 @ 1066Mhz, a considerable bump in speed. Also, Tegra 3 supports up to DDR3 @ 1500Mhz!
xTRICKYxx said:
I think the only compatible RAM would be DDR2. Clock speeds don't matter, as the Tegra 3 can be OC'd to 2Ghz no problem.
Click to expand...
Click to collapse
I'm sure it can, hopefully they increase the battery capacity to compensate for increased power use. As far as the memory, Nvidia's site on Tegra 3 lists DDR3 (though its still running on a 32-bit bus which may or may not be an issue with 3d games), upto 2GB. However, every bit of spec info on the Prime I can find lists DDR2...so I don't know.
RussianMenace said:
I'm sure it can, hopefully they increase the battery capacity to compensate for increased power use. As far as the memory, Nvidia's site on Tegra 3 lists DDR3 (though its still running on a 32-bit bus which may or may not be an issue with 3d games), upto 2GB. However, every bit of spec info on the Prime I can find lists DDR2...so I don't know.
Click to expand...
Click to collapse
The Prime's RAM speed is considerably faster than the TF101.
If it does have room to expand, could we expand or upgrade the RAM?
doeboy1984 said:
If it does have room to expand, could we expand or upgrade the RAM?
Click to expand...
Click to collapse
Judging by the pictures, it doesn't look like the RAM will be removable or upgradeable (the RAM is the Elpida chip right next to the processor).
xTRICKYxx said:
The Prime's RAM speed is considerably faster than the TF101.
Click to expand...
Click to collapse
I never said it wasn't.
What I said is that both Tegra 2 and now Tegra 3 have a single 32-bit wide memory interface when compared to the two on the A5,Exynos,Qualcom, and OMAP4 chips. What that means is that theoretically it will have lower bandwidth which may cause problems with upcoming games, especially considering that you now have to feed extra cores and a beefier GPU. Now, whether or not it will actually be an issue...we will have to see.
Sad that the SGX543MP2 in the Ipad2 is still faster than the Tegra3's GPU. Apple is always ahead of the curve.. Just when Android devices started becoming as fast as the iPad1.. The iPad2 was released and remains to have one of the strongest SOCs out in the field.
Even for pure CPU benches.. the 1ghz dualcore A5 smokes most chips running faster clocks in dual core configs.
Regardless, this is still the most powerful Android device to date. Just disappointed that Nvidia, one of the king of GPU makers can't even compete with PowerVR.. a much smaller company with a lot less money.
Diversion said:
Sad that the SGX543MP2 in the Ipad2 is still faster than the Tegra3's GPU. Apple is always ahead of the curve.. Just when Android devices started becoming as fast as the iPad1.. The iPad2 was released and remains to have one of the strongest SOCs out in the field.
Even for pure CPU benches.. the 1ghz dualcore A5 smokes most chips running faster clocks in dual core configs.
Regardless, this is still the most powerful Android device to date. Just disappointed that Nvidia, one of the king of GPU makers can't even compete with PowerVR.. a much smaller company with a lot less money.
Click to expand...
Click to collapse
I would have to agree with you that Nvidia dropped the ball on their new GPU, at least on paper.
However, it's not as simple as having "omg wtf i > you" hardware that's the source of the performance. What Apple really has going for them is uniformity of hardware/software. Apple software is designed to work on very specific and strictly controlled hardware setup which allows for an incredible level of optimizations of software. This "closed loop" if software/hardware is what really drives the performance of the iProducts. Simply put, probably way over-simplified, but it let's them do more with less.
Diversion said:
Sad that the SGX543MP2 in the Ipad2 is still faster than the Tegra3's GPU. Apple is always ahead of the curve.. Just when Android devices started becoming as fast as the iPad1.. The iPad2 was released and remains to have one of the strongest SOCs out in the field.
Even for pure CPU benches.. the 1ghz dualcore A5 smokes most chips running faster clocks in dual core configs.
Regardless, this is still the most powerful Android device to date. Just disappointed that Nvidia, one of the king of GPU makers can't even compete with PowerVR.. a much smaller company with a lot less money.
Click to expand...
Click to collapse
Very good point. Also apple has the apps n games that showcase and utilize all this extra power. Even my original iPad has apps/games that I haven't seen Android dual core equivalents of. I love my iPad but I also own Atix dual core Tegra 2 phone. I know the open sourced Android will win out in the end.
I came across a good comment in the lenovo specs link that a member here posted in this thread.
"Google and NVidia need to seriously subsidize 3rd party app development to show ANY value and utility over iPad. Apple won't rest on its laurels as their GPU performance on the A5 is already ahead with games and APPs to prove it".
What do you all think about this? Not trying to thread jack as I see it's relevant to this thread also. What apps/games does Android have up it's sleeve to take advantage of this new Tegra3? Majority of Android apps/games don't even take advantage of tegra2 and similar SOC yet. Are we going to have all this extra power for a while without it never really being used to it's potential. Android needs some hardcore apps n games. iPad has all the b.s. Stuff also BUT has very hardcore apps n games also to use it to close to full potential. IMO my iPad 1 jail broken still trumps most of these Tegra 2 tablets out now. Not because of hardware specs, but because of the quality of apps n games I have. I've noticed Android is finally starting to get more hardcore games like ShadowGun, game loft games, etc.. I can't over clock or customize my iPad as extensively as Android but the software/apps/games I have are great. No, I don't want an ipad2 or ipad3. I want an Android tablet now because of more potential with it. Just like with anything in life, potential doesn't mean sh$& if it's not utilized and made a reality.
I was a windows mobile person first. Then I experienced dual booting with XDAndroid on my tilt 2, I loved it. Then I knew I wanted a real android phone or tablet. First Android tablet I owned, for only a day, was the Archos7IT. It was cool but returned it since it couldn't connect to my WMwifirouter, which uses ad-hoc network. So I researched n finally settled on taking a chance with the apple iPad. I use to be an apple hater to the max..lol. My iPad changed all of that. I still hate the closed system of apple but I had to admit, the iPad worked great for what I needed and wanted to do. This iPad, I'm writing this post on now, still works flawlessly after almost 2 years and it's specs are nowhere compared to iPad 2 or all these new dual core tablets out. I'm doing amazing stuff with only 256mb of ram..SMH I hated having to hook iPad up to iTunes for everything like music n videos. So I jail broke and got Ifiles, which is basically a very detailed root file explorer. I also have the USB n SD card adapter. So now I could put my content on my iPad myself without needing to be chained to iTunes. iTunes only good for software updates. I'm still on 4.2.1 jail broken firmware on iPad. Never bothered or really wanted to upgrade to the new IOS 5.01 out now. With all my jailbreak mods/tweaks, I've been doing most new stuff people are now being able to do. All apple did was implement jailbreak tweaks into their OS, for the most part.
Sorry for the long rant. I'm just excited on getting new Prime tegra3 tablet. I just hope the apps/games start rolling out fast that really take advantage of this power. And I don't just mean tegrazone stuff..lol. Android developers going to have to really step their game up once these new quad cores come out. Really even now with dual cores also. I'm a fan of technology in general. Competition only makes things better. Android is starting to overtake apple in sales or similar categories. Only thing is Android hasn't gotten on par with apple quality apps yet. Like the iPad tablet only apps are very numerous. Lots are b.s. But tons are very great also. I'm just hoping Amdroid tablet only apps will be same quality at least or better. I'm not looking to get new quad core tablet to play angry birds or other kiddy type games. I'm into productivity, media apps, and hardcore games, like Rage HD, NOVA2, Modern Combat 3, Order n Chaos, InfinityBlade, ShadowGun, etc.. All of which I have and more on my almost 2 year old iPad 1.
Asus, with being the first manufacturer to come out with quad core tablet and super IPS + display, might just be the last push needed to get things really rolling for Android, as far as high quality software amd tablet optimized OS goes. Can't wait to see how this plays out .
---------- Post added at 01:00 PM ---------- Previous post was at 12:58 PM ----------
RussianMenace said:
I would have to agree with you that Nvidia dropped the ball on their new GPU, at least on paper.
However, it's not as simple as having "omg wtf i > you" hardware that's the source of the performance. What Apple really has going for them is uniformity of hardware/software. Apple software is designed to work on very specific and strictly controlled hardware setup which allows for an incredible level of optimizations of software. This "closed loop" if software/hardware is what really drives the performance of the iProducts. Simply put, probably way over-simplified, but it let's them do more with less.
Click to expand...
Click to collapse
Great point, just as I was saying basically in my long post..lol
nook-color said:
You can read the White Papers on the Tegra 3 over on Nvidia's website. But the chip has a controller built into the chip that activates either the 4 cores, or the 1 core based on power demand of a given processing activity.
The quad vs single core are made out of different silicone materials, but same design structure in order to maximize the energy efficiency at the performance curve. The difference of Materials is more efficient at different power curves. So the 5th core is very efficient at low processing levels where it is actively being used.
It's pretty cool stuff
Click to expand...
Click to collapse
That is correct. Actually, the "5th" core is licensed with ARM A7 instructions set, the quads are A9.
RussianMenace said:
I would have to agree with you that Nvidia dropped the ball on their new GPU, at least on paper.
However, it's not as simple as having "omg wtf i > you" hardware that's the source of the performance. What Apple really has going for them is uniformity of hardware/software. Apple software is designed to work on very specific and strictly controlled hardware setup which allows for an incredible level of optimizations of software. This "closed loop" if software/hardware is what really drives the performance of the iProducts. Simply put, probably way over-simplified, but it let's them do more with less.
Click to expand...
Click to collapse
Again, I agree. Just like saying why Xbox360 and PS3 consoles can still push high quality graphics compared to a new high-end PC? Unity of hardware plays a big role there.
I have a $4000 custom PC. Sometimes I see my brother play the same games on his $250 Playstation 3 with performance and graphics very similar to my PC.
CyberPunk7t9 said:
I have a $4000 custom PC. Sometimes I see my brother play the same games on his $250 Playstation 3 with performance and graphics very similar to my PC.
Click to expand...
Click to collapse
That's because these days, most PC games are console ports.
GPU specs don't matter. The iPad has more and better games than Android tabs, and that won't change for the (1-yr) lifespan of the Teg3. Not to be a downer, but it's just reality.
The Prime is better at certain things. HDMI-out and USB host (NTFS) support makes it a pretty good HTPC, for one. But I wouldn't get into a pissing contest over games--unless of course you're talking about emus.
e.mote said:
GPU specs don't matter. The iPad has more and better games than Android tabs, and that won't change for the (1-yr) lifespan of the Teg3. Not to be a downer, but it's just reality.
The Prime is better at certain things. HDMI-out and USB host (NTFS) support makes it a pretty good HTPC, for one. But I wouldn't get into a pissing contest over games--unless of course you're talking about emus.
Click to expand...
Click to collapse
Is that true? NTFS support? Are you sure? Can you link me to a spec for that? If so then I can transfer files from my SD to an external NTFS without using Windows! That would be great for trips when I need to dump digital pics.

running full speed interesting observation

OK I've got mine on normal mode, and this kind of confirms my original thought that the 500mhz 5th core is clocked to low. I find the pad actually speeds up when I have multiple items in my recently run tab! If my understanding of the way it works these programs are still running in the background right? Then it starts kicking in the other 4 and not just running on the 5th at 500mhz! I really think we'd see a speed boost if we can get that 5th core over 500. Yes its supposed to save battery life but I really don't think 500 is fast enough to run on its own. You're thoughts and observations?
markimar said:
OK I've got mine on normal mode, and this kind of confirms my original thought that the 500mhz 5th core is clocked to low. I find the pad actually speeds up when I have multiple items in my recently run tab! If my understanding of the way it works these programs are still running in the background right? Then it starts kicking in the other 4 and not just running on the 5th at 500mhz! I really think we'd see a speed boost if we can get that 5th core over 500. Yes its supposed to save battery life but I really don't think 500 is fast enough to run on its own. You're thoughts and observations?
Click to expand...
Click to collapse
ill check on this when i get home. this issue im assuming is with honeycomb itself. we would assume that ICS would properly use those cores
Sent from my Samsung Galaxy S II t989
i don't have it yet (mine gets delivered on wed), but what you observed makes perfect sense. Can they change it to run on say an 800 MHZ constant "down" to 500MHZ when doing the most simple tasks? obviously i to do not believe that 500MHZ will be sufficient at all times to do screen scrolling and such on it's own.
I'm really hoping that the few performance issues people are seeing is resolved in firmware updates and a tegra 3 optimized version of ICS. Maybe asus/nvidia needs to do more tweaking to HC before the ICS build is pushed if it will take a while for ICS to arrive to the prime (past january).
The cores are optimized just fine. They kick in when rendering a web page or a game, but go idle and use the 5th core when done. Games always render.
ryan562 said:
ill check on this when i get home. this issue im assuming is with honeycomb itself. we would assume that ICS would properly use those cores
Sent from my Samsung Galaxy S II t989
Click to expand...
Click to collapse
Nothing's changed over HC in the way ICS uses h/w acceleration. And I'd assume apps using h/w acceleration do so via calls to the OS, not to the chip directly. So it appears what you've got is what you're going to get.
---------- Post added at 06:59 PM ---------- Previous post was at 06:55 PM ----------
markimar said:
OK I've got mine on normal mode, and this kind of confirms my original thought that the 500mhz 5th core is clocked to low. I find the pad actually speeds up when I have multiple items in my recently run tab! If my understanding of the way it works these programs are still running in the background right? Then it starts kicking in the other 4 and not just running on the 5th at 500mhz! I really think we'd see a speed boost if we can get that 5th core over 500. Yes its supposed to save battery life but I really don't think 500 is fast enough to run on its own. You're thoughts and observations?
Click to expand...
Click to collapse
Do you have Pulse installed? A bunch of people using it were reporting stuttering where their lower powered devices aren't. If you run it at full speed, does it stutter? One of the hypothesis is that it's the core's stepping up and down that's causing the stuttering.
BarryH_GEG said:
Nothing's changed over HC in the way ICS uses h/w acceleration. And I'd assume apps using h/w acceleration do so via calls to the OS, not to the chip directly. So it appears what you've got is what you're going to get.
Click to expand...
Click to collapse
Also, correct me if I'm wrong, but I don't think that the OS knows about the fifth core? I believe the chip's own scheduler manages the transition between the quad-core and the companion core, not the Android scheduler.
Mithent said:
Also, correct me if I'm wrong, but I don't think that the OS knows about the fifth core? I believe the chip's own scheduler manages the transition between the quad-core and the companion core, not the Android scheduler.
Click to expand...
Click to collapse
That's the way I'd guess it would work. I don't think Android addresses different chips differently. I'd assume it's up to the SoC to manage the incoming instructions and react accordingly. If Android was modified for dual-core, I don't think it diffentiates between the different implementations of dual-core chips. Someone with more h/w experience correct me if I'm wrong. Also, does anyone know if the chip manufacturer can add additional API's that developers can write to directly either instead of or in parallel with the OS? I ask because how can a game be optimized for Tegra if to the OS all chips are treated the same?
I tried out the power savings mode for a while.it seemed to perform just fine. Immediate difference is that it lowers the contrast ratio on display. This happens as soon as you press the power savings tab. Screen will look like brightness dropped a bit but if you look closely, you'll see it lowered the contrast ratio. Screen still looks good but not as sharp as in other 2 modes. UI still seems to preform just fine. Plus I think the modes doesn't affect gaming or video playback performance. I read that somewhere, either anandtech or Engadget. When watching vids or playing games, it goes into normal mode. So those things won't be affected no matter what power mode you in, I think..lol
I was thinking of starting a performance mode thread. To see different peoples results and thoughts on different power modes. I read some people post that they just use it in power/battery savings mode. Some keep it in normal all the time. Others in balanced mode. Would be good to see how these different modes perform in real life usage. From user perspective. I've noticed, so far, that In balanced mode, battery drains about 10% an hour. This is with nonstop use including gaming, watching vids, web surfing, etc. now in battery savings mode, it drains even less per hour. I haven't ran normal mode long enough to see how it drains compared to others. One thing though, web surfing drains battery just as fast as gaming.
BarryH_GEG said:
I ask because how can a game be optimized for Tegra if to the OS all chips are treated the same?
Click to expand...
Click to collapse
I hate quoting myself but I found the answer on Nvidia's website. Any otimizations are handled through OpenGL. So games written to handle additional calls that Teg2 can support are making those calls through OpenGL with the OS (I'm guessing) used as a pass-through. It would also explain why Tegra optimized games fail on non-Teg devices because they wouldn't be able process the additional requests. So it would appear that Teg optimization isn't being done through the OS. Again, correct me if I'm wrong.
BarryH_GEG said:
That's the way I'd guess it would work. I don't think Android addresses different chips differently. I'd assume it's up to the SoC to manage the incoming instructions and react accordingly. If Android was modified for dual-core, I don't think it diffentiates between the different implementations of dual-core chips.
Click to expand...
Click to collapse
I did some research on it; here's what Nvidia say:
The Android 3.x (Honeycomb) operating system has built-in support for multi-processing and is
capable of leveraging the performance of multiple CPU cores. However, the operating system
assumes that all available CPU cores are of equal performance capability and schedules tasks
to available cores based on this assumption. Therefore, in order to make the management of
the Companion core and main cores totally transparent to the operating system, Kal-El
implements both hardware-based and low level software-based management of the Companion
core and the main quad CPU cores.
Patented hardware and software CPU management logic continuously monitors CPU workload
to automatically and dynamically enable and disable the Companion core and the main CPU
cores. The decision to turn on and off the Companion and main cores is purely based on current
CPU workload levels and the resulting CPU operating frequency recommendations made by the
CPU frequency control subsystem embedded in the operating system kernel. The technology
does not require any application or OS modifications.
Click to expand...
Click to collapse
http://www.nvidia.com/content/PDF/t...e-for-Low-Power-and-High-Performance-v1.1.pdf
So it uses the existing architecture for CPU power states, but intercepts that at a low level and uses it to control the companion core/quad-core switch?
Edit: I wonder if that means that tinkering with the scheduler/frequency control would allow the point at which the companion core/quad-core switch happens to be altered? If the OP is correct, this might allow the companion core to be utilised less if an increase in "smoothness" was desired, at the cost of some battery life?
Mithent said:
I wonder if that means that tinkering with the scheduler/frequency control would allow the point at which the companion core/quad-core switch happens to be altered? If the OP is correct, this might allow the companion core to be utilised less if an increase in "smoothness" was desired, at the cost of some battery life?
Click to expand...
Click to collapse
So what we guessed was right. The OS treats all multi-cores the same and it's up to the chip maker to optimize requests and return them. To your point, what happens between the three processors (1+1x2+1x2) is black-box and controlled by Nvidia. To any SetCPU type program it's just going to show up as a single chip. People have tried in vain to figure how to make the Qualcomm dual-core's act independently so I'd guess Teg3 will end up the same way. And Nvidia won't even publish their drivers so I highly doubt they'll provide any outside hooks to control something as sensitive as the performance of each individual core in what they're marketing as a single chip.
[/COLOR]
Do you have Pulse installed? A bunch of people using it were reporting stuttering where their lower powered devices aren't. If you run it at full speed, does it stutter? One of the hypothesis is that it's the core's stepping up and down that's causing the stuttering.[/QUOTE]
I have been running mine in balanced mode, have had pulse installed since day one, no lag or stuttering in anything. games, other apps work fine.
Well my phones when clocked at 500 so I wouldn't be surprised
Sent from my VS910 4G using xda premium

Can the Lenovo K900 emulate PS2 on Android?

I got around to looking at some pretty powerful mobile devices, and found that the Lenovo K900 is pretty darn excellent. Its Intel Atom processor can be overclocked to beyond 2.2 GHz, and it has dual cores too.
Here's a short overview of the device:
1.Intel Atom Z2580 Dual core 2.0Ghz
2. 2GB Ram (LPDDR2).
3.Android V4.2 (not really a big make or break on PS2 emulation, but whatever).
I know this is not powerful enough for ideal gaming, but isn't it possible to port PCSX2 to Android and supported libraries, and manage to get somewhat emulation going and some games at low FPS?
I don't see why not ... I have heard of people using PCSX2 under 2.5 GHz processors on Windows, single core, and it was arguably "playable" to some.
Assuming this is not the highest-end smartphone in the market but still has pretty good specs, wouldn't a device a bit more powerful than this one come close to taking the cake?
In short, I believe PS2 emulation could be done on some high-end smartphones (like this one) now, just not "good FPS/emulation" yet.
Any rebuttal? The S5 from Samsung should be arriving soon, and they will be even more powerful. Somebody should help me port PCSX2, or at least create an open, community project to do so. As time goes on, updates can be made for the more powerful hardware in time.

Under Clocking PC to Reduce Heat

I am currently running Remix OS for PC Hacked Edition 2.0.205 on a Toshiba Satellite Radius 12 (P25W-2300C-4k) 2 in 1 notebook. When booted into Remix the fans are generally running all the time. I though reducing the CPU would help with the heat and therefor reduce the fan use. I did something similar in Windows 10 and Ubuntu 16.04 and it works perfect. The notebook has a Core I7 Dual core 6th gen CPU so I should be able to reduce my clock by half without any problem. When I downloaded an oveerclocking app from the play store my only options were 400Mhz and 3.0GHz, there were no speeds in between. I tried another app and found the same thing. Has anyone else tried to under or over clock their CPU in Remix and found the same issue, only seeing two possible clock speeds?
Thanks
After testing out a bunch of the overclock apps in the play store I found one that seems to work. For anyone interested it is called 3C CPU Manager. I am only able to underclock my CPU because the kernel of Remix OS doesn't seem to support overclocking.
This is a fairly major issue with RemixOS and the myriad of hardware running it currently. With such a small team working on it, I imagine it's impossible to enumerate all the hardware configurations out there, especially legacy. Personally, I just fried another chip on RemixOS, a Nvidia quadro fx 3600m, out of a dell precision m6300 (rather upsetting), tinkering around and running the hacked edition as well. These cards were suspect due to solder issues anyway, but I believe it had a lot more life left in it. I never noticed until it was to late in my case, the fan's on my system weren't enumerated/activated whatsoever, until a hard shutdown-reboot, and the fabled blue lines of death it has now. I'd be especially careful with watching your cpu/gpu temps, and doubly cautious on which programs/games you decide to run through remixOS, as I have no doubt with the plethora of apps available, there are many floating around that can run your card in way's it was never intended.
Thanks for the info, I am only interested in underclocking my notebook rather than overclocking it. It just happens that the apps used are typically called overclocking apps. With how hot Remix runs on my notebook I wouldn't even consider overclocking. Remix currently runs at 1 GHz max (CPU capable of 3.1GHz) using the app I listed above and my fans come on far less than they were before. I am fairly certain underclocking worked for a few reasons, my computer doesn't run as hot, the fans are not running 24/7 and validated the clock speed using ZCPU and cpuinfo in /proc. In addition to the extra power draw from the fans, I was also worried about the fans failing from running so much. Replacing a fan in a desktop is one thing, replacing a fan in a compact notebook like mine is well, not so easy. Sorry to hear about your card, I would be quite upset if my notebook/graphics chip fried. Hope yu don't mind i'm going to use your info to update my article. Will give credit due
I saw my notebook CPU running most of the time >80 C in Remix OS. It never happens in Windows or Linux. Isn't that it is related to the underlying structure of Android on x86?

Vorke Z3 RK3399 Powered 4K Smart TV Box with Type C SATA Interface

Rumours about RK3399 chipset from Rockchip have swept over the internet, and we seem to be barely a soundbite apart from it. Promised to rival the quarter-century old way we think about TV BOX, Vorke has included the RK3399-powered Z3 in its ambitious plans. Next February is expected to mark the birth of the revolutionary gadget.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Adopting Rockchip RK3399, Vorke Z3 is about to outperform almost all of its competitors in market. Unveiled at CES 2016, RK3399 is a Hexa Core processor based on Dual-core ARM Cortex-A72 MPCore and Quad-core ARM Cortex-A53 MPCore framework paired with Mali-T860MP4 Quad CoreGPU. Reportedly, it is about to offer a significant performance boost against its predecessors, including RK3288, outranking SoCs from Amlogic and Allwinner. Scored 72483 in Antutu, What a madness!
Unlike other contenders, z3 offers a staggering free storage of 4GB LPDDR3, 32GB of EMMC to back up your TV entertainment, so that you can use it right out of the box. Along side the storage, its worth to point out that Vorke brings together compelling KODI entertainment center and RK3399`s insane 4K image quality into one nifty box. The joining of the Soc conduces to its support for H.265 and VP9 4K video decoding.
One of the problems with any android box comes to that it entails optimal internet connection and speed to stream the bandwidth hogging that video can take. In account of this, Z3 equips itself with AC wifi, the newest WiFi protocol and has a high speed data transfer speed up-to 1200Mbps under some circumstances. Besides, the support for Gigabit Ethernet is admittedly an afterthought.
When it comes to extension possibilities, there is no prospect of compromise. With 1 USB 2.0 , 1 USB3.0, 1 Type-C and 1 SATA placed at the sides, Z3 enables you to attach USB peripherals or even more storage. The aforementioned Type C port is another highlight in Android 6.0 box, and when you factor in the sheer number of connections on Z3, you begin to realize why it is a little bit bigger than Z1. With support for SATA, USB 2.0, USB 3.0, Gigabit Ethernet, SPDIF, HDMI 2.0, there are few devices you won’t be able to connect to the device.
What to Expect?
Rockchip RK3399
Android 6.0
4GB + 32GB
AC WIFI + Gigabit Ethernet
4K VP9
USB3.0, Type C and SATA interface
Just got confirmation that my Z1 is supposed to arrive in 3-5 days .......... So any idea what the price will be on this puppy the Z3???? When it will launch??? Antu score is badasszz .
How much was your Z1?
I;m guessing the Z3 will be about 40% t0 50% dearer.
Z1 in some webshops is US$ 120 - I'd expect the Z3 to be about $160-180
It looks like a decent unit - but also look around here, Antutu is NOT a reliable bench to quote.
Some have reported a Xperia z3 (if my memory serves me) as doing 71000 in Antutu, others say it should be about 50k.
Fluffbutt said:
How much was your Z1?
I;m guessing the Z3 will be about 40% t0 50% dearer.
Z1 in some webshops is US$ 120 - I'd expect the Z3 to be about $160-180
It looks like a decent unit - but also look around here, Antutu is NOT a reliable bench to quote.
Some have reported a Xperia z3 (if my memory serves me) as doing 71000 in Antutu, others say it should be about 50k.
Click to expand...
Click to collapse
Z1 is on sale at $74.99 .
Z3 is expected to twice the price of Z3.
We will take on CPU Z .
Some devices did some optimizations on Antutu Benchmark, so that they score pretty high.
linhuizhen said:
Z1 is on sale at $74.99 .
Z3 is expected to twice the price of Z3.
We will take on CPU Z .
Some devices did some optimizations on Antutu Benchmark, so that they score pretty high.
Click to expand...
Click to collapse
I had an email from Vorke, they say "under $200".. so i replied, "Hopefully under $150 as well" hahah! Hope it's not too dear!
What about that 3DMark bench, firestorm (or is it firestrike)... that seems to be a decent test, but it's mainly gpu, isn't it?
Fluffbutt said:
I had an email from Vorke, they say "under $200".. so i replied, "Hopefully under $150 as well" hahah! Hope it's not too dear!
What about that 3DMark bench, firestorm (or is it firestrike)... that seems to be a decent test, but it's mainly gpu, isn't it?
Click to expand...
Click to collapse
We will check :fingers-crossed:
As a slight sideways jump - I notice its competitor boxes both have heats sink + fan listed in specs - does anyone know if the Vorke is using active cooling?
I think I was right in a different forum's post - maybe the RK chip runs a little hotter than passive cooling can deal with?
So got my Z1 here in No.VA Mon. 16 ordered Jan. 5 at geekbuying. After three days pretty happy. Games I haven't been able to play because of bricked MINIX X8-H, & not being able to root MiBox so PS3 SIXAXIS controller could work run flawlessly on Z1 . Showbox runs smooth ,YouTube,kodi 16.1, Antu 3D score no tweeks 41723. Jide has crowd funded Rockchip 3399 TV box for March or May ,USD $ 99-129
I don't trust a company that need kick-starter/crowd funding to develop a device - they smell like "fly by nighters" to me... produce the device, rake in some dosh, run like buggery when people start to complain or what tech support.
That Mio/MixIO whatever it's called... US$129 on kick-starter... mope, zero trust...
Two things I like about Vorke - they exist as a company, self-funded development... and they responded to my silly queries; pre-sales support suggest good after sales support.
Fluffbutt said:
I don't trust a company that need kick-starter/crowd funding to develop a device - they smell like "fly by nighters" to me... produce the device, rake in some dosh, run like buggery when people start to complain or what tech support.
That Mio/MixIO whatever it's called... US$129 on kick-starter... mope, zero trust...
Two things I like about Vorke - they exist as a company, self-funded development... and they responded to my silly queries; pre-sales support suggest good after sales support.
Click to expand...
Click to collapse
Just out of interest, I found a US site with a price for the UGOOS UT5 (basically the same as the Vorke Z3) -- $349!!!
What an absolute joke - you could buy a half decent laptop and use that as a TV box, FFS!
I guess that pointless little AMOLED display add to the price.
Still, early days, it might just be a place-holder price.
Edit - why did that double post? Weird.
Fluffbutt said:
Just out of interest, I found a US site with a price for the UGOOS UT5 (basically the same as the Vorke Z3) -- $349!!!
What an absolute joke - you could buy a half decent laptop and use that as a TV box, FFS!
Click to expand...
Click to collapse
I agree. What's more I realy don't know what is all the fuss about RK3399? When for more than a year there is Amazon Fire TV available with MediaTek SoC MT8173 ([email protected] + [email protected]).
Maybe this is not so good forum to discuss that because MediaTek isn't fond of open source but people preffer working solution than one which you must fiddle all the time to make it work for a while .
Yes - but the AmFireTV or sort of locked, isn't it - own UI, low customisation, Amazon bias (understandable).
I've heard that the Vorke will be completely unlocked, rooted, open... maybe...
Anyway, the specs say different:
Qualcomm Snapdragon 8064 Quad Core 4x @ 1.7Ghz Qualcomm Adreno 320
MediaTek 8173C Quad Core 2x @ 2GHz & 2x @ 1.6Ghz GPU PowerVR Rogue GX6250
Click to expand...
Click to collapse
Neither of those will match the RK3399, and the Mali 850mp4 is a very good SoC GPU. Not "superb" or "the best2, but certainly good enough to nearly everything.
I do NOT like Antutu as a benchmark (it's heavily biased to certain chips) but the AFTV gets 53K while the RK3399 gets 73K
Fluffbutt said:
I do NOT like Antutu as a benchmark (it's heavily biased to certain chips) but the AFTV gets 53K while the RK3399 gets 73K
Click to expand...
Click to collapse
I would be more impressed if my RK3288 device couldn't do 62k in Antutu. AFAIK Antutu is quite strongly GPU biased which might indicate that RK3399 GPU is slower than RK3288 or there are problems with drivers.
Besides I preffer "clear" CPU benchmarks which can give me indication of Internet browsing performance. When I did some research about MT8173 more than year ago I found something like browser.geekbench.com/v4/cpu/compare/939975?baseline=1646914. (Note: Nexus 9 is spoofed identity of my RK3288 device). Then I was quite pleased with that performance improvement but devices with MT8173 came only with 2GB RAM which is to small amount for me. Even so RK3399 isn't realy more impressive to me than MT8173 (check: browser.geekbench.com/v4/cpu/compare/939975?baseline=993600) and we are more than a year after.
Fluffbutt said:
Just out of interest, I found a US site with a price for the UGOOS UT5 (basically the same as the Vorke Z3) -- $349!!!
What an absolute joke - you could buy a half decent laptop and use that as a TV box, FFS!
I guess that pointless little AMOLED display add to the price.
Still, early days, it might just be a place-holder price.
Edit - why did that double post? Weird.
Click to expand...
Click to collapse
Ugoos is a trusted brand.
Jagee said:
I would be more impressed if my RK3288 device couldn't do 70k Antutu score already. AFAIK Antutu is quite strongly GPU biased which might indicate that RK3399 GPU is slower than RK3288 or there are problems with drivers.
Besides I preffer "clear" CPU benchmarks which can give me indication of Internet browsing performance. When I did some research about MT8173 more than year ago I found something like browser.geekbench.com/v4/cpu/compare/939975?baseline=1646914. (Note: Nexus 9 is spoofed identity of my RK3288 device). Then I was quite pleased with that performance improvement but devices with MT8173 came only with 2GB RAM which is to small amount for me. Even so RK3399 isn't realy more impressive to me than MT8173 (check: browser.geekbench.com/v4/cpu/compare/939975?baseline=993600) and we are more than a year after.
Click to expand...
Click to collapse
We will try other benchmarks also. Recommend one please
linhuizhen said:
We will try other benchmarks also. Recommend one please
Click to expand...
Click to collapse
I overestimated a little Antutu score for my RK3288 device which should be 62k not 70k. Nevertheless I would realy like to see Vellamo and Octane benchmarks scores for RK3399.
My RK3288 device can do about 4800 points with Chrome in Vellamo.
Geekbench v4 score for RK3399 is rather well known. You can brag only if you beat 1600 single- and 3000 multi-core score . browser.geekbench.com/v4/cpu/993600
linhuizhen said:
Ugoos is a trusted brand.
Click to expand...
Click to collapse
I'm not disputing that - they have a good track record... but that doesn't stop $349 being too high for this device. And I give them mucho-credit for NOT trying to effing kickstarter the device!
But the amoled screen is really a non-item - what's the point, a TV box is stuck under the TV table; I don't even look at mine, just use it...
And what can a little screen show anyway apart from time or some form of channel display? Any more info would mean you;d have to get down on the floor, closer to it, to read it!
An amoled screen is perhaps $50 of that $150 over pricing ($200 is all I'd be paying for this spec TV box - for $350 you can get a full i7 with intel 520 gpu (400 GFlops) TV Box! The Mali gpu in the RK3399 is rated at about 90 GFLops.
It smells like a gimmick to make it stand out from the other 3 or 4 RK3399 boxes coming.
So I'm less knocking Ugoos themselves and more knocking their "vision" of the yet-to-come TV Box.
*********************************************************************************
Geekbench 1520 and 2840 isn't too bad - just 160 lower than that magical 3000 isn't to be sniffed at.
Mind you, even Geekbench can be misleading - GB 3 gives over 5300 for the Samsumng Galaxy Tab S2 9.7 (Qualcomm SoC). (some reviews say 4300, 4800, 5300 (rounded off))
Fluffbutt said:
An amoled screen is perhaps $50 of that $150 over pricing
It smells like a gimmick to make it stand out from the other 3 or 4 RK3399 boxes coming.
Click to expand...
Click to collapse
It might be a gimmic to make it stand out but yours estimation of the cost of such display is quite off. First. It isn't AMOLED display like in smartphones. Second it dosen't have similar size (5" 16:9 proportion).
I've seen OLED displays on MP3 players devices with prices lower than 30$.
Fluffbutt said:
Geekbench 1520 and 2840 isn't too bad - just 160 lower than that magical 3000 isn't to be sniffed at.
Mind you, even Geekbench can be misleading - GB 3 gives over 5300 for the Samsumng Galaxy Tab S2 9.7 (Qualcomm SoC). (some reviews say 4300, 4800, 5300 (rounded off))
Click to expand...
Click to collapse
3000 is just a number for multicore Geekbench v4 score which heavily depends from number of them. Even mentioned by you Qualcom SoC (MSM8976) in New Samsung Galaxy Tab S2 9.7 can get over 5000 points in Geekbench v3 with 8 cores (4xA57 + 4xA53) working simultanously (Source: http://browser.primatelabs.com/geekbench3/compare/4803220?baseline=5849462). That doesn't necessary translate to better performance than even just 2xA73.
Another problem when comparing benchmarks is that some devices contains different hardware than previous "batches". Like new (Qualcomm MSM8976) and "old" Galaxy Tab S2 9.7 (Samsung Exynos 5433 Octa).
Another factor is used benchmarking software and OS. There is clear example: http://browser.primatelabs.com/geekbench3/compare/4803220?baseline=5849462 when one test is for older design and with 32bit OS rather than with 64bit (AArch64). I don't want even start about overheating problems which might be per device variable.
Jagee said:
It might be a gimmic to make it stand out but yours estimation of the cost of such display is quite off. First. It isn't AMOLED display like in smartphones. Second it dosen't have similar size (5" 16:9 proportion).
I've seen OLED displays on MP3 players devices with prices lower than 30$.
3000 is just a number for multicore Geekbench v4 score which heavily depends from number of them. Even mentioned by you Qualcom SoC (MSM8976) in New Samsung Galaxy Tab S2 9.7 can get over 5000 points in Geekbench v3 with 8 cores (4xA57 + 4xA53) working simultanously (Source: http://browser.primatelabs.com/geekbench3/compare/4803220?baseline=5849462). That doesn't necessary translate to better performance than even just 2xA73.
Another problem when comparing benchmarks is that some devices contains different hardware than previous "batches". Like new (Qualcomm MSM8976) and "old" Galaxy Tab S2 9.7 (Samsung Exynos 5433 Octa).
Another factor is used benchmarking software and OS. There is clear example: http://browser.primatelabs.com/geekbench3/compare/4803220?baseline=5849462 when one test is for older design and with 32bit OS rather than with 64bit (AArch64). I don't want even start about overheating problems which might be per device variable.
Click to expand...
Click to collapse
Oh, I agree on the guesstimate - I have no real idea what an oled screen costs - your $30 is probably more valid than my $50 - but that make it worse, not better.
What else is there to make this over $150 more than the other RK3399 boxes - "under $200" stated by Vorke, $115 pre-sale price foe another, and so on.
It MIGHT be a "place holder" number - but it's working against them... I'm not going back to look see what the release price is, the $349 has put me off... especially if it's a "state a high price then a release over-price will sound better".
That's a marketing con, really - you state $349 before release then drop it to a still-too-high $299 or $259 after release and people will flock to it thinking they're getting a great deal. (Not saying they're doing it that way)
But the other RK3399 boxes? £115 pre-release, expected to be £140 after; Vokre has stated to me, in email, "definitely under $200".
And i FULLY agree with you about the benchies.
Perhaps the better benchies are the real-world ones... game FPS -- 5.2 for one SoC, 14.8 for another, 25.7 for a third, and so on.
****************
Still - I'm liking this 6 core box... it's better than almost everything I've looked at - allowing for the Tegra devices that are semi-locked (heavily controlled launchers and biased "features" (understandable, of course, like the Amazon devices).

Categories

Resources