[Q] CPU vs GPU - Android Q&A, Help & Troubleshooting

I'm curious as to the importance of CPU vs GPU in our phones. The two big processors from MWC were Qualcomm's S4 and Nvidia's Tegra 3. From my research, it seems like the S4 has a better CPU but Tegra 3 wins in graphical processing.
Is the speed of android usually limited by CPU or GPU?
Which one would be more important in apps like web browser and games?
Is it worth it to wait for Cortex A15 processors this fall, or is that much speed just overkill?

I'd saymthe CPU is more important, since there's phones out there that don't have a GPU and because pre-honeycomb, the GPU was mostly used for minor rendering and inside games.
Sent from my Ainol Novo7 Elf using xda premium

Please use the Q&A Forum for questions &
Read the Forum Rules Ref Posting
Moving to Q&A

The gpu is a graphics proccesor unit, it is that which processes the graphic information displayed on the screen, having one of these takes the responsibility away from the CPU leaving it to do the other things it needs to do.
Having a gpu on a phone I don't think is really that nessasary yet as we are not doing anything that intensive with our phones graphicaly, who's doing cadcam or 3D rendering or graphic animation or anything else like that on a phone? However as time goes on and phones get used more and more then a gpu will be a must...
just look at the PC world say 20. Years ago we were using Hercules 2 tone graphics cards, then 4 tone cga wow then 16 tone ega then omg 256 colour vga. Then started the 3d era when a really fantastic gpu was a must in stepped the voodoo 3d graphics.....
Anyway I could go on for hours I think u might get the idea..
Sorry for long post
sent from my legend, currently using extream legend fuseā„¢

I'd say both: CPU and GPU are equally important, to allow for super smooth UI experience AND some decent gaming.
So now we have Tegra 3 (A9 + nice GPU) and Qualcomm S4 Krait (A15 + average GPU), both being great.
If you want to wait, there will be Tegra 4 coming out next year, S4 Pro (quadcore + more powerful GPU) and not forget the TI OMAP5.
I usually go buy whatever there is best at the moment and I enjoy new hardware while waiting for new stuff to come out. The waiting for better stuff will never end as new devices are coming out every few months, so up to you how often do you want to upgrade.

Related

RLY?! Xperia x10 gets ISC port but not atrix?

X10 is garbage! this is outrageous!
Yes really, they got it working, you want it so bad try porting it yourself
Sent from my MB860 using XDA App
cry about it?
if you want it so bad for your phone, learn to port it yourself. until then, since you rely solely on other peoples' hard work and sweat, shut up and be patient.
dLo GSR said:
cry about it?
if you want it so bad for your phone, learn to port it yourself. until then, since you rely solely on other peoples' hard work and sweat, shut up and be patient.
Click to expand...
Click to collapse
Oh snap. That was awesome.
Sent from my MB860 using XDA App
I might start to look into trying to port it this weekend
Sent from my MB860 using XDA App
firefox3 said:
I might start to look into trying to port it this weekend
Sent from my MB860 using XDA App
Click to expand...
Click to collapse
Good news man
Sent from my MB860 using XDA App
Being that there are currently no EGL libs for anything except PowerVR SGX devices under ICS yet, and they're closed source and tightly dependent on the kernel there doesn't seem to be a huge point until the official updates start to hit for a range of devices.
Sure, Desire, HD, X10, N1 have ports of a sort at the moment, in fact there shouldn't be too many problems getting them working aside from the graphics drivers but they're just for fun with the framebuffer driver given how much of ICS' UI rendering is done with GPU acceleration in mind. You wouldn't want to use it day-to-day. The browser is surprisingly responsive on the Desire though (I'd say moreso than GB, despite the software rendering), as is the Market (the new one always lagged really badly for me on the Desire before) - glimmers of hope for ICS' eventual performance on older devices. The keyboard lags like you wouldn't believe though!
The Atrix should fly under 4.0.1 though, if it ever happens - bearing in mind the fact that the SGX 540 in the Galaxy Nexus is pretty much in a dead heat with Tegra 2's GPU, we've got a lower resolution screen, and can overclock past the its stock speeds.
Javi97100 said:
Good news man
Sent from my MB860 using XDA App
Click to expand...
Click to collapse
Its turning out to be harder then i though... I think no one will get it until offical updates come out for other phones
Azurael said:
Being that there are currently no EGL libs for anything except PowerVR SGX devices under ICS yet, and they're closed source and tightly dependent on the kernel there doesn't seem to be a huge point until the official updates start to hit for a range of devices.
Sure, Desire, HD, X10, N1 have ports of a sort at the moment, in fact there shouldn't be too many problems getting them working aside from the graphics drivers but they're just for fun with the framebuffer driver given how much of ICS' UI rendering is done with GPU acceleration in mind. You wouldn't want to use it day-to-day. The browser is surprisingly responsive on the Desire though (I'd say moreso than GB, despite the software rendering), as is the Market (the new one always lagged really badly for me on the Desire before) - glimmers of hope for ICS' eventual performance on older devices. The keyboard lags like you wouldn't believe though!
The Atrix should fly under 4.0.1 though, if it ever happens - bearing in mind the fact that the SGX 540 in the Galaxy Nexus is pretty much in a dead heat with Tegra 2's GPU, we've got a lower resolution screen, and can overclock past the its stock speeds.
Click to expand...
Click to collapse
So EGL = gpu driver? If thats the only setback, would it be possible to get an ICS rom with software rendering as a proof of concept, or are there other pieces missing?
GB/CM7 is pretty good on the Atrix, if we dont see ICS for a few months it doesn't hurt us in any way. I'd like to think most of us can be patient if we lack the skills to help.
I noticed the Captivate got a port of it too since i9000 ROMs and Cap ROMs are interchangeable. I thought its funny that it's running on the HD a Windows Mobile 6.5 phone lol. Let's all try to be patient and we will eventually see it.
Edit: not to mention I'm sure if it's not already it will soon be on iPhone too. It seems like iPhones always get the new Android versions kinda early. I'm not sweating it I love my Atrix in its current state.
According to anandtech, Tegra 2 support is essentially ready, so I think as long as nvidia releases the source for ics (libs?), someone will try to port it. Hell, I have a good 5 weeks during break, I might as well try then.
Sent from my MB860 using XDA App
Azurael said:
Being that there are currently no EGL libs for anything except PowerVR SGX devices under ICS yet, and they're closed source and tightly dependent on the kernel there doesn't seem to be a huge point until the official updates start to hit for a range of devices.
Sure, Desire, HD, X10, N1 have ports of a sort at the moment, in fact there shouldn't be too many problems getting them working aside from the graphics drivers but they're just for fun with the framebuffer driver given how much of ICS' UI rendering is done with GPU acceleration in mind. You wouldn't want to use it day-to-day. The browser is surprisingly responsive on the Desire though (I'd say moreso than GB, despite the software rendering), as is the Market (the new one always lagged really badly for me on the Desire before) - glimmers of hope for ICS' eventual performance on older devices. The keyboard lags like you wouldn't believe though!
The Atrix should fly under 4.0.1 though, if it ever happens - bearing in mind the fact that the SGX 540 in the Galaxy Nexus is pretty much in a dead heat with Tegra 2's GPU, we've got a lower resolution screen, and can overclock past the its stock speeds.
Click to expand...
Click to collapse
Actually, no, despite being a much older GPU, the SGX 540 found in the GNexus outpaces the Tegra 2 due to its higher clock rate by 7% or 45% depending on the GLBenchmark being run. Both GPU tests were done at 720p resolution. Also, you can't overclock the GPU, only the CPU.
edgeicator said:
Actually, no, despite being a much older GPU, the SGX 540 found in the GNexus outpaces the Tegra 2 due to its higher clock rate by 7% or 45% depending on the GLBenchmark being run. Both GPU tests were done at 720p resolution. Also, you can't overclock the GPU, only the CPU.
Click to expand...
Click to collapse
Buddy, check out any of the kernels available in the dev thread and you'll see that the GPUs are overclocked.
WiredPirate said:
I noticed the Captivate got a port of it too since i9000 ROMs and Cap ROMs are interchangeable. I thought its funny that it's running on the HD a Windows Mobile 6.5 phone lol. Let's all try to be patient and we will eventually see it.
Edit: not to mention I'm sure if it's not already it will soon be on iPhone too. It seems like iPhones always get the new Android versions kinda early. I'm not sweating it I love my Atrix in its current state.
Click to expand...
Click to collapse
Doubt the iPhone will see ICS, the newest model that can run android as far as I know is the iPhone 3G, which was incredibly slow under Gingerbread.
mac208x said:
X10 is garbage! this is outrageous!
Click to expand...
Click to collapse
222 posts and zero thanks? Is this what you do, go around XDA and post useless threads like the guy complaining about returning home early despite nobody asking him to "to get MIUI ported on his grandma's phone"?
Are you guys related by any chance?
edgeicator said:
Actually, no, despite being a much older GPU, the SGX 540 found in the GNexus outpaces the Tegra 2 due to its higher clock rate by 7% or 45% depending on the GLBenchmark being run. Both GPU tests were done at 720p resolution. Also, you can't overclock the GPU, only the CPU.
Click to expand...
Click to collapse
Depends on the benchmark, yes - texture-heavy rendering tends to perform better on the 540 in the OMAP4460 thanks to it's dual channel memory controller and high clock (and that's probably the directly relevant part to UI rendering to be honest, though as I said - lower resolution screen ) but the Tegra 2 is quite substantially ahead in geometry-heavy rendering (and games on mobiles are starting to move that way now, following the desktop landscape over the past 5 years or so.) Averaged out, the performance of the two is very close.
Plus, as I said, the GPU in my phone is running at 400MHz which ought to even things out in the GLMark 720p tests somewhat even if they are biassed to one architecture or the other. While the GPU in OMAP4460 may overclock just as well from its stock 400MHz, I'm only really concerned that the phone can run as fast as a stock GNexus to maybe skip the next generation of mobile hardware and tide it over until Cortex A15-based SoCs on 28nm process start to emerge with stronger GPUs. I don't really think I'm CPU performance bound with a 1.4GHz dual-core A9 - and increasing the number of equivalent cores without a really substantial boost in GPU horesepower seems worthless right now, even if ICS takes better advantage of SMP (re: Disappointing early Tegra 3 benchmarks - although it does seem GLMark stacks the odds against NVidia GPUs more than other benchmarks?)
Azurael said:
Depends on the benchmark, yes - texture-heavy rendering tends to perform better on the 540 in the OMAP4460 thanks to it's dual channel memory controller and high clock (and that's probably the directly relevant part to UI rendering to be honest, though as I said - lower resolution screen ) but the Tegra 2 is quite substantially ahead in geometry-heavy rendering (and games on mobiles are starting to move that way now, following the desktop landscape over the past 5 years or so.) Averaged out, the performance of the two is very close.
Plus, as I said, the GPU in my phone is running at 400MHz which ought to even things out in the GLMark 720p tests somewhat even if they are biassed to one architecture or the other. While the GPU in OMAP4460 may overclock just as well from its stock 400MHz, I'm only really concerned that the phone can run as fast as a stock GNexus to maybe skip the next generation of mobile hardware and tide it over until Cortex A15-based SoCs on 28nm process start to emerge with stronger GPUs. I don't really think I'm CPU performance bound with a 1.4GHz dual-core A9 - and increasing the number of equivalent cores without a really substantial boost in GPU horesepower seems worthless right now, even if ICS takes better advantage of SMP (re: Disappointing early Tegra 3 benchmarks - although it does seem GLMark stacks the odds against NVidia GPUs more than other benchmarks?)
Click to expand...
Click to collapse
I would expect the Tegra to beat a nearly 5 year old GPU, but it only does so in triangle throughput. Tegra just uses a very poor architecture in general. Look at how little actual horsepower it can pull. The Tegra 3 gpu pulls 7.2GFLOPs @300mhz. The iPad GPU and the upcoming Adreno 225 both pull 19.2 GFLOPS at that same clockspeed. I honestly have no idea what the engineers are thinking over atNnvidia. It's almost as bad as AMD's latest bulldozer offerings. It's really more of Tegra's shortcomings than GLMark stacking the odds. PowerVR's offerings from 2007 are keeping up with a chip that debuted in 2010/2011. The Geforce just doesn't seem to scale very well at all on mobile platforms. But yea, all Nvidia did with Tegra 3 was slap in 2 extra cores, clocked them higher, threw in the sorely missed NEON instruction set, increased the SIMDs on the GPU by 50% (8 to 12), and then tacked on a 5th hidden core to help save power. Tegra 3 stayed with the 40nm process whereas every other SoC is dropping down to 28nm with some bringing in a brand new architecture as well.
edgeicator said:
I would expect the Tegra to beat a nearly 5 year old GPU, but it only does so in triangle throughput. Tegra just uses a very poor architecture in general. Look at how little actual horsepower it can pull. The Tegra 3 gpu pulls 7.2GFLOPs @300mhz. The iPad GPU and the upcoming Adreno 225 both pull 19.2 GFLOPS at that same clockspeed. I honestly have no idea what the engineers are thinking over atNnvidia. It's almost as bad as AMD's latest bulldozer offerings. It's really more of Tegra's shortcomings than GLMark stacking the odds. PowerVR's offerings from 2007 are keeping up with a chip that debuted in 2010/2011. The Geforce just doesn't seem to scale very well at all on mobile platforms. But yea, all Nvidia did with Tegra 3 was slap in 2 extra cores, clocked them higher, threw in the sorely missed NEON instruction set, increased the SIMDs on the GPU by 50% (8 to 12), and then tacked on a 5th hidden core to help save power. Tegra 3 stayed with the 40nm process whereas every other SoC is dropping down to 28nm with some bringing in a brand new architecture as well.
Click to expand...
Click to collapse
Don't you get tired if writing those long rants? We understand you know something about CPU architecture, and that Tegra isn't the best one out there, but damn man, it's the same thing in every thread. Just chill out and try to stay on topic for once
Sent from my MB860 using Tapatalk
edgeicator said:
I would expect the Tegra to beat a nearly 5 year old GPU, but it only does so in triangle throughput. Tegra just uses a very poor architecture in general. Look at how little actual horsepower it can pull. The Tegra 3 gpu pulls 7.2GFLOPs @300mhz. The iPad GPU and the upcoming Adreno 225 both pull 19.2 GFLOPS at that same clockspeed. I honestly have no idea what the engineers are thinking over atNnvidia. It's almost as bad as AMD's latest bulldozer offerings. It's really more of Tegra's shortcomings than GLMark stacking the odds. PowerVR's offerings from 2007 are keeping up with a chip that debuted in 2010/2011. The Geforce just doesn't seem to scale very well at all on mobile platforms. But yea, all Nvidia did with Tegra 3 was slap in 2 extra cores, clocked them higher, threw in the sorely missed NEON instruction set, increased the SIMDs on the GPU by 50% (8 to 12), and then tacked on a 5th hidden core to help save power. Tegra 3 stayed with the 40nm process whereas every other SoC is dropping down to 28nm with some bringing in a brand new architecture as well.
Click to expand...
Click to collapse
I think you are not seeing the whole picture...
The Tegra 3 (Et-Al) is not just about its quad core implementation, remember that the GPU will offer 12 cores that will translate in performance not seeing as of yet on any other platform.
Benchmarks don't tell the whole story! Specially those benchmarking tools which are not Tegra 3 optimized yet.
Cheers!
Sent from my Atrix using Tapatalk
WiredPirate said:
I noticed the Captivate got a port of it too since i9000 ROMs and Cap ROMs are interchangeable. I thought its funny that it's running on the HD a Windows Mobile 6.5 phone lol. Let's all try to be patient and we will eventually see it.
Edit: not to mention I'm sure if it's not already it will soon be on iPhone too. It seems like iPhones always get the new Android versions kinda early. I'm not sweating it I love my Atrix in its current state.
Click to expand...
Click to collapse
LOL I ran all the iDroid ports on my iphone. Not one was even in alpha stage, I would not even count iDroid as a port since you cant use anything on it.

dual core vs quad core

So I've been lurking on the prime's forums for a while now and noticed the debate of whether the new qualcomm dual core will be better or the current tegra 3 that the prime has. Obviously if both were clocked the same then the tegra 3 would be better. Also I understand that the gpu of the tegra 3 is better. However, for normal user (surf web, play a movie, songs etc) isn't dual core at 1.5 ghz better in that an average user will rarely use more 2 cores? The way I understand it each core is able to handle 1 task so in order to activate the 3rd core you would have to have 3 things going on at the same time? Could someone please explain this to me?
First of all, the tegra 3 can go up to 1.6 ghz. Secondly, all 4 cores can be utilized by a multi threading app. Lastly, battery is great on tegra III due to teh companion core.
jdeoxys said:
First of all, the tegra 3 can go up to 1.6 ghz. Secondly, all 4 cores can be utilized by a multi threading app. Lastly, battery is great on tegra III due to teh companion core.
Click to expand...
Click to collapse
But the native clock for that qualcomm would be 1.5 meaning o/c can take it higher. Also doesn't being dual core compared to quad core give it an edge in battery? You do bring up a good point with the multi threading app. Also to clarify I am not standing up for the qualcomm chip or putting down the tegra 3 just trying to get things straight.
Thanks
Hey I'm the ....idiot aboard here....lol
But the tegra 3 has a companion core, being a fifth core, to take over when the tablet is not stressed. Thus saving the battery.
I am just repeating what I have read, I have no knowledge of how it all works. I guess that is how we can get better battery life.
Just trying to help the OP, maybe some one way smarter can chime in. Shouldn't be hard....lol
Quad core is better by far. On low level tasks, simple things, and screen off/deep sleep the companion core takes over. Meaning its running on a low powered single core. This companion core only has a Max of 500Mhz speed. So when in deep sleep or low level tasks, companion core alone is running everything at only 102mhz -500Mhz. Most of the time on the lower end. Therefore tegra3 has the better battery life since all it's low power level tasks are ran by a single low powered companion core. That's 1 low powered core compared to 2 high powered cores trying to save battery. Quad core better all around. We hsvent even begun real overclocking yet. The 1.6Ghz speed was already in the kernel. So if you rooted n using vipercontrol or ATP tweaks or virtuous rom, we can access those speeds at any time. Once we really start overclocking higher than 1.6Ghz we will have an even more superior advantage. Anyone knows 4 strong men are stronger than 2..lol. tegra3 and nvidia is the future. Tegra3 is just the chip that kicked down the door on an evolution of mobile chip SoC.
---------- Post added at 10:13 PM ---------- Previous post was at 10:06 PM ----------
If you really want to learn the in and outs of tegra3, all the details, and how its better than any dual core, check out this thread I made. I have a whitepaper attachment in that thread you can download and read. Its made by nvidia themselves and goes into great detail on tegra3 by the people who created it, Nvidia. Check it out.
http://forum.xda-developers.com/showthread.php?t=1512936
aamir123 said:
But the native clock for that qualcomm would be 1.5 meaning o/c can take it higher. Also doesn't being dual core compared to quad core give it an edge in battery? You do bring up a good point with the multi threading app. Also to clarify I am not standing up for the qualcomm chip or putting down the tegra 3 just trying to get things straight.
Thanks
Click to expand...
Click to collapse
The maximum clock speed isn't all that important, since during tasks like web browsing, watching videos & movies and listening to music you will never push the processor to its highest available clock speed anyway. All mobile devices will underclock their processors so that you rarely have unused clock cycles eating up battery life. So all things being relatively equal performance would be about the same between both tablets during these types of lightweight tasks.
If you have a lot of background processes running, then the quad-core system might have an edge in performance since theoretically different tasks can be pushed off to different processors. However this use case is rarely found in Android. You might have an app checking weather or syncing photos in the background, or you might have music playing while you web surf, but those are generally fairly lightweight tasks that usually won't test the processor performance of your device.
In tasks that will stress you processor, such as 3D gaming, then quad cores have a very large advantage over dual core systems, despite the slight difference in maximum clock speeds. In addition the Tegra 3 has a more powerful GPU than the new Qualcomm chip, which will definitely make a noticeable difference in gaming performance.
Now when it comes to ultra-low power tasks or when the tablet is on Standby, the Tegra 3 uses its "companion core" which has incredibly low power requirements, so it can continue to sync your email, twitter and weather updates for days (or weeks) while having very little impact on the Transformer Prime's battery.
So in short, the Tegra 3 is more likely to outperform the Qualcomm in situations where you actually need extra performance. In light tasks performance between the two should be about the same. Battery life is yet to be definitively determined, however the Tegra's 3 ultra-low power companion core should give it an edge when only doing light tasks or on standb.
Keep in mind, the Tegra 3 in the TF Prime has a maximum clock speed of 1300Mhz. One core has a maximum clock speed of 1400Mhz. If all things were equal, a difference of 100-200 Mhz n a 1Ghz+ processor is practically unnoticeable in daily usage.
almightywhacko said:
The maximum clock speed isn't all that important, since during tasks like web browsing, watching videos & movies and listening to music you will never push the processor to its highest available clock speed anyway. All mobile devices will underclock their processors so that you rarely have unused clock cycles eating up battery life. So all things being relatively equal performance would be about the same between both tablets during these types of lightweight tasks.
If you have a lot of background processes running, then the quad-core system might have an edge in performance since theoretically different tasks can be pushed off to different processors. However this use case is rarely found in Android. You might have an app checking weather or syncing photos in the background, or you might have music playing while you web surf, but those are generally fairly lightweight tasks that usually won't test the processor performance of your device.
In tasks that will stress you processor, such as 3D gaming, then quad cores have a very large advantage over dual core systems, despite the slight difference in maximum clock speeds. In addition the Tegra 3 has a more powerful GPU than the new Qualcomm chip, which will definitely make a noticeable difference in gaming performance.
Now when it comes to ultra-low power tasks or when the tablet is on Standby, the Tegra 3 uses its "companion core" which has incredibly low power requirements, so it can continue to sync your email, twitter and weather updates for days (or weeks) while having very little impact on the Transformer Prime's battery.
So in short, the Tegra 3 is more likely to outperform the Qualcomm in situations where you actually need extra performance. In light tasks performance between the two should be about the same. Battery life is yet to be definitively determined, however the Tegra's 3 ultra-low power companion core should give it an edge when only doing light tasks or on standb.
Keep in mind, the Tegra 3 in the TF Prime has a maximum clock speed of 1300Mhz. One core has a maximum clock speed of 1400Mhz. If all things were equal, a difference of 100-200 Mhz n a 1Ghz+ processor is practically unnoticeable in daily usage.
Click to expand...
Click to collapse
Wow! Thanks for taking the time for breaking it down for me like that! I understand exactly where your coming from and now have to agree.
demandarin said:
Quad core is better by far.
Click to expand...
Click to collapse
At least that is what Nvidia would like you to think.
The Tegra 3 uses an older ARM core for it's quad core design while Qualcomm uses their own ARM instruction set compatible core for their Krait S4 design. For most current benchmarks the Qualcomm Krait S4 dual core seems to outpace the Tegra 3 by quite a large margin. And of course Krait will be expanded to quad core later this year.
http://www.anandtech.com/show/5563/qualcomms-snapdragon-s4-krait-vs-nvidias-tegra-3
Dave_S said:
At least that is what Nvidia would like you to think.
The Tegra 3 uses an older ARM core for it's quad core design while Qualcomm uses their own ARM instruction set compatible core for their Krait S4 design. For most current benchmarks the Qualcomm Krait S4 dual core seems to outpace the Tegra 3 by quite a large margin. And of course Krait will be expanded to quad core later this year.
http://www.anandtech.com/show/5563/qualcomms-snapdragon-s4-krait-vs-nvidias-tegra-3
Click to expand...
Click to collapse
There's already another thread on what you just mentioned and the Krait claims were easily shot down. Tegra3 still a better chip overall. Plus krait gpu was subpar to tegra3. We have more links and stuff in other thread showing Prime still right up there
demandarin said:
There's already another thread on what you just mentioned and the Krait claims were easily shot down. Tegra3 still a better chip overall. Plus krait gpu was subpar to tegra3. We have more links and stuff in other thread showing Prime still right up there
Click to expand...
Click to collapse
As unlikely as that seems considering the slower cores that Nvidia uses, links to real benchmarks ( not self serving white papers ) would be appreciated. I have glanced at your Tegra3 thread but have not read it all the way through after I saw that it seemed to depend a lot on a white paper and not real comparison tests. It is true that the current GPU the Krait uses is not as fast as the one in the Tegra 3, but graphics is only one element of overall performance. The only benchmarks that I have seen Tegra beat out Krait on were benchmarks that emphasized more than two threads, and then not by much.
Dave_S said:
As unlikely as that seems considering the slower cores that Nvidia uses, links to real benchmarks ( not self serving white papers ) would be appreciated. I have glanced at your Tegra3 thread but have not read it all the way through after I saw that it seemed to depend a lot on a white paper and not real comparison tests. It is true that the current GPU the Krait uses is not as fast as the one in the Tegra 3, but graphics is only one element of overall performance. The only benchmarks that I have seen Tegra beat out Krait on were benchmarks that emphasized more than two threads, and then not by much.
Click to expand...
Click to collapse
Its not my tegra3 thread I'm talking about. I think its the Prime alternatives thread created by shinzz. We had a huge debate over it. More benchmarks n supporting arguments in that thread. Check it out if you get the chance.
demandarin said:
Its not my tegra3 thread I'm talking about. I think its the Prime alternatives thread created by shinzz. We had a huge debate over it. More benchmarks n supporting arguments in that thread. Check it out if you get the chance.
Click to expand...
Click to collapse
Thanks, Will do. Gotta run for a doctor appointment right now though.
I frankly think the power savings with the fifth core is mostly hype. According to many battery tests I've read online and my own experiences with my Prime, it doesn't get much different battery life from dual core tablets.
Quad core is better for future but problem for backwards compatibility... it's definitely good for tablet.
jedi5diah said:
Quad core is better for future but problem for backwards compatibility... it's definitely good for tablet.
Click to expand...
Click to collapse
Here is another benchmark that shows that there is a least one current dual core that can soundly beat the Nvida quad core at benchmarks that are not heavily multithreaded.
http://www.extremetech.com/computin...ragon-s4-has-the-competition-on-the-defensive
Buddy Revell said:
I frankly think the power savings with the fifth core is mostly hype. According to many battery tests I've read online and my own experiences with my Prime, it doesn't get much different battery life from dual core tablets.
Click to expand...
Click to collapse
No dual core android tablet battery last longer than an ipad1. My prime easily outlasts my Ipad in battery life. The battery hype is real. Tons of people here seeing 9-11hrs+ on a single charge with moderate to semi heavy use on balanced mode. Even longer on power savings mode.
demandarin said:
No dual core android tablet battery last longer than an ipad1. My prime easily outlasts my Ipad in battery life. The battery hype is real. Tons of people here seeing 9-11hrs+ on a single charge with moderate to semi heavy use on balanced mode. Even longer on power savings mode.
Click to expand...
Click to collapse
Really? I get 9-12 hours constant use on balanced. Plus 6 more with the dock.
Sent from my PG8610000 using xda premium
I think if Krait were to come out with quad core then it would beat out tegra 3 otherwise no. Also they are supposed to improve the chip with updated gpu to 3.xx in future releases. Also benchmarks have been proven to be wrong in the past so who knows? Not like benchmarks can determine real life performance, nor does the average user need that much power.
Companion core really does work
jdeoxys said:
Really? I get 9-12 hours constant use on balanced. Plus 6 more with the dock.
Sent from my PG8610000 using xda premium
Click to expand...
Click to collapse
Strange, we just started uni here (Australia) and I've been using my prime all day, showing it off to friends (to their absolute amazement!) showing off glowball, camera effects with eyes, mouth etc. 2 hours of lecture typing, gaming on the train, watched a few videos and an episode of community played music on speaker for about 40 mins, webbrowsed etc etc started using at lightly at 9 am (only properly at say 1:30 pm) and it's 10:00pm now and GET THIS!!:
72% battery on tablet and 41% on the dock. It's just crazy man. No joke, it just keeps going, I can't help but admit the power saving must be real :/
Edit: Whoops, I quoted the wrong guy, but you get the idea.
That's what I'm saying. Battery life on prime is great. Add a dock n battery life is sick!
I do agree a quad core variant of krait or S4 will give tegra3 a really good battle. Regardless I'm more than satisfied with power of tegra3. I'm not the type as soon as i see a newer or higher spec tab, ill feel like mines is useless or outdated. With have developement going hard now for this device. Just wait till the 1.8-2ghz+ overclock roms n kernels drop. Then we would even give new quad core higher speed chips a good run.
Above all of that, Android needs to developement more apps to take advantage of the more powerful chips like tegra3 and those that's upcoming. Software is still trying to catch up to hardware spec. Android apps haven't even all been made yet to take advantage of tegra2 power..yet lol. With nvidia/tegra3 we have advantage because developers are encouraged to make apps n games to take advantage of tegra3 power.
Regardless we all Android. Need to focus more on the bigger enemy, apple n IOS

[Q] Exynos 4212 Quad, NVDIA tegra 3, Snapdragon 4 dual- which is the best & why?

I need your suggestions. Can any one please make me understand the which is the best processor from Exynos 4212 Quad, NVDIA tegra 3, Snapdragon 4 dual and why?
Please tell me. That will be very helpful to me
From benchmarks, the Exynos CPU was quite a bit better than the other two, and the Mali GPU in the S3 also out-performed the others as far as I can remember. Search for some benchmarks comparing them to find out for yourself.
It should go this way:
Processing power: Exynos 4412 Quad > Qualcom S4 Krait > Nvidia Tegra 3 Quad
GPU power: Mali400 GPU > Adreno 225 >= ULP GeForce
But i read somewhere that S4 Krait CPU which is based on ARM Cortex A15 chips could offer more power without consuming as much energy than the two Quad core beasts.
My first thought when I heard about Nvidias 4+1 CPU was, how can it decide when to switch from single to quad core?? This sounds to me like a prototype for a constantly lagging device.
But I'm not as deep in this matter as to make a qualified statement.
It is just a feeling, since neither Intel ,AMD, Qualcomm or Samsung build their CPUs like this.
Sent from my GT-I9100 using XDA
Coming off a Tegra 2 device and patiently waiting this Verizon version of this phone all I can say is Tegra is terrible. At least on my phone it was, heating up on simple tasks like browsing homescreen.
harise100 said:
My first thought when I heard about Nvidias 4+1 CPU was, how can it decide when to switch from single to quad core?? This sounds to me like a prototype for a constantly lagging device.
But I'm not as deep in this matter as to make a qualified statement.
It is just a feeling, since neither Intel ,AMD, Qualcomm or Samsung build their CPUs like this.
Sent from my GT-I9100 using XDA
Click to expand...
Click to collapse
It actually works very well, the standby time on this phone is the best I've ever seen. It's needed though, because this chip is thirsty. Whether that's down to poor drivers or the design I don't know. Maybe a bit of both. Anyway I like Tegra 3, it IS very fast and you have those Tegra 3 games. Just look at Dark Meadow, the graphics are amazing and it runs smooth as hell.
Sent from my HTC One X using xda premium
I can't imagine how this ever will work without occasional lags.
How does the task scheduler on a tegra 3 predict when to activate the 4 cores ?
Starting an app and wait whether it will need more power will lead to a lag, when it maxes out the single core.
It's not 4+1 but rather 1+4.
Sent from my GT-I9100 using XDA
harise100 said:
I can't imagine how this ever will work without occasional lags.
How does the task scheduler on a tegra 3 predict when to activate the 4 cores ?
Starting an app and wait whether it will need more power will lead to a lag, when it maxes out the single core.
It's not 4+1 but rather 1+4.
Sent from my GT-I9100 using XDA
Click to expand...
Click to collapse
I have no idea how it works, as the fifth core is handled directly by the soc and not the system. Maybe someone more knowledgeable than me can shed some light on this. I haven't encountered any noticeable lags compared to my SII though.
Sent from my HTC One X using xda premium
Finally the search tool works, anyways thanks for clearing my doubts between the differences of the two quad cores.

Whats next after quad-core?

So in 2011 we have Tegra 2, in 2012 we have Tegra 3 so my questions is what will come in 2013? Octo-core or an improved version of quad core cpus?
Fasty12 said:
So in 2011 we have Tegra 2, in 2012 we have Tegra 3 so my questions is what will come in 2013? Octo-core or an improved version of quad core cpus?
Click to expand...
Click to collapse
Well as octo core desktop CPUs havnt really caught on yet I would guess just better quad cores likely with more powerful GPUs
Tegra 3 is already very powerful, presuming the will increase ram and make them more battery efficient or even higher clock speed. 12 core tegra gpu is pretty amazing already and anything better must be godly
Sent from my HTC Desire using xda app-developers app
If u mean for mobile platform , Will we really need beyond Quad core, having seen how SGSIII is smoothly running with it, beyond that what more perfection ( yaa still more can be expected) and speed u will need to do ur work . As known Android use other cores on need basis , why u need to see ur 2-3 cores never used.. i think its just more curiosity n to have more advaced/latest will be the only reason to have such high cpu on ur mobile..
What I like to see is ups in RAM installed and lows in RAM usage by system...
Sounds like octo-mom..the debate.lives on.. battery vs performance...but to answer your question I think it would be hexa-core which is 6..let's wait and see what is to come...
Sent from my SGH-T989 using Tapatalk 2
s-X-s said:
If u mean for mobile platform , Will we really need beyond Quad core, having seen how SGSIII is smoothly running with it, beyond that what more perfection ( yaa still more can be expected) and speed u will need to do ur work . As known Android use other cores on need basis , why u need to see ur 2-3 cores never used.. i think its just more curiosity n to have more advaced/latest will be the only reason to have such high cpu on ur mobile..
What I like to see is ups in RAM installed and lows in RAM usage by system...
Click to expand...
Click to collapse
I agree. Cores are at there peak right now. The amount of CPU power we have especially in the higher end phones is enough to acomplish many, many things. RAM is somewhat of an issue especially since multitasking is a huge part of android. I really thing a 2gb RAM should be a standard soon. Also, better gpu's won't hurt
Sent from my HTC T328w using Tapatalk 2
If they decide to keep going on the core upgrade in the next two or so years, I see one of two possibilities happening:
1) Dual Processor phones utilizing either dual or quad cores.
or
2) Hexacore chips since on the desktop market there's already a few 6-core chips (though whether or not they would actually be practical in the phones architecture, no clue).
Generally speaking whatever they come out with next will either need a better battery material, or lower power processors.
I mean I'm pretty amazed by what my brother's HTC One X is capable of with the quad core, and here I am still sporting a single-core G2. But yes I would like to see more advancement in RAM usage, we got a nice bit of power, but how bout a standard 2GB ram for better multitasking?
I believe 2013 will be all about more efficient quad-cores.
May i ask what going from 1gb to 2gb will improve? Loading times?
hello everyone, could you tell me what is cuad core?
Quad core means that a processor has four processing units.
Because there are more, that means that a process, theoretically, gets executed 4 times faster.
Read more about it: http://simple.wikipedia.org/wiki/Multi-core_processor
Maybe i7 in mobile devices?
I'm sure it will stay at quad core cpu's, anything more is just overkill. They may introduce hyperthreading. It's going to boil down to efficiency.
Sent from my SPH-D700 using xda premium
I'd say the future lies in more efficient use of processors. Right now, Android is still far from optimized on multi-core processor-equipped devices. Project Butter is the start of a great movement by Google to optimize the operating system. Hopefully it spreads out to other OEMs and becomes the main focus for Android development.
Improving and optimizing current processors is the way hardware companies should go.
In my opinion, processor development is out running battery development. Optimized processors could reduce power consumption while preserving excellent speed and usability.
Sent from my Transformer TF101 using Tapatalk 2
building processors on more efficient ARM architectures is going to be the way to go from what I see......throwing four less efficient cores at a problem is the caveman method to dealing with it.....looking at you Samsung Exynos Quad based on tweaked A9 cores.....
the A15 based Qualcomm S4 Krait is more efficient on a clock for clock core for core basis and once the software catches up and starts using the hardware in full capacity, less more efficient cores will be preferred
I dont see anything beyond quads simply because they havent even scratched the surface of what can be done with a modern dual core processor yet.......throwing more cores at it only makes excuses for poor code.....i can shoot **** faster than water with a big enough pump......but that doesn't mean that's the better solution
We don't need more cores! Having more than 2 cores will not make a difference so quad cores are a waste of space in the CPU die.
Hyperthreading, duh.
More ram. Got to have the hardware before the software can be made to use it.
With the convergence of x86 into the Android core and the streamlining of low-power Atom CPUs, the logical step would be to first optimize the current software base for multi-core processors before marketing takes over with their stupid x2 multiplying game...
Not long ago, a senior Intel exec went on record saying that today, a single core CPU Android smartphone is perhaps better overall performing (battery life, user experience, etc) than any dual/quad-core CPU. Mind you, these guys seldom if ever stick out their neck with such bold statements, especially when not pleasing to the ear...
For those interested, you can follow this one (of many) articles on the subject: http://www.zdnet.com/blog/hardware/intel-android-not-ready-for-multi-core-cpus/20746
Android needs to mature, and I think it actually is. With 4.1 we see the focus drastically shifted to optimization, UX and performance with *existing/limited* resources. This will translate to devices beating all else in battery life, performance and graphics but since it was neglected in the first several iterations, it is likely we see 4.0 followed by 4.1 then maybe 4.2 before we hear/see the 5.0 which will showcase maturity and evolution of the experience.
Just my 2c. :fingers-crossed:

How does the S4 pro compare to the Exynos 5??

Planning to return my N10 cause the stuttering on it is driving me insane and im really interested in the Tablet Z currently.
1920x1080 on an 1.5GHz Qualcomm APQ8064 with adreno 320GPU VS 2560x1600 nexus 10 with an exynos 5 and a mali t604 GPU clocked at 1.7 GHZ.
Fasty12 said:
Planning to return my N10 cause the stuttering on it is driving me insane and im really interested in the Tablet Z currently.
1920x1080 on an 1.5GHz Qualcomm APQ8064 with adreno 320GPU VS 2560x1600 nexus 10 with an exynos 5 and a mali t604 GPU clocked at 1.7 GHZ.
Click to expand...
Click to collapse
The S4 is halfway between the Cortex A9 cores and the new Cortex A15 core that we have. So it is decent enough of a CPU. I am not sure how good of a GPU that is. None of my devices the past couple years have had Adreno GPU's At least it wont have to work as hard with the lower resolution
Fasty12 said:
Planning to return my N10 cause the stuttering on it is driving me insane and im really interested in the Tablet Z currently.
1920x1080 on an 1.5GHz Qualcomm APQ8064 with adreno 320GPU VS 2560x1600 nexus 10 with an exynos 5 and a mali t604 GPU clocked at 1.7 GHZ.
Click to expand...
Click to collapse
What stuttering are you talking about?
Draw your own conclusions.
S4 Pro - http://www.anandtech.com/show/6112/...agon-s4-apq8064adreno-320-performance-preview
Exynos 5 - http://www.anandtech.com/show/6148/samsung-announces-a15malit604-based-exynos-5-dual
From everything I've seen and experienced the exynos 5 is the better of the two. The a15 is a more powerful core than the krait core, that with the higher clock speeds and the better GPU makes for a better chip. Personally I have never had my n10 lag at all. Maybe you just got a dud?
Sent from my Nexus 10 using Tapatalk HD
enik_fox said:
From everything I've seen and experienced the exynos 5 is the better of the two. The a15 is a more powerful core than the krait core, that with the higher clock speeds and the better GPU makes for a better chip. Personally I have never had my n10 lag at all. Maybe you just got a dud?
Sent from my Nexus 10 using Tapatalk HD
Click to expand...
Click to collapse
But the Exynos 5 has to run that massive screen res. Also, the reason I think that Qualcomm modified the core was because of the power consumption. Stock A-15 core consumes quite a lot of power.
Sent from my LG-P990 using xda app-developers app
Fasty12 said:
Planning to return my N10 cause the stuttering on it is driving me insane and im really interested in the Tablet Z currently.
1920x1080 on an 1.5GHz Qualcomm APQ8064 with adreno 320GPU VS 2560x1600 nexus 10 with an exynos 5 and a mali t604 GPU clocked at 1.7 GHZ.
Click to expand...
Click to collapse
Every now and then I read ppl complaining about lags and stutters... I have not experienced one since I have the device; can you please explain what you are doing when this happens?
avdaga said:
Every now and then I read ppl complaining about lags and stutters... I have not experienced one since I have the device; can you please explain what you are doing when this happens?
Click to expand...
Click to collapse
Try opening and closing google maps after the map has been loaded there is a NOTICEABLE frame rate drop compare to other apps.
kaspar737 said:
But the Exynos 5 has to run that massive screen res. Also, the reason I think that Qualcomm modified the core was because of the power consumption. Stock A-15 core consumes quite a lot of power.
Sent from my LG-P990 using xda app-developers app
Click to expand...
Click to collapse
The Snapdragon S4 does not use an A15 core or any derivative of an A15. Qualcomm has ALWAYS completely designed their cores custom and has almost nothing to do with the current major core from ARM's reference design. Additionally, the S4 was designed and released before the A15 MP-Core was even finished with its design phase.
The Krait core uses a similar (but not the same) triple wide decode stage like the A15 core, but it uses a completely different 11 stage execution pipeline compared to the A15's 15 stage pipeline. The higher stages of the pipeline allow the A15 design to break things down smaller and achieve higher frequency, but if there were to be a failure in computing then the A15 must wait a longer time before it can start over where the Krait core doesnt have to wait as long, but also isnt as efficient in " normal" circumstances. Honestly the integer performance between the two cores is pretty close, but I think I remember seeing that the A15 has a lot stronger floating point performance. So I guess it really depends on your workload.
FYI, the Exynos 5, Tegra 4, and TI OMAP 5 processors are all based on the A15 core design. Qualcomm is the only major player who does not base their processors on the ARM design
Fasty12 said:
Try opening and closing google maps after the map has been loaded there is a NOTICEABLE frame rate drop compare to other apps.
Click to expand...
Click to collapse
Do you mean a drop in framerate during the animation when closing Maps? I notice a minor framerate drop which lasts as long as the animation does, but if that is it, I'm kinda wondering why you bought an android device in first place... I have not noticed this before and I cannot imagine anyone would using the device for its intended purposes. If you take any android device, you will find a fps drop at some point... Maybe return it and take an iPad? iPads do not have the issue, on the other hand there's a lot that iPads do not have ^^
kaspar737 said:
But the Exynos 5 has to run that massive screen res. Also, the reason I think that Qualcomm modified the core was because of the power consumption. Stock A-15 core consumes quite a lot of power.
Sent from my LG-P990 using xda app-developers app
Click to expand...
Click to collapse
Exynos has higher memory bandwidth so the difference isn't substantial.
Sent from my Galaxy Nexus using Tapatalk 2
---------- Post added at 01:33 PM ---------- Previous post was at 01:29 PM ----------
EniGmA1987 said:
The Snapdragon S4 does not use an A15 core or any derivative of an A15. Qualcomm has ALWAYS completely designed their cores custom and has almost nothing to do with the current major core from ARM's reference design. Additionally, the S4 was designed and released before the A15 MP-Core was even finished with its design phase.
The Krait core uses a similar (but not the same) triple wide decode stage like the A15 core, but it uses a completely different 11 stage execution pipeline compared to the A15's 15 stage pipeline. The higher stages of the pipeline allow the A15 design to break things down smaller and achieve higher frequency, but if there were to be a failure in computing then the A15 must wait a longer time before it can start over where the Krait core doesnt have to wait as long, but also isnt as efficient in " normal" circumstances. Honestly the integer performance between the two cores is pretty close, but I think I remember seeing that the A15 has a lot stronger floating point performance. So I guess it really depends on your workload.
FYI, the Exynos 5, Tegra 4, and TI OMAP 5 processors are all based on the A15 core design. Qualcomm is the only major player who does not base their processors on the ARM design
Click to expand...
Click to collapse
THANK YOU!! my god I've had to explain this so many times! Qualcomm licenses ONLY the armv7 instructions and not arms designs. They design their own chips from the ground up and GPU, so please people stop saying Qualcomm is a cortex series processor because it isn't. Samsung and the rest license arms design and modify it, in Samsungs case they tend to increase the IPC slightly and give it more memory bandwidth.
Also to answer the question, exynos 5 will do better at higher resolutions and they will be very close in lower resolutions. S4 will be better in multi thread workloads more then likely and exynos will have better float performance. Exynos is better for games once the thermal throttling is fixed.
Sent from my Galaxy Nexus using Tapatalk 2
ECOTOX said:
Exynos has higher memory bandwidth so the difference isn't substantial.
Sent from my Galaxy Nexus using Tapatalk 2
---------- Post added at 01:33 PM ---------- Previous post was at 01:29 PM ----------
THANK YOU!! my god I've had to explain this so many times! Qualcomm licenses ONLY the armv7 instructions and not arms designs. They design their own chips from the ground up and GPU, so please people stop saying Qualcomm is a cortex series processor because it isn't. Samsung and the rest license arms design and modify it, in Samsungs case they tend to increase the IPC slightly and give it more memory bandwidth.
Also to answer the question, exynos 5 will do better at higher resolutions and they will be very close in lower resolutions. S4 will be better in multi thread workloads more then likely and exynos will have better float performance. Exynos is better for games once the thermal throttling is fixed.
Sent from my Galaxy Nexus using Tapatalk 2
Click to expand...
Click to collapse
The 50 percent extra memory bandwith doesn't matter so much considered that the Exynos has to move almost twice the amount of pixels.
Sent from my LG-P990 using xda app-developers app
kaspar737 said:
The 50 percent extra memory bandwith doesn't matter so much considered that the Exynos has to move almost twice the amount of pixels.
Sent from my LG-P990 using xda app-developers app
Click to expand...
Click to collapse
Memory bandwidth makes a pretty big difference when it comes too resolutions. I.E 8600gt ddr2 vs ddr3. Wider memory bus and faster memory makes a big difference in higher res performance of any GPU
Also will help with GPU compute performance for future apps utilizing the Mali t604s compute abilities
Sent from my Galaxy Nexus using Tapatalk 2
ECOTOX said:
Memory bandwidth makes a pretty big difference when it comes too resolutions. I.E 8600gt ddr2 vs ddr3. Wider memory bus and faster memory makes a big difference in higher res performance of any GPU
Also will help with GPU compute performance for future apps utilizing the Mali t604s compute abilities
Sent from my Galaxy Nexus using Tapatalk 2
Click to expand...
Click to collapse
But lets say that Exynos uses the whole 12.8 gb/s bandwith. That means that to move twice as less pixels you would need 6.4 gb/s so memory bandwidth isn't an issue.
Sent from my LG-P990 using xda app-developers app
kaspar737 said:
But lets say that Exynos uses the whole 12.8 gb/s bandwith. That means that to move twice as less pixels you would need 6.4 gb/s so memory bandwidth isn't an issue.
Sent from my LG-P990 using xda app-developers app
Click to expand...
Click to collapse
But that bandwidth is shared, unlike on dedicated GPU where it isn't. The total system bandwidth (not including buses for modem or w.e others are there) on the exynos chip being higher is gonna give it the edge in any situation considering the closeness in performance between the two. It also can't be denied that the Mali t604 has a edge in horse power over adreno 320 because even at the n10s resolution it comes within a couple fps of adreno at 1080p resolution. Not saying it's a big difference, but the exynos is the more powerful all around chip and that's just in is dual core form.
Edit: Also its a known fact that Adreno has crap fill rate compared too Mali or Power VR, Adrenos Strength is Geometry performace so it takes more of a hit the higher the resolution than Either the Mali t604 or the SGX 554MP4 which both have higher Fillrate and the SoC we have to compare both have higher bandwidth to facilitate that so we dont get bottle necked.
Sent from my Galaxy Nexus using Tapatalk 2
Finally a lot of exerts here about GPU
, I know it is not related to topic but me and my girlfriend have Galaxy Note 2 and S3. As you all know they are the last ones to use the elder Mali-400 GPU. I love playing games and I am getting my girlfriend used to them too. So I was wondering how is our Mali400 GPU holding up against the new coming 1080p Adreno 320 devices? It is clear the future is 1080p. I am either planning to switch our devices with a couple of Nexus 4s or Xperia Zs. Because I fear our devices are about to be outdated with the next game right around the corner. So far they are doing just fine with Modern Combat 4 and the all other graphic intensive games by playing over 28-30 FPS. But according to the GLBenchmark 2.5Egypt they are useless against new Adreno 320. However I have read that most of the games were designed for high fill rate power and Mali 400 is able to beat Adreno 320. But on the triangle tests, it just bottlenecks.
So what is your opinion about it? I will our devices do another year and half for the new games? Or should I make the trade? Or should I just buy a Nexus 10 with 2 users assigned and continue games on it? I
Thank you for reading.
Sent from my GT-N7100 using xda app-developers app
_delice_doluca_ said:
Finally a lot of exerts here about GPU
, I know it is not related to topic but me and my girlfriend have Galaxy Note 2 and S3. As you all know they are the last ones to use the elder Mali-400 GPU. I love playing games and I am getting my girlfriend used to them too. So I was wondering how is our Mali400 GPU holding up against the new coming 1080p Adreno 320 devices? It is clear the future is 1080p. I am either planning to switch our devices with a couple of Nexus 4s or Xperia Zs. Because I fear our devices are about to be outdated with the next game right around the corner. So far they are doing just fine with Modern Combat 4 and the all other graphic intensive games by playing over 28-30 FPS. But according to the GLBenchmark 2.5Egypt they are useless against new Adreno 320. However I have read that most of the games were designed for high fill rate power and Mali 400 is able to beat Adreno 320. But on the triangle tests, it just bottlenecks.
So what is your opinion about it? I will our devices do another year and half for the new games? Or should I make the trade? Or should I just buy a Nexus 10 with 2 users assigned and continue games on it? I
Thank you for reading.
Sent from my GT-N7100 using xda app-developers app
Click to expand...
Click to collapse
They will hold on, my SGS2 runs all of the current games at the highest settings ( I haven't tried GTA though) without any issues, the Adreno 320 is far better than the Mali 400 MP4 though
Ya, Im pretty sure they will still play games a year from now. Until the market is completely saturated with devices like the Nexus 10 in power we wont really see large jumps in system requirements. That will probably only happen a year or two from now once all the new phones and tablets are made with A15 processors (or Qualcomm equivalent) and beefy GPUs.
Fidelator said:
They will hold on, my SGS2 runs all of the current games at the highest settings ( I haven't tried GTA though) without any issues, the Adreno 320 is far better than the Mali 400 MP4 though
Click to expand...
Click to collapse
The S2(Mali400) plays GTA3 without a hiccup.
The exynos dual is very power hungry compared to the s4pro but it is also the most powerful arm processor out today. Nothing else yet released (I said RELEASED) is as powerful or can match its bandwidth. Having said that I'm sure a normal resolution 1080p screen in this form factor with the s4pro would be a nice fast tablet. Right now the exynos dual is pretty much the only thing outside apple that can push the resolution that the n10 has. I think if they had put another gig of ddr3 in this thing there wouldn't be so much stuttering in certain instances. Besides the thermal cutoff the n10 is starved for memory as it has to share normal duties and its ram with the graphical load of pushing all the pixels of this monster resolution. You are lucky to have 300mb of ram available at idle on the n10 vs over a gig available with the s4pro on the 720p screen of the nexus 4
Sent from my often RMA'd Nexus 4, So that I can use the one I'm using now when I get the 6th and hopefully final one.

Categories

Resources