About procesors and their capabilities - Android Q&A, Help & Troubleshooting

Hi, i'm fairly new into this, but it got me interested, i've been reading about some "cheap high performance" processors, such as de renesas ev2 and the rk2918, with the ev2, some manofacturers do say that it is possible to use android honeycomb, the ev2 is a decent for the price dual core procesor, with a gpu that performs really close to the qualcomm mali that was on the first ipad (about 15m p/s) , the cpu is arm9. The rockchip does excel in 3d performance, its neon gpu is said to be able to get 60m p/s, while the processor itself is not bad, is A8 clocked at 1.2 ghz, single core. With this being told, i'm wondering if rk2918 devices will be able to run smoothly the soon to be released ics, or atleast the honeycomb (its source has not been released yet), or even though the ev2 has a worse gpu, is still a better pick? thanks in advance for your comments (except if they are insults)

Still wondering if processors suck as the rk2918 or the ev2 will be good enough to run a newer release of android, other than gingerbread, regards.

Related

Evo 3d gpu

So is it true that the GPU on the EVO 3D sucks? Or is outdated? I've heard some people say its actually worse than the NS4G's GPU. I want to play some demanding games so the GPU to me is important. How will it run N64oid and the PSX emulator? I'm coming from the hero so there is no question there, but if what everyone says is true about it being worse than the NS4G's GPU then to me it's kind of a disappointment in that regard.
I guess I should rephrase one of my questions. I'm asking how it will run the emulators because I saw someone using a SG playing on the N64oid and it seemed pretty laggy, and if i'm not mistaken that has the same/similar GPU to the NS4G?
tannerw_2010 said:
So is it true that the GPU on the EVO 3D sucks? Or is outdated? I've heard some people say its actually worse than the NS4G's GPU. I want to play some demanding games so the GPU to me is important. How will it run N64oid and the PSX emulator? I'm coming from the hero so there is no question there, but if what everyone says is true about it being worse than the NS4G's GPU then to me it's kind of a disappointment in that regard.
Click to expand...
Click to collapse
the emulators use the CPU, the Evo 3D will be fine, the PSX emulator runs fine on my 18 month old Desire
From everything that I have read, the 3D's GPU is suppose to be one of the best out right now.......
[email protected] said:
From everything that I have read, the 3D's GPU is suppose to be one of the best out right now.......
Click to expand...
Click to collapse
Yeah, I've heard that too. So it makes me wonder whats really true? It might tell you something that I heard the GPU isn't very good from the NS boards ... but I think i've heard it on these boards too, just not near as much
Look up you tube videos of the gpu in action. Nuff said
Sent from my Nexus S 4G using XDA Premium App
Maybe this will calm your fears
http://www.youtube.com/watch?v=DhBuMW2f_NM
Here a better one
http://www.youtube.com/watch?v=Ehfyxvh2W4k&feature=youtube_gdata_player
Sent from my Nexus S 4G using XDA Premium App
The GPU in the Evo3D should be the best out right now. Supposed to be up to twice as fast/powerful as Tegra2. It does appear that some optimizations need to be done to take advantage of this GPU though, hence some of the early, low benchmarks.
The GPU is the fastest right now. NO need to specualte, it will be until tegra 3 comes out, but I think it will still match tegra 3 in most benchmarks. SGX540 is good but adreno 220 is faster.
What about de CPU ? It's worst than the Galaxy S CPU or better ?
jamhawk said:
What about de CPU ? It's worst than the Galaxy S CPU or better ?
Click to expand...
Click to collapse
It's probably a touch slower than the CPU in the Galaxy S2, but probably not enough to be important.
a5ehren said:
It's probably a touch slower than the CPU in the Galaxy S2, but probably not enough to be important.
Click to expand...
Click to collapse
Depends if the US Galaxy 2's are going to be Tegra or Exynos
donatom3 said:
Depends if the US Galaxy 2's are going to be Tegra or Exynos
Click to expand...
Click to collapse
Now that's gonna make all the difference.
Sent from my PC36100 using XDA Premium App
well this cpu has a totally different desgin. if you look at the videos it is plenty fast, I highly doubt it would not be able to do that the samsung processor would be able to do, other than bench a little higher. If and when this phone gets ICS it will probably be better off because of the gpu it uses, I believe the gs2 still uses sgx540 and the adreno is certainly newer and better. SGX540 is still one hell of a chip, but adreno 220 is actually better.
nkd said:
well this cpu has a totally different desgin. if you look at the videos it is plenty fast, I highly doubt it would not be able to do that the samsung processor would be able to do, other than bench a little higher. If and when this phone gets ICS it will probably be better off because of the gpu it uses, I believe the gs2 still uses sgx540 and the adreno is certainly newer and better. SGX540 is still one hell of a chip, but adreno 220 is actually better.
Click to expand...
Click to collapse
Actually the gs2 uses a Mali gpu..I still think the adreno outclasses it..they both have advantages over each other tho..but plenty of muscle for any mobile platform
firmbiz94 said:
Actually the gs2 uses a Mali gpu..I still think the adreno outclasses it..they both have advantages over each other tho..but plenty of muscle for any mobile platform
Click to expand...
Click to collapse
This thread slightly confuses me. The OP mentions the NS4G in the first post, then we have someone coming in asking about comparisons to the Galaxy S, (S or S2?) and everyone answers about the GS2. Quick stat breakdown to answer whatever question is actually being asked here
Nexus S 4G has:
1.0 single core Hummingbird CPU
SGX540 GPU
Galaxy S has:
1.0 single core Hummingbird CPU
SGX540 GPU
Galaxy S2 (Euro) has:
1.2 dual core Orion CPU
Mali 400 GPU
Evo 3D Has:
1.2 dual core MSM8660 CPU
Adreno 220 GPU
(Infoz from GSMArena)
The Nexus S and Galaxy S are last generation's phones, so to answer the OP... No. The Evo 3d doesn't have the same GPU/CPU as the NS4G. Not even similar. It's a generation (Maybe even 2) up. The Evo 4g is slightly slower than the NS4G, and it's running a 1.0 snapdragon with an Adreno 200 (Not even a 205, which is in the next in line before the 220).
As for the GS2 Vs. Evo 3D, they're supposed to be on par with each other, with the GS2 maybe being a bit faster, since Qualcomm isn't the best with GPU's. (Personal opinion) However, AFAIK nobody has done any real testing on the Sensation vs the GS2 (same CPU/GPU) so there's no real data backing up that claim... The GS2 DOES have better benchmark scores though, so take that as you will.
Disclaimer: I found all the numbers on the internets. They may be wrong.
You can't really prove anything without having any concrete proof. There are still no scientific or a dedicated performance comparison with all the gpus found on a dual core.
I say all the post on this thread are just personal opinions.
The only thing we can compare now are benchmark results w/c are not even that credible.
Benchmarks(anandtech, quadrant etc)
1. Exynos
2. TI omap
3. Tegra 2
4. Qualcomm
5. Hummingbird
Now if only the qualcomm dual core uses cortex a9. I wonder why they choose cortex a8 instead of a9. Cortex a8 is so old hardware now
Don't worry too much about the a8 vs a9 thing...tje differences are not huge..45nm vs 40 nm .also Qualcomm heavily optimized the scorpion that it can actually perform processes that a9 can't..it will provide plenty of power..I would go into more details but that seems to upset some people on other threads
peacekeeper05 said:
You can't really prove anything without having any concrete proof. There are still no scientific or a dedicated performance comparison with all the gpus found on a dual core.
I say all the post on this thread are just personal opinions.
The only thing we can compare now are benchmark results w/c are not even that credible.
Benchmarks(anandtech, quadrant etc)
1. Exynos
2. TI omap
3. Tegra 2
4. Qualcomm
5. Hummingbird
Now if only the qualcomm dual core uses cortex a9. I wonder why they choose cortex a8 instead of a9. Cortex a8 is so old hardware now
Click to expand...
Click to collapse
Almost all of the GPU benchmarks I've seen go like this:
1. Qualcomm
2. TI omap
3. Exynos
4. Tegra 2
5. Hummingbird
Qualcomm uses a8 because they don't use the reference designs from arm. Snapdragon outperforms the cortex a8 reference by 20-30% making it pretty close to the a9 reference

Lenovo 'Transformer' IdeaTab S2

So will you guys be swapping your Asus Transformer Prime for a similar product? Im sure most people are purchasing this due to the extra keyboard dock or tegra 3.
EDIT: Personally I'll be sticking with Asus Prime for now, its a good device.
Specification:
10.1" Screen IPS Display
Qualcomm Snapdragon 8960 (28mn TSMC) Dual-Core 1.7Ghz / Adreno 225 GPU 400 Mhz (Overclocked Adreno 220 + Better driver)
20 Hour battery Life
Keyboard Dock like Asus Transformer
16/32/64gb
The GPU is just on par with Mali 400MP which is a shame (GLBenchmark) but that is early benchmark.
Overclocking should be alot better for the CPU, since its a 28mn, I guess reaching over 2.0Ghz is fine!
Information:
Lenovo Idea Tab S 2
We need to start the review by mentioning that there may be certain ambiguities in the specification listed here for Lenovo Idea Tab S 2 since it’s actually not the official release. But as the prior experiences suggest, these information are normally bound to be true. So let us proceed with them. The Lenovo Idea Tab S 2 is to have 10.1 inches IPS display with a resolution of 1280 x 800 pixels which would be a state of the art screen panel and resolution. It will have 1.5GHz Qualcomm Snapdragon 8960 dual core processor with 1GB of RAM. This beast of hardware is controlled by Android OS v4.0 IceCreamSandwich and Lenovo has included a completely modified UI called Mondrain UI for their Idea Tab.
It comes in three storage configurations, 16 / 32 / 64 GBs with the ability to expand the storage using a microSD card. It features 5MP rear camera with auto focus and geo tagging with Assisted GPS and while the camera isn’t that good, it has decent performance verifiers. Idea Tab S 2 will come in 3G connectivity, not 4G connectivity which certainly is a surprise and it also has Wi-Fi 801.11 b/g/n for continuous connectivity and they claim that this tablet can control a smart TV so we assume they have some variation of DLNA included in Idea Tab S 2 as well. Following the footsteps of Asus, Lenovo Idea Tab S 2 also comes with a keyboard dock that has some additional battery life as well as additional ports and an optical track pad. It’s such a good concept to be replicated from Asus and we reckon it would be a deal changer for Lenovo Idea Tab S 2.
Lenovo has also made their new Tablet rather thin scoring a mere 8.69mm of thickness and 580g of weight which is surprisingly light. The inbuilt battery can score up to 9 hours as per Lenovo and if you hook it up with the keyboard dock, 20 hours of total battery life is guaranteed by Lenovo which is a very good move.
Click to expand...
Click to collapse
Video: http://www.youtube.com/watch?v=vWAOmO4LUIo
I certainly won't be going through the trouble of changing to this. This doesn't really look to add anything of value for me (don't need gps and my wifi works fine), and if pricing from lenovo in the past stays true this will likely be more expensive then the equivalent primes.
MrPhilo said:
The GPU is just on par with Mali 400MP which is a shame (GLBenchmark) but that is early benchmark.
Click to expand...
Click to collapse
That's surprising because of the GFLOPS specs for the GPUs:
Tegra 3 Kal-El: 7.2 GFLOPS
Qualcomm 8960 Adreno 225: 19.2 GFLOPS
PowerVR SGX543MP2: 19.2 GFLOPS
And per Anandtech "Qualcomm claims that MSM8960 will be able to outperform Apple's A5 in GLBenchmark 2.x at qHD resolutions." Of course, Qualcomm would say that but even if it is on par with the iPad2 (543MP2) it will still significantly outperform the Tegra3.
L3rry said:
That's surprising because of the GFLOPS specs for the GPUs:
Tegra 3 Kal-El: 7.2 GFLOPS
Qualcomm 8960 Adreno 225: 19.2 GFLOPS
PowerVR SGX543MP2: 19.2 GFLOPS
And per Anandtech "Qualcomm claims that MSM8960 will be able to outperform Apple's A5 in GLBenchmark 2.x at qHD resolutions." Of course, Qualcomm would say that but even if it is on par with the iPad2 (543MP2) it will still significantly outperform the Tegra3.
Click to expand...
Click to collapse
Yes but driver is the most important. Since Tegra 3 Ka el is clocked higher than 300Mhz, the 7.2 GFLOPs doesn't count.
I'd doubt it'll significantly outperform the Tegra 3 GPU. Just like the Adreno 220 was meant to be better but isn't much different.
Even Qualcomm admited that it'll only have 50% more performance than its current Adreno 220.
FML, GLBenchmark took down Asus TF202 with the GPU. It just performed lower than the Mali GPU, wish I saved the website.
With Adreno 225 Qualcomm improves performance along two vectors, the first being clock speed. While Adreno 220 (used in the MSM8660) ran at 266MHz, Adreno 225 runs at 400MHz thanks to 28nm. Secondly, Qualcomm tells us Adreno 225 is accompanied by "significant driver improvements". Keeping in mind the sheer amount of compute potential of the Adreno 22x family, it only makes sense that driver improvements could unlock a lot of performance. Qualcomm expects the 225 to be 50% faster than the outgoing 220
Click to expand...
Click to collapse
MrPhilo said:
FML, GLBenchmark took down Asus TF202 with the GPU. It just performed lower than the Mali GPU, wish I saved the website.
Click to expand...
Click to collapse
Yes, I saw that comment posted in another tread and I tried to google it but could not find it. Hopefully, Anandtech will put something out soon once demos for these newer tablets are available.
I've personally had a lot of headaches in the past with Lenovo laptops so I doubt I'll be making another Lenovo purchase. (Google "Y530 Lenovo Hinges" if you're interested in the issue- it was a common problem due to faulty design.)
The powerVR and Adreno have much more efficient rendering methods than the Tegra chips, so this tablet is no pushover at all.
I wouldn't be surprised if real world performance is better than the tegra 3 outside of tegra 3 specific apps.
hey
The Adreno 225 + SGX 543 MP2 both get 19.2 gflops @300mhz. we dont know the clock speed of the A5 but we can speculate that its probably around the 250-300mhz range.
That makes the Adreno(@400) more powerfull in flops than even the A5/tegra 3, however flops dont tell the whole story, as the A5 has twice the number of TMU's so has a higher fill rate clock for clock and better texturing capability.
The A5 will likely have more ROPs as well, but i dont know that.
The A5 will also have slightly higher bandwidth i think.
Looking at what Anand has said, the adreno 220, only had single channel memory=low bandwidth, it also probably poor effeciency in getting data to the shaders, i think Power vr are more effecient than adreno 2xx series.
The drivers on Adreno were not very good either, indeed some developers on this forum have managed to DOUBLE the adreno [email protected] using the newist Adreno drivers from qualcomm, i think shaky153 was leading the charge with.
I would be very suprised if the Adreno 225 equaled the A5, but it might equal or slightly beat the tegra 3..especially at higher resolutions due to tegras lack of bandwidth.
I don't understand why Nvidia doesn't announce the GPU clock speed!! they detailed it with T2! which means there is something to hide
AP25 was 400Mhz, so T3 shouldn't be under 400mhz
this discussion would be a lot easier if we know the actual clock speed
Prime/Nvidia rules!
Plus Lenovo had No developement support at all. And they are one of the slowest to release firmware updates. Everything is basically dead in Lenovo land.
It seems OK. But nothing enticing to make me think twice about trading my Prime. PRIME is just to cool all around.

Whats next after quad-core?

So in 2011 we have Tegra 2, in 2012 we have Tegra 3 so my questions is what will come in 2013? Octo-core or an improved version of quad core cpus?
Fasty12 said:
So in 2011 we have Tegra 2, in 2012 we have Tegra 3 so my questions is what will come in 2013? Octo-core or an improved version of quad core cpus?
Click to expand...
Click to collapse
Well as octo core desktop CPUs havnt really caught on yet I would guess just better quad cores likely with more powerful GPUs
Tegra 3 is already very powerful, presuming the will increase ram and make them more battery efficient or even higher clock speed. 12 core tegra gpu is pretty amazing already and anything better must be godly
Sent from my HTC Desire using xda app-developers app
If u mean for mobile platform , Will we really need beyond Quad core, having seen how SGSIII is smoothly running with it, beyond that what more perfection ( yaa still more can be expected) and speed u will need to do ur work . As known Android use other cores on need basis , why u need to see ur 2-3 cores never used.. i think its just more curiosity n to have more advaced/latest will be the only reason to have such high cpu on ur mobile..
What I like to see is ups in RAM installed and lows in RAM usage by system...
Sounds like octo-mom..the debate.lives on.. battery vs performance...but to answer your question I think it would be hexa-core which is 6..let's wait and see what is to come...
Sent from my SGH-T989 using Tapatalk 2
s-X-s said:
If u mean for mobile platform , Will we really need beyond Quad core, having seen how SGSIII is smoothly running with it, beyond that what more perfection ( yaa still more can be expected) and speed u will need to do ur work . As known Android use other cores on need basis , why u need to see ur 2-3 cores never used.. i think its just more curiosity n to have more advaced/latest will be the only reason to have such high cpu on ur mobile..
What I like to see is ups in RAM installed and lows in RAM usage by system...
Click to expand...
Click to collapse
I agree. Cores are at there peak right now. The amount of CPU power we have especially in the higher end phones is enough to acomplish many, many things. RAM is somewhat of an issue especially since multitasking is a huge part of android. I really thing a 2gb RAM should be a standard soon. Also, better gpu's won't hurt
Sent from my HTC T328w using Tapatalk 2
If they decide to keep going on the core upgrade in the next two or so years, I see one of two possibilities happening:
1) Dual Processor phones utilizing either dual or quad cores.
or
2) Hexacore chips since on the desktop market there's already a few 6-core chips (though whether or not they would actually be practical in the phones architecture, no clue).
Generally speaking whatever they come out with next will either need a better battery material, or lower power processors.
I mean I'm pretty amazed by what my brother's HTC One X is capable of with the quad core, and here I am still sporting a single-core G2. But yes I would like to see more advancement in RAM usage, we got a nice bit of power, but how bout a standard 2GB ram for better multitasking?
I believe 2013 will be all about more efficient quad-cores.
May i ask what going from 1gb to 2gb will improve? Loading times?
hello everyone, could you tell me what is cuad core?
Quad core means that a processor has four processing units.
Because there are more, that means that a process, theoretically, gets executed 4 times faster.
Read more about it: http://simple.wikipedia.org/wiki/Multi-core_processor
Maybe i7 in mobile devices?
I'm sure it will stay at quad core cpu's, anything more is just overkill. They may introduce hyperthreading. It's going to boil down to efficiency.
Sent from my SPH-D700 using xda premium
I'd say the future lies in more efficient use of processors. Right now, Android is still far from optimized on multi-core processor-equipped devices. Project Butter is the start of a great movement by Google to optimize the operating system. Hopefully it spreads out to other OEMs and becomes the main focus for Android development.
Improving and optimizing current processors is the way hardware companies should go.
In my opinion, processor development is out running battery development. Optimized processors could reduce power consumption while preserving excellent speed and usability.
Sent from my Transformer TF101 using Tapatalk 2
building processors on more efficient ARM architectures is going to be the way to go from what I see......throwing four less efficient cores at a problem is the caveman method to dealing with it.....looking at you Samsung Exynos Quad based on tweaked A9 cores.....
the A15 based Qualcomm S4 Krait is more efficient on a clock for clock core for core basis and once the software catches up and starts using the hardware in full capacity, less more efficient cores will be preferred
I dont see anything beyond quads simply because they havent even scratched the surface of what can be done with a modern dual core processor yet.......throwing more cores at it only makes excuses for poor code.....i can shoot **** faster than water with a big enough pump......but that doesn't mean that's the better solution
We don't need more cores! Having more than 2 cores will not make a difference so quad cores are a waste of space in the CPU die.
Hyperthreading, duh.
More ram. Got to have the hardware before the software can be made to use it.
With the convergence of x86 into the Android core and the streamlining of low-power Atom CPUs, the logical step would be to first optimize the current software base for multi-core processors before marketing takes over with their stupid x2 multiplying game...
Not long ago, a senior Intel exec went on record saying that today, a single core CPU Android smartphone is perhaps better overall performing (battery life, user experience, etc) than any dual/quad-core CPU. Mind you, these guys seldom if ever stick out their neck with such bold statements, especially when not pleasing to the ear...
For those interested, you can follow this one (of many) articles on the subject: http://www.zdnet.com/blog/hardware/intel-android-not-ready-for-multi-core-cpus/20746
Android needs to mature, and I think it actually is. With 4.1 we see the focus drastically shifted to optimization, UX and performance with *existing/limited* resources. This will translate to devices beating all else in battery life, performance and graphics but since it was neglected in the first several iterations, it is likely we see 4.0 followed by 4.1 then maybe 4.2 before we hear/see the 5.0 which will showcase maturity and evolution of the experience.
Just my 2c. :fingers-crossed:

iPad 4 vs 5250 (Nexus 10 Soc) GLBenchmark full results. UPDATE now with Anandtech!!

XXXUPDATEXXX
Anandtech have now published the perfromance preview of the Nexus 10, lets the comparison begin!
http://www.anandtech.com/show/6425/google-nexus-4-and-nexus-10-review
Well, the first full result has appeared on GLBenchmark for the iPad4, so I have created a comparison with the Samsung Arndale board, which uses exactly the same SoC as the Nexus 10, so will be very close in performance to Google's newest tablet. GLBenchmark, as its name suggests test Open GL graphics perrformance, which is important criteria for gaming.
Which device wins, click the link to find out.
http://www.glbenchmark.com/compare....ly=1&D1=Apple iPad 4&D2=Samsung Arndale Board
If you're really impatient, the iPad 4 maintains it lead in tablet graphics, the Nexus 10 may performance slightly better in final spec, but the underlying low-level performance will not change much.
I've also made a comparison between the iPad 3 & 4.
Interestingly the in-game tests GLBenchmark 2.5 Egypt HD C24Z16 - Offscreen (1080p) :, which is run independent of native screen resolution show the following
iPad 4: 48.6 FPS
iPad 3: 25.9 FPS
5250 : 33.7 FPS
So the iPad is twice as fast as its older brother, the Exynos will probably score nearer 40 FPS in final spec, with new drivers and running 4.2, the board runs ICS, however Jelly Bean did not really boost GL performance over ICS. What is interesting is that iPad 4, whose GPU is supposed to clocked at 500 MHz vs 250 MHz in the iPad 3 does not perform twice as fast in low-level test.
Fill Rate, triangle throughtput, vertex output etc is not double the power of the iPad 3, so although the faster A6 cpu helps, I reckon a lot of the improvement in the Egypt HD test is caused by improved drivers for the SGX 543 MP4 in the Pad 4. The Galaxy S2 received a big jump in GL performance when it got updated Mali drivers, so I imagine we should see good improvements for the T604, which is still a new product and not as mature as the SGX 543.
http://www.glbenchmark.com/compare....tified_only=1&D1=Apple iPad 4&D2=Apple iPad 3
I'd imagine the new new iPad would take the lead in benchmarks for now as it'll take Sammy and Google some time to optimize the beast, in the end however actual app and user interface performance is what matters, and reports are overwhelmingly positive on the Nexus 10.
So Mali 604T didn't match 5 times better than Mali 400, or maybe Samsung underclocked it.
Still very good but not the best.
________________
Edit: I forgot that Exynos 4210 with Mali400MP4 GPU had very bad GLbenchmark initially (even worse than PowerVR SGX540), but after updating firmware it's way better than other SoCs on Android handsets.
hung2900 said:
So Mali 604T didn't match 5 times better than Mali 400, or maybe Samsung underclocked it.
Still very good but not the best.
Click to expand...
Click to collapse
Not sure about this, but don't benchmark tools need to be upgraded for new architectures to? A15 is quite a big step, SW updates may be necessary for proper bench.
Damn..now I have to get an iPad.
I believe we have to take the Arndale board numbers with pinch of salt. It's a dev board, and I doubt it has optimized drivers for the SoC like it's expected for N10. Samsung has this habit of optimizing the drivers with further updates.
SGS2 makes for a good case study. When it was launched in MWC2011, it's numbers were really pathetic. It was even worse than Tegra2.
Anand ran benchmark on the pre-release version of SGS2 on MWC2011, check this:
http://www.anandtech.com/show/4177/samsungs-galaxy-s-ii-preliminary-performance-mali400-benchmarked
It was showing less than Tegra2 numbers! It was that bad initially.
Then look when Anand finally reviewed the device after few months:
http://www.anandtech.com/show/4686/samsung-galaxy-s-2-international-review-the-best-redefined/17
Egypt (native resolution) numbers went up by 3.6x and Pro also got 20% higher. Now they could have been higher if not limited by vsync. GLbenchmark moved from 2.0 to 2.1 during that phase, but I am sure this would not make such a big difference in numbers.
If you again check the numbers now for SGS2, it's again another 50% improvement in performance from the time Anand did his review.
Check this SGS2 numbers now:
http://www.anandtech.com/show/5811/samsung-galaxy-s-iii-preview
http://www.anandtech.com/show/6022/samsung-galaxy-s-iii-review-att-and-tmobile-usa-variants/4
This is just to show that how driver optimization can have a big affect on the performance. My point is that we have to wait for proper testing on final release of N10 device.
Also, check the fill rate properly in the Arndale board test. It's much less than what is expected. ARM says that Mali-T604 clocked at 500MHz should get a fill rate of 2 GPixels/s. It's actually showing just about 60% of what it should be delivering.
http://blogs.arm.com/multimedia/353-of-philosophy-and-when-is-a-pixel-not-a-pixel/
Samsung has clocked the GPU @ 533MHz. So, it shouldn't be getting so less.
According to Samsung, it more like 2.1 GPixels/s: http://semiaccurate.com/assets/uploads/2012/03/Samsung_Exynos_5_Mali.jpg
Fill rate is a low-level test, and there shouldn't be such a big difference from the quoted value. Let's wait and see how the final device shapes up.
hung2900 said:
So Mali 604T didn't match 5 times better than Mali 400, or maybe Samsung underclocked it.
Still very good but not the best.
________________
Edit: I forgot that Exynos 4210 with Mali400MP4 GPU had very bad GLbenchmark initially (even worse than PowerVR SGX540), but after updating firmware it's way better than other SoCs on Android handsets.
Click to expand...
Click to collapse
In areas where the Mali 400 lacked performance, like fragment and vertex lit triangle output T604 is comfortably 5 x the performance, whereas in these low-level tests iPad is not a concrete 2x the power of iPad 3, but achieves twice the FPS in Egypt HD than its older brother. I suspect drivers are a big factor here, and Exynos 5250 will get better as they drivers mature.
hot_spare said:
I believe we have to take the Arndale board numbers with pinch of salt. It's a dev board, and I doubt it has optimized drivers for the SoC like it's expected for N10. Samsung has this habit of optimizing the drivers with further updates.
SGS2 makes for a good case study. When it was launched in MWC2011, it's numbers were really pathetic. It was even worse than Tegra2.
Anand ran benchmark on the pre-release version of SGS2 on MWC2011, check this:
http://www.anandtech.com/show/4177/samsungs-galaxy-s-ii-preliminary-performance-mali400-benchmarked
It was showing less than Tegra2 numbers! It was that bad initially.
Then look when Anand finally reviewed the device after few months:
http://www.anandtech.com/show/4686/samsung-galaxy-s-2-international-review-the-best-redefined/17
Egypt (native resolution) numbers went up by 3.6x and Pro also got 20% higher. Now they could have been higher if not limited by vsync. GLbenchmark moved from 2.0 to 2.1 during that phase, but I am sure this would not make such a big difference in numbers.
If you again check the numbers now for SGS2, it's again another 50% improvement in performance from the time Anand did his review.
Check this SGS2 numbers now:
http://www.anandtech.com/show/5811/samsung-galaxy-s-iii-preview
http://www.anandtech.com/show/6022/samsung-galaxy-s-iii-review-att-and-tmobile-usa-variants/4
This is just to show that how driver optimization can have a big affect on the performance. My point is that we have to wait for proper testing on final release of N10 device.
Also, check the fill rate properly in the Arndale board test. It's much less than what is expected. ARM says that Mali-T604 clocked at 500MHz should get a fill rate of 2 GPixels/s. It's actually showing just about 60% of what it should be delivering.
http://blogs.arm.com/multimedia/353-of-philosophy-and-when-is-a-pixel-not-a-pixel/
Samsung has clocked the GPU @ 533MHz. So, it shouldn't be getting so less.
According to Samsung, it more like 2.1 GPixels/s: http://semiaccurate.com/assets/uploads/2012/03/Samsung_Exynos_5_Mali.jpg
Fill rate is a low-level test, and there shouldn't be such a big difference from the quoted value. Let's wait and see how the final device shapes up.
Click to expand...
Click to collapse
I agree with most of what you have said. On the GPixel figure this is like ATI GPU teraflops figures always being much higher than Nvidia, in theory with code written to hit the device perfectly you might see that those high figures, but in reality the Nvidia cards with lower on paper numbers equaled or beat ATI in actual game FPS. It all depends on whether the underlying architecture is as efficient in real-world tests, vs maximum technical numbers that can't be replicated in actual game environments.
I think the current resolution of the iPad / Nexus 10 is actually crazy, and will would see prettier games with lower resolutions, the amount of resources needed to drive those high MP displays, means lots of compromises will be made in terms of effects / polygon complexity etc to ensure decent FPS, especially when you think that to drive Battlefield 3 at 2560 x 1600 with AA and high textures, requires a PC that burn 400+ watts of power, not a 10 watt SoC.
Overall when we consider that Nexus 10 has twice the RAM for game developers to use and faster CPU cores, games should look equally as nice as both, the biggest effect will be the level of support game developers provide for each device, the iPad will probably be stronger i that regard. Nvidia was able to coax prettier games out of Tegra 3 through developer support, hopefully Google won't forget the importance of this.
What's the point of speculation? Just wait for the device to be released and run all the test you want to get confirmation on performance. Doesn't hurt to wait
BoneXDA said:
Not sure about this, but don't benchmark tools need to be upgraded for new architectures to? A15 is quite a big step, SW updates may be necessary for proper bench.
Click to expand...
Click to collapse
Both A9 & A15 use the same instruction set architecture (ISA) so no they won't. Benchmarks may need to be modified, if the new SoC are too powerful and max out the old benches, but for GL Benchmark, that has not happened yet and there are already new updates in the pipeline.
I can't wait to see this Exynos 5250 in a 2.0ghz quad-core variant in the semi near future... Ohhhh the possibilities. Samsung has one hell of a piece of silicon on their hand.
Chrome
http://www.anandtech.com/show/6425/google-nexus-4-and-nexus-10-review
Google if you want to use Chrome as the stock browser, then develop to fast and smooth and not an insult, stock AOSP browser would be so much faster.
Turbotab said:
http://www.anandtech.com/show/6425/google-nexus-4-and-nexus-10-review
Google if you want to use Chrome as the stock browser, then develop to fast and smooth and not an insult, stock AOSP browser would be so much faster.
Click to expand...
Click to collapse
True.. Chrome on mobile is still not upto desktop level yet. I believe it's v18 or something, right? The stock browser would have much better result in SunSpider/Browsermark. The N4 numbers looks even worse. Somewhere the optimizations isn't working.
The GLbenchmark tests are weird. Optimus G posts much better result than N4 when both are same hardware. It infact scores lower than Adreno 225 in some cases. This is totally whacked.
For N10, I am still wondering about fill rate. Need to check what guys say about this.
Is it running some debugging code in the devices at this time?
Turbotab said:
Both A9 & A15 use the same instruction set architecture (ISA) so no they won't. Benchmarks may need to be modified, if the new SoC are too powerful and max out the old benches, but for GL Benchmark, that has not happened yet and there are already new updates in the pipeline.
Click to expand...
Click to collapse
Actually not. A8 and A9 are the same ISA (Armv7), while A5 A7 and A15 are in another group (Armv7a)
Once we get rid of the underclock no tablet will be able to match. I'm sure the Mali t604 at 750 MHz would destroy everything.
hung2900 said:
Actually not. A8 and A9 are the same ISA (Armv7), while A5 A7 and A15 are in another group (Armv7a)
Click to expand...
Click to collapse
I have to disagree, this is from ARM's info site.
The ARM Cortex-A15 MPCore processor has an out-of-order superscalar pipeline with a tightly-coupled low-latency level-2 cache that can be up to 4MB in size. The Cortex-A15 processor implements the ARMv7-A architecture.
The ARM Cortex-A9 processor is a very high-performance, low-power, ARM macrocell with an L1 cache subsystem that provides full virtual memory capabilities. The Cortex-A9 processor implements the ARMv7-A architecture and runs 32-bit ARM instructions, 16-bit and 32-bit Thumb instructions, and 8-bit Java bytecodes in Jazelle state.
http://infocenter.arm.com/help/index.jsp?topic=/com.arm.doc.set.cortexa/index.html
Keion said:
Once we get rid of the underclock no tablet will be able to match. I'm sure the Mali t604 at 750 MHz would destroy everything.
Click to expand...
Click to collapse
Except the iPad 4, which has a GPU that is currently 57% faster than the T604.
Sent from my iPad Mini using Tapatalk
Do remember that Awesome resolution does tax the GPU a lot. Heck most lower end desktop GPUs would struggle
Harry GT-S5830 said:
Do remember that Awesome resolution does tax the GPU a lot. Heck most lower end desktop GPUs would struggle
Click to expand...
Click to collapse
Indeed it does,but not in offscreen testing, where Anand made his proclamation.
Sent from my iPad Mini using Tapatalk
Hemlocke said:
Except the iPad 4, which has a GPU that is currently 57% faster than the T604.
Sent from my iPad Mini using Tapatalk
Click to expand...
Click to collapse
Nah, I think we can beat that too.
Drivers + OC.

Can the Lenovo K900 emulate PS2 on Android?

I got around to looking at some pretty powerful mobile devices, and found that the Lenovo K900 is pretty darn excellent. Its Intel Atom processor can be overclocked to beyond 2.2 GHz, and it has dual cores too.
Here's a short overview of the device:
1.Intel Atom Z2580 Dual core 2.0Ghz
2. 2GB Ram (LPDDR2).
3.Android V4.2 (not really a big make or break on PS2 emulation, but whatever).
I know this is not powerful enough for ideal gaming, but isn't it possible to port PCSX2 to Android and supported libraries, and manage to get somewhat emulation going and some games at low FPS?
I don't see why not ... I have heard of people using PCSX2 under 2.5 GHz processors on Windows, single core, and it was arguably "playable" to some.
Assuming this is not the highest-end smartphone in the market but still has pretty good specs, wouldn't a device a bit more powerful than this one come close to taking the cake?
In short, I believe PS2 emulation could be done on some high-end smartphones (like this one) now, just not "good FPS/emulation" yet.
Any rebuttal? The S5 from Samsung should be arriving soon, and they will be even more powerful. Somebody should help me port PCSX2, or at least create an open, community project to do so. As time goes on, updates can be made for the more powerful hardware in time.

Categories

Resources