Related
So is it true that the GPU on the EVO 3D sucks? Or is outdated? I've heard some people say its actually worse than the NS4G's GPU. I want to play some demanding games so the GPU to me is important. How will it run N64oid and the PSX emulator? I'm coming from the hero so there is no question there, but if what everyone says is true about it being worse than the NS4G's GPU then to me it's kind of a disappointment in that regard.
I guess I should rephrase one of my questions. I'm asking how it will run the emulators because I saw someone using a SG playing on the N64oid and it seemed pretty laggy, and if i'm not mistaken that has the same/similar GPU to the NS4G?
tannerw_2010 said:
So is it true that the GPU on the EVO 3D sucks? Or is outdated? I've heard some people say its actually worse than the NS4G's GPU. I want to play some demanding games so the GPU to me is important. How will it run N64oid and the PSX emulator? I'm coming from the hero so there is no question there, but if what everyone says is true about it being worse than the NS4G's GPU then to me it's kind of a disappointment in that regard.
Click to expand...
Click to collapse
the emulators use the CPU, the Evo 3D will be fine, the PSX emulator runs fine on my 18 month old Desire
From everything that I have read, the 3D's GPU is suppose to be one of the best out right now.......
[email protected] said:
From everything that I have read, the 3D's GPU is suppose to be one of the best out right now.......
Click to expand...
Click to collapse
Yeah, I've heard that too. So it makes me wonder whats really true? It might tell you something that I heard the GPU isn't very good from the NS boards ... but I think i've heard it on these boards too, just not near as much
Look up you tube videos of the gpu in action. Nuff said
Sent from my Nexus S 4G using XDA Premium App
Maybe this will calm your fears
http://www.youtube.com/watch?v=DhBuMW2f_NM
Here a better one
http://www.youtube.com/watch?v=Ehfyxvh2W4k&feature=youtube_gdata_player
Sent from my Nexus S 4G using XDA Premium App
The GPU in the Evo3D should be the best out right now. Supposed to be up to twice as fast/powerful as Tegra2. It does appear that some optimizations need to be done to take advantage of this GPU though, hence some of the early, low benchmarks.
The GPU is the fastest right now. NO need to specualte, it will be until tegra 3 comes out, but I think it will still match tegra 3 in most benchmarks. SGX540 is good but adreno 220 is faster.
What about de CPU ? It's worst than the Galaxy S CPU or better ?
jamhawk said:
What about de CPU ? It's worst than the Galaxy S CPU or better ?
Click to expand...
Click to collapse
It's probably a touch slower than the CPU in the Galaxy S2, but probably not enough to be important.
a5ehren said:
It's probably a touch slower than the CPU in the Galaxy S2, but probably not enough to be important.
Click to expand...
Click to collapse
Depends if the US Galaxy 2's are going to be Tegra or Exynos
donatom3 said:
Depends if the US Galaxy 2's are going to be Tegra or Exynos
Click to expand...
Click to collapse
Now that's gonna make all the difference.
Sent from my PC36100 using XDA Premium App
well this cpu has a totally different desgin. if you look at the videos it is plenty fast, I highly doubt it would not be able to do that the samsung processor would be able to do, other than bench a little higher. If and when this phone gets ICS it will probably be better off because of the gpu it uses, I believe the gs2 still uses sgx540 and the adreno is certainly newer and better. SGX540 is still one hell of a chip, but adreno 220 is actually better.
nkd said:
well this cpu has a totally different desgin. if you look at the videos it is plenty fast, I highly doubt it would not be able to do that the samsung processor would be able to do, other than bench a little higher. If and when this phone gets ICS it will probably be better off because of the gpu it uses, I believe the gs2 still uses sgx540 and the adreno is certainly newer and better. SGX540 is still one hell of a chip, but adreno 220 is actually better.
Click to expand...
Click to collapse
Actually the gs2 uses a Mali gpu..I still think the adreno outclasses it..they both have advantages over each other tho..but plenty of muscle for any mobile platform
firmbiz94 said:
Actually the gs2 uses a Mali gpu..I still think the adreno outclasses it..they both have advantages over each other tho..but plenty of muscle for any mobile platform
Click to expand...
Click to collapse
This thread slightly confuses me. The OP mentions the NS4G in the first post, then we have someone coming in asking about comparisons to the Galaxy S, (S or S2?) and everyone answers about the GS2. Quick stat breakdown to answer whatever question is actually being asked here
Nexus S 4G has:
1.0 single core Hummingbird CPU
SGX540 GPU
Galaxy S has:
1.0 single core Hummingbird CPU
SGX540 GPU
Galaxy S2 (Euro) has:
1.2 dual core Orion CPU
Mali 400 GPU
Evo 3D Has:
1.2 dual core MSM8660 CPU
Adreno 220 GPU
(Infoz from GSMArena)
The Nexus S and Galaxy S are last generation's phones, so to answer the OP... No. The Evo 3d doesn't have the same GPU/CPU as the NS4G. Not even similar. It's a generation (Maybe even 2) up. The Evo 4g is slightly slower than the NS4G, and it's running a 1.0 snapdragon with an Adreno 200 (Not even a 205, which is in the next in line before the 220).
As for the GS2 Vs. Evo 3D, they're supposed to be on par with each other, with the GS2 maybe being a bit faster, since Qualcomm isn't the best with GPU's. (Personal opinion) However, AFAIK nobody has done any real testing on the Sensation vs the GS2 (same CPU/GPU) so there's no real data backing up that claim... The GS2 DOES have better benchmark scores though, so take that as you will.
Disclaimer: I found all the numbers on the internets. They may be wrong.
You can't really prove anything without having any concrete proof. There are still no scientific or a dedicated performance comparison with all the gpus found on a dual core.
I say all the post on this thread are just personal opinions.
The only thing we can compare now are benchmark results w/c are not even that credible.
Benchmarks(anandtech, quadrant etc)
1. Exynos
2. TI omap
3. Tegra 2
4. Qualcomm
5. Hummingbird
Now if only the qualcomm dual core uses cortex a9. I wonder why they choose cortex a8 instead of a9. Cortex a8 is so old hardware now
Don't worry too much about the a8 vs a9 thing...tje differences are not huge..45nm vs 40 nm .also Qualcomm heavily optimized the scorpion that it can actually perform processes that a9 can't..it will provide plenty of power..I would go into more details but that seems to upset some people on other threads
peacekeeper05 said:
You can't really prove anything without having any concrete proof. There are still no scientific or a dedicated performance comparison with all the gpus found on a dual core.
I say all the post on this thread are just personal opinions.
The only thing we can compare now are benchmark results w/c are not even that credible.
Benchmarks(anandtech, quadrant etc)
1. Exynos
2. TI omap
3. Tegra 2
4. Qualcomm
5. Hummingbird
Now if only the qualcomm dual core uses cortex a9. I wonder why they choose cortex a8 instead of a9. Cortex a8 is so old hardware now
Click to expand...
Click to collapse
Almost all of the GPU benchmarks I've seen go like this:
1. Qualcomm
2. TI omap
3. Exynos
4. Tegra 2
5. Hummingbird
Qualcomm uses a8 because they don't use the reference designs from arm. Snapdragon outperforms the cortex a8 reference by 20-30% making it pretty close to the a9 reference
X10 is garbage! this is outrageous!
Yes really, they got it working, you want it so bad try porting it yourself
Sent from my MB860 using XDA App
cry about it?
if you want it so bad for your phone, learn to port it yourself. until then, since you rely solely on other peoples' hard work and sweat, shut up and be patient.
dLo GSR said:
cry about it?
if you want it so bad for your phone, learn to port it yourself. until then, since you rely solely on other peoples' hard work and sweat, shut up and be patient.
Click to expand...
Click to collapse
Oh snap. That was awesome.
Sent from my MB860 using XDA App
I might start to look into trying to port it this weekend
Sent from my MB860 using XDA App
firefox3 said:
I might start to look into trying to port it this weekend
Sent from my MB860 using XDA App
Click to expand...
Click to collapse
Good news man
Sent from my MB860 using XDA App
Being that there are currently no EGL libs for anything except PowerVR SGX devices under ICS yet, and they're closed source and tightly dependent on the kernel there doesn't seem to be a huge point until the official updates start to hit for a range of devices.
Sure, Desire, HD, X10, N1 have ports of a sort at the moment, in fact there shouldn't be too many problems getting them working aside from the graphics drivers but they're just for fun with the framebuffer driver given how much of ICS' UI rendering is done with GPU acceleration in mind. You wouldn't want to use it day-to-day. The browser is surprisingly responsive on the Desire though (I'd say moreso than GB, despite the software rendering), as is the Market (the new one always lagged really badly for me on the Desire before) - glimmers of hope for ICS' eventual performance on older devices. The keyboard lags like you wouldn't believe though!
The Atrix should fly under 4.0.1 though, if it ever happens - bearing in mind the fact that the SGX 540 in the Galaxy Nexus is pretty much in a dead heat with Tegra 2's GPU, we've got a lower resolution screen, and can overclock past the its stock speeds.
Javi97100 said:
Good news man
Sent from my MB860 using XDA App
Click to expand...
Click to collapse
Its turning out to be harder then i though... I think no one will get it until offical updates come out for other phones
Azurael said:
Being that there are currently no EGL libs for anything except PowerVR SGX devices under ICS yet, and they're closed source and tightly dependent on the kernel there doesn't seem to be a huge point until the official updates start to hit for a range of devices.
Sure, Desire, HD, X10, N1 have ports of a sort at the moment, in fact there shouldn't be too many problems getting them working aside from the graphics drivers but they're just for fun with the framebuffer driver given how much of ICS' UI rendering is done with GPU acceleration in mind. You wouldn't want to use it day-to-day. The browser is surprisingly responsive on the Desire though (I'd say moreso than GB, despite the software rendering), as is the Market (the new one always lagged really badly for me on the Desire before) - glimmers of hope for ICS' eventual performance on older devices. The keyboard lags like you wouldn't believe though!
The Atrix should fly under 4.0.1 though, if it ever happens - bearing in mind the fact that the SGX 540 in the Galaxy Nexus is pretty much in a dead heat with Tegra 2's GPU, we've got a lower resolution screen, and can overclock past the its stock speeds.
Click to expand...
Click to collapse
So EGL = gpu driver? If thats the only setback, would it be possible to get an ICS rom with software rendering as a proof of concept, or are there other pieces missing?
GB/CM7 is pretty good on the Atrix, if we dont see ICS for a few months it doesn't hurt us in any way. I'd like to think most of us can be patient if we lack the skills to help.
I noticed the Captivate got a port of it too since i9000 ROMs and Cap ROMs are interchangeable. I thought its funny that it's running on the HD a Windows Mobile 6.5 phone lol. Let's all try to be patient and we will eventually see it.
Edit: not to mention I'm sure if it's not already it will soon be on iPhone too. It seems like iPhones always get the new Android versions kinda early. I'm not sweating it I love my Atrix in its current state.
According to anandtech, Tegra 2 support is essentially ready, so I think as long as nvidia releases the source for ics (libs?), someone will try to port it. Hell, I have a good 5 weeks during break, I might as well try then.
Sent from my MB860 using XDA App
Azurael said:
Being that there are currently no EGL libs for anything except PowerVR SGX devices under ICS yet, and they're closed source and tightly dependent on the kernel there doesn't seem to be a huge point until the official updates start to hit for a range of devices.
Sure, Desire, HD, X10, N1 have ports of a sort at the moment, in fact there shouldn't be too many problems getting them working aside from the graphics drivers but they're just for fun with the framebuffer driver given how much of ICS' UI rendering is done with GPU acceleration in mind. You wouldn't want to use it day-to-day. The browser is surprisingly responsive on the Desire though (I'd say moreso than GB, despite the software rendering), as is the Market (the new one always lagged really badly for me on the Desire before) - glimmers of hope for ICS' eventual performance on older devices. The keyboard lags like you wouldn't believe though!
The Atrix should fly under 4.0.1 though, if it ever happens - bearing in mind the fact that the SGX 540 in the Galaxy Nexus is pretty much in a dead heat with Tegra 2's GPU, we've got a lower resolution screen, and can overclock past the its stock speeds.
Click to expand...
Click to collapse
Actually, no, despite being a much older GPU, the SGX 540 found in the GNexus outpaces the Tegra 2 due to its higher clock rate by 7% or 45% depending on the GLBenchmark being run. Both GPU tests were done at 720p resolution. Also, you can't overclock the GPU, only the CPU.
edgeicator said:
Actually, no, despite being a much older GPU, the SGX 540 found in the GNexus outpaces the Tegra 2 due to its higher clock rate by 7% or 45% depending on the GLBenchmark being run. Both GPU tests were done at 720p resolution. Also, you can't overclock the GPU, only the CPU.
Click to expand...
Click to collapse
Buddy, check out any of the kernels available in the dev thread and you'll see that the GPUs are overclocked.
WiredPirate said:
I noticed the Captivate got a port of it too since i9000 ROMs and Cap ROMs are interchangeable. I thought its funny that it's running on the HD a Windows Mobile 6.5 phone lol. Let's all try to be patient and we will eventually see it.
Edit: not to mention I'm sure if it's not already it will soon be on iPhone too. It seems like iPhones always get the new Android versions kinda early. I'm not sweating it I love my Atrix in its current state.
Click to expand...
Click to collapse
Doubt the iPhone will see ICS, the newest model that can run android as far as I know is the iPhone 3G, which was incredibly slow under Gingerbread.
mac208x said:
X10 is garbage! this is outrageous!
Click to expand...
Click to collapse
222 posts and zero thanks? Is this what you do, go around XDA and post useless threads like the guy complaining about returning home early despite nobody asking him to "to get MIUI ported on his grandma's phone"?
Are you guys related by any chance?
edgeicator said:
Actually, no, despite being a much older GPU, the SGX 540 found in the GNexus outpaces the Tegra 2 due to its higher clock rate by 7% or 45% depending on the GLBenchmark being run. Both GPU tests were done at 720p resolution. Also, you can't overclock the GPU, only the CPU.
Click to expand...
Click to collapse
Depends on the benchmark, yes - texture-heavy rendering tends to perform better on the 540 in the OMAP4460 thanks to it's dual channel memory controller and high clock (and that's probably the directly relevant part to UI rendering to be honest, though as I said - lower resolution screen ) but the Tegra 2 is quite substantially ahead in geometry-heavy rendering (and games on mobiles are starting to move that way now, following the desktop landscape over the past 5 years or so.) Averaged out, the performance of the two is very close.
Plus, as I said, the GPU in my phone is running at 400MHz which ought to even things out in the GLMark 720p tests somewhat even if they are biassed to one architecture or the other. While the GPU in OMAP4460 may overclock just as well from its stock 400MHz, I'm only really concerned that the phone can run as fast as a stock GNexus to maybe skip the next generation of mobile hardware and tide it over until Cortex A15-based SoCs on 28nm process start to emerge with stronger GPUs. I don't really think I'm CPU performance bound with a 1.4GHz dual-core A9 - and increasing the number of equivalent cores without a really substantial boost in GPU horesepower seems worthless right now, even if ICS takes better advantage of SMP (re: Disappointing early Tegra 3 benchmarks - although it does seem GLMark stacks the odds against NVidia GPUs more than other benchmarks?)
Azurael said:
Depends on the benchmark, yes - texture-heavy rendering tends to perform better on the 540 in the OMAP4460 thanks to it's dual channel memory controller and high clock (and that's probably the directly relevant part to UI rendering to be honest, though as I said - lower resolution screen ) but the Tegra 2 is quite substantially ahead in geometry-heavy rendering (and games on mobiles are starting to move that way now, following the desktop landscape over the past 5 years or so.) Averaged out, the performance of the two is very close.
Plus, as I said, the GPU in my phone is running at 400MHz which ought to even things out in the GLMark 720p tests somewhat even if they are biassed to one architecture or the other. While the GPU in OMAP4460 may overclock just as well from its stock 400MHz, I'm only really concerned that the phone can run as fast as a stock GNexus to maybe skip the next generation of mobile hardware and tide it over until Cortex A15-based SoCs on 28nm process start to emerge with stronger GPUs. I don't really think I'm CPU performance bound with a 1.4GHz dual-core A9 - and increasing the number of equivalent cores without a really substantial boost in GPU horesepower seems worthless right now, even if ICS takes better advantage of SMP (re: Disappointing early Tegra 3 benchmarks - although it does seem GLMark stacks the odds against NVidia GPUs more than other benchmarks?)
Click to expand...
Click to collapse
I would expect the Tegra to beat a nearly 5 year old GPU, but it only does so in triangle throughput. Tegra just uses a very poor architecture in general. Look at how little actual horsepower it can pull. The Tegra 3 gpu pulls 7.2GFLOPs @300mhz. The iPad GPU and the upcoming Adreno 225 both pull 19.2 GFLOPS at that same clockspeed. I honestly have no idea what the engineers are thinking over atNnvidia. It's almost as bad as AMD's latest bulldozer offerings. It's really more of Tegra's shortcomings than GLMark stacking the odds. PowerVR's offerings from 2007 are keeping up with a chip that debuted in 2010/2011. The Geforce just doesn't seem to scale very well at all on mobile platforms. But yea, all Nvidia did with Tegra 3 was slap in 2 extra cores, clocked them higher, threw in the sorely missed NEON instruction set, increased the SIMDs on the GPU by 50% (8 to 12), and then tacked on a 5th hidden core to help save power. Tegra 3 stayed with the 40nm process whereas every other SoC is dropping down to 28nm with some bringing in a brand new architecture as well.
edgeicator said:
I would expect the Tegra to beat a nearly 5 year old GPU, but it only does so in triangle throughput. Tegra just uses a very poor architecture in general. Look at how little actual horsepower it can pull. The Tegra 3 gpu pulls 7.2GFLOPs @300mhz. The iPad GPU and the upcoming Adreno 225 both pull 19.2 GFLOPS at that same clockspeed. I honestly have no idea what the engineers are thinking over atNnvidia. It's almost as bad as AMD's latest bulldozer offerings. It's really more of Tegra's shortcomings than GLMark stacking the odds. PowerVR's offerings from 2007 are keeping up with a chip that debuted in 2010/2011. The Geforce just doesn't seem to scale very well at all on mobile platforms. But yea, all Nvidia did with Tegra 3 was slap in 2 extra cores, clocked them higher, threw in the sorely missed NEON instruction set, increased the SIMDs on the GPU by 50% (8 to 12), and then tacked on a 5th hidden core to help save power. Tegra 3 stayed with the 40nm process whereas every other SoC is dropping down to 28nm with some bringing in a brand new architecture as well.
Click to expand...
Click to collapse
Don't you get tired if writing those long rants? We understand you know something about CPU architecture, and that Tegra isn't the best one out there, but damn man, it's the same thing in every thread. Just chill out and try to stay on topic for once
Sent from my MB860 using Tapatalk
edgeicator said:
I would expect the Tegra to beat a nearly 5 year old GPU, but it only does so in triangle throughput. Tegra just uses a very poor architecture in general. Look at how little actual horsepower it can pull. The Tegra 3 gpu pulls 7.2GFLOPs @300mhz. The iPad GPU and the upcoming Adreno 225 both pull 19.2 GFLOPS at that same clockspeed. I honestly have no idea what the engineers are thinking over atNnvidia. It's almost as bad as AMD's latest bulldozer offerings. It's really more of Tegra's shortcomings than GLMark stacking the odds. PowerVR's offerings from 2007 are keeping up with a chip that debuted in 2010/2011. The Geforce just doesn't seem to scale very well at all on mobile platforms. But yea, all Nvidia did with Tegra 3 was slap in 2 extra cores, clocked them higher, threw in the sorely missed NEON instruction set, increased the SIMDs on the GPU by 50% (8 to 12), and then tacked on a 5th hidden core to help save power. Tegra 3 stayed with the 40nm process whereas every other SoC is dropping down to 28nm with some bringing in a brand new architecture as well.
Click to expand...
Click to collapse
I think you are not seeing the whole picture...
The Tegra 3 (Et-Al) is not just about its quad core implementation, remember that the GPU will offer 12 cores that will translate in performance not seeing as of yet on any other platform.
Benchmarks don't tell the whole story! Specially those benchmarking tools which are not Tegra 3 optimized yet.
Cheers!
Sent from my Atrix using Tapatalk
WiredPirate said:
I noticed the Captivate got a port of it too since i9000 ROMs and Cap ROMs are interchangeable. I thought its funny that it's running on the HD a Windows Mobile 6.5 phone lol. Let's all try to be patient and we will eventually see it.
Edit: not to mention I'm sure if it's not already it will soon be on iPhone too. It seems like iPhones always get the new Android versions kinda early. I'm not sweating it I love my Atrix in its current state.
Click to expand...
Click to collapse
LOL I ran all the iDroid ports on my iphone. Not one was even in alpha stage, I would not even count iDroid as a port since you cant use anything on it.
XXXUPDATEXXX
Anandtech have now published the perfromance preview of the Nexus 10, lets the comparison begin!
http://www.anandtech.com/show/6425/google-nexus-4-and-nexus-10-review
Well, the first full result has appeared on GLBenchmark for the iPad4, so I have created a comparison with the Samsung Arndale board, which uses exactly the same SoC as the Nexus 10, so will be very close in performance to Google's newest tablet. GLBenchmark, as its name suggests test Open GL graphics perrformance, which is important criteria for gaming.
Which device wins, click the link to find out.
http://www.glbenchmark.com/compare....ly=1&D1=Apple iPad 4&D2=Samsung Arndale Board
If you're really impatient, the iPad 4 maintains it lead in tablet graphics, the Nexus 10 may performance slightly better in final spec, but the underlying low-level performance will not change much.
I've also made a comparison between the iPad 3 & 4.
Interestingly the in-game tests GLBenchmark 2.5 Egypt HD C24Z16 - Offscreen (1080p) :, which is run independent of native screen resolution show the following
iPad 4: 48.6 FPS
iPad 3: 25.9 FPS
5250 : 33.7 FPS
So the iPad is twice as fast as its older brother, the Exynos will probably score nearer 40 FPS in final spec, with new drivers and running 4.2, the board runs ICS, however Jelly Bean did not really boost GL performance over ICS. What is interesting is that iPad 4, whose GPU is supposed to clocked at 500 MHz vs 250 MHz in the iPad 3 does not perform twice as fast in low-level test.
Fill Rate, triangle throughtput, vertex output etc is not double the power of the iPad 3, so although the faster A6 cpu helps, I reckon a lot of the improvement in the Egypt HD test is caused by improved drivers for the SGX 543 MP4 in the Pad 4. The Galaxy S2 received a big jump in GL performance when it got updated Mali drivers, so I imagine we should see good improvements for the T604, which is still a new product and not as mature as the SGX 543.
http://www.glbenchmark.com/compare....tified_only=1&D1=Apple iPad 4&D2=Apple iPad 3
I'd imagine the new new iPad would take the lead in benchmarks for now as it'll take Sammy and Google some time to optimize the beast, in the end however actual app and user interface performance is what matters, and reports are overwhelmingly positive on the Nexus 10.
So Mali 604T didn't match 5 times better than Mali 400, or maybe Samsung underclocked it.
Still very good but not the best.
________________
Edit: I forgot that Exynos 4210 with Mali400MP4 GPU had very bad GLbenchmark initially (even worse than PowerVR SGX540), but after updating firmware it's way better than other SoCs on Android handsets.
hung2900 said:
So Mali 604T didn't match 5 times better than Mali 400, or maybe Samsung underclocked it.
Still very good but not the best.
Click to expand...
Click to collapse
Not sure about this, but don't benchmark tools need to be upgraded for new architectures to? A15 is quite a big step, SW updates may be necessary for proper bench.
Damn..now I have to get an iPad.
I believe we have to take the Arndale board numbers with pinch of salt. It's a dev board, and I doubt it has optimized drivers for the SoC like it's expected for N10. Samsung has this habit of optimizing the drivers with further updates.
SGS2 makes for a good case study. When it was launched in MWC2011, it's numbers were really pathetic. It was even worse than Tegra2.
Anand ran benchmark on the pre-release version of SGS2 on MWC2011, check this:
http://www.anandtech.com/show/4177/samsungs-galaxy-s-ii-preliminary-performance-mali400-benchmarked
It was showing less than Tegra2 numbers! It was that bad initially.
Then look when Anand finally reviewed the device after few months:
http://www.anandtech.com/show/4686/samsung-galaxy-s-2-international-review-the-best-redefined/17
Egypt (native resolution) numbers went up by 3.6x and Pro also got 20% higher. Now they could have been higher if not limited by vsync. GLbenchmark moved from 2.0 to 2.1 during that phase, but I am sure this would not make such a big difference in numbers.
If you again check the numbers now for SGS2, it's again another 50% improvement in performance from the time Anand did his review.
Check this SGS2 numbers now:
http://www.anandtech.com/show/5811/samsung-galaxy-s-iii-preview
http://www.anandtech.com/show/6022/samsung-galaxy-s-iii-review-att-and-tmobile-usa-variants/4
This is just to show that how driver optimization can have a big affect on the performance. My point is that we have to wait for proper testing on final release of N10 device.
Also, check the fill rate properly in the Arndale board test. It's much less than what is expected. ARM says that Mali-T604 clocked at 500MHz should get a fill rate of 2 GPixels/s. It's actually showing just about 60% of what it should be delivering.
http://blogs.arm.com/multimedia/353-of-philosophy-and-when-is-a-pixel-not-a-pixel/
Samsung has clocked the GPU @ 533MHz. So, it shouldn't be getting so less.
According to Samsung, it more like 2.1 GPixels/s: http://semiaccurate.com/assets/uploads/2012/03/Samsung_Exynos_5_Mali.jpg
Fill rate is a low-level test, and there shouldn't be such a big difference from the quoted value. Let's wait and see how the final device shapes up.
hung2900 said:
So Mali 604T didn't match 5 times better than Mali 400, or maybe Samsung underclocked it.
Still very good but not the best.
________________
Edit: I forgot that Exynos 4210 with Mali400MP4 GPU had very bad GLbenchmark initially (even worse than PowerVR SGX540), but after updating firmware it's way better than other SoCs on Android handsets.
Click to expand...
Click to collapse
In areas where the Mali 400 lacked performance, like fragment and vertex lit triangle output T604 is comfortably 5 x the performance, whereas in these low-level tests iPad is not a concrete 2x the power of iPad 3, but achieves twice the FPS in Egypt HD than its older brother. I suspect drivers are a big factor here, and Exynos 5250 will get better as they drivers mature.
hot_spare said:
I believe we have to take the Arndale board numbers with pinch of salt. It's a dev board, and I doubt it has optimized drivers for the SoC like it's expected for N10. Samsung has this habit of optimizing the drivers with further updates.
SGS2 makes for a good case study. When it was launched in MWC2011, it's numbers were really pathetic. It was even worse than Tegra2.
Anand ran benchmark on the pre-release version of SGS2 on MWC2011, check this:
http://www.anandtech.com/show/4177/samsungs-galaxy-s-ii-preliminary-performance-mali400-benchmarked
It was showing less than Tegra2 numbers! It was that bad initially.
Then look when Anand finally reviewed the device after few months:
http://www.anandtech.com/show/4686/samsung-galaxy-s-2-international-review-the-best-redefined/17
Egypt (native resolution) numbers went up by 3.6x and Pro also got 20% higher. Now they could have been higher if not limited by vsync. GLbenchmark moved from 2.0 to 2.1 during that phase, but I am sure this would not make such a big difference in numbers.
If you again check the numbers now for SGS2, it's again another 50% improvement in performance from the time Anand did his review.
Check this SGS2 numbers now:
http://www.anandtech.com/show/5811/samsung-galaxy-s-iii-preview
http://www.anandtech.com/show/6022/samsung-galaxy-s-iii-review-att-and-tmobile-usa-variants/4
This is just to show that how driver optimization can have a big affect on the performance. My point is that we have to wait for proper testing on final release of N10 device.
Also, check the fill rate properly in the Arndale board test. It's much less than what is expected. ARM says that Mali-T604 clocked at 500MHz should get a fill rate of 2 GPixels/s. It's actually showing just about 60% of what it should be delivering.
http://blogs.arm.com/multimedia/353-of-philosophy-and-when-is-a-pixel-not-a-pixel/
Samsung has clocked the GPU @ 533MHz. So, it shouldn't be getting so less.
According to Samsung, it more like 2.1 GPixels/s: http://semiaccurate.com/assets/uploads/2012/03/Samsung_Exynos_5_Mali.jpg
Fill rate is a low-level test, and there shouldn't be such a big difference from the quoted value. Let's wait and see how the final device shapes up.
Click to expand...
Click to collapse
I agree with most of what you have said. On the GPixel figure this is like ATI GPU teraflops figures always being much higher than Nvidia, in theory with code written to hit the device perfectly you might see that those high figures, but in reality the Nvidia cards with lower on paper numbers equaled or beat ATI in actual game FPS. It all depends on whether the underlying architecture is as efficient in real-world tests, vs maximum technical numbers that can't be replicated in actual game environments.
I think the current resolution of the iPad / Nexus 10 is actually crazy, and will would see prettier games with lower resolutions, the amount of resources needed to drive those high MP displays, means lots of compromises will be made in terms of effects / polygon complexity etc to ensure decent FPS, especially when you think that to drive Battlefield 3 at 2560 x 1600 with AA and high textures, requires a PC that burn 400+ watts of power, not a 10 watt SoC.
Overall when we consider that Nexus 10 has twice the RAM for game developers to use and faster CPU cores, games should look equally as nice as both, the biggest effect will be the level of support game developers provide for each device, the iPad will probably be stronger i that regard. Nvidia was able to coax prettier games out of Tegra 3 through developer support, hopefully Google won't forget the importance of this.
What's the point of speculation? Just wait for the device to be released and run all the test you want to get confirmation on performance. Doesn't hurt to wait
BoneXDA said:
Not sure about this, but don't benchmark tools need to be upgraded for new architectures to? A15 is quite a big step, SW updates may be necessary for proper bench.
Click to expand...
Click to collapse
Both A9 & A15 use the same instruction set architecture (ISA) so no they won't. Benchmarks may need to be modified, if the new SoC are too powerful and max out the old benches, but for GL Benchmark, that has not happened yet and there are already new updates in the pipeline.
I can't wait to see this Exynos 5250 in a 2.0ghz quad-core variant in the semi near future... Ohhhh the possibilities. Samsung has one hell of a piece of silicon on their hand.
Chrome
http://www.anandtech.com/show/6425/google-nexus-4-and-nexus-10-review
Google if you want to use Chrome as the stock browser, then develop to fast and smooth and not an insult, stock AOSP browser would be so much faster.
Turbotab said:
http://www.anandtech.com/show/6425/google-nexus-4-and-nexus-10-review
Google if you want to use Chrome as the stock browser, then develop to fast and smooth and not an insult, stock AOSP browser would be so much faster.
Click to expand...
Click to collapse
True.. Chrome on mobile is still not upto desktop level yet. I believe it's v18 or something, right? The stock browser would have much better result in SunSpider/Browsermark. The N4 numbers looks even worse. Somewhere the optimizations isn't working.
The GLbenchmark tests are weird. Optimus G posts much better result than N4 when both are same hardware. It infact scores lower than Adreno 225 in some cases. This is totally whacked.
For N10, I am still wondering about fill rate. Need to check what guys say about this.
Is it running some debugging code in the devices at this time?
Turbotab said:
Both A9 & A15 use the same instruction set architecture (ISA) so no they won't. Benchmarks may need to be modified, if the new SoC are too powerful and max out the old benches, but for GL Benchmark, that has not happened yet and there are already new updates in the pipeline.
Click to expand...
Click to collapse
Actually not. A8 and A9 are the same ISA (Armv7), while A5 A7 and A15 are in another group (Armv7a)
Once we get rid of the underclock no tablet will be able to match. I'm sure the Mali t604 at 750 MHz would destroy everything.
hung2900 said:
Actually not. A8 and A9 are the same ISA (Armv7), while A5 A7 and A15 are in another group (Armv7a)
Click to expand...
Click to collapse
I have to disagree, this is from ARM's info site.
The ARM Cortex-A15 MPCore processor has an out-of-order superscalar pipeline with a tightly-coupled low-latency level-2 cache that can be up to 4MB in size. The Cortex-A15 processor implements the ARMv7-A architecture.
The ARM Cortex-A9 processor is a very high-performance, low-power, ARM macrocell with an L1 cache subsystem that provides full virtual memory capabilities. The Cortex-A9 processor implements the ARMv7-A architecture and runs 32-bit ARM instructions, 16-bit and 32-bit Thumb instructions, and 8-bit Java bytecodes in Jazelle state.
http://infocenter.arm.com/help/index.jsp?topic=/com.arm.doc.set.cortexa/index.html
Keion said:
Once we get rid of the underclock no tablet will be able to match. I'm sure the Mali t604 at 750 MHz would destroy everything.
Click to expand...
Click to collapse
Except the iPad 4, which has a GPU that is currently 57% faster than the T604.
Sent from my iPad Mini using Tapatalk
Do remember that Awesome resolution does tax the GPU a lot. Heck most lower end desktop GPUs would struggle
Harry GT-S5830 said:
Do remember that Awesome resolution does tax the GPU a lot. Heck most lower end desktop GPUs would struggle
Click to expand...
Click to collapse
Indeed it does,but not in offscreen testing, where Anand made his proclamation.
Sent from my iPad Mini using Tapatalk
Hemlocke said:
Except the iPad 4, which has a GPU that is currently 57% faster than the T604.
Sent from my iPad Mini using Tapatalk
Click to expand...
Click to collapse
Nah, I think we can beat that too.
Drivers + OC.
Planning to return my N10 cause the stuttering on it is driving me insane and im really interested in the Tablet Z currently.
1920x1080 on an 1.5GHz Qualcomm APQ8064 with adreno 320GPU VS 2560x1600 nexus 10 with an exynos 5 and a mali t604 GPU clocked at 1.7 GHZ.
Fasty12 said:
Planning to return my N10 cause the stuttering on it is driving me insane and im really interested in the Tablet Z currently.
1920x1080 on an 1.5GHz Qualcomm APQ8064 with adreno 320GPU VS 2560x1600 nexus 10 with an exynos 5 and a mali t604 GPU clocked at 1.7 GHZ.
Click to expand...
Click to collapse
The S4 is halfway between the Cortex A9 cores and the new Cortex A15 core that we have. So it is decent enough of a CPU. I am not sure how good of a GPU that is. None of my devices the past couple years have had Adreno GPU's At least it wont have to work as hard with the lower resolution
Fasty12 said:
Planning to return my N10 cause the stuttering on it is driving me insane and im really interested in the Tablet Z currently.
1920x1080 on an 1.5GHz Qualcomm APQ8064 with adreno 320GPU VS 2560x1600 nexus 10 with an exynos 5 and a mali t604 GPU clocked at 1.7 GHZ.
Click to expand...
Click to collapse
What stuttering are you talking about?
Draw your own conclusions.
S4 Pro - http://www.anandtech.com/show/6112/...agon-s4-apq8064adreno-320-performance-preview
Exynos 5 - http://www.anandtech.com/show/6148/samsung-announces-a15malit604-based-exynos-5-dual
From everything I've seen and experienced the exynos 5 is the better of the two. The a15 is a more powerful core than the krait core, that with the higher clock speeds and the better GPU makes for a better chip. Personally I have never had my n10 lag at all. Maybe you just got a dud?
Sent from my Nexus 10 using Tapatalk HD
enik_fox said:
From everything I've seen and experienced the exynos 5 is the better of the two. The a15 is a more powerful core than the krait core, that with the higher clock speeds and the better GPU makes for a better chip. Personally I have never had my n10 lag at all. Maybe you just got a dud?
Sent from my Nexus 10 using Tapatalk HD
Click to expand...
Click to collapse
But the Exynos 5 has to run that massive screen res. Also, the reason I think that Qualcomm modified the core was because of the power consumption. Stock A-15 core consumes quite a lot of power.
Sent from my LG-P990 using xda app-developers app
Fasty12 said:
Planning to return my N10 cause the stuttering on it is driving me insane and im really interested in the Tablet Z currently.
1920x1080 on an 1.5GHz Qualcomm APQ8064 with adreno 320GPU VS 2560x1600 nexus 10 with an exynos 5 and a mali t604 GPU clocked at 1.7 GHZ.
Click to expand...
Click to collapse
Every now and then I read ppl complaining about lags and stutters... I have not experienced one since I have the device; can you please explain what you are doing when this happens?
avdaga said:
Every now and then I read ppl complaining about lags and stutters... I have not experienced one since I have the device; can you please explain what you are doing when this happens?
Click to expand...
Click to collapse
Try opening and closing google maps after the map has been loaded there is a NOTICEABLE frame rate drop compare to other apps.
kaspar737 said:
But the Exynos 5 has to run that massive screen res. Also, the reason I think that Qualcomm modified the core was because of the power consumption. Stock A-15 core consumes quite a lot of power.
Sent from my LG-P990 using xda app-developers app
Click to expand...
Click to collapse
The Snapdragon S4 does not use an A15 core or any derivative of an A15. Qualcomm has ALWAYS completely designed their cores custom and has almost nothing to do with the current major core from ARM's reference design. Additionally, the S4 was designed and released before the A15 MP-Core was even finished with its design phase.
The Krait core uses a similar (but not the same) triple wide decode stage like the A15 core, but it uses a completely different 11 stage execution pipeline compared to the A15's 15 stage pipeline. The higher stages of the pipeline allow the A15 design to break things down smaller and achieve higher frequency, but if there were to be a failure in computing then the A15 must wait a longer time before it can start over where the Krait core doesnt have to wait as long, but also isnt as efficient in " normal" circumstances. Honestly the integer performance between the two cores is pretty close, but I think I remember seeing that the A15 has a lot stronger floating point performance. So I guess it really depends on your workload.
FYI, the Exynos 5, Tegra 4, and TI OMAP 5 processors are all based on the A15 core design. Qualcomm is the only major player who does not base their processors on the ARM design
Fasty12 said:
Try opening and closing google maps after the map has been loaded there is a NOTICEABLE frame rate drop compare to other apps.
Click to expand...
Click to collapse
Do you mean a drop in framerate during the animation when closing Maps? I notice a minor framerate drop which lasts as long as the animation does, but if that is it, I'm kinda wondering why you bought an android device in first place... I have not noticed this before and I cannot imagine anyone would using the device for its intended purposes. If you take any android device, you will find a fps drop at some point... Maybe return it and take an iPad? iPads do not have the issue, on the other hand there's a lot that iPads do not have ^^
kaspar737 said:
But the Exynos 5 has to run that massive screen res. Also, the reason I think that Qualcomm modified the core was because of the power consumption. Stock A-15 core consumes quite a lot of power.
Sent from my LG-P990 using xda app-developers app
Click to expand...
Click to collapse
Exynos has higher memory bandwidth so the difference isn't substantial.
Sent from my Galaxy Nexus using Tapatalk 2
---------- Post added at 01:33 PM ---------- Previous post was at 01:29 PM ----------
EniGmA1987 said:
The Snapdragon S4 does not use an A15 core or any derivative of an A15. Qualcomm has ALWAYS completely designed their cores custom and has almost nothing to do with the current major core from ARM's reference design. Additionally, the S4 was designed and released before the A15 MP-Core was even finished with its design phase.
The Krait core uses a similar (but not the same) triple wide decode stage like the A15 core, but it uses a completely different 11 stage execution pipeline compared to the A15's 15 stage pipeline. The higher stages of the pipeline allow the A15 design to break things down smaller and achieve higher frequency, but if there were to be a failure in computing then the A15 must wait a longer time before it can start over where the Krait core doesnt have to wait as long, but also isnt as efficient in " normal" circumstances. Honestly the integer performance between the two cores is pretty close, but I think I remember seeing that the A15 has a lot stronger floating point performance. So I guess it really depends on your workload.
FYI, the Exynos 5, Tegra 4, and TI OMAP 5 processors are all based on the A15 core design. Qualcomm is the only major player who does not base their processors on the ARM design
Click to expand...
Click to collapse
THANK YOU!! my god I've had to explain this so many times! Qualcomm licenses ONLY the armv7 instructions and not arms designs. They design their own chips from the ground up and GPU, so please people stop saying Qualcomm is a cortex series processor because it isn't. Samsung and the rest license arms design and modify it, in Samsungs case they tend to increase the IPC slightly and give it more memory bandwidth.
Also to answer the question, exynos 5 will do better at higher resolutions and they will be very close in lower resolutions. S4 will be better in multi thread workloads more then likely and exynos will have better float performance. Exynos is better for games once the thermal throttling is fixed.
Sent from my Galaxy Nexus using Tapatalk 2
ECOTOX said:
Exynos has higher memory bandwidth so the difference isn't substantial.
Sent from my Galaxy Nexus using Tapatalk 2
---------- Post added at 01:33 PM ---------- Previous post was at 01:29 PM ----------
THANK YOU!! my god I've had to explain this so many times! Qualcomm licenses ONLY the armv7 instructions and not arms designs. They design their own chips from the ground up and GPU, so please people stop saying Qualcomm is a cortex series processor because it isn't. Samsung and the rest license arms design and modify it, in Samsungs case they tend to increase the IPC slightly and give it more memory bandwidth.
Also to answer the question, exynos 5 will do better at higher resolutions and they will be very close in lower resolutions. S4 will be better in multi thread workloads more then likely and exynos will have better float performance. Exynos is better for games once the thermal throttling is fixed.
Sent from my Galaxy Nexus using Tapatalk 2
Click to expand...
Click to collapse
The 50 percent extra memory bandwith doesn't matter so much considered that the Exynos has to move almost twice the amount of pixels.
Sent from my LG-P990 using xda app-developers app
kaspar737 said:
The 50 percent extra memory bandwith doesn't matter so much considered that the Exynos has to move almost twice the amount of pixels.
Sent from my LG-P990 using xda app-developers app
Click to expand...
Click to collapse
Memory bandwidth makes a pretty big difference when it comes too resolutions. I.E 8600gt ddr2 vs ddr3. Wider memory bus and faster memory makes a big difference in higher res performance of any GPU
Also will help with GPU compute performance for future apps utilizing the Mali t604s compute abilities
Sent from my Galaxy Nexus using Tapatalk 2
ECOTOX said:
Memory bandwidth makes a pretty big difference when it comes too resolutions. I.E 8600gt ddr2 vs ddr3. Wider memory bus and faster memory makes a big difference in higher res performance of any GPU
Also will help with GPU compute performance for future apps utilizing the Mali t604s compute abilities
Sent from my Galaxy Nexus using Tapatalk 2
Click to expand...
Click to collapse
But lets say that Exynos uses the whole 12.8 gb/s bandwith. That means that to move twice as less pixels you would need 6.4 gb/s so memory bandwidth isn't an issue.
Sent from my LG-P990 using xda app-developers app
kaspar737 said:
But lets say that Exynos uses the whole 12.8 gb/s bandwith. That means that to move twice as less pixels you would need 6.4 gb/s so memory bandwidth isn't an issue.
Sent from my LG-P990 using xda app-developers app
Click to expand...
Click to collapse
But that bandwidth is shared, unlike on dedicated GPU where it isn't. The total system bandwidth (not including buses for modem or w.e others are there) on the exynos chip being higher is gonna give it the edge in any situation considering the closeness in performance between the two. It also can't be denied that the Mali t604 has a edge in horse power over adreno 320 because even at the n10s resolution it comes within a couple fps of adreno at 1080p resolution. Not saying it's a big difference, but the exynos is the more powerful all around chip and that's just in is dual core form.
Edit: Also its a known fact that Adreno has crap fill rate compared too Mali or Power VR, Adrenos Strength is Geometry performace so it takes more of a hit the higher the resolution than Either the Mali t604 or the SGX 554MP4 which both have higher Fillrate and the SoC we have to compare both have higher bandwidth to facilitate that so we dont get bottle necked.
Sent from my Galaxy Nexus using Tapatalk 2
Finally a lot of exerts here about GPU
, I know it is not related to topic but me and my girlfriend have Galaxy Note 2 and S3. As you all know they are the last ones to use the elder Mali-400 GPU. I love playing games and I am getting my girlfriend used to them too. So I was wondering how is our Mali400 GPU holding up against the new coming 1080p Adreno 320 devices? It is clear the future is 1080p. I am either planning to switch our devices with a couple of Nexus 4s or Xperia Zs. Because I fear our devices are about to be outdated with the next game right around the corner. So far they are doing just fine with Modern Combat 4 and the all other graphic intensive games by playing over 28-30 FPS. But according to the GLBenchmark 2.5Egypt they are useless against new Adreno 320. However I have read that most of the games were designed for high fill rate power and Mali 400 is able to beat Adreno 320. But on the triangle tests, it just bottlenecks.
So what is your opinion about it? I will our devices do another year and half for the new games? Or should I make the trade? Or should I just buy a Nexus 10 with 2 users assigned and continue games on it? I
Thank you for reading.
Sent from my GT-N7100 using xda app-developers app
_delice_doluca_ said:
Finally a lot of exerts here about GPU
, I know it is not related to topic but me and my girlfriend have Galaxy Note 2 and S3. As you all know they are the last ones to use the elder Mali-400 GPU. I love playing games and I am getting my girlfriend used to them too. So I was wondering how is our Mali400 GPU holding up against the new coming 1080p Adreno 320 devices? It is clear the future is 1080p. I am either planning to switch our devices with a couple of Nexus 4s or Xperia Zs. Because I fear our devices are about to be outdated with the next game right around the corner. So far they are doing just fine with Modern Combat 4 and the all other graphic intensive games by playing over 28-30 FPS. But according to the GLBenchmark 2.5Egypt they are useless against new Adreno 320. However I have read that most of the games were designed for high fill rate power and Mali 400 is able to beat Adreno 320. But on the triangle tests, it just bottlenecks.
So what is your opinion about it? I will our devices do another year and half for the new games? Or should I make the trade? Or should I just buy a Nexus 10 with 2 users assigned and continue games on it? I
Thank you for reading.
Sent from my GT-N7100 using xda app-developers app
Click to expand...
Click to collapse
They will hold on, my SGS2 runs all of the current games at the highest settings ( I haven't tried GTA though) without any issues, the Adreno 320 is far better than the Mali 400 MP4 though
Ya, Im pretty sure they will still play games a year from now. Until the market is completely saturated with devices like the Nexus 10 in power we wont really see large jumps in system requirements. That will probably only happen a year or two from now once all the new phones and tablets are made with A15 processors (or Qualcomm equivalent) and beefy GPUs.
Fidelator said:
They will hold on, my SGS2 runs all of the current games at the highest settings ( I haven't tried GTA though) without any issues, the Adreno 320 is far better than the Mali 400 MP4 though
Click to expand...
Click to collapse
The S2(Mali400) plays GTA3 without a hiccup.
The exynos dual is very power hungry compared to the s4pro but it is also the most powerful arm processor out today. Nothing else yet released (I said RELEASED) is as powerful or can match its bandwidth. Having said that I'm sure a normal resolution 1080p screen in this form factor with the s4pro would be a nice fast tablet. Right now the exynos dual is pretty much the only thing outside apple that can push the resolution that the n10 has. I think if they had put another gig of ddr3 in this thing there wouldn't be so much stuttering in certain instances. Besides the thermal cutoff the n10 is starved for memory as it has to share normal duties and its ram with the graphical load of pushing all the pixels of this monster resolution. You are lucky to have 300mb of ram available at idle on the n10 vs over a gig available with the s4pro on the 720p screen of the nexus 4
Sent from my often RMA'd Nexus 4, So that I can use the one I'm using now when I get the 6th and hopefully final one.
I have only used dumb-phones until now and plan to get a smartphone.
I did a little surfing and found Adreno 320 is better than Mali 400 GPU and Qualcomm Snapdragons are better than Samsung Exynos, right?
But in real world they say Note 2 is considered better phone than Nexus 4. I don't understand this.
Different phones top in different benchmarks, and still some things like display quality are not considered in benchmarks i guess, right?
Now I haven't used any smartphone before, and the I would be using it for gaming or watching videos/movies only most of the time as I'm a student.
Which of these phones that I mentioned (all in 23-29,000 INR range) are better than others in reality?
Those who have used these phones or are techsavy might be able to guide me in making the right choice
Just found S3 in India has different specs than in US, Canada etc.
India: 1.4 Ghz Quad-core Exynos 4412,Mali-400,1GB
US: 1.5 Ghz Dual-core Snapdragon, Adreno 225,2GB
Which S3 is better?
sher_dil said:
I have only used dumb-phones until now and plan to get a smartphone.
I did a little surfing and found Adreno 320 is better than Mali 400 GPU and Qualcomm Snapdragons are better than Samsung Exynos, right?
But in real world they say Note 2 is considered better phone than Nexus 4. I don't understand this.
Different phones top in different benchmarks, and still some things like display quality are not considered in benchmarks i guess, right?
Now I haven't used any smartphone before, and the I would be using it for gaming or watching videos/movies only most of the time as I'm a student.
Which of these phones that I mentioned (all in 23-29,000 INR range) are better than others in reality?
Those who have used these phones or are techsavy might be able to guide me in making the right choice
Just found S3 in India has different specs than in US, Canada etc.
India: 1.4 Ghz Quad-core Exynos 4412,Mali-400,1GB
US: 1.5 Ghz Dual-core Snapdragon, Adreno 225,2GB
Which S3 is better?
Click to expand...
Click to collapse
The US version is btr in this case. Based on wat is ur requirements. U dun multi task much so u dun nid tat much cores. And more ram is btr.
Sent from my GT-I9300 using xda app-developers app