So is it true that the GPU on the EVO 3D sucks? Or is outdated? I've heard some people say its actually worse than the NS4G's GPU. I want to play some demanding games so the GPU to me is important. How will it run N64oid and the PSX emulator? I'm coming from the hero so there is no question there, but if what everyone says is true about it being worse than the NS4G's GPU then to me it's kind of a disappointment in that regard.
I guess I should rephrase one of my questions. I'm asking how it will run the emulators because I saw someone using a SG playing on the N64oid and it seemed pretty laggy, and if i'm not mistaken that has the same/similar GPU to the NS4G?
tannerw_2010 said:
So is it true that the GPU on the EVO 3D sucks? Or is outdated? I've heard some people say its actually worse than the NS4G's GPU. I want to play some demanding games so the GPU to me is important. How will it run N64oid and the PSX emulator? I'm coming from the hero so there is no question there, but if what everyone says is true about it being worse than the NS4G's GPU then to me it's kind of a disappointment in that regard.
Click to expand...
Click to collapse
the emulators use the CPU, the Evo 3D will be fine, the PSX emulator runs fine on my 18 month old Desire
From everything that I have read, the 3D's GPU is suppose to be one of the best out right now.......
[email protected] said:
From everything that I have read, the 3D's GPU is suppose to be one of the best out right now.......
Click to expand...
Click to collapse
Yeah, I've heard that too. So it makes me wonder whats really true? It might tell you something that I heard the GPU isn't very good from the NS boards ... but I think i've heard it on these boards too, just not near as much
Look up you tube videos of the gpu in action. Nuff said
Sent from my Nexus S 4G using XDA Premium App
Maybe this will calm your fears
http://www.youtube.com/watch?v=DhBuMW2f_NM
Here a better one
http://www.youtube.com/watch?v=Ehfyxvh2W4k&feature=youtube_gdata_player
Sent from my Nexus S 4G using XDA Premium App
The GPU in the Evo3D should be the best out right now. Supposed to be up to twice as fast/powerful as Tegra2. It does appear that some optimizations need to be done to take advantage of this GPU though, hence some of the early, low benchmarks.
The GPU is the fastest right now. NO need to specualte, it will be until tegra 3 comes out, but I think it will still match tegra 3 in most benchmarks. SGX540 is good but adreno 220 is faster.
What about de CPU ? It's worst than the Galaxy S CPU or better ?
jamhawk said:
What about de CPU ? It's worst than the Galaxy S CPU or better ?
Click to expand...
Click to collapse
It's probably a touch slower than the CPU in the Galaxy S2, but probably not enough to be important.
a5ehren said:
It's probably a touch slower than the CPU in the Galaxy S2, but probably not enough to be important.
Click to expand...
Click to collapse
Depends if the US Galaxy 2's are going to be Tegra or Exynos
donatom3 said:
Depends if the US Galaxy 2's are going to be Tegra or Exynos
Click to expand...
Click to collapse
Now that's gonna make all the difference.
Sent from my PC36100 using XDA Premium App
well this cpu has a totally different desgin. if you look at the videos it is plenty fast, I highly doubt it would not be able to do that the samsung processor would be able to do, other than bench a little higher. If and when this phone gets ICS it will probably be better off because of the gpu it uses, I believe the gs2 still uses sgx540 and the adreno is certainly newer and better. SGX540 is still one hell of a chip, but adreno 220 is actually better.
nkd said:
well this cpu has a totally different desgin. if you look at the videos it is plenty fast, I highly doubt it would not be able to do that the samsung processor would be able to do, other than bench a little higher. If and when this phone gets ICS it will probably be better off because of the gpu it uses, I believe the gs2 still uses sgx540 and the adreno is certainly newer and better. SGX540 is still one hell of a chip, but adreno 220 is actually better.
Click to expand...
Click to collapse
Actually the gs2 uses a Mali gpu..I still think the adreno outclasses it..they both have advantages over each other tho..but plenty of muscle for any mobile platform
firmbiz94 said:
Actually the gs2 uses a Mali gpu..I still think the adreno outclasses it..they both have advantages over each other tho..but plenty of muscle for any mobile platform
Click to expand...
Click to collapse
This thread slightly confuses me. The OP mentions the NS4G in the first post, then we have someone coming in asking about comparisons to the Galaxy S, (S or S2?) and everyone answers about the GS2. Quick stat breakdown to answer whatever question is actually being asked here
Nexus S 4G has:
1.0 single core Hummingbird CPU
SGX540 GPU
Galaxy S has:
1.0 single core Hummingbird CPU
SGX540 GPU
Galaxy S2 (Euro) has:
1.2 dual core Orion CPU
Mali 400 GPU
Evo 3D Has:
1.2 dual core MSM8660 CPU
Adreno 220 GPU
(Infoz from GSMArena)
The Nexus S and Galaxy S are last generation's phones, so to answer the OP... No. The Evo 3d doesn't have the same GPU/CPU as the NS4G. Not even similar. It's a generation (Maybe even 2) up. The Evo 4g is slightly slower than the NS4G, and it's running a 1.0 snapdragon with an Adreno 200 (Not even a 205, which is in the next in line before the 220).
As for the GS2 Vs. Evo 3D, they're supposed to be on par with each other, with the GS2 maybe being a bit faster, since Qualcomm isn't the best with GPU's. (Personal opinion) However, AFAIK nobody has done any real testing on the Sensation vs the GS2 (same CPU/GPU) so there's no real data backing up that claim... The GS2 DOES have better benchmark scores though, so take that as you will.
Disclaimer: I found all the numbers on the internets. They may be wrong.
You can't really prove anything without having any concrete proof. There are still no scientific or a dedicated performance comparison with all the gpus found on a dual core.
I say all the post on this thread are just personal opinions.
The only thing we can compare now are benchmark results w/c are not even that credible.
Benchmarks(anandtech, quadrant etc)
1. Exynos
2. TI omap
3. Tegra 2
4. Qualcomm
5. Hummingbird
Now if only the qualcomm dual core uses cortex a9. I wonder why they choose cortex a8 instead of a9. Cortex a8 is so old hardware now
Don't worry too much about the a8 vs a9 thing...tje differences are not huge..45nm vs 40 nm .also Qualcomm heavily optimized the scorpion that it can actually perform processes that a9 can't..it will provide plenty of power..I would go into more details but that seems to upset some people on other threads
peacekeeper05 said:
You can't really prove anything without having any concrete proof. There are still no scientific or a dedicated performance comparison with all the gpus found on a dual core.
I say all the post on this thread are just personal opinions.
The only thing we can compare now are benchmark results w/c are not even that credible.
Benchmarks(anandtech, quadrant etc)
1. Exynos
2. TI omap
3. Tegra 2
4. Qualcomm
5. Hummingbird
Now if only the qualcomm dual core uses cortex a9. I wonder why they choose cortex a8 instead of a9. Cortex a8 is so old hardware now
Click to expand...
Click to collapse
Almost all of the GPU benchmarks I've seen go like this:
1. Qualcomm
2. TI omap
3. Exynos
4. Tegra 2
5. Hummingbird
Qualcomm uses a8 because they don't use the reference designs from arm. Snapdragon outperforms the cortex a8 reference by 20-30% making it pretty close to the a9 reference
Related
I was just thinking about something. Is it really a fair comparison between an asynchronous dual core and a conventional dual core such as the Tegra or the OMAP4? We all know how everyone loves to compare benchmarks on phones. Also, we all know that the 3d does horrible on Quadrant scores. Is this because of the type of cpu we have? If it is... Is it really fair to even try to compare them?
My thinking is that, if both of our cores ran at the same speed all of the time, our cpu would dominate everything on benchmarks. Am I wrong in thinking that? Is there any way we would truly know?
Ps. Hope this isn't dumb thinking. If it is, please just state why and move on. I am NOT trying to start any flame war or troll thread. This is a 100% completely sincere question.
Thanks in advance!
Sent by my supercharged dual core from the 3rd dimension.
Benchmark scores mean **** anyways. I don't know why people insist on using them. If the phone runs well, it runs well
Tad slower mostly because its based on a similar ARM cortex A8 design. Those other ones, like galaxy s2 or other SOC's are based on the newer cortex A9 designs. Been analyzed several times over anandtech or other sites. Besides those benchmarks are not dual core at all. So we are apples to apples. Difference is in designs. If you compare two cpus clocked at same speeds (snapdragon/A8 vs A9) A9 will come ahead.
Sent from my PG86100 using XDA App
I understand that benchmarks don't mean anything. I just want to know if the fact that our cpu is asynchronous had anything to do with the exceptionally low scores compared to other devices.
Sent from my PG86100 using XDA App
I'd chalk it up to the fact that the most recent OMAP and Exynos are based on A9 while our scorpion cores are heavily modified A8 designs by qualcomm.
Ours are in between A8 and A9.
Sent from my PG86100 using xda premium
I briefly und understand the difference between A9 and A8 based chips but I personally think the current snapdragon in the shooter (msm8660?) is a much superior chip then the tegra 2. I got tiered of my og evo so I bought the shooter off contract from a buddy for cheap and plan to get the nexus prime which I belive will land at sprint before January (contract up). The rumors are that will use OMAP 4660 clocked at 1.5. Just rumors I know. But how will that compare to the snapdragon in terms of speed and battery?
Sent from my PG86100 using Tapatalk
ROM synergy 318 OC 1.8 (2.3.3 base) literally SMOKED the sgs2, was hitting 4000+ with quadrant advanced, but yeah, scores mean nothing. We should have OC again soon, and get nice shiny scores again.
From what I have been reading, A8, A9, v6, v7 or whatever there is now doesn't really equate to any performance gains. The companies license from ARM or they can create their own SoC based on ARM, so its kind of like saying there's an Intel Core 2 Duo and then a AMD Athlon X2, but they are both based on x86 architecture. There's a lot of confusion regarding the whole A8 A9 terminology, so honestly, I don't think it matters much what ARM revision or whatever our SoC is using in the Evo 3D.
What I would really like to know is if the Asynchronous part of it is making a difference in the scores. Does anyone know this? That is the biggest question I have.
Hard to really say which processor is more powerful; but at this stage in smartphones all the dual cores seem to be powerful enough to where it doesn't matter. Asynchronous vs the other guys may be a different story though. Asynchronous cores means each core can be at a different clock speed, so when we get the next version to android (in October or November) and we get to take full advantage of dual core support we may have significantly better battery life than them.
So to elaborate on what you want i guess: Asynchronous cores has nothing to do with the benchmarks because these benchmarks are only running one core anyway (i'm pretty sure).
sprinttouch666 said:
Hard to really say which processor is more powerful; but at this stage in smartphones all the dual cores seem to be powerful enough to where it doesn't matter. Asynchronous vs the other guys may be a different story though. Asynchronous cores means each core can be at a different clock speed, so when we get the next version to android (in October or November) and we get to take full advantage of dual core support we may have significantly better battery life than them.
Click to expand...
Click to collapse
Ok. Now, what about performance wise? Will we be at an advantage or disadvantage?
lyon21 said:
Ok. Now, what about performance wise? Will we be at an advantage or disadvantage?
Click to expand...
Click to collapse
Check this out if you are worried about performance. I think this pretty much sums up how powerful the new snapdragon chipset
http://www.qualcomm.com/blog/2011/04/27/next-gen-snapdragon-dual-core-mdp
lyon21 said:
What I would really like to know is if the Asynchronous part of it is making a difference in the scores. Does anyone know this? That is the biggest question I have.
Click to expand...
Click to collapse
Depends... If you are benchmarking with a non multithreaded app like quadrant, it doesn't matter as you're running on a single core on both. A9 will be faster. And if you're running a multithreaded benchmark that fully uses both cores then the "asynchronous" thing goes out of play as you're using both cores on both devices.
Sent from my PG86100 using XDA App
il Duce said:
ROM synergy 318 OC 1.8 (2.3.3 base) literally SMOKED the sgs2, was hitting 4000+ with quadrant advanced, but yeah, scores mean nothing. We should have OC again soon, and get nice shiny scores again.
Click to expand...
Click to collapse
Well, then if you overclock an A9 to 1.8 ghz you're back to square one and A9 is still faster. I think Qualcomm has already announced their roadmap and a A9 killer is on its way. I think its a quad core with adreno 3xx (will also have dual core with updated architecture to beat A9, but then ARM is coming up with the A15 Hahaha, the never ending race)
Sent from my PG86100 using XDA App
sn0b0ard said:
From what I have been reading, A8, A9, v6, v7 or whatever there is now doesn't really equate to any performance gains. The companies license from ARM or they can create their own SoC based on ARM, so its kind of like saying there's an Intel Core 2 Duo and then a AMD Athlon X2, but they are both based on x86 architecture. There's a lot of confusion regarding the whole A8 A9 terminology, so honestly, I don't think it matters much what ARM revision or whatever our SoC is using in the Evo 3D.
Click to expand...
Click to collapse
Yes it matters, like your comparison, each chip has new sets of instructions, pipelines and optimization. Clock for clock, and like other guy said our snapdragons are between an A8 and A9 and the A9 is simply faster. Ours is an older architecture. By no means a slouch, but its the truth.
Sent from my PG86100 using XDA App
jamexman said:
Yes it matters, like your comparison, each chip has new sets of instructions, pipelines and optimization. Clock for clock, and like other guy said our snapdragons are between an A8 and A9 and the A9 is simply faster. Ours is an older architecture. By no means a slouch, but its the truth.
Sent from my PG86100 using XDA App
Click to expand...
Click to collapse
See, here's the thing. Qualcomm doesn't just go stock ARM architecture. They licensed the technology and made their own snapdragon chipset. Is the snapdragon chipset family old? Yes, it has been around for a while. Is the chipset that is in the Evo 3D old? Not really. It was just developed by Qualcomm relatively recently and expands on their existing, proven QSD chipset. This is like comparing apples to oranges, they are just two different SoCs. If you were to take an absolutely stock ARMv9 and put it against an absolutely stock ARMv7/8, then yes, the ARMv9 obviously is going to win, but these companies try and market that their CPUs are one version higher than others, when in all reality, they modify the hell out of the ARM architecture to make their chipsets.
sn0b0ard said:
Check this out if you are worried about performance. I think this pretty much sums up how powerful the new snapdragon chipset
http://www.qualcomm.com/blog/2011/04/27/next-gen-snapdragon-dual-core-mdp
Click to expand...
Click to collapse
Totally off topic Sorrrry!!!
Just followed the link above and WOW!! how can we con Qualcom into giving us a copy of that home launcher they use with the live wallpaper as well..HMMMMM
jamexman said:
Well, then if you overclock an A9 to 1.8 ghz you're back to square one and A9 is still faster. I think Qualcomm has already announced their roadmap and a A9 killer is on its way. I think its a quad core with adreno 3xx (will also have dual core with updated architecture to beat A9, but then ARM is coming up with the A15 Hahaha, the never ending race)
Sent from my PG86100 using XDA App
Click to expand...
Click to collapse
It is much harder to push a A9 based SOC to 1.8 Ghz compared to the A8 based MSM8660. Clock per clock, A9 will be faster. The A9 has greater IPC and a shorter pipeline, but this also prevents the A9 from running at as high frequencies as an A8 based SOC. How many 1.8 Ghz Exynos chips do you see? In some regards the MSM8660 clearly beats some A9 based SOCs like the Tegra 2 which even lacks hardware support for NEON instructions. Snapdragons have also always traditionally had high floating point performance too.
Also there is no competition between Qualcomm and ARM. Qualcomm simply licenses designs from ARM and then customizes them for its own needs.
I picked up my Galaxy SII after seeing the disappointing specs on the iPhone 4S. But today I read preliminary benchmarks and it smokes the SII.
Sorry, unable to post a link yet.
How can a 800 mhz cpu beat the SII's 1.2 ghz processor?
I am confused. Am I missing something?
026TB4U said:
I picked up my Galaxy SII after seeing the disappointing specs on the iPhone 4S. But today I read preliminary benchmarks and it smokes the SII.
Sorry, unable to post a link yet.
How can a 800 mhz cpu beat the SII's 1.2 ghz processor?
I am confused. Am I missing something?
Click to expand...
Click to collapse
Because benchmarks don't mean jack ****.
Look at how Quadrant scores are all over the damned place with no correspondence to actual usability.
its all about the software. I expect some good gains when moving over to ICS.
Edit, corrected iPhone processor family name.
Trying to benchmark across different operating systems and hardware is not easy to accomplish, but I can tell you that an (Apple A5) A9 800 mhz duel core Samsung processor is not faster than (Exynos) A9 1.2 ghz duel core Samsung processor.
Yes both phones processors are made by Samsung
Sent from my SAMSUNG-SGH-I777 using XDA App
Entropy512 said:
Because benchmarks don't mean jack ****.
Look at how Quadrant scores are all over the damned place with no correspondence to actual usability.
Click to expand...
Click to collapse
+1 10 char
dayv said:
Trying to benchmark across different operating systems and hardware is not easy to accomplish, but I can tell you that an A5 800 mhz duel core Samsung processor is not faster than A9 1.2 ghz duel core Samsung processor.
Yes both phones processors are made by Samsung
Sent from my SAMSUNG-SGH-I777 using XDA App
Click to expand...
Click to collapse
This is true but your wording is a bit confusing. An "Apple A5" processor is a dual core a9 processor with a powervr 543mp2 gpu. An A5 processor is an Arm core made for ultra low power. Basically both the apple a5 and the exynos processor have have the same processor architecture but there are many other factors that can influence performance like the GPU, memory, cache, decoders, ect. In this case i think the main discrepancy will be the software thats so different between the two.
footballrunner800 said:
This is true but your wording is a bit confusing. An "Apple A5" processor is a dual core a9 processor with a powervr 543mp2 gpu. An A5 processor is an Arm core made for ultra low power. Basically both the apple a5 and the exynos processor have have the same processor architecture but there are many other factors that can influence performance like the GPU, memory, cache, decoders, ect. In this case i think the main discrepancy will be the software thats so different between the two.
Click to expand...
Click to collapse
I did not doubt that both processors were of the same type and architecture, but I did not realize that apple A5 was just an apple brand and that both processors were A9. Both are still Samsung family processor one clocked at 800 mhz one clocked at 1.2 GHz
Thank you for the correction
Sent from my SAMSUNG-SGH-I777 using XDA App
The iPhone is probably utilizing the processor to it's full extent, while Gingerbread (and Android in general) does a terrible job of utilizing the power of the hardware.
ICS should see a nice performance increase on dual cores.
OP is probably refering to the benchmark for gaming. It's not the processor that lacks on GS2. If iPhone 4S does come with the same A5 as iPad2, its GPU will smoke Mali400 in GS2 in almost every benchmark test (in most benchmarks, it is twice as fast as Mali400). Just checkout the review of Internationl GS2 by Anandtech.com with benchmark comparison of GS2 vs iPad2 and other smartphones. It is not the Quadrant or Linkpack benchmark but rather the professional benchmarks measuring fill rates and triangle thoughputs etc.
Processor performance wise, it is probably a wash because both are based on the same ARM design.
Although I do agree that benchmarks are just benchmarks, I am still surprised.
Is it true that Gingerbread only utilizes one cpu? And will Ice Cream Sandwich utilize both?
And BTW, I am by no means an Apple fanboy. I had been waiting for this phone to come out to replace my dinosaur BB 9000, so I wouldn't have to get an iPhone and deal with iTunes.
iOS5 > gingerbread. Sad but true.
Hope ICS comes out soon. It seems to be on par from what I hear.
Sent from my Galaxy S II using Tapatalk
I think I saw the benchmark in question - it was a GPU-heavy benchmark for a workload that most users won't experience 99% of the time. (It was a GPU-bound OpenGL benchmark. The GPU of the iPhone 4S IS faster than ours for 3D work - but unless you do lots of 3D gaming, it's wasted. Also, 3D is kind of a waste on a 3.5" screen.)
Apple has an extremely long history of misleading the public with selective benchmarking. Back in the Pentium II or III days, they claimed one of their machines was twice as fast as an Intel machine clocked at least 50% higher. While I agree that MHz isn't everything, there's a limit to that. In that case, on a single Photoshop benchmark that was optimized for PowerPC by using AltiVec and running non-optimized on the Intel chip (despite an optimized MMX or SSE implementation being available), the Apple did better - and Apple used that to try and make users believe the machine was twice as fast for all workloads.
026TB4U said:
Is it true that Gingerbread only utilizes one cpu? And will Ice Cream Sandwich utilize both?
Click to expand...
Click to collapse
It is true.
I guess the benchmarking was for the javascript using safari browser. So it's apple vs oranges. Also completely 2 different OS. Let's run quadrant if it's available for iOS the see how it handles. In the mean time enjoy the best and fastest smartphone currently in the market no matter what other says.
Sent from my SAMSUNG-SGH-I777 using xda premium
It could be ten times faster than a GII, but it still has a 3.5" screen, and I-jail. My wife and kids have Iphone 4's and there is no way I would trade no matter how fast this new one is.
aintwaven said:
It could be ten times faster than a GII, but it still has a 3.5" screen, and I-jail. My wife and kids have Iphone 4's and there is no way I would trade no matter how fast this new one is.
Click to expand...
Click to collapse
Except for the wife and kids part(I have neither) this. Very much this.
Just ran the SunSpider Javascript on CM7.1. Results seem to be quite a bit better than the ones I see posted on AnandTech. Obviously they were running the GS2 stock but I was surprised to see my numbers so low. Also did the GLBenchmark and while the Egypt was slower, the Pro was faster on CM7.1. Coin flip to me it seems...
Those are just plain synthetic benchmark, what does it mean for RL usage? not a damn thing.
You think all the fashionnista who's buying iphone 4s gonna care how fast their CPU are?
footballrunner800 said:
its all about the software. I expect some good gains when moving over to ICS.
Click to expand...
Click to collapse
That's the problem with android; it is always wait for the next version of software, it'll be better then. How about making a good version now?
Sent from my SAMSUNG-SGH-I777 using Tapatalk
arctia said:
iOS5 > gingerbread. Sad but true.
Hope ICS comes out soon. It seems to be on par from what I hear.
Sent from my Galaxy S II using Tapatalk
Click to expand...
Click to collapse
Are you high and drunk?? As far as I'm aware, iOS5 is just playing catch up to Android. There isn't one feature that they implemented that hasn't already been introduced in Android since the Froyo days.
http://www.youtube.com/watch?v=FUEG7kQegSA&feature=share
According to an article today by Android Police, they have strong confirmation that the Nexus Prime/Galaxy will have a T.I. OMAP 4460 SoC(System on a chip) down clocked from 1.5 to 1.2GHz. The OMAP 4460 has the PowerVR 540 GPU which is what is present in our phones. If this is true, I will probably pass on it. But I did a little research and found out that the T.I. OMAP 4470 SoC is due for late 2011 or early 2012. Perhaps Google/Samsung will work with T.I. to debut this new SoC. The OMAP 4470 has a clock speed of 1.8GHz and contains the PowerVR 544 (more powerful than the iPad 2/iPhone 4S). Surely Google would not want a GPU found in last years models to be in their new flagship phone. What are your thoughts?
Zacisblack said:
According to an article today by Android Police, they have strong confirmation that the Nexus Prime/Galaxy will have a T.I. OMAP 4460 SoC(System on a chip) down clocked from 1.5 to 1.2GHz. The OMAP 4460 has the PowerVR 540 GPU which is what is present in our phones. If this is true, I will probably pass on it. But I did a little research and found out that the T.I. OMAP 4470 SoC is due for late 2011 or early 2012. Perhaps Google/Samsung will work with T.I. to debut this new SoC. The OMAP 4470 has a clock speed of 1.8GHz and contains the PowerVR 544 (more powerful than the iPad 2/iPhone 4S). Surely Google would not want a GPU found in last years models to be in their new flagship phone. What are your thoughts?
Click to expand...
Click to collapse
Don't believe half the things you read online. For all we know the nexus prime is a Motorola phone.....
Sent from my SCH-I500 using Tapatalk
nope, it's samsung. you can take off your tinfoil hat since that was officially confirmed about a year ago.
op, where did you get that information? it's been stated that it will have an exynos processor, the latest and greatest from samsung. I don't have a source but the whole point of the nexus line is to have the best and latest hardware.
Sent from my MIUI SCH-i500
sageDieu said:
nope, it's samsung. you can take off your tinfoil hat since that was officially confirmed about a year ago.
op, where did you get that information? it's been stated that it will have an exynos processor, the latest and greatest from samsung. I don't have a source but the whole point of the nexus line is to have the best and latest hardware.
Sent from my MIUI SCH-i500
Click to expand...
Click to collapse
Not saying it's 100% but 4/5 Android websites have concluded that the OMAP series is the platform of choice for Google's new OS. No tech blog/website has stated it will have Exynos. And the OMAP 4470 would be more powerful either way. But below, Android Police strongly asserted that the new device will have the OMAP 4460 downclocked to 1.2GHz. But like I said, I'm asking for everyone's thoughts because I can definitely see Google surprising us.
http://www.androidpolice.com/2011/1...eam-sandwich-phone-sorry-prime-is-not-likely/
You can also check Engadget, AndroidCentral, Anandtech, Android Authority,and PhanDroid.
tonu42 said:
Don't believe half the things you read online. For all we know the nexus prime is a Motorola phone.....
Sent from my SCH-I500 using Tapatalk
Click to expand...
Click to collapse
You could be partially right. Some rumors have suggested that the Prime and Galaxy Nexus are two different devices. What saddens me is that the Galaxy Nexus I-9250 passed through the FCC with GSM only.
The 4460 is has a 100mhz boost in terms of GPU compared to ours. And I can't think of any game/app that would need more than that.
Sent from my Fascinate with MIUI Gingerbread
TheSonicEmerald said:
The 4460 is has a 100mhz boost in terms of GPU compared to ours. And I can't think of any game/app that would need more than that.
Click to expand...
Click to collapse
184mhz, I think -- almost double. Except the Nexus is going to have 2.4 times the pixels of the Fascinate (or 2.22 if you don't count the soft key area).
tonu42 said:
Don't believe half the things you read online. For all we know the nexus prime is a Motorola phone.....
Sent from my SCH-I500 using Tapatalk
Click to expand...
Click to collapse
oh tonu, still trying to have conversations about things you know nothing about.
Sent from my Incredible 2 using XDA App
TheSonicEmerald said:
The 4460 is has a 100mhz boost in terms of GPU compared to ours. And I can't think of any game/app that would need more than that.
Sent from my Fascinate with MIUI Gingerbread
Click to expand...
Click to collapse
Clock speed isn't going to improve graphics. The PowerVR 543MP2 dual core GPU in the A5 chip would still run laps around an overclocked PowerVR540 in terms of speed, throughput and things such as shadows, textures and triangles.
Zacisblack said:
Clock speed isn't going to improve graphics. The PowerVR 543MP2 dual core GPU in the A5 chip would still run laps around an overclocked PowerVR540 in terms of speed, throughput and things such as shadows, textures and triangles.
Click to expand...
Click to collapse
Hah. Imagine having the PowerVR SGX 543MP4 from the PS vita in the prime. That would run laps around the MP2 XD
Zacisblack said:
Clock speed isn't going to improve graphics. The PowerVR 543MP2 dual core GPU in the A5 chip would still run laps around an overclocked PowerVR540 in terms of speed, throughput and things such as shadows, textures and triangles.
Click to expand...
Click to collapse
I don't understand why google put such a crappy GPU in their flagship phone. They easily could have put the Mali GPU or maybe even the 543MP2. Now I really can't decide between the 4S and the Galaxy Nexus...
cherrybombaz said:
I don't understand why google put such a crappy GPU in their flagship phone. They easily could have put the Mali GPU or maybe even the 543MP2. Now I really can't decide between the 4S and the Galaxy Nexus...
Click to expand...
Click to collapse
They probably put it in to work around the hardware. This means that the Galaxy Prime will run extremely well with ICS probably better than some dual core GPU phones but it will lack in the gaming department. If you don't really game alot it shouldn't matter that much it will be really fast. They've also increase the clock speed from 200Mhz to 386 Mhz which is almost twice as fast.
I thought about the 4S thing too but then I realized, "why have all that power if the system takes little use of it?". The only thing it's really good for is gaming but who want's to do that on a 3.5" screen. At this point, the Nexus is probably a better real world choice but if you wait a few more months the GSII HD LTE or the GS3 will be out and will probably be on par with the iPad 3 in terms of hardware. I was hoping the Nexus would blow me away but it didn't. I like the way it looks but the hardware is just lacking and it's not worth my upgrade or $300.
Very well stated I'm also not all in on the GN. We'll see once I can actually play with one in store next month
Sent from my SCH-I500 using XDA Premium App
Zacisblack said:
They probably put it in to work around the hardware. This means that the Galaxy Prime will run extremely well with ICS probably better than some dual core GPU phones but it will lack in the gaming department. If you don't really game alot it shouldn't matter that much it will be really fast. They've also increase the clock speed from 200Mhz to 386 Mhz which is almost twice as fast.
I thought about the 4S thing too but then I realized, "why have all that power if the system takes little use of it?". The only thing it's really good for is gaming but who want's to do that on a 3.5" screen. At this point, the Nexus is probably a better real world choice but if you wait a few more months the GSII HD LTE or the GS3 will be out and will probably be on par with the iPad 3 in terms of hardware. I was hoping the Nexus would blow me away but it didn't. I like the way it looks but the hardware is just lacking and it's not worth my upgrade or $300.
Click to expand...
Click to collapse
True. But Infinity Blade 2 looks pretty amazing and if more developers can take advantage of the 543MP2, that would be great. But, you can always wait a few more months and something better will always come out, so I don't think its a good idea to wait for the GS3 - and it'll take much more than a few months to get onto US carriers. I agree that $300 is a bit of a hard pill to swallow, especially when you can get a GSII with better hardware for cheaper.
X10 is garbage! this is outrageous!
Yes really, they got it working, you want it so bad try porting it yourself
Sent from my MB860 using XDA App
cry about it?
if you want it so bad for your phone, learn to port it yourself. until then, since you rely solely on other peoples' hard work and sweat, shut up and be patient.
dLo GSR said:
cry about it?
if you want it so bad for your phone, learn to port it yourself. until then, since you rely solely on other peoples' hard work and sweat, shut up and be patient.
Click to expand...
Click to collapse
Oh snap. That was awesome.
Sent from my MB860 using XDA App
I might start to look into trying to port it this weekend
Sent from my MB860 using XDA App
firefox3 said:
I might start to look into trying to port it this weekend
Sent from my MB860 using XDA App
Click to expand...
Click to collapse
Good news man
Sent from my MB860 using XDA App
Being that there are currently no EGL libs for anything except PowerVR SGX devices under ICS yet, and they're closed source and tightly dependent on the kernel there doesn't seem to be a huge point until the official updates start to hit for a range of devices.
Sure, Desire, HD, X10, N1 have ports of a sort at the moment, in fact there shouldn't be too many problems getting them working aside from the graphics drivers but they're just for fun with the framebuffer driver given how much of ICS' UI rendering is done with GPU acceleration in mind. You wouldn't want to use it day-to-day. The browser is surprisingly responsive on the Desire though (I'd say moreso than GB, despite the software rendering), as is the Market (the new one always lagged really badly for me on the Desire before) - glimmers of hope for ICS' eventual performance on older devices. The keyboard lags like you wouldn't believe though!
The Atrix should fly under 4.0.1 though, if it ever happens - bearing in mind the fact that the SGX 540 in the Galaxy Nexus is pretty much in a dead heat with Tegra 2's GPU, we've got a lower resolution screen, and can overclock past the its stock speeds.
Javi97100 said:
Good news man
Sent from my MB860 using XDA App
Click to expand...
Click to collapse
Its turning out to be harder then i though... I think no one will get it until offical updates come out for other phones
Azurael said:
Being that there are currently no EGL libs for anything except PowerVR SGX devices under ICS yet, and they're closed source and tightly dependent on the kernel there doesn't seem to be a huge point until the official updates start to hit for a range of devices.
Sure, Desire, HD, X10, N1 have ports of a sort at the moment, in fact there shouldn't be too many problems getting them working aside from the graphics drivers but they're just for fun with the framebuffer driver given how much of ICS' UI rendering is done with GPU acceleration in mind. You wouldn't want to use it day-to-day. The browser is surprisingly responsive on the Desire though (I'd say moreso than GB, despite the software rendering), as is the Market (the new one always lagged really badly for me on the Desire before) - glimmers of hope for ICS' eventual performance on older devices. The keyboard lags like you wouldn't believe though!
The Atrix should fly under 4.0.1 though, if it ever happens - bearing in mind the fact that the SGX 540 in the Galaxy Nexus is pretty much in a dead heat with Tegra 2's GPU, we've got a lower resolution screen, and can overclock past the its stock speeds.
Click to expand...
Click to collapse
So EGL = gpu driver? If thats the only setback, would it be possible to get an ICS rom with software rendering as a proof of concept, or are there other pieces missing?
GB/CM7 is pretty good on the Atrix, if we dont see ICS for a few months it doesn't hurt us in any way. I'd like to think most of us can be patient if we lack the skills to help.
I noticed the Captivate got a port of it too since i9000 ROMs and Cap ROMs are interchangeable. I thought its funny that it's running on the HD a Windows Mobile 6.5 phone lol. Let's all try to be patient and we will eventually see it.
Edit: not to mention I'm sure if it's not already it will soon be on iPhone too. It seems like iPhones always get the new Android versions kinda early. I'm not sweating it I love my Atrix in its current state.
According to anandtech, Tegra 2 support is essentially ready, so I think as long as nvidia releases the source for ics (libs?), someone will try to port it. Hell, I have a good 5 weeks during break, I might as well try then.
Sent from my MB860 using XDA App
Azurael said:
Being that there are currently no EGL libs for anything except PowerVR SGX devices under ICS yet, and they're closed source and tightly dependent on the kernel there doesn't seem to be a huge point until the official updates start to hit for a range of devices.
Sure, Desire, HD, X10, N1 have ports of a sort at the moment, in fact there shouldn't be too many problems getting them working aside from the graphics drivers but they're just for fun with the framebuffer driver given how much of ICS' UI rendering is done with GPU acceleration in mind. You wouldn't want to use it day-to-day. The browser is surprisingly responsive on the Desire though (I'd say moreso than GB, despite the software rendering), as is the Market (the new one always lagged really badly for me on the Desire before) - glimmers of hope for ICS' eventual performance on older devices. The keyboard lags like you wouldn't believe though!
The Atrix should fly under 4.0.1 though, if it ever happens - bearing in mind the fact that the SGX 540 in the Galaxy Nexus is pretty much in a dead heat with Tegra 2's GPU, we've got a lower resolution screen, and can overclock past the its stock speeds.
Click to expand...
Click to collapse
Actually, no, despite being a much older GPU, the SGX 540 found in the GNexus outpaces the Tegra 2 due to its higher clock rate by 7% or 45% depending on the GLBenchmark being run. Both GPU tests were done at 720p resolution. Also, you can't overclock the GPU, only the CPU.
edgeicator said:
Actually, no, despite being a much older GPU, the SGX 540 found in the GNexus outpaces the Tegra 2 due to its higher clock rate by 7% or 45% depending on the GLBenchmark being run. Both GPU tests were done at 720p resolution. Also, you can't overclock the GPU, only the CPU.
Click to expand...
Click to collapse
Buddy, check out any of the kernels available in the dev thread and you'll see that the GPUs are overclocked.
WiredPirate said:
I noticed the Captivate got a port of it too since i9000 ROMs and Cap ROMs are interchangeable. I thought its funny that it's running on the HD a Windows Mobile 6.5 phone lol. Let's all try to be patient and we will eventually see it.
Edit: not to mention I'm sure if it's not already it will soon be on iPhone too. It seems like iPhones always get the new Android versions kinda early. I'm not sweating it I love my Atrix in its current state.
Click to expand...
Click to collapse
Doubt the iPhone will see ICS, the newest model that can run android as far as I know is the iPhone 3G, which was incredibly slow under Gingerbread.
mac208x said:
X10 is garbage! this is outrageous!
Click to expand...
Click to collapse
222 posts and zero thanks? Is this what you do, go around XDA and post useless threads like the guy complaining about returning home early despite nobody asking him to "to get MIUI ported on his grandma's phone"?
Are you guys related by any chance?
edgeicator said:
Actually, no, despite being a much older GPU, the SGX 540 found in the GNexus outpaces the Tegra 2 due to its higher clock rate by 7% or 45% depending on the GLBenchmark being run. Both GPU tests were done at 720p resolution. Also, you can't overclock the GPU, only the CPU.
Click to expand...
Click to collapse
Depends on the benchmark, yes - texture-heavy rendering tends to perform better on the 540 in the OMAP4460 thanks to it's dual channel memory controller and high clock (and that's probably the directly relevant part to UI rendering to be honest, though as I said - lower resolution screen ) but the Tegra 2 is quite substantially ahead in geometry-heavy rendering (and games on mobiles are starting to move that way now, following the desktop landscape over the past 5 years or so.) Averaged out, the performance of the two is very close.
Plus, as I said, the GPU in my phone is running at 400MHz which ought to even things out in the GLMark 720p tests somewhat even if they are biassed to one architecture or the other. While the GPU in OMAP4460 may overclock just as well from its stock 400MHz, I'm only really concerned that the phone can run as fast as a stock GNexus to maybe skip the next generation of mobile hardware and tide it over until Cortex A15-based SoCs on 28nm process start to emerge with stronger GPUs. I don't really think I'm CPU performance bound with a 1.4GHz dual-core A9 - and increasing the number of equivalent cores without a really substantial boost in GPU horesepower seems worthless right now, even if ICS takes better advantage of SMP (re: Disappointing early Tegra 3 benchmarks - although it does seem GLMark stacks the odds against NVidia GPUs more than other benchmarks?)
Azurael said:
Depends on the benchmark, yes - texture-heavy rendering tends to perform better on the 540 in the OMAP4460 thanks to it's dual channel memory controller and high clock (and that's probably the directly relevant part to UI rendering to be honest, though as I said - lower resolution screen ) but the Tegra 2 is quite substantially ahead in geometry-heavy rendering (and games on mobiles are starting to move that way now, following the desktop landscape over the past 5 years or so.) Averaged out, the performance of the two is very close.
Plus, as I said, the GPU in my phone is running at 400MHz which ought to even things out in the GLMark 720p tests somewhat even if they are biassed to one architecture or the other. While the GPU in OMAP4460 may overclock just as well from its stock 400MHz, I'm only really concerned that the phone can run as fast as a stock GNexus to maybe skip the next generation of mobile hardware and tide it over until Cortex A15-based SoCs on 28nm process start to emerge with stronger GPUs. I don't really think I'm CPU performance bound with a 1.4GHz dual-core A9 - and increasing the number of equivalent cores without a really substantial boost in GPU horesepower seems worthless right now, even if ICS takes better advantage of SMP (re: Disappointing early Tegra 3 benchmarks - although it does seem GLMark stacks the odds against NVidia GPUs more than other benchmarks?)
Click to expand...
Click to collapse
I would expect the Tegra to beat a nearly 5 year old GPU, but it only does so in triangle throughput. Tegra just uses a very poor architecture in general. Look at how little actual horsepower it can pull. The Tegra 3 gpu pulls 7.2GFLOPs @300mhz. The iPad GPU and the upcoming Adreno 225 both pull 19.2 GFLOPS at that same clockspeed. I honestly have no idea what the engineers are thinking over atNnvidia. It's almost as bad as AMD's latest bulldozer offerings. It's really more of Tegra's shortcomings than GLMark stacking the odds. PowerVR's offerings from 2007 are keeping up with a chip that debuted in 2010/2011. The Geforce just doesn't seem to scale very well at all on mobile platforms. But yea, all Nvidia did with Tegra 3 was slap in 2 extra cores, clocked them higher, threw in the sorely missed NEON instruction set, increased the SIMDs on the GPU by 50% (8 to 12), and then tacked on a 5th hidden core to help save power. Tegra 3 stayed with the 40nm process whereas every other SoC is dropping down to 28nm with some bringing in a brand new architecture as well.
edgeicator said:
I would expect the Tegra to beat a nearly 5 year old GPU, but it only does so in triangle throughput. Tegra just uses a very poor architecture in general. Look at how little actual horsepower it can pull. The Tegra 3 gpu pulls 7.2GFLOPs @300mhz. The iPad GPU and the upcoming Adreno 225 both pull 19.2 GFLOPS at that same clockspeed. I honestly have no idea what the engineers are thinking over atNnvidia. It's almost as bad as AMD's latest bulldozer offerings. It's really more of Tegra's shortcomings than GLMark stacking the odds. PowerVR's offerings from 2007 are keeping up with a chip that debuted in 2010/2011. The Geforce just doesn't seem to scale very well at all on mobile platforms. But yea, all Nvidia did with Tegra 3 was slap in 2 extra cores, clocked them higher, threw in the sorely missed NEON instruction set, increased the SIMDs on the GPU by 50% (8 to 12), and then tacked on a 5th hidden core to help save power. Tegra 3 stayed with the 40nm process whereas every other SoC is dropping down to 28nm with some bringing in a brand new architecture as well.
Click to expand...
Click to collapse
Don't you get tired if writing those long rants? We understand you know something about CPU architecture, and that Tegra isn't the best one out there, but damn man, it's the same thing in every thread. Just chill out and try to stay on topic for once
Sent from my MB860 using Tapatalk
edgeicator said:
I would expect the Tegra to beat a nearly 5 year old GPU, but it only does so in triangle throughput. Tegra just uses a very poor architecture in general. Look at how little actual horsepower it can pull. The Tegra 3 gpu pulls 7.2GFLOPs @300mhz. The iPad GPU and the upcoming Adreno 225 both pull 19.2 GFLOPS at that same clockspeed. I honestly have no idea what the engineers are thinking over atNnvidia. It's almost as bad as AMD's latest bulldozer offerings. It's really more of Tegra's shortcomings than GLMark stacking the odds. PowerVR's offerings from 2007 are keeping up with a chip that debuted in 2010/2011. The Geforce just doesn't seem to scale very well at all on mobile platforms. But yea, all Nvidia did with Tegra 3 was slap in 2 extra cores, clocked them higher, threw in the sorely missed NEON instruction set, increased the SIMDs on the GPU by 50% (8 to 12), and then tacked on a 5th hidden core to help save power. Tegra 3 stayed with the 40nm process whereas every other SoC is dropping down to 28nm with some bringing in a brand new architecture as well.
Click to expand...
Click to collapse
I think you are not seeing the whole picture...
The Tegra 3 (Et-Al) is not just about its quad core implementation, remember that the GPU will offer 12 cores that will translate in performance not seeing as of yet on any other platform.
Benchmarks don't tell the whole story! Specially those benchmarking tools which are not Tegra 3 optimized yet.
Cheers!
Sent from my Atrix using Tapatalk
WiredPirate said:
I noticed the Captivate got a port of it too since i9000 ROMs and Cap ROMs are interchangeable. I thought its funny that it's running on the HD a Windows Mobile 6.5 phone lol. Let's all try to be patient and we will eventually see it.
Edit: not to mention I'm sure if it's not already it will soon be on iPhone too. It seems like iPhones always get the new Android versions kinda early. I'm not sweating it I love my Atrix in its current state.
Click to expand...
Click to collapse
LOL I ran all the iDroid ports on my iphone. Not one was even in alpha stage, I would not even count iDroid as a port since you cant use anything on it.
I need your suggestions. Can any one please make me understand the which is the best processor from Exynos 4212 Quad, NVDIA tegra 3, Snapdragon 4 dual and why?
Please tell me. That will be very helpful to me
From benchmarks, the Exynos CPU was quite a bit better than the other two, and the Mali GPU in the S3 also out-performed the others as far as I can remember. Search for some benchmarks comparing them to find out for yourself.
It should go this way:
Processing power: Exynos 4412 Quad > Qualcom S4 Krait > Nvidia Tegra 3 Quad
GPU power: Mali400 GPU > Adreno 225 >= ULP GeForce
But i read somewhere that S4 Krait CPU which is based on ARM Cortex A15 chips could offer more power without consuming as much energy than the two Quad core beasts.
My first thought when I heard about Nvidias 4+1 CPU was, how can it decide when to switch from single to quad core?? This sounds to me like a prototype for a constantly lagging device.
But I'm not as deep in this matter as to make a qualified statement.
It is just a feeling, since neither Intel ,AMD, Qualcomm or Samsung build their CPUs like this.
Sent from my GT-I9100 using XDA
Coming off a Tegra 2 device and patiently waiting this Verizon version of this phone all I can say is Tegra is terrible. At least on my phone it was, heating up on simple tasks like browsing homescreen.
harise100 said:
My first thought when I heard about Nvidias 4+1 CPU was, how can it decide when to switch from single to quad core?? This sounds to me like a prototype for a constantly lagging device.
But I'm not as deep in this matter as to make a qualified statement.
It is just a feeling, since neither Intel ,AMD, Qualcomm or Samsung build their CPUs like this.
Sent from my GT-I9100 using XDA
Click to expand...
Click to collapse
It actually works very well, the standby time on this phone is the best I've ever seen. It's needed though, because this chip is thirsty. Whether that's down to poor drivers or the design I don't know. Maybe a bit of both. Anyway I like Tegra 3, it IS very fast and you have those Tegra 3 games. Just look at Dark Meadow, the graphics are amazing and it runs smooth as hell.
Sent from my HTC One X using xda premium
I can't imagine how this ever will work without occasional lags.
How does the task scheduler on a tegra 3 predict when to activate the 4 cores ?
Starting an app and wait whether it will need more power will lead to a lag, when it maxes out the single core.
It's not 4+1 but rather 1+4.
Sent from my GT-I9100 using XDA
harise100 said:
I can't imagine how this ever will work without occasional lags.
How does the task scheduler on a tegra 3 predict when to activate the 4 cores ?
Starting an app and wait whether it will need more power will lead to a lag, when it maxes out the single core.
It's not 4+1 but rather 1+4.
Sent from my GT-I9100 using XDA
Click to expand...
Click to collapse
I have no idea how it works, as the fifth core is handled directly by the soc and not the system. Maybe someone more knowledgeable than me can shed some light on this. I haven't encountered any noticeable lags compared to my SII though.
Sent from my HTC One X using xda premium
Finally the search tool works, anyways thanks for clearing my doubts between the differences of the two quad cores.