These are super limited. They were released on the 20th on the HP site at $599 then pulled before they sold many. I am not sure why they did this. Maybe they will relist them when the last wave of HP Touchpads go for sale.
Aside from it being 1.5 GHz and having 64 GB of space, what differences are there? Think it is a higher binned CPU? Maybe it will hit 1.9 GHz easily. Perhaps it even has better cooling since it is clocked higher. I also wonder if the PCB is different. Maybe it has a USB powered 4G "slot" somewhere on it. That would be a fun mod to give the tablet a USB port with a little soldering. Hopefully they used the same PCB as the 4G version.
I get mine Tuesday. Anyone else here get one and willing to disassemble?
If you are wondering what the hell I am talking about, you can refer to the post on SD that points to a site where 200 were sold.
slickdeals.net/forums/showthread.php?t=3236371
wow, white would look amazing! is the digitizer white aswell? your lucky dude! does it have 3g?
No, my understanding is that it's the same SoC as the 16/32gb version. The same thing was going to happen for the HSPA+ AT&T version.
http://forums.precentral.net/hp-touchpad/287396-tps-apq8060-cpu-ment-clocked-1-5ghz.html
This is why OC'ing to 1.5ghz is virtually danger free.
jmhalder said:
No, my understanding is that it's the same SoC as the 16/32gb version. The same thing was going to happen for the HSPA+ AT&T version.
http://forums.precentral.net/hp-touchpad/287396-tps-apq8060-cpu-ment-clocked-1-5ghz.html
This is why OC'ing to 1.5ghz is virtually danger free.
Click to expand...
Click to collapse
1.7 should be virtually risk free as well.
Dual core Scorpions have a baseline of 1.2Ghz and a max of 1.5Ghz. Anything above 1.5Ghz is a risk.
wrong the 1.2 on our cpu is under clocked. 1.5 is normal and 1.7 is a super easy 200mhz OC
tazzmissionx said:
Dual core Scorpions have a baseline of 1.2Ghz and a max of 1.5Ghz. Anything above 1.5Ghz is a risk.
Click to expand...
Click to collapse
While that may be true for some chips, its obviously not true here. No manufacturer will release a chip and use it to its max potential right off the bat. Not only is that dangerous (overheat/melting/high failure rates), but not good business sense.
All newer smartphones/tablets, especially ones running The snapdragon chip and the tegra 2 chip are ALL underclocked slightly.
In our case, HP UNDERclocked our chip. Normally it sees duty running @ 1.5ghz (as in the white TP), but they chose to run it at 1.2ghz in ours. Thats why its deemed safe to "overclock" our TP to 1.5ghz, and some like me, clock it @ 1.712.
sanvara said:
1.7 should be virtually risk free as well.
Click to expand...
Click to collapse
its possible for HP to be binning the CPUs for the 1.5ghz
(like how AMD bins the quad core CPUs. if it doesn't make the cut, disable that core and sell it as tri-core CPU)
paperWastage said:
its possible for HP to be binning the CPUs for the 1.5ghz
(like how AMD bins the quad core CPUs. if it doesn't make the cut, disable that core and sell it as tri-core CPU)
Click to expand...
Click to collapse
doubt it, they wouldnt go through all that trouble..
Well, I started using it today. So far I have logging off, dev mode on, preware installed. I am not going to change the kernel because for all I know it bricks the 64GB versions. I think I will hold out until overclocking is as simple as an app like with the hd2.
If anyone has any specific questions or things that I can look up, please ask.
I've got a question - what on earth possesses someone to spend four times as much money to get a device that is a different color and comes with $10 more worth of storage?
;-)
Has anyone tried the kernel from the 64gb to the 16/32gb models?
Official speed up?
Just so I can say I have one of the super rare white ones. TBH, I tried getting the other ones and was out of luck. I was on vacation Friday-Sunday that weekend. I placed orders on Sunday night and this was the only one to go through. I was on them the second they were posted on SD. 260 shipped for a tablet that hardware wise is better than the Ipad 2 64GB is a good deal in my book. Right now it sucks for apps, but I am sure the android port will fix that within the next 6 months.
Supposedly someone has overclocked it. Go to page 41 on that thread I linked on my first post here. Someone claims to have overclocked it to 1.92 causing it to lockup. 1.7 seems stable. I guess that is no different than the 16 and 32.
Crucible1001 said:
Well, I started using it today. So far I have logging off, dev mode on, preware installed. I am not going to change the kernel because for all I know it bricks the 64GB versions. I think I will hold out until overclocking is as simple as an app like with the hd2.
If anyone has any specific questions or things that I can look up, please ask.
Click to expand...
Click to collapse
The touchpads are all the same except for the obvious 64 GB of space and the white shell.
Then what allowing the 64gb model to run at the stock clock and the rest running underclocked?
bigsnack said:
Then what allowing the 64gb model to run at the stock clock and the rest running underclocked?
Click to expand...
Click to collapse
Frequency HP let them run at? 1.5 GHz is stock clocks for our processor. Which is why everyone can "OC" to it so easily. HP underclocked for battery/heat reasons.
1.84Ghz ****ers!
Nburnes said:
Frequency HP let them run at? 1.5 GHz is stock clocks for our processor. Which is why everyone can "OC" to it so easily. HP underclocked for battery/heat reasons.
Click to expand...
Click to collapse
So, then its not like in the kernel or anything HP programmed to make the 64gb run at the stock clock? Dang ya'll.
}{Alienz}{ said:
1.84Ghz ****ers!
Click to expand...
Click to collapse
Same, just more politely .
You may want to sell this baby. I saw one on ebay that had a ending bid of $820.... a couple of days ago there was one that sold for $520.
Related
How high do you think we can clock the processors on the EVO 3D? I recall they are 1.5 ghz chips underclocked to conserve battery life. Think these can hit that magical 2.0? Or at least 1.8?
I could see maybe 1.6 but honestly nothing over 1.4ghz is worth it... (batter>speed)
And nothing currently requires anything over 1.2ghz or 1.5ghz for that matter, other than peoples e-penis.
Id like to see a 1.4ghz uv kernel over 1.8ghz 1 hour battery killer but I will use and test all of them
sent from anything but an iPhone
nate420 said:
I could see maybe 1.6 but honestly nothing over 1.4ghz is worth it... (batter>speed)
And nothing currently requires anything over 1.2ghz or 1.5ghz for that matter, other than peoples e-penis.
Id like to see a 1.4ghz uv kernel over 1.8ghz 1 hour battery killer but I will use and test all of them
sent from anything but an iPhone
Click to expand...
Click to collapse
Well that's your opinion. I highly doubt a overclocking the processor to 1.8 would bring the phone down to one hour of battery life. It's not like it would be constantly running at that speed. I would prefer speed over battery life as I charge my phone every night and have plenty left over even overclocked to almost 1.3 on my EVO.
nate420 said:
I could see maybe 1.6 but honestly nothing over 1.4ghz is worth it... (batter>speed)
And nothing currently requires anything over 1.2ghz or 1.5ghz for that matter, other than peoples e-penis.
Id like to see a 1.4ghz uv kernel over 1.8ghz 1 hour battery killer but I will use and test all of them
sent from anything but an iPhone
Click to expand...
Click to collapse
I think this is less about practicality and more about pushing our phone to the limits. overclocking on an already fast enough processor on a device which runs for the most part on battery, is not needed. however it is fun and nice to see the benchmarks soar.
I say 1.8ghz-2ghz
If they're anything like the EVO 4G, then it wont be a very high overclock
But assuming all are capable of 1.5 GHz, then it would be at least a 400-450 MHz overclock!
freeza said:
If they're anything like the EVO 4G, then it wont be a very high overclock
But assuming all are capable of 1.5 GHz, then it would be at least a 400-450 MHz overclock!
Click to expand...
Click to collapse
My g2x was overclocked to 1.6ghz and its only a 1ghz dual core phone...
Id say we could see maybe 1.8ghz if this phone is really 1.5 dropped down to 1.2
sent from anything but an iPhone
fmedina2 said:
Well that's your opinion. I highly doubt a overclocking the processor to 1.8 would bring the phone down to one hour of battery life. It's not like it would be constantly running at that speed. I would prefer speed over battery life as I charge my phone every night and have plenty left over even overclocked to almost 1.3 on my EVO.
Click to expand...
Click to collapse
Again for e-penis and bragging rights on benchmarks nothing more...
As for saying 1.8 oc would kill it in a hour I was joking...
And I bet dollars to donuts you don't see a change in "speed" past 1.6ghz other than a hot battery.
Ginger bread can't fully optimize dual cores it does the job but untill a new os is out
no point ruining a battery for "speed" you won't see
sent from anything but an iPhone
While performance is key, I'd say this phone is well above the bar of expectations for most Android Apps at the current time. I'm more interested in squeezing the most battery life I possibly can via Underclocking. It will be nice to see how far this can be pushed with Two Cores to spread the workload across.
nate420 said:
I could see maybe 1.6 but honestly nothing over 1.4ghz is worth it... (batter>speed)
And nothing currently requires anything over 1.2ghz or 1.5ghz for that matter, other than peoples e-penis.
Id like to see a 1.4ghz uv kernel over 1.8ghz 1 hour battery killer but I will use and test all of them
sent from anything but an iPhone
Click to expand...
Click to collapse
btw the way i have the bigest e penis lol it is googolplex inchs
why are people saying such low numbers the second gen snapdragons can go to what 1.9? if ours is 1.5 stock dropped down to 1.2 then i think we can at least hit 2
I'd bet that the chips in these phones will be those that were unstable at 1.5 ghz. That's how chip makers do these things. They make them all the same, then those with unstable silicon are sold as a lower clock speed. Not sure I'd expect over 1.5 and that might require higher voltage. Hope I'm wrong. We'll see I guess.
hdad2 said:
I'd bet that the chips in these phones will be those that were unstable at 1.5 ghz. That's how chip makers do these things. They make them all the same, then those with unstable silicon are sold as a lower clock speed. Not sure I'd expect over 1.5 and that might require higher voltage. Hope I'm wrong. We'll see I guess.
Click to expand...
Click to collapse
Wrong, to lazy to explain for now.
toxicfumes22 said:
Wrong.....
Click to expand...
Click to collapse
Hope so!
10char
toxicfumes22 said:
Wrong, to lazy to explain for now.
Click to expand...
Click to collapse
OK, a little less lazy right now. But simply the way that manufactures choose the speeds for processors is actually simple. In the case of the 3D it IS underclocked. The processor is an asynchronous dual core with clock speeds initially set at 1.5 by Qualcom and is used in Qualcom's phone they produce for developers. It is underclocked by HTC because of battery problems listed from the 4G and the unnecessary need of 1.5GHz in a F*ing phone. Manufactures for the most part do not underclock the CPU. The reason it is set at the level it is, is because it is most stable, efficient and meets the heat extraction needs (People forget CPUs are just circuits and produce heat with more voltage). OK lets back this up shall we. OK.
That is why I'm too lazy to post thing, I have to search up a link cause most of this is my general knowledge. Anyways, the QSD8650 found in the EVO 4G is clocked at 1GHz and has been posted to a stable 1.3GHz I believe by a recent post. Now the MSM8660 is posted to be a 1.5GHz CPU, so its overclocking potential is more near 2GHz but I would suspect it to get a little warm(sweaty palms anyone?) and I wouldn't know how stable it would be either (I don't know phones the best). Why is it underclocked? Because people kept *****ing at how much battery the EVO used and as technology improves so does the efficiency of CPUs so they go with the most recent and just underclock it. I've seen a comparison graph somewhere by Qualcom but I spent about 10minutes looking for it and couldn't find it but it was really nifty. If someone finds it plz post it, it shows the energy vs Clock speed and it is very cool.
Anyways, to respond to whoever said that the 1.5GHz is the max and that all manufacturers underclock the CPU based upon the silicon is WRONG, wrong WrOnG and Rong/wong (Im sorry I dont remember the exact response). Anyways, its the heat extraction and the silicon hurts it because it doesn't let all the heat through, which is one of the reason your PS3 may have yellow lighted on you(Yes its because of the CPU disconnecting from the Motherboard, but why do you think this extra heat was generated?).
Sorry this is so long and I got distracted a few times while writing it so it I messed up or something doesn't make sense I apologize but being lazy is really a pain in the ass.
hdad2 said:
I'd bet that the chips in these phones will be those that were unstable at 1.5 ghz. That's how chip makers do these things. They make them all the same, then those with unstable silicon are sold as a lower clock speed. Not sure I'd expect over 1.5 and that might require higher voltage. Hope I'm wrong. We'll see I guess.
Click to expand...
Click to collapse
That would be the case if this wasn't an MSM 8660. You're thinking like when AMD makes chips for the HD 6970 and some are found not to be stable at 880 mhz so they bin it to use in the HD 6950 which runs at 800 mhz. These are actually sold as two separate products. In the case of the processor in the Evo it's an MSM 8660 which is sold by qualcomm to be run at speeds as high as 1.5 ghz. If they wanted to sell chips binned for lower speeds they'd have to sell it as a different model since it wouldn't be capable of the 1.5hz.
jersey221 said:
why are people saying such low numbers the second gen snapdragons can go to what 1.9? if ours is 1.5 stock dropped down to 1.2 then i think we can at least hit 2
Click to expand...
Click to collapse
1.9?
No sir it was 1.19stable...
Sent from my PC36100 using Tapatalk
donatom3 said:
That would be the case if this wasn't an MSM 8660. You're thinking like when AMD makes chips for the HD 6970 and some are found not to be stable at 880 mhz so they bin it to use in the HD 6950 which runs at 800 mhz. These are actually sold as two separate products. In the case of the processor in the Evo it's an MSM 8660 which is sold by qualcomm to be run at speeds as high as 1.5 ghz. If they wanted to sell chips binned for lower speeds they'd have to sell it as a different model since it wouldn't be capable of the 1.5hz.
Click to expand...
Click to collapse
Can you explain this to me please.
toxic and donatom,
Your explanations make perfect sense. So I hope to be wrong. Does qualcomm sell a processor with that same architecture and a lower clock advertised?
Just seems like they're not gonna throw them away if they are stable and 1.2 or 1.4 but less stable at 1.5+. The 3vo seems like a good way for them to unload those processors.
hdad2 said:
toxic and donatom,
Your explanations make perfect sense. So I hope to be wrong. Does qualcomm sell a processor with that same architecture and a lower clock advertised?
Just seems like they're not gonna throw them away if they are stable and 1.2 or 1.4 but less stable at 1.5+. The 3vo seems like a good way for them to unload those processors.
Click to expand...
Click to collapse
To my knowledge, if this happens it gets recycled. But.....if this happens a lot then they need to change their manufacturing process or that the technology isn't there yet. Like now we have the technology to do 64GB MicroSD, but why do it because most devices can only do 32GB. For the companies that do sell them, well....I don't have good words for them, I also don't know of this happening. I can understand that it could be useful for donations to universities or others that could use them for damn near free prices, but not resold even under a different name.
toxicfumes22 said:
Can you explain this to me please.
Click to expand...
Click to collapse
Well in the case of AMD with many of their chip lines they produce a higher end chip. The ones that don't fully pass the tests at the higher speed get sold as a different model with a lower clock and voltage.
I have the most experience with the HD 6970 and 6950. They both use the same GPU, but the ones in the 6950 didn't pass AMD's tests at higher speeds so they are set at a lower clock and voltage than the 6970 (they also have some shaders disbaled). They are sold as two different models even though they were made the exact same way with the same silicone. This is not new chip manufacturers have been doing this for a while.
Think of it this way I make 100k chips out of those 100k I'm going to have a percentage that can't perform at their top performance, so instead of throwing them away I make a different model and underclock it and still make money on the chips that didn't pass at the higher speed. Now sometimes I will sell more of the lower end model so I actually have to take some chips that probably would have passed as the higher end model and sell them at the lower end. In this case the user gets lucky and can unlock their chip to the performance of the higher priced model.
EDIT: What HTC is doing here is buying a 1.5ghz chip but purposely underclocking it to save battery, since they figured most users wouldn't see the .3 ghz difference but would see the difference in battery life. Again in video cards you see this but usually the other way around. A manufacturer such as Asus, gigabyte, whomever takes the best of their chips they bought and overclocks them because again some were made even better than the standards set by AMD or Nvidia.
I guess what I'm trying to say here is that ALL these chips should do 1.5 ghz stable without question, unless there isn't enough space inside for the cooling requirements at 1.5ghz (which I doubt), and most should easily go above 1.6.
Edit again since I just saw this post:
toxicfumes22 said:
To my knowledge, if this happens it gets recycled. But.....if this happens a lot then they need to change their manufacturing process or that the technology isn't there yet. Like now we have the technology to do 64GB MicroSD, but why do it because most devices can only do 32GB. For the companies that do sell them, well....I don't have good words for them, I also don't know of this happening. I can understand that it could be useful for donations to universities or others that could use them for damn near free prices, but not resold even under a different name.
Click to expand...
Click to collapse
This is something that happens mostly in higher end processors because their tolerances at those speeds are less forgiving. No manufacturing process is perfect, you're going to have some that won't perform at those very high speeds, and recycling would cost more to the company and environment then simply selling them at lower speeds. These chips are not bad, and not defective, just found to not be stable at those highest speeds, but are perfectly fine at the speeds they are being sold at, so why throw them away. If they don't meet the standards at the lower speed then yes they would be recycled.
I have a question about the 3D's dual core that I'd like more clarification on the vague answers I'm getting by searching this site and google. So I've read that the core is asynchronous so basically meaning the second core doesn't do much work unless needed as others like the tegra 2 and exynos have both cores running or something similar to that, and that this is affecting the benchmark scores. I also read that one would basically double the score of the 3D to get a more accurate reading. Can anyone confirm or further explain this?
Yes, asynchronous is when something operates on another thread whereas the main thread is still available for operating. This allows for better performance in terms of managing tasks. Now just because it doesn't score high on a benchmark, it doesn't mean it is going to perform. Also this allows for better performance for the battery.
I haven't slept for the past 12 hours so if this doesn't help you, just let me know and I will fully elaborate on how the processor will operate on the phone. Now time for bed :'(
In short, asynchronous operation means that a process operates independently of other processes.
Think of transferring a file. A separate thread will utilized for doing so. You will then be able to do background things such as playing with the UI, such as Sense since you will be using the main thread. If anything were to happen to the transferring file (such as it failing), you will be able to cancel it because it is independent on another thread.
I hope this makes sense man, kind of tired. Now I'm really going to bed.
Sent from my PC36100 using XDA App
To be more specific by asynchronous they mean that each core can run at different clock speeds. Core 1 could be at 1.2 ghz while core 2 is at 200 mhz. Most multi core processors are synchronous meaning all the cores are running at the same speed.
donatom3 said:
To be more specific by asynchronous they mean that each core can run at different clock speeds. Core 1 could be at 1.2 ghz while core 2 is at 200 mhz. Most multi core processors are synchronous meaning all the cores are running at the same speed.
Click to expand...
Click to collapse
^This too
Sent from my PC36100 using XDA App
I was also very curious to learn a little more about the async cores and how it differes from a standard "Always-On" dual core arctechiure.
Thh first page/video I found talks about the SnapDragon core specifically.
http://socialtimes.com/dual-core-snapdragon-processor-qualcomm-soundbytes_b49063
From what I've gathered, it comes down to using the second core and thus more power, only when needed. Minimizing voltage and heat to preserve battery life.
The following video goes into similar and slightly deeper detail about the processor specifically found in the EVO 3D. The demo is running a processor benchmark with a visual real time usage of the two cores. You can briefly see how the two cores are trading off the workload between each other. It was previously mentioned somewhere else on this forum, but I believe by seperating a workload between two chips, the chip will use less power across the two chips vs putting the same workload on a sinlge chip. I'm sure someone else will chime in with some additional detail. Also, after seeing some of these demos, I'm inclined to think that the processor found in the EVO 3D is actually stable at 1.5 but has been underclocked to 1.2 to conserve battery. Only time spent within our hands will tell.
Another demo of the MSM8660 and Adreno 220 GPU found in the EVO 3D. Its crazy to think we've come this far for mobile phone technology.
What occurred to me is how complex Community ROMs for such a device may become with the addition of Video Drivers that may continue to be upgraded and improved (think early Video Card tweaks for PC). Wondering how easy/difficult it will be to get our hands on them, possibly through extraction of updated stock ROMs.
EDIT: As far as benchmarks are concerned, I blame the inability of today's bench marking apps to consider async cores or properly utilize them during testing to factor the over all score. Because the current tests are most likely to be spread across cores which favors efficiency, the scores are going to be much lower than what the true power and performance of the chips can produce. I think of it as putting a horsepower governor on a Ferrari.
thanks for the explanation everyone
The best demonstration is in the first video posted, notice when Charbax looks at the monitor. There on the top right are the frequencies of the two cores, and you'll notice the both of them jumping around a lot, independent of the other. Using the cores "on-demand" only when needed ends up saving a lot of battery power, but doesn't give you any performance loss.
Harfainx said:
The best demonstration is in the first video posted, notice when Charbax looks at the monitor. There on the top right are the frequencies of the two cores, and you'll notice the both of them jumping around a lot, independent of the other. Using the cores "on-demand" only when needed ends up saving a lot of battery power, but doesn't give you any performance loss.
Click to expand...
Click to collapse
Actually I was thinking that not just the battery savings but there could be a performance gain. Think of this if the manufacturer knows they only have to clock one core up to speed when needed they can be more aggressive about their timings and have the core clock up faster than a normal dual core would since they know they don't have to clock up both processors when only one needs the full speed.
I wonder if the drop to 1.2 GHz also serves to keep heat under control. It might not just be battery savings, maybe the small case of a phone doesn't allow for proper cooling to hit 1.5 safely.
I'd love to see some confirmation that the asynchronous nature of this chipset is what's responsible for the seemingly lackluster benchmarking.
mevensen said:
I wonder if the drop to 1.2 GHz also serves to keep heat under control. It might not just be battery savings, maybe the small case of a phone doesn't allow for proper cooling to hit 1.5 safely.
I'd love to see some confirmation that the asynchronous nature of this chipset is what's responsible for the seemingly lackluster benchmarking.
Click to expand...
Click to collapse
The "horrible" benchmark scores are simply due to the tests inability to consider async core performance. Wait till the tests are able to take this into consideration.
Sent from my HERO200 using XDA Premium App
RVDigital said:
The "horrible" benchmark scores are simply due to the tests inability to consider async core performance. Wait till the tests are able to take this into consideration.
Sent from my HERO200 using XDA Premium App
Click to expand...
Click to collapse
I went through all of your links, I didn't see anything that confirms that the benches are somehow affected by the asynchronous nature of the chipset. It's not that I don't believe you, I actually had that same theory when the benches first came out. I just don't have any proof or explanation of it. Do you have a link that provides more solid evidence that this is the case?
NVIDIA actually tells a different story (of course)
http://www.intomobile.com/2011/03/24/nvidia-tegra-2-outperforms-qualcomm-dualcore-1015/
AnandTech's article does explain some of the differences
http://www.anandtech.com/show/4144/...gra-2-review-the-first-dual-core-smartphone/4
It appears that Snapdragon (Scorpion) will excel in some tasks (FPU, non-bandwith constrained applications), but will fall short in others .
I'm pretty sure none of the benchmark apps have even been updated past the release of the sensation so yeah....How could they update the app to use the asynchronus processors the if the only phones to use them have only recently been released.
Sent from my zombified gingerbread hero using XDA Premium App
I had the G2x for like 3 days and never got to root. Poor service where I live. But could the cores be set to a specific frequency independently when rooted like computers?
tyarbro13 said:
I had the G2x for like 3 days and never got to root. Poor service where I live. But could the cores be set to a specific frequency independently when rooted like computers?
Click to expand...
Click to collapse
Yea, if someone were to develop an app for that. I do not see why not.
Sent from my PC36100 using XDA App
Hmm...
If a program such as Smart bench (which takes advantage of dual cores) is stressing both cores to 1.2ghz then regardless of if both cores are active or not the bench will be accurate.
I would rather NOT have asyncronus cores as there would be lag during frequency changes...
Ex:
2 cores running at 500mhz vs 1 core @ 1ghz and other not active.
The 2 cores will produce less heat and use less energy...
Maedhros said:
Hmm...
If a program such as Smart bench (which takes advantage of dual cores) is stressing both cores to 1.2ghz then regardless of if both cores are active or not the bench will be accurate.
I would rather NOT have asyncronus cores as there would be lag during frequency changes...
Ex:
2 cores running at 500mhz vs 1 core @ 1ghz and other not active.
The 2 cores will produce less heat and use less energy...
Click to expand...
Click to collapse
There dual, it would be better for them to run asynchronous. Not only that, but it is a phone so there will be no lag between frequency changing. 2 Cores running at 500mhz will perform better than 1 core at 1ghz.
Sent from my PC36100 using XDA App
tyarbro13 said:
I had the G2x for like 3 days and never got to root. Poor service where I live. But could the cores be set to a specific frequency independently when rooted like computers?
Click to expand...
Click to collapse
This is something that the hardware needs to be capable of. Software can only do so much. As far as I've seen Tegra isn't capable of it.
I read the anandtech article and I came with conclusion that everyday task you might not see the difference between the two and while tegra2 might bench higher. The main thing people dont talk about is the GPU. Adreno 220 is a powerhouse GPU, it will probably stand strong when tegra 3 comes out.
DDiaz007 said:
There dual, it would be better for them to run asynchronous. Not only that, but it is a phone so there will be no lag between frequency changing. 2 Cores running at 500mhz will perform better than 1 core at 1ghz.
Sent from my PC36100 using XDA App
Click to expand...
Click to collapse
Huh... what are u saying? Sorry dont understand... On one hand you say asynchronous is better and on the other ur saying 2 cores @ 500 will work better?
nkd said:
I read the anandtech article and I came with conclusion that everyday task you might not see the difference between the two and while tegra2 might bench higher. The main thing people dont talk about is the GPU. Adreno 220 is a powerhouse GPU, it will probably stand strong when tegra 3 comes out.
Click to expand...
Click to collapse
What?!?
Andreno 220 is a horrible GPU. AT BEST it is equal to the GPU in the Original SGS.
The reason benches are so different is because Qualcomm has made NO improvements in the CPU. Desire HD CPU is the same as Sensations. While... SGS2 + Tegra have IMPROVED CPUs.
Arm 7 vs arm 9?
Maedhros said:
Huh... what are u saying? Sorry dont understand... On one hand you say asynchronous is better and on the other ur saying 2 cores @ 500 will work better?
What?!?
Andreno 220 is a horrible GPU. AT BEST it is equal to the GPU in the Original SGS.
The reason benches are so different is because Qualcomm has made NO improvements in the CPU. Desire HD CPU is the same as Sensations. While... SGS2 + Tegra have IMPROVED CPUs.
Arm 7 vs arm 9?
Click to expand...
Click to collapse
Dude go back to sleep. You have no clue what you are talking about.
Sent from my PC36100 using XDA Premium App
Simple question. Is the 3VO's processor really 1.5 ghz underclocked to 1.2? I had seen this information floating around, but none of my searches are able to find anything firmly confirming or denying this.
Thanks
That's what I've also heard, however I still can't find anything to confirm or deny.
Nobody knows, eh?
Yes it is underclocked.
Appreciate my help? Thank me
DDiaz007 said:
Yes it is underclocked.
Appreciate my help? Thank me
Click to expand...
Click to collapse
Sources????
You can't be serious? This has been discussed and answered dozens of times... Google MSM8660..
Appreciate my help? Thank me
DDiaz007 said:
You can't be serious? This has been discussed and answered dozens of times... Google MSM8660..
Appreciate my help? Thank me
Click to expand...
Click to collapse
That doesn't help, the MSM8660 comes in a 1.2 Ghz and a 1.5 Ghz variant.
poweroutlet said:
That doesn't help, the MSM8660 comes in a 1.2 Ghz and a 1.5 Ghz variant.
Click to expand...
Click to collapse
........
Appreciate my help? Thank me
It comes in two different factory clocks, which is what you said.. One is lower than the other because of manufacturer requests and the it being pointless to have 1.5 on a phone. If I were to pull the CPU's supported frequencies, it will say it supports 1512000, which is 1.5Ghz. The 8672 comes factory clocked at 1.5Ghz... They are all the same SoC, but with different applications. Such as one being CDMA support other being GSM. The ones that come in 1.2Ghz is because it is being used on a phone. If it were a tablet, or netbook, the clock would be 1.5Ghz which would be the 8672 or 8660..
Rest assured that 1.5Ghz is a frequency supported for the 8660...
In the end, they are the same SoC, running the same architecture. There is nothing different from the MSM 8260, 8660 and 8672 (which is cancelled). They are all under the 45nm process also.
Appreciate my help? Thank me
DDiaz007 said:
It comes in two different factory clocks, which is what you said.. One is lower than the other because of manufacturer requests and the it being pointless to have 1.5 on a phone. If I were to pull the CPU's supported frequencies, it will say it supports 1512000, which is 1.5Ghz. The 8672 comes factory clocked at 1.5Ghz... They are all the same SoC, but with different applications. Such as one being CDMA support of GSM. The ones that come in 1.2Ghz is because it is being used on a phone. If it were a tablet, or netbook, the clock would be 1.5Ghz
Appreciate my help? Thank me
Click to expand...
Click to collapse
Too bad you can't be sure of that. That MAY be the case, but it may also be the case that the 1.2 MSM8660s are the lower binned chips and the 1.5 are the higher binned units. This is done all the time in the CPU world. Someone gave an example here of how AMD sold the Barton 2500+ CPU which was really just a lower binned 3200+, a CPU that was far more expensive.
Your point that they are all the same SOC is not relevant, Intel and AMD for example have sold many processors which are all identical in architecture and every spec down to TDP, and the only difference is the frequency. It is just that the higher binned chips become the higher speced CPUs and the lower binned ones become the lower end ones. This doesn't mean that a lower binned CPU can't exceed its specification but it does mean that its likely that the higher binned CPU can go even higher. In any case, they are certainly not equal.
Just because they are the same SOC, does not mean you can assume that the 1.2 and 1.5 Ghz units are the same. That's like assuming the Intel Pentium 4 2.4C and the 3.0C are the same. They are the exact same CPU, same architecture, same cache, FSB, etc except one is clocked a bit higher and is of a higher bin. The 3.0C was the superior unit (Higher bin, better ability to overclock, etc).
My point is, we don't actually know if Qualcomm is giving us simply downclocked versions of the 1.5 or if our 1.2s are just lower binned 1.5s. The latter would make more sense for them in terms of profits, therefore its not surprising that this is a common practice in the industry.
poweroutlet said:
Too bad you can't be sure of that. That MAY be the case, but it may also be the case that the 1.2 MSM8660s are the lower binned chips and the 1.5 are the higher binned units. This is done all the time in the CPU world. Someone gave an example here of how AMD sold the Barton 2500+ CPU which was really just a lower binned 3200+, a CPU that was far more expensive.
Your point that they are all the same SOC is not relevant, Intel and AMD for example have sold many processors which are all identical in architecture and every spec down to TDP, and the only difference is the frequency. It is just that the higher binned chips become the higher speced CPUs and the lower binned ones become the lower end ones. This doesn't mean that a lower binned CPU can't exceed its specification but it does mean that its likely that the higher binned CPU can go even higher. In any case, they are certainly not equal.
Just because they are the same SOC, does not mean you can assume that the 1.2 and 1.5 Ghz units are the same. That's like assuming the Intel Pentium 4 2.4C and the 3.0C are the same. They are the exact same CPU, same architecture, same cache, FSB, etc except one is clocked a bit higher and is of a higher bin. The 3.0C was the superior unit (Higher bin, better ability to overclock, etc).
My point is, we don't actually know if Qualcomm is giving us simply downclocked versions of the 1.5 or if our 1.2s are just lower binned 1.5s. The latter would make more sense for them in terms of profits, its not surprise that this is a common practice in the industry.
Click to expand...
Click to collapse
I see what you are talking about.. I forgot about bins. I know for it on PC's, but didn't think much of it for a smartphone.
Appreciate my help? Thank me
I'm going to say you may be right about the bins. There are some people on here who can't reach past 1.5 for the life of god.
Appreciate my help? Thank me
DDiaz007 said:
I see what you are talking about.. I forgot about bins. I know for it on PC's, but didn't think much of it for a smartphone.
Appreciate my help? Thank me
Click to expand...
Click to collapse
Yeah, regardless though, our CPUs are already doing 1.8 stable and maybe even higher, that's plenty fast for me so I don't really care if the 1.5s are even better at clocking (well I might care if I start seeing the 1.5 phones breaking 2 Ghz haha).
poweroutlet said:
Yeah, regardless though, our CPUs are already doing 1.8 stable and maybe even higher, that's plenty fast for me so I don't really care if the 1.5s are even better at clocking (well I might care if I start seeing the 1.5 phones breaking 2 Ghz haha).
Click to expand...
Click to collapse
Yea me too
Appreciate my help? Thank me
You've been thanked for reminding me of the bins. Not once did that come into mind.
#fail
Appreciate my help? Thank me
DDiaz007 said:
You've been thanked for reminding me of the bins. Not once did that come into mind.
#fail
Appreciate my help? Thank me
Click to expand...
Click to collapse
No worries man.
According to an article today by Android Police, they have strong confirmation that the Nexus Prime/Galaxy will have a T.I. OMAP 4460 SoC(System on a chip) down clocked from 1.5 to 1.2GHz. The OMAP 4460 has the PowerVR 540 GPU which is what is present in our phones. If this is true, I will probably pass on it. But I did a little research and found out that the T.I. OMAP 4470 SoC is due for late 2011 or early 2012. Perhaps Google/Samsung will work with T.I. to debut this new SoC. The OMAP 4470 has a clock speed of 1.8GHz and contains the PowerVR 544 (more powerful than the iPad 2/iPhone 4S). Surely Google would not want a GPU found in last years models to be in their new flagship phone. What are your thoughts?
Zacisblack said:
According to an article today by Android Police, they have strong confirmation that the Nexus Prime/Galaxy will have a T.I. OMAP 4460 SoC(System on a chip) down clocked from 1.5 to 1.2GHz. The OMAP 4460 has the PowerVR 540 GPU which is what is present in our phones. If this is true, I will probably pass on it. But I did a little research and found out that the T.I. OMAP 4470 SoC is due for late 2011 or early 2012. Perhaps Google/Samsung will work with T.I. to debut this new SoC. The OMAP 4470 has a clock speed of 1.8GHz and contains the PowerVR 544 (more powerful than the iPad 2/iPhone 4S). Surely Google would not want a GPU found in last years models to be in their new flagship phone. What are your thoughts?
Click to expand...
Click to collapse
Don't believe half the things you read online. For all we know the nexus prime is a Motorola phone.....
Sent from my SCH-I500 using Tapatalk
nope, it's samsung. you can take off your tinfoil hat since that was officially confirmed about a year ago.
op, where did you get that information? it's been stated that it will have an exynos processor, the latest and greatest from samsung. I don't have a source but the whole point of the nexus line is to have the best and latest hardware.
Sent from my MIUI SCH-i500
sageDieu said:
nope, it's samsung. you can take off your tinfoil hat since that was officially confirmed about a year ago.
op, where did you get that information? it's been stated that it will have an exynos processor, the latest and greatest from samsung. I don't have a source but the whole point of the nexus line is to have the best and latest hardware.
Sent from my MIUI SCH-i500
Click to expand...
Click to collapse
Not saying it's 100% but 4/5 Android websites have concluded that the OMAP series is the platform of choice for Google's new OS. No tech blog/website has stated it will have Exynos. And the OMAP 4470 would be more powerful either way. But below, Android Police strongly asserted that the new device will have the OMAP 4460 downclocked to 1.2GHz. But like I said, I'm asking for everyone's thoughts because I can definitely see Google surprising us.
http://www.androidpolice.com/2011/1...eam-sandwich-phone-sorry-prime-is-not-likely/
You can also check Engadget, AndroidCentral, Anandtech, Android Authority,and PhanDroid.
tonu42 said:
Don't believe half the things you read online. For all we know the nexus prime is a Motorola phone.....
Sent from my SCH-I500 using Tapatalk
Click to expand...
Click to collapse
You could be partially right. Some rumors have suggested that the Prime and Galaxy Nexus are two different devices. What saddens me is that the Galaxy Nexus I-9250 passed through the FCC with GSM only.
The 4460 is has a 100mhz boost in terms of GPU compared to ours. And I can't think of any game/app that would need more than that.
Sent from my Fascinate with MIUI Gingerbread
TheSonicEmerald said:
The 4460 is has a 100mhz boost in terms of GPU compared to ours. And I can't think of any game/app that would need more than that.
Click to expand...
Click to collapse
184mhz, I think -- almost double. Except the Nexus is going to have 2.4 times the pixels of the Fascinate (or 2.22 if you don't count the soft key area).
tonu42 said:
Don't believe half the things you read online. For all we know the nexus prime is a Motorola phone.....
Sent from my SCH-I500 using Tapatalk
Click to expand...
Click to collapse
oh tonu, still trying to have conversations about things you know nothing about.
Sent from my Incredible 2 using XDA App
TheSonicEmerald said:
The 4460 is has a 100mhz boost in terms of GPU compared to ours. And I can't think of any game/app that would need more than that.
Sent from my Fascinate with MIUI Gingerbread
Click to expand...
Click to collapse
Clock speed isn't going to improve graphics. The PowerVR 543MP2 dual core GPU in the A5 chip would still run laps around an overclocked PowerVR540 in terms of speed, throughput and things such as shadows, textures and triangles.
Zacisblack said:
Clock speed isn't going to improve graphics. The PowerVR 543MP2 dual core GPU in the A5 chip would still run laps around an overclocked PowerVR540 in terms of speed, throughput and things such as shadows, textures and triangles.
Click to expand...
Click to collapse
Hah. Imagine having the PowerVR SGX 543MP4 from the PS vita in the prime. That would run laps around the MP2 XD
Zacisblack said:
Clock speed isn't going to improve graphics. The PowerVR 543MP2 dual core GPU in the A5 chip would still run laps around an overclocked PowerVR540 in terms of speed, throughput and things such as shadows, textures and triangles.
Click to expand...
Click to collapse
I don't understand why google put such a crappy GPU in their flagship phone. They easily could have put the Mali GPU or maybe even the 543MP2. Now I really can't decide between the 4S and the Galaxy Nexus...
cherrybombaz said:
I don't understand why google put such a crappy GPU in their flagship phone. They easily could have put the Mali GPU or maybe even the 543MP2. Now I really can't decide between the 4S and the Galaxy Nexus...
Click to expand...
Click to collapse
They probably put it in to work around the hardware. This means that the Galaxy Prime will run extremely well with ICS probably better than some dual core GPU phones but it will lack in the gaming department. If you don't really game alot it shouldn't matter that much it will be really fast. They've also increase the clock speed from 200Mhz to 386 Mhz which is almost twice as fast.
I thought about the 4S thing too but then I realized, "why have all that power if the system takes little use of it?". The only thing it's really good for is gaming but who want's to do that on a 3.5" screen. At this point, the Nexus is probably a better real world choice but if you wait a few more months the GSII HD LTE or the GS3 will be out and will probably be on par with the iPad 3 in terms of hardware. I was hoping the Nexus would blow me away but it didn't. I like the way it looks but the hardware is just lacking and it's not worth my upgrade or $300.
Very well stated I'm also not all in on the GN. We'll see once I can actually play with one in store next month
Sent from my SCH-I500 using XDA Premium App
Zacisblack said:
They probably put it in to work around the hardware. This means that the Galaxy Prime will run extremely well with ICS probably better than some dual core GPU phones but it will lack in the gaming department. If you don't really game alot it shouldn't matter that much it will be really fast. They've also increase the clock speed from 200Mhz to 386 Mhz which is almost twice as fast.
I thought about the 4S thing too but then I realized, "why have all that power if the system takes little use of it?". The only thing it's really good for is gaming but who want's to do that on a 3.5" screen. At this point, the Nexus is probably a better real world choice but if you wait a few more months the GSII HD LTE or the GS3 will be out and will probably be on par with the iPad 3 in terms of hardware. I was hoping the Nexus would blow me away but it didn't. I like the way it looks but the hardware is just lacking and it's not worth my upgrade or $300.
Click to expand...
Click to collapse
True. But Infinity Blade 2 looks pretty amazing and if more developers can take advantage of the 543MP2, that would be great. But, you can always wait a few more months and something better will always come out, so I don't think its a good idea to wait for the GS3 - and it'll take much more than a few months to get onto US carriers. I agree that $300 is a bit of a hard pill to swallow, especially when you can get a GSII with better hardware for cheaper.
X10 is garbage! this is outrageous!
Yes really, they got it working, you want it so bad try porting it yourself
Sent from my MB860 using XDA App
cry about it?
if you want it so bad for your phone, learn to port it yourself. until then, since you rely solely on other peoples' hard work and sweat, shut up and be patient.
dLo GSR said:
cry about it?
if you want it so bad for your phone, learn to port it yourself. until then, since you rely solely on other peoples' hard work and sweat, shut up and be patient.
Click to expand...
Click to collapse
Oh snap. That was awesome.
Sent from my MB860 using XDA App
I might start to look into trying to port it this weekend
Sent from my MB860 using XDA App
firefox3 said:
I might start to look into trying to port it this weekend
Sent from my MB860 using XDA App
Click to expand...
Click to collapse
Good news man
Sent from my MB860 using XDA App
Being that there are currently no EGL libs for anything except PowerVR SGX devices under ICS yet, and they're closed source and tightly dependent on the kernel there doesn't seem to be a huge point until the official updates start to hit for a range of devices.
Sure, Desire, HD, X10, N1 have ports of a sort at the moment, in fact there shouldn't be too many problems getting them working aside from the graphics drivers but they're just for fun with the framebuffer driver given how much of ICS' UI rendering is done with GPU acceleration in mind. You wouldn't want to use it day-to-day. The browser is surprisingly responsive on the Desire though (I'd say moreso than GB, despite the software rendering), as is the Market (the new one always lagged really badly for me on the Desire before) - glimmers of hope for ICS' eventual performance on older devices. The keyboard lags like you wouldn't believe though!
The Atrix should fly under 4.0.1 though, if it ever happens - bearing in mind the fact that the SGX 540 in the Galaxy Nexus is pretty much in a dead heat with Tegra 2's GPU, we've got a lower resolution screen, and can overclock past the its stock speeds.
Javi97100 said:
Good news man
Sent from my MB860 using XDA App
Click to expand...
Click to collapse
Its turning out to be harder then i though... I think no one will get it until offical updates come out for other phones
Azurael said:
Being that there are currently no EGL libs for anything except PowerVR SGX devices under ICS yet, and they're closed source and tightly dependent on the kernel there doesn't seem to be a huge point until the official updates start to hit for a range of devices.
Sure, Desire, HD, X10, N1 have ports of a sort at the moment, in fact there shouldn't be too many problems getting them working aside from the graphics drivers but they're just for fun with the framebuffer driver given how much of ICS' UI rendering is done with GPU acceleration in mind. You wouldn't want to use it day-to-day. The browser is surprisingly responsive on the Desire though (I'd say moreso than GB, despite the software rendering), as is the Market (the new one always lagged really badly for me on the Desire before) - glimmers of hope for ICS' eventual performance on older devices. The keyboard lags like you wouldn't believe though!
The Atrix should fly under 4.0.1 though, if it ever happens - bearing in mind the fact that the SGX 540 in the Galaxy Nexus is pretty much in a dead heat with Tegra 2's GPU, we've got a lower resolution screen, and can overclock past the its stock speeds.
Click to expand...
Click to collapse
So EGL = gpu driver? If thats the only setback, would it be possible to get an ICS rom with software rendering as a proof of concept, or are there other pieces missing?
GB/CM7 is pretty good on the Atrix, if we dont see ICS for a few months it doesn't hurt us in any way. I'd like to think most of us can be patient if we lack the skills to help.
I noticed the Captivate got a port of it too since i9000 ROMs and Cap ROMs are interchangeable. I thought its funny that it's running on the HD a Windows Mobile 6.5 phone lol. Let's all try to be patient and we will eventually see it.
Edit: not to mention I'm sure if it's not already it will soon be on iPhone too. It seems like iPhones always get the new Android versions kinda early. I'm not sweating it I love my Atrix in its current state.
According to anandtech, Tegra 2 support is essentially ready, so I think as long as nvidia releases the source for ics (libs?), someone will try to port it. Hell, I have a good 5 weeks during break, I might as well try then.
Sent from my MB860 using XDA App
Azurael said:
Being that there are currently no EGL libs for anything except PowerVR SGX devices under ICS yet, and they're closed source and tightly dependent on the kernel there doesn't seem to be a huge point until the official updates start to hit for a range of devices.
Sure, Desire, HD, X10, N1 have ports of a sort at the moment, in fact there shouldn't be too many problems getting them working aside from the graphics drivers but they're just for fun with the framebuffer driver given how much of ICS' UI rendering is done with GPU acceleration in mind. You wouldn't want to use it day-to-day. The browser is surprisingly responsive on the Desire though (I'd say moreso than GB, despite the software rendering), as is the Market (the new one always lagged really badly for me on the Desire before) - glimmers of hope for ICS' eventual performance on older devices. The keyboard lags like you wouldn't believe though!
The Atrix should fly under 4.0.1 though, if it ever happens - bearing in mind the fact that the SGX 540 in the Galaxy Nexus is pretty much in a dead heat with Tegra 2's GPU, we've got a lower resolution screen, and can overclock past the its stock speeds.
Click to expand...
Click to collapse
Actually, no, despite being a much older GPU, the SGX 540 found in the GNexus outpaces the Tegra 2 due to its higher clock rate by 7% or 45% depending on the GLBenchmark being run. Both GPU tests were done at 720p resolution. Also, you can't overclock the GPU, only the CPU.
edgeicator said:
Actually, no, despite being a much older GPU, the SGX 540 found in the GNexus outpaces the Tegra 2 due to its higher clock rate by 7% or 45% depending on the GLBenchmark being run. Both GPU tests were done at 720p resolution. Also, you can't overclock the GPU, only the CPU.
Click to expand...
Click to collapse
Buddy, check out any of the kernels available in the dev thread and you'll see that the GPUs are overclocked.
WiredPirate said:
I noticed the Captivate got a port of it too since i9000 ROMs and Cap ROMs are interchangeable. I thought its funny that it's running on the HD a Windows Mobile 6.5 phone lol. Let's all try to be patient and we will eventually see it.
Edit: not to mention I'm sure if it's not already it will soon be on iPhone too. It seems like iPhones always get the new Android versions kinda early. I'm not sweating it I love my Atrix in its current state.
Click to expand...
Click to collapse
Doubt the iPhone will see ICS, the newest model that can run android as far as I know is the iPhone 3G, which was incredibly slow under Gingerbread.
mac208x said:
X10 is garbage! this is outrageous!
Click to expand...
Click to collapse
222 posts and zero thanks? Is this what you do, go around XDA and post useless threads like the guy complaining about returning home early despite nobody asking him to "to get MIUI ported on his grandma's phone"?
Are you guys related by any chance?
edgeicator said:
Actually, no, despite being a much older GPU, the SGX 540 found in the GNexus outpaces the Tegra 2 due to its higher clock rate by 7% or 45% depending on the GLBenchmark being run. Both GPU tests were done at 720p resolution. Also, you can't overclock the GPU, only the CPU.
Click to expand...
Click to collapse
Depends on the benchmark, yes - texture-heavy rendering tends to perform better on the 540 in the OMAP4460 thanks to it's dual channel memory controller and high clock (and that's probably the directly relevant part to UI rendering to be honest, though as I said - lower resolution screen ) but the Tegra 2 is quite substantially ahead in geometry-heavy rendering (and games on mobiles are starting to move that way now, following the desktop landscape over the past 5 years or so.) Averaged out, the performance of the two is very close.
Plus, as I said, the GPU in my phone is running at 400MHz which ought to even things out in the GLMark 720p tests somewhat even if they are biassed to one architecture or the other. While the GPU in OMAP4460 may overclock just as well from its stock 400MHz, I'm only really concerned that the phone can run as fast as a stock GNexus to maybe skip the next generation of mobile hardware and tide it over until Cortex A15-based SoCs on 28nm process start to emerge with stronger GPUs. I don't really think I'm CPU performance bound with a 1.4GHz dual-core A9 - and increasing the number of equivalent cores without a really substantial boost in GPU horesepower seems worthless right now, even if ICS takes better advantage of SMP (re: Disappointing early Tegra 3 benchmarks - although it does seem GLMark stacks the odds against NVidia GPUs more than other benchmarks?)
Azurael said:
Depends on the benchmark, yes - texture-heavy rendering tends to perform better on the 540 in the OMAP4460 thanks to it's dual channel memory controller and high clock (and that's probably the directly relevant part to UI rendering to be honest, though as I said - lower resolution screen ) but the Tegra 2 is quite substantially ahead in geometry-heavy rendering (and games on mobiles are starting to move that way now, following the desktop landscape over the past 5 years or so.) Averaged out, the performance of the two is very close.
Plus, as I said, the GPU in my phone is running at 400MHz which ought to even things out in the GLMark 720p tests somewhat even if they are biassed to one architecture or the other. While the GPU in OMAP4460 may overclock just as well from its stock 400MHz, I'm only really concerned that the phone can run as fast as a stock GNexus to maybe skip the next generation of mobile hardware and tide it over until Cortex A15-based SoCs on 28nm process start to emerge with stronger GPUs. I don't really think I'm CPU performance bound with a 1.4GHz dual-core A9 - and increasing the number of equivalent cores without a really substantial boost in GPU horesepower seems worthless right now, even if ICS takes better advantage of SMP (re: Disappointing early Tegra 3 benchmarks - although it does seem GLMark stacks the odds against NVidia GPUs more than other benchmarks?)
Click to expand...
Click to collapse
I would expect the Tegra to beat a nearly 5 year old GPU, but it only does so in triangle throughput. Tegra just uses a very poor architecture in general. Look at how little actual horsepower it can pull. The Tegra 3 gpu pulls 7.2GFLOPs @300mhz. The iPad GPU and the upcoming Adreno 225 both pull 19.2 GFLOPS at that same clockspeed. I honestly have no idea what the engineers are thinking over atNnvidia. It's almost as bad as AMD's latest bulldozer offerings. It's really more of Tegra's shortcomings than GLMark stacking the odds. PowerVR's offerings from 2007 are keeping up with a chip that debuted in 2010/2011. The Geforce just doesn't seem to scale very well at all on mobile platforms. But yea, all Nvidia did with Tegra 3 was slap in 2 extra cores, clocked them higher, threw in the sorely missed NEON instruction set, increased the SIMDs on the GPU by 50% (8 to 12), and then tacked on a 5th hidden core to help save power. Tegra 3 stayed with the 40nm process whereas every other SoC is dropping down to 28nm with some bringing in a brand new architecture as well.
edgeicator said:
I would expect the Tegra to beat a nearly 5 year old GPU, but it only does so in triangle throughput. Tegra just uses a very poor architecture in general. Look at how little actual horsepower it can pull. The Tegra 3 gpu pulls 7.2GFLOPs @300mhz. The iPad GPU and the upcoming Adreno 225 both pull 19.2 GFLOPS at that same clockspeed. I honestly have no idea what the engineers are thinking over atNnvidia. It's almost as bad as AMD's latest bulldozer offerings. It's really more of Tegra's shortcomings than GLMark stacking the odds. PowerVR's offerings from 2007 are keeping up with a chip that debuted in 2010/2011. The Geforce just doesn't seem to scale very well at all on mobile platforms. But yea, all Nvidia did with Tegra 3 was slap in 2 extra cores, clocked them higher, threw in the sorely missed NEON instruction set, increased the SIMDs on the GPU by 50% (8 to 12), and then tacked on a 5th hidden core to help save power. Tegra 3 stayed with the 40nm process whereas every other SoC is dropping down to 28nm with some bringing in a brand new architecture as well.
Click to expand...
Click to collapse
Don't you get tired if writing those long rants? We understand you know something about CPU architecture, and that Tegra isn't the best one out there, but damn man, it's the same thing in every thread. Just chill out and try to stay on topic for once
Sent from my MB860 using Tapatalk
edgeicator said:
I would expect the Tegra to beat a nearly 5 year old GPU, but it only does so in triangle throughput. Tegra just uses a very poor architecture in general. Look at how little actual horsepower it can pull. The Tegra 3 gpu pulls 7.2GFLOPs @300mhz. The iPad GPU and the upcoming Adreno 225 both pull 19.2 GFLOPS at that same clockspeed. I honestly have no idea what the engineers are thinking over atNnvidia. It's almost as bad as AMD's latest bulldozer offerings. It's really more of Tegra's shortcomings than GLMark stacking the odds. PowerVR's offerings from 2007 are keeping up with a chip that debuted in 2010/2011. The Geforce just doesn't seem to scale very well at all on mobile platforms. But yea, all Nvidia did with Tegra 3 was slap in 2 extra cores, clocked them higher, threw in the sorely missed NEON instruction set, increased the SIMDs on the GPU by 50% (8 to 12), and then tacked on a 5th hidden core to help save power. Tegra 3 stayed with the 40nm process whereas every other SoC is dropping down to 28nm with some bringing in a brand new architecture as well.
Click to expand...
Click to collapse
I think you are not seeing the whole picture...
The Tegra 3 (Et-Al) is not just about its quad core implementation, remember that the GPU will offer 12 cores that will translate in performance not seeing as of yet on any other platform.
Benchmarks don't tell the whole story! Specially those benchmarking tools which are not Tegra 3 optimized yet.
Cheers!
Sent from my Atrix using Tapatalk
WiredPirate said:
I noticed the Captivate got a port of it too since i9000 ROMs and Cap ROMs are interchangeable. I thought its funny that it's running on the HD a Windows Mobile 6.5 phone lol. Let's all try to be patient and we will eventually see it.
Edit: not to mention I'm sure if it's not already it will soon be on iPhone too. It seems like iPhones always get the new Android versions kinda early. I'm not sweating it I love my Atrix in its current state.
Click to expand...
Click to collapse
LOL I ran all the iDroid ports on my iphone. Not one was even in alpha stage, I would not even count iDroid as a port since you cant use anything on it.