[Q] MTK 6752 vs Snapdragon 800 - Android Q&A, Help & Troubleshooting

So, it's time to upgrade again. I have 2 main options - The Elephone P7000 or the Oneplus One. The Elephone has an MTK 6752 at 1.7GHz and 3GB of single channel, 800MHz Ram. The OPO has a Snapdragon 800 series processor, with 3GB of dual channel ram at 933 MHz. In an average, day-to-day scenario, realistically, will there be any noticeable difference in the fluidity of the devices? They're both 4G, too, so will general internet browsing feel the same speed, or will both devices feel about the same speeds as each other? Thanks for any insights

I would definitely get the Oneplus One. I never heard of Elephone. The Oneplus One is a known device and therefore it's better supported by developers and I suppose it's easier to find repair parts as well. Also, Snapdragon processors are generally better performing as well.

Related

Lenovo 'Transformer' IdeaTab S2

So will you guys be swapping your Asus Transformer Prime for a similar product? Im sure most people are purchasing this due to the extra keyboard dock or tegra 3.
EDIT: Personally I'll be sticking with Asus Prime for now, its a good device.
Specification:
10.1" Screen IPS Display
Qualcomm Snapdragon 8960 (28mn TSMC) Dual-Core 1.7Ghz / Adreno 225 GPU 400 Mhz (Overclocked Adreno 220 + Better driver)
20 Hour battery Life
Keyboard Dock like Asus Transformer
16/32/64gb
The GPU is just on par with Mali 400MP which is a shame (GLBenchmark) but that is early benchmark.
Overclocking should be alot better for the CPU, since its a 28mn, I guess reaching over 2.0Ghz is fine!
Information:
Lenovo Idea Tab S 2
We need to start the review by mentioning that there may be certain ambiguities in the specification listed here for Lenovo Idea Tab S 2 since it’s actually not the official release. But as the prior experiences suggest, these information are normally bound to be true. So let us proceed with them. The Lenovo Idea Tab S 2 is to have 10.1 inches IPS display with a resolution of 1280 x 800 pixels which would be a state of the art screen panel and resolution. It will have 1.5GHz Qualcomm Snapdragon 8960 dual core processor with 1GB of RAM. This beast of hardware is controlled by Android OS v4.0 IceCreamSandwich and Lenovo has included a completely modified UI called Mondrain UI for their Idea Tab.
It comes in three storage configurations, 16 / 32 / 64 GBs with the ability to expand the storage using a microSD card. It features 5MP rear camera with auto focus and geo tagging with Assisted GPS and while the camera isn’t that good, it has decent performance verifiers. Idea Tab S 2 will come in 3G connectivity, not 4G connectivity which certainly is a surprise and it also has Wi-Fi 801.11 b/g/n for continuous connectivity and they claim that this tablet can control a smart TV so we assume they have some variation of DLNA included in Idea Tab S 2 as well. Following the footsteps of Asus, Lenovo Idea Tab S 2 also comes with a keyboard dock that has some additional battery life as well as additional ports and an optical track pad. It’s such a good concept to be replicated from Asus and we reckon it would be a deal changer for Lenovo Idea Tab S 2.
Lenovo has also made their new Tablet rather thin scoring a mere 8.69mm of thickness and 580g of weight which is surprisingly light. The inbuilt battery can score up to 9 hours as per Lenovo and if you hook it up with the keyboard dock, 20 hours of total battery life is guaranteed by Lenovo which is a very good move.
Click to expand...
Click to collapse
Video: http://www.youtube.com/watch?v=vWAOmO4LUIo
I certainly won't be going through the trouble of changing to this. This doesn't really look to add anything of value for me (don't need gps and my wifi works fine), and if pricing from lenovo in the past stays true this will likely be more expensive then the equivalent primes.
MrPhilo said:
The GPU is just on par with Mali 400MP which is a shame (GLBenchmark) but that is early benchmark.
Click to expand...
Click to collapse
That's surprising because of the GFLOPS specs for the GPUs:
Tegra 3 Kal-El: 7.2 GFLOPS
Qualcomm 8960 Adreno 225: 19.2 GFLOPS
PowerVR SGX543MP2: 19.2 GFLOPS
And per Anandtech "Qualcomm claims that MSM8960 will be able to outperform Apple's A5 in GLBenchmark 2.x at qHD resolutions." Of course, Qualcomm would say that but even if it is on par with the iPad2 (543MP2) it will still significantly outperform the Tegra3.
L3rry said:
That's surprising because of the GFLOPS specs for the GPUs:
Tegra 3 Kal-El: 7.2 GFLOPS
Qualcomm 8960 Adreno 225: 19.2 GFLOPS
PowerVR SGX543MP2: 19.2 GFLOPS
And per Anandtech "Qualcomm claims that MSM8960 will be able to outperform Apple's A5 in GLBenchmark 2.x at qHD resolutions." Of course, Qualcomm would say that but even if it is on par with the iPad2 (543MP2) it will still significantly outperform the Tegra3.
Click to expand...
Click to collapse
Yes but driver is the most important. Since Tegra 3 Ka el is clocked higher than 300Mhz, the 7.2 GFLOPs doesn't count.
I'd doubt it'll significantly outperform the Tegra 3 GPU. Just like the Adreno 220 was meant to be better but isn't much different.
Even Qualcomm admited that it'll only have 50% more performance than its current Adreno 220.
FML, GLBenchmark took down Asus TF202 with the GPU. It just performed lower than the Mali GPU, wish I saved the website.
With Adreno 225 Qualcomm improves performance along two vectors, the first being clock speed. While Adreno 220 (used in the MSM8660) ran at 266MHz, Adreno 225 runs at 400MHz thanks to 28nm. Secondly, Qualcomm tells us Adreno 225 is accompanied by "significant driver improvements". Keeping in mind the sheer amount of compute potential of the Adreno 22x family, it only makes sense that driver improvements could unlock a lot of performance. Qualcomm expects the 225 to be 50% faster than the outgoing 220
Click to expand...
Click to collapse
MrPhilo said:
FML, GLBenchmark took down Asus TF202 with the GPU. It just performed lower than the Mali GPU, wish I saved the website.
Click to expand...
Click to collapse
Yes, I saw that comment posted in another tread and I tried to google it but could not find it. Hopefully, Anandtech will put something out soon once demos for these newer tablets are available.
I've personally had a lot of headaches in the past with Lenovo laptops so I doubt I'll be making another Lenovo purchase. (Google "Y530 Lenovo Hinges" if you're interested in the issue- it was a common problem due to faulty design.)
The powerVR and Adreno have much more efficient rendering methods than the Tegra chips, so this tablet is no pushover at all.
I wouldn't be surprised if real world performance is better than the tegra 3 outside of tegra 3 specific apps.
hey
The Adreno 225 + SGX 543 MP2 both get 19.2 gflops @300mhz. we dont know the clock speed of the A5 but we can speculate that its probably around the 250-300mhz range.
That makes the Adreno(@400) more powerfull in flops than even the A5/tegra 3, however flops dont tell the whole story, as the A5 has twice the number of TMU's so has a higher fill rate clock for clock and better texturing capability.
The A5 will likely have more ROPs as well, but i dont know that.
The A5 will also have slightly higher bandwidth i think.
Looking at what Anand has said, the adreno 220, only had single channel memory=low bandwidth, it also probably poor effeciency in getting data to the shaders, i think Power vr are more effecient than adreno 2xx series.
The drivers on Adreno were not very good either, indeed some developers on this forum have managed to DOUBLE the adreno [email protected] using the newist Adreno drivers from qualcomm, i think shaky153 was leading the charge with.
I would be very suprised if the Adreno 225 equaled the A5, but it might equal or slightly beat the tegra 3..especially at higher resolutions due to tegras lack of bandwidth.
I don't understand why Nvidia doesn't announce the GPU clock speed!! they detailed it with T2! which means there is something to hide
AP25 was 400Mhz, so T3 shouldn't be under 400mhz
this discussion would be a lot easier if we know the actual clock speed
Prime/Nvidia rules!
Plus Lenovo had No developement support at all. And they are one of the slowest to release firmware updates. Everything is basically dead in Lenovo land.
It seems OK. But nothing enticing to make me think twice about trading my Prime. PRIME is just to cool all around.

Whats next after quad-core?

So in 2011 we have Tegra 2, in 2012 we have Tegra 3 so my questions is what will come in 2013? Octo-core or an improved version of quad core cpus?
Fasty12 said:
So in 2011 we have Tegra 2, in 2012 we have Tegra 3 so my questions is what will come in 2013? Octo-core or an improved version of quad core cpus?
Click to expand...
Click to collapse
Well as octo core desktop CPUs havnt really caught on yet I would guess just better quad cores likely with more powerful GPUs
Tegra 3 is already very powerful, presuming the will increase ram and make them more battery efficient or even higher clock speed. 12 core tegra gpu is pretty amazing already and anything better must be godly
Sent from my HTC Desire using xda app-developers app
If u mean for mobile platform , Will we really need beyond Quad core, having seen how SGSIII is smoothly running with it, beyond that what more perfection ( yaa still more can be expected) and speed u will need to do ur work . As known Android use other cores on need basis , why u need to see ur 2-3 cores never used.. i think its just more curiosity n to have more advaced/latest will be the only reason to have such high cpu on ur mobile..
What I like to see is ups in RAM installed and lows in RAM usage by system...
Sounds like octo-mom..the debate.lives on.. battery vs performance...but to answer your question I think it would be hexa-core which is 6..let's wait and see what is to come...
Sent from my SGH-T989 using Tapatalk 2
s-X-s said:
If u mean for mobile platform , Will we really need beyond Quad core, having seen how SGSIII is smoothly running with it, beyond that what more perfection ( yaa still more can be expected) and speed u will need to do ur work . As known Android use other cores on need basis , why u need to see ur 2-3 cores never used.. i think its just more curiosity n to have more advaced/latest will be the only reason to have such high cpu on ur mobile..
What I like to see is ups in RAM installed and lows in RAM usage by system...
Click to expand...
Click to collapse
I agree. Cores are at there peak right now. The amount of CPU power we have especially in the higher end phones is enough to acomplish many, many things. RAM is somewhat of an issue especially since multitasking is a huge part of android. I really thing a 2gb RAM should be a standard soon. Also, better gpu's won't hurt
Sent from my HTC T328w using Tapatalk 2
If they decide to keep going on the core upgrade in the next two or so years, I see one of two possibilities happening:
1) Dual Processor phones utilizing either dual or quad cores.
or
2) Hexacore chips since on the desktop market there's already a few 6-core chips (though whether or not they would actually be practical in the phones architecture, no clue).
Generally speaking whatever they come out with next will either need a better battery material, or lower power processors.
I mean I'm pretty amazed by what my brother's HTC One X is capable of with the quad core, and here I am still sporting a single-core G2. But yes I would like to see more advancement in RAM usage, we got a nice bit of power, but how bout a standard 2GB ram for better multitasking?
I believe 2013 will be all about more efficient quad-cores.
May i ask what going from 1gb to 2gb will improve? Loading times?
hello everyone, could you tell me what is cuad core?
Quad core means that a processor has four processing units.
Because there are more, that means that a process, theoretically, gets executed 4 times faster.
Read more about it: http://simple.wikipedia.org/wiki/Multi-core_processor
Maybe i7 in mobile devices?
I'm sure it will stay at quad core cpu's, anything more is just overkill. They may introduce hyperthreading. It's going to boil down to efficiency.
Sent from my SPH-D700 using xda premium
I'd say the future lies in more efficient use of processors. Right now, Android is still far from optimized on multi-core processor-equipped devices. Project Butter is the start of a great movement by Google to optimize the operating system. Hopefully it spreads out to other OEMs and becomes the main focus for Android development.
Improving and optimizing current processors is the way hardware companies should go.
In my opinion, processor development is out running battery development. Optimized processors could reduce power consumption while preserving excellent speed and usability.
Sent from my Transformer TF101 using Tapatalk 2
building processors on more efficient ARM architectures is going to be the way to go from what I see......throwing four less efficient cores at a problem is the caveman method to dealing with it.....looking at you Samsung Exynos Quad based on tweaked A9 cores.....
the A15 based Qualcomm S4 Krait is more efficient on a clock for clock core for core basis and once the software catches up and starts using the hardware in full capacity, less more efficient cores will be preferred
I dont see anything beyond quads simply because they havent even scratched the surface of what can be done with a modern dual core processor yet.......throwing more cores at it only makes excuses for poor code.....i can shoot **** faster than water with a big enough pump......but that doesn't mean that's the better solution
We don't need more cores! Having more than 2 cores will not make a difference so quad cores are a waste of space in the CPU die.
Hyperthreading, duh.
More ram. Got to have the hardware before the software can be made to use it.
With the convergence of x86 into the Android core and the streamlining of low-power Atom CPUs, the logical step would be to first optimize the current software base for multi-core processors before marketing takes over with their stupid x2 multiplying game...
Not long ago, a senior Intel exec went on record saying that today, a single core CPU Android smartphone is perhaps better overall performing (battery life, user experience, etc) than any dual/quad-core CPU. Mind you, these guys seldom if ever stick out their neck with such bold statements, especially when not pleasing to the ear...
For those interested, you can follow this one (of many) articles on the subject: http://www.zdnet.com/blog/hardware/intel-android-not-ready-for-multi-core-cpus/20746
Android needs to mature, and I think it actually is. With 4.1 we see the focus drastically shifted to optimization, UX and performance with *existing/limited* resources. This will translate to devices beating all else in battery life, performance and graphics but since it was neglected in the first several iterations, it is likely we see 4.0 followed by 4.1 then maybe 4.2 before we hear/see the 5.0 which will showcase maturity and evolution of the experience.
Just my 2c. :fingers-crossed:

Question About Galaxy SIII

I just got the international version of the SIII. I was wondering if I should return it and get the AT&T version because some people say that it is faster graphics and memory wise, and that even though it's dual core it has a newer processor architecture (A9 vs A15). I want to know which one really has the better graphics performance and per-thread performance, just overall which one is more powerful in the end. I dont care about network speeds.
CAN YOU PLEASE USE THE DAMN SEARCH FUNCTION BEFORE POSTING?
*calms down*
because some people say that it is faster graphics and memory wise
Click to expand...
Click to collapse
It has more memory (2GB vs 1GB) but a slower graphic engine. Note that slower is a relative term; the speed differences may be neglectable in most real-world applications, but synthetic benchmarks do show them.
it has a newer processor architecture (A9 vs A15).
Click to expand...
Click to collapse
Yes. However none of the new features (such as Virtualization) are even used.
Note that an architecture doesn't say anything about the CPU itself since Samsung has heavily modified the ARM design.
Also note that the US-version is dualcore at ~1.5Ghz with the Exynos (international) is 4core clocked at 1.4Ghz.
The Exynos is also produced with a newer technology called High-K metal gate which allows higher clock speeds with less power consumption as compared to the old production technology used on the Dualcore version.
My best guess is that everything you'll want to do will just run fine on both versions, but the international one definitively has more horsepower.
The US-version with it's 2GB of RAM should be better for multitasking tough.
I got the international version and I think I made a good decision getting the international version.
Thanks so much. I looked around but couldn't find exactly what I was looking for there was always people saying the Snapdragon S4 was better, but i didn't know if they just meant battery/data wise or what. Thanks for the clarification though. So far I am loving my International version.

[Q] Which S3 is better? quad core exynos or dual core snapdragon?

I have only used dumb-phones until now and plan to get a smartphone.
I did a little surfing and found Adreno 320 is better than Mali 400 GPU and Qualcomm Snapdragons are better than Samsung Exynos, right?
But in real world they say Note 2 is considered better phone than Nexus 4. I don't understand this.
Different phones top in different benchmarks, and still some things like display quality are not considered in benchmarks i guess, right?
Now I haven't used any smartphone before, and the I would be using it for gaming or watching videos/movies only most of the time as I'm a student.
Which of these phones that I mentioned (all in 23-29,000 INR range) are better than others in reality?
Those who have used these phones or are techsavy might be able to guide me in making the right choice
Just found S3 in India has different specs than in US, Canada etc.
India: 1.4 Ghz Quad-core Exynos 4412,Mali-400,1GB
US: 1.5 Ghz Dual-core Snapdragon, Adreno 225,2GB
Which S3 is better?
sher_dil said:
I have only used dumb-phones until now and plan to get a smartphone.
I did a little surfing and found Adreno 320 is better than Mali 400 GPU and Qualcomm Snapdragons are better than Samsung Exynos, right?
But in real world they say Note 2 is considered better phone than Nexus 4. I don't understand this.
Different phones top in different benchmarks, and still some things like display quality are not considered in benchmarks i guess, right?
Now I haven't used any smartphone before, and the I would be using it for gaming or watching videos/movies only most of the time as I'm a student.
Which of these phones that I mentioned (all in 23-29,000 INR range) are better than others in reality?
Those who have used these phones or are techsavy might be able to guide me in making the right choice
Just found S3 in India has different specs than in US, Canada etc.
India: 1.4 Ghz Quad-core Exynos 4412,Mali-400,1GB
US: 1.5 Ghz Dual-core Snapdragon, Adreno 225,2GB
Which S3 is better?
Click to expand...
Click to collapse
The US version is btr in this case. Based on wat is ur requirements. U dun multi task much so u dun nid tat much cores. And more ram is btr.
Sent from my GT-I9300 using xda app-developers app

My theory/rant about Qualcomm and their Snapdragon 808/810 processors.

So on my thread for the Nexus 6P bootloop fix, @btvolta asked me this question:
btvolta said:
I am still on the previous modified ex kernel and my phone seems to run just the same as before my blod experience. How many cores are normally running if not for the blod issue?
Click to expand...
Click to collapse
He had a good question, as many other people were reporting that the 6P was running almost the same, if not even better, with only half the cores enabled.
Below is the reply I gave to him. I decided to post it into this thread, because I would like to know what you guys think about my theory about Qualcomm's chips, and even correct me if I'm wrong, as I would like to understand this situation as accurately as possible. (Although I do ask that those of you who do disagree with me, do it respectfully, and I will treat you the same)
XCnathan32 said:
So me typing this reply ended up in me going about a long rant about my theories about Qualcomm. Tl;Dr to your question: Stock 6P uses 8 cores, fixed 6P uses 4, Qualcomm 810 using 8 cores probably overheats so much, that it thermal throttles heavily, resulting in performance only slightly higher than the same processor, with 4 cores, that thermal throttle much less.
Trigger warning for anyone about to read this: I harshly bash on Qualcomm in this semi-angry rant, if you are a diehard Qualcomm fan, you should probably not read this.
On a stock Nexus 6P, 8 cores are enabled in ARMs big.LITTLE configuration. big.LITTLE is where there is a cluster of power efficient, slower cores to handle smaller tasks (in the 6P's case, 4 Cortex A53s running at 1.55GHz), and a cluster of more power hungry, high performance cores (for the 6P, 4 Cortex A57s running at 2GHz)
On a bootlooping 6P, a hardware malfunction related to the big cluster causes this bootloop, so this fix remedies the problem by disabling the high performance big cores.
The stock 6P is supposed to use the Cortex A57 cores and some of the Cortex A53 cores for foreground tasks. So you would think that a working phone should have double the performance of a phone with this fix, right? After all, it's using 4 more cores, and those cores are clocked almost 25% higher. The reason that (I think) performance is not noticeably affected, is because Qualcomm's Snapdragon 808/810 SoCs, are a horrible, rushed project, that could be designed better by a group of monkeys.
Even with 4 cores disabled, my phone can still thermal throttle (for those who don't know, thermal throttling is when CPU/GPU performance is intentionally limited by software to keep temperatures in check) when playing games, or even watching YouTube. The big cores run way hotter, and they thermal throttle insanely easily, see this graph here. In 30 seconds, the big cores are already to 1.8GHz (From 2GHz), in 60 seconds, the big cores are down to 1.4GHz, and in 3 minutes, 3 freaking minutes, the big cores are thermal throttled down to 850MHz, which is 235% slower than the advertised 2000MHz, and 182% slower than the little cores 1.55GHz.
So my guess is that the big cores thermal throttle so easily, and the high heat output of the big cores results on the little cores overheating, which results in the little cores being thermal throttled along with the big cores. So 4 cores that typically do not thermal throttle, are better than 8 which do. Either that, or when the big cores overheat, the device turns off the big cores and only uses the little cores, which is essentially this fix.
For those of you that think my description of the 808/810 was slightly (extremely) harsh, you're right. However, here's why I was so hard on them: I feel like Qualcomm rushed development of the 808 and 810 to get it to flagship devices. The 808 and 810 were also the first (and last) of it's processors to use the TSMC 20nm manufacturing process. So my guess would be that Qualcomm designed the processor based on that manufacturing process, and then after finding out about the poor thermals of their new chips, it was too late to redesign their chip, because they had to give it to manufacturers. After all, a "Flagship device" can't use a last gen processor. So the overheating chips were given to manufacturers just so their phone could look better on a spec sheet.
Also, Qualcomm VP McDonough said "The rumours are rubbish, there was not an overheating problem with the Snapdragon 810 in commercial devices"(source). However his response to heat issues and benchmarking problems in the early Flex 2 and One M9 was because they weren't final commercial versions of the devices. "Everything you're saying is fair. But we all build pre-released products to find bugs and do performance optimisation. So when pre-released hardware doesn't act like commercial hardware, it’s just part of the development process." In that context, performance optimisation most likely means "allow the devices to run hotter than they should before they throttle" (source) which results in problems later down the line (like maybe half of the cores failing, causing a record number of bootloops in devices?)
The whole reason I typed this rant, was to express my frustration at how Qualcomm (most likely) caused tens of thousands of people to have devices that performed worse than they should have performed on paper, and even result in broken devices. And I haven't seen many people blame Qualcomm for the bootlooping problem, and everyone blames Hauwei/LG/Google, while Qualcomm twiddles their thumbs and keeps ranking in money for their domination in the mobile SoC market. Now obviously, I'm not 100% sure that Qualcomm is to blame for the bootlooping problems, and no one will probably ever know who caused the problem. So this is just a theory that I have. But it is awfully suspicious how the same chip has had problems in multiple devices, even when different companies manufactured the devices.
Even if Qualcomm isn't to blame for the bootlooping problems, it is hard to deny that their chips have serious overheating issues. Samsung themselves basically admitted that the 810 had problems, as every single one of their Galaxy S devices (at least US models) have used a snapdragon processor, except for the galaxy S6, where Samsung opted to use their own Exynos processor instead of the 810, even on the US model.
Please feel free to reply and discuss/argue my points, as I would really like to hear what you guys think about my theory.
Click to expand...
Click to collapse
Well, if it's true what you're saying, Qualcomm is the bad guy here. It all points towards an overheating issue with the powercores, which are designed and made by them. However, I feel that the OEMs who purchase these SoCs from them should take responsibility for their choice to use them in their devices and step up. If this theory you have can be proven by extensive testing, a lawsuit should be fairly easy to win and Qualcomm should be forced to better their development and testing.
I may be jumping the gun a bit here, but seeing Qualcomm has a bit of a monopoly on the SoC market, we, the consumers should stop putting our trust in devices using their chipsets. I've had several devices with a Qualcomm chipset and every single one of them were crap. I've had a Samsung Galaxy S2 (which I hated because of the software Samsung put on that device) but the hardware (Exynos) was top notch at the time.
Ok, that's about all of my two cents. Thanks for the good read btw.
NeoS said:
Well, if it's true what you're saying, Qualcomm is the bad guy here. It all points towards an overheating issue with the powercores, which are designed and made by them. However, I feel that the OEMs who purchase these SoCs from them should take responsibility for their choice to use them in their devices and step up. If this theory you have can be proven by extensive testing, a lawsuit should be fairly easy to win and Qualcomm should be forced to better their development and testing.
I may be jumping the gun a bit here, but seeing Qualcomm has a bit of a monopoly on the SoC market, we, the consumers should stop putting our trust in devices using their chipsets. I've had several devices with a Qualcomm chipset and every single one of them were crap. I've had a Samsung Galaxy S2 (which I hated because of the software Samsung put on that device) but the hardware (Exynos) was top notch at the time.
Ok, that's about all of my two cents. Thanks for the good read btw.
Click to expand...
Click to collapse
Just happened to be Huawei was making the device and even though Huawei has their own in house chip but since Huawei brand was not really familiar to US, maybe Google is not convinced to market a Nexus brand with some Hi Silicon Kirin processor but they need to get another Nexus device out that year.
If just it was Samsung back then to make the Nexus device, maybe Google is ok with Samsung Exynos chip.
How great would the 6p be IF it could utilize the a57 cores? I'm using Franco Kernel, and he has it set up to barely use the big cores. I'm guessing mostly for battery savings of course, but on a 6p that thus far hasn't had the infamous battery meltdown, to have half of the cores (and the most powerful) cores sitting at lowest frequency for 95% of the time is kind of a shame. I'm willing to dust off my pitchfork

Categories

Resources