hey guys so i was planing to get the tegra note tablet. but reading up i see the snapdragon 800 cpu is pretty good too,
what one do i pick !? i have around £200-300 mark, i cant find any snapdragon 800 Tablets out there, im a Pc gamer looking for a kind of gaming tablet. anyhelp would be amazing.
thanks 181jenkins
I'd go for tegra 4 if you're a gamer
Please Don't Forget To Press The Thanks Button!! :thumbup:
Either one will be fine... The snapdragon 800 is a beast also
baileyjr said:
Either one will be fine... The snapdragon 800 is a beast also
Click to expand...
Click to collapse
gahh im a performance guy i like to have the Highest in the benchmarks, thats why im was looking at the snapdragon 800 now, but the tegra 4 has 72 gpu cores... hate looking for tablets lol
Tegra 4 has less games than snapdragon
They are just same in performance
But going with s800 will be a wise choice due to high dev support
Go for galaxy note 3 it has s800 cpu with 3gb ram
Sent from my IM-A850K using Vegaviet App
baileyjr said:
Either one will be fine... The snapdragon 800 is a beast also
Click to expand...
Click to collapse
adeelraj said:
Tegra 4 has less games than snapdragon
They are just same in performance
But going with s800 will be a wise choice due to high dev support
Go for galaxy note 3 it has s800 cpu with 3gb ram
Sent from my IM-A850K using Vegaviet App
Click to expand...
Click to collapse
less games? how come? i thought the tegra 4 will have more "graphics" than the snapdragon?
im looking for a tablet for the 200-300 mark.
thanks 181jenkins
181jenkins said:
less games? how come? i thought the tegra 4 will have more "graphics" than the snapdragon?
im looking for a tablet for the 200-300 mark.
thanks 181jenkins
Click to expand...
Click to collapse
I would like to suggest a tab:
Try Google Nexus 7 (2013)
Please Don't Forget To Press The Thanks Button!! :thumbup:
adi160699 said:
I would like to suggest a tab:
Try Google Nexus 7 (2013)
Please Don't Forget To Press The Thanks Button!! :thumbup:
Click to expand...
Click to collapse
i have the nexus 7 2013 version, and i want to move away, i mean i went to the nexus 7 2012 version for the performance, and it seams they ate going away from that and going to more of the " screen resolution" side of things. i mean i dont really care about the resolution, as the higher it is the more fps it will take away from performance,
thanks 181jenkins.
Consider the Tegra Note or Shield
181jenkins said:
i have the nexus 7 2013 version, and i want to move away, i mean i went to the nexus 7 2012 version for the performance, and it seams they ate going away from that and going to more of the " screen resolution" side of things. i mean i dont really care about the resolution, as the higher it is the more fps it will take away from performance,
thanks 181jenkins.
Click to expand...
Click to collapse
I am waiting on a Tegra Note for $199 (~150GBP I suspect). Frame rates should be nearly 2x that of the Nexus 7 2013, which has 2.5x the number of pixels and a weaker GPU. The HP Slate 7 Extreme may be the first variant available. It also has computational photography featires (real-time HDR, track-focus, and slow-motion with a 5MP camera), very good sound, and a highly accurate stylus. Recent builds have shown Antutu scores of >36K (see YouTube Chinese review), versus ~20K for the highest N7 2013 scores. And yes, that is at least as high as the Sony Xperia Xperia Z Ultra phablet which uses an S800.
The S800 in theory has a stronger GPU but a weaker CPU. The real picture is clouded by the fact that certain vendors (see arstechnica.com_gadgets_2013_10_galaxy-note-3s-benchmarking-adjustments-inflate-scores-by-up-to-20 - replace underscores with slashes) have gamed the benchmarks when using the S800. The T4 has generally shown higher benchmarks than the original reference builds (e.g. Antutu on my Shield is 41.5k vs 36.5k), whereas the S800 has yet to beat their reference design performance. Of course, their reference phone build was about a half-inch thick. So that might have something to do with it...
If you want ultimate FPS performance, get the Shield. It actually has a (very quiet) fan so the device can run at close to 100% capacity for hours - something no phone or tablet can do, and there is no benchmark gaming as confirmed by Ars Technica. The Tegra Note would be a good second choice. After that, things get murkier, with the Sony S800-based Xperia Z Ultra being a decent choice. However, it's stylus looks quite inferior to the Note (no pressure sensitivity, for example), and the higher PPI will translate into slower FPS. The camera is 8MP vs 5MP for the note, but the Z doesn't the same real-time capabilities, AFAIK.
Some people claim that the T4 is a "power hog", and Toshiba's poorly engineered T4 tablet heat-sink hasn't helped this view. But Laptop Magazine has shown the Shield running 4.2W TDP for the T4 at 1.9Ghz under hours of heavy use, which compares favorably to the S800 design TDP (see fudzilla.com_home_item_31532-qualcomm-aims-at-25-to-3w-tdp-for-phones).
Hope that helps!
deppman said:
I am waiting on a Tegra Note for $199 (~150GBP I suspect). Frame rates should be nearly 2x that of the Nexus 7 2013, which has 2.5x the number of pixels and a weaker GPU. The HP Slate 7 Extreme may be the first variant available. It also has computational photography featires (real-time HDR, track-focus, and slow-motion with a 5MP camera), very good sound, and a highly accurate stylus. Recent builds have shown Antutu scores of >36K (see YouTube Chinese review), versus ~20K for the highest N7 2013 scores. And yes, that is at least as high as the Sony Xperia Xperia Z Ultra phablet which uses an S800.
The S800 in theory has a stronger GPU but a weaker CPU. The real picture is clouded by the fact that certain vendors (see arstechnica.com_gadgets_2013_10_galaxy-note-3s-benchmarking-adjustments-inflate-scores-by-up-to-20 - replace underscores with slashes) have gamed the benchmarks when using the S800. The T4 has generally shown higher benchmarks than the original reference builds (e.g. Antutu on my Shield is 41.5k vs 36.5k), whereas the S800 has yet to beat their reference design performance. Of course, their reference phone build was about a half-inch thick. So that might have something to do with it...
If you want ultimate FPS performance, get the Shield. It actually has a (very quiet) fan so the device can run at close to 100% capacity for hours - something no phone or tablet can do, and there is no benchmark gaming as confirmed by Ars Technica. The Tegra Note would be a good second choice. After that, things get murkier, with the Sony S800-based Xperia Z Ultra being a decent choice. However, it's stylus looks quite inferior to the Note (no pressure sensitivity, for example), and the higher PPI will translate into slower FPS. The camera is 8MP vs 5MP for the note, but the Z doesn't the same real-time capabilities, AFAIK.
Some people claim that the T4 is a "power hog", and Toshiba's poorly engineered T4 tablet heat-sink hasn't helped this view. But Laptop Magazine has shown the Shield running 4.2W TDP for the T4 at 1.9Ghz under hours of heavy use, which compares favorably to the S800 design TDP (see fudzilla.com_home_item_31532-qualcomm-aims-at-25-to-3w-tdp-for-phones).
Hope that helps!
Click to expand...
Click to collapse
I've got the Z Ultra as a daily driver and its very good... And the Snapdragon 800 cpu has been overclocked to 2.7ghz over in the Z1 forum. Not thats thats recommended lol
That tegra note looks really sweat gaming platform for the price though, I suspect they have kept the resolution down to keep the price down, but that resolution is perfect for better gaming performance. Stylus is a plus as well...
Its a pitty the tegra 4 hasn't made its way to more phones/phablets. I think they will shift far more tablets than Shields as it a more versatile device for more people.. Anyone know if it will have the streaming capability of the shield? I suspect not.
Bro go for xolo play tab tegra4 obvi nvidia is the best for gaming and also tegra is overclockable
Sent from my GT-I8552 using xda premium
deppman said:
I am waiting on a Tegra Note for $199 (~150GBP I suspect). Frame rates should be nearly 2x that of the Nexus 7 2013, which has 2.5x the number of pixels and a weaker GPU. The HP Slate 7 Extreme may be the first variant available. It also has computational photography featires (real-time HDR, track-focus, and slow-motion with a 5MP camera), very good sound, and a highly accurate stylus. Recent builds have shown Antutu scores of >36K (see YouTube Chinese review), versus ~20K for the highest N7 2013 scores. And yes, that is at least as high as the Sony Xperia Xperia Z Ultra phablet which uses an S800.
The S800 in theory has a stronger GPU but a weaker CPU. The real picture is clouded by the fact that certain vendors (see arstechnica.com_gadgets_2013_10_galaxy-note-3s-benchmarking-adjustments-inflate-scores-by-up-to-20 - replace underscores with slashes) have gamed the benchmarks when using the S800. The T4 has generally shown higher benchmarks than the original reference builds (e.g. Antutu on my Shield is 41.5k vs 36.5k), whereas the S800 has yet to beat their reference design performance. Of course, their reference phone build was about a half-inch thick. So that might have something to do with it...
If you want ultimate FPS performance, get the Shield. It actually has a (very quiet) fan so the device can run at close to 100% capacity for hours - something no phone or tablet can do, and there is no benchmark gaming as confirmed by Ars Technica. The Tegra Note would be a good second choice. After that, things get murkier, with the Sony S800-based Xperia Z Ultra being a decent choice. However, it's stylus looks quite inferior to the Note (no pressure sensitivity, for example), and the higher PPI will translate into slower FPS. The camera is 8MP vs 5MP for the note, but the Z doesn't the same real-time capabilities, AFAIK.
Some people claim that the T4 is a "power hog", and Toshiba's poorly engineered T4 tablet heat-sink hasn't helped this view. But Laptop Magazine has shown the Shield running 4.2W TDP for the T4 at 1.9Ghz under hours of heavy use, which compares favorably to the S800 design TDP (see fudzilla.com_home_item_31532-qualcomm-aims-at-25-to-3w-tdp-for-phones).
Hope that helps!
Click to expand...
Click to collapse
i was looking at the shield but im in the uk adn the price of getting it shipped would be £100 ish with import tax etc, i cant see the point in spending around £350 on a shield when it retails at around £250 mark :/ i wish i could get one, but for now im waiting for the T4 note, anyone have any idea on when its coming out !? i keep thinking about the T4 as the tega zone and better looking games.
thanks 181jenkins
More Tegra Note Info
The chinese reviews are below. Use google translate. Also, I had to replace '/' with ' ' to get around the no-link ban.
A good overview is at pie.pconline.com.cn 365 3650116.html. Notice there is a mistake - the Note 7 does have a front facing VGA cam. And here is a video (remove the spaces): youtube.com-watch ?v= 4xOL3MtXEPU.
The video shows good benchmark scores, and highlights the use of the stylus. Personally, I wish this damn thing would come out! I really want a tablet with a stylus!
deppman said:
The chinese reviews are below. Use google translate. Also, I had to replace '/' with ' ' to get around the no-link ban.
A good overview is at pie.pconline.com.cn 365 3650116.html. Notice there is a mistake - the Note 7 does have a front facing VGA cam. And here is a video (remove the spaces): youtube.com-watch ?v= 4xOL3MtXEPU.
The video shows good benchmark scores, and highlights the use of the stylus. Personally, I wish this damn thing would come out! I really want a tablet with a stylus!
Click to expand...
Click to collapse
seen it all just hate waiting for it lol, was rumours that it was going to be released on the 16th of October but obviously not :/, any one had anymore info about this ??
Related
I realize that this is the Vibrant forum, but it is the general section... so dont get too pissed about me posting a thread not really vibrant related.
So just thinking about processors (that may come come in the Nexus Prime), the two being rumored are Samsungs Exynos and the TI OMAP, from what i can tell.
How do these processors compare? In general the Exynos is generally regarded as better then the snapdragon (not trying to argue either way), but there are plenty of comparisons and topics on this comparison, but how does the Exynos compare to the OMAP? I can really find too many topics on it..
Thanks
Doesn't really matter how it compares right now, because Google can optimize ICS to run fast on Ti-OMAP 4460 while slow on other processors.
The same way when Froyo came out, the Snapdragon processors were able to gain a huge boost in CPU intensive tasks because they took full advantage of the Dalvik optimizations in Froyo. Hummingbird although newer than Snapdragon was not taking full advantage of Dalvik optimizations, thus it ran slower despite being a newer processor.
Nexus Prime running a Ti-Omap 4460 will be faster than any xynos because Google will make sure ICS is perfectly tuned to Ti-OMAP 4460.
Edit:
This is assuming nexus prime has a Ti-OMAP 4460.
SamsungVibrant said:
Doesn't really matter how it compares right now, because Google can optimize ICS to run fast on Ti-OMAP 4460 while slow on other processors.
The same way when Froyo came out, the Snapdragon processors were able to gain a huge boost in CPU intensive tasks because they took full advantage of the Dalvik optimizations in Froyo. Hummingbird although newer than Snapdragon was not taking full advantage of Dalvik optimizations, thus it ran slower despite being a newer processor.
Nexus Prime running a Ti-Omap 4460 will be faster than any xynos because Google will make sure ICS is perfectly tuned to Ti-OMAP 4460.
Click to expand...
Click to collapse
thanks for the response. anyone else have any ideas on the two processors?
So, i watched the presentation last night, i did not see them announce the processor...
have any of the reviews confirmed which processor and GPU?
Its the omap 4460, TI made an announcement on it.Gpu wise its weaker than the exynos in the texture department as it has the sgx540. The biggest advantage it has over the NS or vibrant is the CPU and ram (hardware wise) benching the NS vs the Droid3 or Bionic shows the NS doing fairly close with the differences being probably due to the omap having a higher gpu clock and a processor that can feed the data to the gpu faster.
I can tell you that the chip has great performance, even at that higher resolution, I believe the blackberry playbook has it and that thing runs beautifully =D
Sent from my Nexus S 4G using Tapatalk
everything i'm reading about the omap is saying it's built for better HD performance, however clocking, number crunching and GPU it's weaker then the exynos found in the sgs II. actually there comparing the gpu to the one found in our Vibrant.
as dismal as this sounds, i'm still going for the Galaxy Nexus due to the stock interface and HD resolution...
or i can wait longer (god knows how much longer) and grab the sgs II HD thats currently only in Korea.
---------- Post added at 07:00 PM ---------- Previous post was at 06:54 PM ----------
qoutes from Extremetech website
"So now the OMAP4460 is getting quite a lot of scrutiny, even though it isn’t exactly a new chip. This dual-core SoC is clocked at 1.2GHz, and uses ARM Cortex-A9 architecture, just like the Exynos. That’s not a problem, but the older GPU, the PowerVR SGX540 is. We were hoping for a step up in the graphics department.
Why did Google choose the OMAP for its new Nexus? Well, it might not live up to the high graphical standards set out by the iPhone, but it is a solid chip in its own right. The OMAP4 platform makes use of an additional hardware accelerator called IVA 3 that makes encoding and decoding HD video a snap. The Galaxy Nexus has an HD screen, so this hardware focus on video is a big plus.
Google engineers were likely also drawn to the OMAP for its use of a dual-channel memory controller. Android’s multitasking system means that data is constantly being moved into, and out of, active memory. This is definitely a strength of TI’s OMAP parts"
hopefully that answers some of your questions.
Weak GPU, ****-tastic camera, no microSD slot, small battery, really high pricing (preliminary)...and once again plastic?
I don't get why Google felt they need to repeat the iPhone 4S announcement failure. The screen of the thing is amazing and its OS is. But the actual phone? Not so much.
:/
And I was so hyped about the "One phone to rule them all". . .
}{Alienz}{ said:
Weak GPU, ****-tastic camera, no microSD slot, small battery, really high pricing (preliminary)...and once again plastic?
I don't get why Google felt they need to repeat the iPhone 4S announcement failure. The screen of the thing is amazing and its OS is. But the actual phone? Not so much.
:/
And I was so hyped about the "One phone to rule them all". . .
Click to expand...
Click to collapse
The phone is still worth getting. It will always have the latest version of Android, and Android will run smoothly on it.
I'm never repeating my Vibrant mistake ever again. Running CM-7 with half ass GPS and no 911 calling, no thanks. Next time a Nexus only phone. Just wish it wasn't made by stupid Samsung, errrrr.
Or maybe Motorola phone now that Google owns them, higher chance of getting updates. Just my opinion though.
One last thing. I do agree about the lack of microsd. I was shocked when Nexus S didn't get it, and now again. Hmmmmm. You would think they would want a dev phone to have a microsd slot.
}{Alienz}{ said:
Weak GPU, ****-tastic camera, no microSD slot, small battery, really high pricing (preliminary)...and once again plastic?
I don't get why Google felt they need to repeat the iPhone 4S announcement failure. The screen of the thing is amazing and its OS is. But the actual phone? Not so much.
:/
And I was so hyped about the "One phone to rule them all". . .
Click to expand...
Click to collapse
Weak GPU? It is more than enough to drive a 720p screen at 60fps, as demonstrated consistently throughout the Galaxy Nexus hands-on videos.
How is the camera at all, as you so eloquently put it, '****-tastic'? From what I have seen (and trust me, it isn't nearly enough to make much of an impression to make a final call on its quality), the pictures look decent, with little chroma noise and balanced colours. The zero shutter-lag feature sounds most excellent as well, as most cellphone pictures turn out awful because of the nature of the beast (shaky hands and such). If your judgment is purely based on "Hurr because it's 5MP" then you are a moron.
Though I can lament the loss of a MicroSD card slot, most cards readily available to consumers (read: not newegg or amazon buyers) cannot even fathom being able to record 720p video, much less 1080p featured on the Galaxy Nexus. And using your smartphone as a primary MP3 player is only viable if you have no other use for the phone besides MP3 playing and occasional internet browsing, which would be just a flagrant waste of the ~$80 monthlies people pay for their plans.
And the 1,750mah battery is actually above-average (considering 1,450-1,500mah the standard), along with ICS' built-in 'app-freezing', carrier bloat will never be a cause for unnecessary battery drain again). The battery could last for days depending on your usage (Your mileage may ****ing vary, of course).
Really, high pricing? Really? If the previous two Nexus phones are any indication, it will cost $529 unlocked. Of course, it seems like a lot of money when you work retail or some other **** job, but then you shouldn't be playing with such expensive toys in the first place.
And plastic? Well this explains everything
http://www.youtube.com/watch?v=elKxgsrJFhw
Your post gave me cancer.
Camera---Look at the Nexus S photos at Engadget and the Galaxy Nexus ones. They look IDENTICAL except for colors on the GN looking a bit worse actually. Last I remember, the Nexus S camera is on the level of the Vibrant...it's great for a 5MP but its nothing compared to the competition nowdays. Not backlit sensor, not f2.2 or lower, not even high resolution. No shutter lag? I use Camera360 on my Vibrant and have had that feature for MONTHS. As Engadget comments, the no shutter lag is because the camera on the Galaxy Nexus does not focus. It is just NO competition to a Galaxy S2 or Iphone 4S sadly.
GPU-It is a 1.6 or 1.7 times faster than the Vibrant. We already have a good GPU but...for crying out loud. It is half as slow as the Galaxy S2 one. And THAT itself is already getting old...been on the market for over 6 months. Shall we compare to the new iPhone 4S? Difference of 7 TIMES? I HATE iphones but Samsung and Google seriously didn't try here.
Battery. I am CURRENTLY running a Samsung-made OEM 1800mah in my Vibrant. Same size as our original 1500mah. Should I remind you the Vibrant runs on a 4.0 screen and is NOT HD resolution? For a device that is as big as the Galaxy Nexus (4.6 inches) and with that huge and beautiful screen, 1750 is just TINY. At LEAST a good 2000 or more should have been put in it. And its' not impossible to do at all. Samsung HAS the technology. The phone HAS the space. It's fatter than the Galaxy S2 (and godforbid the new Razor)...doesn't have a MicroSD slot. There is no excuse except laziness.
Pricing---Several retailers in Europe have already priced it. Cheapest one is ~700...typical one is 800 and some go all the way up to 950. Look up the gsmarena.com article if you wish. Off contract it will be an arm and a leg. On contract it will be $300. That makes it the MOST expensive phone both on and off contract. $530? Yeah...maybe at that price it would be something to consider but 800? For an amazing screen and software? Heh.
Like I said, the real star of that presentation was the software. Ice Cream sandwich is amazing. GPU Acceleration anyone?
The phone though it comes on? A letdown sadly. Trust me, I was going to buy it. Was waiting for it for 3 months now. No more. May either get a Galaxy S2 now or wait for a Galaxy S2 HD for the Nexus HD screen and the Galaxy S2 performance and features.
In the OMAP 4460, the SGX540 is clocked at 384 MHz which gives it a total output of ~6.2 GFLOPs. In comparison, the Mali-400 MP4 clocked at 200 MHz produces about ~7.2 GFLOPs, and ~10.8 GFLOPs at 300 MHz. So yeah, it's a step back from the Exynos but still very good.
Everything yea saying don't matter its all about optimization an camera looks great to me
Sent from my SGH-T959 using XDA App
}{Alienz}{ said:
Camera---Look at the Nexus S photos at Engadget and the Galaxy Nexus ones. They look IDENTICAL except for colors on the GN looking a bit worse actually. Last I remember, the Nexus S camera is on the level of the Vibrant...it's great for a 5MP but its nothing compared to the competition nowdays. Not backlit sensor, not f2.2 or lower, not even high resolution. No shutter lag? I use Camera360 on my Vibrant and have had that feature for MONTHS. As Engadget comments, the no shutter lag is because the camera on the Galaxy Nexus does not focus. It is just NO competition to a Galaxy S2 or Iphone 4S sadly.
GPU-It is a 1.6 or 1.7 times faster than the Vibrant. We already have a good GPU but...for crying out loud. It is half as slow as the Galaxy S2 one. And THAT itself is already getting old...been on the market for over 6 months. Shall we compare to the new iPhone 4S? Difference of 7 TIMES? I HATE iphones but Samsung and Google seriously didn't try here.
Battery. I am CURRENTLY running a Samsung-made OEM 1800mah in my Vibrant. Same size as our original 1500mah. Should I remind you the Vibrant runs on a 4.0 screen and is NOT HD resolution? For a device that is as big as the Galaxy Nexus (4.6 inches) and with that huge and beautiful screen, 1750 is just TINY. At LEAST a good 2000 or more should have been put in it. And its' not impossible to do at all. Samsung HAS the technology. The phone HAS the space. It's fatter than the Galaxy S2 (and godforbid the new Razor)...doesn't have a MicroSD slot. There is no excuse except laziness.
Pricing---Several retailers in Europe have already priced it. Cheapest one is ~700...typical one is 800 and some go all the way up to 950. Look up the gsmarena.com article if you wish. Off contract it will be an arm and a leg. On contract it will be $300. That makes it the MOST expensive phone both on and off contract. $530? Yeah...maybe at that price it would be something to consider but 800? For an amazing screen and software? Heh.
Like I said, the real star of that presentation was the software. Ice Cream sandwich is amazing. GPU Acceleration anyone?
The phone though it comes on? A letdown sadly. Trust me, I was going to buy it. Was waiting for it for 3 months now. No more. May either get a Galaxy S2 now or wait for a Galaxy S2 HD for the Nexus HD screen and the Galaxy S2 performance and features.
Click to expand...
Click to collapse
lmfao, best post all day. Gotta pay to play and $530 is pocket change for what you're getting in return.
Galaxy Nexus is clearly the device to get imo.
New Galaxy Nexus was just uncovered to only have 768MB RAM. Not a full GB.
}{Alienz}{ said:
New Galaxy Nexus was just uncovered to only have 768MB RAM. Not a full GB.
Click to expand...
Click to collapse
Uncovered by whom?
Probably that's all that is available after boot up.
}{Alienz}{ said:
New Galaxy Nexus was just uncovered to only have 768MB RAM. Not a full GB.
Click to expand...
Click to collapse
It's 1 GB.
My Tab 10.1 only shows 768 MB as well, but it's 1 GB. The Android System uses part of the RAM to operate. The rest you get as free RAM.
How do I know? Supercurio's Twitter.
"supercurio François Simond:
I read several websites listing #GalaxyNexus with 1GB RAM.. hmm, it's not quite what I found. Linux says: 648MB in total, 630 Available""
Now. He further investigated and Samsung did the same thing they did with the Vibrant. All of the memory combined on it is indeed 1GB. HOWEVER, they are reserving a ****load of memory for the GPU and other functions. Hence of that 1GB (the phone DOES have 1GB), on average there is SIGNIFICANTLY less. How much less? Read the Twitter status posted here. This is the equivalent of the HTC Sensation which has 768MB of RAM and actually IS listed to have 768MB.
}{Alienz}{ said:
How do I know? Supercurio's Twitter.
"supercurio François Simond:
I read several websites listing #GalaxyNexus with 1GB RAM.. hmm, it's not quite what I found. Linux says: 648MB in total, 630 Available""
Now. He further investigated and Samsung did the same thing they did with the Vibrant. All of the memory combined on it is indeed 1GB. HOWEVER, they are reserving a ****load of memory for the GPU and other functions. Hence of that 1GB (the phone DOES have 1GB), on average there is SIGNIFICANTLY less. How much less? Read the Twitter status posted here. This is the equivalent of the HTC Sensation which has 768MB of RAM and actually IS listed to have 768MB.
Click to expand...
Click to collapse
Great find. Thank you.
Sent from my SGH-T959 using xda premium
can anyone point me to some really good hands on videos? i saw the one on phandroid and engadget the night all was announced. but are there any good videos that show a little more in depth?
}{Alienz}{ said:
How do I know? Supercurio's Twitter.
"supercurio François Simond:
I read several websites listing #GalaxyNexus with 1GB RAM.. hmm, it's not quite what I found. Linux says: 648MB in total, 630 Available""
Now. He further investigated and Samsung did the same thing they did with the Vibrant. All of the memory combined on it is indeed 1GB. HOWEVER, they are reserving a ****load of memory for the GPU and other functions. Hence of that 1GB (the phone DOES have 1GB), on average there is SIGNIFICANTLY less. How much less? Read the Twitter status posted here. This is the equivalent of the HTC Sensation which has 768MB of RAM and actually IS listed to have 768MB.
Click to expand...
Click to collapse
Supercurio already has a Galaxy Nexus one day after announcement? Didn't know the devs got it that fast.
Does anyone have any benchmarks? I want to see some raw speed results of the SGS3's Quad Core Arm Mali-400 MP4 vs the iPhone 5's Triple Core PowerVR SGX543MP3.
.
The iPhone 5 looks so unimpressive compared to the SGS3 on all fronts but I'm not seeing any results for the new GPU they're using.
http://www.glbenchmark.com/compare....ple iPhone 5&D2=Samsung GT-I9300 Galaxy S III
The iPhone is roughly twice as fast.
AndreiLux said:
http://www.glbenchmark.com/compare....ple iPhone 5&D2=Samsung GT-I9300 Galaxy S III
The iPhone is roughly twice as fast.
Click to expand...
Click to collapse
Thanks, looks like if mobile gaming is your thing then the iPhone 5 is the way to go if you don't mind the smaller screen, otherwise, the SGS3 seems to beat the iPhone 5 in every area, for me anyway.
ExEvolution said:
Thanks, looks like if mobile gaming is your thing then the iPhone 5 is the way to go if you don't mind the smaller screen, otherwise, the SGS3 seems to beat the iPhone 5 in every area, for me anyway.
Click to expand...
Click to collapse
+1
Even though the i5 doesn't have enough resolution to take advantage of that much power,but the S3 has...
Anyway,it's faster in gpu performance,worse overall for me.
I don,t believe i that test. I ran the same bench here, and everything was running pretty smooth until a warning popped up saying ''vsync enabled''
After that everything became crappy, 11 fps. Every single time i tried the bench. Pretty weird for me. I saw somewhere some complains about wrong results on the test.
Sent from my Nexus 7 using xda premium
tntgdh said:
I don,t believe i that test. I ran the same bench here, and everything was running pretty smooth until a warning popped up saying ''vsync enabled''
After that everything became crappy, 11 fps. Every single time i tried the bench. Pretty weird for me. I saw somewhere some complains about wrong results on the test.
Sent from my Nexus 7 using xda premium
Click to expand...
Click to collapse
First time I hear of such a thing. GLBenchmark is pretty much the industry standard and one of the most reliable ones out there.
woah that Iphone5 gpu seems dammm powerful if thats correct benchmark result one thing i dont get is why there isnt a result yet for much of the iphone 5 I mean if they were going to give it a test why in the hell wouldnt you do all tests. very strange if you ask me
btemtd said:
woah that Iphone5 gpu seems dammm powerful if thats correct benchmark result one thing i dont get is why there isnt a result yet for much of the iphone 5 I mean if they were going to give it a test why in the hell wouldnt you do all tests. very strange if you ask me
Click to expand...
Click to collapse
http://www.anandtech.com/show/6324/the-iphone-5-performance-preview
ip5 gpu just raped the Mali S3 Daymm, But If the SGS3 was released in September I think it would of beat the Ip5 in that department.. And I really do think the I9305 will help the benchmarks slightly and overall performance but thats about it untill the next SGS4... I really think sammy should reallly become future proof in the next S4 especially if they release it before Ip5
I own a I9300 and soon will get the I9305, I have played with the IP5 - I9300 - I9305 the overall feel and ui performance I LOVE the I9300 and especially the I9305.. Not a single stutter. The GS3 just feels better You cant go from android Jelly bean back to IOS its just going backwards and feels so so so DULLL. I recently was watching a movie on 1080P On my HD LG 40Inch connected to my GS3 OMG its unbeleivable simple thing like that made my day lol seriously im happy with the GS3 and will be alittle bit happier with the I9305.. for a while .. I am in no hurry to get another phone after this next one.
Blah still would rather use Android!
Impressive performance indeed. But the OS lets it down severely. It's like having a 800HP motor in Chinese made Cherry. Rather drive a 206 180GT Peugeot. It'll handle those corners unlike the Cherry and the Navigation system works!
So guessing Apple is fragmented now? So many devices to cater for, different screen resolutions etc. They stuffed up maps, so I wonder how screen resolution scaling is working out for the i5?
Seems it might be an iFail for them with the i5 going from reviews. Check out any Apple forum, there is a flood of negative feedback on build quality.. Seems 1/3 phones have visible scratches out of the box. Fingernails scratching paint off the bezel with ease etc.
Gotta feel sorry for there pedantic fans that are focused purely on image. The Apple is rotten, scratched and bruised.
Swyped on I9300 - XXDLIB - Siyah kernel - JKay & Thunderbolt tweaks.
Samsung needs to drop Mali and go PowerVR Their GPU is seriously better in every way
irzero said:
Samsung needs to drop Mali and go PowerVR Their GPU is seriously better in every way
Click to expand...
Click to collapse
Samsung doesn't need to drop Mali. The T-604 in the 5250 will be already be faster than the 543MP3. People tend to forget the Mali-400 and its derivatives are about 3 years old by now. You don't just suddenly change GPU licensee every 6 months because the competition has a temporary performance advantage. Rogue is still far away into next year and we'll also see second generation Vithar architectures from ARM by then.
Powervr seems to always be way ahead of the pack on the gpu side.
Look at the new adreno in the S4 only just about matches the iphone 5 in some benches.
The S4 hasn't been out long and look at how badly it compares.
Sent from my GT-I9300 using Tapatalk 2
Well the na s3 has a better GPU too but would not trade my international one for it.
Besides if it has 3 GPUs of the 543 then the PS Vita has 4 of them. Which is cheaper and has better games!!
Sent from my GT-I9300 using xda premium
I honestly didn't think iPoop5 would beat GS3 , let alone r*pe it like it did here.
S3 was advertised as the Beast of an Hardware can't believe IP5 smokes it
Who gives a **** even if the SGS3 was twice as fast if the entire experience was not upto scruff,
Sent from my GT-I9300 using Tapatalk 2
irzero said:
Samsung needs to drop Mali and go PowerVR Their GPU is seriously better in every way
Click to expand...
Click to collapse
you should be happy to see the older architecture of mali (400mp) still gives a run for money with the new comers in the competition like adreno 320. next gen mali (t604,t658) which will debut soon will give them back the performance crown
I heard the Mali is really weak in the triangle part of the benchmark, and it is obvious seeing the results. I do hope the new mali T604 isnt.
I still don't get why no one uses the a15 architecture in the cpu. Anyway, graphics are always going to be faster on newer chips, that's just how it goes. It is suprising that the other benchmarks aren't beating the S4/Exinos/Tegra. I bet the new Mali, tegra 4, next adrenos are going to be harder to beat.
The thing is that now people who have android flagships have no reason to move to ios and people who want a new super flagship phone should probably just wait till Christmas and get something that beats them all! I really don't see the point of spending all that money on something as locked down and limited as the iphone knowing that it already isn't the king in it's sector. The rest should just buy something like the razr m, al lot cheaper and gives you everthing you' d ever need(the fact that it's not going to be sold in europe really p""""s me off)!
The new lumia on the other hand really sounds nice(hope I don't need to defragment the disk every week with windows phone!).
And still while I right this iphone sales are probably hitting 5 million... people are stupid...
btemtd said:
I heard the Mali is really weak in the triangle part of the benchmark, and it is obvious seeing the results. I do hope the new mali T604 isnt.
Click to expand...
Click to collapse
Mali 400 is weak in vertex processing because it has fixed units to do vertex processing (1 unit afaik, the rest for pixel processing) :. But that's not the case with new 6 series gpu, all of them are dynamic cores which can share the workload.
《swagged from aokp》
XXXUPDATEXXX
Anandtech have now published the perfromance preview of the Nexus 10, lets the comparison begin!
http://www.anandtech.com/show/6425/google-nexus-4-and-nexus-10-review
Well, the first full result has appeared on GLBenchmark for the iPad4, so I have created a comparison with the Samsung Arndale board, which uses exactly the same SoC as the Nexus 10, so will be very close in performance to Google's newest tablet. GLBenchmark, as its name suggests test Open GL graphics perrformance, which is important criteria for gaming.
Which device wins, click the link to find out.
http://www.glbenchmark.com/compare....ly=1&D1=Apple iPad 4&D2=Samsung Arndale Board
If you're really impatient, the iPad 4 maintains it lead in tablet graphics, the Nexus 10 may performance slightly better in final spec, but the underlying low-level performance will not change much.
I've also made a comparison between the iPad 3 & 4.
Interestingly the in-game tests GLBenchmark 2.5 Egypt HD C24Z16 - Offscreen (1080p) :, which is run independent of native screen resolution show the following
iPad 4: 48.6 FPS
iPad 3: 25.9 FPS
5250 : 33.7 FPS
So the iPad is twice as fast as its older brother, the Exynos will probably score nearer 40 FPS in final spec, with new drivers and running 4.2, the board runs ICS, however Jelly Bean did not really boost GL performance over ICS. What is interesting is that iPad 4, whose GPU is supposed to clocked at 500 MHz vs 250 MHz in the iPad 3 does not perform twice as fast in low-level test.
Fill Rate, triangle throughtput, vertex output etc is not double the power of the iPad 3, so although the faster A6 cpu helps, I reckon a lot of the improvement in the Egypt HD test is caused by improved drivers for the SGX 543 MP4 in the Pad 4. The Galaxy S2 received a big jump in GL performance when it got updated Mali drivers, so I imagine we should see good improvements for the T604, which is still a new product and not as mature as the SGX 543.
http://www.glbenchmark.com/compare....tified_only=1&D1=Apple iPad 4&D2=Apple iPad 3
I'd imagine the new new iPad would take the lead in benchmarks for now as it'll take Sammy and Google some time to optimize the beast, in the end however actual app and user interface performance is what matters, and reports are overwhelmingly positive on the Nexus 10.
So Mali 604T didn't match 5 times better than Mali 400, or maybe Samsung underclocked it.
Still very good but not the best.
________________
Edit: I forgot that Exynos 4210 with Mali400MP4 GPU had very bad GLbenchmark initially (even worse than PowerVR SGX540), but after updating firmware it's way better than other SoCs on Android handsets.
hung2900 said:
So Mali 604T didn't match 5 times better than Mali 400, or maybe Samsung underclocked it.
Still very good but not the best.
Click to expand...
Click to collapse
Not sure about this, but don't benchmark tools need to be upgraded for new architectures to? A15 is quite a big step, SW updates may be necessary for proper bench.
Damn..now I have to get an iPad.
I believe we have to take the Arndale board numbers with pinch of salt. It's a dev board, and I doubt it has optimized drivers for the SoC like it's expected for N10. Samsung has this habit of optimizing the drivers with further updates.
SGS2 makes for a good case study. When it was launched in MWC2011, it's numbers were really pathetic. It was even worse than Tegra2.
Anand ran benchmark on the pre-release version of SGS2 on MWC2011, check this:
http://www.anandtech.com/show/4177/samsungs-galaxy-s-ii-preliminary-performance-mali400-benchmarked
It was showing less than Tegra2 numbers! It was that bad initially.
Then look when Anand finally reviewed the device after few months:
http://www.anandtech.com/show/4686/samsung-galaxy-s-2-international-review-the-best-redefined/17
Egypt (native resolution) numbers went up by 3.6x and Pro also got 20% higher. Now they could have been higher if not limited by vsync. GLbenchmark moved from 2.0 to 2.1 during that phase, but I am sure this would not make such a big difference in numbers.
If you again check the numbers now for SGS2, it's again another 50% improvement in performance from the time Anand did his review.
Check this SGS2 numbers now:
http://www.anandtech.com/show/5811/samsung-galaxy-s-iii-preview
http://www.anandtech.com/show/6022/samsung-galaxy-s-iii-review-att-and-tmobile-usa-variants/4
This is just to show that how driver optimization can have a big affect on the performance. My point is that we have to wait for proper testing on final release of N10 device.
Also, check the fill rate properly in the Arndale board test. It's much less than what is expected. ARM says that Mali-T604 clocked at 500MHz should get a fill rate of 2 GPixels/s. It's actually showing just about 60% of what it should be delivering.
http://blogs.arm.com/multimedia/353-of-philosophy-and-when-is-a-pixel-not-a-pixel/
Samsung has clocked the GPU @ 533MHz. So, it shouldn't be getting so less.
According to Samsung, it more like 2.1 GPixels/s: http://semiaccurate.com/assets/uploads/2012/03/Samsung_Exynos_5_Mali.jpg
Fill rate is a low-level test, and there shouldn't be such a big difference from the quoted value. Let's wait and see how the final device shapes up.
hung2900 said:
So Mali 604T didn't match 5 times better than Mali 400, or maybe Samsung underclocked it.
Still very good but not the best.
________________
Edit: I forgot that Exynos 4210 with Mali400MP4 GPU had very bad GLbenchmark initially (even worse than PowerVR SGX540), but after updating firmware it's way better than other SoCs on Android handsets.
Click to expand...
Click to collapse
In areas where the Mali 400 lacked performance, like fragment and vertex lit triangle output T604 is comfortably 5 x the performance, whereas in these low-level tests iPad is not a concrete 2x the power of iPad 3, but achieves twice the FPS in Egypt HD than its older brother. I suspect drivers are a big factor here, and Exynos 5250 will get better as they drivers mature.
hot_spare said:
I believe we have to take the Arndale board numbers with pinch of salt. It's a dev board, and I doubt it has optimized drivers for the SoC like it's expected for N10. Samsung has this habit of optimizing the drivers with further updates.
SGS2 makes for a good case study. When it was launched in MWC2011, it's numbers were really pathetic. It was even worse than Tegra2.
Anand ran benchmark on the pre-release version of SGS2 on MWC2011, check this:
http://www.anandtech.com/show/4177/samsungs-galaxy-s-ii-preliminary-performance-mali400-benchmarked
It was showing less than Tegra2 numbers! It was that bad initially.
Then look when Anand finally reviewed the device after few months:
http://www.anandtech.com/show/4686/samsung-galaxy-s-2-international-review-the-best-redefined/17
Egypt (native resolution) numbers went up by 3.6x and Pro also got 20% higher. Now they could have been higher if not limited by vsync. GLbenchmark moved from 2.0 to 2.1 during that phase, but I am sure this would not make such a big difference in numbers.
If you again check the numbers now for SGS2, it's again another 50% improvement in performance from the time Anand did his review.
Check this SGS2 numbers now:
http://www.anandtech.com/show/5811/samsung-galaxy-s-iii-preview
http://www.anandtech.com/show/6022/samsung-galaxy-s-iii-review-att-and-tmobile-usa-variants/4
This is just to show that how driver optimization can have a big affect on the performance. My point is that we have to wait for proper testing on final release of N10 device.
Also, check the fill rate properly in the Arndale board test. It's much less than what is expected. ARM says that Mali-T604 clocked at 500MHz should get a fill rate of 2 GPixels/s. It's actually showing just about 60% of what it should be delivering.
http://blogs.arm.com/multimedia/353-of-philosophy-and-when-is-a-pixel-not-a-pixel/
Samsung has clocked the GPU @ 533MHz. So, it shouldn't be getting so less.
According to Samsung, it more like 2.1 GPixels/s: http://semiaccurate.com/assets/uploads/2012/03/Samsung_Exynos_5_Mali.jpg
Fill rate is a low-level test, and there shouldn't be such a big difference from the quoted value. Let's wait and see how the final device shapes up.
Click to expand...
Click to collapse
I agree with most of what you have said. On the GPixel figure this is like ATI GPU teraflops figures always being much higher than Nvidia, in theory with code written to hit the device perfectly you might see that those high figures, but in reality the Nvidia cards with lower on paper numbers equaled or beat ATI in actual game FPS. It all depends on whether the underlying architecture is as efficient in real-world tests, vs maximum technical numbers that can't be replicated in actual game environments.
I think the current resolution of the iPad / Nexus 10 is actually crazy, and will would see prettier games with lower resolutions, the amount of resources needed to drive those high MP displays, means lots of compromises will be made in terms of effects / polygon complexity etc to ensure decent FPS, especially when you think that to drive Battlefield 3 at 2560 x 1600 with AA and high textures, requires a PC that burn 400+ watts of power, not a 10 watt SoC.
Overall when we consider that Nexus 10 has twice the RAM for game developers to use and faster CPU cores, games should look equally as nice as both, the biggest effect will be the level of support game developers provide for each device, the iPad will probably be stronger i that regard. Nvidia was able to coax prettier games out of Tegra 3 through developer support, hopefully Google won't forget the importance of this.
What's the point of speculation? Just wait for the device to be released and run all the test you want to get confirmation on performance. Doesn't hurt to wait
BoneXDA said:
Not sure about this, but don't benchmark tools need to be upgraded for new architectures to? A15 is quite a big step, SW updates may be necessary for proper bench.
Click to expand...
Click to collapse
Both A9 & A15 use the same instruction set architecture (ISA) so no they won't. Benchmarks may need to be modified, if the new SoC are too powerful and max out the old benches, but for GL Benchmark, that has not happened yet and there are already new updates in the pipeline.
I can't wait to see this Exynos 5250 in a 2.0ghz quad-core variant in the semi near future... Ohhhh the possibilities. Samsung has one hell of a piece of silicon on their hand.
Chrome
http://www.anandtech.com/show/6425/google-nexus-4-and-nexus-10-review
Google if you want to use Chrome as the stock browser, then develop to fast and smooth and not an insult, stock AOSP browser would be so much faster.
Turbotab said:
http://www.anandtech.com/show/6425/google-nexus-4-and-nexus-10-review
Google if you want to use Chrome as the stock browser, then develop to fast and smooth and not an insult, stock AOSP browser would be so much faster.
Click to expand...
Click to collapse
True.. Chrome on mobile is still not upto desktop level yet. I believe it's v18 or something, right? The stock browser would have much better result in SunSpider/Browsermark. The N4 numbers looks even worse. Somewhere the optimizations isn't working.
The GLbenchmark tests are weird. Optimus G posts much better result than N4 when both are same hardware. It infact scores lower than Adreno 225 in some cases. This is totally whacked.
For N10, I am still wondering about fill rate. Need to check what guys say about this.
Is it running some debugging code in the devices at this time?
Turbotab said:
Both A9 & A15 use the same instruction set architecture (ISA) so no they won't. Benchmarks may need to be modified, if the new SoC are too powerful and max out the old benches, but for GL Benchmark, that has not happened yet and there are already new updates in the pipeline.
Click to expand...
Click to collapse
Actually not. A8 and A9 are the same ISA (Armv7), while A5 A7 and A15 are in another group (Armv7a)
Once we get rid of the underclock no tablet will be able to match. I'm sure the Mali t604 at 750 MHz would destroy everything.
hung2900 said:
Actually not. A8 and A9 are the same ISA (Armv7), while A5 A7 and A15 are in another group (Armv7a)
Click to expand...
Click to collapse
I have to disagree, this is from ARM's info site.
The ARM Cortex-A15 MPCore processor has an out-of-order superscalar pipeline with a tightly-coupled low-latency level-2 cache that can be up to 4MB in size. The Cortex-A15 processor implements the ARMv7-A architecture.
The ARM Cortex-A9 processor is a very high-performance, low-power, ARM macrocell with an L1 cache subsystem that provides full virtual memory capabilities. The Cortex-A9 processor implements the ARMv7-A architecture and runs 32-bit ARM instructions, 16-bit and 32-bit Thumb instructions, and 8-bit Java bytecodes in Jazelle state.
http://infocenter.arm.com/help/index.jsp?topic=/com.arm.doc.set.cortexa/index.html
Keion said:
Once we get rid of the underclock no tablet will be able to match. I'm sure the Mali t604 at 750 MHz would destroy everything.
Click to expand...
Click to collapse
Except the iPad 4, which has a GPU that is currently 57% faster than the T604.
Sent from my iPad Mini using Tapatalk
Do remember that Awesome resolution does tax the GPU a lot. Heck most lower end desktop GPUs would struggle
Harry GT-S5830 said:
Do remember that Awesome resolution does tax the GPU a lot. Heck most lower end desktop GPUs would struggle
Click to expand...
Click to collapse
Indeed it does,but not in offscreen testing, where Anand made his proclamation.
Sent from my iPad Mini using Tapatalk
Hemlocke said:
Except the iPad 4, which has a GPU that is currently 57% faster than the T604.
Sent from my iPad Mini using Tapatalk
Click to expand...
Click to collapse
Nah, I think we can beat that too.
Drivers + OC.
I have read many specs showing the Denver having a massive CPU performance.
However, all of these are "Single Core" performances @3Ghz.
Multi core performance, is just, on par with quad core benchmarks.
However, when I am multitasking, is this going to be a serious issue, with all the overhead of context switching and stuff, especially since Nexus 9 is clocking this at 2.3Ghz (a massive 25% reduction from benchmarks)? I mean, quad core is quad core. Single CPU performance can only do so much.
From the looks of things it'll pull it's weight for multi core loads as well. The only thing that out performs it on multi core workloads is the Galaxy Note 4, and even then it's not by much.
http://www.phonearena.com/news/Nexu...gra-K1-outperforms-Apple-iPhone-6s-A8_id61825
WisdomWolf said:
From the looks of things it'll pull it's weight for multi core loads as well. The only thing that out performs it on multi core workloads is the Galaxy Note 4, and even then it's not by much.
http://www.phonearena.com/news/Nexu...gra-K1-outperforms-Apple-iPhone-6s-A8_id61825
Click to expand...
Click to collapse
Thanks. This should be enough to do pretty much anything seamlessly. I am primarily interested in graphics performance, one reason being the QHD screen, and that is mind blowing anyway.
Don't know why they removed 4K recording, as the CPU and GPU can handle it.
There might be a mod for it in the very near future? 4k
MRobbo80 said:
There might be a mod for it in the very near future? 4k
Click to expand...
Click to collapse
I sure hope so!
Far_SighT said:
one reason being the QHD screen, and that is mind blowing anyway.
Click to expand...
Click to collapse
QXGA (2048 x 1536), not QHD (2560x1440). Close enough though, only about a half million pixels difference between the two.
Multicore performance doesn't matter much in real world scenarios.
bblzd said:
QXGA (2048 x 1536), not QHD (2560x1440). Close enough though, only about a half million pixels difference between the two.
Click to expand...
Click to collapse
Thanks. You are indeed correct. I was just looking at Nexus 6 and note 4 and "missed" it. Besides, that half a million pixel difference (actually 540672), is only 17.19%, so The Tegra K1 won't notice it anyway
And now I'm unsure. Why would they put a lower resolution in N9, when they are putting QHD in N6?
Far_SighT said:
And now I'm unsure. Why would they put a lower resolution in N9, when they are putting QHD in N6?
Click to expand...
Click to collapse
same with ram.
Maybe the price would be to high, like it already is?
Far_SighT said:
I have read many specs showing the Denver having a massive CPU performance.
However, all of these are "Single Core" performances @3Ghz.
Multi core performance, is just, on par with quad core benchmarks.
Click to expand...
Click to collapse
It is really impressive to see a dual core getting the same performance as a quad core in multi-threading tasks, shows just how powerful the cores are. I dont think it is a setback really since it isnt performing a lot worse, and single thread gets such a large boost. This is Nvidia's first every CPU core, and it is really impressive. I am sure we will see a quad core version next generation or so when they can make the processor on a smaller node.
Being that Android L doesn't appear to have multi-window or floating-window multi-tasking, I don't think the lack of a quad-core is a big issue. Many Macbook Airs and Surface Pros have been running dual-core forever (even current models), and you could argue that people do far more multi-tasking on those full desktop OS's than an Android tablet.
Also, related, Antutu benchmark of the N9 from the other thread - http://i.imgur.com/c6b57dT.jpg
MRobbo80 said:
Maybe the price would be to high, like it already is?
Click to expand...
Click to collapse
tyvar1 said:
same with ram.
Click to expand...
Click to collapse
More likely, they are trying to match it with Apple's iPad Air 2, which seems reasonable.
However, they probably should have gone with 1080p in N6. That is wayyy overpriced.
I would have preferred to get a QHD in N9 and QXGA in N6, which would have given N9 an edge.
SenK9 said:
Being that Android L doesn't appear to have multi-window or floating-window multi-tasking, I don't think the lack of a quad-core is a big issue. Many Macbook Airs and Surface Pros have been running dual-core forever (even current models), and you could argue that people do far more multi-tasking on those full desktop OS's than an Android tablet.
Also, related, Antutu benchmark of the N9 from the other thread - http://i.imgur.com/c6b57dT.jpg
Click to expand...
Click to collapse
I am actually waiting for an A8X vs N9 (Denver) benchmark as I read stuff like "they turn off graphics features on Android despite more powerful hardware".
If the above holds true though, I'll probably get N9.
Rumours about RK3399 chipset from Rockchip have swept over the internet, and we seem to be barely a soundbite apart from it. Promised to rival the quarter-century old way we think about TV BOX, Vorke has included the RK3399-powered Z3 in its ambitious plans. Next February is expected to mark the birth of the revolutionary gadget.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Adopting Rockchip RK3399, Vorke Z3 is about to outperform almost all of its competitors in market. Unveiled at CES 2016, RK3399 is a Hexa Core processor based on Dual-core ARM Cortex-A72 MPCore and Quad-core ARM Cortex-A53 MPCore framework paired with Mali-T860MP4 Quad CoreGPU. Reportedly, it is about to offer a significant performance boost against its predecessors, including RK3288, outranking SoCs from Amlogic and Allwinner. Scored 72483 in Antutu, What a madness!
Unlike other contenders, z3 offers a staggering free storage of 4GB LPDDR3, 32GB of EMMC to back up your TV entertainment, so that you can use it right out of the box. Along side the storage, its worth to point out that Vorke brings together compelling KODI entertainment center and RK3399`s insane 4K image quality into one nifty box. The joining of the Soc conduces to its support for H.265 and VP9 4K video decoding.
One of the problems with any android box comes to that it entails optimal internet connection and speed to stream the bandwidth hogging that video can take. In account of this, Z3 equips itself with AC wifi, the newest WiFi protocol and has a high speed data transfer speed up-to 1200Mbps under some circumstances. Besides, the support for Gigabit Ethernet is admittedly an afterthought.
When it comes to extension possibilities, there is no prospect of compromise. With 1 USB 2.0 , 1 USB3.0, 1 Type-C and 1 SATA placed at the sides, Z3 enables you to attach USB peripherals or even more storage. The aforementioned Type C port is another highlight in Android 6.0 box, and when you factor in the sheer number of connections on Z3, you begin to realize why it is a little bit bigger than Z1. With support for SATA, USB 2.0, USB 3.0, Gigabit Ethernet, SPDIF, HDMI 2.0, there are few devices you won’t be able to connect to the device.
What to Expect?
Rockchip RK3399
Android 6.0
4GB + 32GB
AC WIFI + Gigabit Ethernet
4K VP9
USB3.0, Type C and SATA interface
Just got confirmation that my Z1 is supposed to arrive in 3-5 days .......... So any idea what the price will be on this puppy the Z3???? When it will launch??? Antu score is badasszz .
How much was your Z1?
I;m guessing the Z3 will be about 40% t0 50% dearer.
Z1 in some webshops is US$ 120 - I'd expect the Z3 to be about $160-180
It looks like a decent unit - but also look around here, Antutu is NOT a reliable bench to quote.
Some have reported a Xperia z3 (if my memory serves me) as doing 71000 in Antutu, others say it should be about 50k.
Fluffbutt said:
How much was your Z1?
I;m guessing the Z3 will be about 40% t0 50% dearer.
Z1 in some webshops is US$ 120 - I'd expect the Z3 to be about $160-180
It looks like a decent unit - but also look around here, Antutu is NOT a reliable bench to quote.
Some have reported a Xperia z3 (if my memory serves me) as doing 71000 in Antutu, others say it should be about 50k.
Click to expand...
Click to collapse
Z1 is on sale at $74.99 .
Z3 is expected to twice the price of Z3.
We will take on CPU Z .
Some devices did some optimizations on Antutu Benchmark, so that they score pretty high.
linhuizhen said:
Z1 is on sale at $74.99 .
Z3 is expected to twice the price of Z3.
We will take on CPU Z .
Some devices did some optimizations on Antutu Benchmark, so that they score pretty high.
Click to expand...
Click to collapse
I had an email from Vorke, they say "under $200".. so i replied, "Hopefully under $150 as well" hahah! Hope it's not too dear!
What about that 3DMark bench, firestorm (or is it firestrike)... that seems to be a decent test, but it's mainly gpu, isn't it?
Fluffbutt said:
I had an email from Vorke, they say "under $200".. so i replied, "Hopefully under $150 as well" hahah! Hope it's not too dear!
What about that 3DMark bench, firestorm (or is it firestrike)... that seems to be a decent test, but it's mainly gpu, isn't it?
Click to expand...
Click to collapse
We will check :fingers-crossed:
As a slight sideways jump - I notice its competitor boxes both have heats sink + fan listed in specs - does anyone know if the Vorke is using active cooling?
I think I was right in a different forum's post - maybe the RK chip runs a little hotter than passive cooling can deal with?
So got my Z1 here in No.VA Mon. 16 ordered Jan. 5 at geekbuying. After three days pretty happy. Games I haven't been able to play because of bricked MINIX X8-H, & not being able to root MiBox so PS3 SIXAXIS controller could work run flawlessly on Z1 . Showbox runs smooth ,YouTube,kodi 16.1, Antu 3D score no tweeks 41723. Jide has crowd funded Rockchip 3399 TV box for March or May ,USD $ 99-129
I don't trust a company that need kick-starter/crowd funding to develop a device - they smell like "fly by nighters" to me... produce the device, rake in some dosh, run like buggery when people start to complain or what tech support.
That Mio/MixIO whatever it's called... US$129 on kick-starter... mope, zero trust...
Two things I like about Vorke - they exist as a company, self-funded development... and they responded to my silly queries; pre-sales support suggest good after sales support.
Fluffbutt said:
I don't trust a company that need kick-starter/crowd funding to develop a device - they smell like "fly by nighters" to me... produce the device, rake in some dosh, run like buggery when people start to complain or what tech support.
That Mio/MixIO whatever it's called... US$129 on kick-starter... mope, zero trust...
Two things I like about Vorke - they exist as a company, self-funded development... and they responded to my silly queries; pre-sales support suggest good after sales support.
Click to expand...
Click to collapse
Just out of interest, I found a US site with a price for the UGOOS UT5 (basically the same as the Vorke Z3) -- $349!!!
What an absolute joke - you could buy a half decent laptop and use that as a TV box, FFS!
I guess that pointless little AMOLED display add to the price.
Still, early days, it might just be a place-holder price.
Edit - why did that double post? Weird.
Fluffbutt said:
Just out of interest, I found a US site with a price for the UGOOS UT5 (basically the same as the Vorke Z3) -- $349!!!
What an absolute joke - you could buy a half decent laptop and use that as a TV box, FFS!
Click to expand...
Click to collapse
I agree. What's more I realy don't know what is all the fuss about RK3399? When for more than a year there is Amazon Fire TV available with MediaTek SoC MT8173 ([email protected] + [email protected]).
Maybe this is not so good forum to discuss that because MediaTek isn't fond of open source but people preffer working solution than one which you must fiddle all the time to make it work for a while .
Yes - but the AmFireTV or sort of locked, isn't it - own UI, low customisation, Amazon bias (understandable).
I've heard that the Vorke will be completely unlocked, rooted, open... maybe...
Anyway, the specs say different:
Qualcomm Snapdragon 8064 Quad Core 4x @ 1.7Ghz Qualcomm Adreno 320
MediaTek 8173C Quad Core 2x @ 2GHz & 2x @ 1.6Ghz GPU PowerVR Rogue GX6250
Click to expand...
Click to collapse
Neither of those will match the RK3399, and the Mali 850mp4 is a very good SoC GPU. Not "superb" or "the best2, but certainly good enough to nearly everything.
I do NOT like Antutu as a benchmark (it's heavily biased to certain chips) but the AFTV gets 53K while the RK3399 gets 73K
Fluffbutt said:
I do NOT like Antutu as a benchmark (it's heavily biased to certain chips) but the AFTV gets 53K while the RK3399 gets 73K
Click to expand...
Click to collapse
I would be more impressed if my RK3288 device couldn't do 62k in Antutu. AFAIK Antutu is quite strongly GPU biased which might indicate that RK3399 GPU is slower than RK3288 or there are problems with drivers.
Besides I preffer "clear" CPU benchmarks which can give me indication of Internet browsing performance. When I did some research about MT8173 more than year ago I found something like browser.geekbench.com/v4/cpu/compare/939975?baseline=1646914. (Note: Nexus 9 is spoofed identity of my RK3288 device). Then I was quite pleased with that performance improvement but devices with MT8173 came only with 2GB RAM which is to small amount for me. Even so RK3399 isn't realy more impressive to me than MT8173 (check: browser.geekbench.com/v4/cpu/compare/939975?baseline=993600) and we are more than a year after.
Fluffbutt said:
Just out of interest, I found a US site with a price for the UGOOS UT5 (basically the same as the Vorke Z3) -- $349!!!
What an absolute joke - you could buy a half decent laptop and use that as a TV box, FFS!
I guess that pointless little AMOLED display add to the price.
Still, early days, it might just be a place-holder price.
Edit - why did that double post? Weird.
Click to expand...
Click to collapse
Ugoos is a trusted brand.
Jagee said:
I would be more impressed if my RK3288 device couldn't do 70k Antutu score already. AFAIK Antutu is quite strongly GPU biased which might indicate that RK3399 GPU is slower than RK3288 or there are problems with drivers.
Besides I preffer "clear" CPU benchmarks which can give me indication of Internet browsing performance. When I did some research about MT8173 more than year ago I found something like browser.geekbench.com/v4/cpu/compare/939975?baseline=1646914. (Note: Nexus 9 is spoofed identity of my RK3288 device). Then I was quite pleased with that performance improvement but devices with MT8173 came only with 2GB RAM which is to small amount for me. Even so RK3399 isn't realy more impressive to me than MT8173 (check: browser.geekbench.com/v4/cpu/compare/939975?baseline=993600) and we are more than a year after.
Click to expand...
Click to collapse
We will try other benchmarks also. Recommend one please
linhuizhen said:
We will try other benchmarks also. Recommend one please
Click to expand...
Click to collapse
I overestimated a little Antutu score for my RK3288 device which should be 62k not 70k. Nevertheless I would realy like to see Vellamo and Octane benchmarks scores for RK3399.
My RK3288 device can do about 4800 points with Chrome in Vellamo.
Geekbench v4 score for RK3399 is rather well known. You can brag only if you beat 1600 single- and 3000 multi-core score . browser.geekbench.com/v4/cpu/993600
linhuizhen said:
Ugoos is a trusted brand.
Click to expand...
Click to collapse
I'm not disputing that - they have a good track record... but that doesn't stop $349 being too high for this device. And I give them mucho-credit for NOT trying to effing kickstarter the device!
But the amoled screen is really a non-item - what's the point, a TV box is stuck under the TV table; I don't even look at mine, just use it...
And what can a little screen show anyway apart from time or some form of channel display? Any more info would mean you;d have to get down on the floor, closer to it, to read it!
An amoled screen is perhaps $50 of that $150 over pricing ($200 is all I'd be paying for this spec TV box - for $350 you can get a full i7 with intel 520 gpu (400 GFlops) TV Box! The Mali gpu in the RK3399 is rated at about 90 GFLops.
It smells like a gimmick to make it stand out from the other 3 or 4 RK3399 boxes coming.
So I'm less knocking Ugoos themselves and more knocking their "vision" of the yet-to-come TV Box.
*********************************************************************************
Geekbench 1520 and 2840 isn't too bad - just 160 lower than that magical 3000 isn't to be sniffed at.
Mind you, even Geekbench can be misleading - GB 3 gives over 5300 for the Samsumng Galaxy Tab S2 9.7 (Qualcomm SoC). (some reviews say 4300, 4800, 5300 (rounded off))
Fluffbutt said:
An amoled screen is perhaps $50 of that $150 over pricing
It smells like a gimmick to make it stand out from the other 3 or 4 RK3399 boxes coming.
Click to expand...
Click to collapse
It might be a gimmic to make it stand out but yours estimation of the cost of such display is quite off. First. It isn't AMOLED display like in smartphones. Second it dosen't have similar size (5" 16:9 proportion).
I've seen OLED displays on MP3 players devices with prices lower than 30$.
Fluffbutt said:
Geekbench 1520 and 2840 isn't too bad - just 160 lower than that magical 3000 isn't to be sniffed at.
Mind you, even Geekbench can be misleading - GB 3 gives over 5300 for the Samsumng Galaxy Tab S2 9.7 (Qualcomm SoC). (some reviews say 4300, 4800, 5300 (rounded off))
Click to expand...
Click to collapse
3000 is just a number for multicore Geekbench v4 score which heavily depends from number of them. Even mentioned by you Qualcom SoC (MSM8976) in New Samsung Galaxy Tab S2 9.7 can get over 5000 points in Geekbench v3 with 8 cores (4xA57 + 4xA53) working simultanously (Source: http://browser.primatelabs.com/geekbench3/compare/4803220?baseline=5849462). That doesn't necessary translate to better performance than even just 2xA73.
Another problem when comparing benchmarks is that some devices contains different hardware than previous "batches". Like new (Qualcomm MSM8976) and "old" Galaxy Tab S2 9.7 (Samsung Exynos 5433 Octa).
Another factor is used benchmarking software and OS. There is clear example: http://browser.primatelabs.com/geekbench3/compare/4803220?baseline=5849462 when one test is for older design and with 32bit OS rather than with 64bit (AArch64). I don't want even start about overheating problems which might be per device variable.
Jagee said:
It might be a gimmic to make it stand out but yours estimation of the cost of such display is quite off. First. It isn't AMOLED display like in smartphones. Second it dosen't have similar size (5" 16:9 proportion).
I've seen OLED displays on MP3 players devices with prices lower than 30$.
3000 is just a number for multicore Geekbench v4 score which heavily depends from number of them. Even mentioned by you Qualcom SoC (MSM8976) in New Samsung Galaxy Tab S2 9.7 can get over 5000 points in Geekbench v3 with 8 cores (4xA57 + 4xA53) working simultanously (Source: http://browser.primatelabs.com/geekbench3/compare/4803220?baseline=5849462). That doesn't necessary translate to better performance than even just 2xA73.
Another problem when comparing benchmarks is that some devices contains different hardware than previous "batches". Like new (Qualcomm MSM8976) and "old" Galaxy Tab S2 9.7 (Samsung Exynos 5433 Octa).
Another factor is used benchmarking software and OS. There is clear example: http://browser.primatelabs.com/geekbench3/compare/4803220?baseline=5849462 when one test is for older design and with 32bit OS rather than with 64bit (AArch64). I don't want even start about overheating problems which might be per device variable.
Click to expand...
Click to collapse
Oh, I agree on the guesstimate - I have no real idea what an oled screen costs - your $30 is probably more valid than my $50 - but that make it worse, not better.
What else is there to make this over $150 more than the other RK3399 boxes - "under $200" stated by Vorke, $115 pre-sale price foe another, and so on.
It MIGHT be a "place holder" number - but it's working against them... I'm not going back to look see what the release price is, the $349 has put me off... especially if it's a "state a high price then a release over-price will sound better".
That's a marketing con, really - you state $349 before release then drop it to a still-too-high $299 or $259 after release and people will flock to it thinking they're getting a great deal. (Not saying they're doing it that way)
But the other RK3399 boxes? £115 pre-release, expected to be £140 after; Vokre has stated to me, in email, "definitely under $200".
And i FULLY agree with you about the benchies.
Perhaps the better benchies are the real-world ones... game FPS -- 5.2 for one SoC, 14.8 for another, 25.7 for a third, and so on.
****************
Still - I'm liking this 6 core box... it's better than almost everything I've looked at - allowing for the Tegra devices that are semi-locked (heavily controlled launchers and biased "features" (understandable, of course, like the Amazon devices).