Epic games has released the Epic Citadel Demo on the Google Play Store. It is a demo for android devices showing off the Unreal Engine. You can also do a benchmark with it. It looks amazing on the Nexus 10 screen I am running Paranoid 2.99 and using KTManta kernel. Screen resolution shows up as 2560 x 1504.
Playstore Link: https://play.google.com/store/apps/...SwxLDEsImNvbS5lcGljZ2FtZXMuRXBpY0NpdGFkZWwiXQ..
Screen Resolution 100% Performance HQ
Stock speeds = 47.7 fps
GPU @612 = 54.2
It is capped around 60fps because the highest fps I see during the demo is 61fps.
I ran the benchmark on my HTC Rezound all speeds stock. Resolution shows in demo as 1280 x 720 and got 28.5fps.
My Score
My Nexus 10 (stock apart from recovery and root) scored 52.7 on High Quality.
My Nexus 4 by comparison scored 55.4 again on High Quality.
Different resolutions though, and yes it must be capped at 60/61 fps.
Holy hell the demo is amazing
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
I get around 45-47
I got average 50fps in High Quality setting.
The quality was awesome!
styckx said:
Holy hell the demo is amazing
Click to expand...
Click to collapse
Sent from my Nexus 10 using xda premium
Demo almost makes me wanna play all my games on my Nexus 10 and give away my PS3.....OK jk, but it is sick though lol
Sent from my Nexus 10 using Tapatalk HD
The results we get make me believe that we could get better than console level graphics if we were running @1280x800
The game devs should start letting us choose the resolution by now, they'll eventually have to do it anyways.
Imagine running Skyrim, hell if 720p isn't possible then 960x540 would be great too (though we'll probably have to wait years until we see games like that on mobile)
Fidelator said:
The results we get make me believe that we could get better than console level graphics if we were running @1280x800
The game devs should start letting us choose the resolution by now, they'll eventually have to do it anyways.
Imagine running Skyrim, hell if 720p isn't possible then 960x540 would be great too (though we'll probably have to wait years until we see games like that on mobile)
Click to expand...
Click to collapse
I don't know. Even though you would be pushing fewer pixels, chances are that none of the games you currently play are actually 2560 by 1600 in either texture quality or polygon complexity. I am betting that while we would see a performance jump from reduced resolution, it would not be linear or exponential but more gradual.
Fidelator said:
The results we get make me believe that we could get better than console level graphics if we were running @1280x800...
Click to expand...
Click to collapse
http://forum.xda-developers.com/showpost.php?p=36983073&postcount=311
espionage724 said:
http://forum.xda-developers.com/showpost.php?p=36983073&postcount=311
Click to expand...
Click to collapse
Having those tweaks is really useful, thanks for the link and well, that was unexpected, could anyone explain why does this happen?
Fidelator said:
Having those tweaks is really useful, thanks for the link and well, that was unexpected, could anyone explain why does this happen?
Click to expand...
Click to collapse
When you increase the resolution of the output without increasing the complexity of the underlying models (buildings and trees etc) or increasing the size of the textures used on those models (the cobblestone or green of the leaves or sky etc), the graphics chip is just doing the same work it would do for a lower resolution screen and performing a simple scale. Simple output scaling is cheap and is built into almost any graphics chip.
Unless Epic releases an updated version of the software with super high res models or textures, the result will be the same for screens that have higher resolutions like our Nexus 10.
MrGrimace said:
When you increase the resolution of the output without increasing the complexity of the underlying models (buildings and trees etc) or increasing the size of the textures used on those models (the cobblestone or green of the leaves or sky etc), the graphics chip is just doing the same work it would do for a lower resolution screen and performing a simple scale. Simple output scaling is cheap and is built into almost any graphics chip.
Unless Epic releases an updated version of the software with super high res models or textures, the result will be the same for screens that have higher resolutions like our Nexus 10.
Click to expand...
Click to collapse
That's not entirely true. While textures are indeed scaled, almost everything increases in complexity as you increase resolution. Many GPU operations involve computation on a per pixel basis.
Sent from my Nexus 10 using Tapatalk HD
dalingrin said:
That's not entirely true. While textures are indeed scaled, almost everything increases in complexity as you increase resolution. Many GPU operations involve computation on a per pixel basis.
Sent from my Nexus 10 using Tapatalk HD
Click to expand...
Click to collapse
I would think it depends on when those adjustments are made in the epic engine's pipeline. If the last step in the pipeline is the scalar then it would matter very little. If, as you suggest, they are applied to the final fully scaled draw buffer then yeah, that particular operation would require more effort on the chip's part.
To be honest, I have no idea when or what image convolution or per pixel operations Epic is doing in this demo. The anecdotal evidence thus far seems to indicate it doesn't make much of a difference.
Sent from my Nexus 10 using Tapatalk HD
MrGrimace said:
I would think it depends on when those adjustments are made in the epic engine's pipeline. If the last step in the pipeline is the scalar then it would matter very little. If, as you suggest, they are applied to the final fully scaled draw buffer then yeah, that particular operation would require more effort on the chip's part.
To be honest, I have no idea when or what image convolution or per pixel operations Epic is doing in this demo. The anecdotal evidence thus far seems to indicate it doesn't make much of a difference.
Sent from my Nexus 10 using Tapatalk HD
Click to expand...
Click to collapse
I thought you were replying in regards to my hack to make the Nexus 10 run @ 1280x800 rather than in engine adjustments.
Sent from my Nexus 10 using Tapatalk HD
dalingrin said:
I thought you were replying in regards to my hack to make the Nexus 10 run @ 1280x800 rather than in engine adjustments.
Sent from my Nexus 10 using Tapatalk HD
Click to expand...
Click to collapse
I meant more in general for OpenGL pipeline. If your hack fools the Epic engine into outputting a lower res and then scaling then (I would think) the results would only be significantly different if the models or textures were swapped out at different resolutions. (The one exception being adjustments applied to the entire draw buffer after scaling to 2560 by 1600).
If you think I am way off base let me know. It is the only way I learn sometimes
Sent from my Nexus 10 using Tapatalk HD
MrGrimace said:
I meant more in general for OpenGL pipeline. If your hack fools the Epic engine into outputting a lower res and then scaling then (I would think) the results would only be significantly different if the models or textures were swapped out at different resolutions. (The one exception being adjustments applied to the entire draw buffer after scaling to 2560 by 1600).
If you think I am way off base let me know. It is the only way I learn sometimes
Sent from my Nexus 10 using Tapatalk HD
Click to expand...
Click to collapse
Right, with my hack all of userspace is rendered at 1280x800 and it is only scaled right before drawing. I imagine in engine resolution changing is approximately the same. The reason there aren't major FPS changes is due to the vsync limiting the max FPS and likely some other bottleneck affecting the minimum FPS.
Sent from my Nexus 10 using Tapatalk HD
dalingrin said:
Right, with my hack all of userspace is rendered at 1280x800 and it is only scaled right before drawing. I imagine in engine resolution changing is approximately the same. The reason there aren't major FPS changes is due to the vsync limiting the max FPS and likely some other bottleneck affecting the minimum FPS.
Sent from my Nexus 10 using Tapatalk HD
Click to expand...
Click to collapse
Ooh, good point. I forgot about VSync.
Sent from my Nexus 10 using Tapatalk HD
Anyone grab the newest update with the new Ultra High Quality/100% scale settings? I only average 20FPS now and a lot of areas are are much lower. It also (even with trinty kernel) lasts no more than a few seconds at 1700MHz on each core (if you're lucky) before throttling down to 1000MHz on each core.
On my Nexus 4 the Ultra High Quality setting seems to have little no no affect on Benchmarks as I average about the same (52FPS)
EDIT: Held the unit in front of a blasting AC unit to prevent thermal throttle and got the same score..
Related
Despite both having the same GPU, I have heard that the SGS3 version will be better some how? Higher clock speed?
Can anyone shed some light on this?
otester said:
Despite both having the same GPU, I have heard that the SGS3 version will be better some how? Higher clock speed?
Can anyone shed some light on this?
Click to expand...
Click to collapse
Also have more cores i believe.
FISKER_Q said:
Also have more cores i believe.
Click to expand...
Click to collapse
According to this they both have 4 cores.
http://en.wikipedia.org/wiki/Exynos
otester said:
According to this they both have 4 cores.
http://en.wikipedia.org/wiki/Exynos
Click to expand...
Click to collapse
Hmm, could've sworn there was a discussion about it having more cores when it was announced, my bad then.
As far as I know the GPU will have higher clock frequency. Also both S2 and S3 have the Quad-Core Mali-400MP.
Faryaab said:
As far as I know the GPU will have higher clock frequency. Also both S2 and S3 have the Quad-Core Mali-400MP.
Click to expand...
Click to collapse
This will largely be the decider on whether I get it or not.
(As I am designing a GPU intensive 3D game).
otester said:
This will largely be the decider on whether I get it or not.
(As I am designing a GPU intensive 3D game).
Click to expand...
Click to collapse
Well, S2 had the fastest Mobile GPU and now S3 has the fastest one. So if you really want the best GPU go for the S3 but S2 will also work really well.
Faryaab said:
Well, S2 had the fastest Mobile GPU and now S3 has the fastest one. So if you really want the best GPU go for the S3 but S2 will also work really well.
Click to expand...
Click to collapse
I need some proof though, no one seems to really know for sure, just want to be sure before splashing £500
This seems to explain it:
the main thing is the smaller process node design, the increased memory bandwidth, cleverer memory bandwidth architecture
Click to expand...
Click to collapse
Samsung announced that they have switched to high-k materials and metal gates (HKMG) and further claimed it can provide superior performance with less power than conventional poly-Si/SiON used at 45nm. Samsung demonstrated that Exynos 4212 (32nm version) SoC can produce 35% to 50% more horsepower than the older Exynos 4210 (45nm version). So clearly Exynos 4412 quad core wins the “CPU Horsepower” battle.
Click to expand...
Click to collapse
While they haven’t officially announced this yet, I believe the GPU is the same one they promised for the dual core 1.5 Ghz Exynos 4212 chip. They said that GPU had a 50% increase in performance over the current one in the Galaxy S2. This improvement is most likely possible because of the jump from 45nm for the dual core Exynos to the 32nm Exynos 4412 (they used the more efficient transistors to increase performance at the same or lower power consumption).
Click to expand...
Click to collapse
source
I had the Note which was dualcore Exynos 1.4ghz and same GPU, but 1280x800, it could handle just about all the games thrown at it, ie. RipTide, Asphalt 6. I only felt Mordern Combat 3 could have higher fps, although very smooth and playable.
eksasol said:
I had the Note which was dualcore Exynos 1.4ghz and same GPU, but 1280x800, it could handle just about all the games thrown at it, ie. RipTide, Asphalt 6. I only felt Mordern Combat 3 could have higher fps, although very smooth and playable.
Click to expand...
Click to collapse
With the game I'm working on, just tried it on my phone today (haven't tested in a few months) and I have noticed some lagging with the new faster animations.
otester said:
With the game I'm working on, just tried it on my phone today (haven't tested in a few months) and I have noticed some lagging with the new faster animations.
Click to expand...
Click to collapse
The thing is that if you are hardly able to run the game on a Mali-400MP then the game would lag like hell even on T2/T3.
Faryaab said:
The thing is that if you are hardly able to run the game on a Mali-400MP then the game would lag like hell even on T2/T3.
Click to expand...
Click to collapse
T2/T3?
Also I wouldn't say it hardly runs, minimum of 20FPS, lots of optimizations to be done though such as LOD (basically chops poly count down on far away models).
For lower devices...
Dynamic lighting turned off.
Light maps could be baked into the texture
Normal maps removed.
Specular maps removed.
At the moment it's probably the same or exceeds the quality of Shadowgun THD.
Check available benchmarks .)
You should have looked around more for available benchmarks.
This show clarly that the s3 is much faster than the s2:
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
source and source
Hironimo said:
You should have looked around more for available benchmarks.
This show clarly that the s3 is much faster than the s2:
source and source
Click to expand...
Click to collapse
I wanted to know the technical reason, I've already looked at the benchmarks.
otester said:
I wanted to know the technical reason, I've already looked at the benchmarks.
Click to expand...
Click to collapse
Higher clock, optimised drivers, maybe hardware optimisation
Mopral said:
Higher clock, optimised drivers, maybe hardware optimisation
Click to expand...
Click to collapse
People at the HTC sensation forum extracted Adreno 225 Drivers and used it on their Adreno 220 powered phones and they gained a huge performance boost!
as we know Adreno 225 is Adreno 220 with double the frequency (thanks to that new manufacturing process of the CPU also the GPU still using the same process...)
so can't someone extract the new drivers so everyone with Mali 400 GPU can use it?
Because S3's Quad-core GPU frequency is 400MHz. S2 only 275MHz.I am a Chinese grade 9 student andlooking forward to GS III.
Sent from my GT-S5570 using xda premium
sorry for my bad English. the frequency isthereason that some tablets with realtek rk2918 board(mali400 mp2)'s 3d performance is higher than GS II.because mali400 on rk2918 is400MHz but GS2 only 275MHz.
Sent from my GT-S5570 using xda premium
Just wonder whether clocking up the GPU might pose heating issues.
Sent from my GT-I9300 using xda premium
Update
It seems the graphics do stutter even in basic games such as Temple Run.
Quadrant aside, the graphics do stutter.
It has been proven here:
http://gizmodo.com/5910015/htc-evo-4g-lte-lightning-review-the-phone-that-would-be-king-but-isnt/
Ctrl+F "This is weird"
That will jump you to the place in the article that states this. Sorry, couldn't post links earlier. I wanted to though.
Graphics also stutter in the One X because it uses the same chip.
We all know from assumption of the One X that this phone is going to be a beast. The processing power is outrageously raw. It comes out of the box with a Quadrant score of 5,000+.
The thing i'm worried about though is the graphics part of Quadrant only scored a 2,150 with the EVO LTE.
Blah blah blah real world use blah. Believe it or not Quadrant is a standard that all phones will be used against. It shows a good insight to the phone. I know it's not the true phones capabilities. Anyshways.....
Think we can get this phone overclocked to 2.0 ghz? Dream with me people. Reviews are going around stating that even Temple Run stutters. The massive screen drains the GPU big time.
I'm not that worried about it though. Don't really play that many games. That's what i'll tell myself. Ya, sounds good.
Evo 3D doesn't have any problems playing the games I want to play..The LTE will be no different
And quadrant will not be any standard for anything other than fan boy fapping.
Sent from my sticky fingered ice cream sandwich Evo 3D
newfireorange said:
We all know from assumption of the One X that this phone is going to be a beast. The processing power is outrageously raw. It comes out of the box with a Quadrant score of 5,000+.
The thing i'm worried about though is the graphics part of Quadrant only scored a 2,150 with the EVO LTE.
Blah blah blah real world use blah. Believe it or not Quadrant is a standard that all phones will be used against. It shows a good insight to the phone. I know it's not the true phones capabilities. Anyshways.....
Think we can get this phone overclocked to 2.0 ghz? Dream with me people. Reviews are going around stating that even Temple Run stutters. The massive screen drains the GPU big time.
I'm not that worried about it though. Don't really play that many games. That's what i'll tell myself. Ya, sounds good.
Click to expand...
Click to collapse
2ghz might be possible for some phones, it just depends
Quadrant aside, the graphics do stutter.
It has been proven here:
http://gizmodo.com/5910015/htc-evo-4g-lte-lightning-review-the-phone-that-would-be-king-but-isnt/
Ctrl+F "This is weird"
That will jump you to the place in the article that states this. Sorry, couldn't post links earlier. I wanted to though.
Graphics also stutter in the One X because it uses the same chip.
Probably due to filesystem cache getting dumped.
If the evo 3d can run it fine, the evo lte that is much faster with a better gpu can run it with no problems. The website is stupid. Anyone can make a game stutter
Sent from my PG86100 using Tapatalk 2
bloodrain954 said:
If the evo 3d can run it fine, the evo lte that is much faster with a better gpu can run it with no problems.
Click to expand...
Click to collapse
EVOLTE also has a higher resolution screen than EVO 3D, so the logic is not as straightforward.
The units are also not running the final release firmware. There's bound to be a few hiccups in places.
Sent from my HTC Incredible 2 using Tapatalk 2
EVO 4G LTE is a beast no doubt, but I'm waiting for a phone with an S4 processor and Adreno 300 series GPU.
The EVO LTE has an Adreno 225 GPU, same as 220, just clocked higher for increased performance. Supposedly better drivers will be implemented to also increase performance. I believe that's good enough for me. The only game I know that can make my Epic 4G stutter is GTA III. No other game can even make it come close to having a hiccup. This let's me rest assured that the EVO LTE will be rock solid for at least 12 months. Unless Gameloft's N.O.V.A. 3 destroys the GPU.......
And BTW, if you don't like Quadrant, check out AnTuTu. About the same results. People cry that they mean NOTHING. As far as I'm concerned, it means something when we have nothing else. I don't just buy a phone and say dang I thought since it was new it would be crazy good. Remember when the original EVO 4G came out it had that 30 FPS limit?
HA!
Sent From My Sprint Galaxy Nexus via XDA Premium
Seriously, people must think phones are mobile gaming systems now. I can't stand all of the "yeah I'm gonna wait for the 320 Adreno GPU blah blah" talk when I'm sure half of them don't know what GPU even stands for, and a bunch of them will use this awesome power to play Words with Friends and Temple Run. Seriously people, I get the whole "I want the best in the business" thing, but at least consider if it is even USEFUL to you personally.
Sorry to the many of you who DO intend to push the limits of the graphics and such, but so many people Wikipedia things and then start complaining about stuff they don't even understand.
I, for one, and ecstatic about this processor chip, with the "modest" 225 Adreno GPU. Wah wah, poor me and my amazing, top-of-the-line processor.
I believe it had a quadrant score of 4000 per phone dog review
---------- Post added at 02:25 PM ---------- Previous post was at 02:20 PM ----------
actually 4913. There is no doubt this is a very powerful phone
PsiPhiDan said:
Seriously, people must think phones are mobile gaming systems now. I can't stand all of the "yeah I'm gonna wait for the 320 Adreno GPU blah blah" talk when I'm sure half of them don't know what GPU even stands for, and a bunch of them will use this awesome power to play Words with Friends and Temple Run. Seriously people, I get the whole "I want the best in the business" thing, but at least consider if it is even USEFUL to you personally.
Sorry to the many of you who DO intend to push the limits of the graphics and such, but so many people Wikipedia things and then start complaining about stuff they don't even understand.
I, for one, and ecstatic about this processor chip, with the "modest" 225 Adreno GPU. Wah wah, poor me and my amazing, top-of-the-line processor.
Click to expand...
Click to collapse
Geez, people complain when others don't use a "search" engine and inform themselves about a certain topic, and now your complaining when people DO search and return with questions...
Mcoupe said:
Geez, people complain when others don't use a "search" engine and inform themselves about a certain topic, and now your complaining when people DO search and return with questions...
Click to expand...
Click to collapse
I'm sorry if I didn't make my target clear, but I'm not complaining about people who are asking questions or using the search function (???). I'm just tired of people bashing a phone's GPU when they don't have a clue what 225 vs 320 GPU will even do for them, or if it even matters for their personal usage. They just want higher numbers for every spec in existence.
Again, I'm not attacking people asking questions AT ALL. No harm intended there.
Sent from my PC36100 using XDA
It's just as bad with the quad core vs dual core garbage. Any of these new phones will be lightning fast
woody296 said:
It's just as bad with the quad core vs dual core garbage. Any of these new phones will be lightning fast
Click to expand...
Click to collapse
One up from what you said I actually think the dual will be much better than the quad, while the quad may be a small fraction better at very intense gaming it will also suck down you battery life MUCH faster because it needs external LTE radios in the US and is based on older 32nm semiconductors while the new S4 Krait Snapdragons are 24nm, the power savings are enormous.
Well let me clarify for some people. 2k graphics score, 5k total quadrant score.
Somebody went on a rant how you won't even push the graphical limits with this out dated gpu blah blah. This is an opinion and various user to user. Chillax. I was just a little disappointed when I found this out. Still a great phone. Gosh its like walking on egg shells here. Join together peeps.
Sent from my wife's purple Optimus.
my evo lte just pulled around 5500 total with over 3000 3d graphics. I haven't gamed much, but I've experienced no graphical glitches in the 8 days I've had it.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Sent from my EVO using Tapatalk 2
kellybrf said:
my evo lte just pulled around 5500 total with over 3000 3d graphics. I haven't gamed much, but I've experienced no graphical glitches in the 8 days I've had it.
Sent from my EVO using Tapatalk 2
Click to expand...
Click to collapse
You tease.....
Hi geeks,
checkout this wiki for general information around the new Exynos dual core
http://www.arndaleboard.org/wiki/index.php/Resources
Of course this file might be of special interest...
BTW:
This document is marked as confidential but it's public available.
So Mike, if this is against rules... tell me!
Best regards,
scholbert
Antutu benchmark of Nexus 10
http://www.antutu.com/view.shtml?id=2960
Quite impressive because antutu depends much on the number of cores and clockrate rather than architechture (1.5 Ghz Snapdragon S3 got ~6600 while 1.4 Ghz Exynos 4210 on GNote had only ~6300)
And the NAND flash is pretty good too, 16.6MB/s write and >50 MB/s read (in fact my Note 2 has 200 point in SD card read with the mark >50 MB/s too, but this one is 10% faster)
I would love to see one of those fancy graphics comparing the Nexus10 performance with "the others".
We know that it's better but how much better?
And if you want more specs, here's two other benchmarks:
SunSpider:
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
GLBenchmark:
If anyone is interested in the Antutu scores across the Nexus 4, 7 and 10 devices I've cut'n'paste them together from the Antutu site linked above...
These results are as recorded by Antutu himself at these links...
Nexus 10 - http://www.antutu.com/view.shtml?id=2960
Nexus 7 - http://www.antutu.com/view.shtml?id=1282
Nexus 4 - http://www.antutu.com/view.shtml?id=2940
The 3D and 2D scores on the Nexus 10 keeps up with the other two devices which seems quite impressive considering the higher resolution.
Keitaro said:
If anyone is interested in the Antutu scores across the Nexus 4, 7 and 10 devices I've cut'n'paste them together from the Antutu site linked above...
These results are as recorded by Antutu himself at these links...
Nexus 10 - http://www.antutu.com/view.shtml?id=2960
Nexus 7 - http://www.antutu.com/view.shtml?id=1282
Nexus 4 - http://www.antutu.com/view.shtml?id=2940
The 3D and 2D scores on the Nexus 10 keeps up with the other two devices which seems quite impressive considering the higher resolution.
Click to expand...
Click to collapse
Odd, why is the CPU on the Nexus 10 slower than the others? I thought that the A15 was supposed to be the fastest thing on the market right now, which would go nicely with the fastest GPU (Mali 604 or whatever it is).
Also, Scumbag Antutu forces the tablet into portrait. I would love it if Google could somehow force apps to run in landscape, apps should never be in portrait on a 16:10 tablet this big unless I deem it so by orienting it in portrait.
via Tapatalk
Kookas said:
Odd, why is the CPU on the Nexus 10 slower than the others? I thought that the A15 was supposed to be the fastest thing on the market right now, which would go nicely with the fastest GPU (Mali 604 or whatever it is).
Also, Scumbag Antutu forces the tablet into portrait. I would love it if Google could somehow force apps to run in landscape, apps should never be in portrait on a 16:10 tablet this big unless I deem it so by orienting it in portrait.
via Tapatalk
Click to expand...
Click to collapse
Need to keep in mind that the N10 is running at a MUCH larger resolution, that most likely has something to do with it. Had the processor been on the same device as the 4 and 7 you would see a substantial difference.
tkoreaper said:
Need to keep in mind that the N10 is running at a MUCH larger resolution, that most likely has something to do with it. Had the processor been on the same device as the 4 and 7 you would see a substantial difference.
Click to expand...
Click to collapse
But I wouldn't expect the load of that higher res to go to the CPU, just the GPU (so 2D and 3D scores). Does the CPU get involved in video processing in SoCs?
via Tapatalk
Does the Nexus label mean that all drivers for this device will be open source? As in, none of the BS that the devs for the i777 are experiencing with Samsung completely unwilling to release specs for the Exynos/Mali combo in that device?
EDIT: Answered my own question. The AOSP site itself tells you to go get the blobs for specific devices if you want to build. So no. Ah well, my concern would be fully functional OS updates, and the Nexus label DOES solve that - at least for a couple of years after release.
These I/O results look promising. A lot better then the transformer prime I had.
While it's nice to see numbers and you should always take them with a grain of salt (for obvious reasons). I for one am just going to wait till I have my Nexus 10 in my hands and see how she flies. I have no doubts that it will run todays apps with no issues and last you easily for a year +. I for one am drooling over the display (esp if non-pentile). Just hope Samsung did something to address the possiblity of pixel fatigue and ghosting/image retention.
Kookas said:
Odd, why is the CPU on the Nexus 10 slower than the others? I thought that the A15 was supposed to be the fastest thing on the market right now, which would go nicely with the fastest GPU (Mali 604 or whatever it is).
Also, Scumbag Antutu forces the tablet into portrait. I would love it if Google could somehow force apps to run in landscape, apps should never be in portrait on a 16:10 tablet this big unless I deem it so by orienting it in portrait.
via Tapatalk
Click to expand...
Click to collapse
Because it's ANTUTU benchmark. As I said in #2, Antutu always prefer number of cores and frequency of cores rather than the architechture.
That's why the Crapdragon S3 1.5 GHz having higher score than 1.4 GHz Exynos 4210 in GNote.
hung2900 said:
Because it's ANTUTU benchmark. As I said in #2, Antutu always prefer number of cores and frequency of cores rather than the architechture.
That's why the Crapdragon S3 1.5 GHz having higher score than 1.4 GHz Exynos 4210 in GNote.
Click to expand...
Click to collapse
What are you talking about. The snapdragon is a better architecture with less cores. You have it backwards.
Edit: thought you meant snapdragon s4.
Sent from my Galaxy Nexus using xda premium
Here are the benchmarks from Anand's review of Chromebook:
http://www.anandtech.com/show/6422/samsung-chromebook-xe303-review-testing-arms-cortex-a15/6
I am almost sure that the N10 will be better optimized compared to Chromebook purely because of the resources dedicated for Android. Also shows that Samsung is still Google's preferred partner in terms of hardware.
zetsumeikuro said:
While it's nice to see numbers and you should always take them with a grain of salt (for obvious reasons). I for one am just going to wait till I have my Nexus 10 in my hands and see how she flies. I have no doubts that it will run todays apps with no issues and last you easily for a year +. I for one am drooling over the display (esp if non-pentile). Just hope Samsung did something to address the possiblity of pixel fatigue and ghosting/image retention.
Click to expand...
Click to collapse
It's not an AMOLED display, it's a Super PLS (LCD), isn't it?
blackhand1001 said:
What are you talking about. The snapdragon is a better architecture with less cores. You have it backwards.
Sent from my Galaxy Nexus using xda premium
Click to expand...
Click to collapse
Do a research man! You can searxh how sh.t crapdragon s3 msm 8660 compared to exynos 4210 in the same galaxy note (US variant vs int variant)
philos64 said:
And if you want more specs, here's two other benchmarks:
SunSpider:
GLBenchmark:
Click to expand...
Click to collapse
That sun spider score is bad vs the chrome book must be the resolution which is high for even high end PCs.
The epic screen was always going to eat most of the resources, the question is: whatever benchmark the final (and in future surely improved) SW version produces, is it enough for smooth operation? That answer will most likely be yes. Chromebook shows huge HW potential, but it's also more optimized at this moment, patience my lads.
The hardware has the potential
Hi Guys,
Came across this while researching Exynos 5250. Looks like the hardware is very capable to handle the WQXGA resolution with memory bandwidth and power to spare. This white paper also mentions the support for 1080p 60fps wireless display. So I hope Miracast will be reality as well, just Google needs to step up and utilize the hardware to its full potential. Its an interesting read none the less..
Sorry, can not post links yet.. replace _ with . and then try.
www_samsung_com/global/business/semiconductor/minisite/Exynos/data/ Enjoy_the_Ultimate_WQXGA_Solution_with_Exynos_5_Dual_WP.pdf
---------- Post added at 04:26 AM ---------- Previous post was at 04:20 AM ----------
oneguy7 said:
Hi Guys,
Came across this while researching Exynos 5250. Looks like the hardware is very capable to handle the WQXGA resolution with memory bandwidth and power to spare. This white paper also mentions the support for 1080p 60fps wireless display. So I hope Miracast will be reality as well, just Google needs to step up and utilize the hardware to its full potential. Its an interesting read none the less..
Sorry, can not post links yet.. replace _ with . and then try.
www_samsung_com/global/business/semiconductor/minisite/Exynos/data/ Enjoy_the_Ultimate_WQXGA_Solution_with_Exynos_5_Dual_WP.pdf
Click to expand...
Click to collapse
If the link does not work, google exynos 5 dual white paper.
I ran Quadrant and compared the results with those of my old Epic 4g.
Epic 4g Graphics (3d) score is 1666. N10 Graphics (3d) score is 2087. See below.
Epic 4g, rooted, FC09 MTD deodexed, Shadow kernel io/cpu=deadline/ondemand
Nexus 10, not rooted
Hi Guys,
I had a Nexus 10 for like a week but unfortunately it was faulty. Returned it and thought I'd wait a month for "New Stock" then re-purchase !
For the week I had it ... I loved it.....and now cant get any Nexus 10s from the Play Store - I'm just sat here waiting
I came across this article regarding a revised Nexus 10 device !! anyone heard anything else ?
Here's the article:
http://www.brightsideofnews.com/new...let-at-mwc-2013-quad-core2c-gpgpu-inside.aspx
Thought I'd throw it out there!
If the article is true - I'm gona be waiting for the revised version!
Peace Out
I'm not gonna hold my breath for any new Nexus device at MWC. I think the next thing we'll see is the successor to the Nexus 7 at Google I/O in May.
I don't know... I don't read many people complaining about the power of the tegra3 in the Nexus 7, while the article is correct in that pushing this resolution the Nexus 10 is a bit underwhelming on the performance side. I'd see them wanting a top performer 10" to one up the ipad4. They've beat the ipad4 in dpi (not color) if they can boost perf and hopefully adjust color values they'd do quite well.
I just see making these adjustments to a Nexus 10 (2) being more needed than any adjustments on the Nexus 7.
If true, would this solve the issue with missing Miracast?
Sent from my GT-I9100 using xda premium
I believe all the issues have to do with how little rigidity there is in the chassis. Gg2 is just too thin to stay flat with such a long narrow panel. Its the uneven backlight that makes light bleed and colors so bad. If the device were stiffer and/or had thicker glass I'm sure that the screen would be much better. Almost all the screen issues I believe stem from this.
Sent from my 5th and final Nexus 4
---------- Post added at 04:53 PM ---------- Previous post was at 04:50 PM ----------
To expand on that... The screen is not flat really. I think in their desire to make it light and thin that they didn't anticipate the screen sag. I had my screen replaced by Samsung and the screen was either pressed in too hard or was warped from the start because it had worse bleed than the original. If I were to grip the tablet from the edges and push back as if to snap the device in half, all bleed went away.
Sent from my 5th and final Nexus 4
I hope Google won't crap on early adopters like that and just work on optimizing software with the current hardware..... I have no more money to upgrade!
c0minatcha said:
Here's the article
Click to expand...
Click to collapse
With the combination of "brightsideofthenews.com" and "according to the people we spoke with" as attribution for the source in the article I'd say this is probably below the scale of a rumor.
As far as user experience goes for me so far, the tablet itself hasn't really slow me down anytime due to speed yet. Playing 1080p has been smooth... I like the 1600p display as that match the native resolution of my desktop monitor so that works really well for remote desktop..
I understand how Google want to take the tech leadership from apple. If that can somehow lead developers to start developing better apps than iOS then that would be so great. So tired of playing seomd fiddle to so many of these apps that have much better iOS over.
As far as the hardware goes, I would glad to pay a little extra if they can include 64 or 128 GB storage. Better yet, a micro sdxc slot would be nice.... Also a 4g over would be good. And if they can ever change the software to allow ad hoc network... I think these are more important than 8 core CPU GPU..
Jayrod1980 said:
I believe all the issues have to do with how little rigidity there is in the chassis. Gg2 is just too thin to stay flat with such a long narrow panel. Its the uneven backlight that makes light bleed and colors so bad. If the device were stiffer and/or had thicker glass I'm sure that the screen would be much better. Almost all the screen issues I believe stem from this.
Sent from my 5th and final Nexus 4
---------- Post added at 04:53 PM ---------- Previous post was at 04:50 PM ----------
To expand on that... The screen is not flat really. I think in their desire to make it light and thin that they didn't anticipate the screen sag. I had my screen replaced by Samsung and the screen was either pressed in too hard or was warped from the start because it had worse bleed than the original. If I were to grip the tablet from the edges and push back as if to snap the device in half, all bleed went away.
Sent from my 5th and final Nexus 4
Click to expand...
Click to collapse
It is not nearly as flexible as you make it sound. If this were truly the problem with the display then flexing the device would cause backlight shifts and screen distortions. This, however, is not the case in my experience.
wildpig1234 said:
As far as user experience goes for me so far, the tablet itself hasn't really slow me down anytime due to speed yet. Playing 1080p has been smooth
Click to expand...
Click to collapse
same here after some tweaking
its reference quality in playback and smoothness
also the screen is out of this world
i say it already after 1 month of use.
i have never seen a display producing "more" details and sharpness to 1080p blu ray material.
its amost that the N10 knows what color the pixel that is scaled up should have.
Blu Ray source material looks more like 4K on this screen.
why the hell would i change this for a rev2 ?
because NFSMW isnt running at 60fps?..nah..that **** should be played on a PC with a Wheel
also remember
performance improvements can come with updates in the near future.
you could allways tweak settings for specific games to run even better.
Nvidia has done it for a long time.
Google can also do it when they see that only 99% of the games runs at 60fps
my next Tablet upgrade will be when they runs in 120hz at 120FPS
not before.
Just reading through the above, the small use I had of the Nexus 10 was a great experience!
Ive been checking the stock in the UK and it hasn't been replenished...then I read that article....and thought as usual now should I wait (Not that I have any choice right now!)
Maybe instead of concentrating on the N10 v2 - they should maybe concentrate on getting the current version back in stock and their accessories in order for all their products !!
And to re-itterate the screen was super dope !!
Could the low stock indicate that they are working on a rev 2?
Sent from my GT-I9100 using xda premium
Performance wise this tablet is a milestone in the android world.... Dafuq i just read?!
Here is another place running the story. Sounds like copy/paste `journalism' to me.
http://nvonews.com/2013/01/21/new-g...imus-g2-ascend-w2expected-for-mwc-2013-specs/
More Powerful Google Nexus 10: It has been quite recently that we received the large 10-inch Nexus tablet from Google. The device, made by Samsung, has an amazing 2560 x 1600 display. This high resolution is highly praised as it beats both iPad 3 and iPad 4. But, the device lacks a powerful processor. Samsung used its Exynos 5 chipset inside a dual core Cortex-A15 CPU and a Mali T604-Class graphical processor.
Many customers find the device underpowered, because of its dual core CPU and less competent graphical processor. As per rumors, Google is likely to produce a quad core version for the device. Along with the CPU upgrade, the tablet will have an 8-core T628 graphical processor. The projected tablet will have a fresh Android version and the same 2GB RAM.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
borchgrevink said:
Could the low stock indicate that they are working on a rev 2?
Sent from my GT-I9100 using xda premium
Click to expand...
Click to collapse
I don't think so, but it would be great, this would lower their costs about RMA's since so many people returned their tablets.
Garbage
I doubt they will update it until a year after the first one came out, other then to add a simcard.
Incremental upgrade.....cool
Sent from my Nexus 10 using XDA Premium HD app
I'm sure Google is touting around a bunch of tablet prototypes and these guys might have even seen one but I doubt it will be a Revision 2 of the same product. It will probably be the followup model and we'll see it in about a year from now which is fine and expected. The only thing that makes this somewhat plausible is that Google is the worst company in the world at keeping secrets. Or they are the most genius speculation generating company in existence.
I can't see moving off the N10 (personally) in a year.. The mobile markets technology advancement is moving at a disgusting rate though. I love bleeding edge technology and such but they need to slow it down a notch.
You guys certainly are an emotional lot.
First, Exynos 5 Dual is more than capable of running a 10" tablet. It's a more modern architecture than Exynos 4 Quad with more memory bandwidth. It's on par with S4 Pro which is the other new-gen SoC being heavily used right now.
Samsung's Exynos 5 Dual integrates two ARM Cortex A15 cores running at up to 1.7GHz with a shared 1MB L2 cache. The A15 is a 3-issue, Out of Order ARMv7 architecture with advanced SIMDv2 support. The memory interface side of the A15 should be much improved compared to the A9 [Exynos 4]. The wider front end, beefed up internal data structures and higher clock speed will all contribute to a significant performance improvement over Cortex A9 based designs. It's even likely that we'll see A15 give Krait a run for its money, although Qualcomm is expected to introduce another revision of the Krait architecture sometime next year to improve IPC and overall performance. The A15 is also found in TI's OMAP 5. It will likely be used in NVIDIA's forthcoming Wayne SoC, as well as the Apple SoC driving the next iPad in 2013.
Mali 604 is a huge leap above anything Samsung's created before. And if there's a quad-core version of Exynos 5 it'll use the exact same Mali GPU just as the dual and quad-core versions of S4 Pro share the same GPU.
Samsung's fondness of ARM designed GPU cores continues with the Exynos 5 Dual. The ARM Mali-T604 makes its debut in the Exynos 5 Dual in quad-core form. Mali-T604 is ARM's first unified shader architecture GPU, which should help it deliver more balanced performance regardless of workload (the current Mali-400 falls short in the latest polygon heavy workloads thanks to its unbalanced pixel/vertex shader count). Each core has been improved (there are now two ALU pipes per core vs. one in the Mali-400) and core clocks should be much higher thanks to Samsung's 32nm LP process. Add in gobs of memory bandwidth and you've got a recipe for a pretty powerful GPU. Depending on clock speeds I would expect peak performance north of the PowerVR SGX 543MP2 [iPad 3], although I'm not sure if we'll see performance greater than the 543MP4 [iPad 4].http://www.anandtech.com/show/6148/samsung-announces-a15malit604-based-exynos-5-dual
So there's no rush to get a Rev2 of N10 out as the performance increase from Exynos Dual to Quad wouldn't be that dramatic and the GPU would be exactly the same. The h/w for the N10 is mostly provided by Samsung and the components are all latest generation. The display has an enormous amount of pixels but if your read the article the h/w is more than capable of supporting it. So any issues with performance will hopefully be addressed with s/w updates vs. the need for an emergency h/w update.
Do people still think that these random reboots are the hardware problem of N10? It is 4.2 problem. If you have 4.2 in Nexus 4 and Nexus 7, they all get these random reboots. I just think that people just return/replace their devices without doing any reading.
Version 2? It looks like it'd have better chipset but I don't know where does it stop. It's like Samsung developing Octa chipset. I just think that we need any more power than Snapdragon S4 Dual. That already had plenty of power and now we are holding Exynos 5 Dual A15 based. It's like with your desktop computer, you could have i7 6-core 12-thread 37XXk CPU. You bought it for $600 but you are probably never going to use all its power for years to come.
Last summer, I decided to buy a Nexus 7 for using it mainly as an ebook reader. It's perfect for that with its very sharp 1280x800 screen. It was my first Android device and I love this little tablet.
I'm a fan of retro gaming and I installed emulators on every device I have: Pocket PC, Xbox, PSP Go, iPhone, iPad3, PS3. So I discovered that the Android platform was one of the most active community for emulation fans like me and I bought many of them, and all those made by Robert Broglia (.EMU series). They were running great on the N7 but I found that 16GB was too small, as was the screen.
I waited and waited until the 32 GB Nexus 10 became available here in Canada and bought it soon after (10 days ago). With its A15 cores, I was expecting the N10 to be a great device for emulation but I am now a little disapointed. When buying the N10, I expected everything to run faster than on the N7 by a noticeable margin.
Many emulators run slower on the N10 than on the N7. MAME4Ddroid and MAME4Droid reloaded are no longer completely smooth with more demanding ROMs, Omega 500, Colleen, UAE4droid and SToid are slower and some others needed much more tweaking than on the N7. I'm a little extreme on accuracy of emulation and I like everything to be as close to the real thing as possible. A solid 60 fps for me is a must (or 50 fps for PAL machines).
On the other side, there are other emus that ran very well: the .EMU series and RetroArch for example. These emulators are much more polished than the average quick port and they run without a flaw. They're great on the 10-inch screen and I enjoy them very much. The CPU intensive emulators (Mupen64Plus AE and FPSE) gained some speed but less that I anticipated.
So is this because of the monster Nexus 10's 2560x1600 resolution? Or is it because of limited memory bandwith? Maybe some emulators are not tweaked for the N10 yet. I wish some emulators had the option to set a lower resolution for rendering and then upscale the output. I think that many Android apps just try to push the frames to the native resolution without checking first if there is a faster way.
The N7 has a lower clocked 4 core CPU but has only 1/4 the resolution. I think that it's a more balanced device that the N10 which may have a faster dual core CPU but too much pixels to push. It's much like the iPad3 who was twice as fast as the iPad2 but had a 4x increase in resolution.
I am now considering going for a custom ROM on the N10 but I wonder if I will see an increase in emulation speed. Maybe those of you who did the jump can tell me. I'm thinking about AOKP maybe.
Any suggestion on that would be appreciated, thanks!
The emulators just need to be tweaked a bit to better perform on the completely different processor architecture. Really our processor is far more powerful than the Nexus 7 so the emulators should run faster. I too am a fan of the old games, and I play Super Nintendo and Game Boy Advance (and some Color) games quite often. I find performance to be perfect with no issues at all, but then again those arent exactly "demanding" emulators.
We do not have any sort of memory bandwidth limitation on the Nexus 10. The tablet has been designed to give the full needed 12.8 GB/s of memory bandwidth that is required for 2560x1600 resolution.
EniGmA1987 said:
The emulators just need to be tweaked a bit to better perform on the completely different processor architecture. Really our processor is far more powerful than the Nexus 7 so the emulators should run faster. I too am a fan of the old games, and I play Super Nintendo and Game Boy Advance (and some Color) games quite often. I find performance to be perfect with no issues at all, but then again those arent exactly "demanding" emulators.
We do not have any sort of memory bandwidth limitation on the Nexus 10. The tablet has been designed to give the full needed 12.8 GB/s of memory bandwidth that is required for 2560x1600 resolution.
Click to expand...
Click to collapse
Hmm, if no memory bandwidth limitation exists on the N10, wouldn't I be able to run GTA 3 at 100% screen resolution and not have significantly lower FPS, as compared to 50% resolution?
Even Beat Hazard Ultra seems to be a bit laggy on the N10. When I inquired about it to the developer, he said:
Having to render to that size of screen [2560x1600] will slow the game down. It’s called being ‘fill rate bound’. Even for a good processor it's a lot of work as the game uses quite a lot of overdraw.
The solution is to draw everything to a smaller screen (say half at 1280x800) and then stretch the final image to fill the screen.
Click to expand...
Click to collapse
A sad true my nexus 10 get dam hot and i have to play games at 1.4 or 1.2 that sux
Sent from my XT925 using xda app-developers app
espionage724 said:
Hmm, if no memory bandwidth limitation exists on the N10, wouldn't I be able to run GTA 3 at 100% screen resolution and not have significantly lower FPS, as compared to 50% resolution?
Even Beat Hazard Ultra seems to be a bit laggy on the N10. When I inquired about it to the developer, he said:
Click to expand...
Click to collapse
But fillrate isnt memory bandwidth. We need both more MHz and more raster operations to get higher fill rate of pixels per second. We can overclock the GPU to get the MHz, and that will help, but we have to find a way to solve the higher heat output too from that. More ROP's are impossible as it is a hardware design for how many we have. If we ever get to overclock up to around 750 MHz then we should see a 30-40% improvement in fill rate. At that point we may have memory bandwidth problems, but we wont know for sure until we get there. But the 12.8GB/s of bandwidth that we currently have is enough to support 2560x1600 resolution at our current GPU power. Our Nexus 10 also has the highest fillrate of any Android phone or tablet to date, about 1.4 Mtexel/s. And if we have memory bandwidth limitations, then we would see no improvement at all from the current overclock we do have up to 612-620MHz because the speed wouldnt be where the bottleneck is. Yet we can clearly see in benchmarks and real gaming that we get FPS increases with higher MHz, thus our current problem is the fillrate and not the memory bandwidth.
Also, the solution is not to render the game at half the resolution as that is a band-aid on the real problem. If the developer of a game would code the game properly we wouldnt have this problem, or if they dont feel like doing that then they should at least stop trying to put more into the game than their un-optimized, lazy project is capable of running nicely.
espionage724 said:
Hmm, if no memory bandwidth limitation exists on the N10, wouldn't I be able to run GTA 3 at 100% screen resolution and not have significantly lower FPS, as compared to 50% resolution?
Even Beat Hazard Ultra seems to be a bit laggy on the N10. When I inquired about it to the developer, he said:
Click to expand...
Click to collapse
With that logic you could buy any video card for a PC and it would run any game at the resolution the video card supports. That isn't the case because rendering involves more than just memory fill rate. There are textures, polygons, multiple rendering passes, filtering, it goes on and on. As EniGmA1987 mentioned nothing has been optimized to take advantage of this hardware yet, developers were literally crossing their fingers hoping their games would run 'as is'. thankfully the A15 cpu cores in the exynos will be used in the tegra 4 as well so we can look forward to the CPU optimizations soon which will definitely help.
Emulators are more cpu intensive than anything else, give it a little time and you won't have any problems with your old school games. Run the new 3DMark bench to see what this tablet can do, it runs native resolution and its not even fully optimized for this architecture yet.
2560*1600*4*60/1024/1024 = 937,3 MB/s for a 60 fps game at 32-bit depth. Most emulators don't use 3D functions so fillrate, rendering, overdraw won't be a factor. Most emulators are single-threaded (correct me if I'm wrong) and the A15 should shine in this particular situation and even more so in multi-threaded scenarios. With its out-of-order pipeline and greatly enhanced efficiency it should be perfectly suited for the job.
We have the fillrate, we have enough CPU power and I'm still wondering why simple app like emulators aren't much faster than that. Is it Android? Is it the Dalvik VM? Or is it because some emulators need to be written in native code instead of using Java VM? I'm not a developer and I have only minimal knowledge in this department. I can only speculate but I'm curious enough about it that I started googling around to find why.
Lodovik said:
2560*1600*4*60/1024/1024 = 937,3 MB/s for a 60 fps game at 32-bit depth
Click to expand...
Click to collapse
Just curious but what is that calculation supposed to be? total bandwidth needed? Cause I don't see your bit depth in there, unless the 4 is supposed to be that? If that is true than you are calculating on 4-bit color depth?
And then the result would just be bandwidth required for pixel data to memory wouldnt it? It wouldnt include texture data in and out of memory and other special functions like post processing.
2560*1600 = number of pixels on the screen
4 = bytes / pixels for 32-bits depth
60 = frames / second
/1024/1024 = divide twice to get the result in MB
Actually, I made a typo the result is 937,5 MB/s or 0.92 GB/s. This is just a rough estimate to get an idea of what is needed at this resolution just to push the all pixels on the screen in flat 2D at 60 fps, assuming that emulators don't use accelerated functions.
My point was that with 12.8 GB/s of memory bandwith, we should have more than enough even if this estimate isn't very accurate.
Thanks for the explanation
If there really were a memory bandwidth limitation the newer Trinity kernels and newest KTManta should help. In addition to the higher GPU speed they both allow (KTManta up to 720MHz) both ROM's have increased memory speeds which increase memory bandwidth to 13.8GB/s, up from 12.8 on stock.
Thanks for the info. There's so many configuration options available for the Nexus 10. I really enjoy having all those possibilities.
EniGmA1987 said:
If there really were a memory bandwidth limitation the newer Trinity kernels and newest KTManta should help. In addition to the higher GPU speed they both allow (KTManta up to 720MHz) both ROM's have increased memory speeds which increase memory bandwidth to 13.8GB/s, up from 12.8 on stock.
Click to expand...
Click to collapse
=Lodovik;40030*1600*4*60/1024/1024 = 937,3 MB/s for a 60 fps game at 32-bit depth. Most emulators don't use 3D functions so fillrate, rendering, overdraw won't be a factor. Most emulators are single-threaded (correct me if I'm wrong) and the A15 should shine in this particular situation and even more so in multi-threaded scenarios. With its out-of-order pipeline and greatly enhanced efficiency it should be perfectly suited for the job.
We have the fillrate, we have enough CPU power and I'm still wondering why simple app like emulators aren't much faster than that. Is it Android? Is it the Dalvik VM? Or is it because some emulators need to be written in native code instead of using Java VM? I'm not a developer and I have only minimal knowledge in this department. I can only speculate but I'm curious enough about it that I started googling around to find why.
Click to expand...
Click to collapse
You are taking what I said out of context. I was responding to someone else, thus the "quote" above my post.
Since you posted I loaded up some Super Nintendo, N64, and PlayStation games on my n10 without any issues. It may just be your setup. There are a lot of tweaks out there that could easily increase performance. One great and very simple one is enabling 2D GPU rendering which is in developer options. Just do some searching. GPU Overclocking won't help much, as you said above your games are only 2D. I am sure you can get them running just fine.