NPU activation , when ? - Huawei Mate 10 Questions & Answers

when will Huawei activate NPU processor and what is the difference between Mate 10 and pro in NPU speed ?

kinjx11 said:
when will Huawei activate NPU processor and what is the difference between Mate 10 and pro in NPU speed ?
Click to expand...
Click to collapse
The NPU is activated automatically essentially for camera and battery management for now.

sonydesouza said:
The NPU is activated automatically essentially for camera and battery management for now.
Click to expand...
Click to collapse
i hope so because it looks super fast
https://wccftech.com/huaweis-kirin-...ragon-845-worrying-sign-san-diego-chip-maker/

Related

What CPU (number of cores) does this device have and are all of the cores utilized?

Hi!
I'm keen to know if this is running the dual core or the quad core version of the Snapdragon 400?
I've been looking for a reliable source without any luck, so can someone here that actually owns the device verify the number of active cores?
Thank you very much!
bernard black said:
Hi!
I'm keen to know if this is running the dual core or the quad core version of the Snapdragon 400?
I've been looking for a reliable source without any luck, so can someone here that actually owns the device verify the number of active cores?
Thank you very much!
Click to expand...
Click to collapse
According to the internet it has 4 cores. Here is one of the sites I found: http://www.androidbeat.com/2014/07/samsung-gear-live-lg-g-watch-get-teardown-treatment/
GIYF
spiderflash said:
According to the internet it has 4 cores. Here is one of the sites I found: http://www.androidbeat.com/2014/07/samsung-gear-live-lg-g-watch-get-teardown-treatment/
GIYF
Click to expand...
Click to collapse
Thanks! I guess that it definitely has four cores. It does however say:
It’s also believed that these are simply regulated to use only one core, and one core only.
Click to expand...
Click to collapse
Can someone perhaps run some of the following commands and figure this out? "top -m -d 1 -n 1", "cat /proc/cpuinfo", "cat /proc/version". How many cores are actually available? It seems odd to me that all four cores should be utilized with regard to the very limited battery capacity.
Isn't it strange that they went ahead with such a "powerful" CPU/GPU combination? There are more energy efficient options out there such as the Cortex A7. It's not as powerful of course...but for a smart watch?
bernard black said:
Thanks! I guess that it definitely has four cores. It does however say:
Can someone perhaps run some of the following commands and figure this out? "top -m -d 1 -n 1", "cat /proc/cpuinfo", "cat /proc/version". How many cores are actually available? It seems odd to me that all four cores should be utilized with regard to the very limited battery capacity.
Isn't it strange that they went ahead with such a "powerful" CPU/GPU combination? There are more energy efficient options out there such as the Cortex A7. It's not as powerful of course...but for a smart watch?
Click to expand...
Click to collapse
This site features a /proc/cpuinfo. All four cores seem enabled in hardware, however, the geekbench results show it is limited to the maximum performance of one core (probably a scheduler tweak).
http://arstechnica.com/gadgets/2014/06/reviewing-android-wears-first-watches-sometimes-promising-often-frustrating/2/
bernard black said:
Isn't it strange that they went ahead with such a "powerful" CPU/GPU combination? There are more energy efficient options out there such as the Cortex A7. It's not as powerful of course...but for a smart watch?
Click to expand...
Click to collapse
Cool! My watch have more cores than my computer.
Well, I hope Google or the manufacturer can balance between good performance and battery life as this device replaces regular watch that have battery for half a year. I have no problem charging the watch, but It would be nice if possible to extend the battery life with one or two days extra.
Android Wear should also had an option to minimize battery drain by night. There is no reason that my watch display should be on or the bluetooth activated during night.
kartongsaft said:
Cool! My watch have more cores than my computer.
Click to expand...
Click to collapse
Are you kidding me? My other computer is 7 years old and already has a quad-core processor.
spiderflash said:
Are you kidding me? My other computer is 7 years old and already has a quad-core processor.
Click to expand...
Click to collapse
Unfortunately no. I hope that my current computer stops working one day, so I have the reason to buy a new one.
kartongsaft said:
Unfortunately no. I hope that my current computer stops working one day, so I have the reason to buy a new one.
Click to expand...
Click to collapse
My laptop fried 2 years ago, so I built a desktop. Much more fun than buying one.
Thanks Devs, from my LG G2

[Snapdragon] Quick Charge 2.0 hardware is built-in but not enabled

LINK : qualcomm.com/documents/quick-charge-device-list
According to this PDF from Qualcomm that lists QC3 and QC2 devices, the Xiaomi Redmi Note 3 "contains the hardware necessary to achieve Quick Charge 2.0" (page 4, bottom right).
Apparently Xiaomi has not enabled the feature. But it can probably be enabled by modifying the kernel.
Can the developers confirm that this can be done?
mottokeki said:
LINK : qualcomm.com/documents/quick-charge-device-list
According to this PDF from Qualcomm that lists QC3 and QC2 devices, the Xiaomi Redmi Note 3 "contains the hardware necessary to achieve Quick Charge 2.0" (page 4, bottom right).
Apparently Xiaomi has not enabled the feature. But it can probably be enabled by modifying the kernel.
Can the developers confirm that this can be done?
Click to expand...
Click to collapse
Actually RN3P Kenzo suport even QC3.0,but this feature is not enabled by default by manufacturer.Take a look here: qualcomm.com/products/snapdragon/processors/650
Here are detailed processor specs;this is a very powerfull and versatile procesoor&chipset.650Snapdrgon can support even NFC but this device does not have such feature enabled.
Snapdragon 650 Processor Specs
Compare all processors
CPU
Up to 1.8 GHz hexa-core
(2x ARM® Cortex™ A72, 4x ARM Cortex A53)
GPU
Qualcomm® Adreno™ 510 GPU
DSP
Qualcomm® Hexagon™ DSP
Camera
Up to 21 MP camera
Dual Image Sensor Processor (ISP)
Zero Shutter Lag
Video
Up to 4K Ultra HD capture and playback
[email protected] capture
H.264 (AVC)
H.265 (HEVC)
Display
Up to Quad HD (2560x1600) on device
1080p external display support
Audio
Qualcomm® Immersive Audio
Qualcomm® Snapdragon™ Voice Activation
LTE Connectivity
X8 LTE
LTE Category 7
Up to 300 Mbps DL
Up to 100 Mbps UL
Downlink Features:
2x20 MHz carrier aggregation
64-QAM
Uplink Features:
2x20 MHz carrier aggregation
16-QAM
Global Mode
LTE FDD and TDD
WCDMA (DB-DC-HSDPA, DC-HSUPA)
TD-SCDMA
EV-DO and CDMA 1x
GSM/EDGE
Additional features include:
LTE Broadcast
LTE Dual SIM
HD Voice over 3G and VoLTE
Wi-Fi calling with LTE call continuity
Wi-Fi
Qualcomm® VIVE™ 1-stream 802.11n/ac with MU-MIMO
Location
Qualcomm® IZat™ Gen8C
Charging
Qualcomm® Quick Charge™ 2.0
Qualcomm® Quick Charge™ 3.0
Security
Qualcomm Haven™ security suite:
Snapdragon StudioAccess™ Content Protection
Qualcomm® SafeSwitch™ Technology
Qualcomm® SecureMSM™ hardware and software foundation
Memory
LPDDR3 933MHz dual-channel
Process Technology
28nm HPm
RF
Qualcomm® RF360™ front end solution
USB
USB 2.0
Bluetooth
Bluetooth Smart 4.1
NFC
Supported
Part Number
8956
@Luiggi79 Agreed. But the RN3 does not have the required antennae for NFC, even though the SD650 supports it.
However, it does have compatibility for QuickCharge 2.0 and I believe it could be enable in the kernel. (I don't know about QC3.0 since the document does not say so).
This is similar to the LG G2 which had official support for QC1.0, but could successfully achieve QC2.0 using custom kernel.
what are the chances of getting this working?
And why did xiaomi not include something which the chipset supports?
Both those features would have made this a killer handset, are they that much more expensive to implement?
gotbass said:
what are the chances of getting this working?
And why did xiaomi not include something which the chipset supports?
Both those features would have made this a killer handset, are they that much more expensive to implement?
Click to expand...
Click to collapse
If it has the hardware capability then it should definitely work. But the battery should be able to handle it too.
Xiaomi probably didn't enable it because they did not have licencing for this handset. It would add to the cost I guess.
Sent from my Redmi Note 3 using XDA-Developers mobile app
hulgo said:
If it has the hardware capability then it should definitely work. But the battery should be able to handle it too.
Xiaomi probably didn't enable it because they did not have licencing for this handset. It would add to the cost I guess.
Sent from my Redmi Note 3 using XDA-Developers mobile app
Click to expand...
Click to collapse
do you think through development we could unlock these features?
battery might not be to handle it? it gets pretty hot already when charging
meangreenie said:
battery might not be to handle it? it gets pretty hot already when charging
Click to expand...
Click to collapse
What charger are you using ? I haven't used official charger I am using my Samsung charger which is just a standard 2.0A charger from my Note 3 or something. Phone doesn't warm up even a little bit. In fact I haven't felt this phone warm up ever, even after 30 mins of GPS use or charging. Unusual !
Some developers have already been testing quick charge and apparently it is working. Of course, no one knows what effects it could have mid and long term. I believe it was in on of the cyanogenmod rom threads.
>1080p external display support
Since the RN3 does not support an external display officially but the 650 does, I was wondering if this could be enabled by software like quick charge or does it need some special hardware connection like nfc? Anyone knows? Would love to be able to connect the RN3 to an external display.
syl0n said:
What charger are you using ? I haven't used official charger I am using my Samsung charger which is just a standard 2.0A charger from my Note 3 or something. Phone doesn't warm up even a little bit. In fact I haven't felt this phone warm up ever, even after 30 mins of GPS use or charging. Unusual !
Click to expand...
Click to collapse
Can confirm that it does not get hot at all while charging, using the original 2A charger that came with the device.
I use a simple trick to counter the charging heat, remove back cover if u use any, put ur mobile on slightly damp cloth while charging. That's it.
Fast charging works on my device 1 hr to full charge
hmmm
Then ,sir, tell us how u do that ??
AFAIK, to have full support for Quick Charge 2.0/3.0 on any Snapdragon device, in this case for Redmi Note 3
- The Hardware (SoC) should be compatible for QC. (Present)
- The OEM should pay the licensing fee for QC to Qualcomm. Around 5$ per device for QC 2.0 (Absent)
- The Kernel should have QC enabled. (Absent)
- The Power Management Integrated Circuit (PMIC) should be capable of handling QC. (Not sure)
- The Battery should be capable of rapid charging via QC. (Not Sure)
sushil888 said:
Then ,sir, tell us how u do that ??
Click to expand...
Click to collapse
I made my own custom kernel.. Think that's the reason why xiaomi allows unlocking the device.. So the we can enable this feature and they can save a few bux .. Also fixed the network issue
filthyrich77 said:
I made my own custom kernel.. Think that's the reason why xiaomi allows unlocking the device.. So the we can enable this feature and they can save a few bux .. Also fixed the network issue
Click to expand...
Click to collapse
Could you share with us that? Is there a link to the cyanogenmod discussing qc2/3 with this phone?
filthyrich77 said:
I made my own custom kernel.. Think that's the reason why xiaomi allows unlocking the device.. So the we can enable this feature and they can save a few bux .. Also fixed the network issue
Click to expand...
Click to collapse
Do you know if external display support, be it MHL or SlimPort, could be enabled the same way or needs extra hardware so it is not possible?
its fake claim of custom kernel creator here...............1 hr for 4050 mah is impossible, as per the Qualcomm list of supported device RN3 supports QC 2.0, May be users and forums plague with heating issue on its launch,so they blocked at kernel level and also to save costs on Qc2.0 Supported charger, please stop bluffing as jackXXx,. even if he is skilled to code, 1hr is impossible. mi5 takes 1 hour + with 3000mah battery. qc1.0 qc2.0 qc3.0 times will be 2h 50mts, 2 hrs 10mts, 1hr 35mts. thnks to thread creator for finding it listed in qualcomm page.
Surprised to read this, can u share that kernel?
http://forum.xda-developers.com/xperia-z2/development/sony-secret-revealed-qc-2-0-t3163490
I saw that a while ago so thought it might be possible.
Athiril said:
http://forum.xda-developers.com/xperia-z2/development/sony-secret-revealed-qc-2-0-t3163490
I saw that a while ago so thought it might be possible.
Click to expand...
Click to collapse
Interesting info

[ANNOUNCEMENT] Remix IO Facebook Event October 20, 2016 @11:00pm (GMT+8)

Hey XDA Remixers!
We warmly invite you to the Remix IO Facebook event going on tomorrow, Thursday October 20 @11pm (GMT+8). I'd like to tell you now what Remix IO is all about, but then I wouldn't be doing my job.
I will drop a hint here that there will be something nice for developers in the event/campaign.
Here's the link to the Remix IO Facebook event: https://www.facebook.com/events/1781404908740204/
If you aren't on Facebook for any reason, don't worry as I will be sharing another link with you guys around 11pm (GMT+8) as well. So, no one who is interested will be missing out.
Thanks!
yeah new toys
heh new remix os device.
Does anyone see where it says what chip they are using for this. I would HATE to get this like the Mini and be stuck on a device that will NOT get updates in the future. 2GB is also very much on the low side for a console you would like to run games on.
thirdlobe said:
Does anyone see where it says what chip they are using for this. I would HATE to get this like the Mini and be stuck on a device that will NOT get updates in the future. 2GB is also very much on the low side for a console you would like to run games on.
Click to expand...
Click to collapse
RK3368 from Rockchip
https://www.kickstarter.com/project...gat-powered-all-in-one-device?token=bfc75bfd#
thirdlobe said:
Does anyone see where it says what chip they are using for this. I would HATE to get this like the Mini and be stuck on a device that will NOT get updates in the future. 2GB is also very much on the low side for a console you would like to run games on.
Click to expand...
Click to collapse
Exactly. The 2GB RAM can still be tolerated when the Remix Mini was first dreamed up last year (2015) but a BIG no-no for a 2017 model. Not with this NEW Remix IO.
Speaking of performance, I wonder why didn't they go for some SoC that has at least the ARM Cortex-A72 cluster in addition to the low power Cortex-A53 cluster to improve performance.
kevinf28 said:
RK3368 from Rockchip
https://www.kickstarter.com/project...gat-powered-all-in-one-device?token=bfc75bfd#
Click to expand...
Click to collapse
Definitely not the best chipset.
nakTT said:
Exactly. The 2GB RAM can still be tolerated when the Remix Mini was first dreamed up last year (2015) but a BIG no-no for a 2017 model. Not with this NEW Remix IO.
Speaking of performance, I wonder why didn't they go for some SoC that has at least the ARM Cortex-A72 cluster in addition to the low power Cortex-A53 cluster to improve performance.
Click to expand...
Click to collapse
Possibly for cost but I have been reading where they may address the RAM. I dont know if that will make it to the Kickstarter buyers.
So what chip would u like to see?
Your Jide Ambassador is here!!!
x86 support
Does the Remix team plan on releasing this software version for x86 like the current Remix OS? It would be awesome to load on my old chromebox instead of openelec.
AmoraRei said:
So what chip would u like to see?
Your Jide Ambassador is here!!!
Click to expand...
Click to collapse
Chip with a least a pair of Big-Core (ARM Cortex-A72 or something along that line) in addition to the cluster of Small-Core (ARM Cortex-A53) from a manufacturer that is better in term of Android support than the one in Remix Mini who refused to cooperate with Remix on the Marshmallow upgrade. We don't want the same disaster, do we?
nakTT said:
Chip with a least a pair of Big-Core (ARM Cortex-A72 or something along that line) in addition of the cluster of Small-Core (ARM Cortex-A53) from a manufacturer that is better in term of Android support than the one in Remix Mini who refused to cooperate with Remix on the Marshmallow upgrade. We don't want the same disaster, do we?
Click to expand...
Click to collapse
intel atom with uefi?
tailslol said:
intel atom with uefi?
Click to expand...
Click to collapse
If it is at least comparable to ARM Cortex-A72 + A53 combo in performance and power consumption as well as cost, why not.
nakTT said:
If it is at least comparable to ARM Cortex-A72 + A53 combo in performance and power consumption as well as cost, why not.
Click to expand...
Click to collapse
Well Intel tryed to build some atom phones with the asus zenphone 2 so I'm not sure but it can be comparable. Anyway low consumption and big little is useless with a tv box without battery...no limit in power here. It use a wall socket after all.
But I can imagine a nice tv box with a cherry trail atom.
And the uefi part mean: you can mod a system yourself.
tailslol said:
Well Intel tryed to build some atom phones with the asus zenphone 2 so I'm not sure but it can be comparable. Anyway low consumption and big little is useless with a tv box without battery...no limit in power here. It use a wall socket after all.
But I can imagine a nice tv box with a cherry trail atom.
And the uefi part mean: you can mod a system yourself.
Click to expand...
Click to collapse
To be fair to ARM and its big.LITTLE™, low power is not useless. All things being equal, low power can also be translated into less heat. I'm sure you know what less heat means to longevity, packaging etc of a product.
nakTT said:
To be fair to ARM and its big.LITTLE™, low power is not useless. All things being equal, low power can also be translated into less heat. I'm sure you know what less heat means to longevity, packaging etc of a product.
Click to expand...
Click to collapse
well no mostly because I live near a desert area so all those passive TV boxes overheat here what ever CPU they have. I know a fan add a failure point but a fan is just needed here.
most computer we build here have oversized coolers.
so this is why I am interested with the Io having a fan.
and to me usability comes first with performance.
this is not a phone after all
again
I have pledge remix io+ but please guys let us ask Jide to include a microphone in the controller so that we can do voice search over YouTube and use Google now voice search. I sent them an email asking this. Please guys send an email too. This is a function that needs to be implemented. Put a microphone in the controller like nvidia shield
Why a vga port for this great device?
https://www.xda-developers.com/jide...remix-os-as-it-focuses-on-enterprise-markets/
Sent from my I9195I using XDA Labs

What is the maximum benchmark for mido?

Can you guys say what is the maximum benchmark score for mido 4/64gb variant in latest software update I Mean miui 9 I got a score of 59k is it good?
meheboobalam1 said:
Can you guys say what is the maximum benchmark score for mido 4/64gb variant in latest software update I Mean miui 9 I got a score of 59k is it good?
Click to expand...
Click to collapse
Hm? Are you still that concern with benchmarks these days?
anyways, i got 64K+ on ViperOs, a good score but in Quadrant i got uninspiring results. (SD, 3GB version)
59K is good enough as long its perform well as day to day driver without any hiccup.
GabrielScott said:
Hm? Are you still that concern with benchmarks these days?
anyways, i got 64K+ on ViperOs, a good score but in Quadrant i got uninspiring results. (SD, 3GB version)
59K is good enough as long its perform well as day to day driver without any hiccup.
Click to expand...
Click to collapse
Using official stock ROM. Not a fan of custom ROM not anymore ultimately we will come back to stock ROM so why. Not modified stock ROM like Xiaomi.eu
65k on AEX without any modifications, 4/64gb variant.
64-65k on redmi note 4x (sd) 3/32, rom RR
Mine is 0 coz I dun care... Lol
aabenroi said:
Mine is 0 coz I dun care... Lol
Click to expand...
Click to collapse
Well, if I was going to use benchmarking apps, I would rather use Geekbench4 or PC Mark2.0.
DarthJabba9 said:
Well, if I was going to use benchmarking apps, I would rather use Geekbench4 or PC Mark2.0.
Click to expand...
Click to collapse
Whatever bench, I don't trust bench anymore...
The mtk for example score very high bench but real performance is so bad.
Those huawei kirin score very low bench score, but it plays games, especially psp and dolphin emulation way-way better than snapdragon and mtk that have better bench score.
aabenroi said:
Whatever bench, I don't trust bench anymore...
The mtk for example score very high bench but real performance is so bad.
Those huawei kirin score very low bench score, but it plays games, especially psp and dolphin emulation way-way better than snapdragon and mtk that have better bench score.
Click to expand...
Click to collapse
MTk is scoring high because it's faster but for short amounts of time because they overheat, after 10-15minutes of heavy usage like gaming it's reducing CPU/GPU frequency to reduce heat (thermal throttling), and then the performance is much worse.
Thats why benchmarks are useless, they are too short to show the real performance. For example Snapdragon 625 can keep the same performance for hours during heavy gaming without thermal throttling, because it's super power & thermal efficient.
aabenroi said:
Whatever bench, I don't trust bench anymore...
The mtk for example score very high bench but real performance is so bad.
Those huawei kirin score very low bench score, but it plays games, especially psp and dolphin emulation way-way better than snapdragon and mtk that have better bench score.
Click to expand...
Click to collapse
The secret is to look at single-core performance. Multi-core benchmarks are always very high on MTK because their chips usually have more cores than others. However, how many apps actually use all those cores? All that the extra cores do is to suck your battery.
k3lcior said:
MTk is scoring high because it's faster but for short amounts of time because they overheat, after 10-15minutes of heavy usage like gaming it's reducing CPU/GPU frequency to reduce heat (thermal throttling), and then the performance is much worse.
Thats why benchmarks are useless, they are too short to show the real performance. For example Snapdragon 625 can keep the same performance for hours during heavy gaming without thermal throttling, because it's super power & thermal efficient.
Click to expand...
Click to collapse
Yeah I agree,
Beside that, benchmark just measure raw power, big raw power is useless without good optimization.
Snapdragon n Kirin have lower raw power but easily beat mtk because they're way more optimized.
DarthJabba9 said:
The secret is to look at single-core performance. Multi-core benchmarks are always very high on MTK because their chips usually have more cores than others. However, how many apps actually use all those cores? All that the extra cores do is to suck your battery.
Click to expand...
Click to collapse
Nowdays, mtk score very high at single core as well... But it still sucks...

p30 pro scores higher than mate30pro ?

just read the article on xda comparing sd865 vs 855 vs kirin990
was curious to test mine the p30pro, well it scored higher than the mate and sd855 on both antutu 8.0.4 and 8.0.5 ?
any clues ? mine is android10/emui 10 , 8/128
ashouhdy said:
just read the article on xda comparing sd865 vs 855 vs kirin990
was curious to test mine the p30pro, well it scored higher than the mate and sd855 on both antutu 8.0.4 and 8.0.5 ?
any clues ? mine is android10/emui 10 , 8/128
Click to expand...
Click to collapse
? Your P30Pro is a beast.
Objectivity on tests and reported results are nowadays a big mess...
Chinese cheap better than a US cheap... Impossible... With all the trade ban and other big commercial fight... The truth is out there but where, in the customer's hands ?
ashouhdy said:
just read the article on xda comparing sd865 vs 855 vs kirin990
was curious to test mine the p30pro, well it scored higher than the mate and sd855 on both antutu 8.0.4 and 8.0.5 ?
any clues ? mine is android10/emui 10 , 8/128
Click to expand...
Click to collapse
Well,that's not true. What you see there is an average score,and most of the people do not take those tests correctly. They use power saving mode or normal mode without realizing that it affects the score. My P30 Pro (8GB RAM with 256GB storage) scored almost 416000 in Antutu BTW. I bet that Mate 30 Pro easily beats this score if it's done correctly,with performance mode active.
This is due to different versions of antutu. On latest version p30 pro scores 415k while mate 30 pro scores around 470k.

Categories

Resources