Related
Alright, so I have a mini USB car charger that is rated at 2.0A (got with GPS) and I just want to make sure the extra amperage is not going to charge my battery too quickly and then kill it in the process. All I have to go on is that my wall charger is capable of only 1.0A and the computer USB is on the level of mA. Does anybody have a car charger or know the maximum amperage possible that these phones can take?
Thanks in advance.
Its to Strong ! I have Crash a PDA with 1.5A
I've used a 2.0a charger but noticed that the battery didn't charge right.
On that note, also noticed that a .5a or 500mA charger after about 6 months killed my battery. (burned out and won't take a full charge anymore)
Recommendation is to stick to 1.0A charging as much as possible as the Wizard's charge circuit is designed at 1.0A charging and monitoring.
DOESN'T MATTER ..if u know the basic Ohms Law..its the max capacity of the charger.(or any source) the current regulating circuits will take care of the real charging current fed to the battery
I charge my phone on my computers USB connection all the time, all computers have a maximum of 500mA USB current.
Amps are pulled from the charger, not pushed to the phone.
Voltages are pushed to the phone, not pulled from the charger.
If the phone draws too much current from the charger, the voltage drops to a point where the maximum power (P (Watt) = U (Voltage) * I (Amps)) of the charger isn't exceeded.
According to the USB specification you need at least 200mA with 500mA recommended +5VDC.
More is never a problem, you could even use a 50A power supply without breaking your phone. Practically spoken: I wouldn't do that.
huh?
I have messed up a few devices before by putting the wrong charger on the device (2.0 amp charger on a droid eris and 1amp charger on the MOTOACTV) Is this a problem with the device's charging circuit? In general it should only pull up what it needs to charge but my devices were messed up?
I returned the first MOTOACTV after it wouldn't leave the boot screen and the second one I got acted funny on 1 amps as well. It charged fine on .75amp charger that came with the device but when I put it on my HTC Dinc charger it shot up from 10% to 20 to 30 etc. all within minutes. Hope I didn't mess this one up too.
Just trying to figure out what the deal is. My phone for sure charges faster when connected to 1amp vs .5amp (computer). This makes sense I understand because the device is able to handle 1amp, but I wonder if it would mess up with a 2amp charger.
Thanks!
The last two posts of 2008 are correct. Your phone is capable of drawing more than 500 mA but less than 1000 mA. If the charger is 1A, 1.5A, 2A, or 50A, it won't make a difference to the phone's charging time or life.
Wrong voltage can be bad, but phones are designed to support USB charging, as a minimum, and 1A to 2A is always safe. As was said in 2008, the charger pushes voltage to the battery, but the battery pulls current from the charger.
That's a really good way of describing it
Pushing and pulling current and voltage is a really good way of describing it.. Given that I am theoretically well within the charging parameters, how comes my phone (Galaxy mini/pop) becomes unusable when its charging (touch screen doesnt touch and screen jumps to new screen without touching?) Am I wrecking my phone?
No, your charger is to blame. Maybe it isn't properly grounded. Phones with capacitive touch screens (not the Wizard! but maybe your phone) can get really weird on some chargers. I have a Nexus One with an aftermarket charger that always makes the touch screen go haywire. When I use an HTC charger, the phone has no problem. As far as I know, the damage isn't permanent, it's just that the sensors get confused. When I disconnect from the charger, turn the phone off, then turn it on, all is well.
My wizards were never bothered by chargers, no matter what kind, as long as they were mini-USB, they were the right voltage. The wizard doesn't take as much current as more modern phones, either.
Hmm. Not properly grounded sounds very plausible. Viva mediterranean circuits. Thanks
Usb chargers will be rated at 5v, which is exactly the correct voltage to charge your device. What you want to make sure is you buy the correct "rated" amperage. Not because it will damage your phone. Amperage only exists as a sum of the power used by the device. You want to find a decent 1amp (1000mA) or higher if you wish, rated usb charger. The the mains charger for your phone is only rated at 1amp, so a 1amp charger is adequate. Avoid 500ma chargers as it will take twice as long to charge and gps / satnav applications will drain the battery even when charging at that rate.
Also if you have a new pc, most of the decent boards will specify 1amp charge even when off if it has on/off charge stated on manufacture details.
Newer phones will charge fine with higher amperage
The myth that charging your device at a faster rate will reduce the life of your device’s battery is false!
If you want quicker charging, look for a wall or car charger that delivers 2100 mA of current at 5 volts higher won't matter.... These lithium ion batteries can handle it... It was just back then if you were to try to put a faster charger in an older battery it (in any cases) just won't charge.
The Atrix's default adapter in an 5v = .85A, while the Ipad's is 5v = 2.1A. Is it safe for the battery to use this charger? I also have been using the ipod charger on my atrix too, should i countinue to use the ipod charger or does that have negitive effects too, ipod charger is 5v = 1A
Atrix: 5v = .85A
Ipod/Iphone: 5v = 1A
Ipad: 5v = 2.1A
Typically, a device will only pull what it needs, amperage-wise. The ratings on power supplies are, to my knowledge, always indications of maximum amperage, not any form of 'forced' current. Thus, the only time you need to be worried is if it is lower than your device's required input. You should be fine with either.
+1
That's correct. I actually spent a lot time researching that kind of stuff because I use electronic cigarettes and finding chargers for them was difficult. Anyways, as long as it's 5V it should be fine. They actually make AC adapters that are iPad "compatible", meaning that they are just rated at 2.1A but it still works with the iPhone which the OP has stated uses a lower Amperage.
ian426 said:
Typically, a device will only pull what it needs, amperage-wise. The ratings on power supplies are, to my knowledge, always indications of maximum amperage, not any form of 'forced' current. Thus, the only time you need to be worried is if it is lower than your device's required input. You should be fine with either.
Click to expand...
Click to collapse
Thank you, both of you.
Would it charge the same rate?
the fact that a device will pull as much as it needs is true, but that is true only to the devices, appliances, and anything that is using the electricity, not storing it - which is the case with the battery. any electrical device uses only as much power as it needs. for example: a 55watt house light bulb will only use .5 amps, (110 volts AC) even though the circuit is wired for 15 amps max.
When it comes to cellphones, the cellphone is the device that uses the power and the battery stores the power. during charging, battery will try to pull as much in as you will give it, unless there is a limiting factor involved. a limiting factor can be a charger it self, which will supply 1.0 amps, .85 amp max, or what ever the case may be. also there may be a limiting factor built in to the phones circuitry it self that would allow only so much to go through ( i seriously doubt though)
By plugging in to 2.1A charger, the battery will try to intake all 2.1 amps,
Pro: you are charging the battery in half the time.
Con: if it doesn't destroy the battery right away, the lifespan of it and usefulness decreases dramatically.
This is called overcharging the battery, do some research on that and you will find out that overcharging the battery is never a good thing.
2.1A is not enough to destroy the battery right away, but if you would have plugged in 5 or 10 amp charger, it probably would, i'm just saying this to explain the concept.
I personally do use a 1.0A charger that i have left over from previous cellphone (touch pro 2) and your ipod charger should be ok too, but I wouldn't use anything bigger then that.
a small experiment that you can conduct which may or may not work. compare the temperatures of the battery/cellphone while it is charging on .85amp charger and 2.1amp, when it is on a bigger charger, it should get a lot hotter, and that is what destroys the battery.
As far as my knowledge goes, i have taken enough classes about electricity and electronics, and have been working in the field for several years, so i hope i was helpful enough and explained it in simple enough terms for everyone.
hlywine said:
the fact that a device will pull as much as it needs is true, but that is true only to the devices, appliances, and anything that is using the electricity, not storing it - which is the case with the battery. any electrical device uses only as much power as it needs. for example: a 55watt house light bulb will only use .5 amps, (110 volts AC) even though the circuit is wired for 15 amps max.
When it comes to cellphones, the cellphone is the device that uses the power and the battery stores the power. during charging, battery will try to pull as much in as you will give it, unless there is a limiting factor involved. a limiting factor can be a charger it self, which will supply 1.0 amps, .85 amp max, or what ever the case may be. also there may be a limiting factor built in to the phones circuitry it self that would allow only so much to go through ( i seriously doubt though)
By plugging in to 2.1A charger, the battery will try to intake all 2.1 amps,
Pro: you are charging the battery in half the time.
Con: if it doesn't destroy the battery right away, the lifespan of it and usefulness decreases dramatically.
This is called overcharging the battery, do some research on that and you will find out that overcharging the battery is never a good thing.
2.1A is not enough to destroy the battery right away, but if you would have plugged in 5 or 10 amp charger, it probably would, i'm just saying this to explain the concept.
I personally do use a 1.0A charger that i have left over from previous cellphone (touch pro 2) and your ipod charger should be ok too, but I wouldn't use anything bigger then that.
a small experiment that you can conduct which may or may not work. compare the temperatures of the battery/cellphone while it is charging on .85amp charger and 2.1amp, when it is on a bigger charger, it should get a lot hotter, and that is what destroys the battery.
As far as my knowledge goes, i have taken enough classes about electricity and electronics, and have been working in the field for several years, so i hope i was helpful enough and explained it in simple enough terms for everyone.
Click to expand...
Click to collapse
I might have to double check that. There is a chance that there is some sort of limiting circuit between the wall and the charger for the Atrix... I am fairly certain at least laptops do so. I will see if I have a stronger charger and I will check the voltage across the leads in the atrix... if I can.
Its not voltage that you should be checking, voltage should be same in all usb chargers, about 5 volts, you should be checking amps
hlywine said:
Its not voltage that you should be checking, voltage should be same in all usb chargers, about 5 volts, you should be checking amps
Click to expand...
Click to collapse
My mistake... realized that after I posted it.
Also -- I do not have any USB charger that is over one amp, so I cannot check this. If anyone has a mutineer and a more powerful charger, they could do so.
The important factor is the Voltage which is at 5V for both the iPad and Atrix chargers. Whether it is rated at 10W or 5W does not matter because that just reflects the capacity for the current. And the charger is "rated" at 2.1 meaning it can handle that current rather than meaning it will force it. The "draw" of current is decided by the phone itself, as long as the Voltage is identical, the other factors should not matter.
If you read the "Summary" here it will say that, with the iPad charger, you can charge an iPhone which is similar to the Atrix in charging specs:
http://support.apple.com/kb/HT4327
And here are a couple more links:
http://www.youtube.com/watch?v=-ZjRm8nkv9Q
http://munnecke.com/blog/?p=836
hlywine said:
the fact that a device will pull as much as it needs is true, but that is true only to the devices, appliances, and anything that is using the electricity, not storing it - which is the case with the battery. any electrical device uses only as much power as it needs. for example: a 55watt house light bulb will only use .5 amps, (110 volts AC) even though the circuit is wired for 15 amps max.
When it comes to cellphones, the cellphone is the device that uses the power and the battery stores the power. during charging, battery will try to pull as much in as you will give it, unless there is a limiting factor involved. a limiting factor can be a charger it self, which will supply 1.0 amps, .85 amp max, or what ever the case may be. also there may be a limiting factor built in to the phones circuitry it self that would allow only so much to go through ( i seriously doubt though)
By plugging in to 2.1A charger, the battery will try to intake all 2.1 amps,
Pro: you are charging the battery in half the time.
Con: if it doesn't destroy the battery right away, the lifespan of it and usefulness decreases dramatically.
This is called overcharging the battery, do some research on that and you will find out that overcharging the battery is never a good thing.
2.1A is not enough to destroy the battery right away, but if you would have plugged in 5 or 10 amp charger, it probably would, i'm just saying this to explain the concept.
I personally do use a 1.0A charger that i have left over from previous cellphone (touch pro 2) and your ipod charger should be ok too, but I wouldn't use anything bigger then that.
a small experiment that you can conduct which may or may not work. compare the temperatures of the battery/cellphone while it is charging on .85amp charger and 2.1amp, when it is on a bigger charger, it should get a lot hotter, and that is what destroys the battery.
As far as my knowledge goes, i have taken enough classes about electricity and electronics, and have been working in the field for several years, so i hope i was helpful enough and explained it in simple enough terms for everyone.
Click to expand...
Click to collapse
thanks live4nyy, i never saw those before. with all the stuff described there, the only conclusion possible is that each device has its built in limiter on how much it will pull while charging, or apple figures that with a bigger charger your battery on ipod/iphone will still last you past the 1 year manufacturers warranty expiration date, but barely past that date, instead of lasting 3-5 years like its suppose to. what ever the case is with apple, i just hope we have a safety built in into our atrix phones. I guess the only way to find out is to actually check the amperage while its charging.
I'm almost positive that the lithium batteries in phones these days are "rated" for specific current and have built in circuits that dictates the "flow", which is also the same thing that causes the battery to go into a "trickle" charge when near capacity. Just for that there has to be some sort of "regulation" happening. See also here:
http://science.howstuffworks.com/environmental/energy/question501.htm
But I agree, better safe than sorry. If you happen to have an iPad charger that you plan on using let me know how it goes. I'm curious as well.
hlywine said:
thanks live4nyy, i never saw those before. with all the stuff described there, the only conclusion possible is that each device has its built in limiter on how much it will pull while charging, or apple figures that with a bigger charger your battery on ipod/iphone will still last you past the 1 year manufacturers warranty expiration date, but barely past that date, instead of lasting 3-5 years like its suppose to. what ever the case is with apple, i just hope we have a safety built in into our atrix phones. I guess the only way to find out is to actually check the amperage while its charging.
Click to expand...
Click to collapse
I know different chargers have different amps, but what i dont know is how much the phone (battery) can take at a time, even if the charger is 10 amp it doesnt been the phone get fully charge in 10 min.
Look for how much the charger gives out is not hard cause its label, but how much does the phone take?
cant seem to edit my thread (TYPO) nor find it on my profile, maybe its my browser.
Samsung phones charge at about 800mA. Samsung chargers have the two middle pins (data pins) soldered. Any regular charger, even with 2Amps will only max at about 330mA charging. The phone needs to detect that the two data pins are soldered. So higher Amps is not equal to faster charging unless you are using a Samsung charger or you have the data pins soldered( pretend to be Samsung charger)
Sent from my GT-I9100 using xda premium
No. The phone will regulated the charging on the device as lipoly batteries have a specific charging method. It is possible (for example) that the device may charge at 500mA maximum current draw, but a charger is rated at 1A. This would meen that the charger could output 1A without the voltage falling so in theory the usb output could be split between two devices. If the charger was rated at 300mA but the device needed 500mA then the output voltage may be very unstable and damage either the phone or the charger.
Choosing a higher rated charger will not charge the phone faster. Any attempt to do so with Lipoly chemistry could result in explosion.
Sent from my U20i using Tapatalk 2
This is were i got my info from:
http://forum.xda-developers.com/showthread.php?t=1384253
Were did you get yours from?
Sent from my GT-I9100 using xda premium
Just curious if there is an app or something similar that would show how many amps are being provided when charging through a wall charger/USB powered hub? The reason I ask is that I'm thinking of buying a powered USB 3.0 Hub. The adapter that came with our phone says it's 2 Amp, so I am assuming our phone can pull 2 amps for charging. Just wanted to verify in some way that a 2 Amp dedicated port would really work for this phone.
*Madmoose* said:
Just curious if there is an app or something similar that would show how many amps are being provided when charging through a wall charger/USB powered hub? The reason I ask is that I'm thinking of buying a powered USB 3.0 Hub. The adapter that came with our phone says it's 2 Amp, so I am assuming our phone can pull 2 amps for charging. Just wanted to verify in some way that a 2 Amp dedicated port would really work for this phone.
Click to expand...
Click to collapse
When I get home, I'll download the kernel source and see if I can find out how much power it draws during charging. I doubt however, that it will draw 2A during charging as most chargers are rated to supply more power than the phone will accept.
Yeah, 2A seems like that could melt a battery charging that fast. Someone sent me a private message and told me to try CurrentWidget. I threw that on the phone and it registers as 1A while charging. But it appears like the widget doesn't break it down with decimals. For instance it could be charging with 1.8A and wouldn't know it. I put it in a standard USB port and it reported as charging with 0 Amps but the battery was indeed charging.
I took a quick look at the N7100 (International Note 2) source posted on Github by CM and it looks like AC charger is 650mA, USB is 450mA. It's a little hard to tell what exactly it's using for charging, so I'll try to verify that when I get home and have a chance to take a better look.
*Madmoose* said:
Yeah, 2A seems like that could melt a battery charging that fast. Someone sent me a private message and told me to try CurrentWidget. I threw that on the phone and it registers as 1A while charging. But it appears like the widget doesn't break it down with decimals. For instance it could be charging with 1.8A and wouldn't know it. I put it in a standard USB port and it reported as charging with 0 Amps but the battery was indeed charging.
Click to expand...
Click to collapse
A 3100mAh Lithium Ion battery can easily handle a full 2A charge rate. The ideal charge profile for Lithium Ion is a CC/CV profile, starts at constant current between like 3V and 4V, which most LI batters can take a rate of 1C, meaning it can handle a charge rate of 3.1A, recommended charge rate to achieve the most possible charge/discharge cycles is usually 0.2C so for a 3100mAh battery that would be 620mA. Once the charge gets to the correct voltage it gets to constant voltage and charges until termination current usually in the 100mA range. So yes, it can handle a 2A charge no problem.
Hey there. I very much appreciate that breakdown. Makes me wonder why they dropped the amps so much during charge.
bose301s said:
recommended charge rate to achieve the most possible charge/discharge cycles is usually 0.2C so for a 3100mAh battery that would be 620mA.
Click to expand...
Click to collapse
If this is true (first time I've seen this anywhere), that would line up great with the 650mA max charge rate I found. Also, I downloaded the VZW source, and it doesn't look to significantly different from the N7100 source, at least as far as the charger stuff is concerned, so I would say they both probably have a max charge rate of 650mA.
I appreciate the info and time you both put into this. I guess it means a 2A usb port will be slight overkill. Even changing the charge rate to a higher value seems to indicate a lower battery life. Makes you wonder how apple did it's math for the ipads charge rate. The battery must be huge to accommodate a 1.1A charge rate. Or they are sacrificing battery life for fast charging.
Wont the kernel dictate the charge rate no matter what the charger is rated at?
If the kernel is set for a charge rate of 650mA (0.650A), then why does the Note 2 have a more powerful 2A wall charger, while the GS3 has a 1A wall charger.
FAUguy said:
If the kernel is set for a charge rate of 650mA (0.650A), then why does the Note 2 have a more powerful 2A wall charger, while the GS3 has a 1A wall charger.
Click to expand...
Click to collapse
The original nook color 7" came with a 2A wall charger and that was 2 years ago... both my note 2 and nook color charge about the same rate (quick to 99% and slow to 100). The charger is probably cheaper to make at 2A rather than anything and plus it could be used to charge future devices. Also if you used a 1A charger to charge the note it might possibly get warm/hot from running at near full capacity.
Im using my OLD blackberry 700mA wall charger to charge the phone at night while im sleeping. No problems with heat.
Hello all,
I have a really stupid question but it's keeping me awake...
The DashCharge charger, can I use it with other devices? Like a MP3 player? Or another phone? Or is it purely and uniquely proprietary for the OnePlus phones and it might damage the other devices I would use it with?
I know it won't charge faster the devices or... But if I can plug anything with it, then I can drop the other chargers I have and keep only the DC in my bag
Thanx for your replies
Dash power chargers work at 5 Volts 4 Amps which translates to 20 Watts, if I'm not mistaken, Qualcomm's Quick Charge 3.0 works at 6.5 Volts on 3 Amps which means 19.5 Watts of power. The difference in Volts and amps between different chargers is not an issue for most devices for two main reasons.
1. Amperage is pulled by the device, which means that if you have a charger with a 4 Amp capacity, you can pretty much charge any device that draws up to 4 Amps, since no phone that I know of besides the 1+5 and 1+3(t) draws that much, you'll be fine.
2. On the voltage side, most phone chargers (apart from Qualcomm's Quick Charge) usually work at 5 Volts, the same as 1+ Dash chargers. And even if the Voltage rating on the charger is higher than the phone. These type of devices have safety features that reduce the output if they don't recognize the device being charged as compatible with their technology.
So for a quick recap, if the phone you're charging is not compatible with the technology of the charger (1+ Dash, Qualcomm QC for example) the charger will make sure to reduce the amount of power being fed to the device to a safe amount. So normally a non Dash compatible phone will probably charge at no more than 5V 2 or 2.4A.
Thanx a lot for your answer
So I can plug to my DashChargers my MP3 players and lent them sometimes to my colleagues to charge their phones as well, with nothing to worry about. It's good to know
LeKeiser said:
Thanx a lot for your answer
So I can plug to my DashChargers my MP3 players and lent them sometimes to my colleagues to charge their phones as well, with nothing to worry about. It's good to know
Click to expand...
Click to collapse
Yup, it just won't charge at full speed as it would with 1+ devices
You can use it with any device you like, but only OnePlus devices get the fast charge advantage!
Had anyone tried usb c power delivery charge yet?
I've got one for my cheap vernee which works well, in theory it requires negotiation to draw the correct power, but wondered if anyone has tested one yet
and oppo