Wow, with the number of smart people here, it is truly amazing how few of you really understand how USB charging works, or even much about electronics.
O.K. raise your hands if you thing any 2.1A USB port can charge any USB device that requires 2.0A? O.K. Look around. If you have your hand up, you are WRONG.
Here is an easier one. Lets say you plug a circular saw that uses 18A into a plug rated for 15A? It blows the fuse, correct? So what happens when you plug a tablet that draws 2.1A into a USB outlet that can only supply 0.5A? It charges at 0.5A. How can that be? If you measure the USB voltage its still 5V. But no sparks.
Well USB charging involves MUCH more than just a socket at a certain amperage. A USB devices "asks" the charger how much current it can supply, so it can adjust to that charger. A "standard" USB outlet is only guaranteed to supply 0.5A at 5V. Anything over that is optional.
Back when all USB outlets were in computer, it wasn't a problem because a computer and smartphone/tablet had no problem conducting a great conversation on maximum current allowed, but what about "dumb" cube chargers? Not much smart in there. So manufacturers came up with a method that would use the two data wires in USB to indicate their supply current capabilities of their chargers. Apple, for example, uses two different resistance values on the data+ and data- to indicate two optional charging rates; the iPhone rate of 1.0A and the iPad rate of 2.1A If its not coded at all, the device assumes 0.5A max.
Here is where it gets complex. Different manufactures at different companies use different coding schemes, and surprise, one is not compatible with another. This means in many cases (not all) a device not understanding the charging scheme it is seeing will revert to 0.5A even if the USB port can supply 10,000Amps. A low charging rate is preferable to a fire. Wouldn't you agree?
A few devices, like recent Android devices are "smart" enough to understand charging schemes from several companies but most devices CAN'T. Also, the majority of cube chargers out there, especially if they are the color white are designed for Apple devices and can charge iPhones and iPads at their correct rates. (Again, Apple ONLY has two special rates 1.0A, and 2.1A) I have a few chargers from third parties that even indicate the coding by the USB outlet, 1.0A or 2.1A.
Oh, and one more thing. Some charging cables are made with such low quality cable, that they can't even handle over an amp or two if the coding was correct.
SO, the moral of the post is, NOT every charging port rated at over 0.5A can charge EVERY devices at over 0.5A EVEN IF electronically the power supply is big enough.
So is there a solution? Partially. First, more devices are getting smarter to read several USB codings, but today most CAN'T. Second. A few chip companies, like TI, realized this problem and designed chips to "translate" and negotiate the outlet with the device. For a while you could buy these adapters on Amazon, but it seems most were poor sellers and are not available now, but if you want to search try "USB Power Converter Adapter Charger for Smart Phone" as search term. They might still be available in China. I have a bought a bunch of them and now use them when charging any USB device. They work great, and I can now charge a Nikon camera I have with ANY charger, not just the Nikon USB charger.
You would thing they just could build these into USB chargers, so chargers would be universal, but that costs extra money, and so few people really understand the problem that it never has happened that I know of.
So when you buy ANY USB charger rated at over 0.5A, its important to know WHAT company is it coded for? If its white, and labeled at 1.0A or 2.1A its most likely Apple. But as I've been saying, don't ASSUME that if the charger is rated at a rate that is as high as or higher than the device, that it will charge at this rate. Maybe it will. Maybe it won't.