Originally Posted By: DWallach
The most authoritative source on all of this seems to be Benson Leung (Google engineer and one-man Underwriters Laboratories). https://plus.google.com/u/0/+BensonLeung

With regard to all the different power delivery profiles, he nicely summarized:
Quote:
Going forward, the new voltage levels to expect on new chargers are 5V, 9V, 15V, and 20V. The 12V level is now optional.

The new rules also introduce a “superset” guarantee. Larger wattage power sources must support all voltage levels below their maximum up to 3A. As the spec says, “Bigger is always better in user’s eyes – don’t want a degradation in performance. Higher power Sources do everything smaller ones do.”

As a result, the consumer only needs to know that their device ships with a x watt power supply, and know that any power supply that is rated at > x watts will be at least as good as the one that shipped with the device. When comparing power supplies, they only need to look at the watt rating to know when a charger is objectively more capable than another. Under no circumstances should a more expensive charger charge your device slower than a cheaper one.


For the Quick Charge kerfuffle, he wrote that the USB-C spec prohibits Qualcomm's QuickCharge, flat out. (https://plus.google.com/+BensonLeung/posts/cEvVQLXhyRX)
So how does the consumer know whether their USB-C stuff complies with the new rules or the slightly older rules?

Originally Posted By: Benson Leung
Apple's 29 watt charger is a PD charger, but it was designed and built before the current rules went into effect, so it is grandfathered in...


The watt rating shortcut presumes that all devices, cables and power sources are using the same rules. Which Benson has done an excellent job of showing is not always the case.

Originally Posted By: Benson Leung
The problem is that if Samsung invested in their own proprietary (and spec violating) technology, it's entirely possible that they didn't invest any time or effort getting the rest of Type-C or PD working properly.

I've seen noncompliant implementations on Type-C phones that support QC, but don't even charge at 5V 3A, or don't charge from certain classes of spec compliant Type-C chargers because the manufacturer didn't do the work to make the device actually Type-C compliant.

This is the reason why Section 4.8.2 (which forbids proprietary charging methods that change voltage), and the PD rules I describe exists, to prevent this sort of confusion that becomes the burden of the consumer.


12 volts is optional ... For whom? If the USB-C device requires it, can the power source deny the request?

For multi-port USB-C chargers there is the potential additional complication when the aggregate power available across all ports combined is less than the maximum power available from each port. Now that multiple port voltages and current capacities are in play it is important that the charger deliver the appropriate power on each port, even if the devices on the other ports are also highly demanding.

Originally Posted By: Benson Leung
The Type-C spec does give some guidance as to how to handle dynamically changing load situations, but for multi-port charger scenarios with a limited pool of shared power, there's really no easy way to tell what will happen as a consumer. It's entirely up to the manufacturer to sort out what to do with the ports when the load changes on them.


Common among USB 2.x multiport chargers is to not allow full power on all ports at the same time. Even the rather nice Choetech six port 50 watt charger cannot deliver a full 12 watts from all six ports at the same time.

I have done no reading at all on what it means to have a multi-port USB-C hub (not just a charger) in terms of power management. Not to mention routing DisplayPort style video through the hub or back charging the laptop through the hub. Don't even know if those things are possible.


Edited by K447 (04/05/2016 04:02)