OK, I understand electronics, SMPS, etc, and have designed quite a number of PWM devices. And I have had Solar for in excess of 20 years. I am a retired EE. Just saying this to clear the decks of assumptions and un-needed explanations. So, looking at the specs, with no particular explanation of how the specific units work, I see that the input voltage of the units, from the PV panels, is allowed up to around 48 or 50V, depending on model. These models are intended to use with 12 or 24V systems, considerably lower in voltage. Yet I see in another thread, that "Tom", Windynation engineer, suggests that the panels must be connected in parallel, with output voltage close to battery voltage. This makes no sense to me. What is the "PWM" control technique used for in these controllers supplied by WindyNation? "PWM", to me, suggests that the chargers are "buck regulator based", and can accept the higher voltages while converting them to the selected charging voltage. In the process, converting to, say, 13 volts from a 34 volt source, would lead to a charge current over double the panel current, due to the voltage change, and preserving the panel power. In that case, since panels typically put out around 17V at rated performance point, one would logically put two in series, for around 34V at rated performance. a 100W panel would produce 5.8A at 17V. A typical flooded battery is around 12.3V at 60% of full, and might become 13V under charge. So that would convert 34V /5.8A to 13V at about 14.4A. Just using the panels in parallel without conversion would charge at only 11.6A under the same conditions. That is a significant difference. I suppose that an alternate meaning of "PWM" is that the regulation of voltage is simply done by switching a "pass" device on and off, with no attempt at converting voltage. While possible, that would make no sense as far as maximum voltage, since most of the panel power at higher voltages would be wasted, unused. You might get only 75% of the actual power of the panels, Note I am NOT confusing this with MPPT, which actively seeks for a voltage that will produce maximum charge current. MPPT is just a way of controlling a PWM type "buck regulator based" charger so as to find the maximum possible power point as sun conditions change, or potential panel shading, etc occur throughout the day. It is NOT the same as voltage conversion. An MPPT controller may improve over the standard voltage converter type by another 15% to 20% as sun conditions change. I would assume that the "ordinary PWM" (NON MPPT) charger would simply be set to an expected panel voltage, which it would hold the panels to, adjusting current to maintain that voltage, and accepting whatever increased current that can provide at the lower charging voltage coming out of the converter. A smarter charger would read the open circuit voltage, and set itself to hold the panels to a FIXED voltage some amount under that, varying current draw to achieve the result. In contrast, an MPPT charger would actively vary the panel current draw and resulting panel voltage for max power, instead of holding a fixed voltage. But I see "explanations" that suggest the "PWM" controllers are not actually using the power at the higher voltage, and are therefore not able to convert power and charge at any current higher than what the solar panels produce. They may simply be turning the charge current off and on at a duty cycle to provide the desired charge current. Using such a device would not provide any advantage in charge current from putting panels in series, and is really just a solid state replacement for a "voltage sensitive relay", with the only significant advantage being some added metering. So, what is the true situation? Do the "PWM" chargers (not the MPPT unit) do actual voltage conversion, or do they just control the panel current to vary charging?