It all comes from the geography and partly from history: The 120V home service is a kind of residue of the Edison's DC power era. With DC, higher voltages become problematic to handle (mainly problems with quite persistent arcs), the 120V is really the maximum that can be reasonably dealt with in home like installation. And because in the US the electrification started quite large already with the DC systems, the 120V became quite wide spread. So it became easier to stick with that, at least for the home lighting (the main electricity use). Most US consumers are spread across large area, so the "last mile" wiring (from the district distribution transformer to the consumers) is quite long, which made the direct use of the original Edison concept not that optimal. That means even 220V would be too low voltage for that, the voltage in the wiring needs to be higher (in the kV range), so high it is unusable within a home or so. So it requires a dedicated transformer at each consumer. Having transformer at the consumer, close to the loads means the lower voltage for the loads behind this transformer is not that problem anymore, the distances are short. Therefore the 2x120V/240 single phase remained till today. And because this scheme means there are a ton of transformers, it was beneficial to operate the mains at a bit higher frequency - transformers becomes smaller, cheaper and more efficient. Therefore the 60Hz. And because each consumer needed to have his own transformer, it would become quite expensive to use all 3 phases (except really high power customers, like commercial or larger appartments). Therefore everything is operated at a single phase, even when it means extra cost and reliability complications with motor capacitors and starters. So the 2x120/240V single phase split phase system became the standard
On the contrary people in Europe tend to live "more densely packed" into clusters with large enough consumer base. That means the "last mile" of the distribution (from the distribution transformer; usually 22kV/400V) became way shorter, so using 3x230V/400V and feeding that directly to the homes become the preferred way. And because the last mile run at lower voltage and higher currents (3x230V/400V instead of the few kV), very important factor is to keep the efficiency of these sections high enough. And that led to the preference of the lower frequency, hence the 50Hz Another consequence was, for really high power loads (kVA and higher, mainly motors) it became very advantageous to use all 3 phases, as that just means splitting the wiring to 3 phases, because no customer transformer means no penalty for using all 3 phases. So unless it is some really small thing, most motor equipment just uses 3 phases straight away and no hassle with any complications like large motor capacitors and relays. Hence the standard became the 3 phase 3x230/400V. Only very small customers where many of them are packed together in a single building (units within apartment complexes,...) use just a single phase service.
In early installations many places also started with 120V DC, but there were way less installations, usually not exceeding premises of a single factory or farm, so not that much customer base. Then later a safety concept of "not that deadly" 70V vs ground became a thing as a way to protect people when a short circuit towards conductive case of an equipment happened. But I'm suspecting this became an afterthought of distributing 3 phase (3x70/)120V power, where the 120V, equal voltage to the earlier DC systems, was taken from between the phases. And then there were two world wide wars sweeping Europe, both needing complete infrastructure rebuild from scratch, so there was not that much legacy systems left so it was an easier decision to go for a system better optimized for more modern needs.
And recently a lot of ELV DC systems are emerging across the globe, mainly in what is called the "3'rd world" - just because that becomes the easiest way for small isolated systems, usually run on solar power with batteries and powering lights and some IT (phones, personal computers, TV's,...). Running an inverter there means quite high idle power drain, so the main installation is directly at the DC battery level (usually 12, 24, 36 or 48V nominal), only when devices needing AC are used, the inverter is turned ON. This market becomes big enough so it gradually becomes standardized to a level that a lot of off the shelf equipment becomes available designed to run on them. These are intended for very low income (per western standards) users, but there are quite a lot of such people to form quite significant customer base to justify things like standardization (it seems like the 48V and maybe also 24V). Moreover unlike on AC, it is easier to design DC appliance that can run multiple voltage levels, like 24 till 48V, some low power ones even the whole 12..48V range...
|