Why is there 50 hertz in a 220 volt outlet

The impetus in the development of electricity came in the second half of the 19th century. It was at this time that scientists made a number of discoveries in this area, which made it possible to find practical applications for electricity. Thomas Edisson invented the first light bulb and, promising everyone very cheap lighting, set about building power plants.

The first lamps were arc lamps, in which the discharge took place in the open air between two carbon rods. At this time, it was empirically found that the most suitable voltage for arc burning is 45 V. In order to reduce the short-circuit currents that occurred at the time of ignition of the lamps (when the coals were in contact), and for a more stable burning of the arc, a ballast resistor was connected in series with the arc lamp. It was also found that the resistance of the ballast resistor should be such that the voltage drop across it during normal operation was approximately 20 V. Thus, the total voltage in DC installations was initially 65 V, and this voltage was applied for a long time. However, often in one circuit, two arc lamps were connected in series, for the operation of which 2x45 = 90 V were required, and if we add another 20 V to this voltage, attributable to the resistance of the ballast resistor, then a voltage of 110 V.

Thomas Edison's mistake was that he used direct current generators to generate current, and tried to transmit direct current through the wires. The radius of the power supply did not exceed several hundred meters and had enormous losses. Attempts to expand the boundaries of the power supply area led to the birth of the so-called three-wire DC system (110x2 = 220 V).

At the same time, Nikola Tesla led the development and implementation of generators and alternating current systems. The use of alternating current with a voltage of several thousand volts made it possible to simplify and reduce the cost of the electrical network and increase the radius of power supply (more than 2 km with a loss of up to 3% of the voltage in the main wires instead of 17-20% in the DC networks). And at the output to consumers through transformers, the voltage dropped to 127 volts (3 phases = 220 volts, 1 phase = 127 volts according to the formula √220 / 3).

This continued until the 60s of the last century and in the USSR, until the number of electrical appliances overtook the number of the population. To somehow reduce the load, it was necessary to either thicken the wires in the cable lines or increase the voltage (I = U / R). We chose the lesser of evils and increased the voltage in the network to the same 220 volts only for each phase.

The Russian scientist Dolivo-Dobrovolsky was the first to propose to decompose the current into active and passive components and recommended to take a sinusoid as the main form of the current curve. With regard to the frequency of the current, he spoke in favor of 30-40 Hz. Later, as a result of critical selection, only two industrial current frequencies were used: 60 Hz in America and 50 Hz in other countries. These frequencies turned out to be optimal, because an increase in frequency leads to an excessive increase in the speed of rotation of electric machines (with the same number of poles), and a decrease in frequency adversely affects the uniformity of illumination.

That is why we have in 220 V 50 Hz outlets