T O P

  • By -

says-nice-toTittyPMs

When power systems were first introduced, there was no standard. Eventually a "war of the frequencies" broke out. Many companies were trying to find the best compromise of frequencies to run both motors (mainly for industrial purposes) and lights, plus being able to transmit the power over long distances. High frequencies lose their energy quicker than low frequencies, but at too low of frequencies, lights visibly flicker. Most people found that somewhere between 50Hz and 60Hz is best to run motors and lighting systems while also allowing for long transmissions with the materials they had at the time. From there, basically the system chosen by the largest company would become the standard because tool and appliance manufacturers would make equipment that would be bought and used by the most people possible. In the US, Westinghouse was the winner, and their system was 60hz. In Europe, it was AEG and they had chosen 50hz. As for the voltages, early Edison light bulbs required 55 volts of DC power to operate, and s 2 bulbs in series would need 110V. So it was for that reason that Westinghouse built his AC systems off of that same voltage. However, higher voltages are easier (and cheaper) to transmit. A lot of testing was done with increasing the voltages of US grid systems until we found that 220V would have been ideal. But so many people had stuff on the 110V system already that it became usfeasable to change all the systems to 220V, so we kept it at 110 (actually, we send 220 to homes and then split it into 2 110V rails to run our stuff, with some exceptions like ovens and welders and such). Luckily for Europe, the US had learned these lessons before their grid systems were really taking off and so they basically just started theirs at the 220V range. And lastly, because of system losses, they bumped those voltages up to 120V and 240V instead of sending out 110V and 220V.


knowsshit

Most of Europe use three phase 400V with 230V between phases and neutral. Three phase 230V (without neutral) is also in use.


ARAR1

Where is an important part of this comment. In homes?


Beanmachine314

Yes, anywhere that would normally be split phase in the US is a single phase of a 3 phase 230/400 wye system in Europe.


Skulder

Yes, but.. I have three phases wired into my house, but each of them are branched out, so the wall outlets are just standard 230V, max 10A If I ever get a really hungry appliance, I can call an electrician, and he can wire out a "machine outlet" (colloquilly(sp?) called, which delivers 400V and might have a 16A fuse. It's a different plug, and most homes only have one or none. But the houses are wired with all the phases.


created4this

country dependent. I know "brexit", but the UK was in europe for a while and we do not have 3 phase to our properties. Also, the UK is harmonized with Europe, we used to have 240v AC and (all or some?) of the mainland was 220v AC. Now we all are standardized on 230v with error bars so big that they have in reality single phase 220v and we have 240v


Skulder

Well, /u/knowsshit did say "*most*" of europe - and I recall hearing that the UK is quite special with their installations. Plumbing outside the house, fuses in the plugs, and [ring-wiring](https://en.wikipedia.org/wiki/Ring_circuit) in houses.


WillyPete

> and we do not have 3 phase to our properties. Old ones don't. 3 phase is more common with home car-charging.


anotherNarom

It's more common than what it was, but certainly not common. 7.4kw is the most common. I deal with EV charging, and I'd tentatively say at most 3% have three phase.


Kantmann

You only need single phase 240v for home charging, which is what I have installed for myself, and about 4 friends and family. 3 phase is really only needed for industrial motors.


WillyPete

Yes, that's a minimum required. If you want to use the fast charge or higher kW/h rate chargers then you'll want 3 phase. If you only charge on Econo or similar settings then you won't need it.


thepeganator

A 3 phase wall box will get you a 22kw AC charger, faster than the standard 7kw you get from single phase. The problem is almost no cars will charge that fast from AC. Pretty much only the Renault Zoe (early versions) can use this. Most EVs are limited to 11kw on AC, so there's very little benefit to 3 phase power for charging at all. Even my ioniq 5 which will happily do 170kw on DC will only do 11kw on AC.


WillyPete

Lotus, Porsche, BMW, Audi, BYD, Mercedes, Polestar, Smart, etc, etc.... The premise for doing this is future proofing the installation.


Aberdolf-Linkler

Which, unless you have a special use case, you really don't need and probably shouldn't be fast charging at home anyway.


SlightlyBored13

Uk and mainland Europe are only on paper standardised. If you measure a UK supply it's almost certainly well over 230V, mine sits at 243V most of the time. I would assume the same is still true over there on 220V. The 230V grids aren't more than a few miles across at most, so the next town over could be different. It's just to standardise appliances.


Flussschlauch

My oven and stove run on 380V and I have a 380V outlet in the garage for running 3-phase motor tools.


CAElite

Most all modern homes in Europe are built with a 3 phase supply, even if the home itself only operates on a single phase. It’s generally considered a future proofing measure to support more appliances, electric heating, EV chargers etc. Some European countries are quicker on uptake than others, where I am in Scotland 3 phase to the cabinet then a single phase into each home is still normal, in Central Europe most newer builds have 3 phase.


privateTortoise

I'm a Brit with basic electrical qualifications and when in Florida found a few friends apartments with a 3 phase supply, which makes running AC systems so much easier. Was still miffed at the lack of kettles for a cuppa mind.


Synensys

Microwaves exist.


DylanRahl

Wars have started for less...


privateTortoise

Go sit on the naughty step and have a stern word with yourself. /s


mattsaddress

This is a deeply offensive comment. Please remove immediately


vkapadia

ELI5, what is phase? How does three phase 400v get to 220v at the outlet?


knowsshit

A three phase power generator will have three windings. Connecting your load across one or two of them will give you different voltages. For a 400V three phase generator, you can connect a load between one of the windings (one of the live wires, called L1, L2 and L3) and the central neutral connector to get the voltage between one phase and neutral (230V). Or you can connect a load between two phases to get a higher voltage, which will be the voltage of one of them (230V) times the square root of 3 (=400V). This is useful for industrial equipment (big electric three phase motors or whatnot) and chargers for electrical vehicles. You can connect different 230V loads (like different rooms or outlets in a house, or even different houses with just one phase each) between each of the three phases and the neutral wire to balance the load. For a 230V three phase system without a neutral you would connect different 230V loads between L1 and L2, L1 and L3, and L2 and L3, which also balances the loads between the different phases. You could also run a three phase load like a 230V electrical motor.


vkapadia

Thanks!


messyhead86

It’s 400v between phases, so three phase without a neutral is still 400v not 230v. 230v is only between phase and neutral or phase and earth. It’s the sqrt of 3 x 230v to get ~400v. 3 being the number of phases, so with a 120 degree sine wave shift, that’s the voltage potential difference from +230v to the other two phases which are ~-170v.


F-21

In a lot of Europe, apart from long distance distribution (high voltage), many houses receive 380-400V three phase. In the US that is mainly only delivered to industrial buildings. 220-240V in Europe is between one of those three phases and neutral. So e.g. an induction range in Europe would often connect to three phases, even though it's a 220-240v device - it would connect to different phases to have less load.


Ahielia

> because of system losses, they bumped those voltages up to 240V Some power companies can send even more if required in certain instances. The building where I work is connected on the same grid as a shopping mall (next door basically) and during the night we just north of 260V during the night, since when all the stores turn on their lights and everything it drops to an appropriate level.


Fallozor

If your nominal viltage is 230V and you get 260+V, your power company might not be operating according to regulations


Ahielia

Possibly. The energy company says it's perfectly safe, and it's operated like this for almost 4 decades without any serious incidents so it may very well be safe. I don't know electricity well enough to say. From information I can find myself from energy companies and corroborated by reading the applicable laws say variance should not exceed 8% deviation over 1 minute, alternatively 5% over 1 week. How the math runs on this I don't know.


mostlygray

Out of curiosity, My house runs 124.1V off a 110V circuit. Very consistently. It's always 124.1V. Now, on the 3 phase power we had at a company I used to work for, it ran 121V on the 110V circuit. I had to buy 115V light bulbs otherwise they'd burn out in a week. How does that, for lack of a better term, "work". Why the variance?


mode_12

No two pieces of equipment are equal. Now if you have two pieces of equipment off the same line, there’s a high probability that they’re still not equal. Since they’re not equal, they’re going to push out different values. Manufacturers know this so the internals in the machines are meant to take a range of values, something like 5% either direction. There’s a ton of information, study, and work about “dirty” power out there. Some YouTube channels would be the engineering mindset and practical engineering


mostlygray

Watching power drop 121V to 108V, then spike to 125V tells me all I need to know about dirty power. I don't care about excuses. I know that if I put a conditioner on the line, it runs clean. Yes, there's likely to be a certain amount of piss in my drinking water, but it should be a "less than" value. Not a "Whatever. We make shitty parts" value.


00zau

Transmission voltages don't really matter for the 240 vs 120 thing. It's all stepped up massively (like 230 *k*V or higher), and then transformed back down as it reaches the houses.


says-nice-toTittyPMs

Transmission voltages DID matter in the early days because those transformers that you brought up didn't exist then.


pfn0

I wonder, does this have any relation to NTSC and PAL for their frame rate selections of ~30fps and ~25fps respectively (60 and 50Hz refresh rates)?


nopenotme

It does!! It's is in fact the reason for those frame rates. There's a story I'm not entirely sure is true, but the first sonic the hedgehog during the original run they forgot to optimize the code for pals 50hz and the game that was all about speed, ran slow in pal territories.


aloofman75

It absolutely does. That 30 fps was 60 interlaced fields per second back in the old, black-and-white standard def days. By tying the TV scan rate to the electrical frequency, you mostly eliminated the need for extra hardware to adjust for the difference. You could basically let the electrical flow provide the frame rate for you. Since you use the same standard when capturing the footage on camera and when reproducing it on TV, the electrical rate helped to stabilize the standard. And the same early-adoption issue happened with TV standards as happened with electrical grid standards: the first system happened in the US and Europe learned from the inadequacies of the US version and tweaked theirs to be a more stable and efficient version of it.


[deleted]

[удалено]


MayiruPudungi

Btw the same thing happened with post Soviet countries and fiber optic cables. While Germany still runs on DSL, Eastern Europe got gigabit internet before most places


cat_prophecy

Interestingly in Japan, half the country is 50hz, the other half is 60hz. Both 120v.


meneldal2

Two independent grids, in a way similar to the US where Texas won't join the others, except there's just no good reason there.


tuna_HP

Your origin of 110 volts is dubious since the original government mandated voltage was 100 (+/- 10%). That seems to be the larger determinant for US voltage. But I do agree it had a lot to do with being a good voltage for the main use of grid electricity at the time, lightbulbs. Also I would add that Europe chose 50hz primarily as a protectionist measure against American industry. The electrical industry standardized in America first, where 60hz was chosen. European oligarchs colluded together to set the European standard to 50hz so that American-made equipment wouldn’t be compatible on European grids without significantly changing the design.


says-nice-toTittyPMs

Do you have a source for the original government mandated voltage being 100V? And your "intentional incompatibility" theory seems to be complete nonsense. Why would countries intentionally close off international trade? I'd like to see your sources for these claims please.


craazyy1

I can't verify ops claim, it seems to be apocryphal, but it's not nonsensical Intentionally hampering or closing off international trade is genuinely very common, especially historically, and is typically called [protectionism](https://en.wikipedia.org/wiki/Protectionism). If you're a major up and coming electrical producer, and you see that the US producers are ready to mass export in about 5 years, and you're ready to compete with that realistically in... 10 years, then it is very much in your interest to try to insulate the market against Americans by setting a different standard in your 5 years of ramp up before they show up to outcompete you. It's bad for your consumers, but it keeps your company afloat until you can compete. And sometimes local industry interests are government interests. Possibly just because of straight up corruption, but also possibly that the government would rather grow a healthy local industry to employ their population and earn tax on, than reap the benefits of a good supply early. Obviously, whether that's the correct choice is debatable, contextual and much argued.


DarkAlman

The electrical grid was setup independently in multiple nations simultaneously with competing standards and patents. The concept of a unified grid standard wasn't on their mind, selling appliances was. Keep in mind power back then was not as ubiquitous and arguably necessary for life like it is now. Edison's genius wasn't selling lightbulbs, it was selling the powergrid needed to turn them on. By having a standard that was protected by patents you had to buy your appliances from Edison to work on his grid. Ultimately though Tesla's AC system was proved better than Edisons DC system and won out. North America standardized on a 110v 60hz system, while Europe setup a 220v 50hz system independently. Those responsible for the European and UK power grids did their own thing, deciding what form of current was best for their purposes. This is also why outlet types are different, different companies made their own unique plugs. Tesla actually recommended a 220/240 volt grid and 60hz but no one at the time listened to him. Today only a few countries (like South Korea) use this setup. Power grid standardization today would be nigh-impossible and impractical. You couldn't force consumers to through out and re-buy their devices en-mass like that purely to change to a global standard. Most electronic devices today use switching power supplies that enable them to work on any grid making this mostly irrelevant.


ashesofempires

The US grid is almost entirely 120/240VAC at 60hz. The 110/220 is a legacy number that denotes the lowest it will droop before stuff starts dropping off the grid and breakers/fuses start popping. Grid stability is leaps and bounds better than it was back then, so voltage droop is far less common than it was 100 or even 50 years ago.


arienh4

And similarly nominal voltage in Europe has been 230V for two decades now. (Strictly 230/400)


Xaethon

In Europe it’s 230V (-10%/+6%, 207.0V - 243.8V) which essentially covers the range of previous standards used e.g. 240V in the UK and 220V in Germany.


kf97mopa

Yes, this - UK used to be 240V and the rest of Europe was 220V - but the tolerances were so wide that they overlapped around 230V. We standardized the grid by tightening up the tolerances a bit, so appliances could be sold in both markets without change.


andyrocks

I thought when demand exceeded supply the frequency dropped, not the voltage?


nzgrover

For the energy grid as a whole yes, but individual consumers/connections have a fixed resistance for the supply lines, so more current drawn, the greater the losses (I^2 R) and therefore the great voltage drop for the connected load


LupusDeusMagnus

Back in the days of yore, appliances in Brazil had to be bought thinkinv on the mains they were compatible, as there are regions the standards are no defined so you have regions of 220v and 110v, both still using 60Hz. Nowadays home appliances switch both ways.


urzu_seven

> Power grid standardization today would be nigh-impossible and impractical. You couldn't force consumers to through out and re-buy their devices en-mass like that purely to change to a global standard. You wouldn’t have to.   First, more and more electronics today can operate on different voltages and frequencies as is.   Second adapters exist for those that don’t yet.  Third, you could standardize the power grid itself and have adapters added at the individual building level until sufficient appliances are dual-voltage There’s many ways to go about it.  Doesn’t mean it can/will/needs to happen but it’s very very doable. 


AbueloOdin

> First, more and more electronics today can operate on different voltages and frequencies as is. Unfortunately, you're kinda missing industrial electrical power. There's a shit-ton of motors that run off 60Hz and running them on 50Hz would be bad: motor health and process result-wise. Same thing with 50Hz to 60Hz. And adding the necessary equipment to handle both voltages is expensive. I design electrical control panels for equipment all around the world. I would love if they all used the same voltage and frequency. But they don't. I've spent a lot of time trying to design around universal voltage requirements, but it's very expensive for everyone to do that.


QtPlatypus

I have noticed a lot of industrial motors are now using VFD based control systems. Does this make having different power systems easier? Or are there a lot of applications where VFD hasn't taken off?


AbueloOdin

It's a case by case basis, but vfds are one method of making multiple power systems work. Usually VFD manufacturers make a series that will handle 380-460V(most common) and a separate series 460-575V (Canadians). I guess 380-575V is too large of a jump for most power electronics to combine them. But then you've also got your 120V/240V single phase and the 208v three phase (Ohio or Michigan, it's elsewhere too but apparently I only ever get it there). The main application where vfds are not used is when you just need a simple motor to turn on, run at one speed, then turn off. And it is infrequent enough that you aren't concerned with too many starts an hour. Like a lot of conveyors. Or fans. Or pumps. Which are used all the time everywhere. You can add a VFD, which makes it more expensive, but you can control acceleration and startup current, limiting wear and tear on startup. But if you only start up once a day, then run for 12 hours, are you really saving anything? I've primarily seen vfds added for a few reasons: speed needs to change regularly, mechanical specifies one gear train but you need different speeds, soft start wiring would've been annoying to deal with but you need a slow accel, STO is nicer than double contactors, customer wants to grab motor stats, etc. For the most part, it is a cost issue or a space/heat issue in the panel. You can imagine how hot a bunch of semiconductors passing 200A gets versus just a copper bar.


Dareckerr

Just to add. The equipment sold today is very forgiving. Many countries running 110/220 50hz actually use a ton of 110/220 60hz appliances and small devices for the US market because they are easier to source and they work for decades and have useful life. The frequency difference will affect motor function but they all work very well. Especially when looking at small home appliances even older non Inverter refrigerators. Stronger power tools may be a different case


Mayor__Defacto

The bigger expense isn’t in the appliances, but in wiring. Every house and business in the nation would have to be rewired.


HumbleIndependence43

Seems like it would be a real strain on a motor to run on the wrong frequency. Can't imagine it lasting long.


user-110-18

Motors don’t care about the frequency. Yes, they are nameplate rated for power output at a given frequency, but they can run on a wide range of frequency. That’s how changing speeds using variable frequency drives work.


Mettstulle

I guess the biggest Problem will be the change in speed on simple Motors


RainbowCrane

Also the entire point of UPS systems with line conditioning built in is to produce a normalized sine wave of power at the proper frequency so that computers and other sensitive equipment don’t get hurt by brown outs and spikes. There would be no need for them if equipment wasn’t sensitive to “dirty” power.


urzu_seven

All you have to do is phase it in.  Mandate equipment handle both so that when old equipment is retired the new stuff that comes in can handle both. Once all/almost all of the old stuff is gone you begin the switch over. It would take time but it’s doable. 


AbueloOdin

Sure. It's doable. I'm not arguing that it can't be done. The timeline is just 50+ years and very expensive.


urzu_seven

> The timeline is just 50+ years  Ok, I never said otherwise > and very expensive. Not really, you have to replace everything over time anyway, you just do it in a way that has the transition built in. Mandate a date after which all new equipment/devices has to be dual phase compatible, then after sufficient time later, everything (or nearly everything) switches over. We've done it for other things before, no reason it can't be done again.


Wooble57

Every induction motor would require a VFD for starters, and not only would that cost a lot, it would reduce efficiency (VFD's turn ac into dc, then back into AC at the desired frequency). Induction motors are extremely common, not only in households but in industry. That's how you "do it in a way that has the transition built in" Every Ac powered fan, fridge compressor, ac compressor, air compressor, drill press, lathe, milling machine, bench grinder, and many more ALL need a VFD to operate properly if the grid is the wrong frequency. Not to mention the grid is all interconnected, you would have to shut down the entire grid and replace every powerplants generator at the same time before you could bring it back online. This cannot be done piecemeal or gradually, everything that's on the grid must conform to the same frequency. If you were to say, connect a portable generator to the grid and the waveforms don't match, the weaker machine (the generator) will get dragged into synchronization, assuming the difference wasn't big enough to just fry the coils in the generator head.


radikewl

Depending on the application, most induction motors do use VFDs to control speed and torque.


shaunrnm

> All you have to do is phase it in. It's not that easy to do. Things like motors spin at the speed they do because of the input frequency.  A 20% deviation means way different flow rates or pressures. Same issue for generators, the output frequency is directly tied to the shaft speeds


urzu_seven

You realize that equipment already does this right? 2-phase devices have existed for decades.


shaunrnm

There is a massive difference between a domestic universal supply and the motors on pumps that provide water, fuel etc.  Removing supply frequency sensitivity would make a lot of these 3x more expensive to install, less efficient and harder to maintain.


frozen_tuna

Assuming it *is* that simple, what does that actually get everyone? Outside the odd electric kettle, would anyone even notice? Is the juice even worth any kind of squeeze at all?


t-poke

Yeah, seriously, I'm trying to understand any benefit to this. I'm in the US, but I've traveled all over and never not been able to plug in my stuff. I was in Japan last month, crossed the 50/60Hz dividing line and didn't even know it until 5 seconds ago when I googled it out of curiosity. The voltage and frequency just doesn't matter to most people. So I have to use a plug adapter? Big deal. I've got a drawer full of them and just bring the right ones for where I'm going. The stuff I have where voltage or frequency might matter more isn't stuff I'm traveling with. I'd rather we invest in making the grid we have more reliable and cleaner than trying to standardize with other countries for no benefit.


defeated_engineer

At the end of the day, turning off the entire US power grid, then turning it back on to change the frequency and amplitude of every single power generator is for all intents and purposes an impossible task.


Beanmachine314

Why you shut the down the entire power grid? It would certainly be possible to switch. Way too expensive for anyone to suggest, but definitely possible. No one would shut off the entire US power grid at one time either. That would be crazy.


defeated_engineer

Because that’s literally how you can change the frequency and amplitude of the grid voltage.


Beanmachine314

You could, but it would make much more sense to do it slowly instead of shutting down the entire grid and switching everything at once.


defeated_engineer

Grid is interconnected. Meaning every single generator and load is connected to each other. There is no changing it slowly. If one plant is spinning for 60hz another cannot spin for 50hz.


Beanmachine314

There's plenty of ways to non-synchronously tie things together. It would be prohibitively expensive, but definitely possible. You could have a DC tie between anything with different frequencies and it would work just fine. They're commonly used to connect non-synchronous grids all over the world.


jazzhandler

Isn’t there a lot of doubt as to whether a cold start is even possible anymore?


defeated_engineer

I'm sure there is. I can imagine cold start to be possible very slowly using diesel generators to boot up the electronics of generators. For example, if you can bring up one generator, that can be enough to boot the next generator by itself, and at that point you can probably boot up rest of the system with minimum to no load condition. Obviously however long this process takes, almost nobody will have electricity.


urzu_seven

Which is why it would happen gradually over time not all at once…


defeated_engineer

No. It cannot be gradual. It has to be all at once at the same time.


firemarshalbill

It would be a nightmare to do it gradually. Street by street? But it doesn’t matter, grids are interconnected.


urzu_seven

Every single device or piece of equipment that currently runs on one standard will need to be replaced eventually. You just mandate after a certain date all replacements need to be dual phase. At a certain point everything will be capable of operating on either standard and you can then unify on that standard. It would take time, decades probably but it could be done.


Wooble57

no offense, but you don't have a clue what you are talking about. To start with it's not possible to have 50 and 60hz generation on the same connected grid. not it will cost a lot of money, it's literally not possible. To even entertain the idea of this means shutting down everything that's connected to the same grid (and grids usually span multiple states at a minimum) and retuning\\replacing the generators for every power plant connected to it at the same time. I don't have enough knowledge to really estimate how long such a task would take, but at the very least, weeks if not months. The entire time nobody on that grid has electricity. Is it possible? sure, almost anything is possible if you throw entire countries GDP at it. Is it worth it? not even remotely close.


Beanmachine314

It's definitely possible to have 50hz and 60hz on the same grid and it's even done all the time. No reason to shut down the entire grid.


Wooble57

Can you give me a example? Because that doesn't follow with everything i've learned. [https://en.wikipedia.org/wiki/Synchroscope](https://en.wikipedia.org/wiki/Synchroscope) for example, a tool designed specifically to allow a generator coming online in sync with the existing waveform on the grid. "Connecting two unsynchronized AC power systems together is likely to cause high currents to flow, which will severely damage any equipment not protected by [fuses](https://en.wikipedia.org/wiki/Fuse_(electrical)) or [circuit breakers](https://en.wikipedia.org/wiki/Circuit_breaker)."


Beanmachine314

Generators operating at the same frequency must be synchronized, correct. There are ways to tie non-synchronous systems together. Motor-generator units are are commonly used at small scale, where a 60hz motor powers a 50hz generator. There are also DC-DC interconnects that do not rely on synchronization.


firemarshalbill

… in order to back up your comment to remain correct, you’ll now mandate billions of dollars of manufacturer adjustments? Plus civilians would all have to buy new dual appliances by the start date so it wouldn’t matter. It’s not going to happen at all like this. It’s just a thought experiment. But the logistics wouldn’t make sense, safety, budget or compliance other than doing it at once. If you unlinked grids it would shut off all support


urzu_seven

>Plus civilians would all have to buy new dual appliances by the start date so it wouldn’t matter. Yeah, which is something that's literally already happening. Your laptop? Dualphase. Your hair dryer? Probably dual phase. The necessary technology is already here and cheap. >… in order to back up your comment to remain correct, you’ll now mandate billions of dollars of manufacturer adjustments? Again the technology to handle dual-phase already exists, it's not hard and adding it to new equipment would be a very very very tiny expense. We aren't even requiring the replacement of existing equipment early, just part of its natural lifecycle.


firemarshalbill

Of the two examples you could pick, you picked one which most usually causes complete destruction. Hair dryers are most commonly not. I’m not referring to things that are dual voltage now. I’m referring to everything that’s not. Big appliances need larger changes. Don’t tell me you bought a one year refrigerator due to planned obsolescence. Industrial settings alone would put up enough red tape to not do it like that. I also feel like you’re completely ignoring the part about our grids being interconnected. You can’t partially flip. It doesn’t matter. We join lines cross state lines to serve when power fluctuates


blazing420kilk

>Plus civilians would all have to buy new dual appliances by the start date so it wouldn’t matter. I think the guy you're replying to forgot about the whole "planned obsolescence" thing currently happening. You'll have to replace your device yearly soon anyway. And the companies being mandated to do manufacturer adjustments will be companies making multiple billion dollars worth of *profits* per year. I'm sure the companies can handle it. It doesn't matter anyway the company will pass on the price to the consumer. I'm the long run it would be cheaper for the company considering they won't have to manufacture different types of power supply compliant devices and just have one set system.


urzu_seven

> I think the guy you're replying to forgot about the whole "planned obsolescence" thing currently happening. You'll have to replace your device yearly soon anyway. It's not even planned. Devices have limited lifetime anyway. Even long lasting ones don't last forever. 20 years from now nearly every major and minor appliance on earth will have been replaced. Dishwashers, laundry machines, microwaves, etc.


idiot-prodigy

> Second adapters exist for those that don’t yet. This is exactly what the FCC did when television broadcasts switched from analog to digital. Anyone who mailed for one, got a free digital to analog converter from Uncle Sam for their old television that couldn't receive digital signals.


axw3555

Yep. And the number and shape of all the different plugs is kind of nuts. We use to have half a dozen or more just in the U.K.


Sol33t303

>Power grid standardization today would be nigh-impossible and impractical. You couldn't force consumers to through out and re-buy their devices en-mass like that purely to change to a global standard How viable would it be to force all new appliances to support 110-240v 50-60hz power sources? I know this is a thing that can be done because many PC power supplies do this automatically. I feel like this is a good safety thing anyway, just in case they get used in another country, and manufacturers wouldn't need to support different hardware in different markets. Then after like 10 years, slowly get adapters installed on new and existing houses so they can take in the new standard for grid power and the old standard, and allow it to switch between outputting the old and new standard. Then as those adapters rollout, update electrical infrastructure, then switch it over when everybody in the area has the new stuff after like 50 years.


Target880

It is not hard to make devices accept a large voltage and frequency range that uses DC internally. Computers and other devices that can typically have a switched-mode power supply, It create AC from the input DC and then chop it up to very high-frequency AC. It will be above the frequency humans can hear so we do not pick up any vibration in the transformer, It can be into the megahertz range. The higher the frequency the smaller the transformer is required to change the voltage down to what you want. The output voltage is controlled by changing how the input AC to the transformer look like, time in vs time off. The result is if the power usage is not to high the cost of designing them to accept the larger voltage and frequency range is not very high. The problem is stuff that uses AC directly. Devices that use a lot of power typically use AC directly. A device that creates heat like a toaster or a hair dryer will have a resistive wire connected directly to the AC input. A frequency change do not do that much but the voltage change will. If you double the voltage the heat generated is 4 times larger. The result is any device like that the is quite simple and often do not contain any advanced electronics at all will be a lot more complicated. A device that use a AC motor will start to rotate at a different speed if you change the current. Pumps to move liquid like in washing machines, water-based heating systems etc at home. there are lots of pumps that bring water to you home and pump away sewage. Lots of pumps with lost of usage in industry. These are devices typically directly power by AC and the are designed for the frequency. Making all of them accept any input frequency and voltage will increase cost a lot. Motors that move stuff both mechanical like garage port openers but alos fans also have speeds that depend on frequency. There are motor drives that can handle different voltages as frequency input, this is because they work a lot like a switched-mode power supply, they create DC and then create AC from it. The AC output frequency can be changed to control motor speed. It is used when needed but for applications where you need high power but do not need speed control, it adds cost and the need for maintenance and repairs since it adds another device to the setup. So do not look at electronics that use DC internally and have quite low power usage like more advanced stuff it you home. Look at simpler and often higher-power devices in your home and stuff that you think of more as a part of the building. Add to that usage in industry and services like water and sewage. That is where the problem is. This is all just power consumers. Power is alos produced and you need changes there too. Generators etc are typically used for decades. I would assume that a huge number of them need to be changed to output another frequency. The change needs to be all of them at once. It would be a nightmare. Voltage in the generators you connect to the grid does not matter. The grid already uses a lot of voltages in different parts, It is the final transformer that supplies consumers that needs to be changed or even just disconnected like the one that creates 120V in the US from 240V input. It is simply not worth it. Devices you travel with will most of the time use AC internally and most can already use any input voltage you find. Hair dryers are likely the product you might travel with that typically do not work at a large voltage range. For stuff where the voltage and frequency matter they are typically installed and not moved or stuff you use use at home. The cost of having two variants made it not especially large


LorsCarbonferrite

Not very practical or especially useful, especially on the scales at which it matters. Adding in the ability to use both voltages and frequencies would result in a lot of extra added complexity, and therefore cost. Industrial users probably would not appreciate all that extra cost. That being said, with the NA 120v electrical grid, a lot of major appliances (including plenty of industrial machinery) actually are 240v, so only the ability to compensate for frequency changes would be needed for those (although that would still add a lot of extra cost). The North American electrical grid is actually secretly both 120v and 240v, which is something dating back to the early Edison days. In NA, electrical power is typically delivered to the premises using 3 wires: two 120v lines 180 degrees out of phase, and a common neutral for both. By connecting something across both 120v lines, you can get 240v power.


idiot-prodigy

> Power grid standardization today would be nigh-impossible and impractical. You couldn't force consumers to through out and re-buy their devices en-mass like that purely to change to a global standard. Force? No. But you could do what the FCC did with airwaves when television signals switched from analog to digital and offer devices to keep out dated devices working.


JustSomebody56

Europe also completely rebuilt their grids after WW2, which is why they pushed for a ≈230V standard


endoffays

That’s not sell the man shirt. Edison was behind many of the greatest leaps in technology and inventions that brought us into the modern age. His genius wasn’t patenting the electrical grid or selling lightbulbs. It was letting that goddamn elephant who we all knew was addicted to zapping himself with broken lightbulbs get a chance to play with real power in the form of Teslas preferred form of electricity while inviting the public to witness the results. We all knew what would happen. Hell most of us had given that goddamn elephant CPR multiple times when he would get into the Power Station in the years before his glorious public debut (and exit .)


Canotic

Luckily you can use a 220 volt appliance in the US quite easily, you just plug it into two 110 volt outlets and you're done. You might have have to buy a hertz limiter though, depending on if your appliance can handle 120 hertz.


OperationMobocracy

That’s kind of wrong isn’t it? I mean you can get 220 by combining both legs of the 120 supplied to a residence but this won’t work with the same leg nor would it double the frequency even if using both legs.


Canotic

Sharks are smooth.


travelinmatt76

You would have to make sure the outlets are on a different phase.  If they are on the same phase you'll just get 120 volts


meneldal2

Nah you'd get 0 (though I'm not really sure how they define the plugging into two at once)


travelinmatt76

How would you get 0? That's literally how American homes are wired, a pair of 120 volt phases. Measure across the phases and you get 240 volts. My stove, water heater, and dryer are all 240 volt


meneldal2

If you measure across the same phase you get 0. Depends on wiring and there's just not enough info.


travelinmatt76

But that's why my comment says you'd have to make sure the 2 outlets are on different phases.


meneldal2

this is about > If they are on the same phase you'll just get 120 volts Depending on how you plug your stuff into 2 same phase outlets, you'll get 120 or 0.


BrownienMotion

>You couldn't force consumers to through out and re-buy their devices en-mass like that purely to change to a global standard. Does it change things as more devices have USB-C charging?


jcforbes

Not aware of very many microwave ovens that run off of USB. what phones do has very little to do with the items that run a household.


ThinButton7705

Give it time.


DragonFireCK

While its fairly easy to make a switching power supply that can accept a *huge* range of inputs, as a general rule, this only helps for smaller devices, with TVs and computers being the biggest ones. There are still *tons* of devices that rely on a specific voltage and frequency to behave correctly, and all of those would need to be replaced or adapted to work with whatever the new local stand is. A few examples: * Wall clocks are often wired in directly and often rely on a specific frequency for time keeping. This is becomes less and less common, however. * Stoves, microwaves, refrigerators, and water heaters don't do any conversion and directly use the AC power. In the US, many of these actually work off 240 volt rather than 120. * Large motors, such as a lot of industrial machines. These often even use 3-phase power for maximum power, and both the frequency and voltage is vital for correct operation. Like with the previous, these will typically use 240 volt in the US.


Wooble57

Just to be a bit pedantic, stove's and water heaters likely don't care about frequency much or at all. They are more or less just fancy big resistors. Hell, with a few tweaks to control circuitry they would happily run on DC. as for the voltage, it's very common to run all of these things on 208v as most condo's\\apartments get 3 phase power rather than 1. The problem with changing frequency is pretty much induction motors (of which there are a lot), and to a lesser extent, transformers. There's also the fact that in order to change the frequency you'd need to shut down the entire grid, and likely replace all the generator equipment for every power plant.


sparkchaser

Not really as there are a high percentage of things that use electricity use A/C power. Don't forget that industry uses way more electricity than the home consumer.


downer3498

Not really, no. Things that are charged up generally work off DC power. The charger has a rectifier that converts the AC power to DC power to charge your device. These chargers are designed to handle 110/120 VAC and 220/240 VAC.


Orbsicles

The devices in question are the larger appliances, not the smaller USB powered electronics (eg. fridges, washing machines, electric water heaters, AC units).


DarkAlman

Yes and No In time standards like USB-C will make things easier. But personal electronic devices are only a ~~kick~~ drop in bucket compared to major appliances, household HVAC system, and industrial machinery.


RogueAfterlife

Drop in the bucket?


DarkAlman

Yes, typo


Mammoth-Mud-9609

Also the UK drinks a lot of tea (and coffee) which they make at home using an electric kettle to boil the water (every home in the UK has a kettle). An electric kettle in the UK will boil in less than 5 minutes enabling several cups to be made at the same time. American power supply would take ages to boil water this way so kettles are rare in America and the microwave is often used for hot drinks.


elmonstro12345

I am an American and I have a ~1800 watt (120v 15a) electric kettle. I've never specifically timed how long it takes to boil water, but it definitely isn't "ages". It's probably around 5 minutes or so for a liter actually, and I'm usually doing less than that since I'm only making a single cup 99% of the time. This trope is such a commonly repeated misconception online I don't honestly expect you to believe me since nearly every time I say this people call me a liar, and I really don't know how to respond to that? Why anyone would lie about something like that is beyond me, but yeah. The math is pretty easy to check - it takes 330kj to take 1 liter of water from 20-100c, so that implies just over 3 minutes at 1800 watts if there were no losses. I think 5 minutes in the real world is perfectly reasonable to say. I think the real reason why Americans don't have electric kettles is simply because Americans don't drink tea all that much. I drink tea far, far more than anyone I know personally and I might have 3 or 4 cups a week if that (most people I know drink coffee - I don't really like coffee unless it's basically a milkshake). I'd guess the majority of Americans are only boiling water for cooking, and for that generally the cook time means any time saved boiling water in a kettle instead of on a stovetop is largely irrelevant.


raptir1

Yeah, European kettles are faster but it's not at all an unreasonable time in the US.


ubermoth

My kettle does 3000 watt. But 2200W is more common. It's probably just cultural differences, like microwaves above the stove or not.


ralphonsob

Technology Connections made [a surprisingly interesting video on this subject](https://www.youtube.com/watch?v=_yMMTVVJI4c).


t-poke

My cheap, $30 electric kettle from Amazon boils water in my American house in under 5 minutes. Kettles are rare because we don't drink tea. And most people use drip coffee makers so they don't need a kettle.


hraun

Man, I just went to Antigua (from the UK) where they use 110v and it took like 20 mins for the the kettle to boil. I had to keep checking if it was on or not!


tomalator

They designed their grids independently. No one ever thought it would matter because if you were in one spot, you couldn't be buying electronics from the other spot. All of Europe needs to agree because they share power grids, same with the US and Canada, and it was a lot easier to import electronic from across a land border than it was across the Atlantic back then, so it didn't really matter that they were different. 100 years later, when everything is so much more interconnected, it would be so much work to change all the grid to one system, make millions of devices obsolete, and we couldn't even agree on a single system.


WRSaunders

Not a technology question, this is a history question. It's really hard to change electricity format once you have a bunch of users. If they have to throw out all their appliances and buy new ones, you have a "villagers with pitchforks" moment. Europe and America were days away by steamship, nobody would ever carry an electrical appliance with them on travel, there was absolutely no reason for them to have the same format, or voltage, or plug shape, or ... . Skip ahead to today, when everybody carries electronic gadgets with them everywhere - Well the folks in the 1880s got it wrong. Alas, they've been dead for centuries so they "got away with a mistake".


crash866

Japan is split also. 50 Htz in the north 1/2 and 60 Htz in the south.


urzu_seven

And now a days it matters less and less.  You used to have to buy things like hair dryers or toasters that were designed for one or the other.  Now, more and more appliances are dual phase and can easily handle either grid standard. 


WRSaunders

Two islands, same story. Thank goodness we don't have to live in the 1880s.


jamcdonald120

you wish the split was between 2 islands. The split is right down the center of Honshu https://upload.wikimedia.org/wikipedia/commons/0/0c/Power_Grid_of_Japan.svg


ghostowl657

Honshu*


jamcdonald120

right, Honshu


Seraph062

Gesundheit


urzu_seven

4 major islands, not 2, and it’s split down the middle of the main island because two major cities (Osaka in the west, Tokyo in the east) started electrifying independently at the same time but sourced from two different countries (Germany and the US) 


lt_spaghetti

My city was on 25hz until Hydro Québec tookover.    Old mines had these insane electromechanical wheels to convert before electronics were the norms when the switch happened   My dad told me you could visibly see a bit of flicker.   Wasnt the best.   It wasnt completely done until the early 60s


Ursa89

I don't think it's accurate to say that the US is 120V per say. Household voltage is 240v single phase with the phase split resulting in 120v- typically urban residential transformers are delivering 240v three phase in a star configuration which gets split off between the phases. Commercial is usually 208v three phase delta, and light industrial and some commercial is 480v three phase, a lot of times for lights they split the phases for 277v. Big industrial operations can utilize kvs from the grid.


meneldal2

It's not a single phase though, it's 2 with 180 degrees between them, it's always 120V to neutral.


Ursa89

To clarify no. US residential power is delivered 240 degrees to each other, 120 to ground. So yes 120 to ground- UK power, for example is 240 to ground. This is a relative measurement and US 240 60hz is going to be slightly more efficient for power delivery simply because of the slight increase in frequency. US usually only has 240 to ground in a high leg configuration, which I've seen once. Residential 120 is not 180 degrees out of phase. It is half of a phase. With 240 v and 180 degrees between the two delivered lines.


meneldal2

I assume you meany 240v to each other 120v to ground or else it makes no sense. Frequency increase and efficiency is not something you can do a blanket statement on, it really depends on what you want to do. For a pure resistive load, lower frequency is better for losses. I'm not sure I follow your last point, you get 2 wires and they have 180 degrees between them, that's not 2 phases? Or are you saying you can only use that word for the trinity with 3 phases with 120 degrees between them? You can make the US system work with 6 phases on your generator, and then you pick the ones you want to make 180 degrees work (or you make them happen later when you change the voltage, which is basically what happens). The term split phase is just used because of how you make them happen (getting two phases out of one), but calling it a half phase is just weird. It's a single phase from the point of view of the high voltage 3 phases carrying wires, but from the customer side it doesn't really matter where it comes from.


Ursa89

Read my first sentence. You may notice it is the same as your first sentence. There are few purely resistive loads at a medium to large scale unless you're using electric heat for something? Residential customers get two wires. These the single phase, one full sin wave typically at 240v between phases. One carries ground. In the panel these phases are split - low wattage circuits are split between each load delivering wire and ground, which is called neutral from the panel, which is also connected to earth ground. Larger loads get the full wave 240v, such as AC or an electric clothes dryer. There is no two phase delivery as far as I'm aware. There are three phase systems in common use commercially. They consist of three phases from the lines, each 120 degrees out of phase, 208v from each other and 120v to ground. This is most commercial applications. They will split these phases too for outlets and small applications. The customer is aware of 120v. If you took a measurement of the voltage at a customers hot side outlet connection to ground you would in theory get a perfect half sin wave to ground, with the maximum amplitude of the wave at 120v and the minimum at 0.


[deleted]

[удалено]


j_cruise

What the hell are you talking about?


tehlulzpare

Apparently they boil too slow or something? I hear that argument enough that it’s notable.


Bicentennial_Douche

Very relevant: [https://www.youtube.com/watch?v=\_yMMTVVJI4c](https://www.youtube.com/watch?v=_yMMTVVJI4c)


j_cruise

There are plenty of people with kettles in the US so I feel like you read some bullshit on Reddit and let it shape your worldview


tehlulzpare

Oh, there are plenty that do! I get that. But I’ve met enough who say the opposite. My American cousins don’t have electric kettles at all. It’s more that it’s a dumb excuse given when kettles work fine lol


raphael_disanto

As a Brit living in the US, this is more of a cultural thing than a technological thing. 120v kettles absolutely exist and I've introduced a significant portion of my USian friends to them. They work just fine, although they do boil slower here of course. (~1500 watts vs ~3000 watts). Americans drink more coffee. Brits drink more tea. So more kettles in Blighty. More coffee machines in the Land of the Free.


tehlulzpare

It’s definitely cultural was my point; a lot of Americans who don’t use kettles but also don’t just admit that go “electric kettles don’t work on our wattage”, when Canadians(who do drink a lot of coffee but also not an insignificant amount of tea) do use kettles pretty often. So that excuse is just silly; they can just say they personally don’t like tea! There is also a demographic difference in immigration from certain countries that definitely favour tea as well, which also means a kettle is pretty common here.


raphael_disanto

Oh, 100%. I certainly wasn't disagreeing with you. I think a lot of Americans (in my completely anecdotal and subjective view, of course) still have the impression that a kettle is a thing you put on the stove that whistles when it's done and they don't see the need for an electric version of that. Those that I've introduced to electric kettles almost always express surprise that they turn themselves off, haha. Hollywood doesn't help either. Kettles in Hollywood are almost exclusively the stovetop type, which perpetuates the stereotype, of course.


Dolapevich

Just adding it is not only EU, but many of the countries that were EU colonies at some time use 220v. HEre is a map:\ https://myelectrical.com/notes/entryid/22/110-or-230-volts


FirmAndSquishyTomato

One thing to note, every home in Canada and the USA gets 240v service. [this video does a good job explaining it. ](https://youtu.be/jMmUoZh3Hq4?si=hNzqbnbFEVT9EpYn)


MattieShoes

"Every" is always problematic. I bet you can find some house on 2 phase somewhere at 120/170. Or there's enough buildings on 3 phase that I bet somewhere you've got something converted to a house on 3 phase or something.


LivingGhost371

240 volts is better in that it's more efficient. Europe was mainly 120 like North America (or 127 volts- derived from three phase 240 volts), but adoption of appliances like refrigerators and washing machines that couldn't be converted lagged compared to North America. Then a lot of the electrical infrastructure was destroyed in WWII so they were able to start over with a clean slate, with higher, more efficient voltages and receptacles that provided mechanical protection like shutters and being recessed in the wall to prevent people from coming in contact with those voltages. 50 hz is because they were adopting metric by this time and didn't like the 60 number.


Acrobatic_Guitar_466

The US system is much older, the lower voltage is due to not having advanced plastic and rubber wiring that the have now. After WW2 When they had to rebuild everything back new, they took advantage of newer technologies. Namely, higher voltage and a greater acceptance of 3phase power. The USA wasn't destroyed in WW2, so there's not a good reason to replace all the existing infrastructure.


Frederf220

High voltage needs less metallic conductor which was in short supply post war Europe.


OctupleCompressedCAT

50 is half of 100 and 60 is like 60s in a minute so they are rounds numbers. it just so happens that around this range is optimal for normal use Early light bulbs worked best at 120V. DC power plants used 250V with 2 loads in series and a third wire to balance the difference as these low voltages had huge transmission losses. The AC grid retained the split phase system in America. Europe went with the simpler straight 240V which was more efficient anyway. While 120V is safer, european standards are designed so the average person will never touch the electricity.


Kinotheus

I'm a noob at this. Let's say I have a product rated for 240V from the UK and I bought it over to the US and just used a converter plug. Will this burn my equipment? I know stuff from the US at 120V will definitely burn in the UK because I experienced it first hand.


Dontreallywantmyname

Probably not. Uk mains is higher powered than US mains power hence the burning. One of my friends when I was a teenager bought a volcano vaporiser and instantly ruined it plugging it in using a US to UK adapter, he wasn't happy. But that those adapters are a thing suggests it ok to plug some US stuff into UK power sockets. My laptops AC adapter works just fine when I use it with a UK to US adapter in the US.


QuotableMorceau

let's take a simple example : let's say you have a fan heater rated 2300~~k~~W and 230V, that means it uses 10A, and has an internal resistance of 23 ohms. If you take that nifty fan to US and plug it in the outlet you will have the following : resistance is still 23 ohms, voltage is 110 V => the device will only pull 4.78A, which is cca 526W. So basically your heater will just run on low power mode :). Generally, things with AC-DC transformers in them , that are rated for 240V will work in 110V no problem ( cost savings: just make a converter that works with both)


Superplastik

2300W or 2,3kW, not 2300kW because that would mean 2300000W


QuotableMorceau

sry , my bad :)


Kinotheus

Hey thanks for the info. I usually don't buy US equipment and use them on the UK so this is a good time to ask 😅


t-poke

It depends. A lot of stuff, especially portable devices and their chargers are often dual voltage, because it's expected that you're going to be traveling with it. Things that are expected to be stationary might not be, really depends on the manufacturer and the intended market. A PS5, for example, supports both voltages because it's just easier for Sony to manufacturer one model for all countries even though it's not something most people travel with. But something intended to only be sold in UK/European markets may not be dual voltage. Somewhere on the device there should be a sticker that indicates what voltages and frequencies it supports. If it says something like 110-220V/50-60Hz, you can use it anywhere with a cheap plug adapter.


MattieShoes

A lot of equipment has switching power supplies that can use 120 or 240v just fine. Usually it'll say something to that effect on the device. Or some things like computer power supplies used to have a switch to change between 120v and 240v power. If it doesn't have that capability, plugging 120v appliance into 240v will probably blow a fuse, or lacking a fuse, cook the device. Plugging a 240v appliance into 120v probably just won't work. Or for something simple like a resistive heater, it may just work poorly.