This may not really be a good quesion for this group since it is really about power generation on the utility level rather than about electronics but here goes since I am sure many of you will have some insight to this stuff...
Today's Boston Globe had an article about places like Fenway Park (Red Sox), the Fleet Center (Celtics & Bruins) and Gillette Stadium (Pats) and the amount of power they use during a game. The numbers under the photos gave stats like '3000 kilowatts used during a game, enough to power 2250 average homes'. So I am sitting there drinking my coffee and reading this and thinking to myself, self, this doesn't seem right since that is less than 1.5kW per home. A single hair dryer will draw more power than that.
But as it turns out, this appears to be a number that is often quoted for stating the generation capability of a power plant. 1MW for 750 homes. While I understand that not each and every home is going to be pulling as much as they can all the time, it does seem likely that a significant number of them would be needing more than a single hair dryer of power. At 3am, most home are not using much power, but at say
8pm, MOST of those 750 homes probably have the TV on, a half dozen or more lights on, fridge is running since it has been opened a lot for dinner, etc. Even if only half those 750 homes are using only 3kW, that is well over the 1MW capacity of the utility.So my question is, how does this number hold up? Seems like yes it is possible that over the course of a day many homes will not be using that much, it seems as likely that many homes would be using a significant amount of electricity at the same time many times a day and you would have more brown outs and power failures than we have. So what is the deal?
PT