Bender a day ago

As someone that used to help manage data-centers I think this is fine. I had to load test the generators quarterly. It's good to ensure the transfer switches and generators are working and nothing beats a real world load test. The data-centers have contracts with diesel fuel providers to keep the main tanks full. They can run on diesel any time the infrastructure load is high and write off the cost in their taxes check with your tax lawyer. There may even be other tax advantages being the need for generators would be compelled by the state, perhaps a tax lawyer could find ways to make a generator tech refresh get a bigger write-down, write off better noise abatement walls as usage will increase. If a company was running with scissors and did not buy enough generator capacity to run everything then they will have to get their act together now vs. later.

  • Shank a day ago

    This is true, but how long will firm load shed events last, and how many of them will happen? In California, when CAISO has events that lead to firm load shedding, they're predictable, rolling blackouts and everyone knows how long they'll last, and they're assigned on a customer-specific basis. You know your power will be cut, and you know it will be a certain amount of time, and you know roughly where you are in the schedule.

    I could see operators of datacenters in Texas wondering about this. Also, it's underrated how much critical infrastructure is dependent on datacenters running. Like, are you going to pull someone's EHR system down that serves a local hospital, while keeping the local hospital on a critical circuit?

    • Bender a day ago

      You know your power will be cut, and you know it will be a certain amount of time, and you know roughly where you are in the schedule.

      Knowing when the power will be cut will not help unless I am misunderstanding you. If the data-center loses power for even a minute the generators will all fire up and then every ATS will count-down and transfer in 30 seconds. Battery backup only lasts just long enough to do a quick return to service on a failed generator and even that is sketchy at best. A properly engineered data-center can run on reduced generator capacity.

      Some data-centers are indeed on circuits deemed to be critical but I could see regulations changing this so that they are "business critical" vs. "life support critical" and some changes could be made at substations so that data-centers could participate in shedding. I think you are right that they will be thinking about this and adding to this probably filing preemptive lawsuits to protect their business. Such changes can violate SLA contracts businesses have with power companies and Texas is very pro-business so I can not compare it to California.

      • imglorp a day ago

        Texas has shown no interest in life support critical. They prioritized operator profits over uptime. Hundreds died as a result.

        https://en.wikipedia.org/wiki/2021_Texas_power_crisis

        • thegreatpeter a day ago

          I lived in San Antonio during the winter storm in 2021 and no power went out and the hospital didn't lose power in my area either.

          • grepfru_it a day ago

            H-town here. Our house at the time never lost power.. we also shared the block with emergency communication, so we figured that was why our neighborhood didn't lose power. Hospitals (and their neighborhoods) did not lose power either. Where I live now lost power, so did a lot of suburbs.

        • doodlebugging a day ago

          >Texas has shown no interest in life support critical.

          I'm not sure that this is correct. I was initially worried about how Mom would fare since she lives alone and is over 80. During the entire one week period of power problems in Feb. 2021 my Mom never lost power, not even a quick brown-out. Her home is within a half mile of a local hospital which also never lost power. The area around the hospital did not lose power so businesses and homes close by had no issues with heating, cooking, bathing, etc during the cold blast. That fact allowed me to stay here at my place a couple hours away and manage my own situation which was fairly easy compared to many others in the state.

          Your other statements are quite true and to date no one who played a part in mismanagement of utility power in Texas has been held accountable nor will they ever be in a libertarian state where regulations exist only to guarantee a profitable situation for a commercial entity. In fact, most electricity customers in Texas ended up paying for the huge cost increases that occurred as those in charge tweaked the system in real time to maximize their own profits.

          Texas needs regulations worse than most other states. Grifters, fraudsters, and thieves have filled too many critical positions for too long.

          • opo a day ago

            >...in a libertarian state

            I don't think any organization that considers themselves to be libertarian has ever called Texas a "libertarian state". For example:

            >...Texas’ institutions and policies continue to bear something of an old statist legacy. In the Cato Institute’s Freedom in the 50 States study, Texas scores a mere 17th, behind even the southern states of Florida (#2), Tennessee (#6), Missouri (#8), Georgia (#9), and Virginia (#12).

            https://www.cato.org/commentary/texas-really-future-freedom

            Are there any Texas national or state politicians who are members of the Libertarian Party or even refer to themselves as Libertarian?

          • grepfru_it a day ago

            Heating your house/cooking/bathing etc during this time put extraordinary strain on the grid. A big reason why others did not have power is because those that did did not reduce their consumption by much. So many of my neighbors/friends/collegues made comments like "we didn't lose power, so we kept the heat cranking at 75". So it would make sense that load shedding primarily affected neighborhoods, but my recollection of the events from people who lived near emergency centers was use it up before it goes away.

            • doodlebugging a day ago

              You must live near and work with some selfish people.

              I have more family up there where Mom lives and they lost power for all or most of the week so they all shuffled operations to the homes that had the most reliable power and pooled resources so no one had to be hungry or cold.

            • bsder a day ago

              > A big reason why others did not have power is because those that did did not reduce their consumption by much.

              First, that was the big manufacturers. ERCOT couldn't force big companies off the grid, and they didn't go off grid until the press noticed and started complaining.

              Second, the Texas grid has insufficient granularity to actually shed enough non-critical load to do rolling blackouts. There are too many "critical" things connected to the same circuits as non-critical ones, and it would cost money to split those loads (something Texas just ain't gonna do).

              Third, the base production got hit because fundamental natural gas infrastructure wasn't winterized, froze and exacerbated the whole situation. It would cost money to fix. (aka: something Texas just ain't gonna do)

              Finally, when you don't have big industrial consumers defining your power grid (aka massive overprovisioning), you can't "shed load" your way out of trouble.

              The fundamental problem is that, like so many things in the US economy, personal consumption is so low that it doesn't help when the problem is systemic. We've optimized houses with insulation, LED lighting, high-efficiency appliances, etc. Consequently, the difference between "minimal to not die" and "fuck it, who cares" in terms of consumption differential isn't sufficiently large to matter when a crisis hits.

          • const_cast 18 hours ago

            My house had zero power for 3 days straight. No cooktop either, because that's electric, and no water heating. It got to be ~30 degrees inside.

          • buerkle a day ago

            I'm in the Austin area and lost power for 2 days. Some friends of mine lost power for almost a week.

            • doodlebugging a day ago

              We're near DFW. Mom is north of us by a bit. We lost power too, for days. Towards the end we had rolling outages that were predictable so we prepped anything that needed heat or power and as soon as the lights came on we made fresh coffee and tea and water for oatmeal or whatever and recharged the water supply since we are on a private well. Our power bricks handled most of the phone/laptop power delivery so we basically topped off the charge on the bricks whenever we had power. My greenhouse is solar/battery powered though I did use 1 lb propane cylinders for the coldest periods since the heater in the greenhouse was way too small to manage temps that went below 10F. I lost some things but I learned some things too. We are much more resilient today.

      • toast0 a day ago

        > Knowing when the power will be cut will not help unless I am misunderstanding you. If the data-center loses power for even a minute the generators will all fire up and then every ATS will count-down and transfer in 30 seconds. Battery backup only lasts just long enough to do a quick return to service on a failed generator and even that is sketchy at best. A properly engineered data-center can run on reduced generator capacity.

        If you know when the power will be cut, you can start the generators before the cut, and depending on your equipment, you may be able to synchronize the generator(s) with the grid and switch over without hitting the batteries. I assume big datacenters are on three phase, can you switch over each phase as it crosses zero, or do you need to do them all at once?

        • Bender a day ago

          At least in the data-centers I helped manage the inverters were in-line running 100% duty-cycle, meaning frequency sync is not required as there is no bypass. The servers never see the raw commercial power. Data-centers in the US are indeed 3-phase. FWIW the big Cats did have controllers that would maintain sync even when commercial power was gone but we did not need it. There wasn't even a way to physically merge commercial and generator power. ATS inputs and outputs were a binary choice.

          I know what you mean though, the generators I worked with in the military had a manual frequency sync that required slowly turning a dial and watching light bulbs that got brighter with frequency offset. Very old equipment for Mystic Star, post-WWII era equipment. 50's to 90's

          • dylan604 a day ago

            In the facilities I have been in (not managed), they were all in-line as you describe as well. Mains power is dirty. Having a data center without line condition on mains would be insane.

            • Bender a day ago

              Mains power is dirty. Having a data center without line condition on mains would be insane.

              Agreed. Even my home computer and networking equipment is 100% in-line with inverters and never see commercial power. PG&E in California got me into this habit with all the Planned Safety Power Shutoffs, wildfires, surges from really old transformers and unplanned outages. Now each of my tiny indoor rings of power have 200 to 800 amp-hour capacity each and over-sized inverters. I put the whole-house inverter plans on hold for now.

              • dylan604 a day ago

                way back when, I worked for a VHS dubbing facility where we had a voltage meter with an alarm set to warn when voltage would drop below a certain rate, but I don't remember the exact value. At that point, the VCRs would glitch and the recordings would be bad but the dip would be momentary and not enough to force the machines to stop like a full outage. When the alarm sounded, we stop all of the decks and re-rack the room and restart all of them. Without the alarm, it was impossible to catch these without 100% QC of a tape. That is when I groked how much worse a dip can be than a spike. Some equipment will start to pull harder when the voltage drops which kills more power supplies than spikes. Surge protectors are great for the spikes, but line conditioners or battery backups are the only protection from the dips. Management decided that the fully time battery conditioned expense was not worth it, so we were constantly running with some set of equipment down because of a dead power supply

          • jimmygrapes a day ago

            Manually syncing several of the MEP012 generators was always far more stressful to me than any physical dangers!

            • Bender a day ago

              I bet. I never messed with the trailer or skid mounted generators. It sounds like you were also USAF. At least modern day noise cancelling headphones are much better. Guessing you probably have tinnitus from working on them. At least I think that is partially where mine came from.

        • bluGill a day ago

          Again, that doesn't matter because everyone knows the grid isn't 99% reliable. They just pull the big power switch to the whole building and watch all the backup systems work. If anything fails it was broke already and they fix/replace it. Because this happens often they have confidence that most systems will work - even where it doesn't computers fail unexpectedly anyway and so they have redundancy of computers anyway (and if it is really important redundancy of a data center in a different state/country)

          Synchronizing generators is a thing, but it isn't useful for this situation since they need to be able to handle the sudden without warning power loss where generators cannot be synchronized anyway.

          • stevetron a day ago

            How often does that inverter burn-out a transistor? Is there a backup inverter? Do you keep replacement transistors on-site?

            • Bender a day ago

              Commercial inverters are massive and highly redundant. They do fail but it is very rare and there are contractors that can be on site to fix things very quickly. A properly engineered system can run in a degraded state for a prolonged period of time.

            • hdgvhicv a day ago

              My data centres have two separate supplies through two separte ups with two seperate generators, kit is striped across each one.

              Of course that doesn’t help for fire/flood etc which is why we have critical workloads in two dcs.

              • bluGill 4 hours ago

                I once worked in a building that had two separate connections to the grid which went to different substations. Their servers have more than one power supply as well so that either grid could go down. While somewhat of an outlier, it is an option if you care about this.

    • tw04 a day ago

      If power is out everywhere for an extended period of time, they aren’t doing anything but life saving surgery. Pulling up an EMR will be near the bottom of the list of concerns.

      • vineyardmike a day ago

        Yea, that’s not how medicine works. There are patients that are in the hospital beds for a variety of reasons, and they don’t just go home during a storm. Those people still need some level of care, even if they’re not getting XRays and basic preventative care.

        EMRs contain a record of when patients last took some critical but dangerous drug, what their allergies and reactions are, and many other important bits of information. When one of the patients starts to exhibit some new symptom or reaction (very stressful situation!), doctors and nurses look at the EMR to understand the best course of treatment or intervention.

        When the EMR goes down, doctors and nurses revert to pen and paper. It’s very slow, and requires a lot of human handoff - which, critically, they’re less practiced in.

        • tw04 a day ago

          I literally help design IT resiliency for hospitals. This is absolutely how they work and part of their disaster planning. When there is an extended power outage they stop anything but vital surgery and work off pen and paper.

          Which you got to after spending 3 paragraphs talking about what an EMR is for.

    • nradov a day ago

      All of the major cloud EHRs run in multiple availability zones.

      • mschuster91 a day ago

        The data center might be... but are all fiber routers, amplifiers, PoPs etc. along the line from the datacenter to the hospitals and other healthcare providers backed up as well?

        Particularly the "last mile" often enough is only equipped with batteries to bridge over small brownouts or outages, but not with full fledged diesel engines.

        And while hospitals, at least those that deal with operating patients, are on battery banks and huge ass diesel engines... private small practices usually are not, if you're lucky the main server has a half broken UPS where no one ever looked after that "BATTERY FAULT" red light for a year. But the desktop computers, VPN nodes, card readers or medical equipment? If it's not something that a power outage could ruin (such as a MRT), it's probably not even battery backed.

        There's a German saying "the emperor is naked, he has no clothes". When it comes to the resilience of our healthcare infrastructure, the emperor isn't just naked, the emperor's skin is rotting away.

        • marcosdumay a day ago

          Just pointing out that none of those non-data-center buildings are data centers.

          I really doubt the classification of "data centers and other large, non-critical power consumers" extends to telecom infrastructure.

          • mschuster91 12 hours ago

            > I really doubt the classification of "data centers and other large, non-critical power consumers" extends to telecom infrastructure.

            That's the point. Okay, cool, the datacenter is highly available, multiple power and data feeds, 24/7/365. But that highly available datacenter is useless when it cannot be reached because its data feed elements or the clients don't have power.

    • vel0city a day ago

      If they cannot handle the grid power being pulled due to load shedding, they have no business handling critical applications.

      • bluGill a day ago

        In particular there is no way you can predict when a backhoe will take our your power line. (Even if they call to get lines located - sometimes the mark is in the wrong spot, though most are they didn't call in the first place). There are lots of other ways the power can go down suddenly and nothing you can do about it except have some other redundancy.

  • lokar a day ago

    For any big facility there will be pretty strict EPA limits on how long you can run the generators each year.

    • dylan604 a day ago

      The EPA? Are they still a thing? I doubt anyone is concerned about the EPA under current management.

      • xxpor a day ago

        The state regulators can also get you.

        • dylan604 a day ago

          The state regulators of Texas? Unless you're trying to manage your own health, the state is not concerned about you. If you're a gas/power company, they only want to know what regulations you want removed/enacted. They definitely aren't "getting you" for being part of bigEnergy

          • xxpor a day ago

            Texas, I could agree with. I'm just saying that Virginia has fined DC operators specifically for running their generators too much.

      • bilbo0s a day ago

        Which is great.

        Until there's new management.

        You can't run a business by seesaw.

        Best to just count on that rule being enforced and place the necessary battery backups and wind or solar in place to backstop the diesel. Then make any users who need to use those data centers eat that extra cost. There's no problem with us-east costing less than us-west, and us-texas costing most of all. That's how markets work.

        • dylan604 a day ago

          But the seesaw is what the future is going to look like. If some bit of mass voting breaks out for the next election and moves to the other party, there will be a swing back the other direction. Then the following election the masses will get upset about something and swing it back. The country is too polarized to expect anything other than seesaw policies. Unless we go full revolution and just deny the other party from ever taking charge.

    • Bender a day ago

      Indeed. I have faith that Texas will find a way around such rules especially if they are being regulated into running them. A Texas company I worked for was highly proficient in maximum shrugs.

    • more_corn a day ago

      The EPA doesn’t really have the resources to enforce that. And certainly won’t have that capability under the trump administration.

  • duxup 18 hours ago

    When I worked in a different career I worked with big banks and did disaster recovery tests (we didn't run the tests we just had equipment that the tests sort of centered around). We'd basically cut of a data center for a weekend and they would run from another data center that was supposed to have all the data there too. We'd even move their check processing to a backup site and they'd truck full truckloads of paper checks to a backup site.

    They were legit tests at the offline site too too, they'd power down equipment, and power it up and we'd fix what didn't come back up. Even the data centers would be fully powered off to test.

    At least at that time those banks did not skimp on those big tests and it was a big effort and pretty dang well run / complete.

  • abeppu a day ago

    I wasn't involved in the specific details but I remember being told that during the power outage from hurricane Sandy, even datacenters that had sufficient generators had trouble getting the diesel to keep them running, because everyone wanted diesel at the same time and both the supply and distribution were bottlenecked.

    How long can most DCs run with just the fuel onhand? Have standards around that changed over time?

    • jabart a day ago

      At a call center, that had a whole datacenter in the basement. They had two weeks of fuel on hand at all times. Being on a border of a state, they also had a 2nd grid connection in case one failed.

      The whole area lost power for weeks but gym was open 24/7 which became very busy during that time.

    • Bender a day ago

      How long can most DCs run with just the fuel onhand?

      There really is not a universal answer for this. Every generator will have what is called a "day tank" that as you might guess lasts for one day under a nominal load.

      The day tanks are connected in pods to large diesel fuel tanks. Every {n} number of generators get a main tank. Those tanks vary in size depending on how much the company wishes to spend and how resilient they need to make their data-center excluding fuel trucks. Cities have regulations about how much fuel can be above or below ground at each location. My main tanks were 10K gallons. Each generator used over a gallon per minute under load.

      And you are right, during a regional or global disaster fuel trucks will be limited. They who bribe the most get the last fuel but that too will run out. Ramping up production and distribution takes weeks and that assumes roads are still viable and the internet outside of the data-center is still functional.

    • stevetron a day ago

      It seems to me that Hurricane Sandy caused an issue with fueling backup generators in a New York City datacenter, where there was a bucket-brigade of people carrying fuel in pails up flights of steps several stories.

      • abeppu a day ago

        I remember hearing this story at the time but have forgotten all the details. I think it involved a pretty well known company? I also remember hearing that some DC in NJ got special priority in getting diesel in the following days because some federal government services were hosted there and so it was treated as a national security issue to keep them supplied.

    • stogot a day ago

      I recall ranges I’ve heard from operators as 24h to upwards of 72 (rare)

  • stronglikedan a day ago

    > did not buy enough generator capacity to run everything

    Maybe they can cap it for most cases - we'll turn it off for n number of days max - so that companies have a target to prepare for (or maybe that was already mentioned). Of course, no one can prepare for anything, but if it's longer than that then most other things are probably also affected anyway and no one is worried about their favorite data center.

  • f1shy a day ago

    I get your point, but saying “is ok to cut power, they have backup” isn’t a little bit kind of deviance? For example in a hospital this kind of thinking would be totally unacceptable IMHO

    • doubled112 a day ago

      Are data centers as life or death as a hospital?

      Also, the data center I use has run from generator power for days after storms with NO quality of service loss. Nothing like some real world testing to remind me they have this figured out.

      Is a hospital in the same situation? Or is only part of the hospital on those generators?

      • f1shy a day ago

        Of course not all, depends, but I worked in a telco; and pretty much yes. Without datacenter, no cell phones (or trunking in the case I know) so no ambulance, no police, no firefighters (or a very delayed version)

        We had batteries and 2 generators, and once we were minutes from blackout, as the primary generator failed, and the secondary was not dimensioned to cope with the load of AA of a 45 celsius day

    • quickthrowman 21 hours ago

      Hospitals are obligated by law and building code to have backup generators with multiple (at least 3) separate backup power feeds for critical branch, life safety branch, and equipment branch. These are defined as ‘essential loads’ by the National Electrical Code. They can all be fed from the same generator but must use separate overcurrent protection and automatic transfer switches.

      Critical branch is defined as loads that are used for direct patient care, plus ‘additional task lighting, receptacles, and circuits needed for effective hospital operation.’

      Life safety branch is the fire alarm system and emergency lighting, plus elevator lights and controls, medgas alarms, PA/notification systems used during building evacuation, and some generator accessories.

      Equipment branch has some require items including OR cooling, patient room heating, and data rooms. Some hospitals will add MRIs, non-patient care HVAC, and chillers (for air conditioning) on the generator backup system as well.

      There’s typically a fourth system for everything else (‘normal’ power) that is not backed by a generator. Non-emergency lighting, convenience receptacles, and other non-essential loads are on this system.

  • chaz6 a day ago

    Are they clean enough to stay within the limits of regulations around pollution when run for long periods of time?

    • Bender a day ago

      The generators I worked with were just massive diesel engines the size of a tractor trailer each. This was pre-def, pre-cat requirements. When I would switch on multiple load banks at once I could get a small puff of soot as the engines revved up. Provided the load was not varying heavily they would at least run clean enough that one could not see or smell any exhaust. This was in the 2000 time frame. Mind you the load tests are not real-world realistic at all and this can be further mitigated using step-start programmable PDU's in the data-center. Companies that own data-centers usually have a lot of political pull with the cities due to all the assorted taxes they pay, direct and indirect revenue and employees they bring in.

      Cat has since added options for hydrogen [1] but I have no idea how many people have bought them.

      [1] - https://www.cat.com/en_US/by-industry/electric-power/electri...

  • lowwave a day ago

    Nice to know that, but are the data-centers Carrington Event proof? Always wondered that, but never worked in a data-center.

w10-1 a day ago

Regulating high-demand customers does seem most reasonable. But when the Texas legislature gets involved, it's typically to build a political franchise by creating government power than can be used arbitrarily against a fixed asset. In this case, nothing prevents the grid authority from harassing the data center during the many overload periods when a power contract is coming up for renewal.

Not that businesses aren't abusing the process, either. The article mentions that 80-90% of planned data centers won't be built, due to duplicate applications. They duplicate both to secure a "phantom" slot for power, and to get localities to compete with incentives for business. It's hard to plan a grid when financial speculation and gamesmanship is driving the planning.

The worst aspect is that Texas corruption is explicitly being offered as a model for other states in the PJM interconnect.

I wonder how much of the high economic growth rate in the 1950's-1980's came from the lack of gamesmanship and political franchises. But it's probably unrealistic to think professionals could step back from the very maximalist positions that are selecting them as leaders.

  • mike_d a day ago

    > But when the Texas legislature gets involved, it's typically to build a political franchise by creating government power than can be used arbitrarily against a fixed asset.

    When I saw that it only applies to datacenters using more than 75 MW, my first thought was who are they writing the rule around?

    I imagine we will see things like "xAI Under 75 MW Datacenter III, LLC", "xAI Under 75 MW Datacenter IV, LLC", etc.

jqpabc123 a day ago

The main problem with the Texas grid is really very simple.

Their "free market" real time power auction makes no allowances for long term concerns like reliability. Any provider who spends money to address these sort of issues is immediately priced out of the market.

Some things are too important to leave to the "free market".

  • infecto a day ago

    I hear opinions like yours often but I am not sure it’s simple or that the reasons you’re citing are grounded in reality.

    What is the alternative, have a state imposed monopoly with a single power company like PGE who I would not see as an ideal operator either. Same can be true for a lot of other similar generators across the company.

    You’ll probably bring up the winter storm outage which is inexcusable but their neighbor to the north SPP had very similar failings in being prepared and only faired better because they have interconnects.

    Texas has had some of the fastest adoption for wind and solar. It is far from perfect but I also think there is benefit to having multiple generation companies supplying to the grid. You have companies with different expertise and perhaps innovation.

    • MBCook a day ago

      They could just connect to the rest of the national grid to get power from other states when running short. That would work fine.

      Except then they’d be subject to regulations. And we can’t have that now can we?

      • infecto a day ago

        Don’t confuse some of Texas’s shortcomings with faults in the free market system for energy. This is my only issue in these conversations. Not everything the Texas market does is bad and they were not the only grid impacted by the weather storm but certainly the most severe because of the lack of interconnects.

      • dogleash a day ago

        >Except then they’d be subject to regulations. And we can’t have that now can we?

        Why so salty? There are n>0 things the federal government does that I disagree with. On a subject I know nothing about why would I assume that Texas is wrong to avoid it's rules?

        • macintux a day ago

          I’d be curious how many people in Texas have died from power loss compared with the rest of the country combined over the last few years.

      • MrMorden 16 hours ago

        Congress could remove Texas's exemptions from those rules at any time. It's got nothing to do with whether and how much the Texas grid connects with others.

    • bityard a day ago

      There is lots of middle ground. Here in Michigan, (most) electricity and gas providers are for-profit companies. But they are heavily regulated by the state, and must get approval from the state before they are allowed to change rates. Our rates are not dirt-cheap, but they are not Coastal either.

      When I lived in the capital, we got our power from the Lansing Board of Water and Light, which is 100% publicly owned. Their rates are still some of the lowest in the Midwest. The main downside is that until recently their main energy source was coal. (We used to live downwind of the smokestacks. You couldn't smell it, but lung cancer was definitely in the air.)

    • apendleton a day ago

      > What is the alternative Other markets in the US are generally energy + capacity markets -- you get paid both for what you actually provide and for your ability to provide a certain level of power, whereas Texas is an energy-only market (EOM). It needn't be the case that that if you don't do an EOM, you have to have a monopoly.

      • infecto a day ago

        Definitely you could operate on a capacity model instead of generation. There are a lot of levers. My issue is mostly around how much uninformed hate a “free market” energy system gets.

    • e40 a day ago

      Their refusal to not connect to nearby grids is baffling and rooted in their free market ideology as well as their go-it-alone philosophy.

      It does the citizens of TX a disservice and has resulted in deaths of many of them.

      EDIT: s/rooting/rooted/

      • conductr a day ago

        I’m a Texas native and I feel one easy change here is simple and cheap. We should popularize the concept of utilizing residential transfer switches and portable generators for emergency backup. It goes along with our go-it-alone philosophy that ultimately the property owner should be responsible for ensuring power when it’s needed. Also, it’s such a super rare weather event (historically) that would ever cause that type of issue again. It also solves for all the minor power losses we have due to old infrastructure, branches falling on overhead cables, etc.

        It’s really cheap. Ive done it for a grand total of $2000 most of which was to get a real beefy generator so I could just power my whole house instead of only a few circuits. Most people think an installed appliance like Generac or some battery/solar option are the only options, and those often run $15-20k and up. We don’t always need instant switchover, but if it doesn’t come back on in a couple hours I pull out the generator.

        Apartments and other MF properties will need to approach it differently, but I don’t think it’s possible and reasonable to just let the property owners take ultimate responsibility. After all, most my outages aren’t grid failures they’re some localized wire/transformer issue that is unavoidable.

        • e40 a day ago

          I agree, but I read that a lot of the people who had terrible problems that winter a few years ago, many were low-income residents of TX. I think $2,000 for a generator is a nonstarter for them.

          • conductr a day ago

            Everyone had problems that winter. It was pretty universally felt. How you recover from it is where your economic status changes your experience. If you are under insured or can’t come up with your insurance deductible you’re pretty screwed but we can’t solve all the worlds problems with this alone.

            So my general albeit cold sounding response is “Doesn’t matter.” We should have the expectation that it’s owner responsibility first. After that, we can devise subsidies and such to ensure everyone can retrofit their house. There’s a ton of levers to work with once you admit that the grid and power transmission isn’t some god like thing that never fails

            You can’t hinder progress because someone can’t afford it. They maybe did have the money if it meant a few bucks a month on their bill, but they were never told this risk existed, we all thought we lived in a modern enough country that we would never be without power for an entire week. But we also have never seen freezing temperatures for a solid week either, not in anyone I knows lifetime including some 90 year olds.

            Once I know the problem exists, I’d rather spend the $2k and have a solution at hand than take on the full system costs of winterizing/prepping for a once in a century(?) snow storm. That would perpetually make my energy cost go up by 10% or more. It’s the smarter solution with better ROI if people DIY the contingency.

        • quickthrowman 21 hours ago

          > It’s really cheap. Ive done it for a grand total of $2000 most of which was to get a real beefy generator so I could just power my whole house instead of only a few circuits.

          Are you talking about something like a 7.2kW portable with a 60A manual transfer switch? I could see that costing around $2000, which is substantially cheaper than a Generac. I found a portable Kohler with 7.2kW for ~$1500.

          Instead of a transfer switch, you could shut off the main circuit breaker or pull the meter and backfeed the panel through a 60A 2P breaker, that would save some dollars.

          Just make sure that you disconnect from utility power before backfeeding and be absolutely certain to disconnect the generator before switching back to utility power, you don’t want to find out what happens when a generator isn’t in sync with the utility frequency :) Rapid unintentional disassembly, lol.

          • conductr 14 hours ago

            I might not be using the correct term, I had an electrician install it, but I just flip a switch and it cuts off the grid including back feed then the power in is through what I think is commonly called an RV panel allowing for generator connection. Do agree on operation and order of things, I have it written in the boxes.

            My generator is the biggest one harbor freight sells. Might not be that beefy but I also don’t need much electricity during these events. I wouldn’t run laundry or my central AC but it’s enough to keep my pool pump running (we don’t winterize here, run nonstop during freezes as they are short enough the pool water never drops enough). Also, this model runs of natural gas which is a big plus for me so I don’t have to keep up with fuel. Keeping fridge running and furnace running is my highest priority but since all my lights are LED I don’t have to think about them. I’m giving the harbor freight one a shot first, my hope is that it last a long time since I’m putting little time on it. If longevity is ab issue I’d probably spring for a kohler or Honda next time.

            I may add a soft start kit to hvac as last summer we had some outages long enough to be uncomfortable, twice cost us our fridge contents which is expensive and annoying. But primarily, winter protection is highest priority for me as the risk is highest

    • const_cast 18 hours ago

      Every person in the country has an interest in having electricity. Therefore, every person in the country should pay for electricity.

      What is the most cost-effective and reliable way to administer something that every single person in a country requires? Taxes and the public sector.

      It's not rocket science.

      It's just like Visa and other payment processors. What do you think that 3% on every single transaction is? That's a tax. The difference is you're being taxed and you're buying someone's third private jet. If everyone is gonna be taxed regardless, just nationalize it. At least then we won't have to pay the burden of profit.

    • jqpabc123 a day ago

      You have companies with different expertise and perhaps innovation.

      Lack of reliability is not innovation. This is something the Texas "real time power auction" has created and continues to promote.

      Measures like the one the governor just signed are bandaids meant to cover up the fundamental lack of reliability.

      There is nothing left to innovate in terms of reliability. The solution really is very simple --- build additional capacity to cover worst case scenarios.

      This can't happen if providers are forced to continually offer a rock bottom price or lose out in the auction.

      Some things are too important to leave to the "free market".

      • infecto 6 hours ago

        I am going to have to disagree. This is really not about Texas but folks get hung up on Texas. There are ways to have market based systems in the power grid while still maintaining capacity. Having a market based system does not mean you cannot have interconnects. I think Texas gets some things right and some things wrong but I believe it’s too easy to lump everything into the bad bucket.

    • lokar a day ago

      PG&E only runs the grid(s), not generation

      The government can structure a public market in many different ways (they do this in many aspects of the economy). It’s not limited to real time auction vs single provider.

      • infecto a day ago

        Incorrect. They do have generation but it’s not a majority producer. My point being that PGE still is helping set the rate via the CPUC. They are purchasing power through some of the spot markets. They are unable to even manage their own transmission lines effectively though.

  • robocat 17 hours ago

    > Their "free market" real time power auction makes no allowances for long term concerns like reliability

    Blaming the "free market" is an ignorant kneejerk reaction.

    There is a whole discipline revolving around designing the incentives for electricity markets - countries pay for consultants from companies like NERA Economic Consulting to create or improve their electricity markets to achieve the goals needed by that country.

    https://duckduckgo.com/?q=wholesale+electricity+market+desig...

    The problem is that wholesale electricity markets need to be designed so that the incentives of the market participants are aligned with desired outcomes. Use game theory so that selfish participants are rewarded for good outcomes in the market, and are penalized for bad outcomes in the market.

    When there is a failure (e.g. Spain) the responsibility lies at the feet of the regulators.

    Unfortunately after a failure we all want to blame someone, and it is easiest to blame the market participants or the market.

    Read how Enron manipulated the market to the detriment of California and its residents. The answer isn't to Blame Enron. Blame the people that set up shitty incentives.

    The big issue is ensuring that large generators/retailers don't twist the market rules to their advantage or obtain regulatory capture. The large players tend to have a knowledge advantage and are very experienced at manipulating their regulatory authorities.

  • bryanlarsen a day ago

    This can be a "have your cake and eat it too" situation. California, most of Northern Europe and many other jurisdictions have a power market without this problem. They do this by also establishing a market for capacity and/or reliability.

    • zdragnar a day ago

      > California

      > without a problem

      That doesn't really line up.

      • bryanlarsen a day ago

        California's myriad problems are delivery side, not supply side.

      • mrguyorama a day ago

        Then try one of the other 48 states. The rest of us have figured this out. People should stop pretending the US is only Texas and California like there aren't plenty of well run states that aren't single-party kleptocracies

    • dlcarrier a day ago

      California has both a free energy market and energy capacity problems, although NIMBYism is the main factor in the lack of stable energy production.

  • floatrock a day ago

    There's a lot of things to take a critical eye towards on the Texas power market, but

    > ...no allowances for long term concerns like reliability. Any provider who spends money to address these sort of issues is immediately priced out of the market.

    ...is a bit of an exaggeration.

    When Texas has those cold-snap/freeze days a few years ago, wholesale rates went up to $9,000 per megawatt-hour. So $9/kWh. Wholesale.

    A large amount of energy suppliers went out of business because they didn't properly hedge for such an event.

    You can bet those who are left have started to react to market price signals like that. Whether it's through financial engineering or boots-and-poles engineering is a fair discussion to have, but to say "no allowances for reliability signals" is a bit disingenuous.

    When someone says "It's all really very simple...", it's almost certainly not.

  • mothballed a day ago

    In a free market Texas could connect to the interstate grid which would average some of the localized reliability issues.

    • MBCook a day ago

      It’s because they want their “free market” that they don’t connect to the national grid.

      • barbacoa a day ago

        Texas is connected to the Eastern and Mexican grids and can share power through high voltage DC interconnect stations. They simply don't sync to the phase of other grids.

      • mothballed a day ago

        Other states/feds were happy to take the "free market" of Texas during WWII when they direly needed it and other Gulf states begged Texas to connect.

        It's a holier than thou thing for the other states. "Free market" buying from Texas when they need it, when they don't it's "fuck the free market" and cut texas off via legislation and refuse to provide the same assistance that Texans provided.

MBCook a day ago

Good. If people want to build massive AI data centers to pollute and waste power for theoretical gain propped up by VC money and hype cycles then let them power themselves in an emergency too.

If they’re that important, shouldn’t be a problem right? AI makes so much money right?

People come first. Not virtual hype driven land grabs.

tzury a day ago

Hospitals, first responders, and such, are all cloud driven operations nowadays.

It is really strange to see them categorized under

    Data centers and other large, non-critical power consumers
  • criddell a day ago

    Whose fault is it if hospitals place critical data and services on systems for which the SLA cannot guarantee the service they require?

    • tzury a day ago

      Google Cloud in Texas:

          Region: us-south1 (Dallas)
          Zones: us-south1-a, us-south1-b, us-south1-c
          Availability: Dallas, Texas is also listed as a location for Vertex AI. 
      
      Azure in Texas:

          Region: "South Central US" (paired with "North Central US")
          Azure Government: Region in Austin, Texas
          Availability: Azure has a physical presence in Texas. 
      
      
      Imagine a vendor picking multi zone, or even Multi cloud...

      This sounds like someone was trying to mitigate the "AI is eating our small town electricity" and threw the baby along with the water.

  • mike_d a day ago

    Hospitals are categorized as critical care facilities, regardless of how much on prem compute they have.

    Under national fire and electrical codes hospitals have to have at least three completely isolated power systems (these are the different colors of electrical outlets you see).

    The life safety branch is the most critical and powers things like fire alarms, exit signs, stairwell lighting - anything required to evacuate patients.

    The critical branch is patient care like ICU beds, ventilators, and operating rooms.

    Lastly is the equipment branch which is essential to patient care but won't kill people immediately: HVAC, kitchens, and sterilization/cleaning equipment.

    There is usually a fourth system for everything else, but these are the required ones. Each has its own requirements for battery and generator backup (life safety can't take more than 10 seconds to cut over), and usually the systems are wired so that they can draw power from lower priority circuits if needed.

  • thrance a day ago

    Aren't there special clouds accredited for such use cases? Ones that probably won't be targeted by this sort of bills?

leecarraher a day ago

On the surface, cutting less essential resources during a power supply event makes sense, the ranking of essentialness seems problematic. While the decision to stop dumping megawatts of power to train a companies next gen LLM to be used for life saving/sustaining systems makes sense, it's pretty hard to implement in all but the most extreme cases. Hospital vs gpt6 training is an easy decision, but what about deciding between someone who wants to run AC at their unoccupied home vs. cutting power to a multi-day training epoch worth hundreds of thousands of dollars. It all feels very un-capitalistic, which in the US, like it or not, is how many edge cases get resolved. Right now datacenters are just the easy target, but why not Texas' numerous fracking sites, or other less desirable industries. My guess is that an injunction on the constitutionality of this will hold it up in court for a while.

bee_rider a day ago

This seems like the sort of thing that ought to be negotiated into the data center’s contract. Why force the grid operator to make every data center subject to curtailment?

Of course, it is easy enough to deal with curtailment for many services. But, it should be on the table, either way.

  • lokar a day ago

    Yeah, if they needed a law it was to say you must have X% of your load subject to a load shed agreement.

lacker a day ago

It sounds better than the northern California system where occasionally PG&E will cut off the power of random neighborhoods because the grid is overloaded.

loa_in_ a day ago

Is Texas lawmaking an example of an agile lawmaking process that adapts to the ever changing landscape of needs or is it an example of a system where this will stay codified and never repeled for the foreseeable future because it was passed once?

net01 a day ago

Honestly, not that big of a deal. the price of batteries is so low per KW and the cost of generation is so low that, in the case of a major crisis, they could just keep them running. https://ourworldindata.org/grapher/average-battery-cell-pric...

And prioritizing humans (home heating, food freezing, etc ) over servers is a good thing.

lupusreal a day ago

Don't grid operators already have this power? That's the sort of thing they do during "rolling blackouts", isn't it?

latchkey a day ago

[flagged]

  • add-sub-mul-div a day ago

    I partly agree, ideally the worthless crypto/AI data centers can be isolated and shut off separately from data centers hosting random small businesses and web sites etc.

    But I'd rather have an unbalanced grid without Bitcoin waste to start with than a balanced grid that can briefly shut down Bitcoin if needed.

  • miltonlost a day ago

    > Mining actually helps balance the grid because they can immediately shut it down during high demand.

    Mining is what is also creating part of that demand, so they're "balancing" by not in the first place working to create their crypto nonsense. To help "balance" they would put more in than take out.

  • davidcbc a day ago

    [flagged]

    • latchkey a day ago

      Seems to violate a bunch of guidelines, take your pick.

      https://news.ycombinator.com/newsguidelines.html

      • davidcbc a day ago

        Then flag it I guess, but adding intentionally wasteful load to the power grid so that you can turn it off when the grid is overloaded is equally as silly as my example

        • latchkey a day ago

          wasteful is your opinion.

          Let me give you an example in the real world. Massena Dam in upstate NY generates, a lot, of hydropower. It was built during the time when aluminum smelting was a thing and the pots could never go cold. This constant draw on the grid and the generators was actually a good thing. When they shut down the smelting, you know, cause it created multiple superfund sites, it became a lot harder to operate the Dam.

          A few enterprising bitcoin miners took over the old Alcoa smelter buildings and started using the power. When I went to speak with the people who ran the dam, they were super happy about it. Not only were they now receiving revenue to keep funding the operation of dam, but they were also extending the life of it because of the continuous draw.

          The obvious argument would be to allocate the power elsewhere. But this is upstate NY, where nobody wants to live. There just isn't enough demand for ~2GW of power up there, plus it is insanely expensive to build transmission for that amount of power to other locations. Even just the maintenance costs on the lines going the few miles to the smelters was insane.

          AI is probably the next best thing to take over that power, but this was all happening long before AI became the thing that it has.

bloomingeek a day ago

This is a terrible idea. We live in the digital age, so almost everything depends on data. As in all mechanical devices, generators have a tendency to either break down(Let's not forget about associated switching gear.) or become subject to maintenance cut backs instituted my buffoons in management. We cannot depend on human errors, we can depend on the electric grid, if properly handled and maintained. Depending on generators just adds another link in the failure chain.

Texas is the perfect example of how not to run an electrical grid by not allowing other states to assist in an emergency.

  • hermitdev a day ago

    > This is a terrible idea.

    No, it isn't. Any decent datacenter will have on-site generation in event of power grid failure, anyway. When I was an intern, the company I worked for would routinely go off grid during the summer at a call from the electric company. The electric company actually gave us significant incentives to do so, because us running on our own 12MW generator was effectively like the grid operator farming out a 12MW peaker unit.

    • bluGill a day ago

      Not only will a data center of a generator, they will test it regularly and if it doesn't work get it fixed.

      The power company has a long list of who has backup power. I know of one factory where the generator was installed in the 1920s on a boiler from the 1880's - it is horribly inefficient, but the power company still gives the owners incentive to keep it working because for 4x the normal cost of power and 12 hours notice that generator can run the entire town it is in, which they do every 5 years when things really go wrong with the grid.

      • dylan604 a day ago

        > they will test it regularly and if it doesn't work get it fixed.

        What is your definition of regularly, and what qualifies as getting it fixed? I know lots of places that had things scheduled, but on the day of, something "came up" that the test was pushed. I've seen others where they tested by only firing up the generator, but didn't actually use it to power the facility. I've also seen repair tags that sat "unlooked" at for years.

        Not every facility is managed/financed the same for such a blanket statement as yours.

        • bluGill a day ago

          Most places I know of where like that until 'something' happened and now they take a lot more care.

        • mjcl a day ago

          So this will force datacenters that employ "reliability theater" to either actually be reliable or give up the facade and take repeated outages?

          Ok!

        • bloomingeek a day ago

          Exactly! And one of points, based on real work experience, I was trying to make.

    • scotty79 a day ago

      There was recent news that a datacenter is going to be built that will consume few times more power than all homes in the state. I don't think they are gonna have on-site backup power. Although they'll probably have on-site powerplant for normal operations.

  • _verandaguy a day ago

        > We live in the digital age, so almost everything depends on data
    
    Data that I can't consume if my house is browned out and my router doesn't work (on top of heating/cooling, lights, and other basic living-related services that are less essential than the almighty ONT).

        > As in all mechanical devices, generators have a tendency to either break down(Let's not forget about associated switching gear.) or become subject to maintenance cut backs instituted my buffoons in management
    
    Famously, power infrastructure relies on no moving parts whatsoever since the abolition of contactors, relays, rotors (but not stators), turbines (both water and wind), and control rod actuators, though even before abolition, none of these devices needed any maintenance.

        > We cannot depend on human errors, we can depend on the electric grid, if properly handled and maintained
    
    The electric grid, which famously has no human or mechanical errors like line sag or weirdly-designed interconnects or poorly-timed load shedding.

        > Depending on generators just adds another link in the failure chain.
    
    Weird way to frame a redundancy layer, but sure.

        > Texas is the perfect example of how not to run an electrical grid by not allowing other states to assist in an emergency.
    
    Again, weird way to frame this. You're actually technically right about this, but the redundancy offered through a better-integrated interconnect goes both ways, rather than just externalizing weaknesses in TX's own interconnect design.
  • joezydeco a day ago

    I'm happier with Texas being independent. Why should my state brown out because a bunch of companies put data centers in the hottest part of the continent?

  • Workaccount2 a day ago

    Don't put all your digital services in a texas datacenter with no fallback then...

  • darth_avocado a day ago

    The whole point of using a cloud data center is to be able to handle grid outages. I’d be using the cabinet under my table for otherwise.

  • vel0city a day ago

    > We live in the digital age, so almost everything depends on data

    Agreed, the datacenters need to be extremely durable. What's more durable than proving you're able to withstand a power outage event? The grid does go down from time to time; they need to be ready to handle it. That's not a Texas-only kind of thing; power outages happen all over the US.

    If the datacenter can't handle the outage that was announced as a probability ahead of time, they have no business running critical applications.

  • mothballed a day ago

    IIRC the reason Texas cannot get 'assistance' from other states is that the feds made it illegal to connect to most interstate grids without following their regulatory regimes. I believe Texas does connect to Mexico and possible some other regional grids although I don't really understand the exemption for those.

    In this case it's not really Texas 'not allowing' other states to help but the other states not allowing Texas. Conceivably federal law could be updated to remove those regulations and Texas would absolutely connect to the interstate grid at that point.

    • RHSeeger a day ago

      > Conceivably federal law could be updated to remove those regulations and Texas would absolutely connect to the interstate grid at that point.

      That, of course, ignores the fact that those regulations are in place for a reason. Texas refuses to play by the rules, and the impact of that is that they don't get help when it's important. It is unfortunate, but a direct consequence of the choices they made.

    • vel0city a day ago

      The Texas Interconnection does tie in to surrounding grids through DC-ties. Those are limited in how much power can be sent through them and ultimately isolate the AC frequency.

    • more_corn a day ago

      Rules like hardening your system to be resilient in high or low temperatures.

      Super abusive. Let’s do away with safety systems that literally save human lives. Heck Texas doesn’t like them so let’s do away with them for the whole nation? How many people died during the last couple heat and cold induced grid outages in Texas? I lost count after a couple dozen. But those people were weak or poor anyway right? Texas strong!