Ofgem says August power cut caused by cascade of calamity

ofgem logo plus meter

The UK’s biggest blackout in a decade hit trains, the London underground, traffic lights, a hospital and an airport in August. Ofgem has now published the National Grid Electricity System Operator’s (ESO) report into the August 9 power cuts.

The report will form part of Ofgem’s investigation into the outage, which it began on 20 August. The systems and procedures of National Grid Electricity Transmission, 12 distribution network operators and the generators RWE Generation and Orsted are also under scrutiny.

A National Grid Electricity System Operator spokesperson said “rare circumstances” led to the black out.

“The electricity system operated as it should and within the operational guidelines by which we are governed,”
National Grid Electricity System Operator spokesperson

What does Ofgem say happened?

At 4:52pm on 9 August over 1 million customers lost power due to multiple failures in the UK’s electricity network.

The power cut was the largest since 2008 and caused widespread disruption to many people and businesses.

Some rail services in and around London were especially badly hit with many Friday evening commuters left stranded. Traffic lights in some areas also stopped working.

According to the ESO’s final technical report to Ofgem approximately 1.1m customers were without power for a period of between 15 and 45 minutes. Supply was returned to all customers by 5:37pm.

Parts of the rail network were affected by major disruptions. Services were cancelled or significantly delayed affecting thousands of passengers after about 60 trains shut down.

Lines out of Farringdon and Kings Cross stations were blocked and Kings Cross and London St Pancras were temporarily closed resulting in overcrowding at rush hour.

While the overhead power supply to the network had continued operating, the trains had reacted to a drop in frequency below 49Hz caused by the wider problems with the grid.

Some lines were still experiencing delays and cancellations the following morning. The London Underground Victoria Line was also temporarily halted due to the event, but it was back in service at 5:35pm.

Ipswich Hospital experienced a power fluctuation but the hospital’s emergency generators kicked in. While a circuit breaker did cut the electrical supply to the outpatients and x-ray departments, the issue was brief.

In Newcastle, the airport was plunged into darkness when it lost supplies from the network for 18 minutes due to the Low Frequency Demand Disconnection (LFDD) safety protocol.

The electricity system operator can cut supply when the network is disrupted to balance the energy distributed across the system. However, under the Electricity Supply Emergency Code (ESEC) supply should be maintained to sites classed as “protected” and it seems the airport was not registered as a protected site until after this incident.

Why Did It Happen?

The initial cause of all this chaos has been traced to a lightning strike hitting a transmission circuit.

According to Meteogroup, a private weather monitoring service, on 9 August there were 12,370 lightning strikes across Great Britain. In the 2 hours before the blackout 2,106 strikes hit the mainland UK. There was heavy rain and lightning storms around the transmission network north of London.

ESO said in all other cases of strikes on the transmission network safety systems had protected the network successfully without problems.

However, on August 9, one unlucky strike between Eaton Socon in Cambridgeshire and Wymondley in Hertfordshire at 4.52pm began the crisis. The grid’s protection systems discharged the lightning in less than 0.1 seconds and the line returned to normal operation.

Unfortunately, at the same time a small energy producer connected to the network went down, removing around 150MW from the system.

This was to be expected in the event of a lightning strike and shouldn’t have caused much of a problem, except the dominoes kept falling.

Right after the lightning strike, a software problem forced a wind farm off the Yorkshire coast to cut its supply to the grid by 737MW. A further 244MW was lost when a steam turbine at Little Barford power station in Bedfordshire tripped.

The ESO said this last event would not have been expected after a lightning strike, calling it “extremely rare”. The power station operators are still investigating the shutdown of their turbines.

Altogether, the grid had now lost 1,131MW of power. To put this in perspective, before the lightning strike there was around 32GW of capacity available on the system and overall demand was forecast to reach 29GW, similar to the demand on the previous Friday. This means 3.9% of needed capacity was lost from the grid.

However, the cascade of calamity didn’t stop there. The dramatic loss of power caused a sharp drop in frequency.

What’s the deal with frequency?

The UK electricity system uses alternating current (AC), which means it alternates between positive and negative voltage. This switching is called frequency and happens 50 times a second in the UK. So, the UK has a frequency of 50 hertz (50hz).

Every electrical appliance sold in the country is designed to operate at this frequency, the same goes for generators supplying power to the grid. So, the frequency needs to be kept stable within very narrow limits.

If the supply of or demand for electricity changes too much too quickly it can mean big problems for the frequency of the grid. If the demand exceeds the ability of providers to supply, then the frequency drops.

A frequency change as small as 1% above or below 50Hz can cause damage if it is allowed to go on too long.

Many pieces of equipment have mechanisms to protect them in such circumstances. This is why about 60 trains shut down when the frequency dropped below 49Hz and why Ipswich hospital lost power due to the operation of their own protection systems.

So, on 9 August, the drop in frequency caused some small energy generators to disconnect from the system as an automatic safety measure to protect their equipment, draining a further 350MW from the supply.

While the ESO had “backup” power enough to cover for the loss of 1,000MW, the total lost at this point was a whopping 1,481MW. This caused the frequency to drop sharply to 49.1Hz, outside the safe range of 49.5Hz-50.5Hz.

To prevent damaging frequency fluctuations the National Grid relies on power generators to provide frequency response services to stabilize the system.

When the frequency increases, the generator reduces its output. If the frequency drops, output is increased. These reactions can happen in less than a second from the time a change in frequency is detected.

What happened next?

The ESO resorted to its backups, including 472MW of battery storage, and frequency response services to stabilize the situation. However, just as it was beginning to climb again, at 49.2Hz another turbine tripped in Little Barford due to the failure of a valve. The total loss of power now shot up to 1,691MW.

With no more backup power to call on, the frequency plummeted to 48.8Hz. At this point, more safety systems kicked in and automatically stopped supply to 5% of mainland UK customers, including Newcastle Airport. This measure was to allow the frequency to recover and prevent damage to the network.

An estimated 1GW of mainland UK’s electricity demand went temporarily unmet. However, the second turbine at Little Barford then tripped at 187MW, making the total loss was now 1,878MW.

This chain of misfortune is what led to roughly 1.1m customers being left without power for up to 45 minutes, according to ESO.

However, the disconnection of customers and the activation of backup generation stabilized the system at 50Hz by 4:57pm. This entire saga happened within just 5 minutes of the lightning striking the transmission line.

Once the grid had stabilized, the UK’s 14 Distribution Network Operators started reconnecting everyone and supply to all customers had resumed by 5:37pm.

What can be learned?

The report to Ofgem argues procedures “generally worked well to protect the vast majority of consumers”, but it highlights some lessons learned and puts forward some recommendations.

It suggests the list of facilities protected from emergency disconnection should be reviewed to ensure critical infrastructure and services are not at risk in future.

Internal protection systems on electric trains should be checked to make sure they can deal with “normal” fluctuations in the electricity system, according to ESO.

It also pointed out that as the UK is developing a greater reliance on small scale energy generators, a review of the LFDD safety measures should be carried out to prevent inadvertent tripping and disconnection of such generators.

The overall report is very favourable towards ESO and these recommendations seem to put the blame, if there is any to be had, elsewhere.

That said, Norwegian advisory and certification society DNV GL was tasked by the operator to review its report and it concluded that “the technical analyses performed by ESO have been diligent and robust, and we support the findings and recommendations in the ESO Technical Report”.

Ofgem’s investigation is continuing and it will be interesting to see if its conclusions match those of ESO and DNV GL.

The government is also investigating the actions of ESO. The Energy Emergencies Executive Committee is due to report its initial findings to the Secretary of State for Business, Energy and Industrial Strategy by the end of September.

Updated on

The services and products mentioned on this website may only represent a small selection of the options available to you. Selectra encourages you to carry out your own research and seek advice if necessary before making any decisions. We may receive commission from selected partner providers on sales of some products and/or services mentioned within this website. Our website is free to use, and the commission we receive does not affect our opinion or the information we provide.