You are here
Home > Survival > Interview with EMP expert John Kappenman

Interview with EMP expert John Kappenman


As part of our research for the guide to understanding electromagnetic pulse threats (a great place to start if you’re new to EMPs), I had the chance to interview John Kappenman, a notable expert from the think tank Metatech. I discovered Kappenman’s work pretty quickly because his models are cited in multiple government reports and form the basis for FEMA’s thinking about worst-case scenarios for a large space weather event.

Kappenman’s CV is impressive, and includes multiple faculty positions, awards, and high engineering honors. He has devoted his career to this topic, which is one reason why Metatech is relied on by the US government for classified and unclassified work on EMP impacts.

Our interview was interesting enough that we’re sharing the (lightly edited) transcript below as a companion to the main guide. There’s certainly a lot in here for all of us to think about — namely how the power industry underestimates the solar EMP problem and isn’t ready for it — as we try to understand the how’s and why’s of preparing for a major EMP event.

Here’s a quick overview of what Kappenman told me about the catastrophic risks we as a society are taking every minute that our electrical grid orbits a star we still don’t understand well enough to guard against:

  • The famous 2013 Lloyds of London report on solar storms and the grid that everyone relies on for their forecasts is deeply flawed. The real outcome of a storm the size that Lloyds modeled is likely to be a lot worse.
  • Modeling and simulating the grid is one thing, but modeling and simulating the earth beneath it is far harder yet just as critical for figuring out how much danger we’re in. Our ground models just aren’t good enough.
  • The power industry just doesn’t want to hear that it needs to spend hundreds of millions of dollars to guard against the full range of space weather we know the sun can fling at us with no notice.
  • We’re not entirely sure what an EMP pulse from a high-altitude nuclear blast will do to modern electronics, but much of the available evidence points to widespread failures.
  • We’re at least a decade away from even starting to harden the grid with major upgrades. Until we get this taken care of, we’re vulnerable to a range of catastrophes, from large regions of the country without power for weeks or months to a total nationwide (or even worldwide) blackout that will take years to recover from.

A range of possibilities

TP: One of the things that has come up for me in investigating the topic of EMP is the wide range of perspective on the impact of a Carrington-class geomagnetic storm or a nuclear EMP — and I understand that these are very different — on the grid and on technology.

It seems that some of your reports from Metatech from 2010 are pretty dire — over 120 million people across the nation without power after a large solar storm. But then there is this Lloyds of London report, which has its own simulation, and they have only a few specific, highly-populated counties offline in the Northeast. So I was wondering if you could speak to that range of opinion that’s out there.

The power industry is using ground models that understate the problem by a factor of 2-8x.

JK: In regards to the Lloyds of London report, their ground models were not very good. There were no validations done at all on them. We did extensive amount of validation of ground models across the US. There’s lots of geomagnetically induced current (GIC) measurements that were made since the early 80’s, and some dating back even further than that that allowed us to validate the ground models.

And that’s a real important thing, because ground conductivity ultimately defines how much geoelectric field and GIC flow into the power grids, and that can easily vary by up to a factor of 8 from the least responsive to the most responsive ground models.

The ground conductivity profiles are very non-homogeneous across the US and North America. I’ve filed extensive dockets with FERC [Federal Energy Regulatory Commission] showing that the power industry is using ground models that understate the problem by anywhere from a factor of 2 to a factor of 8 too low, so they’re underestimating the threat potential.

And further, they’re also developing a geomagnetic storm threat [model] that is way lower than what has been measured in modern times, let alone going back to something that might truly represent a 100-year sort of threat.

Incentives, risks, and regulatory gamesmanship

TP: I understand the incentives for the power industry to underestimate this, but what about the insurance industry? The incentives for the power industry seem straightforward — obviously they don’t want to put money into risk mitigation, and they’d rather it be somebody else’s problem. But what about an insurer like Lloyds?

“[Power companies] build threat models that are way more conservative than what has [already] been measured in modern times, let alone going back to something that might truly represent a 100-year sort of threat.”

JK: I don’t know why they’d have an incentive to be wrong. I don’t think it would be in their best interest. I think they’re just wrong because they hired people who didn’t do a very good job.

TP: It’s interesting to me that the Lloyds report does not cite your work. The internal FEMA report that was released in 2017 does cite it, but Lloyds does not.

JK: Then that is a problem! *laughs*

I had to take issue with a lot of what [researcher] Jennifer Gannon had done [in the models that formed the basis for the Lloyds report].

The only way to get these models right is to validate them against actual data and measurements. You can develop models to do anything, but they’re all wrong unless they’re validated. That can’t be overemphasized.

Unfortunately, the research arm of the power industry has this policy of keeping GIC measurements out of the public domain as much as possible. [Keeping that data secret] allows them to get away with developing models and simulation tools that greatly understate the nature of the threat.

Related:  Richard Cole passed, age 103. Cole was the last living Doolittle Raider.

I did recommend to FERC that new GIC measurements must be in the public domain. That’s the only way to make sure that they are not understating the risk.

Improving our models

TP: So it sounds like it’s not so much that modeling the grid is hard, but it’s modeling the ground that’s difficult.

JK: I would say modeling the ground is the hardest part. We have a great deal of accuracy on the elements of the grid that need to be modeled. The locations of substations, the routes of transmission lines — all of that is very easy to determine. I had to do some of this stuff using Google Earth, just to count the number of transformers and spare transformers that the industry held and where they’re located.

The ground modeling is the most difficult part, and that’s the part where frankly not a lot of good science has been done. Most of the space weather and geomagnetic storm emphasis always ended at the ionosphere, and didn’t do a good job of the solid earth physics down to the ground, which becomes very complex and messy in a hurry.

TP: I’m looking at this report from Idaho National Labs, and in one section (4.2.1) they say:

“There are more unknowns than knowns. The largest, most critical grid components do not have past experiment data on EMP mitigations to draw upon… Much of the threat information is not available to persons without security clearances or the ‘need to know’. HEMP information is decades old.”

So it sounds like there is a bunch of stuff that industry and everyone else would want to know, that’s classified or restricted, that makes it difficult to model.

JK: There is some elements of that that are valid, but not really.

Silicon-based electronics do not fail gracefully. They’re very brittle. They spark over, [so then] you’re in the process of buying new equipment to get it back up and running.

Let me give you a simple illustration of what the challenge is. If you walk into a power plant control center — and I’ve seen this on combustion turbines at a relatively small facility, because they all have a very sophisticated electronic distributed control system for each of these generators — every one of them will have a sign on them that shows a cell phone with the big circle and line through it, saying “don’t use your cellphone next to this piece of equipment.”

Well, the cellphone will generate a field of about 1 to 3 volts per meter. The EMP device can do about 50,000 volts per meter (it’s unclassified).

So that’s your basic thing you need to understand about what you’re facing if you own this piece of apparatus. *laughs* It’s probably not going to have a good day if an EMP event occurs.

Silicon-based electronics do not fail gracefully. They’re very brittle. They spark over, and you’re in the process of buying new equipment to get it back up and running.

That’s a huge problem, and you face this across electronic systems throughout the power grid enterprise — substations are no different than many of the power plants. They have a bit higher withstand capability, mainly because they’re switching high-voltage equipment at those locations. But the threat field is substantially larger than the inherent withstand of any of those devices.

So you really can’t engineer these systems to be resilient to 50K volts/meter. Rather, what you have to do is put them in a protected space that will not allow them to experience that 50k volts/meter threat. You keep it walled off, or [in a] Faraday cage isolated from the environment.

That’s a fairly simple concept, and it doesn’t take a lot of access to highly classified information to figure this out. That’s for the E1.

For the E3 there is some classification on field levels, but generally there is a blocking device that’s capable of operating against even those classified levels. Again, that’s not a terribly difficult thing for anyone to understand. I think it comes down to perhaps some resistance to just doing anything, and using this as an excuse.

What can and should be done

TP: I saw a recent Government Accountability Office report on what has been done to address EMP risks since the 2008 EMP commission report, and the answer was some research, planning, and guidelines, but no actual practical mitigation done. How difficult will it be for us to shield these facilities?

JK: The problem with doing metal shielding [for Faraday cages around equipment] is that it’s just a lot messier than using concrete. If you’re familiar with the ribbed metal that has overlaps, these are all painted to steel sheets, and that paint is an insulator. So even though you’re overlapping them, you’re still allowing a slit [made by the paint coatings] for E1 [EMPs] to migrate through. So it doesn’t do a very good job for shielding. You’d have to weld each and every seam.

Then, these are typically constructed on a non-conducting concrete floor, whereas you really need a six-sided box. You have to shield the floor, too, to prevent reflections and refractions into that protected space.

Further, an incident E1 wave that hits metal will reflect, and that makes it difficult to allow for entrances for people, air handling, wires and things like that. You have to build what we call “tortuous paths” to keep that radiation out of the facility.

TP: My sense is that if we were to suffer a Carrington-class event, or something close to it, your estimate is that it would be pretty bad.

If you go back, there’s an event called the Charlemagne event, which appears to be something on the order of even ten times greater than the Carrington Event.

JK: I think it could be, yeah. I think it could be.

TP: So you, then, don’t consider it an extreme stance to say that 130 million people could lose power for an extended period of time.

JK: No, I don’t think that’s out of the realm of the possible, unfortunately.

TP: I was an electrical engineer for undergrad, and I have a fair number of friends in Silicon Valley, and when we talk about this stuff it’s often “yeah of course it would be really bad.”

Related:  Outdoor Survival Skills | Tell Time In The Wild Without A Watch

When I talk to other engineers about this, they’ve often looked up the Carrington Event and read about the telegraph wires sparking [during that EMP], and so on. For them that’s a fairly straightforward mental model of the level of catastrophe we’d be facing.

But now that I’ve hopefully become a little more sophisticated in my understanding of this, it seems like probably the telegraph wires maybe weren’t insulated, or maybe there was other stuff going on that makes that not as good of a mental template for thinking about what would happen in the modern era?

JK: When I look at telegraph wire data, what I gain from it is the geoelectric field response that occurred due to the threat environment from the storm. We don’t at all know from the Carrington Event how many nanoteslas per minute the geomagnetic field change was. That can’t be reverse engineered.

But we do have data from old storms. I did a paper on the 1921 storm, where the actual data was much better, and on the 1982 storm. We were looking at geoelectric fields from those events that were in the 10 to 20 volts/kilometer range.

The beauty of that is that the earth’s conductivity doesn’t change noticeably over time. Infrastructure changes a great deal, so I can use that data from those old infrastructures to extrapolate what would occur to the modern-day infrastructure.

The power grid is a big, big antenna that’s sitting out there, and it’s well-coupled to this threat environment, and it has never been engineered to take this threat environment into consideration.

In fact, the process that’s going on now with the power industry and FERC, they’re probably not going to have even begun implementing any sort of fixes to this until 2028. So we’ve got another decade or so to go before they begin to do anything serious about this.

If you’re worried about a 1 in 100 year threat, you’ve got a 1% chance per year, and you’re already moving your way through the game of Russian roulette that we’re playing with the sun.

Playing the odds

TP: So you take this as a one percent chance per year?

JK: I’m worried about a threat environment that could be a 1-in-30 to a 1-in-100 year event, that could cause significant impact to the US power grid. So we’re in that 1% to 3% per year risk of this happening.

I remember a while back, my colleagues who were looking at solar impact on GPS — that’s driven by a different solar phenomenon, radio frequency interference from the sun — and they had two or three of what they classified as “one in 100 year events” in a one month period of time back around 2010.

We have a very poor understanding of what the sun is capable of, I fear. I know we’ve experienced events like the 1921 storm and the Carrington storm, and the sun is fully capable of reproducing those again events at almost any time.

And if you go back, there’s an event called the Charlemagne event, which appears to be something on the order of even ten times greater than the Carrington Event.

So who knows what could come our way, but we do know that we have an engineered system that is not well prepared for these events out there in the power grid, and that has never taken this threat into consideration in its design. And the way that the design has evolved over the decades, it has only made itself much more vulnerable and much more electromagnetically coupled to this threat.

Deep problems

TP: When I came into this topic, I was thinking about this as power grid problem and a problem of the sun, but what I’ve comes understand from talking to you is that there’s the sun, there’s the earth’s magnetic field, and there’s the earth which is this giant conductor. And, we’ve got this magnetic field wrapped around his conductor, and the solar wind moves it around causes a current, in the ground, I guess.

JK: Well, it causes a current in the upper atmosphere, and that couples to the ground. That coupling to the ground is very complex, because of the complexity of the nature of the deep earth.

I don’t know if you come from the world of silicon physics, and solid-state physics and so forth?

TP: I do, yeah.

JK: Ok, then here’s a real easy way of thinking about it:

You’re dealing with a low-frequency signal from the magnetosphere that propagates down to the surface of the earth. This is very low frequency, below 1 Hertz. So if you look at it that way, the magnetic field and the geoelectric field response requires a propagation to depths of 400km or so.

So you’ve got a very complex geological strata that you’re dealing with, and a lot of it is made of silicon-based layers of earth. That silicon can have variations in impurities, and those impurities can impart some degree of semiconducting and insulating properties, which is just like your silicon physics that you exploit for electronics.

That’s a very difficult thing to characterize, and it can really only be done well by trying to establish these measurements over broad regions.

The best way to do it is to take the GIC measurements that we already have. They fully capture the mesoscale properties of this whole environment that has occurred, and if you understand it at the intensity of the storm, and you have a coupling of GIC measurements with magnetometer measurements that measure the magnetic field variations, they’re easy to establish if you marry those two measurements together what the response characteristics of the ground are in your particular region.

And if you do that over the entire power grid the you can develop a very good model of that, and simulate with very high fidelity storms that have occurred before and are almost certain to occur at some future date.



Source link

Leave a Reply

Top