[00:00:00] Kyle King: Welcome to the Crisis Conflict and Emergency Management Podcast, where we have global conversations and share perspectives about international crisis, preparedness, and how to build more resilient societies. My name is Kyle, and I will be your host. And just how vulnerable are we to the changing international environment and what can we learn from this experience?
[00:00:18] From AI, to space warfare, to community development and crisis communications, there's something here for everyone. Join us for unique international conversations and perspectives into the current threats, challenges, and risks to our society. This podcast is brought to you by Capacity Building International and sponsored by The International Emergency Management Society.
[00:00:45] All right, so welcome to the Crisis Conflict Emergency Management Podcast, brought to you by Capacity Building International. And I am Kyle King. I'm your host, and today we are going to talk about complex problems in emergency management and complex systems analysis with our guest CJ Unis. CJ is currently a Systems Engineer with the United States Space Force.
[00:01:02] He has a Master's degree in Systems Engineering from the Stevens Institute of Technology. The degree is focused on systems engineering practices and systems of systems, methodology, sub-component architecture, and systems design. His main area of concentration is focused on systems complexity and cascading failures of enterprise systems.
[00:01:20] He has over 20 years of professional experience in various disciplines, and he is a subject matter expert in the continuity of operations (COOP), continuity of government (COOG), devolution, infrastructure, community resiliency, alternative energy systems, and supply chain logistics. Mr. Unis spent 10 years in Sandia National Laboratories as a system engineer where he performed complex systems analysis for the DOE, Department of Energy, Department of Defense, and other government agencies. He also performed complex systems analysis for the intelligence community, cybersecurity emergency response, physical security, and alternative energy divisions at Sandia during his tenure.
[00:01:55] He performed complex supply chain logistics analysis and consequences of disruptions in enterprise systems, and he was also part of a team that was responsible for real-world nuclear incident response globally. He performed complex systems analysis as part of the nuclear red team to the vulnerabilities of nuclear weapons systems, and the title of his Master's thesis was Understanding Complex Problems in Emergency Management in Order to Mitigate Cascading Consequences. As we get started today, the traditional caveat for our conversation is that the opinions expressed here are our own and do not necessarily reflect those of our organization, the authority, or any formal positions, et cetera. I think you know the drill.
So CJ, welcome to the show. Thanks for joining us.
[00:02:33] CJ Unis: Thanks for having me, Kyle. Great to be here.
[00:02:35] Kyle King: One of the reasons I wanted to go through your bio a little bit more in detail is because there's quite a lot to unpack there. I mentioned your Master's thesis on Understanding Complex Problems in Emergency Management in Order to Mitigate Cascading Consequences.
[00:02:48] Now, first of all, let's start out with, for those that aren't really well informed in this area, and even for myself, what is systems thinking?
[00:02:56] CJ Unis: Systems thinking is looking at the way in which we address problems. There are five foundational pillars that I look at, in any enterprise system, that have the capability to fail.
[00:03:06] And those five things are operations, infrastructure, systems, human capital, and cyber. You can add another one, but I assert that all those things are connected internally and when they start failing internally, they start failing catastrophically. And when you start looking at some of how these problems propagate, they're connected kinda like a big spider web.
[00:03:25] So when you start looking at how these things are connected and why they're important and who needs to be a part of the equation with regard to policy folks or operations folks or senior leadership, it's really critical that the problem is framed properly. And if we don't look at framing the problem properly, you're not gonna get buy-in from the top down.
[00:03:45] And it's really critical that you're able to frame those things because infrastructure runs our daily world, quite frankly. And if we are not looking at community resilience, we're not looking at resiliency in general and we're not looking at infrastructure resilience. We're setting ourselves up for short and long-term failure.
[00:04:02] Kyle King: Most of our audience is either in the academic field or professional field related to crisis or emergency management. So can you give us a bit of a brief overview of your thesis and what were some of the key findings that you took away?
[00:04:14] CJ Unis: Yeah, so one of the interesting things about the thesis was the academic community understood it to a point.
[00:04:21] But there's an operational perspective that needs to be highlighted because there's a tremendous amount of tacit knowledge that goes into a lot of these operations, and I know you've seen it firsthand. I've seen it firsthand as a former operator myself. There are a lot of unspoken words that occur within these operations or within these disasters.
[00:04:41] Like it's intuitive, like the folks that have been doing this for years and years, just it clicks kind of like a battalion chief at a fire company or senior folks in S.W.A.T. They just know it. It's that spidey sense that just tingles. It's the unwritten word. It's tacit knowledge that drives the way in which they do things.
[00:04:58] None of that stuff is written down. And quite frankly, if we were able to capture the way in which some of this is done, the folks who are on the academic side would be able to do more cogent analysis to be able to understand, look, I have the ability to help you, but I have to be able to understand where you're coming from because I don't have that level of experience.
[00:05:20] And I'm trying to help us help you kind of thing. So I think it's really critical that I coined a phrase in my Master's thesis that operational context is a critical constraint to any type of analytical tool development. So it's really critical to, if you don't know what's going on on the ground, it's really critical to have some insight into that world to be able to do better analysis in order to make better decisions for that OODA loop or that continuous feedback loop.
[00:05:45] Kyle King: We constantly say, at least in the work that we do internationally, that context is extremely important, and there's even a cultural context that we add onto it. Which I guess one could argue is also applicable in the states based on the different regions that you're working in and environments and territories and things like that also applies, and the way that people understand things and do business.
[00:06:04] If we look at some of the recent events, I mean even just the Michigan Ice Storm, 700,000 people without power, or the Ohio train derailment, blizzards in Southern California. Or even just really a real potential for global conflict. I think we've learned over the last few years that we are really in a highly integrated society.
[00:06:21] And so from your view, and in your profession, you're looking at this continually and professionally day after day. What is on your radar these days in terms of systems integration? Because, in terms of what I see in our work it's very, very difficult to disconnect things, but they fail rapidly and I'm not sure quite how I can say that. But what we see is that things, when they fail, they fail hard.
[00:06:46] And it's very, very difficult to plan for that because you can't unpack it anymore. You can't disconnect things anymore. And so it's a very difficult sort of problem to get our heads around. And then some of our national assessments or when we're looking at wartime resilience or whatever the topic is, it is extremely difficult to try and segregate some of these different topics or isolate them to try and work on these one thing, this one topic or one problem because it's just connected in so many different ways and we're constantly surprised about things we just did not even see that occurred because of one, two, or three different situations came up and created a domino effect. So what are your thoughts on that?
[00:07:25] CJ Unis: That's a great question. So I think the bigger thing is that we need to get back to the basics here with regard to the way things propagate in our physical world which is almost parallel to the way they occur in nature.
[00:07:37] And nature is a very complex system. But if you're looking at the if-then construct about how we go about addressing some of these problems, it's an interconnected, interdisciplinary, multi-dimensional systems architecture and infrastructure problem, and it needs to be addressed as such. But a lot of people don't want to dive into that deep end of the pool because number one, it scares the hell out of 'em, quite frankly.
[00:08:00] And number two, they don't really know where to begin. So some foundational pillars need to be around infrastructure, around systems, around operations, and all of those things that we've talked to, those five pillars that I talked about. Those all can be built upon.
[00:08:14] And there are four also foundational questions that I ask when you are trying to tackle one of these projects: What is our baseline? Do we have requirements? Do we have money to fund this? And do we have senior leadership buy-in on these particular things that you're concerned about?
[00:08:29] If you can answer no to any four of those questions, you need to go back to the drawing board and you need to get buy-in at whatever level that you're at in order to do these things. Because our physical world is a very complicated and complex thing, which is almost exponentially compounded by technology, right? So if we think we have the ability to solve our problems with technology, we're looking at the problem in the wrong way, right?
[00:08:53] So I've got this app that’s gonna do everything for you. Well, yeah, that's great. But that app didn't necessarily consider all of the things that you're talking about, the stuff coming outta left field. So I think it's really incredibly important for academia, the operations community, the analysis community, and the intelligence community to look at this problem through a systems lens.
[00:09:13] And that's all of the things that are wrapped up into one thing. But they all feed off of each other, Kyle. So that's where it's also critical as well. So we're taking that if-then construct, well if I go in and do this, then there's going to be positive or negative cascading implications to the way in which that outcome is either perceived or executed upon, and bad data in bad outcome out.
[00:09:36] So a complex, wicked cascading problem on top of bad decisions made equals a really bad day for everybody involved in emergency management. And I'm sure you've been there and I know I have.
[00:09:48] Kyle King: You raised a really interesting sort of topic in terms of technology because I'm trying to find the right words here, but I think one of the key challenges is the speed of innovation. Because if we look at the ideas of what you are talking about and taking a more comprehensive look and taking the time to look through the systems, the effects and what may happen, resourcing these plans, discussions, and community engagement, that tends to slow things down with the implementation of projects because you need a thoughtful approach to be able to make sure all your bases are covered.
[00:10:19] Whereas alternatively, there's this speed of innovation, right? The speed of technology. So how do we balance these? Because a lot of times, and you just mentioned it, you're like, oh, I've got this great new app, which will do everything. Or we have a new smart city technology that will come out and make everything easier for traffic management or something, I don't know. But it is just like there's rapid innovation, which can be pushed forward as innovative and great, but then we also often don't take the time to really think about the effects of implementation. So how do we balance these?
[00:10:49] CJ Unis: Yeah, so let's pull on that string a little bit more. It's safe to say that 20th-century policy is driving 21st-century innovation, and I think the bigger issue that you're dealing with here is you have really archaic policies that are in emergency management or that are in the Department of Defense or any agency you can throw at. A lot of the continuity of operations, the continuity of business, continuity of government, exercises really aren't exercised. And they'll do a plan and it'll sit on the shelf until it's needed. From your time in the Marine Corps, my time in the Marine Corps, [it’s] train as you fight, right?
[00:11:23] So if you're not proficient, if you're not doing these things on a regular basis, you're setting yourself up for failure, both internally, institutionally, and culturally. So it's a cultural thing too. The Japanese have a term that's called kaizen, which is continuous improvement. So the person on the bottom has a stake as well as the person all the way at the top, and they all come together to share their ideas.
[00:11:44] I think in the US we've kind of lost sight of everybody's buy-in or everybody has a voice here with the way in which we address a lot of these problems, but we have the ability to innovate. We've already proven that. But the policies are the roadblock to innovation, in my opinion. And what I've seen firsthand, and I'm sure you've probably seen it as well. But those are things that are really important to driving the way in which we need to address some of these problems because we are literally moving at the speed of technology right now.
[00:12:13] And last time I checked, that's really, really fast. We've got this thing that we hold in our hands most of the time called a smartphone. It actually has more computing power than Apollo I, which is kind of crazy to think about. The average smartphone has 190 million lines of code on it.
[00:12:28] That's crazy as far as I'm concerned. So that thing that you hold in your hand is more complex than the first shuttle mission that went to the moon. So it's pretty wild when you think about it from those terms, but, again there's complexity in everything. The automobiles that you drive, the telecommunication systems, the infrastructure systems, and the distribution supply chain systems.
[00:12:46] So when you take a look back, we have to address these problems differently. Because the people who want to continue to push this technology have to understand that there are consequences to that technology. It's like what they were talking about at Terminator in the early nineties, right? The rise of the machines.
[00:13:03] So when you've got AI and you've got machine learning and things of that nature, it's great. But we have to have people that are formally trained to understand what those things mean once they propagate. And once you have an answer faster than people are able to comprehend the way in which those answers come about. Well, how did we get here?
[00:13:24] It's kinda like showing your math problems in grade school. You have the answer. Well, how did you get that answer? There has to be a strategic roadmap on, okay, this is how I got the answer. How do I functionally decompose that to get back to my root cause to say, this is how we got here in the first place.
[00:13:39] Kyle King: Yeah, you mentioned AI and I think one of the great examples that have just happened in the last few months is platforms like ChatGPT and others just blowing up in terms of marketplace. And Microsoft and Google jumping into that competitive space as well with their own AI.
[00:13:54] And then we've seen rapidly how it was debunked, taken apart by engineering and prompt engineering, how people just automatically took it apart and revealed biases and everything else that goes along with it. Because, as you mentioned already sort of garbage in, garbage out.
[00:14:09] It only knows what we know. And then there's this whole aspect to it that...but I think it exemplifies a rapid adoption of technology without fully understanding the consequences. And if we thought for just a minute, if say that in the emergency management field, we just said the AI is great and we're gonna use it for our automated messaging systems and IPOs or whatever, and just let it take over control.
[00:14:32] Then you find out it's sending out messages at the wrong place and at the wrong time, whatever the case is. That rapid adoption of technology, I think is a concern because you also highlighted a point, which I think is a critical concern from my perspective especially when we work internationally in conflict areas.
[00:14:48] Systems go down, there's no banking system, there are barely any communication systems, and it comes back to the fundamentals of community. Which are like, do you know your neighbors? Do you have a few days of food? Do you have sources of water? These real fundamental elements of society that if we continue along this pace with just a rapid adoption technology and leaning into that without really understanding what root causes and why things happen, we're gonna have a huge knowledge gap and it will probably expand in the future. As you know, from the old land nav stuff, right? If you're just a couple of degrees off, you'll be miles off.
[00:15:24] CJ Unis: Absolutely. I think you're spot on and I think it's really important to consider that there are a lot of things within how we're looking at addressing these problems, right? Because they're not getting any less complicated or any less complex. So there's got to be some understanding here, not only from the C-suite level but just when you're talking about baseline infrastructure in third-world countries, right? Those are the basics. The things that we take for granted are so treasured in other countries. It's not even funny. I mean, electricity. There are places in the world that have rolling blackouts every day. We were working on a project when I was at Sandia for Pakistan on alternative energy grids.
[00:16:06] So when you talk about the implementation of technology, that's great. But when you're talking about the sociopolitical economic construct, that's a whole other angle of the problem. And when you're talking about being in the middle of a war and trying to implement this technology, that's also a huge challenge because now you've got ISIS and you've got Al-Qaeda, and you've got these terror groups that are wanting to do harm to Americans who are around the world forward deployed not only in the DOD but civilians and other folks who are in the Corps of Engineers and things of that nature. They're in harm's way as well. So they're trying to engineer and help these folks with these problems.
[00:16:42] But again, there are a lot of existential things that are driving the way in which the project either succeeds or fails.
[00:16:48] Kyle King: So if we bring that back to emergency management or crisis management as I tend to call it in the international context. So what is the impact here in terms of the future of this field and the work that's being done?
[00:17:01] As you said, we're leaning into technology and we've got greater dependencies. What does the future hold for us in your view?
[00:17:07] CJ Unis: Yeah, I think it's critically important that we understand, number one, the complexity of the problem that we're trying to address. We need to look through a systems lens when it comes to emergency management.
[00:17:16] We have to get back to the basics, but the problem has to be framed. I think a lot of it comes back to training. Training and exercising and education. Because a lot of these things that I'm talking about are relatively new to a lot of people in the emergency management field. There's a handful of people that understand the complexity, but when you're talking about the practical application of systems thinking and being able to frame that problem on a whiteboard or walk a first responder or an emergency manager through that. A lot of these concepts are relatively new to them, so I believe education is gonna be key here moving forward and being able to get this out to the masses and do some type of formalized training where we can have a baseline with guys like you getting the word out on what is this systems thinking you're talking about? It's making connections, it's making interconnections, and taking that to an interdisciplinary nature because quite frankly, I don't care what industry you're in or what you do for a living, all these problems, we all have the same problems.
[00:18:13] They're all relative. They're all similar. They just exist in a different domain space, and I believe it's critically important that is highlighted because your problems are my problems. They just exist...Yours might be in emergency management, mine might be in space.
[00:18:25] We have infrastructure problems. We have systems problems. We have communications problems. We have policy problems. We have funding problems. We have bureaucratic issues that we deal with. I guarantee you, you're dealing with all the same things that I'm dealing with. It just happens to exist in a different domain space.
[00:18:41] Kyle King: That reflects a lot of what we've seen in our international work as well as while, you know, sort of the US specific and legislative frameworks and response plans and essential functions are all, US very contextually specific.
[00:18:53] There's a lot of logic that you can take out of that when you work with other nations and say, you probably should identify some critical services. Or you should probably have a continuity government sort of plan and continuity operations plans and things like that. So we can extract a lot from that. But in terms of, because I often think about what is at a community level, what can emergency managers do at a community level, and then how does that relate to nationally? So what can they do specifically to integrate systems thinking? We talked about education, but there's also the issue of resources, right?
[00:19:23] And so there's the issue of that upwards management, that we need resource allocation to be able to explore these things, to workshop these ideas and to be able to educate those in our communities, including the responders and the local authorities. What is your view on being able to get the resources needed to do something like this?
[00:19:41] CJ Unis: I believe a lot of the resources are already there, Kyle. I think it's important that we need to look at the problem differently because a lot of the infrastructure, the resources are already there. A lot of the policies that are being created are custom tailored to the emergency management.
[00:19:58] My insight to this is a lot of them have not been integrated. A lot of them have not been vocalized to other response entities or response mechanisms. It's like, well, that policy is really great. Could that apply over here? Oh, I really never thought about that from that angle. But yeah, it actually could.
[00:20:15] So a lot of the stuff that's already been done is there, and a lot of these things are sitting on the shelf quite frankly. So it's a matter of dusting off some of those EOPs and EAPs to get folks to go, huh, I never thought about that policy in this light, but I can see where it would be applicable to the things that you're doing and getting folks from strategic planning and operations and hazard mitigation to start talking to each other. Because a lot of times in emergency management, we don't have time to sit down and have some really solid conversations about what's going on and why it's important and why that policy is a part of this, or why that policy is going to hurt some of the things that I might be doing.
[00:20:57] I don't know if that's vocalized enough in state, local, and tribal entities. And I experienced it when I was in the emergency management world, but I would imagine those things are probably still going on because a lot of it has to do, it's foundationally based in communication. Those are improvements that we have the ability to make or those are opportunities that we have to do better.
[00:21:16] Kyle King: I was thinking when you were talking there for a second cuz you said that, I'm talking about bringing people together and I laughed at myself a little bit because it's often that people don't want to hear from the emergency response and emergency management community about, "Hey, we need to update plans and we need to sit together and discuss these things."
[00:21:33] And it's not that we often don't have time, but it's like people don't have time for us. I think that part of the problem is getting stakeholder buy-in.
[00:21:40] CJ Unis: Absolutely, and unfortunately in regular government, folks take a very reactive approach to the way in which they address problems instead of being very proactive about it.
[00:21:49] If you have the ability to look three or four steps forward into the problem, once the problem hits you, you're already prepared for some of the things that you need to do. But when you're reacting to something, you're already behind the eight ball, and that doesn't benefit anybody. And that's why people are trying to play catch up all the time when it comes to emergency management and emergency response-type activities.
[00:22:10] So there are things that I believe that we have the ability to improve upon within this world that we call emergency management and first response that could definitely benefit from some systems thinking. Because I know I've actually beta-tested this on a couple of occasions and they've been wildly successful as a result of the things that they've done.
[00:22:30] They've actually decreased their response time, increased their understanding, and also increased their ability to execute because now they know what type of resourcing there is and where they're able to go as a result of the systems thinking or looking at the problem from a multitude of different angles.
[00:22:46] Kyle King: Is that something you can elaborate on a little bit to just in terms of the test of what you did?
[00:22:51] CJ Unis: Yeah, so there's a certain friend of mine that was having some issues on implementing, getting some new propane heaters and propane cylinders through a grant out on her pueblo that she was the emergency manager for. And as a result of putting a systems hat on the way in which she executed on the problem, she was on time ahead of schedule and under budget, and she was able to help probably 8 to 10 more households as a result of her efficiencies of scale.
[00:23:22] So again, when you start looking at the problem from a systems lens, opportunities open up from a funding perspective and an execution perspective because now you have additional resources with less manpower and less headache than you would have, because you've looked at this problem from a multitude of different angles and you're able to create efficiencies of scale.
[00:23:43] Kyle King: Yeah, that's really interesting. I can only say that I endorse the idea of thinking outside the box with this. Because we just encounter in our work, we encounter things almost daily that we just never thought about. It's forcing us to rethink the entire way that we do our work and things that we advise on.
[00:23:58] And so I think it's really important that people look into this, especially if you're in a local authority or jurisdiction, look into this and just take the time to explore the interconnectedness and how things impact your communities and take that systems thinking approach that you're bringing to it.
[00:24:12] CJ Unis: Yeah.
[00:24:12] Kyle King: [If] somebody wants to get in touch with you. How can they reach you?
[00:24:15] CJ Unis: They can reach out to me on LinkedIn. I've got my profile that's there. [I] had a lot of interesting posts as of late. A lot of it had to do the recent train fire was a big one that had a lot of engagement, talking about systems. I'm available through LinkedIn mostly, and if you wanna reach out to me there I'm happy to have a conversation.
[00:24:32] Again, this is a one team, one fight mentality and getting these ideas out there, I believe is critically important to driving the way in which not only we deal with these problems, but looking at them differently and executing upon them differently, I believe was critical as well.
[00:24:46] Kyle King: All right, CJ, thanks for joining us.
[00:24:47] [I] really appreciate the conversation and well done in terms of bringing up this subject for the emergency management community. I think it's incredibly important, especially as we keep advancing in technology and the way our societies operate. So thanks for joining us.
[00:25:00] Thanks again and we'll include everything in the show notes in case you wanna be in touch with CJ and get some of insights in terms of maybe improving your local authorities and how you respond as well.
[00:25:09] So thanks CJ.
[00:25:10] CJ Unis: Thanks a lot, Kyle.