Computational Modeling of a Large-Scale Casualty Disaster

Moving beyond table-top exercises, this LaSER sub-project addresses the need for a robust computer model capable of capturing and informing the public health response to high casualty catastrophic events. To meet this need, our interdisciplinary team has developed a computer simulation platform named PLAN C (Planning with Large Agent-Networks against Catastrophes).

PLAN C provides a powerful computational reasoning and analysis platform to help policy makers consider a wide range of parameters, many different objective functions, and effects of several concomitant catastrophes. While present models exist to address specific components of casualty events, responses, and outcomes, we are not aware of any model with the capabilities of PLAN C, namely the simulation of a large, complex environment, that will be scalable to cover the eventuality of 1,000,000 casualties, and which provides statistical outcome data at medical, emergency responder and community levels.

PLAN C features several integral features:

PLAN C has successfully modeled and analyzed a 1998 Brazilian food-poisoning incident in which a biological agent caused over 8,000 casualties and 16 fatalities. The model captures the dynamics of the interaction between people and hospitals in the presence of different communication channels, in different initial scenarios and under different triage policies. The results were presented in the paper "Multi-Agent Modeling and Analysis of the Brazilian Food-Poisoning Scenario" (by Mysore V., Gill O., Daruwala R.S., Antoniotti M., Saraswat V. and Mishra B.) at the Agent 2005 Conference on Generative Social Processes, Models and Mechanisms in Chicago.

Beyond accurately modeling real-world events (like the Brazilian food poisoning scenario), PLAN C can construct hypothetical complex disaster-scenario models. Two such models have been created (and others are in progress) using New York City as a backdrop. One considers a Sarin gas attack at the in Manhattan (scenario specific to a Port Authority Bus Terminal incident). The result of the interaction of more than 1,000 agents is analyzed by repeated simulation and parameter sweeps in the paper "Agent Modeling of a Sarin Attack in Manhattan" (by Mysore V., Narzisi G., Nelson L., Rekow D., Triola M., Shapiro A., Coleman C., Gill O., Daruwala R. S. and Mishra B) published and presented at the First International Workshop on Agent Technology for Disaster Management (ATDM) in Hakodate, Japan. A second scenario simulates a smallpox outbreak in Manhattan and enables a simulation of divergent disease progression, cross contamination and secondary exposure in sub-populations based on vaccination status.

PLAN C's development by LaSER's multidisciplinary team ensures that its application to the study of a disaster or disease outbreak is reality-based. Medical, public health, legal, mathematical, computational, and other experts model inputs ranging from EMS triage to risk communication in a way that is consistent with real practice and data. As mentioned, Plan-C's optimization tool allows for probing a variety of emergency plans and determining optimal allocation of resources. LaSER's expert team considers the ethical and legal dimensions of these plans, allowing for study, using PLAN C, of difficult questions that arise in the allocation of limited resources in a disaster or pandemic. For example, the implication of varying guidelines governing ventilator allocation can be studied. PLAN C is scalable, in the sense that it can be applied not only in New York City but also in smaller environments. Furthermore, any city's street map and public transportation system can easily be integrated into PLAN C via publicly available GIS data.

This tool is also built to optimize multiple objective functions, considering the best outcomes in categories such as total number of casualties, population unhealthiness, fairness, economic impact, legal consequences, and others. By considering these objective functions, PLAN C will automatically generate Pareto-optimal set of response plans. The preliminary results of these theoretic and experimental approaches are presented and analyzed in the paper "Multi-Objective Evolutionary Optimization of Agent Based Models: an application to emergency response planning". The paper was presented at the IASTED International Conference on Computational Intelligence (CI 2006), November 20-22, 2006 San Francisco, California, USA and in "Complexities, Catastrophes and Cities: Unraveling Emergency Dynamics" (by Narzisi G., Mysore V., Nelson L., Rekow D., Triola M., Halcomb L., Portelli I., and Mishra B). International Conference on Complex Systems (ICCS 2006), Boston, MA, USA June 25-30, 2006.

The modeling team also focused their work on the incorporation of specific subpopulations of person agents, reflecting the existence of individuals with specific defining characteristics and needs, and their interactions with the available resources. The team's work on vulnerable populations is related in the paper by Narzisi G., Mincer J., Smith S., and Mishra B. (2007) "Resilience in the Face of Disaster: Accounting for Varying Disaster Magnitudes, Resource Topologies, and Sub Population Distributions in Plan C Emergency Planning Tool" (accepted at the Industrial Applications of Holonic and Multi-Agent Systems, HoloMAS Conference 2007, Regensburg, Germany). In PLAN C, the performance of these subpopulations is compared in both point-source attack and distributed disaster scenarios for disasters of different magnitudes. Specific health "recovery points" can be derived both for total- and sub-populations, which estimate the duration of a response system's or city’s vulnerability. The effect of varying topologies of available resources, i.e. different hospital maps, provides particular insight into the dynamics that can emerge in this complex system. PLAN C produces interesting emergent behavior which is often consistent with the literature on emergency medicine of previous events.

We are also determined to go beyond the current theoretical framework and breach the theory practice gap by refining the current scenarios already build on this platform and focus thoroughly on the medical component. Thus, we are in the process of aiming our efforts towards thorough sensitivity testing and validation. This process has already started re the Chemical (Sarin) simulations where upgrades to the systems in use have been made to reflect read Lethal Dosage, Exposure Times etc. To this effect we have been invited by the North American Congress of Clinical Toxicology (NACCT) to present and our abstract entitled “A New Approach to Multi-Hazard Modeling and Simulation” in New Orleans, Louisiana, USA.

The modeling team has now internally validated the sarin scenario and is focusing on the external validation of the platform. We are hoping to achieve this over the end of this year. We also started the implementation of an extensively larger biological model that could deal with epidemic and pandemic scenarios. This entails more computer power and we are looking at using parallel computing clusters at NYU Courant Bioinformatics to achieve these final milestones.

Future investigations will include: