« August 2009 | Main | January 2013 »

September 08, 2012

Simulation-Based Science

[This is cross-posted from the original entry on the Z Shift corporate blog.]

This is adapted and extended from ideas I first put forth within a keynote delivered at MODSIM World Canada in Montréal in June 2010.

On the Navajo Indian Reservation in northern Arizona is a place called Long House Valley. This valley was inhabited by a people known as the Anasazi from 1800 BC to 1300 AD, at which point they abandoned it for reasons that remain unclear (1). This abandonment is a mystery that archaeologists have been studying for decades. Why did the Anasazi disappear? There's no written record, and no obvious catastrophe -- no meteors, no volcanoes, no anachronistic herds of saber-toothed cats. Traditional archaeological approaches haven't yielded a definitive answer. We know what killed the dinosaurs 65 million years ago, but we don't know why the Anasazi left Long House Valley 700 years ago.

We can't go back and observe the Anasazi directly. But what if we could simulate their society in an attempt to understand what happened? As it happens many researchers have done just that.

Long House Valley has been described as "one of the icon models of the agent-based modeling community" (2). It's relatively small (just 96 square kilometers) and well-bounded. There is a rich paleoenvironmental record that can be used as the basis of simulations. And there's a mystery to be solved. As a result, numerous Anasazi simulations have been built. These simulations typically cover periods of hundreds of years and model everything from family size and composition to population growth, weather patterns, agricultural productivity, and the like.

What have we learned from these simulations? What they tend to show, consistently, is that environmental factors by themselves don't explain the complete abandonment of Long House Valley. There was a 300-year drought in North America beginning around 1150 AD, and that seems to have contributed to the departure of the Anasazi. There was a drop in water table levels that also contributed. And overfarming seems to have taken its toll. But even with all these factors taken into account, the valley could have supported a smaller population.

In other words, despite all the work has been done, there is still a significant mystery attached to Long House Valley. But thanks to simulation, the mystery is smaller. It's not why the Anasazi left, but instead, and more precisely, why all the Anasazi left. A logical reason would be social pressures, but that remains to be addressed by future simulations.

Can we ever prove what happened to the Anasazi via simulation? No. Simulation can tell us what might have happened, what likely happened, but it can't tell us definitively what did happen. But knowing what might have happened, what likely happened -- these are valuable in and of themselves. They help us narrow our future efforts. They provide a base upon which future researchers can build. And as agent-based computational simulations improve in quality, and as more and more independently developed simulations return similar results, we can move from likely to probably.

Studying the reasons for the disappearance of the Anasazi probably doesn't hold a great deal of relevance for most people on a daily basis. But it does point to something that is important to many people: the use of simulation to understand things we otherwise can't.

The use of simulation as a scientific research tool is spreading rapidly, from simulations of sociological events like the disappearance of the Anasazi to simulations of galactic formation taking place over tens of millions of years; from simulations of rat brains at a level of detail sufficient to replicate their functioning to simulations of global climate patterns and how they are changing (and might change) as the result of human activity. This presents tremendous opportunities for us to know the previously unknowable. The problem with this new new kind of science (with apologies to Steven Wolfram) is that simulation has not traditionally been a component of the scientific method.

The first person I heard articulate the problem of integrating simulation into the scientific method was Dr. Rick Satava, Professor in the Department of Surgery at the University of Washington. As he put it in the abstract to his 2005 paper on the subject (3):

The scientific method has been the mainstay of scientific inquiry and clinical practice for nearly a century. A new methodology has been emerging from the scientific (nonmedical) community: the introduction of modeling and simulation as an integral part of the scientific process. Thus, after the hypothesis is proposed and an experiment is designed, modern scientists perform numerous simulations of the experiment. An iterative optimization of the design of the experiment is performed on the computer and is seen in virtual prototyping and virtual testing and evaluation. After this iterative step, when the best design has been refined, the actual experiment is conducted in the laboratory. The value is that the modeling and simulation step saves time and money for conducting the live experiment. The practice of medicine should look to the tools being used by the rest of the scientific community and consider adopting and adapting those new principles.
As Rick says in the talk he gives based on these concepts, "Would you rather run eight iterations of an experiment, or two to the eighth iterations?"

Rick's focus is on medicine, so he sees (if I understand him correctly) simulation as a way to shrink the experimental space when the experiment moves from the virtual to the real. In other words, if we can use simulation to discard thousands or even millions of scenarios, we can focus our limited dollars for expensive real-world experimentation on the most promising possibilities. But what happens when real-world experimentation isn't possible, as in the case of the Anasazi?

The problem isn't limited to just the study of historical events such as those at Long House Valley. Climate change is an area of tremendous interest to researchers right now. Arguably, one of the reasons that humanity has yet to address climate change with a level of seriousness appropriate to the findings of the scientific community is that scientists can't prove beyond a shadow of a doubt whether carbon dioxide levels will continue to rise, or what will happen to the Earth if they do. Proof is difficult to come by when the duration of a real-world experiment stretches beyond weeks and months into decades and even centuries. Or take the challenge of understanding how galaxies form. We have theories about how they do so, and these theories often involve galactic collisions that take place over tens of millions of years. How can we "prove" our theories when the timescale of the experiment is an order of magnitude (or more) greater than human beings have existed?

I believe that we're going to have to come to a new understanding of the scientific method. For domains in which we can validate the results of a simulation using real-world experimentation, Rick's description of Steven Wolfram's methodology is good: "Build the computer model, add the data from a real world experiment, see if the results match real-world expectations, change the input data to more closely approximate the model, and run the next iteration... until there is concurrence with the evidence of real-world results." But for domains in which real-world validation is impractical for one reason or another, then we will have to agree on a standard of evidence that allows us to accept a theory as provisionally proven, as the best evidence available at the time, and move on, knowing that revisions may be necessary as better evidence is developed.

The obvious danger with this approach is that we could find ourselves building scientific houses of cards. Simulations based on inaccurate starting data, inaccurate simulation algorithms, or both, will produce inaccurate results. It is entirely conceivable that disparate researchers around the world could simulate the same events using different techniques and come to such similar results that all concerned accept these results as provisionally proven and begin building upon them -- and yet for these disparate researchers to all be wrong, even profoundly so.

This will be a growing challenge to the simulation and scientific communities for many years to come. But the rewards are too great to not solve the problem. And I'm sure that we are -- collectively -- up to the task.


1. Axtell RL, Epstein JM, Dean JS, Gumerman GJ, et al. Population growth and collapse in a multiagent model of the Kayenta Anasazi in Long House Valley. Proceedings of the National Academy of Sciences. 2002;99:7275-7379. Full text (PDF).

2. Janssen MA. Understanding artificial Anasazi. Journal of Artificial Societies and Social Simulation. 2009;12:13. Full text.

3. Satava RM. The scientific method is dead-long live the (new) scientific method. Surgical Innovation. 2005;12(2):173-176. Abstract. Full text (PDF).

September 05, 2012

Research Results: Can Simulation Improve Healthcare System Performance?

[This is cross-posted from the original entry on the Z Shift corporate blog.]

Using simulation to improve medical training is a fairly well researched and understood phenomenon at this point. Many articles point to the effectiveness of using simulator-based systems to improve the results of clinician training. (I'm planning on providing an overview of this in a future blog post.) But can simulation be used to improve the systemic performance of clinical care? In other words, can we use simulation to model clinical healthcare systems and then improve their real-world performance based on the results of the simulation? This is one of our focus areas at Z Shift and so it's of great importance to us. Together with my co-author Dr. Robert Szczerba of Lockheed Martin, I hypothesized about this in a paper that won the Best Paper award at I/ITSEC (Interservice/Industry Training, Simulation, and Education Conference) in 2010, "Simulated clinical environments and virtual system-of-systems engineering for health care". (Full text (PDF). Slides (PDF).) But what about results from actual clinical trials?

As it turns out, a number of papers have been written on various aspects of this subject, over a period of decades, and the results have been positive. A brief survey of the available literature revealed the following:

  • A 1989 article published in Annals of Emergency Medicine described a "computer simulation model of emergency department [ED] operations". The authors concluded that "simulation is a potentially useful tool that can help predict the results of changes in the ED system without actually altering it and may have implications for planning, optimizing resources, and improving the efficiency and quality of care". (1)

  • A 2007 study published in Pediatric Emergency Care found that a model of patient flow within a pediatric emergency department based on discrete event simulation "accurately represents patient flow through the department and can provide simulated patient flow information on a variety of scenarios. It can effectively simulate changes to the model and its effects on patient flow". (2)

  • A 2008 article from AORN Journal [Association of periOperative Registered Nurses] described a simulation project "performed to assist with redesign of the surgery department of a large tertiary hospital and to help administrators make the best decisions about relocating, staffing, and equipping the central sterilization department". The authors found that "simulation can facilitate the design of a central sterilization department and improve surgical sterilization operations". (3)

  • An article published in the Journal of Healthcare Management in 2011 discussed the use of discrete event simulation as a "cost-effective way to diagnose inefficiency and create and test strategies for improvement". The authors concluded that while discrete event simulation "is not a cure-all for clinic throughput problems," it can nevertheless "be a strong to provide evidentiary guidance for clinic operational redesign". (4)

  • A 2011 article in the Journal of Healthcare Quality described the use of discrete event simulation to "model the operations and forecast future results for four orthopedic surgery practices". The authors concluded that simulation "was found to be a useful tool for process redesign and decision making even prior to building occupancy". (5)
As stated, this was from a relatively brief survey. I suspect that a more thorough search of the available literature would reveal additional similar articles. I should also point out that I haven't cherry-picked these results; all the relevant articles I found had positive conclusions on the question of using simulation to systemically improve healthcare. It's highly encouraging.


1. Saunders CE, Makens PK, Leblanc LJ. Modeling emergency department operations using advanced computer simulation systems. Annals of Emergency Medicine. 1989 Feb;18(2):134-40. Abstract.

2. Hung GR, Whitehouse SR, O'Neill C, Gray AP, et al. Computer modeling of patient flow in a pediatric emergency department using discrete event simulation. Pediatric Emergency Care. 2007 Jan;23(1):5-10. Abstract.

3. Lin F, Lawley M, Spry C, McCarthy K, et al. Using simulation to design a central sterilization department. AORN Journal. 2008 Oct;88(4):555-67. Abstract.

4. Parks JK, Engblom P, Hamrock E, Satjapot S, et al. Designed to fail: how computer simulation can detect fundamental flaws in clinic flow. Journal of Healthcare Management. 2011 Mar-Apr;56(2):135-44; discussion 145-6. Abstract.

5. Montgomery JB, Linville BA, Slonim AD. Desktop microsimulation: a tool to improve efficiency in the medical office practice. Journal of Healthcare Quality. 2011 Sep 13. doi: 10.1111/j.1945-1474.2011.00166.x. Abstract.

September 03, 2012

Healthcare Simulation in the Post-Mannequin Era

[This, which I co-authored with Dr. Robert Szczerba of Lockheed Martin, is cross-posted from the original entry on my corporate Z Shift blog, where it was in turn posted after originally being published on the Intelligent Hospital Today website.]

In 2010, Apple Inc. CEO Steve Jobs used the words "post-PC era" to describe the impact of new computing devices such as tablet computers. (1) In this description, he was clear that personal computers (PCs) would remain with us for a long time to come. By "post-PC era," Jobs meant the era in which PCs are no longer the focal point of computing for most people.

In a manner similar to the early PC market, the healthcare industry has mainly focused on the interactions with a single technology: the mannequin. When healthcare professionals use the terms "simulation" and "simulation center" to describe how they train clinicians, what they typically mean is "mannequin" and "mannequin center."

Good reasons exist for this. The focus of a clinician's work is, by definition, interacting with patients, and the mannequin is the best technology currently available for the immersive simulation of patients, providing for both input (clinician actions) and output (patient responses) using a variety of mechanisms. Mannequin technology is well-understood, and studies across a variety of clinical task domains have validated the value of mannequin-based training. (2, 3)

Yet with this in mind, we believe that the focus on the mannequin for clinician training is retarding progress in the evolution of healthcare simulation technologies in general, impacting our ability to significantly improve patient safety and the overall quality of care.

We believe that the focus of healthcare simulation can, should, and must switch away from the mannequin and toward the software-based simulation of clinical systems, which integrates virtual patients, clinicians, devices, settings, and processes. Moving our focus away from hardware and toward software will enable more rapid progress in fidelity, functionality, and price-performance.

To illustrate our point, it is useful to examine the history of flight simulation, which serves as the leading example of how training via simulation can improve real-world performance in complex environments.

Four decades ago, creators of flight simulators began to shift their development approach in a way that would profoundly impact the future of aviation. Prior to this time, flight simulators had been based upon simple analog electronics. However, computer developments in the 1960s offered the opportunity to move the task of the simulation of flight from analog electronics to computer software, with the software controlling a physical simulator. This is not dissimilar from how mannequins have worked since the first modern computer-controlled mannequin, Sim One, was introduced in 1967. (4, 5)

After switching to software-based technologies, the next generation of flight simulators used computer graphics to replace the mechanical and video-based systems heretofore used to display terrain to simulator-based pilots. Prior to this change, a typical high-end flight simulator would have included a "model board," a large physical model of terrain, over which a computer-controlled camera would "fly," providing images to the pilots.

While model boards provided a reasonable method of providing imagery given the technology of the time, they were ultimately limiting. Simulators could only "fly" over terrain that had been physically modeled. The model boards took up substantial space and required effort to handle. A model board could only be used by one flight crew in one simulator at a time. The view through the camera was limited by the degrees of freedom provided by the camera control mechanism.

The change to computer-generated graphics removed these limitations and more. Any environment that could be imagined could be drawn algorithmically, from any viewpoint, in a variety of simulated conditions. Further, this development eliminated flight simulation's last remaining dependency upon hardware for user interaction. Although flight simulators continued to make use of physically simulated cockpits and motion control systems, these were no longer essential. A flight simulator could be defined purely in terms of software and interacted with using mass-market input devices such as keyboards and mice. The result of this was that the rate of improvement in the fidelity of flight simulators was limited only by increases in computer processing power, which has grown exponentially since the invention of digital computing. (6)

Around this time, one other development took place that dramatically impacted the evolution of flight simulators. The introduction of personal computers in the late 1970s enabled anyone to develop or extend flight simulator technology. The first amateur flight simulator for a popular personal computer was released in 1980. (7) While it was primitive, within a decade, its successors were capable enough to be certified for pilot training. By the 2000s, entrepreneurs and hobbyists had released tens of thousands of add-ons that extended the functionality of multiple flight simulators.

With this history in mind, we re-examine healthcare simulation.

In flight simulation terms, healthcare simulation is approximately at the model board level of the early 1970s. Mannequins are computer-controlled, but rely on expensive, proprietary hardware for user interaction. The focus of healthcare simulation development is on improving the interactivity and fidelity of mannequins themselves. This would be as if the only focus of flight simulation today were on making better cameras and more realistic-looking model boards.

We do not claim that mannequins are going away. As with PCs in the post-PC era, the post-mannequin era will be one in which mannequins play an important but not central role. In the post-mannequin era, the mannequin will be seen as one of many possible methods of enabling user interaction with rapidly improving simulations of clinical system-of-systems. (8) As a result, we expect to see less investment in proprietary mannequins and associated hardware, and more investment in the research, development, and deployment of software simulations. (9)

Given current practices in the larger software development community and the certain involvement of academia, we expect that most useful and popular software simulations of clinical systems will be partly or fully open. This will move responsibility for the advancement of healthcare simulation away from a relatively small group of for-profit firms and toward the broadest possible community, including healthcare researchers and providers. Experience with prominent open software projects (10) suggests that this will have a dramatic and positive effect on the pace of development and the quality of the results.

The next generation of healthcare simulation has the potential to revolutionize the practice of medicine, from improving the quality of care to reducing overall waste and errors. However, for this "evolution to revolution" to occur, the industry needs to stop thinking in mannequin-centric terms and start thinking in terms of the holistic software simulation of clinical systems, with the mannequin as one piece of a much larger puzzle. To the extent that the healthcare industry adopts this attitude, it will enable dramatic advances in simulation for healthcare, and we will usher in a new era: the post-mannequin era.


1. Steve Jobs at D8: Post-PC era is nigh. CNET News Web site. Full text. June 1, 2010.
2. Issenberg SB, Mcgaghie WC, Petrusa ER, et al. Features and uses of high-fidelity medical simulations that lead to effective learning: a BEME systematic review. Medical Teacher. 2005;27(1), 10-28. Abstract. Full text (PDF).
3. Murin S, Stollenwerk NS. Simulation in procedural training: at the tipping point. Chest. 2010;137(5):1009-1011. Full text.
4. Denson JS, Abrahamson SM. A computer-controlled patient simulator. JAMA. 1969;208(3):504-508. Abstract.
5. Cooper JB, Taqueti VR. A brief history of the development of mannequin simulators for clinical education and training. Quality and Safety in Health Care. 2004;13(suppl 1):i11-i18. Abstract. Full text (PDF).
6. Kurzweil R. The law of accelerating returns. Kurzweil Accelerating Intelligence website. Full text. March 7, 2001.
7. Flight Simulator history introduction. simFlight Web site. Full text. February 20, 2005.
8. Boosman F, Szczerba RJ. Simulated clinical environments and virtual system-of-systems engineering for health care. Interservice/Industry Training, Simulation and Education Conference (I/ITSEC) 2010. Abstract. Full text (PDF).
9. Kneebone RL. Practice, rehearsal, and performance: an approach for simulation-based surgical and procedure training. JAMA. 2009;302(12):1336-1338. Abstract.
10. Lerner J, Tirole J. Some simple economics of open source. Journal of Industrial Economics. 2002;50(2):197-234. Abstract. Full text (PDF).