But what the movie got wrong was how it portrayed the public health community’s access to critical information about the spread of the disease. In one scene, a scientist from the World Health Organization goes to a computer and pulls up real-time reports about the spread of the disease across the U.S. and the world. If you’re to believe the movie, these scientists had immediate access to information about this pandemic, including the infection rate, the mortality, the spread of infection and its area of impact. While CDC has some pretty exciting visualization systems and an ultra-modern command center, the ways data are gathered in the real world look very different from what was on screen.
We know how it really works because we experienced it during the H1N1 outbreak two years ago. H1N1 was nowhere near as dramatic as what’s portrayed in Contagion, but the outbreak did reveal the very real limitations of our public health information infrastructure. Contagion implied that daily information from across the country was coming real-time into a central database where it was analyzed by scientists and used to brief the administration and the general public.
What really happened with H1N1 is that some states and locales gathered and submitted data manually on paper while others used computers. The data were largely non-standard, which meant there was a tremendous effort to transform the data into useable formats. And, when it was compiled across multiple different systems, additional manual measures were taken to integrate the data into information that CDC and the state and local health officials could use to provide the public the information it demanded.
So why is public health so far behind the technology curve? It’s a great question, and in a follow-up post I’ll talk more about the reasons why public health is where it is today and the work being undertaken to build a 21st century information infrastructure.