The Informatics Academy: People, process and technology, part II
CC Master

In this three-part blog series, Sarah Gilbert, Director of PHII’s Informatics Academy, explores how the three focus areas of informatics – people, process, and technology – illustrate the Academy’s approach to training the public health workforce.

At PHII’s Informatics Academy, we produce solutions that are realistic, practical and tailored to the training need. To ensure people are able to do, not just know, our workforce development projects are grounded in real-world application and performance improvement. We take a structured, rigorous approach to each of our Academy projects. We think about our process in two ways: the models that provide the framework that guides our offerings and the project methodology that defines how we approach each learning project.


Our training solutions are structured using two models:

The Applied Public Health Informatics Competency Model focuses on individual skills and abilities. This model contains eight core competencies: standards and interoperability; project management; information systems; policy; communications; evaluation; principles and strategies; and analysis, visualization, and reporting. Our offerings include clear learning outcomes that tie back to the Competency Model and help practitioners connect theory to practice.

The Informatics Savvy Health Department model focuses on organizational capacity. PHII created an informatics-savvy health department tool to help health departments determine organizational capabilities needed for informatics.

These models guide our solutions that help public health practitioners develop skills to improve practice – both individually and as part of their organization.

Project Methodology

Along with the structure of our projects comes a rigorous, detailed approach to project management that ensures we meet our ultimate goal: to create solutions that improve public health practice. Every project initiation phase includes a needs assessment conducted with the target audience where we seek to understand the training need. We build in evaluation from the beginning by identifying specific, measurable performance targets that can be incorporated into the solution and will take shape in the design and development phases. Evaluation for all our Academy projects must answer the following two questions first:

1.       How is it applicable to the work?

2.       Will it be useful in practice?

These questions, as well as others, guide us well to produce solutions that are realistic and practical and improve performance.

With these goals in mind and the audience needs identified, we employ an iterative project process called SAM (Successive Approximation Model). During the design phase, we cycle through design, prototyping and review until the project is ready to move to development. Iteration in the development phase includes testing the solution and evaluating outcomes. What we learn during pilot testing or evaluation could cause us to return to the design phase and refine the solution to better meet the training need. Although we aim to run efficient, well-controlled projects, the quality of each solution is always the most important consideration.

Process is integral to what we do. Our models help provide a framework for our offerings that leads to individual and organizational development. Through leveraging communities of developing practice, we connect practitioners looking to build competency in the same area, and we provide them with the necessary supports for social learning. Applying rigorous, yet flexible, processes to manage our projects ensures we create solutions that support the needs of public health practitioners.