Glossary of Terms

Pain points:

A problem, real or perceived that can be turned into actionable insights.


People who use the services provided by a particular program. The term is often used interchangeably with program users and clients.

Participant-centered design process:

An approach in which involves users more deeply in the process as co-designers by empowering them to propose and generate design alternatives themselves. It supports diverse ways of thinking, planning, and acting making work, technologies, and social institutions more responsive to human needs.

Participant interview:

A conversation performed with the potential users of a design to gather information on users’ feelings, motivations, and daily routines, or how they use various products with the specific purpose of informing a design project.

Participant selection:

The total group of individuals from which the sample might be drawn.

Participatory design methods:

An approach where all stakeholders are involved in the design process.

Participatory Evaluation:

An evaluation in which managers, implementing staff and beneficiaries work together to choose a research design, collect data, and report findings.

Patient-centered approach:

This unique model of care aims to improve access, population disease control, patient-self management, and the role of primary care physicians as coordinators and managers of care.

Patient-Centered Outcomes:

A practice that focuses the attention on patient’s beliefs, preferences, and needs, in contrast to physician-centered care.

Patient experience:

Encompasses the range of interactions that patients have with the health care system, including their care from health plans, and from doctors, nurses, and staff in hospitals, physician practices, and other health care facilities.


A person under health care.

Peer researchers:

Members of the research target group who adopt the role of active researchers, interviewing their peer group about their experiences.

Performance Indicator:

A particular characteristic or dimension used to measure intended changes. Performance indicators are used to observe progress and to measure actual results compared to expected results.

Performance Management:

Systematic process of collecting and analyzing performance data to track progress towards planned results to improve resource allocation, implementation, and results.

Performance Measurement:

Ways to objectively measure the degree of success that a program has had in achieving its stated objectives, goals, and planned program activities.


The science of the functions of living organisms and their parts.


A small scale preliminary study conducted in order to evaluate feasibility, time, cost, adverse events, and improve upon the study design prior to performance of a full-scale research project.

Pilot project:

A small scale implementation of a program over a set period of time. Pilot projects use robust monitoring and evaluation processes to test whether a hypothesis or model is effective at addressing a specific issue and to determine how the model works in the real world.

Place-based approach:

Focuses on the social and physical environment of a community and on better integrated and more accessible service systems, rather than focusing principally on the problems faced by individuals.

Positive deviants:

Based on the observation that in every community there are certain individuals or groups whose uncommon behaviors and strategies enable them to find better solutions to problems than their peers, while having access to the same resources and facing similar or worse challenges.

Pragmatic Intervention:

Scaleable approaches based on practical considerations.

Preconditions for success:

Bring (something) into the desired state for use.

Primary Data:

Information collected directly by the researcher (or assistants), rather than culled from secondary sources (data collected by others). In program evaluation, it refers to the information gathered directly by an evaluator to inform an evaluation.

Primary qualitative research:

Research collected first hand by the research for a specific research purpose. It involves going directly to a source to ask questions and gather information.

Problem solving framework:

A method, tool, or concept to work through a problem and reach a solution.

Problem statement:

A guiding statement that provides a focus on the specific needs that are uncovered in the design process.


The programmed, sequenced set of things actually done to carry out a program or project.

Process Evaluation:

An assessment conducted during the implementation of a program to determine if the program is likely to reach its objectives by assessing whether or not it is reaching its intended beneficiaries (coverage) and providing the intended services using appropriate means (processes).

Processing capabilities

Involves sifting through available data to eliminate useless, irrelevant, or incorrect information and to put the data into a logical order.

Product adjustments:

An adjustment made to an existing product, usually made for greater appeal or functionality. A modification may include a change to a product’s shape, adding a feature or improving its performance.

Product experiences (user experience):

All aspects of the end-user’s interaction with the company, its services, and its products.
It focuses on having a deep understanding of users, what they need, what they value, their abilities, and also their limitations.

Product requirements:

A document that clearly and unambiguously articulate the product’s purpose, features, functionality, and behavior.


A set of interventions, activities or projects that are typically implemented by several parties over a specified period of time and may cut across sectors, themes and/or geographic areas.

Program delivery:

A process for monitoring the achievement of the desired outputs as well as intended outcomes.

Program Evaluation:

Evaluation of a set of interventions designed to attain specific global, regional, country, or sector development objectives. A program is a time-bound intervention involving multiple activities that may cut across sectors, themes and/or geographic areas.


A discrete activity (or ‘development intervention’) implemented by a defined set of implementers and designed to achieve specific objectives within specified resources and implementation schedules. A set of projects make up the portfolio of a program. Related term: activity, intervention.

Project Appraisal:

A comprehensive and systematic review of all aspects of the project — technical, financial, economic, social, institutional, environmental — to determine whether an investment should go ahead.

Project Evaluation:

An evaluation of a discrete activity designed to achieve specific objectives within specified resources and implementation schedules, often within the framework of a broader program.

Proof-of-concept protocol:

A demonstration, the purpose of which is to verify that certain concepts or theories have the potential for real-world application.

Proof points:

Qualitative or quantitative evidence that help organizations understand how and what customers will be measuring to analyze the benefit of the product.


A document that describes the background, rationale, objectives, design, methodology, statistical considerations, and organization of a clinical research project. 


A test or preliminary model of an idea, design, process, interface, technology, product, service or creative work. 

Prototype characteristics:

An easily modified and extensible model (representation, simulation 
or demonstration).

Provider bias:

Associations outside conscious awareness that lead to a negative evaluation of a person on the basis of irrelevant characteristics such as race or gender.

Proximity to the field:

Immersion by multidisciplinary research team, allowing for immediate feedback.


A statement of “why” a study is being conducted, or the goal of a study