Optimal Information Collection Policies in a Markov Decision Process Framework.
Ontology highlight
ABSTRACT: BACKGROUND:The cost-effectiveness and value of additional information about a health technology or program may change over time because of trends affecting patient cohorts and/or the intervention. Delaying information collection even for parameters that do not change over time may be optimal. METHODS:We present a stochastic dynamic programming approach to simultaneously identify the optimal intervention and information collection policies. We use our framework to evaluate birth cohort hepatitis C virus (HCV) screening. We focus on how the presence of a time-varying parameter (HCV prevalence) affects the optimal information collection policy for a parameter assumed constant across birth cohorts: liver fibrosis stage distribution for screen-detected diagnosis at age 50. RESULTS:We prove that it may be optimal to delay information collection until a time when the information more immediately affects decision making. For the example of HCV screening, given initial beliefs, the optimal policy (at 2010) was to continue screening and collect information about the distribution of liver fibrosis at screen-detected diagnosis in 12 years, increasing the expected incremental net monetary benefit (INMB) by $169.5 million compared to current guidelines. CONCLUSIONS:The option to delay information collection until the information is sufficiently likely to influence decisions can increase efficiency. A dynamic programming framework enables an assessment of the marginal value of information and determines the optimal policy, including when and how much information to collect.
SUBMITTER: Cipriano LE
PROVIDER: S-EPMC6690493 | biostudies-literature | 2018 Oct
REPOSITORIES: biostudies-literature
ACCESS DATA