Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Check effect of multiple interventions #187

Closed
pratikunterwegs opened this issue Feb 28, 2024 · 4 comments
Closed

Check effect of multiple interventions #187

pratikunterwegs opened this issue Feb 28, 2024 · 4 comments
Labels
Testing Related to package tests and quality control

Comments

@pratikunterwegs
Copy link
Collaborator

This issue references the new "Modelling scenarios" vignette in PR #176 and requests a re-check of overlapping intervention functionality (see also the "Modelling multiple interventions vignette"), including the addition of tests for statistical correctness that multiple interventions reduce infections more than single or no interventions.

Because parameters are held the same across scenarios (if I've understood correctly) it would be useful to plot the impact of each control measure. Given the parameter is the same, then we should have total cases with no intervention >= total cases with intervention X etc.

Actually, this made me notice that the scenarios don't always seem transitive for the same parameter values, as we'd expect (i.e. cases averted with school closure + workplace closure > cases averted with one of these only):

     param_set scenario  difference
         <int>    <int>       <num>
  1:         1        2    50453.43
  2:         1        3   673484.63
  3:         1        4   230256.17
  4:         1        5  4723950.45
  5:         1        6   189674.39

All the differences are positive (i.e. fewer cases than the 'no intervention' scenario), which makes sense. But not sure why multiple closures averts fewer cases than a single one?

Originally posted by @adamkucharski in #176 (comment)

@pratikunterwegs
Copy link
Collaborator Author

Hi @adamkucharski - just looking into this issue raised from the vigenette in #176 - what seems to be happening is that the interventions do serve to suppress new infections while active, but when $R$ is relatively high $\sim$ 1.7 (from {EpiEstim}) the later stages of the epidemic see lots of new infections, while this doesn't happen when $R$ is lower ($\sim$ 1.3, the model default). This leads to changes in relative differences in final sizes.

I would also expect transitivity, so this surprises me too - any thoughts on whether this is seen in other epidemic models? My main concern is that I'm not sure whether this effect of $R$ should be expected in age-stratified models, or whether something is wrong. I've added tests to check that the cumulative effect of interventions is greater than that of single interventions, but note that that is for lower $R \sim$ 1.3.

Rplot01
New infections over time in each scenario for different $R$ here ($\beta$).

Rplot02
Final size per scenario for different $R$ here ($\beta$).

@adamkucharski
Copy link
Member

Looks like it's a 'flatten the curve' dynamic happening, where individual interventions reduce the difference between the final size and the herd immunity threshold, and hence alter dynamics of a resurgence in non-linear way. I discuss it more in this post: https://kucharski.substack.com/p/disrupted-dynamics

It's also been observed for epidemics like H1N1, where school holidays reduced final size (so a bigger, but briefer intervention may have led to larger outbreak): https://pubmed.ncbi.nlm.nih.gov/24230961/

One quick check on the underlying transmission process would be to include intervention for the rest of the simulation period, to check indeed transitive (because no potential for resurgence).

Basically, above shows value of outputting a plot of dynamics (ideally with multiple sampled trajectories, rather than just summary statistic), so user can get some intuition alongside. And might be worth showing this as an example of how time-limited interventions can do counterintuitive things?

@pratikunterwegs
Copy link
Collaborator Author

Thanks - I expected that interventions would shift cases in time but not that they could lead to larger final sizes - will be sure to include this per your final suggestion.

One quick check on the underlying transmission process would be to include intervention for the rest of the simulation period, to check indeed transitive (because no potential for resurgence).

Will add this to the tests and perhaps also to this vignette if it seems to fit.

@pratikunterwegs
Copy link
Collaborator Author

Closing as fixed in #176.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Testing Related to package tests and quality control
Development

No branches or pull requests

2 participants