Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Viral load V, dV/dt = rV. This loosely solves to V = a*exp(rt). The bigger your V0, the faster the initial rate of increase.

The Vmin to be detected by your immune system is small. You want the most time between Vdetected and Vdanger.



My question is about how much r matters.

For influenza, each infected cell apparently infects ~22 other cells:

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1563736/

So in 2 generations you have 500x the virus.

If it were 'less' exponential, and each cell only infected, say, 5 others, then you get to 500x late in the 4th generation.

In a cartoon example where a low exposure leads to 1 infected cell and a high exposure leads to 500 infected cells, the low exposure matters more if the reproductive factor is lower.


There is an interesting accidental study about this regarding norovirus. (Back in 2014 I think, not the 2017 Tennessee thanksgiving event)

Some guy vomited in a restaurant. Then you can clearly model the time from exposure to symptoms in everyone else based on how far away they were when the guy vomited. Though in this case the viral exposure was very high for everyone in the restaurant.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: