UV in Summer Sunlight Disinfects Covid-19
... and can UV Damage to Covid-19 be estimated from RNA Absorption?
In the time since my last posts on this subject, I’ve been convinced that Ratneser-Shumate’s (2020) estimate of the time taken for sunlight to inactivate the Covid-19 virus is probably correct (despite my initial skepticism at the result being first aired in a Trump White House briefing). She and her team came up with a time of about 6 minutes to inactivate 90 percent of the virus sample population sitting on a horizontal surface in summer sunlight.
As I noted in my first post, if the virus is airborne it can be irradiated from all directions, which approximately doubles the dose and therefore halves the time taken for inactivation. So the question posed in my first post is answered. In summer at least, you are better to be outside breathing air that’s been disinfected by exposure to sunlight than indoors breathing recirculating air that’s not exposed to sunlight. If you need to be inside in summer, turn off that air conditioner and open the windows. You only have to look at Florida and Texas to see it’s not a universal panacea, but I wonder how many of the burgeoning cases there and in Asia are from indoor workers? And disinfection of the air from UV in sunlight not so easy in winter.
For me, the big remaining question is why were the inactivation times that I calculated so much longer than observed? Some recent correspondence with colleagues Martin Hessling (that should really be spelt Heßling, from Germany), and Arjan van Dijk (from The Netherlands) sheds a little more light on that question. I had worked with Arjan previously, but contacted Martin a couple of weeks ago to ask him some questions about this very useful explanatory diagram in his paper.
I’m not a biologist, so won’t attempt to explain those intricacies. Please read his paper here to see what it all means.
It’s the UV component of radiation that damages RNA – the genetic material in covid-19 virus. Arjan drew my attention to a point Id’ missed in an action spectrum measured by Lytle and Sagripanti in 2005. It had escaped my attention that their measurements included a data point at 254 nm, the wavelength that’s used to irradiate viruses in the lab. In their paper they had shown of plot of the relative damage as a few wavelengths, but had omitted that crucial one (and I obviously hadn’t read the paper very carefully!).
Arjan digitized the data from that log plot and interpolated to intermediate wavelengths. We then calculated the inactivation time in summer sunlight. This is a fast-moving area of research, and the inactivation energy required to damage RNA is a topic of hot debate. I’d previously used thresholds of 20 and 67 J/m2 as the amount needed to inactivate 90 percent of the virus. But Martin Hessling had suggested that 37 Jm-2 is a better number (which is much larger than others had presumed a few years earlier). Using that number, the activation time in sunlight is a little more than 1 hour – comparable with the times I had calculated previously, and still much longer than the times measured directly by Ratneser-Shumate.
Martin Hessling’s paper also included that very useful explanatory diagram (above), which showed a plot of the absorption spectrum for RNA at high spectral resolution, extending over wavelengths from UV-C wavelengths near 200 nm right through the visible region. That extended wavelength range is just what’s needed. I contacted him, and he gladly provide the data (and even re-did the measurements as a check). There were just a couple of problems. First, there was a small measurement offset which had to be removed. Secondly, the spectrum is for RNA absorption, rather than RNA damage. The two would be the same only if every absorbed photon led to unrepaired damage to the RNA molecule.
The two spectra are compared in the figures below.
On a linear scale (upper graph), the two spectra look similar in shape, but when viewed on a log scale (lower graph) it’s clear the drop-off at wavelengths longer than 290 nm - that are transmitted in sunlight to the Earth’s surface - is much less steep for Hessling’s RNA absorption than it is for the Lytle’s action spectrum for RNA damage. As I suggested in an earlier post, that slower drop-off at longer wavelengths would increase the rate of damage from UV in sunlight, and lead to shorter inactivation times (and a smaller seasonal dependence).
The corresponding calculated times for inactivation by UV in summer sunlight using Hessling’s RNA absorption spectrum, and assuming every absorbed photon permanently damages the RNA, is 6.8 minutes (compared with more than ten times as long for Lytle’s spectrum).
That 90 percent inactivation time for damage calculated from the RNA absorption spectrum is remarkably close to that measured directly. It might be just a coincidence, or it might be that our assumption about absorption being proportional to permanent damage is correct.
But there are two schools of thought. The other is that Lytle’s spectrum is more appropriate but the inactivation energy is much lower, as the same group has very recently concluded. But even when they assume an inactivation energy as low as 7 J/m2, the 90 percent inactivation times are twice as large as observed. To get agreement with observations using their spectrum, an inactivation energy of about 3 J/m2 would be required.
At mid-latitudes in winter the time required for virus inactivation would be about two or three times longer (i.e., ~20 minutes) if Hessling’s spectrum applied, but too long for significant inactivation in an entire day if Lytle’s spectrum applied.
Which school of thought wins? The jury’s still out …
I am never again going indoors