This post is an update of Geiger Counter Case Study: Inspector Alert published on SurvivalJapan in which some questions remained open, mainly about the relatively high values (although still in the safe range) which I measured with the system kindly lent to me by Safecast and from whom I received some further advice.
The Safecast bGeigie system is designed to measure mainly gamma rays (high energy protons, akin to X-rays) and hence is used at least one meter above ground in their radiation maps. Since I live in the monitored land, several hundred miles away from Fukushima, gamma radiation is low and not really a concern. Therefore I had measured instead beta radiation (high energy electrons or positrons which are emitted back from the ground after radioactive fall-out) at about one foot above ground. For convenience, I monitored the level of radiation with the Safecast display which communicates by radio with the Inspector Alert safely cast in its lunchbox style (in Japanese “bento”) box, along with the GPS and SD memory card to geo-locate and store results. The Safecast team advised against this methodology for beta radiation pick-up and advised me to use the Inspector Alert alone for that matter – which I did.
I read the user manual to set the Inspector Alert display in uSv/h as opposed to CPM (count per minute) as I am more familiar with this unit and it is more relevant for body effects. The user manual explains that the factor used by the device to convert CPM into uSv/h is based on Cesium-137, the radionuclide used for its calibration, so the uSv/hr display is less accurate for other nuclides (such as Cesium-134, Strontium-90, Iodine-131 and of course Uranium and Plutonium…). This is why Safecast uses the CPM raw data instead.
The first measure that I made was inside my home and the display changed widely even in a single place. A Geiger counter is not like a weighing scale: it does not give a result at once nor does it give a stable result. Therefore when a value is broadcast either by citizens or a governmental organization, it should be taken with a grain of salt. For instance, I could measure 0.120 uSv/h and any value between 0.090 and 0.150 uSv/h, that is about 25% more or less than the central value. Sometimes, some wilder values would come up: how do we interpret these?
Radiation is a random phenomenon which occurs naturally, so when a particle hits the Geiger counter sensor plate, it is registered and changes the overall measure value. The Inspector Alert averages measures over 30 seconds in order to get a more statistically relevant measure. Even then, the result is only displayed every 3 seconds so if one is moving, there is a delay between the measure and the display. Then there is the 15% accuracy which is probably an average: it means that some wild values (standard deviation) can occur from time to time. Other factors which can affect the results are solar flares (there was just a sunstorm by the way) and, probably, thermal drift if the device electronics is not properly compensated when temperature changes (any kind of electronics sensor is subject to this phenomenon). The bottom line is that measures could be twice as high depending on temperature, solar activity, randomness of natural radioactivity, types of radionuclides (including artificial ones from nuclear plants) and radiation (here it is a synthetic result of alpha, beta, gamma and X-rays), accuracy, resolution, etc.
Indeed, I could still measure inside and outside values from 0.055 to 0.225 uSv/h and even up to 0.355 uSv/h when spot on granite blocks which are naturally radioactive. These new measures were consistent with the range I had already measured with the full Safecast system. I could also check that the outer casing of Safecast suitcase and bento box did not emit stronger radiation than the room so the Geiger counter is likely not contaminated (and there should not be any calibration issue either according to the user manual).
I still could not double-check with another type of Geiger counter yet but these new results convinced me that they are normal. The maximum international value (except in post-Fukushima Japan) accepted is 1 mSv/year, which equates to 0.114 uSv/h. Given a 15% accuracy, it means that the Inspector Alert should read between 0.097 and 0.131 uSv/h which is indeed what it does most of the time (so we can dismiss occasional lower and higher results as products of standard deviation).
A final word of advice which I received from Safecast and which is also documented in the user manual is to use the timed count function of the Inspector Alert over at least 10 minutes to further smooth out results. There should be about 15% difference maximum between two such timed counts.
I hope that this update helps you to get a better idea about the capabilities and limitations of Geiger counters in general and specifically of the Inspector Alert – and of the analytical mindset and of the basic radiation knowledge necessary to properly use them. In any case, purchasing a Geiger counter to try and measure radioactivity in food does not make any sense (unless the food is irradiated to such a level that just staring at it is dangerous) and that monitoring the food trace is a safer and more reliable procedure. Thankfully, this is getting easier.