According to a very recent paper published in the Journal of the American Medical Association (Pediatrics), food allergy costs a staggering $25 billion annually. This estimate was based on a survey of 1643 caregivers of children with food allergies. These costs are, by any measure, breath-taking. For example, in the case of obesity, for which we have very accurate data of the true prevalence, it is estimated that by 2030, the annual costs to the US will range somewhere between $ 48 and $66 billion. How on earth can childhood food allergy cost as much as obesity? The authors break the costs down as 17% due to the direct costs of clinical care and 83% due to hidden costs such as time off work by parents in looking after sick children. However, the key figure in calculating the true national cost to the US is the prevalence of food allergy, which the authors cite as 8%. That figure was generated by the authors in an earlier study of US children and is based on a large survey involving the carers of over 38.000 children. Therein lies a huge problem in that the figure of 8% is self- reported. Two recent studies have completed a systematic review and meta analysis of the literature in this area. The first focused on the prevalence of food allergy in Europe and found that the lifetime prevalence was 17.3 % meaning that at some point in their lives, about 1 in 6 people had encountered an allergy to food. However, the point estimate, that is the % at a specific time, was only 5.9 %. Both these values refer to self-reported food allergy. When food allergy diagnosis was made using a blood test (specific IgE) the point estimate was 10.1% falling to 2.7% when the more specific skin-prick test was applied. Importantly, when the gold standard test of food challenge was used, the value plummeted to just 0.9%. The second study focused on allergies to plant foods and accessed data from 36 studies with over 250,000 children. The majority of studies used self-reported values for the prevalence of food allergy. Taking fruit allergies as an example, some 21 studies using self-reporting methods found the prevalence of fruit allergy to be as high as 6.6 % and with most studies coming in at about 1%. However, with skin prick tests, the prevalence fell to less that 0.1%. A similar pattern was found for vegetable and nut allergies. The authors conclude: “Prevalence estimate of plant food allergy based on self-reported symptoms should be treated with caution”.
A final study explored the extreme event of fatal food allergic reactions. The authors searched the literature from 1946 to 2012 and identified 13 studies involving 165 million food allergic person years which recorded 240 fatal food allergic reactions and 14 studies which explored such fatal events but found none. The estimated fatality rate among allergy sufferers was 1.8 per million person years which is less than that associated with accidental death. Peanut allergy, which is often cited in the popular media as the most dramatic of food allergies, had a mortality rate of 2.3 per million person years. Note that the term “person years” refers specifically to the population who are in fact food allergic and excludes the many who are not.
All of these data show that the popular estimates of food allergy and its consequences are grossly inflated, as are the putative economic costs of food allergy to carers. The true diagnosis of food allergy requires a slow methodical approach, ultimately identified by an exclusion diet and with the re-introduction of foods under clinical supervision. Skin prick tests are a next level and after that the diagnostic tools range from the low reliability ones such as blood IgE to downright quackery. For the few who are truly allergic to foods based on a proper diagnosis, the task of avoiding the offending food is a lifelong struggle. For the vast majority, where the diagnosis is self-made or based on some popular quackology, the allergy is no more than a social accessory.