Science fiction often provides a canvas for exploring possible futures before they become part of our lived experience. This happened to me recently with the HBO show “The Last of Us”, in which a seemingly benign fungus, once confined to the insect world, evolves under the pressures of a warming climate, morphing into a pandemic-scale threat that decimates humanity.
I’ve been wondering if the idea that natural organisms, spurred by environmental changes, stand ready to exploit gaps in our defenses is more than a chilling Sci-Fi narrative. What if it starkly mirrors our current vulnerabilities in biosecurity? And if so, can agtech help?
Between the radical interconnectedness of global markets, climate change, and agricultural development’s increasing interface with wild spaces, biosecurity threats are not only on the rise, they’re likely increasing exponentially.
We’ve seen signs of this over recent years with the outbreak and spread of African Swine Fever, foot and mouth disease, and even the COVID-19 pandemic. We’re seeing even more urgent signs now, for example as the US deals with avian influenza passing into cattle.
It’s not hard to imagine that, going forward, we’ll face multiple simultaneous threats rather than one cataclysmic incursion at a time. This will spread our limited biosecurity resources thinner and thinner.
I recently had the opportunity to speak with Sarah Britton, biosecurity expert and former Chief Veterinarian for NSW, Australia, about whether science and technology have major roles to play in both helping us prepare for possible risks, and in dealing with them when they arise.
“In the more intensive industries, like poultry or egg laying, they can measure everything from feed and water, to egg laying production and temperature. If anything goes out of whack, it sets off an alarm. So if potentially a disease comes through, an alarm will activate”
While we’re not there yet, it’s not hard to imagine a future where increasingly low-cost and ubiquitous monitoring tools and advanced image recognition software combine to have eyes on livestock at all times, carefully watching for the slightest indication of an illness. Likewise, advancements in AI and machine learning could have powerful applications, helping to recognize and eliminate potential threats before crops or products move across states or across oceans, saving individuals and industries millions if not billions of dollars. And once digitized, this information can be readily put into biosecurity plans, lowering the bar for regulatory compliance and shoring up market access credentials.
Another critical issue Sarah highlighted is vet shortages, which lengthen the amount of time between when a sick animal is identified and when it is tested or treated. Tech could help empower producers with more tools to provide some veterinary care themselves, as well as improve the UX and efficacy of tele-vet visits.
Though the benefits of biosecurity tech are undeniable, and hugely cost-efficient compared to the alternative, there are still massive human psychology challenges to deal with when it comes to getting would-be customers on board.
The essential incentive issue is that people generally don’t feel urgency around biosecurity until it’s already too late. As Sarah explains:
“It usually takes an outbreak or a threat to galvanize actual interest, or really get people motivated to act.”
This is not just an ag issue, it taps into the fundamentals of how humans make decisions. Put two pieces of technology on the table, one that will earn you $10 this year and one that could potentially save you $1,000 at some point in the future. The immediate reward of the $10 (hyperbolic discounting) outcompetes the possible - but uncertain- threat of possibly losing $1,000 (loss aversion).
The medical and insurance sectors struggle against these same challenges, and they’ve explored strategies (mostly, financial ones) to motivate consumers to participate in preventative activities– think annual check-ups, teeth cleanings, and flu shots.
For agtech to combat biosecurity threats, we’ll need to both learn from examples in these industries, and explore business models beyond those that simply offer to pay customers.
Biosecurity issues pose significant risks that the agricultural finance industry – banks, insurance companies, etc. - have strong incentives to be thinking about. If tech can verifiably reduce risks of catastrophic losses or bankruptcy, there could be some serious synergy between agtech companies and financial and risk institutions.
For example, an insurance product that covers losses due to biosecurity breaches could make the cost of inaction more immediate and concrete. Premiums could be reduced for businesses that adopt proactive biosecurity measures, thus providing a tangible, immediate financial benefit for being proactive.
Governments, of course, also have a major stake in national biosecurity, and a role to play. But as Sarah pointed out, governments can be fickle supporters. The problem is, when biosecurity systems are working, there are fewer biosecurity issues. As the threat ebbs, demands for government resources often cause attention to shift elsewhere, and investment to decrease.
Governments could offer subsidies or tax breaks for investing in biosecurity technologies. Perhaps by reducing the initial financial burden, investing in preventive measures would be sufficiently attractive.
While there are many ways to kill fantastical fungal zombies, it’s clear that tech alone won’t be enough to combat the real-world pathogens threatening to disrupt the future of food and agriculture industries.