Contributing lab leader: Shara Evans
Contributing lab leader: Shara Evans
Advances in the automation of labs have helped to make them work more efficiently, reduce turnaround times and improve quality. As we move forward, labs will need to think about how they can prepare themselves for the new challenges and opportunities that could revolutionize the way we use the crucial data they generate to improve patient care.
We spoke with Shara Evans, futurist and expert in artificial intelligence, robots, cybersecurity, and Jobs of the Future, to learn more about how she thinks labs will evolve in the coming years. We started by asking her about the importance of interconnected data infrastructure.
Sign up today to receive a free gift as well as our latest insights and more in your inbox!
Q: What are the advantages of clinical labs having an interconnected data infrastructure for operations and for patients?
Shara Evans:
We’re all familiar with the concept of lab automation. It has been going on for a very long time, but this doesn’t always translate into more connected data. Even today, if I were to get pathology results from a blood test, it may have my history of previous blood results, but it certainly doesn't take into account any other medical issues that may have been picked up by other tests, such as ultrasounds, x-rays, and MRIs. I think we have an opportunity to do things differently, so that you can see the whole picture of a patient’s health.
What we really need now are data infrastructures that are interconnected and that incorporate patient information, including insights from things like electronic health records, medical imaging systems, and other sources, and making those available to lab staff, who can then tailor their comments based on a holistic clinical context.
But there is a major, major flip side that's going to impact the availability of this kind of interconnected data infrastructure, and that is cybersecurity and privacy. And frankly, with all of the huge data breaches that are happening, it's going to be a big uphill battle to be able to win patient trust to allow that kind of interconnection to happen. Because we're talking about really sensitive data, we’re going to need the very highest standards in cybersecurity across many different areas, both in the lab and any external organization that is providing inputs or data to the lab. This is a huge area and one that we really need to be paying attention to.
Q: What would you say are the implications of an interconnected data structure on cybersecurity, and are there any recommendations on what can be done now to further enhance data privacy and security in healthcare labs?
Shara Evans:
I'm based in Australia, and at the end of 2022, we had a number of huge cyber security breaches with one of our top telecommunications companies, as well as with one of our largest, if not the largest, healthcare insurance provider. I estimate that over 60% of our adult population here is at great risk of identity theft. 1 So security, privacy, ethics, corporate governance, all of these fall into a lot of the work that I've been doing lately. One of the guiding principles that I would recommend is to collect the least amount of information possible and to use multiple types of security defenses. There are lots of different vendors, and they offer many different kinds of solutions. So I think a mix of solutions will be required.
I also recommend that we use the strongest possible encryption, and keep audit trails of who has access to the data, and when and why they have accessed it. You might consider putting together a white list of people that can access the data. Ultimately, it's likely that this kind of sensitive data will be accessed by an AI-based program so the audit trail needs to account for what's being done with the data and whether any of that sensitive data is shared externally other than with the intended recipient.
Sensitive data also needs to be stored in different data repositories, each with its own strongest possible encryption, and using multiple multifactor logins for these different data repositories. Taking these kinds of precautions would mean that even if the worst were to happen and a hacker did manage to access a particular database, they wouldn't have access to everything.
It’s also absolutely critical that people stop sending very sensitive data by unencrypted email, including any documents that are attached to these emails. If it's not encrypted, it's not secure. And it's amazing the amount of healthcare data that I see being sent just over plain email with no encryption whatsoever, and it's so easy to fix.
Institutions and companies should also invest in a secure portal for both clinical and patient access to sensitive data. This is something that I'm seeing as a best practice, especially when it also has two-factor authentication. Another recommendation that I think could be very valuable to both laboratories and other organizations, is to set up automated alerts that are triggered and prevent access to personal data when any unusual access activity is detected.
Q: In the event of a cyber attack or security breach in a healthcare lab setting, who takes care of the cost, what will this mean for smaller labs, and what could this mean for the competitive landscape overall?
Shara Evans:
If companies have data breaches, they could be liable for huge payouts. This could significantly impact their profitability and possibly even their viability. Smaller labs may just not have the funds to pay for cyber damages or to upgrade their infrastructure to prevent them from falling victim to such an attack in the first place.
Q: Who should be responsible for ensuring the security of data once they leave the lab?
Shara Evans:
Once the data leaves the lab and it gets to the correct recipient, it's up to the receiving party to keep it safe and secure. And at that point, the lab is not responsible if, say, a hospital or a doctor's office, or someone else suffers a data breach that happens to involve data that originated from a lab report, the lab has nothing to do with that. It's whoever is responsible for the safekeeping of the data.
Q: Quality is critical in ensuring optimal productivity within labs, and as management of big data sets becomes more prevalent within laboratories, what are your recommendations for maintaining credibility and accuracy?
Shara Evans:
When it comes to accurate lab results, high quality data is an absolute prerequisite. One of the biggest challenges that I think we'll see going into the future is maintaining the quality of data sets where AI algorithms, such as machine learning or deep learning, use datasets as their fuel. And if that data is inaccurate or biased, it can lead to really disastrous health outcomes.
Looking to the not-too-distant future, AI and AI-based reinforcement learning are going to play an increasingly important role. We will need computer scientists, philosophers, ethicists, and many others, to come together to ensure that the health decisions that may be generated through AI are accurate and unbiased.
One of the latest trends we're seeing is a new kind of AI many are starting to call the third wave of sophisticated AI - called generative AI. Here, I'm talking about tools like ChatGPT that can do many things including the automation of report writing. But, I've already seen examples in the healthcare space where this type of AI program can get things really wrong, even with very, very narrow specific medical experiments. These tools will continue to get more sophisticated and there'll be lots of new players on the market. Eventually, they'll become an essential part of a lab's workflow, but I think that human beings will still be needed to keep a close watch on the outputs, and to double-check findings.
Want to be the first to receive the latest insights from industry leaders? Sign up for our newsletter.