The challenge of regaining trust in medical services in a new era of tech
Mita Srinivasan
10x Industry
Published:

The challenge of regaining trust in medical services in a new era of tech

In this exclusive opinion piece for SME10X and UPS, Serene Touma and Dr. Baher Al Hakim from Medicus AI share their view on restoring trust in technology with medical data.

It was just over a decade ago, and we were at the precipice of the dawn of a new era: Tech 2.0. Finding ourselves at this new frontier of technological innovation brought with it a special sense of optimism: where tech could solve any problem, for anyone, anywhere. Technology, and in particular the new focus on the smartphone and the app ecosystem, was seen as an enabler, heralded as an objective and fair arbiter and a much-needed layer of convenience and efficiency in our everyday lives.

However, over the past few years, we have seen numerous examples of negative stories, of companies and entire industries who, in the pursuit of profit and success, had been exposed as abusing their privilege and position in a repeated and ongoing compromise of user data and a violation of privacy. This wave of eroding trust took us from an era of optimism and hope to a darker post-trust era of tech, where distrust was the new default and privacy was no longer sacrosanct.

This culminated in dramatic fashion, with the exposure of Facebook and the company’s role in the Cambridge Analytica scandal. A senate hearing or two later, and Mark Zuckerberg’s ambiguity on data privacy on the platform became water cooler talk, no longer confined to online forums, but now discussed at dinner tables and on morning TV shows.

The Facebook scandal did more than just harm Facebook, it shrouded the entire tech industry in a fog of apprehension and distrust. It wasn’t just the idea that user data was being sold in exchange for ad dollars that irked users, it was also the fact that they didn’t know exactly what the platform knew about how they were being targeted. Facebook came under intense scrutiny, and other services and platforms scrambled to distance themselves from the behemoth, positioning themselves as better advocates of user privacy. Google tried hard, Twitter less so.

As privacy became a hot topic and was brought to the users’ attention in a very public way, more and more they were deciding that the old paradigm of data in exchange for services was no longer one they were willing to accept. In short, the value exchange was simply not worth it. The privacy of critical and sensitive data was fast becoming a priority for users.

Also read: Building trust in artificial intelligence

When we started implementing and pushing our commercial offering at Medicus AI, we realized that we faced an important challenge: reinstating the user’s trust in technology and in particular with their personal health data. As we began thinking about how to build our products, we all agreed that making ethical design the norm had to be our highest priority.

In pursuit of trust: our guiding principles

  1. Ethical design: we think hard about not only all matters relating to the ethics of user data but also of design, taking into consideration unintentional consequences that in turn enable us to see around corners. We apply this approach in everything we do to better understand the impact that we have and any potential harm that could be caused; from the public products we create to the work we do with our clients.

  2. Ecosystem-driven: we believe in taking a collaborative and leading position with our clients and also within the industries we operate in. We look for innovative ways to involve the ecosystem and decision-makers, playing what we hope is an inclusive and collaborative role

  3. Regulation-first: before embarking on any product design or development, we look at any regulations first. We outline what we can do, and more importantly, decide what we should do. We determined how we can ensure compliance (our CE mark certification is an example of how we ensure we’re operating legally and ethically) and we make a concerted effort to always hold ourselves accountable.

  4. Transparency: we take great measures in being clear and open with how we share our sources and how we carry out our medical validation, from product design decisions to smallest details. We have a culture of sharing when it comes to how we do things internally, focusing on the demystification of how we build our products to go the extra mile to gain more trust.

  5. Real Privacy: we decided early on that health data belongs to the end-user, always and without compromise. Once we took that decision, we had a framework and a driving force that provided the basis for most of the innovation in our offering, providing solutions that created value for all stakeholders without compromising the privacy or data of any of themand today this is something that we’re particularly proud of at Medicus AI.

Also read: how to make our healthcare systems better