10 Use Cases for AI in Healthcare as part of your Digital Strategy

AI has to potential to save millions of lives by applying complex algorithms | Photo Credit: via Brother UK

Good health is a fundamental need for all of us. Hence, it’s no surprise that the total market size of healthcare is huge. Developed countries typically spend between 9% and 14% of their total GDP on healthcare.

The digital transformation in the healthcare sector is still in its early stages. A prominent example is the Electronic Health Record (EHR) in particular, and, in general poor quality of data. Other obstacles include data privacy concerns, risk of bias, lack of transparency, as well as legal and regulatory risks. Although all these matters have to be addressed in a Digital Strategy, the implementation of Artificial Intelligence (AI) should not hesitate!

AI has to potential to save millions of lives by applying complex algorithms to emulate human cognition in the analysis of complicated medical data. AI furthermore simplifies the lives of patients, doctors, and hospital administrators by performing or supporting tasks that are typically done by humans, but more efficiently, more quickly and at a fraction of the cost. The applications for AI in healthcare are wide-ranging. Whether it’s being used to discover links between genetic codes, to power surgical robots or even to maximize hospital efficiency, AI is reinventing modern healthcare through machines that can predict, comprehend, learn and act.

Let’s have a look at ten of the most straightforward use cases for AI in healthcare that should be considered for any Digital Strategy:

1. Predictive Care Guidance:

AI can mine demographic, geographic, laboratory and doctor visits, and historic claims data to predict an individual patient’s likelihood of developing a condition. Using this data predictive models can suggest the best possible treatment regimens and success rate of certain procedures.

2. Medical Image Intelligence:

AI brings in advanced insights into the medical imagery specifically the radiological images. Using AI providers can gain insights and conduct automatic, quantitative analysis such as identification of tumors, fast radiotherapy planning, precise surgery planning, and navigation, etc.

3. Behavior Analytics:

AI helps to solve patient registry mapping issues for and help the Human Genome Project map complicated genomic sequences to identify the link to diseases like Alzheimer’s.

4. Virtual Nursing Assistants:

Conversational-AI-powered nurse assistants can provide support patients and deliver answers with a 24/7 availability. Mobile apps keep the patients and healthcare providers connected between visits. Such AI-powered apps are also able to detect certain patterns and alert a doctor or medical staff.

5. Research and Innovation:

AI helps to identify patterns in treatments such as what treatments are better suited and efficient for certain patient demography, and this can be used to develop innovative care techniques. Deep Learning can be used to classify large amounts of research data that is available in the community at large and develop meaningful reports that can be easily consumed.

6. Population Health:

AI helps to learn why and when something happened, and then predict when it will happen again. Machine Learning (ML) applied to large data sets will help healthcare organizations find trends in their patients and populations to see adverse events such as heart attacks coming.

7. Readmissions Management:

By analyzing the historical data and the treatment data, AI models can predict, flag the causes of readmissions, patterns, etc. This can be used to reduce the hospital readmission rates and for better regulatory compliance by developing mitigating strategies for the identified causes.

8. Staffing Management:

Predictive models can be developed by analyzing various factors such as historical demand, seasonality, weather conditions, disease outbreak, etc. to forecast the demand for health care services at any given point of time. This would enable better staff management and resource planning.

9. Claims Management:

AI detects any aberrations such as – duplicate claims, policy exceptions, fictitious claims or fraud. Machine learning algorithms recognize patterns in data looking at trends, non-conformance to Benford’s law, etc. to flag suspicious claims.

10. Cost Management:

AI automates the cost management through RPA, cognitive services, which will help in faster cost adjudication. It will also enable analysis, optimization, and detection by identifying patterns in cost and flagging any anomalies.


As these examples show, the wide range of possible AI use cases can improve healthcare quality and healthcare access while addressing the massive cost pressure in the healthcare sector. Strategic sequencing of use cases is mandatory to avoid implementation bottlenecks due to the scarcity of specialized talent.

Which use cases for AI in healthcare would you add to this list?

Share your favorite AI use case in the blog post comments or reply to this tweet:

This post is also published on LinkedIn.

How China is winning in the Age of Artificial Intelligence

Alibaba Campus
Alibaba Campus

Currently, I’m on a 4-week China trip, visiting many cities. In Hangzhou, I met CEIBS peers who work for Alibaba. While the Alibaba campus is quite impressive, I got even more impressed by Alibaba’s leadership culture, which is encouraging its employees to innovate as intrapreneurs.

If you start your own project (a new mobile app, a patent, a scientific paper, etc.), you’re doing it in your own pace, you’re not being micro-managed and you’ll receive a bonus based on success. Intrapreneurship at Alibaba is just one of many examples where we (Europeans) can learn a lot from China!

Yue and me, Hangzhou West Lake

While traveling in China I was reading AI Superpowers: China Silicon Valley, and the New World Order by Kai-Fu Lee, a book that is a must-read to get an idea where China’s AI ambitions are heading to. What matters most for AI innovation these days, the author argues, is access to vast quantities of data—where China’s advantage is overwhelming.

AI Superpowers: China, Silicon Valley, and the New World Order
  • Kai-Fu Lee
  • Publisher: Houghton Mifflin Harcourt
  • Gebundene Ausgabe: 272 pages

A quite entertaining book focusing on the new mindset of China’s young generation is this one: Young China: How the Restless Generation Will Change Their Country and the World by Zak Dychtwald.

  • Publisher: MACMILLAN USA
  • Gebundene Ausgabe: 304 pages

[Update 2 May 2019]: Which other cities in China did I visit? Check out my Tableau Public viz:

Data Operations: Wie Sie die Performance Ihrer Datenanalyse und Dashboards steigern

#dataops: Folgen Sie der Diskussion auf Twitter
#dataops: Folgen Sie der Diskussion auf Twitter

Sind Sie mit der Geschwindigkeit Ihrer Datenanlyse unzufrieden? Oder haben Ihre Dashboards lange Ladezeiten? Dann können Sie bzw. Ihr Datenbank-Administrator folgenden Hinweisen nachgehen, die sich je nach Datenquelle unterscheiden können.

Allgemeine Empfehlungen zur Performance-Optimierung

Möchten Sie die Geschwindigkeit der Analyse verbessern? Dann beachten Sie folgende Punkte:

  • Benutzen Sie mehrere »kleinere« Datenquellen fĂĽr individuelle Fragestellungen anstatt einer einzigen Datenquelle, die alle Fragestellungen abdecken soll.
  • Verzichten Sie auf nicht notwendige VerknĂĽpfungen.
  • Aktivieren Sie in Tableau die Option »Referentielle Integrität voraussetzen« im »Daten«-MenĂĽ (siehe Abbildung 2.20). Wenn Sie diese Option verwenden, schlieĂźt Tableau die verknĂĽpften Tabellen nur dann in die Datenabfrage ein, wenn sie explizit in der Ansicht verwendet wird*. Wenn Ihre Daten nicht ĂĽber referentielle Integrität verfĂĽgen, sind die Abfrageergebnisse möglicherweise ungenau.
Aktivierte Option „Referentielle Integrität voraussetzen“ im „Daten“-Menü
Abbildung 2.20: Aktivierte Option »Referentielle Integrität voraussetzen« im »Daten«-Menü

* So wird beispielsweise der Umsatz anstatt mit der SQL-Abfrage SELECT SUM([Sales Amount]) FROM [Sales] S INNER JOIN [Product Catalog] P ON S.ProductID = P.ProductID lediglich mit der SQL-Abfrage SELECT SUM([Sales Amount]) FROM [Sales] ermittelt.

Empfehlungen fĂĽr Performance-Optimierung bei Dateien und Cloud-Diensten

Achten Sie insbesondere beim Arbeiten mit Dateiformaten, wie Excel-, PDF- oder Textdateien, oder Daten aus Cloud-Diensten wie Google Tabellen zusätzlich auf folgende Punkte:

  • Verzichten Sie auf Vereinigungen ĂĽber viele Dateien hinweg, da deren Verarbeitung sehr zeitintensiv ist.
  • Nutzen Sie einen Datenextrakt anstatt einer Live-Verbindung, falls Sie nicht mit einem schnellen Datenbanksystem arbeiten (siehe Wann sollten Sie Datenextrakte und wann Live-Verbindungen verwenden).
  • Stellen Sie sicher, dass Sie beim Erstellen des Extrakts die Option »Einzelne Tabelle« wählen, anstatt der Option »Mehrere Tabellen« (siehe Abbildung 2.21). Dadurch wird das erzeugte Extrakt zwar größer und das Erstellen des Extrakts dauert länger, das Abfragen hingegen wird um ein Vielfaches beschleunigt.
Ausgewählte Option „Einzelne Tabelle“ im „Daten extrahieren“-Dialog
Abbildung 2.21: Ausgewählte Option »Einzelne Tabelle« im »Daten extrahieren«-Dialog

Empfehlungen fĂĽr Performance-Optimierung bei Datenbank-Servern

Arbeiten Sie mit Daten auf einem Datenbank-Server, wie Oracle, PostgreSQL oder Microsoft SQL Server, und möchten die Zugriffszeiten verbessern? Dann achten Sie bzw. der dafür zuständige Datenbankadministrator zusätzlich auf folgende Punkte:

  • Definieren Sie fĂĽr Ihre Datenbank-Tabellen sinnvolle Index-Spalten.
  • Legen Sie fĂĽr Ihre Datenbank-Tabellen Partitionen an.

Dieser Beitrag ist der dritte Teil der Data-Operations-Serie:

Teil 1: Daten fĂĽr die Analyse optimal vorbereiten
Teil 2: Wann sollten Sie Datenextrakte und wann Live-Verbindungen verwenden
Teil 3: Wie Sie die Performance Ihrer Datenanalyse und Dashboards steigern

AuĂźerdem basiert dieser Blog-Post auf einem Unterkapitel des Buches “Datenvisualisierung mit Tableau“:

Datenvisualisierung mit Tableau
  • Alexander Loth
  • Publisher: mitp
  • Edition no. 2018 (31.07.2018)
  • Broschiert: 224 pages

How to research LinkedIn profiles in Tableau with Python and Azure Cognitive Services

Tableau is using Python to access the Web Services API provided by Microsoft Azure Cognitive Services
Tableau is using Python to access the Web Services API provided by Microsoft Azure Cognitive Services

A few weeks after the fantastic Tableau Conference in New Orleans, I received an email from a data scientist who attended my TC18 social media session, and who is using Azure+Tableau. She had a quite interesting question:

How can a Tableau dashboard that displays contacts (name & company) automatically lookup LinkedIn profile URLs?

Of course, researching LinkedIn profiles for a huge list of people is a very repetitive task. So let’s find a solution to improve this workflow…

1. Python and TabPy

We use Python to build API requests, communicate with Azure Cognitive Services and to verify the returned search results. In order to use Python within Tableau, we need to setup TabPy. If you haven’t done this yet: checkout my TabPy tutorial.

2. Microsoft Azure Cognitive Services

One of many APIs provided by Azure Cognitive Services is the Web Search API. We use this API to search for name + company + “linkedin”. The first three results are then validated by our Python script. One of the results should contain the corresponding LinkedIn profile.

3. Calculated Field in Tableau

Let’s wrap our Python script together and create a Calculated Field in Tableau:

4. Tableau dashboard with URL action

Adding a URL action with our new Calculated Field will do the trick. Now you can click on the LinkedIn icon and a new browser tab (or the LinkedIn app if installed) opens.

LinkedIn demo on Tableau Public

Is this useful for you? Feel free to download the Tableau workbook (don’t forget to add your API key), leave a comment and share this tweet:

The Empathy Machine: Are Digital Technologies the Best Bet in Telling about your Cause?

The panel discussion “The empathy machine: are digital technologies the best bet in telling about your cause?” took place on the opening day of the 2018 Fundamental Rights Forum (FRA). This forum was organized by the European Union Agency for Fundamental Rights, and took place at the METAStadt Vienna 25-27 September 2018.

In this panel discussion Kadri Kõusaar (a Oscar nominated film director), Fanny Hidvegi (European Policy Manager) and me discussed if digital technologies really are the “empathy machine” and how innovative applications can help human rights defenders to achieve some challenging goals such as a change in public attitudes or meeting tough fund-raising targets. The panel discussion was moderated by the virtual reality artist Dr. Frederick Baker.

In this blog post I want to share some of the panel’s questions, which I answered:

1. How do algorithms interfere with human rights?

When algorithms make certain decisions, these algorithms  tent to mirror what they are shown with training sets. This is especially apparent for issues such as bias and machine discrimination. Both might be the result of the content of the training data, which reflects existing inequalities.

2. So, it’s about the data? What else makes data so important today?

The effective use of data is vital for our understanding of fundamental issues, such as human rights violations and political instability, for informing our policy-making, and for enhancing our ability to predict the next crisis. Furthermore, the scope, complexity and life-changing importance of the work being done on topics like these across the European Union has made it more important than ever for everyone participating in the public conversation and in demographic decision-making to have access to and to be able to derive insights from key data sources.

3. Where is data coming from and how can people benefit?

Every time we google something, send a tweet, or just browse a website, we create data. With the rise of visual analytics we can benefit from this vast amount of information. Visual analytics is a hands-on approach to interacting with data that does not require any programming skills. Furthermore, communicating with data, is seen as one of the most relevant skills in today’s information age.

Global Refugee Crisis visualization on Tableau Public

4. What is the easiest way to find interesting data?

I would check out the Google’s new search engine for datasets that was just released recently! Tableau Public is a good source for existings visualizations. Many of these are based on public data.

5. What is required to enable organizations to use data for good?

Data can be used for the good of society, but private- and public-sector firms, nonprofits and NGOs still lack analytics resources and expertise. Data and analytics leaders must cross traditional boundaries to use data for good, to better compete for limited talent, and to foster an ethical culture. VizForSocialGood and Tableau Foundation are good examples.

6. How can the private sector contribute for good?

Some private sector organizations are making data open and available to researchers, nonprofits and NGOs. Examples include:

  • Mastercard anonymizing credit card data to be analyzed in smart city initiatives.
  • Google making search data available to hospitals to predict infection disease outbreaks such as flu and dengue fever.
  • Insurance companies providing anonymized healthcare data to improve patient outcomes and prevention strategies.
  • Yelp providing ratings data to cities to prioritize food safety inspectors.

The panel discussion was followed by workshops in the afternoon: