How to Research LinkedIn Profiles in Tableau with Python and Azure Cognitive Services in Tableau

Azure Cognitive Services in Tableau: using Python to access the Web Services API provided by Microsoft Azure Cognitive Services
Azure Cognitive Services in Tableau: using Python to access the Web Services API provided by Microsoft Azure Cognitive Services

A few weeks after the fantastic Tableau Conference in New Orleans, I received an email from a data scientist who attended my TC18 social media session, and who is using Azure+Tableau. She had quite an interesting question:

How can a Tableau dashboard that displays contacts (name & company) automatically look up LinkedIn profile URLs?

Of course, researching LinkedIn profiles for a huge list of people is a very repetitive task. So let’s find a solution to improve this workflow…

Step by Step: Integrating Azure Cognitive Services in Tableau

1. Python and TabPy

We use Python to build API requests, communicate with Azure Cognitive Services and to verify the returned search results. In order to use Python within Tableau, we need to setup TabPy. If you haven’t done this yet: checkout my TabPy tutorial.

2. Microsoft Azure Cognitive Services

One of many APIs provided by Azure Cognitive Services is the Web Search API. We use this API to search for name + company + „linkedin“. The first three results are then validated by our Python script. One of the results should contain the corresponding LinkedIn profile.

3. Calculated Field in Tableau

Let’s wrap our Python script together and create a Calculated Field in Tableau:

SCRIPT_STR("
import http.client, urllib, base64, json
YOUR_API_KEY = 'xxx'
name = _arg1[0]
company = _arg2[0]
try:
headers = {'Ocp-Apim-Subscription-Key': YOUR_API_KEY }
params = urllib.urlencode({'q': name + ' ' + company + ' linkedin','count': '3'})
connection = http.client.HTTPSConnection('api.cognitive.microsoft.com')
connection.request('GET', '/bing/v7.0/search?%s' % params, '{body}', headers)
json_response = json.loads(connection.getresponse().read().decode('utf-8'))
connection.close()
for result in json_response['webPages']['value']:
if name.lower() in result['name'].lower():
if 'linkedin.com/in/' in result['displayUrl']:
return result['displayUrl']
break
except Exception as e:
return ''
return ''
", ATTR([Name]), ATTR([Company]))

4. Tableau dashboard with URL action

Adding a URL action with our new Calculated Field will do the trick. Now you can click on the LinkedIn icon and a new browser tab (or the LinkedIn app if installed) opens.

LinkedIn demo on Tableau Public

Is this useful for you? Feel free to download the Tableau workbook – don’t forget to add your API key!

Get More Insights

This tutorial is just the tip of the iceberg. If you want to dive deeper into the world of data visualization and analytics, don’t forget to order your copy of my new book, Visual Analytics with Tableau (Amazon).  This comprehensive guide offers an in-depth exploration of data visualization techniques and best practices.

I’d love to hear your thoughts. Feel free to leave a comment, share this tweet, and follow me on Twitter and LinkedIn for more tips, tricks, and tutorials on Azure Cognitive Services in Tableau and other data analytics topics.

Also, feel free to comment and share my Azure Cognitive Services in Tableau tweet:

#TC18 Sessions: Rock your Social Media Data with Tableau

My TC18 sessions in New Orleans: "Rock your Social Media Data with Tableau"
My TC18 sessions in New Orleans: „Rock your Social Media Data with Tableau“

Anyone can analyze basic social media data in a few steps. But once you’ve started diving into social analytics, how do you bring it to the next level? This session will cover strategies for scaling a social data program. You’ll learn skills such as how to directly connect to your social media data with a Web Data Connector, considerations for building scalable data sources, and tips for using metadata and calculations for more sophisticated analysis.

First session: Tues, 23 Oct,  12:30-1:30 (Location: MCCNO – L3 – 333)

Second session: Wed, 24 Oct, 10:15-11:15 (Location: MCCNO – L3 – 346)

Twitter Analysis #TC18 Dashboard featured as Tableau Public Viz of the Day
Twitter Analysis #TC18 Dashboard featured as Tableau Public Viz of the Day

Here are some key takeaways and links (i.e. additional resources) featured during my TC18 sessions to help you formulate your social media data program in order to build a stronger presence and retrieve powerful insights:

Prolog: Introducing data artist Noah

Step 1: Understand How to Succeed with Social Media

Apple has officially joined Instagram on 7th August 2017. This isn’t your average corporate account as the company doesn’t want to showcase its own products. Instead, Apple is going to share photos shot with an iPhone:

The Customer-Centric Data Strategy

Apple’s Instagram account is more an extension of the “Shot on iPhone” billboard ad campaign.

And there are plenty takeaways for every business:

  • Wrap your data around your customers, in order to create business value
  • Interact with your customer in a natural way
  • Understand your customer and customer behaviour better by analyzing social media data

Step 2: Define Your Social Objectives and KPIs

A previous record-holding tweet: In 2014, actor and talk show host Ellen DeGeneres took a selfie with a gaggle of celebrities while hosting the Oscars. That photo has 3.44 million retweets at the time of writing:

Social Objectives:

  • Define specific KPIs for social media platforms
  • KPI objectives need to be measurable
  • Metrics should be in line with the business goals

Step 3: Assemble Your KPIs

Brand Awareness and Reputation

Step 4: Connect Your Social Media with Tableau

Option 1 – Directly from the platform: Get data directly from Facebook, Twitter, YouTube, and more

Option 2 – Via web automation: Use a service like IFTTT to store data on Google Sheets

Option 3 – Via web data connector: Use Tableau’s web data connector, e.g. the Twitter Web Data Connector by Alex Ross (a.k.a. Tableau Junkie) -> http://bit.ly/tc18_twitter

Option 4 – Code your own solution: Use an API provided by the platform -> http://bit.ly/tc17_r_fetch

Option 5 – Via a third party platform: Get data from an integrated social media platform, such as Talkwalker -> http://bit.ly/tc17_talkwalker

Talkwalker - Via a Third Party Platform

Step 5: Apply some Tips to Level Up

Gather Historic Data

Step 6: Explore Social Media Listening

Social listening means that you look beyond your own content. E.g. Talkwalker offers AI for image recognition and ggregation for online/offline media: http://bit.ly/tc17_talkwalker

Step 7: Leverage Your Analytics Tool Chain

Use Your R and Python Skills

Demo/Tutorial: Let’s See this in Tableau!

How to analyse Social Media traffic with Google Analytics in Tableau (YouTube):

How to analyse Social Media data from Twitter in Tableau (YouTube):

Slide Set

The slides presented at Tableau Conference are also available on SlideShare.

Are you on Social Media?

Feel free to retweet/share:

[Update 25 Oct 2018]: Missed the sessions? Watch the recording online!

#MakeoverMonday: An Interview with Authors Eva Murray and Andy Kriebel

#MakeoverMonday with Eva Murray and Andy Kriebel
#MakeoverMonday with Eva Murray and Andy Kriebel

#MakeoverMonday, one of the biggest community endeavors in data visualization, is hosted by Eva Murray and Andy Kriebel. Andy started #MakeoverMonday almost 10 years ago as a weekly blog to document his learning progress on vizwiz.com. In 2016, together with Andy Cotgreave, he turned #MakeoverMonday into a social data project by sharing weekly datasets, providing examples of data visualization best practices, as well as tips and tricks with Tableau. Andy is also Head Coach at The Information Lab Data School and a five-time Tableau Zen Master.

Eva joined #MakeoverMonday in 2017. She loves to blog about Tableau, travel, and triathlon on trimydata.com. Furthermore, Eva is the Head of Business Intelligence at Exasol and a 2018 Tableau Zen Master.

In a few days, Eva’s and Andy’s [amazon link=“1119510775″ title=“#MakeoverMonday book“/] will be released. I interviewed both about their data background, where data analytics is heading to, and of course, about #MakeoverMonday!

Alex Loth: Hi Eva, hi Andy, first thank you for the interview. Let’s start with your „data background“. How did you get interested in working with data?

Eva MurrayEva Murray: For me it started at university. I studied Psychology, HR, Accounting and Commercial Law. Psychology was by far the most interesting subject and for some reason I really took a liking to my statistics papers. I had never been very successful in maths during secondary school, but at university something clicked. The right or wrong nature of numbers was satisfying and provided a good balance to the fluffiness of essay writing. I aced all my stats papers and really enjoyed that part of my psychology degree. After university I joined Deloitte as a consultant for Information Management. That let me stumble into data. It was a mix of 80% powerpoint and 20% data analysis and I loved both parts. From there I decided to move into the financial services industry and took on a role as an analyst because I wanted to sharpen those skills.And that’s when things really started because I was surrounded by data every day.

Andy KriebelAndy Kriebel: I got interested in numbers from an amazing geometry teacher I had in high school. The beauty of him as a teacher was that he was blind. That’s right, a blind geometry teacher. From there, I had THE BEST professor at university who is still a mentor to me today. As for data itself, I’ve pretty much been involved with data since my career started. My first job was as an underwriter for an insurance company (slimy business that is!) then I went into a revenue planning role at Coca-Cola, where I first found Tableau in 2007.

Alex: What was the first data set you remember working with? What did you do with it?

Eva: The very first one was probably when I was 10 and I collected data about my gerbils. Dad helped me research on the internet to predict the fur color of the gerbil babies that were about to be born. I had quite the breeding operation going on (my biology teacher thought the ones he gave me were brothers, but they turned out to be a couple).
My first proper data analysis was done with survey data at university looking at responses, but I don’t really remember the topic. At uni we used SPSS to work with the data and the visualizations we built were typically scatterplots and histograms, focusing on the statistical relevance of the relationships between two metrics.

Andy: My dad was president of the local Little League (youth baseball) for about 20 years or so. I would go to games all weekend and help with scoring games. I would take all of the results and tabulate them by hand to calculate the stats, type them up on a typewriter and post them for any kid in the league to see. I was probably 9 or 10 when I started doing that.

Alex: Was there a specific „aha“ moment when you realized the power of data?

Eva: Yes, definitely. It wasn’t until a bit later after finishing uni. I was working on a project which basically involved an IT audit, looking at individual line items of spending on various hardware. Having to manage, analyse and find insights in the huge amount of data in Excel was a massive challenge but it showed me just a snippet of the type of data that’s out there, ready to be taken apart. Finding insights and creating data stories became something that fascinated me. Another couple of years later when I got my hands on Tableau and was able to make data visible so much more easily, data started to really come to life for me.

Andy: Absolutely! It was the day I found Tableau. Getting insight into the data in a few minutes after downloading the software totally blew me away. I showed that to the Director of our group and we immediately began using it to measure our sales teams.

Alex: How important is data in your personal life?

Eva: As an endurance athlete, data is very important for me. I track my sleep, my weight, my training, distances ridden/run/swum, elevation conquered and the effort it took to get me there. I’m often fascinated by what the human body is capable of and having a way to put it in words and numbers through data is something I really enjoy. Of course I also try to learn something from the data so I can improve my performance.

I like using data to identify patterns which in turn helps me build good habits and behaviours and stop the bad ones (at least I try to)

Andy: I’d say it’s less important than it used to be. I track a lot of quantified self data, but I don’t do much with it. I found that I became too obsessed with tiny things that didn’t really matter in my life, like step counts, weight, etc. I exercise enough to not worry about those things, so why worry about that? Just about the only thing I do now is create art with my fitness data.

Alex: Thank you for sharing. Next, let’s talk a little about helpful resources and where you think data analytics is heading to. What is the book (or books) that have greatly influenced the way your work with data?

Eva: I have to say that for me it wasn’t a book in particular. When I work with data, I sit in front of a screen, so my go-to-resources are typically blogs and forums to find the answers.
In the early days of my Tableau journey, I heavily relied on the Tableau forums for help. Once I had a better understanding and knew what I was looking for, I shifted to blogs, such as the one Andy writes.
Quite honestly, if I need an answer, I ask google first and based on my knowledge of people in the community, I then quickly pick from the results based on the names that pop up.

Andy: If I had to pick one book, I’d say #MakeoverMonday :-). But to answer your question less selfishly, here are some books that influenced me:




Alex: What advice would you give to a student about to enter the “analytics world”? What advice should they ignore?

Eva: Don’t think you need to have a computer science or statistics background to be successful. Yes, it can help, but if you’re someone with curiosity, you’re well on your way. Different disciplines play into the field of data analysis. Here are some that come to mind for me:

  • Thinking and researching like a journalist, finding sources of information, checking them, building a story and sharing it effectively
  • Analysing and challenging the data like a researcher, not just taking it at face value but testing different hypotheses, running through different scenarios and checking the statistical validity of your conclusions
  • Structuring your results like an attorney, making sure you have solid foundations for your arguments, you have proof and facts to back up your claims
  • Looking at data like a graphic designer, making sure the story becomes visible in a beautiful and impactful visualization, using colors, white space, text and charts in the most effective way to elicit emotion in your audience and to draw them into your data story.

Andy: I’d agree with Eva, don’t let your “degree” get in the way of your enjoying a career in data analytics. If you love numbers, jump right in. Find your niche, practice relentlessly, build a portfolio.

When approaching any project, try to answer five key questions: When? What? Where? Who? Why?

Alex: What are bad recommendations you hear in the area of analytics?

Eva: I don’t think they’re necessarily recommendations I hear but a phenomenon I have been witnessing is the almost compulsive move by everyone to do a Masters degree. Sure, if you’d like to do one after you graduate, go right ahead. Don’t feel like you have to do it, however, to be successful. If you instead spend those 12-18 months working, learning and applying your knowledge to real-life scenarios and gaining experiences in the real business world, you’ll probably benefit more than just financially. Having experience in applying your knowledge to client scenarios, finding solutions to problems and helping your organisations save money, improve processes, make greater contributions to their communities, etc. will probably be more exciting than spending more time at university for another certification to hang on your wall. No one ever asked me for my missing Masters title and getting my hands ‘dirty’ instead by working, learning on the job, seeking opportunities and pursuing them, has helped me greatly. Everyone should find their way and if you’re unsure whether or not you should stay another year or so at university, please don’t feel like you don’t have options or should do it because everyone is doing it.

Andy: That’s a good question. I often hear of people giving bad advice for how to approach data analysis and data visualization. People give advice that can be too complicated, which leads to frustration and kills someone’s interest. The most important thing anyone can do is keep it simple.

Alex: How does the future of analytics look like?

Eva: In my opinion, we’ll see a shift for analysts towards greater requirements for data science knowledge and skills. A lot of standard reporting will be automated but the stories we can tell with data will still come from human beings, from analysts who work with data and understand the human connection within the numbers.

We’ll hopefully see a lot fewer silos and much more collaboration within and across organisations. Data will become the lifeblood of humanitarian causes with volunteers and nonprofits using data and analytics to drive change at scale to improve living conditions for and the wellbeing of millions of people around the world because they know when to act, what resources to send and how to most effectively deploy the right people, machinerie and processes in different parts of the world.

Andy: I’m hoping that the future of data leads to a better world to live in. I hope we can get through all of the noise and lies by using data and facts to educate people. I hope data is used to improve education and health, especially for those that don’t have the best access to those resources now. Maybe I’m living in a utopian world, but one can dream and I promise I’ll do my best to make it happen.

Alex: Very insightful. Finally, let’s talk about your initiative #MakeoverMonday. How did you come to found #MakeoverMonday?

Eva: I’ll let Andy answer the question on how Makeover Monday came about. He brought it to life, I joined him in 2017 and injected my own personality and ideas into the initiative. It’s been so much fun to see the project grow to hundreds of regular participants and to follow people’s growth and development.

Andy: As of this writing, I’ve done 224 vizzes for “Makeover Monday”, but really, I’ve been doing makeovers since my first blog post in August 2009. Credit for the name “Makeover Monday” goes to Emily Kund. She saw that I tended to do the makeovers on Mondays, and came up with the alliteration. It looks like my first official Makeover Monday was on April 28, 2014. #MakeoverMonday the community project started in January 2016 with me and Andy Cotgreave. Eva replaced Andy in January 2017 and the project has really taken off since.

Alex: What specific problem is #MakeoverMonday trying to solve? How would you describe it to someone who is not familiar with it?

Eva: Our mission is to improve the way we visualize and analyze data – one chart at a time. ‘We’ in this case is everyone. Not just Andy and me. Not just the Makeover Monday community. There are so many people in the world who work with data and there are countless examples of bad data visualizations. We want to change that. Beyond beautiful charts we want to help people create truthful, easy-to-understand representations of the data which bring various topics to their audiences in a way that resonates with them. There is so much knowledge in the world and to make it accessible, we need to find easy ways to distill complex scenarios into clear, simple representations.

The way Makeover Monday works is that every week, Andy and I provide a visualization and the accompanying data to our community. We ask participants to create an improved visualization of the same data.
To support the community, we run a weekly 75min feedback webinar where we help people with their questions, provide recommendations for improvement and explain why some visualizations work better than others to represent the data at hand.
We also write a weekly blog post with lessons learned, provide feedback on social media, have a gallery with each week’s favorite vizzes and have written a book that distills everything into a paper-version people can use for reference

Andy: Ultimately, we’re helping people learn, not only technically, but with data literacy and communication. There are way too many charts that communicate poorly and we’re hoping people can use #MakeoverMonday to improve on those charts, take what they learn into their day job, and ultimately find the career they’ve always wanted. It’s quite simple when you think about it.

#MakeoverMonday book coverAlex: What should we know about your new #MakeoverMonday book?

Eva: Our book has been a labor of love, bringing together lessons learned from thousands of Makeover Monday visualizations, close to 150 data sets, over 100 hours of webinar content we created and showcasing the work from our community since the project started almost three years ago.

It puts the essentials into your hands, focusing on the foundations every analyst should build when it comes to their analysis and visualization skills. It is packed with over 300 examples from the community and can be read cover to cover or referred to as and when needed.

The book has been a very personal project, as we worked closely with our participants, as well as friends to create the final version. We had great support from the team at i-for-ideas.com who helped create a design that reflects the essence of what we and this project are all about.

Andy: I’m quite proud of how Eva and I pulled the book together so quickly. It’s a culmination of everything we have learned through the project. We’re taking the most frequently discussed lessons and turn them into a practical guide for anyone with an interest in data visualization. I don’t want to give away too many spoilers.

Alex: What has been the most surprising insight you have found while writing the #MakeoverMonday book?

Eva: It wasn’t as much a surprise as it was a very nice realisation: Andy and I are very good at teamwork and playing to our strengths. We didn’t argue once about who would work on which tasks. We simply created a plan with everything we needed to do, split up the jobs according to our interests and preferences and got to work.
When one person was pressed for time, the other would take on a couple of extra jobs to ease the pressure and that’s how we went from book proposal to finished manuscript in 120 days.
While I’m not sure Andy is keen on a second book at this point, I’d be happy to write another one with him :-).

Andy: I was surprised at how little time it took. Don’t get me wrong, we spent countless nights and weekends writing, but it wasn’t nearly as bad as people had led me to believe. I quite enjoyed the writing too; it helped reinforce my personal learning and I find writing therapeutic.

Data Operations: Wann sollten Sie Datenextrakte und wann Live-Verbindungen verwenden

#dataops: Folgen Sie der Diskussion auf Twitter
#dataops: Folgen Sie der Diskussion auf Twitter

Nachdem Sie Ihre Daten für die Analyse optimal vorbereitet haben, stellt sich die Frage auf welche Weise Sie Ihre Daten bereithalten wollen, damit Sie schnell erste Erkenntnisse erhalten.

Tableau bietet Ihnen für die meisten Datenquellen die Möglichkeit, zwischen einer Live-Verbindung, also einer direkten Verbindung zur Datenbank, und einem Datenextrakt, also einem Abzug der Daten zu wählen. Wie Abbildung 1.1 zeigt, können Sie einfach zwischen beiden Verbindungstypen wechseln.

Auswahlknöpfe, um zwischen Live-Verbindung und Datenextrakt zu wechseln
Abbildung 1.1: Auswahlknöpfe, um zwischen Live-Verbindung und Datenextrakt zu wechseln

Live-Verbindungen ermöglichen Ihnen die Arbeit mit den Daten, wie sie zum momentanen Zeitpunkt auf der Datenbank oder der Datei vorliegen. Wenn Sie Daten extrahieren, importieren Sie einige oder alle Daten in die Data Engine von Tableau. Dies gilt sowohl für Tableau Desktop als auch für Tableau Server. Welche Verbindungsmethode Sie bevorzugen sollten, hängt von Ihrer Situation und dem Anwendungsfall, Ihren Anforderungen sowie von der Verfügbarkeit der Datenbank und der Netzwerkbeschaffenheit ab.

Immer aktuell mit der Live-Verbindung

Durch die direkte Verbindung mit Ihrer Datenquelle visualisieren Sie immer die aktuellsten Daten, die Ihnen die Datenbank zur Verfügung stellt. Wenn Ihre Datenbank in Echtzeit aktualisiert wird, müssen Sie die Tableau-Visualisierung nur über die Funktionstaste F5 aktualisieren oder indem Sie mit der rechten Maustaste auf die Datenquelle klicken und die Option Aktualisieren auswählen.

Wenn Sie eine Verbindung zu großen Datenmengen herstellen, die Visualisierung sehr viele Details enthält oder Ihre Daten in einer leistungsstarken Datenbank mit entsprechend ausgestatteter Hardware gespeichert sind, können Sie mit einer direkten Verbindung eine schnellere Antwortzeit erzielen.

Die Auswahl einer direkten Verbindung schließt nicht die Möglichkeit aus, die Daten später zu extrahieren. Andersherum können Sie auch wieder von einem Extrakt zu einer Live-Verbindung wechseln, indem Sie mit der rechten Maustaste auf die Datenquelle klicken und die Option Extrakt verwenden deaktivieren.

Unabhängig mit einem Datenextrakt

Datenextrakte haben naturgemäß nicht den Vorteil, dass sie in Echtzeit aktualisiert werden, wie es bei einer Live-Verbindung der Fall ist. Die Verwendung der Data Engine von Tableau bietet jedoch eine Reihe von Vorteilen:

Leistungsverbesserung bei langsamen Datenquellen:

Vielleicht ist Ihre Datenbank stark mit Anfragen belastet oder bereits mit transaktionalen Operationen beschäftigt. Mithilfe der Data Engine können Sie Ihre Datenbank entlasten und die Datenhaltung von Tableau übernehmen lassen. Extrakte können Sie am besten außerhalb der Stoßzeiten aktualisieren. Tableau Server kann Extrakte auch zu festgelegten Zeitpunkten aktualisieren, zum Beispiel nachts um 3 Uhr.

Inkrementelles Extrahieren:

Durch das inkrementelle Extrahieren wird auch die Aktualisierungszeit beschleunigt, da Tableau nicht die gesamte Extraktdatei aktualisiert. Es fügt nur neue Datensätze hinzu. Um inkrementelle Extrakte auszuführen, müssen Sie ein Feld angeben, das als Index verwendet werden soll. Tableau aktualisiert die Zeile nur, wenn sich der Index geändert hat. Daher müssen Sie beachten, dass Änderungen an einer Datenzeile, die das Indexfeld nicht ändert, von der Aktualisierung nicht berücksichtigt werden.

Datenmenge mit Filtern einschränken:

Eine andere Möglichkeit, Extrakte zu beschleunigen, besteht darin, beim Extrahieren der Daten Filter anzuwenden. Wenn für die Analyse nicht die gesamte Datenmenge benötigt wird, können Sie den Extrakt so filtern, dass er nur die erforderlichen Datensätze enthält. Wenn Sie eine sehr große Datenmenge haben, müssen Sie nur selten den gesamten Inhalt der Datenbank extrahieren. Zum Beispiel kann Ihre Datenbank Daten für viele Regionen enthalten, aber Sie benötigen möglicherweise nur die Daten zur Region »Süd«.

Um einen Extrakt entsprechend anzulegen, wählen Sie als Verbindung Extrakt aus und klicken dann auf das nebenstehende Bearbeiten. Es öffnet sich das Fenster Daten extrahieren. Mit einem weiteren Klick auf Hinzufügen können Sie nun einen Filter erstellen, der für Ihren Extrakt angewandt wird (siehe Abbildung 1.2).

Der Datenextrakt kann mit Filtern eingeschränkt werden
Abbildung 1.2: Der Datenextrakt kann mit Filtern eingeschränkt werden

Weitere Funktionen für bestimmte Datenquellen:

Wenn Ihre Daten aus einer bestimmten Datenquelle stammen, so sind unter anderem Aggregationsfunktion wie Median (beispielsweise bei Access-Datenbanken ) bei einer Live-Verbindung nicht verfügbar. Arbeiten Sie mit einem Extrakt, können Sie diese Funktionen nutzen, auch wenn sie von der ursprünglichen Datenquelle nicht unterstützt werden.

Datenübertragbarkeit:

Sie können Extrakte lokal speichern und auch dann verwenden, wenn die Verbindung zu Ihrer Datenquelle nicht verfügbar ist. Eine Live-Verbindung funktioniert nicht, wenn Sie nicht über ein lokales Netzwerk oder das Internet auf Ihre Datenquelle zugreifen können. Extrakte werden außerdem komprimiert und sind normalerweise wesentlich kleiner als die ursprünglichen Datenbanktabellen, was dem Weitertransport der Daten zugutekommt.

Achten Sie auf Datenschutz und Data Governance

In Unternehmen spielen Datenschutz und Data Governance und damit verbunden Integrität und Sicherheit der Daten eine wichtige Rolle. Wenn Sie Extrakte an Mitarbeiter oder Geschäftspartner verteilen, sollten Sie die etwaige Vertraulichkeit Ihrer Daten berücksichtigen. Ziehen Sie in Betracht, den Inhalt des Extrakts über Filter einzuschränken und zu sichtbaren Dimensionen zu aggregieren.

Sind Sie sich unsicher, arbeiten Sie im Zweifelsfall besser mit einer Live-Verbindung, da in diesem Fall Ihre Datenbank das Rechte-Management steuert und so Ihre Daten nicht von Personen ohne ausreichende Berechtigungen gesehen werden können.

Dieser Beitrag ist der dritte Teil der Data-Operations-Serie:

Teil 1: Daten für die Analyse optimal vorbereiten
Teil 2: Wann sollten Sie Datenextrakte und wann Live-Verbindungen verwenden
Teil 3: Wie Sie die Performance Ihrer Datenanalyse und Dashboards steigern

Außerdem ist dieser Blog-Post ein Auszug aus dem Buch „Datenvisualisierung mit Tableau„, das am 31. Juli 2018 erschienen ist:

Das Tableau-Buch ist ab sofort im Handel erhältlich

Datenvisualisierung mit Tableau: Das erste deutschsprachige Tableau-Buch ist auch bei Amazon erhältlich
Datenvisualisierung mit Tableau: Das erste deutschsprachige Tableau-Buch ist auch bei Amazon erhältlich

Das erste deutschsprachige Buch zur Datenvisualisierung mit Tableau hat seinen Weg in die Buchhandlungen gefunden und steht nun allen Interessierten zur Verfügung. Ob Sie ein Anfänger oder ein erfahrener Experte in der Welt der Datenvisualisierung sind, dieses Buch bietet Ihnen wertvolle Einblicke und praxisorientierte Anleitungen.

Das erste deutschsprachige Tableau-Buch, Datenvisualisierung mit Tableau, ist ab sofort im Handel erhältlich bei:

Vorschau

Mehr erfahren zu Datenvisualisierung mit Tableau

Erfahren Sie mehr zu Datenvisualisierung mit Tableau auf der Webseite zum Tableau-Buch! Entdecken Sie die Themen, die das Buch abdeckt, und sehen Sie sich Rezensionen von anderen Lesern an. Dieses Buch wird Ihr Verständnis für Datenvisualisierung mit Tableau auf ein neues Level heben.

Update 11 Aug 2018: In den Informatikbücher-Top-20 bei Amazon!
Update 17 Aug 2018: In den Informatikbücher-Top-10 bei Amazon!