Slowly the dust settles after the impressive TC18. During my wrap-up, I remembered the data warehouse benchmarks of the Azure & Tableau session by James Rowland-Jones. Especially because my customers ask me about such performance metrics over and over again.
The first benchmark (graph above) shows how Microsoft Azure SQL Data Warehouse (aka. SQL DW) outperforms Amazon Redshift – in terms of performance and price. While the second benchmark shows further performance tests for Amazon Redshift, Snowflake, Azure, Presto, and Google Big Query:
Since James‘ session is already available on Tableau’s Youtube channel, feel free to watch the entire Azure & Tableau session:
The Welcome Reception at #TC18 has officially started—from a parade (New Orleans themed, of course!) to networking with our #DataFam! 🎉 pic.twitter.com/SWWnicdTFq
This morning we kicked off #TC18 with 17,000 data rockstars! 🎉 We shared some exciting announcements including Ask Data, Tableau Prep Conductor, Tableau Developers Program, big news for Tableau Foundation, and more. Learn all about them: https://t.co/CiXWo8qtxOpic.twitter.com/pnWZJzYwma
Honoured & humbled to win the @mcristia Community Leader Award at the #Vizzies yesterday. This came as a complete surprise to me. Thank you to everyone that voted & a special thank you to @emily1852 & @Matt_Francis for renaming the award in honour of Michael #TC18#Tableaupic.twitter.com/tuXfL2aSQS
Anyone can analyze basic social media data in a few steps. But once you’ve started diving into social analytics, how do you bring it to the next level? This session will cover strategies for scaling a social data program. You’ll learn skills such as how to directly connect to your social media data with a Web Data Connector, considerations for building scalable data sources, and tips for using metadata and calculations for more sophisticated analysis.
Here are some key takeaways and links (i.e. additional resources) featured during my TC18 sessions to help you formulate your social media data program in order to build a stronger presence and retrieve powerful insights:
Step 1: Understand How to Succeed with Social Media
Apple has officially joined Instagram on 7th August 2017. This isn’t your average corporate account as the company doesn’t want to showcase its own products. Instead, Apple is going to share photos shot with an iPhone:
And there are plenty takeaways for every business:
Wrap your data around your customers, in order to create business value
Interact with your customer in a natural way
Understand your customer and customer behaviour better by analyzing social media data
Step 2: Define Your Social Objectives and KPIs
A previous record-holding tweet: In 2014, actor and talk show host Ellen DeGeneres took a selfie with a gaggle of celebrities while hosting the Oscars. That photo has 3.44 million retweets at the time of writing:
Social listening means that you look beyond your own content. E.g. Talkwalker offers AI for image recognition and ggregation for online/offline media: http://bit.ly/tc17_talkwalker
#MakeoverMonday, one of the biggest community endeavors in data visualization, is hosted by Eva Murray and Andy Kriebel. Andy started #MakeoverMonday almost 10 years ago as a weekly blog to document his learning progress on vizwiz.com. In 2016, together with Andy Cotgreave, he turned #MakeoverMonday into a social data project by sharing weekly datasets, providing examples of data visualization best practices, as well as tips and tricks with Tableau. Andy is also Head Coach at The Information Lab Data School and a five-time Tableau Zen Master.
Eva joined #MakeoverMonday in 2017. She loves to blog about Tableau, travel, and triathlon on trimydata.com. Furthermore, Eva is the Head of Business Intelligence at Exasol and a 2018 Tableau Zen Master.
In a few days, Eva’s and Andy’s [amazon link=“1119510775″ title=“#MakeoverMonday book“/] will be released. I interviewed both about their data background, where data analytics is heading to, and of course, about #MakeoverMonday!
Alex Loth: Hi Eva, hi Andy, first thank you for the interview. Let’s start with your „data background“. How did you get interested in working with data?
Eva Murray:For me it started at university. I studied Psychology, HR, Accounting and Commercial Law. Psychology was by far the most interesting subject and for some reason I really took a liking to my statistics papers. I had never been very successful in maths during secondary school, but at university something clicked. The right or wrong nature of numbers was satisfying and provided a good balance to the fluffiness of essay writing. I aced all my stats papers and really enjoyed that part of my psychology degree. After university I joined Deloitte as a consultant for Information Management. That let me stumble into data. It was a mix of 80% powerpoint and 20% data analysis and I loved both parts. From there I decided to move into the financial services industry and took on a role as an analyst because I wanted to sharpen those skills.And that’s when things really started because I was surrounded by data every day.
Andy Kriebel:I got interested in numbers from an amazing geometry teacher I had in high school. The beauty of him as a teacher was that he was blind. That’s right, a blind geometry teacher. From there, I had THE BEST professor at university who is still a mentor to me today. As for data itself, I’ve pretty much been involved with data since my career started. My first job was as an underwriter for an insurance company (slimy business that is!) then I went into a revenue planning role at Coca-Cola, where I first found Tableau in 2007.
Alex: What was the first data set you remember working with? What did you do with it?
Eva:The very first one was probably when I was 10 and I collected data about my gerbils. Dad helped me research on the internet to predict the fur color of the gerbil babies that were about to be born. I had quite the breeding operation going on (my biology teacher thought the ones he gave me were brothers, but they turned out to be a couple). My first proper data analysis was done with survey data at university looking at responses, but I don’t really remember the topic. At uni we used SPSS to work with the data and the visualizations we built were typically scatterplots and histograms, focusing on the statistical relevance of the relationships between two metrics.
Andy: My dad was president of the local Little League (youth baseball) for about 20 years or so. I would go to games all weekend and help with scoring games. I would take all of the results and tabulate them by hand to calculate the stats, type them up on a typewriter and post them for any kid in the league to see. I was probably 9 or 10 when I started doing that.
Alex: Was there a specific „aha“ moment when you realized the power of data?
Eva:Yes, definitely. It wasn’t until a bit later after finishing uni. I was working on a project which basically involved an IT audit, looking at individual line items of spending on various hardware. Having to manage, analyse and find insights in the huge amount of data in Excel was a massive challenge but it showed me just a snippet of the type of data that’s out there, ready to be taken apart. Finding insights and creating data stories became something that fascinated me. Another couple of years later when I got my hands on Tableau and was able to make data visible so much more easily, data started to really come to life for me.
Andy:Absolutely! It was the day I found Tableau. Getting insight into the data in a few minutes after downloading the software totally blew me away. I showed that to the Director of our group and we immediately began using it to measure our sales teams.
Alex: How important is data in your personal life?
Eva: As an endurance athlete, data is very important for me. I track my sleep, my weight, my training, distances ridden/run/swum, elevation conquered and the effort it took to get me there. I’m often fascinated by what the human body is capable of and having a way to put it in words and numbers through data is something I really enjoy. Of course I also try to learn something from the data so I can improve my performance.
I like using data to identify patterns which in turn helps me build good habits and behaviours and stop the bad ones (at least I try to)
Andy: I’d say it’s less important than it used to be. I track a lot of quantified self data, but I don’t do much with it. I found that I became too obsessed with tiny things that didn’t really matter in my life, like step counts, weight, etc. I exercise enough to not worry about those things, so why worry about that? Just about the only thing I do now is create art with my fitness data.
Alex: Thank you for sharing. Next, let’s talk a little about helpful resources and where you think data analytics is heading to. What is the book (or books) that have greatly influenced the way your work with data?
Eva:I have to say that for me it wasn’t a book in particular. When I work with data, I sit in front of a screen, so my go-to-resources are typically blogs and forums to find the answers. In the early days of my Tableau journey, I heavily relied on the Tableau forums for help. Once I had a better understanding and knew what I was looking for, I shifted to blogs, such as the one Andy writes. Quite honestly, if I need an answer, I ask google first and based on my knowledge of people in the community, I then quickly pick from the results based on the names that pop up.
Andy:If I had to pick one book, I’d say #MakeoverMonday :-). But to answer your question less selfishly, here are some books that influenced me:
Alex: What advice would you give to a student about to enter the “analytics world”? What advice should they ignore?
Eva: Don’t think you need to have a computer science or statistics background to be successful. Yes, it can help, but if you’re someone with curiosity, you’re well on your way. Different disciplines play into the field of data analysis. Here are some that come to mind for me:
Thinking and researching like a journalist, finding sources of information, checking them, building a story and sharing it effectively
Analysing and challenging the data like a researcher, not just taking it at face value but testing different hypotheses, running through different scenarios and checking the statistical validity of your conclusions
Structuring your results like an attorney, making sure you have solid foundations for your arguments, you have proof and facts to back up your claims
Looking at data like a graphic designer, making sure the story becomes visible in a beautiful and impactful visualization, using colors, white space, text and charts in the most effective way to elicit emotion in your audience and to draw them into your data story.
Andy: I’d agree with Eva, don’t let your “degree” get in the way of your enjoying a career in data analytics. If you love numbers, jump right in. Find your niche, practice relentlessly, build a portfolio.
When approaching any project, try to answer five key questions: When? What? Where? Who? Why?
Alex: What are bad recommendations you hear in the area of analytics?
Eva:I don’t think they’re necessarily recommendations I hear but a phenomenon I have been witnessing is the almost compulsive move by everyone to do a Masters degree. Sure, if you’d like to do one after you graduate, go right ahead. Don’t feel like you have to do it, however, to be successful. If you instead spend those 12-18 months working, learning and applying your knowledge to real-life scenarios and gaining experiences in the real business world, you’ll probably benefit more than just financially. Having experience in applying your knowledge to client scenarios, finding solutions to problems and helping your organisations save money, improve processes, make greater contributions to their communities, etc. will probably be more exciting than spending more time at university for another certification to hang on your wall. No one ever asked me for my missing Masters title and getting my hands ‘dirty’ instead by working, learning on the job, seeking opportunities and pursuing them, has helped me greatly. Everyone should find their way and if you’re unsure whether or not you should stay another year or so at university, please don’t feel like you don’t have options or should do it because everyone is doing it.
Andy:That’s a good question. I often hear of people giving bad advice for how to approach data analysis and data visualization. People give advice that can be too complicated, which leads to frustration and kills someone’s interest. The most important thing anyone can do is keep it simple.
Alex: How does the future of analytics look like?
Eva:In my opinion, we’ll see a shift for analysts towards greater requirements for data science knowledge and skills. A lot of standard reporting will be automated but the stories we can tell with data will still come from human beings, from analysts who work with data and understand the human connection within the numbers.
We’ll hopefully see a lot fewer silos and much more collaboration within and across organisations. Data will become the lifeblood of humanitarian causes with volunteers and nonprofits using data and analytics to drive change at scale to improve living conditions for and the wellbeing of millions of people around the world because they know when to act, what resources to send and how to most effectively deploy the right people, machinerie and processes in different parts of the world.
Andy:I’m hoping that the future of data leads to a better world to live in. I hope we can get through all of the noise and lies by using data and facts to educate people. I hope data is used to improve education and health, especially for those that don’t have the best access to those resources now. Maybe I’m living in a utopian world, but one can dream and I promise I’ll do my best to make it happen.
Alex: Very insightful. Finally, let’s talk about your initiative #MakeoverMonday. How did you come to found #MakeoverMonday?
Eva: I’ll let Andy answer the question on how Makeover Monday came about. He brought it to life, I joined him in 2017 and injected my own personality and ideas into the initiative. It’s been so much fun to see the project grow to hundreds of regular participants and to follow people’s growth and development.
Andy:As of this writing, I’ve done 224 vizzes for “Makeover Monday”, but really, I’ve been doing makeovers since my first blog post in August 2009. Credit for the name “Makeover Monday” goes to Emily Kund. She saw that I tended to do the makeovers on Mondays, and came up with the alliteration. It looks like my first official Makeover Monday was on April 28, 2014. #MakeoverMonday the community project started in January 2016 with me and Andy Cotgreave. Eva replaced Andy in January 2017 and the project has really taken off since.
Alex: What specific problem is #MakeoverMonday trying to solve? How would you describe it to someone who is not familiar with it?
Eva:Our mission is to improve the way we visualize and analyze data – one chart at a time. ‘We’ in this case is everyone. Not just Andy and me. Not just the Makeover Monday community. There are so many people in the world who work with data and there are countless examples of bad data visualizations. We want to change that. Beyond beautiful charts we want to help people create truthful, easy-to-understand representations of the data which bring various topics to their audiences in a way that resonates with them. There is so much knowledge in the world and to make it accessible, we need to find easy ways to distill complex scenarios into clear, simple representations.
The way Makeover Monday works is that every week, Andy and I provide a visualization and the accompanying data to our community. We ask participants to create an improved visualization of the same data. To support the community, we run a weekly 75min feedback webinar where we help people with their questions, provide recommendations for improvement and explain why some visualizations work better than others to represent the data at hand. We also write a weekly blog post with lessons learned, provide feedback on social media, have a gallery with each week’s favorite vizzes and have written a book that distills everything into a paper-version people can use for reference
Andy:Ultimately, we’re helping people learn, not only technically, but with data literacy and communication. There are way too many charts that communicate poorly and we’re hoping people can use #MakeoverMonday to improve on those charts, take what they learn into their day job, and ultimately find the career they’ve always wanted. It’s quite simple when you think about it.
Alex: What should we know about your new #MakeoverMonday book?
Eva:Our book has been a labor of love, bringing together lessons learned from thousands of Makeover Monday visualizations, close to 150 data sets, over 100 hours of webinar content we created and showcasing the work from our community since the project started almost three years ago.
It puts the essentials into your hands, focusing on the foundations every analyst should build when it comes to their analysis and visualization skills. It is packed with over 300 examples from the community and can be read cover to cover or referred to as and when needed.
The book has been a very personal project, as we worked closely with our participants, as well as friends to create the final version. We had great support from the team at i-for-ideas.com who helped create a design that reflects the essence of what we and this project are all about.
Andy:I’m quite proud of how Eva and I pulled the book together so quickly. It’s a culmination of everything we have learned through the project. We’re taking the most frequently discussed lessons and turn them into a practical guide for anyone with an interest in data visualization. I don’t want to give away too many spoilers.
Alex: What has been the most surprising insight you have found while writing the #MakeoverMonday book?
Eva: It wasn’t as much a surprise as it was a very nice realisation: Andy and I are very good at teamwork and playing to our strengths. We didn’t argue once about who would work on which tasks. We simply created a plan with everything we needed to do, split up the jobs according to our interests and preferences and got to work. When one person was pressed for time, the other would take on a couple of extra jobs to ease the pressure and that’s how we went from book proposal to finished manuscript in 120 days. While I’m not sure Andy is keen on a second book at this point, I’d be happy to write another one with him :-).
Andy:I was surprised at how little time it took. Don’t get me wrong, we spent countless nights and weekends writing, but it wasn’t nearly as bad as people had led me to believe. I quite enjoyed the writing too; it helped reinforce my personal learning and I find writing therapeutic.
Nachdem Sie Ihre Daten für die Analyse optimal vorbereitet haben, stellt sich die Frage auf welche Weise Sie Ihre Daten bereithalten wollen, damit Sie schnell erste Erkenntnisse erhalten.
Tableau bietet Ihnen für die meisten Datenquellen die Möglichkeit, zwischen einer Live-Verbindung, also einer direkten Verbindung zur Datenbank, und einem Datenextrakt, also einem Abzug der Daten zu wählen. Wie Abbildung 1.1 zeigt, können Sie einfach zwischen beiden Verbindungstypen wechseln.
Live-Verbindungen ermöglichen Ihnen die Arbeit mit den Daten, wie sie zum momentanen Zeitpunkt auf der Datenbank oder der Datei vorliegen. Wenn Sie Daten extrahieren, importieren Sie einige oder alle Daten in die Data Engine von Tableau. Dies gilt sowohl für Tableau Desktop als auch für Tableau Server. Welche Verbindungsmethode Sie bevorzugen sollten, hängt von Ihrer Situation und dem Anwendungsfall, Ihren Anforderungen sowie von der Verfügbarkeit der Datenbank und der Netzwerkbeschaffenheit ab.
Immer aktuell mit der Live-Verbindung
Durch die direkte Verbindung mit Ihrer Datenquelle visualisieren Sie immer die aktuellsten Daten, die Ihnen die Datenbank zur Verfügung stellt. Wenn Ihre Datenbank in Echtzeit aktualisiert wird, müssen Sie die Tableau-Visualisierung nur über die Funktionstaste F5 aktualisieren oder indem Sie mit der rechten Maustaste auf die Datenquelle klicken und die Option Aktualisieren auswählen.
Wenn Sie eine Verbindung zu großen Datenmengen herstellen, die Visualisierung sehr viele Details enthält oder Ihre Daten in einer leistungsstarken Datenbank mit entsprechend ausgestatteter Hardware gespeichert sind, können Sie mit einer direkten Verbindung eine schnellere Antwortzeit erzielen.
Die Auswahl einer direkten Verbindung schließt nicht die Möglichkeit aus, die Daten später zu extrahieren. Andersherum können Sie auch wieder von einem Extrakt zu einer Live-Verbindung wechseln, indem Sie mit der rechten Maustaste auf die Datenquelle klicken und die Option Extrakt verwenden deaktivieren.
Unabhängig mit einem Datenextrakt
Datenextrakte haben naturgemäß nicht den Vorteil, dass sie in Echtzeit aktualisiert werden, wie es bei einer Live-Verbindung der Fall ist. Die Verwendung der Data Engine von Tableau bietet jedoch eine Reihe von Vorteilen:
Leistungsverbesserung bei langsamen Datenquellen:
Vielleicht ist Ihre Datenbank stark mit Anfragen belastet oder bereits mit transaktionalen Operationen beschäftigt. Mithilfe der Data Engine können Sie Ihre Datenbank entlasten und die Datenhaltung von Tableau übernehmen lassen. Extrakte können Sie am besten außerhalb der Stoßzeiten aktualisieren. Tableau Server kann Extrakte auch zu festgelegten Zeitpunkten aktualisieren, zum Beispiel nachts um 3 Uhr.
Inkrementelles Extrahieren:
Durch das inkrementelle Extrahieren wird auch die Aktualisierungszeit beschleunigt, da Tableau nicht die gesamte Extraktdatei aktualisiert. Es fügt nur neue Datensätze hinzu. Um inkrementelle Extrakte auszuführen, müssen Sie ein Feld angeben, das als Index verwendet werden soll. Tableau aktualisiert die Zeile nur, wenn sich der Index geändert hat. Daher müssen Sie beachten, dass Änderungen an einer Datenzeile, die das Indexfeld nicht ändert, von der Aktualisierung nicht berücksichtigt werden.
Datenmenge mit Filtern einschränken:
Eine andere Möglichkeit, Extrakte zu beschleunigen, besteht darin, beim Extrahieren der Daten Filter anzuwenden. Wenn für die Analyse nicht die gesamte Datenmenge benötigt wird, können Sie den Extrakt so filtern, dass er nur die erforderlichen Datensätze enthält. Wenn Sie eine sehr große Datenmenge haben, müssen Sie nur selten den gesamten Inhalt der Datenbank extrahieren. Zum Beispiel kann Ihre Datenbank Daten für viele Regionen enthalten, aber Sie benötigen möglicherweise nur die Daten zur Region »Süd«.
Um einen Extrakt entsprechend anzulegen, wählen Sie als Verbindung Extrakt aus und klicken dann auf das nebenstehende Bearbeiten. Es öffnet sich das Fenster Daten extrahieren. Mit einem weiteren Klick auf Hinzufügen können Sie nun einen Filter erstellen, der für Ihren Extrakt angewandt wird (siehe Abbildung 1.2).
Weitere Funktionen für bestimmte Datenquellen:
Wenn Ihre Daten aus einer bestimmten Datenquelle stammen, so sind unter anderem Aggregationsfunktion wie Median (beispielsweise bei Access-Datenbanken ) bei einer Live-Verbindung nicht verfügbar. Arbeiten Sie mit einem Extrakt, können Sie diese Funktionen nutzen, auch wenn sie von der ursprünglichen Datenquelle nicht unterstützt werden.
Datenübertragbarkeit:
Sie können Extrakte lokal speichern und auch dann verwenden, wenn die Verbindung zu Ihrer Datenquelle nicht verfügbar ist. Eine Live-Verbindung funktioniert nicht, wenn Sie nicht über ein lokales Netzwerk oder das Internet auf Ihre Datenquelle zugreifen können. Extrakte werden außerdem komprimiert und sind normalerweise wesentlich kleiner als die ursprünglichen Datenbanktabellen, was dem Weitertransport der Daten zugutekommt.
Achten Sie auf Datenschutz und Data Governance
In Unternehmen spielen Datenschutz und Data Governance und damit verbunden Integrität und Sicherheit der Daten eine wichtige Rolle. Wenn Sie Extrakte an Mitarbeiter oder Geschäftspartner verteilen, sollten Sie die etwaige Vertraulichkeit Ihrer Daten berücksichtigen. Ziehen Sie in Betracht, den Inhalt des Extrakts über Filter einzuschränken und zu sichtbaren Dimensionen zu aggregieren.
Sind Sie sich unsicher, arbeiten Sie im Zweifelsfall besser mit einer Live-Verbindung, da in diesem Fall Ihre Datenbank das Rechte-Management steuert und so Ihre Daten nicht von Personen ohne ausreichende Berechtigungen gesehen werden können.
We use cookies to optimize our website and our service.
Functional
Immer aktiv
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes.The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.