9 Key Elements of a Successful Data Strategy for Business Growth

Get the Competitive Edge with Decisively Digital - The Ultimate Guide to Data Strategy
Get the Competitive Edge with Decisively Digital – The Ultimate Guide to Data Strategy

Data is a valuable asset that can give businesses a competitive edge and drive growth in today’s digital age. But without a clear and well-defined data strategy, companies risk missing out on the benefits that data provides. To help your business succeed in the digital world, here’s an overview of nine essential elements of a comprehensive data strategy.

    1. Goals and Objectives: Define specific goals and objectives that the company wants to achieve through its data efforts, such as improving customer experiences or optimizing business processes.
    2. Data Sources: Identify the most valuable data types and determine where they will come from, such as internal transaction or customer data and external market research.
    3. Data Management and Storage: Outline how data will be collected, organized, and stored consistently, accurately, and compliantly, with data management tools and technologies.
    4. Data Analysis and Reporting: Define how data will be analyzed and used to inform business decisions, with data visualization tools, dashboards, and reporting systems.
    5. Data Governance: Establish clear roles and responsibilities for data management, guidelines for data use and access, and ensure ethical and regulatory compliance.
    6. Data-driven Culture: Foster a data-driven culture by providing training and resources for data-driven decision making.
    7. Data Security and Privacy: Ensure data is collected, stored, and used securely and in compliance with privacy regulations.
    8. Data Integration and Interoperability: Define how data will be integrated and shared across systems and platforms.
    9. Data Quality and Accuracy: Ensure data is accurate and up-to-date, with processes for data cleansing and enrichment.

A data strategy is a must-have tool for any company that wants to fully realize the benefits of its data. It provides a clear roadmap for data collection, management, and analysis and helps organizations make better use of their data, drive growth, and succeed in today’s digital world. Get more insights and in-depth information by reading the book Decisively Digital (on Amazon).

Data Operations: Wie Sie die Performance Ihrer Datenanalyse und Dashboards steigern

#dataops: Folgen Sie der Diskussion auf Twitter
#dataops: Folgen Sie der Diskussion auf Twitter

Sind Sie mit der Geschwindigkeit Ihrer Datenanlyse unzufrieden? Oder haben Ihre Dashboards lange Ladezeiten? Dann können Sie bzw. Ihr Datenbank-Administrator folgenden Hinweisen nachgehen, die sich je nach Datenquelle unterscheiden können.

Allgemeine Empfehlungen zur Performance-Optimierung

Möchten Sie die Geschwindigkeit der Analyse verbessern? Dann beachten Sie folgende Punkte:

  • Benutzen Sie mehrere »kleinere« Datenquellen für individuelle Fragestellungen anstatt einer einzigen Datenquelle, die alle Fragestellungen abdecken soll.
  • Verzichten Sie auf nicht notwendige Verknüpfungen.
  • Aktivieren Sie in Tableau die Option »Referentielle Integrität voraussetzen« im »Daten«-Menü (siehe Abbildung 2.20). Wenn Sie diese Option verwenden, schließt Tableau die verknüpften Tabellen nur dann in die Datenabfrage ein, wenn sie explizit in der Ansicht verwendet wird*. Wenn Ihre Daten nicht über referentielle Integrität verfügen, sind die Abfrageergebnisse möglicherweise ungenau.
Aktivierte Option „Referentielle Integrität voraussetzen“ im „Daten“-Menü
Abbildung 2.20: Aktivierte Option »Referentielle Integrität voraussetzen« im »Daten«-Menü

* So wird beispielsweise der Umsatz anstatt mit der SQL-Abfrage SELECT SUM() FROM S INNER JOIN P ON S.ProductID = P.ProductID lediglich mit der SQL-Abfrage SELECT SUM() FROM ermittelt.

Empfehlungen für Performance-Optimierung bei Dateien und Cloud-Diensten

Achten Sie insbesondere beim Arbeiten mit Dateiformaten, wie Excel-, PDF- oder Textdateien, oder Daten aus Cloud-Diensten wie Google Tabellen zusätzlich auf folgende Punkte:

  • Verzichten Sie auf Vereinigungen über viele Dateien hinweg, da deren Verarbeitung sehr zeitintensiv ist.
  • Nutzen Sie einen Datenextrakt anstatt einer Live-Verbindung, falls Sie nicht mit einem schnellen Datenbanksystem arbeiten (siehe Wann sollten Sie Datenextrakte und wann Live-Verbindungen verwenden).
  • Stellen Sie sicher, dass Sie beim Erstellen des Extrakts die Option »Einzelne Tabelle« wählen, anstatt der Option »Mehrere Tabellen« (siehe Abbildung 2.21). Dadurch wird das erzeugte Extrakt zwar größer und das Erstellen des Extrakts dauert länger, das Abfragen hingegen wird um ein Vielfaches beschleunigt.
Ausgewählte Option „Einzelne Tabelle“ im „Daten extrahieren“-Dialog
Abbildung 2.21: Ausgewählte Option »Einzelne Tabelle« im »Daten extrahieren«-Dialog

Empfehlungen für Performance-Optimierung bei Datenbank-Servern

Arbeiten Sie mit Daten auf einem Datenbank-Server, wie Oracle, PostgreSQL oder Microsoft SQL Server, und möchten die Zugriffszeiten verbessern? Dann achten Sie bzw. der dafür zuständige Datenbankadministrator zusätzlich auf folgende Punkte:

  • Definieren Sie für Ihre Datenbank-Tabellen sinnvolle Index-Spalten.
  • Legen Sie für Ihre Datenbank-Tabellen Partitionen an.

Dieser Beitrag ist der dritte Teil der Data-Operations-Serie:

Teil 1: Daten für die Analyse optimal vorbereiten
Teil 2: Wann sollten Sie Datenextrakte und wann Live-Verbindungen verwenden
Teil 3: Wie Sie die Performance Ihrer Datenanalyse und Dashboards steigern

Außerdem basiert dieser Blog-Post auf einem Unterkapitel des Buches “Datenvisualisierung mit Tableau“:

How to research LinkedIn profiles in Tableau with Python and Azure Cognitive Services

A few weeks after the fantastic Tableau Conference in New Orleans, I received an email from a data scientist who attended my TC18 social media session, and who is using Azure+Tableau. She had a quite interesting question:How can a Tableau dashboard that displays contacts (name & company) automatically lookup LinkedIn profile URLs?

Of course, researching LinkedIn profiles for a huge list of people is a very repetitive task. So let’s find a solution to improve this workflow…

1. Python and TabPy

We use Python to build API requests, communicate with Azure Cognitive Services and to verify the returned search results. In order to use Python within Tableau, we need to setup TabPy. If you haven’t done this yet: checkout my TabPy tutorial.

2. Microsoft Azure Cognitive Services

One of many APIs provided by Azure Cognitive Services is the Web Search API. We use this API to search for name + company + “linkedin”. The first three results are then validated by our Python script. One of the results should contain the corresponding LinkedIn profile.

3. Calculated Field in Tableau

Let’s wrap our Python script together and create a Calculated Field in Tableau:


SCRIPT_STR("
import http.client, urllib, base64, json
YOUR_API_KEY = 'xxx'
name = _arg1[0]
company = _arg2[0]
try:
headers = {'Ocp-Apim-Subscription-Key': YOUR_API_KEY }
params = urllib.urlencode({'q': name + ' ' + company + ' linkedin','count': '3'})
connection = http.client.HTTPSConnection('api.cognitive.microsoft.com')
connection.request('GET', '/bing/v7.0/search?%s' % params, '{body}', headers)
json_response = json.loads(connection.getresponse().read().decode('utf-8'))
connection.close()
for result in json_response['webPages']['value']:
if name.lower() in result['name'].lower():
if 'linkedin.com/in/' in result['displayUrl']:
return result['displayUrl']
break
except Exception as e:
return ''
return ''
", ATTR([Name]), ATTR([Company]))

4. Tableau dashboard with URL action

Adding a URL action with our new Calculated Field will do the trick. Now you can click on the LinkedIn icon and a new browser tab (or the LinkedIn app if installed) opens.

LinkedIn demo on Tableau Public

Is this useful for you? Feel free to download the Tableau workbook (don’t forget to add your API key), leave a comment and share this tweet:

The Empathy Machine: Are Digital Technologies the Best Bet in Telling about your Cause?

The panel discussion “The empathy machine: are digital technologies the best bet in telling about your cause?” took place on the opening day of the 2018 Fundamental Rights Forum (FRA). This forum was organized by the European Union Agency for Fundamental Rights, and took place at the METAStadt Vienna 25-27 September 2018.

In this panel discussion Kadri Kõusaar (a Oscar nominated film director), Fanny Hidvegi (European Policy Manager) and me discussed if digital technologies really are the “empathy machine” and how innovative applications can help human rights defenders to achieve some challenging goals such as a change in public attitudes or meeting tough fund-raising targets. The panel discussion was moderated by the virtual reality artist Dr. Frederick Baker.

In this blog post I want to share some of the panel’s questions, which I answered:

1. How do algorithms interfere with human rights?

When algorithms make certain decisions, these algorithms  tent to mirror what they are shown with training sets. This is especially apparent for issues such as bias and machine discrimination. Both might be the result of the content of the training data, which reflects existing inequalities.

2. So, it’s about the data? What else makes data so important today?

The effective use of data is vital for our understanding of fundamental issues, such as human rights violations and political instability, for informing our policy-making, and for enhancing our ability to predict the next crisis. Furthermore, the scope, complexity and life-changing importance of the work being done on topics like these across the European Union has made it more important than ever for everyone participating in the public conversation and in demographic decision-making to have access to and to be able to derive insights from key data sources.

3. Where is data coming from and how can people benefit?

Every time we google something, send a tweet, or just browse a website, we create data. With the rise of visual analytics we can benefit from this vast amount of information. Visual analytics is a hands-on approach to interacting with data that does not require any programming skills. Furthermore, communicating with data, is seen as one of the most relevant skills in today’s information age.

Global Refugee Crisis visualization on Tableau Public

4. What is the easiest way to find interesting data?

I would check out the Google’s new search engine for datasets that was just released recently! Tableau Public is a good source for existings visualizations. Many of these are based on public data.

5. What is required to enable organizations to use data for good?

Data can be used for the good of society, but private- and public-sector firms, nonprofits and NGOs still lack analytics resources and expertise. Data and analytics leaders must cross traditional boundaries to use data for good, to better compete for limited talent, and to foster an ethical culture. VizForSocialGood and Tableau Foundation are good examples.

6. How can the private sector contribute for good?

Some private sector organizations are making data open and available to researchers, nonprofits and NGOs. Examples include:

  • Mastercard anonymizing credit card data to be analyzed in smart city initiatives.
  • Google making search data available to hospitals to predict infection disease outbreaks such as flu and dengue fever.
  • Insurance companies providing anonymized healthcare data to improve patient outcomes and prevention strategies.
  • Yelp providing ratings data to cities to prioritize food safety inspectors.

The panel discussion was followed by workshops in the afternoon:

 

#TC18 Sessions: Rock your Social Media Data with Tableau

My TC18 sessions in New Orleans: "Rock your Social Media Data with Tableau"
My TC18 sessions in New Orleans: “Rock your Social Media Data with Tableau”

Anyone can analyze basic social media data in a few steps. But once you’ve started diving into social analytics, how do you bring it to the next level? This session will cover strategies for scaling a social data program. You’ll learn skills such as how to directly connect to your social media data with a Web Data Connector, considerations for building scalable data sources, and tips for using metadata and calculations for more sophisticated analysis.

First session: Tues, 23 Oct,  12:30-1:30 (Location: MCCNO – L3 – 333)

Second session: Wed, 24 Oct, 10:15-11:15 (Location: MCCNO – L3 – 346)

Twitter Analysis #TC18 Dashboard featured as Tableau Public Viz of the Day
Twitter Analysis #TC18 Dashboard featured as Tableau Public Viz of the Day

Here are some key takeaways and links (i.e. additional resources) featured during my TC18 sessions to help you formulate your social media data program in order to build a stronger presence and retrieve powerful insights:

Prolog: Introducing data artist Noah

Step 1: Understand How to Succeed with Social Media

Apple has officially joined Instagram on 7th August 2017. This isn’t your average corporate account as the company doesn’t want to showcase its own products. Instead, Apple is going to share photos shot with an iPhone:

The Customer-Centric Data Strategy

Apple’s Instagram account is more an extension of the “Shot on iPhone” billboard ad campaign.

And there are plenty takeaways for every business:

  • Wrap your data around your customers, in order to create business value
  • Interact with your customer in a natural way
  • Understand your customer and customer behaviour better by analyzing social media data

Step 2: Define Your Social Objectives and KPIs

A previous record-holding tweet: In 2014, actor and talk show host Ellen DeGeneres took a selfie with a gaggle of celebrities while hosting the Oscars. That photo has 3.44 million retweets at the time of writing:

Social Objectives:

  • Define specific KPIs for social media platforms
  • KPI objectives need to be measurable
  • Metrics should be in line with the business goals

Step 3: Assemble Your KPIs

Brand Awareness and Reputation

Step 4: Connect Your Social Media with Tableau

Option 1 – Directly from the platform: Get data directly from Facebook, Twitter, YouTube, and more

Option 2 – Via web automation: Use a service like IFTTT to store data on Google Sheets

Option 3 – Via web data connector: Use Tableau’s web data connector, e.g. the Twitter Web Data Connector by Alex Ross (a.k.a. Tableau Junkie) -> http://bit.ly/tc18_twitter

Option 4 – Code your own solution: Use an API provided by the platform -> http://bit.ly/tc17_r_fetch

Option 5 – Via a third party platform: Get data from an integrated social media platform, such as Talkwalker -> http://bit.ly/tc17_talkwalker

Talkwalker - Via a Third Party Platform

Step 5: Apply some Tips to Level Up

Gather Historic Data

Step 6: Explore Social Media Listening

Social listening means that you look beyond your own content. E.g. Talkwalker offers AI for image recognition and ggregation for online/offline media: http://bit.ly/tc17_talkwalker

Step 7: Leverage Your Analytics Tool Chain

Use Your R and Python Skills

Demo/Tutorial: Let’s See this in Tableau!

How to analyse Social Media traffic with Google Analytics in Tableau (YouTube):

How to analyse Social Media data from Twitter in Tableau (YouTube):

Slide Set

The slides presented at Tableau Conference are also available on SlideShare.

Are you on Social Media?

Feel free to retweet/share:

: Missed the sessions? Watch the recording online!