As a data enthusiast and very early adopter of Tableau, I was excited to join Tableau, a Seattle-based startup company that is coming up with the next level of self-service data analytics software – compared to classic BI software. My first weeks have been nothing short of amazing, with an incredible opportunity to contribute to company building and be one of the first employees in Tableau’s new Frankfurt Office that was just recently opened to ramp up Tableau’s Europe business.
Being part of a startup company is an incredible experience, and I am thrilled to have the opportunity to work on such an innovative and disruptive product. It is a privilege to be involved in building a company from the ground up, especially in such an exciting industry as data analytics.
Tableau’s new Frankfurt Office has brought exciting opportunities, especially for me, who just earned my MBA degree. I have been able to apply my newfound knowledge to contribute to the growth of the company. I am honored to be part of the team that is bringing this new product to market and to be able to learn from some of the best minds in the business.
Tableau’s bootcamp in Seattle is nothing short of awesome. The three-week program is intense, but the wealth of knowledge and experience that I have gained from it has been invaluable. I have learned a lot about the company’s culture, the product, and the industry as a whole. The bootcamp has given me a great foundation for my work at Tableau and has helped me hit the ground running in my role as one of the first employees in Frankfurt, Europe’s hub for finance and technology.
Tableau is known for its unique company culture that encourages creativity, innovation, and collaboration. From weekly hackathons to Tableau’s famous Data Night Out events, there’s always something exciting happening at the company. As someone who is passionate about data and thrives in a collaborative environment, I couldn’t be more thrilled to be a part of this culture.
Being involved in company building is a great thing when you’re in a startup, and I am honored to be part of this exciting journey. I look forward to continuing to contribute to the growth of the company and to be part of a team that is making such a huge impact in the world of data analytics. If you’re looking for a challenging and rewarding career in data analytics, Tableau is definitely the place to be.
My Data Science journey starts at CERN where I finished my master thesis in 2009. CERN, the European Organization for Nuclear Research, is the home of the Large Hadron Collider (LHC) and has some questions to answer: like how the universe works and what is it made of. CERN collects nearly unbelievable amounts of data – 35 petabytes of data per year that needs analysis. After submitted my thesis, I continued my Data Science research at CERN.
Today companies have realized that Business Analytics needs to be an essential part of their competitive strategy. The demand on Data Scientists grows exponentially. To me, Data Science is more about the right questions being asked than the actual data. The MBA enabled me to understand that data does not provide insights unless appropriately questioned. Delivering excellent Big Data projects requires a full understanding of the business, developing the questions, distilling the adequate amount of data to answer those questions and communicating the proposed solution to the target audience.
“The task of leaders is to simplify. You should be able to explain where you have to go in two minutes.” – Jeroen van der Veer, former CEO of Royal Dutch Shell
How about some visual takeaways from the IMF’s World Economic Outlook? Recently I prepared two nifty data visualizations with Tableau that I like to share with you.
These visualizations allow you to explore plenty of economical data, including IMF staff estimates until 2020. Don’t forget to choose “Units” after switching “Subject” on the right-side bar. A detailed description on each subject is displayed below.
There are four key issues to overcome if you want to tame Big Data: volume (quantity of data), variety (different forms of data), velocity (how fast the data is generated and processed) and veracity (variation in quality of data). You have to be able to deal with lots and lots, of all kinds of data, moving really quickly.
That is why Big Data Analytics has a huge impact on how we plan CERN’s overall technology strategy as well as specific strategies for High-Energy Physics analysis. We want to profit from our data investment and extract the knowledge. This has to be done in a proactive, predictive and intelligent way.
The following presentation shows you how we use Big Data Analytics to improve the operation of the Large Hardron Collider.
The raw data from the experiments is stored in structured files (using CERN’s ROOT Framework), which are better suited to physics analysis. Transactional relational databases (Oracle 11g with Real Application Clusters) store metadata information that is used to manage that raw data. For metadata residing on the Oracle Database, Oracle TimesTen serves as an in-memory cache database. The raw data is analysed on PROOF (Parallel ROOT Facility) clusters. Hadoop Distributed File System (HDFS), however, is used to store the monitoring data.
Just as in the CERN example, there are some significant trends in Big Data Analytics:
Descriptive Analytics, such as standard business reports, dashboards and data visualization, have been widely used for some time, and are the core applications of traditional Business Intelligence. This ad hoc analysis looks at the static past and reveal what has occurred. One recent trend, however, is to include the findings from Predictive Analytics, such as forecasts of sales on the dashboard.
Predictive Analytics identify trends, spot weaknesses or determine conditions for making decisions about the future. The methods for Predictive Analytics such as machine learning, predictive modeling, text mining, neural networks and statistical analysis have existed for some time. Software products such as SAS Enterprise Miner have made these methods much easier to use.
Discovery Analytics is the ability to analyse new data sources. This creates additional opportunities for insights and is especially important for organizations with massive amounts of various data.
Prescriptive Analytics suggests what to do and can identify optimal solutions, often for the allocation of scarce resources. Prescriptive Analytics has been researched at CERN for a long time but is now finding wider use in practice.
Semantic Analytics suggests what you are looking for and provides a richer response, bringing some human level into Analytics that we have not necessarily been getting out of raw data streams before.
As these trends bear fruit, new ecosystems and markets are being created for broad cross-enterprise Big Data Analytics. Use cases like the CERN’s LHC experiments provide us with greater insight into how important Big Data Analytics is in the scientific community as well as to businesses.
Manage Cookie Consent
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
The technical storage or access that is used exclusively for statistical purposes.The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.