One of the most highly anticipated and highly regarded reviews of the business intelligence market was published a couple of days ago. Gartner released its 2013 iteration of the famous Magic Quadrant for BI and Analytics Platform (aka. Gartner BI MQ) – and Tableau was cited as a „Leader“ for the first time.
After enjoying research for the last four years, especially during my time at CERN, I have made a significant decision. I have decided to resign from my postgraduate position and make a transition from academia to the exciting world of Capgemini. My passion for Data and Analytics remains strong and will be the core focus of my new role.
Capgemini: A New Adventure After Academia
Capgemini, one of the world’s largest consulting corporations, has caught my attention. Unlike many other consulting companies, Capgemini does not yet have a dedicated team to offer effective strategies and solutions employing Big Data, Analytics, and Machine Learning. This presents an exciting opportunity for me to contribute and innovate.
My Vision: Building a Data-Driven Future at Capgemini
I love these technologies and am confident in my ability to elaborate a business development plan to drive business growth. Through customer and market definition, my plan includes new services such as:
Data Science Strategy: Enabling organizations to solve problems with insights from analytics.
Consulting: Answering questions using data.
Development: Building custom tools like interactive dashboards, pipelines, customized Hadoop setup, and data prep scripts.
Training: Offering various skill levels of training, from basic dashboard design to deep dives in R, Python, and D3.js.
This plan also includes a go-to-market strategy, which I’ll keep under wraps for now. Stay tuned for a retrospective reveal in the future!
Reflecting on My Transition from Academia
Making this transition from academia to a corporate role has been a considered decision. As I previously shared in my reflection on my software engineering internship at SAP, the blend of technological challenges and team collaboration has always intrigued me. Joining Capgemini allows me to continue pursuing my passion for data in a dynamic business environment.
Conclusion: Exciting Times Ahead
This transition from academia to Capgemini marks a thrilling new chapter in my career. I look forward to leveraging my expertise in Data and Analytics to contribute to Capgemini’s growth and innovation.
Follow my journey as I explore the intersection of data, technology, and business. Connect with me on Twitter and LinkedIn.
Physics projects don’t get any bigger than this. The active European Organization for Nuclear Research, aka CERN, formed in 1954 and is headquartered in Geneva, Switzerland, employs thousands of world-class scientists on the forefront of breakthrough research. Its claim to fame is unmatched as the origin of the World Wide Web and creator of underground 17-mile-long particle accelerator called the Large Hadron Collider. Here, see photos of the many aspects of an international institution that may discover a way to move faster than the speed of light and how our universe was pieced together.
There are four key issues to overcome if you want to tame Big Data: volume (quantity of data), variety (different forms of data), velocity (how fast the data is generated and processed) and veracity (variation in quality of data). You have to be able to deal with lots and lots, of all kinds of data, moving really quickly.
That is why Big Data Analytics has a huge impact on how we plan CERN’s overall technology strategy as well as specific strategies for High-Energy Physics analysis. We want to profit from our data investment and extract the knowledge. This has to be done in a proactive, predictive and intelligent way.
The following presentation shows you how we use Big Data Analytics to improve the operation of the Large Hardron Collider.
I became a Python geek and GnuPlot maniac since I joined CERN around three years ago. I have to admit, however, that I really enjoy the flexibility of D3.js, and its capability to render histograms directly in the web browser.
D3 is a JavaScript library for manipulating documents based on data. This library helps you to bring data to life leveraging HTML, CSS and SVG, and embed it in your website.
The following example loads a CSV file, which includes 10,000 dimuon events (i.e. events containing two muons) from the CMS detector, and displays the distribution of the invariant mass M (in GeV, in bins of size 0.1 GeV):
Feel free to download the sample CSV dataset here.
We use cookies to optimize our website and our service.
Functional
Immer aktiv
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistics
The technical storage or access that is used exclusively for statistical purposes.The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.