Transition from Academia to Capgemini: A New Chapter in Data and Analytics

CERN Main Auditorium: my transition from academia to Capgemini
CERN Main Auditorium: my transition from academia to Capgemini

After enjoying research for the last four years, especially during my time at CERN, I have made a significant decision. I have decided to resign from my postgraduate position and make a transition from academia to the exciting world of Capgemini. My passion for Data and Analytics remains strong and will be the core focus of my new role.

Capgemini: A New Adventure After Academia

Capgemini, one of the world’s largest consulting corporations, has caught my attention. Unlike many other consulting companies, Capgemini does not yet have a dedicated team to offer effective strategies and solutions employing Big Data, Analytics, and Machine Learning. This presents an exciting opportunity for me to contribute and innovate.

My Vision: Building a Data-Driven Future at Capgemini

I love these technologies and am confident in my ability to elaborate a business development plan to drive business growth. Through customer and market definition, my plan includes new services such as:

  • Data Science Strategy: Enabling organizations to solve problems with insights from analytics.
  • Consulting: Answering questions using data.
  • Development: Building custom tools like interactive dashboards, pipelines, customized Hadoop setup, and data prep scripts.
  • Training: Offering various skill levels of training, from basic dashboard design to deep dives in R, Python, and D3.js.

This plan also includes a go-to-market strategy, which I’ll keep under wraps for now. Stay tuned for a retrospective reveal in the future!

Reflecting on My Transition from Academia

Making this transition from academia to a corporate role has been a considered decision. As I previously shared in my reflection on my software engineering internship at SAP, the blend of technological challenges and team collaboration has always intrigued me. Joining Capgemini allows me to continue pursuing my passion for data in a dynamic business environment.

Conclusion: Exciting Times Ahead

This transition from academia to Capgemini marks a thrilling new chapter in my career. I look forward to leveraging my expertise in Data and Analytics to contribute to Capgemini’s growth and innovation.

Follow my journey as I explore the intersection of data, technology, and business. Connect with me on Twitter and LinkedIn.

Challenges of Big Data Analytics in High-Energy Physics

Challenges of Big Data Analytics: volume, variety, velocity and veracity
Screenshot of CERN Big Data Analytics presentation

There are four key issues to overcome if you want to tame Big Data: volume (quantity of data), variety (different forms of data), velocity (how fast the data is generated and processed) and veracity (variation in quality of data). You have to be able to deal with lots and lots, of all kinds of data, moving really quickly.

That is why Big Data Analytics has a huge impact on how we plan CERN’s overall technology strategy as well as specific strategies for High-Energy Physics analysis. We want to profit from our data investment and extract the knowledge. This has to be done in a proactive, predictive and intelligent way.

The following presentation shows you how we use Big Data Analytics to improve the operation of the Large Hardron Collider.

Displaying Dimuon Events from the CMS Detector using D3.js

Physicists working on the CMS Detector
Physicists working on the CMS Detector

I became a Python geek and GnuPlot maniac since I joined CERN around three years ago. I have to admit, however, that I really enjoy the flexibility of D3.js, and its capability to render histograms directly in the web browser.

D3 is a JavaScript library for manipulating documents based on data. This library helps you to bring data to life leveraging HTML, CSS and SVG, and embed it in your website.

The following example loads a CSV file, which includes 10,000 dimuon events (i.e. events containing two muons) from the CMS detector, and displays the distribution of the invariant mass M (in GeV, in bins of size 0.1 GeV):

Feel free to download the sample CSV dataset here.

Further reading: D3 Cookbook

CERN: Where Big Bang Theory meets Big Data Analytics

Screenshot of SQL Plan Baselines with Oracle Enterprise Manager at CERN
Screenshot of SQL Plan Baselines with Oracle Enterprise Manager at CERN

The volume, variety, velocity and veracity of data generated by the LHC experiments at CERN continue to reach unprecedented levels: some 22 petabyte of data this year, after throwing away 99% of what is recorded by the LHC detectors. This phenomenal growth means that not only must we understand Big Data in order to decipher the information that really counts, but we also must understand the opportunities of what we can achieve with Big Data Analytics.

The raw data from the experiments is stored in structured files (using CERN’s ROOT Framework), which are better suited to physics analysis. Transactional relational databases (Oracle 11g with Real Application Clusters) store metadata information that is used to manage that raw data. For metadata residing on the Oracle Database, Oracle TimesTen serves as an in-memory cache database. The raw data is analysed on PROOF (Parallel ROOT Facility) clusters. Hadoop Distributed File System (HDFS), however, is used to store the monitoring data.

Just as in the CERN example, there are some significant trends in Big Data Analytics:

  • Descriptive Analytics, such as standard business reports, dashboards and data visualization, have been widely used for some time, and are the core applications of traditional Business Intelligence. This ad hoc analysis looks at the static past and reveal what has occurred. One recent trend, however, is to include the findings from Predictive Analytics, such as forecasts of sales on the dashboard.
  • Predictive Analytics identify trends, spot weaknesses or determine conditions for making decisions about the future. The methods for Predictive Analytics such as machine learning, predictive modeling, text mining, neural networks and statistical analysis have existed for some time. Software products such as SAS Enterprise Miner have made these methods much easier to use.
  • Discovery Analytics is the ability to analyse new data sources. This creates additional opportunities for insights and is especially important for organizations with massive amounts of various data.
  • Prescriptive Analytics suggests what to do and can identify optimal solutions, often for the allocation of scarce resources. Prescriptive Analytics has been researched at CERN for a long time but is now finding wider use in practice.
  • Semantic Analytics suggests what you are looking for and provides a richer response, bringing some human level into Analytics that we have not necessarily been getting out of raw data streams before.

As these trends bear fruit, new ecosystems and markets are being created for broad cross-enterprise Big Data Analytics. Use cases like the CERN’s LHC experiments provide us with greater insight into how important Big Data Analytics is in the scientific community as well as to businesses.

bitcoin.de: Erster deutscher Marktplatz für Bitcoins

Bitcoins sind derzeit auch bei uns am CERN ein brandheißes Thema. Innerhalb weniger Wochen stieg der Wert eines Bitcoins (BTC) von 20 Cent im Dezember 2010 auf Größenordnungen von bis zu 30 Dollar. Dennoch lohnt sich das Mining kaum, zumindest nicht zu den aktuellen Strompreisen.

Die Bitcoin-Börse bitcoin.de schafft hier nun Abhilfe! Ein gutes halbes Jahr später, am 26. August 2011, hat der erste deutsche Marktplatz zum Kaufen und Verkaufen von Bitcoins den Handel aufgenommen. Auf bitcoin.de können User auf einfache Art und Weise Bitcoins an andere User verkaufen oder von diesen kaufen.

Dafür ist es erforderlich, dass sich die User bei bitcoin.de registrieren und, insofern sie als Verkäufer auftreten wollen, auf ihr Benutzerkonto ein Bitcoin-Guthaben übertragen. Sobald für die eigenen Bitcoins ein Käufer gefunden wurde, werden automatisch alle Informationen zur Bezahlung an den Käufer übermittelt.

Die Bezahlung der Bitcoins erfolgt direkt zwischen Käufer und Verkäufer. Erst wenn die Zahlung beim Verkäufer eingegangen ist, werden die Bitcoins abzüglich einer geringen Gebühr aus dem Guthaben des Verkäufers in das Guthaben des Käufers übertragen.