At the time of a new engagement, managers take into consideration many activities like project planning, effort estimation, defining goals and metrics, cost, outcome, etc. One factor that is most important for any project to succeed is engaging the right onshore/offshore staffing ratio to execute the project. This factor is mostly not given adequate importance in many recent delivery models. For managers to meet project profit margins, they try to limit the cost spent on project resources and execution. With the limited resourcing budget, it is not feasible to have a default onshore/offshore ratio that fits all projects.
After gathering some experience in working offshore (2007-2008 in Bangalore, India) and onshore (in Germany and Switzerland) I started to wonder if there is a optimal onshore/offshore ratio. Quite soon I concluded that this question is not easy to answer. So I did a breakdown to certain aspects and instead of answering them by myself, I set up a survey and hope to get your support!
Start the survey: http://bit.ly/offshoreratio [Update 15 Nov 2014]: After collecting data over four weeks (18 Oct – 14 Nov), the survey is closed. Results will follow soon.
Basically, I’d like to address three groups to answer this survey:
Employees of traditional consulting firms
Employees of Indian pure players (such as Infosys, TCS, HCL, Wipro, etc.)
Employees of clients of consulting firms
Of course, I’m going to share the results after evaluation. Thank you for participating and sharing the link with your colleagues! Also retweets are highly appreciated…
Are you involved in projects that are delivered (partly) offshore? I'd be happy if you would share your experience! http://t.co/wCqJbUa0TO
Asian countries, especially countries in South Asia and Southeast Asia, keep on being favored picks among organizations interested in contract out business processes offshore. India remains the top outsourcing destination, with its unrivaled advantages in scale and people skills, said the 2014 Global Services Location Index (GSLI) released by A.T. Kearney. China and Malaysia are second and third respectively.
The GSLI, which tracks offshoring patterns to lower-cost developing countries and the ascent of new locations, measures the underlying fundamentals of 51 nations focused on measurements in three general classifications, such as financial attractiveness, people skills and availability, and business environment.
Distributed since 2004 the GSLI, revealed that leading IT-services companies in India, to whom IT-related functions were outsourced, are extending their traditional offerings to incorporate research and development, product development and other niche services. The line between IT and business-procedure outsourcing there is obscuring, as players offer packages and specialized services to their customers and are developing skills in niche domains.
Furthermore, the GSLI identified a trend of multinationals reassessing their outsourcing strategies, after having aggressively outsourced back office operations in the mid-2000s; it has been noted that some companies are starting to reclaim some of these functions and undertaking them in-house again.
Recently Tableau released an exciting new feature: R integration via RServe. Tableau with R seems to bring my data science toolbox to the next level! In this tutorial I’m going to walk you through the installation and connecting Tableau with RServe. I will also give you an example of calling an R function with a parameter from Tableau to visualize the results in Tableau.
1. Install and start R and RServe
You can download base R from r-project.org. Next, invoke R from the terminal to install and run the RServe package:
[Update 26 Jun 2016]: Tableau 8.1 screenshots were updated with Tableau 10.0 (Beta) screenshots due to my upcoming Advanced Analytics session at TC16, which is going to reference back to this blog post.
I have enjoyed research for the last four years. Yet, I have decided to resign from my postgraduate position at CERN, and to move to Capgemini. I will continue on the areas I love: Data and Analytics!
Capgemini is one of the world’s largest consulting corporations. Like many other consulting company, Capgemini does not yet have a dedicated team to offer effective strategies and solutions employing Big Data, Analytics and Machine Learning.
I love these technologies, and I am very confident that I will elaborate a business development plan to drive business growth, through customer and market definition, including new services such as:
Data Science Strategy (enable organizations to solve business problems increasingly with insights from analytics)
Consulting (answering questions using data)
Development (building custom data-related tools like interactive dashboards, pipelines, customized Hadoop setup, data prep scripts…)
Training (across a variety of skill levels; from basic dashboard design to deep dive in R, Python and D3.js)
This plan is also accompanied by a go-to-market strategy, which I don’t want to unveil on my blog. Maybe retrospective in some years, so stay tuned…
There are four key issues to overcome if you want to tame Big Data: volume (quantity of data), variety (different forms of data), velocity (how fast the data is generated and processed) and veracity (variation in quality of data). You have to be able to deal with lots and lots, of all kinds of data, moving really quickly.
That is why Big Data Analytics has a huge impact on how we plan CERN’s overall technology strategy as well as specific strategies for High-Energy Physics analysis. We want to profit from our data investment and extract the knowledge. This has to be done in a proactive, predictive and intelligent way.
The following presentation shows you how we use Big Data Analytics to improve the operation of the Large Hardron Collider.
The cookie settings on this website are set to "allow cookies" to give you the best browsing experience possible. If you continue to use this website without changing your cookie settings or you click "Accept" below then you are consenting to this.