More and more companies have deployed People Analytics projects that have had an impact on the success of their employees and their businesses. We’re also seeing fast growth in the number of companies that have started building their own People Analytics units, to continuously strengthen the interaction between the talents of their (future) employees and their organization’s objectives.
Examples of the great results I have witnessed myself are: improved quality of hire due to powerful algorithms, increased retention due to crystal clear insight into the key drivers of engagement (followed by interventions), more diversity due to less biased selection processes, and real-time insight into how to most build the most effective teams. This is all great news!
As always, amongst all these successes we’re also confronted with limitations, pitfalls, and risks. From day one, a key challenge we’ve encountered in People Analytics projects is 'winning trust’. This is why we need to explain to people that analytics can be trusted. It’s important that we reiterate how People Analytics is about more than just automating people decisions while excluding or replacing human insights, experience, and even gut-feeling.
More recently, I’ve noticed an increase in the number of questions and concerns with respect to topics such as; privacy, data security, data ownership, ethics, and the reliability or integrity of algorithms. There’s a greater awareness that algorithms can and do sometimes inherit the developer’s biases and those of the used data itself. There’s been huge publicity about Google’s biased search algorithms, and Facebook’s biased news feeds. It would be naive to assume that People Analytics projects are immune to this kind of problem.
It’s easy to see why we need to be careful when applying data and algorithms 'blindly'. I think it is good to realize that People Analytics is not the holy grail, but that it should instead complement existing academic knowledge and our own day-to-day expertise. Brought together, the theory, experts, and data create valuable insights. It is vitally important not to isolate them from one another.
When discussing the topic of trust, it helps to distinguish between the different aspects of trust.
Moving forward, we expect that fewer people are going to be willing to share their data by default. General Data Protection Regulation (GDPR), coming into force in May 2018, will enable people to exercise their rights not to share their data. Increasingly they will want to know “what’s in it for me?” and, “can I trust you?”.
I anticipate that people will increasingly take greater ownership of their personal data. Privacy by design will become the standard, and it will be a real challenge to convince people to share their data with you unless they trust you 100%.
Research shows that top decision makers want their organisations to be making more fact-based people decisions. However, many of them also feel uncomfortable trusting the data in their systems. KPMG published some interesting research on how to reinstate trust, using four anchors. These are:
People have a hard time accepting and trusting automated decisions, even more so when it comes to HR related decisions. This distrust is often referred to as ‘algorithm aversion’. Algorithm aversion is the tendency to distrust evidence-based algorithms, even when it is known that these outperform human forecasting and decisions.
I recently came across some nice research focused on overcoming this aversion. One key takeaway from this was that we need to give people the possibility to modify algorithms, even if the bandwidth of the modification is relatively small.
It’s vital that people trust that the insights generated from the data will be used for the right reasons. This comes down to trusting the people who will be using that information to support their decisions.
It is crucial to communicate that we are not aiming to automate the human but to humanize the data.
We should make it clear we are using the data to create insights that are beneficial for the company’s strategic objectives, but above all that this is beneficial for the people who are involved.
For instance, gathering insights into work behaviors to find out how people collaborate at work, what they share with each other and how they communicate with customers can create valuable knowledge for the employees involved. This kind of exercise can help them become happier and more effective at work. On the other hand, it could also be used to let people go, or to become stricter about what they can and can’t do in their roles.
To summarize, I think we must be careful that we do not underestimate the challenges we need to overcome to win the trust of all our stakeholders, including, or especially, the subjects of the analyses. If we don’t take this seriously and put it at the core of everything we do, it could very well be that we have already reached the limits of People Analytics.
Good and effective People Analytics teams must have the highest moral standards and must always put people first. People will not accept a ‘black box’ making decisions about them. To win trust, it is essential that we understand and explain the algorithms that power these decisions. Complete transparency and the ability to modify the algorithms (even slightly) is key, otherwise, analytics will soon be out of data.
If you'd like to know more about how Cubiks can help you get People Analytics up and running in your organisation, email us on email@example.com.
Jouko van Aggelen is one of the Cubiks partners and before heading the analytics team he was Director of Cubiks in the Netherlands. As a data enthusiast, Jouko is passionate about connecting people and data to create actionable insights on the trends, opportunities and challenges that matter to HR. He believes that People Analytics is about much more than algorithms, and he’s passionate about helping HR understand the unique talent stories data analytics can tell. If you want to reach out to him please connect via LinkedIn, or simply drop him an email.
Further reading and inspiration
Article originally published on analyticsinHR.com.