How to Govern Analytical Models [GDPR Step 12]

The General Data Protection Regulation (GDPR), introduced by the European Union, took effect on May 25, 2018. With the introduction of GDPR, organizations have to govern their analytical models to ensure that these do not interfere with the rights of data subjects, such as customers, employees, and prospects.

Talend recently hosted an on-demand webinar, Practical Steps to GDPR Compliance, that focuses on a comprehensive, 16-step plan to set up a data governance program that supports GDPR compliance.

Governing analytical models is Step 12 in this plan. To learn more about the first eleven steps check out the links in the sidebar.

Watch Practical Steps to GDPR Compliance now.
Watch Now

GDPR’s Perspective on Analytical Models

Analytical models feed off customer profiles to predict future behavioral patterns. Since these models use personal data for computations, it becomes important to assess how they use this data and whether they compromise data subjects’ privacy.

Automated Decision-Making and Profiling

Article 22 of the GDPR addresses automated individual decision-making, including profiling. According to this article, the data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, that produces legal or other significant effects on him or her.

This implies that the data subject has to provide an explicit opt-in, agreeing to be a part of any profiling by an organization, unless it’s for legitimate business purposes such as screening a potential employee who has a criminal record.

Another use case might be a company using an analytical model to automatically compute bonus payouts. If the company ensures that the data is completely protected using techniques, such as anonymization that do not reveal the individual’s identity, then an opt-in is unnecessary. However, if the company inputs certain identifying characteristics for the analytical model computation, the employee’s buy-in is needed. The article can, therefore, also be interpreted as an incentivization for companies to establish better data masking standards.

Under privacy laws such as Recital 71, organizations are required to disclose automated processing, and the results are to be made available to data subjects (“right to an explanation”).

Disparate Treatment and Impact

Article 9 of the GDPR mandates that legal and compliance need to sign off on usage of special categories, such as race, ethnic origin, political opinions, sexual orientation, etc. When a company makes a decision using one of these special category fields, it can be construed as a discriminatory act, and hence the sign-off is mandatory to confirm that that this field is absolutely required for the processing.

However, there may be scenarios where the company may not intend to discriminate (disparate treatment), but the bias may occur as an accidental consequence (disparate impact).

For example, consider a bank that uses postal codes in its analytical models to make credit offers to customers. The use of postal codes may result in disparate impact if the bank excludes certain postal codes that have a significant amount of minority population residing in them.

How to Govern Analytical Models

Analytical models used within an organization have remained a black box until now, with no visibility into the logic that drives decisions. These models can no longer hide under the pretext of complexity. For example, if a bank rejects a housing loan to a prospect, the reason has to be clearly communicated.

Here are a few strategies that companies can employ to govern analytical models:

  • Create controls through a data governance team so that all models need to be signed off by legal and compliance before being released into production.
  • Establish focused governance over analytical models, which deal with risk and marketing propensities.
  • Design and test predictive models thoroughly to avoid situations of disparate impact.
  • Develop an inventory of models to evaluate impact if new data elements get added or the existing ones get modified.
  • Facilitate communication between data analytics, governance, legal, and compliance teams.

Data governance teams can build a model inventory using Talend Metadata Manager, which includes the name of the model, model owner, input and output variables, model methodology, creation date, and proof of sign-off by legal and compliance. Since the metadata manager already has the data landscape defined, it becomes an ideal framework to integrate model inventory.

Next Steps to Governing Analytical Models

The challenge with governing analytical models is that it requires collaboration of dissimilar teams, such as data science (strategic) with legal and compliance (operational). For this partnership to work successfully, organizations need to make a fundamental shift in their mindset by prioritizing personal data privacy ahead of other decision-making criteria.

The next step of Talend’s comprehensive 16-step plan to achieve GDPR compliance is managing end user computing.

To learn more about this, and see all 16 steps together, don’t miss the on-demand webinar, Practical Steps to GDPR Compliance. The video covers information on developing standards and controls, identifying data owners and critical data elements, conducting risk assessments, improving data quality, and more.

    

 

| Last Updated: November 29th, 2018