European Privacy Regulators Step Up Scrutiny of Business Data Practices

Many companies have gradually adjusted business practices to comply with the GDPR since it took effect in 2018

In January, Facebook parent Meta Platforms was fined $414 million by Ireland’s privacy regulator, a decision that could make it much more difficult for Meta to collect data.
Photo: dado ruvic/Reuters

European privacy regulators are reaching beyond investigations into run-of-the-mill violations of the General Data Protection Regulation, such as data breaches, and eyeing companies’ business models, scrutinizing their contracts and considering more nuanced aspects of how the nearly five-year-old law applies. 

Cybersecurity failures and misuse of artificial intelligence show up in privacy fines, with regulators sanctioning companies that don’t require long, complex passwords to protect data.

Increased attention to technical details, combined with companies’ adoption of new technologies, mean corporate privacy officers need to continually reassess how they meet the requirements of the GDPR, said Odia Kagan, a partner at law firm Fox Rothschild LLP. 

Compliance with “GDPR is not a one-time thing,” she said.

Regulators are gradually receiving budget increases after complaining for years that they lacked the funds and staff to properly enforce the GDPR. Eighty-seven percent of regulators said they don’t have enough staff in a survey published in September by the European Data Protection Board, the umbrella group of regulators. Some offices received budget increases last year and plan to hire more experts, and therefore increase enforcement.

The size of Ireland’s regulator, which oversees European operations for many large multinationals including Alphabet Inc.’s Google and Meta Platforms Inc., grew from 110 staff members in 2018 to an estimated 258 by the end of 2022, the government said last year. The office’s 2023 budget was increased to 26 million euros, around $28 million, to hire two commissioners and other employees and respond to “investigative complexity,” the Irish Department of Justice said in July.

The Swedish regulator plans to hire 50 employees in the first half of this year, including legal, technology and cybersecurity specialists, after receiving increased funding between 2022 and 2024, a spokesman said. 

Businesses adjust

Many businesses have adapted some data practices to comply with the law, which appeared onerous to many corporate executives and came with the threat of high fines when it went into effect in 2018. Gradually, the GDPR has forced changes such as decreasing the time many human-resources departments hold on to job applicants’ information to comply with data minimization rules, said Barry Cook, an independent data protection consultant who works with multinational companies in different industries. 

Companies also bring privacy and product teams together earlier and more regularly now, he said, a change prompted by the law’s requirement to build privacy features into products, instead of having privacy teams review them for the first time before they go to market.

In the past, companies often didn’t know what data they held, which brought privacy risks and often led to higher IT costs, Mr. Cook said. “It has gotten easier because companies are seeing the positive spinoffs,” he added. 

At German car maker Porsche, for example, privacy-specialized engineers are involved in software development and designing products with built-in privacy features. 

For many companies, it was confusing four years ago to assess basic questions like how to justify collecting personal information for common practices, including monitoring the corporate network for security problems, said Tomas Hedman, head of privacy at Embracer Group AB, the Swedish parent company of videogame development studios. 

Monitoring the network captures employees’ data, but those questions are now clear, he said, in part because regulators have issued decisions and guidelines to explain specific topics. Corporate privacy leaders initially rushed to comply with specific elements of the GDPR, and over time have developed processes to integrate discussions about data protection into the business, Mr. Hedman said. “It’s changed from being a completely legal work to being a more business and change management thing,” he said.

Big fines

In November, the French regulator tied cybersecurity practices to data privacy violations at Discord Inc., fining the chat service 800,000 euros, in part for accepting passwords of just six characters, which it considered too weak. Discord changed its policy to satisfy the regulator’s recommendation, requiring passwords of at least eight characters, with at least three of the four types of characters—lowercase letters, uppercase letters, numbers and special characters. If a user attempts to log in 10 times unsuccessfully, they must sign in with the Captcha verification tool. 

In an early January decision against Facebook parent Meta Platforms, Ireland’s privacy regulator reached into one of the company’s core business processes. The regulator slapped the company with a $414 million fine and a decision that could make it much more difficult for Meta to collect data. Meta can no longer rely on contracts with users to claim that people agree to sharing their data if they accept contractual terms, the regulator said. 

Meta objected to the decision, saying that it doesn’t prevent targeted advertising, but concerns how the company justifies data collection. The company said it is evaluating options to continue its personalized ads. 

“An entire business model falls by the wayside,” said Omer Tene, a partner at law firm Goodwin Procter LLP.

European privacy regulators have also edged into growing areas of technology to scrutinize technical details affecting data. The Hungarian regulator, for example, last year fined Budapest Bank Zrt. 250 million forint, or around $682,000, for using artificial intelligence software to analyze emotions during calls with customers without properly informing them about the tool. 

The regulator said in its decision that artificial intelligence needs special attention because it poses new privacy risks, and the bank needed to obtain explicit consent from consumers to analyze their voices. Instead, the company had told consumers in a privacy notice that calls were recorded, but didn’t explain the AI analytics tool it used. 

Budapest Bank didn’t respond to a request for comment. 

As regulators gain more experience with the GDPR, they are looking more frequently into how corporate practices align with principles of the GDPR, instead of whether they comply with certain requirements such as carrying out a risk assessment, said Gabriela Zanfir-Fortuna, vice president for global privacy at the Future of Privacy Forum, a think tank based in Washington, D.C. Several regulators are moving into more sophisticated types of investigations and looking at how to apply the privacy law to algorithms, she said. 

Related Posts

Leave a Reply

Your email address will not be published. Required fields are marked *

Editors Pick

  • All Post
  • Unites States

Subscribe For News

Get the latest sports news from News Site about world, sports and politics.

You have been successfully Subscribed! Ops! Something went wrong, please try again.

Latest Posts

  • All Post
  • Economy


Hot News

Subscribe For More!

Get the latest creative news from us about politics, business, sport and travel

You have been successfully Subscribed! Ops! Something went wrong, please try again.

© 2023 AicoNews right Reserves