Big Tech’s last minute attempt to tame EU tech rules
Lobbying in times of trilogues
As the EU discussions to set up new tech rules come to an end, new lobby documents expose the intense last minute corporate campaign to influence the secretive last phase - the trilogues.
In short:
A new set of lobby documents released by the European Commission and Swedish Government via freedom of information requests, shows intense corporate lobbying to shape the final stage EU discussions of new tech rules.
Google, Apple, Facebook, Spotify and others jumped on the trilogues process to try to neutralise the EU Parliament’s proposals to limit surveillance ads and expand external scrutiny of how the platforms’ systems amplify or demote content. They also sough to increase the companies’ future leverage to avoid market obligations.
The lobby documents confirm fears that trilogue secrecy only benefits rich and well-connected lobbyists. The EU Institutions and national governments must take stock and finally make the trilogues process transparent and open to the public.
Just over a year after the European Commission proposed new rules for online platforms, the process for the EU Institutions to approve them is coming to an end. The European Parliament and Council are set to reach an agreement on the Digital Services Act (DSA) Friday (22 April 2022). This new set of rules, touted as the EU’s attempt at reining in Big Tech, will tackle content issues including moderation policies, content ranking (recommenders) and surveillance advertising. The DSA follows on the heels of the agreement reached in March for a general approach of the sister proposal dealing with platforms’ market power, the Digital Markets Act (DMA).
New lobbying documents released to Corporate Europe Observatory and Global Witness via freedom of information requests expose Big Tech’s last minute attempt to influence the final stage of the EU policy-making process. Google, Apple, Facebook and others jumped on the trilogues process to try to neutralise the EU Parliament’s proposals to limit surveillance ads and expand external scrutiny of how the platforms’ systems amplify or demote content. They also sough to increase the companies’ future leverage to avoid market obligations like interoperability and access for smaller competitors.
On the other side, civil society organisations are seeking to improve users’ rights to privacy and increase protections from discrimination and manipulation. Their resources and access seem to be no match to the corporate lobbyists.
Digital Services Act and Digital Markets Act: heavy lobbying by corporate interests
The two digital proposals have the potential to address some key public interest issues regarding our online lives, and could significantly impact the way tech giants like Google and Facebook conduct their business and generate profits. The tech giants didn’t just take this risk lying down: they heavily lobbied the EU Commission during the drafting of the proposals, and then moved on to the Council and Parliament.
New self-declared lobby data shows that during this period Google, Facebook, Apple, Amazon and Microsoft all increased their spending on EU lobbying. Combined, the Big Tech firms spent more than 27 million euros in just one year. All five companies upped their budgets, but the biggest increase by far was Apple, which nearly doubled its lobbying expenditure.
But that wasn’t it: lobbyists then set their sights on the trilogues, the last stage of the EU policy-making process, when Council and Parliament try to reconcile their positions, under the stewardship of the Commission.
This process is one of the most secretive stages of EU policy-making, held entirely behind closed doors and with nearly no public access to the discussions. The EU Institutions have argued that this secrecy is needed in part to prevent lobbying pressure on the policy-makers.
New lobby documents obtained from the European Commission and the Swedish government via freedom of information requests show that intense corporate lobbying is happening regardless of the lack of transparency.
Google, Apple, Amazon and Facebook – alongside European firms like Spotify and the copyright industry – actively sought to influence the trilogues. They did so by:
- pitting the EU Institutions against one another;
- becoming more technical and offering amendments to the text;
- using meetings to gain access to information that was not available to the public;
- going high level: bringing in the CEOs to meet Commissioners, inviting them to off the record dinners.
These new documents provide unique insights into corporate lobbying by one of the most well-resourced global industries during one of the least open and transparent stages of the EU policy-making process. They show how the lack of transparency benefits big corporate lobbies and adds weight to the urgency of finally opening trilogues up to public scrutiny.
Lobbying in times of (secretive) trilogues
Trilogue negotiations are a crucial moment in EU policy-making: the time when Parliament and Council discuss and reach agreements on EU policy proposals. The EU Institutions are represented via a small number of negotiators, normally the lead MEP for the file (rapporteur), the minister in charge of the specific file for the country holding the Presidency of the Council of the EU and the responsible Commissioner.
According to the EU Parliament, in 2018 between 70 and 80% of the European Union’s legislative acts were adopted following a trilogue. In the majority of cases, the final trilogue agreement was swiftly adopted. Yet, this process is more secretive than regular EU policy-making: meetings are held behind closed doors and access to documents relating to these discussions is mostly not allowed.
The trilogues for the Digital Services Act and Digital Markets Act were no different. The agendas for meetings were not made public, and neither the Parliament or the Council have published updates with agreements achieved so far (known as the four column document as it presents the original Commission text, the Parliament and Council’s positions, and a fourth column with the final agreed compromise). Transparency of process is achieved though incidental leaks, either to media outlets (mostly behind paywalls) or to civil society organisations like Lobbycontrol.
Access to documents requests should enable democratic scrutiny and participation, notably on issues like these where there is a divergence of opinion between Commission, Parliament and Council. With the help of FragDenStaat, 40 people have directly requested the DSA and DMA trilogue documents from the European Parliament. So far the Parliament has yet to release the documents. The Council actually showed better transparency practices, releasing parts of the relevant document - although with substantial delays.
This secrecy means that only the well-resourced and well-connected lobbying actors can follow and intervene in trilogues, and excludes citizens from crucial discussions that will have an impact on their lives. In February 2022, Corporate Europe Observatory, alongside 40 other civil society organisations, wrote to the negotiators demanding an end to this secrecy. We have yet to receive a reply.
Pushback against limits to surveillance ads continues
Surveillance advertising dominated discussions on the Digital Services Act and the Digital Markets Act. An intense corporate pushback had already successfully blocked Parliament’s proposals to ban behaviour advertising or at least severely limit it (by turning tracking off by default) (see box 2 below). A last minute surprise saw new limits to surveillance ads being approved by the European Parliament, including a ban on targeting minors and on using sensitive categories of data (e.g. religion, sexuality, racial or ethnic origin), collected or inferred, to target vulnerable people with ads.
Civil society organisations welcomed this development, while observing that it still doesn’t go far enough. The EU consumer organisation, BEUC, for instance, commented that “banning ads that track minors or that are based on sensitive personal data is a very positive step forward, but this will not put an end to widespread online commercial surveillance.”
Surveillance ads
Surveillance advertising – also known as tracking or behaviour advertising – relies on the massive collection of data on people using sources including the websites they visit, their search engine queries, the videos they watch, the type of device they use, their location, other apps they have downloaded, purchasing history, etc. Users’ online lives are mined for information which is then used to build up a user profile. These profiles can contain a variety of personal information, that can be simply observed and/or inferred, including age, economic status, political views, religion, sexual orientation, mental and physical health, etc. This data is then used to micro-target users with advertising.
Criticism of surveillance ads has been growing in recent years. First and foremost due to the extreme data collection (essentially surveillance) such advertising relies on, and the way it undermines people’s data protection and privacy. Personalised ads have also been linked with other societal ills, including manipulative political campaigns, the exploitation of people in vulnerable states, and discrimination.
Read CEO’s analysis of the corporate lobbying to stop limits to surveillance advertising here.
The Parliament vote approving limits to targeting minors and the use of sensitive categories of data was not the end of the saga. While Parliament agreed that new rules should tackle surveillance ads, the Council had already adopted its own position and mostly opted to keep the Commission’s proposed approach: transparency. Unlike the European Parliament, the Commission’s proposal – supported by the Council - only went as far as asking for users to be able to able to ascertain who is behind the ad and which parameters were used to target them. Very large online platforms would further be obliged to create ads archives to allow external scrutiny and research into “emerging risks brought about by the distribution of advertising online”.
Critics - including the EU’s independent data protection authority, the European Data Protection Supervisor (EDPS) – argued that this does not sufficiently address the many, serious risks associated with surveillance ads.
This divergence of opinion between the Parliament and Council was Big Tech’s opening.
Google takes its concerns to the European Commission
In fact Big Tech companies had been laying the ground work for their lobby campaigns for a few months. Notes of lobby meetings obtained by Global Witness from the EU Commission show that in three different high level Commission meetings, taking place between November and early January, Google brought up its concerns regarding the European Parliament’s proposals on advertising.
First, in November 2021, while Parliament was still discussing a possible ban, Google’s CEO Sundar Pichai sat down for a meeting with EU Commission Executive Vice-President Vestager. The Commission’s notes show that one of his main concerns was the Parliament’s discussion of a possible ban of surveillance advertising. Google brought up the issue again in December to Vestager’s cabinet. This time the company directly addressed the Parliament’s draft position, expressing “their concern on the latest draft in the IMCO [European Parliament Committee on the Internal Market and Consumer Protection] Committee”. IMCO was main committee responsible for the DSA and, within it, the lead MEP’s draft position included a default opt-out from surveillance ads.
Google stressed its concerns again just as the Parliament was gearing up to its vote in January. This time Google raised it with Breton’s cabinet, sharing it as one of the company’s priorities on the upcoming DSA and DMA.
From the notes it is clear that Google’s main argument was that limits to advertising – either a ban or an opt-in by default – would “be detrimental also to SMEs [small and medium enterprises]” (here) and “would be most problematic for actors such as news publishers” (here). This marked a continuation of Google and Facebook’s strategy throughout the whole discussion on new digital regulations - trying to reframe it away from Big Tech’s immense profits and business model and to instead hype up potential negative impacts for smaller businesses and consumers. As Google’s leaked lobbying strategy showed, one of its priorities was to focus the discussion on the costs to the economy and consumers.
Mobilising EU capitals against the European Parliament
Around 31 January 2022, as the EU Institutions finally sat down together to start reconciling their positions on the Digital Services Act, Google was busy sending national governments detailed analysis of the positions of the EU Parliament, Commission and Council, adding in its own ‘helpful’ suggestions.
This is one of the main revelations uncovered from the documents received from the Swedish government via freedom of information requests. National governments have a say in EU policy-making via the Council. This is often referred to as the EU’s ‘black box’, as it is difficult for citizens to know who is lobbying their national government on EU policies, or even what position their national government takes in the Council. This approach, combined with the fact that lobbying at member state level requires massive resources and good connections, creates the conditions for undue corporate influence.
Google pitched in with “specific language on articles currently discussed” and suggested “concrete amendments”,
We can now confirm that corporate lobbying of EU capitals continues even after the Council agrees its positions and starts trilogue negotiations with the Parliament and Commission. While only Sweden gave us extensive access to these documents, we can expect that all EU governments must be on the receiving end of similar lobby efforts. A similar request placed to Czech Republic proved less fruitful but showed that there too Google had raised concerns with the Parliament’s DSA proposals.
The lobby documents also reveal that Google remained in frequent contact with the Swedish government from January to the end of March (the time when we placed our freedom of information request). During this period, the tech giant would send in analysis of the different positions, adding the company’s own analysis, and all the while replicating the EU Institutional format of documents with four columns. As the discussions went on behind closed doors, Google pitched in with “specific language on articles currently discussed” and suggested “concrete amendments”, showing a strikingly live knowledge of what was happening in the negotiation process.
On four different occasions, Google argued against Parliament’s proposal to ban advertising targeted at minors and other limits. Their recommendation to national governments was to “support the Council / Commission position (i.e. no restraints on targeted ads”). Google argued “that the DSA is not the right forum to deal with these issues”.
On 22 March 2022, the day of the final DMA trilogue, Google sent the Swedish government its thoughts for future trilogue meetings. Google’s positions reflected the up to date state of the ongoing discussions. Google continued to oppose concrete new proposals regarding user consent to tracking and banning the use of sensitive data for advertising. Perhaps more interesting though, Google now seemed to understand that likely there would be some new limits to targeted advertising. So Google offered suggestions about how these should be drafted: the ban on targeting minors should be limited to “known minors” and behaviour advertising should be defined as the use of individual profiling.
On four different occasions, Google argued against Parliament’s proposal to ban advertising targeted at minors and other limits.
This is no accident. Google has been moving away from relying on individual profiles for advertising (based on cookies) and has started experimenting with creating cohort groups (which would aggregate people with similar characteristics) or tracking topics that could be of interest. Digital rights groups have argued that while these new models could reduce some of the privacy risks created by tracking cookies that get shared with a wide number of third parties, they still wouldn’t stop individual's behaviour being constantly tracked, and then targeted with personalised ads. Risks like targeting vulnerable people, discrimination or predatory targeting would remain.
Google supported ad transparency but only to a very limited extent – they pushed back against proposals that would allow users to know the criteria used to target them specifically. In its detailed suggestions, Google proposed the national governments should seek to delete the obligation to disclose the criteria used for targeting, even when ads target vulnerable people like children.
Beyond Google: Facebook, Spotify and publishers
The documents show Google taking a central position lobbying against limits to surveillance ads. But they weren’t alone. Facebook, and other European companies and publishers also resorted to trying to persuade national governments to oppose the Parliament’s position.
Facebook, for instance, told the Swedish Ministry of Infrastructure in late December that it was “open to transparency” but targeted marketing should be allowed.
“A broad ban on tailored advertising to minors could badly affect the development of free streaming services, which are very popular with young people of different ages.” - Spotify
Spotify, on the other hand, focused on the limits on the targeting of minors. In February 2022, the Swedish streaming platform shared detailed policy suggestions with the Swedish government, including that consent requirements and limits on targeting minors should not be included in the DSA. Spotify argued that: “A broad ban on tailored advertising to minors could badly affect the development of free streaming services, which are very popular with young people of different ages.”
The publishers entered the discussion via a joint letter by the European Magazine Media Association (EMMA) and the European Newspapers’ Publishers Association (ENPA). In the letter, EMMA-ENPA argued against limits to surveillance advertising by highlighting the existence already of privacy laws like the General Data Protection Regulation or the ePrivacy Directive. The lobby outfit encouraged “Member States to stand by the Council’s General approach and to avoid any data protection regulation in the DSA.”
Similarly, Schibsted (a Norwegian conglomerate that includes digital marketplaces and publishers, which has been active in these discussion since the beginning), jumped in to push back against the Parliament’s position on advertising. They oppose all limits proposed, including the Parliament’s requirement to allow users that do not want to be tracked and targeted to opt out of tracking and still be given access. Schibsted argued that allowing such opt-outs “may constitute a prohibition on “cookie walls” (making access to a site conditional upon accepting the use of cookies and other identifiers).”
“It’s an encouraging positive surprise that the Council fully follows its mandate and is prepared to push back in a forceful manner on the ad-related points brought by the Parliament” - anonymous tech lobbyist
Advertising quickly became a key point of contention between the Parliament and Council. The latter strongly opposed the Parliament’s proposal for a ban or any new limits to advertising in the DSA or the DMA. During negotiations on the DMA, the Parliament’s rapporteur had agreed to drop new limits to advertising in this text based on the commitment from the EU French Presidency that this would be dealt with in the DSA.
It had seemed that perhaps this was a turning point, and civil society organisations celebrated it as a victory. Yet, the compromise suggested by the French Presidency quickly transpired to be quite underwhelming. The Council’s added new language to the text around targeting minors with advertising and using sensitive data but the text was significantly weaker than the Parliament’s proposal. It excluded sensitive data that is inferred through tracking a user’s online activity - for example deducing a user’s religion from their web-browsing history. It also made the proposal weaker by moving the main obligations on platform companies into the text’s preamble.
Tech lobbyists seemed to be rejoicing at a job well done, as one anonymously told Euractiv: “It’s an encouraging positive surprise that the Council fully follows its mandate and is prepared to push back in a forceful manner on the ad-related points brought by the Parliament”.
The good news for those concerned about digital rights, however, is that the Parliament does not seem to be backing down - the lead MEP continues to push for stronger limits to surveillance ads. Who will win in this tug of war is so far unclear.
Big Tech tried to stop data access for NGOs and public scrutiny
As well as advertising, another issue worried policy-makers, civil society and companies: how to regulate the automatic systems that rank content in the users’ feeds, also known as recommenders.
This issue was a key priority and concern for civil society in the DSA, especially after Facebook whistleblower Frances Haugen showed that Facebook’s recommender system had led to the amplification of hate content and disinformation. Haugen’s revelations, published by the Wall Street Journal under the series Facebook Files, showed that Facebook knew about the negative and harmful impact of its systems, yet, over and over again, the company chose profits over safety.
By this point, Facebook and its recommender were already under scrutiny. Many media and academic investigations had already been examining how its content ranking and advertising systems were amplifying hate and disinformation. Moreover, just before Haugen’s revelations, Facebook had blocked independent watchdogs New York University’s Ad Observatory and AlgorithmWatch from being able to scrutinise the platforms’ ads and recommenders. This move effectively blocked the public’s ability to understand how the platform is operating, and to hold the multinational accountable for its actions and impacts.
The Commission’s DSA proposal had originally sought to increase external scrutiny of the very large platforms by forcing them to give access to data on how recommenders - that is the automatic ranking systems that determines which content is prioritised in users’ feed - select content to vetted external researchers. But the Commission’s proposals limited this to researchers affiliated with academic institutions. This left a wide gap – for instance, AlgorithmWatch would still not be able to resume its work. The text also included protections for the trade secrets of very large online platforms. Companies can invoke ‘trade secrets’ to avoid complying with transparency measures, and thus they can be a severe threat to media and social scrutiny.
“The Parliament text moves in a dangerous direction, by expanding access from researchers to include ‘non-profit’ organizations" - Google
Civil society organisations, including AlgorithmWatch, are concerned that the DSA would “fall short of the measures needed to open up big online platforms to meaningful scrutiny“. A letter to MEPs sent on 26 November, signed by civil society organisations and academics, defended expanding data access to vetted public interest civil society organisations and journalists and removing trade secrets exceptions, in order to ensure greater transparency and scrutiny of the actions and impact of Big Tech companies like Facebook.
The January vote proved to have a mixed result – on the positive side it expanded access to vetted civil society organisations, but not journalists. A trade secrets amendment was voted down, although mentions of it remained in the texts.
Just before the vote, Spotify raised concerns with national governments regarding the Parliament’s positions on recommenders. Spotify called for “flexibility” regarding how obligations like transparency and opt-out of personalisation are to be implemented. The company followed up again a few days later, this time with more detailed input. The world’s biggest music streaming service didn’t want the transparency requirements to include detailed lists of parameters, as was introduced by the Parliament. On the other hand, it welcome the Parliament’s last-minute introduction of exceptions to recommender transparency, including the protection of intellectual property and trade secrets.
In March this year, Spotify followed up to add its comments “regarding the latest compromise proposals on Recommender Systems”. The company supported the “evolution of the text” regarding recommender transparency and welcomed “a clarification in a Recital that these rules do not prejudice IP [intellectual property] rights and trade secrets”.
Google, in turn, took issue with the external scrutiny measures. In a meeting with the Swedish Ministry of Infrastructure, the owner of Youtube questioned whether non-profit organisations should be included, as proposed by the Parliament.
“regarding the latest compromise proposals on Recommender Systems” Spotify supported the “evolution of the text”
In a follow-up e-mail, Google pushed back even further: “The Parliament text moves in a dangerous direction, by expanding access from researchers to include ‘non-profit’ organizations, a category so broad it puts user data and privacy and confidentiality of information at risk”. Google also wanted to be given the possibility to block specific vetted researchers from accessing its data. The company asked national governments to oppose the Parliament’s position and instead support the Council’s mandate. Taken all together, Google’s suggestions would make external scrutiny of the ways in which services like Youtube amplify or de-prioritise content nearly impossible.
Google later strengthened its position by opposing “prescriptive proposals which would force platforms to make both the info on the main parameters for recommender systems and the functionality to opt-out from personalised recommendations directly accessible from the content itself”.
As the trilogue discussions commenced, the Council and Parliament positions were quite different on these key issues. The Council’s mandate did not touch upon the Commission’s limited position regarding access to data by vetted researchers. Notes of internal Council discussions show that Luxembourg, Spain and Lithuania echoed concerns regarding allowing civil society organisations to become vetted researchers. Finland, Lithuania, Estonia and Poland also seemed concerned by the Parliament’s deletion of a trade secrets exception.
So far it is not known whether the EU Institutions have reached a compromise agreement on these issues. We know that the Council accepted the forcing of companies to disclose the criteria used for personalizing recommenders, but it is unclear what will happen to possible opt-outs and access to data by vetted civil society organisations.
In the background: marketplaces, copyright and categorising very large online platforms
There is much more to the DSA and to the lobbying that took place during the trilogues. Google also pushed back against attempts to have it included as a marketplace, and against publishers’ requests to have media content be exempted from content moderations. The company also wanted to limit users’ redress options when faced with bad content moderation.
Spotify, on the other hand, took a particular interest in which companies would be categorised as very large online platforms. Under the DSA, very large online platforms face stricter rules, including mandatory risk assessment and extra layers of transparency.
Copyright industry lobby groups such as the International Federation of the Phonographic Industry (IFPI) pushed, consistently and frequently, to try to ensure that search engines would not be automatically exempted from possible liability and to allow the possibility of automatic content recognition tools, like copyright filters. See, for instance, here and here. IFPI’s message was echoed by other players including Universal Music, Impala, Digital Music Europe and the Swedish Professional Football Leagues.
The Copyright industry could be one of the big winners from the trilogue lobbying campaign as, at the very last minute, the conservative MEP Geoffrey Didier, proposed changes to the DSA agreement that echoed the copyright industry’s requests. Commission and Council are expected to support them too. This last minute addition brings back a heated discussion that took place in 2018 while the EU Institutions prepared the Copyright Directive that opposed the copyright industry to Big Tech and human rights activists concerned about possible overblocking and risks to freedom of speech.
High-level DMA lobbying: Apple pushes back
While the Digital Services Act is still yet to be approved, its market power counterpart the Digital Markets Act was wrapped up in March 2022. The DMA aims to make digital markets more open and fair, by identifying companies that act as digital gatekeepers - broadly meaning market dominant companies. These gatekeepers are then subject to a set of rules governing how they should behave, including a prohibition on merging personal data collected across services, and a requirement to refrain from self-preferencing (e.g. giving their own products top billing in internet searches). The proposal also obliges gatekeepers to allow competing services to interact with the gatekepeers' own operating system, hardware, or software (interoperability) and to allow users to install third party apps (sideloading).
The DMA proposal sped through the institutions much faster, and it seems the lobbying efforts from the big companies were less successful. Yet, this didn’t stop them from trying.
The documents released to us show that the same lobby strategies used to try to influence the DSA were also deployed to impact on the DMA. Google was joined in its lobby efforts by Amazon, Apple and Facebook, who all tried to soften the proposal during its last stage. To a much lesser degree, smaller competing businesses like Ecosia, DuckDuckGo and Qwant tried instead to strengthen the proposed measures.
Apple was particularly active in pushing back against measures that could open up its grip on the App Store or Apple’s mobile operating system. The company’s main argument was that increasing data access, sideloading and interoperability would reduce user privacy and security. The iPhone maker tried to spread its message far and wide. Back in September 2021, when the Parliament and Council were still discussing their positions on the DMA, the think tank Atlantic Council invited Commissioner for Justice Didier Reynders for an off-the-record dinner discussion with Apple. The roundtable was to open with Apple’s comments on “cybersecurity and the tech sector’s understanding of the EU’s digital regulation or proposals”. Reynders accepted the invitation. No notes of the discussion exist.
On January 11 2022, policy-makers from the Commission, Council and Parliament held the first trilogue meeting for the Digital Markets Act. Two weeks later, Apple wrote to the Swedish government outlining its concerns regarding the DMA. It was especially concerned about the Parliament’s proposed extension of interoperability, but also the fact that side-loading – as proposed by the Commission - remained in the text. The Swedish officials agreed to hear Apple’s concerns, and in early February, ahead of their meeting to discuss the same, Apple sent on their “latest paper” on the interoperability requirement “given the important differences between the Council and Parliament text.”
While Apple could not successfully stop interoperability and sideloading entirely, the final text does introduce a security safeguard, which will enable the company to try to justify not complying with these obligations.
Big Tech’s main wish: regulatory dialogue
Yet the top level message from the Big Tech companies to policy-makers regarding the DMA was the same across the board: Big Tech wanted to build a dialogue between the DMA’s regulator – the European Commission – and the companies covered by it – the gatekeepers, into the text and the regulatory approach. They brought this wish up consistently at the high level meetings, such as the December meeting between Google and Vestager’s cabinet. There Google said that regarding the DMA their “core argument towards the Parliament was the need for regulatory dialogue and the opportunity to individually justify certain practices”. Google repeated the same message to Breton’s cabinet in January - “Proper regulatory dialogue is important to ensure the enforcement of the DMA.”
Google's “core argument towards the Parliament was the need for regulatory dialogue and the opportunity to individually justify certain practices”.
On the very same day, Nick Clegg, Facebook’s head lobbyist, told Commissioner Reynders, that for Facebook “it would be helpful to have the possibility of having a dialogue with regulators on questions concerning compliance.”
Amazon, in turn, told the Swedish government that it was “more comfortable with content of the Council compromise proposal than with the European Parliament's amendments.” The company also raised concerns that specific measures had been moved from Article 6 to Article 5, which would mean they would be automatically applicable and not dependent on a regulatory dialogue.
The DMA envisions that a set of rules would be implemented automatically (under Article 5), but others (Article 6) would be subject to dialogue with the relevant companies to ensure how to best deliver compliance. The aim of Big Tech’s lobby campaign was to expand this dialogue as much as possible. Even last August Corporate Europe Observatory flagged this as a key priority of Facebook, Google and Apple. Lobbycontrol has argued that Big Tech’s aim here is to “gain time – and first of all an entrance point for challenging the DMA’s obligations.”
Ultimately the scope of regulatory dialogue in the DMA has been changed to allow the gatekeepers to initiate it. However, it will still be up to the Commission to decide whether or not to engage. We will have to wait and see how this plays out in practice.
The final tech rules are in sight but the fight is far from over.
We are now on the brink of an agreement on how Big Tech will be regulated, but still too many crucial details are up in the air. It seems clear that the lobbying campaign from the deep pocketed tech companies had a significant impact on the discussions, especially when it came to advertising. Yet, there are still hopeful signs from the European Parliament.
It is important to remember that regardless of the result, the process won’t end there. The actual implementation of these new rules will be crucial both for the DSA and the DMA. The firms know it too and they have been getting ready. It is no surprise that Politico Europe found that the tech giants recently went on a hiring spree, “building up their contingents of competition, legal and lobby experts”, adding up to almost a hundred new posts over just a few weeks.
The tech giants recently went on a hiring spree, “building up their contingents of competition, legal and lobby experts”, adding up to almost a hundred new posts over just a few weeks.
Civil society, policy-makers and media should also be getting ready.
But as the DSA and DMA process come to an end, there is a need to learn lessons from this process, and the role that corporate lobbying is playing in tech policy discussions, as well as more broadly, in the opaque trilogue phase.
That includes the bad: that companies with deep pockets get more access to policy-makers and have a wide net of third parties like think-tanks and intermediaries that can echo and reframe their message. On top of that, the secrecy of certain policy-making process, especially trilogues, and the EU’s approach to lobbying, only serves to benefit well-resourced companies.
We should also reflect on what went well – including that political leadership and scepticism regarding corporate lobbying can move the process forward but that, above all, intense media scrutiny and civil society participation can limit the impact of money in politics. However, that also means that political and policy issues that are subject to less or no civil society participation or media scrutiny, due to capacity or resource constraints, will be more vulnerable to corporate influence.
These lessons will be important in the future as the DSA and DMA are implemented, but also for other ongoing and future discussions on how to regulate Big Tech companies, and how to make digital technologies work for people.
Recommendations:
Corporate Europe Observatory has been tracking and analysing the DSA and DMA process since the drafting stages. Based on this research we make the following recommendations to improve EU policy-making and ensure that well-resourced tech lobbyists don’t enjoy excessive influence in future discussions:
-
Improve transparency of trilogues processes by making public an up to date calendar of trilogue meetings, including summary agendas, and proactively publishing the four-column document on a rolling basis;
-
Increase transparency and democratic accountability at member state and Council level - this must include disclosing each country’s position;
-
Limit one-on-one lobby meetings and replace them as much as possible with public hearings;
-
EU institutions should proactively seek out those who have less resources, such as small and medium sized enterprises, independent academics, civil society and community groups;
-
Ensure effective lobby transparency without loopholes, including a mandatory and better equipped Transparency Register;
-
Proper funding transparency requirements that mandate think tanks and other organisations to reveal their funding sources;
-
Block the revolving door between EU institutions and Big Tech firms by strengthening ethics rules and setting up an independent ethics committee that can launch investigations and implement sanctions;
-
EU officials and policy-makers should be sceptical of those lobbying them: they should question their funding sources, check their information and data sources and denounce any type of wrongdoing or non-transparent/unethical lobbying they encounter;
-
EU officials and policy-makers should not attend or participate in events or debates that are closed to the public, held under Chatham House rules, or that do not disclose their sponsorship;
-
Experts participating in policy discussions should always disclose their clients and potential conflicts of interest. Whenever expert organisations become involved in communication with decision-makers and at policy events on behalf of clients, they should register in the EU’s Transparency Register and disclose information about any clients who are providing the money for these activities.