1.
Demonetization as a regulatory tool ^
It is impossible to talk about regulatory tools aimed at disinformation without properly defining disinformation first. In 2009, Fallis defined disinformation as “misleading information that is intended to be misleading.” In 2015, he adjusted his definition to “misleading information that has the function to mislead,” because the previous definition, when strictly applied, would also include some instances of humour or satire.1 The distinguishing features of disinformation are therefore the intent to mislead expressed by the author or source of disinformation and the function to do just that. A corresponding definition of disinformation can be found in the Report of the High-level Expert Group on Fake news and Disinformation, which defined disinformation as: “False, inaccurate, or misleading information designed, presented, and promoted to intentionally cause public harm or for profit.“2 This definition has narrowed the function of disinformation, but at its core, it focuses on the intentionally misleading nature and the function to mislead. When the intent to mislead the audience is missing, then it is not disinformation, but misinformation.3
The spread of disinformation was not caused by the internet and social media, but it is undeniable that both have changed the media landscape. With the internet and social media came the rise of the Attention Economy which perceives the attention of the user as a resource,4 and monetizes it through Programmatic advertising based on behavioural targeting.5 Digital platforms allow content creators to create revenue, which also creates the opportunity to monetize disinformation content. The motivation to gain financial revenue has been established as one of two main motivations to spread disinformation by Allcott and Gentzkow in their influential paper about the role of disinformation in the 2016 US presidential election.6 Due to the structure of the media market, brand managers, either knowingly or unintentionally allow their advertisements to be placed alongside disinformation content,7 generating revenue to the ones who deliberately disseminate disinformation. It is difficult to estimate how much ad revenue is being generated by disinformation content in the entire European Union (EU). If using just Czech Republic as an example, it has been estimated that the ad revenue generated by six most visited disinformation websites amounted up to 29 000 Euros per month and the most visited disinformation website is a for-profit oriented project.8 Ad revenue is not the only option to generate money through sharing disinformation as some of the disinformation websites are also benefiting from voluntary contributions from their supporters9 or in case of propaganda, being funded by foreign entities.
Demonetization, meaning restricting or removing the ability to earn revenue from content is a tool that is currently being employed to counter the spread of disinformation by certain internet platforms and social media. Demonetization can be perceived as a content moderation tool but since it has been included in The Code of Practice on Disinformation, a soft-law self-regulatory Code for internet platforms and social media created by the European Commission, it can be labelled as a regulatory tool as well. The conditions under which the content can be demonetized are established by the platforms themselves in their guidelines or codes of conduct.10
In terms of regulatory tools aimed at countering disinformation, Helm and Nasu had previously divided them into three categories: (1) information corrections, (2) content removal or blocking and (3) criminal sanction.11 The demonetization of disinformation content does not fit into any of these categories but may be perceived as a step between the information correction and content removal. Like information correction, demonetization still allows the content to be visible and accessible but is more severe as it restricts financial revenue. Furthermore, demonetization can fulfil the function of information correction as it warns the audience about the problematic nature of the content. Compared to content removal or blocking, demonetization constitutes a more moderate interference with the right to freedom of expression, as the content in question is still visible. Demonetization is not aimed at outright illegal content, but at content that violates community guidelines and content policies of concerned internet platforms.
As with most regulatory tools, demonetization has both shortcomings and benefits. Among its benefits is its employment on social media and internet platforms, where spreading disinformation is easier. The loss of profit caused by demonetisation may lead profit-oriented actors to quit sharing disinformation. On the other hand, disinformation is not illegal content, so it is up to the platforms themselves to set rules regarding the demonetization of harmful content. That raises concerns about the privatization of censorship, as the internet platforms are the ones who decide how can their users exercise their freedom of expression.12 Furthermore, the content policies regarding demonetization can also target content creators who inform about controversial topics thus creating a chilling effect and demotivate content creators from creating legitimate content. This has already occurred when YouTube changed their content policies to prevent demonetization of “controversial or sensitive subjects and events, including subjects related to war, political conflicts, natural disasters and tragedies,”13 which affected even long-established content creators.14 Additionally, the implementation of demonetization and stricter content moderation rules can lead actors who disseminate disinformation to leave mainstream platforms and move to encrypted or decentralised platforms, such as Telegram, Peertube or Odysee.15 In addition, smaller platforms such as these can avoid stricter regulation under the new European legislation (see below) and disinformation there can spread more effectively while decreasing the ability of authorities to monitor them.
2.
The demonetization of disinformation in the European law ^
The EU has included demonetization in its policy aimed at countering disinformation from its beginning in 2018. Demonetization was a part of the Code of Practice on Disinformation, a voluntary and self-regulatory soft-law Code targeted at social media and internet platforms (2018 Code).16 The compliance with the 2018 Code was done by annual self-evaluation,17 lacking further enforcement tools. While being perceived as a positive measure by the signatories,18 criticism arose. The 2018 Code contained no clear and meaningful commitments, no measurable objectives and no compliance or enforcement tool.19 I addition, the relationship between the 2018 Code and the e-commerce directive20 was unclear as the 2018 Code has created commitments regarding content moderation which could jeopardize platforms‘ liability privilege.21
The demonetisation of misinformation was not directly tackled; however, it was intended to be achieved by improving the transparency of online advertising systems and enhanced scrutiny of ad placement. Signatories were requested to incorporate these aspects into their established processes and content policies. As a result, signatories claimed they have already established and enforced ad-placement policies.22 In its evaluation, the European Commission acknowledged the platforms have established policies regarding their commitment,23 however, the platforms have not shared data that would allow researchers to independently assess the fulfilment of their commitment. Furthermore, the Assessment of Implementation notes that the monetization of advertising is key for the platform’s business model, thus the efforts to demonetize disinformation should be emphasised.24
In 2022 the “Strengthened” Code of Practice on Disinformation (2022 Code) was introduced following the introduction and adoption of the Digital Service Act25 (DSA). The 2022 Code should have addressed the shortcomings of its predecessor together with the DSA, they have brought the following changes:
- Semi-binding nature
Very large online platforms (VLOPs) and very large search engines (VLOSEs)26 must annually conduct systemic risk assessments,27 which include evaluating risks associated with disinformation. Becoming a signatory of the 2022 Code is a measure to tackle disinformation. Furthermore, the 2022 Code aims to become a code of conduct under Article 45 of DSA. DSA gives the European Commission a tool to enforce the codes of conduct, as the Commission can invite signatories to take necessary action in cases of systematic failure to uphold the code, giving voluntary codes a binding character.28 - Keeping liability exemptions
The DSA has cleared up the abovementioned relationship between the obligations in the Code and the liability privilege. Article 7 of the DSA allows ISP providers to take measures in good faith to detect and disable illegal content without being deemed ineligible for the exemptions from liability.29 - Clear and measurable objectives
The 2022 Code tackles the No. of measurable objectives by introducing Qualitative Reporting Elements (QRE) and Service Level Indicators (SLI). QRE represent specific actions taken by signatories to fulfil commitments, while SLI provide quantitative reports on these QRE. Additionally, the code proposes creating Structural Indicators to assess the spread of disinformation and the effectiveness of the 2022 Code in both the EU and individual member states.30 - Cooperation
The 2022 Code encourages signatories to cooperate on the improvement of the Code and sets up a permanent Task-force that supervises the implementation of the 2022 Code and works on its updates. Several commitments in the 2022 Code encourage cooperation with researchers, fact-checkers and other relevant actors. - Third-party access and evaluation
The Signatories reports on compliance with the 2022 Code are made publicly available on the Transparency Centre website. VLOP’s and VLOSE’s are obliged to file their report twice a year, and other signatories once a year.31 Both the DSA and the 2022 Code oblige VLOPs and VLOSEs to ensure data access to researchers and other relevant third parties to review and evaluate their measures.
Regarding the demonetization in the 2022 Code, the adjective “strengthened”, is fitting. The current code addresses demonetization directly, not only through improving ad placement scrutiny and transparency. The 2022 Code directly obliges its signatories not to publish harmful disinformation and prevent the placement of advertisements on sites that share disinformation. The commitment addresses previously mentioned ad placement scrutiny by tightening their eligibility requirements for content monetization and improve transparency by adopting measures to enable verification of the landing pages of advertisements and providing information on the placement of the advertisements to the buyers. The signatories involved in buying advertisements are obliged to place their advertisement through advertisement sellers who have taken measures to prevent placement of advertisement next to disinformation. All relevant signatories should advance the use the brand safety tools and provide access to their services and data to third parties who can review the accuracy of their reporting and the effectiveness of their policies.
3.
2022 Code in action ^
As the commitments regarding demonetization are vast, the next section will be strictly focused on the restrictions of gaining revenue from content.
All relevant signatories32 have updated their policies regarding disinformation and the enforcement of the policies. Those who generate revenue on a per-click or per-impression basis, have emphasised strengthening their policies and mechanisms to prevent placing advertisements on sites that share content that break their content policies. To identify such content, Google uses its AdSense monitors, which combine automated mechanisms and human content review.33 Microsoft Advertising uses a combination of internal evaluation and trusted third party data and information to restrict advertisement on sites that contain disinformation.34 Meta demonetizes content rated as false by third-party fact-checkers.35 MediaMath evaluates website by internal factors and places those that breach their policies on universal blacklist.36 Adobe claims that it focuses on false or misleading ads, but neither in their Ad Requirement Policy nor in the report on the 2022 Code they do not specify any measures they have taken to demonetize disinformation websites.37 TikTok does not share content revenue with the creators who can monetize their content by entering TikTok Creator Fund, which admit creators who comply with TikTok Community Guidelines and in cases where they fail to comply, they are unable to monetize their content.38 Twitch demonetizes users who persistently share harmful disinformation both on and off the platform.39
Google, MediaMath and Microsoft Advertising have reported how many advertisements were restricted and estimated their value. MediaMath estimated the value of demonetized ads at € 18,451,042.48,40 Microsoft at € 4,878 (01/2023 – 30/2023)41 and Google at € 13,298,797 (07/2022 – 09/2022), and at € 31,132,197 (01/2023 – 06/2023).42
4.
Discussion ^
As social media and online platforms have created the opportunity to make money from sharing content, demonetization of disinformation content is a regulatory tool aimed at discouraging profit-oriented disinformation websites and creators. Implementing demonetization on large and mainstream platforms makes the opportunity to profit from disinformation complicated, although those who share it can still enter smaller online platforms with less strict content and demonetization policies or find another method of earning financial revenue (besides the one from advertising). Demonetization does not target only disinformation, as it is very difficult to prove the presence of the intent to mislead. Therefore, demonetization also those who share misinformation. The problem is that demonetization in cases of misinformation affects those who share content without being aware of its misleading nature, which, from the point of user, may seem like arbitrary interference. This problem can be addressed by demonetizing only the accounts that constantly share disinformation (as Twitch already does) or by outsourcing the evaluation of content to fact-checkers (as Meta does). Similarly, if the content policies are inappropriately configured, demonetization can affect content creators who create legitimate content. This problem could be resolved by the opportunity to file a complaint against a decision to restrict the ability to monetize content. The DSA may resolve this No. as it has obliged the online platforms to create internal complaint-handling system, where the users can file a complaint against them, and if the complaint is justified, reverse the decision.43
The EU has included demonetization among its regulatory tools to counter online disinformation from the beginning. Due to the adoption of the DSA and the update of the code of practice, it may become a widespread tool resulting in a reduction of the number of profit-oriented disinformation websites and creators in Europe. The effort to work together with the signatories to create an appropriate set of rules is to be appreciated. The 2022 Code and the DSA addressed the shortcomings of the 2018 Code and created vast commitments and measurable objectives. The obligation to report on the fulfilment of the Commitments and the obligation to provide access to data to independent third parties allow further research on the effectiveness of demonetization. However, it needs to be monitored whether demonetization also results in possible negative consequences for users.
5.
Literature ^
Allcott, Hunter/Gentzkow Matthew, Social Media and Fake News in the 2016 Election. In: Journal of Economic Perspectives, Vol. 32, No. 2, 211–236 (2017).
Braun, Joshua A./Eklund, Jessica L., Fake News, Real Money: Ad Tech Platforms, Profit-Driven Hoaxes, and the Business of Journalism. In: Digital Journalism, Vol. 7, No. 1, 1–21 (2019).
Brogi, Elda, Structural indicators to assess the effectiveness of the EU’s Code of Practice on Disinformation, SSRN Electronic Journal, https://dx.doi.org/10.2139/ssrn.4530344 (https://www.ssrn.com/abstract=4530344 Accessed 13th December 2023), (2023).
Caplan, Robyn/Gillespie, Tarleton, Tiered Governance and Demonetization: The Shifting Terms of Labor and Compensation in the Platform Economy. In: Social Media + Society, Vol. 6, No. 2, 5 https://doi.org/10.1177/2056305120936636 (2020).
Carlberg, Malin/Goubet, Marion/Hill, Jordan/Plasilova, Iva/Procee, Richard. Study for the Assessment of the implementation of the Code of Practice on Disinformation Final Report, European Commission, https://ec.europa.eu/newsroom/dae/document.cfm?doc_id=66649 (Accessed 13th December 2023), (2020).
Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market http://data.europa.eu/eli/dir/2000/31/oj
European Commission, Assessment of the Code of Practice on Disinformation – Achievements and areas for further improvement, https://ec.europa.eu/newsroom/dae/document.cfm?doc_id=69212 (Accessed 13th December 2023), (2020).
European Commission, High-Level Expert Group on Fake News and Online Disinformation. Shaping Europe’s Digital Future, https://digital-strategy.ec.europa.eu/en/library/final-report-high-level-expert-group-fake-news-and-online-disinformation, (Accessed 6th December 2023) (2018). European Commission, High-Level Expert Group on Fake News and Online Disinformation. Shaping Europe’s Digital Future, https://digital-strategy.ec.europa.eu/en/library/final-report-high-level-expert-group-fake-news-and-online-disinformation (Accessed 6th December 2023), 2018.
European Union External Action, Action plan against Disinformation, https://www.eeas.europa.eu/node/54866_en (Accessed 13th December 2023) (2018).
Fallis, Don, A Functional Analysis of Disinformation, iSchools, https://www.ideals.illinois.edu/handle/2142/47258 (Accessed 6th December 2023) (2014).
Fallis, Don, What is disinformation?, Library Trends, Vol. 63, No. 3, 422 (2015).
Gerstein, Lea/Hammer, Dominik, Schwieter, Christian, Inside the Digital Labyrinth: Right-Wing Extremist Strategies of Decentralisation on the Internet & Possible Countermeasures, Institute for Strategic Dialogue, https://www.isdglobal.org/wp-content/uploads/2023/02/Inside-the-Digital-Labyrinth.pdf (Accessed 13th December 2023), (2023).
Griffin, Rachel/ Vander Maelen, Carl. Codes of Conduct in the Digital Services Act: Exploring the Opportunities and Challenges, SSRN, https://papers.ssrn.com/abstract=4463874 (Accessed 13th December 2023), (2023).
Helm, Rebecca K/Nasu, Hitoshi, Regulatory Responses to ‘Fake News’ and Freedom of Expression: Normative and Empirical Evaluation, Human Rights Law Review, 2021, Volume 2, p. 302–328.Helm, Rebecca K/Nasu, Hitoshi, Regulatory Responses to ‘Fake News’ and Freedom of Expression: Normative and Empirical Evaluation, In: Human Rights Law Review, Vol. 21, No. 2, 302–328 (2021).
Monti, Matteo, The EU Code of Practice on Disinformation and the Risk of the Privatisation of Censorship. In: Guisti, Serena/ Piras, Elisa (Eds.). Democracy and Fake News Information Manipulation and Post-Truth Politics, Routledge, Taylor & Francis Group, London, 2021, p. 214–225.
Monti, Matteo, The EU Code of Practice on Disinformation and the Risk of the Privatisation of Censorship. In: Guisti, Serena/ Piras, Elisa (Eds.). Democracy and Fake News Information Manipulation and Post-Truth Politics, Routledge, Taylor & Francis Group, London, 214–215 (2021).
Ruiz, Carlos Diaz, Disinformation on digital media platforms: A market-shaping approach, In: New media & Society, 1–24, https://doi.org/10.1177/146144482312076 (2023).
Ryan, Butner, Camile D.Ryan/Schault, Andrew/Butner, Ryan/Ryan, Camile D., Schault, Andrew, Swarthout Swarthout, Johtn, T., Monetizing Disinformation in the Attention Economy: The Case of Genetically Modified Organisms (GMOs). In: European Management Journal, 2020, VolumeVol. 38, No. 1, p. 7–18 (2020).
Saurwein, Florian/Spencer-Smith, Charlotte, Combating Disinformation on Social Media: Multilevel Governance and Distributed Accountability in Europe, In: Digital Journalism, 2020, VolumeVol. 8, No. 6, s. 820–841 (2020).
Šefčíková, Kristína/Tkáčová, Natália, Disinformation as a Business: Business Models of the Czech Disinformation Landscape, Prague Security Studies Institute, https://www.pssi.cz/download//docs/10680_business-models-of-the-czech-disinformation-landscape.pdf (Accessed 13th December 2023), (2023).
Transparency Centre, Google, https://disinfocode.eu/reports-archive/reports/?chapter=advertising&commitment=commitment-1&signatory=google#sli-googleadvertisingmeasure11-163-1 (Accessed 18th December 2023), (2023).
Transparency Centre, Reports: Adobe, https://disinfocode.eu/reports-archive/reports/?chapter=advertising&commitment=commitment-1&signatory=adobe#qre-linkedinmeasure11-163-1 (Accessed 18th December 2023), (2023).
Transparency Centre, Reports: MediaMath, https://disinfocode.eu/reports-archive/reports/?chapter=advertising&commitment=commitment-1&signatory=mediamath#sli-googleadvertisingmeasure11-163-1 (Accessed 18th December 2023), (2023).
Transparency Centre, Reports: Meta, https://disinfocode.eu/reports-archive/reports/?chapter=advertising&commitment=commitment-1&signatory=meta#qre-linkedinmeasure11-163-1 (Accessed 18th December 2023), (2023).
Transparency Centre, Reports: Microsoft, https://disinfocode.eu/reports-archive/reports/?chapter=advertising&commitment=commitment-1&signatory=microsoft#qre-linkedinmeasure11-163-1 (Accessed 18th December 2023), (2023).
Transparency Centre, Reports: TikTok, https://disinfocode.eu/reports-archive/reports/?chapter=advertising&commitment=commitment-1&signatory=tiktok#qre-linkedinmeasure11-163-1 (Accessed 18th December 2023), (2023).
Transparency Centre, Reports: Twitch https://disinfocode.eu/reports-archive/reports/?chapter=advertising&commitment=commitment-1&signatory=twitch#qre-linkedinmeasure11-163-1 (Accessed 18th December 2023), (2023).
- 1 Fallis, A Functional Analysis of Disinformation, iSchools, https://www.ideals.illinois.edu/handle/2142/47258 (Accessed 6th December 2023) (2014).
- 2 European Commission, High-Level Expert Group on Fake News and Online Disinformation. Shaping Europe’s Digital Future, https://digital-strategy.ec.europa.eu/en/library/final-report-high-level-expert-group-fake-news-and-online-disinformation, (Accessed 6th December 2023) (2018).
- 3 Fallis, What is disinformation?, Library Trends, Vol. 63, No. 3, 422 (2015).
- 4 Ryan/Schault/Butner/Swarthout, Monetizing Disinformation in the Attention Economy: The Case of Genetically Modified Organisms (GMOs). In: European Management Journal, Vol. 38, No. 1, 9 (2020).
- 5 Braun/Eklund, Fake News, Real Money: Ad Tech Platforms, Profit-Driven Hoaxes, and the Business of Journalism, In: Digital Journalism, Vol. 7, No. 1, 3 (2019).
- 6 Allcott/Gentzkow, Social Media and Fake News in the 2016 Election. In: Journal of Economic Perspectives, Vol. 32, No. 2, 217 (2017).
- 7 Ruiz, Disinformation on digital media platforms: A market-shaping approach, In: New media & society, 7, https://doi.org/10.1177/146144482312076 (2023).
- 8 Šefčíková/Tkáčová. Disinformation as a Business: Business Models of the Czech Disinformation Landscape, Prague Security Studies Institute, https://www.pssi.cz/download//docs/10680_business-models-of-the-czech-disinformation-landscape.pdf (Accessed 13th December 2023) (2023).
- 9 Via paid content or by direct contributions.
- 10 Ruiz, Disinformation on digital media platforms: A market-shaping approach. 13.
- 11 Helm/Nasu, Regulatory Responses to ‘Fake News’ and Freedom of Expression: Normative and Empirical Evaluation, In: Human Rights Law Review, Vol. 21, No. 2, 302 (2021).
- 12 Monti, The EU Code of Practice on Disinformation and the Risk of the Privatisation of Censorship. In: Guisti, Serena/ Piras, Elisa (Eds.). Democracy and Fake News Information Manipulation and Post-Truth Politics, Routledge, Taylor & Francis Group, London, 214- 215 (2021).
- 13 Caplan/Gillespie, Tiered Governance and Demonetization: The Shifting Terms of Labor and Compensation in the Platform Economy. In: Social Media + Society, Vol.6, No. 2, 5 https://doi.org/10.1177/2056305120936636 (2020).
- 14 Caplan/Gillespie, Tiered Governance and Demonetization: The Shifting Terms of Labor and Compensation in the Platform Economy, 6.
- 15 Gersten/Hammer/Schwieter, Inside the Digital Labyrinth: Right-Wing Extremist Strategies of Decentralisation on the Internet & Possible Countermeasures, Institute for Strategic Dialogue, https://www.isdglobal.org/wp-content/uploads/2023/02/Inside-the-Digital-Labyrinth.pdf (Accessed 13th December 2023), (2023).
- 16 European Union External Action, Action plan against Disinformation, https://www.eeas.europa.eu/node/54866_en (Accessed 13th December 2023) (2018).
- 17 European Commission, Assessment of the Code of Practice on Disinformation - Achievements and areas for further improvement, https://ec.europa.eu/newsroom/dae/document.cfm?doc_id=69212 (Accessed 13th December 2023) (2020).
- 18 Carlberg/Goubet/Hill/Plasilova/Procee, Study for the Assessment of the implementation of the Code of Practice on Disinformation Final Report. European Commission, https://ec.europa.eu/newsroom/dae/document.cfm?doc_id=66649 (Accessed 13th December 2023) (2020).
- 19 Brogi, Structural indicators to assess effectiveness of the EU’s Code of Practice on Disinformation, SSRN Electronic Journal, https://dx.doi.org/10.2139/ssrn.4530344 (Accessed 13th December 2023) (2023).
- 20 Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market http://data.europa.eu/eli/dir/2000/31/oj
- 21 Saurwein/Spencer-Smith, Combating Disinformation on Social Media: Multilevel Governance and Distributed Accountability in Europe, In: Digital Journalism, Vol. 8, No. 6, 836 (2020).
- 22 Carlberg/Goubet/Hill/Plasilova/Procee, Study for the Assessment of the implementation of the Code of Practice on Disinformation Final Report.
- 23 European Commission, Assessment of the Code of Practice on Disinformation - Achievements and areas for further improvement.
- 24 Carlberg/Goubet/Hill/Plasilova/Procee, Study for the Assessment of the implementation of the Code of Practice on Disinformation Final Report.
- 25 Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market For Digital Services and amending Directive 2000/31/EC.
- 26 Platforms and search engines with more than 45 milion monthly active users.
- 27 Art. 34 DSA.
- 28 Griffin/Vander Maelen, Codes of Conduct in the Digital Services Act: Exploring the Opportunities and Challenges, SSRN, https://papers.ssrn.com/abstract=4463874 (Accessed 13th December 2023) (2023).
- 29 Art. 7 DSA.
- 30 Brogi, Structural indicators to assess effectiveness of the EU’s Code of Practice on Disinformation.
- 31 Measure 40.1 and 40.2 of the “Strengthened” Code of Practice on Disinformation.
- 32 Adobe, Google, MediaMath, Meta, Microsoft, Twitch and TikTok. The rest of signatories include NewsGuard, DoubleVerify and IAB Europe however they are not directly involved in placing advertisements and demonetization.
- 33 Transparency Centre, Reports: Google, https://disinfocode.eu/reports-archive/reports/?chapter=advertising&commitment=commitment-1&signatory=google#sli-googleadvertisingmeasure11-163-1 (Accessed 18th December 2023), (2023).
- 34 Transparency Centre, Reports: Microsoft, https://disinfocode.eu/reports-archive/reports/?chapter=advertising&commitment=commitment-1&signatory=microsoft#qre-linkedinmeasure11-163-1 (Accessed 18th December 2023), (2023).
- 35 Transparency Centre, Reports: Meta, https://disinfocode.eu/reports-archive/reports/?chapter=advertising&commitment=commitment-1&signatory=meta#qre-linkedinmeasure11-163-1 (Accessed 18th December 2023) (2023).
- 36 Transparency Centre, Reports: MediaMath, https://disinfocode.eu/reports-archive/reports/?chapter=advertising&commitment=commitment-1&signatory=mediamath#sli-googleadvertisingmeasure11-163-1 (Accessed 18th December 2023) (2023).
- 37 Transparency Centre, Reports: Adobe https://disinfocode.eu/reports-archive/reports/?chapter=advertising&commitment=commitment-1&signatory=adobe#qre-linkedinmeasure11-163-1 (Accessed 18th December 2023) (2023).
- 38 Transparency Centre, Reports: TikTok, https://disinfocode.eu/reports-archive/reports/?chapter=advertising&commitment=commitment-1&signatory=tiktok#qre-linkedinmeasure11-163-1 (Accessed 18th December 2023) (2023).
- 39 Transparency Centre, Reports: Twitch https://disinfocode.eu/reports-archive/reports/?chapter=advertising&commitment=commitment-1&signatory=twitch#qre-linkedinmeasure11-163-1 (Accessed 18th December 2023) (2023).
- 40 The report did not mention any timeframe.
- 41 Transparency Centre, Reports: Microsoft, https://disinfocode.eu/reports-archive/reports/?chapter=advertising&commitment=commitment-1&signatory=microsoft#qre-linkedinmeasure11-163-1 (Accessed 18th December 2023) (2023).
- 42 Transparency Centre, Reports: Google, https://disinfocode.eu/reports-archive/reports/?chapter=advertising&commitment=commitment-1&signatory=google#sli-googleadvertisingmeasure11-163-1 (Accessed 18th December 2023), (2023).
- 43 Art. 20 DSA.