Legal Framework of the Digital Services Act for Online Platforms
Elif Beyza Akkanat Öztürk, Eylül Erva AkınThis paper comprehensively evaluates the legal obligations prescribed for online platforms under the Digital Services Act (DSA). It examines the current impacts of DSA on online platforms and potential future implications. The DSA represents one of the European Union’s most comprehensive and innovative initiatives to regulate digital spaces. The objective of this regulatory framework is to clarify the responsibilities and obligations of online platforms and to provide secure and transparent digital services. In other words, the DSA aims to reduce legal uncertainties in the internet environment, protect user rights and control the impact of platforms on society. In this perspective, the regulation of online platforms and the existing legal framework for digital services within the European Union are analysed. The evolving role of online platforms with DSA and the responsibilities of this new role are discussed.
This study examines content moderation and the liability regime under the DSA, highlighting the balance between fundamental rights and freedoms and the importance of transparency obligation. Special attention is given to the protection of children online, assessing the adequacy of the DSA in safeguarding children’s rights, the scope of platform obligations and how the concept of harmful content is addressed. Finally, terms of use involving content moderation and protection of special categories of personal data and behavioural advertising practises are examined. The impact of DSA on the operation of online platforms and user rights is evaluated within the framework of transparency and accountability principles.
Çevrimiçi Platformlar Için Dijital Hizmetler Yasası’nın Hukuki Çerçevesi
Elif Beyza Akkanat Öztürk, Eylül Erva AkınBu çalışma, Dijital Hizmetler Yasası (Digital Services Act-DSA) uyarınca çevrimiçi platformlar için öngörülmüş hukuki yükümlülüklerin kapsamlı bir değerlendirmesine özgülenmiştir. Bu kapsamda DSA’nın çevrimiçi platformlara ve bu düzenlemenin gelecekteki potansiyel düzenleme olası etkileri incelenmiştir. DSA, Avrupa Birliği’nin dijital alanı düzenlemeye yönelik en kapsamlı ve yenilikçi girişimlerinden biridir. Bu düzenlemenin amacı çevrimiçi platformların sorumluluklarını ve yükümlülüklerini netleştirmek, dijital hizmetlerin daha güvenli ve şeffaf bir şekilde sunulmasıdır. Bir diğer ifadeyle DSA, internet ortamındaki yasal belirsizlikleri azaltmayı, kullanıcı haklarını korumayı ve platformların toplum üzerindeki etkisini denetlemeyi amaçlamaktadır.
İlk olarak DSA’nın yürürlük sürecinden itibaren Avrupa Birliği’nde çevrimiçi platformların düzenlenmesi ve dijital hizmetler üzerine mevcut hukuki çerçeve analiz edilmiştir. DSA ile çevrimiçi platformların “dönüşen rolü” ve bu yeni rolün getirdiği sorumluluklar değerlendirilmiştir. DSA hükümlerinde içeriğe yönelik müdahaleler ve sorumluluk rejimi incelenmiş; temel hak ve hürriyetler arasında denge arayışının nasıl sağlandığı ve şeffaflık hükümlerinin önemi vurgulanmıştır. Ayrıca çocukların çevrimiçi ortamda korunmasına dair hükümler ayrı bir başlık altında incelenmiş; çocuk haklarının korunmasında DSA’nın yeterliliği, platformların yükümlülüklerinin kapsamı ve zararlı içerik kavramının ele alınış şekilleri analiz edilmiştir.
Son olarak içeriğe yönelik müdahale içeren kullanım koşulları değerlendirilmiş ve özel nitelikli kişisel verilerin korunması ile davranışsal reklamcılık uygulamaları ele alınmıştır. Bu bağlamda DSA’nın çevrimiçi platformların işleyişine ve kullanıcı haklarına etkisi, şeffaflık ve sorumluluk ilkeleri çerçevesinde değerlendirilmiştir.
The Digital Services Act represents a landmark regulatory initiative by the European Union that aims to establish a comprehensive framework for the regulation of online platforms and digital services. The DSA is designed to address the growing complexities and challenges associated with the digital ecosystem, particularly in relation to platform accountability, content moderation, user rights and the protection of vulnerable groups like children. This article provides an in-depth evaluation of the legal obligations imposed on online platforms under the DSA.
One of the primary motivations behind the DSA is the need to address the imbalance of power between large online platforms and their users. The rise of digital platforms has led to significant changes in how content is created, shared, and consumed. However, these platforms often operate with minimal oversight, leading to concerns about misinformation, harmful content, and user data exploitation. The DSA aims to rectify this by imposing clear legal obligations on platforms to manage content and ensure user safety.
A critical aspect of the DSA is its emphasis on the evolving role of online platforms, from passive intermediaries to active regulators. Historically, platforms have been viewed as neutral conduits of information, merely hosting content generated by users without taking responsibility for content or its impact. However, the proliferation of harmful and illegal content has necessitated a re-evaluation of this role. The DSA requires platforms to take a more active stance in moderating content, which includes swift removal of illegal content and implementation of measures to prevent its dissemination. This shift reflects a broader trend towards greater accountability in the digital space, recognising that platforms, by virtue of their influence, have a duty to protect the public from harm.
The DSA also introduces a liability regime for online platforms, particularly focusing on the responsibilities of very large online platforms (VLOPs) that have significant reach and impact on society. These platforms are required to conduct regular risk assessments to identify and mitigate potential harms arising from their services, including the dissemination of illegal content, disinformation and violations of fundamental rights. This risk-based approach is complemented by transparency requirements, which mandate that platforms provide clear and accessible information about their content moderation policies, the use of algorithms, and advertising practises. By enhancing transparency, the DSA aims to ensure that users are better informed about how their data are used and how content decisions are made.
An important dimension of the DSA is its focus on balancing fundamental rights and freedoms with the need for content regulation. The regulation explicitly acknowledges the importance of safeguarding freedom of expression while simultaneously addressing the need to prevent harm and protect vulnerable groups. However, DSA presents inconsistencies in that it does not require online platforms to monitor illegal activities on their platforms while simultaneously mandating the prevention of infringements, leading to potential contradictions in enforcement across the EU. This balancing act is one of the most challenging aspects of the DSA, as it requires platforms to navigate the fine line between moderating content to prevent harm and ensure that such moderation does not unjustly infringe on users’ rights. The DSA’s transparency requirements are crucial in this regard because they provide users with the ability to understand and challenge the decisions made by platforms regarding content moderation.
The protection of children in online environments is another significant focus of DSA. The DSA sets out specific obligations for platforms to implement safeguards that protect children’s rights online. The adequacy of these measures in effectively safeguarding children’s rights is critically examined in this article, highlighting both the strengths and potential gaps in the DSA provisions.
The DSA also addresses the issue of harmful content and how platforms manage such content. The concept of harmful content, which can include hate speech and disinformation, presents a complex challenge for regulators. The DSA requires platforms to develop clear policies for handling harmful content, including mechanisms for reporting and removal. However, the subjective nature of harmful content means that platforms must make difficult decisions about content moderation while avoiding over-removal.
Another crucial area explored in this article is the regulation of personal data and behavioural advertising under the DSA. The use of personal data for targeted advertising has been one of the most contentious issues in the digital economy, raising concerns about privacy, consent, and the manipulation of user behaviour. The DSA introduces stricter rules on how platforms may collect and use personal data, particularly sensitive data, and emphasises the need for greater transparency and user control over how their data are used. This is seen as a necessary step towards rebalancing the relationship between platforms and users and ensuring that users are not exploited for commercial gain without their informed consent.
Finally, the article considers the broader implications of DSA for the future of the digital ecosystem. The DSA is a significant step towards regulating the power of large online platforms, but it also raises questions about the future role of regulation in the digital space. Implementing the DSA will require ongoing collaboration between platforms, regulators, and other stakeholders to ensure that the regulations are effective and do not stifle innovation or freedom of expression. The potential for the DSA to serve as a model for other jurisdictions is also discussed, considering how the principles of transparency, accountability, and user protection can be applied globally.
In conclusion, the DSA represents a bold attempt by the European Union to shape the future of the digital landscape by imposing clear and enforceable obligations on online platforms. While the DSA offers a robust framework for protecting users and ensuring platform accountability, its success depends on careful implementation and the willingness of platforms to embrace their responsibilities. The article argues that the DSA, despite its challenges, is a necessary evolution in the regulation of digital services and has the potential to significantly enhance the safety, transparency, and fairness of the online environment.