Europe’s major technology law has been approved. Now comes the hard part | MarketingwithAnoy

The potential gold The standard for online content management in the EU – the Digital Services Act – is now a reality after the European Parliament voted overwhelmingly in favor of the legislation earlier this week. The last obstacle, which is merely a formality, is for the European Council of Ministers to sign the text in September.

The good news is that the landmark legislation includes some of the most comprehensive obligations for transparency and platform responsibility to date. It will give users real control over and insight into the content they engage in, and offer protection against some of the most pervasive and harmful aspects of our online space.

The focus is now on implementing the comprehensive law, as the European Commission begins to seriously develop enforcement mechanisms. The proposed scheme is a complex structure in which responsibilities are shared between the European Commission and national regulators, in this case known as Digital Services Coordinators (DSCs). It will rely heavily on the creation of new roles, the expansion of existing responsibilities and seamless cross-border co-operation. What is clear is that at present there is simply no institutional capacity to adopt this legislation effectively.

In a “sneaky“, the Commission has provided insight into how they propose to overcome some of the more obvious challenges of implementation – such as how they plan to oversee large online platforms and how they will try to avoid the problems plaguing GDPR, such as off-sync national regulators and selective enforcement – but their proposals only raise new questions: recruit a large number of new staff and a new European Center for Algorithmic Transparency must attract world-class computer scientists and experts to help enforcing the comprehensive new commitments on algorithmic transparency and data availability.The Commission’s provisional vision is to organize its regulatory responsibilities by thematic areas, including a community affairs team tasked with overseeing some of the new due diligence commitments. Insufficient resources here give rise to concern and will ultimately risk re turning those hard-won commitments into empty check box exercises.

A critical example is the platforms’ obligation to perform assessments to address systemic risks to their services. This is a complex process that must take into account all the fundamental rights protected by the EU Charter. To do this, technology companies will need to develop human rights impact assessments (HRIAs) – an evaluation process to identify and mitigate potential human rights risks arising from a service or company, or in this case a platform – something civil society encouraged them to do. it throughout the negotiations. However, it will be up to the Board, which consists of the DSCs and is chaired by the Commission, to annually assess the most prominent systemic risks identified and outline best practices for mitigation measures. As someone who has helped develop and evaluate HRIAs, I know this will not be an easy feat, even with independent auditors and researchers contributing to the process.

If they are to have an impact, the assessments must establish comprehensive baselines, concrete impact assessments, evaluation procedures and stakeholder involvement strategies. The very best HRIAs integrate a gender-sensitive approach and place particular emphasis on systemic risks that will disproportionately affect them from historically marginalized societies.

This is the most concrete method of ensuring that all potential infringements are included.

Fortunately, the international human rights framework, such as the UN Guiding Principles on Human Rights, provides guidance on how best to develop these assessments. Nevertheless, the success of the provision will depend on how platforms interpret and invest in these assessments, and even more so on how well the Commission and national regulators will enforce these obligations. However, with the current capacity, the institutions’ ability to develop the guidelines, best practices and evaluate mitigation strategies is nowhere near the extent that the DSA will require.

Leave a comment