Post by soyeb19 on Feb 17, 2024 13:53:03 GMT 2
Borja Adsuara, lawyer, consultant and university professor, expert in digital law, privacy and data protection and one of the members of the Committee of Experts of the Charter of Digital Rights, states that “in the absence of the Digital Services Coordinator being appointed and that it is known which administrative authority is going to supervise this regulation, the problem is who is going to decide if we are dealing with hate speech content or disinformation; something that will significantly affect not only individuals, but also the media.” This jurist remembers that it will be the algorithms of these digital platforms that will detect this type of irregular content. “If it is already difficult for a judge to decide whether content is legal or illegal, what this rule does is allow social networks or digital platforms to delete any content, even if it is not illegal. This means that we are facing a problem of freedom of expression.
In his opinion, “it must be clear that, although the Spanish Constitution states that the only limit to freedom of expression is the law, the DSA regulation allows any social network to delete any content they want, as long as it is reflected. previously in its internal rules and terms of use. Social networks are going to decide on the veracity of this content. “It can be a way of limiting freedom of information and expression.” From his point of view "it is foreseeable that many social networks or digital platforms, before the national government in Whatsapp Database power opens a sanctioning file with fines of over six percent of the annual billing, will dedicate themselves to deleting certain content that may bother them." to that authority that can open said sanctioning file. We would be facing indirect censorship.” Adsuara explains that, in this context, the Digital Services Regulation has proposed a self-regulation initiative, similar to that of Advertising Self-Control, to supervise the decisions of content moderation platforms. “There should be an independent sector body, made up of a jury of professionals, that can review user complaints about the deletion of certain content by a social network or platform.
The key, therefore, is that this deletion of content is not decided by governments or by the social networks or digital platforms themselves.” In his opinion, “the rule is already coming into force for large platforms and neither the figure of the Digital Services Coordinator nor the codes of conduct have yet been approved nor has the independent sectoral body for resolution been launched. extrajudicial conflict. I think we have to be diligent because on February 7, 2024, the Regulation fully applies and we must have the implementation duties done by that date.” Regarding the direct impact of the DSA on the existing global digital framework, Adsuara states that “it is true that there will be greater control of activity and content. The problem is that this control, in case of doubt, can end with certain content or messages that would not be strictly illegal. With the large fines that are being considered, up to six percent of global turnover, it is foreseeable that digital platforms, in case of doubt about the legality of content, will delete it to avoid these sanctions. In this context, the expert points out that "this will increase the number of complaints and internal claims, and digital platforms will have to hire specialized personnel to supervise the 'content moderation' activity carried out by the algorithms. We will have to see if they can have a team that supervises all the claims that come to them and within what time frames.”
In his opinion, “it must be clear that, although the Spanish Constitution states that the only limit to freedom of expression is the law, the DSA regulation allows any social network to delete any content they want, as long as it is reflected. previously in its internal rules and terms of use. Social networks are going to decide on the veracity of this content. “It can be a way of limiting freedom of information and expression.” From his point of view "it is foreseeable that many social networks or digital platforms, before the national government in Whatsapp Database power opens a sanctioning file with fines of over six percent of the annual billing, will dedicate themselves to deleting certain content that may bother them." to that authority that can open said sanctioning file. We would be facing indirect censorship.” Adsuara explains that, in this context, the Digital Services Regulation has proposed a self-regulation initiative, similar to that of Advertising Self-Control, to supervise the decisions of content moderation platforms. “There should be an independent sector body, made up of a jury of professionals, that can review user complaints about the deletion of certain content by a social network or platform.
The key, therefore, is that this deletion of content is not decided by governments or by the social networks or digital platforms themselves.” In his opinion, “the rule is already coming into force for large platforms and neither the figure of the Digital Services Coordinator nor the codes of conduct have yet been approved nor has the independent sectoral body for resolution been launched. extrajudicial conflict. I think we have to be diligent because on February 7, 2024, the Regulation fully applies and we must have the implementation duties done by that date.” Regarding the direct impact of the DSA on the existing global digital framework, Adsuara states that “it is true that there will be greater control of activity and content. The problem is that this control, in case of doubt, can end with certain content or messages that would not be strictly illegal. With the large fines that are being considered, up to six percent of global turnover, it is foreseeable that digital platforms, in case of doubt about the legality of content, will delete it to avoid these sanctions. In this context, the expert points out that "this will increase the number of complaints and internal claims, and digital platforms will have to hire specialized personnel to supervise the 'content moderation' activity carried out by the algorithms. We will have to see if they can have a team that supervises all the claims that come to them and within what time frames.”