Home / News / Fake news wild on the internet: the regulatory measures necessary to make platforms for accountability – technology news, Firstpost

Fake news wild on the internet: the regulatory measures necessary to make platforms for accountability – technology news, Firstpost

In an attempt to reduce the fake news Indian MPs to discuss regulatory measures to make digital more accountable. This has spurred many cases of incendiary messages working through WhatsApp and social media posts inciting mob violence.

Also with the 2019 elections looming, political parties, advocacy began a campaign of defamation in social media and messaging platforms. In this scenario, the extent of the responsibility of the systems serving as a means of obtaining such information need to be closely examined.

Representational image.

Representation of the image.

Information Technology Act, 2000, defines ‘mediator’ as an entity who on behalf of another entity ‘receives stores or transmits’ a message or electronic service in regard to them. Some social media apps, instant messaging platforms, although not explicitly included in this definition may be interpreted as mediators under the law. Shall be exempt from liability if its role is limited to being a way to obtain information, which if not sold or modified or specify the recipient of the information, if she had complied with the due diligence stipulated in the secondary legislation, and if they do not have ‘actual knowledge of any illegal contents hosted by.

It is important to understand that digital platforms such as social networking sites, chat rooms, instant messaging applications are content aggregators, not publishers, which it does not exercise any editorial control over the content being shared on their platform. Therefore the responsibility of any digital broker must be limited to a minimum have a role in being a mere channel of communication.

As well as any legislative or policy intervention should not be considered a basic right to freedom of expression and the privacy of individuals. Stringent regulations and sanctions push the pads self-censorship.

For example, Germany has issued a law enforcement network that require social media platforms to remove clearly illegal content from their websites within 24 hours of receipt of the notification. Non-compliance may lead to penalties can reach up to millions of dollars. This has led to a situation where brokers are more cautious users to post content are afraid of the dissemination of information that may be at risk of taken down. Impose a general obligation on intermediaries to monitor and edit or takedown of the content on the platform, including private communications, political oversight, abuse. In cases where instant messaging platforms such as WhatsApp is being misused by individuals on the political agenda or incite violence, it would be impossible to impose an obligation on the platform to monitor person-to-person communications, or communications made on WhatsApp groups. As well as any regulatory solution proposed by the government must not discount the monetary features technology some of these platforms, such as end-to-end encryption of messages and other privacy-enhancing tools.

A demonstrator holds placards during a protest against what the demonstrators say recent mob lynching of Muslims and Dalits who were accused of possessing beef, in Ahmedabad, India July 9, 2017. REUTERS/Amit Dave RC1651CD9140

A demonstrator holds placards during a protest against what the demonstrators say the recent mob killing of Muslims and Dalits who were accused of possession of beef, Ahmedabad in India. PHOTO: Reuters

The spread of misinformation and hate on the internet is a collective failure of society. Regulatory measures to curb should take into account the roles and responsibilities of individuals to create content, intermediaries hosting such content and/or by posting it third party advertisers support such initiatives, and in the end, the consumers of content.

The law must be careful in imposing obligations that are commensurate with the role of power brokers such as social networking sites and instant messaging programs. To this effect, policy measures to enhance the accountability of the organization. For example, to ensure that individuals on the use of platforms such as Twitter, Facebook and WhatsApp don’t work through fake accounts, and provide more options accessible to individuals using WhatsApp on Flag or report messages that incendiary or fake, generating the spread of awareness campaigns to educate individuals on the ramifications of the fake news, and incendiary messages. Some of these measures are already being implemented in India.

To combat the high incidence of fake news on the internet has emerged as a major challenge for governments and regulators all over the world.

Tim Berners-Lee, the founder of the global network of opinion of the tech giants today are not programmed to increase the social good. He hoped the legal or development that, knowing the social goals will go a long way in ensuring ethical and responsible participation of tech giants in the digital society.

The author is a lawyer working with Vidhi Centre for legal policy in New Delhi

Check Also

IMC 2018: Reliance Jio and Ericsson showcase 5G use case on a driverless car  Technology News, Firstpost

IMC 2018: Reliance Jio and “Ericsson” show 5G use case for driverless cars – technology news, Firstpost

The second day of the India Mobile Congress (IMC) 2018 more demonstration central, and then …