The Facebook Effect: Political Advertisement on Social Media

The Election Commission(EC) released the dates for the 2019 Lok Sabha elections on the 10th of March, and with this announcement, India has officially entered the home stretch of the 2019 elections. With the Model Code of Conduct (MCC) now in effect, the political parties are bound by certain restrictions in terms of advertising and campaigning. MCC’s have been in effect since 1968 which serve as a binding model code based upon a consensus between all political parties to ensure ethical behavior during elections. Due to its consensual nature, MCC is not a statutory document and the violation of many of its provisions does not attract punitive action. The MCC seeks to ensure free and fair elections and is enforced by the Election Commission. It has evolved with time to incorporate all aspects of political campaigning, with the current edition of the MCC also extending to social media posts and online advertising. This shows a paradigm shift in the mode of campaigning employed by many political parties; with the rise of social media activity in India, the online sphere has become a virtual battleground of political advertisement.

The regulations which govern other advertisements, such as print media and radio, will also be extended to social media platforms like Facebook, Twitter. The forums have been contacted by the Elections Commission, and platforms like Twitter and Facebook have released their own codes of conduct to be followed at the time of elections. The increased scrutiny of social media platforms for political advertisement is a welcomed change, and comes at the heels of the Cambridge Analytica scandal which broke early last year, and is rumored to have has significant impacts on the US elections and the Brexit vote.

The new guidelines bring political advertisements on social media platforms under the MCC, similar to the provisions relating to newspaper and radio advertisements, parties are now required to disclose expenditure accounts for social media advertisement as well. These advertisements will also have to go through a mandated Pre-Certification procedure, and they will be monitored by the Media Certification and Monitoring Committees (MCMCs) placed at the district and state levels to which a social media expert will be added. The candidates are also required to submit details of their social media accounts (as applicable) at the time of filing nominations. The EC has also extended the norms applicable to social media advertising to include Voice and SMS messages propagated through phones.

The platforms have also agreed to the guidelines laid down in the MCC and agreed to take stringent actions against any content reported by designated officers of the EC. To this end, they have agreed to set up priority channels with the EC for quick response to complaints. The EC has also sought to extend the ban on political advertisement for 48 hours before the elections to social media, as mandated under section 128 of the Representation of People Act.

The new guidelines have been lauded and criticized by many. One of the chief complaints stems from the fact that private posts of individuals would not count as advertisements and would thus not be covered under these regulations. Parties and workers are still able to use their pages to upload posts supporting their candidacy which are exempt from the MCC. However, it would be difficult to bring personal posts under this ambit without violating the right to freedom of speech guaranteed under the constitution. Any discrepancies or hate speech propagated under these private posts would still be governed under their community guidelines and can be reported under the same. Additionally, depending upon the subject matter of the individual posts, they would also be punishable under the Hate Speech laws in India, such as those relating to promoting enmity between groups, assertions prejudicial to national integration, hurting religious sentiment etc. There is a comprehensive and robust legal framework which deals with Hate Speech in India and as such the EC should not be required to step in to govern a citizen’s private posts made in a personal capacity.

The effects of the guidelines can already be felt, as the EC has contacted Facebook, asking them to take down a controversial poster featuring members of the Armed forces, being used as part of a political campaign. The poster in question was uploaded by the social media account of O.P Sharma and has a picture of Wing Commander Abhinandan Varthaman on it, which is in direct contravention of and advisory issued by the commission in 2013, in which it had asked them to desist from using photographs of defence personnel or any events associated with them for political propaganda

What you can do: The 17th Lok Sabha elections also debuts the cVIGIL app, which is a mobile application created by the Election Commission to provide a reliable mechanism enabling citizens to transmit and track complaints relating to the breach of the MCC. In addition, the Voters helpline 1950, can also be used to register complaints for potential breeches of the MCC. With the upcoming election shaping up to be one of the most expensive elections to date, the EC is working on ensuring a free and fair election process without any voter coercion and fake news to skew the balance.

Happy Voting!

Advertisements

Launching our Mapping Report on ‘Hate Speech Laws in India’

We are launching our report on hate speech laws in India. This report maps criminal laws and procedural laws, along with medium-specific laws used by the state to regulate hate speech.

This report was launched last week at a panel on ‘Harmful Speech in India’, as a part of UNESCOs World Press Freedom Day. The panel was comprised of Pamela Philipose, Aakar Patel, Chinmayi Arun and Sukumar Muralidharan. The panelists discussed the state of harmful speech in the country and regulatory issues arising from the proliferation of hate speech.

We hope that this report can serve as a basis for further research on hate speech in India, and can serve as a resource for practicing lawyers, journalists and activists.

We would appreciate any feedback, please feel free to leave a comment or to write to us.

The report can be found here.

 

 

 

 

Pachauri defamation suit: Court rejects interim gag order plea

The Patiala House court at Delhi has rejected R. K. Pachauri’s plea for an interim gag order against NDTV, Bennett Coleman and Co., and the India Today Group. The media houses had been made defendants in a defamation suit filed by him in 2016.

In 2015, an FIR had been filed against Pachauri by a woman employee of TERI (The Energy and Resources Institute, of which he was then the Chief) accusing him of sexual harassment. Following these allegations, several other women had spoken out about similar experiences while they had worked at the organization. The allegations and ongoing proceedings had received extensive coverage in the media.

Pachauri filed for defamation against multiple parties, including the media houses, one of the women who had spoken out, as well as her lawyer. He sought a gag order against the media houses, and damages of Rs. 1 Crore from the victim and her lawyer.

We have written previously about how suits such as these are in the nature of ‘SLAPP’ suits – Strategic Lawsuits Against Public Participation. These are cases where powerful individuals and corporations use litigation as a way of intimidating or silencing their critics. The defendants are usually media houses or individuals who are then forced to muster the resources to mount a legal defense. Even if they are able to secure a victory in Court, it is at the cost of a protracted and expensive process.

The court has now refused to grant an interim injunction against the media houses, noting the right of the public to be aware of the developments. It further noted that public figures can be held to a higher degree of scrutiny by the public. However, it has also held that further reportage must also carry Pachauri’s views, and indicate that the matter is still pending before the Court. The text of the order may be found here.

Facebook and its (dis)contents

In 2016, Norwegian writer Tom Egeland, uploaded a post on Facebook, listing seven photographs that “changed the history of warfare”. The post featured the Pulitzer-winning image, ‘The Terror of War’, which depicts a naked nine-year-old running from a napalm attack during the Vietnam War. Facebook deleted the post, and suspended Egeland’s account.

A Norwegian newspaper, Aftenposten, while reporting on the suspension, used the same image on its Facebook page. The newspaper soon received a message from Facebook demanding that the image be either removed, or pixelated. The editor-in-chief refused to comply in an open letter to Mark Zuckerburg, noting his concern at the immense power Facebook wielded over speech online. The issue escalated when several Norwegian politicians, including the Prime Minister, shared the image on Facebook, and were temporarily suspended from Facebook as well.

Facebook initially stated that it would be difficult to create a distinction between instances where a photograph of a nude child could be allowed. However, due to widespread censure, the platform eventually decided to reinstate the image owing to its “status as an iconic image of historical importance.”

This incident brought to light the tricky position Facebook finds itself in as it attempts to police its platform. Facebook addresses illegal and inappropriate content through a mix of automated processes, and human moderation. The company publishes guidelines about what content may not be appropriate for its platform, called its ‘Community Standards.’ Users can ‘flag’ content that they think does not meet the Community Standards, which is then reviewed by moderators. Moderators may delete, ignore, or escalate flagged content to a senior manager. In some cases, the user account may be suspended, or asked to submit identity verification.

As evident from the ‘Terrors of War’ incident, Facebook has often come under fire for supposed ‘wrong’ moderation of content, as well as opacity in how its community review process comes to be applied. It has been argued that content that is evidently in violation of Community Standards is often not taken down, while content that should be safe is censored. For instance, Facebook courted controversy again, when it was accused of blocking content and accounts documenting persecution of the Rohingya Muslim community in Myanmar.

Closer home as well, multiple instances of Facebook’s questionable moderation practices have come to light. In October 2017, Raya Sarkar, a law student based out of the United States, had created what came to be called, the List. The List named over 70 prominent academics that had been accused of sexual harassment. The approach proved extremely controversial, sparking debates about due process, and the failure of institutional mechanisms to address harassment. Facebook blocked her account for seven days, which proved equally contentious. Sarkar’s account was restored only after Facebook staff in Palo Alto were contacted directly. Similar instances have been reported of seemingly arbitrary application of the Community Standards. In many cases accounts have been suspended, and content blocked without notice, explanation or recourse.

Content moderation inherently involves much scope for interpretation and disagreement. Factors such as context, as well as cultural differences, render it a highly subjective exercise. Algorithms don’t appear to have reached sufficient levels of sophistication, and there exist larger issues associated with automated censoring of speech. Human moderators are by all accounts burdened by the volume and the psychologically taxing nature of the work, and therefore prone to error. The way forward should therefore be first, to ensure that transparent mechanisms exist for recourse against the removal of legitimate speech.

In light of the ‘Terror of War’ incident, Facebook responded by updating its community standards. In a statement, it said that it would allow graphic material that would be “newsworthy, significant, or important to the public interest — even if they might otherwise violate our standards.” Leaked moderator guidelines in 2017 opened the company up to granular public critique of its policies. There is evidently scope for Facebook to be more responsive and consultative in how it regulates speech online.

In June 2017, Facebook reached 2 billion monthly users, making it the largest social network, and a platform for digital interaction without precedent. It has announced plans to reach 5 billion. With the influence it now wields, it must also embrace its responsibility to be more transparent and accountable to its users.

Update from the Supreme Court – Aadhaar linking and Sabu Mathew George vs. Union of India

Aadhaar linking 

With regard to the pending matter of linking Aadhaar with certain services, the Bench stated that the hearing for interim relief would take place tomorrow (14/12). In addition, the Centre issued a notification on the 12th of December, stating that the deadline for linking Aadhaar with bank accounts, which was the 31st of December, was extended indefinitely. On the 13th of December however, this deadline was fixed as the 31st of March. Our coverage of the Aadhaar linking matter can be found here and here.

Sabu Mathew George vs. Union of India

Today, the Supreme Court heard the ongoing matter of Sabu Mathew George vs. Union of India. In 2008, a petition was filed to ban advertisements endorsing sex-selective abortions from search engine results. Advertisements endorsing sex selective abortions are illegal under Section 22 of the PNDT Act (The Pre-conception and Pre-Natal Diagnostic Techniques Act), 1994 Act. Several orders have been passed over the last few years, the last of which was passed on April 13th, 2017. Following from these orders, the Court had directed the Centre to set up a nodal agency where complaints against sex selective ads could be lodged. The Court had also ordered the search engines involved to set up an in-house expert committee in this regard. The order dated April 13th stated that compliance with the mechanism in place would be checked hereinafter. Our blog posts covering these arguments and other issues relevant to search neutrality can be found on the following links (1, 2 and 3).

In today’s proceedings, the matter was disposed off.

Senior counsel Sanjay Parikh appearing for the petitioners started off by commenting on the working of the nodal agencies and the limits within which they function. He stated that search engines were ‘washing their hands off’ and trying to pawn off their responsibilities to the government.

Counsel for the respondents argued that the petitioners displayed a fundamentally incorrect understanding of how the internet functioned. They stated that a blanket ban on content, as desired by the petitioners, would not be possible.

The respondents then stated that problematic content was taken down in the time period stipulated in the earlier orders. The petitioners refuted this statement.

The respondents once again stated that the petitioners ‘betrayed a lack of understanding’ of how search engines functioned.

The petitioners stated that search engines have been much more proactive and have had more success in taking down content related to child sexual abuse material and terrorism. As per the petitioners, this implies that search engines are capable of removing content in an efficient manner.

The respondents stated that material relating to sexual abuse usually relates to images and other visuals, as opposed to search terms or words. They stated that this was an important distinction, and would determine the extent to which search engines could efficiently take down content.

Referring to the affidavit filed, the petitioners reiterated that the government and the nodal agency were ‘helpless’ and would need further cooperation to prevent content from disseminating.

To this, the respondents stated that the government of India should block problematic URLs.

The petitioners then drew attention to the magnitude of illegitimate content on the internet, by discussing statistics from a YouTube search.

At this point, Chief Justice Dipak Misra interjected by stating that nodal agencies had to function in a competent manner and ensure that complaints were addressed in the requisite time period.

The petitioners responded stating that nodal agencies were finding it difficult to efficiently regulate content, since the takedown of URLs did not affect the availability of related illegitimate content on the internet.

The respondents then outlined the constraints within which search engines functioned. They stated that a search engine could only de-index illegitimate content on the internet, and that the content would continue to exist on the internet otherwise. They remarked on safe-harbour exceptions and also stated that filtering and indexing is an algorithmic process, which could only be regulated to a certain extent. Reiterating on the algorithmic nature of the process, they stated that ‘one step could not be removed from the process’.

They also reassured the petitioners that any problematic URLs, that they were intimated of, would be removed. However, proxy websites with similar content could still crop up. They stated that the possible permutations and combinations were endless, and eliminating search results was not possible. However, sponsored ads could be dealt with effectively.  They also stated that dealing with every instance of infringement on an individual level would be impossible.

At this point, the Chief Justice asked the respondents to elaborate on what could be done.

The respondents stated that there was a need to understand the technology better.

The Bench then asked the petitioners if they could interact with the committee to better understand technical solutions.

Mr. Parikh, referring to an affidavit filed, stated that Google, in 2014, had displayed the ability to ‘proactively’ takedown content, without being informed by external bodies.

The respondents stated that they would look into this.

The Bench concluded by stating that the nodal agency should hold a meeting with the respondents and the petitioners within 6 weeks.

Chief Justice Dipak Misra read out the order.

Mr. Sanjay Parikh appearing for the petitioners stated that the nodal agency, despite the orders passed, had not been able to stop the offending material from being used. According to Mr. Parikh, search engines alone have the potentiality to deliberately remove offending material. Mr. Parikh has also stated that there are other ways in which offending content can be removed by the search engines.

The counsel for the respondents have stated that content can only be removed once it is pointed out, and once a specific URL is specified. There are other permutations and combinations to consider while regulating search results.

Senior Counsel Pinky Anand has stated that the nodal agency is hard at work and addresses complaints efficiently whenever it receives them.

The matter was disposed off.

 

The Supreme Court’s Free Speech To-Do List

Written by the Civil Liberties team at CCG

The Supreme Court of India is often tasked with adjudicating disputes that shape the course of free speech in India. Here’s a roundup up of some key cases currently before the Supreme Court.

Kamlesh Vaswani vs. Union of India

A PIL petition was filed in 2013 seeking a ban on pornography in India. The petition also prayed for a direction to the Union Government to “treat watching of porn videos and sharing as non-bailable and cognizable offence.”

During the course of the proceedings, the Department of Telecommunications ordered ISPs to block over 800 websites allegedly hosting pornographic content. This was despite the freedom of expression and privacy related concerns raised before the Supreme Court. The Government argued that the list of websites had been submitted to the DoT by the petitioners, who blocked the websites without any verification. The ban was revoked after much criticism.

The case, currently pending before the Supreme Court, also presented implications for the intermediary liability regime in India. Internet Service Providers may claim safe harbor from liability for content they host, as long as they satisfy certain due diligence requirements under Sec. 79 of the IT Act, read with the Information Technology (Intermediaries Guidelines) Rules, 2011. After the Supreme Court read down these provisions in Shreya Singhal v. Union of India, the primary obligation is to comply with Court orders seeking takedown of content. The petition before the Supreme Court seeks to impose an additional obligation on ISPs to identify and block all pornographic content, or risk being held liable. Our work on this case can be found here.

Sabu Mathew George vs. Union of India

This is a 2008 case, where a writ petition was filed to ban ‘advertisements’ relating to pre-natal sex determination from search engines in India. Several orders have been passed, and the state has now created a nodal agency that would provide search engines with details of websites to block. The ‘doctrine of auto-block’ is an important consideration in this case -in one of the orders the Court listed roughly 40 search terms and stated that respondents should ensure that any attempt at looking up these terms would be ‘auto-blocked’, which raises concerns about intermediary liability and free speech.

Currently, a note has been filed by the petitioners advocate, which states that search engines have the capacity to takedown such content, and even upon intimation, only end up taking down certain links and not others. Our work on this case can be found on the following links – 1, 2, 3.

Prajwala vs. Union of India

This is a 2015 case, where an NGO (named Prajwala) sent the Supreme Court a letter raising concerns about videos of sexual violence being distributed on the internet. The letter sought to bring attention to the existence of such videos, as well as their rampant circulation on online platforms.

Based on the contents of the letter, a suo moto petition was registered. Google, Facebook, WhatsApp, Yahoo and Microsoft were also impleaded as parties. A committee was constituted to “assist and advise this Court on the feasibility of ensuring that videos depicting rape, gang rape and child pornography are not available for circulation” . The relevant order, which discusses the committee’s recommendations can be found here. One of the stated objectives of the committee was to examine technological solutions to the problem – for instance, auto-blocking. This raises issues related to intermediary liability and free speech.

 

Call for Applications – Civil Liberties

Update: Deadline to apply extended to January 15, 2018! 

The Centre for Communication Governance at the National Law University Delhi (CCG) invites applications for research positions in its Civil Liberties team on a full time basis.

About the Centre

The Centre for Communication Governance is the only academic research centre dedicated to working on the information law and policy in India and in a short span of four years has become a leading centre on information policy in Asia. It seeks to embed human rights and good governance within communication policy and protect digital rights in India through rigorous academic research and capacity building.

The Centre routinely works with a range of international academic institutions and policy organizations. These include the Berkman Klein Center at Harvard University, the Programme in Comparative Media Law and Policy at the University of Oxford, the Center for Internet and Society at Stanford Law School, Hans Bredow Institute at the University of Hamburg and the Global Network of Interdisciplinary Internet & Society Research Centers. We engage regularly with government institutions and ministries such as the Law Commission of India, Ministry of Electronics & IT, Ministry of External Affairs, the Ministry of Law & Justice and the International Telecommunications Union. We work actively to provide the executive and judiciary with useful research in the course of their decision making on issues relating to civil liberties and technology.

CCG has also constituted two advisory boards, a faculty board within the University and one consisting of academic members of our international networks. These boards will oversee the functioning of the Centre and provide high level inputs on the work undertaken by CCG from time to time.

About Our Work

The work at CCG is designed to build competence and raise the quality of discourse in research and policy around issues concerning civil liberties and the Internet, cybersecurity and global Internet governance. The research and policy output is intended to catalyze effective, research-led policy making and informed public debate around issues in technology and Internet governance.

The work of our civil liberties team covers the following broad areas:

  1. Freedom of Speech & Expression: Research in this area focuses on human rights and civil liberties in the context of the Internet and emerging communication technology in India. Research on this track squarely addresses the research gaps around the architecture of the Internet and its impact on free expression.
  2. Access, Markets and Public Interest: The research under this area will consider questions of access, including how the human right to free speech could help to guarantee access to the Internet. It would identify areas where competition law would need to intervene to ensure free, fair and human rights-compatible access to the Internet, and opportunities to communicate using online services. Work in this area will consider how existing competition and consumer protection law could be applied to ensure that freedom of expression in new media, and particularly the internet, is protected given market realities on the supply side. We will under this track put out material regarding the net neutrality concerns that are closely associated to the competition, innovation, media diversity and protection of human rights especially rights to free expression and the right to receive information and particularly to substantive equality across media. It will also engage with existing theories of media pluralism in this context.
  3. Privacy, Surveillance & Big Data: Research in this area focuses on surveillance as well as data protection practices, laws and policies. The work may be directed either at the normative questions that arise in the context of surveillance or data protection, or at empirical work, including data gathering and analysis, with a view to enabling policy and law makers to better understand the pragmatic concerns in developing realistic and effective privacy frameworks. This work area extends to the right to be forgotten and data localization.

Role

CCG is a young and continuously evolving organization and the members of the centre are expected to be active participants in building a collaborative, merit led institution and a lasting community of highly motivated young researchers.

Selected applicants will ordinarily be expected to design and produce units of publishable research with Director(s)/ senior staff members. They will also be recommending and assisting with designing and executing policy positions and external actions on a broad range of information policy issues.

Equally, they will also be expected to participate in other work, including writing opinion pieces, blog posts, press releases, memoranda, and help with outreach. The selected applicants will also represent CCG in the media and at other events, roundtables, and conferences and before relevant governmental, and other bodies. In addition, they will have organizational responsibilities such as providing inputs for grant applications, networking and designing and executing Centre events.

Qualifications

The Centre welcomes applications from candidates with advanced degrees in law, public policy and international relations.

  • All candidates must preferably be able to provide evidence of an interest in human rights / technology law and / or policy / Internet governance/ national security law as well. In addition, they must have a demonstrable capacity for high-quality, independent work.
  • In addition to written work, a project/ programme manager within CCG will be expected to play a significant leadership role. This ranges from proactive agenda-setting to administrative and team-building responsibilities.
  • Successful candidates for the project / programme manager position should show great initiative in managing both their own and their team’s workloads. They will also be expected to lead and motivate their team through high stress periods and in responding to pressing policy questions.

However, the length of your resume is less important than the other qualities we are looking for. As a young, rapidly-expanding organization, CCG anticipates that all members of the Centre will have to manage large burdens of substantive as well as administrative work in addition to research. We are looking for highly motivated candidates with a deep commitment to building information policy that supports and enables human rights and democracy.

At CCG, we aim very high and we demand a lot of each other in the workplace. We take great pride in high-quality outputs and value individuality and perfectionism. We like to maintain the highest ethical standards in our work and workplace, and love people who manage all of this while being as kind and generous as possible to colleagues, collaborators and everyone else within our networks. A sense of humour will be most welcome. Even if you do not necessarily fit requirements mentioned in the two bulleted points but bring to us the other qualities we look for, we will love to hear from you.

[The Centre reserves the right to not fill the position(s) if it does not find suitable candidates among the applicants.]

Positions

Based on experience and qualifications, successful applicants will be placed in the following positions. Please note that our interview panel has the discretion to determine which profile would be most suitable for each applicant.

  • Programme Officer (2-4 years’ work experience)
  • Project Manager (4-6 years’ work experience)
  • Programme Manager (6-8 years’ work experience)

A Master’s degree from a highly regarded programme might count towards work experience.

CCG staff work at the Centre’s offices at National Law University Delhi’s campus. The positions on offer are for duration of one year and we expect a commitment for two years.

Remuneration

The salaries will be competitive, and will usually range from ₹50,000 to ₹1,20,000 per month, depending on multiple factors including relevant experience, the position and the larger research project under which the candidate can be accommodated.

Where candidates demonstrate exceptional competence in the opinion of the interview panel, there is a possibility for greater remuneration.

Procedure for Application

Interested applicants are required to send the following information and materials by December 30, 2017 to ccgcareers@nludelhi.ac.in.

  1. Curriculum Vitae (maximum 2 double spaced pages)
  2. Expression of Interest in joining CCG (maximum 500 words).
  3. Contact details for two referees (at least one academic). Referees must be informed that they might be contacted for an oral reference or a brief written reference.
  4. One academic writing sample of between 1000 and 1200 words (essay or extract, published or unpublished).

Shortlisted applicants may be called for an interview.