Reining in the Political ‘Wild West’: Campaign Rules for the 21st Century

Author:
Michela Palese, former Research and Policy Officer

Posted on the 4th February 2019

Foreword

Rt Hon Dame Cheryl Gillan MP

Cheryl_GillanOur democracy is at a critical juncture. We live in a time when our democratic processes face considerable threats from a range of sources – from ‘dark ads’ and fake news, to foreign interference and the misuse of personal data on a monumental scale.

The year 2000 – when our election rules were largely written – was a time before social media as we know it. Twitter and Facebook did not exist. It was a time of floppy discs and dial-up internet. Smart phones were practically unheard of. In short, the ability to rapidly transmit disinformation and channel millions of pounds into campaigns without scrutiny was far more difficult. So much has changed – yet our campaign rules have remained in the analogue age.

If we do not update the rules governing our elections and referendums, the credibility of our elections faces a ‘perfect storm’ of threats, as the chair of the Electoral Commission has warned.

The increasing use of online campaigning has exposed the many loopholes that can allow people to circumvent our rules perfectly legally in the digital realm. The traditional principles of fairness and transparency underlying our democratic processes are being undermined by a multiplicity of actors and processes. Online political campaigning has effectively become an unbridled Wild West.

The effects of this online Wild West go beyond these attempts at undermining our rules and regulations. The quality of political discourse itself is under attack, as the Electoral Reform Society showed with their 2016 report It’s Good to Talk: debate has become increasingly polarised and tribal, facilitated by online filter bubbles and echo chambers. This, combined with poor quality information, has created a toxic climate which is stifling genuine political debate.

As rapporteur on referendums for the Committee on Political Affairs and Democracy of the Council of Europe’s Parliamentary Assembly, I am well aware of the challenges that face our election and referendum rules – and of the remarkable work that is being done to address these threats.

The need to look at this issue in the round is vital given the huge overlap between the issues involved. Yet to date there has been little to thread these responses and research together. That is why this report brings together an unprecedented range of voices – from regulators to campaigners and academics – to address these challenges and offer solutions on what digital-age campaign regulation would look like.

When faith in the integrity of elections is undermined, democracy suffers. Tom Hawthorn of the Electoral Commission looks at what should be done to enhance voters’ confidence in digital campaigning. He sets out the actions that can and should be taken by legislators and social media companies to update our election rules and increase transparency in digital campaigning.

Academics Martin Moore and Damian Tambini similarly raise the pressing need to update our outdated electoral law, and reinforce the calls made by our regulators to ensure they have sufficient powers to deal with the changes brought about by online campaigning.

In his contribution, Stephen Kinnock MP considers how we can enhance transparency around campaign finance. To this end, in collaboration with Fair Vote and the Electoral Reform Society, he announces the planned launch of an All-Party Parliamentary Group on Electoral Campaigning Transparency that seeks to investigate how our democratic processes can best be protected.

Legal expert Bethany Shiner examines the laws surrounding ‘big data’ in politics, and what a statutory code of practice – as proposed by the ICO – should take into account.

Kyle Taylor, founder of Fair Vote, considers how we can constrain the role of money in politics in the digital age and update our election rules. He sets out three areas where urgent reform is needed – such as setting funding caps and modernising spending reporting.

Cassie Staines of Full Fact focuses on misinformation and how best to respond to it. Rather than overreacting to the ‘panic’ about fake news, she highlights that we need to increase our democracy’s resilience against misinformation. She argues any action against misinformation must be proportionate and cautious about potential unintended consequences.

In conversation with the ERS, Deputy ICO Commissioner Steve Wood discusses how the data regulator is navigating the changing environment around the political use of personal data. He speaks about why an ethical pause in political advertising might be the best way of getting different actors to come together, and assesses the powers of the ICO and Electoral Commission.

In their contribution, Doteveryone reiterate their call for the establishment of an Office for Responsible Technology to address the challenges of the existing regulatory landscape. This Office would be an independent regulator tasked with empowering existing and future regulators, informing the public and policymakers about online benefits and harms, and supporting people to seek redress when their digital rights have been breached.

Finally, Josh Smith of the Centre for the Analysis of Social Media at Demos considers the future of political campaigning in the face of new and advanced forms of technology, such as AI and deep learning. As he argues, technologies should be used to improve our political process – not to manipulate them – and should be clearly understood by users and targets.

The contributors to this report have suggested proposals that are realistic and feasible, some of which would require little change to our existing rules.

A common thread running through all contributions is the urgent need to increase transparency in relation to campaign messaging. As stated in the report I presented to the Committee on Political Affairs and Democracy of the Council of Europe’s Parliamentary Assembly, “voters should be able to find out what claims campaigners are making and who is making them. They should also be able to see if campaigners are putting out contradictory messages to different groups of voters or seeking to portray different images of themselves to different voters.”

Though they acknowledge that tech platforms have recently taken some steps in the right direction, all contributors agree that it would be unwise to leave the regulation of online campaigning to private tech companies alone. That is no way to defend the integrity of our democracy, and action is needed to ensure that how we rein in the online Wild West has the best interests of the democratic community as a whole at its heart.

The calls for a comprehensive review of our campaign laws have never been so widespread. Regardless of when the next election or referendum takes place, now is the time to bring our rules into the 21st century – before this spirals out of our control.

Introduction

Dr Jess Garland, Director of Policy and Research, ERS

Jessica GarlandPolitical party campaigning has shifted dramatically online in recent years. To give just one example: political parties spent around £1.3 million on Facebook adverts during the 2015 general election. This more than doubled two years later, with parties spending around £3.2 million on Facebook in the 2017 campaign4.

This report coincides with the 15th anniversary of Facebook’s launch. It is also 19 years since our main election rules were created. Since 2003, the Electoral Commission has been calling for online ‘imprints’ (showing who has paid for and published political materials), yet no government has grasped the nettle and taken the necessary action to update the rules.

As the use of online political campaigning has grown, so too have concerns about the implications for our democratic processes. Though online campaigning is not new, it has changed significantly in terms of scale, reach and sophistication.

Underlying these concerns is the fact our rules have not kept up with the shifting nature of campaigning, and our regulators appear powerless in the face of threats. Despite recent scandals, online political campaigning remains an almost entirely unregulated Wild West.

This has implications for the principles of transparency, fairness and the notion of a level playing field that are supposed to govern our elections.

Imprint requirements, for example, apply to campaign material in print, but not online. This means that voters cannot be certain of who created an online political advert. It also has implications for the spread of mis- and dis-information: purveyors of false or misleading information cannot be held to account if their identity cannot be verified.

‘Dark ads’ can be micro-targeted to individual voters who may not be aware of the fact they are being targeted, and why. As these ads are visible only to the creator and the individual or group being targeted, different voters can be targeted with conflicting information without the sender facing any scrutiny.

These techniques as well as playing into tribalism and polarisation in politics are also moving democratic life outside of our shared public space.

The shift to online campaigning also creates problems for regulating money in politics and for attempts to create a level playing field. With online material, cost does not have the same direct correlation with reach that it does with printed materials: lower spending does not necessarily mean fewer people seeing the ads.

Additionally, reporting of spending online is subject to limited regulatory oversight (parties, for example, do not need to provide a breakdown of social media spend). This makes it now easier than ever to blur what is spent at the local/constituency level and nationally. In this context, spending caps appear increasingly meaningless.

Traditionally, most political campaigning occurred around fixed political moments (elections and referendums) given the time, financial and resource costs involved. This is reflected in the two main pieces of legislation governing party campaigning – the Political Parties, Elections and Referendums Act (PPERA) 2000 and the Representation of the People Act (RPA) 1983. But the cheap and easy access to new campaigning techniques available online means that political parties have shifted to low-cost, 24/7 campaigning, outside of regulated periods.

The growing role of a handful of private tech companies and external agencies exacerbates these concerns. Tech platforms do not have the same liability as traditional news outlets, and the fact that they are not based in the UK raises jurisdictional and regulatory enforcement concerns. External agencies are increasingly involved in data collection and analysis, including outside of regulated campaign periods, but the extent of their involvement remains hard to ascertain.

For all these reasons, we support the proposals that are put forward by the contributors to this report:

  • In the short term, extending the imprint requirement to online campaign materials and improving how campaigners report funding and spending are two of the most readily achievable solutions. The government seems to recognise this and its consultation on imprints was a welcome and important first step in this regard.
  • The creation of a single online database of political adverts, which would be publicly available and easily searchable, would similarly increase transparency and allow voters to identify who has produced a piece of content.
  • Those charged with enforcing the rules should have sufficient enforcement powers and resources. That must involve strengthening the fines or sanctions so they can act as a meaningful deterrent against wrongdoing. The ICO’s powers were increased considerably in the past year, showing what can be achieved if there is political will.
  • Parties and the government must properly engage in efforts to establish a statutory code of practice for political parties and campaigners without delay.
  • More broadly, the ERS is calling for a comprehensive review and overhaul of our electoral law, which needs to be updated and future-proofed for the digital age. The fundamental principle must be to ensure that the public have faith in the democratic process. Alongside efforts to improve the quality of public debate itself, this could transform the murky world of online campaigning into a force for good.

There has been considerable work around campaign regulation in recent months – such as by the Digital, Culture, Media and Sport select committee, and by many contributors to
this report.

We cannot risk another election or referendum being undermined by dodgy donors, dark ads and disinformation. Now is the time for politicians and parties to take charge of this issue. Political will is needed to fix these problems and secure our democratic processes.

Online political campaigning has the potential to increase citizens’ participation in our political processes significantly. The cheap and easy access to information, and the fact that everyone has – at least theoretically – equal voice online are important democratising features of digital campaigning. But our rules and laws need to be sufficiently robust to protect us from the potential threats of online campaigning while allowing us to reap its benefits.

One thing is for sure: this is even bigger than Brexit. Let’s give it the priority it deserves.

Wall of promoted FB posts

Giving voters greater confidence

Tom Hawthorn, Head of Policy at the Electoral Commission

Not so long ago, election campaigning was all about knocking on doors, sending out leaflets and putting up billboard adverts. Over the last few years, we have seen a rapid increase in the use of digital tools in political campaigning. Overall, that’s a good thing: after all, elections depend on participation and on campaigners connecting with voters.

However our rules and laws have not always kept pace with the increasing use of digital campaigning. For example, while printed campaign materials must carry an ‘imprint’ that sets out who paid for them, the same is not true for online materials. And our public opinion research shows that voters want to know more about who is targeting them with political campaigns online.

Digital campaigning is covered by the UK laws that govern party, campaigner and candidate spending. But we need to see greater transparency for voters. Last June we published a package of practical recommendations to address this.

There are two changes to the law that we want the UK’s governments to bring forward as soon as possible. Firstly, online materials produced by parties, candidates and campaigners should include an imprint stating who has created them. This would mean that, when voters scroll through their social media feeds and see an eye-catching advert trying to influence their vote, they know who is targeting them.

Secondly, the UK’s governments should update the law so that campaigners are required to provide more detailed information about how they have spent money on digital campaigns. The invoices and spending returns of campaigners often do not provide a clear picture of their digital activities. This needs to change so that these documents give voters greater transparency about campaign spending. It would also give us more information to check whether campaigners are following the spending rules, and a better basis to investigate where we suspect something is not right.

When the UK’s key piece of election law was first designed in 2000, parliament decided that political parties and campaigners needed to report spend on ‘advertising’. Today, this catch-all category makes it difficult to understand what is being spent and where. Campaigners report a sophisticated mixture of digital and offline campaigning as ‘advertising’. We estimate that 42 per cent of campaigners’ spending reported to us at the 2017 UK general election was on digital campaigning, which is an increase from 23 per cent in 2015. We want the law to be changed so campaigners sub-divide their spending returns in a more useful way, and we can understand more about how digital campaigning is growing.

In addition to legislative changes, we want social media companies to take concrete action. Facebook, Twitter and Google implemented differing approaches – with mixed results – before the referendum on abortion in Ireland and at the US midterm elections.

In the UK, we want to see them deliver on their proposals for clarity about where political adverts come from, and to publish online databases of political adverts in time for planned elections in 2019 and 2020.

Over the past year, we have been talking with the social media companies to make sure they understand the UK’s electoral laws and can design their tools to work well here. Facebook has launched a UK political ad library, and we are looking at how much useful information it provides to voters and regulators. We are still waiting to see how others, such as Twitter and Google, will adapt the tools they have trialled in the US to work in the UK. If voluntary action by social media companies is insufficient, the UK’s governments should consider direct regulation.

Another important area where reform is needed is campaigns’ sources of funding. UK election law is based on a clear principle that funding from abroad is not allowed. Since 2013, we have recommended that company donations should be funded from UK-generated activities only. This would require a change to the law in this area, as the current requirement is for companies to be registered and carry on business in the UK – but not for the companies’ funds to have originated through UK-based activities. In the digital era, this is an overdue safeguard to help ensure that online and other campaign activities are not funded by foreign sources.

Finally, the Electoral Commission needs the right tools to enable us to enforce electoral law in the digital era. This includes a significant increase to the maximum fine that we can impose on those who break the rules. This is currently £20,000 per offence. We are concerned that political parties and campaigners will simply accept our current fines as the cost of doing business. We need the power to impose sanctions that genuinely deter breaches of electoral law. We also want clearer powers to compel campaign suppliers, including social media companies, to provide us with information when we suspect the rules may have been broken. These powers would be similar to those recently given to the Information Commissioner.

Taking forward these changes will not resolve all concerns around political use of the internet, and technology and campaigning techniques will continue to evolve. But they would have an important positive impact on transparency, and therefore public confidence in digital campaigning.

The issues raised fall under the responsibilities of a range of bodies, including the Information Commissioner’s Office, so it is right for the Electoral Commission to continue working closely with these organisations.

For our part, we will continue to monitor how campaigners are communicating with voters, and will stand up for voters’ interests and ensure greater transparency in the digital age.

The risks of inaction

Martin Moore, Director of the Centre for the Study of Media, Communication and Power, and a Senior Research Fellow in the Policy Institute at King’s College London and Damian Tambini, Associate Professor in the Department of Media and Communications at LSE

Well over two years after the EU referendum, and almost a year after the Cambridge Analytica scandal broke, the Electoral Commission (EC) and the Information Commissioner’s Office (ICO) – the statutory bodies for fair elections and data protection – now agree that the relevant regulatory frameworks are broken.

At the same time, both bodies are acutely conscious that there may be another election or referendum in the not-too-distant future. Legislative and regulatory change is, therefore, urgent. Yet the government, and most parliamentarians, appear to share none of this sense of urgency. Even the most basic change, to extend the imprint law to cover online campaign communications, has yet to be implemented at time of writing.

Why do the ICO and the Electoral Commission believe reform is necessary?

Both the ICO and the Electoral Commission have conducted lengthy investigations over the last eighteen months into alleged breaches of electoral and data protection law. The ICO investigation was, the regulator said, the largest and most complex in its history. It involved over 30 organisations, led to 11 warning letters to political parties, and fines up to the maximum possible for the ICO.

By the end, the ICO had become concerned enough about the political use of personal data to call on all parties to reflect on their responsibilities. “We are now”, the ICO warned, “at a crucial juncture, where trust and confidence in the integrity of our democratic process risks being undermined if an ethical pause is not taken.”

The Electoral Commission found its investigations into alleged breaches equally challenging and complicated. It lacked the powers to require relevant evidence from the technology platforms (most notably Facebook), it struggled to keep track of spending by all campaign participants, and it was unable to trace campaign funding back to its ultimate source. Eventually, when it found evidence to suggest criminal offences had been committed it had to pass responsibility onto the National Crime Agency since it lacked the remit and capacity to trace the origins of the money.

What do the regulators believe needs to be changed?

Since all major political parties have been criticised for processing data unlawfully and breaching consent rules, the ICO is calling for a statutory code of conduct to clarify the existing rules on data use by political parties and tie them in to a shared and agreed regime. In order to make sure such a code – and the law – is adhered to, it has said there should be an independent audit of each campaign’s use of personal data following future elections or referendums. For the Electoral Commission, the government urgently needs to make the sourcing of all campaign material transparent – and enforceable in law. In addition to which, the Commission wants parliament to extend its powers such that it can require information from digital platforms.

Would these changes address all existing problems?

A statutory code could do a lot to clear up confusion about the existing rules and stop the arms race between the parties. If it is carefully implemented and parties and campaigns are provided with training and clear information, it could clean up some of the muddiest aspects of data driven campaigns. Some of this will require further consultation: the ICO has stressed that there are issues around ‘inferred data’ such as lookalike audiences and the consent regime.

But the wider issue of what is permissible and what is not, in the grey area of targeted manipulation and untruth will not be fixed by this code. Much of what would be considered unfair or wrong is, in fact, perfectly legal. Similarly, more transparency of political advertising would certainly help voters and, if done right, allow for wider scrutiny. Still, implementation will be hard – finding the source of printed campaign literature is far easier than finding the source of a fleeting digital message.

The ICO and the EC are to be admired for the work they have done. And pitied. The problem is that they are each working on a small piece of the complex jigsaw of regulatory reform that is desperately needed, and this is going to be a long hard slog. Not only the ICO and EC, but Ofcom, government services, parliament and a wide range of public bodies are part of the response. As they respond, so digital campaigning moves on at pace.

Meanwhile, none of the main political parties has taken leadership on electoral law reform.

Clearly, attention on Brexit has deprived many other issues – including reforming our election rules – from parliamentary focus. Plus, given that any reform of the system is likely to constrain parties’ future use of digital campaign tools, we should perhaps not be surprised that few parliamentarians are enthusiastic for change. But the longer that government and parliament take to acknowledge the transformation of our political system as a consequence of the digital revolution, the more this country’s democratic legitimacy will suffer.

Martin Moore and Damian Tambini jointly edited Digital Dominance: the power of Google, Amazon, Facebook and Apple, published by Oxford University Press in 2018.

Empowering the regulators

Stephen Kinnock, Labour MP for Aberavon and member of the Brexit Select Committee

Our democracy is crumbling. The EU referendum exposed many of Britain’s flaws, but perhaps none more so than our creaking system of campaign finance. Drip by drip we have learnt about the extent to which our democratic system has been flooded with ‘dodgy’ money and dirty data.

First Cambridge Analytica, the disgraced big data company that was using harvested Facebook data to sell elections to the highest bidder – a scandal that saw Facebook receive the maximum fine of £500,000.

Then there was Leave.EU – the unofficial Brexit campaign bankrolled by the insurance tycoon Arron Banks – which was recently found guilty of “multiple breaches of electoral law”, fined £70,000 and referred to the Metropolitan Police for suspected criminal offences. In July 2018, Vote Leave – the official Brexit campaign – was fined £61,000 for co-ordinating with BeLeave, which was in turn fined £20,000.

Banks, the self-styled ‘bad boy of Brexit’, has since been reported to the National Crime Agency (NCA) by the Electoral Commission because of suspicions that “money given … came from impermissible sources” and that Banks “knowingly concealed the true circumstances under which this money was provided” . Since February 2017, I have been asking the Commission to do more to investigate how Banks’ money was spent during the EU referendum and, more recently, why he has refused to disclose the source of his money to the Commission and the Digital, Culture, Media and Sport select committee, or why he met Russian Ambassador eleven times in the run-up to the EU referendum, not the two or three or four initially stated.

But the very fact that this case took so long to be referred to the National Crime Agency, or indeed that this saga came to pass in the first place, shows just how badly our institutions are struggling to keep on top of election finance. And these questions go far beyond Banks, Brexit and Britain.

A recent report by the Atlantic Council – Democracy in the Crosshairs: How Political Money Laundering Threatens the Democratic Process – included three case studies of opaque political funding: Banks, the German far-right party Alternative für Deutschland, and the staggering number of small donations received by Donald Trump’s campaign.

It is notable that 59 per cent of the $624 million in donations Trump received were smaller than the $200 threshold at which donors must be identified under US federal law – either evidence of staggering success in generating grassroots support, or evidence of financial wrongdoing on a grand scale.

In an attempt to find solutions, I have been working with the Electoral Reform Society and Fair Vote to set up an All-Party Parliamentary Group (APPG) on Electoral Campaigning Transparency to investigate how we can better ensure our democratic processes are protected.

To start this debate, Fair Vote and I have set out a number of proposals:

  1. We should look at which powers sit best with the Electoral Commission – which works best as a regulator and policy body – and which should sit with the police. There should also be unlimited fines for electoral offences, rather than a maximum of £20,000, which is an insufficient deterrent.
  2. All political campaigns should be made to report spending online. We have a precedent for this with the Independent Parliamentary Standards Authority, which tracks MPs’ spending. This would make it easier for campaigns to track their spending and bring more transparency into elections.
  3. Financial transfers from designated campaign groups during referendums must be banned. Current rules allow the designated campaign to give up to £700,000 to groups as long as they do not coordinate their work, but it is surely unreasonable to think gifts of this size are entirely without expectation, particularly as they create the potential to evade spending limits?
  4. We should regulate paid political digital advertising in the election period with a digital bill of rights for democracy.

The Electoral Commission was established at a time when political campaigning centred around door knocking and leafletting. Digital campaigning and online political engagement have revolutionised politics to the extent that the Electoral Commission do not currently have the capacity to deal with such an investigation – an analogue regulator in a digital age, some might say.

It is imperative that we act now and give resources and legislative bite to the regulatory bodies designed to protect our democracy’s integrity. We cannot allow this issue to be side-lined or overlooked. This is far bigger than Brexit, and if we do not act now to fix our democracy the consequences could be even worse in the future.

As the Information Commissioner Elizabeth Denham recently said: “We are at a crossroads. Trust and confidence in the integrity of our democratic processes risk being disrupted because the average voter has little idea of what is going on behind the scenes.”

Our new APPG will seek to bring more transparency and 21st century solutions to the murky world of campaign finance.

The legal landscape

Bethany Shiner, Lecture in Law, Middlesex University, London

The evolution of data-driven political campaigning has spawned an entire industry that has capitalised personal data for political ends.

The scandal around the deceptive and opaque use of personal data and the global web of connections between political campaigns and corporate interests has exposed an approach towards the electorate that seeks to cajole and steer it, not through open and robust debate but through personalised, localised and private digital advertisements.

Such forms of political communication can be positive and empowering, but can also contain misleading, inaccurate or false information that cannot be easily scrutinised. The ICO’s proposal for a statutory code of practice seeks to promote dialogue between the regulators and the government, and encourages a comprehensive reflection on corporate and political practices.

However, due to the complexity of the issue, any attempts to further regulate political campaigning need to be carefully thought through to avoid being ineffective and having unintended consequences.

The ICO’s proposal for a statutory code of practice is an attempt to change the practices of ‘datafied’ political campaigning through clear, enforceable rules. The statutory code should establish standards in political campaigning and limits on the use of data in politics. What is unclear is how a statutory code of practice will sit alongside section 8(e) of the Data Protection Act 2018 which enables the processing of personal data for activities that “support or promote democratic engagement” such as communicating with electors, campaigning activities, and opinion gathering inside and outside election periods.

Of course, there are numerous actors involved in political campaigning, not just registered political parties and campaign groups but also lobby groups, interest groups, online platforms, individuals, foreign countries and private interests. The statutory code must provide further guidance on how section 8(e) may apply to private organisations paid to process data.

It would also be helpful to clarify how section 8(e) sits alongside the additional provisions applicable to sensitive personal data, which includes political opinions. Specifically, how are the methods of using personal data to reveal or infer sensitive information, such as political views, consistent with the Data Protection Act 2018?

For example, when using data obtained from multiple sources and analysed, political parties did not regard any information inferred from this process as ‘personal data’, a conclusion with which the ICO disagrees.

In short, there is a distinction that needs to be clarified between using personal data for political purposes that is surface-level data processing (such as using the electoral register alongside information submitted into a mailing list to send out political messages), and using personal data that is processed to infer sensitive personal data such as political opinions.

Political parties and campaign groups invest heavily in data, financially and strategically, and there is nothing to stop them from doing so.

There is nothing to prevent the marketisation of data for political purposes. There is also nothing that regulates political communication outside of TV and radio political party broadcasts. Therefore, guidance on how micro-targeting can be consistent with the Data Protection Act 2018 would be welcome, bearing in mind the need for a distinction between when an individual is targeted based on data given freely with explicit consent and when an individual is targeted after the processing of other data sets to infer their political views.

One particular aspect of micro-targeted politics is disinformation, i.e. the content not the method of communication. The government has said it is already tackling disinformation through legislative and non-legislative initiatives. The democratic necessity in protecting freedom of expression means any initiative must not curb free speech.

There is a nuance in that some manipulative tools that seek to shape and engineer political discourse, including amplification, bots, troll farms plus micro-targeting contain misleading and manipulative content – but not false or illegal content. In the battle to fight disinformation, we should not regulate political communication in an expansive or suppressive way. Instead, we should focus on the mechanisms of manipulation, not the content.

There have been suggestions that social media companies and intermediaries work closely with regulators and advise political parties on transparency and accountability when using data to target voters on those platforms. The Digital, Culture, Media and Sport committee has recommended that the Electoral Commission establishes a code for advertising through social media during election periods and considers whether social media campaigning should be restricted during the regulated period to registered political organisations or campaigns.

The Committee has proposed a new category for technology companies which is neither platform or publisher, but something in between that establishes some liability to act against “harmful and illegal content”. Social media platforms are being urged to introduce transparency features – with the ICO and the Electoral Commission being consulted on those features and completing evaluations.

If intermediaries will be expected to monitor political content online (to determine what is harmful or illegal, and whether any restrictions are complied with), very careful thought must be given to how this can be done while preserving freedom of speech and not enforcing rules unfairly or in a discriminatory way.

Further, serious thought must be given to whether such power should be delegated to technology companies. The same digital interventions that can be heralded as promoting democratic engagement, such as Facebook’s ‘Get out the vote’ campaigns, can also be used to suppress democratic engagement or shape democratic discourse opaquely.

Such a role could make it more likely that intermediaries, such as Facebook, will disrupt political campaigns – as it did during the Irish referendum on the Eighth Amendment when, after public pressure, it blocked advertisements that originated from outside of Ireland.

Although this was a legitimate concern, such an intervention is in the gift of intermediaries that make judgement calls as moderators and are not subject to review in the way an administrative decision would be. Facebook’s action in the Irish referendum came late in the campaign cycle and was an unforeseen intervention disadvantaging some campaign groups because it disrupted campaign strategies. Interventions such as this should be predictable, consistent and transparent.

Although technology giants have vowed to self-regulate by taking steps towards greater transparency and better monitoring of electoral interference, it has been repeatedly shown that they are irresponsible and have no more regard for the democratic process than their fluctuating stock market value dictates they should. Facebook’s CEO has repeatedly refused to appear before parliamentary committees and reports have illustrated Facebook’s decision to not act on certain forms of electoral interference.

Regulators have so far encouraged dialogue with intermediaries but this should only be done to the extent of consultation. We cannot afford to dilute or compromise electoral integrity according to the desires of private interests and corporate profit.

Also see Bethany Shiner’s forthcoming article in Public Law journal entitled Big data, small law: how gaps in regulation are affecting political campaigning methods and the need for fundamental reform.

Avoiding the American model

Kyle Taylor, Founder and Director of Fair Vote Project

If you are interested in a preview of what British politics could look like in 20 years without urgent reform, look no further than the United States.

In less than two decades, a system that was already controlled by moneyed interests and required constant fundraising (some elected representatives report spending 80 per cent of their time asking for donations) has spiralled beyond a point of no return where corporations have no limit on what they can donate. In addition, ‘non-partisan’ third-party groups can spend an unlimited amount of money as long as they do not ‘coordinate’ with official campaigns.

Sound familiar? We got our first taste of this in the EU referendum where we now know Vote Leave and supposedly unaligned groups coordinated their work, overspending to the tune of hundreds of thousands of pounds. While there has been little political will or desire to take action (mainly because of partisan vested interests), the need for reform is the most urgent issue of our time. Democracy is already more unpopular with the citizenry than at any point since records have been kept. If our very way of life – the very foundation of how we ‘set up’ our society – falters, there is little chance we will be able to come back from the brink without some cataclysmic event.

Any type of electoral reform is almost immediately put into the ‘too hard’ basket as politicians and parties conflate a multitude of issues to discourage change and overcomplicate basic, simple problems.

There are three areas where urgent reform is not only fairly easy, but already has broad support:

  1. Re-democratise democracy by keeping big money out of politics.The most powerful constituency in a democracy should be the voters themselves. This should be self-evident, but we are dealing with a democratic landscape where that is no longer the case. Historically, the strict spending controls in UK elections have ensured a level of fairness in elections. As a result of structural changes both to how spending rules work and the means by which campaigners campaign, this system is no longer fit for purpose.While spending caps should remain (and be better enforced, see point two below), the EU referendum and, in particular, the £8 million donation from Arron Banks have made it abundantly clear that our democracy also needs funding caps. These should not only limit the amount an individual, company, organisation or entity can give, but also require a clear trail to be certain of the source. These caps should be in place all the time, not just during an election campaign. There is no point restricting funding, if from five years to one minute before the regulated period someone can completely escape scrutiny.
  2. Increase transparency by modernising reporting.

    At present, election spending is reported offline consisting of a spreadsheet ‘top sheet’ and physical copies of individual invoices. These are often heavily marked up to represent ‘splits’ between local and national spending as well as ‘wastage’ or the percentage of a particular leaflet that was not actually distributed. In addition, there is no requirement to supply detailed evidence of online election spend – the fastest growing area of spending – beyond non-itemised receipts from digital platforms. These physical returns are then held by local councils for candidate expenditure and centrally for national expenditure.

    There are countless cases of individuals and entities having to pursue freedom of information requests to gain access to what are obviously public documents. Election spending should be reported in near real-time on a national online database that is easily and publicly accessible and searchable. This should include copies of all leaflets and digital ads produced, alongside audience details (who received what and why) and detailed reports of spend, reach and so on, which can then be cross-referenced against publicly available records held by online platforms themselves. This is easily the simplest way to push for rapid rule-following.

  3. Be 22nd century ready by closing digital loopholes.

    At the moment we are roughly three decades behind in properly legislating for our election system. We not only need to catch up. We need to be decades ahead. While the broader digital space needs adequate regulation, driven and overseen by an independent regulator, there are immediate changes we can make to close digital loopholes.

    The two most basic (and obvious) are: a) Applying the same standard of transparency to digital advertisements as is applied to physical leaflets in the form of an election imprint. This ensures that any ad no matter where it appears – can be traced back to a campaign and the campaign’s legally responsible party (the agent); and b) Making targeting data for digital adverts available within ‘two clicks’. A voter should know who is targeting them and on what basis.

The above are not only sensible recommendations – they are actionable almost immediately. Regardless of whether we have a general election or referendum any time soon, the public’s trust in our democratic processes and outcomes will continue to decline unless we take these problems seriously. Without real change now, the only path forward is the American one. That is not the future we want nor one we deserve.

American tweets

Increasing the resilience of our democracy

Cassie Staines, Senior Policy Officer at the independent fact-checking charity Full-Fact

There is a moral panic about ‘fake news’ which is prompting frightening over-reactions by some governments around the world. Notwithstanding this, misinformation represents a real risk to open democratic societies and we need to develop effective responses.

As the UK’s independent factchecking charity, at Full Fact we have been building an evidence base of misinformation in all its guises for over eight years.

While misinformation has existed in various forms for a very long time, the internet has opened the doors to a proliferation of sources and emerging technologies that have changed the rules, particularly around elections. It is easier than ever to hide knowledge in plain sight and this makes it harder for people to know where to place their trust.

We believe that – while immediate action is needed to tackle some more urgent issues – rushing to come up with quick solutions to the range of issues could do more harm than good. The realistic goal is not to eliminate mis- and dis-information, but to build resilience against them.

What is the harm?

It is important to understand the types of harm associated with misinformation and the evidence of their impact before deciding whether government action is necessary or appropriate.

In the context of elections, we see three main areas of potential harm:

  • Disengagement: The fast pace of technological development has led to the proliferation of information sources online and has changed the way voters consume campaign content, making it harder than ever for people to know where to place their trust. The sheer quantity of information available online, combined with having to assess the quality of content in terms of sources, means that there is a risk that people simply disengage and switch off.
  • Interference: There have been warnings from official sources of concerted election interference campaigns and there is also strong evidence that misinformation has had a wide reach during elections, at least in other countries. The evidence of the impact of these interferences – for example the impact on voter choice – is less clear.
  • Effects on beliefs and attitudes: We know that people have been seriously misinformed about the state of the world for as long as we have had data. But we do not yet know enough about how online political misinformation or state-sponsored disinformation affect people’s attitudes.

It is also important to consider that what happens between elections and before referendums may be just as important as what happens during official campaign periods. Harm can also extend beyond election disruption including to economic harm, abuse of power and even risk to life.

Actions to protect democracy

Misinformation can cause harm in an open democratic society. But it is important not to panic – the risk of harm from overreacting is potentially much greater. Any step to tackle misinformation must be cautious about potential unintended consequences on free speech and civil rights. Even relatively simple choices about what content should be amplified can inadvertently suppress the speech of certain groups. However, we do believe that immediate action is needed to tackle some urgent problems.

Electoral Law

In the UK, there are a number of steps that the government needs to take to update our election law to make it fit for purpose in the digital age, and to protect the integrity of our democracy.

The first is to mandate transparency for political advertising by collecting adverts into an online, publicly searchable database in real time, in machine-readable formats. The second is for the imprint rule (information about who is campaigning) to apply online as well as in print.

It is welcome that internet companies have taken some steps to increase transparency without waiting for parliament to catch up – but it is no substitute for proper democratic decision-making about how our democracy works.

Any election or referendum conducted under the current rules would be vulnerable to abuse. Currently, it is possible for a candidate to run a thousand different political campaigns in the same election, promising something different to each group it targets. If we do not act, we risk undermining the principle that democracy is a shared experience.

Open information can tackle misinformation

As well as tackling misinformation and addressing the harms directly, it is also crucial that we build resilience against it in society. Providing high-quality and trusted information is an important part of an open democratic response.

The UK has an array of independent public bodies capable of informing public debate. We believe that we need to make much more use of these bodies and equip them, and the government and parliament, for 21st century communication if we are to maintain trust in public life in the face of campaigns to undermine it.

We need a proportionate response

At Full Fact we have called for action where we believe it is proportionate and can be beneficial. We have not called for government intervention in the content of information shared online or during political campaigns. This debate, and our thinking on it, has further to go.

Full Fact is the UK’s independent factchecking organisation. We have a cross-party board of trustees, and are funded by a range of charitable trusts, individual donors and corporate sponsors. We have received funding from Google and Facebook: details of our funding are available on our website.

You can read more about our recommendations to combat misinformation in our report: Tackling Misinformation in an Open Society.

Reining in the tech giants

Steve Wood, ICO Deputy Commissioner (Policy) - In conversation with the ERS Head of Communications, Josiah Mortimer

Josiah (ERS): Do you feel you have the powers you need to adequately regulate the use of data by political campaigns?

Steve (ICO): GDPR [the EU General Data Protection Regulation that came into force in 2018] and the Data Protection Act 2018 equip us very well, in terms of being able to fine organisations up to four percent of global turnover, or £17 million, and of our powers of compulsory audit, no notice inspections, demands for access, and so on. More info.

One missing piece is that we want to be able to issue a statutory code of practice for political use of data. We’re waiting for a response from government to that recommendation, but we feel as though this should be a statutory code in the same way as we have a code of practice on data protection in the media – e.g. data protection in children’s privacy.

We also feel that this will set out a level playing field for all of the different actors who are using data for political purposes, about what they can do with data. It shouldn’t create a chilling effect: it should enable parties to understand how they can use digital campaigning techniques, because we’re not as a regulator saying, ‘you can’t use digital campaigning tools’, but it’s about how you use them safely and transparently in accordance with the law.

Even though we haven’t had a response from the government yet, we’ve already started work on it, because we’re aware of the appetite for guidance…We’ve currently got a call for views open at the moment to do that [establish a voluntary code].

Compared to the Electoral Commission – and I think the Electoral Commission has been quite open about this – the ICO is much better equipped.

Josiah: Is there a danger with referendums that the campaigns and directors can just shut up shop and evade responsibility once a campaign is over?

Steve: It’s certainly one of the challenges we found in investigating the referendum campaigns: lots of people who we wanted to speak with weren’t engaged with the campaigns anymore.

During referendum campaigning particularly that seems to be a challenge. Political parties are always there, they continue to be data controllers under GDPR, so they’ll always have responsibilities in between elections. But we’re not just focused on what campaigns do during regulated periods: we’re different to the Electoral Commission and we will look at the use of data whenever we need to.

Josiah: You’d like to see an ‘ethical pause’ in political advertising. Can you tell us about that?

Steve: Yes, we’ve not [said] political parties, for example, should stop using those tools completely, but more [that there should be] a pause in which all the different actors can come together, including politicians, to say, ‘What kind of digital campaigning do we want in the future? What should the next general election look like?’

There’s also an ethical dimension to this in terms of the question: ‘What’s the way we want our digital campaigning system to evolve, in terms of what’s a reasonable use of data?’ Particularly highlighting the importance that this has in a democratic context, not just a commercial context.

We have our own concerns about online advertising and the manipulation in the commercial context. Particularly because it’s intersecting with democracy. It needs that wider debate.

The Electoral Commission also talk about the positive side of digital campaigning. More people are now engaged in actions and some of that has come about through digital tools encouraging people to get involved.

So it’s important to use that pause to look at all of those different issues in the round: to enable all of the key players who are all going to have to make decisions, whether it’s parliament passing additional laws, government taking measures, different regulators taking measures, businesses taking measures. There’s probably not just one key player in this: it’s not just the ICO, because it’s so complicated.

All that must come together to have a fuller understanding of the type of digital campaigning we want. Because we can’t ignore it, it’s not going to go away. So we’re not saying there should be a moratorium on the use of digital advertising, but there needs to be that stepping back, and it needs to happen soon.

In terms of that wider debate, it has started to happen: the DCMS select committee has done a lot of work on this as has civil society.

We’ve also made a recommendation to work with the Centre for Data Ethics and Innovation on a ‘citizens’ jury’ to look at these issues as well. That would really be part of the ethical pause. How can we really sit down with the public over quite a long deliberative process, to understand the public’s views and how they develop?

Josiah: Do you think there’s a knowledge gap there in terms of people’s rights?

Steve: I think the public are relatively unaware about how micro-targeting works online. And they’re probably unaware of how political parties are using micro-targeting too.

We’ve had the benefit of GDPR coming into force in May 2018. What the research indicates to us is that people are becoming more aware, and more people are making requests for their own data.

There is a good level of improvement from organisations in providing better transparency. We’re not there yet, but GDPR has pushed that forward. People are becoming more aware of their rights, so there’s improving transparency by organisations, but there’s still quite a way to go.

Josiah: Given that electoral and referendum legislation hasn’t changed since around 2000, do you think that there needs to be fresh election legislation to keep up with all this change?

Steve: We responded to the Cabinet Office consultation on that recently, supporting digital imprints for political ads. And [there’s] the work which the online platforms have started to do – Facebook has now got their process where you can go to a database of online political ads. So transparency is improving, but it might need to be actually required in law. We don’t know how well that [social media transparency] is working yet. A number of journalists have been able to circumvent the registration system for political advertising there.

Josiah: Given the plethora of regulators involved in this field, do you think it would be easier to have one overarching political campaign regulator, or do you think it’s working as it is now?

Steve: I think it can work well with a strong working relationship between the regulators. They are quite distinct activities – regulating personal data and regulating elections. So, I think those need to be separate, but they need to complement each other. It’s important that the regulators are not constrained, so that legally they can work together, which we are able to do – e.g. we can share information with each other.

For example, if we see evidence that the Electoral Commission might want to know about for an investigation, we can share that data with them. [But while we] work together, I think they’re quite different systems.

Also, our investigations don’t just cover political parties. So, we look at the commercial players, like Cambridge Analytica, and who supplied the data. For example, we fined a company called Emma’s Diary which released data to the Labour Party. We can follow the data wherever it goes. I think that has to be something the data protection regulator does, rather than an overarching political and electoral regulator.

We’re quite comfortable with there being two regulators, but joining up is key. And making sure our powers are comparable too: the Electoral Commission should also have the opportunity to make a case for strengthening its powers.

We’re working on stronger international networks on these issues, because the concerns about a lack of transparency, and fair [data] processing by technology companies are the same in Brazil or wherever you are. These issues keep cropping up.

The best system we have for cooperation at the moment is European-based, but that will grow hopefully into a global system, over the next five to ten years.

Josiah: We’re obviously leaving the EU quite soon. Is there a risk that these powers will disappear and not immediately be replaced?

Steve: The government is being very clear that the GDPR, the data protection law we have now, will be copied over into UK law at the point of exit, so there’s a commitment there to continuity. What we don’t know is how we might be able to access the European system.

Josiah: That will be presumably be difficult if there’s no deal?

Steve: Yes, it will be, probably.

Josiah: Looking at the next five to ten years for elections, what do you think the biggest data trends are going to be, in terms of parties and campaigners using these tools?

Steve: The Demos report for us highlighted a lot of the different trends: the increased sophistication in techniques, the use of machine learning, and artificial intelligence. Bigger pools of data being used as well, so you can learn an awful lot about people if you plug in more data from their devices – which describe a lot about lifestyle, behaviour, health, etc.

There’s a lot more data which can be plugged into the data sets, which could be analysed and used to predict which categories voters might fall into. So, that’s the area that the commercial sector is already looking at in terms of customer segmentation. And it can be used for voter segmentation. So, it’s particularly [these] areas becoming ever more sophisticated that has an element of lacking transparency.

It’s important that the political parties take the right steps before they deploy these technologies, so that they’re properly assessing the privacy risk.

We’re equally aware of the positive aspects of digital campaigns and engaging groups who haven’t been active before. We mustn’t lose sight of those points.

We have got ways we can start to improve trust and confidence in the system. If we take these actions across the board, if everybody does their bit, I think that’s key. [We need] to work on accountability for all of the different parts of the system: accountability of the parties, the data brokers, the analytics companies – it’s got to work in a systematic way, to achieve that.

But we feel optimistic that we’ve made a start. We’ve pulled back the curtain and now it’s also for others to do what they need to do to take things forward.

A new Office for Responsible Technology

Jacob Ohrvik-Stott, Researcher for Doteveryone, an independent think tank championing responsible technology for a fairer future

The world is changing fast and our regulations cannot keep pace. Technologies transform at speed but legislation lags behind. This revolution is happening in every part of what is now a digital-first society. But there are few areas that expose the gap between digital practices and analogue rules as glaringly as digital political campaigning. And there are few areas where the effects of regulation’s failure to adapt are more acutely felt.

In November 2018, Doteveryone published recommendations for a new Office for Responsible Technology to tackle these challenges. The Office would sit across the regulatory landscape, empowering existing and future regulators, informing the public and policymakers about online benefits and harms, and supporting people to seek redress when their digital rights have been breached.

Digital political campaigning is in urgent need of this kind of reform, but regulation in this area has not even remained in-step with that of the offline world, let alone been updated to reflect the realities of today’s digital technologies.

As online political campaigning practices continue to evolve, one-off changes to legislation and regulators’ powers are likely to become obsolete by the time they have reached the statute book. Deep-rooted, systemic change is needed to make the regulatory ecosystem more agile, resilient and intelligent in the long-term. An Office for Responsible Technology could lead this much-needed change, bringing regulation of political campaigning into the digital age.

Empowering regulators and informing the public

In the UK, the work of the Electoral Commission (EC), Ofcom, Information Commissioner’s Office (ICO), Advertising Standards Agency (ASA) and Independent Press Standards Organisation (IPSO) all overlap with various aspects of the digital political campaigning landscape. Despite, or perhaps because of, this, many parts of this landscape sit in the grey areas between regulators with too little scrutiny. An Office for Responsible Technology could close these gaps in regulation by leading an independent review, making recommendations to government on how to bring regulations up to date, who should enforce them, and the changes to resources and capabilities needed to deliver them.

It may, for example, recommend that the Political Parties and Elections Act (PPERA) 2000 is amended to give the EC the flexibility to apply its offline powers online, including mandatory imprinting of online political advertising, reporting of online campaign spending, and setting minimum group sizes for targeted online advertising. It could also call for the ICO, ASA and the EC to develop a joint code of practice for online political advertisers, or place a statutory obligation on social media companies to report the number of fake accounts spreading political messages and to set up ‘open APIs’ for third parties to scrutinise political advertising data.

But the Office will not just play catch up with current practice. It would work with regulators to anticipate future developments in the landscape, leading horizon scanning and training regulators to develop their own capacity to plan for the future. Enabling regulators to get ahead of emerging challenges, such as fabricated political ‘deep-fake’ videos and the next generation of behavioural profiling practices, is vital to ensure they are not perpetually left in the wake of fast-moving digital technologies.

The knowledge the Office holds will not be shared with policymakers alone, and it must also be used to empower the public to understand how digital campaigning affects them individually and as a society. In addition to communicating risks and harms, their engagement will also raise awareness of how people can have more control over their interactions with online political campaigns – where to go to report suspicious campaigning activity, check the veracity of evidence used, or how to change the data they share with political campaigners.

Supporting people to find redress

Engaging the public with contemporary issues around digital political campaigning is vital. But raising individual’s awareness of their rights is of little use if they are not able to hold bad actors to account when they have been breached.

Another role of an Office for Responsible Technology is to raise standards for complaint resolution, mediation and redress. As a first step, the Office could audit measures developed by digital services and platforms to maintain the integrity of digital political campaigning. This auditing is needed to shine a light on lax approaches and uncover loopholes in these measures.

Criticisms of Facebook’s political ad transparency tools (described by US lawmakers as “failing to carry out the basic disclosure and disclaimer provisions of the [Honest Ads Act]”) show why this spot-checking function is much needed. Where these measures are not up to scratch the Office would explore ways to enable backstop mediation and redress, making recommendations to government and regulators on how to make them a reality. This ‘Ombudsman-style function’, delivered by a new body or through the reconfiguration of existing ones such as the Parliamentary and Health Service Ombudsman, could be used to mediate between platforms and people and organisations who believe their campaigning content has been unfairly taken down.

The gap between regulation and the realities of ubiquitous digital campaigning practices cannot be ignored. The current failure to hold political campaigners to account risks undermining public faith in democracy. To bring political campaigning regulation into the digital age we need to fundamentally rethink the role of the regulator. An Office for Responsible Technology can lead this transformation and steward the system to uphold the public good in a digital age.

The technical future of political campaigns

Josh Smith, Senior Researcher at the Centre for the Analysis of Social Media, Demos

Targeted online advertising has produced a fundamental shift in the ways in which political campaigns are conducted. The ability to divide the electorate into precisely shaped groups, lovingly adjusted by inferred characteristics and patterns of behaviour, has helped parties and campaigns reach ever smaller groups with the right message, and measure the response in real time.

The importance of digital messaging to parties can be seen in their budget sheets. In 2015, the Electoral Commission found that 23 per cent of the total spend for the election went toward digital campaigns, and in the run up to the EU referendum, Dominic Cummings estimates that Vote Leave ran around one billion targeted adverts, mostly via Facebook.

In this, campaigns are following the market. Across the political spectrum, electoral politics has become, in the words of US academics Chester and Montgomery, “fully integrated into a growing, global commercial digital media and marketing ecosystem that has already transformed how corporations market their products and influence consumers”.

The capabilities offered to parties by technology are set to change over the very short term. In 2018, Demos produced a report for the Information Commissioner’s office sketching out how advances in data analytics, AI and micro-targeting might affect political campaigns. We found a sector buzzing with possibility.

One company promises to help find sympathetic voters through their phone’s presence at political events, and follow them home with advertising. Another offers ‘voter file enhancements’, providing electoral rolls fleshed out with data on income, occupation, education, likely marital status, ethnic and religious identification, magazine category subscriptions, pet ownership and so on.

As the character and volume of data increases, along with our capacity to make inferences from it about voter mood, preference and behaviour, strategies for targeting are likely to become ever more sophisticated and complex.

Incoming technology has implications not only for the accuracy with which campaigns can direct messaging, but also for the basis on which that targeting is performed. Take, for example, the rise of ‘deep learning’. Crudely put, one powerful form of algorithm, a ‘neural net’, can be used to work out how to turn inputs – e.g. detailed voter profiles – to outputs – e.g. the wording to use when contacting them. Deep learning takes this process one step further, not only automating the process of working out how to get from voters to vocabulary, but also deciding, itself, which information on a voter should be used to inform this content.

As voter profiles become increasingly detailed, use of this technology by campaigns could lead to their targeting people based on characteristics, such as health, race or sexual preference, which campaign organisers may not be aware they are using. This could be true even where protected characteristics are not stored, as machine learning makes it possible to infer detailed demographic information based on seemingly innocuous data. For example, MIT has used deep learning to accurately infer a user’s age and gender simply from data on the timing and duration of their phone calls.

Deep learning is just one example of the technologies which are changing campaigns. Political groups may soon be able to incorporate data gleaned from ‘wearables’ (such as smartwatches) and the ‘internet of things’ into their messaging, or to algorithmically generate entirely new slogans and designs. The direction of travel is towards campaigns being able to address each voter individually, based on their specific profile; advertising to an audience of one. This individual targeting risks breaking down public political discourse – having a conversation about what a party stands for in the run up to a campaign is difficult if your neighbours and family have seen completely different messages.

This could, of course, be a shot in the arm for politics. A detailed and accurate view of the electorate could enable parties not only to better allocate restricted messaging budgets, but also to develop policies more in tune with the lives and concerns of people they will affect. For this to happen, however, rapidly developing profiling techniques will need to move beyond their status as a marketing tool, and be understood, and thoughtfully implemented, by policymakers.

Making individually crafted, algorithmically-honed offers to people is a sound marketing technique, but politicians need a way to ensure those offers are followed up. At present, it is difficult for campaigners themselves (let alone regulators) to know on which basis they are targeting voters, and what they are telling them.

It is imperative that these technologies are used to improve our political process, and understood by their users as well as their targets. Otherwise, we run the risk of becoming an electorate resigned to a politics which says the right things but never fulfils its promises, singled out for persuasion in ways we do not expect or understand, and to which we cannot meaningfully consent.

Read more publications...