Registration
Past Event
Welcome! To join the event, please register below.
About Event

Public discourse receives the blame for many of society's problems, but also holds many of our hopes for human progress. Newspeak House presents an evening of lectures on how to think about what people are talking about. Talks will be followed by food and networking. Please take a test before coming, and consider your personal thresholds for acceptable risk.

Areeq Chowdhury: The Effect of Digital Technologies on Scientific Misinformation

How are digital technologies changing the way people interact with information? What technologies are there that can fabricate and detect misinformation? And what role does technology have to play in creating a better information environment?

Areeq will talk about The Royal Society's recent report The Online Information Environment that addresses these questions, providing an overview of how the internet has changed, and continues to change, the way society engages with scientific information, and how it may be affecting people’s decision-making behaviour from taking up vaccines to responding to evidence on climate change, highlight key challenges for creating a healthy online information environment, and make recommendations for policymakers, academics, and online platforms.

Areeq Chowdhury (@AreeqChowdhury) is a Senior Policy Adviser at The Royal Society (the UK's national academy for science) focusing on data and digital technologies, and founded the influential technology policy think tank, WebRoots Democracy.

Luke Thorburn: Metaphors for Public Discourse

Consider the totality of all human communication, opinion, knowledge, and the mechanisms by which these things change over time. Some people call it the "information environment" or the "marketplace of ideas", but to use such terms is to assume that environmental or economic metaphors are appropriate. Over the past few years, Luke Thorburn has been accumulating an inventory of such metaphors.

In this talk, he will provide an overview of the different metaphors commonly (and less commonly) used, where they fall short, and the different interventions and regulatory responses they imply—aiming to promote critical thinking about such metaphors when they are encountered 'in the wild'.

Luke Thorburn (@LukeThorburn) is a PhD candidate at King's College London within the Centre for Doctoral Training in Safe and Trusted AI. His primary research interest is in ensuring that the algorithms used to mediate our shared epistemic systems are aligned with human values, as well as better understanding the extent to which this is important. Previously, he was a member of the Hunt Lab for Intelligence Research at the University of Melbourne.