Please enable javascript in your browser to view this site!

Navigating the Ethics of Emerging Technologies

‘The Good Place’ is a half hour sitcom about four people that have died and think they’ve made it to ‘the good place’ only to discover that (*spoiler alert*) they have not.  In one scene, one of the four - Chidi, a professor of moral philosophy and ethics - is explaining the trolley problem.  This century-plus-old thought experiment requires individuals to choose between two terrible outcomes - kill one, or kill five.

Beneath the light-touch sitcom antics is a very relevant, modern but arguably timeless question - when choosing between the lesser of two evils, how do we choose? And, what should be the consequences of those choices?

Answering such a question requires a framework of principals, morals and ethics. In our last event for 2018, we looked at what frameworks were being applied to emerging technologies and what the implications are for failing to integrate critical and ethical thinking.

We explored:

  • What ethics are

  • The ethical implications of emerging technologies

  • The influence of data

  • The role of government

  • The role of industry

With panelists:

 

Let’s start with a quick Ethics 101…

Ethics is defined as moral philosophy, or a set of moral principles or rules conduct. 

The study of ethics is a ‘branch of philosophy that involves systematising, defending, and recommending concepts of right and wrong conduct.

Think of ethics as the ‘should’ questions rather than the credential reasoning of ‘can’.  They allow you to critically assess options, or actions, before you to determine the way forward according to your (individual or collective) morals and values.

Why are ethics becoming such an important issue in technology?

In May of this year, the world watched as Google’s Duplex booked a hair appointment.

In October it was reported that Amazon was pitching its facial recognition technology to officials from the US Immigration and Customs Enforcement (ICE).  

Back in April we watched Mark Zuckerberg sit before a senate hearing to explain how Facebook allowed 87 million profiles to be data-mined by Cambridge Analytica - just another chapter in the long and continually developing story of how the social media channel influenced the outcome 2016 Presidential election.

These are just three examples of the high-profile tech stories that are now becoming commonplace in our news cycle.  At the heart of all of them are ethical questions - should they have been allowed to do that?  Should they have been allowed to access that?  Should this technology be allowed to make those decisions, or be used in this way?

Emerging technologies are moving out of low-stakes daily scenarios, like predicting what we should watch or buy, and into high-stakes situations like welfare and criminal justice, where they shape society and communities.  

How can technology do this, and why should we be concerned?

The core concerns with emerging technology - predominantly machine learning or AI - being used in areas like policing and welfare are:

  • Recidivism

  • Bias

 Recidivism refers to an individual repeating an undesirable behaviour even after they have been punished or efforts have been made to rehabilitate.  There is concern that many of the models being employed by government are actually recidivism in nature - essentially self-fulfilling prophecies.  Driving these models is data that can be inherently biased.

In October, the ABC revealed that children as young as 10 were being included on a NSW Police blacklist know as Suspect Target Management Plan. Individuals on the list are monitored and (it’s claimed) harassed by police.  They are not notified that they are on the list, many are from lower socioeconomic areas like Redfern, and sample data showed that young people, and Aboriginal and Torres Strait Islander people, are over-represented. The criteria for inclusion on the list is confidential, and many on the list have no criminal history.

The NSW Police describe the program as a ‘crime prevention strategy’.  But without transparency or understanding of the program, or the data used to develop it, it is argued that it’s being used to unfairly target the disadvantaged, and influence the outcome it has been developed to prevent.

Applications of machine learning like STMP have the potential to create a feedback loop which stifles social mobility, making it harder for individuals to climb out of poverty.  

How can data be bias, and doesn’t it need to be?

The machine learning models that power Netflix predictions and automate your autonomous vehicles, need to have some bias otherwise they can’t select or decide.  To assess whether the bias is ethical we ask if it is FAT:

  • Fair

  • Accountable

  • Transparent

Transparency is the key to assessing fairness and accountability, but while considered best practice, it is not commonplace.  

Consider your weekly shop; whatever jar, tub or package you pick up has a list of ingredients, where and by whom it was made, and any requisite warnings. Many programs are being developed using proxy data in lieu of actual data - for example, instead of orange juice made from 100% oranges, you’re getting 50% oranges and 50% orange flavouring. And while we’re told that machine learning programs continue to learn, many don’t. They require training, retraining, data updates and maintenance that often doesn’t happen post implementation.  

If you apply for a job, and are rejected, you can enquire as to why. HR will usually be able to tell you why - you didn’t have the right experience, or maybe too much etc. Some companies are now using RPA (robotic processing automation) to screen applications before they make it to the desk of HR.  If HR don’t understand how the RPA was developed, or what data was used to build it, they can’t explain why you were rejected. Nor can the company assess whether the program is creating efficiencies in the recruitment process, or seeing them miss out on valuable talent. To empower accountability, their needs to be explainability - end users need to understand how algorithms have been build.

Is there a role for government?

Government can play a role in allowing ethics to guide emerging technology.

Legislation and regulation

The Australian Government has already put in place legislation for data security which is not dissimilar from the EU General Data Protection Regulation, but there is room for more.

Australia’s Chief Scientist, Alan Finkel, is one of a growing chorus calling for governments around the world to develop a regulatory framework for AI. While regulation is difficult, particularly for emerging technology, Finkel is particularly focused on ensuring that ethics is scalable - from an ethical ‘Fair Trade’ style stamp for AI (such as for social media), to a global accord for weaponised drones.

Education

Creating more inroads between the two dominant areas of study in the western education system - Humanities and Social Sciences (HASS), and Science, Technology, Engineering and Mathematics (STEM) - may be the key to equipping future generations with the critical skills required to navigate, create and invent ethical technology.   

The wunderkinds of the fourth industrial revolution have been educated in a system that silos areas of study.  As an individual progresses in their academic career they are less likely to be exposed to others areas - for example, a software engineer is unlikely to have ethics consistently integrated as a skill in their courses, and similarly a philosopher or public policy student is unlikely to apply their critical thinking to the issues of emerging technology outside of (perhaps) an assignment scenario.

Access

Data is the engine that runs every emerging technology. Clean, complete, FAT data is crucial in ensuring that technology is not only effective, but ethically sound.

Government can lead in the collection and secure storage, and the provision of access to open source data.  The City of Melbourne already does this, and the UK Government recently completed a data review with plans to establish an agency whose responsibility will be the collation and maintenance of data.

How can industry lead ethically?

While Government needs to be involved, it’s ability meet the demands and pace of emerging technology have been called into question. The Privacy Chief for My Health Record quit earlier this year over claims the Health Minister refused to listen to her recommendations, and the deadline for ‘opting out’ (another issue in and of itself) has continually been delayed due to public pressure.

Industry however, can apply ethics to future-proofing activities now while Government plays catch up.

Think about the problem you’re solving

A consistent theme in Churchill Club’s 2018 events has been bringing everything back to the basic question - what problem are you trying to solve?  Starting with the problem, rather than the technical application, can ensure the solution is accountable.

Educate

Emerging technology consultants and developers have a responsibility to inform and guide their clients. In the example of developing an RPA for scanning job applicants, a consultant should ensure the client understands how the program is developed - the data, criteria etc - empowering them in the process.  

Start thinking of technology as core business

In order to manage risk and ensure the right questions are asked, leadership need to be part of the process. However, if it’s not considered ‘core business’ it can be difficult to gain their attention and buy-in.  Reframing tech according to the problem that you’re trying to solve can help you position it as core business e.g. it’s not a ‘tech problem’, it’s a ‘service issue’.

Maintenance

Like any car you’ve ever owned, tech needs servicing. Models need to be retrained, and data updated. Ensuring that regular maintenance is considered as part of the implementation of any technological solution can ensure that you’re not only safeguarding your investment, but you’re meeting best practice and improving your systems.  

What should we be reading?

If you’re after some beach reading to carry you through to 2019, we have some page turners for you: