Skip to main content

13 March 2023

The importance of ethics and morals in IT - Can you help?

CAS logo
Written by

Computing at School

Are you someone who is passionate about the importance of ethics and morals in IT? Would you like to help young people understand how ethical principles, guided by a socially responsible culture, shape the way in which organisations and IT professionals should behave? If so, we would love to hear from you.

We are looking for enthusiastic people to share their experience and expertise around a wide range of issues concerning ethics, morals and organisational culture, with students in our Digital T Level support programme.

The BCS Digital T Level support programme helps add real-world context to classroom learning by providing teachers and students with industry insight, showing how curriculum topics actually apply within the world of work, and demonstrating desirable workplace behaviours. Through interviews, webinars and masterclasses with great people in industry, young people about to embark upon work placements and early careers can gain valuable insight in practical areas that can help them to understand what to expect and how to conduct themselves in the workplace.

T Levels are a relatively new post-GCSE qualification, having been introduced in 2020. Roughly equivalent to three A Levels, the T Level combines classroom and work-based learning, with students spending 80% of their two-year course learning in the classroom and 20% in relevant industry placements. Through this process, students learn valuable industry-specific and practical knowledge that can give them a head start when entering the work environment.

We are currently looking to create video-based content that can be used to help T Level students understand the work-based implications of ethical and moral responsibilities. In particular, we would love to speak to anyone who could provide expertise in discussing:

  • Ethical qualities – what does being ethical really mean?
  • Ethical responsibilities - including a wide range of areas, such as protecting data and intellectual property, cyber security, fair use of technology, reducing bias in AI
  • The ethical impact of an increasing reliance on digital technology, in terms of things such as acceptable use, environmental issues, inclusion and diversity, changes in societal norms, the collection and use of data, autonomous operation
  • Company culture – how this is established, communicated and sustained, and how it impacts open the individual
  • Codes of conduct
  • Human-centred design and inclusivity
  • Corporate Social Responsibility and its benefits and challenges
  • Threats to acting ethically and safeguards to prevent these threats
  • Examples or case studies of ethical dilemmas
  • The impact of unsafe or inappropriate use of digital technology and mitigation techniques to reduce impact

If you would like to help in any of these areas by being part of a video interview or delivering a presentation that could be used as part of a classroom teaching resource for 16 to 18 year-olds, we would love to hear from you, and your time would be greatly appreciated.

To register your interest, ask any questions or find out more information, please email us at tlevels@bcs.uk

 

Many thanks in advance.

Discussion

Please login to post a comment

Adrian Mee
15/03/2023 11:56

Indeed Simon! You’ve outlined some of the “bigger” and “wider issues” which tend to lurk in the background. “ethics” as it appears in most workplaces actually means little more than “compliance”.
I’d offer that pupils are far more interested in the big picture issues and that they are a good reason to learn some computer science.
The problem is that the notion of computing tends to like “knowledge transfer” of “declarative knowledge” which can be tested with simple answers and easily marked.

  1. Name 3 requitements of the DPA (3 marks) isn’t really a question about digital ethics - it’s a simple factual knowledge recall issue.
  2. Should all individuals have broadband access as a human right? (10 marks) …is a digital ethics problem. Demanding to answer…difficult to teach…but incredibly satisfying.

You rightly reference the political context in which technology emerges.
This has been on the cards since the 1930s.
Lewis Mumfords book Technics and civilization is a powerful read made more not less relevant by recent developments. His notion of authoritarian and democratic technics offers a great lens through which to view technologies.

I have raised this with some teachers who felt the issues to be “far too complex for pupils”…when they are dealing with such issues all the time in English, history and RE/ethics.

RGds
A

Adrian Mee
14/03/2023 18:18

Just something from somewhere else…

How do we empower ‘digital citizens’? The only way is ethics!

The notions of “ethical”, “moral” and “legal” are hugely interesting topics and should feature more strongly in the education of those who “do computer science”, work in the IT field and in the schooling of Jo Citizen. I’d offer a view that the need for every citizen in a democracy to be able to understand ethical dilemmas in the fields science and technology is the dominant justification for all children to have a grasp of computer science. Most science teachers at school level seek to develop a scientifically informed public rather than the “little scientist”. Moving to a specialism can come later! How can we make judgments as citizens about global warming, nuclear energy and genetic modification if we don’t have a grasp of “the science”? A dose of science in the public might constitute an inoculation against voices who claim to “follow the science”….but only that science that is convenient!
As Beveridge offered “Ignorance is an evil weed, which dictators may cultivate among their dupes, but which no democracy can afford among its citizens”.
There are several problems in the way the “social and ethical dimensions” are dealt with in qualifications.
Firstly there is often a lack of clarity over what is meant by:
Ethical – a general socio-cultural set of norms.
Moral – an overlapping but more personalised version of the above.
Legal – the current rules which will or will not get your collar felt!

Often I’ve encountered pupils who feel that the difference between the three is really just “teacher nit picking”. I’d suggest the opposite as exploring the differences is where the fun (and arguments!!) are to be had.
Putting aside the “moral” for a moment lets play a game! Here are four activities we might engage in.
a) Designing a game designed to be addictive to players.
b) Developing a game aimed to help children with spelling.
c) Developing an ebay for stolen goods!!
d) Hacking a pharmaceutical company website and sharing the formula for a new wonder-drug they were planning to sell for ££££.

Now try to place these in the little contingency table below.

Dilemmas

Playing with the chart throws up some interesting issues in relation to technology and society but to get the full benefit of the game players need to have the ‘ethical schema’ which casts light on the process of reaching “an answer”……not “the answer”.
Some useful notional tools will probably already familiar to pupils taking A level philosophy and religious studies.
The toolkit includes:

  • The contrasting lenses of deontology and consequentialism. The ideas of Bentham and his hedonic calculus will appeal to pupils who may offer “Oi ….this guy invented ‘computational thinking’,before computers!!!’. Others might like to explore the state of Kants universal rules….in a digital world!
  • Ogburns idea of “cultural lag” helps them cast light on Zone C in our chart. It might lead them to think of the AI challenges that our legal system hasn’t even though of……let alone legislated for!
  • Langdon Winner’s “technological sonambulism” might shed some light on why, in 1948 Orwell wrote of a coming dystopia where the population was watched and monitored and the Brave New World of 2023……when we spend a fortune buying the devices which do the same!!
  • A critical perspective of the term “logical” which has become the abracadabra of computing. “Logical thinking” becomes the fount of all wisdom. Until we ponder how Johnathan Swift’s modest proposal might have been grabbed at by app developers “to make things easier”!!

A final look at the focus:

To……”understand the work-based implications of ethical and moral responsibilities”.

We could ponder the relevance of “work based”. It invites us to explore the boundaries of our morals and ethics in our various lives. The capacity to suspend morality when donning the uniform or even corporate lanyard and ID card with a cheery “I’m just doing what I’m told” and “I just invent it….it’s up to you how you use it”!! leads us to dark places Hannah Arendt explored.

Science and technology is fascinating and as Eric Schmidt offered, the emerging digital environment may be a source of “tremendous good and potentially dreadful evil”. As little computer scientists grapple with their Python syntax overlooked by the classroom posters of Alan and Ada perhaps Albert should be there too with his motto “If only I had known, I should have become a watchmaker”.
If one of the rules in the computing classroom is “Don’t be evil”…………first they got to know evil and where it can hide in the bushes on the road paved with good intentions!

:blush:

a

Simon Morris
14/03/2023 11:53

I’m sure there are others here who are more qualified to talk about ethics in computing than myself. My own – strictly amateur – interest relates to how the modern digital technologies are used to undermine democracy, invade privacy, and constrain freedom of speech. These concepts are much higher level than the day-to-day digital ethics you might find in the average workplace, although as technology advances I wonder to what extent they may start to become considerations in areas such as everyday software development.

Documentaries like The Social Dilemma and books like The Chaos Machine by Max Fisher show the problems around managing the firehose of data generated by modern internet tools. When the content was moderated by humans there were complaints of political bias (as the humans employed as moderators were generally young, educated, and liberal.) To counter the accusations of human bias the technology companies began to rely more and more on algorithms, but to fulfil their goal of increasing engagement these algorithms picked up on and amplified the worst aspects of human nature. If the most extreme content gets the biggest reactions – who cares if those reactions are boos or cheers? – then the sensible thing for an algorithm to do is to recommend ever more extreme content.

I’m currently reading Pegasus by Laurent Richard and Sandrine Rigaud (there’s also the Storyville documentary The Spy in Your Mobile on iPlayer if you are unfamiliar with Pegasus.) Pegasus is able to take over a target’s smartphone completely and covertly, and like a lot of other cyber surveillance tools was touted as a way to combat drug gangs, people trafficking, terrorists, etc. But it became increasingly clear that the companies behind these tools were happy to sell them to oppressive regimes to spy on their political opponents, human rights advocates, journalists, etc.

Over the last couple of weekends I’ve been playing with Elevenlabs AI voice generation. It’s ability to make a fairly accurate impersonation of a living person’s voice using just a few minutes of sample audio is really quite impressive. Even more impressive, though, is the way that the AI is able to infer how to modulate and emote the text you give it to read, rather than using a flat public-address-system voice. Naturally there’s been a spate of deep fake meme videos of celebrities (not!) saying all manner of crazy things, but you can already see how this technology is likely to dominate politics and democracy in the coming months and years.

These are three technology ethical dilemmas that are starting to seriously impact on the everyday lives of everyday people. If you’re a software developer creating an app, you now have to worry about how spyware could abuse your app and the data it collects. You may have to worry about how your app could influence the behaviour of its users in negative ways, rather than just positive ways. And you have to consider how your app might be used as a tool to spread dangerous (but very convincing looking) misinformation.

Adrienne Tough
13/03/2023 19:52

For ethical dilemmas there’s a great ted talk on driverless cars on YouTube, using the principles of Philippa Foot’s trolley problem.