GCSE Computer Science
Exploring the wider impacts of technology on society and the environment.
Learn about ethical issues in computing including privacy, surveillance, artificial intelligence, and digital rights. Understand key legislation such as the Data Protection Act, Computer Misuse Act, and Copyright law.
This unit covers the environmental impact of technology, e-waste, energy consumption, and how technology affects culture and society.
⚠️
AI-Generated Content - Not Yet Reviewed
This lesson content has been partially generated using AI and has not yet been reviewed by a human educator. It likely contains numerous issues, inaccuracies, and pedagogical problems. Do not use this content for actual lesson planning or teaching at this time.
You mention wanting new trainers to a friend, and an hour later you see an ad for them. Coincidence... or is your phone spying on you?
Open with the hook question. Take a quick poll: how many students have experienced this 'creepy' ad coincidence? Share a few examples. Then reveal: we're going to investigate whether phones really listen, and discover something possibly even more surprising about how our data is used.
Resources:
Simple yes/no poll about experiencing targeted ads that felt 'too accurate'
Teacher Notes:
Don't immediately debunk or confirm the 'listening' theory - maintain mystery to keep engagement. Students often have strong opinions here.
Interactive discussion: What does privacy mean? Students write their own definition first, then share in pairs. Build a class definition together. Introduce the concept of digital footprints - every search, click, like, and purchase creates data. Show examples of what data is collected: location, contacts, browsing history, purchase patterns, social connections.
Resources:
Visual showing all the data points collected in a typical day of smartphone use
Teacher Notes:
Key insight: privacy isn't just about 'secrets' - it's about control over your own information. Avoid making students feel paranoid; focus on awareness and informed choice.
Scenario-based group activity. Give each group a different real-world case study: (1) A fitness app selling location data to advertisers, (2) A social media platform using facial recognition without consent, (3) A school monitoring students' online activity, (4) A game app collecting data from children, (5) A company using AI to predict employees who might quit. Groups must identify: What data is being collected? Who benefits? Who might be harmed? Is it legal? Is it ethical? Groups share conclusions; highlight that legal ≠ ethical.
Resources:
Printed scenarios with guiding questions for each group
2x2 grid for students to plot scenarios
Teacher Notes:
This is the key learning moment. Push students to articulate WHY something feels wrong, not just that it does. Introduce the distinction: something can be legal but still ethically questionable, or ethical but not (yet) legal.
Return to the opening question. Reveal the likely reality: phones probably aren't literally recording conversations (it would require too much processing and battery), but companies know so much about you from other data that they can predict what you want. This is arguably more concerning. Show brief video clip or article about how prediction algorithms work.
Resources:
2-3 minute clip explaining how ad targeting works without audio recording
Teacher Notes:
The goal is nuance - neither dismissing concerns as paranoia nor promoting conspiracy theories. The real privacy issues are often more mundane but equally significant.
Individual reflection: 'What's one thing you might do differently online after today's lesson?' Students write response. Quick verbal check: What's the difference between something being legal and something being ethical? Tease next lesson: We've talked about ethics - next time we'll look at what happens when people cross legal lines with technology.
Teacher Notes:
The reflection question should be personal and actionable. Accept all responses without judgement - the goal is awareness, not behaviour change.
Explore the real technology behind personalised ads - tracking cookies, device fingerprinting, and cross-platform data sharing. Spoiler: your phone probably isn't literally listening, but the truth is almost creepier.
Connection: Understanding the technical reality helps students make informed judgements about privacy trade-offs and ethical boundaries.
Further Reading:
Introduction to roles like Privacy Engineers and Data Protection Officers - people whose entire job is thinking about user privacy.
Connection: Shows that ethical technology isn't just about rules - there are careers dedicated to building more ethical systems.
Further Reading:
Support:
Stretch:
In 2017, a 22-year-old stopped a global cyberattack that had crippled the NHS - from his bedroom. Was he a hero... and what exactly is illegal when it comes to computers?
Tell the story of the WannaCry attack (2017): ransomware spreading across the world, NHS computers frozen, operations cancelled. Then introduce Marcus Hutchins - a 22-year-old who found the 'kill switch' while analysing the malware from his bedroom. Ask: Is what he did legal? What skills did he need? Should there be laws about this kind of thing?
Resources:
Visual showing the spread of the attack and its impact on the NHS
Teacher Notes:
The WannaCry story is dramatic and recent enough that some students may remember it. Use it to show that cybersecurity isn't abstract - it affects hospitals, lives, real people.
Interactive teaching of the three main offences: (1) Unauthorised access to computer material - 'digital trespassing', (2) Unauthorised access with intent to commit further offences - breaking in to do something worse, (3) Unauthorised acts with intent to impair computer operation - deliberately damaging systems. Use relatable analogies: the difference between finding an unlocked door (seeing it's there), opening it (offence 1), opening it to steal something (offence 2), and opening it to smash up the house (offence 3). Show maximum penalties for each.
Resources:
One-page summary with the three offences and penalties
Teacher Notes:
The three-tier structure is logical and memorable. Emphasise that intent matters - accidentally stumbling into a system isn't the same as deliberately exploiting it.
Quick-fire scenarios for students to judge using the CMA. Examples: Using your friend's Netflix password (they said you could). Guessing a weak password to access your ex's social media. Finding a security flaw in a website and reporting it to the company. Creating a virus 'just to see if you can' but never releasing it. Writing code that could be used for hacking but never using it. Students hold up cards (Legal/Illegal/It Depends) and discuss the edge cases.
Resources:
Printed scenarios for the quiz
Legal/Illegal/It Depends cards for each student
Teacher Notes:
Some scenarios are deliberately ambiguous to promote discussion. The 'using someone else's password' example often surprises students - even with permission, it may technically breach terms of service if not the CMA.
Introduce the concept of penetration testing and bug bounties. Companies like Google, Microsoft, and Apple pay people to find security flaws. Show examples of bug bounty payouts (some are six figures!). Discuss: How do ethical hackers avoid breaking the law? (Permission, contracts, responsible disclosure). Link to UK's National Cyber Security Centre apprenticeship and career paths.
Resources:
List of major companies with bounty programs and example payouts
Information about cybersecurity education pathways
Teacher Notes:
This section reframes hacking skills positively. Many students interested in 'hacking' don't realise there's a legitimate career path that uses the same skills.
Return to Marcus Hutchins. After stopping WannaCry, he was arrested by the FBI for creating malware years earlier (before he reformed). He pleaded guilty and received no prison time. Discuss: Should your past affect how you're treated for doing something good? Can people change? What does this tell us about the complexity of 'heroes and villains' in cybersecurity?
Teacher Notes:
End on complexity rather than simple answers. The real world is messier than clear-cut legal categories suggest.
The full story of how a young security researcher accidentally stopped a global ransomware attack that was destroying NHS systems - and the complicated story of his own past that emerged afterwards.
Connection: Illustrates the grey areas between security research and hacking, and shows why clear laws are needed.
Further Reading:
Penetration testers, security analysts, and ethical hackers - there's a whole industry of people paid to hack into systems legally. The UK has a significant skills shortage in this area.
Connection: Shows that understanding computer crime creates career opportunities in legitimate security work.
Further Reading:
Support:
Stretch:
https://www.cps.gov.uk/legal-guidance/computer-misuse-act-1990
Prerequisites: 1
In 2018, Facebook admitted that data from 87 million users had been harvested without consent. What rights do you actually have over your personal data?
Open with the Cambridge Analytica scandal story. Show the scale: 87 million people's data used without their knowledge. Ask: Did those people agree to this? What should happen when companies misuse data? What could you do if this happened to you? Reveal: There are actual laws that give you rights - and we're going to learn what they are.
Resources:
Visual timeline of events
Teacher Notes:
This story resonates because students use social media. Emphasise that this could happen to anyone - it's not about being 'careful' online, it's about companies being held accountable.
Interactive teaching of key individual rights: Right to be informed (know what data is collected), Right of access (see your data), Right to rectification (correct wrong data), Right to erasure ('right to be forgotten'), Right to restrict processing, Right to data portability (take your data elsewhere). Use practical examples for each: requesting your data from Spotify, asking Instagram to delete your account, correcting wrong medical records. Introduce the role of the ICO (Information Commissioner's Office) as the enforcer.
Resources:
Visual summary of the 8 key rights
Real template students could use
Teacher Notes:
Focus on rights students can actually exercise. The 'right to be forgotten' often surprises them - you can actually ask Google to remove search results about you (with limitations).
Practical demonstration: Show how to request your data from major platforms (Google Takeout, Facebook Download Your Information, Spotify data request). If possible, show real data from a volunteer teacher account - the quantity and detail often shocks students. Discuss: Why might companies not want to make this easy? Group task: Draft a data access request letter for a fictional scenario (school, employer, online game).
Resources:
Step-by-step guides for major platforms
Anonymised example showing the type of data collected
Teacher Notes:
Showing real data exports is very powerful. The sheer amount of data collected makes the abstract concrete. If you can't show real data, there are journalistic examples online.
Flip the perspective: What must organisations do? Key principles: lawfulness, purpose limitation, data minimisation, accuracy, storage limitation, security. Introduce concept of 'data breaches' - what happens when it goes wrong. Show example ICO fines (British Airways: £20m, Marriott: £18.4m). Discuss: Is a fine enough, or should executives face prison?
Resources:
List of major fines with amounts and reasons
Teacher Notes:
The fine amounts usually impress students. Link back to earlier lesson - this is the 'legal' side of the ethical issues we discussed.
Quick scenarios - students judge if DPA has been breached: Your school shares your grades with a newspaper without asking. A hospital leaves patient files on a train. A shop keeps your payment details 'just in case' without telling you. A company emails you marketing after you asked them to stop. Students vote and justify. Exit ticket: Name two rights you have under the DPA.
Teacher Notes:
These scenarios link to real cases where possible. The 'files on train' example references multiple real NHS incidents.
The full story of how personal data from millions of Facebook users was harvested and used for political advertising, leading to stricter data protection worldwide.
Connection: Shows why data protection laws matter and how they've been strengthened in response to real abuses.
Further Reading:
Data Protection Officers, compliance managers, and privacy consultants - an entire profession has grown around ensuring organisations follow the rules.
Connection: Shows practical application of DPA knowledge in careers.
Further Reading:
Support:
Stretch:
Prerequisites: 1
Can you get sued for sharing a meme? Can a 16-year-old's hobby project become a billion-dollar industry? The answer to both is yes - and it's all about who owns what.
Start with an attention-grabbing case: photographers suing over meme usage, music in YouTube videos being claimed, or fan art being removed. Ask: Who owns creative work? Can you 'own' an idea? When you post something online, who owns it? Reveal: There's a 1988 law that answers these questions, and it affects everyone who creates or shares anything online.
Resources:
Brief case studies of copyright claims on popular memes/content
Teacher Notes:
Students often don't realise how copyright applies to everyday online activity. The goal isn't to scare them but to inform them about how the system works.
Core teaching: What is copyright? (Automatic protection for original creative works). What does it cover? (Literary works, artistic works, music, films, software, databases). What rights does it give? (Copy, distribute, adapt, perform). How long does it last? (Generally 70 years after creator's death). Key point: You don't have to register - copyright exists automatically when you create something. Application to software: Code is protected like literature. You can't just copy and use someone else's code.
Resources:
Visual guide to what's protected and for how long
Teacher Notes:
Emphasise the automatic nature - students often think you need to 'register' copyright. Also clarify: you can't copyright an idea, only the expression of it.
Structured comparison activity. Split class into two groups: Team Open Source and Team Proprietary. Each group receives information cards about their licence type. Groups prepare arguments for: Who benefits? What are the advantages? What are the risks? Structured debate format with points for each side. After debate, reveal: Most real-world projects use BOTH - show examples (Android = open source Linux + proprietary Google apps).
Resources:
Detailed cards for each group with features, examples, pros, cons
Structure for the debate
Teacher Notes:
This activity works best when students genuinely advocate for their assigned position. The reveal about mixed models prevents black-and-white thinking.
Story of Linus Torvalds: 21-year-old student in Finland, 1991, posts on a message board about his 'hobby' operating system. Today: Linux runs 96.3% of the top million web servers, 100% of supercomputers, all Android phones, most smart TVs, most routers. Show that students have probably used Linux today without knowing it. The power of open source: thousands of contributors, free to use and modify.
Resources:
Visual showing where Linux runs in everyday life
Teacher Notes:
This story shows the scale of impact that licensing decisions can have. It's also inspiring - a student project became world-changing.
Scenario exercise (exam-style practice): For each scenario, recommend open source or proprietary and justify. Scenarios: (1) A games company creating a new console, (2) A student making a photo-editing tool they want others to improve, (3) A bank developing security software, (4) A charity creating educational resources for developing countries, (5) A startup with a unique algorithm as their main product. Share answers and discuss edge cases.
Teacher Notes:
This directly addresses the exam requirement to 'recommend a type of licence for a given scenario including benefits and drawbacks'. Emphasise that justification is as important as the choice.
In 1991, a Finnish student posted his hobby project online for free. Today, Linux runs most of the world's servers, all Android phones, and most of the internet. The power of open source.
Connection: Shows the massive real-world impact of licensing choices and why understanding them matters.
Further Reading:
In 2021, a critical security flaw was found in Log4j - open source software used by almost every major company. The maintainers were volunteers. This sparked major debates about supporting open source.
Connection: Illustrates both the power and potential problems with open source software in critical infrastructure.
Further Reading:
Support:
Stretch:
Prerequisites: 2, 3
Streaming one hour of video creates as much CO2 as driving 100 metres. With 5 billion hours of YouTube watched every day, what's the real environmental cost of our digital lives?
Start with surprising statistics: One email with attachment = 50g CO2. One hour of streaming = 36g CO2. One year of cloud storage = 200kg CO2. Ask: Where does this 'pollution' come from? Most students imagine data as weightless - reveal that the 'cloud' is actually massive physical buildings full of computers running 24/7. Interactive: Estimate the class's collective digital carbon footprint from yesterday's activity.
Resources:
Interactive tool or spreadsheet for calculating digital carbon footprint
Teacher Notes:
The goal is cognitive dissonance - students don't associate their phone use with pollution. Be careful not to induce guilt; focus on awareness and systemic issues rather than individual blame.
Demystify the cloud: Show images/video of real data centres. Discuss: electricity for computing, electricity for cooling, water usage, land use. Show scale: Google's data centres use as much electricity as San Francisco. Compare tech giants' carbon commitments - some claiming to be 'carbon neutral', what does this actually mean? (Often offsetting rather than reducing). Interactive map: Where are data centres located and why? (Cold climates, renewable energy access, submarine cables).
Resources:
Video or interactive tour of a major data centre
Map showing major data centre locations
Teacher Notes:
Many students have never seen a data centre. The scale often surprises them. Be balanced - also mention that companies ARE making efforts to reduce environmental impact.
Shift to physical waste: How often do you get a new phone? Where do the old ones go? Statistics: 53.6 million tonnes of e-waste generated globally in 2019. Only 17.4% properly recycled. Show the journey: UK e-waste often ends up in Ghana, Nigeria, China - processed in unsafe conditions. Discuss rare earth elements: what's in a phone, why mining them is environmentally damaging, why recycling is crucial. Brief discussion of 'planned obsolescence' - are devices designed to become outdated?
Resources:
Visual showing the rare elements in a typical phone
Key numbers on global e-waste
Teacher Notes:
E-waste is an emotional topic - images of waste dumps can be powerful but also distressing. Focus on systemic issues rather than making students feel guilty about their phones.
Balance the discussion: Technology also helps the environment. Examples: Smart grids reducing energy waste, AI optimising logistics, video calls replacing flights, apps helping track and reduce consumption, satellite monitoring of deforestation, precision agriculture. Discussion: Is technology net positive or negative for the environment? What would be needed for it to be positive overall?
Resources:
Case studies of technology helping environmental causes
Teacher Notes:
This section prevents doom and gloom. The goal is nuanced understanding - technology is a tool that can help or harm depending on how we use it and design it.
Group brainstorm: Changes at individual level (keeping devices longer, streaming less?), company level (renewable energy, recyclable design), government level (regulations, incentives). Vote: Which level of change would have the most impact? Exit ticket: Name one environmental impact of digital technology and one way technology helps the environment.
Teacher Notes:
End with agency rather than despair. Students should leave feeling informed and empowered, not guilty and hopeless.
Virtual tour of what data centres look like - the scale of cooling, power consumption, and why location matters. Some are built in cold climates or underwater to reduce cooling needs.
Connection: Makes the abstract 'cloud' concrete and shows the physical infrastructure behind digital services.
Further Reading:
Sustainability is becoming a major focus in tech. Roles like Green IT Consultant, Sustainability Analyst, and Circular Economy Designer are growing fields.
Connection: Shows career opportunities at the intersection of technology and environmental responsibility.
Further Reading:
Support:
Stretch:
https://www.bbc.com/future/article/20200305-why-your-internet-habits-are-not-as-clean-as-you-think
https://www.itu.int/en/ITU-D/Environment/Pages/Spotlight/Global-Ewaste-Monitor-2020.aspx
Prerequisites: 1
Your grandparents probably met someone in person before knowing them. You might know someone's life story before ever meeting them. Is technology bringing us together or pulling us apart?
Discussion starter: How did your parents meet? How do people meet now? Broader: How did people find jobs, get news, stay in touch, learn things 30 years ago vs now? Create a quick comparison timeline. Ask: Which way is better? (There's no single right answer). Key insight: Technology doesn't just change how we do things, it changes our culture, relationships, and expectations.
Resources:
Grid for students to fill in cultural changes
Teacher Notes:
This works well with student anecdotes. The comparison should be observational, not judgemental - neither era is 'better' in every way.
Explore positive cultural impacts: Global communication (friends worldwide, diaspora communities staying connected). Access to information (anyone can learn almost anything). Platform for marginalised voices (social movements, representation). Accessibility (life-changing technology for people with disabilities - show examples). Creative expression (everyone can publish, create, share). Use specific examples: how a disability advocate uses technology, how protest movements organise, how isolated communities stay connected.
Resources:
Brief stories showing technology improving lives
Teacher Notes:
It's important to genuinely explore positives - this unit can become very negative otherwise. Choose examples that feel real and specific.
Explore negative cultural impacts: Digital divide (who's left out? age, income, location, disability without accessible tech). Echo chambers and polarisation (algorithms showing you what you already believe). Mental health impacts (comparison culture, addiction). Misinformation spread. Cultural homogenisation (same platforms, same content worldwide). Use specific examples: teens and social media pressure, elderly isolation from digital services, fake news impact on elections. Discussion: Are these problems with technology or problems with how we use it?
Resources:
UK-focused data on internet access disparities
Teacher Notes:
Balance specificity with sensitivity - students may have personal experience with these issues. Focus on systemic effects rather than individual blame.
Mock trial activity: 'Technology' is on trial for crimes against society. Groups prepare: (1) Prosecution - gathering evidence of harm, (2) Defence - gathering evidence of benefit, (3) Witnesses - real examples from different perspectives (elderly person, young entrepreneur, disabled person, rural resident, urban professional). Conduct brief mock trial. Jury (whole class) delivers verdict: Not guilty / Guilty / Guilty but with mitigating circumstances. Debrief: What does the nuance tell us?
Resources:
Roles and evidence prompts for each group
Teacher Notes:
This synthesises the whole unit's learning. The 'guilty with mitigating circumstances' verdict usually wins - technology isn't good or bad inherently, but has both effects.
Individual writing task (exam practice): In your view, is digital technology having a mostly positive or mostly negative impact on society? Justify your answer with examples from across the unit. Minimum: One privacy point, one legal point, one environmental point, one cultural point. Share a few responses. Final reflection: What's one thing YOU could do to make technology's impact more positive?
Teacher Notes:
This prepares for the 8-mark extended response question format. Encourage balance - strong answers acknowledge both sides before coming to a judgement.
10% of UK households have no internet access. During COVID lockdowns, this meant no school, no work, no contact with services. Explore the geography, age, and income factors that affect digital access.
Connection: Shows that cultural impacts aren't universal - technology affects different groups very differently.
Further Reading:
Screen readers, voice assistants, and accessibility features - technology that makes life possible for people with disabilities. The story of how features designed for accessibility (like voice control, autocorrect) ended up helping everyone.
Connection: Demonstrates positive cultural impact and the importance of inclusive design.
Further Reading:
Support:
Stretch:
https://www.ofcom.org.uk/research-and-data/media-literacy-research/adults
Prerequisites: 1, 2, 3, 4, 5