Education resources › Blog › Generative AI in education: The government guidance, explained

Generative AI in education: The government guidance, explained

Generative AI in education: The government guidance, explained

5 min read
  • Phones & technology

Generative Artificial Intelligence (AI) is an exciting and rapidly emerging technology with immense potential. Many in education have been experimenting with employing advanced algorithms and machine learning to support Teaching & Learning – but many have also called for caution.

The Department for Education just released a policy statement setting out their position regarding the use of things like ChatGPT or Google Bard, providing much-needed guidance for schools and colleges.

So, what are the key points from this statement? Read on to learn more about the Department for Education’s advice on:

  • The pros and cons of using Generative AI in education
  • How to effectively use AI in education
  • How to protect your pupils, staff and data
  • The future of Generative AI in education

What are the pros and cons of Generative AI?

The Department for Education report that Generative AI tools provide numerous opportunities for the education sector. They excel at quickly analysing, structuring and writing text, as well as turning prompts into audio, video and images. When used appropriately, these tools can reduce workload across the education sector and free up teachers’ time, allowing them to focus on delivering excellent teaching.

However, Generative AI tools have certain limitations to consider. Here are the key limitations raised by the Department of Education regarding Generative AI:

  • The results they produce are based on the dataset they have been trained on. For example, a tool may not have been specifically trained on the English curriculum. Therefore, AI-generated results may not compare to resources developed by a human within the context of the curriculum.
  • Generative AI-produced content could be inaccurate, inappropriate, biased, taken out of context or unreliable. AI can make some tasks quicker and easier, but it cannot replace the judgment and deep subject knowledge of a human expert.

Regardless of the tools or resources used to create plans, policies or documents, the quality and content of the final product ultimately remain the professional responsibility of the person or organisation who makes them. Schools and colleges may also want to review their homework policies and other unsupervised study practices to account for the availability of Generative AI.

How to effectively use AI in education

The Department for Education’s statement addresses the important issue of teacher workload, and the role that Generative AI can play in minimising the time spent on non-pupil facing tasks. They are actively seeking AI-supported opportunities to reduce this workload by collaborating with the education sector and experts.

However, it’s important to note is that to harness the full potential of Generative AI, we need base knowledge to draw upon. The Department for Education suggest that to craft high-quality prompts, it’s vital to have clear writing skills, a domain-specific understanding and the ability to validate the results against a well-defined schema.

In a nutshell, the quality and reliability of AI-generated output depends on your judgement and subject knowledge.

High-impact CPD made easy. Develop evidence-informed CPD at your school, using our exclusive online collection of courses and resources.

How to protect pupils, staff and data

The Department for Education addresses the key concerns of data protection, student and staff safety, and the responsible use of Generative AI in the policy statement, outlining the following key points:

Data protection

Just like with any emerging technology, understanding the data privacy implications of using Generative AI tools is crucial, to safeguard personal and special category data in compliance with data protection legislation.

It’s important to prioritise openness and transparency with data subjects, particularly pupils, and make them aware of what you process with AI tools.

Intellectual property

Schools and colleges must adhere to legislation to safeguard personal and special category data. Intellectual property, such as students’ work, should not be used to train Generative AI models without appropriate consent or copyright exemptions.

Scams

Generative AI is capable of learning from data without identifying individuals. It can even generate realistic content, including persuasive scams. It’s important to be aware of how staff and students engage with Generative AI, and educate them to question content that may appear authoritative and believable.

Cyber standards

Cybersecurity measures should be reviewed and reinforced, considering the potential impact of Generative AI on the sophistication and credibility of attacks. The cyber standards provide guidelines for schools and colleges in this regard.

Keeping children safe

One key concern is to ensure that children and young people are not exposed to harmful or inappropriate content online, including through Generative AI.

The “Keeping Children Safe in Education” document offers valuable information on how schools and colleges can protect their students and limit risks associated with their IT systems. Referring to the filtering and monitoring standard helps ensure that the appropriate systems are in place to control access to content.

By following these guidelines, schools and colleges can effectively address the challenges and opportunities presented by Generative AI while prioritising the protection and well-being of their community.

The impact of AI on formal assessments

The policy statement also outlines educational institutions and awarding bodies’ responsibility in preventing malpractice involving the use of Generative AI by taking appropriate measures.

To safeguard the integrity of qualifications, the Joint Council for Qualifications has released comprehensive guidance on the use of AI in assessments, providing teachers and exam centres with valuable information on preventing and identifying AI misuse.

The future of Generative AI in education

It’s pretty clear that Generative AI isn’t going away and will play an increasingly important role in education. To harness its potential, students need a knowledge-rich curriculum that prepares them to become well-informed users of technology and understand its impact on society. This includes teaching about the limitations, reliability, potential biases and the organisation and ranking of information on the internet. To achieve this, the Department for Education suggests:

  • Assisting students, especially young learners, in identifying and utilising suitable resources to aid their ongoing education.
  • Promoting the use of age-appropriate resources, including the potential incorporation of Generative AI in certain cases.
  • Discouraging excessive reliance on a limited set of tools or resources.

The Department for Education also states they will continue collaborating with experts to examine and address the implications of Generative AI and other emerging technologies, as well as provide support to primary and secondary schools in delivering a knowledge-rich computing curriculum to children up to the age of 16.

Final thoughts

Generative AI presents many exciting opportunities for educators, but also many challenges that need careful consideration. This policy statement is helpful in understanding and proactively addressing these issues, so that your school or college can use this technology in a safe and responsible way.

This technology has the power to enhance your students’ learning outcomes, create an environment that fosters innovation and creativity, open up new avenues in education, and prepare your students for the ever-evolving digital landscape they will deal with throughout their life.


About the editor

Bradley Busch

Bradley Busch

Bradley Busch is a Chartered Psychologist and a leading expert on illuminating Cognitive Science research in education. As Director at InnerDrive, his work focuses on translating complex psychological research in a way that is accessible and helpful. He has delivered thousands of workshops for educators and students, helping improve how they think, learn and perform. Bradley is also a prolific writer: he co-authored four books including Teaching & Learning Illuminated and The Science of Learning, as well as regularly featuring in publications such as The Guardian and The Telegraph.

Follow on XConnect on LinkedIn