ÐÓ°ÉappÏÂÔØ

Learning and teaching

AI in Learning and Teaching

The role of AI in education is unfolding rapidly. Here you can find an overview of generative AI and advice on how we can use these tools responsibly in teaching, learning and assessment.

Frequently Asked Questions

Generative AI, like ChatGPT, is an artificial intelligence that can process text and generate human-like responses

Generative artificial intelligence (AI) can process text, generate human-like responses and produce multimodal content (e.g. audio, images, simulations and videos). Common models of generative AI include natural language processing, speech recognition, Text-to-Image (T2I) and Text-to-Video (T2V) generation. This is an important technology with diverse and growing capabilities, creating opportunities and challenges for education.

ChatGPT, a versatile chatbot developed by , is a generative artificial intelligence (AI) tool that went viral after its launch in November 2022. Since the release of ChatGPT, generative AI technology continues to advance at a breakneck pace, propelling the development of other popular generative AI tools.

For an introduction to the use of ChatGPT in Higher Education, please .

We encourage the responsible use of generative AI

Generative AI will play a significant role in society and we encourage the responsible use of the technology in teaching, learning and assessment. Responsible use of generative AI can provide us and our students with many opportunities.

Some universities and schools around the world have issued bans to prohibit students from using generative AI tools. In practice, detecting AI-written text may prove to be impracticable. If you suspect a student of inappropriate use of AI tools then you must proceed as you would in any other example of misconduct.

All students will be required to append a declaration page to their assignment when they have used AI during the process of undertaking the assignment and to acknowledge the way in which they have used it. You should use this declaration to aid your academic judgement of the extent to which students have met the assessment criteria.

Opportunities for educators

Generative AI creates new opportunities to develop educational content and teaching activities, provide students with an interactive and engaging learning experience, and improve learning outcomes through adaptive and personalised means of teaching.

AI could be used to generate ideas and drafts for curriculum design, module outlines, lesson plans and teaching activities. It could be used to create assessments such as quizzes, topics for group projects, essay and exam questions and sample answers at different performance levels. AI could also produce teaching materials such as case studies, codes, summaries and translations.

You can find all our upcoming AI related workshops in our Continuing Professional Development in Learning and Teaching calendar here.

Opportunities for students

Generative AI can interact with students in a conversational manner providing personalised learning support and feedback. By adapting to students’ performance and adjusting the learning paths accordingly, AI tools could respond to individual learning needs and improve learning progress.

Opportunities beyond education

Use of AI tools is becoming more common, and an increasing number of organisations are choosing to use AI. This is anticipated to transform the job market and affect the employability of our graduates going out into an AI-driven economy.

Reviewing your curriculum to incorporate AI

Consider if there is a need to change your curriculum design to teach knowledge and skills that are relevant in an AI-driven economy. Programme content, learning objectives, teaching plans and learning activities could all be adapted to take into account generative AI. Programmes could focus on higher-order cognitive skills, such as concept acquisition and application, synthesis of evidence, critical and creative thinking, systematic decision making, solving complex and practical problems.

Students should acknowledge all uses of generative AI tools in assessment to avoid academic misconduct

AI tools such as ChatGPT and GPT-4 can be used to research, write essays and academic articles. Some universities and schools around the world have issued bans to prohibit students from using generative AI tools. In practice, detecting AI-written text may prove to be challenging. If you suspect a student of inappropriate use of AI tools then you must proceed as you would in any other example of misconduct.

All students will be required to append a declaration page to their assignment when they have used AI during the process of undertaking the assignment and to acknowledge the way in which they have used it. You should use this declaration to aid your academic judgement of the extent to which students have met the assessment criteria.

Failure to declare the use of AI, which results in the work appearing to demonstrate greater attainment against the assessment criteria than is the case, is academic misconduct. Likewise, so is the use of copying or paraphrasing AI-generated content without reference. We will be updating the over summer 2023 to be more explicit about this. However, it still currently falls under the policy, as an “improper activity or behaviour by a student which may give that student, or another student, an academic advantage in assessment” (para 1.2) and “attempting to complete any assessment by unfair means” (para 3.1).

Consider what barriers your students might face in accessing generative AI

Generative AI tools are not equally accessible to all users. Although ChatGPT is free of charge, it remains unavailable in certain countries and could be temporarily inaccessible at peak times. GPT-4, a more powerful and multimodal AI chatbot, is restricted to fee-paying subscribers. If you intend to use such tools in your teaching consider beforehand what barriers students might face.

Never submit personally identifiable information into an AI model

There are concerns about the legality and ethics of using generative AI in terms of intellectual property, data privacy and the General Data Protection Regulation (GDPR). You should avoid using any external tools that promise to check whether material is AI generated.

We must not feed business data or student work into a system that we don't have a contractual relationship with, and haven't assessed the privacy impact or security implications. Personal or sensitive information should never be entered into AI, as it stores this data. This would be deemed unethical.

Moreover, students are accountable for the originality of their assignments and should acknowledge all sources generated by AI tools. We have shared with our students.

Remind your students that generative AI is not always reliable and may fabricate information

Not all generative AI tools are able to respond to prompts related to recent events, and may ‘hallucinate’ and fabricate misleading answers. It should be explained to students why generative AI tools should be used to assist but not replace formal learning. Students should understand that they need to be knowledgeable in the subject area in order to use AI intelligently (i.e. to ask the right questions and discern the quality of AI-generated answers). Students should also be reminded that generative AI tools cannot be listed as a co-author in top academic journals.

Using AI to increase teaching effectiveness and efficiency

AI could be used to generate ideas and drafts for curriculum design, module outlines, lesson plans and teaching activities. It could be used to create assessments such as quizzes, topics for group projects, essay and exam questions and sample answers at different performance levels. AI could also produce teaching materials such as case studies, code, summaries and translations.

Using the AI Risk Measure Scale to evaluate potential risks

The university encourages the responsible use of generative AI in teaching, learning and assessment. However, the use of such tools in assessments is also a concern with regards to academic integrity. To tackle this challenge, we encourage colleagues to reflect on their approach to assessment and determine whether they remain fit for purpose in light of the challenge posed by generative AI. Against this backdrop, the AI Risk Measure Scale (ARMS), has been developed and piloted by academic staff across different programmes in the university.

The AI Risk Measure Scale (ARMS) is a tool specifically designed to help academic staff evaluate the potential risk associated with students utilising generative AI tools in their assignments when not expected as part of the assessment. The ARMS categorises assessment tasks into five levels, ranging from very low risk to very high risk.

Guidance Documents and Resources

Contact details

If you have any questions, please contact our Academic and Learning Enhancement (ALE) team by clicking on the button below

Learning and teaching resource centre


Learn and develop your practice in your own time using our Learning and Teaching Resource Centre - Here, you will find a variety of tools, guides, and materials tailored to help you create engaging and effective learning experiences for your students