AI support for supervisors

AI can support you as an supervisor in several ways.
Potentially, generative AI can give research supervisors more time to engage in "high-quality mentorship" as doctoral students can receive support from generative AI to increase their independence (for example, Dai, et al., 2024, Okoth, 2025). Supervisors should also strive to create a research culture that prioritizes integrity, originality, and responsible use of AI by encouraging open discussions about the possibilities and limitations of AI. In this way, generative AI can hopefully enhance, rather than undermine, doctoral students' academic writing and professional development.
Supervisors should:
- uphold academic integrity and originality
- guide students on how to use generative AI tools in an efficient but responsible way
- emphasize the importance of independent and critical thinking
- inform students of the risks with over-reliance on AI-generated content
As AI becomes an integral part of academic work, supervisors must also be clear on if and how generative AI tools could be used by the doctoral students.
Ten quick tips for supervisors and doctoral students
(adapted from Okoth, M.O., 2025).
- Take time to learn and understand how AI tools work, their limitations, and their potential biases
- Use AI as support, to assist with tasks like proofreading, data analysis, or idea generation, not replace critical thinking or original research
- Be transparent about using AI tools in your work, whether for writing assistance or data processing, by acknowledging the source
- Always validate AI-generated outputs by checking for accuracy, especially in tasks like summarizing research or analyzing data
- Avoid putting sensitive data or unpublished work into AI tools without ensuring data privacy
- Balance AI assistance with your own intellectual contributions to maintain academic integrity, without over-reliance on it
- Supervisors should discuss ethical AI use with students to establish clear expectations
- Stay updated with policies and guidelines on the ethical use of AI in research and supervision
- Use AI to explore new ideas and perspectives but ensure that the final work reflects personal insights and creativity
- Stay updated with developments in AI and ethics
References
Dai, Y., Lai, S., Lim, C.P., Liu, A. (2024). ChatGPT and its impact on research supervision: Insights from Australian postgraduate research students. Australasian Journal of Educational Technology, 39(4).
Available at: https://ajet.org.au/index.php/AJET/article/view/8843/2026 (Accessed 14-02-2025).
King's College London (2025). Generative AI: Guidance for doctoral students, supervisors and examiners. Available at: https://www.kcl.ac.uk/about/strategy/learning-and-teaching/ai-guidance/doctoral-assessment (Accessed 14-02-2025).
Okoth, M.O. (2025). Use of AI in Research and Postgraduate Supervision: Balancing Efficiency and Academic Integrity(Accessed 04-02-2025).
The University of Manchester (2025). Code of practice for postgraduate research degrees. Available at: https://www.staffnet.manchester.ac.uk/rbe/rdrd/code/ (Accessed 04-02-2025).