Postgraduate research opportunities Exploring bias in retrieval-augmented generation (RAG) models

Apply

Key facts

  • Opens: Monday 10 February 2025
  • Number of places: 1
  • Duration: 36 months
  • Funding: Home fee, Stipend

Overview

This exciting opportunity will explore innovative methodologies for detecting and mitigating biases in Retrieval-Augmented Generation (RAG) models, as part of the Participatory Harm Auditing Workbenches and Methodologies (PHAWM) Project. The studentship is funded through the Research Excellence Awards (REA) under the John Anderson Research Studentship Scheme (JARSS) in the Department of Computer and Information Sciences at University of Strathclyde.
Back to opportunity

Eligibility

We are expecting you to have:

  • a UK honours degree, or overseas equivalent, in a Computer Sciences discipline from a recognised academic institution, or a relevant field cognate to the project's aims; or equivalent experience;
  • demonstrable interest in applying, and investigating how, formal methods can address issues of software insecurity;
  • good interpersonal and communication skills, and the ability to work within a team environment;
  • ability to engage with complex information, and show independent thought, self-learning, and time-management;


It would be desirable if you had experience, or awareness, of:

  • programming language theory;
  • dependently typed programming; or
  • computer and software security;
THE Awards 2019: UK University of the Year Winner
Back to opportunity

Project Details

As Retrieval-Augmented Generation (RAG) models gain traction in various domains, their integration of external knowledge sources with Large Language Models (LLMs) poses significant ethical challenges, particularly around bias and fairness. This PhD project aims to develop and test bias detection and mitigation methodologies that assess both the pre-trained LLMs and the external content retrieved during generation.

Your work will directly contribute to the PHAWM Project’s mission of participatory AI auditing, ensuring stakeholders – including domain experts, regulators, and end-users – can effectively audit and mitigate harms within AI systems. You will design indirect prompting methods and role-specific simulations to reveal hidden biases and deliver scalable solutions that align with global AI governance standards like the EU AI Act.

Objectives

  • develop scalable tools to measure biases introduced by both LLMs and retrieved external content.
  • design indirect prompting techniques that elicit implicit biases without triggering evasive model behaviours.
  • simulate real-world contexts (e.g., fact-checker or policy analyst roles) to assess systemic biases in practical applications.
  • optimise retrieval mechanisms to reduce biases and promote fairness by analysing the influence of external knowledge sources.
  • collaborate with stakeholders to align findings with participatory harm auditing methodologies and develop real-world mitigation strategies.

This project provides a unique opportunity to collaborate with a large consortium of academic and industry partners to design real-world solutions that ensure AI systems remain ethical, fair, and trustworthy.

Back to opportunity

Funding details

This fully-funded PhD is open to both home and international students and includes:

  • a fee waiver equivalent to the home rate
  • a tax-free stipend of approximately £20,780 per year, increasing annually with inflation, for up to 4 years
  • access to resources and collaboration with academic and industry partners through the PHAWM Project.

Home Students

To be eligible for a fully funded UK home studentship you must:

  • Be a UK national or UK/EU dual national or non-UK national with settled status / pre-settled status / indefinite leave to remain / indefinite leave to enter / discretionary leave / EU migrant worker in the UK or non-UK national with a claim for asylum or the family member of such a person, and
  • Have ordinary residence in the UK, Channel Islands, Isle of Man or British Overseas Territory, at the Point of Application, and
  • Have three years residency in the UK, Channel Islands, Isle of Man, British Overseas Territory or EEA before the relevant date of application unless residency outside of the UK/ EEA has been of a temporary nature only and of a period less than six years

While there is no funding in place for opportunities marked "unfunded", there are lots of different options to help you fund postgraduate research. Visit funding your postgraduate research for links to government grants, research councils funding and more, that could be available.

Back to opportunity

Supervisors

Dr Moshfeghi

Dr Yashar Moshfeghi

Reader
Computer and Information Sciences

View profile
Dr Azzopardi

Dr Leif Azzopardi

Reader
Computer and Information Sciences

View profile

Professor Ian Ruthven

Computer and Information Sciences

View profile
Back to course

Apply

To apply, send an email to Dr Yashar Moshfeghi (yashar.moshfeghi@strath.ac.uk) with the subject line: “JARSS PhD Application - Bias in RAG Models” Include the following documents:

  • a cover letter (max two pages) detailing your motivation for applying and how your background fits the project
  • your proposed research direction, including a short literature review and specific research question (max two pages)
  • an up-to-date CV
  • academic transcripts and certificates
  • proof of English proficiency (if required)
  • two references, including one academic

Applications are based on first come first served. Interviews will be held shortly after applications are received. This is a unique opportunity to be part of a large, collaborative research network that is creating safer, fairer, and more accountable AI.

Number of places: 1

To read how we process personal data, applicants can review our 'Privacy Notice for Student Applicants and Potential Applicants' on our Privacy notices' web page.