AI in Higher Education: Trends, Tools, Ethics and the Future of Learning

To help you stay up to date with how AI is evolving across UK higher education, we’ve brought together the latest statistics on student adoption, institutional policy and subject-level skills gaps. You’ll find all sources listed at the end of this article for easy reference.

If You Only Have 5 Minutes, Read This

  • AI use is now near-universal: 92% of students use it, and 88% use GenAI in assessments.
  • The data suggests AI is primarily used for learning tasks (such as explanations, summaries and source identification) rather than for misconduct.
  • 44% say AI helped them develop more skills; only 12% say it reduced learning.
  • Most institutions say they have policies, but only 36% of students receive training.
  • Students see benefits, but fear misconduct accusations and inaccurate outputs.
  • STEM subjects show the highest AI confidence and exposure to future AI-aligned jobs.

AI Adoption Among Undergraduates: Latest Statistics

According to the Higher Education Policy Institute (HEPI)–Kortext 2025 survey, AI adoption has surged dramatically: 92% of undergraduates now use AI in some form (up from 66% in 2024).

  • 88% have used generative AI specifically in their assessments.
  • Use of AI to generate text has more than doubled to 64%.

This marks one of the fastest technology adoption curves ever recorded in UK higher education.

Generative tools such as Adobe Firefly are increasingly part of this landscape, helping students create, refine and understand content more efficiently while maintaining responsible AI standards.

AI Use Cases: What Students Use AI For

HEPI–Kortext data shows that AI is most commonly used to:

  • Explain concepts: 58%
  • Summarise articles (sharpest increase year-on-year)
  • Suggest research ideas or structure work
  • Draft or edit assessment content; 25% use AI to draft, 18% include AI-edited text in submitted work

YouGov’s 2025 national student survey reinforces these patterns:

  • 81% use AI to explain concepts
  • 69% use AI to summarise sources
  • 55% use AI to identify relevant sources
  • 52% use AI to improve work they already wrote
  • 45% use AI to find evidence for arguments

Generative AI is therefore not a fringe behaviour; it is deeply embedded in day-to-day academic practice.

Students often combine AI tools with practical document workflows, such as using AI to summarise research and then convert PDFs into editable Word documents for annotation and revision.

AI Usage Frequency Statistics

Daily AI use is not yet the norm, but regular usage is:

  • 5% use AI every day for their degree
  • 13% use it every few days
  • 15% about once a week

AI Adoption by Subject Area

Department for Education (DfE) exposure data shows higher AI task-alignment in:

  • Computing
  • Engineering
  • Business
  • Creative disciplines

This supports the HEPI–Kortext finding that students in STEM and health subjects report the strongest familiarity and comfort with AI, while arts and humanities students remain more sceptical of its academic value.

Overall, AI has moved from a niche tool to a mainstream part of the UK academic experience.

AI Adoption and Impact in Higher Education: Key Findings

Students frequently report that AI improves the quality and efficiency of their academic work. This is consistently supported by both the HEPI–Kortext and YouGov datasets.

Reasons Students Use AI: 2025 Statistics

According to HEPI–Kortext:

  • 51% use AI because it saves them time
  • 50% say it improves the quality of their work
  • 40% rely on it for instant support
  • 32% for personalised help
  • 28% to improve their AI skills
  • Women are more likely to use AI for out-of-hours support; men more for skill development

Impact of AI on Academic Performance

YouGov’s study shows:

  • 30% say AI has improved their marks
  • 48% say their marks stayed the same
  • Only 11% believe AI worsened their marks

Impact of AI on Learning and Skill Development

Students also report positive learning effects:

  • 44% say they learned or developed more skills thanks to AI
  • 32% say learning stayed the same
  • Only 12% say AI reduced learning

In data-heavy subjects such as business and engineering, students increasingly use AI alongside tools that convert PDF data into Excel spreadsheets, enabling faster analysis and interpretation.

Variation across subjects and demographics

HEPI–Kortext highlights widening divides:

  • STEM students report higher usage and greater confidence
  • Arts & Humanities students are more sceptical that AI-generated content would score well
  • Wealthier students and men use AI more frequently and for more advanced tasks
  • Women are more concerned about hallucinations and being accused of misconduct

Together, these patterns show that AI is influencing both academic behaviour and student experience in ways that vary across subgroups.

Generative AI and ChatGPT Usage Statistics

The rise of tools like ChatGPT, Copilot, and Gemini has fundamentally changed how students approach academic tasks.

How students use generative AI

Across both major datasets:

HEPI–Kortext (assessments):

  • 58% use GenAI to explain concepts
  • Summarising articles saw the largest year-on-year increase
  • 25% use AI to help draft assessments
  • 18% include AI-edited text directly in submitted work

YouGov (study activities):

  • 81% use AI to explain concepts
  • 69% to summarise sources
  • 52% to improve their own writing
  • 20% for graded coursework sections
  • 13% because it is required by their course

For assessment workflows, students often draft with AI support and then convert Word files to PDF for submission, reflecting how generative tools integrate into existing academic processes.

Acceptability of AI Use vs Actual Behaviours

A key tension:

  • 18% admit submitting AI-edited text
  • But only 25% consider that acceptable

Some students are using AI in ways they feel uncertain about, signalling a gap between behaviour and institutional clarity.

Barriers and Concerns Around AI Use

According to HEPI–Kortext:

  • 53% fear being accused of cheating
  • 51% fear hallucinations
  • 37% worry about biased results
  • 31% say their institution discourages or bans AI

Women and younger students show the highest concern, especially around misconduct risk.

Trust in AI and Hallucination Experience

Experience with hallucinations is now clearer:

  • 30% say AI hallucinates “quite often”
  • 39% say hallucinations occur rarely
  • Only 12% say they “don’t know,” down sharply from 35% last year

Students are becoming more aware of AI’s limits and risks, a sign of growing digital maturity.

Subject-level differences (Department for Education + HEPI)

STEM subjects, with higher exposure to AI-aligned job tasks, show:

  • higher early adoption
  • more trust in AI’s academic usefulness
  • greater skill confidence

Arts & humanities students are more hesitant, shaping very different discipline cultures around AI.

How AI Is Shaping Student Experience

Students are increasingly conscious that AI is not just a study tool, it’s a career skill.

Student Perceptions of AI and Future Careers

YouGov’s data:

  • 47% say AI skills will be important for jobs after graduation
  • 39% believe AI won’t matter or will matter very little

This reflects a divided but shifting landscape: many students anticipate an AI-intensive workforce, while nearly as many remain unconvinced.

DfE occupational-exposure data supports this tension, showing that sectors such as Computing, Engineering and Business face the highest levels of AI-related job transformation, suggesting employer expectations will rise as AI becomes embedded in everyday professional tasks.

Institutional Changes Driven by AI

The HEPI–Kortext report reveals rapid institutional shifts:

  • 59% say assessments have already changed “a lot” due to AI
  • 80% say their institution now has a clear policy
  • 76% believe their institution could detect AI use (even though detection tools remain unreliable)

Student Expectations for AI in Education

Qualitative responses highlight:

  • desire for more support
  • frustration with inconsistent rules
  • recognition that AI is becoming integral to future work
  • concern that institutions are moving too slowly

Subject-based future impact

Department for Education exposure data shows:

  • Computing, Engineering, Technical disciplines → highest future AI task exposure
  • Arts & Humanities → lowest exposure
  • Business → strong alignment with AI-augmented roles

Students in high-exposure subjects anticipate more dramatic change to their academic and career pathways.

AI is now part of everyday academic workflows, even though institutional training remains inconsistent.

Common study uses

Across HEPI–Kortext and YouGov, top tasks include:

  • Explaining concepts: up to 81%
  • Summarising sources: 69%
  • Editing or refining text: 52%
  • Identifying relevant sources: 55%
  • Creating non-graded content: 25%

Institutional use cases

Universities increasingly deploy:

  • AI chatbots for support
  • AI-enhanced LMS platforms
  • Plagiarism and similarity detection
  • Microsoft Copilot through institutional licences

Alongside AI-powered study tools, students frequently rely on practical utilities to manage coursework, such as splitting PDFs into smaller sections for focused reading and revision.

But support is limited

HEPI–Kortext 2025:

  • Only 36% have received support to develop AI skills
  • Only 26% say their institution provides AI tools
  • Staff skills improving: 42% say staff are now “well-equipped” (up from 18%)

These gaps show a misalignment: students adopt quickly, institutions follow slowly.

Benefits of AI for Students

Students consistently report strong benefits linked to using AI.

Key benefits:

  • Time savings: 51% (HEPI–Kortext)
  • Higher-quality work: 50% (HEPI–Kortext)
  • More effective learning: 44% say they developed more skills (YouGov)
  • Better explanation and comprehension: 81% use AI to understand concepts
  • Better writing clarity: 52% incorporate AI-suggested improvements
  • Improved research workflows: 55% use AI for source identification

Longitudinal insights

HEPI–Kortext shows:

  • declining confusion about AI use
  • growing understanding of hallucinations
  • rising expectations for institutional support
  • improved staff readiness over time

The narrative has shifted from novelty to practical academic enhancement across the student population.

Barriers to AI Adoption in Higher Education

Despite widespread use, significant challenges remain.

Top barriers (HEPI–Kortext 2025)

  • 53% fear being accused of cheating
  • 51% fear hallucinations
  • 31% say their institution bans or discourages AI
  • 23% worry about data privacy
  • 20% find tools too expensive

YouGov data reinforces concerns around ethics and fairness, with large majorities saying certain uses of AI in assessments are unacceptable, especially submitting AI-generated work without editing (93% unacceptable).

Policy inconsistencies

Qualitative feedback from HEPI–Kortext highlights:

  • mixed messages from lecturers
  • variations between departments
  • vague or confusing guidance
  • unclear boundaries around acceptable use

Students repeatedly call for clearer, more consistent rules.

AI Governance and Institutional Readiness

AI governance is still emerging across UK higher education.

Institutional clarity

HEPI–Kortext:

  • 80% say their institution claims to have a clear AI policy
  • But only 36% say they have received actual AI skills support
  • 29% feel “encouraged” to use AI
  • 40% disagree that they are encouraged

Detection confidence

YouGov 2025:

  • 66% believe their institution would detect fully AI-generated work
  • 23% believe detection is unlikely
  • 11% don’t know

Fairness and transparency

HEPI–Kortext student comments reveal concerns over:

  • inconsistent enforcement
  • unclear expectations
  • fairness between subjects
  • rapid policy changes

Governance is improving, but not yet aligned with student behaviour.

AI Literacy and Skills Gaps

A clear literacy gap is emerging in UK higher education.

Key divides

  • Students are learning AI faster than institutions teach it
  • Only 36% have received AI-skills support
  • STEM subjects show the highest confidence
  • Socio-economic divides persist
  • Women and C2DE students report less confident use and greater concern
  • Postgraduates show more advanced generative AI behaviours

Together, these divides signal a broader equity risk for the sector. As AI adoption accelerates unevenly, disparities in skills and confidence across subjects, socio-economic backgrounds, and qualification levels will deepen unless universities provide consistent, universal training.

Department for Education exposure alignment

Subjects with highest AI-task exposure (Computing, Engineering, Business) will require more intensive institutional support to remain competitive.

Non-STEM subjects risk falling behind unless training becomes universal.

AI Policy and Governance in Higher Education

AI policy must now evolve beyond warnings and misconduct. Students want:

  • transparent rules
  • consistent enforcement across departments
  • proactive skills training
  • access to trusted tools
  • assessment redesign to reflect AI-enabled study behaviours

HEPI–Kortext comments emphasise frustration with vague policies and mixed messages, while DfE data shows workforce transformation that will require stronger national coordination.

Summary of Key AI Findings in Higher Education

Drawing together all datasets, the state of AI in UK higher education can be summarised clearly:

  • AI use is nearly universal, with 92% of students using AI and 88% using GenAI in assessments.
  • AI supports learning, with 44% saying they gained more skills and 30% earning better marks.
  • Generative AI dominates, with high usage for explanation, summarisation and improvement of work.
  • Institutions lag behind, with limited training and uneven policy clarity.
  • STEM leads adoption, but all subjects are touched by AI exposure.
  • Students expect AI to shape their future, but are divided on how essential those skills will be.
AI is no longer optional in UK higher education; it is a structural part of how students learn, how assessments evolve, and how universities prepare graduates for the future.

What’s next?

The data points to a sector now moving from experimentation to consolidation. Universities will likely focus on strengthening AI literacy, embedding responsible-use policies, and developing assessment models that reflect widespread generative AI use. Ensuring equitable access to AI skills will become a defining priority for the coming years.