Use of AI tools to support business processes

Purpose and scope

1) We recognise the range of AI tools available to support business processes at the University, and the opportunity this represents for us. We are equally aware that the use of AI for this purpose is fast-developing and that we must identify and manage emerging risks as an institution and act in compliance with our Information Security Policy and IT Acceptable Use Policy. This guidance relates to the use of AI within business operations. Staff or student use of AI for the purposes of learning and teaching is not covered by this guidance.

2) The responsible use of Generative AI tools by staff at the University of Essex will comply with all laws and regulations applicable to the use of AI.

Definition of AI

3) For the purposes of this guidance:

  • ‘AI’ or ‘Generative AI’ is a system that may generate new outputs in a range of possible formats such as text, images, or sounds. An example of this is using a Large Language Model (LLM) to summarise text and data and to provide this in a different format
  • a Large Language model (LLM) is an advanced computer program that processes and generates human-like text based on a vast amount of language data it has learned, by identifying patterns in words or phrases
  • AI is not the same as Robotic Process Automation (sometimes called RPA or automation) whereby a task is repeatedly performed without human intervention according to set parameters.

Our principled approach

4) Staff using AI in their work must ensure that they comply with our IT Acceptable Use Policy and follow the principles below:

  • integrity: AI tools must be used in a way that aligns with our ethical principles and our values of equality, diversity, and inclusion
  • co-piloting: AI should augment, not replace, human judgment. AI can make mistakes and create bias. Human intervention must mitigate this
  • accountability: staff are accountable for decisions made with the assistance of AI
  • transparency: the University is transparent about its use of AI. Where AI is used to support decision making, users must clearly state how. For example, when AI is used to produce meeting notes, attendees must be made aware, and the use of AI must be declared on the draft minutes. Find out more about recording meetings
  • data privacy and security: AI usage by staff in must comply with our Information Security Policy and all relevant legal requirements

Using AI tools

5) The use of approved Generative AI tools may be considered for the following purposes:

  • to enhance operational efficiency
  • to support and inform decision-making
  • to improve service delivery to students, staff and other stakeholders
  • to support every student from every background to fulfil their potential at the University.

6) The potential uses of AI are many and varied. The following are examples of how AI might be used to support University business processes to:

  • summarise and improve the ‘readability’ of information for presentation in a different format, for example using the contents of a word document to generate a PowerPoint presentation
  • summarise reading (for example to help prepare for a committee or prepare a paper)
  • produce draft meeting notes or minutes before they are reviewed with human intervention
  • identify trends in a dataset that does not contain sensitive data, which may improve the efficiency of report writing or similar activities
  • support student and staff enquiries, whilst ensuring that queries of a complex or personal nature are routed for human intervention
  • help to draft an email that is concise, clear and audience targeted

Where the use of AI is not permitted

7. AI tools may not be used for the following purposes:

  • sensitive decision-making: decisions relating to individuals; for example, admissions, staff recruitment, promotions, or performance management
  • isolated decision-making: where AI is used but not additionally subject to human judgment
  • processing or analysing confidential or personal information as outlined in our Information Security Policy (.pdf)
  • where the AI tool is a ‘personal’ account and has not been approved for use by the University.

Approval process for AI tools

8) Our approved AI tools are Zoom AI Companion and MS Co-pilot. These are Generative AI tools developed by suppliers with whom we have an established contractual agreement and are licenced to the University.

9) Staff wishing to use an AI tool not listed on this page must request that this is considered by their line manager in the first instance and then follow the appropriate procurement process.

Non-compliance with this guidance

10. Non-compliance with this guidance may result in the revocation of access to AI tools and disciplinary action.

Governance

11. On an annual basis, the AI Advisory Group will:

  • review the institutional use of agreed AI tools to assess engagement, effectiveness, and adherence to the principles
  • review the guidance provided to staff regarding the use of AI.
Arrow symbol
Still need help?
Get in touch