Strategy and Innovation
Artificial Intelligence
Generative Artificial Intelligence (GAI) is a broad term that refers to a type of artificial intelligence (AI) application that uses machine learning algorithms to create new content. GAI tools, such as ChatGPT, Microsoft Copilot and others, will have a profound impact on administrative, academic and research work going forward, but they are not intended to replace the important work you do. They are designed to assist with everyday tasks like drafting emails, summarizing information or creating outlines.
Although the University of Missouri does not yet have a formal governance structure in place, the Offices of Finance and Human Resources are fully engaged with IT and System/Campus leadership to establish these, and in advising how AI can and should be used across campus.
The university supports responsible AI use by faculty, staff and students. This page serves as a resource to understand how the university supports the use of AI in the learning environment and the workplace, compliance requirements and approved AI tools that can be used now.
Please note that given the rapid pace of advancements and instructional applications in generative AI, these guidelines will continue to evolve.
At this time, the university supports responsible experimentation with and use of generative AI (GAI) tools, such as ChatGPT and Google Gemini, but there are important considerations to keep in mind when using these tools, including information security, data privacy, compliance, intellectual property/copyright implications, academic integrity and bias. In particular, student data should NOT be entered into generative AI tools, and we strongly encourage you to not enter your own personal information into such tools.
What can you use GAI tools for?
GAI tools can be used for any needs incorporating public or generic data (DCL1, under the university’s Data Classification System). Examples include:
- Writing software code that uses common routines
- Research on nonsensitive topics
- Queries (without confidential information) to better understand our customers, partners, vendors, etc.
- Writing generic documentation such as job descriptions, strategic plans or other administrative documents
What should you avoid when using GAI tools?
- Do not enter personal, health, student or financial information in AI tools (DCL 2, 3, and 4 under the university’s Data Classification System). The technology and vendors may not protect the data or the privacy rights of individuals. Data entered into these tools may be shared with unauthorized third parties.
- Do not reuse your password associated with your university account to sign up for AI accounts.
- Do not share sensitive research information, intellectual property or trade secrets with AI tools. The university may lose its rights to that information, which may be disclosed to unauthorized third parties.
The MU Task Force on Artificial Intelligence and the Learning Environment, gathered input from across campus, consulted with AI experts, and analyzed industry best practices to create a roadmap for MU to become an “AI forward” institution.
Visit the AI Taskforce site to access the taskforce report, 2024-25 priorities, campus resources for instructors and more.
Visit Academic Technology to learn more about using AI in teaching and learning activities.
The Division of IT evaluates all IT-related products and solutions, including AI-related technologies, under UM policy BPM 12004. Determining the risk level of IT-related free tools and purchases are essential to maintaining an environment capable of supporting university activities in a safe and secure manner.
If you are interested in purchasing or using new AI technology or have questions about how AI could assist you in your roles, reach out to request an IT compliance consultation. Our team can assist you by discussing your needs and reviewing potential AI products for university and regulatory compliance.
MU is actively reviewing the role third-party AI tools play at the university, and part of that review involves examining formal contracts and agreements with AI vendors. There are, however, still many opportunities to experiment and innovate using third-party AI tools at the university.
The university’s guidance on third-party AI usage will adapt and change as we engage in broader institutional review and analysis of these tools. MU encourages its community members to use AI responsibly and review the data inputted into AI systems to ensure it meets the current general guidelines.
Guidelines for Secure AI Use (Third-Party Tools)
- Third-party AI tools should only be used with institutional data classified as DCL1 (Data Classification Level 1-Public under the university’s Data Classification System).
- Third-party AI tools like ChatGPT should not be used with sensitive information such as student information regulated by FERPA, human subject research information, health information, HR records, etc.
- AI-generated code should not be used for institutional IT systems and services.
- Open AI’s usage policies disallow the use of its products for many other specific activities. Examples of these activities include, but are not limited to:
- Illegal activity
- Generation of hateful, harassing or violent content
- Generation of malware
- Activity that has high risk of economic harm
- Fraudulent or deceptive activity
- Activity that violates people’s privacy
- Telling someone that they have or do not have a certain health condition, or providing instructions on how to cure or treat a health condition
- High risk government decision-making
- Read the third-party AI’s privacy policy to understand what data it collects. Be aware that some tools may collect sensitive information, such as keystrokes, usernames, passwords and geolocation. Use this review to make an informed decision about whether the tool should be used.
AI Services and Roadmap
The Division of IT actively follows and reviews current and new technologies to find opportunities to incorporate them into our technology architecture and learning environment.
Note: Before using these tools, please check with your department for any specific policies or guidelines. To meet BPM 12004, IT Compliance must review and approve each software use case.
Service | Description | Status | Data that can be used with the service |
---|---|---|---|
ChatGPT Enterprise | AI-powered language model for text generation. How to purchase. | Approved when accessed via Single Sign-On and username@umsystem.edu | DCL 1, 2 and 3 Examples: Presentations, published research, job postings (DCL 1); budgets, salaries, internal memos (DCL 2); FERPA data, personally identifiable information (DCL 3) |
Google Gemini | AI-powered language model for text generation | Approved when accessed via Single Sign-On and username@umsystem.edu | DCL 1 – Public data Examples: Presentations, published research, job postings, press releases |
Google NotebookLM | AI-powered research and notetaking tool from Google | Approved and can be accessed via Google login with username@umsystem.edu | DCL 1 and 2 Examples: Presentations, published research, job postings (DCL 1); budgets, salaries, internal memos (DCL 2) |
Grammarly for Education | AI-powered writing assistance that augments writing and learning. How to purchase. | Approved when accessed via Single Sign-On and username@umsystem.edu | DCL 1, 2 and 3 Examples: Presentations, published research, job postings (DCL 1); budgets, salaries, internal memos (DCL 2); FERPA data, personally identifiable information (DCL 3) |
Microsoft Teams Premium | Available for meeting notes and transcriptions. Available to purchase at DoIT Software Sales. More information about Teams Premium. | Approved when accessed via Single Sign-On and username@umsystem.edu | DCL 1, 2 and 3 Examples: Presentations, published research, job postings (DCL 1); budgets, salaries, internal memos (DCL 2); FERPA data, personally identifiable information (DCL 3) |
Zoom AI Companion | Used for meeting notes, transcription, summary and other features. More information about Zoom AI. | Approved when accessed via Single Sign-On and username@umsystem.edu | DCL 1, 2 and 3 Examples: Presentations, published research, job postings (DCL 1); budgets, salaries, internal memos (DCL 2); FERPA data, personally identifiable information (DCL 3) |
Microsoft M365 Copilot | Generative AI features within Microsoft 365 programs such as Word, Excel, PowerPoint, Excel and more. How to purchase. | Approved when accessed via Single Sign-On and username@umsystem.edu | DCL 1, 2 and 3 Examples: Presentations, published research, job postings (DCL 1); budgets, salaries, internal memos (DCL 2); FERPA data, personally identifiable information (DCL 3) |
TeamDynamix Conversational AI | Chatbot feature within the TeamDynamix service management and ticketing platform | Approved when accessed via Single Sign-On and username@umsystem.edu | Pending |
Apple Intelligence | Generative AI features within iPhone, iPad and Mac devices | Approved when accessed via Single Sign-On and username@umsystem.edu | DCL 1 – Public data Examples: Presentations, published research, job postings, press releases |
Read AI | Meeting transcription service | Not approved | Not approved because the service could pose privacy or security concerns |
Otter.AI | Meeting transcription service | Not approved* | Not approved because the service could pose privacy or security concerns. *This software may be reviewed for specific use cases. Please reach out to IT Compliance using the process outlined here. |