Strategy and Innovation

Artificial Intelligence

Generative Artificial Intelligence (GAI) is a broad term that refers to a type of artificial intelligence (AI) application that is designed to use a variety of machine learning algorithms to create new content (text, images, video, music, artwork, synthetic data, etc.) based on user input that was not explicitly programmed into the AI application. Generative AI systems are “trained” by using complex algorithms to learn from an existing large corpus of datasets (often consisting of millions of examples) and to analyze patterns, rules and statistical structures from the sample data to be used in generating new content that is similar in style and characteristics to the original training datasets.

GenAI tools will transform how work gets done. Tools such as ChatGPT, Gemini, GitHub and Copilot will all have a profound impact on administrative, academic and research work going forward. Many have asked if there are policies, procedures, or tools available for use. Although the University of Missouri does not yet have approved tools or a formal governance structure in place, the Offices of Finance and Human Resources are fully engaged with IT and System/Campus leadership to establish these, and in advising how AI can and should be used across campus.

GenAI and the broader classification of artificial intelligence have been identified by UNESCO as having the potential to “address some of the biggest challenges in education today, innovate teaching and learning practices, and accelerate progress” as well as calling for a human-centered approach to respond to inequalities.

AI Services and our Roadmap

The Division of IT is actively following and reviewing current and new technologies to find opportunities to incorporate them into our technology architecture and learning environment.

ServiceDescriptionStatusData that can be used with the service
Microsoft Teams PremiumAvailable for meeting notes and transcriptions. Available to purchase at DoIT Software Sales. Learn more.ApprovedUp to DCL 3 – Restricted
Zoom AI CompanionUsed for meeting notes, transcription, summary and other features. Learn more.ApprovedUp to DCL 3 – Restricted
Microsoft Bing CoPilotAI-powered search engine available under Microsoft M365ApprovedDCL 1 – Public data
Microsoft M365 CopilotGenerative AI features within Microsoft 365 programs such as Word, Excel, PowerPoint, Excel and moreUnder IT reviewUp to DCL 3 – Restricted
TeamDynamix Conversational AIChatbot feature within the TeamDynamix service management and ticketing platformUnder IT reviewPending
Apple IntelligenceGenerative AI features within iPhone, iPad and Mac devicesUnder IT reviewDCL 1 – Public data
Read AIMeeting transcription serviceNot approvedNot approved
Otter.AIMeeting transcription serviceNot approvedNot approved

At this time, the university supports responsible experimentation with and use of generative AI (GAI) tools, such as ChatGPT and Google Gemini, but there are important considerations to keep in mind when using these tools, including information security, data privacy, compliance, intellectual property/copyright implications, academic integrity and bias. In particular, student data should NOT be entered into generative AI tools, and we strongly encourage you to not enter your own personal information into such tools.

What can you use GAI tools for?

GAI tools can be used for any needs incorporating public or generic data (DCL1, under the university’s Data Classification System). Examples include:

  • Writing software code that uses common routines
  • Research on nonsensitive topics
  • Queries (without confidential information) to better understand our customers, partners, vendors, etc.
  • Writing generic documentation such as job descriptions, strategic plans or other administrative documents

What should you avoid when using GAI tools?

  • Do not enter personal, health, student or financial information in AI tools (DCL 2, 3, and 4 under the university’s Data Classification System). The technology and vendors may not protect the data or the privacy rights of individuals. Data entered into these tools may be shared with unauthorized third parties.
  • Do not reuse your password associated with your university account to sign up for AI accounts.
  • Do not share sensitive research information, intellectual property or trade secrets with AI tools. The university may lose its rights to that information, which may be disclosed to unauthorized third parties.

The MU Task Force on Artificial Intelligence and the Learning Environment, gathered input from across campus, consulted with AI experts, and analyzed industry best practices to create a roadmap for MU to become an “AI forward” institution.

Visit the AI Taskforce site to access the taskforce report, 2024-25 priorities, campus resources for instructors and more.

Visit Missouri Online to learn more about using AI in teaching and learning activities.

More AI guidelines and considerations

The Division of IT evaluates all IT-related products and solutions, including AI-related technologies, under UM policy BPM 12004. Determining the risk level of IT-related free tools and purchases are essential to maintaining an environment capable of supporting university activities in a safe and secure manner.

If you are interested in purchasing or using a new technology, start by talking to your IT professional. Your IT pro will assist you in submitting your product for review to meet university and regulatory compliance needs. Please note software that incorporates AI may take longer for assessment to ensure we comply with security, privacy and data requirements.

More about IT compliance guidelines

MU is actively reviewing the role third-party AI tools, like ChatGPT, play at the university, and part of that review involves examining formal contracts and agreements with AI vendors. Currently, MU does not have a contract or agreement with any AI provider, which means that standardized MU security and privacy provisions are not present for this technology. There are, however, still many opportunities to experiment and innovate using third-party AI tools at the university.

The university’s guidance on third-party AI usage will adapt and change as we engage in broader institutional review and analysis of these tools. MU encourages its community members to use AI responsibly and review the data inputted into AI systems to ensure it meets the current general guidelines.

Guidelines for Secure AI Use (Third-Party Tools)

  1. Third-party AI tools should only be used with institutional data classified as DCL1 (Data Classification Level 1-Public under the university’s Data Classification System).
  2. Third-party AI tools like ChatGPT should not be used with sensitive information such as student information regulated by FERPA, human subject research information, health information, HR records, etc.
  3. AI-generated code should not be used for institutional IT systems and services.
  4. Open AI’s usage policies disallow the use of its products for many other specific activities. Examples of these activities include, but are not limited to:
    • Illegal activity
    • Generation of hateful, harassing or violent content
    • Generation of malware
    • Activity that has high risk of economic harm
    • Fraudulent or deceptive activity
    • Activity that violates people’s privacy
    • Telling someone that they have or do not have a certain health condition, or providing instructions on how to cure or treat a health condition
    • High risk government decision-making