Artificial Intelligence
Generative Artificial Intelligence (GAI) is a broad term that refers to a type of artificial intelligence (AI) application that is designed to use a variety of machine learning algorithms to create new content (text, images, video, music, artwork, synthetic data, etc.) based on user input that was not explicitly programmed into the AI application. Generative AI systems are "trained" by using complex algorithms to learn from an existing large corpus of datasets (often consisting of millions of examples) and to analyze patterns, rules and statistical structures from the sample data to be used in generating new content that is similar in style and characteristics to the original training datasets.
AI Services and our Roadmap
ITS is following and reviewing current and new technologies to find opportunities to incorporate them into our technology architecture and learning environment.
Note: Please check with your department for any specific policies or guidelines before using these tools. To meet BPM 12004, each software use case must be reviewed and approved by ITS.
Service | Description | Status | Data that can be used with the service |
---|---|---|---|
ChatGPT | AI-powered language model for text generation | Approved | DCL 1 – Public data Examples: Presentations, published research, job postings, press releases |
Google Gemini | AI-powered language model for text generation | Approved when accessed through Single Sign-On | DCL 1 – Public data Examples: Presentations, published research, job postings, press releases |
Grammarly for Education | AI-powered writing assistance that augments writing and learning. Learn more and how to purchase. | Approved (Note: Grammarly for Education is approved. Personal and premium licenses are not approved. | DCL 1, 2 and 3 Examples: Presentations, published research, job postings (DCL 1); budgets, salaries, internal memos (DCL 2); FERPA data, personally identifiable information (DCL 3) |
Microsoft 365 CoPilot Chat (Free Tool) | AI-powered search engine available under Microsoft M365 | Approved when accessed through Single Sign-On | DCL 1 – Public data Examples: Presentations, published research, job postings, press releases |
Microsoft Teams Premium | Available for meeting notes and transcriptions. Available to purchase through IT Procurement Learn more. | Approved when accessed through Single Sign-On | DCL 1, 2 and 3 Examples: Presentations, published research, job postings (DCL 1); budgets, salaries, internal memos (DCL 2); FERPA data, personally identifiable information (DCL 3) |
Zoom AI Companion | Used for meeting notes, transcription, summary and other features. Learn more. | Approved when accessed through Single Sign-On | DCL 1, 2 and 3 Examples: Presentations, published research, job postings (DCL 1); budgets, salaries, internal memos (DCL 2); FERPA data, personally identifiable information (DCL 3) |
Microsoft M365 Copilot (Paid Version) | Generative AI features within Microsoft 365 programs such as Word, Excel, PowerPoint, Excel and more. Learn more. | Under IT review during pilot testing | DCL 1, 2 and 3 Examples: Presentations, published research, job postings (DCL 1); budgets, salaries, internal memos (DCL 2); FERPA data, personally identifiable information (DCL 3) |
TeamDynamix Conversational AI | Chatbot feature within the TeamDynamix service management and ticketing platform | Under IT review | Pending |
Apple Intelligence | Generative AI features within iPhone, iPad and Mac devices | Under IT review | DCL 1 – Public data Examples: Presentations, published research, job postings, press releases |
NotebookLM | AI-Powered research and note-taking tool by Google. | Approved when accessed through Single Sign-On | DCL 1 – Public data Examples: Presentations, published research, job postings, press releases |
Otter.AI | Meeting transcription service | Not approved* | Not approved because the service could pose privacy or security concerns. *This software may be reviewed for specific use cases. Please reach out to ITS with questions. |
At this time, the university supports responsible experimentation with and use of generative AI (GAI) tools, such as ChatGPT and Google Gemini, but there are important considerations to keep in mind when using these tools, including information security, data privacy, compliance, intellectual property/copyright implications, academic integrity and bias. In particular, student data should NOT be entered into generative AI tools, and we strongly encourage you to not enter your own personal information into such tools.
What can you use GAI tools for:
GAI tools can be used for any needs incorporating public or generic data (DCL1, under the university's Data Classification System). Examples include:
- Writing software code that uses common routines
- Research on nonsensitive topics
- Queries (without confidential information) to better understand our customers, partners, vendors, etc.)
- Writing generic documentation such as job descriptions, strategic plans or other administrative documents
What should you avoid when using GAI tools?
- Do not enter personal, health, student or financial information in AI tools (DCL 2, 3, and 4) under the university's Data Classification System). The technology and vendors may not protect the data or the privacy rights of individuals. Data entered into these tools may be shared with unauthorized third parties.
- Do not reuse your password associated with your university account to sign up for AI accounts.
- Do not share sensitive research information, intellectual property or trade secrets with AI tools. The university may lose its rights to that information, which may be disclosed to unauthorized third parties.
- Visit Missouri Online to learn more about using AI in teaching and learning activities.
- Considering that artificial intelligence (AI) is commonly referred to as its acronym, there are many acronyms related to this field. Here are a few acronyms and their definitions from this moxielearn site.
- ML (Machine Learning) - A subset of AI that focuses on algorithms that improve through experience and data.
- DL (Deep Learning) - A subset of ML using neural networks with multiple layers to learn from large amounts of data.
- LLM (Large Language Model) - A type of AI model trained on vast amounts of text data to understand and generate human-like text.
- Gen AI (Generative AI) - AI systems that can create new content, such as text, images, or music.
- NLP (Natural Language Processing) - The field of AI focused on the interaction between computers and human language.
You can read the remaining 12 terms at ai terms and acronyms.
ITS evaluates all IT-related products and solutions, including AI-related technologies, under UM policy BPM 12004. Determining the risk level of IT-related free tools and purchases are essential to maintaining an environment capable of supporting university activities in a safe and secure manner.
All IT and Telecom purchases, irrespective of dollar value and including no-cost items, must be reviewed and approved based on the requirements established by BPM 12004 and must meet all other unique IT/Telecom-related requirements and technology standards in place at each business unit. All software must be approved by IT prior to use, even if the software is free.
The IT approval is meant to ensure the software meets the legal, data and security standards for the organization.
To submit a technology request, see UMSL Technology Purchase Request.
ITS is actively reviewing the role third-party AI tools, like ChatGPT, play at the university, and part of that review involves examining formal contracts and agreements with AI vendors.
The university's guidance on third-party AI usage will adapt and change as we engage in broader institutional review and analysis of these tools. ITS encourages its community members to use AI responsibly and review the data inputted into AI systems to ensure it meets the current general guidelines.
Guidelines for Secure AI Use (Third-Party Tools)
- Third-party AI tools should only be used with institutional data classified as DCL1 (Data Classification Level 1 - Public under the university's Data Classification System).
- Third-party AI tools like ChatGPT should not be used with sensitive information such as student information regulated by FERPA, human subject research information, health information, HR records, etc.
- AI-generated code should not be used for institutional IT systems and services.
- Open AI's usage policies disallow the use of its products for many other specific activities. Examples of these activities include, but are not limited to:
- Illegal activity
- Generation of hateful, harassing or violent content
- Generation of malware
- Activity that has high risk of economic harm
- Fraudulent or deceptive activity
- Activity that violates people's privacy
- Telling someone that they have or do not have a certain health condition, or providing instructions on how to cure or treat a health condition
- High risk government decision-making - Read the third-party AI's privacy policy to understand what data it collects. Be aware that some tools may collect sensitive information, such as keystrokes, usernames, passwords and geolocation. Use this review to make an informed decision about whether the tool should be used.