Technology
Microsoft 365 Copilot (Pilot Group)
Microsoft Copilot is a suite of AI-powered productivity tools developed by Microsoft, designed to enhance efficiency and creativity across various platforms and use cases. It functions as a conversational, AI-driven assistant that leverages advanced language models, such as GPT-4, to provide contextual assistance, automate routine tasks, and analyze data. There are multiple versions of Copilot tailored to different needs. The versions currently in use at Illinois State University are:
- Microsoft Copilot: Microsoft Copilot serves as a general digital companion/chatbot for personal use. All current students, faculty, and staff currently have access to this version of Microsoft Copilot.
- Access Microsoft Copilot at: Microsoft Copilot.
- For more information about the Microsoft Copilot chat, please see: Copilot for the Web
- Microsoft 365 Copilot: Microsoft 365 Copilot is designed for business productivity. This version is currently only available to a predefined pilot group at Illinois State University. The core capabilities of Copilot include generating content, summarizing documents and emails, creating presentations, analyzing data in Excel, and taking meeting notes in Teams. It integrates deeply with Microsoft 365 applications like Word, Excel, PowerPoint, Outlook, and Teams, using data from the Microsoft Graph to deliver personalized and intelligent suggestions. Microsoft 365 Copilot may be access at the same link shared above, or from within Microsoft 365 applications.
Use of Institutional Data in Microsoft 365 Copilot
When using any AI tool at Illinois State University, it is essential to adhere to applicable laws, policies, and best practices regarding the data accessed by the tool. Whatever your role at the University, it is very likely that you encounter or have access to data that is covered by the Data Classification policy. While Copilot is considered secure, there are several key guidelines to follow when using institutional data.
- Only use ISU-provided AI tools (e.g., Microsoft 365 Copilot) for University work.
- Copilot mirrors your access. It can only see what your account can access in Microsoft 365.
- Do not share data sources with others – they may not be approved for the same access granted to you.
- Ensure all institutional date is protected in accordance with ISU data classification procedures. Confidential data misuse can result in severe consequences.
- Institutional data includes information created, collected, maintained, transmitted, or recorded by or for the university to conduct university operations.
- Report Access Issues: If Copilot shows you something you believe you shouldn’t have access to, report it to the Information Security Office immediately so it can be corrected.
OneDrive and Copilot
Treat OneDrive as your safety boundary. As a rule of thumb, only put into Copilot what you would store in OneDrive, and work on files that are saved to OneDrive (Autosave on is an indication that a file is stored in OneDrive). Copilot respects your existing Microsoft 365 permissions—it cannot see files or mail that you cannot access. Anytime you upload a file to Copilot, it is also saved in your work OneDrive account. Highly restricted/confidential data should not reside in OneDrive.
- For more information about OneDrive, please refer to: Microsoft OneDrive
Data you may NOT use in Copilot
Copilot is not appropriate for use with all data types. Do not use the following data types when using Microsoft 365 Copilot at ISU:
- Highly restricted/confidential data of any kind.
- Examples include student or employee identifiers such as Social Security numbers and similar data protected by law, regulation, or contractual agreement.
- For more information regarding data classification at Illinois State University, please refer to: 9.8.1 Data Classification Procedure
- When in doubt, do not paste it into a chat or upload a document—consult the Information Security Office.
- Highly restricted/confidential data of any kind.
Sensitivity Labels and DLP
Illinois State University is in the process of adopting Data Loss Prevention (DLP) practices; while that process is ongoing, you may see new behaviors in existing products that will help ensure that data is not shared inappropriately.
- Expect guardrails; If you try to move or paste sensitive content in ways that violate policy, you may see a warning or a block.
- Labels and DLP travel with your content across OneDrive, SharePoint, Teams, and supported endpoints.
Accountability for Output
Copilot accelerates drafting and analysis, but you remain responsible for the results. Always review for accuracy, bias, confidentiality, and tone. Cite sources where appropriate and never treat generated content as authoritative without verification.
Safe‑Use “Do / Don’t” Review
Do:
- Store files in OneDrive before using in‑app Copilot, adhering to data governance procedures for file storage.
- Apply the correct sensitivity label and verify sharing settings.
- For more information regarding sharing settings in OneDrive, please refer to: Sharing Files in OneDrive
- Verify and edit all outputs before sharing. Data analysis results will vary significantly by source and prompts. Ensure this is clear when sharing results.
Don’t:
- Share sensitive/highly restricted data or anything you should not store in OneDrive.
- Assume Copilot can see your local files or private mail—permissions apply.
- Forward outputs without reading and reviewing for accuracy, bias, confidentiality, and tone.
Copilot Studio and Agents
Copilot Studio, Copilot Studio Lite, and SharePoint Agents are capabilities in Copilot that essentially allow a user to create a custom bot to handle a specific type of inquiry or to conduct a process. These bots are connected to data that enables them to function. While you will have access to some Microsoft-provided agents and be able to create some limited custom bots for your own use, publishing bots general use is disabled for this pilot.
- You may create personal agents for your own use during the pilot.
- Publishing agents has not been enabled and is subject to governance approval going forward.
- Sharing with unlicensed users consumes metered credits; understanding this cost will be crucial in institutional decision-making about using Copilot Studio.