Educational Development Digest: January 2024

Addressing AI in Syllabi

Pedagogy in Practice | By Scott Wojtanowski, Minnesota State System Director for Educational Technology and Development

The perils and promises of artificial intelligence (AI) are being explored by faculty and students alike. As educators adapt their pedagogical practices in ways that AI can support student learning, consider making clear the degree to which students are allowed to use AI tools to complete coursework.

Listed below are statements that faculty members are welcome to use in their syllabi. These statements may be used as a starting point or in their entirety. The following descriptions and practices are adapted from colleagues from the University of Delaware who shared their ideas via a Creative Commons license.

Considerations for Using Artificial Intelligence in this Course

Students, please note that all large language models still tend to make up incorrect facts and fake citations. Artificial intelligence models tend to produce inaccurate outputs, and image generation models can occasionally come up with highly offensive products. You will be responsible for any inaccurate, biased, offensive, or otherwise unethical content you submit regardless of whether it originally comes from you or an AI tool. This syllabus includes the course assignments and the degree to which you may use artificial intelligence to complete the corresponding assignment.

Practice 1:  Prohibited

All work submitted during this course must be your own. Contributions from anyone or anything else, including AI sources, must be properly quoted and cited every time they are used. Failure to do so violates the institution’s academic misconduct/integrity policy.  Any allegations of academic misconduct will be adjudicated using the process outlined in the institution’s student handbook.

Practice 2:  Prescribed

There are situations and contexts within this course where students will be permitted to use AI tools to explore how they can be used to complete course work.  Any student work submitted using AI tools should clearly indicate what work is the student’s work and what part is generated by the AI.  Any allegations of academic misconduct will be adjudicated using the process outlined in the institution’s student handbook.

Outside of those instances that are permitted, students are discouraged from using AI tools to generate content (text, video, audio, images) that will end up in any student work (assignments, activities, responses, etc.) that is used to assess student learning. Any allegations of academic misconduct will be adjudicated using the process outlined in the institution’s student handbook.

Practice 3:  Open

Within this class, you are welcome to use artificial intelligence tools in a totally unrestricted fashion, for any purpose. Any student work submitted using AI tools should clearly indicate what work is the student’s work and what part is generated by the AI.


New AI Tools in Zoom

Academic Technology Tips | By Brock Behling, Minnesota State Program Director for Instructional Technology

The Minnesota State Media Management and Web Conferencing Committee evaluated several new AI tools and features included within our Educational Zoom account. They recommended that, Meeting Summary, Meeting Coach, and Smart Recording, be available as an opt in feature for our educational accounts. 

Zoom’s new tools can help you be more productive in your meetings and recordings. These features will need to be enabled in your educational Zoom account’s profile settings before they will be available for use.

Hosts who decide to use these tools should be responsible with their specific meeting content. Information that can be accessed by the AI tools is listed in Zoom’s article discussing how data is handled. Hosts should ensure alignment with existing board policies for intellectual property, copyrights, acceptable use, data security classification and information security controls associated with their unique use case.

Please be sure not to record or store information that includes Personal Identifiable Information (PII). PII includes social security numbers (SSNs), passport numbers, driver’s license numbers, taxpayer identification numbers, financial account numbers, credit card numbers, and personal contact information.

Meeting Summary

Meeting Summary in Zoom will use AI to generate a summary of the audio in the meeting to be edited and shared after the meeting. Once enabled, select the start or stop summary button in Zoom to begin or end the meeting summary. Meeting participants will see a notification regarding the tool being enabled.

Meeting Summary with AI Companion is on screenshot
Screenshot showing the user what appears when Meeting Summary is enabled

After the meeting has concluded, a meeting summary report is emailed to the meeting host. It is recommended to keep the Share summary with setting to Only me (meeting host) for additional control of the content.

Screenshot of the Zoom settings option to share summary only with the meeting host.

The meeting summary includes the main discussion topics organized in the order in which they were presented. The meeting summaries are also available within the web under the Meeting Summary with AI Companion section once enabled.

Screenshot of the Meeting Summaries view on Zoom Desktop

Meeting Coach

Meeting Coach provides analytics for informational purposes, the AI model used to classify metrics may contain inaccuracies and should not be used for comparable decisions associated with employment or grade performances.

One metric used is a talk to listening ratio for the host. This can be helpful to identify total time the host was speaking and may provide insights into how well the host engaged participants or how complex the information being presented was. 

Another metric is talking speed, this shows the average number of words spoken per minute by the host. Research suggests that listenability increases when speakers take frequent and longer pauses on average. The recommended range is 110-160 words per minute.

This tool will also show the average number of filler words used per minute by the host. Frequent use of “ah, um, hmm” or not using any filler words can diminish a speaker’s credibility. The recommended range is 6 – 3 filler words per minute.

Two other metrics include the longest speech segment by the host, the max length is recommended to be 2.5 minutes to increase interactivity and promote engagement. Patience is another metric used and it helps measure how long the host waits for participants to respond. The recommended range is between .5 and 1.5 seconds.

Smart Recording

Smart Recording for Zoom Cloud recordings will produce a transcript, a general meeting summary with the AI determined highlights of the meeting, chaptering with individual summaries, and potential next steps.

Once enabled, the process is the same as before for leveraging cloud recording capabilities. Select the Record button on the controls, then select Record to the Cloud option to begin the process. Meeting Participants will see a notification informing participants that the Smart Recording AI tool may be used on the recording.

Screenshot showing the user what appears when Meeting Summary and Smart Recording are enabled
Screenshot of the message to participants letting them know the meeting is being recorded.

The AI tools should be used responsibly. These solutions require personal and comprehensive review to ensure accuracy and that the desired intent is in alignment with personal expectations.


Microsoft’s Copilot – an alternative to OpenAI’s ChatGPT

Did You Know? | By Scott Wojtanowski, Minnesota State System Director for Educational Technology and Development

You’ve probably heard about OpenAI’s free version of ChatGPT, perhaps you’ve used it yourself. As with many online services today, by way of “clickwrap,” you agree to OpenAI’s terms and conditions when you create an account to use their services. Like other “free” services, you – the end user – end up being the product. For instance, according to their privacy policy, by using OpenAI’s free ChatGPT service you are allowing them to use your contributions to make their products better (e.g. train/improve their models).

If you want to throw caution into the wind in your personal life (not advisable), we’d say “to each their own” or “you do you, boo,” however, there are many folks across Minnesota State who want to be sure their data is being protected and want to be assured that the terms and conditions of the tools and services they use comply with Minnesota State Board of Trustees Policies and Procedures.

One such tool we’d encourage you to explore or assign students to use, would be Microsoft’s Copilot (formerly Bing Chat Enterprise) available at https://copilot.microsoft.com/. 

When you log in to the web version of Copilot with your Minnesota State credentials (StarID@minnstate.edu or StarID@go.minnstate.edu), Microsoft is held to the terms and conditions for similar services provided through Microsoft 365. After signing in, you should be alerted that your data is protected.

A green oval with the an icon of the a shield and checkmark is listed next to the word Protected. A callout box displays "Your personal and company data are protected in this chat"

Both ChatGPT and Copilot uses “generative pre-trained transformers,” a type of large language model. At the time of this writing, when marked as protected, Copilot will not save your chat history, while the free version of ChatGPT will. Additionally, using the DALL-E 3 model, Copilot allows users to generate images, the free version of ChatGPT does not.


Contact

Educational Development and Technology, Minnesota State.

View past editions of the Educational Development Digest.

Visit the NED Events Calendar to view upcoming educational development opportunities. Visit the NED Resource Site for recordings of previous webinars and additional resources.

Comments are closed.

Up ↑

Discover more from ASA Newsletter

Subscribe now to keep reading and get access to the full archive.

Continue reading