The Importance of Inclusive Data and Diverse Perspectives with AI
By Brock Behling, Program Director for Instructional Technology, Minnesota State
In an era where data-driven decisions and artificially intelligent (AI) models are merging into every aspect of society, it is crucial to ensure that AI technologies are developed, deployed, tested, and taught responsibly to ensure full inclusivity.
Some AI models that were developed to exclude outlying data to optimize final outputs are beginning to experience the amplified effect of these programmed digital eugenics by intentionally removing or systemically devaluing contributions from outlying data.
Jutta Treviranus, an artificial intelligence ethics expert shared how many AI models favor the average which can make existing social inequalities worse.
We must prioritize the inclusion of marginalized voices in our data to foster innovation and critical thinking as we address a broader spectrum of needs beyond the typical patterns of the majority.
Differentiated instruction and universal design for learning are strategies that incorporate flexibility into the learning journey. Similarly, AI needs to be deployed strategically to empower marginalized individuals while maintaining their creative voices instead of negatively devaluing unique contributions. AI solutions often recommend conformance to the programmatically optimized output which can exclude varied perspectives and return to pre-defined standards that are not always inclusive.
Fostering rich and equitable learning opportunities require that any evaluation process looking for an optimized output critically assesses and attributes value to diverse perspectives. Including and valuing diverse perspectives is an important element needed when AI models are being trained and tested. Transparency in the process and accountability from key stakeholders throughout the product lifecycle can help improve trust with expected outputs by continuously addressing observed or implicit biases that are influencing the tool.
Picture Smart AI from Freedom Scientific is one example where the developers of the popular screen reading tool JAWS brought in multiple models to mitigate the negative effects of AI hallucinations and biases occurring from machine inference of the specific model’s training data and logic. They have been working at refining their companion tool since it was first introduced in 2019 with user feedback from early adopters and have shared publicly some of the challenges they have experienced with the tool’s AI outputs.
By teaching AI the importance of embracing the unique diversity found at the edges of our datasets, we can foster flexibility in our responses to societal needs and reduce waste that occurs through mass conformance. When systems are optimized solely for the general population, they often fail to account for the needs of marginalized communities. However, we can prompt AI to value and incorporate diversity and it will do better!
In education, we can teach critical thinking and ethical use of AI, following guidance from resources like the AI bill of rights, or the European Commission’s comprehensive Artificial Intelligence Act. These guidance documents address key elements of expectations from trained models under human supervision with specific data sets. This reiterates the fact that individuals with disabilities and other marginalized communities need to be involved in the training discussions to prevent the perpetuation of stigma through discriminatory practices and ableist outputs that are a direct result from the programmed and trained biases from existing AI systems and societal structures and processes.
The University of Washington showed how generative AI was amplifying real-world biases when assessing applications, but when prompted to not be ableist in its ranked decisions, it returned more equitable results. Being intentional with prompts can improve AI outputs by building guard rails that protect against extremism by promoting a deeper commonality leading to innovative solutions that can address the needs of everyone.
A potential path to more effective, innovative, and equitable use of AI can be achieved by constantly assessing if the tool is supporting and valuing the full spectrum of human diversity, including diverse perspectives and lived-experiences with equitable weights associated with these unique inputs, to ensure the provided outcomes are matching the needs of all. Without monitoring the programmatic outputs, marginalized groups may begin to lose their voices and representation, but by being intentional and aware of the inherit bias associated with some of these automations, we can do our part in creating a more just and adaptable society that includes, instead of removes the creative nuances and disruptors that have been integral to past societal advances.
For additional reference, Minnesota State has a guidance document on AI that includes policy intersections, considerations and recommendations for acceptable use of generative AI services that has been featured in the 2024 EDUCAUSE Horizon Report.
The RISE Act goes into effect in January – use Copilot to ensure plain language
By Megan Babel, ASA Communications Coordinator, Minnesota State
The Minnesota Respond, Innovate, Succeed, and Empower (RISE) Act will simplify the process for students with disabilities to obtain accommodations at colleges and universities. Currently, the accommodations students receive depend on where they are enrolled since policies and required documentation are not the same across schools.
The Minnesota RISE Act will provide more information to students about what documentation they need and requires all resources and information and to be in plain language and accessible formats. The bill goes into effect January 1, 2025. We encourage everyone to read more about the RISE Act.
One way we can all help, is to ensure our assignments, syllabi, emails, and policy information and expectations are written in plain language. According to Minnesota IT Services, “plain language is communication that all readers can understand the first time they read it and know what they need to do next.” By avoiding jargon, writing in short sentences, presenting easy-to-find information, and clearly stating directions and deadlines, we can help make our communications more useful to readers.
Consider Copilot
Consider using Microsoft Copilot to translate communications into plain language. Ensure you are using Copilot with commercial data protection by accessing the secure version. This requires being logged in with a Minnesota State account to the service.

- Navigate to https://copilot.microsoft.com/
- Select Sign in
- Log in with your StarID credentials
- Follow multi-factor authentication steps, if prompted
- You will see an commercial data protection seal in the top right corner.
Why Copilot?
Copilot is the preferred generative artificial intelligence (AI) platform for Minnesota State employees because when employees and students are logged into Microsoft with their StarID, our Microsoft contract provides some data protection. When logged in, Microsoft does not keep any of the data that is entered (e.g., prompts or data). Microsoft has no eyes-on access, and the chat data isn’t used to train the underlying large language models.
Employees must use caution and discretion when entering personally identifiable private data into Copilot. Deidentifying data and using aliases in spreadsheets are two options for leveraging Copilot while maintaining data privacy.
Use some or all of the following prompts to help get started:
- Review the following passage and modify it to ensure plain language is used.
- Avoid jargon, write in short sentences, and be easy to follow.
- My audience is _______.
- I want my audience to know (how to) _____________.
- Add short, informative headings.
What other prompts would you use? Join the conversation on the Generative AI Channel on the NED Team.
Always proofread responses. While generative AI services can be helpful, they can produce inaccurate, misleading, or nonsensical content. Find more Generative AI Guidance from Minnesota State.
Removing unnecessarily confusing steps from securing accommodations on campuses will “make securing accommodations on campus less discouraging, more welcoming, more supportive and consistent, and less burdensome,” says Minnesota Representative Jessica Hanson.
Teaching with AI Workshops
By Elizabeth Harsma, Program Director for Technology Integrated Learning, Minnesota State
Teaching with AI Workshop
On November 8, the Network for Educational Development (NED) will hold its first Teaching with Artificial Intelligence (AI) workshop at Fond du Lac Tribal and Community College. Thank you to the Fond du Lac campus for hosting!
Upcoming Teaching with AI Workshop Series
The next Teaching with AI Workshop will be a four-part series on Zoom in February 2025. The workshops will be held on Thursdays from 10 to 11:15 a.m. Each workshop is limited to 30 participants. Those looking to join need to register for each part individually.
Workshop Description
This hands-on introductory workshop gives you time and space to apply AI in your teaching. You will explore generative AI tools and redesign a learning activity or assessment to support AI literacy. You will also learn how to incorporate or resist generative AI tools and apply equitable teaching practices. At the end of the workshop, you will receive additional resources to explore these topics further on your own.
Whether you want to maintain your curriculum, explore changes, or transform your classrooms with AI, we hope you join us for this collaborative and hands-on learning experience.
Future In-Person Workshops
Stay tuned for more in-person Teaching with AI workshops from the Network for Educational Development in March and May 2025.
Reference/Attribution
This text was human written with support from Microsoft Copilot, citation:
Copilot, response to “Based on the definition and tips for plain language I shared, please rewrite my draft in plain language” Microsoft, October 21, 2024, Copilot (microsoft.com). AI generated writing was edited by humans and included.
Network for Educational Development
View past editions of the Educational Development Digest.
Visit the NED Events Calendar to view upcoming educational development opportunities. Visit the NED Resource Site for recordings of previous webinars and additional resources.
