Educational Development Digest: October 2022

Assessments: You have to have the right tool for the right job

Pedagogy in Practice | By Scott Wojtanowski, System Director for Educational Technology and Development

Last month, as many of us were eagerly returning to our campuses, a part of the higher ed world was a buzz as a federal court in Ohio found that a state university’s requirement that a student perform a “room scan” before taking a remote proctored assessment was an unconstitutional search and seizure.  In this month’s Did You Know? section, you can review the guidance that was provided to the colleges and universities of Minnesota State. 

With this ruling came a plethora of perspectives on how assessments should and should not be conducted online. These debates often go off the rails because the arguments being made are often focused on the modality of the course (e.g. online) without considering what knowledge or behaviors an instructor is trying to assess. In other words, the popular saying applies, “you have to use the right tool for the right job.”

No one assessment method is better than another, instead assessment types should be selected so that they align with your student learning goals and learning objectives. There are many different ways you can assess your student learning. Generally, we categorize assessments as either being traditional or performance based. Consider the information that colleagues who participate in the course, Hacking Your Course Assessments, are provided. 

Traditional Assessments

Traditional assessments typically include selected response like multiple choice, true-false, and matching.  With selected response items students have to select a response provided by the instructor or test developer rather than constructing a response in their own words or actions. Selected response items do not require that students recall the information but rather recognize the correct answer. Tests with these items are called objective because the results are not influenced by scorers’ judgments or interpretations and so are often machine scored. Another traditional assessment type is constructed response items like completion, short answer, or extended response. These assessments ask students to recall information and create an answer—not just recognize if the answer is correct—so guessing is reduced.

Performance Based Assessments

Typically in performance based assessments students complete a specific task while instructors observe the process or procedure (e.g. data collection in an experiment) as well as the product (e.g. completed report) (Popham, 2005; Stiggens, 2005). The tasks that students complete in performance based assessments are not simple—in contrast to traditional assessments like selected response items. Two related terms, alternative assessment and authentic assessment are sometimes used instead of performance based assessment but they have different meanings (Linn & Miller, 2005).

  • Alternative assessment refers to tasks that are not pencil-and-paper and while many performance based assessments are not pencil-and paper tasks some are (e.g. writing a term paper, essay tests).
  • Authentic assessment is used to describe tasks that students do that are similar to those in the “real world.” Classroom tasks vary in level of authenticity (Popham, 2005).

Below is a handy concept map that provides a description of Assessment.

Concept Map showing the following connections.  Assessment for learning is formative.  Assessment of learning is summative.  Both require data to be collected.  Informal collection of data is done in an unsystematic manner through observation and questioning.  Formal collection of data is conducted in a systematic manner through either a traditional assessment or performance assessment.  Examples of traditional assessments can either be selected response or constructed response.  With selected response assessments students are asked to recognize the correct answer.  Answers include multiple choice, true/false, matching.  Constructed response ask student to recall the correct answer.  Examples include completion/short answer or extended response.  Examples of performance based assessment have many examples and can include things like athletic skills, repairing a machine, writing a term paper, using interaction skill to work together, conducting an experiment, engaging in a debate.  Both constructed responses and performance based assessments are often scored with a scoring rubric that can be holistic or analytical.

Learn more about Assessments

Sign up for the NED short course Hacking Your Course Assessments which starts Monday, October 17.  You may also be interested in downloading and using the My Assessment Plan tool which is available to download from Opendora.

Submission View: What do students see after they submit a quiz in D2L Brightspace?

Academic Technology Tips | By Scott Wojtanowski, System Director for Educational Technology and Development

Whether it is assessment for learning (formative) or assessment of learning (summative), providing prompt feedback is helpful for learners to correct their understanding. Providing constructive feedback that helps students know what they do and do not understand, as well as encouraging them to learn from their errors is fundamental. Effective feedback should be given as soon as possible as the longer the delay between students’ work and feedback the longer students will continue to have some misconceptions. Also, delays reduce the relationship between students’ performance and the feedback as students can forget what they were thinking during the assessment.

Submission Views in D2L Brightspace

If you’ve used the Quiz tool inside of D2L Brightspace, you may have seen the term “submission view.”  Submission views are settings that govern what students will see after they “submit” their quiz. For instance, do you want learners to see their final quiz score immediately after submitting the quiz? What about the questions they got wrong? How about the answers to these wrong questions? As an instructor of a course you can edit the default view and show questions answered incorrectly, show questions answered correctly, show all questions without user responses, or show all questions with user responses.

Displaying a detailed submission view to all learners immediately after submitting a quiz would work if all learners take the quiz at the same time. However, if learners are completing the quiz at different times, and the instructor wants to ensure that learners do not share the answer key with other learners, the instructor may choose to create an additional submission view. This additional view only appears to learners when instructors publish the feedback on a quiz attempt. 

Learn more

For instructions and more details on configuring submission views review D2L Brightspace Knowledge Article #1294.

Remote Proctoring

Did You Know? | By Scott Wojtanowski, System Director for Educational Technology and Development

Awareness of remote proctoring was heightened during the pandemic. The Higher Learning Commission provides some direction to the colleges and universities regarding remote proctoring via Institutional Practices for Verification of Student Identity and Protection of Student Privacy, however, Minnesota State does not yet have guidance to institutions formalized in an operating instruction of the Board of Trustees Policies and Procedures. During the 2022-2023 academic year, Minnesota State system office will work within the existing governance structure to develop a remote proctoring operating instruction. In the meantime, the following guidance was issued to the colleges and universities of Minnesota State.


View past editions of the Educational Development Digest.

Visit the NED Events Calendar to view upcoming educational development opportunities. Visit the NED Resource Site for recordings of previous webinars and additional resources.

One thought on “Educational Development Digest: October 2022

Comments are closed.

Up ↑