Challenging Assumptions: Minimal Signs of Academic Dishonesty in Unproctored Online Examinations, Researchers Discover

by Tatsuya Nakamura
8 comments
Unproctored online exams

A study shows that student performance in unproctored online examinations closely mirrors that of in-person, monitored tests, thereby suggesting a low incidence of effective cheating and reinforcing the credibility of online evaluation methods. Although there are reservations, the consistency in results across varied academic fields and educational stages has bolstered the case for the ongoing use of online examinations, while faculty members still employ measures to further discourage dishonest practices.

When Iowa State University switched to virtual learning in the middle of the spring semester in 2020, Jason Chan, a psychology professor, was apprehensive. Would unmonitored online tests result in widespread cheating?

Contrary to his initial concerns, the outcomes of the tests revealed that individual student performance was marginally higher yet aligned with their scores from in-person, proctored tests. There was a slight uptick in individual student scores, but the changes were consistent with earlier in-person, supervised test outcomes. Students who had received B grades before the pandemic-induced lockdown continued to achieve similar grades in the online, unproctored environment. This trend was consistent across the spectrum of student performance.

“The evidence suggests that there was either an absence of cheating or that such activities were insufficient to significantly alter student scores,” Chan states.

To understand the implications on a larger scale, Chan, along with Dahwi Ahn, a doctoral candidate in psychology, examined test scores from nearly 2,000 students enrolled in 18 different courses during the spring semester of 2020. The courses spanned from large, lecture-based classes like Introduction to Statistics, to specialized courses in fields such as engineering and veterinary medicine.

Across diverse academic disciplines, sizes of classes, levels of coursework, and types of exams (e.g., primarily multiple-choice or short-answer), the results were uniform. Unproctored online tests yielded scores that were strikingly similar to those of in-person, monitored tests, affirming their value as reliable instruments for student assessment.

The findings have been recently published in the Proceedings of the National Academy of Sciences.

“Before undertaking this study, I harbored reservations about the efficacy and integrity of online, unproctored exams. However, the data has fortified my confidence, and I believe it will do the same for other educators,” Ahn comments.

Both researchers continue to administer exams online, even in the context of in-person classes, citing the flexibility it offers to students with part-time employment or those involved in sports and extracurricular activities. Ahn also conducted her first entirely online course during the summer.

Why was cheating apparently so ineffective in altering test scores?

The researchers hypothesize that students prone to cheating are often those who are already struggling academically and fearful of failing. These students may have neglected lectures, lagged in their studies, or found it challenging to seek assistance. Even with the possibility of online searches during an unproctored exam, a lack of understanding of the subject matter can hinder the identification of correct answers. In their paper, the researchers reference prior studies that compared open-book and closed-book exam scores.

Another deterrent against cheating could be the value that many students place on academic integrity and fairness, according to Chan. Students who have devoted time to study and take pride in their achievements may be less inclined to share their answers with those they perceive as trying to take shortcuts.

Nevertheless, the researchers caution instructors to be vigilant about potential vulnerabilities in online, unproctored test environments. For instance, certain platforms reveal the correct answer immediately after a multiple-choice question is answered, making it simpler for students to disseminate answers collectively.

To mitigate this and other cheating techniques, instructors are advised to:

  1. Withhold exam answers until the examination period concludes.
  2. Utilize extensive, randomized question banks.
  3. Increase the number of options in multiple-choice questions and obscure the correct answer.
  4. Modify grade thresholds.

Jason Chan and Dahwi Ahn acknowledge that the unique circumstances of the spring 2020 semester offered a singular opportunity to examine the effectiveness of online exams. They note limitations such as the unknown impact of stress and other pandemic-related factors on students, faculty, and teaching assistants. It’s also unclear whether the courses sampled normally become easier or harder as the semester advances. To account for these factors, they examined older test data from a subset of the 18 courses during fully in-person semesters and found the grade distribution to be consistent with that of the spring 2020 semester.

At the point when this study was conducted, ChatGPT was not available to students. Nonetheless, the researchers recognize that AI writing tools like ChatGPT could revolutionize education and create new challenges for evaluating student performance. Ahn intends to explore this issue in future research.

The study was underwritten by a grant from the National Science Foundation’s Science of Learning and Augmented Intelligence Program.

Reference: “Unsupervised Online Examinations as Effective Tools for Assessing Student Learning” by Jason C. K. Chan and Dahwi Ahn, published on 24 July 2023 in Proceedings of the National Academy of Sciences.
DOI: 10.1073/pnas.2302020120

Frequently Asked Questions (FAQs) about Unproctored online exams

What was the primary finding of the study conducted at Iowa State University?

The primary finding was that student performance in unproctored online exams closely mirrored that of in-person, proctored exams. This suggests a low incidence of effective cheating and bolsters the credibility of online assessment methods.

Who conducted the study and how large was the sample size?

The study was conducted by psychology professor Jason Chan and doctoral candidate Dahwi Ahn. They analyzed test score data from nearly 2,000 students across 18 different courses during the spring semester of 2020.

Were the results consistent across different academic disciplines and course levels?

Yes, the consistency in results was observed across varied academic fields, class sizes, educational stages, and even types of exams (multiple-choice or short-answer), affirming the reliability of online exams.

What precautionary measures are recommended by the researchers to deter cheating in online exams?

The researchers recommend several measures, including withholding exam answers until the test period concludes, utilizing extensive randomized question banks, increasing the number of options in multiple-choice questions, and modifying grade thresholds.

Did the study consider the impact of stress or other COVID-19 related factors on students and faculty?

The study acknowledged these as limitations but attempted to account for them by examining older test data from similar courses during fully in-person semesters. The grade distribution was found to be consistent with that of the spring 2020 semester.

How might AI writing tools like ChatGPT affect future online exams?

The researchers acknowledge that AI writing tools could revolutionize education and pose new challenges for student evaluation. Dahwi Ahn intends to explore this issue in future research.

Where were the study’s findings published?

The research findings were published in the Proceedings of the National Academy of Sciences.

Was the study supported by any grants?

Yes, the study was supported by a grant from the National Science Foundation’s Science of Learning and Augmented Intelligence Program.

More about Unproctored online exams

  • Proceedings of the National Academy of Sciences
  • National Science Foundation’s Science of Learning and Augmented Intelligence Program
  • Iowa State University Department of Psychology
  • Research on Academic Integrity in Online Education
  • Overview of Assessment Methods in Higher Education

You may also like

8 comments

CarFan October 3, 2023 - 9:48 pm

Interesting stuff. If students can be trusted to take online exams, maybe remote work trust issues will ease up too. Just a thought.

Reply
Megan_S October 4, 2023 - 12:33 am

so the old saying ‘cheaters never win’ actually holds up, huh? Interesting study for sure!

Reply
ProfLinda October 4, 2023 - 10:33 am

Important findings for educators here. With the move to online learning, we need solid data to back our strategies. Good to know that we’re not facilitating an environment for dishonesty.

Reply
FionaM October 4, 2023 - 12:09 pm

Could they extend this study to high school? Would love to know if the findings hold true for younger age groups as well.

Reply
CryptoExpert October 4, 2023 - 1:10 pm

Never thought psychology profs would dive into this but kudos. Makes me wonder how this could affect professional certifications that are going online.

Reply
JohnDoe October 4, 2023 - 4:38 pm

Wow, this study really shatters some of my assumptions. always thought online tests were a breeding ground for cheaters. Guess I was wrong.

Reply
Tech_Guru October 4, 2023 - 6:28 pm

With AI tools like ChatGPT on the rise, this might be a short-lived victory. Dahwi Ahn’s future research on that sounds intriguing.

Reply
Jake_the_Snake October 4, 2023 - 7:25 pm

I was skeptical at first but the sample size and multiple disciplines make it pretty credible. Not bad, ISU.

Reply

Leave a Comment

* By using this form you agree with the storage and handling of your data by this website.

SciTechPost is a web resource dedicated to providing up-to-date information on the fast-paced world of science and technology. Our mission is to make science and technology accessible to everyone through our platform, by bringing together experts, innovators, and academics to share their knowledge and experience.

Subscribe

Subscribe my Newsletter for new blog posts, tips & new photos. Let's stay updated!