The Times Real Estate


.

  • Written by Nir Kshetri, Professor of Management, University of North Carolina – Greensboro
School surveillance of students via laptops may do more harm than good

Ever since the start of the pandemic, more and more public school students are using laptops, tablets or similar devices issued by their schools.

The percentage of teachers who reported their schools had provided their students with such devices doubled from 43% before the pandemic to 86% during the pandemic[1], a September 2021 report shows.

In one sense, it might be tempting to celebrate how schools are doing more to keep their students digitally connected during the pandemic. The problem is, schools are not just providing kids with computers to keep up with their schoolwork. Instead – in a trend that could easily be described as Orwellian – the vast majority of schools are also using those devices to keep tabs on what students are doing in their personal lives.

Indeed, 80% of teachers and 77% of high school students[2] reported that their schools had installed artificial intelligence-based surveillance software on these devices to monitor students’ online activities and what is stored in the computer.

This student surveillance is taking place – at taxpayer expense – in cities and school communities throughout the United States.

For instance, in the Minneapolis school district, school officials paid over $355,000[3] to use tools provided by student surveillance company Gaggle[4] until 2023. Three-quarters of incidents reported – that is, cases where the system flagged students’ online activity – took place outside school hours[5].

In Baltimore, where the public school system uses the GoGuardian[6] surveillance app, police officers are sent to children’s homes[7] when the system detects students typing keywords related to self-harm.

Safety versus privacy

Vendors claim these tools keep students safe[8] from self-harm or online activities that could lead to trouble. However, privacy groups[9] and news outlets[10] have raised questions about those claims.

Vendors often refuse to reveal[11] how their artificial intelligence programs were trained[12] and the type of data used to train them.

Privacy advocates fear these tools may harm students[13] by criminalizing mental health problems[14] and deterring free expression[15].

As a researcher who studies[16] privacy[17] and security[18] issues in various settings[19], I know that intrusive surveillance techniques cause emotional and psychological harm[20] to students, disproportionately penalize minority students[21] and weaken online security[22].

Artificial intelligence not intelligent enough

Even the most advanced artificial intelligence[23] lacks the ability[24] to understand human language and context[25]. This is why student surveillance systems pick up a lot of false positives[26] instead of real problems.

In some cases, these surveillance programs have flagged students discussing music deemed suspicious and even students talking about the novel[27] “To Kill a Mockingbird.”

Harm to students

When students know they are being monitored, they are less likely[28] to share true thoughts online and are more careful about what they search. This can discourage vulnerable groups[29], such as students with mental health issues, from getting needed services.

When students know that their every move and everything read and written is watched, they are also less likely to develop into adults with a high level of self-confidence[30]. In general, surveillance has a negative impact on students’ ability to act and use analytical reasoning[31]. It also hinders the development[32] of the skills and mindset needed to exercise their rights.

More adverse impact on minorities

U.S. schools disproportionately discipline[33] minority students. African American students’ chances of being suspended are more than three times higher[34] than that of their white peers.

After evaluating flagged content, vendors report any concerns to school officials[35], who take disciplinary actions on a case-by-case basis. The lack of oversight[36] in schools’ use of these tools could lead to further harm for minority students.

The situation is worsened by the fact that Black and Hispanic students rely more on school devices than their white peers do[37]. This in turn makes minority students more likely to be monitored and exposes them to greater risk of some sort of intervention.

A Black girl stares at her laptop.
Students of color are more likely to rely on school-issued laptops than their white peers are. Igor Alecsander/E+ via Getty Images[38]

When both minority students and their white peers are monitored, the former group is more likely to be penalized because the training data used in developing artificial intelligence programs often fails to include enough minorities[39]. Artificial intelligence programs are more likely to flag[40] languages written and spoken by such groups[41]. This is due to the underrepresentation of languages written and spoken by minorities[42] in the datasets used to train such programs and the lack of diversity of people working in this field[43].

Leading AI models are 50% more likely to flag tweets written by African Americans as “offensive”[44] that those written by others. They are 2.2 times more likely to flag tweets written in African American slang.

These tools also affect sexual and gender minorities more adversely. Gaggle has reportedly flagged “gay,” “lesbian” and other LGBTQ-related terms[45] because they are associated with pornography, even though the terms are often used to describe one’s identity.

Increased security risk

These surveillance systems also increase students’ cybersecurity risks. First, to comprehensively monitor students’ activities, surveillance vendors compel students to install a set of certificates known as root certificates. As the highest-level security certificate installed in a device, a root certificate functions as a “master certificate”[46] to determine the entire system’s security. One drawback is that these certificates compromise cybersecurity checks that are built into these devices[47].

[You’re smart and curious about the world. So are The Conversation’s authors and editors. You can read us daily by subscribing to our newsletter[48].]

Gaggle, which scans digital files of more than 5 million students[49] each year, installs such certificates. This tactic of installing certificates is similar to the approach that authoritarian[50] regimes, such as the Kazakhstani government[51], use to monitor and control their citizens[52] and that cybercriminals use to lure victims to infected websites[53].

Second, surveillance system vendors use insecure systems that hackers can exploit. In March 2021, computer security software company McAfee found several vulnerabilities[54] in student monitoring system vendor Netop’s Vision Pro Education software. For instance, Netop did not encrypt communications between teachers and students to block unauthorized access[55].

The software was used by over 9,000 schools worldwide to monitor millions of students. The vulnerability allowed hackers to gain control over webcams and microphones in students’ computers[56].

Finally, personal information of students that is stored by the vendors is susceptible to breaches[57]. In July 2020, criminals stole 444,000 students’ personal data[58] – including names, email addresses, home addresses, phone numbers and passwords – by hacking online proctoring service ProctorU. This data was then leaked online.

Schools would do well to look more closely at the harm being caused by their surveillance of students and to question whether they actually make students more safe – or less.

References

  1. ^ 86% during the pandemic (cdt.org)
  2. ^ 80% of teachers and 77% of high school students (www.theguardian.com)
  3. ^ paid over $355,000 (www.the74million.org)
  4. ^ student surveillance company Gaggle (www.gaggle.net)
  5. ^ outside school hours (www.the74million.org)
  6. ^ GoGuardian (www.washingtonpost.com)
  7. ^ sent to children’s homes (therealnews.com)
  8. ^ keep students safe (www.eff.org)
  9. ^ privacy groups (www.govtech.com)
  10. ^ news outlets (www.vice.com)
  11. ^ refuse to reveal (www.bloomberg.com)
  12. ^ artificial intelligence programs were trained (www.buzzfeednews.com)
  13. ^ harm students (www.the74million.org)
  14. ^ criminalizing mental health problems (www.govtech.com)
  15. ^ deterring free expression (scholarship.law.unc.edu)
  16. ^ studies (scholar.google.com)
  17. ^ privacy (www.sciencedirect.com)
  18. ^ security (utorontopress.com)
  19. ^ settings (link.springer.com)
  20. ^ emotional and psychological harm (scholarship.law.unc.edu)
  21. ^ disproportionately penalize minority students (www.wired.com)
  22. ^ weaken online security (www.bleepingcomputer.com)
  23. ^ most advanced artificial intelligence (bdtechtalks.com)
  24. ^ lacks the ability (sloanreview.mit.edu)
  25. ^ context (www.pwvconsultants.com)
  26. ^ false positives (www.vice.com)
  27. ^ talking about the novel (www.govtech.com)
  28. ^ less likely (www.bestcolleges.com)
  29. ^ discourage vulnerable groups (www.the74million.org)
  30. ^ less likely to develop into adults with a high level of self-confidence (scholarship.law.unc.edu)
  31. ^ ability to act and use analytical reasoning (eric.ed.gov)
  32. ^ hinders the development (scholarship.law.unc.edu)
  33. ^ disproportionately discipline (www.usatoday.com)
  34. ^ more than three times higher (www.nytimes.com)
  35. ^ vendors report any concerns to school officials (www.lgbtqnation.com)
  36. ^ lack of oversight (www.wired.com)
  37. ^ more on school devices than their white peers do (www.wired.com)
  38. ^ Igor Alecsander/E+ via Getty Images (www.gettyimages.com)
  39. ^ fails to include enough minorities (www.dataversity.net)
  40. ^ are more likely to flag (civic.mit.edu)
  41. ^ written and spoken by such groups (homes.cs.washington.edu)
  42. ^ underrepresentation of languages written and spoken by minorities (www.unite.ai)
  43. ^ the lack of diversity of people working in this field (www.thenationalnews.com)
  44. ^ 50% more likely to flag tweets written by African Americans as “offensive” (www.vox.com)
  45. ^ “gay,” “lesbian” and other LGBTQ-related terms (www.lgbtqnation.com)
  46. ^ functions as a “master certificate” (www.makeuseof.com)
  47. ^ compromise cybersecurity checks that are built into these devices (attack.mitre.org)
  48. ^ You can read us daily by subscribing to our newsletter (theconversation.com)
  49. ^ 5 million students (www.mprnews.org)
  50. ^ to the approach that authoritarian (www.makeuseof.com)
  51. ^ such as the Kazakhstani government (www.zdnet.com)
  52. ^ monitor and control their citizens (www.eff.org)
  53. ^ lure victims to infected websites (www.trendmicro.de)
  54. ^ several vulnerabilities (www.mcafee.com)
  55. ^ encrypt communications between teachers and students to block unauthorized access (www.mcafee.com)
  56. ^ allowed hackers to gain control over webcams and microphones in students’ computers (www.fastcompany.com)
  57. ^ susceptible to breaches (cdt.org)
  58. ^ 444,000 students’ personal data (www.bleepingcomputer.com)

Authors: Nir Kshetri, Professor of Management, University of North Carolina – Greensboro

Read more https://theconversation.com/school-surveillance-of-students-via-laptops-may-do-more-harm-than-good-170983

Metropolitan republishes selected articles from The Conversation USA with permission

Visit The Conversation to see more