.

  • Written by Apu Kapadia, Professor of Computer Science, Indiana University
Can Facebook’s smart glasses be smart about security and privacy?

Facebook’s recently announced Ray-Ban Stories glasses[1], which have two cameras and three microphones built in, are in the news again.

Facebook has kicked off a worldwide project[2] dubbed Ego4D[3] to research new uses for smart glasses.

Ray-Ban Stories glasses capture audio and video so wearers can record their experiences and interactions. The research project aims to add augmented reality features to the glasses, potentially including facial recognition and other artificial intelligence technologies that could provide wearers with a wealth of information, including the ability to get answers to questions like “Where did I leave my keys?”

Several other technology companies like Google[4], Microsoft[5], Snap[6], Vuzix[7] and Lenovo[8] have also been experimenting with versions of augmented or mixed reality glasses. Augmented reality glasses can display useful information within the lenses, providing an electronically enhanced view of the world. For example, smart glasses could draw a line over the road to show you the next turn or let you see a restaurant’s Yelp rating as you look at its sign.

However, some of the information that augmented reality glasses give their users could include identifying people in the glasses’ field of view[9] and displaying personal information about them. It was not too long ago that Google introduced Google Glass, only to face a public backlash[10] for simply recording people. Compared to being recorded by smartphones in public, being recorded by smart glasses feels to people like a greater invasion of privacy[11].

As a researcher who studies computer security and privacy[12], I believe it’s important for technology companies to proceed with caution and consider the security and privacy risks of augmented reality.

Smartphones vs. smart glasses

Even though people are now used to being photographed in public, they also expect the photographer typically to raise their smartphone to compose a photo. Augmented reality glasses fundamentally disrupt or violate this sense of normalcy. The public setting may be the same, but the sheer scale and approach of recording has changed.

A pair of sunglasses
Facebook’s Ray-Ban Stories glasses capture photos and video and play audio, but the company has much bigger plans for smart glasses, including AI that can interpret what the wearer is seeing. Courtesy Facebook[13]

Such deviations from the norm have long been recognized by researchers[14] as a violation of privacy. My group’s research has found that people in the neighborhood of nontraditional cameras want a more tangible sense of when their privacy is being compromised[15] because they find it difficult to know whether they are being recorded.

Absent the typical physical gestures of taking a photo, people need better ways to convey whether a camera or microphone is recording people. Facebook has already been warned by the European Union[16] that the LED indicating a pair of Ray-Ban Stories is recording is too small.

In the longer term, however, people might become accustomed to smart glasses as the new normal. Our research found that although young adults worry about others recording their embarrassing moments on smartphones, they have adjusted[17] to the pervasive presence of cameras.

Smart glasses as a memory aid

An important application of smart glasses is as a memory aid. If you could record or “lifelog” your entire day from a first-person point of view, you could simply rewind or scroll through the video at will. You could examine the video to see where you left your keys, or you could replay a conversion to recall a friend’s movie recommendation.

Our research studied volunteers who wore lifelogging cameras for several days. We uncovered several privacy concerns – this time, for the camera wearer[18]. Considering who, or what algorithms, might have access to the camera footage, people may worry about the detailed portrait it paints of them.

Who you meet, what you eat, what you watch and what your living room really looks like without guests are all recorded. We found that people were especially concerned about the places being recorded[19], as well as their computer and phone screens, which formed a large fraction of their lifelogging history.

Popular media already has its take on what can go horribly wrong with such memory aids. “The Entire History of You[20]” episode of the TV series “Black Mirror” shows how even the most casual arguments can lead to people digging through lifelogs for evidence of who said exactly what and when. In such a world, it is difficult to just move on. It’s a lesson in the importance of forgetting.

Psychologists have pointed to the importance of forgetting[21] as a natural human coping mechanism to move past traumatic experiences. Maybe AI algorithms can be put to good use identifying digital memories to delete. For example, our research has devised AI-based algorithms to detect sensitive places[22] like bathrooms and computer and phone screens[23], which were high on the worry list in our lifelogging study[24]. Once detected, footage can be selectively deleted from a person’s digital memories.

X-ray specs of the digital self?

However, smart glasses have the potential to do more than simply record video. It’s important to prepare for the possibility of a world in which smart glasses use facial recognition, analyze people’s expressions, look up and display personal information, and even record and analyze conversations. These applications raise important questions about privacy and security.

We studied the use of smart glasses by people with visual impairments. We found that these potential users were worried about the inaccuracy of artificial intelligence algorithms[25] and their potential to misrepresent other people.

Even if accurate, they felt it was improper to infer someone’s weight or age. They also questioned whether it was ethical for such algorithms to guess someone’s gender or race. Researchers have also debated whether AI should be used to detect emotions[26], which can be expressed differently by people from difference cultures.

Augmenting Facebook’s view of the future

I have only scratched[27] the surface[28] of the privacy and security considerations for augmented reality glasses. As Facebook charges ahead with augmented reality, I believe it’s critical that the company address these concerns.

[Over 115,000 readers rely on The Conversation’s newsletter to understand the world. Sign up today[29].]

I am heartened by the stellar list of privacy and security researchers[30] Facebook is collaborating with to make sure its technology is worthy of the public’s trust, especially given the company’s recent track record[31].

But I can only hope that Facebook will tread carefully and ensure that their view of the future includes the concerns of these and other privacy and security researchers.

References

  1. ^ Ray-Ban Stories glasses (about.fb.com)
  2. ^ worldwide project (ai.facebook.com)
  3. ^ Ego4D (ai.facebook.com)
  4. ^ Google (www.google.com)
  5. ^ Microsoft (www.microsoft.com)
  6. ^ Snap (www.spectacles.com)
  7. ^ Vuzix (www.vuzix.com)
  8. ^ Lenovo (www.lenovo.com)
  9. ^ could include identifying people in the glasses’ field of view (www.theverge.com)
  10. ^ public backlash (www.newyorker.com)
  11. ^ feels to people like a greater invasion of privacy (doi.org)
  12. ^ studies computer security and privacy (scholar.google.com)
  13. ^ Courtesy Facebook (tech.fb.com)
  14. ^ recognized by researchers (digitalcommons.law.uw.edu)
  15. ^ tangible sense of when their privacy is being compromised (doi.org)
  16. ^ warned by the European Union (techcrunch.com)
  17. ^ they have adjusted (www.usenix.org)
  18. ^ for the camera wearer (dx.doi.org)
  19. ^ especially concerned about the places being recorded (dx.doi.org)
  20. ^ The Entire History of You (www.youtube.com)
  21. ^ importance of forgetting (www.psychologytoday.com)
  22. ^ sensitive places (www.ndss-symposium.org)
  23. ^ computer and phone screens (dx.doi.org)
  24. ^ lifelogging study (www.ndss-symposium.org)
  25. ^ inaccuracy of artificial intelligence algorithms (doi.org)
  26. ^ whether AI should be used to detect emotions (dl.acm.org)
  27. ^ only scratched (vr4sec.hcigroup.de)
  28. ^ the surface (vr4sec.hcigroup.de)
  29. ^ Sign up today (theconversation.com)
  30. ^ stellar list of privacy and security researchers (research.fb.com)
  31. ^ recent track record (theconversation.com)

Authors: Apu Kapadia, Professor of Computer Science, Indiana University

Read more https://theconversation.com/can-facebooks-smart-glasses-be-smart-about-security-and-privacy-170002

Metropolitan republishes selected articles from The Conversation USA with permission

Visit The Conversation to see more