Pop-ups, privacy notices, and consent checks provide a tiny bit of order in our unwieldy digital world, especially when it comes to pictures and videos of you. Some company-issued notifications, for example, might prompt you to agree to a platform or organization’s use of your likeness in a captured photo.
“How do you roll that out, when you have, say, a million individuals with glasses just walking around, living their lives? Are they to wear T-shirts or signage that says, ‘Hey, I’m not myopic, I’m not [near]-sighted. I’m wearing these glasses because I’d like to take pictures of everyone as I walk about doing my daily life,’” Joe Jones, director of research and insights at nonprofit privacy organization IAPP, told IT Brew with a laugh.
Jones spoke with us about security and privacy risks—as well as the upside—of wearables as this technology becomes more advanced.
This interview has been edited for length and clarity.
Are you seeing mainstream adoption of AR glasses to help people do their jobs?
Do any privacy concerns come to mind as this technology gets adopted?
Your name, age, inferences, pupil dilation, etc., could all be picked up by someone’s AR glasses. And the moment is so fleeting, just as much as it’s so ubiquitous…The questions around the lawfulness and the efficacy of the not just documentary safeguards, but the governance and compliance safeguards that exist when you’re collecting data, when you should be telling people what you’re doing with that data, how you’re going to handle it, what rights they have, and what recourse they have.
What about security?
I think a lot of the more mature players in this space are processing a lot of that data as locally as possible. Many of them are processing their data on device, so in the headset, on the glasses, and once that data is no longer being used or doesn’t have the utility, a lot of that data is being deleted.
Can documentation and compliance safeguards be somehow implemented in everyday life? Are we just kind of stuck with this risk?
The big challenge to all of this is the stylistic design, which makes it harder and harder for individuals to know that their data has been collected in the first place. It’s one thing to talk about CCTV. You see the camera. It’s one thing to talk about the selfie; you see a phone go up…A lot of this technology is going back to a more analog design so that we don’t know it’s technological, and it becomes even harder to understand what safeguards, what documentation and checks and balances exist.
Would you have any advice for, say, a dentist who is using these glasses?
Making sure that they are on top of their own governance, their own infrastructure security, their own privacy compliance is going to be really key. There’s only so much they can control when it comes to the device manufactured by someone else, but the extent to which they’re pulling down and pulling out that data for their own use in their own systems. That’s when they have more control and more is expected of them.



