Table of Contents[Hide][Show]
Just by looking at you, anyone could get to your personal information right away. That’s right, it can be done with I-XRAY AI Glasses.
With their project I-XRAY, Harvard students Anh Phu Nguyen and Caine Ardayfio proved that they could do it. They took Meta’s Ray-Ban smart glasses and added AI and technology for recognizing faces.
As a result?
A system that can quickly look at someone’s face and get their personal information from public sources. But don’t think this is a business product; they made I-XRAY to show how public data that is easy to get can invade our privacy.
The point of making these AI glasses wasn’t to get people to abuse them but to show how dangerous they are for privacy.
The students want to start a talk about data security by showing how powerful AI is and how it can gather personal information from sources that are open to the public.
They just want to show that privacy holes are real and encourage people to do something to keep their own information safe.
The article shows how amazing it could be to have glasses driven by AI that can instantly show private information about a person, such as their social media accounts, public records, and more.
You will learn how these smart glasses use advanced AI and face recognition to gather huge amounts of data in real-time, which raises serious privacy issues.
What exactly are I-XRAY AI Glasses?
I-XRAY is a new, AI-powered system that uses Meta’s Ray-Ban smart glasses to get real-time personal information about someone just by looking at them.
This technology was created by AnhPhu Nguyen and Caine Ardayfio, two students at Harvard.
It uses large language models (LLMs), reverse picture searches, and face recognition to collect information like names, addresses, phone numbers, and even information about family members.
The whole process is meant to work without any problems, which raises serious privacy concerns because it’s so easy to get personal information without the subject’s permission.
Overview of I-XRAY’s Core Functionality
I-XRAY is based on the camera that comes with Meta’s Ray-Ban smart glasses. This camera streams video to a computer program. Then, this app uses AI-powered facial recognition technology to figure out who the faces are.
The recorded face is compared to pictures and files that are open to the public. Personal information like names and phone numbers is among the data that is collected and sent back to a mobile app in a matter of seconds.
People are arguing about what this fast process means for broad monitoring and the wrong use of personal information that is open to the public.
The people who made I-XRAY want to draw attention to the privacy risks that come with AI and face recognition. They want to stress that these technologies are not of the future, but are already here and easy for anyone to use.
Despite Nguyen and Ardayfio’s stated intention not to disclose the program, their efforts should be seen as a wake-up call about the dangers of abusive software.
This demonstration brought up important questions about how to protect privacy in a world that is becoming more and more linked.
Technologies like I-XRAY show how personal details from everyday public exchanges can be shared without people being aware they are being watched.
Purpose Behind I-XRAY: Raising Privacy Awareness
There was a clear goal behind the creation of I-XRAY by AnhPhu Nguyen and Caine Ardayfio: to make people more aware of how modern technology can invade privacy.
I-XRAY’s goal is not to abuse or take advantage of the information it shows, but to show how easily accessible data can be put together and used in ways most people don’t know about.
Face recognition can be used to get personal information like names, addresses, and phone numbers from public records. They want to bring attention to the bigger problem of privacy in a world that is becoming more and more linked.
Publicly Available Data and Its Exploitation
The project shows how AI tools can be used to gather seemingly safe data that is available to everyone and use it to make thorough profiles of people.
Face recognition, reverse picture searches, and large language models (LLMs) can all be used together in the right way to get personal information in real time, often without the subject’s knowledge or permission.
The I-XRAY shows that you don’t need special tools to get this knowledge; smart glasses and other common technologies can do it.
Just why would you want to use smart glasses?
This technology could work without smart glasses, but they are being used on purpose. The Ray-Ban smart glasses from Meta are a strong example of how easily commonplace gadgets can be used for jobs that invade privacy.
With the glasses, real-time data collection can be done in a way that is almost unnoticeable to the person being targeted.
In addition to making I-XRAY’s message more concrete, this adds a visual element that shows how gadgets that are used every day can make privacy breaches easier.
Nguyen and Ardayfio use smart glasses as a stage to show how dangerous common technologies can be. They encourage people to be careful about how their data is handled and shared.
The Technology Behind I-XRAY
Smart Glasses as the Interface
The I-XRAY system is built on top of Meta’s Ray-Ban smart glasses. These glasses have cameras built right in that can record high-quality video in real time.
In the I-XRAY system, the glasses send live video to a device that is attached to them, like a computer or smartphone. An AI system watches the video feed all the time and processes the data to find faces in the crowd.
Since cameras are built right into most glasses, most people won’t even notice that this is happening. This makes it a very effective way to invade privacy.
Facial Recognition
An AI-based face recognition system looks at the video stream after the smart glasses record it. Faces in the video are recognized by the AI by comparing them to huge collections of pictures from public records, social media, and online storage sites.
Using services like PimEyes, the system compares the faces it finds with pictures that are open to the public to figure out who the person is. This process is very quick; it only takes seconds to match a face with personal information like names, addresses, and phone numbers.
Large Language Models (LLMs)
Large language models (LLMs) are used after face recognition to figure out who the person is. LLMs put together information from many different sources to make a full picture of the person.
Online stories, voter registration records, social media profiles, and public listings such as FastPeopleSearch can all be used to get this information.
The LLMs can give a full picture of the person by looking at all of this information, which includes how to contact them and even their family ties.
The whole thing shows how easy it is to use AI tools on public data to find out about someone’s private life in real time.
How to Remove Your Personal Information
You can stay safe from technologies like I-XRAY by taking your personal information out of databases that reverse face search and people search engines use. Here are specific steps you can take to protect your privacy.
Removal from Reverse Face Search Engines
You can get the images taken off of two of the most popular in reverse face search engines, PimEyes and FaceCheck.ID, for free.
- PimEyes: You can choose not to use PimEyes by sending a picture of yourself and a government ID that has been encrypted. Your picture will never show up in PimEyes searches again. To start the opt-out process, go to their opt-out request form and do what it says.
- FaceCheck.ID: Like PimEyes, FaceCheck.ID lets people ask for their pictures to be taken down. There are ways to opt out right on their website.
Removal from People Search Engines
A lot of people search engines get your address, phone number, and the names of your family and show them to other people. Here are the steps you can take to get your information out of the most popular databases:
- FastPeopleSearch: To get rid of your information, visit their opt-out page and take the steps there. It usually takes up to 72 hours for the removal to be confirmed after you send in your request.
- CheckThem and Instant Checkmate: Some services, like CheckThem and Instant Checkmate, also have opt-out pages where users can ask to be removed. Like FastPeopleSearch, you must fill out a form and wait for an email proof.
- Large-Scale Data Broker Removal: To get rid of all of your information from all data brokers, you can use tools or services like OneRep or DeleteMe to do it automatically. There are only a few steps to get rid of your information from hundreds of sites with these services.
Protecting Against Identity Theft
It’s important to freeze your credit with the three major credit companies (Equifax, Experian, and TransUnion) to protect yourself even more, especially from SSN-related data hacks.
No one else will be able to open new accounts in your name after this. A lot of financial services also say that you should add an extra layer of security to your important accounts by turning on two-factor authentication (2FA).
By doing these things, you can make it much less likely that tools like I-XRAY will put your privacy at risk.
Conclusion
Projects like I-XRAY may speed up the process of combining AI and smart tech, which will push the limits of what consumer electronics can do.
As AI becomes more common, new technologies may focus on handling data in real time, recognizing faces, and getting information quickly and easily.
This could make things easier, but it also raises serious worries about privacy and the misuse of data.
The critical need for more stringent privacy laws is brought to light by these demonstrations.
Stronger rules about data security could stop people from abusing public data and AI technologies by making it clear how they should be used ethically.
In this era of AI, consumer rights are more important than ever, and the capabilities shown by I-XRAY might spark larger legislative initiatives to protect information privacy.
Leave a Reply