Dos estudiantes de Harvard han creado una pesadilla de privacidad, según 404 medios. Gafas inteligentes con reconocimiento facial en tiempo real que extraen nombres, contactos, direcciones y más sobre un extraño con solo mirarlo.
Más allá de las capacidades del software de reconocimiento facial, el proyecto de los estudiantes es aún más sorprendente considerando el hardware que utilizan para ejecutarlo: Gafas inteligentes Ray-Ban Meta.
Eso es lo que dicen los estudiantes – Anhvo Nguyen y Cain Ardaiveo no será liberado El producto ni el software detrás de él. El proyecto, titulado I-XRAY, tiene como objetivo crear conciencia sobre lo que es posible utilizar la tecnología actual. De hecho, esto es algo que las grandes empresas tecnológicas como Google y Facebook han tenido durante mucho tiempo. La fuerza para hacerpero no liberaron dichas capacidades debido al alto potencial de uso indebido.
como funciona
Nguyen y Ardayfio pudieron crear I-XRAY, gracias en parte a las gafas inteligentes Meta y al software de reconocimiento facial de PimEyes.
Aunque existen algunos motores de búsqueda de reconocimiento facial, PimEyes es quizás el más grande que hace que su tecnología sea accesible al público. Los usuarios simplemente cargan una foto en PimEyes y, mediante el reconocimiento facial, el servicio escanea la web en busca de imágenes del individuo en la foto.
Velocidad de la luz triturable
Utilizando la información de PimEyes, I-XRAY puede identificar al individuo y encontrar información personal sobre él escaneando Internet en busca de artículos y a través de intermediarios de datos como FastPeopleSearch. Esta información puede incluir nombres completos, números de teléfono, direcciones particulares, perfiles de redes sociales y más.
Todo el sistema está automatizado para comenzar a extraer estos datos tan pronto como las gafas inteligentes detectan el rostro de un individuo en el marco. Estudiantes subido Vídeo en redes sociales para mostrar el proceso.
Es posible que el tweet haya sido eliminado.
En su respuesta al proyecto I-XRAY, Meta señaló que cualquier producto de cámara similar podría adaptarse para utilizar PimEyes de esta manera. No hay nada particularmente único en la tecnología Ray-Ban Meta Smart Glasses que permitió que este proyecto se llevara a cabo.
Sin embargo, Newgen dijo que había una razón específica por la que los dos estudiantes eligieron las gafas inteligentes Meta: el factor de asombro de poder interactuar con completos desconocidos utilizando dispositivos que parecen gafas normales de uso diario. A diferencia de muchos dispositivos portátiles, las gafas inteligentes Ray-Ban Meta no parecen un dispositivo tecnológico. Además, su precio de 300 dólares lo sitúa en un rango bastante razonable en comparación con otros productos similares.
Cómo protegerte
Como se mencionó anteriormente, todavía no existe ningún producto o servicio público que pueda hacer esto. Pero si te preocupa la posibilidad de atrapar a personas en fuga de esta manera, los dos estudiantes de Harvard te explican cómo protegerte.
Según Nguyen y Ardayfio, es tan sencillo como contactar a estos intermediarios de datos para eliminar su información.
Por ejemplo, ojos de haz el proporciona Una página de exclusión voluntaria donde las personas pueden eliminar sus fotos de su motor de búsqueda de reconocimiento facial. Los corredores de datos como FastPeopleSearch suelen proporcionarlos también. Formularios Los usuarios pueden solicitar la eliminación de datos del servicio.
Ray-Ban ha muerto Dos estudiantes de ingeniería de Harvard utilizaron gafas inteligentes para crear una aplicación que puede revelar información confidencial sobre las personas sin que se den cuenta. Los estudiantes publicaron una demostración en vídeo en X (anteriormente Twitter) y demostraron las capacidades de la aplicación. Vale la pena señalar que la aplicación no se puso a disposición del público para los usuarios, sino que se puso a disposición para resaltar los peligros de los dispositivos portátiles impulsados por inteligencia artificial que utilizan cámaras secretas que pueden tomar fotografías e imágenes de personas.
Utiliza una aplicación llamada I-Xray. inteligencia artificial (Inteligencia Artificial) para el reconocimiento facial utiliza luego los datos visuales procesados de las personas. Doxxing, jerga popular de Internet para “dox-drop (documento o documentos informales)”, es el proceso de revelar información personal sobre alguien sin su consentimiento.
Está integrado con las gafas inteligentes Ray-Ban Meta, pero los desarrolladores dijeron que funcionará con cualquier gafas inteligentes con cámaras secretas. Utiliza un modelo de IA similar a PimEyes y FaceCheck para el reconocimiento facial inverso. Esta tecnología puede hacer coincidir el rostro de un individuo con imágenes disponibles públicamente en línea y URL de búsqueda.
Luego, se alimenta otro modelo de lenguaje grande (LLM) con estas URL y se genera un mensaje automático para el nombre, ocupación, dirección y otros datos similares de la persona. El modelo de IA también analiza datos gubernamentales disponibles públicamente, como bases de datos de registro de votantes. Además, también se utilizó para este fin una herramienta online llamada FastPeopleSearch.
En una breve presentación en vídeo, los estudiantes de Harvard AnhPhu Nguyen y Caine Ardayfio también demostraron cómo funciona la aplicación. Pudieron conocer a extraños con la cámara ya encendida, preguntarles sus nombres y la aplicación impulsada por IA podría tomar desde allí para encontrar datos personales sobre el individuo.
En Documentos de Google archivo“Esta sinergia entre los LLM y la búsqueda inversa permite una extracción de datos completa y totalmente automatizada que antes no era posible utilizando únicamente métodos tradicionales”, dijeron los desarrolladores.
Los estudiantes declararon que no tenían intención de poner la aplicación a disposición del público y la desarrollaron sólo para resaltar los peligros de los dispositivos portátiles con IA que pueden grabar a las personas en secreto. Sin embargo, esto no significa que los malos actores no puedan crear una aplicación similar utilizando una metodología similar.
muerto Según se informa, la compañía no dice nada sobre si está recopilando datos de vídeo e imágenes de sus gafas inteligentes Ray-Ban Meta de inteligencia artificial (IA) portátiles para entrenar sus modelos de lenguaje de gran tamaño (LLM). La compañía anunció un nuevo video en tiempo real. característica Para un dispositivo a través del cual los usuarios puedan pedirle a la inteligencia artificial que responda consultas y solicite sugerencias en función de su entorno. Sin embargo, no hay claridad sobre qué sucede con estos datos una vez que la IA responde a la consulta.
La característica en cuestión es la capacidad de video en tiempo real que permite a Meta AI “mirar” el entorno de los usuarios y procesar esa información visual para responder cualquier consulta que el usuario pueda tener. Por ejemplo, se podría pedir a un usuario que identifique a un profesor famoso, le muestre el armario y le pida sugerencias de vestuario, o incluso que pida recetas basadas en los ingredientes del frigorífico.
Sin embargo, cada una de estas funciones requiere que las gafas inteligentes Ray-Ban Meta capturen videos e imágenes negativas del entorno para comprender el contexto. En circunstancias normales, una vez que se genera una respuesta y el usuario finaliza la conversación, los datos deben dejarse en servidores privados si no se eliminan inmediatamente. Esto se debe a que gran parte de los datos pueden ser información privada sobre la casa del usuario y otras posesiones.
Pero se dice que Meta no mencionó esto. Cuando se le preguntó si la empresa almacena estos datos y entrena modelos nativos de IA con ellos, respondió un portavoz de Meta. el dijo TechCrunch dijo que la compañía no está discutiendo el asunto públicamente. Otro portavoz supuestamente confirmó que esta información no se comparte externamente y agregó: “No decimos de ninguna manera”.
La negativa de la empresa a indicar claramente qué está sucediendo con los datos de los usuarios es preocupante dada la naturaleza potencialmente privada y sensible de los datos que las gafas inteligentes pueden capturar. Si bien Meta ya ha confirmado el entrenamiento de sus modelos de inteligencia artificial con datos públicos de sus usuarios con sede en EE. UU. en Facebook e Instagram, los datos de las gafas inteligentes Ray-Ban Meta no son públicos.
Gadgets 360 se ha puesto en contacto con Meta para hacer comentarios. Actualizaremos la historia tan pronto como recibamos una declaración de la empresa.
After months of waiting the moment is here: Meta AI features have arrived on the Ray-Ban Meta smart glasses for everyone – well, everyone in the US and Canada, for now.
The exclusivity to those regions is not the only caveat unfortunately. Another big one is that while the Meta AI tools are no longer locked behind an exclusive beta, Meta notes in its blog post announcement that they are still beta features – suggesting that you’ll likely run into several problems with regard to reliability and accuracy.
But while the update isn’t quite as complete as we’d have liked, it’s still a major leap forward for Meta’s smart glasses – finally having them deliver on the impressive AI promises Meta CEO Mark Zuckerberg made when they were revealed back at Meta Connect 2023 in September last year.
(Image credit: Ray-Ban / Meta)
The main Meta AI feature you’ll want to take advantage of is ‘Look and Ask.’ To activate it simply start a phrase with “Hey Meta, look and …” then ask the glasses a question about something you can see.
You could try “… tell me about this animal,” or “…tell me about this building,” or even “…tell me what I can make for dinner with these ingredients.”
The glasses will then use your command alongside an image captured by the camera to search for an answer in its database – which include data the Meta AI has been trained on, and information it has gathered from Google and Bing.
As with all AI responses, we’d recommend taking what the Meta AI says with a pinch of salt. AI assistants are prone to hallucinating – which in the AI context you can read simply as “getting stuff completely wrong” – and this Meta model is no different. It will get stuff right too, but don’t take its advice as gospel.
Get the hottest deals available in your inbox plus news, reviews, opinion, analysis and more from the TechRadar team.
(Image credit: Meta)
Beyond Look and Ask you can use the Meta AI assistant like the Google or Siri assistant on your phone. This means starting video calls (above), sending texts and images, or playing music all with just voice commands.
Just be prepared to get some attention as you walk around talking to your smart glasses – we got some odd looks when we were testing a different pair of specs the other day.
After a handful of rumors and speculation suggested Meta was working on a pair of AR glasses, it unceremoniously confirmed that Meta AR glasses are on the way – doing so via a short section at the end of a blog post celebrating the 10th anniversary of Reality Labs (the division behind its AR/VR tech).
While not much is known about them, the glasses were described as a product merging Meta’s XR hardware with its developing Meta AI software to “deliver the best of both worlds” in a sleek wearable package.
We’ve collected all the leaks, rumors, and some of our informed speculation in this one place so you can get up to speed on everything you need to know about the teased Meta AR glasses. Let’s get into it.
Meta AR glasses: Price
We’ll keep this section brief as right now it’s hard to predict how much a pair of Meta AR glasses might cost because we know so little about them – and no leakers have given a ballpark estimate either.
Current smart glasses like the Ray-Ban Meta Smart Glasses, or the Xreal Air 2 AR smart glasses will set you back between $300 to $500 / £300 to £500 / AU$450 to AU$800; Meta’s teased specs, however, sound more advanced than what we have currently.
Meta’s glasses could cost as much as Google Glass (Image credit: Future)
As such, the Meta AR glasses might cost nearer $1,500 (around £1,200 / AU$2300) – which is what the Google Glass smart glasses launched at.
A higher price seems more likely given the AR glasses novelty, and the fact Meta would need to create small yet powerful hardware to cram into them – a combo that typically leads to higher prices.
Get the hottest deals available in your inbox plus news, reviews, opinion, analysis and more from the TechRadar team.
We’ll have to wait and see what gets leaked and officially revealed in the future.
Meta AR glasses: Release date
Unlike price, several leaks have pointed to when we might get our hands – or I suppose eyeballs – on Meta’s AR glasses. Unfortunately, we might be waiting until 2027.
That’s according to a leaked Meta internal roadmap shared by The Verge back in March 2023. The document explained that a precursor pair of specs with a display will apparently arrive in 2025, with ‘proper’ AR smart glasses due in 2027.
(Image credit: Meta)
In February 2024 Business Insider cited unnamed sources who said a pair of true AR glasses could be shown off at this year’s Meta Connect conference. However, that doesn’t mean they’ll launch sooner than 2027. While Connect does highlight soon-to-release Meta tech, the company takes the opportunity to show off stuff coming further down the pipeline too. So, its demo of Project Orion (as those who claim to be in the know call it) could be one of those ‘you’ll get this when it’s ready’ kind of teasers.
Obviously, leaks should be taken with a pinch of salt. Meta could have brought the release of its specs forward, or pushed it back depending on a multitude of technological factors – we won’t know until Meta officially announces more details. Considering it has teased the specs suggests their release is at least a matter of when not if.
Meta AR glasses: Specs and features
We haven’t heard anything about the hardware you’ll find in Meta’s AR glasses, but we have a few ideas of what we’ll probably see from them based on Meta’s existing tech and partnerships.
Meta and LG recently confirmed that they’ll be partnering to bring OLED panels to Meta’s headsets, and we expect they’ll bring OLED screens to its AR glasses too. OLED displays appear in other AR smart glasses so it would make sense if Meta followed suit.
Additionally, we anticipate that Meta’s AR glasses will use a Qualcomm Snapdragon chipset just like Meta’s Ray-Ban smart glasses. Currently, that’s the AR1 Gen 1, though considering Meta’s AR specs aren’t due until 2027 it seems more likely they’d be powered by a next-gen chipset – either an AR2 Gen 1 or an AR1 Gen 2.
The AR glasses could let you bust ghost wherever you go (Image credit: Meta)
As for features, Meta’s already teased the two standouts: AR and AI abilities.
What this means in actual terms is yet to be seen but imagine virtual activities like being able to set up an AR Beat Saber jam wherever you go, an interactive HUD when you’re navigating from one place to another, or interactive elements that you and other users can see and manipulate together – either for work or play.
AI-wise, Meta is giving us a sneak peek of what’s coming via its current smart glasses. That is you can speak to its Meta AI to ask it a variety of questions and for advice just as you can other generative AI but in a more conversational way as you use your voice.
It also has a unique ability, Look and Ask, which is like a combination of ChatGPT and Google Lens. This allows the specs to snap a picture of what’s in front of you to inform your question, allowing you to ask it to translate a sign you can see, for a recipe using ingredients in your fridge, or what the name of a plant is so you can find out how best to care for it.
The AI features are currently in beta but are set to launch properly soon. And while they seem a little imperfect right now, we’ll likely only see them get better in the coming years – meaning we could see something very impressive by 2027 when the AR specs are expected to arrive.
Meta AR glasses: What we want to see
A slick Ray-Ban-like design
The design of the Ray-Ban Meta Smart Glasses is great (Image credit: Meta)
While Meta’s smart specs aren’t amazing in every way – more on that down below – they are practically perfect in the design department. The classic Ray-Ban shape is sleek, they’re lightweight, super comfy to wear all day, and the charging case is not only practical, it’s gorgeous.
While it’s likely Ray-Ban and Meta will continue their partnership to develop future smart glasses – and by extension the teased AR glasses – there’s no guarantee. But if Meta’s reading this, we really hope that you keep working with Ray-Ban so that your future glasses have the same high-quality look and feel that we’ve come to adore.
If the partnership does end, we’d like Meta to at least take cues from what Ray-Ban has taught it to keep the design game on point.
Swappable lenses
We want to change our lenses Meta! (Image credit: Meta)
While we will rave about Meta’s smart glasses design we’ll admit there’s one flaw that we hope future models (like the AR glasses) improve on; they need easily swappable lenses.
While a handsome pair of shades will be faultless for your summer vacations, they won’t serve you well in dark and dreary winters. If we could easily change our Meta glasses from sunglasses to clear lenses as needed then we’d wear them a lot more frequently – as it stands, they’re left gathering dust most months because it just isn’t the right weather.
As the glasses get smarter, more useful, and pricier (as we expect will be the case with the AR glasses) they need to be a gadget we can wear all year round, not just when the sun’s out.
Speakers you can (quietly) rave too
These open ear headphones are amazing, Meta take notes (Image credit: Future)
Hardware-wise the main upgrade we want to see in Meta’s AR glasses is better speakers. Currently, the speakers housed in each arm of the Ray-Ban Meta Smart Glasses are pretty darn disappointing – they can leak a fair amount of noise, the bass is practically nonexistent and the overall sonic performance is put to shame by even basic over-the-ears headphones.
We know open-ear designs can be a struggle to get the balance right with. But when we’ve been spoiled by open-ear options like the JBL SoundGear Sense – that have an astounding ability to deliver great sound and let you hear the real world clearly (we often forget we’re wearing them) – we’ve come to expect a lot and are disappointed when gadgets don’t deliver.
The camera could also get some improvements, but we expect the AR glasses won’t be as content creation-focused as Meta’s existing smart glasses – so we’re less concerned about this aspect getting an upgrade compared to their audio capabilities.
Meta’s Reality Labs division – the team behind its VR hardware and software efforts – has turned 10 years old, and to celebrate the company has released a blog post outlining its decade-long history. However, while a trip down memory lane is fun, the most interesting part came right at the end, as Meta teased its next major new hardware release: its first-ever pair of AR glasses.
According to the blog post, these specs would merge the currently distinct product pathways Meta’s Reality Labs has developed – specifically, melding its AR and VR hardware (such as the Meta Quest 3) with the form factor and AI capabilities of its Ray-Ban Meta Smart Glasses to, as Meta puts it, “deliver the best of both worlds.”
Importantly for all you Quest fans out there, Meta adds that its AR glasses wouldn’t replace its mixed-reality headsets. Instead, it sees them being the smartphones to the headsets’ laptop/desktop computers – suggesting that the glasses will offer solid performance in a sleek form factor, but with less oomph than you’d get from a headset.
Before we get too excited, though, Meta hasn’t said when these AR specs will be released – and unfortunately they might still be a few years away.
A report from The Verge back in March 2023 shared an apparent Meta Reality Labs roadmap that suggested the company wanted to release a pair of smart glasses with a display in 2025, followed by a pair of ‘proper’ AR smart glasses in 2027.
We’re ready for Meta’s next big hardware release (Image credit: Meta)
However, while we may have to wait some time to put these things on our heads, we might get a look at them in the next year or so,
A later report that dropped in February this year, this time via Business Insider, cited unnamed sources who said a pair of true AR glasses would be demoed at this year’s Meta Connect conference. Dubbed ‘Orion’ by those who claim to be in the know, the specs would combine Meta’s XR (a catchall for VR, AR, and MR) and AI efforts – which is exactly what Meta described in its recent blog post.
Get the hottest deals available in your inbox plus news, reviews, opinion, analysis and more from the TechRadar team.
As always, we should take rumors with a pinch of salt, but given that this latest teaser came via Meta itself it’s somewhat safe to assume that Meta AR glasses are a matter of when, not if. And boy are we excited.
Currently Meta has two main hardware lines: its VR headsets and its smart glasses. And while it’s rumored to be working on new entries to both – such as a budget Meta Quest 3 Lite, a high-end Meta Quest Pro 2, and the aforementioned third-generation Ray-Ban glasses with a screen – these AR glasses would be its first big new hardware line since it launched the Ray-Ban Stories in 2021.
And the picture Meta has painted of its AR glasses is sublime.
Firstly, while Meta’s current Ray-Ban smart glasses aren’t yet the smartest, a lot of major AI upgrades are currently in beta – and should be launching properly soon.
The Ray-Ban Meta Smart Glasses are set to get way better with AI (Image credit: Future / Philip Berne)
Its Look and Ask feature combines the intelligence of ChatGPT – or in this instance its in-house Meta AI – with the image-analysis abilities of an app like Google Lens. This apparently lets you identify animals, discover facts about landmarks, and help you plan a meal based on the ingredients you have – it all sounds very sci-fi, and actually useful, unlike some AI applications.
We then take those AI-abilities and combine them with Meta’s first-class Quest platform, which is home to the best software and developers working in the XR space.
While many apps likely couldn’t be ported to the new system due to hardware restrictions – as the glasses might not offer controllers, will probably be AR-only, and might be too small to offer as powerful a chipset or as much RAM as its Quest hardware – we hope that plenty will make their way over. And Meta’s existing partners would plausibly develop all-new AR software to take advantage of the new system.
Based on the many Quest 3 games and apps we’ve tried, even if just a few of the best make their way to the specs they’d help make Meta’s new product feel instantly useful. a factor that’s a must for any new gadget.
Lastly, we’d hopefully see Meta’s glasses adopt the single-best Ray-Ban Meta Smart Glasses feature: their design. These things are gorgeous, comfortable, and their charging case is the perfect combination of fashion and function.
We couldn’t ask for better-looking smart specs than these (Image credit: Meta)
Give us everything we have already design-wise, and throw in interchangeable lenses so we aren’t stuck with sunglasses all year round – which in the UK where I’m based are only usable for about two weeks a year – and the AR glasses could be perfect.
We’ll just have to wait and see what Meta shows off, either at this year’s Meta Connect or in the future – and as soon as they’re ready for prime time, we’ll certainly be ready to test them.