Here’s looking at you, AI – Innovative Eyewear’s Lucyd range of smart glasses with ChatGPT
Seeing is believing, is learning, is style, writes Caroline Hayes, exploring the world of fashion and AI. Last week, Innovative Eyewear’s fashion show in Miami, Florida introduced the world to Lucyd Lyte 2.0, the ChatGPT-enabled line of smart glasses.
Smart glasses got a little bit smarter last week, as models wearing Innovative Eyewear’s Spring 2023 collection showed off the fruits of the company’s progress in AI with the latest glasses in the Lucyd range, paired with the Lucyd app for Lucyd Lyte 2.0 ChatGPT-enabled smart eyewear.
The smart glasses are sleek in design with integrated open-ear headphones, polarised UV 400 protection, prescription lenses and voice assistance. The lightweight design also includes quadrasonic speakers, open-ear audio, noise cancellation microphones and 12-hour battery life seamlessly programmed to work with the AI platform.
Innovative Eyewear has filed a patent for the CharGPT-enabled smart eyewear (System, Apparatus and Method for Using a Chatbot 18/189,547). The patent applies to software used on one or more smart devices operating with the open source ChatGPT AI chatbot. The patent pending methodology for communicating with one or more chatbots chooses and prioritises data drawn from multiple chatbots or AI language models. Text, audio and image data can be provided in response to enquiries.
CEO, Harrison Gross, believes the company is the first to provide AI intelligence voice accessibility on Bluetooth-enabled eyewear. “With our new Lucyd app, we are continuing to make eyewear more flexible and smarter than ever before,” he says.
The Lucyd app provides audible or spoken output of the ChatGPT language model for natural communication between the user and the AI chatbot. Although the app can be used with any smartphone or hearable device for verbal interaction with ChatGPT, the company demonstrated its hardware and software advances at the eMerge conference in Miami, Florida last week (20 and 21 April).
The Lucyd app is voice-controlled, making it significantly easier to use ChatGPT, says Innovative Eyewear. Typical requests are input into ChatGPT is long-form text paragraphs which can be cumbersome to type on a mobile device. Introducing voice control makes ChatGPT mobile friendly for the first time, says the company.
The Lucyd Lyte flagship smart eyewear enables wearers to listen to music, take and make calls and use voice assistants to perform tasks and now the wearer can access the ChatGPT chatbot’s AI to ask questions and receive spoken answers.
The open source AI chatbot can be used to translate text on products or signs and to find out information. Wearers of the Lucyd glasses can ask question via microphones and hear the response through stereo speakers. The app can also be used via a streamlined visual interface on iOS/Android devices for the option of both seamless voice and textual interfaces for ChatGPT.
The Lucyd app is currently in beta testing. “By connecting to ChatGPT via voice commands on Lucyd smart eyewear, users can now access a wealth of detailed information on just about any subject, making smart eyewear a wonderful mobile learning tool,” says the company.
The Lucyd Starseeker titanium audio glasses are made of a brushed gunmetal titanium front and temples.
Lucyd Lyte 2.0 eyewear is available in 15 distinct, patent-pending styles, all of which are available with prescription or sunglass lenses. “Lucyd Smart Eyewear is fusing AI technology and style to disrupt the designer eyewear market by seamlessly integrating ChatGPT via voice commands through Siri or Google Voice. Our smart glasses go beyond listening to music or taking a phone call – now consumers can go from stylish and smart to gifted, with audio ChatGPT spoken directly in their ears,” the company adds.
A Danish start up, Be My Eyes has been exploring the potential for openai’s GPT-4, AI to help blind or partially sighted people. It has developed a GPT-4 -powered Virtual Volunteer with its Be My Eyes app, which can generate the same level of context and understanding as a human volunteer.
Initial trials show the app to be an effective image-to-text object recognition tool. “The implications for global accessibility are profound,” says Michael Buckley, CEO of Be My Eyes. “In the not so distant future, the blind and low vision community will utilise these tools not only for a host of visual interpretation needs, but also to have a greater degree of independence in their lives.”
Potential uses will be for someone to send an image, for example the contents of their fridge, and GPT-4 will identify and name what is in there but also extrapolate and analyses what meal can be made with those ingredients. The user could then ask GPT-4 for a recipe.
The difference between GPT-4 and other language and machine learning models, explains Jesper Hvirring Henriksen, CTO of Be My Eyes, is the conversational ability and the greater analytical prowess. “Basic image recognition applications only tell you what’s in front of you”, he says. “They can’t have a discussion to understand if the noodles have the right kind of ingredients or if the object on the ground isn’t just a ball, but a tripping hazard – and communicate that,” he explains.
The AI platform could be used to navigate public transport systems, for example as well as help with wardrobe choices and other everyday living tasks.
The Be My Eyes app has been training using deep learning algorithms to enable it to understand the importance of a web page, to simplify tasks such as reading the news online, but also to help navigate web sites, such as shopping ad e-commerce sites which can be populated with adverts as well as content which a sighted person can scan and assess to find relevant data. Be My Eyes says that GPT-4 is able to summarise the search results the way the sighted naturally scan them, without reading every detail to support them to make the right purchase in real-time.
“This is a fantastic development for humanity”, Buckley says, “but it also represents an enormous commercial opportunity,” he adds.