Apple Inc. is accelerating development of three new wearable devices as part of a shift toward artificial intelligence-powered hardware, a category also being pursued by OpenAI and Meta Platforms Inc.
The company is ramping up work on smart glasses, a pendant that can be pinned to a shirt or worn as a necklace, and AirPods with expanded AI capabilities, according to people with knowledge of the plans. All three devices are being built around the Siri digital assistant, which will rely on visual context to carry out actions.
Each of the products, which will be linked to Apple’s iPhone, depends on a camera system with varying capabilities, said the people, who asked not to be identified because the plans haven’t been announced. A spokesperson for Cupertino, California-based Apple declined to comment.
The AirPods and pendant are envisioned as simpler offerings, equipped with lower-resolution cameras designed to help the AI work rather than for taking photos or videos. The glasses, meanwhile, will be more upscale and feature-rich.
In an all-hands meeting with employees earlier this month, Chief Executive Officer Tim Cook hinted that the company would be pushing hard into AI devices, saying Apple is working on new “categories of products” that are enabled by artificial intelligence. “We’re extremely excited about that.”
Cook added that the company was investing in new technology. “The world is changing fast,” he said.
While iPhone sales remain robust, Apple is playing catch-up in AI. Revamping Siri has been a key challenge: Upgrades to the voice assistant have been plagued by development snags, delaying their rollout.
The company is preparing a version of the assistant for iOS 27, due later this year, that will feature a chatbot-like interface. Apple will rely on underlying models co-developed with Alphabet Inc.’s Google.
In the longer run, AI is expected to change the way consumers use phones — with more activities shifting to peripherals. Meta’s glasses have already become a hit, and OpenAI is developing a series of devices, including wearables, with the help of ex-Apple design chief Jony Ive and other former Apple executives.
Apple has been trying to find a winning formula in this area. Its last major push into a new category, the pricey Vision Pro headset, didn’t resonate with consumers. The company is looking for a breakthrough with its accelerated push into wearable devices, aiming to keep users locked into the Apple ecosystem.
Smart Glasses
The smart glasses are planned to be positioned as an advanced offering in the company’s AI hardware lineup, intended to compete with Meta’s camera-equipped eyewear. They would include a high-resolution camera capable of capturing photos and video.
Apple has made significant progress in recent months on its glasses, code-named N50, and has recently distributed a broader set of prototypes within its hardware engineering division. The company is targeting the start of production as early as December, ahead of a public release in 2027.
Like most of Meta’s current offerings, the glasses won’t include a display. Instead, the interface will rely on speakers, microphones and cameras — letting users make phone calls, access Siri, take actions based on surroundings, play music and take photos. Apple aims to differentiate the product in two key areas: build quality and camera technology.
Employees say the company initially developed the hardware by embedding electronics and cameras into off-the-shelf frames from a variety of popular brands. Apple at one point even discussed relying on partnerships to launch the product, following a broader industry trend. Meta works with EssilorLuxottica SA, while Google has teamed up with Warby Parker Inc.
More recently, however, Apple decided to develop its own frames in-house in a variety of sizes and colors.
Early prototypes of the glasses connect via a cable to a standalone battery pack and an iPhone, but newer versions have the components embedded in the frame. The design uses high-end materials, including acrylic elements intended to give the glasses a premium feel. Apple is already discussing launching the device in additional styles over time.
The glasses will include two camera lenses: one for high-resolution imagery and another dedicated to computer vision — a technology similar to what’s used in the Vision Pro. The second sensor is designed to give the device environmental context, helping it more accurately interpret surroundings and measure distance between objects.
The goal is for the glasses to function as an all-day AI companion, capable of understanding what a user is seeing and doing in real time. Wearers could look at an object and ask what it is and get assistance with everyday tasks. That could mean inquiring about ingredients in a meal, for instance.
Apple is also exploring more advanced uses. The glasses could read printed text and convert it into digital data — say, by adding the information on an event poster directly to a calendar. The device also could create context-aware reminders, such as prompting a user to grab an item when they’re looking at the right shelf in a grocery store.
For navigation, Siri could reference real-world landmarks — rather than just giving more generic instructions. The assistant could tell users to walk past a described building or vehicle before making a turn.
Apple already has some visual AI capabilities, including the Visual Intelligence feature for analyzing images on iPhones, but the technology would be more accessible.
Pendant and AirPods
Of course, some users prefer not to wear something on their face — especially if they don’t already have glasses. Apple is aiming to serve that market with its other wearable AI devices: the pendant and camera-equipped AirPods.
Apple’s industrial design team hatched the pendant idea while working on the glasses — before they had settled on a design for that product. The device is reminiscent of the failed Humane AI pin, but it’s designed as an iPhone accessory rather than a standalone product.
The pendant would essentially serve as an always-on camera for the smartphone that also includes a microphone for Siri input. Some Apple employees call it the “eyes and ears” of the phone.
While Apple’s industrial design team is leading the strategy for the product, Apple is also leaning on the Vision Products Group that developed the Vision Pro for the engineering. That group is working on the smart glasses as well.
Unlike the Humane AI Pin, the Apple device lacks a projector or a display system. It’s also designed to rely heavily on an iPhone for processing. Though it has a dedicated chip, the system is closer in computing power to AirPods than an Apple Watch.
One area of debate for the product has been whether or not to include a speaker, which would allow users to hold back-and-forth conversations with the device directly. That means they could leave their iPhone in their pocket or bag or not wear AirPods.
Apple is working to allow users to wear the AirTag-sized pendant in two primary ways: with a clip that can attach to clothing or via a necklace that can be placed through a hole inside the hardware.
The Information previously reported on aspects of the pin project, which remains early-stage and could still be canceled. If Apple moves forward with the device, it could launch as early as next year. The plans for the other products also remain fluid.
The company has previously stopped work on other devices, including updated versions of the Apple Watch with embedded cameras. Testers found the concept impractical due to clothing sleeves and the difficulty of capturing usable camera angles from the wrist.
The AirPods, planned for as early as this year, have been in development for a while, with Bloomberg News first reporting in early 2024 that Apple was exploring camera-equipped earbuds. The company has steadily added AI features to the product, including a new live-translation mode introduced last year.
Down the road, Apple aims to create smart glasses with an augmented reality display, giving users access to richer data and visuals. But a potential launch remains many years away.
The company stopped development last year of a cheaper and lighter version of its Vision Pro headset dubbed N100. It was meant to be a bridge toward the AR devices, but Apple ultimately chose to focus on glasses rather than a more enclosed headset design.
Beyond wearables, Apple is developing a range of AI devices for the home. That lineup includes a smart display built around the company’s upcoming Siri revamp and a later version with a larger screen and robotic arm. The company is also working on an updated HomePod speaker and a compact indoor sensor for home security and automation.
More stories like this are available on bloomberg.com
Published on February 18, 2026
