A representative for Cupertino, California-based Apple declined to comment.
The glasses processor is based on chips used in the Apple Watch that require less energy than the components in products like the iPhone, iPad and Mac. The chip has been customized to remove some parts in order to further improve power efficiency. The processor is also being designed to control the multiple cameras that are planned for the glasses.
Apple has spent years trying to develop smart glasses — something lightweight that consumers can wear all day. The original idea was to use augmented reality, which superimposes media, notifications and apps over real-world views. But AR remains years away from being practical.
Apple is currently exploring non-AR glasses that use cameras to scan the surrounding environment and rely on AI to assist users. That would make the device similar to the Meta product, though Apple is still figuring out the exact approach it wants to take. The iPhone maker also needs its own artificial intelligence technology to vastly improve before the company can roll out a compelling AI-centric device.
Already, the iPhone has a feature called Visual Intelligence that can provide context for photos. For instance, customers can scan a music poster and have the event details added to their calendar.