Meta platforms (Nasdaq: TargetCEO Mark Zuckerberg revealed his vision for augmented reality glasses that could pack a lot of technology into a thin, designed frame during a discussion with Nvidia (NASDAQ: NVDACEO Jensen Hwang at SIGGRAPH 2024 conference in Denver Monday evening.
He eventually wants to have a line of AR glasses at different prices depending on the tasks they can perform. They will be equipped with AI assistants, and will be capable of a wide range of capabilities while maintaining a sleek look.
“We couldn’t put all the technology into the Ray-Ban glasses that we had,” Zuckerberg said. “We’re getting closer to that goal, maybe two or three years away. We want the glasses to look great.”
Zuckerberg also discussed the impact of AI on Meta’s products.
“We’ll use Meta AI, but we want our customers to create their own agents,” Zuckerberg said. “We call it AI Studio, which is a set of tools that lets each user create an AI version of themselves. There’s only so much time in a day. We can’t always interact with everyone we need to every day. But we can create an agent that’s based on ourselves and can interact with people.”
He also said that Llama 3 and Llama 4 will be smoother, and will no longer feel like a chatbot.
“AI today is role-based,” Huang added. “You give it input, and it responds. In the future, AI will be able to plan.”
“One of the interesting uses we’ve seen is people using AI models for support,” Zuckerberg said. “We see a lot of people using Meta AI to role-play difficult situations. Like how to ask a boss for a promotion, and this makes it possible to role-play those situations without making judgments.”
He added that Facebook and Instagram will continue to implement more AI features as the technology evolves.