

Daniel W. Rasmus: 2026 CES PreGame Panel (Video)
Dan was invited by the Virtual Events Group (VEG) to participate in the 2026 CES PreGame Panel. I cover AI to look forward to at the upcoming CES 2026 in Las Vegas. Watch the video below. Dan was joined by fellow CES veterans John T. Kelley, Sr. VP and Show Director of CES, Jeremy Kaplan, Content Director of Future, Meredith Perry, Co-Founder & CEO, Elemind Technologies and Steward Wolpin, CES historian. We provided an overview of what to expect at CES 2026. Dan covered AI and intelligent devices/wearables. A transcript of Dan’s analysis follows.
CES PreGame Panel Transcript (Daniel W. Rasmus portion):
Robin Raskin: So, Daniel, that gives you a great entrée. We’re hearing a lot about AI being not just talkative, but actually predictive. Like, you just think about it, and AI will figure it out. And that is a big topic of conversation this year. So why don’t you tell us what you’re seeing on the AI-ification of CES?
Daniel Rasmus: Yeah. Yeah, I’m not sure we’re quite to the point that it’s going to actually anticipate where we’re going, but, you know, we’re some of the things we’re not going to see at CES, we’re not going to see anything like this, but I do have prototypes from the 23rd century as well in my portfolio. So, uh, I try to keep ahead of what’s happening.
Um, and I think wearables are going to be one of the areas. It’s not just wearables, but it’s the intelligence inside of wearables. So there’s AI tools uh built into VR glasses like XREAL. And I’ll come back to this as a theme multiple times here is the materials as well. So uh uh Carrillo is showing glasses that can uh change their uh the way the lenses work. Um, and so there’s intelligent materials that have nothing to do with generative AI, and they have nothing to do with AI in you know, specifically, but they are intelligent from the standpoint that they can morph.
We’re going to see, of course, you know, earbuds with translations. We’ve got uh uh Pebble uh, you know, uh coming up with their new ring. Um, we’ve got intelligent dash cams from companies like Vantrue. One of the ones I saw that was fascinating was Y-Brush is going to be showing a sonic toothbrush that has built-in gas sensors that use AI to sense what’s wrong with you across 300 different parameters while you’re brushing your teeth. So, you know, I think sensors are one of the other big factors that we’re going to see.
And to Jeremy’s point about nerdiness, there are component vendors as well at CES. So we’re not just talking about seeing robots all up; we’re also talking about seeing companies like uh Xela’s uSkin, that’s a new 3D tactile sensor that manufacturers can put on the tips of robots so they can sense better.
In the wearable space, you’re going to see a lot of AI around wearables like Plaud, Viaim and Mobvoi’s TickNote that give you the ability to record, you know, if you’re going to one of the keynotes, you can wear one of these devices. It will transcribe all of the content and then give you all the takeaways. Very similar to what your experience might be in Zoom or Google Meet when you’re turning on the AI factors there. Uh, but this gives it to you as a personal device that you can use.
You can also talk to your lights. So Jeremy is asking about talking to the computers; we could probably have a whole discussion about that. just put up some Lepro lights that I can now push a button to set the colors. I think we’re going to see a lot of this, where you can actually generatively express what you want the lights to do, and they will do that for you. And those, rather than being conversations, are commands. So I think we’ll see voice being used in that way.
And there is a company called Falcon that’s doing an AI camera. So I was in the era of camcorders. And now when I watch concerts and things, I’m watching everybody with their iPhone or their Samsung phones, watching and recording everything. And so there’s a space between you and the event. And they’ve created an AI camera that you just put up and tell it what you want it to watch, and it’ll watch it for you so you can actually watch the event yourself. So I think that’s interesting.
Beauty agents are going to be out there. We’ve got one that’s going to take a picture of you, and the Perfect Beauty Agent calls the beautiful AI API. You can take a selfie of yourself, and then it’ll tell you what makeup you should use.
One of the devices I’m looking forward to seeing is Looki L1. Gordon Bell was a researcher at Microsoft who ran around with cameras and digitized his entire life. They have this device that is a multimodal sensing device that you wear. I don’t know if people are going to like it or not, but you can wear it, and it’ll kind of track your entire life and remind you both in audio and video and eventually in text because it will interpret what happened in your day. So, for some people that may be a little creepy, some people may find it something that’s interesting.
Industrial robots were mentioned, so I think we’re going to see, you know, of course, Oshkosh and a lot of those, and those are technologies that I think we’re going to see filtering down. So you’ve seen, you know, you’re going to see big autonomous things doing safe and unsafe spaces where they’re going to be applied to putting out fires or helping build things or going on cranes and helping build uh skyscrapers. I think that same technology will get smaller.
There’s this convergence that’s going to happen where we’re seeing a lot of push for humanoid robots., I don’t believe humanoid robots are where the future of robotics is going to be. I’m not sure anybody wants humanoid robots. I’m not sure what they’re for because there are so many things we’re not good at that robots could be better at if they were made for the thing that we’re trying to do. And so we’re going to see purpose-built robotics, and that goes back to my earlier comments around the components. At CES, you can see all of the pieces as well. As I said, the sensors, the actuators, all of those companies are there as well.
Then Robin mentioned PlantPetz, which is the name of the product that turns your plant into a user interface. It doesn’t just take your plant and dance around with you as a robot; you can actually touch the leaves, and it will influence what the robot does. So that’s an interesting thing. We’re going to see an autonomous cooking thing. So a very Jetsons thing. We’re going to see a CHEF that’s going to prepare a meal completely autonomously. And we’re also going to see Large Language Model-infused stuffed toys from companies like Folotoy. That’s one that’s going to be interesting to watch.
And I’ll leave you with, thinking about not just robotics but also cybernetics or cyborgism or transhumanism or whatever. So I think the wearables are going to get to the point. Vigx is going to be showing knee braces that actually help you walk and help you perform in sports better, and give you feedback and support you at the same time. So all of the wearables—the wearables are not just going to be necklaces or rings or watches or glasses; they’re also going to be, you know, walking amplification tools. And we’re going to get to the point that we’re going to I think that’s a transition that we’ll see over the next few years, is going to be less of the robots and more about how do we make humans better.
And I think on the PC side, I would watch for one of the pieces I’m writing right now as an analyst on the potential disruption Microsoft recently announced they wanted an Agentic OS. I think we’re going to see a disruption potentially in what we call an operating system. Windows, macOS, and Linux. We’re going to see AI become a potential substrate for that. So watch things like what Lenovo is going to say at Tech World at the Sphere for some of those announcements, and keep an eye on the PC companies and see where that takes us.
And I’ll take a breath and hand it off to our next… Thank you, Robin.
For more Serious Insights news, click here.

Leave a Reply