Normal view

There are new articles available, click to refresh the page.
Before yesterdayThe Verge - Reviews

The Flextail Tiny Bike Pump is a solid pump half the time

The tiny Flextail pump inflated this city bike tire in 45 seconds. | Photo by Thomas Ricker / The Verge

Social media’s algorithms know that I ride a bike almost every day. My quiver includes a city bike, mountain bike, and gravel bike, in addition to one or two e-bikes I’m always in the process of reviewing. I’m also the family mechanic, which makes me responsible for no less than 16 to 18 tires that I must keep inflated. So, you’d better believe I took notice when Instagram served me several ads for the Flextail Tiny Bike Bump.

The mini rechargeable pump works with Presta (the thin one) or Schrader (the old fatty) valves and promises ultra-fast inflation that maxes out at 100psi (about 7 bars) — enough for any bike that doesn’t require a stretchy wardrobe coordinated with your shoes and helmet.

The origins of the pump are suspect, as I see what looks to be the exact same product sold with branding like Cyclami, Toptoper, Rrskit, and Epoom at a variety of price points, some as low as $25. Flextail sells its version for $85 and lists the manufacturer as Huzhou Jingwei Outdoor Products on the box and device itself. The first pump Flextail sent me couldn’t pump a tire beyond 19psi before dying. Flextail sent me another that (mostly) lives up to the claims.

The thing that’s not mentioned in the ads I’ve seen is how loud the tiny pump is: 76dB at arm’s length, in my testing, which is akin to bending over to inspect a running vacuum cleaner or garbage disposal. Using it while stopped alongside forest trails generates more scowls than seeing a mountain biker in Lycra.

The Flextail Tiny Bike Pump does work, though. It’s much faster and smaller than the mini hand pumps riders usually carry in case of trouble. At 3.9 ounces (111 grams), it’s also just a bit heavier than the trusty 3.4-ounce (96 grams) Unich pump I regularly carry. But the Flextail pump also doesn’t strain your air valve mounts as much because it doesn’t require long periods of vigorously erratic pumping.

The Flextail pump’s biggest disadvantage is that it’s only good for a few zero-to-full inflations before needing a recharge, but that will vary by tire size and desired pressure. It’ll last much longer if you’re just topping up tires. Its tiny 2.59Wh battery recharges in as little as 25 minutes.

In my testing, on a city bike fitted with wide 700 x 40c tires and Schrader valves, I was able to pump one tire up to 45psi in 45 seconds. Then, moving to a gravel bike fitted with wider 700 x 42c tires and Presta valves, I was able to hit 50psi in 90 seconds before the pump quit in need of a recharge. That’s two real-world inflations per charge, for those keeping score.

The Flextail Tiny Bike Pump is so small and lightweight that I initially thought it would be ideal for bikepacking trips or even long day rides. But with only two inflations in the tank, I’d still want to carry a hand pump as backup alongside my patch kit and spare inner tube(s). But there’s no way my gram-obsessed brain would allow me to carry two pumps.

If your rig is an e-bike with a built-in USB charging port, then you’re already traveling with a giant power bank on wheels. That makes it easy to recharge the Flextail pump after depleting it because your side-of-the-road flat tire repair didn’t go as planned (it happens!). Just don’t forget your USB-C cable... and maybe a carbohydrate bar to snack on while you wait.

If you’re still interested, all I can say is that one of the two Flextail Tiny Bike Pumps I tested worked as advertised, and I bet you’ll have similar success from other brands that sell what looks to be the same Huzhou Jingwei Outdoor Products battery-powered pump for much less.

For everyone else, just buy a mini hand pump for much less money. They never need charging, are too big to lose, and will likely last a human lifetime — or two.

All photography by Thomas Ricker / The Verge

Rabbit R1 review: nothing to see here

Artificial intelligence might someday make technology easier to use and even do things on your behalf. All the Rabbit R1 does right now is make me tear my hair out.

“You’re holding a taco.”

My Rabbit R1 told me that the other day. I was sitting at a table at a cafe on the National Mall in Washington, DC, and I had just picked up this $199 orange rectangle of a gadget and pointed it at the food in my hand. With unwavering, absolute confidence, the R1 told me it was a taco.

It was a Dorito. A Cool Ranch Dorito, to be specific. I’d shown the R1 the bag just a few seconds earlier and asked about the calories. (The R1 got that bit right.) I moved the chip around and tried again — still taco. I could not convince this AI-powered gadget, theoretically at the cutting edge of a technological revolution, that I was holding a chip.

Over and over in my testing of the R1, I’ve run into moments like the taco encounter, where the whole thing just feels broken. It misidentified a red dog toy as a stress ball, then as a tomato, then as a red bell pepper that it assured me is totally safe to eat. I’d start playing a song on the R1, and then the device would stop responding but keep playing so that I couldn’t even pause it or turn the volume down.

For a while, the R1 couldn’t even tell the time or the weather. Rabbit finally fixed that with a software update on Tuesday, and the company promised many more updates to come — though now, instead of the weather being wrong by thousands of miles, it gives me the weather from about 15 miles away. I guess that counts for something.

Ever since the R1 debuted at CES, with a keynote filled with big promises and impressive demos, this device has been sold as a super-clever, ultra-helpful AI assistant. Rather than just answer ChatGPT-style questions, it was supposed to do just about everything your phone can do, only faster. A few months later, this device on my desk bears no resemblance to the one we were told about, that more than 100,000 people preordered based on promises and demos.

After reviewing the Humane AI Pin and finding it woefully unable to execute its ambition, I was excited about the R1. It’s cheaper, more whimsical, and less ambitious. After using the R1, I feel like Humane at least deserves credit for trying. The R1 is underwhelming, underpowered, and undercooked. It can’t do much of anything. It doesn’t even know what a taco looks like.

On the LAM

The most intriguing tech in the R1 is what Rabbit calls the “Large Action Model,” or LAM. Where a large language model, or LLM, is all about analyzing and creating text, the LAM is supposed to be about doing stuff. The model learns how an app works in order to be able to navigate it on your behalf. In a LAM-powered world, you’d use Photoshop just by saying “remove that lady from the background” or make a spreadsheet by telling your device to pull the last six quarters of earnings from the investor website.

There is basically no evidence of a LAM at work in the R1. The device only currently connects to four apps: Uber, DoorDash, Midjourney, and Spotify. You connect to them by opening up Rabbit’s web app, called Rabbithole, and logging in to each service individually. When you go to do so, Rabbit opens up a virtual browser inside the app and logs you in directly — you’re not logging in to a service provided by DoorDash but rather literally in to DoorDash’s website while Rabbit snoops on the process. Rabbit says it protects your credentials, but the process just feels icky and insecure.

I logged in to them all anyway, for journalism. Except for Midjourney, which I never managed to get into because I couldn’t get past the CAPTCHA systems that obviously thought I was a bot. The connection doesn’t do much anyway: the R1 won’t show you the images or even send them to you. It’s just typing an image prompt and pressing enter.

A photo of the Rabbit R1.
The R1’s camera can see things... it’s just not great at knowing what they are.

I’d love to tell you how Uber and DoorDash work better once you’re logged in, but I never got either one to successfully do anything. Every time I pressed that side button on the R1 — which activates the microphone — and asked it to order food, it spat back a warning about how “DoorDash may take a while to load on RabbitOS” and then, a second later, told me there was an issue and to try again. (If you have to include that disclaimer, you probably haven’t finished your product.) Same thing for Uber — though I was occasionally able to at least get to the point where I said my starting and ending addresses loudly and in full before it failed. So far, Rabbit has gotten me zero rides and zero meals.

Spotify was the integration I was most interested in. I’ve used Spotify forever and was eager to try a dedicated device for listening to music and podcasts. I connected my Bluetooth headphones and dove in, but the Spotify connection is so hilariously inept that I gave up almost immediately. If I ask for specific songs or to just play songs by an artist, it mostly succeeds — though I do often get lullaby instrumental versions, covers, or other weirdness. When I say, “Play my Discover Weekly playlist,” it plays “Can You Discover?” by Discovery, which is apparently a song and band that exists but is definitely not what I’m looking for. When I ask for the Armchair Expert podcast, it plays “How Far I’ll Go” from the Moana soundtrack. Sometimes it plays a song called “Armchair Expert,” by the artist Voltorb.

Not only is this wrong — it’s actually dumber than I expected. If you go to Spotify and search “Discover Weekly” or “Armchair Expert,” the correct results show up first. So even if all Rabbit was doing was searching the app and clicking play for me — which is totally possible without AI and works great through the off-the-shelf automation software Rabbit is using for part of the process — it should still land on the right thing. The R1 mostly whiffs.

About a third of the time, I’ll ask the R1 to play something, it’ll pop up with a cheery confirmation — ”Getting the music going now!” — and then nothing will happen. This happened in my testing across all of the R1’s features and reminded me a lot of the Humane AI Pin. You say something, and it thinks, thinks, thinks, and fails. No reason given. No noise letting you know. Just back to the bouncing logo homescreen as if everything’s A-okay.

The long and short of it is this: all the coolest, most ambitious, most interesting, and differentiating things about the R1 don’t work. They mostly don’t even exist. When I first got a demo of the device at CES, founder and CEO Jesse Lyu blamed the Wi-Fi for the fact that his R1 couldn’t do most of the things he’d just said it could do. Now I think the Wi-Fi might have been fine.

A photo of the Rabbit R1.
The R1 connects to Spotify but doesn’t do it very well.

Hot mic

Without the LAM, what you’re left with in the R1 is a voice assistant in a box. The smartest thing Rabbit did with the R1 was work with Perplexity, the AI search engine, so that the R1 can deliver more or less real-time information about news, sports scores, and more. If you view the R1 as a dedicated Perplexity machine, it’s not bad! Though Perplexity is still wrong a lot. When I asked whether the Celtics were playing one night, the R1 said no, the next game isn’t until April 29th — which was true, except that it was already the evening of April 29th and the game was well underway. Like with Humane, Rabbit is making a bet on AI systems all the way down, and until all those systems get better, none of them will work very well.

For basic things, the kinds of trivia and information you’d ask ChatGPT, the R1 does as well as anything else — which is to say, not that well. Sometimes it’s right, and sometimes it’s wrong. Sometimes it’s fast — at its best, it’s noticeably faster than the AI Pin — but sometimes it’s slow, or it just fails entirely. It’s helpful that the R1 has both a speaker and a screen, so you can listen to some responses and see others, and I liked being able to say “save that as a note” after a particularly long diatribe and have the whole thing dumped into the Rabbithole. There’s a handy note-taking and research device somewhere inside the R1, I suspect.

To that point, actually: my single favorite feature of the R1 is its voice recorder. You just press the button and say, “Start the voice recorder,” and it records your audio, summarizes it with AI, and dumps it into the Rabbithole. $200 is pretty steep for a voice recorder, but the R1’s mic is great, and I’ve been using it a bunch to record to-do lists, diary entries, and the like.

The most enjoyable time I spent with the R1 was running around the National Mall in Washington, DC, pointing the R1’s camera at a bunch of landmarks and asking it for information via the Vision feature. It did pretty well knowing which large president was which, when memorials were built, that sort of thing. You could almost use it as an AI tour guide. But if you’re pointing the camera at anything other than a globally known, constantly photographed structure, the results are all over the place. Sometimes, I would hold up a can of beer, and it would tell me it was Bud Light; other times, it would tell me it’s just a colorful can. If I held up a can of shaving cream, it identified it correctly; if I covered the Barbasol logo, it identified it as deodorant or “sensitive skin spray,” whatever that is. It could never tell me how much things cost and whether they had good reviews or help me buy them. Sometimes, it became really, really convinced my Dorito was a taco.

For the first few days of my testing, the battery life was truly disastrous. I’d kill the thing in an hour of use, and it would go from full to dead in six hours of sitting untouched on my desk. This week’s update improved the standby battery life substantially, but I can still basically watch the numbers tick down as I play music or ask questions. This’ll die way before your phone does.

A photo of the Rabbit R1.Photo: David Pierce / The Verge
AI gadgets are coming — but they’re not great.

A vision in orange

Just for fun, let’s ratchet the R1’s ambitions all the way down. Past “The Future of Computing,” past “Cool Device for ChatGPT,” and even past “Useful For Any Purpose At All.” It’s not even a gadget anymore, just a $200 desk ornament slash fidget toy. In that light, there is something decidedly different — and almost delightful — about the R1. A rectangle three inches tall and wide by a half-inch deep, its plastic body feels smooth and nice in my hand. The orange color is loud and bold and stands out in the sea of black and white gadgets. The plasticky case picks up fingerprints easily, but I really like the way it looks.

I also like the combination of features here. The press-to-talk button is a good thing, giving you a physical way to know when it’s listening. The screen / speaker combo is the right one because sometimes I want to hear the temperature and, other times, I want to see the forecast. I even like that the R1 has a scroll wheel, which is utterly superfluous but fun to mess around with.

As I’ve been testing the R1, I’ve been trying to decide whether Humane’s approach or Rabbit’s has a better chance as AI improves. (Right now, it’s easy: don’t buy either one.) In the near term, I’d probably bet on Rabbit — Humane’s wearable and screen-free approach is so much more ambitious, and solving its thermal issues and interface challenges will be tricky. Rabbit is so much simpler an idea that it ought to be simpler to improve.

But where Humane is trying to build an entirely new category and is building enough features to maybe actually one day be a primary device, Rabbit is on an inevitable collision course with your smartphone. You know, the other handheld device in your pocket that is practically guaranteed to get a giant infusion of AI this year? The AI Pin is a wearable trying to keep your hands out of your pockets and your eyes off a screen. The R1 is just a worse and less functional version of your smartphone — as some folks have discovered, the device is basically just an Android phone with a custom launcher and only one app, and there’s nothing about the device itself that makes it worth grabbing over your phone.

Lyu and the Rabbit team have been saying since the beginning that this is only the very beginning of the Rabbit journey and that they know there’s a lot of work left to do both for the R1 and for the AI industry as a whole. They’ve also been saying that the only way for things to get better is for people to use the products, which makes the R1 sound like an intentional bait-and-switch to get thousands of people to pay money to beta-test a product. That feels cruel. And $199 for this thing feels like a waste of money.

AI is moving fast, so maybe in six months, all these gadgets will be great and I’ll tell you to go buy them. But I’m quickly running out of hope for that and for the whole idea of dedicated AI hardware. I suspect we’re likely to see a slew of new ideas about how to interact with the AI on your phone, whether it’s headphones with better microphones or smartwatches that can show you the readout from ChatGPT. The Meta Smart Glasses are doing a really good job of extending your smartphone’s capabilities with new inputs and outputs, and I hope we see more devices like that. But until the hardware, software, and AI all get better and more differentiated, I just don’t think we’re getting better than smartphones. The AI gadget revolution might not stand a chance. The Rabbit R1 sure doesn’t.

Photography by David Pierce / The Verge

How to get more space in your Google storage

Vector illustration with the Google Drive logo.
The Verge

For many of us, Google storage is the modern-day hard drive. It’s the place where our most important thoughts, documents, and memories reside. But just like with a traditional hard drive, the space isn’t infinite, and running out of room can be a real problem.

By default, Google gives you 15GB of space to use for everything associated with your account. That includes content connected to Gmail, Google Drive, and all Google photos (except those saved before June 1st, 2021). Needless to say, data adds up fast.

You can check your current storage status by visiting this page, and if push comes to shove, you can purchase more space there, too, for as little as $2 a month for an extra 100GB. But shelling out more money might not be necessary. A quick round of old-fashioned housekeeping could be enough to clear away your virtual cobwebs and give yourself ample room to grow. Here’s how to do it.

Delete Drive debris

Google Drive is a common place for space-sucking files to build up and wear down your quota, but tidying things up doesn’t take long.

  • Open this link, which will show you a list of all of your Drive files sorted by size with the largest items at the top.
  • Look through the heftiest offenders and delete anything you no longer need.
  • Click the gear-shaped icon in Drive’s upper-right corner, and select Settings, followed by Manage Apps.
  • Apps associated with your Google Drive storage can sometimes have hidden data, but all it takes is a couple of clicks to remove it. For any apps that have a note about hidden data, click the gray Options box to the right, and select Delete hidden app data.
Google Drive hidden data
For any apps that have a note about hidden data, click the gray “Options” box to the right, and select “Delete hidden app data.”

Free up Photos storage

Unless you currently have a model 5 or earlier Pixel phone (in which case, you will, for now, keep the unlimited “Storage saver” option), every photo and video backed up to Google Photos after June 1st, 2021, counts against your Google storage. If you’ve been saving photos at their original sizes, you can free up tons of space by converting them to Google’s “Storage saver” option (which used to be called “High quality”). This compresses images down to 16MP and videos to 1080p (a change that’s unlikely to be noticeable for most people and purposes).

  • Go to the Photos settings page and select Storage saver.
  • If you switch to Storage saver, your previous photos won’t automatically be compressed. To do that, on the Photos setting page, look for the “Recover storage” button, which will compress many (but not all) of your existing videos and photos. (Check out the list on Google’s support page to see which images will be affected.)
You can compress many of your saved photos and videos to save space.
You can compress many of your saved photos and videos to save space.

Another handy resource is the Manage storage page, which you get to by tapping Storage at the bottom of the left column. This will take you to a page that will tell you approximately how much more time you have before you fill up your storage space and offer to find (and delete) blurry photos, screenshots, and other possibly unwanted images that are taking up space.

Google has a page that helps you get rid of unnecessary images that are taking up space.
Google has a page that helps you get rid of unnecessary images that are taking up space.

Say goodbye to Gmail junk

Emails don’t take up a ton of space, but you know what does? Attachments. Odds are, you’ve got plenty of old attachments sitting in your Gmail account that you don’t really need.

Here’s how to address that:

  • Go to the Gmail website and type “has:attachment larger:10M” into the search box at the top
  • Identify any messages with disposable attachments and delete them. (There’s no great way to get rid of an attachment without also deleting the associated email, unfortunately, but you can always forward a message back to yourself and manually remove the attachment before axing the original.)

If you’re like me, you tend to ignore most, if not all, of the email that finds its way into your Promotional folder. While each promotional email may not take up much space, they can build up over the weeks (or months).

  • If you just want to get rid of the entire mess, go to the Promotions section, and click on the small checkbox just under the search field and then Select all [number] conversations in Promotions > Clear selection.
  • If you’d rather remove specific promotions, then open one of the emails from the company you no longer want to hear from. After clicking the Unsubscribe button from next to the email address (because you don’t want to get any more of those, right?), go to the right of the address, click on the three dots and then on Filter messages like this > Search. You can then use the above method to get rid of all the emails from that merchant.

And of course, these directions can be used with any other Gmail folder as well.

Gmail message filter showing emails from Walgreens.
By filtering messages, you can delete emails from a specific sender.

Now you can completely dispose of it all.

  • Open your Spam folder, and click the link to Delete all spam messages now.
  • Open your Trash folder, and select Empty Trash now to send everything away for good.

Feeling lighter is liberating, isn’t it?

Update May 1st, 2024, 10:41AM ET: This article was originally published on March 19th, 2019. The information on Google Drive and Google Photos has been updated and information on filtering has been added.

Beats Solo 4 review: playing both sides

Beats’ on-ear headphones get an overdue refresh with a more comfortable design, longer battery life, and wired audio over both USB-C and the 3.5mm jack — but no ANC.

They look like Beats headphones. They sound like Beats headphones. The battery life can stretch to a new high of 50 hours. Those things alone all but guarantee that the new Beats Solo 4 on-ear wireless headphones will prove just as successful as their predecessors — and it’ll be no time at all before you start seeing them worn by athletes and music stars at every turn.

But there’s more to these than a logo. Unlike the Solo 3, the fourth-gen cans uniquely support native software features (like one-tap pairing and Find My / Find My Device) on both Android and iOS; Beats has quietly become Apple’s Android-friendly brand, in case you weren’t paying attention. And for an old-school guy like me, I love that the company is putting an emphasis on wired, lossless listening over either USB-C or the 3.5mm headphone jack. Sonically, these are a world apart from Sennheiser’s Momentum 4 or the Bowers & Wilkins PX7 S2e headphones that I often carry — both of which are more expensive. But they’re also for much different audiences. As ever, Beats is about cultural cachet, that prominent “b” logo, and enjoyable (if not mind-blowing) sound.

That’s not to say Beats knocked everything out of the park. A complete lack of active noise cancellation in any $200 pair of wireless headphones is hard to overlook; the ear cushions have provided relatively good natural noise isolation in my local coffee shop and when traversing Brooklyn, but ANC is always appreciated when the clamor starts to bubble over. The short-lived Solo Pro had it, but not these, which are technically a sequel to the eight-year-old Solo 3.

So Beats’ flagship Studio Pro easily win out on the ANC front. But the Solo 4 do have one thing going for them: they’re passively tuned. On many wireless headphones, there’s an active EQ profile running at all times that provides the fullest sound. That’s all driven by the battery. Once you’re out of power, some headphones will stop playing — even when wired — or will fall back to very meager audio quality until you recharge. The Solo 4 will keep playing endlessly when plugged in even if the battery is dead thanks to that passive tuning, and the sound never changes. “Unlimited wired playback” is actually one of the bullet points advertised on the back of the box. We love to see it, though this inevitably means getting out a headphone dongle nowadays.

A photo of the Beats Solo 4 wireless headphones.
The “pink” Solo 4s often look very neutral depending on the light.
A photo of the Beats Solo 4 wireless headphones.
The left Beats logo doubles as your play/pause button, with volume above and below.

On your head, the Solo 4 wear well. They’re narrower, sleeker, and significantly lighter than the Studio cans. And they use the same “UltraPlush” memory foam pads as the Studio Pro, which are a key part of the comfort. Beats claims the new cover material — a failure point of some past headphones — should provide better durability and extended longevity compared to the Solo 3. There’s always a moderate amount of clamping force with Beats headphones; plenty of people use them at the gym or during outdoor activities. But despite my huge noggin, I never found the pressure to reach unpleasant territory.

A photo of the Beats Solo 4 wireless headphones.
The memory foam ear cups help make the clamping force less noticeable.

I like the included fabric carrying case, too, but why doesn’t it match the color of your headphones? Blue, pink, and black are the hardware choices at launch, but Beats has a history of churning out many other colors as time goes on. The design of the headphones is similar to past models, and so are the controls. The left-side Beats logo acts as a play / pause button, and you’ve got volume controls directly above and below, so using these headphones is about as simple as it gets. You can double-tap the logo to skip forward a track or triple-press to go back — all very familiar controls for Beats fans.

Rather than integrating an Apple chip, which would make these lopsidedly appealing to iPhone owners, Beats is sticking with the same proprietary platform that has been the brains of its recent products. In practice, this means you’ll get some (but not all) ecosystem software tricks, regardless of whether you’re using iOS or Android. This feels like the right approach to me. Apple fans get at least one exclusive: personalized spatial audio with head tracking. But the Android crowd gets automatic device switching between Android, Chromebooks, and other devices.

A photo of the Beats Solo 4 wireless headphones.
The “4” is how you know these are the new ones, obviously.

I’ve been listening to the Solo 4 for several days, and the sound is honestly more restrained than I expected. They’re not particularly bassy and avoid overemphasizing any section of the frequency range; the goal was to land on a consistent tuning that fits right across music, podcasts, work meetings / voice calls, and more. Speaking of calls, voice quality is rather decent, with Beats having trained its ML algorithm “using over 7,000 hours of exposure to real-world environments.” Where the Solo 4 fall short compared to pricier headphones is in their overall richness and a fairly condensed soundstage that lacks much breadth. But for the target audience, I think they’ll prove more than adequate.

The omission of noise cancellation on the Solo 4 could be a real obstacle for some, but I don’t think it’ll be enough to dampen their appeal to the masses who’ve been cycling through Beats products for so many years now. Even if you’re buying largely for the cool factor, at least these on-ear headphones are now platform-agnostic, more comfortable, and more versatile since you can just plug in if you manage to run through that 50-hour battery life. As with all Beats products, it’s worth holding out until they go on sale — and the Solo 4 certainly will.

Photography by Chris Welch / The Verge

SwitchBot S10 review: with plumbing hookups, this robovac and mop is actually hands-free

Thanks to its self-cleaning mop, automatic water tanks, and supersize bin, this robot vacuum can go for two months without any manual intervention. 

As I type, a small white robot has just rolled past me, heading from my bathroom, where it just finished mopping and vacuuming the floor, to my laundry room. Once there, it docked to its water station, gurgled a lot as it emptied its dirty water tank and filled itself up with fresh water, then headed back to its charging station, where it settled down for a blow-dry (of its mop) and recharge.

That robot is the new SwitchBot S10. A robot vacuum and mop with a couple of unique features, the S10 was announced last year and, following a Kickstarter launch, is now available to buy for $1,119.99 (€1,099.99 / £999.99).

The S10 is the first combo vacuum / mop I’ve tested that can hook directly into your plumbing, so you don’t need a bulky, multifunctional charging dock to take care of the robot.

Instead of one giant dock, the SwitchBot has two small ones: a water refill station and a charging / auto-empty dock. These don’t have to be placed in the same room, the water station doesn’t even need a power outlet (it has a battery), and they are both more compact than other docks with the same function.

With its dual docks, the S10 can empty its dirty water, fill itself up with clean water, empty its dustbin, dry its mop, and charge itself. And because it’s hooked directly into my plumbing, I never had to deal with emptying or refilling big water tanks. In fact, the only manual labor the S10 requires is replacing its dust bag — something SwitchBot claims you’ll only have to do every two months.

The S10 is the only robot vacuum I’ve tested that cleans its own mop as it cleans the floor. Its roller mop uses a squeegee system, pushing the dirty water into a small tank while spraying the mop with clean water as it mops.

This meant less possibility of cross-contamination on the floors, and it finished the job faster than a robot that has to go back to its base to wash its mop would. I also didn’t have to deal with the grimy, smelly “sink” that most other multifunctional docks have to clean the mops in — and which has to be cleaned manually.

The SwitchBot’s water refill station. While the station fits under my sink, the robot is too wide to get under there.
The separate charging dock empties the robot’s bin and dries its mop.

I’ve been testing the S10 for about a week, and this system, which at first seemed a little circuitous, works really well. The downside is that all this work drains its relatively small 4,000mAh battery pretty quickly, and it couldn’t get through a full clean of my upstairs and downstairs (about 1,000 square feet) without a three-hour top-up.

Overall, I like the approach of the two docking stations, especially the self-filling water station, which was surprisingly easy to install. I put that in my laundry room, where it is largely out of sight, and installed the compact auto-empty dock in my bathroom, where it fits neatly under my heated towel rack.

SwitchBot isn’t the only company offering the option of plumbing its docks. Narwhal, Roborock, and Dreame all have plumbable options using their multifunction docks. However, these require power, whereas SwitchBot’s water station is a smaller, battery-powered device — which makes installation easier. (The battery is recharged by the robot.)

The S10 emptying its dirty water and refilling itself all from my sink. You can see the plumbing hookups here, too; it was a tight fit.

SwitchBot’s water station can hook into a number of water sources and drain lines — under a sink, by a toilet (draining into the bowl), or connecting to a dishwasher or washing machine supply line. The company’s Evaporative Humidifier (coming soon) can be refilled by the robot vacuum, and a dehumidifier that could empty itself into the S10 is being developed. This kind of smart home symbiosis is intriguing.

SwitchBot makes a wide range of smart home devices, from lights and locks to curtain motors and robot fingers. The S10 can work with all of these using the SwitchBot app to do things like dock the robot when the front door unlocks or start cleaning when the lights turn off. SwitchBot also supports Matter through its Hub 2, opening the door to more smart home integrations.

I was impressed at how easy the S10’s station was to install, considering I am not a plumber (nor is my husband, who gave me a hand with this one since he’s better with a wrench than I am).

On a scale of one to 10, one being installing a smart light bulb and 10 being plumbing a smart faucet, this is a seven. It’s totally doable if you have easy access to the pipes and can handle a wrench. We installed it under an open utility sink (so no cupboard to deal with), but the water outlets were tucked behind the sink, so it was tight getting in there.

That was the toughest part. SwitchBot’s YouTube installation videos were very clear (don’t bother with the paper directions), and the attachments SwitchBot provided to hook the pipes into the plumbing were easy to use and well made. However, the dirty water pipe attachment wasn’t free-spinning, resulting in some contortions to get it attached. The whole thing took 30 minutes.

I wasn’t thrilled with how much piping there was, and I couldn’t find a good way to hide it all under my open sink. SwitchBot provides cable-tidy fittings, but I would prefer to switch them out for shorter pipes or to be able to cut them to fit. Because the S10 is so wide, it wouldn’t fit under my sink, so I had to put the dock next to the sink, which isn’t ideal.

If you don’t have the option of hooking up to water, SwitchBot has a water tank add-on coming later this year for $80. I tried this, and while it worked fine, you still have to deal with all the piping, so it’s not an elegant solution. If you can’t use the plumbing option, the S10 is not the right robot for you.

As a robovac, the S10’s specs are good, if not the best of the best. It has 6,500Pa suction, a 4,000mAH battery, and AI-powered obstacle avoidance (which uses an onboard camera). While Roborock’s, Dreame’s, and Ecovacs’ flagships have higher suction power and longer battery life, they cost significantly more.

The S10’s obstacle avoidance wasn’t as good as Roborock’s, and while it dodged fake pet poop, it got close — brushing up against it as it navigated away. It did well at avoiding cables, socks, and larger items like shoes.

The bot’s single rubber roller brush performed well in my tests, getting up all the oatmeal and rice on hard floor and doing an excellent job on cat hair on low-pile carpet. Generally, I prefer dual roller brushes, but the S10 is a heavy robot, and its weight seemed to help it dig down into carpet fibers.

Mopping is the main reason to buy this bot, and it’s very effective. Its rolling movement agitates the dirt, and with its 10n downward pressure, the mop easily tackled dried milk and spilled OJ. The water station has the option to add a cleaning solution, which seemed to help with tougher grime. But the mop doesn’t oscillate as the pads on the Dreame X30 or Roborock Q Revo do, and some dirty paw prints were still slightly visible.

On the flip side, oscillating mopping pads can get hung up on things like rug tassels, cables, and room transitions, whereas the SwitchBot’s mop is tucked neatly under it, and it never got stuck. The mop raises when it goes over carpet (but only by 7mm), or you can program it to avoid carpet altogether or draw keep-out zones around high-pile rugs.

The mop is easy to remove — it slides out the side, so I didn’t have to flip the big beast on its back. It stayed clean throughout my testing and dried quickly. The mop is replaceable, and a two-pack costs $30.

The SwitchBot app has the key features you’d expect from a high-end bot, including lidar mapping, virtual no-go zones, and room-specific cleaning. But it is missing a few things. There are two cleaning modes with multiple levels: vacuum and mop and vacuum only, but there is no mop-only option or smart cleaning modes. There’s voice control with all the main platforms, but it’s limited. It supports up to three maps, but you can’t add furniture to them. Additionally, on floors where there is no water station, it only vacuums — it won’t mop — and you can’t buy a second water station.

Overall, I was impressed with the S10. Its dual dock system is an innovative fix to the design problem of these big multifunction docks, and the possibility of connecting with a humidifier and dehumidifier takes us one step closer to a future where robots can do more in our homes than just clean the floors. But for today, the water hookup and self-cleaning mop make this the most hands-free cleaning robot I’ve tested.

The biggest downside is the short battery life, and I’m disappointed SwitchBot didn’t go for more milliamp hours, especially given the potential for using the water station for more tasks down the line. The self-refilling humidifier and self-draining dehumidifier are great ideas — if they get released. Maybe this ingenious robotics company could come up with an attachment for the robot that can refill your dog’s water bowl or water your plants. But all of that will require more power.

Photos by Jennifer Pattison Tuohy / The Verge

Mercedes-Benz won’t let Apple CarPlay take over all its screens

Apple CarPlay UI across three simulated Porsche in-car screens
Image: Posche (via Car & Driver)

Mercedes-Benz doesn’t have any plans to adopt Apple’s immersive, next-generation version of CarPlay, the German automaker’s CEO said in an episode of Decoder.

“The short answer is no,” Ola Källenius told The Verge’s Nilay Patel in response to a question about whether Mercedes-Benz will enable Apple CarPlay to take over all the screens inside its vehicles. Instead, he touts the need for a “holistic software architecture” to meet the needs of customers who are increasingly looking for a better technology experience from their vehicles.

Apple announced its next-gen version of CarPlay, in which the phone-mirroring feature would extend beyond the central touchscreen to also include additional screens like the gauge cluster, back in 2022. It was a bold move, with Apple signaling its desire to control core functions of the vehicle like HVAC, as well as the speedometer and odometer. But since then, the new CarPlay has yet to appear on any production models. Last year, it said that Porsche and Aston Martin would be among the first companies to adopt the new immersive display.

But Mercedes doesn’t appear to be in any rush to follow its luxury vehicle peers in letting Apple dominate the in-car experience for its customers. Instead, Källenius said that the company is working closely with Apple’s main rival, Google, in designing a new navigation feature that will build on Google Maps. The key difference there is that Mercedes’ own engineering team will be heavily involved in the process.

“I fundamentally believe that that holistic customer experience is best done by us, and we will serve you,” he said during the interview.

But Källenius said he still sees value in offering phone-mirroring services to his customers and has no plans to exclude their use — despite some in the auto industry turning away from them. Last year, General Motors made the controversial move to prohibit Apple CarPlay and Android Auto in its forthcoming lineup of electric vehicles, arguing that the company could provide a more comprehensive software experience than what exists on someone’s phone.

“We’re not fundamentalists to say, for some reason, we’re not going to allow a customer to use Apple CarPlay if that’s what they choose to do,” Källenius said. “So, we have Apple CarPlay. We have Android Auto. If, for some of the functions, you feel more comfortable with that and will switch back and forth, be my guest. You can get that, too.”

At the end of the answer, he reiterated his position that Apple’s next-gen CarPlay was a bridge too far for Mercedes. “To give up the whole cockpit head unit — in our case, a passenger screen — and everything to somebody else, the answer is no.”

Duel of the dual-screen laptops: Asus Zenbook Duo vs. Lenovo Yoga Book 9i

Their small variations in design make a big difference.

There’s no getting around owning a laptop these days, especially if we travel often or don’t have the space for a desktop computer. But as time has gone on, many of us have invested in an external monitor or two to better handle all the windows we need to have visible at the same time. We could arrange them all on one screen, but the smaller the laptop, the harder it is to read two windows side by side, let alone scroll through them. But who wants the hassle of traveling with a portable monitor? I sure don’t. So I tested the Asus Zenbook Duo and the Lenovo Yoga Book 9i head to head to see which ones would alleviate those issues the best.

The Zenbook Duo looks like a regular laptop until you remove the keyboard and trackpad that cover the entire bottom screen. I enjoyed the look of awe on my friends’ faces every time I did that. It felt like performing a magic trick. The Yoga Book, meanwhile, looks like two tablets stuck together, and its keyboard hides only half of the bottom screen, so you know right away that it’s different. “What is that? Is that a laptop?!” is the most common response I received from people when they first saw it. These and other small design differences have a big effect on the overall experience of using them.

Design features: the Yoga Book 9i has more form factors

The dual screens give you lots of options. Both the Zenbook Duo and Yoga Book 9i can be used as traditional laptops with either a physical keyboard and trackpad (Zenbook) or a physical keyboard and mouse (Yoga Book); taking notes on the bottom display in clamshell mode; or with their displays oriented vertically or horizontally. They both also have virtual keyboards and trackpads.

For the dual screens to remain stable while upright, the Zenbook Duo has a kickstand attached to the bottom chassis, while the Yoga Book comes with a keyboard folio cover that transforms into a stand with a flat, triangular back and a thick lip at the bottom to keep the laptop in place.

A close up of the back of a laptop and it’s attached stand.
The Zenbook Duo’s attached kickstand.
The back of a laptop propped up by a stand.
The Yoga Book 9i’s keyboard folio when it’s folded into a stand.

Only the Yoga Book has a 360-degree hinge that rotates the displays back to back, so you can use it as a tablet. (The Zenbook Duo’s top display folds back 180 degrees.) It’s thick for a tablet, but when I need to walk around my classroom as I’m teaching, it’s far less unwieldy than the Zenbook. The Yoga Book is also lighter and thinner when folded, at 2.95lbs and 0.63 inches compared to the Zenbook’s 3.62lbs and 0.78 inches.

Neither of these laptops is great for using directly on your lap, though. The Zenbook Duo can get uncomfortably hot if the processor is running as fast as it can, and there’s a vent that blows hot air directly into your lap. The Yoga Book is fine temperature-wise, but the keyboard can easily shift and twist a few centimeters despite attaching magnetically over the bottom display.

Winner: Lenovo Yoga Book 9i

Tech specs: the Zenbook has more power and ports

The Zenbook can be configured with either an Intel Core Ultra 7 or 9 H-series processor, while the Yoga Book has only an Intel Core Ultra 7 U-series option. Both can be configured with 16GB or 32GB of memory, but the Zenbook offers larger storage options, 1TB and 2TB, compared to the Yoga Book’s 512GB and 1TB.

Both laptops have OLED displays, but the Zenbook’s are physically larger at 14 inches, with higher resolution and refresh rate: up to 2880 x 1800 at 120Hz, compared to the Yoga Book’s twin 13.3-inch, 1920 x 1200 at 60Hz. (The Zenbook also has a 1920 x 1200, 60Hz option.) The Zenbook’s displays get brighter, at 500 nits compared to 400.

The Zenbook Duo takes a Swiss Army knife approach with its port options: one USB-A, two Thunderbolt 4 USB-C, one HDMI, and even a 3.5mm combo audio jack. The Yoga Book has only three Thunderbolt 4 USB-C ports, so you’re more reliant on hubs, dongles, and Bluetooth if you connect a lot of accessories to your laptop.

Winner: Asus Zenbook Duo

A dual screen laptop open and powered on.
Zenbook Duo in dual-screen mode.
A dual screen laptop open and powered on.
Yoga Book 9i in dual-screen mode.

Dual-screen gestures: the Yoga Book’s work consistently

Both laptops use tap or swipe gestures to pull up and put away the virtual keyboard and trackpad, flick windows from one screen to the other, extend a window across both screens and launch more than one app at the same time. The exact number of fingers or swiping motion for the gestures differ between the laptops — and Lenovo does a better job at teaching you how to use those gestures.

The Yoga Book’s User Center software is one of the first things that pops up the first time you power on the laptop. It’s both a guide and a setting portal with clear instructions and visuals and lets you turn any of the gestures on or off, including the option to automatically launch its bespoke notetaking app when you open Microsoft Teams or Zoom.

Top down view of two closed laptops side by side on top of a wood table.
The Zenbook Duo (left) is a smidge faster to put away.

Lenovo also recently added an app group launcher that lets you launch two apps at the same time, one on the top and one on the bottom. You can customize up to four pairs or let the computer do it for you, but Zenbook one-ups the Yoga Book here. You can launch more than two apps at the same time and assign them to a specific Window’s layout.

But when it comes to teaching you how to use all the gestures and features, the Zenbook tosses you into a pool and says “Swim.” Its instructions are buried within ScreenXpert, Asus’ equivalent to Lenovo’s program. Some of its gestures don’t work consistently, either, like the five-finger gesture for expanding a window across both displays. It would zoom in on the page at the same time because it thought I was also using the two-finger gesture for zooming in and out.

Winner: Lenovo Yoga Book 9i

Stowing and traveling: the Zenbook Duo makes it easier

The 14-inch Zenbook and 13.3-inch Yoga Book are both compact enough to fit inside most bags, but the Zenbook is more convenient to stow because its lid folds right over the physical keyboard and trackpad, like a regular laptop. If the kickstand is popped out, just give it a firm pushback in and — bada bing — you’re done.

For the Yoga Book, you have to detach the keyboard from the bottom screen, fold up the laptop and place it to the side, attach the keyboard to the magnetic portion of the folio stand (if not already attached), and then wrap the folio around the keyboard before you can put both in your bag. That doesn’t include finding a pocket inside your bag for the mouse that comes with the laptop.

Lenovo does win a few points for attaching an elastic band to the folio that holds the included stylus. The Zenbook includes a stylus, but you’ll have to figure out where you’re going to put it when you pack up your things to go to Narnia or wherever.

Winner: Asus Zenbook Duo

Two laptops laying flat side by side on a wood table.
It’s easier to type on the Yoga Book 9i’s (right) virtual keyboard.

Speakers and more: the Yoga Book brings the bass

Virtual trackpads suck, straight-up, and they suck on both the Zenbook and Yoga Book for the same reason: it’s too easy to accidentally minimize or close the active window when all you’re trying to do is “click” on a link or something else on the screen. This happened frequently when I was using either laptop.

Both virtual keyboards are fine for tapping out quick messages, but I prefer the larger surface area of the Yoga Book’s keys. I had fewer mistypes compared to the Zenbook, but I liked the Zenbook’s physical low-profile keys more. Their subtle, clicky sound and tactile feel were similar to the Yoga Book’s, but my presses traveled further and made me feel like I had more control over how fast I typed. (I’m a heavy-fingered typist.)

The biggest surprise was the terrible quality of the Zenbook’s Harman Kardon-branded speakers. They made my favorite playlist with a lot of bass-heavy songs sound surprisingly tinny. Spoken dialogue comes through loud and clear, but how they handled my favorite styles of music made me want to cry. The Bowers & Wilkins-branded system in the Yoga Book is well-balanced right out of the box.

Winner: Lenovo Yoga Book 9i

The Lenovo Yoga Book 9i is a better package deal

These laptops have different personalities and are appealing for different reasons. The Zenbook Duo is for people who want a traditional laptop with a variety of ports and the option to dual-screen drive if they want to. It’s for the people who want the best specs for the price and don’t want to stand out while using it.

But the Yoga Book 9i is the winner for me. It’s for people who want to get away from the traditional laptop form factor without losing all of its essential comforts and dual-screen drive all day long. It’s thinner, lighter, and prettier, and — while it takes a little longer to pack up — the keyboard folio doubling as a laptop stand is a clever design. The touchscreen gestures work consistently, and it’s clear Lenovo put a lot of thought into making sure it provided clear and easily accessible instructions on how to use them.

Photos by Joanna Nelius / The Verge

Fiido Air review: so lightweight you’ll forget it’s an e-bike

‘The world’s lightest city e-bike’ is worth a test, if not your $1,799.

Yes, that’s an electric bike, though you wouldn’t know it from looks alone, or from hoisting it up some stairs since it weighs as much as a regular city bike at just 30 pounds (about 14kg).

What you’re looking at is the Fiido Air, a carbon fiber e-bike from the Chinese company I tested on a whim once, just to see what a $999 direct-to-consumer electric bike was like. Not great, it turned out, and its follow-up had a habit of breaking in two.

But hey, I’m a forgiving type and the company did make amends to those affected. And Fiido says the Air is “the world’s lightest city e-bike” with a “super early bird” price tag of just $1799 at launch (or €1799 in Europe) — rising to $1999 and then $2799 later, ahead of August shipments. That’s just too tempting not to test, especially when it costs half that of the comparable Gogoro Eeyo.

And after spending more than a month with a Fiido Air as my daily rider, I gotta say — I’m impressed... so long as you ignore the app and the silly smartwatch it ships with, and aren’t afraid of doing a little wrenching and troubleshooting yourself.

The first thing you’ll notice about the Fiido Air is the battery — or lack of any visible trace because it’s integrated into the slender frame. Normally that’s a problem but this bike, unlike VanMoofs and some Amplers, is something that many can still haul into an elevator or up a flight of stairs in a pinch due to the liberal use of rigid and lightweight carbon fiber in the bike frame, front fork, handlebar, and seat post stem.

In fact, you wouldn’t know it’s an e-bike at all if it wasn’t for the giant ON / OFF graphic that Fiido inexplicably chose to blaze across the frame as if its owner needs to be forever reminded of where that button is. The otherwise clean design is helped by internally routed cables.

The 250W Mivice rear-hub motor is paired with a Mivice torque sensor for an intuitive assist.

My bike arrived partially assembled in its shipping box. A spacer for my front axel assembly was jammed into the packing materials, however, causing me to overlook it when I assembled the front wheel and handlebars. I could tell something was wrong, and eventually sorted it out with the help of Fiido support, but less experienced bicycle owners might have just lived with the slightly noisy, slightly wobbly, and potentially dangerous assembly.

My European Fiido Air is fitted with a 250W Mivice rear-hub motor and Mivice torque sensor (as you’d expect in this price range) to make the pedal-assisted power feel more natural. It also features plenty of off-the-shelf parts that should help make it easy to service at any local bike shop. That’s not always the case with Fiido’s cheaper e-bikes that use parts not widely available outside of China (I once had a terrible time finding brake pads). The Fiido Air uses Shimano BR-MT410 hydraulic brakes, a Velo saddle, and a Gates Carbon Drive CDX belt drive, with the latter rarely needing servicing unless your bike is shipped with a loose belt, like mine was.

Tightening the belt isn’t difficult, but it’s also not intuitive. Nevertheless, it’s never nice to spend $2000 and find that your transmission slips with a loud clunk when stepping hard on the crank to quickly cross the street against oncoming traffic. I also had to recently lubricate the bottom bracket (where the crankset attaches to the bicycle) after the pedals began making a horrible creaking sound on each downward stroke. Both of these fixes were relatively simple to do, but not something that’s usually required after just a few weeks of riding.

The Air is equipped with a fingerprint sensor that’s surrounded by a colorful light ring. To prevent people from riding off with the e-bike after hitting the well-labeled ON/OFF button, the motor can be configured to unlock with the fingerprint sensor. This worked surprisingly well 99 percent of the time. It worked fine in light rain, so long as I was able to dry it off and shield it, but I once tried to unlock it in a heavy downpour, and no amount of wiping allowed the sensor to recognize my finger. That meant opening the app to unlock the motor.

The app is... terrible, and should be avoided at all cost. Fortunately, it can be abandoned for day-to-day use, but not until you suffer through it for initial setup, and then occasionally to check the battery level — which seems to be off by as much as 20 percent — since there’s no indication of it on the bike itself. It’s a shame Fiido didn’t repurpose the colored ring around the fingerprint sensor for some kind of battery indicator.

After the fingerprint sensor unlocks the bike, more taps will steadily increase the power assist with corresponding rings of color — yellow, blue, a slightly brighter blue, and green — to show the current selection. Unfortunately, the bike doesn’t remember your preferred setting when turning it on and off. A quick double tap on the sensor turns the integrated running lights on and off.

Fiido ships the e-bike with a cheap plastic-y Fiido Mate smartwatch, which is just laughably bad. It can be used to unlock the motor or as a dashboard on your wrist — but can’t be easily attached to the frame. After testing it once I never used it again. I already wear an Apple Watch, but there’s no app for that.

The Fiido Air puts the rider in a very aggressive and sporty position, which creates an awkward hand position that’s less than ideal for long commutes or casual city riding. But it is fun! The pedal assist is delivered smoothly, intuitively, and very quietly, but the motor’s modest 40nm of torque makes this single-speed e-bike best suited for mostly flat commutes. Out of the box the Fiido Air has a 15.5mph (25km/h) top speed that shoots up to 18.6mph (30km/h) with a simple (and often illegal) software setting.

Fiido says the Air can go up to 80km (about 50 miles) on a single charge which is wildly optimistic for its 209Wh non-removable (but serviceable) battery, but may be doable in the lowest power setting (I always tested in max). In my testing, pedal assist was already noticeably degraded after around 40km (25 miles) of riding. Fiido also sells an optional bolt-on range extender that you can take inside to charge from Fiido’s relatively small charging brick.

For what’s supposed to be an e-bike for cities, it ships without a kickstand, bell, or any mudguards which means a rooster tail of spatter on your back if you get caught in the rain. It does have attachment points for front and rear fenders though, if you decide to go that route. It also comes with Kenda 700*40C tires that look better suited for gravel than city streets.

Overall, I’ve really enjoyed using the Fiido Air as my primary city ride for the last six weeks and change. For $1799 it’s a good deal for anyone looking for a nicely designed and lightweight e-bike. For $1999 it’s still worth a hard look, but for $2799 I’d consider other options first.

All photography by Thomas Ricker / The Verge

Battle of the best robovacs (that iRobot doesn’t make)

We put the Roborock S8 MaxV Ultra head-to-head against the DreameBot X30 Ultra to find out which of these Roomba competitors’ flagship robot vacuums is the best.

There are an absurd number of robot vacuums available today, but based on my testing of dozens of bots, just a handful of manufacturers are leading the pack when it comes to innovation, choice, and really good cleaning machines. These include Roborock, iRobot, and Dreame. Each has recently released new flagship models: the Roborock S8 MaxV Ultra, the DreameBot X30 Ultra, and iRobot’s Roomba Combo J9 Plus.

I’ve reviewed the Combo j9 Plus, and I still recommend Roombas if you’re looking for either a high-end robovac or a budget bot, in large part due to their repairability, ease of use, and reliability. But the competition is getting very good, and with iRobot’s future looking shaky following its break up with Amazon, I figured it was time for a deeper dive into its strongest competitors. Here, I pit the X30 Ultra against the S8 Max V Ultra to see which one is the best.

The Roborock S8 MaxV Ultra ($1,799.99) is a robot vacuum and mop with a charging dock that fills the robot’s onboard water tank, cleans and dries its mop pads, and empties its onboard dustbin. It features a whopping 10,000Pa of suction and a camera for obstacle detection and avoidance. Its mop vibrates up to 4,000 times a minute to scrub your floors and raises up to 20mm to avoid carpet.

The S8 MaxV has a new flexi arm that pushes its spinning side brush out further to get into corners better and a side mop that helps clean along edges. A new on-device voice assistant can take direct commands, so you don’t need to use the app or a third-party speaker to control the robot (although it works with Alexa, Google Home, and Siri Shortcuts). It’s also one of the first robot vacuums that will support Matter, although that feature hasn’t been turned on yet.

The DreameBot X30 Ultra ($1,699.99) has many of the same features as the S8 MaxV Ultra, including a charging dock that auto-empties, washes the mops, and fills the robot’s water tank, plus a camera for obstacle detection. It has 8,300Pa suction and uses dual spinning mop pads that it can automatically remove when it vacuums — my favorite feature. It can also lift the mops if needed (up to 10.5mm).

Uniquely, the Dreame can extend its mops out to reach baseboards and even under low furniture, as far as 4cm; this is surprisingly effective at getting up grime from edges.

I let these two bots battle it out in my home over 10 days, testing their cleaning prowess, mopping chops, navigation skills, and unique features — such as an arm and mops that do the splits. I also evaluated the design and usability of their multifunction charging docks and how well they meet their promise of hands-free cleaning. I put their companion apps through their paces, diving into all the settings and features these machines offer in their quest to clean your floors. Read on to find out which one came out on top.

Dock design and function: bigger is beautiful unless you can plumb it

Despite being bigger the Dreame’s dock (left) looks better.

While Roborock has redesigned its dock into something smaller and more aesthetically pleasing (it was the first to release a multifunction dock, and those early days were characterized by hulking monstrosities), it’s still one of the ugliest out there. Dreame, on the other hand, has perfected the stylish dock look, and while it’s bigger than Roborock’s, it’s much prettier.

Dreame’s dock is also slightly more functional. While both models will wash the mops with hot water and dry them with heated air, which helps deal with the smell and mess, Dreame has little wipers that clean the mop area for you, whereas Roborock’s mop tray needs manual cleaning. However, Roborock has the option to connect directly to your plumbing, doing away with the bulky water tank-look entirely. You do need to get a specific model for this, which costs $100 more. While Dreame sells an add-on kit to its existing model for this function, it’s only available in Asia. A North American model — the X40 — is coming later this month, but it costs nineteen hundred dollars.

Winner: Tie

Navigation and obstacle avoidance: they both dodged the poop

The Roborock stares down the fake poop and goes on its way.

Both models use lidar to map and navigate your home. They both mapped the house quickly and accurately and responded correctly to requests for room-specific cleaning and zone cleaning — meaning they didn’t get lost. These robots both have front-facing cameras for AI-powered obstacle avoidance, and they both nimbly avoided fake dog turds, socks, shoes, and bundles of cables.

However, each had weak spots. The Dreame successfully sucked up a pile of Cheerios, which the Roborock thought was an obstacle, but the Dreame got stuck on a stray iPhone cable that the Roborock dodged. Roborock also loves to eat pencils. In the end, though, they were both rarely derailed compared to non-camera-powered robots I’ve tested, and that’s the biggest benefit of AI-powered obstacle avoidance unless you regularly let your pet poop in your house.

This is the first Roborock since the excellent S7 MaxV Ultra to feature a camera for object detection (all the other models use 3D obstacle detection, which is not as effective). But Roombas with the same feature are still the best at knowing what’s in its way and successfully avoiding it or cleaning it when necessary. Also worth noting: if you have a bed skirt or fabric around your sofa, lidar-powered robots will see it as a wall, whereas a VSLAM-powered model, like the Roombas, will push right through and clean under your bed.

Winner: Tie

The S8 MaxV Ultra’s robot arm reaches out to get debris out of corners.

Vacuuming power: Roborock sucks hardest and has an arm …

Both bots have super suction power and did an excellent job getting up every last bit of larger debris, such as rice and oatmeal, on hard floor. But Roborock’s dual-brush system did a better job on carpet, and its rubber roller design means less hair tangle. Dreame sent me its new $50 anti-tangle tri-cut brush (sold separately) that cuts the hair, and I didn’t have to deal with any tangles, which was nice. But the Roborock was tangle-free without buying an extra accessory, and its dual brushes did better at getting dirt and hair up off the carpet.

Roborock’s flexi arm is also a great upgrade. It’s designed to help the bot clean corners better by reaching the spinning brush out to swipe up the dirt. I have seen this in action at CES, but it happens in the flash of an eye, and despite spending a lot of time hovering over the bot, I never actually saw it work in my home. But the debris I put in the corners to test it was gone, so I guess it worked?!

The Dreame (left) has a single roller brush, whereas the Roborock has two rubber brushes that are better at getting dirt off carpet and sucking up messes in one pass than the Dreame.

Auto-cleaning modes are a new feature I’m starting to see on high-end robots. They eliminate the bother of having to set specific cleaning modes for different rooms — such as cleaning the kitchen and entryway twice but the dining room once. Both Roborock and Dreame have versions of this AI-powered cleaning mode. Dreame calls it CleanGenius, and Roborock’s is SmartPlan. I found them both very useful for just hitting go and not having to plan the route but still ending up with spotless floors.

These modes also turn on a feature that sends the robot back to clean areas it determines that need more attention. This was hard to test effectively in the time I’ve had with them, but it’s an interesting feature I’ll be keeping an eye on. Anything that involves less of me spending time with an app and more of the robot doing things on its own is a good thing.

I really liked Roborock’s “Recommended Routines,” personalized cleaning sequences that again mean less programming by you. There’s an After Meals one that tackles the kitchen and dining room and a Pet Supply option for cleaning around pet food areas (the robot can identify pets, pet beds, and pet bowls), along with a few other useful options.

Winner: Roborock

Mopping prowess: Dreame’s mop moving and mop removal is genius

The Dreame can push its mops out to scrub baseboards and also swing the robot’s body to extend the mops further to get under things like my dishwasher here.

Dreame’s auto-detachable mop pads are still the best way I’ve seen to deal with the “how does a robot mop and vacuum without messing up your carpet” conundrum. When it’s cleaning carpet, it goes back to its dock, takes off its mop pads, then goes and vacuums. Genius. It can also raise its mop to about 10mm if needed to save time, so it can still traverse carpet to mop further away rooms. Roborock’s mop isn’t detachable, although you can manually remove the pad itself. It does lift a lot higher, up to 20mm, but there’s still a chance of contaminating high-pile carpets unless you tell it to avoid carpets.

Dreame’s dual oscillating mop pads also do a better job of getting wet messes off the floor than Roborock’s single flat pad. While Roborock’s mop vibrates up to 4,000 times a minute, Dreame successfully removed all the dried ketchup and OJ in my tests, whereas Roborock left a trace behind.

The other thing Dreame does very well is clean baseboards and edges. It uses a “MopExtend RoboSwing” technology that extends its mop out to reach the baseboard and also swings the robot toward the edge to push the mops under things like my fridge and dishwasher, getting the grime that other cleaning methods miss. Roborock’s Extra Edge Mop system, new on the S8 MaxV Ultra, does give the bot a bit more mopping reach — a small spinning mop pad extends slightly out from the right of the robot, but it’s not a patch on the Dreame.

Winner: Dreame

Apps, video cameras, voice control, and Matter, oh my!

These high-end robovacs have a dizzying amount of features accessed through their apps; which is where you set up the map (name rooms and add furniture to help the robot understand your home better). This was easy to do on both, and they have very similar apps.

However, Roborock’s app is more refined, more stable, and slightly more user-friendly. Both have so, so many settings menus to dive into to customize everything from how often the bot washes its mop and when it empties its bin to which direction it cleans your hardwood floors (yes — you can select “along the grain”). But Roborock makes it easier to get to what you need. It also never crashed on me, whereas Dreame’s often showed the robot offline or made me wait a while before I could access it.

One neat feature is that both can act as roving home security cameras. Roborock even claims it can go look for your pet — although it failed to find my 80lb pup when he was sitting right in front of it. To be fair, it was dark, and he looks like a rug. You can also drop in on the robot’s camera and see and talk to people in your home — yes, that’s as weird as it sounds, but there could be a use case. The camera feature is not enabled by default on either Dreame or Roborock and requires a set of actions and a code to access it remotely.

The Roborock on patrol for a pet. It didn’t spot my dog here, but it can be set to snap pictures of your pet whenever it sees them.

Only Roborock has built-in voice control, a new feature with this model. The wake word is Hello Rocky, and it worked very well, responding promptly and understanding my commands. You do have to wait a beat after activating it to say the command, which takes a bit of getting used to. Dreame can respond to voice commands from Alexa, Google Home, and Siri shortcuts (as does Roborock), but the single-purpose use here makes the experience much better.

Hello Rocky gave me much more control than any of the third-party integrations. I could ask it to empty the bin, skip here, stop drying, and more, along with all the standard commands like clean the kitchen and go back to the dock.

Finally, Roborock supports Matter, which gives it an edge. While none of the major smart home platforms support robot vacuums in Matter yet, most have said they will soon. The fact that Roborock’s S8 MaxV Ultra is already Matter-certified means you’re ready for that future if and when it arrives. Dreame has said it will support Matter in its newest vacuums but has not made any announcements about the X30.

Winner: Roborock

Which bot’s the best?

The Dreame X30 Ultra (left) and the Roborock S8 MaxV Ultra are both impressive robot vacuum mops.

Both robots perform exceptionally well at mopping and vacuuming, and their all-singing-all-dancing docks make floor maintenance virtually hands-free. But the Roborock beats the DreameBot overall thanks to its superior vacuuming performance, easier-to-use app, and built-in voice control. Its dual roller brushes, side brush, and 10,000Pa suction demolished all the dry dirt in my tests. And while the Dreame is better at mopping, the Roborock is still very good.

If mopping is what you really want, the DreameBot’s oscillating mops do a better job with wet spills and dried-on gunk, like ketchup. The mop removal feature meant I didn’t have to worry about my white, high-pile carpet at all. If you have a lot of carpet or high-pile rugs scattered around your home or prefer the nicer-looking dock, Dreame may be a better choice, but otherwise, the Roborock will suit you very well.

If you are sold on these bots but can’t stomach the price, both brands have cheaper models that do almost as much. The Roborock S8 Pro Ultra costs $1,600 and has lower suction power, no camera (so no AI-powered obstacle detection), and no voice assistant or Matter. Dreame’s previous flagship model, the L20 Ultra, is currently $1,500 and slightly better in a few areas. It does have lower suction power but can remove its mops and extend them (though not as far as the X30). However, its auto-emptying wasn’t as reliable.

I should note that Dreame has just announced the X40 Ultra, which will be available for an eye-watering $1,900 and will have a model with a direct water hookup. The X40 also adds a flexi arm — just like Roborock’s — and 12,000Pa of suction. But it still only has one roller brush, and the brushes are key to cleaning. Also, yes, I do think these robots are breeding.

Is Crossrope’s smart jump rope worth $200?

Photo by Sheena Vasani / The Verge

Skip Crossrope unless you really love skipping rope.

Like everybody else, my New Year’s resolution was to work out more. After moving to a new city, I fell out of my workout routine, and it didn’t help that the gym chain I belonged to was now a 30-minute drive in Los Angeles traffic.

So I started researching workouts I could do from home. Jump roping is fun and a great, full-body cardio workout that can also improve agility and coordination. So when I heard the $199 Crossrope AMP Jump Rope Set would quantify the experience and help me incorporate strength training into my routine with its weighted ropes, I was intrigued.

After testing the set for a month, I can confirm few jump ropes are as well-made as Crossrope’s, and its workouts and community offer a lot of value for jumping enthusiasts. Yet, at $199, plus a $12 monthly subscription, it’s only for those committed to jumping consistently — not casual users.

The Crossrope AMP Jump Rope set box surrounded by its three green, gray, and white weighted jump ropes, with the AMP handled attached to the green one.Photo by Sheena Vasani / The Verge
The Crossrope AMP Jump Rope set comes with a set of Bluetooth-connected handles and three different weighted ropes.

The Crossrope system, which has been around since 2013, consists of interchangeable handles, ropes, and ropeless jumping attachments in a variety of weights from three ounces up to five pounds. The AMP set that I tested comes with a set of Bluetooth-connected handles plus quarter-pound, half-pound, and one-pound ropes.

The ropes and handles are built from strong materials and connect with steel clasps. They feel made to last, but unlike most jump ropes, each rope is a fixed length — you can’t adjust them. They come in six different lengths, but I tripped a few times despite using the size Crossrope recommended for my height. While I began to trip less as I improved as a jumper, when I asked the Crossrope community for help, several members acknowledged they had had the same issue.

A hand holding a set of black jump-rope handles with green squiggly lines, steel interconnects, and a green rope connecting them.Photo by Sheena Vasani / The Verge
Crossrope’s handles feature steel clasps that make swapping out ropes really easy.

The AMP handles are what turn this from an expensive modular jump rope system to an expensive modular smart jump rope system. The Bluetooth-enabled handles connect to iOS and Android devices, allowing you to track jumps, streaks, power output, speed, and calories burned from the companion app. If you connect it with your Apple Watch, you can also import your heart rate data. It’s difficult to judge how accurate these stats were, but Crossrope correctly counted my jumps for the most part, and the other numbers didn’t seem like a stretch.

But that information comes at a price: $11.99 per month. That’s right: along with forking out $199 for the set (or $99 for the handles if you already have Crossrope ropes), you also have to pay a monthly fee to get any value from the smart handles. Even the jump counter is paywalled. That fact was — and still is — jarring to me and is the biggest downside to the set.

A screenshot of a Crossrope’s app listing for a workout to strengthen your core, with a 3D avatar of a personal trainer performing crunches.
Crossrope’s workouts incorporate other exercises besides jumping, like crunches for those wanting to strengthen their core.
A screenshot of Crossrope’s curated Spotify playlists.
Crossrope curates Spotify playlists by beats per minute, which was helpful for when I needed extra motivation.

That said, you’re not paying just for metrics. Along with a helpful Facebook community of nearly 100,000 people, Crossrope includes an app with over 2,500 workouts created by its personal trainers and on-demand classes taught by instructors popular in the jumping world. Jumping rope is obviously the focus, but the custom workouts also include other exercises like squats and dumbbell lifting. There are also longer programs focused on specific fitness goals, from burning fat in, say, six weeks to improving endurance. If you don’t like any of the options, you can also create your own workout, which was helpful when I required a slower pace.

I appreciated how well thought out the workouts are, with a timer included for each set and rest sessions. Crossrope’s own programs even feature Spotify playlists curated by beats per minute geared for different rope weights and speeds. Unlike, say, Apple Fitness Plus or Fitbit Premium workouts, Crossrope also displays a (weird) 3D avatar of the trainer performing the same exercise in real time, which helps with form. And unlike Apple’s and Fitbit’s programs, you can even message Crossrope’s trainers with questions for a more personalized experience.

A screenshot a 3D version of Crossrope’s personal trainer jumping role in real-time during a workout.
Watching a 3D version of Crossrope’s personal trainer exercise in real-time with me was simultaneously helpful and bizarre.

But we have to address the elephant in the room: the Crossrope AMP costs two hundred dollars, plus $12 a month. It exists in a niche market with little direct competition, but it also exists in a world with a lot of cheaper jump ropes. To pull an example almost at random, the Te-Rich Smart Weighted Jump Rope I found on Amazon costs $17 and has a built-in LCD display with a timer and jump counter, while the YaoYao app also tracks jumps and time and only costs $0.99 per month (or $10 for a one-time unlock). Both also estimate calories burned, and YaoYao also lets you set the length of workouts and rest sessions and compete with others via a leaderboard.

A hand holding the Te-Rich Smart Weighted Jump Rope’s pink handles, with one handle featuring a built-in LCD display with a timer and jump counter.Photo by Sheena Vasani / The Verge
The Te-Rich Smart Weighted Jump Rope features a built-in LCD display with a timer and jump counter. It also comes in fun colors, like pink.

While YaoYao often overestimated my jumps, the Te-Rich Smart Weighted Jump Rope’s stats were consistent with Crossrope’s, and sometimes even counted my jumps more accurately. The flimsy 9.8-foot PVC rope tangles easily, but that’s forgivable at this price, especially as the rope is adjustable. The Te-Rich lacks custom workouts, on-demand video classes, and community, but you can find similar ones online. In fact, some on-demand class instructors offer their own YouTube channels. Plus, you can always use the free or paid versions of Crossrope’s app without the AMP handles if you want the workouts and don’t mind losing the jump counter, personalized targets, benchmarks, and leaderboards.

A wrist wearing the Apple Watch Series 8 with the YaoYao app open, display heart rate, timer, speed, and (incorrectly) the number of jumps.Photo by Sheena Vasani / The Verge
YaoYao thought I jumped 22 times when the real number was closer to 14.

The most effective workout is the one you’re going to stick with. If a smart jump rope with guided workouts and an encouraging community makes it easier for you to exercise consistently, Crossrope is worth it. It’s overpriced, but it’s also smaller and cheaper than other home gym equipment I considered, like treadmills. Crossrope’s 60-day return policy also means you can get your money back if you decide you’re not going to use it enough to justify the expense.

I enjoyed my time with the Crossrope. It helped put some of the fun back into fitness for me. But I don’t think jumping will replace jogging and walking as my primary cardio workout — though it’s a fun accessory — so I won’t be buying the Crossrope AMP once I send the review unit back. The Te-Rich didn’t come with a bunch of workout programs or a Facebook group or track my heart rate, but it still gave me a rough idea of jumps and calories burned and didn’t cost $200.

The OnePlus Watch 2 is what redemption looks like

OnePlus could be a strong alternative to Google or Samsung for Wear OS smartwatches.

At the end of February, a large package arrived on my doorstep. Inside were 11 boxes containing the same version of the $299.99 OnePlus Watch 2. My eyes watered and I whispered, “Not again.”

This was a shipping accident. My box had 10 more watches than I needed for a review. It happens and normally doesn’t hold any larger meaning. But I was nervous because the original OnePlus Watch was by far the worst smartwatch I’ve ever had the misfortune of reviewing. Everything that could go wrong did. The fitness and health tracking deserved the Pulitzer Prize for fiction. Troubleshooting the buggy software was a nightmare. That whole abysmal experience was seared into my memory. So when OnePlus reached out to say it was making a second watch — and that this one was markedly better — I was hopeful. And then a box of 11 smartwatches arrived on my doorstep.

Still, it wouldn’t be fair to let past mistakes color my opinions of a new watch. I took my time to get to know the OnePlus Watch 2 on its own merits. So when I say this watch is not only competent but also pretty good, I really mean it.

It works!

The bar for the OnePlus Watch 2 was low. All it had to do was be better than its craptacular predecessor. It’s been a while, so I reread my original review to refresh my memory. To beat the original watch, this one only had to:

  • Record reasonably accurate activity and health data. Last time, it recorded 15,314 extra steps compared to a control smartwatch.
  • Accurately sync data between the phone and watch. If I took a mile-long walk, it needed to log one mile on my watch and phone. The data for steps, heart rate, distance, etc., had to also match. The last OnePlus Watch continuously gaslit me by showing radically different metrics on the phone versus the wrist.
  • View historical sleep data in the app and not just on the wrist. The last one couldn’t do this.
  • Deliver push notifications in a timely manner. Not 40 notifications four hours later, all at once.
Person looking at OnePlus Watch 2 on their wrist
The OnePlus Watch 2 has a novel dual chip and dual OS structure.

I’m genuinely chuffed to say wearing the watch these past few weeks was a healing experience. OnePlus did all of that and then some. Whereas the original watch was basically a fitness tracker, this is a genuine smartwatch with a dual-processor architecture, including the latest Wear OS chip and a novel dual OS to prolong battery life.

By upgrading from a proprietary OS to Wear OS 4, the watch delivers a much richer overall experience. I can now access third-party apps from the Play Store. There are multiple music apps to choose from, including Spotify and YouTube Music! There’s contactless payments! I can turn off my smart lights with Google Assistant. That’s huge considering almost no third-party Android smartwatches have launched with Google Assistant since the switch to Wear OS 3. This is the stuff you’d expect from a proper flagship.

Build quality is also better. The original watch was a pretty screen with chintzy, plasticky materials. This has stainless steel and sapphire crystal. The silicone strap is much thicker. The 1.43-inch OLED display is pleasing to look at. Scrolling through screens is smooth, and colors are crisp. I wish screen brightness went above 600 nits — it can look washed out in direct sunlight — but that’s a quibble.

I don’t have any complaints with health, activity, and sleep tracking, either. I wore the OnePlus Watch 2 alongside the Oura Ring, the Garmin Forerunner 165 Music, the Apple Watch Ultra 2, and a few other smart rings. I saw some normal minor discrepancies but nothing to write home about. The OnePlus Watch 2 also adds dual-frequency GPS. It’s a common addition to more premium or rugged smartwatches these days but mostly translates to slightly more accurate GPS data in challenging environments. My results in testing were quite similar to the Ultra 2 and my phone, which both have dual-frequency GPS. Good stuff if you’re the outdoorsy type.

Side view of the OnePlus Watch 2
There’s a new shortcut button that launches workouts, along with a digital crown.

But while these are massive improvements, it’s not perfect. While I liked the addition of a new shortcut button, the “digital crown” isn’t really a crown. It looks like one and spins like one, but it doesn’t actually scroll. It’s functionally a button. The watch, like many newer Wear OS watches, supports Android only. There’s no cellular capability, either, which stinks if you want to leave your phone at home for a run. Likewise, the OnePlus Watch 2 doesn’t have fall detection, EKGs, native period tracking (you could download a third-party app), or body temperature tracking. Most of these omissions aren’t the end of the world if all you want is basic activity tracking. It just means this isn’t a watch where you can comfortably leave your phone at home.

Multiday battery life

The OnePlus Watch 2 has some fancy stuff going on under the hood that translates to excellent battery life. The gist is you’ve got two processors — the Qualcomm Snapdragon W5 and the BES2700 MCU. The W5 handles the power-guzzling tasks and runs Wear OS 4. The BES2700 runs background tasks using a proprietary OS. Stuff gets handed off between the two, and the result is long battery life. It helps that there’s a 500mAh battery, too.

Look at OnePlus’ mobile app with sleep chart pulled up
I wasn’t able to view historical sleep data in the app with the original OnePlus Watch. Not a problem here.
Dashboard view of OnePlus’ OHealth app
I was so pleased to see that my data actually synced properly. The bar was low.

How long depends on your usage. If you turn the always-on display off, keep notifications to a minimum, and exercise about 30 minutes with GPS on, you can get several days. The most I got was about four days before a power saving mode kicked on and then about one more day after that mode kicked in. That’s very good and it’s longer than what you’ll get on an Apple, Samsung, or Google smartwatch. With the always-on display turned on, I got closer to 1.5 to two days. That’s standard, though not too shabby.

In power saving mode, the watch loses Wear OS, but you still can receive notifications and track health / activities. OnePlus says you can get around 12 days. I never used the watch in power saving mode only. That sort of defeats the purpose of having a flagship smartwatch, but it’s nice to have if you forget your charger at home.

To get excellent battery life, OnePlus made one big design tradeoff. This watch only comes in a single 47mm size.

Close-up of OnePlus Watch 2 on wrist
The 47mm watch does look quite beefy on my wrist.

That 47mm watch case is why you can stuff in a 500mAh battery. But it bumps the weight to 80g with the strap. I’ve got petite wrists. I felt gravity’s pull on the watch whenever I ran. I really felt it when I wore my leather jacket. The watch was so chunky, I almost didn’t have enough space to pull my wrist through the cuff.

If you’ve got bigger wrists, this won’t be an issue. But there’s no other option for everyone else. That’s a bummer. Most flagship smartwatch makers offer at least a small and big size. They’re making the same tradeoffs — the smaller ones are usually more comfortable, but the bigger ones have better battery life. Consumers understand that, and most are happy to choose the tradeoff that best suits them. Here, that choice has been made for you.

Filling a void

OnePlus has a real opportunity here to take over as the “default alternative.” It’s the first Wear OS 4 watch that isn’t made by Samsung or Google, and it trounces them both in battery life. Unlike its predecessor, it nails the basics. At $300, it’s competitively priced. While there’s room for improvement, it’s well suited for folks who want something stylish without too many bells and whistles.

I actually held onto the 10 extra OnePlus Watch 2 smartwatches during testing. Part of me was afraid that the one I opened would be riddled with bugs. This way, I wouldn’t have to request alternate units. I’d be able to definitively tell if one unit was flawed — or if the watch was yet another unmitigated disaster. But I never needed to open a second watch. Now that my review is done, I can see how silly I was being. This watch is the exact opposite of its predecessor. For once, that’s a good thing.

Correction, April 17th, 2024, 5:08PM ET: A previous version of this review noted the side button wasn’t customizable. It is. We regret the error.

The internet really is a series of tubes

An image of a cable repair ship, on top of The Vergecast logo.
Photo by Go Takayama for The Verge

Hundreds of cables. Hundreds of thousands of miles. The internet runs, in vastly more ways than we realize or think about, through a series of garden-hose size tubes on the ocean floor. Without those tubes, the modern world sort of collapses. And the folks responsible for keeping them functioning have bigger, harder, stranger jobs than you might think.

On this episode of The Vergecast, we talk to The Verge’s Josh Dzieza, who has been reporting on the undersea cable world and just published a feature about some of the folks who keep it running. It’s a story worthy of a high-seas action movie, and it’s all about cables.

Then, we chat with The Verge’s Tom Warren and Joanna Nelius about the new generation of PCs that Microsoft and others seem to think are going to be huge improvements over anything we’ve seen before. Can Qualcomm finally make the PC chip we’ve been waiting for? Is this really, actually, finally the year of Windows on Arm? What the heck is an AI PC? We cover all of that and more.

Lastly, Alex Cranz joins to help us answer a hotline question about e-readers. Because it’s always interesting times in e-readers.

If you want to know more about everything we discuss in this episode, here are a few links to get you started, beginning with Josh’s story on undersea cables:

And on AI PCs:

And on e-readers:

Anker’s latest Soundcore Sleep earbuds actually improve slumber

The Soundcore Sleep A20 are decent passive earbuds that are great for side sleepers, even if Anker overpromises.

“Sleep when you’re dead” was the rallying cry of my youth. But now, in the soft haze of dull middle age, I feel like I’ll die without enough sleep. That’s why I took interest in the new Sleep A20 earbuds from Anker’s Soundcore brand, which promise “pressure-less comfort for side sleepers.”

I, like many, fall asleep listening to podcasts. It’s either that or let a three-pound hunk of fat and neurons lodged in my skull harass me about the future. But my Apple AirPods Pro, like most true wireless earbuds, are too big for comfortable side sleeping, so I only wear one and swap them throughout the night as I toss and turn in fits related to some undiagnosed sleeping disorder.

And since they’re designed as sleep aids, the A20 buds offer lots of sleep-focused features like “unmatched noise blocking” and noise masking to “silence common disturbances such as snoring,” according to Anker.

But not really.

It’s important to understand that Anker doesn’t offer any active noise cancellation to silence snoring or chatty neighbors. The Sleep A20 buds block all external sounds passively by fitting snuggly inside the ear, just like regular ol’ earplugs. That’s partly why the company can charge just $89.99 at launch and still claim up to 14 hours of continuous white noise to mask sounds or 10 hours of audio listening before needing a recharge.

The app lets you switch between two listening modes: Bluetooth audio and sleep sounds. The former is for listening to podcasts, music, or anything else you’d like to stream, while the latter gives you access to dozens of very lifelike sleep sounds grouped by water, nature, life (trains, airplanes, and such), and meditation — I particularly like Rain on Tent. You can also double-tap a bud to switch between listening modes and configure them to keep playing audio all night or until you fall asleep. This is done manually (via timer) or automatically, which I found to be too unreliable.

The A20 buds also include a variety of masking sounds. You can play with a multitude of sliders to mix white noise with seven other colors and two types of snore-masking tracks. It didn’t really work when I attempted to mask a variety of snoring sound effects playing on a nearby speaker. While it did diminish the snoring by layering on less annoying sounds, it certainly didn’t live up to the claim of silencing common disturbances. It also didn’t silence barking dogs or drunken frat boys passing below my bedroom window, I came to find out.

The Sleep A20 buds in and out of their case. They come with multiple ear tips and wings to dial in your correct size.

In my side-by-side testing, the AirPods Pro with noise cancelation enabled and playing music did a noticeably better job of neutralizing those disturbances than the Sleep A20 buds also playing music. But I can’t sleep on my side wearing Apple’s AirPods Pro buds (they also cost more than double the A20s during Anker’s discounted launch period).

Nevertheless, I have to say that for my needs these buds are a game-changer. Although I suffered a bit of mild discomfort the first week of wearing them, sleeping with the A20 buds on my side now feels normal — as does inserting them with a push and a twist and then digging them back out each morning (they’re snug!). I do have to micro-adjust the pillow-to-ear angle occasionally for optimal comfort, and the bud facing the pillow will often just mute itself due to the pressure, which means listening to audio from just one ear. But the end result is that I’m sleeping longer and waking up less frequently. And, anecdotally, I feel better rested.

According to sleep data measured by my Apple Watch Ultra, I’m now averaging 7 hours and 14 minutes of sleep time for the two weeks I’ve been testing the A20 buds, up from 6 hours and 50 minutes for the two weeks prior (wearing AirPods Pro) with slightly improved deep sleep. Other sleep tracking data is about the same.

Screengrabs from the Soundcore app showing (left) available noise masking sounds and (right) data collected by Anker’s sleep algorithm showing me rolling over 45 times... my poor wife.

Anker also offers sleep tracking data in the Soundcore app, including novelties like Position (left or right side) and Roll Over (times I’ve switched sides). Unfortunately, the data is only available to view when my iPhone is paired with the buds in my ears. It says I’m predominately a left-side sleeper away from my partner, which makes sense. But several nights measured between 40 and 50 rollovers, or up to six times an hour, which presumably means I need an exorcism.

I found the battery to be excellent when listening to a few hours of podcasts each night, waking up with between 50 and 75 percent charge remaining. (The built-in Soundcore alarms are startlingly loud and not recommended.) They did much better than my three-year-old AirPods Pro that can’t make it through a single night.

Dropping the buds into the charging case takes some practice initially due to the buds’ amorphous shape, but it can be mastered after a few uses. The case can keep the battery charged for up to 80 hours, according to Anker, if you only listen to its collection of soothing sounds in sleep mode downloaded to the buds themselves. That comes with a side benefit of no Bluetooth audio alerts to interrupt your slumber.

Otherwise, the buds feature a Find Device feature, which sounds like and is about as loud as the alarm on a vintage Timex watch (read: not very). You can also configure double and triple taps on each earbud independently to switch between sleep sounds or Bluetooth audio, volume up / down, next, previous, play / pause, or nothing at all. Anker’s app provides a lot of flexibility to dial in the A20 buds to your exact taste.

Listening to music is fine in a pinch with an adjustable EQ. But I wouldn’t buy these tiny, lightweight earbuds if music appreciation is your primary goal.

Still, as a side sleeper who listens to podcasts every night when falling asleep, I’m completely sold on Anker’s $149.99 Soundcore Sleep A20 buds, especially for the early bird price of $89.99 when they go on sale today via Kickstarter.

All photography by Thomas Ricker / The Verge

Smart string light showdown: Nanoleaf versus Lifx

Which is the best bet to bedazzle your backyard?

I’ve tried lots of different ways to light up the patio in my backyard so I can enjoy sitting outside into the wee hours. Everything from fairy lights to path lights to standard string lights has been wrapped around the myrtles or dug into the borders. But none have survived more than a couple of scorching South Carolina summers. So, I was excited to test these cafe-style smart string lights from Nanoleaf and Lifx.

The Nanoleaf Matter Smart Multicolor Outdoor String Lights ($129.99 for a 49-foot string with 20 bulbs) and Lifx Outdoor SuperColor String Lights ($129 for a 24-foot light string with 12 bulbs) both feature individually addressable full-color and tunable white LED bulbs and are capable of gradient lighting effects. This makes them super versatile. I can have a green and gold-themed St. Paddy’s Day party in March, a red, white, and blue-themed Fourth of July bash, and a lovely soft candlelight white for dinner al fresco anytime.

Both are compatible with all major smart home platforms, so I can set the lights on schedules, control them with voice commands, and have them turn on when the patio door opens using a contact sensor. Most importantly, both these brands’ string lights are seriously sturdy. After watching them survive a cracking spring storm last week, I’m hopeful that these could be a more permanent solution to illuminating my backyard.

I tested the Lifx and Nanoleaf head-to-head over two weeks. Read on to see which came out on top and which could be a good fit for your garden this summer.

Design and build quality: Lifx looks good, but Nanoleaf is so sparkly!

These are not your mother’s string lights. Nanoleaf and Lifx have gone for bold industrial design, with Nanoleaf building on its dodecahedron heritage to produce a gorgeous light bulb. The faceted face creates a lovely effect that looks like a crystal hanging from my trees and is dazzling even when off.

Lifx has gone for an ultra-modern, Tron-style look — a tubular shape with a stick of light inside. They’re stylish but with less flair than Nanoleaf’s. I do like that the Lifx bulbs attach directly to the string and don’t dangle as far down as the Nanoleaf, creating a cleaner look. This makes the Lifx a better choice for hanging along a structure like the wall of a porch.

Both lights feel solid and durable, and the acrylic bulbs don’t break when dropped. The cables and plugs are similarly super heavy-duty, being weatherproof and holding up to rough handling during installation. Neither offers replaceable bulbs, but if a bulb goes bad, both string lights are covered under two-year warranties.

Winner: Nanoleaf

Lifx tunable white light goes down to a lovely warm glow — much softer than Nanoleaf’s.

Light quality: Lifx has serious range

The Lifx's color rendering and tunable white light are very impressive. With a color rendering index (CRI) of 90 and white light that goes from rich, warm candlelight at 1500 Kelvins to an icy blue cool white at 9000 Kelvins, the Lifx has better color and a broader range of white than Nanoleaf (80CRI and 2700K to 6500K).

Lifx on the left, Nanoleaf on the right.

Its colors are also more saturated; red on the Lifx is really red, whereas on the Nanoleaf, it’s more pink and softer. But while brighter is usually better in a light bulb, I’d argue that accent light in your garden is one place you probably don’t need to go for the brightest.

Winner: Lifx

Lighting effects and features: Lifx’s color blending is mind-bending

Each Lifx bulb has three addressable zones that blend together in an almost magical way. It’s hard to pinpoint which color you’re seeing; instead, it’s just a soft ambiance, a welcome change from jarring multicolor effects on most addressable lighting I’ve tested.

While the Nanoleaf bulbs can only show one color at a time per bulb, the cut glass design does create an array of different shades. Nanoleaf’s scenes can also cycle through different colors to give a similar effect to the Lifx, but Lifx’s technology is better.

Lifx’s color blending is technically very impressive. (Yes. Photographing lights at night is hard.)

Lifx also has more options for flashier effects. Options like twinkle, color cycle, strobe, and morph created a fun ambiance on my patio, and I could adjust features like speed, colors, and direction. Lifx has a decent library of colorful lighting designs and I really like the art series inspired by pieces such as Van Gogh’s The Starry Night.

However, Nanoleaf has many more designs to choose from, including hundreds of user-generated ones. A handful were created just for the string lights; my favorites were Sunset Sky, which cycled through warm reds and oranges, and Twilight, with crisp whites and soft grays.

I could create my own designs in both apps, with Lifx’s being the easiest to use. Nanoleaf’s app is messy and crashes a lot, but its new AI scene generator makes it easier to create new designs without struggling through the app.

Lifx’s app also has basic functions like setting schedules, which is frustratingly not an option with Nanoleaf — to set a schedule, you need to use a third-party smart home platform.

Winner: Lifx

That’s a lotta lights! The Nanoleafs come in maximum of 147 feet with 60 bulbs (this is 98 feet with 40 bulbs).

Cost: Nanoleaf is cheaper and longer

While both string lights start at $130, for that Nanoleaf gives you 20 bulbs on almost 50 feet compared to just 12 bulbs over 24 feet on the Lifx (30 feet including the power cord). The Lifx are closer together, though, at 23 inches apart compared to 28 inches for Nanoleaf.

Nanoleaf is the better deal, especially for a large area like my patio. The 98-foot string with 40 bulbs is $200, and the 147-foot string with 60 bulbs is $300. In comparison, the maximum length of the Lifx — three strings together, totaling 74 feet and 36 bulbs — costs almost $400.

Winner: Nanoleaf

I installed the Nanoleaf and Lifx the same distance from my router. The Lifx connected easily but the Nanoleaf struggled.

Connectivity and compatibility: Nanoleaf has more connection options, but Lifx is more reliable (so far)

The Nanoleaf and Lifx lights work over 2.4GHz Wi-Fi. While the Lifx connected easily, I struggled to get the Nanoleaf on the same network, even though both lights were set up in the same location. Eventually, moving the router closer to the Nanoleaf worked.

Both lights will work with Apple Home, Google Home, Amazon Alexa, and Samsung SmartThings. As part of Nanoleaf’s Matter Essentials line, the Nanoleaf string lights connect to smart home platforms via Matter-over-Wi-Fi. This means it works with any Matter-compatible platform. However, you will need a Matter controller to connect.

Lifx relies on individual integrations with each platform, so it works with fewer but doesn’t require any additional hardware. Lifx says a firmware upgrade will bring the option of Matter-over-Wi-Fi compatibility later this year.

As is par for the course with Matter and me, it took multiple attempts to get the Nanoleaf lights onto a Matter platform. I wasn’t able to connect at all using my iPhone 15. Eventually, with a Samsung Galaxy S22 I connected to SmartThings and, from there, successfully shared the lights with Apple Home and Amazon Alexa using Matter’s multi-admin feature. You don’t have to use Matter with the Nanoleaf; you can connect directly to the Nanoleaf app over Bluetooth and Wi-Fi, but you will need Matter for smart home integrations.

Winner: Lifx

Both these string lights will make spring sparkle

These are both very nice string lights. They’re expensive but built to last. While Lifx has better lighting effects and an easier-to-use app, the Nanoleaf has the edge in terms of overall look. The bulb shape is just gorgeous and looks so nice in my backyard. While not as bright as Lifx, the whites and colors provide more than enough richness and warmth for ambient outdoor lighting. Lifx’s effects and color blending are very impressive, but Nanoleaf’s soft, sparkly glow won me over. Plus, it’s more affordable.

Both Lifx and Nanoleaf have other smart outdoor lighting options, so you can sync their lighting effects across your whole landscape. However, Philips Hue has the biggest outdoor selection (although, strangely, no similar Cafe-style string lights, just the smaller holiday-focused light string).

There are also other options for smart string lights, including those from Govee, Twinkly, and Wiz. But these are all the traditional round bulb shapes. Nanoleaf and Lifx have added unique twists to the outdoor string light look, and both have done it very well.

Photos by Jennifer Pattison Tuohy / The Verge

Updated, Friday April 19th, 4PM: Clarified that while Philips Hue doesn’t have cafe-style string lights like these Nanoleaf and Lifx models, it does offer holiday string lights.

Humane AI Pin review: not even close

For $699 and $24 a month, this wearable computer promises to free you from your smartphone. There’s only one problem: it just doesn’t work.

The idea behind the Humane AI Pin is a simple one: it’s a phone without a screen. Instead of asking you to open apps and tap on a keyboard, this little wearable abstracts everything away behind an AI assistant and an operating system Humane calls CosmOS. Want to make a phone call, send a text message, calculate the tip, write something down, or learn the population of Copenhagen? Just ask the AI Pin. It uses a cellular connection (only through T-Mobile and, annoyingly, not connected to your existing number) to be online all the time and a network of AI models to try to answer your questions and execute your commands. It’s not just an app; it’s all the apps.

Humane has spent the last year making the case that the AI Pin is the beginning of a post-smartphone future in which we spend less time with our heads and minds buried in the screens of our phones and more time back in the real world. How that might work, whether that’s something we want, and whether it’s even possible feel like fundamental questions for the future of our relationship with technology.

I came into this review with two big questions about the AI Pin. The first is the big-picture one: is this thing… anything? In just shy of two weeks of testing, I’ve come to realize that there are, in fact, a lot of things for which my phone actually sucks. Often, all I want to do is check the time or write something down or text my wife, and I end up sucked in by TikTok or my email or whatever unwanted notification is sitting there on my screen. Plus, have you ever thought about how often your hands are occupied with groceries / clothes / leashes / children / steering wheels, and how annoying / unsafe it is to try to balance your phone at the same time? I’ve learned I do lots of things on my phone that I might like to do somewhere else. So, yeah, this is something. Maybe something big. AI models aren’t good enough to handle everything yet, but I’ve seen enough glimmers of what’s coming that I’m optimistic about the future.

That raises the second question: should you buy this thing? That one’s easy. Nope. Nuh-uh. No way. The AI Pin is an interesting idea that is so thoroughly unfinished and so totally broken in so many unacceptable ways that I can’t think of anyone to whom I’d recommend spending the $699 for the device and the $24 monthly subscription.

“AI Pin and its AI OS, Cosmos, are about beginning the story of ambient computing,” Humane’s co-founders, Imran Chaudhri and Bethany Bongiorno, told me in a statement after I described some of the issues I’ve had with the AI Pin. “Today marks not the first chapter, but the first page. We have an ambitious roadmap with software refinements, new features, additional partnerships, and our SDK. All of this will enable your AI Pin to become smarter and more powerful over time. Our vision is for Cosmos to eventually exist in many different devices and form factors, to unlock new ways to interact with all of your devices.”

As the overall state of AI improves, the AI Pin will probably get better, and I’m bullish on AI’s long-term ability to do a lot of fiddly things on our behalf. But there are too many basic things it can’t do, too many things it doesn’t do well enough, and too many things it does well but only sometimes that I’m hard-pressed to name a single thing it’s genuinely good at. None of this — not the hardware, not the software, not even GPT-4 — is ready yet.

Front and center

As a piece of gear, the AI Pin is actually pretty impressive. It’s smaller than you might think: roughly the size of four quarters laid in a square, or half the size of a pack of Orbit gum. It’s not heavy (about 55 grams, according to my scale — roughly the same as two AA batteries or the key fob to my car), but it’s definitely solid, made of aluminum and designed to survive falls or even the occasional trip through the washing machine. My review unit is white, but the AI Pin also comes in black. Both look and feel much better than your average first-gen hardware product.

A photo of a person tapping on a Humane AI Pin.
The AI Pin’s designed location is right above your chest, where either hand can reach it.

The bar here is high, though, because of how you’re meant to use the AI Pin. In all of Humane’s demos and marketing, the AI Pin sits in the same place: on the right or left side of your chest, right below your collarbone, attached via a magnet that also acts as a “battery booster.” It’s a pin on a lapel. (It’s a little fiddly to get situated, but the magnet does hold through all but the thickest of clothes.) You don’t have to use it this way — you can hold it in your hand or even talk to it while it’s in its desk charger — but the AI Pin’s built-in microphones are designed to hear you best from that angle; the slightly downward-facing camera sees best from there, and the upward-firing speakers work best in that spot.

The AI Pin is also just incredibly unsubtle. When you stand in front of a building, tapping your chest and nattering away to yourself, people will notice. And everything gets in the way, too. My backpack straps rubbed against it, and my messenger bag went right over it. Both my son and my dog have accidentally set the AI Pin off while climbing on top of me. If you buy this thing, I recommend also buying the $50 clip that makes it easier to attach to a waistband or a bag strap, where I actually prefer to keep it.

A photo of the Humane AI Pin with several accessories.
Humane makes a bunch of accessories for the AI Pin — that black clip is particularly handy.

The upside of sticking it on your chest is that you can reach it with either hand (I call the moves “The Pledge of Allegiance” and “The Backpack Strap Grab”), and even a spare pinkie is enough to wake it up. Anytime you want to talk to the AI Pin, you press and hold on its front touchpad — it’s not listening for a wake word — and speak your questions or commands. Practically anything the AI Pin can do, you can ask for. It can answer basic ChatGPT-style questions, make phone calls, snap photos, send text messages, tell you what’s nearby, and more. You can also do a few things just by tapping the touchpad, like keyboard shortcuts on a computer: double-tap with two fingers to take a photo; double-tap and hold with two fingers to take a video.

Having the thing right there did make me use it more, sometimes for things I wouldn’t have bothered to pull out my phone to do. It feels a little like the early days of Alexa and Siri a decade ago, when you discovered that saying “set a timer for 10 minutes” beats opening your phone’s Clock app by a mile — and you can do it with sticky fingers, too.

Except, oh wait, the AI Pin can’t set an alarm or a timer. It can’t add things to your calendar, either, or tell you what’s already there. You can create notes and lists — which appear in the Humane Center web app that is also where you connect the device to your contacts and review your uploaded photos — but if you try to add something to the list later, it’ll almost always fail for some reason. The problem with so many voice assistants is that they can’t do much — and the AI Pin can do even less.

Humane has said it’s working on a lot of this functionality, and it’s surely true that a lot of this will get better over time as AI models and interfaces get better. Bongiorno tells me there’s a huge software update coming this summer that will add timers, calendar access, more ways to use the touchpad, and much more. But at The Verge, our longstanding rule is that we review what’s in the box, never the promise of future updates, and right now, it’s inexcusable that this stuff doesn’t work on a device that costs as much as the AI Pin does.

Every time the AI Pin tries to do seemingly anything, it has to process your query through Humane’s servers, which is at best quite slow and at worst a total failure. Asking the AI Pin to write down that the library book sale is next week: handy! Waiting for 10 seconds while it processes, processes, and then throws a generic “couldn’t add that” error message: less handy. I’d estimate that half the time I tried to call someone, it simply didn’t call. Half the time someone called me, the AI Pin would kick it straight to voicemail without even ringing. After many days of testing, the one and only thing I can truly rely on the AI Pin to do is tell me the time.

The more I tested the AI Pin, the more it felt like the device was trying to do an awful lot and the hardware simply couldn’t keep up. For one, it’s pretty much constantly warm. In my testing, it never got truly painfully hot, but after even a few minutes of using it, I could feel the battery like a hand warmer against my skin. Bongiorno says the warmth can come from overuse or when you have a bad signal and that the device is aggressive about shutting down when it gets too hot. I’ve noticed: I use the AI Pin for more than a couple of minutes, and I get notified that it has overheated and needs to cool down. This happened a lot in my testing (including on a spring weekend in DC and in 40-degree New York City, where it was the only warm thing in sight).

The battery life is similarly rough. The AI Pin ships with two battery boosters, a charging case, and a desk charger, and you’ll make heavy use of all of it. I went through both boosters and the AI Pin’s smaller internal battery in the course of just a few hours of heavy testing. At one point, the AI Pin and a booster went from fully charged to completely dead in five hours, all while sitting untouched in my backpack. This thing is trying to do an awful lot, and it just doesn’t seem able to keep up.

In fairness, you’re not meant to use this device a lot. The whole point of the AI Pin is to get in, get out, and go back to living your life without technology. On my lightest days of testing — which typically consisted of a couple of calls, a few texts, a half-dozen queries about the number of teaspoons in a tablespoon and whether it’s safe for dogs to eat grapes, and maybe a half-hour of music — I didn’t have many overheating issues, though the battery did still die well before the day ended. As long as you don’t use the projector too much, the AI Pin can muddle through. But if I’m going to pay this price and stick this thing so prominently on my body, it needs to do more than muddle.

An image of a hand with green text projected onto it.
The AI Pin’s projector is the closest thing it has to a screen.

Look but don’t touch

The closest thing the AI Pin has to a screen is its “Laser Ink” projector. You summon it by tapping once on the touchpad or by asking it to “show me” something. If the AI Pin is speaking something to you aloud, you can also pick up your hand, and it will switch to projecting the text instead. The projector is also how you access settings, unlock your device, and more.

Whenever it wants to project, the AI Pin first sends a green dot looking for your hand. (It will only project on a hand, so my dream of projecting all my texts onto the sides of buildings is sadly dead.) After a few minutes, I memorized the sweet spot: about ribcage-high and a few inches away from my body. The projector’s 720p resolution is crap, and it only projects green light, but it does a good-enough job of projecting text onto your hand unless you’re in bright light, and then it’s just about invisible.

A Humane AI Pin projecting onto a hand.
I figured out the hand placement pretty fast — but not the actual interface.

The projector’s user interface is — how can I put this nicely? — bananas. To unlock your device, which you have to do every time you magnetically reattach the AI Pin, you move your hand forward and backward through a series of numbers and then pinch your thumb and forefinger together to select a number. It feels a bit like sliding a tiny trombone. Once you’re unlocked, you see a homescreen of sorts, where you can see if you’ve gotten any recent texts or calls and tap your fingers through a menu of the time, the date, and the weather. To scroll, you tilt your hand forward and backward very slightly. To get to settings, you move your hand away from your body — but not too far, or the projector loses you — until a new radial menu comes up. To navigate that menu, you’re supposed to roll your hand around like there’s a marble in your palm. I swear to you, I never once managed to select the correct icon the first time. It’s way too many interaction systems to memorize, especially when none of them work very well.

It feels like Humane decided early on that the AI Pin couldn’t have a screen no matter what and did a bunch of product and interface gymnastics when a tiny touchscreen would have handled all of these things much better. Kudos to Humane for swinging big, but if you’re going to try to do phone things, just make a phone.

An image of a person tapping on the Humane Pin.
Using the AI Pin is to constantly just ask a question and hope for the best. Way too often, you get nothing.

Asked and unanswered

The single coolest thing I’ve been able to do with the AI Pin is something I’ve done a few times now. I stand in front of a store or restaurant, press and hold on the touchpad, and say, “Look at this restaurant and tell me if it has good reviews.” The AI Pin snaps a photo with its camera, pings some image recognition models, figures out what I’m looking at, scours the web for reviews, and returns it back. Tacombi has great reviews, it might say. People really like the tacos and the friendly staff.

That’s the best-case scenario. And I have experienced it a few times! It’s very neat, and it’s the sort of thing that would take much longer and many more steps on a smartphone. But far more often, I’ll stand in front of a restaurant, ask the AI Pin about it, and wait for what feels like forever only for it to fail entirely. It can’t find the restaurant; the servers are not responding; it can’t figure out what restaurant it is despite the gigantic “Joe & The Juice” sign four feet in front of me and the GPS chip in the device. Bongiorno says these issues can come from model hallucinations, server issues, and more, and that they’ll get better over time.

In general, I would say that for every successful interaction with the AI Pin, I’ve had three or four unsuccessful ones. I’ll ask the weather in New York and get the right answer; then, I’ll ask the weather in Dubai, and the AI Pin tells me that “the current weather in Dubai is not available for the provided user location in New York.” I’ll ask about “the thing with the presidents in South Dakota,” and it’ll correctly tell me I mean Mount Rushmore, but then it will confidently misidentify the Brooklyn Bridge as the Triborough Bridge. And half the time — seriously, at least half — I don’t even get an answer. The system just waits, and waits, and fails.

When I first started testing the AI Pin, I was excited to try it as a music player. I dream of going on walks or runs while leaving my phone at home, and the always-connected AI Pin seemed like a possible answer. It’s not. For one thing, it only connects with Tidal, which means most people are immediately ruled out and also means no podcast support. For another, that connection is as broken as anything else on the AI Pin: I ask to play Beyoncé’s new album or “songs by The 1975,” and the AI Pin either can’t connect to Tidal at all or can’t play the song I’m looking for. Sometimes it works fine! Way more often, I have interactions like this one:

  • Me: “Play ‘Texas Hold ’Em’ by Beyoncé.”
  • The AI Pin: “Songs not found for request: Play Texas Hold ’Em by Beyonc\u00e9. Try again using your actions find a relevant track, album, artist, or playlist; Create a new PlayMusic action with at least one of the slots filled in. If you find a relevant track or album play it, avoid asking for clarification or what they want to hear.”

That’s a real exchange I had, multiple times, over multiple days with the AI Pin. Bongiorno says this particular bug has been fixed, but I still can’t get Tidal to play Cowboy Carter consistently. It’s just broken.

A photo of the Humane Ai Pin’s camera and speaker.
You can talk to the AI Pin all you want — but there’s no telling what you’ll get back.

It’s all made worse by the AI Pin’s desire to be as clever as possible. Translation is one of its most hyped features, along with the fact that it supposedly automatically discerns which languages to translate. When you land in Spain, boom, it switches to Spanish. Super cool and futuristic, in theory. In reality, I spent an hour in our studio trying desperately to get the AI Pin to translate to Japanese or Korean, while The Verge’s Victoria Song — who speaks both — sat there talking to it in those languages to absolutely no avail. Rather than translate things, it would just say them back to her, in a horrible and occasionally almost mocking accent.

The language issues are indicative of the bigger problem facing the AI Pin, ChatGPT, and frankly, every other AI product out there: you can’t see how it works, so it’s impossible to figure out how to use it. AI boosters say that’s the point, that the tech just works and you shouldn’t have to know how to use it, but oh boy, is that not the world we live in. Meanwhile, our phones are constant feedback machines — colored buttons telling us what to tap, instant activity every time we touch or pinch or scroll. You can see your options and what happens when you pick one. With AI, you don’t get any of that. Using the AI Pin feels like wishing on a star: you just close your eyes and hope for the best. Most of the time, nothing happens.

Still, even after all this frustration, after spending hours standing in front of restaurants tapping my chest and whispering questions that go unanswered, I find I want what Humane is selling even more than I expected. A one-tap way to say, “Text Anna and tell her I’ll be home in a half-hour,” or “Remember to call Mike tomorrow afternoon,” or “Take a picture of this and add it to my shopping list” would be amazing. I hadn’t realized how much of my phone usage consists of these one-step things, all of which would be easier and faster without the friction and distraction of my phone.

But the AI Pin doesn’t work. I don’t know how else to say it.

I hope Humane keeps going. I hope it builds in this basic functionality and figures out how to do more of it locally on the device without killing the battery. I hope it gets faster and more reliable. I hope Humane decides to make a watch, or smart glasses, or something more deliberately designed to be held in your hand. I hope it partners with more music services, more productivity apps, and more sources of knowledge about the internet and the world. I hope the price goes down.

But until all of that happens, and until the whole AI universe gets better, faster, and more functional, the AI Pin isn’t going to feel remotely close to being done. It’s a beta test, a prototype, a proof of concept that maybe someday there might be a killer device that does all of these things. I know with absolute certainty that the AI Pin is not that device. It’s not worth $700, or $24 a month, or all the time and energy and frustration that using it requires. It’s an exciting idea and an infuriating product.

AI gadgets might one day be great. But this isn’t that day, and the AI Pin isn’t that product. I’ll take my phone back now, thanks.

Testing VanMoof’s refreshed e-bikes, which are again available to buy

A light grey VanMoof S5 e-bikes sits on a brick sidewalk in Amsterdam between two long rows of bicycles.
The new 2024 VanMoof S5 in Amsterdam where the majority of this UK-owned company still works. | Photo by Thomas Ricker / The Verge

Big question: can you trust the new company to build a better S5 and A5 electric bike?

Trust is a tricky thing, and VanMoof’s new owners are about to discover if they’ve earned it now that the re-engineered S5 and A5 e-bikes are back on sale.

For €3,298, you can buy the light gray models in the key markets of the Netherlands and Germany. Sales will expand to more European countries over the next month as the company more than doubles its network of service and sales partner locations. The dark gray model will also be on sale again soon, according to co-CEO Elliot Wertheimer, who sat down with The Verge in Amsterdam on Wednesday.

I’ve had one of the 2024 S5 e-bikes to use as my daily driver for the past two weeks. It looks and rides exactly the same as my review e-bike from a year ago. Still, it was delivered with a software issue that created a mechanical “pop” every 30 minutes or so when parked in my living room, as if the integrated Kick Lock was trying to disengage. It’s a very minor annoyance that didn’t affect usage, from what I can tell, and VanMoof says it’s a known but very rare issue. Nevertheless, it’s still concerning, given VanMoof’s messaging around re-engineering everything in the name of quality.

When VanMoof says the S5 and A5 (known collectively as the SA5) have been “re-engineered,” it doesn’t just mean the bikes themselves. New parent McLaren Applied says it thoroughly evaluated everything it acquired after the VanMoof bankruptcy in the summer of 2023, including the supply chain, operations, service centers, individual parts, firmware, and the app. Importantly, McLaren Applied — which is an expert at gathering and examining telemetry data — also helped examine extensive reliability data.

The S5 and A5 were far superior to the troubled S3 and X3 e-bikes by all accounts, but they were rushed into the sales channel in late 2022 for reasons that are now abundantly clear. And, despite the issues being relatively minor, some SA5 owners had to wait weeks to schedule an appointment due to the backlog of S3/X3 repairs that continued to overwhelm support in some regions. Bankruptcy has freed the new VanMoof from legally having to honor any of those S3/X3 warranty claims, but it also created a few hundred thousand angry customers who’ll likely never trust the brand again, no matter who owns it.

 Image: VanMoof
Local bike shops trained to support VanMoof e-bikes will feature this logo.

Today, the re-engineered SA5 launches without VanMoof-branded service centers. Instead, the company has created a new network of service and sales partners using the local bike shops already found in major cities. Also, the e-bikes aren’t shipped direct from the factory to customers anymore. Instead, the new SA5 e-bikes arrive at quality control centers where a final round of checks is done before being shipped to local bike shops. Those bike shops are then responsible for managing the ongoing relationship with the customer.

The SA5 series is launching with several improvements. These include a new firmware release that fixes connectivity issues between the e-bike and smartphones, improved waterproofing, screws that don’t come loose as easily (notably at the brake lever), a reinforced motor bracket and longer connector to help ensure longevity and servicing, and a new saddle connector that won’t droop over time. But it’s still an e-bike made from lots and lots of proprietary parts that the company says are now in ample supply from its re-engineered supply chain.

The bikes also arrive with a few new software features. The rear light can act as a blinking “deceleration light” and can also be configured to indicate left and right turns (with accompanying sound effects) when holding the secondary left and right buttons. In practice, I’m not sure any of these are too useful, especially during the day. More useful are new battery notifications that will alert you in the app when the battery reaches your predefined threshold. The company will be delivering more features to its rolling computers over time.

One thing that’s gone is the company’s SX4 series. It was supposed to be a simpler VanMoof but was scrapped after its new owners looked at the data, namely the bike’s socket design, which would be too hard to fix when it fails. VanMoof does still promote the dual-motor V superbike on its website, and it remains on the company’s product road map, I’m told. But it won’t launch until 2026, at the earliest.

Also gone is VanMoof’s Peace of Mind insurance that replaced stolen bikes when the company’s Bike Hunters couldn’t recover them. GPS tracking is still enabled in the VanMoof app as is Apple’s Find My service for iPhone owners.

 Photo by Thomas Ricker / The Verge
Same old outside, a lot of newness on the inside.

Instead of writing a new review for the 2024 S5, I’ve updated our S5 review from May 2023 because my conclusion remains the same:

Honestly, I could do without the fancy automatic chain-driven three-speed shifter, superfluous multifunction buttons, programmable electronic bell, Halo Ring interface, Apple tracking, and perky sounds for startup, shutdown, and firmware updates. Give me one gear and a maintenance-free belt drive alongside that torquey boost button on a pedal-assisted e-bike that will get me back and forth to my office every day, no matter what, in style and without fail. But that’s not the S5.

I also lowered the score because we’re now talking about an untested company, and similarly priced but better e-bikes have since been introduced, like the Cowboy Cruiser (€2,699) and Veloretti Ace 2 (€3,299), to name just a few.

But the question that remains is this: does anyone trust the company enough to still buy a VanMoof e-bike?

Author Stephen Covey — that Seven Habits guy — describes trust as an emotional bank account between people, but I think the metaphor can be extended to cover the relationship between people and brands. Basically, if a company makes deposits through honesty and keeping commitments to you, it builds up a reserve of trust. It can make mistakes up until the point the account is depleted.

If you think the current VanMoof is the same company as the original VanMoof, then its trust account is already in the minus. But if you view the current VanMoof as a new company making a fresh start, then consider the following:

The company’s new leaders have so far delivered upon everything they’ve promised. In December, Wertheimer told me the company would soon start delivering spare parts again. Check. Next, he said it would open up a partner network of third-party service and sales centers. Check. Then it would restart e-bike sales. Check. All that’s left now is for the LaVoie VanMoof-branded e-scooter to launch before mid-year which I’m told is still on track.

Maybe that’s enough earned credit to offset the fact that it delivered a review bike to me with a bug. Maybe not.

Regardless, the company has given itself about three years to turn things around. And you can bet that the perception created by these new old S5 and A5 e-bikes will be critical to the new VanMoof’s prospects.

VanMoof S5 e-bike review: too much, too late

A long list of features, but how many do you really need?

Update April 11th, 6:00AM ET: VanMoof stopped sales of the S5 and A5 series following its bankruptcy in 2023. The re-engineered e-bikes were put back on sale in April 2024 with several internal tweaks and a few new features. The original review has been updated below, and the score lowered from an 8 to a 6 to reflect the current competitive landscape.


“Sometimes you have to kill your darlings,” is a phrase used by designers to justify the removal of elements they find personally exciting but fail to add value.

The last time I heard it was in April, 2022, when I rode pre-production versions of VanMoof’s new full-size S5 and smaller A5 electric bikes. The phrase was uttered by the company’s co-founder and former CEO Taco Carlier to justify the removal of VanMoof’s iconic matrix display for a new “Halo Ring” interface.

One year later and both e-bikes were finally being delivered, well after their original target of July 2022. They were priced much higher than VanMoof’s previous generation e-bikes — the VanMoof S3 / X3 — when introduced for a rather remarkable price of $1,998 / €1,998 back in 2020. In hindsight, VanMoof was likely selling those bikes for a loss in order to gain marketshare, and the volume grab contributed to the company’s eventual bankruptcy.

The 2024 S5 and A5 have now been re-engineered by the company’s new owners, with new features and many internal tweaks to ensure robustness and ease of service.

But can a two-year old e-bike priced at €3,298 still compete?

Although the S5 and A5 pedal-assisted e-bikes still look like VanMoofs with that extended top tube capped by front and rear lights, everything from the frame down to the chips and sensors have been re-engineered. First in 2022, when the company said that only a “handful of parts” were carried over from the troubled S3 an X3 models, then again in 2024 when the new owners evaluated reliability data to fixed several short-comings of the original SA5 e-bikes that were rushed into the sales channels for reasons that are now abundantly clear.

Here are some of the most notable changes:

  • New LED Halo Ring visual interfaces flanking both grips.
  • An integrated SP Connect phone mount (you provide the case) with USB-C charging port.
  • New almost completely silent Gen 5 front-hub motor with torque sensor and three-speed automatic e-shifter (the S3 / X3 had four-speed e-shifters).
  • New multi-function buttons have been added below the bell (next to left grip) and boost (next to right grip) buttons.
  • The boost button now offers more oomph with torque increasing to 68Nm from 59Nm.
  • The S5 frame which has been criticized for being too tall has been lowered by 5cm (2 inches) to better accommodate riders as tall as 165cm (5 feet, 5 inches), while the A5 caters to riders as tall as 155cm (5 feet, 1 inch) and allows for an easier step-through than the X3 it supersedes.
  • Low battery notification alerts, blinking brake-light indicator, and turn signals.

These join a very long list of standard features found on VanMoof e-bikes like a well designed and useful app, integrated Kick Lock on the rear wheel, baked in GPS tracking and Apple Find My support, hydraulic disc brakes, muscular city tires, bright integrated front and rear lights, mudguards, and kickstand. In 2024, however, the company discontinued VanMoof’s Peace of Mind insurance service which guaranteed recovery of stolen bikes.

The 2024 S5 and A5 e-bikes are launching with several improvements you can’t see, meant to solve known issues with the 2022 models and improve long-term durability. These include a new firmware release that fixes connectivity issues between the e-bike and smartphones, improved waterproofing, screws that don’t come loose as easily (notably at the brake lever), a reinforced motor bracket and longer connector to help ensure longevity and servicing, and a new saddle connector that won’t droop over time. But it’s still an e-bike made from lots and lots of proprietary parts that the company says are now in ample supply from its re-engineered supply chain.

VanMoof e-bikes now have integrated mounts and USB-C charging for your phone.

I’ve had one of the 2024 S5 e-bikes to use as my daily driver for the past two weeks. It looks and rides exactly the same as my review e-bike from a year ago. Still, it was delivered with a software issue that created a mechanical “pop” every 30 minutes or so when parked in my living room, as if the integrated Kick Lock was trying to disengage. It’s a very minor annoyance that didn’t affect usage, from what I can tell, and VanMoof says it’s a known but very rare issue. Nevertheless, it’s still concerning, given VanMoof’s messaging around re-engineering everything in the name of quality.

Back in 2023 when I first reviewed the S5, I picked up my dark gray (also available in light gray) VanMoof S5 loaner in March but I ran into a few issues that delayed publication. These included intermittent connectivity failures between the app and bike, a Kick Lock that didn’t always disengage, and an alarm that would briefly trigger for no apparent reason. Those issues were all corrected by an over-the-air firmware (v1.20) update released in mid-April before I could even report them back to VanMoof support.

I had mixed emotions about this. The S5 and A5 had just started shipping in quantity — albeit, eight months late — so you’d think they would have had time to sort out any issues in VanMoof’s new testing labs. That’s annoying given VanMoof’s history of initial quality issues and assurances provided by the company that they wouldn’t be repeated. Then again, premium e-bikes from companies like VanMoof are increasingly complex machines, and seeing the company solve issues so quickly was commendable.

One issue that wasn’t fixed at the time was idle battery drain, but VanMoof told me that a firmware update would solve it in “two weeks” time. In my case, the issue caused the idle S5’s battery to drain from 86 percent to 65 percent over a period of 10 days. I generally lost about two percent charge each day whether I ride it or not, back in 2023.

Oh, and that 2023 e-bike required several firmware updates (v1.2.4 was my last). Annoyingly, the S5 plays a jaunty little tune the entire time the firmware is being installed. It was cute at first, my daughter even offered a little dance to go with it. But it takes five to 10 minutes, and after the first time you hear it, it’s just annoying and there’s no way to turn it off. It still does that in 2024, even at firmware v1.5.0 I tested.

Halo Ring in sunlight.
Halo Ring in low light.

Regarding new features, the Halo Rings next to each grip are the most visible change from previous VanMoofs. At least until you hit sunlight and those weak LEDs washout almost completely. The Halo Rings are meant to show speed, charge remaining, current pedal-assist power level, and more through a series of light bars and animations. Overall they’re fine, if gimmicky, but I don’t have much of a need for status information when bicycling. I also didn’t miss the old top-tube matrix display.

Riding a 23kg / 50.7lbs VanMoof S5 feels like an S3 albeit with fewer shifts and a boost button that provides more torque when trying to pass someone or get an early jump off the line. The fifth generation 250W motor of VanMoof design is absolutely quiet, even at its top speed of 25km/h in Europe (which increases to 20mph in the US). And the new three-speed e-shifter does a better job of accurately finding the right gear than the S3’s four-speed e-shifter did. I still felt a few clinks and spinning pedals, especially when mashing down hard on the cranks when in a hurry. But overall the S5’s predictive shifting is much improved, especially when rolling along at a casual pace. Still, it’s not as smooth as the automatic shifters from Enviolo, for example, so there’s still work to be done.

It’s a shame VanMoof doesn’t offer a simple belt-drive option for its e-bikes. That coupled with the S5’s torquey boost button would obviate the need for any gears when riding in all but the most hilly environments.

As to range, VanMoof says I should be able to get 60km on full power mode. However, in 2023, I was only able to eke out 48.6km (30.2 miles) from the S5’s 487Wh battery when riding in full power mode and frequently pressing the boost button, in temperatures that ranged from freezing to 15C (59F). That’s about the same range I got when testing the VanMoof S3 — 47 km (29.2 miles) — and its bigger 504Wh battery. VanMoof claims the 2024 S5 and A5 models use the battery more efficiently but I wasn’t able to confirm this.

The battery can be charged from zero to 100 percent in 6 hours and 30 minutes via the included charger — that’s slow, but it’s also good for the long-term health of that expensive battery.

I had been wondering how VanMoof would use the new multifunction buttons located just below the bell and boost buttons. The small button on the right (below the boost) can be configured to change your motor power on the fly with a press or hold it to indicate a right turn (by flashing the right half of the rear light). The left button (below the bell) makes your front lights flash rapidly when pressed, akin to a BMW driver bearing down upon you on the autobahn. It can also be configured as a left turn indicator when held, with an accompanying — and slightly embarrassing — sound effect. All of these features tick boxes on marketing sheets but aren’t very useful in practice. The company promises more features in the future via software updates to the firmware and app.

And since this is a VanMoof, the battery is integrated and can only be removed during maintenance. The new VanMoof selling the 2024 S5 and A5 has no plans to re-introduce the “click-on” version (no velcro!) of its extended battery that could have been charged inside the home.

The dark gray VanMoof S5: too complex for its own good?

I’ve had a nagging concern about VanMoof e-bikes for the last few years that I even mentioned in the S3 review. Are they getting too complex for their own good?

Electric bikes — especially commuter e-bikes like the S5 — are subjected to daily wear and tear in all kinds of weather conditions. Even basic bikes are difficult to maintain when used everyday and VanMoof’s e-bikes are expensive rolling computers.

Honestly, I could do without the fancy automatic chain-driven three-speed shifter, superfluous multifunction buttons, programmable electronic bell, Halo Ring interface, Apple tracking, and perky sounds for startup, shutdown, and firmware updates. Give me one gear and a maintenance-free belt drive alongside that torquey boost button on a pedal-assisted e-bike that will get me back and forth to my office every day, no matter what, in style and without fail. But that’s not the S5.

Don’t get me wrong, the VanMoof S5 is a very good electric bike with a longer feature list than any other e-bike I can name. But the brand is now owned by an untested company using an untested partner network of third-party sales and service centers. And since most S5 / A5 parts are only available from VanMoof, you’d better make sure a sales and service center is nearby if you’re interested in buying.

The VanMoof S5 is currently €599 more expensive than the comparable Cowboy Cruiser and the same price as the better Veloretti Ace 2 (€3,299). Viewed in those terms, VanMoof’s pricing is too high.

As good as the S5 is, the feature set is verging on gimmickry, in my opinion. They’re cute and entertaining, sure. But many just aren’t needed for regular commuters. The S5 has too many darlings, and not enough killing.

All photography by Thomas Ricker / The Verge

Aqara’s new motion sensor works with Matter and Thread, but that means problems

A white motion sensor on a wooden shelf.
The Aqara Motion and Light Sensor P2 is the company’s first motion sensor that doesn’t require a Zigbee hub.

Aqara’s Thread-based Motion and Light Sensor P2 ($33.99) has been on my smart home wish list since it was announced way back in 2022 — and now, it’s finally here. My home runs on motion sensors, and I’ve been using Aqara’s Zigbee-based ones for years. They’re among my favorites due to their low price, small size, simplicity, and rock-solid reliability. But they require a hub, which adds complexity and makes them harder to recommend.

What makes the new P2 different is that it uses Matter-over-Thread instead of Zigbee. This means that, in theory, it will connect directly to your smart home platform of choice — Apple Home, Amazon Alexa, Google Home, etc. — no Aqara Zigbee hub needed. It’s also $10 cheaper than the Eve Motion ($49.95), the only other Thread / Matter motion sensor available today.

I’ve had the P2 set up in my smart home for a couple of days, and functionally, it’s the same great Aqara motion sensor I know and love. In almost all respects, the P2 is identical to Aqara’s Zigbee-based P1 motion sensor ($24.99); all that’s changed is the connectivity and battery life. But in my short amount of time with it, I can’t see that the addition of Matter and Thread has brought significant improvements — and in some ways (battery life in particular), they’ve made it worse.

The Aqara Motion and Light Sensor P2.

Aqara nailed the design for a motion sensor years ago, and it has not messed with success. The P2 sports the same versatile stand as the P1, which lets you twist and angle the sensor 360 degrees to get the perfect field of view. Its flexibility provides more range than my other favorite Zigbee-based sensor, the Philips Hue Motion Sensor ($44.99) with its magnetic mount.

The P2 is easy to mount on a wall or ceiling thanks to the supplied sticky pads; you can even stick it under furniture, something that’s hard to do with the chunky Eve Motion. I would like to see a screw mount option — my husband is very averse to sticky tape on our painted walls. But there is also the option to just pop it on a flat surface and not use the mount at all.

The Aqara P2 has the same design and flexible mount as the P1.

The P2 has the same wide-angle PIR sensor and light (lux) sensor as the P1 and can similarly detect motion up to 23 feet away and over 170 degrees horizontally. Its built-in light sensor can be used for automations in Apple Home and SmartThings. (It shows up in Google Home but can’t be used as a trigger and is not in Alexa at all.) This is useful for having shades automatically lower when it gets too bright or to only turn the lights on when it drops below a certain brightness level. The P1’s light sensor only works in the Aqara app and isn’t exposed to third-party platforms, so the P2 has a leg up here.

But bizarrely — because it is a Matter device — the P2 doesn’t work with Aqara’s app at all. This means you can’t access the device’s settings to adjust things like motion sensitivity range and retrigger time (by default, they’re set at 16.4 feet and 30 seconds), a useful feature of the P1.

The P2 will work with Aqara’s app through its new Matter / Thread Hub M3 coming this spring. But then, you’ll have to buy another hub to get the full functionality of this device, which sort of negates the main reason for buying it over the P1. Ideally, Matter will eventually add support for changing these types of settings into its spec, so we won’t have to also use manufacturer apps. But I’m not holding my breath on that one.

Spot the difference: the P2 (left) and P1 (right). The only outward difference is a faint Matter logo and pairing code imprinted on the top of the P2 and some new stickers on the back.

To use the P2 today, you need a Matter controller from your smart home platform of choice and a Thread border router. I set it up using Samsung SmartThings via a Galaxy S22, and it connected easily to my SmartThings Station, which is a Matter controller and Thread border router. (Newer HomePods and Apple TV 4Ks do similar double duty for Apple Home, the third-gen Echo Show 8 and Echo fourth-gen for Alexa, and Google Nest Hubs for Google Home.)

I set automations to have the P2 turn my kitchen lights on when motion is detected; when light levels dip below a set threshold between 4 and 7PM; and to turn on dim when motion is detected during the night. I also set a “no motion detected” automation to turn lights off when there is no motion after 10 minutes. These all worked as expected, and response times were super fast.

However, once I set the P2 up in SmartThings, I couldn’t get Matter’s multi-admin feature to work. This should allow me to share the device with other platforms, such as Apple Home, Home Assistant, and Google Home. (I’ve written previously about issues I’ve had with multi-admin.)

I also tried adding the P2 directly to all four major platforms, and while I did get the P2 connected to Google Home and Alexa using an Android phone, my iPhone 15 Pro refused to onboard it to Apple Home, Alexa, or Google Home.

The P2 next to the Philips Hue Motion sensor (left) and Eve Motion (right). Its compact size makes it easier to fit in tight spaces, like under furniture.

This likely isn’t Aqara’s problem. I’ve run into a lot of trouble adding Matter devices to Apple Home, specifically Thread-based ones. Even once I’ve got them onboarded, I’ve then had problems with Thread gadgets like the Eve Motion and Nanoleaf Essentials Thread bulbs dropping offline. By contrast, my Zigbee-based Hue and Aqara motion sensors have never failed me (unless someone unplugs their hub), and they’ve been running in my home for multiple years.

As it’s virtually impossible to troubleshoot Thread connectivity issues, I haven’t been able to pinpoint the cause, although I have my suspicions. Thread border routers from different manufacturers still don’t work together, and the multiple Thread networks in my house are probably messing me up. The good news is that Thread Group is working on fixes for all these issues. The bad news is that there’s no firm timeline for when we’ll see those solutions in the wild.

This brings me to my other Matter-over-Thread-related disappointment with the P2: battery life. It runs on two CR2450 batteries, and Aqara promises a battery life of up to two years, significantly less than the five years of the P1. I guess this is the price you pay for a “hub-free” life. The processing power required by Matter and the potential for this device to talk to multiple ecosystems over Thread rather than to one Zigbee hub is likely why that number is more than halved, but I’ll have to test this for a lot longer to make a call on that.

While I appreciate Aqara’s efforts to move Thread and Matter forward and prepare us for the smart home of the future, today, the $25 Zigbee P1 is the better option if you are looking for a fast, reliable motion sensor. The main reason to consider the P2 is if you want access to that light sensor in your smart home platform (although you can’t use it in Alexa or Google Home), you just really don’t want to buy an Aqara hub, or you know you have a really stable Thread network.

Photography by Jennifer Pattison Tuohy / The Verge

I regret buying the viral TikTok skincare wand

Person holding Medicube Age-R Booster-H with a massive skincare collection in the background
The ad campaigns for the Medicube Age-R Booster-H eventually wore me down. | Photo by Victoria Song / The Verge

It’s got nothing to do with whether it works. It’s how I’m trapped in TikTok’s neverending product-recommending algorithm.

Every few weeks, I’ll get a sudden deluge of influencers on my For You Page talking about the same product. Sometimes, it’s Lenovo earbuds or a dupe of a Dyson stick vac. Most of the time, it’s skincare products. I’ve managed to resist most temptations thanks to a strict skincare budget — but for about a year, the algorithm won’t stop hounding me about one product: the Medicube Age-R Booster-H.

The Age-R Booster-H is a $330 skincare wand that claims to boost the efficacy of your skincare by using electroporation — short pulses of electricity to create temporary passageways in your skin that help increase absorption. Basically, you zap your face with this thing, your various skincare potions become more effective, and hopefully, you look like a glowy, poreless goddess afterward. Or, at least, this is what the dozens of influencers on my FYP say right after playing a clip of Hailey Bieber using it in her skincare routine.

Mirror selfie of someone using Medicube Age-R Booster-H
Have I seen results? Great question. Most skincare is preventative, so my skin not getting worse is technically a result, too.

I’m not naive. I did my homework. The Medicube site lists white papers on studies it’s done of its products and why electroporation might have merit in skincare. But I also know I’m also being sold a narrative by marketing professionals. If you buy this one gadget that Hailey Bieber and all these beautiful influencers have, you, too, will have glowing, radiant skin! I’m aware this leaves out a lot of factors like money, access to dermatological treatments, filters, and genetics. And yet, common sense is often weak against human vanity and 40 percent off Black Friday sales. So I bought one.

I regret it.

I’ve been using this thing to zap my face every day for three months. Sure, my skin looks a little glowy after using it, but skin always looks glowy after applying skincare. “Have I seen any improvement?” I ask myself that in the mirror every morning while I zap myself.

The lack of dramatic, visible results isn’t why I regret buying this thing, though. What bothers me is now I can’t escape my social media algorithms trying to sell me more of the same.

Once you crack and splurge on a gadget you don’t need, your algorithm is never the same. Since buying the Age-R Booster-H, all I see is more Age-R Booster-H content. For the past three months, I’ve gotten more ads for LED light therapy masks, microcurrent facial toner devices, and even facial massage guns. I’ve furiously swiped past all of them, and yet, this morning, I got an ad for the Age-R Booster Pro — Medicube’s latest wand that combines six skincare gadgets in one for $480.

I’m not going to get it. If I did, I’m sure my FYP would become even more of a skincare QVC than it already is. Look, I know this is TikTok working as intended, but I do resent it. It makes me feel even more beholden to the skincare zappy wand. I spent a lot on the thing, so I will be using this until it dies. Knowing all this, it’s frustrating when I find the same marketing tactics creeping into my brain again. Maybe I haven’t seen better results because I don’t use it with the same Medicube collagen cream as the influencers. I already know that results will vary and that OTC skincare can only do so much. Paid influencers also aren’t incentivized to talk about nuance or caveats. Why else am I not seeing more people say the wand was mid or disappointing? I have to yell at myself not to fall into the trap again.

Medicube Age-R Booster-H standing upright in a bathroom
I’m gonna use this thing until it dies. I don’t hate it, but I regret caving to the e-commerce algorithm.

I fell into skincare TikTok because, during the pandemic, applying skincare was a soothing way to wind down after a stressful day. I enjoyed watching nerdy videos about sunscreen filters and listening to cosmetic chemists talk about the efficacy of certain ingredients. I liked how funny people talked about their day while slathering on retinol. The e-commerce aspect was always there, but once upon a time, it felt like a bit more like a friend telling you the product they stumbled upon. Somewhere, something shifted. Now I feel like I’m five again, sitting on the living room floor and watching a lady with a bouffant sell me a neck cream on the home shopping network.

And while I don’t begrudge the influencers a living, I do wonder how I ended up zapping my face with this $330 skincare wand.

❌
❌