Reading view

There are new articles available, click to refresh the page.

Razer Blade 14 vs. Asus Rog Zephyrus G14: uncompromised value

Where the Blade comes ahead in performance, the Zephyrus wins on value.

Fourteen-inch gaming laptops are the best gaming laptops. They can run most PC games over 60fps on high settings, but at under four pounds and less than an inch thick, they’re still easy to carry around. There are compromises: their size limits how much power they can handle before the components melt into goo, they have loud cooling fans, and they can’t match the battery life of thinner, lighter non-gaming laptops. But they’re the right compromises for a gaming laptop that can also be your lecture hall or coffee shop laptop.

The Razer Blade 14 and Asus ROG Zephyrus G14 are two of the most highly regarded 14-inch gaming laptops. The Zephyrus — one of our longtime favorites — has a redesigned chassis and new OLED panel, while the 2024 Blade 14 is a small spec bump over the previous generation. It made sense to pit them against each other in a benchmark brawl.

Just a nudge: Razer’s Blade 14 has better performance

The $2,700 Blade I tested comes with an AMD Ryzen 9 8945HS, RTX 4070, 32GB of memory, and 1TB of storage. I also tested two different configurations of the Zephyrus G14: one $2,200 version with the same hardware specs as the Blade 14 and a $1,700 version with an RTX 4060 and 16GB of memory. The Blade and higher-end Zephyrus have different memory and storage speeds but nothing that would produce a meaningful difference.

At 1080p on the highest presets, all three can push over 60fps in most games. Cyberpunk 2077 is an outlier if ray tracing is enabled and DLSS 3.5 is off — but turning DLSS on more than triples the frame rate.

The Blade outperformed the 4070 Zephyrus by between 9 and 18 percent in every benchmark, despite having the same CPU and GPU and the same amount of memory. Razer gives the RTX 4070 as much juice as the GPU can handle, up to 115W, or 140W with Dynamic Boost enabled. Asus caps the Zephyrus’ total graphics power at 65W, with Dynamic Boost up to 90W. The Blade 14 can generate more frames because it literally has more power.

Winner: Razer Blade 14

OH-LED-LA: the Zephyrus G14 has a better display

The Blade 14 uses that extra GPU power to drive its 2560 x 1600 IPS display as close as possible to its 240Hz refresh rate — useful if you’re playing competitive esports (or just like fast refresh rates). It reaches about 500 nits of brightness, so it can handle any normal lighting situation inside or outside with minimal glare.

But the Zephyrus has a 2880 x 1800 OLED display with visibly richer colors and bolder contrasts. Its 120Hz refresh rate balances nicely with its graphical capabilities for gaming and creative work. It only hits 400 nits, but that’s a nonissue indoors.

Two laptops with their lids closed next to one another on top of a dark wood table
The Zephyrus G14’s LED strip can pulsate to music playing through the laptop speakers.

Both laptops hit 100 percent of sRGB and P3 color gamuts; the Zephyrus also covers 100 percent of AdobeRGB while the Blade only hits 89. (AdobeRGB matters more for print work, not gaming.)

The Zephyrus’ display is HDR500-certified and supports G-Sync; the Blade 14 does not have HDR, and even though the display supports AMD FreeSync Premium, it doesn’t work with the discrete GPU. Razer’s website says the Blade 14 supports G-Sync on external monitors, but that didn’t work in my testing.

Winner: Asus Rog Zephyrus G14

Design: one of them still looks like a MacBook

The Blade 14 has always looked like a MacBook Pro, and it looks even more like one in silver. (The Blade still comes in black, of course — but now, so does the MacBook Pro.) Its black keys have rounded edges that contrast with the clean, straight lines of the chassis. The keys feel a touch too small, and the silver gaps between them further accentuate how spaced out they are.

The Blade’s trackpad looks smoother than it feels; my fingertips sometimes caught and skipped. The Blade is also a little thicker and larger: 0.70 inches thick compared to the Pro’s 0.61 inches and 4.05 pounds compared to 3.4 pounds. But it comes with a treat: two slots for up to 96GB of socketed RAM; the Zephyrus’ memory is soldered to the board.

A close-up of a few connectivity ports on the side of a silver laptop.
The Blade 14 supports USB4 Type-C, USB-A, and HDMI.
A close-up of a few connectivity ports on the side of a white laptop.
The Zephyrus G14 also supports USB4 Type-C, USB-A, and HDMI.

Monica Chin, The Verge’s former laptop reviewer, was worried that the new Zephyrus would look too much like a MacBook, but it pleases me to report that it doesn’t. Even though the dot matrix on the lid is gone, the diagonal LED strip still adds flair. It’s slimmer and lighter than the Blade 14, too: 12.76 x 8.66 x 0.70 inches, weighing 3.53 pounds, and it comes in either white or dark gray. The only MacBook-like thing about the Zephyrus G14 is its smooth trackpad.

Both versions of the Zephyrus I tested had good battery life for a gaming laptop — an average of 6.5 hours when used only for web browsing, word processing, and video streaming. It can last up to three hours while gaming, depending on the game and graphics settings. (On high graphics at 1080p, Cyberpunk 2077 drained the battery in an hour, while Botany Manor drained it in just under three.) As for the Blade 14, its battery lasts only up to four hours, even on simple tasks.

Winner: Asus Rog Zephyrus G14

Thermals: both handle the heat, but the Blade handles more

Small gaming laptops often have trouble cooling their components, but both the Blade 14 and Zephyrus G14 have a handle on their thermals. HWInfo reported no throttling on either during the Cinebench 2024 benchmark.

While not uncomfortable, the Blade 14’s S and D keys, and the left side of the spacebar, become noticeably warmer than the rest of the keyboard after gaming for two hours straight.

A close-up of a laptop keyboard with black keys and a silver chassis.
Sorry, Blade 14, but the black keys on silver chassis just isn’t doing it for me.
A close-up of a laptop keyboard with white keys and a white chassis.
The Zephyrus G14 pulls off the uniform white look without feeling sterile.

But I’ll take that over the dozens of small vents that span the Zephyrus G14’s bottom chassis. I was fine using the Zephyrus in my lap to dash off a few emails or watch an episode of Shōgun, but gaming felt like getting blasted by hot desert air.

The Blade 14 handles a lot more GPU wattage in a chassis nearly the same size as the Zephyrus, with fewer vents on the bottom.

Winner: Razer Blade 14

The Blade is gaming-first, but the Zephyrus is better balanced

The Zephyrus G14 — with an RTX 4070 — is still the best 14-inch gaming laptop.

For $2,000 to $2,200, it maintains a fine balance between features, performance, and price. You get a quality OLED HDR display, commendable battery life, a fun but not too flashy design; and high frame rates despite the GPU power cap. The only true fault with this laptop are the friggin’ vents that span the entire bottom. The 90W TGP limit and soldered-in memory are disappointing, but I’m happy to accept those small concessions for everything else the Zephyrus G14 offers.

The similarly configured Razer Blade costs up to $700 more. It has a higher refresh rate but no HDR or G-Sync support, and its 9 to 17 percent higher frame rate at 1080p is barely noticeable. The Blade 14 is also a little heavier than the Zephyrus G14, its battery life is significantly shorter, and it looks even more like a MacBook with its new mercury colorway.

Winner: Asus ROG Zephyrus G14

Photography by Joanna Nelius / The Verge

Apple iPad Pro (2024) review: the best kind of overkill

Apple’s latest high-end tablet is a marvel of hardware design still in need of the software and accessories to really make it sing. But wow is it fun to use.

The new iPad Pro is a genuine achievement in tablet design. It’s the closest thing I’ve ever seen to the vision that a tablet should feel not like a computer but, rather, like a piece of glass in your hand. I’m honestly not sure how you’d shrink it more; the USB-C plug I use to charge the 13-inch Pro I’ve been testing is already thicker than the iPad itself. It’s a light, fast, remarkable machine.

But does that really count for much anymore? The iPad has been a hardware triumph for years, plenty fast and light and long-lasting for just about anything you could do with it. The problem has always been the software: by forcing the device to run a locked-down, oversimplified operating system, Apple has prevented this ultraportable and ultrapowerful machine from becoming the full-fledged computer so many users want it to be.

The way Apple seems to see it, the iPad’s appeal is greater than the sum of its parts. No, you can’t do some of the things you’d do on a MacBook. But you can hold it in your hands in bed. You can draw on the screen. You can play mobile games. Everyone at Apple speaks of the iPad’s “versatility” as its main selling point — the fact that it’s a jack-of-all-trades is a feature, not a bug. The hard part about trying to do everything, though, is that it’s really hard to do everything well.

Apple’s case for the iPad Pro seems to be that this is the device for the future. It has the processor, screen, accessories — everything you’ll need to be ready for the next decade of your computing life. Because pretty soon, AI will change everything, and you’ll be glad you had all the power to run it well. That might well be true! But none of it is real yet. And besides, the most important parts of that future will happen on the screen, not behind it.

This new iPad Pro feels, in many ways, like the finale of the 14-year history of the iPad, all the pieces finally in place. It also feels, as ever, like a futuristic device plagued by software stuck firmly in the past, one I’m not sure I’d recommend to most people.

I do love it, though.

A magic pane of glass

I’ve done almost all of my testing on one of the highest-end versions of the iPad Pro: a 13-inch space black model with 1TB of storage, 16GB of RAM, and a built-in cellular connection. That’s a $2,099 tablet right there. Add in the $129 Pencil Pro and the new $349 Magic Keyboard, and I’m reviewing $2,577 worth of iPad — the amount you would spend on a high-end laptop. You can get it cheaper, of course, though the Pro is never exactly inexpensive: the 11-inch model starts at $999 and comes with 256GB of storage and 8GB of RAM. (That entry-level storage option is double what it used to be, which is a nice change but still spendy.)

No matter which Pro you buy, though, you get access to the three most important new things about this new model: the chip; the screen; and the design.

A photo of the two sizes of iPad Pro.
The 13-inch iPad Pro is enormous. But it’s much lighter now.

Let’s do the chip first, because it’s important and also slightly confusing. The Pro runs on the M4 processor, a brand-new chip Apple designed specifically to accommodate the Pro’s new screen and design, and it’s as fast as you’d hope. In my benchmark tests, the M4-powered Pro scored about 50 percent higher than the previous M2-running model. In practice, it definitely doesn’t feel 50 percent faster, but it does feel faster.

Apps load and close a half-beat faster with the M4, even complex games run perfectly smooth (I still can’t believe how good Call of Duty: Warzone Mobile looks on this device), and iMovie renders video noticeably more quickly than on the 11-inch M2 Pro I’ve been using for a couple of years. Individually, these aren’t earth-shattering upgrades, but particularly if you’re doing a lot of intense photo and video work or even love a long Warzone session, it’s a real performance bump. And in all my testing, I’ve never noticed the device getting hot in my hands. Occasionally very slightly warm, maybe, but that’s it.

The top-tier models of Pro — with 1TB or 2TB of storage — get the best M4, with an additional performance core in the CPU. Yay for more power, I guess, but I’d be astonished if there were any way to tell the difference in everyday use. In most cases, the iPad’s raw performance hasn’t been an issue for a very long time.

The M4’s main practical purpose is to power the new OLED display. Apple’s new “Tandem OLED” setup basically smashes two OLEDs together to get a sharper, brighter panel. Apple calls it Ultra Retina XDR, which is a ridiculous name, but whatever, it works beautifully. All of the traditional upsides of OLED are immediately apparent: since OLEDs control each pixel individually, you get much richer blacks, so the letterboxes above and below a video just disappear into the bezel, and photos look much more dynamic. Colors are incredibly vibrant — to the point of occasionally looking too contrasty and HDR-y to my eyes. The Pro’s peak brightness is significantly brighter than the new Air, too, which is tough to pull off with an OLED.

A photo of the iPad Pro, showing a live view of a parking lot.
The Pro’s OLED display is a big step up from anything on an iPad before.

The only downside I’ve noticed in the display so far is that the OLED seems to pick up a little more glare and reflection than the Air’s LCD panel. When I’m using it outdoors, that has meant I crank the brightness a little more than I’d like to be able to see everything on the screen. But that’s a tiny complaint; this screen looks fantastic — and I haven’t noticed battery draining faster at max brightness than before.

On the design front, the new Pro is more of a refinement than a redesign, but the difference is still pretty remarkable. The thinness is one thing — at 5.1mm thick for the 13-inch model and 5.3mm for the 11-inch, they’re the thinnest iPads yet — but the weight is what really gets me. The 13-inch Pro I’ve been testing weighs about a quarter of a pound less than last year’s model, which doesn’t sound like much but is very noticeable when I’m holding this big slab of glass in my hands on the couch. I’ve always thought the larger-size iPads were way too big to actually use, but I’ve been holding and using this one a lot. It’s so thin and light that I’ve worried about it being fragile. So far, it’s been sturdy.

A photo of the edge of an iPad Pro, showing the speaker and USB-C port.
LOOK HOW THIN THAT IS.

The only other big design change here is that Apple finally — finally — put the front-facing camera in the correct spot: in the middle of the long side of the iPad. This is very much a landscape-first device now, but that’s a good thing! The iPad, in general, absolutely is a landscape-first device. I’m not particularly impressed with the quality of the front-facing camera, but it’s fine, and it’s much more useful now.

Apple doesn’t seem to have sacrificed anything in the name of being thin and light. As a pure design and engineering exercise, it’s a home run.

Feature creep

There are basically two types of iPad users. (This is an oversimplification, but go with me.) The first type wants a simple way to send emails, read news, do the crossword, look at photos, and browse the web. For those people, the new iPad Pro is total overkill. Everything about it is a little better than the new Air or even the newly cheaper base iPad, but not so much better that I’d recommend splurging unless you really want that OLED screen. (If you do, please know: I get it. I’m with you.)

A photo of a person pinching the screen on an iPad Pro.
As ever, the biggest problem with the iPad is iPadOS.

The other type of iPad user does all those things but also has an iPad-specific feature or two that really matters to them. Musicians love it for turning sheet music; students for handwriting notes; filmmakers for quickly reviewing footage; designers for showing interactive renders to clients. When Apple talks about how “versatile” the iPad is, I think this is what the company means. The iPad is not all things to all people, but it should have something for everyone. By putting ever more power into the device, Apple is trying to expand the number of those features that might appeal to you.

New features this year come mostly in the form of the Pencil Pro. It has a nifty new squeeze gesture that is useful and makes it quicker to bring up menus and commonly used tools. Apple’s also letting developers customize what happens when you squeeze in their apps, so expect some cool and deeply weird integrations soon. The new Barrel Roll feature is also going to be a big win for artists of all sorts, now that you can turn your virtual brush or pen just by twisting the Pencil as you draw. (It works really well, though honestly, I’m woefully unqualified to review anything from an artist’s perspective. We’ll have more on that front soon.)

Same goes for the new Magic Keyboard, which is my personal favorite upgrade of the whole lot this year. When you dock the iPad in the attachment, it adds a full keyboard and a trackpad, floating the iPad above it — it’s the most laptop-like way to use an iPad. The new model is sturdier than the last, though it does still wobble a bit when you touch the iPad’s screen. The keyboard feels wonderful, right in step with a MacBook’s keys or the traditional Magic Keyboard. Now that there’s a row of function keys and a bigger trackpad, I can use the device for hours without ever picking my hands up. Best of all, it’s about 50 grams lighter than before (658g on the new model, according to my kitchen scale, compared to 710g on the last), which contributes to the overall smaller footprint of the new Pro.

An overhead photo of the iPad Pro’s Magic Keyboard.
The row of function keys makes the Pro a very functional laptop replacement.

In my own use, my iPad hardly ever leaves the keyboard case. I use the Magic Keyboard for journaling, emailing, and just as a stand while I’m cooking and watching shows. Having a better keyboard in a smaller package matters a lot to me. But it won’t to a lot of people, especially at $299. With both of its accessories, Apple is making the Pro more appealing to the people who might already have a Pro and not doing much to win over those who don’t.

There is, I should at least note, the possibility that AI could change the whole equation. Maybe generative AI will make Photos so much better that everybody suddenly wants a big, beautiful screen. Maybe Siri will get so good that the iPad will become a smart home controller. Maybe the camera software will be so spectacular that you’ll use a tablet for all your video calls forever. Maybe, maybe, maybe. WWDC is in a few weeks, and I expect Apple to aggressively try to convince you that advances in AI make the iPad Pro more than just an iPad. If it can make the argument that a super-powerful, super-portable, jack-of-all-trades device is what you need in the future, I’ll probably be running to buy an iPad Pro.

For now, it’s just an iPad. The best iPad ever, I think — maybe even the best iPad you could reasonably ask for. But the story of the iPad — the “magic pane of glass,” as Apple is so fond of calling it — is actually all about software. The iPad’s software has let its hardware down for years. Apple has led us to believe that’s about to change, that this year’s WWDC will be the great turning point for AI and iPads and everything. We’ll see. Until then, the iPad Pro is almost too good for its own good.

The new Apple iPad Air is great — but it’s not the one to get

A photo of the iPad Air in a cafe setting.
The iPad Air is an excellent iPad — and that’s all. | Photo: David Pierce / The Verge

The iPad Pro is a beast. The two-year-old iPad is more compelling than ever. So what is the Air even for anymore?

The new iPad Air is very good. If you buy one, you’ll almost certainly like it. That’s it, that’s the review.

But is this the iPad you should buy? That’s a more interesting question. The iPad Air is a study in tradeoffs, even more so than before. Starting at $599, it’s not the cheapest iPad you can buy, nor is it the most impressive. It doesn’t support all the accessories, but it does support some of the accessories. It’s fast but not the fastest, thin but not the thinnest, powerful but not the powerful-est. It is Apple’s attempt to find the Goldilocks middle ground — the features that matter most to the most users and nothing else.

Outside of a couple of specific scenarios, I don’t think I’d tell you to buy this year’s iPad Air. Not because it’s not great — it is great! It’s just that for $250 less, you can get the base iPad, which is just about as good at every common iPad activity. The 10th-generation iPad is a couple of years old at this point, but it’s still an excellent device, especially after Apple lowered its price from $449 to $349. The iPad, not the iPad Air, is the right iPad for most people.

The new Air is pretty much last year’s iPad Pro in the body of last year’s iPad Air. The two models are identical other than the screen size. The new 13-inch model is obviously larger in every dimension and about a third of a pound heavier than the 11-inch Air, which is exactly the same size and weight as the last-gen Air.

Both new Airs run the same M2 chip as the old Pro and, in my testing, run it practically identically — it’s a fast and reliable chip, though the new M4 processor in this year’s iPad Pro runs laps around it in benchmark tests. The screen is the same as last year’s Air, the battery life is the same, and the rear camera is the same — it’s just a spec bump on the same thing.

In my testing, there’s really only one change from the old Air that I’ve noticed: Apple moved the front-facing camera to the middle of the landscape edge, which means I can use it for video calls without looking like I’m always staring up and away from the screen. This is a great change, and one Apple should have made a long time ago. If you do want to buy an Air, I’d recommend this one over the previous generation just to get the camera in the right place.

Next to this year’s Pro, on the other hand, the Air definitely feels like a lesser model. The Pro has a much better OLED screen, that ultra-powerful M4 chip, full Thunderbolt support on the USB-C connector, more speakers, more storage in every price tier, and is lighter and smaller at both screen sizes. You pay handsomely for those upgrades, but they’re real upgrades.

A side photo of the new iPad Air.Photo: David Pierce / The Verge
The Air is thin, but it’s not that thin.

But honestly? If you’re just looking for a way to send emails, browse the web, play games, and maybe make an iMovie or two, none of that will really change the way you use your iPad. An iPad is an iPad is an iPad, and until Apple either fixes a bunch of things or opens up the operating system — and I wouldn’t hold my breath on either one — you just aren’t going to get enough out of all that extra power to make it a must-have upgrade. You can do lots of things on an iPad, which is great! But the list is pretty much the same no matter which tablet you’re holding. The iPad Pro is the best iPad, no question about it, but it’s also a very expensive iPad. And it’s still an iPad.

There are only two Pro features that I truly missed in everyday use after switching to the iPad Air. The first is Face ID: the Air uses Touch ID in the home button to log you in to your device, which works well enough, but Face ID on the Pro makes it feel like you never have to log in at all. The second is the row of function keys on the Magic Keyboard attachment. On the 13-inch Air in particular, the Magic Keyboard is big and roomy and lovely to type on — which means I’ve missed having quick access to playback, brightness, and more.

A photo from overhead of the iPad Air’s Magic Keyboard.Photo: David Pierce / The Verge
Love the Magic Keyboard. Miss the function keys.

In real use, the Air is much closer to the base iPad than the Pro, which puts it in an awkward tweener position. You do get the M2 chip instead of the A14 Bionic, and as Apple continues to push into on-device AI features, it’s possible that having a bonkers amount of processing power will become very useful. The M2 is certainly the more future-proof option, but the A14 Bionic is fully capable of handling a typical iPad workload.

Otherwise, the base iPad and the Air have the same cameras and camera placement, the same Touch ID system, and the same battery life. The iPad is a bit larger than the Air, but we’re talking hundredths of inches and pounds. Neither has a headphone jack, which remains dumb and bad. The Air’s screen is definitely better — it’s probably the most important spec upgrade over the regular iPad. But the regular iPad is good enough — just don’t look at them side by side. Ignorance is bliss; it’ll be fine.

The Air gets points for supporting the Pencil Pro, which the regular iPad doesn’t. The iPad gets points for having a function row on its Magic Keyboard Folio but loses some because it doesn’t feel as sturdy as the larger accessory. (Can I just say, by the way, that it makes exactly no sense which keyboards get which features on which iPads? No sense at all.) The iPad also comes in much nicer colors, though I love the look of the white Magic Keyboard, and that only comes with the Air.

A 10th-gen iPad in an Apple Magic Keyboard Folio.Photo by Dan Seifert / The Verge
The 10th-gen iPad is still a terrific (and newly cheap) tablet.

Ultimately, I think I can answer the Air vs. iPad debate in two questions. Do you want a big screen? Do you use the crap out of your Apple Pencil? If so, buy the Air. The 13-inch model is the cheapest big screen in Apple’s lineup — a whopping $500 less than the comparable iPad Pro — and the 11-inch model is the least expensive way to get access to the Pencil Pro. Done and done.

Otherwise, buy the plain ol’ iPad, which is an already terrific tablet at a newly terrific price. There’s even a better way to upgrade: I’d urge you to spend $150 upgrading the base iPad to the cellular model rather than $250 upgrading to the Air. Having an iPad that is just always connected, without having to think about it, is a game-changer for tablet life.

My standard buying advice is to buy the best stuff you can afford and then keep it as long as possible. But I’m confident that even a two-year-old 10th-generation iPad is capable enough to do most things really well for a long time. So is the Air, obviously! But the bad news for Apple, and the good news for you, is that every iPad is a great iPad — including the cheapest one.

The Garmin Lily 2 was the tracker I needed on vacation

close up of lilac Garmin Lily 2 Sport on a colorful background
The Lily 2 is a small, unassuming tracker that suits casual users. | Photo by Amelia Holowaty Krales / The Verge

Its limitations made it fall short in daily life but ended up being a plus while trying to disconnect from the world.

On my last day of vacation, I sat on a pristine beach, sipping on a piña colada while staring at a turquoise Caribbean Sea. In four days, I’d charged my Apple Watch Ultra 2 three times, and I was down to about 30 percent. On the other wrist, I had the more modest $249.99 Garmin Lily 2 Sport. It was at about 15 percent, but I hadn’t charged it once. Actually, I’d left the cable hundreds of miles away at home. While pondering this, the Ultra 2 started buzzing. My phone may have been buried under towels and sunscreen bottles at the bottom of a beach bag, but Peloton was having a bad earnings day. The way that watch is set up, there was no way it would let me forget. The Lily 2 also buzzed every now and then. The difference was reading notifications on it was too bothersome and, therefore, easily ignored.

That tiny slice in time sums up everything that makes the Lily 2 great — and perhaps not so great.

Close up of the Garmin Lily 2 looking for GPS
The hidden screen is a bit dim in direct sunlight and doesn’t fit a ton of information on it.

My 10 days with the Lily 2 were split into two dramatically different weeks. The first was a chaotic hell spent zipping here and there to get 10,000 things done before vacation. The second, I did my very best to be an untroubled beach potato. That first week, I found the Lily 2 to be cute and comfortable but lacking for my particular needs. On vacation, its limitations meant it was exactly the kind of wearable I needed.

I wasn’t surprised by that. The Lily 2 is not meant to be a mini wrist computer that can occasionally sub in for your phone. It’s meant to look chic, tell you the time, and hey, here’s some basic notifications and fitness tracking. That’s ideal for casual users — the kind of folks who loved fitness bands and Fitbits before Google started mucking around with the formula.

The main thing with the Lily 2 is you have to accept that it’s going to look nice on your wrist but be a little finicky to actually use. The original Lily’s display didn’t register swipes or taps that well. It’s improved a smidge with the Lily 2, but just a smidge. Reading notifications, navigating through menus, and just doing most things on the watch itself I found to be nowhere near as convenient as a more full-fledged touchscreen smartwatch. This extra friction is a big reason why the Lily 2 just didn’t fit my needs in daily life.

As a fitness tracker, the Lily 2 is middling. The main additions this time around are better sleep tracking and a few more activity types, like HIIT, indoor rowing and walking, and meditation. There are also new dance fitness profiles with various subgenres, like Zumba, afrobeat, jazz, line dancing, etc. That said, the Lily 2 isn’t great for monitoring your data mid-workout. Again, fiddly swipes and a small screen add too much friction for that.

I also wouldn’t recommend trying to train for a marathon with the Lily 2. Since it uses your phone’s GPS, my results with outdoor runs were a mixed bag. One four-mile run was recorded as 4.01 miles. Great! Another two-mile run was logged as 2.4 miles. Less great. It’s a tracker best suited to an active life, but not one where the details really matter. Case in point, it was great for tracking my general activity splashing around and floating in the ocean — but it’s not really the tracker I’d reach for if I were trying to track laps in the pool.

At 35mm, it’s a skosh bigger than the original Lily but much smaller than just about every other smartwatch on the market. It’s lighter than most at 24.4g, too. That makes this a supremely comfortable lil watch. Most days, I forgot I was wearing it.

While I’m no fashionista, I didn’t feel like my lilac review unit was hard to slot into my daily wardrobe. But if playful colors aren’t your thing, the Classic version is $30-50 more and has a more elegant feel, a more muted color palette, and nylon / leather straps. (It also adds contactless payments.)

As a woman with a small wrist, the 35mm size is a plus. But while I personally don’t think the Lily 2 has to be a women’s watch, it is undeniably dainty. If you want something with a more neutral vibe or a slightly bigger size, Garmin has the Vivomove Trend or Vivomove Sport. Withings’ ScanWatch 2 or ScanWatch Light are also compelling options.

View of the Garmin Lily 2’s sensor array
The sensor array uses the last-gen Garmin optical heart rate sensor, but that’s fine on a casual tracker.

Ultimately, the Lily 2 is great for folks who want to be more active while trying to cut down on notifications. It’s also a great alternative if you miss the old Misfits, Jawbones, or Fitbit Alta HR. Deep down, I wish that were me, but the reality is I have too much gadget FOMO and care way too much about my running data. That said, the next time I go on vacation — or feel the urge to disconnect — I think I’ll reach for the Lily 2 and try to leave the rest of my life at home.

Goodbye to Apple’s Smart Keyboard Folio, the best iPad Pro accessory

An image of Apple’s Smart Keyboard Folio attached to an M1 iPad Pro.
Farewell to a real one. | Photo by Chris Welch / The Verge

It’s versatile, doesn’t weigh much, and no one else makes anything quite like it for the iPad Pro. Hopefully it’s not gone for good.

I had a sneaking suspicion this was going to happen. All the rumors about the new iPad Pro — its shift to an OLED display, the more premium Magic Keyboard — had me convinced that Apple was going to quietly move on from the quirky, very not luxurious Smart Keyboard Folio that became my preferred carry for the 2018 iPad Pro and, later, the M1 iPad Pro.

Sure enough, the Smart Keyboard Folio isn’t compatible with the OLED iPad Pros. The 11-inch version can still be used with the sixth-generation iPad Air, but that’s all. So if you’re set on Apple’s very best tablet, it’s not an option anymore. And with no alternative quite like it anywhere in sight, I’m bummed.

Before I get to the praise, let’s touch on the negatives. The Smart Keyboard Folio has no trackpad, so unless you pair a mouse with your iPad, the only way to navigate around is by touching the screen. That’s not ideal over long durations, but the product’s whole purpose, at least to me, has always felt like a keyboard tailored for short bursts of productivity. Fire off an email? Absolutely. Post a blog? Yep, I’ve written plenty of posts on The Verge using it. If you ever wanted to work on a novel, the Magic Keyboard was always there waiting in the wings as a step-up option for the real serious stuff.

My other critique of the Smart Keyboard Folio is one that Apple still hasn’t fully rectified with the new starts-at-$300 Magic Keyboard. The palm rest and keyboard deck are now aluminum, which is objectively an improvement. But on the outside, Apple’s still using the same old material that picks up smudges like no other and tends to age terribly. I’ve long hoped they’d switch to fabric like Logitech or just come up with something (anything) better, but nope.

A photo of an iPad Pro attached to Apple’s Smart Keyboard Folio, with a camera to the left.Photo by Chris Welch / The Verge
The lightweight Smart Keyboard Folio was there for typing when I needed it, but it never felt like a chore to lug around.

And then there’s the price: the Smart Keyboard Folio for the 12.9-inch iPad Pro ran around $200 — pretty ludicrous when you consider how basic it was. It had no backlit keys. There were only two angles to pick from when using the iPad upright. It didn’t offer a spare USB-C port or any extra connectivity. It was literally just a folio case with a weird keyboard on the inside.

But you know what made up for all of that? Versatility and a lightweight design that even the newer and lighter Magic Keyboard still can’t match. In practice, it just worked exceptionally well. The fabric-covered keyboard felt damn near invincible. Sure, the keys barely had any travel, and I wouldn’t exactly describe the typing experience as “comfortable.” But the Smart Keyboard Folio was a keyboard when I needed it to be — I could write Verge articles using the thing from anywhere if there was breaking news — and it could just fold behind the screen when I was reading The New York Times, browsing the web, or retouching photos using an Apple Pencil with the tablet in my lap. Feeling my fingers against the keys in that flipped-back orientation was a little odd at first, but I got used to it in no time.

With the Magic Keyboard, you’ve got to fully detach the iPad Pro whenever you want to do some reading or use the device in a way where all you really need is the screen. Some people will prefer that, but the Smart Keyboard Folio was thin enough that you never really had to make a choice; you could always just leave it on no matter what you were doing.

A photo of Apple’s Smart Keyboard Folio on the 2018 iPad Pro.Photo by Amelia Holowaty Krales / The Verge
They made the new iPad Pro thinner than ever but... got rid of this super thin keyboard? Someone make it make sense.

Then there was the fact that the folio keyboard was so damn light. It kept the iPad Pro feeling like an iPad in my bag. That has never, ever been the case with a Magic Keyboard attached. When it goes on, you’ve entered MacBook weight territory. I’m not saying there’s any problem with that, but with the Smart Keyboard Folio, there was something special about toting around such a powerful combo that always stayed so airy on my back.

At best, Apple is being somewhat stubborn in assuming that every iPad Pro buyer wants the tablet to feel like a laptop (and be a similar weight to one) whenever a keyboard is attached, which is what the Magic Keyboard gets you. If you want to view it with more pessimism, the company is intentionally doing away with what was a compelling, more affordable accessory — one that was easy to take anywhere — in hopes that more people will cave and fork over $300 for the only first-party keyboard that’s available for the new Pro.

Now, it falls to other companies to replicate the Smart Keyboard Folio — assuming any of them even decide to bother. For now, Logitech is just churning out a refreshed version of its Combo Touch, which has more of a Surface Pro vibe than anything else. It’s nothing like the folio, so I’m not optimistic that anyone will step in to fill the void.

You really don’t know what you’ve got until it’s gone, I suppose.

The Flextail Tiny Bike Pump is a solid pump half the time

The tiny Flextail pump inflated this city bike tire in 45 seconds. | Photo by Thomas Ricker / The Verge

Social media’s algorithms know that I ride a bike almost every day. My quiver includes a city bike, mountain bike, and gravel bike, in addition to one or two e-bikes I’m always in the process of reviewing. I’m also the family mechanic, which makes me responsible for no less than 16 to 18 tires that I must keep inflated. So, you’d better believe I took notice when Instagram served me several ads for the Flextail Tiny Bike Bump.

The mini rechargeable pump works with Presta (the thin one) or Schrader (the old fatty) valves and promises ultra-fast inflation that maxes out at 100psi (about 7 bars) — enough for any bike that doesn’t require a stretchy wardrobe coordinated with your shoes and helmet.

The origins of the pump are suspect, as I see what looks to be the exact same product sold with branding like Cyclami, Toptoper, Rrskit, and Epoom at a variety of price points, some as low as $25. Flextail sells its version for $85 and lists the manufacturer as Huzhou Jingwei Outdoor Products on the box and device itself. The first pump Flextail sent me couldn’t pump a tire beyond 19psi before dying. Flextail sent me another that (mostly) lives up to the claims.

The thing that’s not mentioned in the ads I’ve seen is how loud the tiny pump is: 76dB at arm’s length, in my testing, which is akin to bending over to inspect a running vacuum cleaner or garbage disposal. Using it while stopped alongside forest trails generates more scowls than seeing a mountain biker in Lycra.

The Flextail Tiny Bike Pump does work, though. It’s much faster and smaller than the mini hand pumps riders usually carry in case of trouble. At 3.9 ounces (111 grams), it’s also just a bit heavier than the trusty 3.4-ounce (96 grams) Unich pump I regularly carry. But the Flextail pump also doesn’t strain your air valve mounts as much because it doesn’t require long periods of vigorously erratic pumping.

The Flextail pump’s biggest disadvantage is that it’s only good for a few zero-to-full inflations before needing a recharge, but that will vary by tire size and desired pressure. It’ll last much longer if you’re just topping up tires. Its tiny 2.59Wh battery recharges in as little as 25 minutes.

In my testing, on a city bike fitted with wide 700 x 40c tires and Schrader valves, I was able to pump one tire up to 45psi in 45 seconds. Then, moving to a gravel bike fitted with wider 700 x 42c tires and Presta valves, I was able to hit 50psi in 90 seconds before the pump quit in need of a recharge. That’s two real-world inflations per charge, for those keeping score.

The Flextail Tiny Bike Pump is so small and lightweight that I initially thought it would be ideal for bikepacking trips or even long day rides. But with only two inflations in the tank, I’d still want to carry a hand pump as backup alongside my patch kit and spare inner tube(s). But there’s no way my gram-obsessed brain would allow me to carry two pumps.

If your rig is an e-bike with a built-in USB charging port, then you’re already traveling with a giant power bank on wheels. That makes it easy to recharge the Flextail pump after depleting it because your side-of-the-road flat tire repair didn’t go as planned (it happens!). Just don’t forget your USB-C cable... and maybe a carbohydrate bar to snack on while you wait.

If you’re still interested, all I can say is that one of the two Flextail Tiny Bike Pumps I tested worked as advertised, and I bet you’ll have similar success from other brands that sell what looks to be the same Huzhou Jingwei Outdoor Products battery-powered pump for much less.

For everyone else, just buy a mini hand pump for much less money. They never need charging, are too big to lose, and will likely last a human lifetime — or two.

All photography by Thomas Ricker / The Verge

Rabbit R1 review: nothing to see here

Artificial intelligence might someday make technology easier to use and even do things on your behalf. All the Rabbit R1 does right now is make me tear my hair out.

“You’re holding a taco.”

My Rabbit R1 told me that the other day. I was sitting at a table at a cafe on the National Mall in Washington, DC, and I had just picked up this $199 orange rectangle of a gadget and pointed it at the food in my hand. With unwavering, absolute confidence, the R1 told me it was a taco.

It was a Dorito. A Cool Ranch Dorito, to be specific. I’d shown the R1 the bag just a few seconds earlier and asked about the calories. (The R1 got that bit right.) I moved the chip around and tried again — still taco. I could not convince this AI-powered gadget, theoretically at the cutting edge of a technological revolution, that I was holding a chip.

Over and over in my testing of the R1, I’ve run into moments like the taco encounter, where the whole thing just feels broken. It misidentified a red dog toy as a stress ball, then as a tomato, then as a red bell pepper that it assured me is totally safe to eat. I’d start playing a song on the R1, and then the device would stop responding but keep playing so that I couldn’t even pause it or turn the volume down.

For a while, the R1 couldn’t even tell the time or the weather. Rabbit finally fixed that with a software update on Tuesday, and the company promised many more updates to come — though now, instead of the weather being wrong by thousands of miles, it gives me the weather from about 15 miles away. I guess that counts for something.

Ever since the R1 debuted at CES, with a keynote filled with big promises and impressive demos, this device has been sold as a super-clever, ultra-helpful AI assistant. Rather than just answer ChatGPT-style questions, it was supposed to do just about everything your phone can do, only faster. A few months later, this device on my desk bears no resemblance to the one we were told about, that more than 100,000 people preordered based on promises and demos.

After reviewing the Humane AI Pin and finding it woefully unable to execute its ambition, I was excited about the R1. It’s cheaper, more whimsical, and less ambitious. After using the R1, I feel like Humane at least deserves credit for trying. The R1 is underwhelming, underpowered, and undercooked. It can’t do much of anything. It doesn’t even know what a taco looks like.

On the LAM

The most intriguing tech in the R1 is what Rabbit calls the “Large Action Model,” or LAM. Where a large language model, or LLM, is all about analyzing and creating text, the LAM is supposed to be about doing stuff. The model learns how an app works in order to be able to navigate it on your behalf. In a LAM-powered world, you’d use Photoshop just by saying “remove that lady from the background” or make a spreadsheet by telling your device to pull the last six quarters of earnings from the investor website.

There is basically no evidence of a LAM at work in the R1. The device only currently connects to four apps: Uber, DoorDash, Midjourney, and Spotify. You connect to them by opening up Rabbit’s web app, called Rabbithole, and logging in to each service individually. When you go to do so, Rabbit opens up a virtual browser inside the app and logs you in directly — you’re not logging in to a service provided by DoorDash but rather literally in to DoorDash’s website while Rabbit snoops on the process. Rabbit says it protects your credentials, but the process just feels icky and insecure.

I logged in to them all anyway, for journalism. Except for Midjourney, which I never managed to get into because I couldn’t get past the CAPTCHA systems that obviously thought I was a bot. The connection doesn’t do much anyway: the R1 won’t show you the images or even send them to you. It’s just typing an image prompt and pressing enter.

A photo of the Rabbit R1.
The R1’s camera can see things... it’s just not great at knowing what they are.

I’d love to tell you how Uber and DoorDash work better once you’re logged in, but I never got either one to successfully do anything. Every time I pressed that side button on the R1 — which activates the microphone — and asked it to order food, it spat back a warning about how “DoorDash may take a while to load on RabbitOS” and then, a second later, told me there was an issue and to try again. (If you have to include that disclaimer, you probably haven’t finished your product.) Same thing for Uber — though I was occasionally able to at least get to the point where I said my starting and ending addresses loudly and in full before it failed. So far, Rabbit has gotten me zero rides and zero meals.

Spotify was the integration I was most interested in. I’ve used Spotify forever and was eager to try a dedicated device for listening to music and podcasts. I connected my Bluetooth headphones and dove in, but the Spotify connection is so hilariously inept that I gave up almost immediately. If I ask for specific songs or to just play songs by an artist, it mostly succeeds — though I do often get lullaby instrumental versions, covers, or other weirdness. When I say, “Play my Discover Weekly playlist,” it plays “Can You Discover?” by Discovery, which is apparently a song and band that exists but is definitely not what I’m looking for. When I ask for the Armchair Expert podcast, it plays “How Far I’ll Go” from the Moana soundtrack. Sometimes it plays a song called “Armchair Expert,” by the artist Voltorb.

Not only is this wrong — it’s actually dumber than I expected. If you go to Spotify and search “Discover Weekly” or “Armchair Expert,” the correct results show up first. So even if all Rabbit was doing was searching the app and clicking play for me — which is totally possible without AI and works great through the off-the-shelf automation software Rabbit is using for part of the process — it should still land on the right thing. The R1 mostly whiffs.

About a third of the time, I’ll ask the R1 to play something, it’ll pop up with a cheery confirmation — ”Getting the music going now!” — and then nothing will happen. This happened in my testing across all of the R1’s features and reminded me a lot of the Humane AI Pin. You say something, and it thinks, thinks, thinks, and fails. No reason given. No noise letting you know. Just back to the bouncing logo homescreen as if everything’s A-okay.

The long and short of it is this: all the coolest, most ambitious, most interesting, and differentiating things about the R1 don’t work. They mostly don’t even exist. When I first got a demo of the device at CES, founder and CEO Jesse Lyu blamed the Wi-Fi for the fact that his R1 couldn’t do most of the things he’d just said it could do. Now I think the Wi-Fi might have been fine.

A photo of the Rabbit R1.
The R1 connects to Spotify but doesn’t do it very well.

Hot mic

Without the LAM, what you’re left with in the R1 is a voice assistant in a box. The smartest thing Rabbit did with the R1 was work with Perplexity, the AI search engine, so that the R1 can deliver more or less real-time information about news, sports scores, and more. If you view the R1 as a dedicated Perplexity machine, it’s not bad! Though Perplexity is still wrong a lot. When I asked whether the Celtics were playing one night, the R1 said no, the next game isn’t until April 29th — which was true, except that it was already the evening of April 29th and the game was well underway. Like with Humane, Rabbit is making a bet on AI systems all the way down, and until all those systems get better, none of them will work very well.

For basic things, the kinds of trivia and information you’d ask ChatGPT, the R1 does as well as anything else — which is to say, not that well. Sometimes it’s right, and sometimes it’s wrong. Sometimes it’s fast — at its best, it’s noticeably faster than the AI Pin — but sometimes it’s slow, or it just fails entirely. It’s helpful that the R1 has both a speaker and a screen, so you can listen to some responses and see others, and I liked being able to say “save that as a note” after a particularly long diatribe and have the whole thing dumped into the Rabbithole. There’s a handy note-taking and research device somewhere inside the R1, I suspect.

To that point, actually: my single favorite feature of the R1 is its voice recorder. You just press the button and say, “Start the voice recorder,” and it records your audio, summarizes it with AI, and dumps it into the Rabbithole. $200 is pretty steep for a voice recorder, but the R1’s mic is great, and I’ve been using it a bunch to record to-do lists, diary entries, and the like.

The most enjoyable time I spent with the R1 was running around the National Mall in Washington, DC, pointing the R1’s camera at a bunch of landmarks and asking it for information via the Vision feature. It did pretty well knowing which large president was which, when memorials were built, that sort of thing. You could almost use it as an AI tour guide. But if you’re pointing the camera at anything other than a globally known, constantly photographed structure, the results are all over the place. Sometimes, I would hold up a can of beer, and it would tell me it was Bud Light; other times, it would tell me it’s just a colorful can. If I held up a can of shaving cream, it identified it correctly; if I covered the Barbasol logo, it identified it as deodorant or “sensitive skin spray,” whatever that is. It could never tell me how much things cost and whether they had good reviews or help me buy them. Sometimes, it became really, really convinced my Dorito was a taco.

For the first few days of my testing, the battery life was truly disastrous. I’d kill the thing in an hour of use, and it would go from full to dead in six hours of sitting untouched on my desk. This week’s update improved the standby battery life substantially, but I can still basically watch the numbers tick down as I play music or ask questions. This’ll die way before your phone does.

A photo of the Rabbit R1.Photo: David Pierce / The Verge
AI gadgets are coming — but they’re not great.

A vision in orange

Just for fun, let’s ratchet the R1’s ambitions all the way down. Past “The Future of Computing,” past “Cool Device for ChatGPT,” and even past “Useful For Any Purpose At All.” It’s not even a gadget anymore, just a $200 desk ornament slash fidget toy. In that light, there is something decidedly different — and almost delightful — about the R1. A rectangle three inches tall and wide by a half-inch deep, its plastic body feels smooth and nice in my hand. The orange color is loud and bold and stands out in the sea of black and white gadgets. The plasticky case picks up fingerprints easily, but I really like the way it looks.

I also like the combination of features here. The press-to-talk button is a good thing, giving you a physical way to know when it’s listening. The screen / speaker combo is the right one because sometimes I want to hear the temperature and, other times, I want to see the forecast. I even like that the R1 has a scroll wheel, which is utterly superfluous but fun to mess around with.

As I’ve been testing the R1, I’ve been trying to decide whether Humane’s approach or Rabbit’s has a better chance as AI improves. (Right now, it’s easy: don’t buy either one.) In the near term, I’d probably bet on Rabbit — Humane’s wearable and screen-free approach is so much more ambitious, and solving its thermal issues and interface challenges will be tricky. Rabbit is so much simpler an idea that it ought to be simpler to improve.

But where Humane is trying to build an entirely new category and is building enough features to maybe actually one day be a primary device, Rabbit is on an inevitable collision course with your smartphone. You know, the other handheld device in your pocket that is practically guaranteed to get a giant infusion of AI this year? The AI Pin is a wearable trying to keep your hands out of your pockets and your eyes off a screen. The R1 is just a worse and less functional version of your smartphone — as some folks have discovered, the device is basically just an Android phone with a custom launcher and only one app, and there’s nothing about the device itself that makes it worth grabbing over your phone.

Lyu and the Rabbit team have been saying since the beginning that this is only the very beginning of the Rabbit journey and that they know there’s a lot of work left to do both for the R1 and for the AI industry as a whole. They’ve also been saying that the only way for things to get better is for people to use the products, which makes the R1 sound like an intentional bait-and-switch to get thousands of people to pay money to beta-test a product. That feels cruel. And $199 for this thing feels like a waste of money.

AI is moving fast, so maybe in six months, all these gadgets will be great and I’ll tell you to go buy them. But I’m quickly running out of hope for that and for the whole idea of dedicated AI hardware. I suspect we’re likely to see a slew of new ideas about how to interact with the AI on your phone, whether it’s headphones with better microphones or smartwatches that can show you the readout from ChatGPT. The Meta Smart Glasses are doing a really good job of extending your smartphone’s capabilities with new inputs and outputs, and I hope we see more devices like that. But until the hardware, software, and AI all get better and more differentiated, I just don’t think we’re getting better than smartphones. The AI gadget revolution might not stand a chance. The Rabbit R1 sure doesn’t.

Photography by David Pierce / The Verge

How to get more space in your Google storage

Vector illustration with the Google Drive logo.
The Verge

For many of us, Google storage is the modern-day hard drive. It’s the place where our most important thoughts, documents, and memories reside. But just like with a traditional hard drive, the space isn’t infinite, and running out of room can be a real problem.

By default, Google gives you 15GB of space to use for everything associated with your account. That includes content connected to Gmail, Google Drive, and all Google photos (except those saved before June 1st, 2021). Needless to say, data adds up fast.

You can check your current storage status by visiting this page, and if push comes to shove, you can purchase more space there, too, for as little as $2 a month for an extra 100GB. But shelling out more money might not be necessary. A quick round of old-fashioned housekeeping could be enough to clear away your virtual cobwebs and give yourself ample room to grow. Here’s how to do it.

Delete Drive debris

Google Drive is a common place for space-sucking files to build up and wear down your quota, but tidying things up doesn’t take long.

  • Open this link, which will show you a list of all of your Drive files sorted by size with the largest items at the top.
  • Look through the heftiest offenders and delete anything you no longer need.
  • Click the gear-shaped icon in Drive’s upper-right corner, and select Settings, followed by Manage Apps.
  • Apps associated with your Google Drive storage can sometimes have hidden data, but all it takes is a couple of clicks to remove it. For any apps that have a note about hidden data, click the gray Options box to the right, and select Delete hidden app data.
Google Drive hidden data
For any apps that have a note about hidden data, click the gray “Options” box to the right, and select “Delete hidden app data.”

Free up Photos storage

Unless you currently have a model 5 or earlier Pixel phone (in which case, you will, for now, keep the unlimited “Storage saver” option), every photo and video backed up to Google Photos after June 1st, 2021, counts against your Google storage. If you’ve been saving photos at their original sizes, you can free up tons of space by converting them to Google’s “Storage saver” option (which used to be called “High quality”). This compresses images down to 16MP and videos to 1080p (a change that’s unlikely to be noticeable for most people and purposes).

  • Go to the Photos settings page and select Storage saver.
  • If you switch to Storage saver, your previous photos won’t automatically be compressed. To do that, on the Photos setting page, look for the “Recover storage” button, which will compress many (but not all) of your existing videos and photos. (Check out the list on Google’s support page to see which images will be affected.)
You can compress many of your saved photos and videos to save space.
You can compress many of your saved photos and videos to save space.

Another handy resource is the Manage storage page, which you get to by tapping Storage at the bottom of the left column. This will take you to a page that will tell you approximately how much more time you have before you fill up your storage space and offer to find (and delete) blurry photos, screenshots, and other possibly unwanted images that are taking up space.

Google has a page that helps you get rid of unnecessary images that are taking up space.
Google has a page that helps you get rid of unnecessary images that are taking up space.

Say goodbye to Gmail junk

Emails don’t take up a ton of space, but you know what does? Attachments. Odds are, you’ve got plenty of old attachments sitting in your Gmail account that you don’t really need.

Here’s how to address that:

  • Go to the Gmail website and type “has:attachment larger:10M” into the search box at the top
  • Identify any messages with disposable attachments and delete them. (There’s no great way to get rid of an attachment without also deleting the associated email, unfortunately, but you can always forward a message back to yourself and manually remove the attachment before axing the original.)

If you’re like me, you tend to ignore most, if not all, of the email that finds its way into your Promotional folder. While each promotional email may not take up much space, they can build up over the weeks (or months).

  • If you just want to get rid of the entire mess, go to the Promotions section, and click on the small checkbox just under the search field and then Select all [number] conversations in Promotions > Clear selection.
  • If you’d rather remove specific promotions, then open one of the emails from the company you no longer want to hear from. After clicking the Unsubscribe button from next to the email address (because you don’t want to get any more of those, right?), go to the right of the address, click on the three dots and then on Filter messages like this > Search. You can then use the above method to get rid of all the emails from that merchant.

And of course, these directions can be used with any other Gmail folder as well.

Gmail message filter showing emails from Walgreens.
By filtering messages, you can delete emails from a specific sender.

Now you can completely dispose of it all.

  • Open your Spam folder, and click the link to Delete all spam messages now.
  • Open your Trash folder, and select Empty Trash now to send everything away for good.

Feeling lighter is liberating, isn’t it?

Update May 1st, 2024, 10:41AM ET: This article was originally published on March 19th, 2019. The information on Google Drive and Google Photos has been updated and information on filtering has been added.

Beats Solo 4 review: playing both sides

Beats’ on-ear headphones get an overdue refresh with a more comfortable design, longer battery life, and wired audio over both USB-C and the 3.5mm jack — but no ANC.

They look like Beats headphones. They sound like Beats headphones. The battery life can stretch to a new high of 50 hours. Those things alone all but guarantee that the new Beats Solo 4 on-ear wireless headphones will prove just as successful as their predecessors — and it’ll be no time at all before you start seeing them worn by athletes and music stars at every turn.

But there’s more to these than a logo. Unlike the Solo 3, the fourth-gen cans uniquely support native software features (like one-tap pairing and Find My / Find My Device) on both Android and iOS; Beats has quietly become Apple’s Android-friendly brand, in case you weren’t paying attention. And for an old-school guy like me, I love that the company is putting an emphasis on wired, lossless listening over either USB-C or the 3.5mm headphone jack. Sonically, these are a world apart from Sennheiser’s Momentum 4 or the Bowers & Wilkins PX7 S2e headphones that I often carry — both of which are more expensive. But they’re also for much different audiences. As ever, Beats is about cultural cachet, that prominent “b” logo, and enjoyable (if not mind-blowing) sound.

That’s not to say Beats knocked everything out of the park. A complete lack of active noise cancellation in any $200 pair of wireless headphones is hard to overlook; the ear cushions have provided relatively good natural noise isolation in my local coffee shop and when traversing Brooklyn, but ANC is always appreciated when the clamor starts to bubble over. The short-lived Solo Pro had it, but not these, which are technically a sequel to the eight-year-old Solo 3.

So Beats’ flagship Studio Pro easily win out on the ANC front. But the Solo 4 do have one thing going for them: they’re passively tuned. On many wireless headphones, there’s an active EQ profile running at all times that provides the fullest sound. That’s all driven by the battery. Once you’re out of power, some headphones will stop playing — even when wired — or will fall back to very meager audio quality until you recharge. The Solo 4 will keep playing endlessly when plugged in even if the battery is dead thanks to that passive tuning, and the sound never changes. “Unlimited wired playback” is actually one of the bullet points advertised on the back of the box. We love to see it, though this inevitably means getting out a headphone dongle nowadays.

A photo of the Beats Solo 4 wireless headphones.
The “pink” Solo 4s often look very neutral depending on the light.
A photo of the Beats Solo 4 wireless headphones.
The left Beats logo doubles as your play/pause button, with volume above and below.

On your head, the Solo 4 wear well. They’re narrower, sleeker, and significantly lighter than the Studio cans. And they use the same “UltraPlush” memory foam pads as the Studio Pro, which are a key part of the comfort. Beats claims the new cover material — a failure point of some past headphones — should provide better durability and extended longevity compared to the Solo 3. There’s always a moderate amount of clamping force with Beats headphones; plenty of people use them at the gym or during outdoor activities. But despite my huge noggin, I never found the pressure to reach unpleasant territory.

A photo of the Beats Solo 4 wireless headphones.
The memory foam ear cups help make the clamping force less noticeable.

I like the included fabric carrying case, too, but why doesn’t it match the color of your headphones? Blue, pink, and black are the hardware choices at launch, but Beats has a history of churning out many other colors as time goes on. The design of the headphones is similar to past models, and so are the controls. The left-side Beats logo acts as a play / pause button, and you’ve got volume controls directly above and below, so using these headphones is about as simple as it gets. You can double-tap the logo to skip forward a track or triple-press to go back — all very familiar controls for Beats fans.

Rather than integrating an Apple chip, which would make these lopsidedly appealing to iPhone owners, Beats is sticking with the same proprietary platform that has been the brains of its recent products. In practice, this means you’ll get some (but not all) ecosystem software tricks, regardless of whether you’re using iOS or Android. This feels like the right approach to me. Apple fans get at least one exclusive: personalized spatial audio with head tracking. But the Android crowd gets automatic device switching between Android, Chromebooks, and other devices.

A photo of the Beats Solo 4 wireless headphones.
The “4” is how you know these are the new ones, obviously.

I’ve been listening to the Solo 4 for several days, and the sound is honestly more restrained than I expected. They’re not particularly bassy and avoid overemphasizing any section of the frequency range; the goal was to land on a consistent tuning that fits right across music, podcasts, work meetings / voice calls, and more. Speaking of calls, voice quality is rather decent, with Beats having trained its ML algorithm “using over 7,000 hours of exposure to real-world environments.” Where the Solo 4 fall short compared to pricier headphones is in their overall richness and a fairly condensed soundstage that lacks much breadth. But for the target audience, I think they’ll prove more than adequate.

The omission of noise cancellation on the Solo 4 could be a real obstacle for some, but I don’t think it’ll be enough to dampen their appeal to the masses who’ve been cycling through Beats products for so many years now. Even if you’re buying largely for the cool factor, at least these on-ear headphones are now platform-agnostic, more comfortable, and more versatile since you can just plug in if you manage to run through that 50-hour battery life. As with all Beats products, it’s worth holding out until they go on sale — and the Solo 4 certainly will.

Photography by Chris Welch / The Verge

SwitchBot S10 review: with plumbing hookups, this robovac and mop is actually hands-free

Thanks to its self-cleaning mop, automatic water tanks, and supersize bin, this robot vacuum can go for two months without any manual intervention. 

As I type, a small white robot has just rolled past me, heading from my bathroom, where it just finished mopping and vacuuming the floor, to my laundry room. Once there, it docked to its water station, gurgled a lot as it emptied its dirty water tank and filled itself up with fresh water, then headed back to its charging station, where it settled down for a blow-dry (of its mop) and recharge.

That robot is the new SwitchBot S10. A robot vacuum and mop with a couple of unique features, the S10 was announced last year and, following a Kickstarter launch, is now available to buy for $1,119.99 (€1,099.99 / £999.99).

The S10 is the first combo vacuum / mop I’ve tested that can hook directly into your plumbing, so you don’t need a bulky, multifunctional charging dock to take care of the robot.

Instead of one giant dock, the SwitchBot has two small ones: a water refill station and a charging / auto-empty dock. These don’t have to be placed in the same room, the water station doesn’t even need a power outlet (it has a battery), and they are both more compact than other docks with the same function.

With its dual docks, the S10 can empty its dirty water, fill itself up with clean water, empty its dustbin, dry its mop, and charge itself. And because it’s hooked directly into my plumbing, I never had to deal with emptying or refilling big water tanks. In fact, the only manual labor the S10 requires is replacing its dust bag — something SwitchBot claims you’ll only have to do every two months.

The S10 is the only robot vacuum I’ve tested that cleans its own mop as it cleans the floor. Its roller mop uses a squeegee system, pushing the dirty water into a small tank while spraying the mop with clean water as it mops.

This meant less possibility of cross-contamination on the floors, and it finished the job faster than a robot that has to go back to its base to wash its mop would. I also didn’t have to deal with the grimy, smelly “sink” that most other multifunctional docks have to clean the mops in — and which has to be cleaned manually.

The SwitchBot’s water refill station. While the station fits under my sink, the robot is too wide to get under there.
The separate charging dock empties the robot’s bin and dries its mop.

I’ve been testing the S10 for about a week, and this system, which at first seemed a little circuitous, works really well. The downside is that all this work drains its relatively small 4,000mAh battery pretty quickly, and it couldn’t get through a full clean of my upstairs and downstairs (about 1,000 square feet) without a three-hour top-up.

Overall, I like the approach of the two docking stations, especially the self-filling water station, which was surprisingly easy to install. I put that in my laundry room, where it is largely out of sight, and installed the compact auto-empty dock in my bathroom, where it fits neatly under my heated towel rack.

SwitchBot isn’t the only company offering the option of plumbing its docks. Narwhal, Roborock, and Dreame all have plumbable options using their multifunction docks. However, these require power, whereas SwitchBot’s water station is a smaller, battery-powered device — which makes installation easier. (The battery is recharged by the robot.)

The S10 emptying its dirty water and refilling itself all from my sink. You can see the plumbing hookups here, too; it was a tight fit.

SwitchBot’s water station can hook into a number of water sources and drain lines — under a sink, by a toilet (draining into the bowl), or connecting to a dishwasher or washing machine supply line. The company’s Evaporative Humidifier (coming soon) can be refilled by the robot vacuum, and a dehumidifier that could empty itself into the S10 is being developed. This kind of smart home symbiosis is intriguing.

SwitchBot makes a wide range of smart home devices, from lights and locks to curtain motors and robot fingers. The S10 can work with all of these using the SwitchBot app to do things like dock the robot when the front door unlocks or start cleaning when the lights turn off. SwitchBot also supports Matter through its Hub 2, opening the door to more smart home integrations.

I was impressed at how easy the S10’s station was to install, considering I am not a plumber (nor is my husband, who gave me a hand with this one since he’s better with a wrench than I am).

On a scale of one to 10, one being installing a smart light bulb and 10 being plumbing a smart faucet, this is a seven. It’s totally doable if you have easy access to the pipes and can handle a wrench. We installed it under an open utility sink (so no cupboard to deal with), but the water outlets were tucked behind the sink, so it was tight getting in there.

That was the toughest part. SwitchBot’s YouTube installation videos were very clear (don’t bother with the paper directions), and the attachments SwitchBot provided to hook the pipes into the plumbing were easy to use and well made. However, the dirty water pipe attachment wasn’t free-spinning, resulting in some contortions to get it attached. The whole thing took 30 minutes.

I wasn’t thrilled with how much piping there was, and I couldn’t find a good way to hide it all under my open sink. SwitchBot provides cable-tidy fittings, but I would prefer to switch them out for shorter pipes or to be able to cut them to fit. Because the S10 is so wide, it wouldn’t fit under my sink, so I had to put the dock next to the sink, which isn’t ideal.

If you don’t have the option of hooking up to water, SwitchBot has a water tank add-on coming later this year for $80. I tried this, and while it worked fine, you still have to deal with all the piping, so it’s not an elegant solution. If you can’t use the plumbing option, the S10 is not the right robot for you.

As a robovac, the S10’s specs are good, if not the best of the best. It has 6,500Pa suction, a 4,000mAH battery, and AI-powered obstacle avoidance (which uses an onboard camera). While Roborock’s, Dreame’s, and Ecovacs’ flagships have higher suction power and longer battery life, they cost significantly more.

The S10’s obstacle avoidance wasn’t as good as Roborock’s, and while it dodged fake pet poop, it got close — brushing up against it as it navigated away. It did well at avoiding cables, socks, and larger items like shoes.

The bot’s single rubber roller brush performed well in my tests, getting up all the oatmeal and rice on hard floor and doing an excellent job on cat hair on low-pile carpet. Generally, I prefer dual roller brushes, but the S10 is a heavy robot, and its weight seemed to help it dig down into carpet fibers.

Mopping is the main reason to buy this bot, and it’s very effective. Its rolling movement agitates the dirt, and with its 10n downward pressure, the mop easily tackled dried milk and spilled OJ. The water station has the option to add a cleaning solution, which seemed to help with tougher grime. But the mop doesn’t oscillate as the pads on the Dreame X30 or Roborock Q Revo do, and some dirty paw prints were still slightly visible.

On the flip side, oscillating mopping pads can get hung up on things like rug tassels, cables, and room transitions, whereas the SwitchBot’s mop is tucked neatly under it, and it never got stuck. The mop raises when it goes over carpet (but only by 7mm), or you can program it to avoid carpet altogether or draw keep-out zones around high-pile rugs.

The mop is easy to remove — it slides out the side, so I didn’t have to flip the big beast on its back. It stayed clean throughout my testing and dried quickly. The mop is replaceable, and a two-pack costs $30.

The SwitchBot app has the key features you’d expect from a high-end bot, including lidar mapping, virtual no-go zones, and room-specific cleaning. But it is missing a few things. There are two cleaning modes with multiple levels: vacuum and mop and vacuum only, but there is no mop-only option or smart cleaning modes. There’s voice control with all the main platforms, but it’s limited. It supports up to three maps, but you can’t add furniture to them. Additionally, on floors where there is no water station, it only vacuums — it won’t mop — and you can’t buy a second water station.

Overall, I was impressed with the S10. Its dual dock system is an innovative fix to the design problem of these big multifunction docks, and the possibility of connecting with a humidifier and dehumidifier takes us one step closer to a future where robots can do more in our homes than just clean the floors. But for today, the water hookup and self-cleaning mop make this the most hands-free cleaning robot I’ve tested.

The biggest downside is the short battery life, and I’m disappointed SwitchBot didn’t go for more milliamp hours, especially given the potential for using the water station for more tasks down the line. The self-refilling humidifier and self-draining dehumidifier are great ideas — if they get released. Maybe this ingenious robotics company could come up with an attachment for the robot that can refill your dog’s water bowl or water your plants. But all of that will require more power.

Photos by Jennifer Pattison Tuohy / The Verge

Mercedes-Benz won’t let Apple CarPlay take over all its screens

Apple CarPlay UI across three simulated Porsche in-car screens
Image: Posche (via Car & Driver)

Mercedes-Benz doesn’t have any plans to adopt Apple’s immersive, next-generation version of CarPlay, the German automaker’s CEO said in an episode of Decoder.

“The short answer is no,” Ola Källenius told The Verge’s Nilay Patel in response to a question about whether Mercedes-Benz will enable Apple CarPlay to take over all the screens inside its vehicles. Instead, he touts the need for a “holistic software architecture” to meet the needs of customers who are increasingly looking for a better technology experience from their vehicles.

Apple announced its next-gen version of CarPlay, in which the phone-mirroring feature would extend beyond the central touchscreen to also include additional screens like the gauge cluster, back in 2022. It was a bold move, with Apple signaling its desire to control core functions of the vehicle like HVAC, as well as the speedometer and odometer. But since then, the new CarPlay has yet to appear on any production models. Last year, it said that Porsche and Aston Martin would be among the first companies to adopt the new immersive display.

But Mercedes doesn’t appear to be in any rush to follow its luxury vehicle peers in letting Apple dominate the in-car experience for its customers. Instead, Källenius said that the company is working closely with Apple’s main rival, Google, in designing a new navigation feature that will build on Google Maps. The key difference there is that Mercedes’ own engineering team will be heavily involved in the process.

“I fundamentally believe that that holistic customer experience is best done by us, and we will serve you,” he said during the interview.

But Källenius said he still sees value in offering phone-mirroring services to his customers and has no plans to exclude their use — despite some in the auto industry turning away from them. Last year, General Motors made the controversial move to prohibit Apple CarPlay and Android Auto in its forthcoming lineup of electric vehicles, arguing that the company could provide a more comprehensive software experience than what exists on someone’s phone.

“We’re not fundamentalists to say, for some reason, we’re not going to allow a customer to use Apple CarPlay if that’s what they choose to do,” Källenius said. “So, we have Apple CarPlay. We have Android Auto. If, for some of the functions, you feel more comfortable with that and will switch back and forth, be my guest. You can get that, too.”

At the end of the answer, he reiterated his position that Apple’s next-gen CarPlay was a bridge too far for Mercedes. “To give up the whole cockpit head unit — in our case, a passenger screen — and everything to somebody else, the answer is no.”

Duel of the dual-screen laptops: Asus Zenbook Duo vs. Lenovo Yoga Book 9i

Their small variations in design make a big difference.

There’s no getting around owning a laptop these days, especially if we travel often or don’t have the space for a desktop computer. But as time has gone on, many of us have invested in an external monitor or two to better handle all the windows we need to have visible at the same time. We could arrange them all on one screen, but the smaller the laptop, the harder it is to read two windows side by side, let alone scroll through them. But who wants the hassle of traveling with a portable monitor? I sure don’t. So I tested the Asus Zenbook Duo and the Lenovo Yoga Book 9i head to head to see which ones would alleviate those issues the best.

The Zenbook Duo looks like a regular laptop until you remove the keyboard and trackpad that cover the entire bottom screen. I enjoyed the look of awe on my friends’ faces every time I did that. It felt like performing a magic trick. The Yoga Book, meanwhile, looks like two tablets stuck together, and its keyboard hides only half of the bottom screen, so you know right away that it’s different. “What is that? Is that a laptop?!” is the most common response I received from people when they first saw it. These and other small design differences have a big effect on the overall experience of using them.

Design features: the Yoga Book 9i has more form factors

The dual screens give you lots of options. Both the Zenbook Duo and Yoga Book 9i can be used as traditional laptops with either a physical keyboard and trackpad (Zenbook) or a physical keyboard and mouse (Yoga Book); taking notes on the bottom display in clamshell mode; or with their displays oriented vertically or horizontally. They both also have virtual keyboards and trackpads.

For the dual screens to remain stable while upright, the Zenbook Duo has a kickstand attached to the bottom chassis, while the Yoga Book comes with a keyboard folio cover that transforms into a stand with a flat, triangular back and a thick lip at the bottom to keep the laptop in place.

A close up of the back of a laptop and it’s attached stand.
The Zenbook Duo’s attached kickstand.
The back of a laptop propped up by a stand.
The Yoga Book 9i’s keyboard folio when it’s folded into a stand.

Only the Yoga Book has a 360-degree hinge that rotates the displays back to back, so you can use it as a tablet. (The Zenbook Duo’s top display folds back 180 degrees.) It’s thick for a tablet, but when I need to walk around my classroom as I’m teaching, it’s far less unwieldy than the Zenbook. The Yoga Book is also lighter and thinner when folded, at 2.95lbs and 0.63 inches compared to the Zenbook’s 3.62lbs and 0.78 inches.

Neither of these laptops is great for using directly on your lap, though. The Zenbook Duo can get uncomfortably hot if the processor is running as fast as it can, and there’s a vent that blows hot air directly into your lap. The Yoga Book is fine temperature-wise, but the keyboard can easily shift and twist a few centimeters despite attaching magnetically over the bottom display.

Winner: Lenovo Yoga Book 9i

Tech specs: the Zenbook has more power and ports

The Zenbook can be configured with either an Intel Core Ultra 7 or 9 H-series processor, while the Yoga Book has only an Intel Core Ultra 7 U-series option. Both can be configured with 16GB or 32GB of memory, but the Zenbook offers larger storage options, 1TB and 2TB, compared to the Yoga Book’s 512GB and 1TB.

Both laptops have OLED displays, but the Zenbook’s are physically larger at 14 inches, with higher resolution and refresh rate: up to 2880 x 1800 at 120Hz, compared to the Yoga Book’s twin 13.3-inch, 1920 x 1200 at 60Hz. (The Zenbook also has a 1920 x 1200, 60Hz option.) The Zenbook’s displays get brighter, at 500 nits compared to 400.

The Zenbook Duo takes a Swiss Army knife approach with its port options: one USB-A, two Thunderbolt 4 USB-C, one HDMI, and even a 3.5mm combo audio jack. The Yoga Book has only three Thunderbolt 4 USB-C ports, so you’re more reliant on hubs, dongles, and Bluetooth if you connect a lot of accessories to your laptop.

Winner: Asus Zenbook Duo

A dual screen laptop open and powered on.
Zenbook Duo in dual-screen mode.
A dual screen laptop open and powered on.
Yoga Book 9i in dual-screen mode.

Dual-screen gestures: the Yoga Book’s work consistently

Both laptops use tap or swipe gestures to pull up and put away the virtual keyboard and trackpad, flick windows from one screen to the other, extend a window across both screens and launch more than one app at the same time. The exact number of fingers or swiping motion for the gestures differ between the laptops — and Lenovo does a better job at teaching you how to use those gestures.

The Yoga Book’s User Center software is one of the first things that pops up the first time you power on the laptop. It’s both a guide and a setting portal with clear instructions and visuals and lets you turn any of the gestures on or off, including the option to automatically launch its bespoke notetaking app when you open Microsoft Teams or Zoom.

Top down view of two closed laptops side by side on top of a wood table.
The Zenbook Duo (left) is a smidge faster to put away.

Lenovo also recently added an app group launcher that lets you launch two apps at the same time, one on the top and one on the bottom. You can customize up to four pairs or let the computer do it for you, but Zenbook one-ups the Yoga Book here. You can launch more than two apps at the same time and assign them to a specific Window’s layout.

But when it comes to teaching you how to use all the gestures and features, the Zenbook tosses you into a pool and says “Swim.” Its instructions are buried within ScreenXpert, Asus’ equivalent to Lenovo’s program. Some of its gestures don’t work consistently, either, like the five-finger gesture for expanding a window across both displays. It would zoom in on the page at the same time because it thought I was also using the two-finger gesture for zooming in and out.

Winner: Lenovo Yoga Book 9i

Stowing and traveling: the Zenbook Duo makes it easier

The 14-inch Zenbook and 13.3-inch Yoga Book are both compact enough to fit inside most bags, but the Zenbook is more convenient to stow because its lid folds right over the physical keyboard and trackpad, like a regular laptop. If the kickstand is popped out, just give it a firm pushback in and — bada bing — you’re done.

For the Yoga Book, you have to detach the keyboard from the bottom screen, fold up the laptop and place it to the side, attach the keyboard to the magnetic portion of the folio stand (if not already attached), and then wrap the folio around the keyboard before you can put both in your bag. That doesn’t include finding a pocket inside your bag for the mouse that comes with the laptop.

Lenovo does win a few points for attaching an elastic band to the folio that holds the included stylus. The Zenbook includes a stylus, but you’ll have to figure out where you’re going to put it when you pack up your things to go to Narnia or wherever.

Winner: Asus Zenbook Duo

Two laptops laying flat side by side on a wood table.
It’s easier to type on the Yoga Book 9i’s (right) virtual keyboard.

Speakers and more: the Yoga Book brings the bass

Virtual trackpads suck, straight-up, and they suck on both the Zenbook and Yoga Book for the same reason: it’s too easy to accidentally minimize or close the active window when all you’re trying to do is “click” on a link or something else on the screen. This happened frequently when I was using either laptop.

Both virtual keyboards are fine for tapping out quick messages, but I prefer the larger surface area of the Yoga Book’s keys. I had fewer mistypes compared to the Zenbook, but I liked the Zenbook’s physical low-profile keys more. Their subtle, clicky sound and tactile feel were similar to the Yoga Book’s, but my presses traveled further and made me feel like I had more control over how fast I typed. (I’m a heavy-fingered typist.)

The biggest surprise was the terrible quality of the Zenbook’s Harman Kardon-branded speakers. They made my favorite playlist with a lot of bass-heavy songs sound surprisingly tinny. Spoken dialogue comes through loud and clear, but how they handled my favorite styles of music made me want to cry. The Bowers & Wilkins-branded system in the Yoga Book is well-balanced right out of the box.

Winner: Lenovo Yoga Book 9i

The Lenovo Yoga Book 9i is a better package deal

These laptops have different personalities and are appealing for different reasons. The Zenbook Duo is for people who want a traditional laptop with a variety of ports and the option to dual-screen drive if they want to. It’s for the people who want the best specs for the price and don’t want to stand out while using it.

But the Yoga Book 9i is the winner for me. It’s for people who want to get away from the traditional laptop form factor without losing all of its essential comforts and dual-screen drive all day long. It’s thinner, lighter, and prettier, and — while it takes a little longer to pack up — the keyboard folio doubling as a laptop stand is a clever design. The touchscreen gestures work consistently, and it’s clear Lenovo put a lot of thought into making sure it provided clear and easily accessible instructions on how to use them.

Photos by Joanna Nelius / The Verge

Fiido Air review: so lightweight you’ll forget it’s an e-bike

‘The world’s lightest city e-bike’ is worth a test, if not your $1,799.

Yes, that’s an electric bike, though you wouldn’t know it from looks alone, or from hoisting it up some stairs since it weighs as much as a regular city bike at just 30 pounds (about 14kg).

What you’re looking at is the Fiido Air, a carbon fiber e-bike from the Chinese company I tested on a whim once, just to see what a $999 direct-to-consumer electric bike was like. Not great, it turned out, and its follow-up had a habit of breaking in two.

But hey, I’m a forgiving type and the company did make amends to those affected. And Fiido says the Air is “the world’s lightest city e-bike” with a “super early bird” price tag of just $1799 at launch (or €1799 in Europe) — rising to $1999 and then $2799 later, ahead of August shipments. That’s just too tempting not to test, especially when it costs half that of the comparable Gogoro Eeyo.

And after spending more than a month with a Fiido Air as my daily rider, I gotta say — I’m impressed... so long as you ignore the app and the silly smartwatch it ships with, and aren’t afraid of doing a little wrenching and troubleshooting yourself.

The first thing you’ll notice about the Fiido Air is the battery — or lack of any visible trace because it’s integrated into the slender frame. Normally that’s a problem but this bike, unlike VanMoofs and some Amplers, is something that many can still haul into an elevator or up a flight of stairs in a pinch due to the liberal use of rigid and lightweight carbon fiber in the bike frame, front fork, handlebar, and seat post stem.

In fact, you wouldn’t know it’s an e-bike at all if it wasn’t for the giant ON / OFF graphic that Fiido inexplicably chose to blaze across the frame as if its owner needs to be forever reminded of where that button is. The otherwise clean design is helped by internally routed cables.

The 250W Mivice rear-hub motor is paired with a Mivice torque sensor for an intuitive assist.

My bike arrived partially assembled in its shipping box. A spacer for my front axel assembly was jammed into the packing materials, however, causing me to overlook it when I assembled the front wheel and handlebars. I could tell something was wrong, and eventually sorted it out with the help of Fiido support, but less experienced bicycle owners might have just lived with the slightly noisy, slightly wobbly, and potentially dangerous assembly.

My European Fiido Air is fitted with a 250W Mivice rear-hub motor and Mivice torque sensor (as you’d expect in this price range) to make the pedal-assisted power feel more natural. It also features plenty of off-the-shelf parts that should help make it easy to service at any local bike shop. That’s not always the case with Fiido’s cheaper e-bikes that use parts not widely available outside of China (I once had a terrible time finding brake pads). The Fiido Air uses Shimano BR-MT410 hydraulic brakes, a Velo saddle, and a Gates Carbon Drive CDX belt drive, with the latter rarely needing servicing unless your bike is shipped with a loose belt, like mine was.

Tightening the belt isn’t difficult, but it’s also not intuitive. Nevertheless, it’s never nice to spend $2000 and find that your transmission slips with a loud clunk when stepping hard on the crank to quickly cross the street against oncoming traffic. I also had to recently lubricate the bottom bracket (where the crankset attaches to the bicycle) after the pedals began making a horrible creaking sound on each downward stroke. Both of these fixes were relatively simple to do, but not something that’s usually required after just a few weeks of riding.

The Air is equipped with a fingerprint sensor that’s surrounded by a colorful light ring. To prevent people from riding off with the e-bike after hitting the well-labeled ON/OFF button, the motor can be configured to unlock with the fingerprint sensor. This worked surprisingly well 99 percent of the time. It worked fine in light rain, so long as I was able to dry it off and shield it, but I once tried to unlock it in a heavy downpour, and no amount of wiping allowed the sensor to recognize my finger. That meant opening the app to unlock the motor.

The app is... terrible, and should be avoided at all cost. Fortunately, it can be abandoned for day-to-day use, but not until you suffer through it for initial setup, and then occasionally to check the battery level — which seems to be off by as much as 20 percent — since there’s no indication of it on the bike itself. It’s a shame Fiido didn’t repurpose the colored ring around the fingerprint sensor for some kind of battery indicator.

After the fingerprint sensor unlocks the bike, more taps will steadily increase the power assist with corresponding rings of color — yellow, blue, a slightly brighter blue, and green — to show the current selection. Unfortunately, the bike doesn’t remember your preferred setting when turning it on and off. A quick double tap on the sensor turns the integrated running lights on and off.

Fiido ships the e-bike with a cheap plastic-y Fiido Mate smartwatch, which is just laughably bad. It can be used to unlock the motor or as a dashboard on your wrist — but can’t be easily attached to the frame. After testing it once I never used it again. I already wear an Apple Watch, but there’s no app for that.

The Fiido Air puts the rider in a very aggressive and sporty position, which creates an awkward hand position that’s less than ideal for long commutes or casual city riding. But it is fun! The pedal assist is delivered smoothly, intuitively, and very quietly, but the motor’s modest 40nm of torque makes this single-speed e-bike best suited for mostly flat commutes. Out of the box the Fiido Air has a 15.5mph (25km/h) top speed that shoots up to 18.6mph (30km/h) with a simple (and often illegal) software setting.

Fiido says the Air can go up to 80km (about 50 miles) on a single charge which is wildly optimistic for its 209Wh non-removable (but serviceable) battery, but may be doable in the lowest power setting (I always tested in max). In my testing, pedal assist was already noticeably degraded after around 40km (25 miles) of riding. Fiido also sells an optional bolt-on range extender that you can take inside to charge from Fiido’s relatively small charging brick.

For what’s supposed to be an e-bike for cities, it ships without a kickstand, bell, or any mudguards which means a rooster tail of spatter on your back if you get caught in the rain. It does have attachment points for front and rear fenders though, if you decide to go that route. It also comes with Kenda 700*40C tires that look better suited for gravel than city streets.

Overall, I’ve really enjoyed using the Fiido Air as my primary city ride for the last six weeks and change. For $1799 it’s a good deal for anyone looking for a nicely designed and lightweight e-bike. For $1999 it’s still worth a hard look, but for $2799 I’d consider other options first.

All photography by Thomas Ricker / The Verge

Battle of the best robovacs (that iRobot doesn’t make)

We put the Roborock S8 MaxV Ultra head-to-head against the DreameBot X30 Ultra to find out which of these Roomba competitors’ flagship robot vacuums is the best.

There are an absurd number of robot vacuums available today, but based on my testing of dozens of bots, just a handful of manufacturers are leading the pack when it comes to innovation, choice, and really good cleaning machines. These include Roborock, iRobot, and Dreame. Each has recently released new flagship models: the Roborock S8 MaxV Ultra, the DreameBot X30 Ultra, and iRobot’s Roomba Combo J9 Plus.

I’ve reviewed the Combo j9 Plus, and I still recommend Roombas if you’re looking for either a high-end robovac or a budget bot, in large part due to their repairability, ease of use, and reliability. But the competition is getting very good, and with iRobot’s future looking shaky following its break up with Amazon, I figured it was time for a deeper dive into its strongest competitors. Here, I pit the X30 Ultra against the S8 Max V Ultra to see which one is the best.

The Roborock S8 MaxV Ultra ($1,799.99) is a robot vacuum and mop with a charging dock that fills the robot’s onboard water tank, cleans and dries its mop pads, and empties its onboard dustbin. It features a whopping 10,000Pa of suction and a camera for obstacle detection and avoidance. Its mop vibrates up to 4,000 times a minute to scrub your floors and raises up to 20mm to avoid carpet.

The S8 MaxV has a new flexi arm that pushes its spinning side brush out further to get into corners better and a side mop that helps clean along edges. A new on-device voice assistant can take direct commands, so you don’t need to use the app or a third-party speaker to control the robot (although it works with Alexa, Google Home, and Siri Shortcuts). It’s also one of the first robot vacuums that will support Matter, although that feature hasn’t been turned on yet.

The DreameBot X30 Ultra ($1,699.99) has many of the same features as the S8 MaxV Ultra, including a charging dock that auto-empties, washes the mops, and fills the robot’s water tank, plus a camera for obstacle detection. It has 8,300Pa suction and uses dual spinning mop pads that it can automatically remove when it vacuums — my favorite feature. It can also lift the mops if needed (up to 10.5mm).

Uniquely, the Dreame can extend its mops out to reach baseboards and even under low furniture, as far as 4cm; this is surprisingly effective at getting up grime from edges.

I let these two bots battle it out in my home over 10 days, testing their cleaning prowess, mopping chops, navigation skills, and unique features — such as an arm and mops that do the splits. I also evaluated the design and usability of their multifunction charging docks and how well they meet their promise of hands-free cleaning. I put their companion apps through their paces, diving into all the settings and features these machines offer in their quest to clean your floors. Read on to find out which one came out on top.

Dock design and function: bigger is beautiful unless you can plumb it

Despite being bigger the Dreame’s dock (left) looks better.

While Roborock has redesigned its dock into something smaller and more aesthetically pleasing (it was the first to release a multifunction dock, and those early days were characterized by hulking monstrosities), it’s still one of the ugliest out there. Dreame, on the other hand, has perfected the stylish dock look, and while it’s bigger than Roborock’s, it’s much prettier.

Dreame’s dock is also slightly more functional. While both models will wash the mops with hot water and dry them with heated air, which helps deal with the smell and mess, Dreame has little wipers that clean the mop area for you, whereas Roborock’s mop tray needs manual cleaning. However, Roborock has the option to connect directly to your plumbing, doing away with the bulky water tank-look entirely. You do need to get a specific model for this, which costs $100 more. While Dreame sells an add-on kit to its existing model for this function, it’s only available in Asia. A North American model — the X40 — is coming later this month, but it costs nineteen hundred dollars.

Winner: Tie

Navigation and obstacle avoidance: they both dodged the poop

The Roborock stares down the fake poop and goes on its way.

Both models use lidar to map and navigate your home. They both mapped the house quickly and accurately and responded correctly to requests for room-specific cleaning and zone cleaning — meaning they didn’t get lost. These robots both have front-facing cameras for AI-powered obstacle avoidance, and they both nimbly avoided fake dog turds, socks, shoes, and bundles of cables.

However, each had weak spots. The Dreame successfully sucked up a pile of Cheerios, which the Roborock thought was an obstacle, but the Dreame got stuck on a stray iPhone cable that the Roborock dodged. Roborock also loves to eat pencils. In the end, though, they were both rarely derailed compared to non-camera-powered robots I’ve tested, and that’s the biggest benefit of AI-powered obstacle avoidance unless you regularly let your pet poop in your house.

This is the first Roborock since the excellent S7 MaxV Ultra to feature a camera for object detection (all the other models use 3D obstacle detection, which is not as effective). But Roombas with the same feature are still the best at knowing what’s in its way and successfully avoiding it or cleaning it when necessary. Also worth noting: if you have a bed skirt or fabric around your sofa, lidar-powered robots will see it as a wall, whereas a VSLAM-powered model, like the Roombas, will push right through and clean under your bed.

Winner: Tie

The S8 MaxV Ultra’s robot arm reaches out to get debris out of corners.

Vacuuming power: Roborock sucks hardest and has an arm …

Both bots have super suction power and did an excellent job getting up every last bit of larger debris, such as rice and oatmeal, on hard floor. But Roborock’s dual-brush system did a better job on carpet, and its rubber roller design means less hair tangle. Dreame sent me its new $50 anti-tangle tri-cut brush (sold separately) that cuts the hair, and I didn’t have to deal with any tangles, which was nice. But the Roborock was tangle-free without buying an extra accessory, and its dual brushes did better at getting dirt and hair up off the carpet.

Roborock’s flexi arm is also a great upgrade. It’s designed to help the bot clean corners better by reaching the spinning brush out to swipe up the dirt. I have seen this in action at CES, but it happens in the flash of an eye, and despite spending a lot of time hovering over the bot, I never actually saw it work in my home. But the debris I put in the corners to test it was gone, so I guess it worked?!

The Dreame (left) has a single roller brush, whereas the Roborock has two rubber brushes that are better at getting dirt off carpet and sucking up messes in one pass than the Dreame.

Auto-cleaning modes are a new feature I’m starting to see on high-end robots. They eliminate the bother of having to set specific cleaning modes for different rooms — such as cleaning the kitchen and entryway twice but the dining room once. Both Roborock and Dreame have versions of this AI-powered cleaning mode. Dreame calls it CleanGenius, and Roborock’s is SmartPlan. I found them both very useful for just hitting go and not having to plan the route but still ending up with spotless floors.

These modes also turn on a feature that sends the robot back to clean areas it determines that need more attention. This was hard to test effectively in the time I’ve had with them, but it’s an interesting feature I’ll be keeping an eye on. Anything that involves less of me spending time with an app and more of the robot doing things on its own is a good thing.

I really liked Roborock’s “Recommended Routines,” personalized cleaning sequences that again mean less programming by you. There’s an After Meals one that tackles the kitchen and dining room and a Pet Supply option for cleaning around pet food areas (the robot can identify pets, pet beds, and pet bowls), along with a few other useful options.

Winner: Roborock

Mopping prowess: Dreame’s mop moving and mop removal is genius

The Dreame can push its mops out to scrub baseboards and also swing the robot’s body to extend the mops further to get under things like my dishwasher here.

Dreame’s auto-detachable mop pads are still the best way I’ve seen to deal with the “how does a robot mop and vacuum without messing up your carpet” conundrum. When it’s cleaning carpet, it goes back to its dock, takes off its mop pads, then goes and vacuums. Genius. It can also raise its mop to about 10mm if needed to save time, so it can still traverse carpet to mop further away rooms. Roborock’s mop isn’t detachable, although you can manually remove the pad itself. It does lift a lot higher, up to 20mm, but there’s still a chance of contaminating high-pile carpets unless you tell it to avoid carpets.

Dreame’s dual oscillating mop pads also do a better job of getting wet messes off the floor than Roborock’s single flat pad. While Roborock’s mop vibrates up to 4,000 times a minute, Dreame successfully removed all the dried ketchup and OJ in my tests, whereas Roborock left a trace behind.

The other thing Dreame does very well is clean baseboards and edges. It uses a “MopExtend RoboSwing” technology that extends its mop out to reach the baseboard and also swings the robot toward the edge to push the mops under things like my fridge and dishwasher, getting the grime that other cleaning methods miss. Roborock’s Extra Edge Mop system, new on the S8 MaxV Ultra, does give the bot a bit more mopping reach — a small spinning mop pad extends slightly out from the right of the robot, but it’s not a patch on the Dreame.

Winner: Dreame

Apps, video cameras, voice control, and Matter, oh my!

These high-end robovacs have a dizzying amount of features accessed through their apps; which is where you set up the map (name rooms and add furniture to help the robot understand your home better). This was easy to do on both, and they have very similar apps.

However, Roborock’s app is more refined, more stable, and slightly more user-friendly. Both have so, so many settings menus to dive into to customize everything from how often the bot washes its mop and when it empties its bin to which direction it cleans your hardwood floors (yes — you can select “along the grain”). But Roborock makes it easier to get to what you need. It also never crashed on me, whereas Dreame’s often showed the robot offline or made me wait a while before I could access it.

One neat feature is that both can act as roving home security cameras. Roborock even claims it can go look for your pet — although it failed to find my 80lb pup when he was sitting right in front of it. To be fair, it was dark, and he looks like a rug. You can also drop in on the robot’s camera and see and talk to people in your home — yes, that’s as weird as it sounds, but there could be a use case. The camera feature is not enabled by default on either Dreame or Roborock and requires a set of actions and a code to access it remotely.

The Roborock on patrol for a pet. It didn’t spot my dog here, but it can be set to snap pictures of your pet whenever it sees them.

Only Roborock has built-in voice control, a new feature with this model. The wake word is Hello Rocky, and it worked very well, responding promptly and understanding my commands. You do have to wait a beat after activating it to say the command, which takes a bit of getting used to. Dreame can respond to voice commands from Alexa, Google Home, and Siri shortcuts (as does Roborock), but the single-purpose use here makes the experience much better.

Hello Rocky gave me much more control than any of the third-party integrations. I could ask it to empty the bin, skip here, stop drying, and more, along with all the standard commands like clean the kitchen and go back to the dock.

Finally, Roborock supports Matter, which gives it an edge. While none of the major smart home platforms support robot vacuums in Matter yet, most have said they will soon. The fact that Roborock’s S8 MaxV Ultra is already Matter-certified means you’re ready for that future if and when it arrives. Dreame has said it will support Matter in its newest vacuums but has not made any announcements about the X30.

Winner: Roborock

Which bot’s the best?

The Dreame X30 Ultra (left) and the Roborock S8 MaxV Ultra are both impressive robot vacuum mops.

Both robots perform exceptionally well at mopping and vacuuming, and their all-singing-all-dancing docks make floor maintenance virtually hands-free. But the Roborock beats the DreameBot overall thanks to its superior vacuuming performance, easier-to-use app, and built-in voice control. Its dual roller brushes, side brush, and 10,000Pa suction demolished all the dry dirt in my tests. And while the Dreame is better at mopping, the Roborock is still very good.

If mopping is what you really want, the DreameBot’s oscillating mops do a better job with wet spills and dried-on gunk, like ketchup. The mop removal feature meant I didn’t have to worry about my white, high-pile carpet at all. If you have a lot of carpet or high-pile rugs scattered around your home or prefer the nicer-looking dock, Dreame may be a better choice, but otherwise, the Roborock will suit you very well.

If you are sold on these bots but can’t stomach the price, both brands have cheaper models that do almost as much. The Roborock S8 Pro Ultra costs $1,600 and has lower suction power, no camera (so no AI-powered obstacle detection), and no voice assistant or Matter. Dreame’s previous flagship model, the L20 Ultra, is currently $1,500 and slightly better in a few areas. It does have lower suction power but can remove its mops and extend them (though not as far as the X30). However, its auto-emptying wasn’t as reliable.

I should note that Dreame has just announced the X40 Ultra, which will be available for an eye-watering $1,900 and will have a model with a direct water hookup. The X40 also adds a flexi arm — just like Roborock’s — and 12,000Pa of suction. But it still only has one roller brush, and the brushes are key to cleaning. Also, yes, I do think these robots are breeding.

Is Crossrope’s smart jump rope worth $200?

Photo by Sheena Vasani / The Verge

Skip Crossrope unless you really love skipping rope.

Like everybody else, my New Year’s resolution was to work out more. After moving to a new city, I fell out of my workout routine, and it didn’t help that the gym chain I belonged to was now a 30-minute drive in Los Angeles traffic.

So I started researching workouts I could do from home. Jump roping is fun and a great, full-body cardio workout that can also improve agility and coordination. So when I heard the $199 Crossrope AMP Jump Rope Set would quantify the experience and help me incorporate strength training into my routine with its weighted ropes, I was intrigued.

After testing the set for a month, I can confirm few jump ropes are as well-made as Crossrope’s, and its workouts and community offer a lot of value for jumping enthusiasts. Yet, at $199, plus a $12 monthly subscription, it’s only for those committed to jumping consistently — not casual users.

The Crossrope AMP Jump Rope set box surrounded by its three green, gray, and white weighted jump ropes, with the AMP handled attached to the green one.Photo by Sheena Vasani / The Verge
The Crossrope AMP Jump Rope set comes with a set of Bluetooth-connected handles and three different weighted ropes.

The Crossrope system, which has been around since 2013, consists of interchangeable handles, ropes, and ropeless jumping attachments in a variety of weights from three ounces up to five pounds. The AMP set that I tested comes with a set of Bluetooth-connected handles plus quarter-pound, half-pound, and one-pound ropes.

The ropes and handles are built from strong materials and connect with steel clasps. They feel made to last, but unlike most jump ropes, each rope is a fixed length — you can’t adjust them. They come in six different lengths, but I tripped a few times despite using the size Crossrope recommended for my height. While I began to trip less as I improved as a jumper, when I asked the Crossrope community for help, several members acknowledged they had had the same issue.

A hand holding a set of black jump-rope handles with green squiggly lines, steel interconnects, and a green rope connecting them.Photo by Sheena Vasani / The Verge
Crossrope’s handles feature steel clasps that make swapping out ropes really easy.

The AMP handles are what turn this from an expensive modular jump rope system to an expensive modular smart jump rope system. The Bluetooth-enabled handles connect to iOS and Android devices, allowing you to track jumps, streaks, power output, speed, and calories burned from the companion app. If you connect it with your Apple Watch, you can also import your heart rate data. It’s difficult to judge how accurate these stats were, but Crossrope correctly counted my jumps for the most part, and the other numbers didn’t seem like a stretch.

But that information comes at a price: $11.99 per month. That’s right: along with forking out $199 for the set (or $99 for the handles if you already have Crossrope ropes), you also have to pay a monthly fee to get any value from the smart handles. Even the jump counter is paywalled. That fact was — and still is — jarring to me and is the biggest downside to the set.

A screenshot of a Crossrope’s app listing for a workout to strengthen your core, with a 3D avatar of a personal trainer performing crunches.
Crossrope’s workouts incorporate other exercises besides jumping, like crunches for those wanting to strengthen their core.
A screenshot of Crossrope’s curated Spotify playlists.
Crossrope curates Spotify playlists by beats per minute, which was helpful for when I needed extra motivation.

That said, you’re not paying just for metrics. Along with a helpful Facebook community of nearly 100,000 people, Crossrope includes an app with over 2,500 workouts created by its personal trainers and on-demand classes taught by instructors popular in the jumping world. Jumping rope is obviously the focus, but the custom workouts also include other exercises like squats and dumbbell lifting. There are also longer programs focused on specific fitness goals, from burning fat in, say, six weeks to improving endurance. If you don’t like any of the options, you can also create your own workout, which was helpful when I required a slower pace.

I appreciated how well thought out the workouts are, with a timer included for each set and rest sessions. Crossrope’s own programs even feature Spotify playlists curated by beats per minute geared for different rope weights and speeds. Unlike, say, Apple Fitness Plus or Fitbit Premium workouts, Crossrope also displays a (weird) 3D avatar of the trainer performing the same exercise in real time, which helps with form. And unlike Apple’s and Fitbit’s programs, you can even message Crossrope’s trainers with questions for a more personalized experience.

A screenshot a 3D version of Crossrope’s personal trainer jumping role in real-time during a workout.
Watching a 3D version of Crossrope’s personal trainer exercise in real-time with me was simultaneously helpful and bizarre.

But we have to address the elephant in the room: the Crossrope AMP costs two hundred dollars, plus $12 a month. It exists in a niche market with little direct competition, but it also exists in a world with a lot of cheaper jump ropes. To pull an example almost at random, the Te-Rich Smart Weighted Jump Rope I found on Amazon costs $17 and has a built-in LCD display with a timer and jump counter, while the YaoYao app also tracks jumps and time and only costs $0.99 per month (or $10 for a one-time unlock). Both also estimate calories burned, and YaoYao also lets you set the length of workouts and rest sessions and compete with others via a leaderboard.

A hand holding the Te-Rich Smart Weighted Jump Rope’s pink handles, with one handle featuring a built-in LCD display with a timer and jump counter.Photo by Sheena Vasani / The Verge
The Te-Rich Smart Weighted Jump Rope features a built-in LCD display with a timer and jump counter. It also comes in fun colors, like pink.

While YaoYao often overestimated my jumps, the Te-Rich Smart Weighted Jump Rope’s stats were consistent with Crossrope’s, and sometimes even counted my jumps more accurately. The flimsy 9.8-foot PVC rope tangles easily, but that’s forgivable at this price, especially as the rope is adjustable. The Te-Rich lacks custom workouts, on-demand video classes, and community, but you can find similar ones online. In fact, some on-demand class instructors offer their own YouTube channels. Plus, you can always use the free or paid versions of Crossrope’s app without the AMP handles if you want the workouts and don’t mind losing the jump counter, personalized targets, benchmarks, and leaderboards.

A wrist wearing the Apple Watch Series 8 with the YaoYao app open, display heart rate, timer, speed, and (incorrectly) the number of jumps.Photo by Sheena Vasani / The Verge
YaoYao thought I jumped 22 times when the real number was closer to 14.

The most effective workout is the one you’re going to stick with. If a smart jump rope with guided workouts and an encouraging community makes it easier for you to exercise consistently, Crossrope is worth it. It’s overpriced, but it’s also smaller and cheaper than other home gym equipment I considered, like treadmills. Crossrope’s 60-day return policy also means you can get your money back if you decide you’re not going to use it enough to justify the expense.

I enjoyed my time with the Crossrope. It helped put some of the fun back into fitness for me. But I don’t think jumping will replace jogging and walking as my primary cardio workout — though it’s a fun accessory — so I won’t be buying the Crossrope AMP once I send the review unit back. The Te-Rich didn’t come with a bunch of workout programs or a Facebook group or track my heart rate, but it still gave me a rough idea of jumps and calories burned and didn’t cost $200.

The OnePlus Watch 2 is what redemption looks like

OnePlus could be a strong alternative to Google or Samsung for Wear OS smartwatches.

At the end of February, a large package arrived on my doorstep. Inside were 11 boxes containing the same version of the $299.99 OnePlus Watch 2. My eyes watered and I whispered, “Not again.”

This was a shipping accident. My box had 10 more watches than I needed for a review. It happens and normally doesn’t hold any larger meaning. But I was nervous because the original OnePlus Watch was by far the worst smartwatch I’ve ever had the misfortune of reviewing. Everything that could go wrong did. The fitness and health tracking deserved the Pulitzer Prize for fiction. Troubleshooting the buggy software was a nightmare. That whole abysmal experience was seared into my memory. So when OnePlus reached out to say it was making a second watch — and that this one was markedly better — I was hopeful. And then a box of 11 smartwatches arrived on my doorstep.

Still, it wouldn’t be fair to let past mistakes color my opinions of a new watch. I took my time to get to know the OnePlus Watch 2 on its own merits. So when I say this watch is not only competent but also pretty good, I really mean it.

It works!

The bar for the OnePlus Watch 2 was low. All it had to do was be better than its craptacular predecessor. It’s been a while, so I reread my original review to refresh my memory. To beat the original watch, this one only had to:

  • Record reasonably accurate activity and health data. Last time, it recorded 15,314 extra steps compared to a control smartwatch.
  • Accurately sync data between the phone and watch. If I took a mile-long walk, it needed to log one mile on my watch and phone. The data for steps, heart rate, distance, etc., had to also match. The last OnePlus Watch continuously gaslit me by showing radically different metrics on the phone versus the wrist.
  • View historical sleep data in the app and not just on the wrist. The last one couldn’t do this.
  • Deliver push notifications in a timely manner. Not 40 notifications four hours later, all at once.
Person looking at OnePlus Watch 2 on their wrist
The OnePlus Watch 2 has a novel dual chip and dual OS structure.

I’m genuinely chuffed to say wearing the watch these past few weeks was a healing experience. OnePlus did all of that and then some. Whereas the original watch was basically a fitness tracker, this is a genuine smartwatch with a dual-processor architecture, including the latest Wear OS chip and a novel dual OS to prolong battery life.

By upgrading from a proprietary OS to Wear OS 4, the watch delivers a much richer overall experience. I can now access third-party apps from the Play Store. There are multiple music apps to choose from, including Spotify and YouTube Music! There’s contactless payments! I can turn off my smart lights with Google Assistant. That’s huge considering almost no third-party Android smartwatches have launched with Google Assistant since the switch to Wear OS 3. This is the stuff you’d expect from a proper flagship.

Build quality is also better. The original watch was a pretty screen with chintzy, plasticky materials. This has stainless steel and sapphire crystal. The silicone strap is much thicker. The 1.43-inch OLED display is pleasing to look at. Scrolling through screens is smooth, and colors are crisp. I wish screen brightness went above 600 nits — it can look washed out in direct sunlight — but that’s a quibble.

I don’t have any complaints with health, activity, and sleep tracking, either. I wore the OnePlus Watch 2 alongside the Oura Ring, the Garmin Forerunner 165 Music, the Apple Watch Ultra 2, and a few other smart rings. I saw some normal minor discrepancies but nothing to write home about. The OnePlus Watch 2 also adds dual-frequency GPS. It’s a common addition to more premium or rugged smartwatches these days but mostly translates to slightly more accurate GPS data in challenging environments. My results in testing were quite similar to the Ultra 2 and my phone, which both have dual-frequency GPS. Good stuff if you’re the outdoorsy type.

Side view of the OnePlus Watch 2
There’s a new shortcut button that launches workouts, along with a digital crown.

But while these are massive improvements, it’s not perfect. While I liked the addition of a new shortcut button, the “digital crown” isn’t really a crown. It looks like one and spins like one, but it doesn’t actually scroll. It’s functionally a button. The watch, like many newer Wear OS watches, supports Android only. There’s no cellular capability, either, which stinks if you want to leave your phone at home for a run. Likewise, the OnePlus Watch 2 doesn’t have fall detection, EKGs, native period tracking (you could download a third-party app), or body temperature tracking. Most of these omissions aren’t the end of the world if all you want is basic activity tracking. It just means this isn’t a watch where you can comfortably leave your phone at home.

Multiday battery life

The OnePlus Watch 2 has some fancy stuff going on under the hood that translates to excellent battery life. The gist is you’ve got two processors — the Qualcomm Snapdragon W5 and the BES2700 MCU. The W5 handles the power-guzzling tasks and runs Wear OS 4. The BES2700 runs background tasks using a proprietary OS. Stuff gets handed off between the two, and the result is long battery life. It helps that there’s a 500mAh battery, too.

Look at OnePlus’ mobile app with sleep chart pulled up
I wasn’t able to view historical sleep data in the app with the original OnePlus Watch. Not a problem here.
Dashboard view of OnePlus’ OHealth app
I was so pleased to see that my data actually synced properly. The bar was low.

How long depends on your usage. If you turn the always-on display off, keep notifications to a minimum, and exercise about 30 minutes with GPS on, you can get several days. The most I got was about four days before a power saving mode kicked on and then about one more day after that mode kicked in. That’s very good and it’s longer than what you’ll get on an Apple, Samsung, or Google smartwatch. With the always-on display turned on, I got closer to 1.5 to two days. That’s standard, though not too shabby.

In power saving mode, the watch loses Wear OS, but you still can receive notifications and track health / activities. OnePlus says you can get around 12 days. I never used the watch in power saving mode only. That sort of defeats the purpose of having a flagship smartwatch, but it’s nice to have if you forget your charger at home.

To get excellent battery life, OnePlus made one big design tradeoff. This watch only comes in a single 47mm size.

Close-up of OnePlus Watch 2 on wrist
The 47mm watch does look quite beefy on my wrist.

That 47mm watch case is why you can stuff in a 500mAh battery. But it bumps the weight to 80g with the strap. I’ve got petite wrists. I felt gravity’s pull on the watch whenever I ran. I really felt it when I wore my leather jacket. The watch was so chunky, I almost didn’t have enough space to pull my wrist through the cuff.

If you’ve got bigger wrists, this won’t be an issue. But there’s no other option for everyone else. That’s a bummer. Most flagship smartwatch makers offer at least a small and big size. They’re making the same tradeoffs — the smaller ones are usually more comfortable, but the bigger ones have better battery life. Consumers understand that, and most are happy to choose the tradeoff that best suits them. Here, that choice has been made for you.

Filling a void

OnePlus has a real opportunity here to take over as the “default alternative.” It’s the first Wear OS 4 watch that isn’t made by Samsung or Google, and it trounces them both in battery life. Unlike its predecessor, it nails the basics. At $300, it’s competitively priced. While there’s room for improvement, it’s well suited for folks who want something stylish without too many bells and whistles.

I actually held onto the 10 extra OnePlus Watch 2 smartwatches during testing. Part of me was afraid that the one I opened would be riddled with bugs. This way, I wouldn’t have to request alternate units. I’d be able to definitively tell if one unit was flawed — or if the watch was yet another unmitigated disaster. But I never needed to open a second watch. Now that my review is done, I can see how silly I was being. This watch is the exact opposite of its predecessor. For once, that’s a good thing.

Correction, April 17th, 2024, 5:08PM ET: A previous version of this review noted the side button wasn’t customizable. It is. We regret the error.

The internet really is a series of tubes

An image of a cable repair ship, on top of The Vergecast logo.
Photo by Go Takayama for The Verge

Hundreds of cables. Hundreds of thousands of miles. The internet runs, in vastly more ways than we realize or think about, through a series of garden-hose size tubes on the ocean floor. Without those tubes, the modern world sort of collapses. And the folks responsible for keeping them functioning have bigger, harder, stranger jobs than you might think.

On this episode of The Vergecast, we talk to The Verge’s Josh Dzieza, who has been reporting on the undersea cable world and just published a feature about some of the folks who keep it running. It’s a story worthy of a high-seas action movie, and it’s all about cables.

Then, we chat with The Verge’s Tom Warren and Joanna Nelius about the new generation of PCs that Microsoft and others seem to think are going to be huge improvements over anything we’ve seen before. Can Qualcomm finally make the PC chip we’ve been waiting for? Is this really, actually, finally the year of Windows on Arm? What the heck is an AI PC? We cover all of that and more.

Lastly, Alex Cranz joins to help us answer a hotline question about e-readers. Because it’s always interesting times in e-readers.

If you want to know more about everything we discuss in this episode, here are a few links to get you started, beginning with Josh’s story on undersea cables:

And on AI PCs:

And on e-readers:

Anker’s latest Soundcore Sleep earbuds actually improve slumber

The Soundcore Sleep A20 are decent passive earbuds that are great for side sleepers, even if Anker overpromises.

“Sleep when you’re dead” was the rallying cry of my youth. But now, in the soft haze of dull middle age, I feel like I’ll die without enough sleep. That’s why I took interest in the new Sleep A20 earbuds from Anker’s Soundcore brand, which promise “pressure-less comfort for side sleepers.”

I, like many, fall asleep listening to podcasts. It’s either that or let a three-pound hunk of fat and neurons lodged in my skull harass me about the future. But my Apple AirPods Pro, like most true wireless earbuds, are too big for comfortable side sleeping, so I only wear one and swap them throughout the night as I toss and turn in fits related to some undiagnosed sleeping disorder.

And since they’re designed as sleep aids, the A20 buds offer lots of sleep-focused features like “unmatched noise blocking” and noise masking to “silence common disturbances such as snoring,” according to Anker.

But not really.

It’s important to understand that Anker doesn’t offer any active noise cancellation to silence snoring or chatty neighbors. The Sleep A20 buds block all external sounds passively by fitting snuggly inside the ear, just like regular ol’ earplugs. That’s partly why the company can charge just $89.99 at launch and still claim up to 14 hours of continuous white noise to mask sounds or 10 hours of audio listening before needing a recharge.

The app lets you switch between two listening modes: Bluetooth audio and sleep sounds. The former is for listening to podcasts, music, or anything else you’d like to stream, while the latter gives you access to dozens of very lifelike sleep sounds grouped by water, nature, life (trains, airplanes, and such), and meditation — I particularly like Rain on Tent. You can also double-tap a bud to switch between listening modes and configure them to keep playing audio all night or until you fall asleep. This is done manually (via timer) or automatically, which I found to be too unreliable.

The A20 buds also include a variety of masking sounds. You can play with a multitude of sliders to mix white noise with seven other colors and two types of snore-masking tracks. It didn’t really work when I attempted to mask a variety of snoring sound effects playing on a nearby speaker. While it did diminish the snoring by layering on less annoying sounds, it certainly didn’t live up to the claim of silencing common disturbances. It also didn’t silence barking dogs or drunken frat boys passing below my bedroom window, I came to find out.

The Sleep A20 buds in and out of their case. They come with multiple ear tips and wings to dial in your correct size.

In my side-by-side testing, the AirPods Pro with noise cancelation enabled and playing music did a noticeably better job of neutralizing those disturbances than the Sleep A20 buds also playing music. But I can’t sleep on my side wearing Apple’s AirPods Pro buds (they also cost more than double the A20s during Anker’s discounted launch period).

Nevertheless, I have to say that for my needs these buds are a game-changer. Although I suffered a bit of mild discomfort the first week of wearing them, sleeping with the A20 buds on my side now feels normal — as does inserting them with a push and a twist and then digging them back out each morning (they’re snug!). I do have to micro-adjust the pillow-to-ear angle occasionally for optimal comfort, and the bud facing the pillow will often just mute itself due to the pressure, which means listening to audio from just one ear. But the end result is that I’m sleeping longer and waking up less frequently. And, anecdotally, I feel better rested.

According to sleep data measured by my Apple Watch Ultra, I’m now averaging 7 hours and 14 minutes of sleep time for the two weeks I’ve been testing the A20 buds, up from 6 hours and 50 minutes for the two weeks prior (wearing AirPods Pro) with slightly improved deep sleep. Other sleep tracking data is about the same.

Screengrabs from the Soundcore app showing (left) available noise masking sounds and (right) data collected by Anker’s sleep algorithm showing me rolling over 45 times... my poor wife.

Anker also offers sleep tracking data in the Soundcore app, including novelties like Position (left or right side) and Roll Over (times I’ve switched sides). Unfortunately, the data is only available to view when my iPhone is paired with the buds in my ears. It says I’m predominately a left-side sleeper away from my partner, which makes sense. But several nights measured between 40 and 50 rollovers, or up to six times an hour, which presumably means I need an exorcism.

I found the battery to be excellent when listening to a few hours of podcasts each night, waking up with between 50 and 75 percent charge remaining. (The built-in Soundcore alarms are startlingly loud and not recommended.) They did much better than my three-year-old AirPods Pro that can’t make it through a single night.

Dropping the buds into the charging case takes some practice initially due to the buds’ amorphous shape, but it can be mastered after a few uses. The case can keep the battery charged for up to 80 hours, according to Anker, if you only listen to its collection of soothing sounds in sleep mode downloaded to the buds themselves. That comes with a side benefit of no Bluetooth audio alerts to interrupt your slumber.

Otherwise, the buds feature a Find Device feature, which sounds like and is about as loud as the alarm on a vintage Timex watch (read: not very). You can also configure double and triple taps on each earbud independently to switch between sleep sounds or Bluetooth audio, volume up / down, next, previous, play / pause, or nothing at all. Anker’s app provides a lot of flexibility to dial in the A20 buds to your exact taste.

Listening to music is fine in a pinch with an adjustable EQ. But I wouldn’t buy these tiny, lightweight earbuds if music appreciation is your primary goal.

Still, as a side sleeper who listens to podcasts every night when falling asleep, I’m completely sold on Anker’s $149.99 Soundcore Sleep A20 buds, especially for the early bird price of $89.99 when they go on sale today via Kickstarter.

All photography by Thomas Ricker / The Verge

Smart string light showdown: Nanoleaf versus Lifx

Which is the best bet to bedazzle your backyard?

I’ve tried lots of different ways to light up the patio in my backyard so I can enjoy sitting outside into the wee hours. Everything from fairy lights to path lights to standard string lights has been wrapped around the myrtles or dug into the borders. But none have survived more than a couple of scorching South Carolina summers. So, I was excited to test these cafe-style smart string lights from Nanoleaf and Lifx.

The Nanoleaf Matter Smart Multicolor Outdoor String Lights ($129.99 for a 49-foot string with 20 bulbs) and Lifx Outdoor SuperColor String Lights ($129 for a 24-foot light string with 12 bulbs) both feature individually addressable full-color and tunable white LED bulbs and are capable of gradient lighting effects. This makes them super versatile. I can have a green and gold-themed St. Paddy’s Day party in March, a red, white, and blue-themed Fourth of July bash, and a lovely soft candlelight white for dinner al fresco anytime.

Both are compatible with all major smart home platforms, so I can set the lights on schedules, control them with voice commands, and have them turn on when the patio door opens using a contact sensor. Most importantly, both these brands’ string lights are seriously sturdy. After watching them survive a cracking spring storm last week, I’m hopeful that these could be a more permanent solution to illuminating my backyard.

I tested the Lifx and Nanoleaf head-to-head over two weeks. Read on to see which came out on top and which could be a good fit for your garden this summer.

Design and build quality: Lifx looks good, but Nanoleaf is so sparkly!

These are not your mother’s string lights. Nanoleaf and Lifx have gone for bold industrial design, with Nanoleaf building on its dodecahedron heritage to produce a gorgeous light bulb. The faceted face creates a lovely effect that looks like a crystal hanging from my trees and is dazzling even when off.

Lifx has gone for an ultra-modern, Tron-style look — a tubular shape with a stick of light inside. They’re stylish but with less flair than Nanoleaf’s. I do like that the Lifx bulbs attach directly to the string and don’t dangle as far down as the Nanoleaf, creating a cleaner look. This makes the Lifx a better choice for hanging along a structure like the wall of a porch.

Both lights feel solid and durable, and the acrylic bulbs don’t break when dropped. The cables and plugs are similarly super heavy-duty, being weatherproof and holding up to rough handling during installation. Neither offers replaceable bulbs, but if a bulb goes bad, both string lights are covered under two-year warranties.

Winner: Nanoleaf

Lifx tunable white light goes down to a lovely warm glow — much softer than Nanoleaf’s.

Light quality: Lifx has serious range

The Lifx's color rendering and tunable white light are very impressive. With a color rendering index (CRI) of 90 and white light that goes from rich, warm candlelight at 1500 Kelvins to an icy blue cool white at 9000 Kelvins, the Lifx has better color and a broader range of white than Nanoleaf (80CRI and 2700K to 6500K).

Lifx on the left, Nanoleaf on the right.

Its colors are also more saturated; red on the Lifx is really red, whereas on the Nanoleaf, it’s more pink and softer. But while brighter is usually better in a light bulb, I’d argue that accent light in your garden is one place you probably don’t need to go for the brightest.

Winner: Lifx

Lighting effects and features: Lifx’s color blending is mind-bending

Each Lifx bulb has three addressable zones that blend together in an almost magical way. It’s hard to pinpoint which color you’re seeing; instead, it’s just a soft ambiance, a welcome change from jarring multicolor effects on most addressable lighting I’ve tested.

While the Nanoleaf bulbs can only show one color at a time per bulb, the cut glass design does create an array of different shades. Nanoleaf’s scenes can also cycle through different colors to give a similar effect to the Lifx, but Lifx’s technology is better.

Lifx’s color blending is technically very impressive. (Yes. Photographing lights at night is hard.)

Lifx also has more options for flashier effects. Options like twinkle, color cycle, strobe, and morph created a fun ambiance on my patio, and I could adjust features like speed, colors, and direction. Lifx has a decent library of colorful lighting designs and I really like the art series inspired by pieces such as Van Gogh’s The Starry Night.

However, Nanoleaf has many more designs to choose from, including hundreds of user-generated ones. A handful were created just for the string lights; my favorites were Sunset Sky, which cycled through warm reds and oranges, and Twilight, with crisp whites and soft grays.

I could create my own designs in both apps, with Lifx’s being the easiest to use. Nanoleaf’s app is messy and crashes a lot, but its new AI scene generator makes it easier to create new designs without struggling through the app.

Lifx’s app also has basic functions like setting schedules, which is frustratingly not an option with Nanoleaf — to set a schedule, you need to use a third-party smart home platform.

Winner: Lifx

That’s a lotta lights! The Nanoleafs come in maximum of 147 feet with 60 bulbs (this is 98 feet with 40 bulbs).

Cost: Nanoleaf is cheaper and longer

While both string lights start at $130, for that Nanoleaf gives you 20 bulbs on almost 50 feet compared to just 12 bulbs over 24 feet on the Lifx (30 feet including the power cord). The Lifx are closer together, though, at 23 inches apart compared to 28 inches for Nanoleaf.

Nanoleaf is the better deal, especially for a large area like my patio. The 98-foot string with 40 bulbs is $200, and the 147-foot string with 60 bulbs is $300. In comparison, the maximum length of the Lifx — three strings together, totaling 74 feet and 36 bulbs — costs almost $400.

Winner: Nanoleaf

I installed the Nanoleaf and Lifx the same distance from my router. The Lifx connected easily but the Nanoleaf struggled.

Connectivity and compatibility: Nanoleaf has more connection options, but Lifx is more reliable (so far)

The Nanoleaf and Lifx lights work over 2.4GHz Wi-Fi. While the Lifx connected easily, I struggled to get the Nanoleaf on the same network, even though both lights were set up in the same location. Eventually, moving the router closer to the Nanoleaf worked.

Both lights will work with Apple Home, Google Home, Amazon Alexa, and Samsung SmartThings. As part of Nanoleaf’s Matter Essentials line, the Nanoleaf string lights connect to smart home platforms via Matter-over-Wi-Fi. This means it works with any Matter-compatible platform. However, you will need a Matter controller to connect.

Lifx relies on individual integrations with each platform, so it works with fewer but doesn’t require any additional hardware. Lifx says a firmware upgrade will bring the option of Matter-over-Wi-Fi compatibility later this year.

As is par for the course with Matter and me, it took multiple attempts to get the Nanoleaf lights onto a Matter platform. I wasn’t able to connect at all using my iPhone 15. Eventually, with a Samsung Galaxy S22 I connected to SmartThings and, from there, successfully shared the lights with Apple Home and Amazon Alexa using Matter’s multi-admin feature. You don’t have to use Matter with the Nanoleaf; you can connect directly to the Nanoleaf app over Bluetooth and Wi-Fi, but you will need Matter for smart home integrations.

Winner: Lifx

Both these string lights will make spring sparkle

These are both very nice string lights. They’re expensive but built to last. While Lifx has better lighting effects and an easier-to-use app, the Nanoleaf has the edge in terms of overall look. The bulb shape is just gorgeous and looks so nice in my backyard. While not as bright as Lifx, the whites and colors provide more than enough richness and warmth for ambient outdoor lighting. Lifx’s effects and color blending are very impressive, but Nanoleaf’s soft, sparkly glow won me over. Plus, it’s more affordable.

Both Lifx and Nanoleaf have other smart outdoor lighting options, so you can sync their lighting effects across your whole landscape. However, Philips Hue has the biggest outdoor selection (although, strangely, no similar Cafe-style string lights, just the smaller holiday-focused light string).

There are also other options for smart string lights, including those from Govee, Twinkly, and Wiz. But these are all the traditional round bulb shapes. Nanoleaf and Lifx have added unique twists to the outdoor string light look, and both have done it very well.

Photos by Jennifer Pattison Tuohy / The Verge

Updated, Friday April 19th, 4PM: Clarified that while Philips Hue doesn’t have cafe-style string lights like these Nanoleaf and Lifx models, it does offer holiday string lights.

❌