If you’ve been wondering when you’ll be able to order the flame-throwing robot that Ohio-based Throwflame first announced last summer, that day has finally arrived. The Thermonator, what Throwflame bills as “the first-ever flamethrower-wielding robot dog” is now available for purchase. The price? $9,420.
Thermonator is a quadruped robot with an ARC flamethrower mounted to its back, fueled by gasoline or napalm. It features a one-hour battery, a 30-foot flame-throwing range, and Wi-Fi and Bluetooth connectivity for remote control through a smartphone.
It also includes a Lidar sensor for mapping and obstacle avoidance, laser sighting, and first-person-view navigation through an onboard camera. The product appears to integrate a version of the Unitree Go2 robot quadruped that retails alone for $1,600 in its base configuration.
Photograph: Xmatter
The company lists possible applications of the new robot as “wildfire control and prevention,” “agricultural management,” “ecological conservation,” “snow and ice removal,” and “entertainment and SFX.” But most of all, it sets things on fire in a variety of real-world scenarios.
Back in 2018, Elon Musk made the news for offering an official Boring Company flamethrower that reportedly sold 10,000 units in 48 hours. It sparked some controversy, because flamethrowers can also double as weapons or potentially start wildfires.
Flamethrowers are not specifically regulated in 48 US states, although general product liability and criminal laws may still apply to their use and sale. They are not considered firearms by federal agencies. Specific restrictions exist in Maryland, where flamethrowers require a Federal Firearms License to own, and California, where the range of flamethrowers cannot exceed 10 feet.
Photograph: Xmatter
Even so, to state the obvious, flamethrowers can easily burn both things and people, starting fires and wreaking havoc if not used safely. Accordingly, the Thermonator might be one Christmas present you should skip for little Johnny this year.
Hello Nature readers, would you like to get this Briefing in your inbox free every week? Sign up here.
The insect wing hinge is one of the most sophisticated skeletal structures in the animal kingdom. (Johan M. Melis et al/Nature)
A machine-learning model that can fly like a fly helped researchers to unravel the workings of the insect wing hinge. Most hypotheses about this complex biomechanical structure have been built on how it looks when it isn’t moving. An AI system, trained on video recordings of around 70,000 fruit-fly wing beats, predicted how muscle contractions would cause different wing motions. A winged robot programmed with the model’s findings then allowed the researchers to create a map linking muscle activity to flight forces.
An AI tool could help to identify the origins of cancers that have spread from a previously undetected tumour somewhere else in the body. The proof-of-concept model analyses images of cells from the metastatic cancer to spot similarities with its source — for example, breast cancer cells that migrate to the lungs still look like breast cancer cells. In dry runs, there was a 99% chance that the correct source was included in the model’s top three predictions. A top-three list could reduce the need for invasive medical tests and help clinicians tailor treatments to suit.
An initiative called ChatGPT and Artificial Intelligence Natural Large Language Models for Accountable Reporting and Use (CANGARU) is consulting with researchers and major publishers to create comprehensive guidelines for AI use in scientific papers. Some journals have introduced piecemeal AI rules, but “a standardized guideline is both necessary and urgent”, says philosopher Tanya De Villiers-Botha. CANGARU hopes to release their standards, including a list of prohibited uses and disclosure rules, by August and update them every year.
This ‘metafluid’ can be used to build robotic grippers that can grasp objects as large and heavy as a glass bottle — or as small and fragile as an egg. Unlike a regular liquid, the metafluid can be compressed: the small gas-filled capsules collapse when pressure increases. The pressure inside the material then plateaus for some time even if outside pressure increases further. This means that when the system is used to operate a gripper, applying the same pressure will grasp objects of various sizes and fragility. (Nature | 7 min read, Nature paywall)
AI systems could reveal hidden features in medical scans that currently require injecting dyes into the body. Contrast agents — gadolinium for magnetic resonance imaging, for example — are generally safe, but aren’t suitable for people with certain conditions. AI-assisted virtual dyes also make images taken with a fluorescence microscope appear as if they had been stained by a pathologist, a process that makes features stand out. Radiologist Kim Sandler expects to spend less time writing reports about what she sees in scans, and more time vetting AI-generated reports. “My hope is that it will make us better and more efficient, and that it’ll make patient care better,” Sandler says.
This article is part of Nature Outlook: Medical diagnostics, an editorially independent supplement produced with financial support from Seegene.
If neural networks mull over their training data for too long, they end up memorizing the information and become worse at adapting to unseen test data. But when researchers accidentally overtrained a model that specialized in certain mathematical operations, they discovered that it could suddenly master any test data. This ability, called ‘grokking’ — slang for total understanding — seems to happen when the system develops a unique way to solve problems. It’s not yet clear if this phenomenon applies to AI models beyond small, specialized ones. “These weird [artificial] brains work differently from our own,” says AI researcher Neel Nanda. “We need to learn to think how a neural network thinks.”
When a study that pitted 256 humans against three chatbots, the AI systems were generally more creative at coming up with uncommon uses for everyday objects. The study adds to an ongoing debate about how machines master skills traditionally considered to be exclusive to people. Passing tests designed for humans doesn’t demonstrate that machines are capable of anything approaching original thought, AI researcher Ryan Burnell points out. Chatbots are fed vast amounts of mostly unknown data and might just draw on things seen in their training data, he suggests.
Chatbots are designed to identify the quickest way to an answer, even if that means jumping to conclusions or making up things, says machine-intelligence researcher Arseny Moskvichev, who teaches AI systems to read novels like most humans do: from beginning to end. (Nautilus | 6 min read)
Today, I’m bidding farewell to Boston Dynamic’s iconic humanoid robot Atlas, which is being decommissioned after 11 years of running, jumping, flips and falls. The hydraulic machine will be replaced by a new electric Atlas model with a fully rotating ‘head’ and reversible ‘hips’ that allow it to easily get up from the ground.
Please tell me about your favourite robot (real or fictional) by sending an e-mail to [email protected].Thanks for reading,
This week in the world of tech: Boston Dynamics unveiled a new robot, and while it terrified us, the barrage of negative Humane AI Pin reviews showed us that maybe the artificial intelligence uprising is perhaps further aware than we initially feared.
But maybe you’ve missed these or other major tech stories from this past week. No worries, because we’re here to help with a round-up of the eight biggest tech news stories from the last seven days.
There’s a lot to catch up on, so let’s get into it.
8. The Humane AI Pin came… and flopped
(Image credit: Humane)
Reviews for the first Humane AI pin came out this week, and they weren’t great – with the wearable being universally labeled as “unreliable.”
Marques Brownlee released a video on his YouTube channel calling the AI gadget “the worst product i’ve ever reviewed… for now,” Mrwhosetheboss said “It’s not good,” Bloomberg said “”The design and interface are fatally flawed,” and The Verge’s video interview featured frequent bouts of hysterical laughter because of how bad it found the Humane AI Pin to be.
There was also a very strange controversy on social media criticizing the critics – with much of the undeserved hate being directed at Brownlee, leading to him issuing a response – but the main thing was that while AI wearables do seem to be the future – with Ray-Ban Meta Smart Glasses getting Meta AI and the Nothing earbuds getting ChatGPT (more below) – the current tech isn’t where it needs to be right now.
7. Boston Dynamics terrified us with its new humanoid robot
(Image credit: Boston Dynamics)
Perhaps someday we’ll ask, “Where were you when you first saw “New Atlas?”
Get the hottest deals available in your inbox plus news, reviews, opinion, analysis and more from the TechRadar team.
Boston Dynamics all-new and all-electric Atlas update is a generational leap from the more than decade-old hydraulics-based Atlas. That robot, which is being retired was amazing in its own right, capable of doing numerous human-like tasks like walking, running, bending, and lifting, but also showing us how an apex human could perform through various acrobatics and parkour antics.
New Atlas, however, might be even stronger and it’s already showing us its uncanny flexibility in a brief introduction video. Expect to see it doing even more impressive physical tricks before the bot finally makes its way to factories and, maybe someday, our homes.
6. We saw Sony’s new mini-LED TV backlight tech put OLEDs on notice
(Image credit: Future)
Sony just launched its new 2024 TVs, and the Bravia 9 mini-LED leads the lineup. Positioning mini-LED as its flagship TV tech is an extreme about-face for the company, which had previously reserved that status for OLED.
We saw the new Bravia 9 TV in action at Sony Pictures Studios in Culver City, California, and there’s good reason for Sony’s newfound mini-LED enthusiasm. The company has developed a new XR Backlight Master Drive backlight design that uses a 22-bit LED driver to deliver 50% higher brightness and 320% more local dimming zones than its previous X95L mini-LED model.
This new backlight helps enhance contrast and color brightness while reducing blooming, closing the picture quality gap between mini-LED and OLED. It’s also better able to capture the full range of highlight detail in movies with high dynamic range – an important factor as movies increasingly get mastered at higher brightness levels.
5. Samsung confirmed its AI is coming to your Galaxy 21 phone
Per a post on Samsung’s Korean community forum the Galaxy S21, Galaxy S21 Plus, Galaxy S21 Ultra, Galaxy Z Flip 3 and Galaxy Z Fold 3 will be getting the company’s AI-packed One UI 6.1 update in “early May,” and when it does it’ll bring Circle to Search and Magic Rewrite to these devices.
It’s always great news to hear older handsets will be getting some of the technical bells and whistles of the newer smartphone releases, though if you’re rocking a Samsung handset from 2020 it looks unlikely you’ll get any of these AI tools – so we’d suggest checking out our Samsung phones deals page if you’re thinking of upgrading to a new phone that can access Galaxy AI.
4. New Nothing Ear buds launched with ChatGPT
(Image credit: Nothing)
Nothing’s naming strategy is anything but self-explanatory, so to avoid supplementary confusion: Nothing launched two new sets of true wireless earbuds on Thursday, April 18. A model called just Ear are the company’s new flagship offering – these have arrived after the Ear (1), Ear (Stick) and Ear (2), that’s just how it is – while the also-new Ear (a) are the cheaper pair. And it’s this entry-level model that just gained a highly coveted TechRadar five-star recommendation, which you can read about to your heart’s content in our full-fat Nothing Ear (a) review.
But the fact that there are two new sets of Nothing earbuds is only part of the news here, because in addition to releasing two new earbuds models, Carl Pei’s startup has also fixed it so that your Nothing earbuds and phones can let you talk to ChatGPT for instant AI support.
Nothing says that once you’ve downloaded the ChatGPT app on your Nothing Phone (running the latest Nothing OS), you’ll be able to pinch-to-speak using the earbuds’ stems and thus summon the chatbot for answers, without having to dig out your device. And have to admit, that’s really something, Nothing…
(Image credit: Meta)
Meta’s AI got a new and improved website, as well as some upgrade thanks to it’s new “state-of-the-art Llama 3 AI model” according to CEO Mark Zuckerberg – and the best thing of all is it’s completely free to use.
The site lets you generate text and images with a written prompt – though to make AI images you’ll need to log in and your picture will feature a watermark which should help a little in cutting down misuse.
It’s still early days in the battle between AI creators, but Google and OpenAI had better watch out because Meta’s new and improved software is already looking like a major competitor to what’s currently out there – and it will only get better.
2. The iPhone got its first Nintendo emulators and alternative app stores
(Image credit: AltStore)
Following some gentle arm-twisting from the EU, Apple recently said its App Store would soon allow retro game emulators like the ones you can find on Android. This week, we saw the first one arrive with Delta – a free app that you can download now from the App Store in the EU and many countries outside of it.
Unlike rival emulators like iGBA, which quickly disappeared from the App Store due to copyright violations, this one is likely here to stay. Delta supports several consoles including the NES, SNES, Game Boy Advance and Nintendo DS, and you can play games with iPhone-compatible controllers, too. All you need to do is provide the ROM files (as long they’re copyright-free, of course) and you’re good to go. Now all we need is a PS1 emulator…
1. The Insta360 X4 became our new favorite 360 camera
(Image credit: Insta360)
Our extensive Insta360 X4 hands-on review waxed lyrical about the 8K video-equipped 360-degree camera. With higher resolution video than the X3, much better battery life and welcome design tweaks, the X4 is the best 360-degree camera yet.
Full waterproofing and a decent single-camera mode make the X4 a compelling action camera, vlogging tool, and even a dash cam especially for motorcyclists who can voice command the X4 from a Bluetooth compatible headset from within a helmet. GoPro has a tough act to follow with its upcoming Max 2, as does Canon with its intriguing 360 / 180 3D Powershot concept. It’s been really quiet in this category of cameras the last couple of years, but that seems set to change in 2024.
Sunny, a mystery thriller with a darkly comic bent, will premiere on Apple TV+ this summer. The series will star Rashida Jones and a robot.
The two will work together in an attempt to locate the woman’s missing family.
Sunny will be a mystery/thriller/dark comedy
As a computer-maker, it really shouldn’t surprise anyone that Apple’s streaming video service includes plenty of sci-fi. Silo and Foundation stand out as two excellent examples. Some of the series mix sci-fi into everyday life, like Severance and The Big Door Prize.
That’s the approach that Sunny is taking. As Apple TV+ says:
Sunny stars Jones as Suzie, an American woman living in Kyoto, Japan, whose life is upended when her husband and son disappear in a mysterious plane crash. As “consolation” she’s given Sunny, one of a new class of domestic robots made by her husband’s electronics company. Though at first, Suzie resents Sunny’s attempts to fill the void in her life, gradually they develop an unexpected friendship. Together they uncover the dark truth of what really happened to Suzie’s family and become dangerously enmeshed in a world Suzie never knew existed.
Emmy-nominee Rashida Jones has been part of multiple Apple TV+ projects. She’s in Silo and Sofia Coppola’s acclaimed film On the Rocks. But she’s perhaps best known for her time on NBC’s Parks and Recreation.
Sunny was created by Katie Robbins (The Affair, The Last Tycoon), who’s also the showrunner, and Lucy Tcherniak (Station Eleven, The End of the F***ing World), who is also the director.
There’s no better place than Japan to set a TV series with a robot as a central character. Photo: Apple TV+
On Apple TV+ this summer
The 10-episode series will premier globally on Apple’s streaming service with the first two episodes on Wednesday, July 10, 2024. Apple TV+ will release new episodes every Wednesday through September 4.
Watching Sunny will come with a subscription to Apple TV+. The service is $9.99 per month with a seven-day free trial. You can also access it via any tier of the Apple One subscription bundle.
And Apple’s streaming video service also includes much more, of course. There’s a library of drama, comedies, sci-fi, musicals, children’s shows, nature documentaries, etc.
Boston Dynamics all but trade-marked jaw-dropping robot videos with its hydraulics-power Atlas robot’s dancing and parkouring videos. Now it’s upped the ante and I’m scraping my jaw off the floor again after watching the brief introduction video for its all-electric and completely redesigned Atlas robot.
The All New Atlas is Boston Dynamic’s first all-electric humanoid robot and the robotics firm claims it’s stronger and more agile than all previous iterations. What jumps out at me in the video, though, is the robot’s far more human-like body.
Where the hydraulics-based Atlas always looked charmingly like a mash-up between a line-backer and some scaffolding, the new Atlas is much more in the vein of Tesla’s Optimus, and Figure AI’s Figure 01, quickly shifting the legendary robotics company back into a pole position in the growing humanoid robotics race. What stuns here, though, is not just the robot’s looks, it’s how the New Atlas moves.
The clip starts with the new Atlas motionless and splayed out on the floor. It lifts its two legs up and rolls them back until they’re in a position no human who is not a contortionist could easily match. With both feet planted on the ground, the new Atlas rises up from the floor but with its chest, head, and legs facing away from the camera. First, the head, which has a circular glass panel for a face, spins around, and then each leg rotates at the hip to face the camera as Atlas effortlessly walks forward. Finally, the torso spins around under the head until the entire New Atlas is facing the camera.
As Boston Dynamics notes in the release, “Atlas may resemble a human form factor, but we are equipping the robot to move in the most efficient way possible to complete a task, rather than being constrained by a human range of motion. Atlas will move in ways that exceed human capabilities.”
Giving Atlas super-human capabilities that include more strength and the ability to move in ways we can’t is all about efficiency. Humans are constrained by their physiology in ways that robots don’t have to be. We joke about people “keeping their heads on a swivel” to remain aware of their environment, but robots can literally do this.
Boston Dynamics continues to focus on bi-pedal robots because it believes it’s a useful form factor in building robots to work “in a world designed for people.” It’s unclear if the shift to all-electric augurs a similar change for its popular SPOT robot (the one that looks a bit like a dog), which currently uses battery power and hydraulic actuators.
Get the hottest deals available in your inbox plus news, reviews, opinion, analysis and more from the TechRadar team.
Just the start
Even though Atlas now looks more human, it’s still a long way from commercial or consumer availability. Its initial test bed, according to Boston Dynamics, will be with company investor Hyundai. “In the months and years ahead, we’re excited to show what the world’s most dynamic humanoid robot can really do—in the lab, in the factory, and in our lives,” notes the company in the release.
On the back end, the New Atlas will be powered by, among other things, the company’s latest AI and machine learning. When it places robots in factories, Boston Dynamics ensures that the programming knows as much about the factory as possible so the robots can work independently and safely.
The dawn of a New Atlas does mean the sunsetting of the beloved hydraulic-based robot. Boston Dynamics gave the original Atlas a sweet sendoff with a video recounting its successes and numerous gaffes. For every time that Atlas successfully completed a parkour routine, it also tipped over, face-planted, and spectacularly burst a hydraulics line. The video is a funny and loving tribute to a robot that’s captured the imaginations of millions of viewers.
It’s a fitting way to end one chapter and launch this new one with the all-electric Atlas. Sure, we’ve only seen 30 seconds of movement, but I’m sure we’ll soon see this more personable robot dancing with SPOT, back-flipping off ledges, and parkouring its way into our hearts.
Almost 11 years after Boston Dynamics revealed the Atlas humanoid robot, it’s finally being retired. The DARPA-funded robot was designed for search-and-rescue missions, but it rose to fame thanks to videos showing off its dance moves and—let’s be honest—rudimentary parkour skills.
Atlas is trotting off into the sunset with one final YouTube video, thankfully including plenty of bloopers — which are the best parts. Boston Dynamics, of course, has more commercially successful robots in its lineup, including. It’s likely not the end of the line for the company’s humanoid robots, either.
— Mat Smith
The biggest stories you might have missed
You can get these reports delivered daily direct to your inbox. Subscribe right here!
It was part of a cargo pallet the space station dropped in 2021.
Back in March, a piece of space debris hit the roof of a house in Naples, FL, ripped through two floors and (fortunately) missed the son of homeowner Alejandro Otero. On Tuesday, NASA confirmed it was a piece of equipment dumped from the International Space Station (ISS), three years ago. NASA expected the haul of discarded nickel-hydrogen batteries to orbit Earth for between two to four years, “before burning up harmlessly in the atmosphere.” Not the case.
Netflix is accused of using AI-manipulated imagery in the true crime documentary What Jennifer Did. Several photos show the usual AI issues: mangled hands and fingers, strange artifacts, curved edges that should be straight and more. If accurate, the report raises serious questions about using such images in documentaries, particularly since the person depicted is currently awaiting retrial. Netflix has yet to acknowledge the report.
When the X3 landed, it was a 360-degree action cam that solved a lot of the usual problems with that camera genre. With the X4, Insta360 has just… upgraded everything. The technical improvements focus on video, with the new ability to record footage at up to 8K 30 fps or 5.7k at 60 fps. Slow-mo video has been boosted up to 4K resolution, too. In short, it captures more of everything. The X4 has a 2,290mAh battery, 67 percent bigger than the X3’s. According to the press release, it should be able to capture video for up to 135 minutes. The camera is available for $500 now.
Being a cat owner is a joy like no other, but I miss my cat so, so much when I’m away even just for a day. That’s why the Enabot Ebo SE pet robot is a literal must-have in my cat-crazy household. This small and sweet little robot doesn’t have an adorable little ‘face’ like the Enabot Ebo X, but operates similarly, offering features like mobile phone compatibility and the ability to take photos and videos.
It’s also not as stuffed with features as the Enabot Ebo X, which has built-in Alexa smart home functions and a 4K UHD camera, however, if you’re looking for a simple and much cheaper robot, the Enabot Ebo SE robot reigns supreme. This little orb is simple to set up right out of the box and is completely managed through the app.
It’s not quite got the chops to be one of the best home security cameras, but certainly gives peace of mind if you quickly want to check in at home. My testing of the Enabot Ebo SE coincided with my holiday, which was a huge blessing; this would be my first time traveling away for more than a day since I got Miso, and having the Enabot Ebo SE keeping an eye on my baby eased a lot of my anxieties.
I set everything up a few days before my six-day trip, and I was relieved to see that the Enabot Ebo SE returned to its charging station all on its own without any prompting after checking in on my sweet boy, Miso, (don’t worry, pictures soon to come!), which makes things a lot easier when you’re remotely checking in on your furry friends.
(Image credit: Enabot)
It was a complete stroke of luck that I started reviewing this robot when I did because as soon as I landed, I lost contact with friends and family in the UK – I couldn’t get a Facetime, WhatsApp, or even an IMO call to hold for more than two seconds.
Since the Ebo SEhas a two-way audio capability, I was able to keep in touch with Miso through the robot. It was incredibly useful to be able to open the app, turn the microphone on, and check in not just on my cat but on the people at home. It helped me stay connected, and I honestly don’t think I could travel without having this little guy set up and ready to be on guard duty.
In terms of the bot’s mobility, it’s pretty decent, but not groundbreaking. Through the app, you can steer it to go left, right, backward, and forward, and there are designated spin and sprint buttons. These proved to be useful as I had the Ebo SE set up in my bedroom, where there’s an obstacle course made of socks and other various flotsam and jetsam that Miso likes to hoard, which was how I discovered that the Ebo SE struggles to get over smaller objects like the corner of a shirt, and also with sharper turns.
Image 1 of 3
(Image credit: Muskaan Saxena via Future)
Close ups of Miso the cat! Night Mode Miso has a lot to say…
(Image credit: Muskaan Saxena via Future)
(Image credit: Muskaan Saxena via Future)
It would be nice if the Ebo SE had some kind of crash detection feature that would alert me before I smack the robot into a bedpost – or better yet if it could reverse away from the hazard on its own. However, I suppose that level of intelligence would drive the price out from the moderately affordable $247 / £199 / AU$382 to a little on the expensive side.
In addition to its decent 1080p HD camera, the Enabot Ebo SE’s Night Mode was pretty impressive as well, and it was nice to be able to see Miso at any hour of the day or night while he was creating chaos and growing his pile of stolen artifacts. Thankfully, Night Mode is automatically enabled, so whenever you want to drive around your home and check in on your loved ones (both furry and otherwise) in the dark you can open the app and get straight to spying.
It’s worth noting that you need a pretty solid internet connection, as you may end up accidentally driving your robot off a cliff (or, more likely, down the stairs) like I did when I stepped outside of my hotel and lost connection.
(Image credit: Muskaan Saxena via Future)
Enabot Ebo SE pet robot review: price and availability
How much does it cost? $247 / £199 / AU$382
When is it available? Available now
Where can I buy it? Amazon and Enabot official site
The Enabot Ebo SE is relatively cheap and pretty budget-friendly for most people, available for $247 / £199 / AU$382. It’s well worth the splurge if you’ve been saving up for a pet robot, but of course, it’s still a luxury purchase.
It is a lot cheaper than the Enabot Ebo X which starts at just under $1,000. It’s currently available on Amazon as well as through the Enabot website. Of course, you lose out on a fair few advanced features; but I didn’t find myself needing these.
Two best friends taking in the sights (Image credit: Muskaan Saxena via Future)
Should I buy?
Swipe to scroll horizontally
Enabot Ebo SE score card
Price:
For the price, you get a pretty good product! The picture quality however could be better for the cost.
4/5
Design:
The Enabot Ebo SE is super cute and super small, which is perfect for accessing the same nooks and crannies your pets do.
4.5/5
App and features:
The app is a little basic, and it is lacking Alexa compatibility which is disappointing. However, it’s easy to get a hang of.
3/5
Buy it if…
Don’t buy it if…
(Image credit: Muskaan Saxena via Future)
How I tested the Enabot Ebo SE
I used the Enabot Ebo SE while away from home and abroad for three weeks
I set up all the controls
I took photos and videos with the bot
I tested the microphone speaking both to my cat and to my housemate
I used the Enabot Ebo SE for about three weeks as my only pet and indoor camera. Once out of the box I paired it with my Ebo account and placed it in a secure room to operate in where stairs or other big obstacles wouldn’t impact it. I spent a few days familiarizing myself with the controls before I traveled, practicing the steering and controls, and trialing the app and its features.
I took several photos and videos of my cat at different parts of the day under different levels of internet connectivity, as well as using the microphone and speakers to see how reliable both components are. It is still currently my only pet camera and I use it often when I’m away from home or just want to check up on my cat.
I’ve been researching and reviewing technology for two years, and while Miso hasn’t necessarily developed the same writing skills I have, he’s a pretty good judge of pet toys and products.
Most of the best robots, ones that can walk, run, climb steps, and do parkour, do not have faces, and there may be a good reason for that. If any of them did have mugs like the one on this new research robot, we’d likely stop in our tracks in front of them, staring wordlessly as they ran right over us.
Building robots with faces and the ability to mimic human expressions is an ongoing fascination in the robotics research world but, even though it might take less battery power and fewer load-bearing motors to make it work, the bar is much much higher for a robot smile than it is for a robot jump.
Even so, Columbia Engineering’s development of its newest robot, Emo and “Human-robot Facial Co-Expression” is impressive and important work. In a recently published scientific paper and YouTube video, researchers describe their work and demonstrate Emo’s ability to make eye contact and instantly imitate and replicate human expression.
To say that the robot’s series of human-like expressions are eerie would be an understatement. Like so many robot faces of its generation, its head shape, eyes, and silicon skin all resemble a human face but not enough to avoid the dreaded uncanny valley.
That’s okay, because the point of Emo is not to put a talking robot head in your home today. This is about programming, testing, and learning … and maybe getting an expressive robot in your home in the future.
Emo’s eyes are equipped with two high-resolution cameras that let it make “eye contact” and, using one of its algorithms, watch you and predict your facial expressions.
Because human interaction often involves modeling, meaning that we often unconsciously imitate the movements and expressions of those we interact with (cross your arms in a group and gradually watch everyone else cross their arms), Emo uses its second model to mimic the facial expression it predicted.
Get the hottest deals available in your inbox plus news, reviews, opinion, analysis and more from the TechRadar team.
“By observing subtle changes in a human face, the robot could predict an approaching smile 839 milliseconds before the human smiled and adjust its face to smile simultaneously.” write the researchers in their paper.
In the video, Emo’s expressions change as rapidly as the researcher’s. No one would claim that its smile looks like a normal, human smile, that its look of sadness isn’t cringeworthy, or its look of surprise isn’t haunting, but its 26 under-the-skin actuators get pretty close to delivering recognizable human expression.
(Image credit: Columbia Engineering)
“I think that predicting human facial expressions represents a big step forward in the field of human-robot interaction. Traditionally, robots have not been designed to consider humans,” said Columbia PhD Candidate, Yuhang Hu, in the video.
How Emo learned about human expressions is even more fascinating. To understand how its own face and motors work, the researchers put Emo in front of a camera and let it make any facial expression it wanted. This taught Emo the connection between its motor movements and the resulting expressions.
They also trained the AI on real human expressions. The combination of these training methods gets Emo about as close to instantaneous human expression as we’ve seen on a robot.
The goal, note researchers in the video, is for Emo to possibly become a front end for an AI or Artificial General Intelligence (basically a thinking AI).
If you’re just looking for a cheap way to keep your floors cleaner and don’t need all the top-end features, you may want to check out this deal. Anker brand Eufy’s BoostIQ RoboVac S11 is now down to $140 after a 44 percent discount. The sale comes as part of a larger sale on Eufy vacs, including ones with a few more bells and whistles. The S11 Max is our current pick for an ultra budget option in our buyer’s guide to robo vacs because it’s super affordable (especially after the discount), has good suction power and a long battery life. Probably the biggest caveat is that it’s not Wi-Fi enabled.
eufy
Our pick for an ultra budget robot vacuum is down to just $140 after a 44 percent discount.
Instead of controlling the unit through your home’s wireless network, the 11S Max comes with a remote that handles scheduling and other smart features like cleaning mode selection. It also has a manual button up top to start a session. It has three power modes — Standard, BoostIQ and Max — and BoostIQ provides a good balance of adequate suction and noise level. In our tests, a BoostIQ session lasted about an hour and 15 minutes. The obstacle avoidance is impressive at sidestepping random objects, though it occasionally bumped into walls. The vac is also about an inch an a half thinner than many other robot vacs we tested, which lets it get beneath low-slung furniture for more complete cleaning.
Recently, I’ve had a bit of a ‘mare with my robot vacuum. I’ve been using the iRobot Roomba Combo J7 Plus since I first reviewed it a year and a half ago, and while it’s been a stellar sucker until the last few months, a litany of sudden issues has me wanting to try something new.
It’s not the first iRobot Roomba I’ve tried, and I doubt it’ll be the last, but given its lofty price, I’m pretty surprised by some of its issues. Despite my regular cleaning and maintenance, the mop function has stopped working almost entirely. Suddenly, the vacuum is rubbish at cleaning edges, and despite the really impressive navigation technology I observed during my test, my Combo J7 Plus somehow managed to gouge out a chunk of its camera lens during a cleaning job, meaning obstacle detection is permanently marred.
iRobot is facing increasingly intense competition in the robot vacuum space, and with news of its failed Amazon acquisition, there’s less and less room for Roomba to continue its decades-long market dominance.
That’s especially true now some of the big-name household appliance brands like Dyson and Samsung are jostling for their slice of the pie, and doing so with increasing promise (though the Dyson 360 Vis Nav failed to impress us). I had the chance to see Samsung’s new Bespoke Jet Bot Combo AI robot vacuum in the flesh at its global Bespoke AI launch event, and I’m pretty excited by the promised improvements upon its predecessor.
The Bespoke Jet Bot Combo AI’s predecessor, the Jet Bot AI+ (Image credit: Future)
A brief history
When Samsung announced its first Jet Bot AI+ robot vacuum at CES 2021, all eyes were on the tech giant, waiting in anticipation to see if it could sweep the market with its new tech.
Unfortunately, at least based on our Samsung Jet Bot AI+ review, the OG Jet Bot robot vacuum lagged behind the competition with some pretty fundamental issues. For one thing, it’s 3.9in / 13.7cm tall; compared to the best robot vacuums we’ve tested, which average around the 3.2in / 8.1cm mark, that makes the Jet Bot AI pretty chunky, which isn’t ideal for cleaning under furniture.
Then there was the slightly sloppy navigation and less-than-impressive edge cleaning, all of which amounted to a very expensive bot that didn’t quite warrant its lofty price tag, even with the armory of cool features Samsung built-in to the Jet Bot AI+.
Still, it was a decent attempt at re-entering the robotic cleaning market following a handful of earlier attempts that failed to fully take off (RIP Crubo and PowerBot), and the Jet Bot AI+ still managed to impress with its fantastic suction and ability to suck up debris many other robovacs would have simply brushed aside.
What’s new, pussy cat?
On the other hand, Samsung’s new Bespoke Jet Bot Combo AI is poised to deliver a much better cleaning experience – and not just because it can mop, too.
First announced at CES 2024, the Bespoke Jet Bot Combo AI doesn’t only include returning features like the ability to spy on your pets, but it’s also seen some serious upgrades to both hardware and software.
Most noticeably, it’s a fair bit shorter than the original Jet Bot AI+. By my eye, it looks to be around the same height, if not slightly taller than my iRobot Roomba Combo J7+ at 3.4in / 8.6cm tall. While I do prefer even slimmer robot vacuums, even the extra clearance space compared to its predecessor makes the Bespoke Jet Bot Combo AI far superior.
Then there are the powerful spinning mop pads, which rotate at 170RPM to offer a thorough clean on even the toughest stains. Thanks to the vacuum’s ability to recognize floor types and adjust the cleaning patterns accordingly, the Bespoke Jet Bot Combo AI will automatically lift mop pads to avoid dampening carpets. The best part? The Clean Station will even steam, wash, and sanitize mop pads, neutralizing 99.9% of bacteria.
Speaking of the Clean Station, the Bespoke Jet Bot Combo AI’s auto-emptying home base features 3-litre tanks for dirty and clean water in addition to a compartment for the dust bin, which Samsung says needs replacing on average every 2-3 months. That does mean it’s pretty enormous, which can be a real turn-off for some, but it’s the nature of combination robot vacuum and mops that feature mop pad cleaning.
On the software side, things are looking up, too. This clever cleaner leverages a database of 1.7 million images to bolster its deep AI neural network for features like obstacle detection.
(Image credit: Future)
Only time will tell…
I can’t speak to the performance of Samsung’s new Bespoke Jet Bot Combo AI, having only seen it leave its base, pirouette, and return, but I’m certainly ready to try something new.
Samsung’s experience in home appliances combined with its software chops in theory makes for a strong contender, though I’ve no idea if the vacuum will outlast my presently-inept iRobot Roomba Combo J7+.
The Bespoke Jet Bot Combo AI is currently available for pre-order on Samsung’s US website, with a list price of $1,699, which is reduced by $300 as of writing, but we’ve not got pricing or release date details for the UK or Australia right now. The US pricing does, however, give us an indicator for price comparison; it’s $300 more than iRobot’s most recent combination cleaner, the Roomba J9+, which retails for $1,399.99 / £1,249 / AU$1,999 and $200 more than the vacuum-only Dyson Vis Nav, which costs $1,199.99 / £1,399.99 / AU$2,399.