- Video Friday: Robots Are Everywhere at CES 2026by Evan Ackerman on January 9, 2026 at 6:00 pm
Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.ICRA 2026: 1–5 June 2026, VIENNAEnjoy today’s videos! We’re excited to announce the product version of our Atlas® robot. This enterprise-grade humanoid robot offers impressive strength and range of motion, precise manipulation, and intelligent adaptability—designed to power the new industrial revolution. [ Boston Dynamics ]I appreciate the creativity and technical innovation here, but realistically, if you’ve got more than one floor in your house? Just get a second robot. That single-step sunken living room though....[ Roborock ]Wow, SwitchBot’s CES 2026 video shows almost as many robots in their fantasy home as I have in my real home.[ SwitchBot ]What is happening in robotics right now that I can derive more satisfaction from watching robotic process automation than I can from watching yet another humanoid video?[ ABB ]Yes, this is definitely a robot I want in close proximity to my life.[ Unitree ]The video below demonstrates a MenteeBot learning, through mentoring, how to replace a battery in another MenteeBot. No teleoperation is used.[ Mentee Robotics ]Personally, I think we should encourage humanoid robots to fall much more often, just so we can see whether they can get up again.[ Agility Robotics ]Achieving long-horizon, reliable clothing manipulation in the real world remains one of the most challenging problems in robotics. This live test demonstrates a strong step forward in embodied intelligence, vision-language-action systems, and real-world robotic autonomy.[ HKU MMLab ]Millions of people around the world need assistance with feeding. Robotic feeding systems offer the potential to enhance autonomy and quality of life for individuals with impairments and reduce caregiver workload. However, their widespread adoption has been limited by technical challenges such as estimating bite timing, the appropriate moment for the robot to transfer food to a user’s mouth. In this work, we introduce WAFFLE: Wearable Approach For Feeding with LEarned Bite Timing, a system that accurately predicts bite timing by leveraging wearable sensor data to be highly reactive to natural user cues such as head movements, chewing, and talking.[ CMU RCHI ]Humanoid robots are now available as platforms, which is a great way of sidestepping the whole practicality question.[ PNDbotics ]We’re introducing Spatially Enhanced Recurrent Units (SRUs)—a simple yet powerful modification that enables robots to build implicit spatial memories for navigation. Published in the International Journal of Robotics Research (IJRR), this work demonstrates up to +105 percent improvement over baseline approaches, with robots successfully navigating 70+ meters in the real world using only a single forward-facing camera.[ ETHZ RSL ]Looking forward to the DARPA Triage Challenge this fall![ DARPA ]Here are a couple of good interviews from the Humanoids Summit 2025. [ Humanoids Summit ]
- Video Friday: Watch Scuttle Evolveby Evan Ackerman on January 2, 2026 at 6:00 pm
Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.ICRA 2026: 1–5 June 2026, VIENNAEnjoy today’s videos! I always love seeing robots progress from research projects to commercial products.[ Ground Control Robotics ]Well this has to be one of the most “watch a robot do this task entirely through the magic of jump cuts” I’ve ever seen.[ UBTECH ]Very satisfying sound on this one.[ Pudu Robotics ]Welcome to the AgileX Robotics Data Collection Facility, where real robots build the foundation for universal embodied intelligence. Our core mission? Enable large-scale data sharing and reuse across dual-arm teleoperation robots of diverse morphologies, breaking down data silos that slow down AI progress.[ AgileX ]I’m not sure how much thought was put into this, but giving a service robot an explicit cat face could be a good way of moderating expectations on its behavior and interactivity.[ Pudu Robotics ]UBTECH says they have built 1,000 of their Walker S2 humanoid robots, over 500 of which are “delivered & working.” I would very much like to know what “working” means in this context.[ UBTECH ]Every story has its beginning, and ours started in 2023—a year defined by the unknown. Let technology return to passion; let trials catalyze evolution. Embracing growth, embarking on a new journey. We’ll see you at the next stop.Please, please hire someone to do some HRI (human-robot interface) design.[ PNDbotics ]
- Tech to Track in 2026by Harry Goldstein on January 1, 2026 at 3:00 pm
Every September as we plan our January tech forecast issue, IEEE Spectrum’s editors survey their beats and seek out promising projects that could solve seemingly intractable problems or transform entire industries.Often these projects fly under the radar of the popular technology press, which these days seems more interested in the personalities driving Big Tech companies than in the technology itself. We go our own way here, getting out into the field to bring you news of the hidden gems that genuinely—as the IEEE motto goes—advance technology for the benefit of humanity.A look back at the last 20 years of January issues reveals that while we’ve certainly covered our share of huge tech projects, like the James Webb Space Telescope, many of the stories touch on subjects most people would have otherwise missed.Last January, Senior Associate Editor Emily Waltz reported on startups that are piloting ocean-based carbon capture. This issue, she’s back with another CO2-centric story, this time focused on grid-scale storage, which is poised to blow up—literally. Waltz traveled to Sardinia to check out Milan-based Energy Dome’s “bubble battery,” which can store up to 200 megawatt-hours by compressing and decompressing pure carbon dioxide inside an inflatable dome.This kind of modular, easy-to-deploy energy storage could be especially useful for AI data centers, says Senior Editor Samuel K. Moore, who curated this issue and wrote about gravity energy storage back in January 2021.Big bubbles could help with grid-scale storage; tiny bubbles can liquefy cancer tumors. “When we think about energy storage, our minds usually go to grid-scale batteries,” Moore says. “Yet these bubbles, which are in many ways more capable than batteries, will be sprouting up all over the place, often in association with computing infrastructure.”For his story in this issue, Moore dove into the competition between two startups that are developing radio-based cables to replace conventional copper cables and fiber optics in data centers. These radio systems can connect processors 10 to 20 meters apart using a third of the power of optical-fiber cables and at a third of the cost. The next step is to integrate the radio connections directly with GPUs, to ease cooling burdens and help data centers and the AI models running on them continue to scale up.Big bubbles could help with grid-scale storage; tiny bubbles can liquify cancer tumors, as Greg Uyeno found when reporting on HistoSonics’ ultrasound treatment. Feared for its aggressive nature and extremely low survival rate, pancreatic cancer kills almost half a million people per year worldwide. HistoSonics uses noninvasive, focused ultrasound to create cavitation bubbles that destroy tumors without dangerously heating surrounding tissue. This year, the company is concluding kidney trials as well as launching pancreatic cancer trials.Over the last two decades, Spectrum has regularly covered the rise of drones. In 2018, for instance, we reported that the startup Zipline would deploy autonomous drones to deliver blood and medical supplies in rural Rwanda. Today, Zipline has a market cap of about US $4 billion and operates in several African countries, Japan, and the United States, having completed almost 2 million drone deliveries. In this issue, journalist Robb Mandelbaum takes us inside the Wildfire XPrize competition, aimed at providing another life-saving service: dousing wildfires before they grow out of control. Zipline succeeded because it could make deliveries to remote locations much faster than land vehicles. This year’s XPrize teams plan to detect and suppress fires faster than conventional firefighting methods.In addition to these emerging technologies, we’ve packed this issue with a dozen others, including Porsche’s wireless home charger for EVs, the world’s first electric air taxi service, neutral-atom quantum computers, interoperable mesh networks, and robotic baseball umpires. Let’s see which of this year’s picks make it to the big leagues.
- Teams of Robots Compete to Save Lives on the Battlefieldby Evan Ackerman on December 31, 2025 at 1:00 pm
Last September, the Defense Advanced Research Projects Agency (DARPA) unleashed teams of robots on simulated mass-casualty scenarios, including an airplane crash and a night ambush. The robots’ job was to find victims and estimate the severity of their injuries, with the goal of helping human medics get to the people who need them the most.Kimberly ElenbergKimberly Elenberg is a principal project scientist with the Auton Lab of Carnegie Mellon University’s Robotics Institute. Before joining CMU, Elenberg spent 28 years as an army and U.S. Public Health Service nurse, which included 19 deployments and serving as the principal strategist for incident response at the Pentagon.The final event of the DARPA Triage Challenge will take place in November, and Team Chiron from Carnegie Mellon University will be competing, using a squad of quadruped robots and drones. The team is led by Kimberly Elenberg, whose 28-year career as an army and U.S. Public Health Service nurse took her from combat surgical teams to incident response strategy at the Pentagon.Why do we need robots for triage?Kimberly Elenberg: We simply do not have enough responders for mass-casualty incidents. The drones and ground robots that we’re developing can give us the perspective that we need to identify where people are, assess who’s most at risk, and figure out how responders can get to them most efficiently.When could you have used robots like these?Elenberg: On the way to one of the challenge events, there was a four-car accident on a back road. For me on my own, that was a mass-casualty event. I could hear some people yelling and see others walking around, and so I was able to reason that those people could breathe and move.In the fourth car, I had to crawl inside to reach a gentleman who was slumped over with an occluded airway. I was able to lift his head until I could hear him breathing. I could see that he was hemorrhaging and feel that he was going into shock because his skin was cold. A robot couldn’t have gotten inside of the car to make those assessments.This challenge involves enabling robots to remotely collect this data—can they detect heart rate from changes in skin color or hear breathing from a distance? If I’d had these capabilities, it would have helped me identify the person at greatest risk and gotten to them first.How do you design tech for triage?Elenberg: The system has to be simple. For example, I can’t have a device that’s going to force a medic to take their hands away from their patient. What we came up with is a vest-mounted Android phone that flips down at chest height to display a map that has the GPS location of all of the casualties on it and their triage priority as colored dots, autonomously populated from the team of robots.Are the robots living up to the hype?Elenberg: From my time in service, I know the only way to understand true capability is to build it, test it, and break it. With this challenge, I’m learning through end-to-end systems integration—sensing, communications, autonomy, and field testing in real environments. This is art and science coming together, and while the technology still has limitations, the pace of progress is extraordinary.What would be a win for you?Elenberg: I already feel like we’ve won. Showing responders exactly where casualties are and estimating who needs attention most—that’s a huge step forward for disaster medicine. The next milestone is recognizing specific injury patterns and the likely life-saving interventions needed, but that will come.This article appears in the January 2026 print issue as “Kimberly Elenberg.”
- First Air Taxi Service to Launch in Dubai in 2026by Elan Head on December 28, 2025 at 1:00 pm
SummaryJoby Aviation is realizing Uber’s original “Elevate” dream, moving electric vertical take-off and landing (eVTOL) aircraft from science fiction toward commercial reality.By 2026, Joby aims to inaugurate the world’s first integrated air taxi network—in Dubai—leveraging aggressive local infrastructure investment to bypass Western bureaucratic hurdles.The plan includes “vertiports” at strategic hubs like Dubai International Airport, creating the essential physical and digital ecosystem required for reliable point-to-point urban flight.While facing a cautious FAA in the U.S., Joby will use its Dubai operations to bridge the gap between experimental testing and full-scale passenger operations.Ten years ago, ride-sharing giant Uber embraced a sci-fi future in which clean, quiet electric aircraft would shuttle passengers around crowded cities. Uber’s well-funded Elevate initiative, which included a white paper and three high-profile annual summits, effectively launched the electric vertical take-off and landing (eVTOL) industry, promising investors, regulators, and the general public that these futuristic flying taxis were “closer than you think.”At the time, California-based Joby Aviation was still in stealth mode. But behind the scenes, this pioneering eVTOL developer—which has received more than US $3 billion in total funding, including around $900 million from Toyota—was playing a major role in shaping Uber’s vision. It later stepped in to keep that vision alive, acquiring the Elevate program in 2020 after Uber CEO Dara Khosrowshahi decided to axe it.Now, Joby, which was founded in 2009 and has become the dominant eVTOL startup, says it is finally on the verge of making “urban air mobility” a reality. It plans to conduct its first passenger flights in 2026 in Dubai, United Arab Emirates.This article is part of our special report Top Tech 2026.“Dubai continues to be our global launchpad for commercial service, and our progress here is a testament to the UAE’s visionary approach to advanced air mobility,” says Anthony Khoury, Joby’s UAE general manager, in an email interview. “Dubai is on track to be the first city in the world to offer a fully integrated, premium air taxi network, and we are sprinting toward that target.”Joby Struck a Six-Year Exclusive Deal with DubaiThe company first announced its UAE plans at the World Governments Summit in Dubai in February 2024, striking a deal with Dubai’s Roads and Transport Authority (RTA) that gives it an exclusive right to operate air taxis there for six years from the launch of commercial operations.Joby’s exclusive Dubai deal will help fortify its lead in the global race to commercialize electric air taxis Joby also signed an agreement with U.K.-based Skyports to design, build, and operate four “vertiport” sites in Dubai—places for the eVTOL aircraft to load and unload passengers and charge their batteries. The first vertiport will be near Dubai International Airport, with additional ones planned for Dubai Mall, the Atlantis the Royal resort, and American University in Dubai.Joby won’t be the first eVTOL developer to carry passengers. That distinction goes to China’s EHang, which is already conducting limited sightseeing and demonstration flights with its two-seat, autonomous electric multicopters. (Joby’s aircraft are piloted.) If Joby pulls off its goal, however, it will be the first to routinely fly passengers from point to point over urban traffic, in keeping with Uber Elevate’s original vision. Its exclusive agreement in Dubai will help fortify its lead in the global race to commercialize electric air taxis, which includes a handful of other Western eVTOL developers, plus a growing number of Chinese players. Besides its Dubai deal, Joby also has a partnership with Delta to start an airport shuttle service in the United States. The Joby S4 electric vertical takeoff and landing (eVTOL) aircraft has six electric motors, each weighing 28 kilograms and capable of a peak output of 236 kilowatts.Joby AviationOperating a reliable air taxi service is a demanding proposition that will require Joby’s aircraft, charging infrastructure, and scheduling software to perform safely and reliably day in and day out. Since every new and complex technology has teething problems, Joby envisions fairly limited initial operations in 2026.“We will transition from test flights to more complex proving runs and eventually nonpaying passenger flights out of the completed vertiports, ensuring a seamless passenger experience ahead of full commercial launch,” says Khoury. He adds that Joby is currently working with Skyports to ready its initial vertiports and with government agencies in Dubai and the UAE to receive the necessary approvals for its operations.“Dubai’s approach is deeper and more comprehensive than what you see in many of the headlines,” said Clint Harper, an aviation infrastructure and policy advisor who recently participated in an advanced air mobility workshop with Dubai’s RTA. “In our workshop,” he says, “the RTA staff had fantastic questions and concerns regarding safety, security, and system-level integration. Everyone recognized and appreciated strong government support and wanted to deliver the right system solution, not just a one-off demo. I was thoroughly impressed and inspired.”Initial Air Operations Will Precede an Airworthiness CertificateNotably, all of this groundwork is taking place in advance of Joby receiving an initial type certificate for its aircraft from the U.S. Federal Aviation Administration. In the United States (and elsewhere), a type certificate is typically a prerequisite for conducting commercial operations with paying passengers. Joby claims it’s making good progress toward FAA certification, but how quickly (or slowly) that process moves is largely out of its hands. In recent years, the FAA has been taking longer to certify even conventional airplanes and helicopters, which the industry blames on staffing shortages at the agency and more cautious decision-making in the wake of the Boeing 737 Max crisis.This perception that certification delays have more to do with bureaucracy than safety may be why Dubai is willing to approve some early operations by Joby in advance of FAA type certification. Interestingly, the United States is now following the UAE’s example. In September, the FAA and U.S. Department of Transportation began soliciting proposals for an eVTOL Integration Pilot Program (eIPP), which will select at least five projects to demonstrate eVTOL operations in the national airspace starting as early as summer 2026.The FAA has stated that the eIPP won’t allow eVTOL developers to bypass certification requirements or carry paying passengers. However, it will enable them to undertake additional testing and demonstration flights as a stepping-stone to commercial operations. Joby says it’s planning to take part in the eIPP, meaning its air taxis could also be flying over U.S. cities in 2026—even if the only person on board is the pilot.
- Video Friday: Holiday Robot Helpers Send Season’s Greetingsby Evan Ackerman on December 26, 2025 at 6:30 pm
Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.ICRA 2026: 1–5 June 2026, VIENNAEnjoy today’s videos! Happy Holidays from Boston Dynamics!I would pay any amount of money for that lamp.[ Boston Dynamics ]What if evolution wasn’t carbon-based—but metal instead? This short film explores an alternative, iron-based evolution through robots, simulation, and real-world machines. Inspired by biological evolution, this Christmas lab film imagines a world where machines evolve instead of organisms.[ ETH Zurich Robotics System Lab ]Happy Holidays from FieldAI![ FieldAI ]Happy Holidays from the Institute of Robotics and Machine Intelligence at Poznan University of Technology![ Poznan University of Technology IRMI ]Happy Holidays from BruBotics![ AugmentX ]Thanks, Bram![ Humanoid ]Check out how SCUTTLE tackles the dull, dirty, and dangerous tasks of the pest control industry.[ Ground Control Robotics ]Happy Holidays from LimX Dynamics![ LimX Dynamics ]Happy (actually maybe not AI?) Holidays from Kawasaki Robotics![ Kawasaki Robotics ]Happy Holidays from AgileX Robotics[ AgileX Robotics ]Big news: Badminton just got a new training partner. Our humanoid robot can rally with a human in continuous exchanges, combining fast returns with stable movement. Peak return speed reaches 19.1 m/s.[ Phybot ]Well, here’s one way of deploying a legged robot.[ Kepler ]Today, we present the world’s first demo video of a full-size robot taking on the challenging Charleston dance.[ PNDbotics ]The DR02 humanoid robot from DEEP Robotics showcases remarkable versatility and agility. From the graceful flow of Tai Chi to the energetic moves of street dance, DR02 combines precision, strength, and artistry with ease![ Deep Robotics ]Decreasing the Cost of Morphing in Adaptive Morphogenetic Robots: By using kirigami laminar jamming flippers, the Jamming Amphibious Robotic Turtle (JART) can quickly morph its limbs to adapt to changing terrain. This pneumatic layer jamming technology enables multi-environment locomotion on land and water by changing the robot’s flipper shape and stiffness to decrease the cost of transport.[ Paper ]Super Odometry is a resilient sensor-fusion framework that delivers accurate, real-time state estimation in challenging environments by integrating external and inertial sensing. For decades, SLAM has depended on external sensors like cameras and lidar. We argue it’s time to reverse this hierarchy: True robustness begins from within. By placing inertial sensing at the core of state estimation, robots gain an inner sense of motion. We believe the systems that not only see, but also feel, learn, and adapt.[ AirLab ]
- The Top 6 Robotics Stories of 2025by Evan Ackerman on December 24, 2025 at 2:00 pm
Usually, I start off these annual highlights posts by saying that it was the best year ever for robotics. But this year, I’m not so sure. At the end of 2024, it really seemed like AI and humanoid robots were poised to make a transformative amount of progress toward some sort of practicality. While it’s certainly true that progress has been made, it’s hard to rationalize what’s actually happened in 2025 with the amount of money and hype that has suffused robotics over the course of the year. And for better or worse, humanoids are overshadowing everything else, raising questions about what will happen if the companies building them ultimately do not succeed.We’ll be going into 2026 with both optimism and skepticism, and we’ll keep doing what we always do: talking to the experts, asking as many hard questions as we can, and making sure to share all the cool robots, even (or especially) the ones that you won’t see anywhere else.So thanks for reading, and to all you awesome robotics folks out there, thanks for sharing your work with us! IEEE Spectrum has a bunch of exciting new stuff planned for 2026, and as we close out 2025, here’s a quick look back at some of our best robotics stories of the year.1. Reality Is Ruining the Humanoid Robot Hype Eddie Guy Humanoid robots are hard, and they’re hard in lots of different ways. For some of those ways, we at least understand the problems and what the solutions will likely involve. But there are other problems that have no clear solutions, and most humanoid companies, especially the well-funded ones, seem quite happy to wave those problems away while continuing to raise extraordinary amounts of money. We’re going to keep calling this out whenever we see it, and expect even more skepticism in 2026.2. Exploit Allows for Takeover of Fleets of Unitree Robots CFOTO/Future Publishing/Getty Images Security is one of those pesky little things that is super important in robotics but that early-stage robotics companies typically treat as an afterthought because it doesn’t drive investment. Chinese manufacturer Unitree is really the one company with humanoid robots that are available enough and affordable enough for clever people to perform a security audit on them. And to the surprise of no one, Unitree’s robots had serious vulnerabilities, which as of yet have not all been fixed.3. Amazon’s Vulcan Robots Now Stow Items Faster Than Humans Amazon The thing I appreciate about the folks at Amazon Robotics is how relentless they are in finding creative solutions for problems at scale. Amazon simply doesn’t have time to mess around, and they’re designing robots to do what robots do best: specific repetitive tasks in structured environments. In the current climate of robotics hype, it’s refreshing, honestly.4. Large Behavior Models Are Helping Atlas Get to Work Boston Dynamics Did I mention that humanoid robots are hard? Whether or not anyone can deliver on the promises being made about them (and personally, I’m leaning more and more strongly toward not), progress is being made toward humanoids that are much more capable and versatile than they ever have been. The collaboration between Toyota Research and Boston Dynamics on large behavior models is just one example of how far we’ve come, and how far we still have to go.5. iRobot’s Cofounder Weighs In on Company’s Bankruptcy Lindsey Nicholson/Universal Images Group/Getty Images My least favorite story to write happened right at the end of the year—iRobot filed for bankruptcy. This was not a total surprise; regulators shutting down an acquisition by Amazon in 2024 essentially gutted the company, and it’s been limping along toward the inevitable since then. Right after the news was announced, we spoke with iRobot co-founder and ex-CEO Colin Angle, who had plenty to share about where things went wrong, and what we can learn from it.6. How Dairy Robots Are Changing Work for Cows (and Farmers) Evan Ackerman My favorite story of 2025 was as much about cows as it was about robots. I was astonished to learn just how many fully autonomous robots are hard at work on dairy farms around the world, and utterly delighted to also learn that these robots are actively improving the lives of both dairy farmers and the dairy cows themselves. Dairy farming is endless hard work, but thanks to these robots, small family farms are able to keep themselves sustainable (and sane). Everybody wins, thanks to the robots.
- Drones Compete to Spot and Extinguish Brushfiresby Robb Mandelbaum on December 24, 2025 at 1:00 pm
To the untrained eye, it did not look like a particularly complicated mission. A large black quadcopter drone, more than two meters spanning the propeller tips, sat parked on the grass. Nestled between the legs of its landing gear was a red balloon filled with water. Not far away, on a concrete pad, a stack of wood pallets was ablaze, the flames whipping around in a heavy wind. A student at the University of Maryland (UMD) would fly the Alta X drone all of about 25 meters to the fire. There it would drop the water balloon to extinguish the flames. In the XPrize contest, drones must distinguish between dangerous fires—like this one—and legitimate campfires. Jayme ThorntonBut, of course, it was complicated. The drone needed to hover at about 13.5 meters overhead, and the balloon was configured to detonate at a specific point in midair to ensure optimal water dispersal, as calculated by UMD’s Department of Fire Protection Engineering. On a signal, Andrés Felipe Rivas Bolivar, a doctoral student in aerospace engineering, launched the Alta X toward the fire. As a second, smaller drone outfitted with a thermal camera surveyed the scene from above, Rivas maneuvered the balloon-laden drone to the proper position. After about a half minute, he released the water bomb...and the balloon plummeted to the ground just wide of the platform, bursting with a thwaaaap.This article is part of our special report Top Tech 2026.On this warm but blustery day in mid-October, a team of about 20 UMD students and professors were gathered at a fire and rescue training center in La Plata, Md., to demonstrate the building blocks of what could be the future of wildfire fighting. They called their team Crossfire. Their guests were a handful of officials from the XPrize Foundation, which has organized a pair of competitions to vastly speed up wildfire detection and suppression. Twelve other teams are competing with Crossfire in the semifinals for the autonomous wildfire-suppression track of the competition. In the final round, to be held in June 2026, five of those teams will have to find a fire within 1,000 square kilometers of what XPrize calls “environmentally challenging” terrain and then navigate to and extinguish it, all within 10 minutes. The winner collects a US $3.5 million purse—and, hopefully, the world’s wildfire-fighting armies get a powerful new weapon for their arsenals.The Wildfire ProblemWildfires are growing more severe and affecting more people worldwide. The November 2018 Camp Fire that burned down 620 square kilometers of Northern California, including most of the town of Paradise, was the most deadly and destructive in the state’s recorded history, and it sent Pacific Gas and Electric, the giant utility responsible for starting the fire, into bankruptcy. XPrize had long been based in the Los Angeles area, so that catastrophe was undoubtedly on the minds of its staffers when they formulated the competition in 2019. “This was just something that was really personal and close to a lot of the individuals at the organization,” says Andrea Santy, program director for the wildfire competition. XPrize eventually organized a separate track of the competition to award $3.5 million for detecting small fires with satellites. Andrea Santy, one of the program managers from XPrize in charge of the wildfire competition, looks on during Crossfire’s trials.Jayme ThorntonSanty says XPrize’s competition designers met with more than 100 experts in the field, including fire scientists, agency officials, and technologists—“all the experts that you would want at the table were at the table.” Where their views aligned, Santy says, XPrize researchers detected the “core problems.” One of the most important was response time. In the best case, an hour can often pass between when a fire is first detected and when it’s extinguished. XPrize aims to shrink that drastically. An additional $1 million will go to the teams that (per the rules) “successfully demonstrate accurate, precise, and rapid detection.”Arnaud Trouvé, chair of the UMD’s Fire Protection Engineering department, thinks even the 10-minute limit may not be good enough. “On a red flag day with high-wind conditions, a fire that starts is going to be taking a big size within a matter of tens of seconds,” he said as we waited for the Alta X to try again. “So even the 10 minutes you have to go do something will be too slow.” Whatever comes from the XPrize, he says, will be adopted, but more likely in developed areas, where fires spread more slowly and could be extinguished early on, when firefighters are often busy evacuating residents.In any event, the time limit pointed most teams—and all the teams to make the semifinals—toward drones. Firefighters have worked, or tried to work, given bureaucratic and other hurdles, with drones for years, but mainly for reconnaissance, says Bob Roper, a senior wildfire advisor for the Western Fire Chiefs Association. Many of the hurdles around using drones have been cleared, but no drone exists yet that can carry enough suppressant to be useful on its own, says Roper. (The smallest helicopter bucket carries 270 liters.) Roper says government-funded fire agencies seldom “have available unrestricted dollars to be able to develop something that’s new.” By sprinkling startups and universities with research funding, the XPrize is poised to make, he says, “a quantum leap difference.”Team Crossfire Word of the XPrize wildfire competition reached Trouvé’s desk soon after it launched in April 2023. He joined forces with colleagues in aerospace and mechanical engineering and with xFoundry, a new organization that uses competitions to spur entrepreneurship. (xFoundry’s founder, Amir Ansari, happened to be one of the sponsors of the first XPrize in 1994; his sister-in-law Anousheh is the CEO of the XPrize Foundation.) It didn’t take long to sketch out most of what they brought to La Plata. The University of Maryland’s Yaseen Taha [right] pilots a spotter drone while Brian Tran looks on. Jayme ThorntonThe day began with tests of the detection drone. Its dock opened like flower petals unfolding and the drone, a much smaller quadcopter than the Alta X, shot up into the air. Using a handheld controller, undergraduate Yaseen Taha flew it to a point 35 meters above the burning pallets. Like all the technology Crossfire has deployed, the scout was an off-the-shelf model, made by the Chinese manufacturer DJI. It came with a lot of important features already programmed in, including obstacle avoidance and lidar, and cost just $25,000, according to xFoundry head of products and ventures Phillip Alvarez. “We get a really nice, well-polished system for a pretty low price here, and then we can spend the rest of development on solving the hard stuff,” he said. In total, Crossfire has spent around $300,000, most of it raised from UMD donors, he added. xFoundry’s Philip Alvarez stands behind the Crossfire team’s drone that’s used for detecting wildfires. Jayme ThorntonThe hard stuff, some of it anyway, was visible on a large display monitor showing the feeds from the drone’s two cameras. On the right was the infrared feed; on it, a red square labeled “fire” bracketed the burning pallets. A smaller red fire square appeared up and to the right of this; this was a pile of glowing embers in a bin not far away. These were meant to represent a campfire—the contest rules required systems to distinguish between potentially destructive conflagrations and “decoy fires” that don’t pose a threat. Crossfire’s system made those distinctions based on the drone’s color video feed. That feed runs through an open-source deep learning model known as YOLO (“You Only Look Once”), which recognizes images. One of Crossfire’s drones scans the terrain and distinguishes between a burning pile of pallets and a small fire in a bin. Robb Mandelbaum To train it, UMD students fed 40,000 photographs of fires to the model—manually identifying the blazes in about 1,200 of these. The result was that when the program processed the color feed from the drone, it concluded that pallets were a fire, marked on the screen in a blue box, and ignored the bin. Now both camera feeds indicated a blaze in the same place, and the monitor threw up a warning in red: “FIRE DETECTED.” As turkey vultures looked on from high above, the drone identified the fire again from a higher altitude, then with the cameras pointed at a different angle, it finally flew a preprogrammed back-and-forth route through the air that looks like a lawnmower’s path. An electric Ford F150 truck serves as charger and home base for Crossfire’s system. Jayme Thornton An electric Ford F-150 pickup, front trunk open, sat off to the side powering a bank of computers that operate the two drones. In the field, it will also process feeds from cameras mounted on poles throughout the forest—an early detection system that will trigger the scouting drone. This was designed by Alvarez, who happens to have a Ph.D. in biophysics, using an even newer version of image-reading AI developed just last year.All of the teams, Santy says, have proposed something broadly similar: sensors and cameras on the ground or on one or more drones, or both, and AI interpreting the data. How teams get to the fire has been driven by regulation—the FAA has restrictions on drones weighing more than 25 kilograms (55 pounds), as well as autonomous systems dropping payloads, which is why Rivas had to pilot the Alta X. “Some are looking at how we can address the problem within the current regulations, so they’re trying to stay within the 55 pounds,” says Santy. Others are designing systems that ultimately could be deployed only under new regulations. That primarily comes down to either using a swarm of smaller drones or one heavy-lift drone. Teams that fly heavy in the finals will have to get FAA approval for the contest, just as Crossfire would need it to operate the Alta X autonomously. Crossfire’s fire-suppression drone flies toward a fire carrying a balloon full of water. Jayme ThorntonCuriously, the XPrize appears not to have spurred much innovation in actually putting out a fire. Most teams are using water, though they’re dropping it in a variety of different ways. It’s a work in progress, says Santy. “Teams have been thinking very hard about what works under challenging conditions” like wind, drone movement, and proximity to the fire. The University of Maryland’s Dahlia Andres works on the Crossfire team’s fire-suppression drone.Jayme ThorntonCrossfire’s approach of detonating water balloons in midair—which has yet to be patented so the team would not describe it in detail—could eventually change the calculation about how much suppressant is needed to fight fires. Typically, aircraft flying at high altitude release a lot of water, which, says Trouvé, mostly misses the burning biomass. “Releasing the water at low elevations and directly above the burning biomass requires much less water,” he says.With a new balloon installed on the Alta X, the team attempted to attack the fire a second time. This time, Rivas spent several minutes maneuvering the drone to get it in place before dropping the balloon, which appeared to partially detonate, spewing water as it fell. The balloon didn’t completely burst until it hit the platform, spraying water all over and creating a huge puff of steam. But when the smoke cleared, the fire still burned. Crossfire’s detonators, it turned out, were rated for warmer weather than this October day. “We’ve tested this probably 20 different times, and 20 different times it’s been successful,” Alvarez said ruefully. Crossfire’s drone carries a water balloon skyward, finds the fire, and drops the balloon.Jayme ThorntonBut the third attempt, several hours later, was the charm. Rivas whisked the Alta X over the fire. Taha, on the other side of the fire, checked its position and motioned for release. The balloon exploded a few meters below the drone, and a shower of water blanketed the fire. The thermal camera on the observation drone confirmed the fire had been extinguished. Muted “yays” and a smattering of applause broke out. Crossfire’s Abdullah Shamsan, Derek Paley, Matthew Ayd, and Joshua Gaus [from left] monitor a drone flight. Jayme ThorntonCrossfire is already looking beyond the competition, regardless of whether it makes it to the finals in 2026. Along with Taha, aerospace engineering professor Derek Paley has talked to some 40 potential customers—mainly fire departments and government agencies—for the system Crossfire is developing. He’s currently uncertain whether there are enough organizations willing to adopt the technology to make it commercially viable. So far, he says, “it’s a little bit of an uphill battle, but we’re hoping with the visibility brought to the problem by XPrize” and the momentum of being a finalist—and, better still, some prize money in hand—“we’ll have enough to have a compelling business model.”Roper, of the Western Fire Chiefs Association, acknowledges that “political considerations” around existing fleets of crewed aircraft will challenge the transition to drones, but he says that these can gain a foothold by operating when and where crewed aircraft can’t, at night, for example. Still, it will take multiple companies commercializing the technology to prod fire departments to purchase drones. Even then, he says, “it’s probably going to have to be adopted either at the federal or the state level first and then there’s a trickle-down effect to the local fire departments.”If not, Paley says, “our tech is applicable to law enforcement, and other aspects of public safety. It’s just a question of, are we starting a wildfire company, or are we starting a robotics company.”
- Video Friday: Happy Robot Holidaysby Evan Ackerman on December 19, 2025 at 4:30 pm
Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events. Please send us your events for inclusion.ICRA 2026: 1–5 June 2026, VIENNAEnjoy today’s videos! Happy Holidays from FZI Living Lab![ FZI ]Thanks, Georg!Happy Holidays from Norlab!I should get a poutine...[ Norlab ]Happy Holidays from Fraunhofer IOSB![ Fraunhofer ]Thanks, Janko!Happy Holidays from HEBI Robotics![ HEBI Robotics ]Thanks, Trevor!Happy Holidays from the Learning Systems and Robotics Lab![ Learning Systems and Robotics Lab ]Happy Holidays from Toyota Research Institute![ Toyota Research Institute ]Happy Holidays from Clearpath Robotics![ Clearpath Robotics ]Happy AI Holidays from Robotnik![ Robotnik ]Happy AI Holidays from ABB Robotics![ ABB Robotics ]With its unique modular configuration, TRON 2 lets you freely configure dual-arm, bipedal, or wheeled setups to fit your mission.[ LimX Dynamics ]Thanks, Jinyan!I love this robot, but can someone please explain why what happens at 2:00 makes me physically uncomfortable?[ Paper ]Thanks, Ayato!This robot, REWW-ARM, is a remote wire-driven mobile robot that separates and excludes electronics from the mobile part, so that the mobile robot can operate in harsh environments. A novel transmission mechanism enables efficient and long-distance electronics-free power transmission, closed-loop control that estimates the distal state from wire. It demonstrated locomotion and manipulation on land and underwater.[ JSK Lab ]Thanks, Takahiro!DEEP Robotics has deployed China’s first robot dog patrol team for forest fire protection in the West Lake area. Powered by embodied AI, these quadruped robots support early detection, patrol, and risk monitoring—using technology to protect nature and strengthen emergency response.[ DEEP Robotics ]In this video we show how we trained our robot to fold a towel from start to finish. Folding a towel might seem simple, but for a robot it means solving perception, planning, and dexterous manipulation all at once, especially when dealing with soft, deformable fabric. We walk through how the system sees the towel, identifies key features, and executes each fold autonomously. [ Kinisi Robotics ]This may be the first humanoid app store, but it’s far from the first app store for robots. Problem is, for an app store to gain traction, there needs to be a platform out there that people will buy for its core functionality first.[ Unitree ]You can tell that this isn’t U.S. government–funded research because it involves a robot fetching drinks.[ Flexiv ]This video shows the Perseverance Mars Rover’s point of view during a record-breaking drive that occurred June 19, 2025, the 1,540th Martian day, or sol, of the mission. The Perseverance rover was traveling northbound and covered 1,350.7 feet (411.7 meters) on that sol, over the course of about 4 hours and 24 minutes. This distance eclipsed its previous record of distance traveled in a single sol: 1,140.7 feet (347.7 meters), which was achieved on April 3, 2023 (Sol 753). [ NASA ]Automation is what’s helped keep lock maker Wilson Bohannan based in America for more than 150 years while all of its competitors relocated overseas. Using two high-speed and high-precision FANUC M-10 series robots, Acme developed a simple but highly sophisticated system that uses innovative end-of-arm tooling to accommodate 18 different styles of padlocks. As a result of Acme’s new system using FANUC robots, Wilson Bohannan production rocketed from 1,500-1,800 locks finished per eight-hour shift to more than 5,000.[ Fanuc ]In this conversation, Zack Jackowski, general manager and vice president, Atlas, and Alberto Rodriguez, director of robot behavior, sit down to discuss the path to generalist humanoid robots working at scale and how we approach research & development to both push the boundaries of the industry and deliver valuable applications.[ Boston Dynamics ]
- iRobot’s Cofounder Weighs In on Company’s Bankruptcyby Evan Ackerman on December 16, 2025 at 8:12 pm
On Sunday evening, the legendary robotics company iRobot, manufacturer of the Roomba robotic vacuum, filed for bankruptcy. The company will be handing over all of its assets to its Chinese manufacturing partner, Picea. According to iRobot’s press release, “this agreement represents a critical step toward strengthening iRobot’s financial foundation and positioning the Company for long-term growth and innovation,” which sounds like the sort of thing that you put in a press release when you’re trying your best to put a positive spin on really, really bad news.This whole situation started back in August 2022, when iRobot announced a US $1.7 billion acquisition by Amazon. Amazon’s interest was obvious—some questionable hardware decisions had left the company struggling to enter the home robotics market. And iRobot was at a point where it needed a new strategy to keep ahead of lower-cost (and increasingly innovative) home robots from China.Some folks were skeptical of this acquisition, and admittedly, I was one of them. My primary worry was that iRobot would get swallowed up and effectively cease to exist, which tends to happen with acquisitions like these, but regulators in the United States had much more pointed concerns: namely, that Amazon would leverage its marketplace power to restrict competition. The European Commission expressed similar objections.By late January 2024, the deal had fallen through, iRobot laid off a third of its staff, suspended research and development, and CEO and cofounder Colin Angle left the company. Since then, iRobot has seemed resigned to its fate, coasting along on a few lackluster product announcements and not much else, and so Sunday’s announcement of bankruptcy was a surprise to no one—perhaps least of all to Angle.iRobot’s Bankruptcy and Amazon Deal Collapse“iRobot’s bankruptcy filing was really just a public-facing outcome of the tragedy that happened a year and a half ago,” Angle told IEEE Spectrum on Monday. “Today sucks, but I’ve already mourned. I mourned when the deal with Amazon got blocked for all the wrong reasons.” Angle points out that by the early 2020s, iRobot was no longer monopolizing the robot-vacuum market. This was especially true in Europe, where iRobot’s market share was 12 percent and decreasing. But from Angle’s perspective, regulators were more focused on making a point about Big Tech than they were about the actual merits and risks of the merger. Cofounder Colin Angle says that iRobot’s bankruptcy filing was unsurprising after a failed acquisition by Amazon a year and a half ago.Charles Krupa/AP“We were roadkilled in a larger agenda,” Angle says. “And this kind of regulation is incredibly destructive to the innovation economy. The whole concept of starting a tech company and having it acquired by a bigger tech company is far and away the most common positive outcome. For that to be taken away is not a good thing.” And for iRobot, it was fatal. A common criticism of iRobot even before the attempted Amazon merger is that the company was simply being out-innovated in the robot-vacuum space, and Angle doesn’t necessarily disagree. “By 2020, China had become the largest market in the world for robot vacuums, and Chinese robotics companies with government support were investing two or three times as much as iRobot was in R&D. We simply didn’t have the capital to move as quickly as we wanted to. In order for iRobot to continue to innovate and lead the industry, we needed to do so as part of a larger entity, and Amazon was very aligned with our vision for the home.”This situation is not unique to iRobot, and there is significant concern in robotics about how companies can effectively compete against the massive advantage that China has in the production of low-cost hardware. In some sense, what happened to iRobot is an early symptom of what Angle (and others) see as a fundamental problem with robotics in the United States: lack of government support. In China, long-term government support for robotics and embodied AI (in the form of both policy and direct investment) can be found across industry and academia, something that neither the United States nor the European Union has been able to match. “Robotics is in a global competition against some very fearsome competitors,” Angle says. “We have to decide whether we want to support our innovation economy. And if the answer is no, then the innovation economy goes elsewhere.”The consequence of companies like iRobot losing this competition can be more than just bankruptcy. In iRobot’s case, a Chinese company now owns iRobot’s intellectual property and app infrastructure, which gives it access to data from millions of highly sensorized autonomous mobile robots in homes across the world. I asked Angle whether or not Roomba owners should be concerned about this. “When I was running the company, we talked a lot about this, and put a lot of effort into privacy and security,” he says. “This was fundamental to Roomba’s design. Now, I can’t speak to what they’ll prioritize.”While Angle has moved on from iRobot, and has since cofounded a more-mysterious-than-we’d-like company called Familiar Machines and Magic, he still feels strongly that what has happened to iRobot should be a warning to both robotics companies and policymakers. “Make no mistake: China is good at robots. So we need to play this hard. There’s a lot to learn from what we did at iRobot, and a lot of ways to do it better.”On a personal note, I’m choosing to remember the iRobot that was—not just the company that built a robot vacuum out of nothing and conquered the world with it for nearly two decades, but also the company that built the PackBot to save lives, as well as all of these other crazy robots. I’m not sure there’s ever been a company quite like iRobot, and there may never be again. It will be missed.
- Video Friday: Robot Dog Shows Off Its Musclesby Evan Ackerman on December 12, 2025 at 5:00 pm
Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.ICRA 2026: 1–5 June 2026, VIENNAEnjoy today’s videos! Suzumori Endo Lab, Science Tokyo has developed a musculoskeletal dog robot using thin McKibben muscles. This robot mimics the flexible “hammock-like” shoulder structure to investigate the biomechanical functions of dog musculoskeletal systems.[ Suzumori Endo Robotics Laboratory ]HOLEY SNAILBOT!!![ Freeform Robotics ]We present a system that transforms speech into physical objects using 3D generative AI and discrete robotic assembly. By leveraging natural language, the system makes design and manufacturing more accessible to people without expertise in 3D modeling or robotic programming.[ MIT ]Meet the next generation of edge AI. A fully self-contained vision system built for robotics, automation, and real-world intelligence. Watch how OAK 4 brings compute, sensing, and 3D perception together in one device.[ Luxonis ]Thanks, Max!Inspired by vines’ twisty tenacity, engineers at MIT and Stanford University have developed a robotic gripper that can snake around and lift a variety of objects, including a glass vase and a watermelon, offering a gentler approach compared to conventional gripper designs. A larger version of the robo-tendrils can also safely lift a human out of bed.[ MIT ]The paper introduces an automatic limb attachment system using soft actuated straps and a magnet-hook latch for wearable robots. It enables fast, secure, and comfortable self-donning across various arm sizes, supporting clinical-level loads and precise pressure control.[ Paper ]Thanks, Bram!Autonomous driving is the ultimate challenge for AI in the physical world. At Waymo, we’re solving it by prioritizing demonstrably safe AI, where safety is central to how we engineer our models and AI ecosystem from the ground up.[ Waymo ]Built by Texas A&M engineering students, this AI-powered robotic dog is reimagining how robots operate in disaster zones. Designed to climb through rubble, avoid hazards, and make autonomous decisions in real time, the robot uses a custom multimodal large language model (MLLM) combined with visual memory and voice commands to see, remember, and plan its next move like a first responder.[ Texas A&M ]So far, aerial microrobots have only been able to fly slowly along smooth trajectories, far from the swift, agile flight of real insects—until now. MIT researchers have demonstrated aerial microrobots that can fly with speed and agility comparable to their biological counterparts. A collaborative team designed a new AI-based controller for the robotic bug that enabled it to follow gymnastic flight paths, such as executing continuous body flips.[ MIT ]In this audio clip generated by data from the SuperCam microphone aboard NASA’s Perseverance, the sound of an electrical discharge can be heard as a Martian dust devil flies over the Mars rover. The recording was collected on Oct. 12, 2024, the 1,296th Martian day, or sol, of Perseverance’s mission on the Red Planet.[ NASA Jet Propulsion Laboratory ]In this episode, we open the archives on host Hannah Fry’s visit to our California robotics lab. Filmed earlier this year, Hannah interacts with a new set of robots—those that don’t just see, but think, plan, and do. Watch as the team goes behind the scenes to test the limits of generalization, challenging robots to handle unseen objects autonomously.[ Google DeepMind ]This GRASP on Robotics Seminar is by Parastoo Abtahi from Princeton University, on “When Robots Disappear–From Haptic Illusions in VR to Object-Oriented Interactions in AR.”Advances in audiovisual rendering have led to the commercialization of virtual reality (VR); however, haptic technology has not kept up with these advances. While a variety of robotic systems aim to address this gap by simulating the sensation of touch, many hardware limitations make realistic touch interactions in VR challenging. In my research, I explore how, by understanding human perception through the lens of sensorimotor control theory, we can design interactions that not only overcome the current limitations of robotic hardware for VR but also extend our abilities beyond what is possible in the physical world.In the first part of this talk, I will present my work on redirection illusions that leverage the limits of human perception to improve the perceived performance of encountered-type haptic devices in VR, such as the position accuracy of drones and the resolution of shape displays. In the second part, I will share how we apply these illusory interactions to physical spaces and use augmented reality (AR) to facilitate situated and bidirectional human-robot communication, bridging users’ mental models and robotic representations.[ University of Pennsylvania GRASP Laboratory ]
- Ghost Robotics’ Arm Brings Manipulation to Military Quadrupedsby Evan Ackerman on December 11, 2025 at 3:00 pm
Ghost Robotics is today announcing a major upgrade for their Vision 60 quadruped: an arm. Ghost, a company that originated at the GRASP Lab at the University of Pennsylvania, specializes in exceptionally rugged quadrupeds, and while many of its customers use its robots for public safety and disaster relief, it also provides robots to the U.S. military, which has very specific needs when it comes to keeping humans out of danger.In that context, it’s not unreasonable to assume that Ghost’s robots may sometimes be used to carry weapons, and despite the proliferation of robots in many roles in the Ukraine war, the idea of a legged robot carrying a weapon is not a comfortable one for many people. IEEE Spectrum spoke with Ghost co-founder and current CEO Gavin Kenneally to learn more about the new arm, and to get his perspective on selling robots to the military. The Vision 60’s new arm has six degrees of freedom. Ghost RoboticsRobots for the MilitaryGhost Robotics initially made a name for itself with its very impressive early work with the Minitaur direct-drive quadruped in 2016. The company also made headlines in late 2021, when a now-deleted post on Twitter (now X) went viral because it included a photograph of one of Ghost’s Vision 60 quadrupeds with a rifle mounted on its back.That picture resulted in a very strong reaction, although as IEEE Spectrum reported at the time, robots with guns affixed to them wasn’t new: To mention one early example, the U.S. military had already deployed weapons on mobile robots in Iraq in 2007. And while several legged robot companies pledged in 2022 not to weaponize their general-purpose robots, the Chinese military in 2024 displayed quadrupeds from Unitree equipped with guns. (Unitree, based in China, was one of the signers of the 2022 pledge.)The issue of weaponized robots goes far beyond Ghost Robotics, and far beyond robots with legs. We’ve covered both the practical and ethical perspectives on this extensively at IEEE Spectrum, and the intensity of the debates shows that there is no easy answer. But to summarize one important point made by some ethicists, some military experts, and Ghost Robotics itself: Robots are replaceable; humans are not. “Customers use our robots to keep people out of harm’s way,” Kenneally tells Spectrum.It’s also worth pointing out that even the companies who signed the pledge not to weaponize their general-purpose robots acknowledge that military robots exist, and are accepting of that, provided that such robots are used under existing legal doctrines and operate within those safeguards—and that what constraints should or should not be imposed on these kinds of robots is best decided by policymakers rather than industry.This is essentially Ghost Robotics’ position as well, says Kenneally. “We sell our robots to U.S. and allied governments, and as part of that, the robots are used in defense applications where they will sometimes be weaponized. What’s most critical to us is that the decisions about how to use these robots are happening systematically and ethically at the government policy level.”To some extent, these decisions are already being made within the U.S. government. Department of Defense Directive 3000.09, “Autonomy in Weapon Systems,” lays out the responsibilities and limitations for how autonomous or human-directed robotics weapons systems should be developed and deployed, including requirements for human use-of-force judgments. At least in the United States, this directive implies that there are rules and accountability for robotic weapons.Vision 60’s Versatile Arm CapabilitiesGhost sees its Vision 60 quadruped as a system that its trusted customers can use as they see fit, and the manipulator enables many additional capabilities. “The primary purpose of the robot has been as a sensor platform,” Kenneally says, “but sometimes there are doors in the way, or objects that need to be moved, or you might want the robot to take a sample. So the ability to do all of that mobile manipulation has been hugely valuable for our customers.”As it turns out, arms are good for more than manipulation. “One thing that’s been very interesting is that our customers have been using the arm as a sensor boom, which is something that we hadn’t anticipated,” says Kenneally. Ghost’s robot has plenty of cameras, but they’re mostly at the viewpoint of a moderately sized dog. The new arm offers a more humanlike vantage and a way to peek around corners or over things without exposing the whole robot.Ghost was not particularly interested in building its own arm and tried off-the-shelf options to get the manipulation bit working. And they did get the manipulation working; what didn’t work were any of those arms after the 50 kilogram robot rolled over on them. “We wanted to make sure that we could build an arm that could stand up to the same intense rigors of our customers’ operations that the rest of the robot can,” says Kenneally. “Morphologically, we actually consider the arm to be a fifth leg, so that the robot operates as a unified system for whole-body control.”The rest of the robot is exceptionally rugged, which is what makes it appealing to customers with unique needs, like special forces teams. Enough battery life for more than three hours of walking (or more than 20 hours on standby) isn’t bad, and the Vision 60 is sealed against sand and dust, and can survive complete submergence in shallow water. It can operate in extreme temperatures ranging from -40 °C to 55 °C, which has been a particular challenge for robots. And if you do manage to put it in a situation where it physically breaks one of its legs, it’s easy to swap in a spare in just a few minutes, even out in the field. The Vision 60 can open doors with high-level direction from a human operator.Ghost RoboticsQuadruped Robot Competition From ChinaDespite Ghost quietly selling over a thousand quadrupeds to date, Kenneally is cautious about the near future for legged robots, as is anyone who has seriously considered buying one, because it’s impossible to ignore the option of just buying one from a Chinese company at about a tenth the cost of a quadruped from a company based in the U.S. or Europe.“China has identified legged robotics as a linchpin technology that they are strategically funding,” Kenneally says. “I think it’s an extremely serious threat in the long term, and we have to take these competitors very seriously despite their current shortcomings.” There is a technological moat, for now, but if the market for legged robots follows the same trajectory as the market for drones did, that moat will shrink drastically over the next few years.The United States is poised to ban consumer drone sales from Chinese manufacturer DJI, and banned DJI drone use by federal agencies in 2017. But it may be too late in some sense, as DJI’s global market share is something like 90 percent. Meanwhile, Unitree may have already cornered somewhere around 70 percent of the global market for quadrupeds, despite the recent publication of exploits that allow the robots to send unauthorized data to China.In the United States, in particular, private-sector robotics funding is unpredictable at the best of times, and Kenneally argues that to compete with Chinese-subsidized robot-makers, American companies like Ghost, which produce these robots domestically, will need sustained U.S. government support, too. That doesn’t mean the government has to pick which companies will be the winners, but that it should find a way to support the U.S. robotics industry as a whole if it still wants to have a meaningful one. “The quadruped industry isn’t a science project anymore,” says Kenneally. “It’s matured, and quadruped robots are going to become extremely important in both commercial and government applications. But it’s only through continued innovation that we’ll be able to stay ahead.”
- Video Friday: Biorobotics Turns Lobster Tails Into Gripperby Evan Ackerman on December 5, 2025 at 5:30 pm
Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.ICRA 2026: 1–5 June 2026, VIENNAEnjoy today’s videos! EPFL scientists have integrated discarded crustacean shells into robotic devices, leveraging the strength and flexibility of natural materials for robotic applications.[ EPFL ]Finally, a good humanoid robot demo!Although having said that, I never trust videos demos where it works really well once, and then just pretty well every other time.[ LimX Dynamics ]Thanks, Jinyan!I understand how these structures work, I really do. But watching something rigid extrude itself from a flexible reel will always seem a little magical.[ AAAS ]Thanks, Kyujin!I’m not sure what “industrial grade” actually means, but I want robots to be “automotive grade,” where they’ll easily operate for six months or a year without any maintenance at all.[ Pudu Robotics ]Thanks, Mandy!When you start to suspect that your robotic EV charging solution costs more than your car.[ Flexiv ]Yeah, uh, if the application for this humanoid is actually making robot parts with a hammer and anvil, then I’d be impressed.[ EngineAI ]Researchers at Columbia Engineering have designed a robot that can learn a humanlike sense of neatness. The researchers taught the system by showing it millions of examples, not teaching it specific instructions. The result is a model that can look at a cluttered tabletop and rearrange scattered objects in an orderly fashion.[ Paper ]Why haven’t we seen this sort of thing in humanoid robotics videos yet?[ HUCEBOT ]While I definitely appreciate in-the-field testing, it’s also worth asking to what extent your robot is actually being challenged by the in-the-field field that you’ve chosen.[ DEEP Robotics ]Introducing HMND 01 Alpha Bipedal—autonomous, adaptive, designed for real-world impact. Built in five months, walking stably after 48 hours of training.[ Humanoid ]Unitree says that “this is to validate the overall reliability of the robot,” but I really have to wonder how useful this kind of reliability validation actually is.[ Unitree ]This University of Pennsylvania GRASP on Robotics seminar is by Jie Tan from Google DeepMind, on “Gemini Robotics: Bringing AI into the Physical World.”Recent advancements in large multimodal models have led to the emergence of remarkable generalist capabilities in digital domains, yet their translation to physical agents such as robots remains a significant challenge. In this talk, I will present Gemini Robotics, an advanced Vision-Language-Action (VLA) generalist model capable of directly controlling robots. Furthermore, I will discuss the challenges, learnings, and future research directions on robot foundation models.[ University of Pennsylvania GRASP Laboratory ]
- MIT’s AI Robotics Lab Director Is Building People-Centered Robotsby Willie D. Jones on December 3, 2025 at 7:00 pm
Daniela Rus has spent her career breaking barriers—scientific, social, and material—in her quest to build machines that amplify rather than replace human capability. She made robotics her life’s work, she says, because she understood it was a way to expand the possibilities of computing while enhancing human capabilities.“I like to think of robotics as a way to give people superpowers,” Rus says. “Machines can help us reach farther, think faster, and live fuller lives.”Daniela RusEmployer MITJob titleProfessor of electrical and computer engineering and computer science; director of the MIT Computer Science and Artificial Intelligence LaboratoryMember gradeFellowAlma maters University of Iowa, in Iowa City; CornellHer dual missions, she says, are to make technology humane and to make the most of the opportunities afforded by life in the United States. The two goals have fueled her journey from a childhood living under a dictatorship in Romania to the forefront of global robotics research.Rus, who is director of MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL), is the recipient of this year’s IEEE Edison Medal, which recognizes her for “sustained leadership and pioneering contributions in modern robotics.”An IEEE Fellow, she describes the recognition as a responsibility to further her work and mentor the next generation of roboticists entering the field.The Edison Medal is the latest in a string of honors she has received. In 2017 she won an Engelberger Robotics Award from the Robotic Industries Association. The following year, she was honored with the Pioneer in Robotics and Automation Award by the IEEE Robotics and Automation Society. The society recognized her again in 2023 with its IEEE Robotics and Automation Technical Field Award.From Romania to Iowa Rus was born in Cluj-Napoca, Romania, during the rule of dictator Nicolae Ceausescu. Her early life unfolded in a world defined by scarcity—rationed food, intermittent electricity, and a limited ability to move up or out. But she recalls that, amid the stifling insufficiencies, she was surrounded by an irrepressible warmth and intellectual curiosity—even when she was making locomotive screws in a state-run factory as part of her school’s curriculum.“Life was hard,” she says, “but we had great teachers and strong communities. As a child, you adapt to whatever is around you.”Her father, Teodor, was a computer scientist and professor, and her mother, Elena, was a physicist.In 1982, when she was 19, Rus’s father emigrated to the United States to join the faculty at the University of Iowa, in Iowa City. It was an act of courage and conviction. Within a year, Daniela and her mother joined him there.“He wanted the freedom to think, to publish, to explore ideas,” Rus says. “And I reaped the benefits of being free from the limitations of our homeland.”America’s open horizons were intoxicating, she says.A lecture that changed everythingRus decided to pursue a degree at her father’s university, where her life changed direction, she says. One afternoon, John Hopcroft—a Turing Award–winning Cornell computer scientist renowned for his work on algorithms and data structures—gave a talk on campus. His message was simple but electrifying, Rus says: Classical computer science had been solved. The next frontier, Hopcroft declared, was computations that interact with the messy physical world.For Rus, the idea was a revelation.“It was as if a door had opened,” she says. “I realized the future of computing wasn’t just about logic and code; it was about how machines can perceive, move, and help us in the real world.”After the lecture, she introduced herself to Hopcroft and told him she wanted to learn from him. Not long after earning her bachelor’s degree in computer science and mathematics in 1985, she applied to get a master’s degree at Cornell, where Hopcroft became her graduate advisor. Rus developed algorithms there for dexterous robotic manipulation—teaching machines to grasp and move objects with precision. She earned her master’s in computer science in 1990, then stayed on at Cornell to work toward a doctorate.“I like to think of robotics as a way to give people superpowers. Machines can help us reach farther, think faster, and live fuller lives.”In 1993 she earned her Ph.D. in computer science, then took a position as an assistant professor of computer science at Dartmouth College, in Hanover, N.H. She founded the college’s robotics laboratory and expanded her work into distributed robotics. She developed teams of small robots that cooperated to perform tasks such as ensuring products in warehouses are correctly gathered to fulfill orders, get packaged safely, and are routed to their respective destinations efficiently.Despite a lack of traditional machine shop facilities for fabrication on the Hanover campus, Rus found a way. She pioneered the use of 3D printing to rapidly prototype and build robots.In 2003 she left Dartmouth to become a professor in the electrical engineering and computer science department at MIT.The robotics lab she created at Dartmouth moved with her to MIT and became known as the Distributed Robotics Laboratory (DRL). In 2012 she was named director of MIT’s Computer Science and Artificial Intelligence Laboratory, the school’s largest interdisciplinary lab, with 60 research groups including the DRL. She also continues to serve as the DRL’s principal investigator.The science of physical intelligenceRus now leads pioneering research at the intersection of AI and robotics, a field she calls physical intelligence. It’s “a new form of intelligent machine that can understand dynamic environments, cope with unpredictability, and make decisions in real time,” she says.Her lab builds soft-body robots inspired by nature that can sense, adapt, and learn. They are AI-driven systems that passively handle tasks—such as self-balancing and complex articulation similar to that done by the human hand—because their shape and materials minimize the need for heavy processing.Such machines, she says, someday will be able to navigate different environments, perform useful functions without external control, and even recover from disturbances to their route planning. Researchers also are exploring ways to make them more energy-efficient.One prototype developed by Rus’s team is designed to retrieve foreign objects from the body, including batteries swallowed by children. The ingestible robots are artfully folded, similar to origami, so they are small enough to be swallowed. Embedded magnetic materials allow doctors to steer the soft robots and control their shape. Upon arriving in the stomach, a soft bot can be programmed to wrap around a foreign object and guide it safely out of the patient’s body.CSAIL researchers also are working on small robots that can carry a medication and release it at a specific area within the digestive tract, bypassing the stomach acid known to diminish some drugs’ efficacy. Ingestible robots also could patch up internal injuries or ulcers. And because they’re made from digestible materials such as sausage casings and biocompatible polymers, the robots can perform their assigned tasks and then get safely absorbed by the body, she says.Health care isn’t the only application on the horizon for such AI-driven technologies. Robots with physical intelligence might someday help firefighters locate people trapped in burning buildings, find miners after a cave-in, and provide valuable situational awareness information to emergency response teams in the aftermath of natural disasters, Rus says.“What excites me is the possibility of giving people new powers,” she says. “Machines that can think and move safely in the physical world will let us extend human reach—at work, at home, in medicine … everywhere.”To make such a vision a reality, she has expanded her technical interests to include several complementary lines of research.She’s working on self-reconfiguring and modular robots such as MIT’s M-Blocks and NASA’s SuperBots, which can attach, detach, and rearrange themselves to form shapes suited for different actions such as slithering, climbing, and crawling.With networked robots—including those Amazon uses in its warehouses—thousands of machines can operate as a large adaptive system. The machines communicate continuously to divide tasks, avoid collisions, and optimize package routing.Rus’s team also is making advances in human-robot interaction, such as reading brainwave activity and interpreting sign language through a smart glove.To further her plan of putting all the computerized smarts the robots need within their physical bodies instead of in the cloud, she helped found Liquid AI in 2023. The company, based in Cambridge, Mass., develops liquid neural networks, inspired by the simple brains of worms, that can learn and adapt continuously. The word liquid in this case refers to the adaptability, flexibility, and dynamic nature of the team’s model architecture. It can change shape and adapt to new data inputs, and it fits within constraints imposed by the hardware in which it’s contained, she says.Finding community in IEEERus joined IEEE at one of its robotics conferences when she was a graduate student.“I think I signed up just to get the student discount,” she says with a laugh. “But IEEE turned out to be the place where my community lived.”She credits the organization’s conferences, journals, and collaborative spirit with shaping her professional growth.“The exchange of ideas, the chance to test your thinking against others—it’s invaluable,” she says. “It’s how our field moves forward.”Rus continues to serve on IEEE panels and committees, mentoring the next generation of roboticists.“IEEE gave me a platform,” Rus says. “It taught me how to communicate, how to lead, and how to dream bigger.”Living the American dreamLooking back, Rus sees her story as a testament to unforeseen possibilities.“When I was growing up in Romania, I couldn’t even imagine living in America,” she says. “Now I’m here, working with brilliant students, building robots that help people, and trying to make a difference. I feel like I’m living the American dream.”In a nod to a memorable song from the Broadway musical Hamilton, Rus echoes Alexander Hamilton’s determination to make the most of his opportunities, saying, “I don’t ever want to throw away my shot.”
- Video Friday: Disney’s Robotic Olaf Makes His Debutby Evan Ackerman on November 29, 2025 at 4:30 pm
Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.SOSV Robotics Matchup: 1–5 December 2025, ONLINEICRA 2026: 1–5 June 2026, VIENNAEnjoy today’s videos! Step behind the scenes with Walt Disney Imagineering Research & Development and discover how Disney uses robotics, AI, and immersive technology to bring stories to life! From the brand-new self-walking Olaf in World of Frozen and BDX Droids to cutting-edge attractions like Millennium Falcon: Smugglers Run, see how magic meets innovation.[ Disney Experiences ]We just released a new demonstration of Mentee’s V3 humanoid robots completing a real-world logistics task together. Over an uninterrupted 18-minute run, the robots autonomously move 32 boxes from eight piles to storage racks of different heights. The video shows steady locomotion, dexterous manipulation, and reliable coordination throughout the entire task.And there’s an uncut 18-minute version of this at the link.[ MenteeBot ]Thanks, Yovav!This video contains graphic depictions of simulated injuries. Viewer discretion is advised.In this immersive overview, guided by the DARPA Triage Challenge program manager, retired Army Col. Jeremy C. Pamplin, M.D., you’ll experience how teams of innovators, engineers, and DARPA are redefining the future of combat casualty care. Be sure to look all around! Check out competition runs, behind-the-scenes of what it takes to put on a DARPA Challenge, and glimpses into the future of lifesaving care.Those couple of minutes starting at 6:50 with the human medic and robotic teaming were particularly cool.[ DARPA ]You don’t need to build a humanoid robot if you can just make existing humanoids a lot better.I especially love 0:45 because you know what? Humanoids should spend more time sitting down, for all kinds of reasons. And, of course, thank you for falling and getting up again, albeit on some of the squishiest grass on the planet.[ Flexion ]“Human-in-the-Loop Gaussian Splatting” wins best paper title of the week.[ Paper ] via [ IEEE Robotics and Automation Letters in IEEE Xplore ]Scratch that, “Extremum Seeking Controlled Wiggling for Tactile Insertion” wins best paper title of the week.[ University of Maryland PRG ]The battery swapping on this thing is...unfortunate.[ LimX Dynamics ]To push the boundaries of robotic capability, researchers in the Department of Mechanical Engineering at Carnegie Mellon University, in collaboration with the University of Washington and Google DeepMind, have developed a new tactile sensing system that enables four-legged robots to carry unsecured, cylindrical objects on their backs. This system, known as LocoTouch, features a network of tactile sensors that spans the robot’s entire back. As an object shifts, the sensors provide real-time feedback on its position, allowing the robot to continuously adjust its posture and movement to keep the object balanced.[ Carnegie Mellon University ]This robot is in more need of googly eyes than any other robot I’ve ever seen.[ Zarrouk Lab ]DPR Construction has deployed FieldAI’s autonomy software on a quadruped robot at the company’s job site in Santa Clara, Calif., to greatly improve its daily surveying and data-collection processes. By automating what has traditionally been a very labor-intensive and time-consuming process, FieldAI is helping the DPR team operate more efficiently and effectively, while increasing project quality.[ FieldAI ]In our second episode of AI in Motion, our host, Waymo AI researcher Vincent Vanhoucke, talks with robotics startup founder Sergey Levine, who left a career in academic research to build better robots for the home and workplace.[ Waymo ]
- For This Engineer, Taking Deep Dives Is Part of the Jobby Edd Gent on November 27, 2025 at 1:00 pm
Early in Levi Unema’s career as an electrical engineer, he was presented with an unusual opportunity. While working on assembly lines at an automotive parts supplier in 2015, he got a surprise call from his high-school science teacher that set him off on an entirely new path: piloting underwater robots to explore the ocean’s deepest abysses.That call came from Harlan Kredit, a nationally renowned science teacher and board member of a Rhode Island-based nonprofit called the Global Foundation for Ocean Exploration (GFOE). The organization was looking for an electrical engineer to help design, build, and pilot remotely operated vehicles (ROVs) for the U.S. National Oceanic and Atmospheric Administration.Levi UnemaEmployerDeep Exploration SolutionsOccupationROV engineerEducation Bachelor’s degree in electrical engineering, Michigan Technological UniversityThis was an exciting break for Unema, a Washington state native who had grown up tinkering with electronics and exploring the outdoors. Unema joined the team in early 2016 and has since helped develop and operate deep-sea robots for scientific expeditions around the globe.The GFOE’s contract with NOAA expired in July, forcing the engineering team to disband. But soon after, Unema teamed up with four former colleagues to start their own ROV consultancy, called Deep Exploration Solutions, to continue the work he’s so passionate about.“I love the exploration and just seeing new things every day,” he says. “And the engineering challenges that go along with it are really exciting, because there’s a lot of pressure down there and a lot of technical problems to solve.”Nature and TechnologyUnema’s fascination with electronics started early. Growing up in Lynden, Wash., he took apart radios, modified headphones, and hacked together USB chargers from AA batteries. “I’ve always had to know how things work,” he says. He was also a Boy Scout, and much of his youth was spent hiking, camping, and snowboarding.That love of both technology and nature can be traced back, at least in part, to his parents—his father was a civil engineer, and his mother was a high-school biology teacher. But another major influence growing up was Kredit, the science teacher who went on to recruit him. (Kredit was also a colleague of Unema’s mother.)Kredit has won numerous awards for his work as an educator, including the Presidential Award for Excellence in Science Teaching in 2004. Like Unema, he also shares a love for the outdoors as Yellowstone National Park’s longest-serving park ranger. “He was an excellent science teacher, very inspiring,” says Unema.When Unema graduated high school in 2010, he decided to enroll at his father’s alma mater, Michigan Technological University, to study engineering. He was initially unsure what discipline to follow and signed up for the general engineering course, but he quickly settled on electrical engineering.A summer internship at a steel mill run by the multinational corporation ArcelorMittal introduced Unema to factory automation and assembly lines. After graduating in 2014 he took a job at Gentex Corp. in Zeeland, Mich., where he worked on manufacturing systems and industrial robotics.Diving Into Underwater RoboticsIn late 2015, he got the call from Kredit asking if he’d be interested in working on underwater robots for GFOE. The role involved not just engineering these systems, but also piloting them. Taking the plunge was a difficult choice, says Unema, as he’d just been promoted at Gentex. But the promise of travel combined with the novel engineering challenges made it too good an opportunity to turn down.Building technology that can withstand the crushing pressure at the bottom of the ocean is tough, he says, and you have to make trade-offs between weight, size, and cost. Everything has to be waterproof, and electronics have to be carefully isolated to prevent them from grounding on the ocean floor. Some components are pressure-tolerant, but most must be stored in pressurized titanium flasks, so the components must be extremely small to minimize the size of the metallic housing. Unema conducts predive checks from the Okeanos Explorer’s control room. Once the ROV is launched, scientists will watch the camera feeds and advise his team where to direct the vehicle.Art Howard“You’re working very closely with the mechanical engineer to fit the electronics in a really small space,” he says. “The smaller the cylinder is, the cheaper it is, but also the less mass on the vehicle. Every bit of mass means more buoyancy is required, so you want to keep things small, keep things light.”Communications are another challenge. The ROVs rely on several kilometers of cable containing just three single-mode optical fibers. “All the communication needs to come together and then go up one cable,” Unema says. “And every year new instruments consume more data.”He works exclusively on ROVs that are custom made for scientific research, which require smoother control and considerably more electronics and instrumentation than the heavier-duty vehicles used by the oil and gas industry. “The science ones are all hand-built, they’re all quirky,” he says.Unema’s role spans the full life cycle of an ROV’s design, construction, and operation. He primarily spends winters upgrading and maintaining vehicles and summers piloting them on expeditions. At GFOE, he mainly worked on two ROVs for NOAA called Deep Discoverer and Seirios, which operate from the ship Okeanos Explorer. But he has also piloted ROVs for other organizations over the years, including the Schmidt Ocean Institute and the Ocean Exploration Trust.Unema’s new consultancy, Deep Exploration Solutions, has been given a contract to do the winter maintenance on the NOAA ROVs, and the firm is now on the lookout for more ROV design and upgrade work, as well as piloting jobs.An Engineer’s Life at SeaOn expeditions, Unema is responsible for driving the robot. He follows instructions from a science team that watches the ROV’s video feed to identify things like corals, sponges, or deepwater creatures that they’d like to investigate in more detail. Sometimes he will also operate hydraulic arms to sample particularly interesting finds.In general, the missions are aimed at discovering new species and mapping the range of known ones, says Unema. “There’s a lot of the bottom of the ocean where we don’t know anything about it,” he says. “Basically every expedition there’s some new species.”This involves being at sea for weeks at a time. Unema says that life aboard ships can be challenging—many new crew members get seasick, and you spend almost a month living in close quarters with people you’ve often never met before. But he enjoys the opportunity to meet colleagues from a wide variety of backgrounds who are all deeply enthusiastic about the mission.“It’s like when you go to scout camp or summer camp,” he says. “You’re all meeting new people. Everyone’s really excited to be there. We don’t know what we’re going to find.”Unema also relishes the challenge of solving engineering problems with the limited resources available on the ship. “We’re going out to the middle of the Pacific,” he says. “Things break, and you’ve got to fix them with what you have out there.”If that sounds more exciting than daunting, and you’re interested in working with ROVs, Unema’s main advice is to talk to engineers in the field. It’s a small but friendly community, he says, so just do your research to see what opportunities are available. Some groups, such as the Ocean Exploration Trust, also operate internships for college students to help them get experience in the field.And Unema says there are very few careers quite like it. “I love it because I get to do all aspects of engineering—from idea to operations,” he says. “To be able to take something I worked on and use it in the field is really rewarding.”This article appears in the December 2025 print issue as “Levi Unema.”
- Remote Robotics Could Widen Access to Stroke Treatmentby Greg Uyeno on November 24, 2025 at 2:15 pm
When treating strokes, every second counts. But for patients in remote areas, it may take hours to receive treatment. The standard treatment for a common type of stroke, caused by large clots interrupting blood flow to the brain, is a procedure called endovascular thrombectomy, or EVT. During the procedure, an experienced surgeon pilots catheters through blood vessels to the blockage, accessed through a major channel such as the femoral artery in the groin. This is typically aided by X-ray imaging, which shows the position of blood vessels.“Good outcomes are directly associated with early treatment,” says Cameron Williams, a neurologist at the University of Melbourne and fellow with the Australian Stroke Alliance. In fact, “time is brain” is a common refrain in stroke treatment. While blood flow is stopped, about 2 million neurons die each minute. Over an hour, that adds up to 3.6 years of typical age-related brain cell loss.But in remote places like Darwin, in the north of Australia, this treatment isn’t available. Instead, it could take 6 hours or more and an expensive aeromedical transfer to get a patient to a medical center, says Williams. There are similar geographical challenges to stroke treatment access all over the world. Sparing a rural patient hours of transfer time to a hospital with an on-site expert could save their life, prevent disability, or preserve years of their quality of life.That’s why there is a particular interest in the possibility of emergency stroke treatment performed remotely with the help of robotics. Machines placed in smaller population centers could connect patients to expert surgeons miles away, and shave hours off of time to treatment. Two companies have recently demonstrated their remote capabilities. In September, doctors in Toronto completed a series of increasingly distant brain angiograms, the X-ray imaging element of an EVT, eventually performing two angiograms between crosstown hospitals using the N1 system from Remedy Robotics. And in October, Sentante equipment facilitated a simulated EVT between a surgeon in Jacksonville, Fla., and a cadaver with artificial blood flow in Dundee, Scotland.“All those stories connected is not only proof of concept. It’s coming to realization and implementation that robotic and remote interventions can be performed, and soon will be the reality for many centers in rural areas,” says Vitor Pereira, a neurosurgeon at Unity Health who performed the Toronto procedures.Two Approaches to Remote EVTOne challenge of performing these remote procedures is maintaining strong, fast connections at large distances. “Is there a real life need to do this transatlantically? Probably not,” says Edvardas Satkauskas, CEO of Sentante. “It demonstrates the capabilities. Even this distance is feasible.” Although performing a procedure remotely introduces issues related to latency, the pace of EVT—while urgent—is not reliant on instant reactions, says Satkauskas. Redundant connections should also be an important safeguard for dropped connections. Remedy has taken measures, for instance, to ensure that its robot monitors connection quality, and doesn’t make any harmful movements due to a poor connection, says David Bell, the company’s CEO.Though both companies are careful about disclosing details of products and research that are still in development, there are notable differences between their approaches.“Our device leans heavily on artificial intelligence,” says Bell. Machine learning is incorporated into how the Remedy device manipulates guide wires and creates an informational overlay atop X-ray images for remote physicians, who can control the robot with a laptop and software interface. The long-term goal is for a surgeon to be able to log on to Remedy software at short notice from a central location to interact with Remedy robots in multiple hospitals as needed.In contrast, Sentante uses a control console meant to look and feel like the catheters and guide wires that surgeons are accustomed to manipulating in manual EVT, including force feedback that mimics the resistance they would feel in person. “It’s very intuitive to use this,” says Ricardo Hanel, a neurosurgeon with Baptist Health in Jacksonville, who was on the piloting end of the Sentante demonstration. Naturalistic feeling in the transatlantic procedure came with reported latency of around 120 milliseconds. Hanel is also on Sentante’s medical advisory board.Sentante has not yet implemented AI-assisted movements of its robot, though a plan is in place to capture as much training data as possible, both from images and force measurements. “As we joke, we had to build a sophisticated piece of hardware to become a software company,” says CEO Satkauskas. The Path to Clinical UseHanel expressed optimism that any control system would be easily learned by surgeons. “I think the main limitation for robotics is that you are still dependent on bedside interventionists,” says Ahmet Gunkan, an interventional radiologist at the University of Arizona, who has written about robots and endovascular interventions. Depending on the system, these bedside assistants might be responsible for a variety of tasks related to preparing and communicating with the patient, sterilizing and preparing equipment, loading step-specific parts, and repositioning X-ray or robotic equipment. Both CEOs note that while proper training will be essential, there are ways to reduce the burden on health care providers at the patient site.In the case of remote operations, “it was important to us that the robot could do the entire thing,” says Bell. Remedy’s system has been designed to handle as much of the procedure as possible, and streamline moments when bedside human interaction is necessary. For example, even since the older version used in Toronto, changes have been made to maintain a clean line of communication between bedside and remote clinicians, facilitated by the Remedy system, says Bell. A team at St. Michael’s Hospital in Toronto performs, for the world’s first time, a robot-assisted neurovascular procedure remotely over a network, on 28 August 2025. Katie Cooper and Kevin Van Paassen/Unity Health TorontoThough remote EVT is a high priority, systems capable of the procedure may first be approved for other endovascular procedures performed locally. The hope is that precision robotics leads to better patient outcomes, whether the surgeon is in the next room or the next county. Remedy has a clinical trial planned in 2026 for on-premise neurointerventions, and has partnered with the Australian Stroke Alliance to distribute its N1 system and conduct a future clinical trial for remote procedures. Eventually the robot could be used to treat as many as 30 different conditions, says Bell.Satkauskas views Sentante’s equipment as a flexible platform for endovascular procedures throughout the body, which could help keep bedside clinicians familiar with the device. The system may go to market in the EU next year for peripheral vascular interventions, which restore blood flow to the limbs, and it has a breakthrough device designation from the U.S. FDA for remote stroke treatment.There are other players in the space. For example, an early telerobotic effort from a company called Corindus is still ongoing after the company’s acquisition by Siemens in 2019. And Pereira notes that Xcath has also demonstrated a long-distance simulated EVT and looks to perform local robotic EVT with live patients soon.“I think it’s an exciting time to be a neurointerventionalist,” says Hanel.
- Video Friday: Watch Robots Throw, Catch, and Hit a Baseballby Evan Ackerman on November 21, 2025 at 4:20 pm
Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We also post a weekly calendar of upcoming robotics events for the next few months. Please send us your events for inclusion.SOSV Robotics Matchup: 1–5 December 2025, ONLINEICRA 2026: 1–5 June 2026, VIENNAEnjoy today’s videos! Researchers at the RAI Institute have built a low-impedance platform to study dynamic robot manipulation. In this demo, robots play a game of catch and participate in batting practice, both with each other and with skilled humans. The robots are capable of throwing 70 mph [112 kph], approaching the speed of a strong high school pitcher. The robots can catch and bat at short distances (23 feet [7 m]), requiring quick reaction times to catch balls thrown at up to 41 mph [66 kph] and hit balls pitched at up to 30 mph [48 kph].That’s a nice touch with the custom “RAI” baseball gloves, but what I really want to know is how long a pair of robots can keep themselves entertained.[ RAI Institute ]This week’s best acronym winner is GIRAF: Greatly Increased Reach AnyMAL Function. And if that arm looks like magic, that’s because it is, although with some careful pausing of the video you’ll be able to see how it works.[ Stanford BDML ]DARPA concluded the second year of the DARPA Triage Challenge on 4 October, awarding top marks to DART and MSAI in Systems and Data competitions, respectively. The three-year prize competition aims to revolutionize medical triage in mass casualty incidents where medical resources are limited.[ DARPA ]We propose a robot agnostic reward function that balances the achievement of a desired end pose with impact minimization and the protection of critical robot parts during reinforcement learning. To make the policy robust to a broad range of initial falling conditions and to enable the specification of an arbitrary and unseen end pose at inference time, we introduce a simulation-based sampling strategy of initial and end poses. Through simulated and real-world experiments, our work demonstrates that even bipedal robots can perform controlled, soft falls.[ Moritz Baecher ]Oh look, more humanoid acrobatics.My prediction: Once humanoid companies run out of mocapped dance moves, we’ll start seeing some freaky stuff that leverages the degrees of freedom that robots have and humans do not. You heard it here first, folks.[ MagicLab ]I challenge the next company that makes a “lights-out” video to just cut to a totally black screen with a little “Successful Picks” counter in the corner that just goes up and up and up.[ Brightpick ]Thanks, Gilmarie!The terrain stuff is cool and all, but can we just talk about the trailer instead?[ LimX Dynamics ]Presumably very picky German birblets are getting custom nesting boxes manufactured with excessively high precision by robots.[ TUM ]All those UBTECH Walker S2 robots weren’t fake, it turns out.[ UBTECH ]This is more automation than what we’d really be thinking of as robotics at this point, but I could still watch it all day.[ Motoman ]Brad Porter (Cobot) and Alfred Lin (Sequoia Capital) discuss the future of robotics, AI, and automation at the Human[X] Conference, moderated by CNBC’s Kate Rooney. They explore why collaborative robots are accelerating now, how AI is transforming physical systems, the role of humanoids, labor market shifts, and the investment trends shaping the next decade of robotics.[ Cobot ]Humanoid robots have long captured our imagination. Interest has skyrocketed along with the perception that robots are getting closer to taking on a wide range of labor-intensive tasks. In this discussion, we reflect on what we’ve learned by observing factory floors, and why we’ve grown convinced that chasing generalization in manipulation—both in hardware and behavior—isn’t just interesting, but necessary. We’ll discuss AI research threads we’re exploring at Boston Dynamics to push this mission forward and highlight opportunities our field should collectively invest more in to turn the humanoid vision, and the reinvention of manufacturing, into a practical, economically viable product.[ Boston Dynamics ]On 12 November 2025, Tom Williams presented “Degrees of Freedom: On Robotics and Social Justice” as part of the Michigan Robotics Seminar Series.[ Michigan Robotics ]Ask the OSRF Board of Directors anything! Or really, listen to other people ask them anything. [ ROSCon ]
- Why Is Everyone’s Robot Folding Clothes?by Chris Paxton on November 19, 2025 at 4:00 pm
It seems like every week there’s a new video of a robot folding clothes. We’ve had some fantastic demonstrations, like this semi-autonomous video from Weave Robotics on X.It’s awesome stuff, but Weave is far from the only company producing these kinds of videos. Figure 02 is folding clothes. Figure 03 is folding clothes. Physical Intelligence launched their flagship vision-language-action model, pi0, with an amazing video of a robot folding clothes after unloading a laundry machine. You can see robots folding clothes live at robotics expos. Even before all this, Google showed clothes folding in their work, ALOHA unleashed. 7X Tech is even planning to sell robots to fold clothes!And besides folding actual clothes, there are other clothes-folding-like tasks, like Dyna’s napkin folding—which leads to what is probably my top robot video of the year, demonstrating 18 hours of continuous napkin folding. So why are all of these robotic manipulation companies suddenly into folding? Reason 1: We basically couldn’t do this beforeThere’s work going back over a decade that shows some amount of robotic clothes folding. But these demonstrations were extremely brittle, extremely slow, and not even remotely production-ready. Previous solutions existed (even learning-based solutions!) but they relied on precise camera calibration, or on carefully hand-designed features, meaning that these clothes-folding demos generally worked only on one robot, in one environment, and may have only ever worked a single time—just enough for the recording of a demo video or paper submission. With a little bit of help from a creatively patterned shirt, PR2 was folding things back in 2014.Bosch/IEEETake a look at this example of UC Berkeley’s PR2 folding laundry from 2014. This robot is, in fact, using a neural network policy. But that policy is very small and brittle; it picks and places objects against the same green background, moves very slowly, and can’t handle a wide range of shirts. Making this work in practice would require larger models, pretrained on web-scale data, and better, more general techniques for imitation learning.And so 10 years later, with the appropriate demonstration data, many different startups and research labs have been able to implement clothes-folding demos; it’s something we have seen from numerous hobbyists and startups, using broadly similar tools (like LeRobot from HuggingFace), without intense specialization.Reason 2: It looks great and people want it!Many of us who work in robotics have this “north star” of a robot butler that can do all the chores we don’t want to do. Mention clothes folding, and many, many people will chime in about how they don’t ever want to fold clothes again and are ready to part with basically any amount of money to make that happen.This is important for the companies involved as well. Companies like Figure and 1x have been raising large amounts of money predicated on the idea that they will be able to automate many different jobs, but increasingly these companies seem to want to start in the home. Dyna Robotics can fold an indefinite number of napkins indefinitely.Dyna RoboticsAnd that’s part of the magic of these demos. While they’re slow and imperfect, everyone can start to envision how this technology becomes the thing that we all want: a robot that can exist in our house and mitigate all those everyday annoyances that take up our time.Reason 3: It avoids what robots are still bad atThese robot behaviors are produced by models trained via imitation learning. Modern imitation-learning methods like Diffusion Policy use techniques inspired by generative AI to produce complex, dexterous robot trajectories, based on examples of expert human behavior that’s been provided to them—and they often need many, many trajectories. The work ALOHA Unleashed by Google is a great example, needing about 6,000 demonstrations to learn how to, for example, tie a pair of shoelaces. For each of these demonstrations, a human piloted a pair of robot arms while performing the task; all of this data was then used to train a policy.We need to keep in mind what’s hard about these demonstrations. Human demonstrations are never perfect, nor are they perfectly consistent; for example, two human demonstrators will never grab the exact same part of an object with submillimeter precision. That’s potentially a problem if you want to screw a cover in place on top of a machine you’re building, but it’s not a problem at all for folding clothes, which is fairly forgiving. This has two knock-on effects:It’s easier to collect the demonstrations you need for folding clothes, as you don’t need to throw out every training demonstration that’s a millimeter out of spec.You can use cheaper, less repeatable hardware to accomplish the same task, which is useful if you suddenly need a fleet of robots collecting thousands of demos, or if you’re a small team with limited funding!For similar reasons, it’s great that with cloth folding, you can fix your cameras in just the right position. When learning a new skill, you need training examples with “coverage” of the space of environments you expect to see at deployment time. So the more control you have, the more efficient the learning process will be—the less data you’ll need, and the easier it will be to get a flashy demo. Keep this in mind when you see a robot folding things on a plain tabletop or with an extremely clean background; that’s not just nice framing, it helps out the robot a lot!And since we’ve committed to collecting a ton of data—dozens of hours—to get this task working well, mistakes will be made. It’s very useful, then, if it’s easy to reset the task, i.e., restore it to a state from which you can try the task again. If something goes wrong folding clothes, it’s fine. Simply pick the cloth up, drop it, and it’s ready for you to start over. This wouldn’t work if, say, you were stacking glasses to put away in a cupboard, since if you knock over the stack or drop one on the floor, you’re in trouble.Clothes folding also avoids making forceful contact with the environment. Once you’re exerting a lot of pressure, things can break, the task can become non-resettable, and demonstrations are often much harder to collect because forces aren’t as easily observable to the policy. And every piece of variation (like the amount of force you’re exerting) will end up requiring more data so the model has “coverage” of the space it’s expected to operate in.What to Look Forward toWhile we’re seeing a lot of clothes-folding demos now, I still feel, broadly, quite impressed with many of them. As mentioned above, Dyna was one of my favorite demos this year, mostly because longer-running robot policies have been so rare until now. But they were able to demonstrate zero-shot folding (meaning folding without additional training data) at a couple of different conferences, including Actuate in San Francisco and the Conference on Robot Learning (CoRL) in Seoul. This is impressive and actually very rare in robotics, even now.In the future, we should hope to see robots that can handle more challenging and dynamic interactions with their environments: moving more quickly, moving heavier objects, and climbing or otherwise handling adverse terrain while performing manipulation tasks.But for now, remember that modern learning methods will come with their own strengths and weaknesses. It seems that, while not easy, clothes folding is the kind of task that’s just really well suited for what our models can do right now. So expect to see a lot more of it.
- Students Compete—and Cooperate—in FIRST Global Robotics Challengeby Kohava Mendelsohn on November 15, 2025 at 2:00 pm
Aspiring engineers from 191 countries gathered in Panama City in October to compete in the FIRST Global Robotics Challenge. The annual contest aims to foster problem-solving and cooperation, and inspire the next generation of engineers through three challenges that are inspired by a different theme every year. Teams of students from 14 to 18 years old from around the world compete in the three-day event, remotely operating their robots to complete the challenges. This year’s topic was “Eco-equilibrium,” emphasizing the importance of preserving ecosystems and protecting vulnerable species.Turning Robotics Into a Sport Each team competed in a series of ranking matches at the event. The matches consisted of several simultaneous goals, lasting 2 minutes and 30 seconds. First, students guided their robots in gathering “biodiversity units” (multicolored balls) and delivering them to their humans. Next, the robots removed “barriers” (larger, gray balls) from containers and disposed of them in a set area. Then team members threw the biodiversity units into the now-cleared containers to score points. At the end of the match, each robot was tasked with climbing a 1.5-meter rope. The team with the most points won the match.To promote collaboration, each match had two groups, which consisted of three individual teams and their robots, competing for victory. Each team controlled its own robot but had to work with the other robots in the group to complete the tasks. If all six robots managed to climb the rope at the end of the match, each team’s scores were multiplied by 1.5.The top 24 teams were split into six “alliances” of four individual teams each to compete in the playoffs. The highest-scoring alliance was crowned the winner. This year’s winning teams were Cameroon, Mexico, Panama, and Venezuela. Each student received a gold medal.It may have been hard to tell it was a competition at first glance. When all six robots successfully climbed the rope at the end of the match, students across teams were hugging each other, clapping, and cheering. “It’s not about winning, it’s not about losing, it’s about learning from others,” says Clyde Snyders, a member of the South Africa team. His sentiment was echoed throughout the event. Making It Into the CompetitionBefore the main event, countries all over the world run qualifying events where thousands of students show off their robotics skills for a chance to make it to the final competition. Each country chooses its team differently. Some pick the top-scoring team to compete, while others pick students from different teams to create a new one.Even after qualifying, for some students, physically getting to the competition isn’t straightforward. This year, Team Jamaica faced challenges after Hurricane Melissa struck the country on 28 October, one day before the competition began. It was the strongest storm that has ever hit Jamaica, killing 32 people and leaving billions of dollars in infrastructure repairs. Because of the damage, the Jamaican team faced repeatedly cancelled flights and other travel delays. They almost didn’t make it, but FIRST Global organizers covered the costs of their travel. The students arrived on the second day, just in time to participate in enough matches to avoid being disqualified. Team Jamaica arrived late due to Hurricane Melissa, but they remained positive. Kohava Mendelsohn“We are so happy to be here,” says Joelle Wright, the team captain. “To be able to engage in new activities, to compete, and to be able to showcase our hard work.” Team Jamaica won a bronze medal.Working Together to Fix and Improve RobotsThroughout the competition, it was a regular occurrence to see students from different teams huddled together, debugging problems, sharing tips, and learning together. Students were constantly fixing their robots and adding new features at the event’s robot hospital. There, teams could request spare parts, get help from volunteers, and access the tools they needed. Volunteering in the robot hospital is demanding, but rewarding, says Janet Kapito, an electrical engineer and the operations manager at Robotics Foundation Malawi in Blantyre. She participated in the FIRST Global Challenge when she was a student. “[The volunteers] get to see different perspectives and understand how people think differently,” she says. It’s rewarding to watch students solve problems on their own, she adds. The hospital was home to many high-stress situations, especially on the first day of the competition. The Ecuadorian team’s robot was delayed in transit. So, using the robot hospital’s parts, the students built a new robot to compete with. Tanzanian team members were hard at work repairing their robot, which was having issues with the mechanism that allowed it to climb up the rope. Collaboration played a key role in the hospital. When the South African team’s robot was having mechanical problems, the students weren’t fixing it alone—several teams, including Venezuela, Slovenia, and India, came to help. “It was truly inspirational, and such a great effort in bringing teams from over 190 countries to come and collaborate,” says Joseph Wei, director of IEEE Region 6, who was in attendance at the event.The Importance of Mentoring Future EngineersBehind every team were mentors and coaches who provided students with guidance and experience. Many of them were past participants who are invested in teaching the next generation of engineers. But the robots are designed and built by the students, says Rob Haake, a mentor for Team United States. He tried to stay as hands-off as possible in the engineering of the robot, he says, “so if you asked me to turn on the robot, I don’t even know how to do it.” Haake is the COO of window and door manufacturing company Weiland Inc., in Norfolk, Neb. His passion is to teach kids the skills they need to build things. It’s important to teach students how to think critically and solve problems while also developing technical skills, he says, because those students are the future tech leaders. One major issue he sees is the lack of team mentors. If you’re an engineer, he says, “the best way to help [FIRST Global] grow is to call your local schools to ask if they have a robotics team, and if not, how you can help create one.“The answer may be a monetary donation or, more importantly, your time,” he says. The students you mentor may one day represent their country at a FIRST Robotics Challenge.
Subscribe to Updates
Get the latest creative news from FooBar about art, design and business.
Trending
- Casper Klynge (Zscaler): “Las preocupaciones sobre soberanía digital son reales y están fundamentadas”
- Carrefour y MIOTI ofrecen formación a pymes en IA y transformación digital
- La plataforma PowerStore de Dell se vuelve más inteligente, densa y segura
- Casi la mitad de las empresas españolas incrementará su presupuesto en ciberseguridad en 2026
- CES 2026 recap; expertos de la industria hacen predicciones
- A3 libera estándar completo de seguridad nacional de tres partes para robots industriales
- X Square Robot asegura $140M en financiación para los modelos de fundición AI
- 4 predicciones físicas de IA para 2026 – y más allá, de UR
Noticias en Inglés
Revista digital sobre Robótica, Inteligencia Artificial y Automatización Industrial en Uruguay y Latinoamérica.




















