Sony's Ace Robot Is Beating Professional Table Tennis Players, and That's Actually Harder Than Beating Humans at Chess

Sony's Ace Robot Is Beating Professional Table Tennis Players, and That's Actually Harder Than Beating Humans at Chess

I’ve been following artificial intelligence beating humans at games for years now, and honestly, I thought we’d passed the point where these achievements felt meaningful. Deep Blue beat Kasparov in 1997. AlphaGo demolished Lee Sedol in 2016. These were incredible moments, but they were fundamentally about computation and pattern recognition in controlled, digital environments.

Sony’s Ace robot is different. This thing is actually beating professional table tennis players in real matches that follow official ITTF rules, and I think people are underestimating how wild that actually is.

The Physical World Is Messier Than Any Game Board

Here’s the thing about Chess and Go that makes them (relatively) easier for AI: they’re turn-based, deterministic, and exist in perfectly defined digital spaces. Every possible board state can be represented. There’s no physics to deal with, no sensor noise, no mechanical limitations.

Table tennis? That’s a completely different beast. You’ve got a ball moving at high speeds with unpredictable spin affecting its trajectory. You need to process visual information from multiple cameras in real-time, make split-second decisions about positioning, and then execute precise physical movements through an articulated robotic arm. The margin for error is microscopic.

Sony’s researchers built Ace with eight joints to handle this complexity. Two joints control paddle position, two more adjust its orientation, and three others enable powerful shots. That’s a lot of moving parts that need to coordinate perfectly, and it’s all happening faster than human reaction time in some cases.

Vision Systems That Actually Have to Work in Meatspace

The vision system alone is fascinating from an engineering perspective. Nine traditional cameras surround the court to locate the ball’s position in 3D space. Then there are three additional “gaze control systems” that measure angular velocity and spin. That’s 12 cameras total, all feeding data into a system that needs to predict trajectories and plan responses in milliseconds.

This isn’t computer vision in some lab environment with perfect lighting and controlled conditions. This is a real table tennis court with all the messiness that entails. The ball can have different amounts of spin, players can hit it at different angles and speeds, and the system still needs to track everything accurately enough to compete.

What impresses me most is that Ace won three out of five matches against elite players (people with more than 10 years of training) back in April 2025. It lost to professional league players initially, but then went on to beat professionals in December and again last month according to Reuters.

That progression tells me this isn’t just a one-off demo. The system is actually improving, which means Sony’s figured out how to iterate on both the machine learning components and the physical mechanics in meaningful ways.

Why This Matters Beyond Ping Pong

I know what you’re thinking. Cool robot plays table tennis, so what? But the implications here go way beyond recreational sports.

The technical challenges Sony solved for Ace are the same challenges that need to be solved for robots to work effectively in unstructured environments. Manufacturing, warehouse logistics, disaster response, surgery. All of these domains require robots that can perceive complex physical situations, make rapid decisions, and execute precise movements.

Table tennis is actually a brilliant benchmark because it requires all of these capabilities simultaneously at a high level of performance. You can’t fake it. Either your vision system is fast and accurate enough or the ball flies past you. Either your mechanical system can execute the required movements or you miss the shot.

Omron’s FOREPHUS robot back at CES 2017 could compete against amateur players, which was impressive at the time. But there’s a massive gap between “can rally with an amateur” and “can beat professionals in official matches.” That gap represents years of advances in sensors, actuators, control systems, and the AI that ties it all together.

The Real Test Is Adaptability

What I want to know next is how well Ace adapts to different playing styles. Professional table tennis players have wildly different approaches. Some are defensive players who rely on consistency and forcing errors. Others are aggressive attackers who go for winners. Can Ace adjust its strategy mid-match based on its opponent?

The Nature study published today probably has more details on this, but from what I’m seeing in the source material, Sony’s focused primarily on the raw technical capability to compete at a high level. That’s the necessary foundation, but true mastery in any sport requires reading your opponent and adapting your game plan.

I’d also be curious about the robot’s failure modes. When it loses points, is it because of vision system errors, mechanical limitations, or strategic mistakes? Understanding those failure patterns would tell us a lot about where the remaining technical challenges are.

The fact that Sony’s AI division is the one behind this development is interesting too. They’re not just building a cool demo robot. They’re using table tennis as a testbed for broader robotics and AI research, which means the techniques developed for Ace will likely end up in other Sony products and research areas.

Physical AI is still in its early days compared to the purely digital stuff, and watching a robot genuinely compete with professional athletes in a fast-paced physical game makes you realize just how much harder it is to build intelligence that works in the real world rather than on a screen.

Read Next