The week of February 28, 2016

Would you buy meat from a robot butcher?

By Greg Nichols

Butchers have had a rough half-century.

First, artisan butcher shops lost ground to grocery chains, which bought meat in huge quantities and began absorbing local butchers, putting them to work behind high-volume counters. In the early 2000s, desperate to compete with Walmart, which was beginning to dominate grocery shopping the way it dominates everything, the same chains began replacing their butchers with what’s called (kind of grossly) case-ready meat—pork and beef that’s cut and packaged in massive plants and then shipped around the country. Poultry had gone that way a couple decades earlier, so the transition to cellophane-sheathed, saline-plumped cuts was easy.

Easy for everyone but the butchers, who were transformed into factory workers on an industrial floor. Today, most people making their living carving shanks and flanks have low-wage jobs in arena-size processing plants. Butchering has gone industrial.

But there’s a difference between meat processing and other industrial occupations. In an era when hunks of cow and pig are packaged and distributed like Amazon Prime parcels, butchering has retained a surprising degree of its old-world craftsmanship. Workers armed with knives and hooks anachronistically slice flesh from bone the same way they have for hundreds of years. That’s because cutting meat—be it on an assembly line or in a niche shop in Santa Monica, California, or Brooklyn, New York—is a skill that requires exceptional dexterity, a good eye, and a honed tactile sense for texture and firmness. Industrial robots may be perfectly suited to welding chassis and painting cars, but they don’t have the touch to cut a succulent T-bone steak.

That’s likely to change. JBS, one of the country’s largest meat processors, recently acquired a controlling share of Scott Technology, a New Zealand-based robotics firm. Now JBS is looking at ways to automate its facilities. Robots don’t sleep, don’t collect overtime, and don’t suffer the horrific repetitive stress injuries that plague meat workers. Meat is already packed using machines, and if engineers can figure out how to make automated systems that approximate the deft hands of a butcher, there’s little question giants like JBS, Cargill, and Tyson will replace many of their line workers with robots. In the next decade, adroit robots that can see, feel, and move like humans may finally kill off the butcher.

Eagle eyes (and chicken fingers)

If there were universities for butchers, chicken deboning would be a lower division credit. It’s challenging for a newcomer, and when done quickly it looks like an indecent magic trick, but the maneuver itself is straightforward: You cut the wings at the joints, slice the bird like you’re performing an autopsy, and pry out the carcass in a tidy package with a few deft slices.

As an engineering problem, too, deboning poultry is much easier to tackle than butchering a cow or a pig. That’s because large animals like cows tend to vary significantly in size, even when bred for conformity. A femur bone, say, might be deeper or shallower from one cow to the next, which is where the art of butchering comes in. The external measurements of a bird, on the other hand, tend to correlate very closely to its internal structures. What you see on the outside is a good indicator of what you’ll find on the inside, which makes deboning with a machine much easier.

In Georgia, where poultry is the top agricultural product, accounting for about $20 billion in annual economic impact statewide, an automatic chicken deboner is a technological holy grail. Humans are inconsistent at performing repetitive tasks, which leads to waste and inefficiency, and every 1 percent loss of breast meat represents about $2.5 million to each of Georgia’s 20 poultry processing plants.

Lured by that juicy chicken money, researchers at the Georgia Tech Research Institute have developed what they’ve dubbed the Intelligent Cutting and Deboning System. “Each bird is unique in its size and shape,” says Gary McMurray, chief of GTRI’s Food Processing Technology Division, “so we have developed the sensing and actuation needed to allow an automated deboning system to adapt to the individual bird, as opposed to forcing the bird to conform to the machine.”

“We have developed the sensing and actuation needed to allow an automated deboning system to adapt to the individual bird, as opposed to forcing the bird to conform to the machine.”

The deboning robot uses a 3D vision system to take careful measurements of birds. Modern machine vision owes its precision to high-definition cameras, which have fallen in price in recent years. Using the bird’s dimensions, GTRI’s custom algorithms define a proper cut by estimating the positions of bones and ligaments.

GTRI’s robot can snatch the bones out of a chicken in under a minute—on par with the best human workers. The system is being tested on the line, as is another deboning system being developed by Sintef, an independent research organization in Norway.

The human touch

One of the challenges when working with meat is making sure the knife doesn’t accidentally bite into a bone, which slows work and can result in dangerous shards and splinters.

To do delicate knife work, human butchers use their fingers to feel subtle variations of firmness and texture between sections of muscle. The knife becomes an extension of the body, a sensitive instrument that gives clues based on nuances of resistance and vibration. The skilled way a butcher wields a knife would be impossible without an advanced sense of touch, and that’s one of the major hurdles for robots today.

To understand why touch is so important, consider how clumsy you are with your hands after shaping a snowball or scraping ice off your car’s windshield without gloves. As your hands grow numb, fine motor tasks, such as zipping your jacket or getting a key in the ignition, become difficult. From a mechanical standpoint, your hands haven’t lost any of their ability; your muscles still work as they always did and the joints open and close freely.

What’s changed is the feedback you get through your sense of touch. After dousing your hands with ice, you can no longer feel pressure, texture, heat, or any of a number of other properties we unconsciously rely on for fine manipulation. Your beautifully capable hands are reduced to blunt instruments, which is exactly how robots perceive the world. Deprived of the full tactile sensing capabilities of the human hand, which is the most advanced manipulator in the world, even mechanically sophisticated robots are clumsy and ham-fisted.

But better sensors are starting to arrive.

“Classically, robots feel objects with what’s called a strain gauge, which is an electronic element that deforms slightly when force is applied,” explains Matt Borzage, founding partner of SynTouch, a California company that’s making sensors to allow machines to feel the world more like humans do. “If you have a digital bathroom scale, it uses a strain gauge to tell you how much you weigh.”

A simple strain gauge can only measure force in one dimension. Even the most sensitive strain gauges fail to measure properties like texture or squishiness—information butchers rely on make their cuts.

“We’ve spent a lot of time looking at hands,” says Borzage. “Not just sensing, but combining sensing with the right form factor to allow robotic fingertips to do useful things.”

The result, according to SynTouch, is the only sensor technology in the world that endows robots with the ability to replicate and sometimes exceed the human sense of touch. SynTouch’s signature BioTac sensor even looks like a fingertip. It consists of an elastic skin over an epoxy core, where the sensor’s electronics are located. Between the skin and the epoxy core is a layer of fluid. Electrodes on the surface of the core come into contact with the fluid. When an object is grasped, the electrodes measure the dynamic pressures of the liquid and the system maps the resulting electronic signals.

Of course, touch isn’t all about pressure. Human fingertips sense texture as well. “To tackle that,” says Borzage, “we borrowed from the world of acoustic measurement.” When you rub your finger over the surface of an object, your skin will vibrate slightly. “The same thing happens in our sensor. In our case, those vibrations pass through the fluid underneath the skin. We measure those, which allows us to extract information about texture.”

The BioTac sensor also measures temperature in a novel way. A piece of metal at room temperature feels colder to us than a piece of wood at the same temperature. That’s because metal is a better thermal conductor than wood, so heat will flow out of our fingers faster when we grab a brass doorknob than when we touch, say, a mahogany table. The BioTac mimics this thermal feedback loop by using a small heating element to slightly elevate its temperature. An embedded sensor called a thermistor is able to measure any change in temperature that occurs when the BioTac comes into contact with an object. A robot might decode that data to reveal useful information about the material it’s touching.

The BioTac sensor enables machines to grip delicate and easily bruised fruits, like tomatoes. The sensors aren’t being used for meat cutting yet, but it’s a promising application.

Robots that learn

A robot that feels the way a human does is a start, but advanced sensors are only half the solution. Ultimately, the biggest advantage human butchers have over machines is their ability to refine their technique with each cut. Even with the best sensing and manipulation equipment in the universe, a human who’s never held a knife will be slow and error-prone when asked to cut a pork chop. Give the same person a month of dedicated practice and watch the chops fly.

“If you asked me five years ago,” says Sidd Srinivasa, associate professor at Carnegie Mellon University’s Robotics Institute and founder of the school’s Personal Robotics Lab, “I would have said the hardware just isn’t there yet for really precise manipulation. I don’t think that’s true anymore. There are several robot arms using multimodal sensing that are pretty damn good.”

That’s why Srinivasa and his team are now focusing on control algorithms and machine learning. As the creators of the Home Exploring Robot Butler (HERB), considered the most advanced manipulation robot of its kind, they have some street cred.

HERB famously demonstrated its manipulation skills a couple years back in the so-called Oreo cookie challenge. The task? Separate an Oreo cookie and scrape off the cream without any breakage.

“An Oreo was by far the most delicate and sensitive object HERB had worked with,” Srinivasa remembers. “If you look at an Oreo, they all look alike, but they have different physical properties, different parameters, different squishiness, and other variations. Manipulation is interesting because the only way you can sense those things is by interacting with the object.”

HERB completed the challenge autonomously. The robot broke dozens of Oreos, but each time it learned from its mistakes. Using color and depth cameras, tactile sensors, strain gauges, and force sensors to collect data about each cookie, it figured out a repeatable strategy for twisting the cookies apart and scraping off the cream.

It doesn’t take much imagination to envision the same advanced sensor technology and machine learning techniques applied to the art of butchery. When those advances migrate out of research labs, we’re likely to see the end of one of the last vestiges of old school food processing in the industrial age.

It’ll be an important bellwether—even if you happen to be a vegetarian. The same advances that allow robots to cut meat will bring automation to other corners of the food universe, from stuffing burritos to cooking lavish meals. Fast-food restaurants, which are already run like little factories, are particularly good candidates for automation, which is why a company called Momentum Machines is developing a hamburger-making robot that may one day make line cooks a thing of the past. A recent McKinsey report analyzing the automation potential of hundreds of jobs found that 81 percent of activities performed by food preparation workers might be done by machines in the near future.

That kind of shift may fundamentally change the way we think about food. From the processing plant to the order window, human hands may never touch tomorrow’s value meal.

Which raises an important question: Do you want fries with that, human?

Illustration via Bruno Moraes