Thursday, August 10, 2017

Is Ordering Meat Well-done A Misteak



Somewhere, a chef is crying.

Many chefs feel a sort of kinship with the foods they cook. They lovingly season it with salt and pepper, they cook it to perfection, they take the time and effort to make it just right.

And then someone goes ahead and orders his steak well-done.

Actually, it’s a lot of someones.

LongHorn Steakhouse recently shared the details of all of its steak orders for an entire year with the data geeks at FiveThirtyEight. That’s a ton of information — LongHorn boasts 491 locations around the country, including seven in the St. Louis area.

The results were shocking, or at least sad. Sad to chefs.

It turns out that 77 percent of LongHorn patrons who order steak order it cooked to medium or beyond.

Many chefs will tell you that the only way to order beef is to ask for it rare or medium-rare. That way you can taste the juice; it’s tender and delicious. The more a steak is cooked — and there has to be some scientific way of demonstrating this — the more it loses its flavor. And it indisputably becomes tougher.

And yet, according to the survey, a plurality of 37.5 percent of LongHorn patrons order their steaks medium, 25.8 percent order it medium-well and 11.7 percent like their steaks well-done.

With all due respect to my friends (and readers) who like their steak well-done, you’d get the same flavor profile out of a microwaved bomber jacket.

Lindsey Curtit, managing partner of the LongHorn Steakhouse in O’Fallon, Mo., is happy to have her customers ordering their steak any way they like it.

“There really is no wrong way to cook a steak. It’s really just the guest’s preference,” she said.

Curtit, who likes her steaks cooked medium, said that, based on her casual observation, her customers at the O’Fallon location order their steaks in about the same proportion as the national numbers.

Curtit also pointed out that not all steak is the same. The different types of fiber in different cuts of meat mean that some cuts can stand up to more cooking than others. The ribeye, for instance, can take the heat better than other cuts because of all of its marbled fat. The same is true of the porterhouse and its cousin, the T-bone.

The pertinent question is: Why do people like their steaks the way they like them?

I find I get maximum beefy flavor, without the meat being too chewy, out of steaks cooked on the rare side of medium rare. Though I have no evidence to back it up, my guess is that people who like their steaks medium-well or well done prefer them that way because they are (or were, as children) a little grossed out at seeing juice run out of their meat.

The numbers in the latest data seem to contradict the information gathered just three years ago by FiveThirtyEight. At that time, a survey of 432 steak-eating Americans indicated that 43 percent of us like our steaks cooked to rare or medium-rare.

The plurality of that group, 38 percent, said they like steak cooked medium-rare. A significant number behind, 31 percent, said they like it cooked medium.

The folks at FiveThirtyEight looked at the disparity between these two studies and suggested that people have heard they are supposed to like their steaks rare or medium rare, so that’s what they say when they are asked how they like it. But in reality — when they order at a restaurant — they ask for it medium or above.

But I have a different possibility. LongHorn is a national chain, with locations often (though by no means always) on expressway interchanges. It caters to travelers and families. With a ribeye going for $20.49 and a porterhouse topping out at $28.49, it is a solid and popular restaurant.

But maybe travelers and families are more likely to want their meat cooked longer than couples, singles or people staying closer to home.

Curtit doesn’t think so. “Our vision is to become America’s favorite steakhouse, so America is our target market,” she said. “I would say our guests do represent the country as a whole.”

Thursday, August 3, 2017

Oh, Snap! Scientists Are Turning People's Food Photos Into Recipes


When someone posts a photo of food on social media, do you get cranky? Is it because you just don't care what other people are eating? Or is it because they're enjoying an herb-and-garlic crusted halibut at a seaside restaurant while you sit at your computer with a slice of two-day-old pizza?

Maybe you'd like to have what they're having, but don't know how to make it. If only there were a way to get their recipe without commenting on the photo.

Researchers at the Massachusetts Institute of Technology's Computer Science and Artificial Intelligence Laboratory (CSAIL) would like that for you, too. That's why they're creating an artificial neural network — a computer system modeled after the human brain — to examine those photos and break them down into recipes.

The growth of the Internet has supported the ability to collect and publish several large-scale datasets, allowing for great advances in the field of artificial intelligence (AI), says Javier Marin, a postdoctoral research associate at CSAIL and co-author of a paper published this July at the Conference on Computer Vision and Pattern Recognition in Honolulu.

"However, when it comes to food, there was not any large-scale dataset available in the research community until now," Marin says. "There was a clear need to better understand people's eating habits and dietary preferences."

To do this, researchers have been feeding the computer pairs of photos and their corresponding recipes — about 800,000 of them. The AI network, called Recipe 1M, chews on all of that for a while, learning patterns and connections between the ingredients in the recipes and the photos of food.

"What we've developed is a novel machine learning model that powers an app. The demo that you see is just a pretty interface to that model," says Nicholas Hynes, an MIT graduate student at CSAIL who also co-authored the paper.

You, too, can try out this interface, called Pic2Recipe. To use it, just upload your food photo. The computer will analyze it and retrieve a recipe from a collection of test recipes that best matches your image.

It usually works pretty well, although it can miss an ingredient or two sometimes. Take for example, this video, in which the MIT team uploads a photo of sugar cookies.

"The app took the image, figured out what was in it and how it was prepared, and gave us the recipe that it thinks was most likely to have produced the image," says Hynes.

Pic2Recipe did correctly identify eight out of the 11 ingredients. And it did accurately find a recipe for sugar cookies. Alas, it missed the icing.

But the program doesn't need to visually recognize every ingredient in the photo to find an accurate recipe.

"Just like a human, it can infer the presence of invisible, homogenized or obscured ingredients using context. For instance, if I see a green colored soup, it probably contains peas — and most definitely salt!" says Hynes. "When the model finds the best match, it's really taking a holistic view of the entire image or the entire recipe. That's part of why the model is interesting: It learns a lot about recipes in a very unstructured way."

But as with every new technology, there are some kinks to work out.

The current model sometimes has trouble making fine distinctions between similar recipes, Hynes says. "For instance, it may detect a ham sandwich as pastrami or not recognize that brioche contains milk and egg. We're still actively improving the vision portion of the model."

Another issue, Hynes says, is that the current model has no explicit knowledge of basic concepts like flavor and texture. "Without this, it might replace one ingredient with another because they're used in similar contexts, but, doing so would significantly alter this dish," Hynes says. "For example, there are two very similar Korean fermented ingredients called gochujang and doenjang, but the former is spicy and sweet while the latter is savory and salty."

There are other refinements to be made, such as how to recognize an ingredient as diced, chopped or sliced. Or how to tell the difference between different types of mushrooms or tomatoes.

And when a reporter at The Verge tried the demo, photos of ramen and potato chips turned up no matches. How could the program miss such basics?

"This is simply explained by not having recipes for those foods in the dataset," Hynes says. "For things like ramen and potato chips, people generally don't post recipes for things that come out of a bag."

In the future, the MIT researchers want to do more than just let you have what they're having. They are seeking insight into health and eating habits.

"Determining the ingredients — and therefore how healthy they are — of images posted in a specific region, we could see how health habits change through time," says Marin.

Hynes would like to take the technology a step farther, and is working on a way to automatically link from an image or ingredient list to nutrition information.

"Using it to improve peoples' health is definitely big; when I go to community/potluck dinners, it always astonishes me how people don't pay attention to preparation and how it relates to plausible serving sizes," he says.

Hynes also can see how aspiring cooks might appreciate a system that takes a restaurant item and tells them how to make it. "Even everyday people with dietary restrictions — gluten free, vegan, sparse pantry — would appreciate a tool that could minimally modify a complicated dish like Beef Wellington so that it fits the constraints."

And why stop there? These are MIT scientists, after all, collaborating with researchers from the Qatar Computing Research Institute and the Polytechnic University of Catalonia in Spain.

"In the far future, one might envision a robo-chef that fully understands food and does the cooking for you!" Hynes says.