A US$2 million project to be unveiled on Wednesday in the lunchroom of a Texas elementary school will use high-tech cameras to photograph what foods children pile onto their trays — and later capture what they don’t finish eating.
Digital imaging analysis of the snapshots will then calculate how many calories each student scarfed down. Local health officials said the program, funded by a US Department of Agriculture (USDA) grant, is the first of its kind in a US school, and will be so precise that the technology can identify a half-eaten pear left on a lunch tray.
Researchers hope parents will change eating habits at home once they see what their kids are choosing in schools. The data will also be used to study what foods children are likely to choose and how much of if they’re eating.
“This is very sophisticated,” said Roberto Trevino, director of the San Antonio-based Social & Health Research Center, which will oversee the program.
Parents will be required to give consent for their children to participate, and receive regular reports showing what foods their kids are filling up on at lunch. Trevino said only the trays, and not students, will be photographed.
Here’s how it works: Students are assigned lunch trays with a unique bar code. After the children load up their plates down the line — mashed potatoes or green beans? French fries or fruit? — a camera above the cashier takes a picture of each tray.
When lunch is over and the kids return their plates to the kitchen, another camera takes a snapshot of what’s left on the tray. Software then analyzes the before and after photos to calculate calories consumed and, according to Trevino, a report of nutrients in the foods.
Five San Antonio elementary schools will take part in the -program. Researchers selected poor, minority campuses where obesity rates and students at risk for diabetes are higher.
The grant from the USDA will fund the study for four years. Trevino said the coming school year will be very experimental, with programmers fine-tuning the cameras and imaging software to accurately identify what’s a pear and what’s an apple. He expects the “prototype” to be in place by the second year.