BROWSE ALL ARTICLES BY TOPIC

RELATED ITEMS

Bookmark and Share

From: Food Quality & Safety magazine, June/July 2005

Machine Vision Sees Contaminants We Can’t

Robotic cameras may one day stand between us and the danger of drinking fresh, unpasteurized juices or eating contaminated meat.

by Don Comis

Scientists at the ARS Instrumentation and Sensing Laboratory in Beltsville, Md., are developing "machine-vision" systems that can detect contamination the human eye often can't see called machine-vision systems. These are quicker and more accurate than the human eye and don't require anyone to handle the fruit or cut it up.

Yud-Ren Chen, an agricultural engineer, and his colleagues Kuanglin Chao, an agricultural engineer; Moon Kim, a biophysicist; and Alan Lefcourt, a biomedical engineer, have built a prototype "multispectral imaging" apple-inspection system.

Chen leads a team that specializes in developing machine-vision systems using visible and near-infrared light. A mechanical engineer, an electrical engineer, a computer scientist and a USDA Food Safety Inspection Service (FSIS) industrial engineer are also in the group.

The team has tested machine vision on a commercial apple-sorting line, using a digital spectral camera. It can take pictures at different wavelengths simultaneously creating multiple images. This once required two or more cameras, each with its own light filter. Using a hyperspectral imager, the team can find the wavelengths best suited to spotting fecal contamination or cuts and bruises that can harbor bacteria. Some wavelengths are chosen because of their identifiable relationships to photosynthetic pigments in apples.

Hyperspectral Imaging

Biophysicist Kim came to ARS in 1999 from NASA, where he used reflectance and fluorescence for sensing vegetation remotely from airplanes to check on the planet's environmental health.

Now, to detect fecal contamination, he senses photosynthetic pigments from plants, but on a much smaller scale, working barely 2 feet from his targets rather than several thousand.

Kim and Lefcourt upgraded and modernized the lab's existing hyperspectral imaging equipment. The lab uses the hyperspectral system to design commercial inspection systems for poultry and produce.

The instrument, designed and hand-built by the Beltsville team using commercially available components, is called hyperspectral rather than multispectral because it can capture images at up to 256 different wavelengths; a multispectral system generally uses only two to four wavelengths.

"In the research stage, we use over 100 images at many different wavelengths," Kim says. "But it takes several minutes to scan objects at that many wavelengths. So hyperspectral imaging wouldn't be practical for commercial operations. But it is valuable because it lets us visualize images across a range of the spectrum. We can then choose a few optimal spectral bands that will get the job done with enough speed and accuracy when used in multispectral imaging systems."

A multispectral imaging system can scan a whole object in a fraction of a second and is more suitable for real-time use in processing plants, Kim says. The hyperspectral imaging system has "scientific-grade" imaging spectrograph and halogen and fluorescent lamps, all packaged in one unit that sits above a motorized positioning table where the apple is placed. The imaging spectrograph is connected to a computer. For reflectance sensing, visible to near-infrared light comes from quartz halogen bulbs connected to the unit through fiber-optic lines, while fluorescence imaging uses fluorescent lamps. ARS-developed computer software analyzes the hyperspectral images.

Each Apple Scanned a Hundred Times

The imaging spectrograph scans a moving apple hundreds of times, each time sensing a line across the apple's surface. The light on each point on the line is spread out like a rainbow by the spectrograph, creating a three-dimensional image.

The positioning table lets the researchers run hundreds of scans of the apple surface, placing the apple in many different positions, while recording the exact position of the apple so a scan can be repeated later. Mathematical algorithms interpret the multiple images.

"The hyperspectral imaging system is versatile and has many research applications besides food safety," Kim says. Chen agrees that the lab's hyperspectral imaging equipment can be used in many disciplines and with a variety of agricultural products.

For example, Stephen Delwiche, an agricultural engineer on the team, uses the equipment to test for fungal contamination of wheat kernels, and Lefcourt says a color change in leaves can signal serious nutrient deficiency.

"Machine vision can spot the problem when it's still minor and is causing slight color changes not visible to the human eye," he says. "There's no need to destroy the leaf to diagnose the condition.

The machine, he says, can find common patterns in wholesome agricultural objects, so that any anomalies-diseases, defects or contamination-stand out. Similar machine-vision technology can be applied to detect tumors in chickens, fecal contamination, bruises on apples or a fungus on a kernel of grain.

"Since natural objects are not uniform, we can't compare one spot on an object to another spot, but we can find common features among objects in the same class," he adds. "We take pictures of whole objects with spectral signatures at each spot on these objects to detect anomalies and then figure out what the anomalies are."

The Time is Right

The lab has a cooperative research and development agreement (CRADA) with Stork Gamco, Inc., of Gainesville, Ga.-one of the largest manufacturers of chicken-processing plant equipment in the world-to commercialize the system and move it into use among the nation's 300-plus poultry processing plants.

Chao says that Stork Gamco will soon test the system in a chicken-processing plant under the most demanding situations-lines that move 140 birds a minute. Chen says the system can handle up to 180 birds a minute.

The new system, he explains, will be contained in a box hung over the beginning of the processing line, right after the point where chickens are killed and de-feathered. Its camera will send spectral images to a computer set up in another room.

Chao along with Sukwon Kang, an agricultural engineer, updated the machine-vision system to its present user-friendly form, ready to leave the research bench for commercial development. The two redesigned the system to use the new camera, instead of multiple cameras that required additional mathematical adjustments to join separate images. They also changed the software from the DOS operating system to function in Windows, where users can easily navigate by clicking on graphic images.

"We recognize that the users, in this case the chicken processing plant employees, must be considered at every design stage," Chao says.

The new system is ready to market at just the right time, when everything is in place for its success. FSIS is looking at machine vision as a way to help implement its HACCP system, which shifts more inspection responsibility to the processing plant.

"This would free up inspectors so that they have time to take a close, careful look at the birds the machine-vision system's judges suspect," Chao says.

Also, the processing industry is moving to high-speed lines in response to a rising demand for poultry. The industry wants the highest feasible speeds for maximum efficiency, and they see machine vision as the way to make it possible while also improving inspection efficacy. Chao says that the high-speed lines separate into two or three more lines after the birds are killed, and more inspectors are added to meet USDA's requirement of a maximum speed of 35 birds a minute for each inspector.

Don Comis is a member of the USDA's Agricultural Research Service Information staff. Reach him at comis@ars.usda.gov.

Advertisement

 

Current Issue

Current Issue

June/July 2014

Site Search

Site Navigation

 

Advertisements

 

 

Advertisements