MXPA00008218A - Meat color imaging system for palatability and yield prediction - Google Patents

Meat color imaging system for palatability and yield prediction

Info

Publication number
MXPA00008218A
MXPA00008218A MXPA/A/2000/008218A MXPA00008218A MXPA00008218A MX PA00008218 A MXPA00008218 A MX PA00008218A MX PA00008218 A MXPA00008218 A MX PA00008218A MX PA00008218 A MXPA00008218 A MX PA00008218A
Authority
MX
Mexico
Prior art keywords
lean
video image
image data
meat
section
Prior art date
Application number
MXPA/A/2000/008218A
Other languages
Spanish (es)
Inventor
Keith E Belk
J Daryl Tatum
Gary C Smith
Original Assignee
Keith E Belk
Colorado State University Research Foundation
Gary C Smith
J Daryl Tatum
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Keith E Belk, Colorado State University Research Foundation, Gary C Smith, J Daryl Tatum filed Critical Keith E Belk
Publication of MXPA00008218A publication Critical patent/MXPA00008218A/en

Links

Abstract

Herein is disclosed a video image analysis (VIA) system for scoring characteristics predictive of palatability and yield of a meat animal carcass or cut. The VIA system provides for a video camera, a data processing unit for processing video image data, and an output device for output of processed data to the user. Also disclosed is a method for using the VIA system in predicting palatability and yield of a meat animal carcass or cut.

Description

SYSTEM OF FORMATION OF IMAGE TO COLOR TO PREDICT THE PALATABILITY AND PERFORMANCE OF THE MEAT * FIELD OF THE INVENTION The field of the present invention is the prediction of the palatability and performance of the meat. Very specifically, the present invention relates to the prediction of the palatability and performance of meat by the use of video image analysis (VIA) to determine the color parameters L * (psychometric luminosity), a * (red vs. red). green), b * (yellow vs. blue) of the lean and fat portions of a carcass or meat cut.
DESCRIPTION OF THE RELATED TECHNIQUE Meat consumers generally prefer, and are willing to pay for it, a greater softness in meat. It has been shown that the result of the veining of a channel generally coates with the palatability of meat that is subsequently cooked through a wide range of levels of veining of beef, pork and lamb. However, between channels with the same level of veining, there are substantial differences in palatability. Other factors of the channel were considered to predict palatability, including the result of maturity, muscle pH, and muscle color; These factors may be more valuable in predicting the palatability of chicken, turkey and fish. Among those experts in channel examinations, for example, meat scientists and classifiers from the US Department of Agriculture. (USDA), some of these factors can be qualified and palatability can be predicted by assigning a USDA Grade of Quality, given sufficient time of the examination. In practice, for example in the case of beef, USDA classifiers who work in packing plants, commonly must assign grades to 250 to 450 carcasses per hour, which does not yield enough time for a complete examination of all the factors related to the prediction of palatability. The time reduction also makes the precise calculation required to obtain the Quality Degrees difficult. In addition, the USDA classifiers need to calculate the Degrees of Performance, whose intention is to calculate the capacity of cuts and composition of a channel. The factors that were used to determine the Degrees of Performance include the weight of the fresh channel, the area of ribeye (the transverse area of the longissimus muscle in the interfacial zone of the ribs 12a-13a), the estimated percentage of kidney fat, in the pelvic area and in the heart, and the real and adjusted thickness of subcutaneous fat on the outside of the canal. The time restrictions described above for the calculation of the Quality Degrees also apply to the calculation of the Degrees of Performance. The parameters that serve as a basis for determining the Quality Degrees and the Degrees of Performance are published by the USDA Agricultural Marketing Service, Livestock Division and Seeds, for example, for beef, the United States Standards of Grades of Res on Canal. A device to qualify the factors that can predict the palatability of the carcass or cut of the meat, in addition to the examination of the channel or cut by a classifier of the USDA would allow that the palatability of the meat can be predicted more accurately and They would also more accurately allocate the USDA Grades of Quality. This would result in greater consumer confidence in the Quality Rating system, as well as in any additional system for certification in accordance with product quality specifications, as might be desired in a "brand name" program. . In any case, a more accurate classification of the channels could be obtained to determine the prices of the meat. This higher classification would provide the economic benefit to all those segments of the meat production system: restaurateurs, food service operators and retailers; packers; operators of food lots; as well as ranchers, farmers, pig breeders, lambs, cattle and dairy cattle, chickens, turkeys and different species of fish. This higher classification would also benefit scientists in the collection of data from channels and cuts for research, and to the previous owners of cattle to be able to make decisions regarding their genetics or administrative decisions.
Various attempts have been made to build such devices for use in the beef industry. One of these devices uses an image analysis system of "duo scrutiny" or "double component". Two chambers are used: a first chamber on the slaughterhouse floor scrutinizes a complete channel, and a second chamber detects the part of the ribeye after the channel is frozen and grooved into four parts. In the use of these systems, video data is recorded from the res channel and transferred to a computer. A program that runs through the computer determines the percentages of channel constituted by fat and lean part of the recorded image and the additional data available, for example, the weight of the fresh channel. The amounts of cuts are predicted at different levels of lean portion that can be derived from the channel. However, based on the scientific evaluation, the system does not have the ability to predict the palatability of the observed channel to increase the allocation of a USDA Grade of Quality or other purpose related to the selection of channels based on the quality for feeding. One possible group of factors that can be examined to produce palatability is the color of muscle and fat. Wulf et al, J. Anim. Sci. (1997) 75, 684, generated results of both the color qualification in the L * a * b * color space of the fresh muscle longissimus thoracis at 27 h after slaughter, and the shear strength determinations. Wamer-Bratzler of the longissimus lumborum melted, seasoned, and cooked muscle from cattle carcasses derived from crosses between several breeds of Bos taurus (European genetics) and Bos indicus (genetics of heat tolerance and based on tropical climate). The softness of the meat, measured by the shear strength, correlates with all three color measures, and the highest correlation with b * values is observed. These results showed that the color of the muscle can be used to predict the palatability of the beef. Therefore, it is desirable to have an apparatus for rating the factors that can predict the palatability of an animal carcass for meat. It is desirable that said apparatus collect and process the data and that it provide the output information of the results within the time limit in which a channel is examined by a USDA classifier under typical conditions in the packing plant, which is commonly 5- 15 sec It is desirable that said apparatus returns said results of at least one of the following, for example, the color of the lean tissue, color of the fat tissue, degree of veining, average number and variation of veining of the veining per unit area, size average of the veining and variation of the average of the size of the veining, average texture, and firmness of the lean tissue. It is desirable that the apparatus use these measures to assign a grade or rating to the channels, so that these channels can be classified into groups that reflect precise differences in the palatability of the cooked meat. It is also desirable to have an apparatus that measures the transverse surface area of a cut and exposed muscle (e.g., ribeye) for use in predicting the composition (fat, lean, bone) of a channel or cut. It is desirable that the device uses this measure to assign a grade or rating to channels, so that these channels can be classified into groups, which reflect precise differences in performance. It is desirable that this apparatus also measures the relative areas of the transverse surfaces constituted by grease and / or bone. In addition, it is desirable to have an apparatus for measuring, predicting, and classifying channels based on palatability and performance. Furthermore, it is desirable that said apparatus be portable, for example small and lightweight. It is desirable that the apparatus have the ability to withstand the environment of the packing plant, for example that it can be mounted in a protective housing. The present invention relates to a method for predicting the palatability of meat comprising: providing video image data related to at least a portion of the meat; analyze the video image data to distinguish at least one lean section of meat from a section of meat that is not lean; analyze the video image data corresponding to the lean section; measure the characteristic of the lean section based on the video image data; and correlating said characteristic with the palatability of the meat. The present invention also relates to an apparatus for predicting the palatability of meat comprising: a video camera adapted to provide video image data of at least a portion of the meat, a data processing unit adapted to execute instructions of the Program; a program storage device coded with program instructions that, when executed, develop a method for predicting palatability of meat, the method comprising: analyzing video image data to distinguish at least one section of lean meat from a section of meat that is not lean; analyze the video image data corresponding to the lean section; measure a characteristic of the lean section based on the video image data; and correlate the characteristic with the palatability of the meat. A further aspect of the present invention provides an apparatus for predicting the palatability of meat, comprising: means for providing video image data of at least a portion of the meat; means for analyzing the video image data to distinguish at least one section of the lean meat from a section of the meat that is not lean; means for analyzing the video image data corresponding to the lean section; means for measuring a feature of the lean section based on the video image data; and means to correlate the characteristic with the palatability of the meat. Figure 1 shows a schematic view of an apparatus of the present invention. Figure 2 shows a flow chart of a method of the present invention.
Figure 3 shows a flowchart of a computer program that analyzes video image data to distinguish at least one section of lean meat from a non-lean meat section, analyzing the corresponding video image data to the lean section, and measure a characteristic of the lean section based on the video image data. The present invention provides a video image analysis system (VIA) to qualify factors that can predict the palatability of an animal channel. The VIA system is preferably a VIA color system. As shown in Figure 1, the VIA system includes a video camera 12, preferably a 3-CCD color video camera, preferably mounted in a camera enclosure (not shown). The video camera 12 optionally contains a lighting system 26 mounted either in the camera, in the enclosure of the camera, or it may not be mounted in the camera but it may be mounted in the enclosure of the camera. The VIA system also includes a data processing unit 16, the data processing unit 16 is connected by an interface with a program storage device 20 via an interface of the storage device of the program 18, and at least one output device 24 by an interface of the output device 22. The storage device of the program 20 contains a computer program or programs required to adequately process the video image data, preferably, the video image data to color, by the data processing unit 16. The data processing unit 16 is linked to, and receives data from, the video camera 12 by either a transfer wire 14 or a wireless transmission device (not shown). ). The data processing unit 16 comprises a standard central processing unit (CPU), and preferably also a software module or a hardware device for the conversion of the analog data to digital data, and processes the video image data of the digital data. according to the instructions encoded by a computer program stored in the storage device of the program 20. The video image data can be used in the subsequent calculation of the values of the characteristics, the values can predict the palatability, the features include the color of the lean tissue, the color of the fat tissue, the degree of veining, the average number and variation of grain veins per unit area, the average size of the veining and the variation of the average size of the grain, the average texture of the grain veined and lean tissue, as well as the firmness of lean tissue. These values can then be used to classify the meat (which is herein defined as an animal carcass for meat, side portion, or cut, or any portion of a carcass, side portion, or cut) in groups that vary in size. to the prediction of the quality of the food cooked subsequently. The color parameters L *, a *, and b * can also be used to calculate the values of predictable performance factors, such as the cross sectional area of a muscle of interest and other surrounding organs such as fat, bone and connective tissue. . These values can also be used to classify meat into groups that vary in predicted composition. The data processing unit 16 is linked to, and transmits data processing results to, at least one output device 24 via the interface of the output device 22. Optionally, the results of the data processing can also be transferred to a data processing unit. file in the storage device of the program 20 through an interface of the storage device of the program 18. An output device 24, can be a video screen, printer, or another device. It is preferred that at least one output device 24 supplies a physical or electronic label to mark the meat 10 with the results of the data processing, to facilitate the selection of meat carcasses or cuts, or both, in groups with a similar palatability and / or performance already predicted. The present invention also provides a method for predicting the palatability of the meat 10 and determining the cross-sectional area of the meat 10. Using the VIA system aforementioned, the video image data collected from the meat 10 is recorded by the video camera 12, processed by the data processing unit 16, and the palatability and / or cross-sectional area of the muscle are emitted by the output device 24 to augment the observations made by a USDA line classifier, or another operator responsible for classifying or characterizing animal carcasses for meat, in order to obtain a more precise assignment of the Quality Grades, the Degrees of Performance, and / or other classifications or classification criteria based on said characteristics. An apparatus for use in the present invention comprising a video camera 12 and a processing unit 16. The video camera 12 can be any camera known to those skilled in the art. It is important that the video camera 12 provides the output information within the time limit assigned for the meat channel examination, typically 5-15 seconds. Preferably the output information is in control process time. Said control process time output information may be the same technology of a viewer in a known portable video camera, the control process time output information may be of the same technology as the known digital portable video camera. , the control process time output information may be a known computer-generated control process time screen, such as those known in different video conferencing applications, or it may be another technology known to those skilled in the art. in the technique. It is preferable that the video camera 12 be a color video camera, for the reasons discussed above. It is also preferred that the video camera 12 be small and light in weight, so that it provides the advantages of being portable and of placement flexibility, that is, that the user can adjust the angle of the camera to provide the optimum data collection. of video image of the meat 10. It is also preferred that the video camera 12 be durable, to better support the environment in the packing plant. The power source of the video camera 12 can be either direct current, that is, a battery secured to the electrical contacts from which the video camera 12 can emit power, or by an alternating current provided either from an electrical outlet or from the data processing unit 16. A lighting system 26, which can optionally be used to illuminate the surface of the meat. The latter is desirable when ambient light is reduced or irregular, or when it is desired to examine regions of the meat 10 that are not illuminated by ambient light. Any known lighting system 26 can be used. The power source of the lighting system 26 can be either direct current, i.e., a battery, or alternating current emitted from any of these: an electrical outlet, the video camera 12 or the data processing unit 16. It is preferred that the lighting system 26 be small and light in weight, for reasons explained above with reference to the video camera 12. The lighting system 26 can be mounted on the camera, on the outside surface of a chamber enclosure, or inside a chamber enclosure, the camera enclosure is described in the following paragraph.
The video camera 12 and the optional lighting system 26 can be enclosed or not. Preferably, the video camera 12 is enclosed in a camera enclosure (not shown) for protection against the environment in the packing and processing plants. It is important that the enclosure of the chamber has a first aperture for the lens of the video camera 12 to observe the meat 10. If an optional lighting system 26 is used, the lighting system 26 can be mounted either on the outer surface of the chamber enclosure or inside the chamber enclosure. If it is mounted inside the chamber enclosure, the lighting system 26 may or may not be mounted in the chamber. If the lighting system 26 is mounted in the chamber enclosure, it is important that an opening be provided for lighting the meat 10, either the first used lens aperture of the video camera 12 or a second aperture. In any case, the opening can be enclosed or not by a panel of transparent material. If video image data are to be transferred from the video camera 12 to the data processing unit 16, by means of a transfer wire 14 connected to each other, it is also important that the camera enclosure provide an opening for the cable to exit. of the enclosure. This opening can be the first aperture used by the lens of the video camera 12, the second aperture that can be used by the illumination system 26, or a third aperture. If the cable leaves the enclosure from the first or second opening, and the first or second opening is enclosed by a panel of transparent material, it is important to provide a first passage opening of the first cable in the panel for cable passage. It is preferred that the chamber enclosure be constructed of a light weight material and that it be long enough for the video camera 12 to be conveniently adapted, and optionally, that the lighting system 26 described above be adapted. If alternating current is to be used as the power source of the video camera 12, it is important that an opening be provided for the power cable to pass from the video camera 12 to the power source. Any of the first, second, or third openings can be used, or a fourth opening can also be used. If the opening to be used is enclosed by a panel of transparent material, it is important to provide a second cable passage opening in the panel for the passage of the power cable. Alternatively, both the power cable and the data transfer cable can exit the chamber enclosure through a simple cable passage opening. Optionally, the chamber enclosure can be designed with features to allow easier to use fasteners and manipulation objects, for example handles, helmet assembly, etc. and / or with features that allow fixing in their position without the use of fasteners and manipulation objects, for example brackets for wall mounting, ceiling mounting, or tripod mounting, among other features. Optionally, the wall, ceiling, or tripod assembly can be motorized to obtain a rotating effect, with heads that adjust the camera angle and focal length. Preferably, the enclosure of the camera can be designed to open easily and allow proper maintenance of the video camera 12 or the replacement of a battery if the direct current is used as the power source of the video camera 12. It can also be require the maintenance of lighting system 26 and preferably in this option will be allowed by the same easy opening design as described for video camera 12. The easy opening design may be affected by the use of screws, clamps, or other means widely known in the art. It is desirable to have the ease of maintenance to minimize any loss of time that can be found. After the video image data is photographed by the video camera 12, they are transferred in control process time to the data processing unit 16. The data can be transferred by a transfer cable 14 or by a transmission device Wireless data (not shown). In most situations, the transfer wire 14 is the preferred means of transmission based on the higher cost less shielding. In situations where the video camera 12 and the data processing unit 16 are widely separated, a wireless data transmission device can be a more practical means of transmission. Any data transfer technique known to those skilled in the art can be used. The video image data can be sent from the video camera 12 to the data processing unit 16 either analog or digital data. If they are sent as analogous data, it is important to convert the analog data to digital data before processing it, sending said data to a hardware device (not shown) or a software module with the ability to convert said data. Said software module can be called "video image hoarder". If the video image data is sent as digital data, no conversion is required before processing the data. For purposes of the present invention, a "data processing unit" is defined that includes, but is not limited to, desktop computers, laptops, handheld computers, and electronic devices intended therefor. Any data processing unit known in the art can be used in the present invention. In one embodiment of the present invention, the data processing unit 16 may be small and light in weight to provide the ability to be portable. In a second embodiment of the present invention, the data processing unit 16 may be a microcomputer, minicomputer, or central unit that is not portable. The present invention is not limited to any specific data processing unit, computer or operating system. An exemplary mode, but not a modality that is considered as limiting, is a PC compatible computer that runs an operating system such as DOS, Windows, or UNIX. The choice of the hardware device or software module for conversion of analogous data to digital data for use in the present invention depends on the video camera 12, the data processing unit 16, and the operating system used, but given these restrictions, the person skilled in the art will easily make such a choice. It is also preferred that the data processing unit 16 includes a software module that converts RGB color to a color L * a * b *. An exemplary software module can be provided in Hunter Color Vision Systems (Hunter Associates Laboratory, Inc.) In addition to a cable port or wireless data transmission device for receiving data from the video camera 12, it is also preferred that the data processing unit 16 includes other input devices, for example, a keyboard, a mouse or track-ball, a pointer with light, a touch screen, a feeler etc., to allow convenient operation of user options in the camera and in the operation of software, data processing, data storage, program output information, etc. There are several software programs that are important for the data processing unit 16 for storing in a program 20 storage device (examples of program storage devices that can be a hard disk, a floppy disk drive, a tape drive, a ROM memory, and a CD-ROM, among others) to access from the program storage device 20 via the program storage device interface 18, and execute . It is important that the data processing unit 16 has an operating system, and any software unit needed to adequately control and recover data from the video camera 12 and send the output information to at least one of the output devices 24. It is important that the data processing unit 16 executes a program or programs that can process the received video image data, calculate various parameters of the muscle image formation, the received video image data, and provide the output from the results of calculations to an output device 24. An exemplary code for such program or programs is provided in the appendix or annex to the present. In Figure 3, an exemplary flow chart for said program or programs is provided. The video image data can be analyzed for the color scale parameters. If you want to adapt it to the standard or internadonal standard, you can analyze the video image data for the parameters at color scale L *, a *, and b *, as defined by the Internadonal Commission of Lighting (Commission Internationale d'Eclairage) (CIÉ). A set of L * a * b * parameters is recorded for each image. L *, a *, and b * are dimensions of a three-dimensional color space that are regularized to reflect how humans perceive color. The dimension L * corresponds to the luminosity (a value of 0 if it is black, a value of 100 if it is white). The dimension a * corresponds to the relative levels of green and red (a negative value that is green, a positive value that is red), and the dimension b * corresponds to the relative levels of blue and yellow (a negative value that is blue , a positive value that is yellow). In a preferred embodiment, the system can capture pixellated video images of areas from 7.74 to 278.72 square meters of the muscle of interest, comprising up to 350,000 pixels per measurement, and determine L *, a * and b * for each pixel. In all modes, it is desirable that the determination of L * a * b * be carried out using the Hunter Associates software conversion module. Once the value L * a * b \ has been determined, at least one of the components L *, a * and b * can be used in the subsequent data processing. After the determination of L *, a * and b * for each pixel, a program calculates various parameters of the image for each frame. First, the program delineates the muscle of interest by choosing areas that have tolerances of b * compatible with the muscle. A classification of at least one area of the image into one of two classifications, such as muscle and non-muscle, can be referred to as a "binary mask" 7 Areas with b * values compatible with the muscle of interest are examined to obtain their results from L * and * for the verification and rejection of the surrounding tissues that invade the profile of the muscle of interest.The additional examination does not need to be performed in areas with L *, a * and b * results suggesting bone, tissue connective tissue and fat The surface area of the cross section of the muscle of interest is determined Within the portion of the image taken from the muscle of interest, the lean tissue and muscle fat tissue can be distinguished and can be determined the results of initial L *, a * and b * for lean muscle tissues These results can be sent to the output device 24 so that they are displayed in numerical format and / or retained for lime cular quality and performance and characteristics that determine them as described below. It is known that the higher values of b * for lean muscle tissues correlate with a greater softness in meat (Wulf et al., nineteen ninety six). In addition, the color of the fat of that intramuscular fat can also be determined. In addition, within the portion of the image taken from the muscle of interest, determinations of the quantity, distribution, dispersion, texture, and firmness of the veining can be made (intramuscular fat deposited inside the muscle). The amount of the veining can be determined by calculating the percentage of the surface area of the muscle with L *, a * and b * results compatible with the fat tissue. In addition to calculating the amount of veining present, the distribution and dispersion of said veining can also be determined. First, the portion of the image derived from the muscle of interest can be divided into sub-cells of equal size. You can use a size of 68 x 48 pixels. Within each subcell, the number of veins of the veining can be determined as the number of discrete regions with values L *, a * and b * that correspond to the fat, and the average number of veins of the veining can be calculated per subcell. You can also calculate the variation of grain vein numbers across all sub-cells. In addition, the average size of each vein can be determined through the muscle of interest with the number of pixels within each discrete region with values L *, a * and b * corresponding to the fat. The variation in size of the grain through all veins of the grain can also be calculated. The texture and fineness of the grain can also be measured. It is well known, that generally, greater amounts of the veining that is distributed more uniformly and finely textured, reflect a higher grading of the veined and therefore a higher quality of feed in the meat. In addition, the program can use L *, a * and b * to calculate the average texture, that is, the hardness of the surface in sections of the muscle, and also the firmness of the lean tissue of the transversal muscle. It is well known that the hardness of the surface of a muscle is inversely correlated with the softness, and a greater firmness correlates with the taste. To summarize, the characteristics of the lean section of the meat 10 can be measured including, but not limited to, lean tissue color, fat tissue color, a quantity of veining, a veining distribution, a veining dispersion, a texture of the grain, a fineness of the grain, an average texture of the lean tissue, a firmness of the lean tissue and a surface area of the lean section. The amounts of the non-lean section of the meat 10, including but not limited to the color of the fat and also the relative areas of cross-sectional area constituted by fat, bone and / or connective tissue, can be calculated. Other characteristics that those skilled in the art in the science of meats can easily appreciate, can be the calculation of the L *, a * and b * values and can predict the palatability that can be calculated by the program, and any characteristics that are considered within the scope of the present invention. Once the various parameters described above have been calculated, the program can output to the output device 24 the calculated values of any or all of the characteristics provided above: lean tissue color, fat tissue color, degree of grain, number average vein streaks per unit area, vein streak variation per unit area, average vein size, average vein size variation, vein texture and fineness, average lean tissue texture, and lean tissue firmness. Preferably, if the calculated values of the characteristics are emitted, they are displayed as alphanumeric characters that the operator can conveniently read. Alternatively, or additionally, for outputting the values of the characteristics to an output device 24, additional calculations can be performed using at least one of the values, and optionally parameter values entered by the operator, to derive the estimated Quality Degrees or other general palatability indexes of cooked meat, which may also be emitted. Additionally, because the specific muscle of interest has been isolated in a transverse image, and the geometry and distance to the apparatus can be known in relation to the meat 10, the transverse surface area of the muscle portion of the muscle can be calculated. the meat 10 and emitting to the output device 24. Alternatively, or additionally, to output the result of the area to an output device 24, additional calculations can be made using the cross-sectional area of the muscle, other parameters that can be easily seen by those skilled in the art of meat science as computable from the L * a * b * data and / or parameter values entered by the operator, to derive the estimated Degrees of Performance or other general composition ratios of the meat 10 The results reported by the program can be issued to any output device 24, such as a screen, printer, speaker, etc. If the evaluation of the operator of the results is desired, the results can preferably be displayed on a screen. Preferably, the screen is easily visible to the examiner, evaluator, classifier or operator in your booth. Alternatively, or additionally, it is preferable that the results be printed or issued in such a way that the emitted results can be transferred and fixed in the meat 10. The way to emit the results can be in text, symbols or icons that can be read by personnel either at the packing plant or at later points in the meat production system. Alternatively, the manner of issuing the results may be a bar code or other object that can be read by the appropriate equipment and decoded in ways that can be read by personnel at various points within the production system. The results of the output information can be set in the meat 10 by methods well known in the art, including, but not limited to, brooches, labels and adhesives. The power source of the data processing unit 16 may be either direct current, i.e. a battery, or alternating current drawn from the electrical outlet. In the embodiment wherein the data processing unit 16 is intended for use in the present apparatus, the data processing unit 16 may be mounted in a data processing unit enclosure or in the chamber enclosure, or may be be without pigeonholing In the mode in which the data processing unit 16 is a micro computer, mini computer, or central computing unit resource, present in the plant or facility where the device is used, an enclosure is not required. In the embodiment wherein the data processing unit 16 is a portable entity that is stopped alone, separately, preferably the data processing unit 16 is mounted in a data processing unit enclosure.
It is important that the enclosure of the data processing unit has an opening or openings for the transmission of data or for the display of data by the output device 24. For example, if the display is to be developed using a video screen In congruence with the data processing unit 16, it is important that the enclosure of the data processing unit provide an opening for viewing the video screen therethrough. Said opening may be unboxed or may be enclosed by a panel of transparent material, such as glass, plastic, etc. If the visualization is to be carried out by an external device, for example a remote-controlled monitor or printer, it is important that the enclosure of the data processing unit has an opening for the passage of the output cable therethrough. If the data processing unit 16 is energized by alternating current, it is important that the enclosure of the data processing unit provide an opening for the passage of an energy cable therethrough. If it is desired to store the output information to an internal floppy disk drive, it is important that the enclosure of the data processing unit provide an opening for the insertion and removal of floppy disk inside and from the internal floppy disk drive through Of the same. If it is desired to store the output information to an external program storage device 20, it is important that the enclosure of the data processing unit provide an opening for the passage of a data transfer cable therethrough. Preferably, if the data processing unit 16 is a unit intended to be stopped alone, the enclosure of the data processing unit is only wide enough for the data processing unit 16 to adapt conveniently, and that is light in weight. Optionally, the enclosure of the data processing unit can be designed with features that allow the user to manipulate them more easily, for example, movements. In this embodiment, it is also preferred that the enclosure of the data processing unit be easy to open to allow proper maintenance of the data processing unit 16. The easy-open design can be affected by the means described for the enclosure of camera previously mentioned. The apparatus described above can be used in methods for predicting the palatability and / or yield of, or increasing the assignment of the US grades, for carcasses or animal cuts for meat, or for classifying them for other purposes (eg names). of brands, product lines, etc.). The first step involves the collection of the video image data of the meat 10 using the video camera 12. The second stage involves the processing of video image data using the data processing unit 16. The third stage involves using the results of the processing stage to report the quality determining characteristics that can be used to increase the USDA Grade of Quality assignation to the USDA classifiers, to report the transverse muscle surface areas that can be used to increase to the USDA classifiers the allocation of USDA Degrees of Performance and / or to classify meat 10 based on specific requirements of, for example, brand name or product line program. Using this method, the limited time of the classifier or operator to analyze the meat 10 can be focused on examining the parameters that can be easily examined by one person, providing the classifier or operator with more data for each meat sample 10 in the same period and allowing a more accurate prediction of the palatability and assignment of the Degree of Quality and the Degree of Performance that is possibly obtained. further, this method allows calculations to be made more quickly and accurately compared to those that are currently carried out. The following example is included to demonstrate a preferred embodiment of the invention. Those skilled in the art will appreciate that the techniques described in the following examples represent techniques discovered by the inventor to function adequately in the practice of the invention, and therefore can be considered because they constitute preferred modes for their practice. However, those skilled in the art, in light of the present description, should appreciate that various changes can be made in the specific modalities, which are described and that still obtain a similar or similar result without departing from the spirit and scope of the invention. invention.
EXAMPLE 1 Separation of beef carcasses with very few chances of problems in terms of meat tenderness. A population of 324 carcasses of beef was examined in an effort to separate a subpopulation of channels with very few probabilities (= 0.0003) of having shear strength values in ribeye of cuts of 4.5 kg or greater and subsequent cuts difficult to consume. unacceptable Of the 324 channels, 200 were certified that complied with the previous softness standard. Of the 324 heads, 17 heads were previously selected for the subpopulation characterized by softness, based on the veining results determined by the expert (USDA scientist or supervisor) in modest, moderate or slightly abundant, the three highest grades of veined in the United States Regulations for carcass grades. In a second pre-selection stage, 41 heads of the remaining 307 were preselected based on the color L * a * b *. These channels showed a second component of lean meat principle of values L *, a *, and b * less than -070. Said reduced values of the combined variable have been observed to consistently indicate the sufficient softness of the lean meat cooked subsequently. Third, 19 of the remaining 266 heads were previously selected based on the distribution of the grain. The distribution of the grain was determined, and the distribution variation of the grain was calculated by the apparatus of the present invention. A variation of the veined distribution of less than 1.1 has been observed to consistently indicate sufficient softness of the lean meat subsequently cooked (i.e., a shear value less than 4.5 kg). In the final stage, the softness values were predicted for each 247 remaining heads using a multiple regression equation using CIÉ a * values for lean and fat portions, as well as the high squared percentage of machine-measured graining. The multiple regression equation determined that it was predicted that 123 of 247 channels had the probability of 0.0003 of not being smooth. Then 123 channels were separated, of which 77 had been previously selected, and certified as soft. The remaining channels had a normal probability of 0.117 of having shear values above 4.5 kg. The results indicate that the system has the ability to separate groups of beef carcasses that have very low probability of unacceptable hardness. Both the apparatus and the method described and claimed herein may be performed and executed without undue experimentation in the light of the present disclosure. Although the apparatus and method of this invention have been described in terms of the preferred embodiments, it will be apparent to those skilled in the art that variations may be applied to the apparatus and method and to the sequence of method steps described herein. without departing from the concept, spirit and scope of the invention. Said similar variations apparent to those skilled in the art are considered within the spirit, scope and concept of the invention as defined by the appended claims.

Claims (31)

NOVELTY OF THE INVENTION CLAIMS
1. - A method for predicting the palatability of meat, comprising: providing video image data related to at least a portion of the meat; analyze video image data to distinguish at least one section of lean meat from a section of meat that is not lean; analyze the video image data corresponding to the lean section; measure the characteristic of the lean section based on the video image data; and correlate the characteristic with the palatability of the meat.
2. The method according to claim 1, further characterized in that to provide video image data includes providing color video image data.
3. The method according to claim 2, further characterized in that to analyze the video image data to distinguish at least one section of the lean meat from a section of the meat that is not lean, includes comparing the color of a first part of the video image data to the color of a second portion of the video image data.
4. - The method according to claim 3, further characterized in that to compare, includes calculating at least one of the color components L *, b * and a * of the video image data.
5. The method according to claim 4, further characterized in that the video image data includes a variety of pixels and the calculation of said data includes calculating at least one of the color components L *, b * and * for each pixel.
6. The method according to claim 1, further characterized in that to provide the video image data, includes photographing at least a portion of the meat.
7. The method according to claim 1, further characterized in that to provide the video image data includes illuminating at least a portion of the meat.
8. The method according to claim 1, further characterized in that the lean section includes lean tissue and fat tissue, and measuring the characteristics of the lean section includes distinguishing the lean tissue from the fat tissue.
9. The method according to claim 8, further characterized in that the video image data includes color video image data and distinguish the lean tissue from the fat tissue, including comparing the color of a first portion of the data of video image with the color of a second portion of the video image data.
10. - The method according to claim 9, further characterized in that it includes the calculation of at least one of the color components L *, b * and a * of the video image data.
11. The method according to claim 10, further characterized in that the video data includes a variety of pixels and the calculation includes calculating at least one of the color components L *, b * and a * for each pixel.
12. The method according to claim 9, further characterized in that to measure a characteristic of the lean section includes measuring at least one of the colors of the lean tissue, the color of the fat tissue, the amount of veining, the distribution of the veining, the dispersion of the veining, the texture of the veining, the softness of the veining, the average texture of the lean tissue, the firmness of the lean tissue, the surface area of the lean section, and the amounts of the section that is not lean.
13. The method according to claim 1, further characterized in that to analyze the video image data to distinguish the lean section from the non-lean section includes distinguishing the lean portion of at least one of the fat portions, a portion of bone, and a portion of connective tissue.
14. The method according to claim 1, which also includes the determination of the Quality Grade for meat based on its characteristics.
15. - The method according to claim 1, further comprising determining a Performance Grade for the meat based on said characteristic.
16. An apparatus for predicting the palatability of meat comprising: a video camera adapted to provide video image data of at least a portion of the meat; a data processing unit adapted to execute program instructions; a coded program storage device with program instructions that, when executed, develop a method for predicting the palatability of meat, the method comprising: analyzing video image data to distinguish at least one section of lean meat from a section of meat that is not lean; analyze the video image data corresponding to the lean section; measure a characteristic of the lean section based on the video image data; and correlate the characteristic with the palatability of the meat.
17. The apparatus according to claim 16, further characterized in that the video image data comprises analogous data, and the method further comprises converting the analog data to digital data.
18. The apparatus according to claim 16, further characterized in that to provide the video image data in the method, includes providing color video image data.
19. - The apparatus according to claim 18, further characterized in that to analyze the video image data to distinguish at least one section of the lean meat from a section of the meat that is not lean, in the method includes comparing the color of a first portion of the video image data to the color of the second portion of the video image data.
20. The apparatus according to claim 19, further characterized in that to compare in the method includes calculating at least one of the color components L *, b * and a * of the video image data.
21. The apparatus according to claim 20, further characterized in that the video image data includes a variety of pixels and the calculation in the method includes calculating at least one of the color components L \ b * and * for each pixel
22. The apparatus according to claim 16, further comprising a lighting system adapted to illuminate at least a portion of the meat.
23. The apparatus according to claim 16, further characterized in that the lean section includes lean tissue and fat tissue, and the measurement of the characteristic of the lean section in the method includes distinguishing the lean tissue from the fat tissue.
24. The apparatus according to claim 23, further characterized in that the video image data includes color video image data, and distinguishing the lean tissue from the fat tissue in the method includes comparing the color of a first part of the video image data with the color of a second part of the video image data.
25. The apparatus according to claim 24, further characterized in that the comparison in the method includes calculating at least one of the color components L *, b * and a * of the video image data.
26. The apparatus according to claim 25, further characterized in that the video image data includes a variety of pixels and the calculation in the method includes calculating at least one of the color components L *, b * and * for each pixel.
27. The apparatus according to claim 19, further characterized in that to measure a characteristic of the lean section includes measuring at least one of the colors of the lean tissue, the color of the fat tissue, a quantity of the grain, a distribution of the veined, a scattering of veining, a veining texture, a veining softness, an average texture of lean tissue, a firmness of lean tissue, a lean section surface area, and the amounts of the non-lean section.
28. The apparatus according to claim 16, further characterized in that to analyze the video image data to distinguish the section of lean meat from the section of meat that is not lean, includes distinguishing the lean portion of at least one of the fat portions, a portion of bone, and a portion of connective tissue.
29. - The apparatus according to claim 16, wherein the method further comprises determining a Quality Grade for the meat based on its characteristic.
30. The apparatus according to claim 16, wherein the method further comprises determining a Performance Grade for the meat based on said characteristic.
31. An apparatus for predicting the palatability of meat comprising: a video camera adapted to provide video image data of at least a portion of the meat; a data processing unit adapted to execute program instructions; a coded program storage device with program instructions that, when executed, develop a method for predicting the palatability of meat, the method comprising: analyzing the video image data to distinguish at least one lean section of the meat from a section that is not lean of the meat; analyze the video image data corresponding to the lean section; measure a characteristic of the lean section based on the video image data; and correlate the characteristic with the palatability of the meat.
MXPA/A/2000/008218A 1998-02-20 2000-08-21 Meat color imaging system for palatability and yield prediction MXPA00008218A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US60/075,517 1998-02-20

Publications (1)

Publication Number Publication Date
MXPA00008218A true MXPA00008218A (en) 2001-07-09

Family

ID=

Similar Documents

Publication Publication Date Title
EP1060391B1 (en) Meat color imaging system for palatability and yield prediction
Qiao et al. Pork quality and marbling level assessment using a hyperspectral imaging system
Jackman et al. Automatic segmentation of beef longissimus dorsi muscle and marbling by an adaptable algorithm
US6104827A (en) Image analysis for meat
Basset et al. Application of texture image analysis for the classification of bovine meat
US8260005B2 (en) Portable tool for determining meat quality
Choi et al. Application of AutoFom III equipment for prediction of primal and commercial cut weight of Korean pig carcasses
Alsahaf et al. Estimation of muscle scores of live pigs using a kinect camera
Sánchez et al. Control of ham salting by using image segmentation
Stien et al. Rapid estimation of fat content in salmon fillets by colour image analysis
Kongsro et al. Prediction of fat, muscle and value in Norwegian lamb carcasses using EUROP classification, carcass shape and length measurements, visible light reflectance and computer tomography (CT)
Jia et al. Prediction of lean and fat composition in swine carcasses from ham area measurements with image analysis
WO1991014180A1 (en) Evaluating carcasses by image analysis and object definition
MXPA01008411A (en) Meat imaging system for palatability and yield prediction.
MXPA00008218A (en) Meat color imaging system for palatability and yield prediction
Chandraratne et al. Prediction of lamb carcass grades using features extracted from lamb chop images
Felfoldi et al. Image processing based method for characterization of the fat/meat ratio and fat distribution of pork and beef samples
Hatem et al. Determination of animal skeletal maturity by image processing
CA2541866A1 (en) Apparatus for meat palatability prediction
AU2004200865A1 (en) Meat imaging system for palatability and yield prediction
JP7271286B2 (en) Electronic equipment and its control method
WO2020055325A1 (en) Method and system for determining well-being indicators
Cardenas et al. An electronic equipment for marbling meat grade detection based on digital image processing and support vector machine
Jeyamkondan et al. Predicting beef tenderness with computer vision
Charan A review of hyperspectral imaging in the quality evaluation of meat, fish, poultry and their products.