WO2006064635A1 - Systeme de diagnostic - Google Patents

Systeme de diagnostic Download PDF

Info

Publication number
WO2006064635A1
WO2006064635A1 PCT/JP2005/021348 JP2005021348W WO2006064635A1 WO 2006064635 A1 WO2006064635 A1 WO 2006064635A1 JP 2005021348 W JP2005021348 W JP 2005021348W WO 2006064635 A1 WO2006064635 A1 WO 2006064635A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
subject
unit
data
light
Prior art date
Application number
PCT/JP2005/021348
Other languages
English (en)
Japanese (ja)
Inventor
Shin-Ichiroh Kitoh
Yukio Yoshida
Po-Chieh Hung
Original Assignee
Konica Minolta Holdings, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Konica Minolta Holdings, Inc. filed Critical Konica Minolta Holdings, Inc.
Priority to JP2006548732A priority Critical patent/JPWO2006064635A1/ja
Publication of WO2006064635A1 publication Critical patent/WO2006064635A1/fr

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence

Definitions

  • the present invention relates to a diagnostic system, and more particularly to a diagnostic system that performs medical diagnosis and the like from captured images of each part of a human body.
  • a diagnostic system that detects changes in the physiological state of a human body using an imaging device such as a sensor or a CCD camera and analyzes the image data obtained thereby to make a diagnosis such as a medical diagnosis or a beauty diagnosis.
  • an imaging device such as a sensor or a CCD camera
  • Patent Document 1 skin color such as facial color in an image photographed with a digital camera is quantified for objective biometric information obtained, and skin color is not affected by display image color reproduction.
  • a diagnostic system capable of analyzing the above is disclosed.
  • Patent Document 2 describes a physical condition in daily life in a state in which the subject is not conscious of being observed by installing a sensor in a place where a finger or the like is brought into contact in daily life.
  • a diagnostic system for diagnosing health conditions by acquiring data on pulse waves, body temperature, skin color, respiratory rate, etc. is disclosed.
  • Patent Document 3 discloses a diagnostic system capable of performing management'analysis by referring to information via a network on body temperature data measured by a basal body thermometer! .
  • Patent Document 1 JP-A-10-165375
  • Patent Document 2 JP 2000-139856
  • Patent Document 3 JP 2001-353157
  • the diagnostic system described in Patent Document 3 performs medical diagnosis based on body temperature data. In order to perform more accurate diagnosis by comparing data in a normal state and an abnormal state It was desired to acquire and analyze data for more viewpoints.
  • an object of the present invention is to improve the accuracy of discrimination between a normal state and an abnormal state in a diagnostic system that analyzes an image obtained by imaging each part of a human body.
  • the invention described in claim 1 is a diagnostic system, and includes a subject illumination unit that irradiates a subject with a light source of specific color light at the time of photographing, and an image for photographing the subject.
  • An imaging device including an imaging unit; a data management unit that is communicatively connected to the imaging device via a network and manages image data transmitted from the imaging device; and an image transmitted from the imaging device camera
  • a data management device including an image analysis unit that analyzes data and discriminates between a normal state and an abnormal state.
  • the invention according to claim 2 is a diagnostic system, in which a subject illumination unit that irradiates a subject with a plurality of light sources that emit light of different colors, and an image capturing that captures the subject illuminated by the subject illumination unit. And an image analysis unit that separates reflected light on the surface of the subject photographed by the image photographing unit and analyzes information on the reflected light suitable for the part of the subject to be diagnosed to distinguish between a normal state and an abnormal state And is provided.
  • the invention according to claim 3 is a diagnostic system comprising a plurality of light sources that emit different colored lights, a subject illumination unit that irradiates the subject, a subject, and a light source of color light suitable for the subject.
  • the invention according to claim 4 is the diagnostic system according to any one of claims 1 to 4, wherein the image analysis unit calculates a pixel value of image data. It is characterized in that it is converted to color data and expressed on coordinates, and when the coordinates are more than a predetermined interval apart from the coordinates in the normal state, it is judged as abnormal data.
  • the present invention it is easy to detect image data force lesions and abnormalities by irradiating with irradiation of specific color light, and it becomes easy to separate normal state data and abnormal state data. Therefore, it is possible to improve the accuracy of diagnosis based on the discrimination.
  • FIG. 1 is a schematic diagram showing the overall configuration of a diagnostic system according to an embodiment of the present invention.
  • FIG. 2 is a block diagram showing an overall configuration of a diagnostic system according to an embodiment of the present invention.
  • FIG. 3 is a front view showing an image display unit and a subject illumination unit according to an embodiment of the present invention.
  • 4 is a partial enlarged view showing an example of a light source provided in a subject illumination unit according to an embodiment of the present invention.
  • FIG. 5 is a partially enlarged view showing an example of another light source provided in the subject illumination unit according to the embodiment of the present invention.
  • FIG. 6 is a front view showing the arrangement of the color light units that are formed by the light source of the subject illumination unit according to the embodiment of the present invention.
  • FIG. 7 is a front view showing the arrangement of color light units formed by other light sources of the subject illumination section according to the embodiment of the present invention.
  • FIG. 8 is a graph showing an example of an emission spectrum of an LED.
  • FIG. 9 is a graph showing an example of a spectrum of light transmitted by a white light source and an interference filter.
  • FIG. 10 is a graph showing another spectrum example of light transmitted by the white light source and the interference filter.
  • FIG. 11 is a graph showing another spectrum example of light transmitted by the white light source and the interference filter.
  • FIG. 12 is a graph showing another spectrum example of light transmitted by the white light source and the interference filter.
  • FIG. 13 is a graph showing characteristics of a light source provided in a subject illumination unit according to an embodiment of the present invention.
  • FIG. 14 is a graph showing the spectral reflectance of a specific part (a) of a subject.
  • FIG. 15 is a graph showing the spectral reflectance of another specific part (b) of the subject.
  • FIG. 16 is a graph showing reflected light from a specific part (a) of a subject.
  • FIG. 17 is a graph showing reflected light from a specific part (b) of a subject.
  • FIG. 18 is a diagram showing an example of the subject composition.
  • FIG. 19 is a diagram showing an arrangement of image capturing units according to the embodiment of the present invention.
  • FIG. 20 is an example of a composite image by the image photographing unit according to the embodiment of the present invention.
  • FIG. 21 is a diagram showing another arrangement of the image capturing unit according to the embodiment of the present invention.
  • FIG. 22 is a diagram showing another arrangement of the image capturing unit according to the embodiment of the present invention.
  • FIG. 23 is a diagram showing an example of a cut image by the image photographing unit according to the embodiment of the present invention.
  • FIG. 24 is a graph of pixel values obtained by converting reflected light from a specific part (a) of a subject.
  • FIG. 25 is a graph of pixel values obtained by converting reflected light from a specific part (b) of a subject.
  • FIG. 26 is a diagram showing the pixel value of the specific part (a) of the subject on the coordinates as color data.
  • FIG. 27 is a diagram showing pixel values of a specific part (b) of a subject on coordinates as color data.
  • FIG. 28 is a diagram showing an example of a table provided in the system information holding unit according to the embodiment of the present invention.
  • FIG. 29 is a flowchart showing an operation of the diagnostic system according to the embodiment of the present invention.
  • the diagnostic system 1 of the present embodiment is provided with an imaging device 2, and a data management device 13 is connected to the imaging device 2 via a network that can communicate with each other. Yes. Furthermore, one or more external devices 24 are connected to the data management device 13 via a network that can communicate with each other.
  • the imaging device 2 can also be configured to be able to communicate with the external device 24.
  • the network in the present embodiment is not particularly limited as long as it means a communication network capable of data communication.
  • the Internet a LAN (Local Area Network), a WAN (Wide Area Network), a telephone line network, an ISDN (Integrated Services Digital Network) Network, CATV (Cable Television) line, optical communication line, etc. can be included. Moreover, it is good also as a structure which can communicate not only by a wire but by radio
  • the external device 24 is a personal computer or the like, and is preferably installed in a place where some kind of consulting or diagnosis can be received.
  • it may be installed in a public facility such as a hospital or health care facility.
  • the external device 24 may be configured as an Internet site from which consulting information can be obtained, or as a mobile terminal such as a consultant, a doctor, or a store clerk.
  • the imaging device 2 provided in the diagnostic system 1 of the present embodiment includes the following components.
  • the external communication unit 3 is configured to perform information communication with the data management device 13 and the external device 24 by wired or wireless communication means.
  • the imaging device 2 of the present embodiment handles image information, it is desirable that the imaging device 2 be in a communication mode capable of transmitting as fast as possible. Then, the result of image analysis, image data, vital data, and the like are transmitted to the data management device 13 via the external communication unit 3.
  • the image display unit 4 includes a display such as a CRT, a liquid crystal, an organic EL, a plasma, or a projection system, and in addition to a captured image, information given from the external device 24, information obtained by image processing, or imaging Information about the status of each component of device 2 is displayed. Further, it is possible to display a warning when abnormal image data abnormal data is detected, or to have a configuration capable of notifying the detection of abnormal data by voice means.
  • a display such as a CRT, a liquid crystal, an organic EL, a plasma, or a projection system
  • the subject illumination unit 5 is provided at the peripheral edge of the image display unit 4 as shown in FIG. With such a configuration, it is possible to give illumination light with various directional forces to the subject, and to obtain an image with relatively little illumination unevenness.
  • the image display section The subject illumination unit 5 may be disposed only at the upper part or only at the left and right sides of the peripheral edge of 4. It should be noted that the subject illumination unit 5 can be arranged at a location that is not close to the image display unit 4.
  • FIG. 4 is an enlarged view of the dotted line portion of the subject illumination unit 5 shown in FIG.
  • circular light sources that emit different colored lights are arranged in the subject illumination unit 5 of the present embodiment.
  • rectangular light sources can be arranged as shown in FIG.
  • a combination of light sources in a predetermined area as shown by the dotted line in FIG. 5 is used as one color light set, and this color light set is arranged on the peripheral edge of the image display unit 2 as shown in FIG. Yes.
  • the light source power can evenly irradiate the subject with each color light.
  • a rod-shaped light source instead of a circular light source.
  • the colored light set of the rod-shaped light source can be arranged at the peripheral edge of the image display unit 2 as shown in FIG.
  • an artificial light source such as a fluorescent lamp, an incandescent bulb, an LED, a laser, or an organic EL can be used.
  • a strong light source such as a strobe or a high-brightness LED.
  • the light source has light emission characteristics not only in the visible light castle but also in the vicinity (ultraviolet region, near infrared region, infrared region). In order to suppress the influence of ambient light as much as possible, it is desirable to use a light source that can emit high-intensity light such as a strobe.
  • the light source can irradiate time-stable light. Further, it is possible to obtain light whose intensity distribution is spatially suppressed for a user who uses a line light source instead of a point light source. That is, unevenness of illumination to the user is suppressed and a desirable image can be obtained.
  • a force white light source using an LED is used together with an interference filter.
  • Fig. 8 shows the light emission characteristics of the LED
  • Figs. 9 to 12 show examples of the transmitted light spectrum of the white light source and the interference filter.
  • FIG. 13 shows the characteristics of the light source used in this embodiment, and two types of light sources (color light H and color light I) are used among the light sources having the characteristics shown in FIGS.
  • color light H Toshiba Corporation's TLSH series (peak wavelength 623 nm, half width 13 nm)
  • color light I for example, TLPGE1100B (peak wavelength: 562 nm, half-value width: llnm) and other commercially available LEDs can be used.
  • TLPGE1100B peak wavelength: 562 nm, half-value width: llnm
  • other commercially available LEDs can be used.
  • an interference filter a bandpass filter manufactured by Omega Optical Inc., which is commercially available, can be used as an interference filter.
  • FIG. 14 and FIG. 15 show the spectral reflectances of the specific part (a) and another specific part (b) of the subject when healthy (normal) and unhealthy (abnormal).
  • the structure and composition of the body changes, so the spectral reflectance on the subject surface differs from the normal state.
  • which wavelength region is different depends on the subject.
  • the difference between the normal wavelength and the abnormal state is large on the low wavelength side (overlapping the band of colored light A).
  • the mid-range force is high.
  • the difference in the area (overlapping the band of colored light B) is increasing.
  • the spectrum of the reflected light of the subject is the product of the colored light A or the colored light B and the spectral reflectance of the subject as shown in FIG. Yes.
  • the spectrum of the reflected light of the subject becomes the ratio of the colored light A or the colored light B and the spectral reflectance of the subject as shown in FIG. It becomes product.
  • the image capturing unit 6 is configured by one or a plurality of cameras capable of acquiring a still image or a moving image by an image sensor such as a CCD or a CMOS.
  • an image sensor such as a CCD or a CMOS.
  • a camera module that comes with a digital camera, video camera, or other mobile phone.
  • a plurality of image photographing units 6 are arranged at equal intervals to photograph a subject, and an image having a composition that may be synthesized as shown in FIG. 20 is output. It is configured. Further, as shown in FIG. 21, a mechanism capable of three-dimensionally adjusting the position of one image photographing unit 6 may be provided so that the user's head and face can be photographed.
  • the image capturing unit 6 is configured by a combination of a large number of camera modules so that the user can be collectively captured and an image with a predetermined composition can be cut out as shown in FIG. You can configure it.
  • the image capturing unit 6 of the present embodiment has three image channel numbers, but may be a multi-channel (multi-band) having three or more image channels.
  • a general file format such as jpeg or tiff and a raw file corresponding to the raw image are provided.
  • Image processing demosaic processing, gamma processing, color conversion processing, enhancement processing, etc.
  • each parameter of image processing should be referenced.
  • the image capturing unit 6 may be disposed on the back surface of the image display unit 4 or on the same surface. In this embodiment, the image capturing unit 6 is disposed on the back surface of the image display unit 4.
  • the image display unit 4 should be formed of a half-mirror structure material that does not interfere with image capturing and does not cause the user to have a camera. desirable.
  • the image photographing unit 6 captures reflected light from the surface of the subject and converts it into pixel values of each channel corresponding to R, G, B signals of the RGB image.
  • the spectral reflectance in the normal state and the abnormal state in the specific part (a) of the object is as shown in FIG. 16.
  • the reflected light is imaged by the image capturing unit 6 and channel 1 and channel 2 are captured. If the reflected light caused by the color light A or the color light B is separated by, the pixel value of each channel can be obtained as shown in FIG.
  • the separation of the reflected light caused by the color light A or the color light B is performed by the difference in the band of each channel.
  • colored light A is included in the channel 1 band of RGB, and colored light B is included in the channel 2 band. Further, it is also possible to separate the color light A and the color light B by photographing with different timings of light emission. Similarly, the spectral reflectance in the other specific part (b) of the subject is as shown in FIG. 17. When the reflected light caused by the colored light A or colored light B is separated, a pixel value is obtained as shown in FIG. Is possible.
  • the captured image of the specific part (a) of the subject is in channel 1 and the captured image of the specific part (b) is In channel 2, it is easy to separate pixel values in normal and abnormal states.
  • the subject illumination unit 5 illuminates the subject when capturing an image of the face area as described above. It is possible to judge whether it is healthy or abnormal by measuring the movement of the pupil before and after the illumination intensity and wavelength are abruptly changed when photographing the movement of the subject's pupil. It is possible to fulfill the function.
  • the surrounding environment information acquisition unit 7 includes a measuring instrument as an environment information acquisition unit, acquires information on the surrounding environment where the imaging device 2 is installed, and detects abnormal data when analyzing the captured image. It is now possible to support the discrimination.
  • the surrounding environment refers to the intensity and characteristics of ambient illumination light, temperature such as temperature and water temperature, humidity, and atmospheric pressure.
  • temperature such as temperature and water temperature, humidity, and atmospheric pressure.
  • illuminance meter type temperature
  • spectrum meter type e.g., temperature meter type
  • color temperature measurement type e.g., etc.
  • measuring instruments and sensors for measuring each temperature may be used, or information may be obtained from the external device 24 through a network.
  • the IZO unit 8 is a vital sensor (such as a thermometer, a weight scale, a body fat percentage meter, a blood pressure meter, an electrocardiograph, a skin age meter, a bone densitometer, a spirometer) as a vital information acquisition means, It is configured to connect devices that handle portable devices such as force cards, SD cards, and USB memory cards. Measurement data and image data can be input and output from these devices.
  • the vital sensor can be configured as a part of the imaging device 2, it is possible to consider the introduction of any biological information data and to easily introduce the most advanced measurement equipment. It is desirable to connect the necessary vital sensors to the buttock 8 and use it.
  • the temporal change before and after the stimulus is applied to the subject is imaged, and the image power is distinguished from the normal state and the abnormal state of the subject.
  • Stimulation refers to giving light, pushing, applying pressure, changing temperature, adding scent, applying sound, stabting with a needle, painting, injecting, etc.
  • the normal state and the abnormal state can be distinguished using the amount of change of each measured value as feature data.
  • the memory unit 9 is also configured with power such as RAM, ROM, and DIMM, and temporarily stores data necessary for processing in each component of the imaging device 2 so that the imaging device 2 can be operated at high speed and stably. It becomes.
  • the data processing unit 10 includes an image of each part of the subject photographed by the image photographing unit 6, peripheral environment information obtained by the peripheral environment information obtaining unit 7, measurement data and image data input from the IZO unit 8, and the like. Data processing is performed.
  • the data processing unit 10 extracts a region of interest to be diagnosed from the captured image. For example, if the region of interest is the user's face, for example, the user's standard face image is used as the basic image of the template, and the template is subjected to the upper left screen raster operation to perform a template operation. The region with the highest correlation is set as the face region.
  • the user can enclose the face frame with a rectangular pointer, etc., or when the user specifies a point, a rectangle with a specified size centered on the specified position is displayed. It is also possible to provide designated means.
  • the data processing unit 10 transmits the image of the region of interest to the data management device 13 via the external communication unit 3.
  • the image processing unit 6 may be configured to perform image processing when the image processing in the case of a raw file is performed at the time of file output.
  • the data processing unit 10 acquires the ambient light characteristics in the ambient environment information acquisition unit 7 in order to display an image with stable color reproduction for the purpose of viewing even if the ambient light characteristics change.
  • the color data the chromaticity point and brightness of the display white are adjusted.
  • these characteristics are corrected (calibrated) in order to display an image with stable color reproduction regardless of the characteristics of ambient ambient light and light sources.
  • calibration is the power to shoot a test pattern and correct the shot data to a specified value separately. It is desirable that the same processing can be performed without using a test pattern that is relatively expensive and difficult to store. Therefore, it is possible to use the user's teeth and background data at the time of image capture, which seems to change little over time. Multiple users on the same device If is registered, tooth data must be registered for each user.
  • the data processing unit 10 displays on the screen depending on how the illumination is applied.
  • a function is provided to correct the brightness and size of the subject so that it does not change.
  • the distance between the imaging device 2 and the user is estimated from the image captured by the image capturing unit 6.
  • a general method such as a three-dimensional survey method or a stereo measurement method can be used.
  • laser photometry equipment often used in cameras may be attached to the imaging device 2 for measurement.
  • it is possible to estimate the three-dimensional shape of the user by performing distance estimation based on the captured image at each pixel. Further, if the distance between the object and the camera at each pixel is known, the distance between the light source and the subject can be obtained for each pixel, so that the brightness of the display image can be corrected more accurately and in detail.
  • the data processing unit 10 is adapted to identify an individual from a photographed image for user authentication.
  • the face image of the individual photographed at the time of initial registration by the image capturing unit 6, the measured value of the three-dimensional shape of the face, and other characteristic information (information such as the use of glasses, moles, and spots) are automatically generated.
  • the system is configured so that individuals can be identified automatically by checking. This identification is performed, for example, by comparing the correlation between the extracted template image and the basic template image tailored to the individual after extracting the facial part.
  • the user interface unit 11 includes a keyboard, a mouse, a trackball, and the like.
  • the user interface unit 11 allows a user to input an instruction and also allows the user to communicate the status and request of the imaging device 2.
  • a conventional interface such as a keyboard, mouse, or trackball
  • the user interface unit 11 for example, when shooting a subject!
  • the region of interest for diagnosis by the image is the tongue
  • the user is instructed by voice or text to put out the tongue
  • the image is captured by the image capturing unit 6 by guessing that the tongue has come out from the movement of the mouth.
  • the display screen in the image display unit can be switched by voice, gesture, or gesture through the user interface unit 11.
  • the interface unit 10 allows the user to select a face frame. It is possible to provide a means for enclosing a rectangle with a rectangular pointer or the like, or a specifying means for displaying a rectangle of a predetermined size centering on the designated position when the user designates a point.
  • the user interface unit 11 is provided with data acquisition means for the purpose of assisting in determining abnormal data when analyzing a captured image.
  • data acquisition means for the purpose of assisting in determining abnormal data when analyzing a captured image.
  • the data contents to be obtained include self-judgment of the current physical condition (heavy, fever, cold, etc.) and personal data related to the condition (sleeping time, alcohol consumption, smoking, medication information, etc.) ) Is included. It may be acquired interactively by voice, or may be acquired through a user interface such as a touch panel by displaying an inquiry form on the image output device.
  • the tone of the voice is often related to the physical condition at that time, so it is possible to analyze the frequency and tone of the voice and add the analysis result to the inquiry data.
  • the user interface unit 11 is provided with personal authentication means. That is, when a plurality of users use the diagnostic system 1, it is necessary to identify the individual who is currently using it. Therefore, based on the individual identification result obtained by collating with the personal face area image taken at the time of initial registration in the data processing unit 10, it becomes possible to confirm to the person whether the identification result is incorrect or not. And
  • the user interface unit 11 is provided with means for prompting the user to use the diagnostic system 1 by outputting sound at regular intervals.
  • the means for prompting the user to use may be a display on the image display unit 4. That is, in the diagnostic system 1 according to the present embodiment, since it is necessary to check and update the user's healthy state data at regular intervals, it is necessary to use a vital sensor or the like at regular intervals, for example, every month. By prompting, it is possible to update data in a normal state. In addition, depending on the purpose of diagnosis, it may be necessary to use the diagnostic system 1 on a daily or regular basis.
  • the user interface unit 11 displays an alert or emits sound. It is configured to notify outside. However, because it may be due to a schedule such as a long-term business trip or a trip, it is possible to issue an alert in consideration of the individual schedule entered from the data management device 13 via the external communication unit 3. it can.
  • the control unit 12 includes a CPU and a RAM, and drives and controls each component of the imaging device 2. Since the imaging device 2 of the present embodiment also handles moving images, it is desirable that the control unit 12 be configured with a chip capable of controlling high-speed operation as much as possible. In addition, the control unit 12 controls each light source in the subject illumination unit 5 based on an instruction signal transmitted from the data management apparatus via the external communication unit 3 or an instruction signal input from the user interface unit 11. OnZOff is controlled independently. The control based on the above instruction signal It is also possible to estimate the subject by other known methods.
  • the data management device 13 provided in the diagnostic system 1 of the present embodiment is provided with the following components.
  • the external communication unit 14 is configured to perform information communication with the imaging device 2 and the external device 24 by wired or wireless communication means. Similarly to the external communication unit 3, the result of image analysis and image data 'vital data are transmitted to the data management device 13 or the external device 24 via the external communication unit 3.
  • the external device 24 is preferably a place where some kind of consulting can be diagnosed, such as a hospital, a health care facility, a cosmetic-related store or facility, or a hospital.
  • the image analysis unit 15 includes image analysis means, performs image analysis based on image data transmitted from the imaging device 2 to the data management device via the external communication unit 3 and the external communication unit 14, Based on the analysis results, medical diagnoses and other diagnoses have started! /
  • the image power also makes it easier to detect lesions and abnormalities. That is, when the image capturing unit 2 of the imaging device 2 captures the reflected light on the surface of the object and converts it into a pixel value for each channel, the specific part (a) of the subject as shown in FIG. In channel 1, it is easy to separate the pixel values of the normal state and the abnormal state.As shown in FIG. Separation of pixel values is becoming easier.
  • the image analysis means provided in the image analysis unit 15 separates the pixel values of the normal state and the abnormal state for each specific part of the subject based on the image data of the region of interest. Then, each pixel value force tristimulus value data (R GB) in the normal state / abnormal state is calculated, and the reaction color data is calculated by this tristimulus value data primary conversion. It is designed to convert to spatial data. Examples of uniform color spaces include CIE L * a * b *, L * u * v *, ⁇ with standardized brightness directions, and their derived spaces are also included. And by this uniform color space data, As shown in Fig. 26 or Fig.
  • the color data of the healthy state / abnormal state is expressed as coordinates of two or more dimensions (two dimensions in this embodiment), and the coordinates obtained as a result of analyzing the image are the coordinates of the healthy state. If it is far away by a certain scale compared to, it is judged as abnormal data.
  • FIG. 26 shows the coordinates of the color data in the specific part (a)
  • FIG. 27 shows the coordinates of the color data in the other specific part (b).
  • “feature value 1” is a value such as CIE a *, u * or
  • feature value 2” is a value such as CIE b * or V *.
  • Other color data may be measured values other than color data (such as 3D shapes and textures). It is not possible to separate the color light A and the color light B and then separate them.
  • the color data, three-dimensional shape, and texture of the subject are obtained. Measure (estimate) and estimate the amount of hemoglobin, melanin, oxygen saturation, etc. related to the tissue structure under the skin from the estimated data.
  • data such as the number of blinks, the color of the eyes (particularly white eyes), the state of the eyes, and the like may be acquired from the eye image.
  • the diagnosis process is terminated.
  • the result of image analysis is transmitted to the external device 24 via the external communication unit 14.
  • the destination is the above
  • public facilities such as hospitals and health management facilities, cosmetic stores and facilities, hospitals, Internet sites, and mobile terminals are desirable.
  • the image analysis means included in the image analysis unit 15 changes the portion of data that can be transmitted according to the user's permission. Yes. For example, if the user desires to remove or change the image around the eyes and mouth to transmit the image to avoid identifying the individual, the relevant part in the image is automatically extracted, and the part is arbitrarily colored. Change to, add mosaic or blur.
  • transmitted data and data inside the data management device 13 are encrypted by changing the encryption key for each individual.
  • known methods can be used, but biometrics that enables key setting for each individual is desirable.
  • the positional relationship of the constituent elements in the face such as the eyes, nose and mouth of the individual, the height of the nose, the three-dimensional shape such as the shape of the entire face, iris information, etc. are used as the encryption key.
  • the I / O unit 16 is configured to connect portable devices such as CF cards, SD cards, and USB memory cards. It is also possible to connect a measuring device such as a vital sensor to the IZO unit 16 for use.
  • the user interface unit 17 is configured to allow the user to input instructions and to transmit the status and request of the apparatus to the user. Like the user interface unit 11 of the imaging device 2, a configuration that does not place a burden on the user is desirable, but there are cases where complicated processing such as maintenance backup is required by the device administrator, so only the keyboard and mouse are used. You can also However, the user interface unit 17 can be configured so that the user interface unit 11 of the imaging device 2 is not an indispensable component of the data management device 13 and also functions as the user interface unit 17 of the data management device 13. . In this case, the user interface unit 1 of the imaging device 2 Data input / output through 1 is transferred to the data management device 13 via the external communication unit 3 and the external communication unit 14.
  • the display unit 18 includes a display such as a liquid crystal display, and displays an operation status of the data management device 13, an instruction input by the user interface unit 18, and the like. Also, a warning is displayed when abnormal data is detected. The alarm when abnormal data is detected can be configured to be notified by audio means. However, it is also possible to use the image display unit 4 of the imaging device 2 that is not an essential component in the data management device 13 as the display unit 18. In this case, the displayed content is transferred to the image input / output device via the external communication unit 14 and the external communication unit 3.
  • the storage unit 19 includes a memory such as a noffer, and includes a data holding unit 20, a work area unit 21, and a system information holding unit.
  • the data holding unit 20 includes various types of data such as an image of a region of interest to be diagnosed transmitted from the imaging device 2, measurement, biological information, device status, and environmental status, and consulting input from the external device 24. 'It is designed to hold diagnostic data.
  • the data held in the data holding unit 20 is image data or various measurement data.
  • the data holding unit 20 uses the memory effectively by compressing and storing the data determined as “healthy” by the daily health check.
  • the work area unit 21 temporarily stores various data used for data processing in order to perform data processing in the data management device 13 at high speed.
  • the system information holding unit 22 holds system information of the imaging device 2 and the data management device 13.
  • the system information holding unit 22 displays the management number of the LED or filter as the light source included in the subject illumination unit 5 of the imaging device 2, the effective subject, and the feature quantity that can be extracted as shown in FIG. And register and manage!
  • the control unit 12 of the imaging device 2 can control the OnZOff of each light source included in the subject illumination unit 5 independently by this table to give color light suitable for the subject.
  • system information holding unit 22 is a light source provided in the subject illumination unit 5 of the imaging device 2.
  • the spectral characteristics, color data, and characteristics over time of the filter are managed, and the image analysis unit 15 can refer to these data when necessary for image analysis.
  • the system information holding unit 22 uses “measurement values measured from image data determined to be“ healthy ”and“ health state data ”that serves as a basis for diagnosis by image analysis in the image analysis unit 15. Create and hold. If normal state data is created, abnormal state data can be expressed as a complement of normal state data. This healthy state data is updated after confirming the user's healthy state at regular intervals to improve the reliability of the data. That is, in the image display unit 4 or the user interface unit 17, for example, the health state data is updated by prompting the user to use a vital sensor or the like every month. It can also be updated daily based on data from the most recent month. In this way, it is considered that it can cope with changes in physical condition due to the season and aging. It may also be created with reference to a specified number of days, such as 30 days, excluding days that contain abnormal data.
  • the system information holding unit 22 holds a personal schedule transferred from a device such as a device connected to the IZO unit 8 of the imaging device 2 or the heel unit 16 of the data management device 13. Based on the individual schedule held by the system information holding unit 22, the user interface unit 11 generates an alert prompting the use of the diagnostic system 1.
  • the system information holding unit 22 holds the characteristics of each component of the imaging device 2 or the data management device 13. That is, the subject illumination unit 5 of the imaging device 2
  • the characteristics of the image display unit 4, the image capturing unit 6 and the like change over time and usually deteriorate with time. Therefore, by setting the timing for correcting (adjusting) this change over time in the system information holding unit 22 and reading the tag information power time information etc., the characteristics of the imaging device 2 or the data management device can be adjusted according to the specified timing. Measure and correct! This makes it possible to obtain image measurement values that are stable over time.
  • the user instructs the calibration timing or automatically detects the change in the surrounding situation from the image and performs calibration.
  • ⁇ the surrounding situation has changed '' means, for example, that the surrounding environment lighting has changed significantly (e.g., the fluorescent length has been changed, a new window has been installed nearby), or the surrounding has changed (image The background of the room has changed, the room has been redesigned, the position of the device has been changed, etc.). It is possible to determine whether or not to perform calibration according to the degree of difference by comparing the past data with the current data stored in the system information holding unit 22 for the difference in the surrounding situation. .
  • the light source included in the subject illumination unit 5 and the camera or sensor included in the image capturing unit 6 are also considered consumables, and may need to be replaced at regular intervals in order to maintain a stable device. is there. Therefore, information such as the light source and camera (exchange date, etc.) is added to the tag information of the image data or the system information holding unit 22, and if these devices have been used for a certain period or more, they have to be replaced due to damage. In this case, the user interface unit 11 may be configured to issue an alert.
  • the data management unit 22 manages each individual image data held in the data holding unit 20 of the storage unit 19 by attaching a tag (accompanying information). You can attach tags to the header or footer of each data, or you can associate them with each data name in a separate file! / ⁇
  • the contents of the tag include the image shooting date and time, the user name and its attributes, information on the imaging device 2 and the data management device 13, information on the surrounding environment, the captured image and the IZO unit 8 or IZO unit 16. Include measurement data values obtained from connected vital sensors, diagnostic information when subjectively diagnosing captured images, and contents of interview data obtained separately.
  • Tags that can grasp time information such as the shooting date and time for tracking image data in time series, and other data context are essential.
  • information required for the imaging device 2 and the data management device 13 is information on the subject illumination unit 5 related to estimating measurement data from images, characteristics of the image capturing unit 6 and characteristics of the imaging device 2, This is information on a data processing method in the data processing unit 10.
  • Such hardware and software information is generally managed by version, and this embodiment follows that idea. Therefore, describe the version number of each tag.
  • software for image analysis or the like may be upgraded to improve functions (including improved accuracy of measurement values).
  • the version is upgraded, the measured value is recalculated and updated to the new measured value using the tag contents in the image data and the new software. Therefore, the tag content is newly updated.
  • each image data is divided into directories in units (one roll) for each individual user. This makes it easier to manage data for each individual.
  • the place where the imaging device 2 or the data management device 13 provided in the diagnostic system 1 of the present embodiment is installed may be near a water area such as a washroom or bathroom. Therefore, it is desirable to take measures such as anti-fogging and coating to prevent fogging with moisture. In addition, considering the installation in a place where dust tends to accumulate, it is desirable to have antifouling seals and other measures against dirt and dust.
  • the image capturing unit 6 captures a subject.
  • the data processing unit 10 includes the image of the subject captured by the image capturing unit 6, the ambient environment information acquired by the ambient environment information acquiring unit 7, the measurement data and image data input from the IZO unit 8, or the user interface unit.
  • the input information such as the user instruction input data input from 11 is analyzed, the image is displayed on the image display unit 4, and the light source of the subject illumination unit 5 is adjusted.
  • the face area image of the individual photographed at the time of initial registration by the image photographing unit 6 and the image acquired by the image photographing unit 6 are analyzed, and the individual data that is registered is automatically verified to identify the individual. (Step S1).
  • the user interface unit 11 confirms the individual identification result by the data processing unit 10 by the personal authentication means (step S2). As a result, if the identification of the individual is wrong, the data processing unit 10 identifies the individual again (Step S3, Step Sl).
  • the user interface unit 11 makes an inquiry about the health condition (step S3, step S4). Through this inquiry, you can obtain your current physical condition (heavy, fever, cold, etc.) and personal data related to your condition (sleeping time, alcohol, smoking, medication information, etc.).
  • the image capturing unit 6 captures an image of a part to be diagnosed in the subject.
  • the subject irradiating unit 5 shoots by irradiating a plurality of light sources (step S5). That is, the power to use an LED as a light source and the power to use a white light source together with an interference filter
  • a plurality of light sources two types of light sources among the light sources having the characteristics shown in FIGS. Use colored light H and colored light I).
  • the difference between the normal state and the abnormal state on the low wavelength side (which overlaps the band of the colored light A) is large.
  • the image photographing unit 6 captures the reflected light from the subject surface and converts it into a pixel value. At that time, the reflected light caused by each color light is separated by the channel, thereby identifying the subject. The pixel values in the normal state and the abnormal state are easily separated for each part. Furthermore, the data processing unit 10 extracts a region of interest from the captured image (step S6).
  • the captured image of the region of interest is transmitted from the imaging device 2 to the data management device 13 via the external communication unit 3 and the external communication unit 14 (step S7).
  • the captured image of the region of interest transmitted to the data management device 13 is held by the data holding unit 20 of the storage unit 19. Then, the data management unit 22 manages each individual image data held in the data holding unit 20 with a tag (accompanying information).
  • the image analysis unit 15 separates the pixel values of the normal state and the abnormal state for each specific part of the subject based on the image data of the region of interest held by the data holding unit 20 of the storage unit 19 (step S8 ).
  • the tristimulus value data RGB
  • the reaction color data is calculated by this tristimulus value data primary conversion
  • the uniform color space data is calculated from the reaction color data.
  • the color data of the healthy state / abnormal state is expressed as coordinates of two or more dimensions (two dimensions in this embodiment), and the coordinates of the healthy state and the unhealthy state are represented.
  • are compared step S10).
  • the healthy state data is held in the system information holding unit 22 of the storage unit 19.
  • capillary blood vessels can be enhanced by irradiating narrow-band colored light centered around 550 °. It seems to be effective in detecting abnormalities at the part where capillaries such as fingertips gather.
  • the color light B shown in FIG. 8 or FIG. 10 may be used.
  • the subject's color data, three-dimensional shape, and texture are analyzed. Measure (estimate) the power of those estimated data.
  • the amount of moglobin, the amount of melanin, oxygen saturation, etc. can also be estimated.
  • data such as the number of blinks, the color of the eyes (especially white eyes), the condition of the eyes, and the like may be acquired from the eye image
  • the diagnosis process is terminated (step S12).
  • the diagnosis result is displayed on the display unit 18 or the image display unit 4 (step S14), and the image analysis result is externally displayed.
  • the data is transmitted to the external device 24, that is, an external consultant or a medical institution via the communication unit 14 (step S15).
  • the destination is preferably a place where you can receive some kind of consulting and diagnosis as described above. It is done.
  • the image analysis means included in the image analysis unit 15 is configured to mosaic the portion of data that can be transmitted according to the user's permission when sending image data to a consultant or the like through the external communication unit 3 or the external communication unit 14. Or change by blur.
  • the transmitted data and the data inside the data management device 13 are encrypted by changing the encryption key for each individual.
  • a measured value is obtained by using average data of general persons registered in advance.
  • a temporary measurement value can be obtained by using an average value of data of a user who uses the diagnosis system 1. In this case, after the lapse of a predetermined period, the actual data of the user is switched and the measured value is updated with respect to the temporary image data up to that time.
  • an alert for prompting the use of the diagnostic system 1 is displayed on the image display unit 4, the user interface unit 11, or the display unit 18 every predetermined period.
  • the data is updated every predetermined period and the reliability of the diagnosis result is ensured.
  • the system information holding unit 22 sets and designates the timing for correcting (adjusting) the change with time of the characteristics of each device constituting the diagnostic system 1 in order to obtain a stable image measurement value over time. Measure and correct the characteristics of each device according to the timing.
  • the user instructs the calibration timing or automatically from the image. Calibration is performed by dynamically detecting changes in the surrounding situation.
  • the user interface unit 11 issues an alert to prompt replacement.
  • the lesion or abnormality of the subject appears in the narrow band characteristics of the spectral surface of the subject surface. Since the subject is illuminated with colored light and photographed, abnormalities can be easily detected from the image data.
  • the diagnostic system of the present invention it becomes easier to detect image data force lesions and abnormalities by irradiating with a specific color light, and separation of normal state and abnormal state data. Therefore, it is possible to improve the accuracy of diagnosis by distinguishing between a normal state and an abnormal state.
  • the “diagnostic system” in the present application may take any form as long as it has a system configuration. Moreover, it can be said that even if the components included in the diagnostic system are combined as a single device or configured by a plurality of devices, they fall within the scope of the present invention.

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)

Abstract

Selon l’invention, il est possible d’améliorer la précision de diagnostic pour évaluer un état sain et un état anormal dans un système de diagnostic destiné à analyser une image obtenue à partir de chaque partie d’un corps humain. Le système de diagnostic (1) comprend : un dispositif de formation d’image (2) comportant une unité d’éclairage d’objet (5) destinée à éclairer un objet avec une source de lumière d’une couleur particulière lors de la formation d’image et une unité de formation d’image (6) destinée à former une image en convertissant la lumière réfléchie provenant de l’objet par la source lumineuse en une valeur de pixel ; un dispositif de gestion des données (13) comportant une unité de gestion des données (22) raccordée en communication au dispositif de formation d’image (2) par l’intermédiaire d’un réseau et gérant les données d’image transmises par le dispositif de formation d’image (2) et une unité d’analyse d’image (15) destinée à analyser les données d’image émises par le dispositif de formation d’image (2) et à évaluer si l’état est sain ou anormal.
PCT/JP2005/021348 2004-12-17 2005-11-21 Systeme de diagnostic WO2006064635A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2006548732A JPWO2006064635A1 (ja) 2004-12-17 2005-11-21 診断システム

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2004366508 2004-12-17
JP2004-366508 2004-12-17

Publications (1)

Publication Number Publication Date
WO2006064635A1 true WO2006064635A1 (fr) 2006-06-22

Family

ID=36587696

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2005/021348 WO2006064635A1 (fr) 2004-12-17 2005-11-21 Systeme de diagnostic

Country Status (2)

Country Link
JP (1) JPWO2006064635A1 (fr)
WO (1) WO2006064635A1 (fr)

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008136778A (ja) * 2006-12-05 2008-06-19 Ministry Of National Defense Chung Shan Inst Of Science & Technology 医療情報と映像を結ぶ遠隔監視システム及びその方法
WO2008150343A1 (fr) * 2007-05-22 2008-12-11 Eastman Kodak Company Contrôle d'états physiologiques
JP2009273605A (ja) * 2008-05-14 2009-11-26 Konica Minolta Medical & Graphic Inc 動態画像診断支援システム
JP2013506523A (ja) * 2009-10-06 2013-02-28 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ 第1の信号の少なくとも一つの周期的成分を特徴付けるための分析のため第1の信号を得るための方法及びシステム
WO2014030439A1 (fr) * 2012-08-20 2014-02-27 オリンパス株式会社 Système de surveillance d'un état biologique, procédé de surveillance d'un état biologique, et programme
WO2014087502A1 (fr) * 2012-12-05 2014-06-12 パイオニア株式会社 Dispositif de mesure, partie de sonde et câble de raccordement
JP2014518647A (ja) * 2011-04-12 2014-08-07 ダイムラー・アクチェンゲゼルシャフト 少なくとも1人の車両乗員を監視する方法及び装置、及び少なくとも1つのアシスタント装置を作動させるための方法
JP2014226515A (ja) * 2013-05-27 2014-12-08 セコム株式会社 診断支援システム
TWI556793B (zh) * 2015-07-09 2016-11-11 國立臺灣科技大學 非接觸式即時生理訊號及動作偵測方法
US9517029B2 (en) 2011-06-06 2016-12-13 Sony Corporation Image processing device, image processing method, image processing system, program, and recording medium
JP2017080517A (ja) * 2017-01-19 2017-05-18 パイオニア株式会社 計測装置、プローブ部及び接続ケーブル
JP2017220807A (ja) * 2016-06-08 2017-12-14 株式会社日立システムズ 音声データ収集システム
WO2018078868A1 (fr) * 2016-10-31 2018-05-03 株式会社オプティム Système informatique et procédé et programme de diagnostic d'objets
WO2018078866A1 (fr) * 2016-10-31 2018-05-03 株式会社オプティム Système informatique, et procédé et programme de diagnostic de plantes
WO2018078867A1 (fr) * 2016-10-31 2018-05-03 株式会社オプティム Système informatique, et procédé et programme pour diagnostiquer les animaux
JP2022038639A (ja) * 2020-08-27 2022-03-10 キヤノンメディカルシステムズ株式会社 診断支援装置

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS63173182A (ja) * 1987-01-13 1988-07-16 Olympus Optical Co Ltd 色彩画像処理方式
JPH11194051A (ja) * 1998-01-05 1999-07-21 Matsushita Electric Ind Co Ltd 肌色領域検出装置及び方法
WO2004012461A1 (fr) * 2002-07-26 2004-02-05 Olympus Optical Co., Ltd. Systeme de traitement d'image
JP2004209227A (ja) * 2002-12-27 2004-07-29 Unilever Nv 皮膚の画像診断方法および装置
JP2004344583A (ja) * 2003-05-26 2004-12-09 Minolta Co Ltd 診断支援システムおよび端末装置

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS63173182A (ja) * 1987-01-13 1988-07-16 Olympus Optical Co Ltd 色彩画像処理方式
JPH11194051A (ja) * 1998-01-05 1999-07-21 Matsushita Electric Ind Co Ltd 肌色領域検出装置及び方法
WO2004012461A1 (fr) * 2002-07-26 2004-02-05 Olympus Optical Co., Ltd. Systeme de traitement d'image
JP2004209227A (ja) * 2002-12-27 2004-07-29 Unilever Nv 皮膚の画像診断方法および装置
JP2004344583A (ja) * 2003-05-26 2004-12-09 Minolta Co Ltd 診断支援システムおよび端末装置

Cited By (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008136778A (ja) * 2006-12-05 2008-06-19 Ministry Of National Defense Chung Shan Inst Of Science & Technology 医療情報と映像を結ぶ遠隔監視システム及びその方法
WO2008150343A1 (fr) * 2007-05-22 2008-12-11 Eastman Kodak Company Contrôle d'états physiologiques
JP2009273605A (ja) * 2008-05-14 2009-11-26 Konica Minolta Medical & Graphic Inc 動態画像診断支援システム
JP2013506523A (ja) * 2009-10-06 2013-02-28 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ 第1の信号の少なくとも一つの周期的成分を特徴付けるための分析のため第1の信号を得るための方法及びシステム
US8938097B2 (en) 2009-10-06 2015-01-20 Koninklijke Philips N.V. Method and system for obtaining a first signal for analysis to characterize at least one periodic component thereof
US9524548B2 (en) 2009-10-06 2016-12-20 Koninklijke Philips N.V. Method and system for obtaining a first signal for analysis to characterize at least one periodic component thereof
US10140532B2 (en) 2011-04-12 2018-11-27 Daimler Ag Method and device for monitoring at least one vehicle occupant, and method for operating at least one assistance device
JP2014518647A (ja) * 2011-04-12 2014-08-07 ダイムラー・アクチェンゲゼルシャフト 少なくとも1人の車両乗員を監視する方法及び装置、及び少なくとも1つのアシスタント装置を作動させるための方法
US9517029B2 (en) 2011-06-06 2016-12-13 Sony Corporation Image processing device, image processing method, image processing system, program, and recording medium
WO2014030439A1 (fr) * 2012-08-20 2014-02-27 オリンパス株式会社 Système de surveillance d'un état biologique, procédé de surveillance d'un état biologique, et programme
JPWO2014087502A1 (ja) * 2012-12-05 2017-01-05 パイオニア株式会社 計測装置、プローブ部及び接続ケーブル
US10070797B2 (en) 2012-12-05 2018-09-11 Pioneer Corporation Measuring apparatus, probe portion, and connecting cable
WO2014087502A1 (fr) * 2012-12-05 2014-06-12 パイオニア株式会社 Dispositif de mesure, partie de sonde et câble de raccordement
JP2014226515A (ja) * 2013-05-27 2014-12-08 セコム株式会社 診断支援システム
TWI556793B (zh) * 2015-07-09 2016-11-11 國立臺灣科技大學 非接觸式即時生理訊號及動作偵測方法
JP2017220807A (ja) * 2016-06-08 2017-12-14 株式会社日立システムズ 音声データ収集システム
JPWO2018078868A1 (ja) * 2016-10-31 2018-10-25 株式会社オプティム コンピュータシステム、物体の診断方法及びプログラム
WO2018078867A1 (fr) * 2016-10-31 2018-05-03 株式会社オプティム Système informatique, et procédé et programme pour diagnostiquer les animaux
WO2018078866A1 (fr) * 2016-10-31 2018-05-03 株式会社オプティム Système informatique, et procédé et programme de diagnostic de plantes
WO2018078868A1 (fr) * 2016-10-31 2018-05-03 株式会社オプティム Système informatique et procédé et programme de diagnostic d'objets
JPWO2018078867A1 (ja) * 2016-10-31 2019-09-05 株式会社オプティム コンピュータシステム、動物の診断方法及びプログラム
JPWO2018078866A1 (ja) * 2016-10-31 2019-10-10 株式会社オプティム コンピュータシステム、植物の診断方法及びプログラム
US10643328B2 (en) 2016-10-31 2020-05-05 Optim Corporation Computer system, and method and program for diagnosing objects
US10685231B2 (en) 2016-10-31 2020-06-16 Optim Corporation Computer system, and method and program for diagnosing plants
US11100642B2 (en) 2016-10-31 2021-08-24 Optim Corporation Computer system, and method and program for diagnosing animals
JP2017080517A (ja) * 2017-01-19 2017-05-18 パイオニア株式会社 計測装置、プローブ部及び接続ケーブル
JP2022038639A (ja) * 2020-08-27 2022-03-10 キヤノンメディカルシステムズ株式会社 診断支援装置

Also Published As

Publication number Publication date
JPWO2006064635A1 (ja) 2008-06-12

Similar Documents

Publication Publication Date Title
WO2006064635A1 (fr) Systeme de diagnostic
JP2007125151A (ja) 診断システム及び診断装置
CN108289613B (zh) 用于生理监测的***、方法和计算机程序产品
JP5119921B2 (ja) 画像処理装置、画像処理システム及び画像処理プログラム
US9986913B2 (en) Method and system for analyzing physical conditions using digital images
CN105636506B (zh) 用于远程光体积描记法的自动相机调节
US7477767B2 (en) Systems and methods for analyzing skin conditions of people using digital images
AU2008223050B2 (en) Quantitative analysis of skin characteristics
US20090043210A1 (en) Data detection device and data detection method
US20100041968A1 (en) Image capture in combination with vital signs bedside monitor
US20120078113A1 (en) Convergent parameter instrument
JP2007295946A (ja) 酒気帯び検知システム及び酒気帯び検知方法
US20140221843A1 (en) Near-infrared imaging for diagnosis of sinusitis
WO2016067892A1 (fr) Dispositif de génération de degré de santé, système de génération de degré de santé, et programme
JPWO2014002255A1 (ja) 健康管理支援装置、方法およびプログラム
KR101862696B1 (ko) 실사와 컴퓨터 그래픽스 영상을 이용한 생체 정보 표시 시스템 및 그의 표시 방법
JP7051083B2 (ja) 心理状態判定方法、判定装置、判定システムおよび判定プログラム
JP2016052414A (ja) 概日リズム検査装置、概日リズム検査システムおよび概日リズム検査方法
JP2005094185A (ja) 画像処理システム、画像処理装置、および撮像制御方法
JP2007086872A (ja) 管理システム
WO2007035829A2 (fr) Systemes et procedes destines a analyser des affections cutanees chez des personnes au moyen d'images numeriques
JP2004344583A (ja) 診断支援システムおよび端末装置
TWI489306B (zh) Health status assessment methods and the use of the method of health assessment system
Andrushevich et al. Open smart glasses development platform for AAL applications
KR20140092486A (ko) 패치형 생체 신호 측정 장치

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KM KN KP KR KZ LC LK LR LS LT LU LV LY MA MD MG MK MN MW MX MZ NA NG NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SM SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU LV MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 2006548732

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 05809157

Country of ref document: EP

Kind code of ref document: A1

WWW Wipo information: withdrawn in national office

Ref document number: 5809157

Country of ref document: EP