GB2609147A - Image processing system, imaging apparatus, image processing apparatus, electronic device, methods of controlling the system, the apparatuses, and the device, - Google Patents

Image processing system, imaging apparatus, image processing apparatus, electronic device, methods of controlling the system, the apparatuses, and the device, Download PDF

Info

Publication number
GB2609147A
GB2609147A GB2215730.9A GB202215730A GB2609147A GB 2609147 A GB2609147 A GB 2609147A GB 202215730 A GB202215730 A GB 202215730A GB 2609147 A GB2609147 A GB 2609147A
Authority
GB
United Kingdom
Prior art keywords
affected area
image data
image processing
information
subject
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB2215730.9A
Other versions
GB202215730D0 (en
GB2609147B (en
Inventor
Goto Atsushi
Sugimoto Takashi
Kawai Yoshikazu
Hitaka Yosato
Kuroda Tomoki
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canon Inc
Original Assignee
Canon Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2019095938A external-priority patent/JP2020123304A/en
Application filed by Canon Inc filed Critical Canon Inc
Publication of GB202215730D0 publication Critical patent/GB202215730D0/en
Publication of GB2609147A publication Critical patent/GB2609147A/en
Application granted granted Critical
Publication of GB2609147B publication Critical patent/GB2609147B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/44Detecting, measuring or recording for evaluating the integumentary system, e.g. skin, hair or nails
    • A61B5/441Skin evaluation, e.g. for skin disorder diagnosis
    • A61B5/445Evaluating skin irritation or skin trauma, e.g. rash, eczema, wound, bed sore
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • A61B5/0004Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by the type of physiological signal transmitted
    • A61B5/0013Medical image data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0033Features or image-related aspects of imaging apparatus classified in A61B5/00, e.g. for MRI, optical tomography or impedance tomography apparatus; arrangements of imaging apparatus in a room
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0077Devices for viewing the surface of the body, e.g. camera, magnifying lens
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/107Measuring physical dimensions, e.g. size of the entire body or parts thereof
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/107Measuring physical dimensions, e.g. size of the entire body or parts thereof
    • A61B5/1079Measuring physical dimensions, e.g. size of the entire body or parts thereof using optical or photographic means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/44Detecting, measuring or recording for evaluating the integumentary system, e.g. skin, hair or nails
    • A61B5/441Skin evaluation, e.g. for skin disorder diagnosis
    • A61B5/447Skin evaluation, e.g. for skin disorder diagnosis specially adapted for aiding the prevention of ulcer or pressure sore development, i.e. before the ulcer or sore has developed
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/521Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H10/00ICT specially adapted for the handling or processing of patient-related medical or healthcare data
    • G16H10/60ICT specially adapted for the handling or processing of patient-related medical or healthcare data for patient-specific data, e.g. for electronic patient records
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/20ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/63ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/117Identification of persons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30088Skin; Dermal
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/30ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Physics & Mathematics (AREA)
  • Biomedical Technology (AREA)
  • Pathology (AREA)
  • Veterinary Medicine (AREA)
  • Biophysics (AREA)
  • Animal Behavior & Ethology (AREA)
  • Surgery (AREA)
  • Molecular Biology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Primary Health Care (AREA)
  • Epidemiology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Dermatology (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Geometry (AREA)
  • Business, Economics & Management (AREA)
  • Data Mining & Analysis (AREA)
  • General Business, Economics & Management (AREA)
  • Quality & Reliability (AREA)
  • Physiology (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Databases & Information Systems (AREA)
  • Optics & Photonics (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)
  • Endoscopes (AREA)

Abstract

To provide an image processing system improving the user-friendliness in evaluation of an affected area. The image processing apparatus comprises a communication means for acquiring image data over a communication network, an arithmetic means for extracting an affected area of a subject from the image data and storing means for storing information in the storage means. The communication means outputs information indicating a result of extraction of the affected area extracted by the arithmetic means to an external apparatus over the communication network and acquires information of the affected area from the external apparatus over the communication network which is then stored in the storage means.

Description

DESCRIPTION
TITLE OF INVENTION
IMAGE PROCESSING SYSTEM, IMAGING APPARATUS, IMAGE PROCESSING APPARATUS, ELECTRONIC DEVICE, METHODS OF CONTROLLING THE SYSTEM, THE APPARATUSES, AND THE DEVICE, AND STORAGE MEDIUM STORING THE METHODS
TECHNICAL FIELD
[0001] The present invention relates to a technology to evaluate a certain area of a subject from an image.
BACKGROUND ART
[0002] In a state in which a person or an animal lies down, a contact region between the body and a floor, a mat, or a mattress below the body is compressed by the body weight.
[0003] If the same posture is continued, vascular insufficiency occurs in the contact region between the floor and the body to cause necrosis of the surrounding tissue. The state in which the tissue necrosis occurs is called pressure ulcer or bedsore. It is necessary to give pressure ulcer care, such as body pressure dispersion and skin care, to the patient developing the pressure ulcer to periodically evaluate and manage the pressure ulcer.
[0004] Measurement of the size of the pressure ulcer is known as one method of evaluating the pressure ulcer.
[0005] For example, DESIGN-R (registered trademark), which is an evaluation index of the pressure ulcer developed by Academic Education Committee of the Japanese Society of Pressure Ulcers, is known as an example in which the size of the pressure ulcer is used in the evaluation, as described in NPL1.
[0006] The DESIGN-R (registered trademark) is a tool to evaluate the healing process of a wound, such as the pressure ulcer. This tool is named from the initial letters of evaluation items: Depth, Exudate, Size, Inflammation/Infection, Granulation, and Necrotic tissue. Pocket is also included in the evaluation items, in addition to the above evaluation items, although the initial letter of Pocket is not used in the name.
[0007] The DESIGN-R (registered trademark) is classified into two groups for classification of the severity level, which is used for common and simple evaluation, and for process evaluation in which the flow of the healing process is indicated in detail. In the DESIGN-R (registered trademark) for classification of the severity level, the six evaluation items are classified into two: mild and severe. The mild evaluation items are represented using the lowercase letters alphabet and the severe evaluation items are represented using the capital letters alphabet. [0008] The evaluation using the DESIGN-R (registered trademark) for classification of the severity level in first treatment enables the rough state of the pressure ulcer to be figured out. Since the item having a problem is revealed, it is possible to easily determine the treatment policy.
[0009] The DESIGN-R (registered trademark) capable of comparison of the severity level between patients, in addition to the process evaluation, is also defined as the DESIGN-R (registered trademark) for process evaluation. Here, R represents Rating (evaluation and rating). Different weights are added to the respective items and the sum (0 points to 66 points) of the weights of the six items excluding the depth represents the severity level of the pressure ulcer. With the DESIGN-R (registered trademark), it is possible to objectively evaluate the course of treatment in detail after the treatment is started to enable the comparison of the severity level between the patients, in addition to the evaluation of the course of an individual.
[0010] In the evaluation of the size in the DESIGN-R (registered trademark), the major axis length (cm) and the minor axis length (the maximum diameter orthogonal to the major axis length) (cm) of a skin injury range are measured and the size, which is the numerical value given by multiplying the major axis length by the minor axis length, is classified into seven stages. The seven stages include sO: no skin injury, s3: lower than four, s6: not lower than four and lower than 16, s8: not lower than 16 and lower than 36, s9: not lower than 36 and lower than 64, s12: not lower than 64 and lower than 100, and s15: not lower than 100. [0011] Currently, the evaluation of the size of the pressure ulcer is often based on the value resulting from manual measurement of an affected area using a measure. Specifically, the maximum straight-line distance between two points in the skin injury range is measured and the measured distance is used as the major axis length The length orthogonal to the major axis length is used as the minor axis length, and the value given by multiplying the major axis length by the minor axis length is set as the size of the pressure ulcer.
CITATION LIST
NON-PATENT LITERATURE
[0012] NPL1: Pressure ulcer Guidebook Edition 2 Pressure ulcer prevention & management Guideline (Edition 4) Compliance. Edited by Japanese Society of Pressure Ulcers, ISBN13 978-4796523608. Shourin-sha. pp. 23.
[0013] However, the pressure ulcer often has a complicated shape and it is necessary to adjust the usage of the measure in the manual evaluation of the size of the pressure ulcer. Since it is necessary to perform the above work at least two times to measure the major axis length and the minor axis length, it takes a time and a heavy workload is required. In addition, since the patient the pressure -3 -ulcer of whom is to be evaluated is required to keep the same posture during the work, the manual evaluation of the size of the pressure ulcer is considered to impose a heavy burden on the patient.
[0014] It is recommended to perform the rating once per week or two weeks in the DESIGN-R (registered trademark) and it is necessary to perform the measurement repeatedly. In addition, the position to be determined to be the major axis length of the pressure ulcer may be varied depending on the individuals in the manual measurement and it is difficult to ensure the accuracy of the measurement.
[0015] Although the example is described above in which the evaluation of the pressure ulcer is performed based on the DESIGN-R (registered trademark), the above description is not limited to the case of the DESIGN-R (registered trademark) and similar problems occur regardless of the method of measuring the size of the pressure ulcer. It is necessary to perform the manual measurement for multiple places to calculate the area of the pressure ulcer and, thus, the workload is caused.
[0016] As another problem, the evaluation items of the pressure ulcer include the evaluation items that are desirably visually determined, in addition to the evaluation items including the size which are measured. The evaluation items that should be visually determined are subsequently input by an evaluator onto an electronic health record or a paper medium while watching image data that is captured. In this case, since the input device used for information indicating the size is different from the input device used for other information, the input operation is made complicated and omission is likely to occur.
[0017] These problems are not limited to the pressure ulcer and similar problems occur for an affected area, such as a burn injury or a laceration, on the body surface.
SUMMARY OF INVENTION
[0018] An image processing system of one aspect of the present invention includes an imaging apparatus and an image processing apparatus and is characterized in that the imaging apparatus includes imaging means for receiving light from a subject to generate image data, first communication means for outputting the image data to a communication network, and display means for displaying an image based on the image data generated by the imaging means, the image processing apparatus includes second communication means for acquiring the image data over the communication network and arithmetic means for extracting a certain area of the subject from the image data, the second communication means outputs information indicating a result of extraction of the certain area extracted by the arithmetic means to the communication network. the first communication means acquires the information indicating the result of extraction of the certain area over the communication network, and the display means performs display based on the information indicating the result of extraction of the certain area [0019] An imaging apparatus of another aspect of the present invention includes imaging means for receiving light from a subject to generate image data, communication means for outputting the image data to an external apparatus over a communication network, and display means for displaying an image based on the image data generated by the imaging means and is characterized in that the communication means acquires information indicating a result of extraction of a certain area of the subject in the image data from the external apparatus over the communication network and the display means performs display based on the information indicating the result of extraction of the certain area.
[0020] An image processing apparatus of another aspect of the present invention -5 -includes communication means for acquiring image data and distance information corresponding to a subject included in the image data from an imaging apparatus over a communication network and arithmetic means for extracting a certain area of the subject from the image data and calculating a size of the certain area based on the distance information and is characterized in that the communication means outputs information indicating a result of extraction of the certain area extracted by the arithmetic means and information indicating the size to the imaging apparatus over the communication network.
[0021] An imaging apparatus of another aspect of the present invention includes imaging means for receiving light from an imaging apparatus subject to generate image data, control means for acquiring a result of extraction of a certain area of the subject in the image data, and interface means for causing a user to input evaluation values of multiple predetermined evaluation items in the certain area of the subject and is characterized in that the control means associates the evaluation values of the input multiple evaluation items with the image data.
[0022] An electronic device of another aspect of the present invention is characterized by including communication means for acquiring image data generated by an imaging apparatus and information indicating evaluation values of multiple evaluation items for an affected area of a subject in the image data, which is input by a user with the imaging apparatus, over a communication network and control means for causing display means to display an image based on the image data and the evaluation values of the multiple evaluation items.
BRIEF DESCRIPTION OF DRAWINGS
[0023] FIG. I is a diagram schematically illustrating an image processing system according to a first embodiment. -6 -
FIG. 2 is a diagram illustrating an example of the hardware configuration of an imaging apparatus included in the image processing system.
FIG. 3 is a diagram illustrating an example of the hardware configuration of an image processing apparatus included in the image processing system.
FIG. 4 is a work flow chart illustrating the operation of the image processing system according to the first embodiment.
FIG. 5 is a diagram for describing how the area of an area is calculated.
FIG. GA is a diagram for describing image data including an affected area. FIG. GB is a diagram for describing how information indicating the result of extraction of the affected area and information indicating the size of the affected area are superimposed on the image data.
FIG. 7A is a diagram for describing a method of superimposing the information indicating the result of extraction of the affected area and information that includes a major axis length and a minor axis length of the affected area and that indicates the size of the affected area on the image data.
FIG. 7B is a diagram for describing another method of superimposing the information indicating the result of extraction of the affected area and the information that includes the major axis length and the minor axis length of the affected area and that indicates the size of the affected area on the image data. FIG. 7C is a diagram for describing another method of superimposing the information indicating the result of extraction of the affected area and the information that includes the major axis length and the minor axis length of the affected area and that indicates the size of the affected area on the image data. FIG. 8A is a diagram for describing a method of causing a user to input information about a region of the affected area.
FIG. 8B is a diagram for describing the method of causing the user to input the information about the region of the affected area.
FIG. 8C is a diagram for describing a method of causing the user to input information about an evaluation value of the affected area.
FIG. 8D is a diagram for describing the method of causing the user to input the information about the evaluation value of the affected area.
FIG. 8E is a diagram for describing the method of causing the user to input the information about the evaluation value of the affected area.
FIG. 8F is a diagram for describing another method of causing the user to input the information about the region of the affected area FIG. 8G is a diagram for describing another method of causing the user to input the information about the region of the affected area FIG. 9 is a work flow chart illustrating the operation of an image processing system according to a second embodiment.
FIG. 10 is a diagram schematically illustrating an image processing system according to a third embodiment.
FIG. 11 is a work flow chart illustrating the operation of the image processing system according to the third embodiment FIG. 12A is a diagram for describing a method of displaying the information about the region of the affected area for which the evaluation value has been acquired.
FIG. 12B is a diagram for describing a method of displaying the information about the evaluation value of the affected area, which has been acquired. FIG. 13 is a diagram for describing an example of a data selection window displayed in a browser of a terminal apparatus.
FIG. 14 is a diagram for describing an example of a data browsing window displayed in the browser of the terminal apparatus FIG. 15 is a work flow chart illustrating a modification of the operation of the image processing system according to the third embodiment.
DESCRIPTION OF EMBODIMENTS
[0024] An object of the embodiments is to improve the user-friendliness in evaluation of a certain area of a subject.
[0025] Exemplary embodiments of the present invention will herein be described in detail with reference to the drawings.
[0026] (First embodiment) An image processing system according to an embodiment of the present invention will now be described with reference to FIG. Ito FIG. 3. FIG. 1 is a diagram schematically illustrating an image processing system 1 according to a first embodiment. The imaging system 1 is composed of an imaging apparatus 200, which is a portable handheld device, and an image processing apparatus 300. In the present embodiment, an example of a clinical condition of an affected area 102 of a subject 101 is described as the pressure ulcer over the hip.
[0027] In the image processing system 1 according to the embodiment of the present invention, the imaging apparatus 200 shoots the affected area 102 of the subject 101, acquires a subject distance, and transmits the data to the image processing apparatus 300. The image processing apparatus 300 extracts the affected area from the received image data, measures the area per one pixel of the image data based on the information including the subject distance, and measures the area of the affected area 102 from the result of extraction of the affected area 102 and the area per one pixel. Although the example is described in the present embodiment in which the affected area 102 is the pressure ulcer, the affected area 102 is not limited to this and may be a burn injury or a laceration.
[0028] FIG. 2 is a diagram illustrating an example of the hardware configuration of the imaging apparatus 200 included in the image processing system 1. For example, a common single-lens camera, a compact digital camera, or a smartphone or a tablet provided with a camera having an automatic focus function may be used as the imaging apparatus 200.
[0029] An imaging unit 211 includes a lens group 212, a shutter 213, and an image sensor 214. Changing the positions of multiple lenses included in the lens group 212 enables the focus position and the zoom magnification to be varied. The lens group 212 also includes a diaphragm for adjusting the amount of exposure.
[0030] The image sensor 214 is composed of a charge-storage-type solid-state image sensor, such as a charge coupled device (CCD) or a complementary metal oxide semiconductor (CMOS) sensor, which converts an optical image into image data. An image is formed on the image sensor 214 from reflected light from the subject through the lens group 212 and the shutter 213. The image sensor 214 generates an electrical signal corresponding to the subject image and outputs the image data based on the electrical signal.
[0031] The shutter 213 performs exposure and light shielding to the image sensor 214 by opening and closing a shutter blade member to control the exposure time of the image sensor 214. An electronic shutter that controls the exposure time in response to driving of the image sensor 214 may be used, instead of the shutter 213. When the electronic shutter is operated using the CMOS sensor, a reset process is performed to set the accumulation of charge of the pixel to zero for each pixel or for each area (for example, for each line) composed of multiple pixels. Then, a scanning process is performed to read out a signal corresponding to the accumulation of charge after a predetermined time for each pixel or area for which the reset process is performed.
[0032] A zoom control circuit 215 controls a motor (not illustrated) for driving a zoom lens included in the lens group 212 to control the optical magnification of the lens group 212. The lens group 212 may be a single focus lens group -10 -without a zoom function. In this case, it is not necessary to provide the zoom control circuit 215.
[0033] A ranging system 216 calculates distance information to the subject. A common phase-difference-type ranging sensor installed in a single-lens reflex camera may be used as the ranging system 216 or a system using a time of flight (TOP) sensor may be used as the ranging system 216. The TOE sensor is a sensor that measures the distance to an object based on the time difference (or the phase difference) between the timing when irradiation waves are transmitted and the timing when reflected waves resulting from reflection of the irradiation waves from the object are received. In addition, for example, a position sensitive device (PSD) method using the PSD as a photo detector may be used for the ranging system [0034] Alternatively, the image sensor 214 may have a configuration which includes multiple photoelectric conversion areas for each pixel and in which the pupil positions corresponding to the multiple photoelectric conversion areas included in a common pixel are varied. With this configuration, the ranging system 216 is capable of calculating the distance information for each pixel or for each area position from the phase difference between the images which are output from the image sensor 214 and which are acquired from the photoelectric conversion areas corresponding to the respective pupil areas.
[0035] The ranging system 216 may have a configuration in which the distance information in a predetermined one or multiple raging areas in the image is calculated or may have a configuration in which a distance map indicating the distribution of the pieces of distance information in multiple pixels or areas in the image is acquired.
[0036] Alternatively, the ranging system 216 may perform TV-auto focus (AF) or contrast AF, in which the radio-frequency components of the image data are extracted for integration and the position of a focus lens having the maximum integration value is determined, to calculate the distance information from the position of the focus lens.
[0037] An image processing circuit 217 performs predetermined image processing to the image data output from the image sensor 214 The image processing circuit 217 performs a variety of image processing, such as white balance adjustment, gamma correction, color interpolation, demosaicing, and filtering, to image data output from the imaging unit 211 or image data recorded in an internal memory 221 In addition, the image processing circuit 217 performs a compression process to the image data subjected to the image processing according to, for example, Joint Photographic Experts Group (JPEG) standard.
[0038] An AF control circuit 218 determines the position of the focus lens included in the lens group 202 based on the distance information calculated in the ranging system 216 to control a motor that drives the focus lens.
[0039] A communication unit 219 is a wireless communication module used by the imaging apparatus 200 to communicate with an external device, such as the image processing apparatus 300, over a wireless communication network (not illustrated). A specific example of the network is a network based on Wi-Fi standard. The communication using the Wi-Fi may be realized using a router. The communication unit 219 may be realized by a wired communication interface, such as universal serial bus (USB) or local area network (LAN). [0040] A system control circuit 220 includes a central processing unit (CPU) and controls the respective blocks in the imaging apparatus 200 in accordance with programs stored in the internal memory 221 to control the entire imaging apparatus 200. In addition, the system control circuit 220 controls the imaging unit 211, the zoom control circuit 215, the ranging system 216, the image -12 -processing circuit 217, the AF control circuit 218, and so on. The system control circuit 220 may use a field programmable gate array (FPGA), an application specific integrated circuit (ASIC), or the like, instead of the CPU.
[0041] The internal memory 221 is composed of a rewritable memory, such as a flash memory or a synchronous dynamic random access memory (SDRAM) The internal memory 221 temporarily stores a variety of setup information including information about the point of focus and the zoom magnification in image capturing, which is necessary for the operation of the imaging apparatus 200, the image data captured by the imaging unit 211, and the image data subjected to the image processing in the image processing circuit 217. The internal memory 221 may temporarily record, for example, the image data and analysis data including information indicating the size of the subject, which are received through the communication with the image processing apparatus 300 by the communication unit 219.
[0042] An external memory interface (IJF) 222 is an interface with a non-volatile storage medium, such as a secure digital (SD) card or a compact flash (CF) card, which is capable of being loaded in the imaging apparatus 200. The external memory 1/F 222 records the image data processed in the image processing circuit 217 and the image data, the analysis data, and so on received through the communication with the image processing apparatus 300 by the communication unit 219 on the storage medium, which is capable of being loaded in the imaging apparatus 200. The external memory I/F 222 may read out the image data recorded on the storage medium, which is capable of being loaded in the imaging apparatus 200, and may output the image data that is read out to the outside of the imaging apparatus in playback.
[0043] A display unit 223 is a display composed of, for example, a thin film transistor (TFT) liquid crystal display, an organic electroluminescent (El) display, -13 -or an electronic viewfinder (EVF). The display unit 223 displays an image based on the image data temporarily stored in the internal memory 221, an image based on the image data stored in the storage medium, which is capable of being loaded in the imaging apparatus, a setup screen of the imaging apparatus 10, and so on. [0044] An operation member 224 is composed of, for example, buttons, switches, keys, and a mode dial, which are provided on the imaging apparatus 200, or a touch panel, which is used also as the display unit 223. An instruction from a user to, for example, set a mode or instruct shooting is supplied to the system control circuit 220 through the operation member 224.
[0045] The imaging unit 211, the zoom control circuit 215, the ranging system 216, the image processing circuit 217, the AF control circuit 218, the communication unit 219, the system control circuit 220, the internal memory 221, the external memory I/F 222, the display unit 223, and the operation member 224 are connected to a common bus 225. The common bus 225 is a signal line for transmission and reception of signals between the respective blocks.
[0046] FIG. 3 is a diagram illustrating an example of the hardware configuration of the image processing apparatus 300 included in the image processing system 1. The image processing apparatus 300 includes an arithmetic unit 311 composed of a CPU, a storage unit 312, a communication unit 313, an output unit 314, and an auxiliary arithmetic unit 317. The storage unit 312 is composed of a main storage unit 315 (for example, a read only memory (ROM) or a random access memory (RAM)) and an auxiliary storage unit 316 (for example, a magnetic disk drive or a solid state drive (SSD)).
[0047] The communication unit 313 is composed as a wireless communication module for communication with an external device via the communication network. The output unit 314 outputs data processed in the arithmetic unit 311 and data stored in the storage unit 312 to a display, a printer, or an external -14 -network connected to the image processing apparatus 300.
[0048] The auxiliary arithmetic unit 317 is an integrated circuit (IC) for auxiliary arithmetic operation used under the control of the arithmetic unit 311. A graphic processing unit (GPU) may be used as an example of the auxiliary arithmetic unit. Since the GPU includes multiple product-sum operators and excels in matrix calculation although the GPU is originally a processor for image processing, the GPU is also often used as a processor that performs a signal learning process The GPU is generally used in a deep learning process. For example, Jetson TX2 Module manufactured by NVIDIA corporation may be used as the auxiliary arithmetic unit 317. The FPGA or the ASIC may be used as the auxiliary arithmetic unit 317. The auxiliary arithmetic unit 317 extracts the affected area 102 of the subject 101 from the image data.
[0049] The arithmetic unit 311 is capable of realizing various functions including arithmetic processing for calculating the size and the length of the affected area 102 extracted by the auxiliary arithmetic unit 317 by executing programs stored in the storage unit 312. In addition, the arithmetic unit 311 controls the order in which the respective functions are performed.
[0050] The image processing apparatus 300 may include one arithmetic unit 311 and one storage unit 312 or multiple arithmetic units 311 and multiple storage units 312. In other words, the image processing apparatus 300 performs the functions described below when at least one processing unit (CPU) is connected to at least one storage unit and the at least one processing unit executes a program stored in the at least one storage unit. Instead of the CPU, the FPGA, the ASIC, or the like may be used as the arithmetic unit 311.
[0051] FIG. 4 is a work flow chart illustrating the operation of the image processing system 1 according to the first embodiment. Referring to FIG. 4, Step is denoted by S. In other words, Step 401 is denoted by S401. The same -15 -applies to FIG. 9, FIG. 11, and FIG. 15 described below.
[0052] In the work flow chart in FIG. 4, Step 401 to Step 420 are performed by the imaging apparatus 200 and Step 431, Step 441 to Step 445, and Step 451 to Step 456 are performed by the image processing apparatus 300.
[0053] First, the imaging apparatus 200 and the image processing apparatus 300 are connected to a network (not illustrated) conforming to the Wi-Fi standard, which is a wireless LAN standard. In Step 431, the image processing apparatus 300 performs a search process of the imaging apparatus 200 to which the image processing apparatus 300 is to be connected. In Step 401, the imaging apparatus 200 performs a response process in response to the search process. For example, Universal Plug and Play (UPnP) is used as a technology to search for a device over the network. In the UPnP, the individual apparatuses are identified using universal unique identifiers (UUlDs).
[0054] In response to connection of the imaging apparatus 200 to the image processing apparatus 300, in Step 402, the imaging apparatus 200 starts a live view process. The imaging unit 211 generates image data and the image processing circuit 217 applies a developing process necessary for generating the image data for live view display to the image data. Repeating these processes causes a live view video of a certain frame rate to be displayed in the display unit 223.
[0055] In Step 403, the ranging system 216 calculates the distance information about the subject using any of the methods described above and the AF control circuit 218 starts an AF process to drive and control the lens group 212 so that the subject is in focus. When the point of focus is adjusted using the TV-AF or the contrast AF, the distance information from the position of the focus lens in the in-focus state to the subject 101 that is in focus is calculated. The position that is to be in focus may be the subject positioned at the center of the image data or the -16 -subject existing at the position closest to the imaging apparatus 200. When the distance map of the subject is acquired, a target area may be estimated from the distance map and the focus lens may be focused on the position. Alternatively, when the position of the pressure ulcer 102 on a live view image is identified by the image processing apparatus 300, the focus lens may be focused on the position of the pressure ulcer on the live view image. The imaging apparatus 200 repeatedly performs the display of the live view video and the AF process until depression of a release button is detected in Step 410.
[0056] In Step 404, the image processing circuit 217 performs the developing process and the compression process to any image data captured for the live view to generate, for example, the image data conforming to the JPEG standard. Then, the image processing circuit 217 performs a resizing process to the image data subjected to the compression process to reduce the size of the image data. [0057] In Step 405, the communication unit 219 acquires the image data subjected to the resizing process in Step 404 and the distance information calculated in Step 403. In addition, the communication unit 219 acquires information about the zoom magnification and information about the size (the number of pixels) of the image data subjected to the resizing process. When the imaging unit 211 has the single focus without the zoom function, it is not necessary to acquire the information about the zoom magnification.
[0058] In Step 406, the communication unit 219 transmits the image data acquired in Step 405 and at least one piece of information including the distance information to the image processing apparatus 300 through the wireless communication.
[0059] Since it takes a longer time to perform the wireless communication with the increasing size of the image data to be transmitted, the size of the image data after the resizing process in Step 405 is determined in consideration of a permitted -17 -communication time. However, since the accuracy of extraction of the affected area, which is performed by the image processing apparatus 300 in Step 433 described below, is influenced if the image data has an excessively reduced size, it is necessary to consider the accuracy of the extraction of the affected area, in addition to the communication time [0060] Step 404 to Step 406 may be performed for each frame or may be performed once per several frames [006]] The operation goes to description of the steps performed by the image processing apparatus 300.
[0062] In Step 441, the communication unit 313 in the image processing apparatus 300 receives the image data and the at least one piece of information including the distance information, which are transmitted from the communication unit 219 in the imaging apparatus 200.
[0063] In Step 442, the arithmetic unit 311 and the auxiliary arithmetic unit 317 in the image processing apparatus 300 extract the affected area 102 of the subject 101 from the image data received in Step 441. As the method of extracting the affected area 102, semantic segmentation using the deep learning is performed. Specifically, a high-performance computer for learning (not illustrated) is caused to learn a neural network model using multiple actual pressure ulcer images as teacher data in advance to generate a learned model. The auxiliary arithmetic unit 317 receives the generated learned model from the high-performance computer and estimates the area of the pressure ulcer, which is the affected area 102, from the image data based on the learned model. A fully convolutional network (FCN), which is the segmentation model using the deep learning, is applied as an example of the neural network model. Here, inference of the deep learning is processed by the auxiliary arithmetic unit 317, which excels in parallel execution of the product-sum operation. The inference process may be -18 -performed by the FPGA or the ASIC. The area segmentation may be realized using another deep learning model. The segmentation method is not limited to the deep learning and, for example, graph cut, area growth, edge detection, divide and conquer, or the like may be used as the segmentation method. In addition, learning of the neural network model using the image of the pressure ulcer as the teacher data may be performed in the auxiliary arithmetic unit 317.
[0064] In Step 443, the arithmetic unit 311 calculates the area of the affected area 102 as information indicating the size of the affected area 102 extracted by the auxiliary arithmetic unit 317.
[0065] FIG. 5 is a diagram for describing how the area of the affected area 102 is calculated. The imaging apparatus 200, which is a common camera, is capable of being processed as a pin-hole model illustrated in FIG. 5. Incident light 501 passes through the principal point of a lens 212a and is received on the imaging plane of the image sensor 214. When the lens group 212 is approximated to the thin single lens 212a, the principal point at the front side is considered to coincide with the principal point at the back side. Adjusting the point of focus of the lens 212 so that an image is formed on the planar surface of the image sensor 214 enables the imaging apparatus to be focused on a subject 504. Varying a focal length 502, which is the distance from the imaging plane to the principal point of the lens, varies an angle of view 503 to vary the zoom magnification. At this time, a width 506 of the subject on the focal plane is geometrically determined from the relationship between the angle of view 503 of the imaging apparatus and a subject distance 505. The width 506 of the subject is calculated using a trigonometric function. Specifically, the width 506 of the subject is determined based on the relationship between the angle of view 503, which is varied with the focal length 502, and the subject distance 505. The value of the width 506 of the subject is divided by the number of pixels on each line of the image data to -19 -calculate the length on the focal plane corresponding to one pixel on the image data.
[0066] Accordingly, the arithmetic unit 311 calculates the area of the affected area 102 as the product of the number of pixels in the extracted area, which is acquired from the result of extraction of the affected area in Step 442, and the area of one pixel, which is acquired from the length on the focal plane corresponding to one pixel on the image. The length on the focal plane corresponding to one pixel on the image, which corresponds to the combination of the focal length 502 and the subject distance 505, may be calculated in advance to be prepared as table data The image processing apparatus 300 may store the table data corresponding to the imaging apparatus 200 in advance.
[0067] In order to accurately calculate the area of the affected area 102 using the above method, it is assumed that the subject 504 is the planar surface and the planar surface is vertical to the optical axis. If the distance information received in Step 441 is the distance information or the distance map at multiple positions in the image data, the inclination or the variation in the depth direction of the subject may be detected to calculate the area based on the detected inclination or the variation.
[0068] In Step 444, the arithmetic unit 311 generates image data resulting from superimposition of information indicating the result of extraction of the affected area 102 and the information indicating the size of the affected area 102 on the image data used for the extraction of the affected area 102.
[0069] FIG. 6A and FIG. 6B are diagrams illustrating how the information indicating the result of extraction of the affected area 102 and the information indicating the size of the affected area 102 are superimposed on the image data. An image 601 in FIG. 6A is an image displayed using the image data before the superimposition process and includes the subject 101 and the affected area 102.
-20 -A superimposed image 602 in FIG. 6B is an image based on the image data after the superimposition process. FIG. 6A and FIG. 6B indicate that the affected area 102 is close to the hip.
[0070] The arithmetic unit 311 superimposes a label 611 at the upper left corner of the superimposed image 602. A character string 612 indicating the area value of the affected area 102 is displayed on the label 611 with white characters on the black background as the information indicating the size of the affected area 102. [007]] The background color and the color of the character string on the label 611 are not limited to black and white, respectively, as long as the background and the character string are easily visible. An amount of transmission may be set and a blending may be performed to the set amount of transmission to enable confirmation of the portion on which the label is superimposed.
[0072] In addition, an index 613 indicating an estimated area of the affected area 102, extracted in Step 442, is superimposed on the superimposed image 602. Performing the a blending of the index 613 indicating the estimated area and the image data on which the image 601 is based for superimposition at the position where the estimated area exists enables the user to confirm whether the estimated area on which the area of the affected area is based is appropriate. The color of the index 613 indicating the estimated area is not desirably equal to the color of the subject. The transmittance of the a blending is desirably within a range in which the estimated area is capable of being recognized and the original affected area 102 is also capable of being confirmed. Since the user is capable of confirming whether the estimated area is appropriate without the display of the label 611 when the index 613 indicating the estimated area of the affected area 102 is superimposed, Step 443 may be omitted.
[0073] In Step 445, the communication unit 313 in the image processing apparatus 300 transmits the information indicating the result of extraction of the -21 -affected area 102 that is extracted and the information indicating the size of the affected area 102 to the imaging apparatus 200. In the present embodiment, the communication unit 313 transmits the image data including the information indicating the size of the affected area 102, which is generated in Step Step 444, to the imaging apparatus 200 through the wireless communication [0074] The operation goes back to description of the steps performed by the imaging apparatus 200.
[0075] In Step 407, the communication unit 219 in the imaging apparatus 200 receives any image data that includes the information indicating the size of the affected area 102 and that is newly generated in the image processing apparatus 300.
[0076] In Step 408, the system control circuit 220 goes to Step 409 if the image data including the information indicating the size of the affected area 102 is received in Step 407 and otherwise goes to Step 410.
[0077] In Step 409, the display unit 223 displays the image data including the information indicating the size of the affected area 102, which is received in Step 407, for a certain time period. Here, the display unit 223 displays the superimposed image 602 illustrated in FIG. 6B. Superimposing the information indicating the result of extraction of the affected area 102 on the live view image in the above manner enables the user to perform the shooting after the user confirms whether the area of the affected area and the estimated area are appropriate. Although the example is described in the present embodiment in which both the index 613 indicating the estimated area of the affected area 102 and the information about the size of the affected area 102 are displayed, either of the index 613 indicating the estimated area of the affected area 102 and the information about the size of the affected area 102 may be displayed [0078] In Step 410, the system control circuit 220 determines whether the -22 -release button included in the operation member 224 is depressed. If the release button is not depressed, the imaging apparatus 200 goes back to Step 404. If the release button is depressed, the imaging apparatus goes to Step 411.
[0079] In Step 411, the ranging system 216 calculates the distance information about the subject and the AF control circuit 218 performs the AF process to drive and control the lens group 212 so that the subject is in focus using the same method as in Step 403. If the affected area 102 has been extracted from the live view image, the ranging system 216 calculates the distance information about the subject at the position where the affected area 102 exists.
[0080] In Step 412, the imaging apparatus 200 captures a still image.
[0081] In Step 413, the image processing circuit 217 performs the developing process and the compression process to the image data generated in Step 412 to generate, for example, the image data conforming to the JPEG standard. Then, the image processing circuit 217 performs the resizing process to the image data subjected to the compression process to reduce the size of the image data. The size of the image data subjected to the resizing process in Step 413 is equal to or greater than that of the image data subjected to the resizing process in Step 404. This is because priority is given to the accuracy of the measurement of the affected area 102. Here, the image data is resized to about 4.45 megabytes with 1,440 pixels x 1,080 pixels in 4-bit ROB color. The size of the resized image data is not limited to this. Alternatively, the operation may go to the subsequent step using the generated image data conforming to the JPEG standard without the resizing process.
[0082] In Step 414, the communication unit 219 acquires the image data, which is generated in Step 413 and which is subjected to the resizing process (or which is not subjected to the resizing process), and the distance information calculated in Step 411. In addition, the communication unit 219 also acquires the information -23 -about the zoom magnification and the information about the size (the number of pixels) of the image data subjected to the resizing process. When the imaging unit 211 has the single focus without the zoom function, it is not necessary to acquire the information about the zoom magnification. When the image processing apparatus 300 has the information about the size of the image data in advance, it is not necessary to acquire the information about the image data. [0083] In Step 415, the communication unit 219 transmits the image data acquired in Step 414 and at least one piece of information including the distance information to the image processing apparatus 300 through the wireless communication.
[0084] The operation goes to description of the steps performed by the image processing apparatus 300.
[0085] In Step 451, the communication unit 313 in the image processing apparatus 300 receives the image data and the at least one piece of information including the distance information, which are transmitted from the communication unit 219 in the imaging apparatus 200.
[0086] In Step 452, the arithmetic unit 311 and the auxiliary arithmetic unit 317 in the image processing apparatus 300 extract the affected area 102 of the subject 101 from the image data received in Step Step 441. Since the details of the step is the same as in Step 442, the detailed description of Step 452 is omitted herein. [0087] In Step 453, the arithmetic unit 311 calculates the area of the affected area 102 as an example of the information indicating the size of the affected area 102 extracted by the auxiliary arithmetic unit 317. Since the details of the step is the same as in Step 443, the detailed description of Step 453 is omitted herein. [0088] In Step 454, the arithmetic unit 311 performs image analysis to calculate the major axis length and the minor axis length of the extracted affected area and the area of a rectangle circumscribed around the affected area based on the length -24 -on the focal plane corresponding to one pixel on the image, calculated in Step 453. The DESIGN-R (registered trademark), which is the evaluation index of the pressure ulcer, defines that the size of the pressure ulcer is calculated by measuring the value of the product of the major axis length and the minor axis length. In the image processing system of the present invention, the analysis of the major axis length and the minor axis length enables the compatibility with the data that has been measured in the DESIGN-R (registered trademark) to be ensured. Since the strict definition is not provided in the DESIGN-R (registered trademark), multiple mathematical methods of calculating the major axis length and the minor axis length are considered.
[0089] As one example of the method of calculating the major axis length and the minor axis length, first, the arithmetic unit 311 calculates a minimum bounding rectangle, which is a rectangle having the minimum area, among the rectangles circumscribed around the affected area 102. Then, the arithmetic unit 311 calculates the lengths of the long side and the short side of the rectangle. The length of the long side is calculated as the major axis length and the length of the short side is calculated as the minor axis length. Then, the arithmetic unit 311 calculates the area of the rectangle based on the length on the focal plane corresponding to one pixel on the image, calculated in Step 453.
[0090] As another example of the method of calculating the major axis length and the minor axis length, a maximum Feret diameter, which is the maximum caliper length, may be selected as the major axis length and a minimum Feret diameter may be selected as the minor axis length. Alternatively, the maximum Feret diameter, which is the maximum caliper length, may be selected as the major axis length and a length measured in a direction orthogonal to the axis of the maximum Feret diameter may be selected as the minor axis length. The method of calculating the major axis length and the minor axis length may be -25 -arbitrarily selected based on the compatibility with the result of measurement in the related art.
[0091] The calculation of the major axis length and the minor axis length of the affected area 102 and the area of the rectangle is not performed to the image data received in Step 441. Since the confirmation of the result of extraction of the affected area 102 by the user is intended during the live view, the step of the image analysis in Step 454 is omitted to reduce the processing time [0092] Step 454 may be omitted when the acquisition of the information about the actual area of the pressure ulcer is intended without the evaluation of the size based on the DESIGN-R (registered trademark). In this case, it is assumed in the subsequent steps that the information about the size, which is the evaluation item in the DESIGN-R (registered trademark), does not exist.
[0093] In Step 455, the arithmetic unit 311 generates image data resulting from superimposition of the information indicating the result of extraction of the affected area 102 and the information indicating the size of the affected area 102 on the image data used as the target of the extraction of the affected area 102. [0094] FIG. 7A to FIG. 7C are diagrams for describing the method of superimposing the information indicating the result of extraction of the affected area 102 and the information indicating the size of the affected area, which includes the major axis length and the minor axis length of the affected area 102, on the image data. Since multiple pieces of information indicating the size of the affected area 102 are considered, a superimposed image 701 in FIG. 7A, a superimposed image 702 in FIG. 7B, and a superimposed image 703 in FIG. 7C are separately described.
[0095] In the case of the superimposed image 701 in FIG. 7A, the minimum bounding rectangle is used as the method of calculating the major axis length and the minor axis length. The label 611 is superimposed at the upper left corner of -26 -the superimposed image 701. The character string 612 indicating the area value of the affected area 102 is displayed on the label 611 with white characters on the black background as the information indicating the size of the affected area 102, as in FIG. 6B. In addition, a label 712 is superimposed at the upper right corner of the superimposed image 701. The major axis length and the minor axis length calculated based on the minimum bounding rectangle are displayed on the label 712 as the information indicating the size of the affected area 102. A character string 713 indicates the major axis length (cm) and a character string 714 indicates the minor axis length (cm). A rectangular frame 715 representing the minimum bounding rectangle is displayed around the affected area 102 on the superimposed image 701. Superimposing the rectangular frame 715 with the major axis length and the minor axis length enables the user to confirm the place for which the length is being measured in the image.
[0096] In addition, a scale bar 716 is superimposed at the lower right corner of the superimposed image 701. The scale bar 716 is used for measuring the size of the affected area 102 and the size of the scale bar on the image data is varied with the distance information. Specifically, the scale bar 716 is a bar on which scale marks from 0 cm to 5 cm are indicated in units of 1 cm based on the length on the focal plane corresponding to one pixel on the image, calculated in Step Step 453, and is matched with the size on the focal plane of the imaging apparatus, that is, on the subject. The user is capable of knowing the approximate size of the subject or the affected area with reference to the scale bar.
[0097] Furthermore, an evaluation value of Size in the DESIGN-R (registered trademark) described above is superimposed at the lower left corner of the superimposed image 701. The evaluation value of Size in the DESIGN-R (registered trademark) is classified into the seven stages described above based on the value given by measuring the major axis length (cm) and the minor axis length -27 - (the maximum diameter orthogonal to the major axis length) (cm) of the skin injury range and multiplying the major axis length by the minor axis length. In the present embodiment, the evaluation value resulting from replacement of the major axis length and the minor axis length with the value that is output using the calculation methods is superimposed.
[0098] In the case of the superimposed image 702 in FIG. 7B, the maximum Feret diameter 521 is used as the major axis length and the minimum Feret diameter 522 is used as the minor axis length. A label 722 is superimposed at the upper right corner of the superimposed image 702. A major axis length character string 723 and a minor axis length character string 724 are displayed on the label 722. In addition, an auxiliary line 725 corresponding to the measurement position of the maximum Feret diameter 521 and an auxiliary line 726 corresponding to the minimum Feret diameter 522 are displayed in the affected area 102 on the superimposed image 702. Superimposing the auxiliary lines with the major axis length and the minor axis length enables the user to confirm the place for which the length is being measured in the image.
[0099] The superimposed image 703 in FIG. 7C is the same as the superimposed image 702 in the major axis length. However, the minor axis length is measured not as the minimum Feret diameter but as the length measured in the direction orthogonal to the axis of the maximum Feret diameter on the superimposed image 703. A label 732 is superimposed at the upper right corner of the superimposed image 702. The major axis length character string 723 and a minor axis length character string 734 are displayed on the label 732. In addition, the auxiliary line 725 corresponding to the measurement position of the maximum Feret diameter 521 and an auxiliary line 736 corresponding to the length measured in the direction orthogonal to the axis of the maximum Feret diameter are displayed in the affected area 102 on the superimposed image 702.
-28 - [0100] Any one of the pieces of information to be superimposed on the image data, illustrated in FIG. 7A to FIG. 7C, may be used or a combination of multiple pieces of the information may be used. Alternatively, the user may be capable of selecting the information to be displayed. The superimposed images illustrated in FIG. 6B and FIG. 7A to FIG. 7C are only examples and the display mode, the display positions, the font type, the font size, the font color, the positional relationship, and so on of the affected area 102 and the information indicating the size of the affected area 102 may be varied depending on various conditions. [0101] In Step 456, the communication unit 313 in the image processing apparatus 300 transmits the information indicating the result of extraction of the affected area 102 that is extracted and the information indicating the size of the affected area 102 to the imaging apparatus 200. In the present embodiment, the communication unit 313 transmits the image data including the information indicating the size of the affected area 102, which is generated in Step 455, to the imaging apparatus 200 through the wireless communication.
[0102] The operation goes back to description of the steps performed by the imaging apparatus 200.
[0103] In Step 416, the communication unit 219 in the imaging apparatus 200 receives the image data including the information indicating the size of the affected area 102, which is generated in the image processing apparatus 300. [0104] In Step 417, the display unit 223 displays the image data including the information indicating the size of the affected area 102, which is received in Step 416, for a certain time period. Here, the display unit 223 displays any of the superimposed images 701 to 703 illustrated in FIG. 7A to FIG. 7C, respectively, and the operation goes to Step 418 after the certain time period elapsed.
[0105] In Step 418, it is determined whether affected area information for which no value is input exists. The affected area information indicates information -29 -indicating the region of the affected area and the evaluation value of each evaluation item in the DESIGN-R (registered trademark) described above. The evaluation value of the evaluation item concerning Size is automatically input based on the information indicating the size, which is received in Step 416. [0106] If the affected area information for which no value is input exists in Step 418, the operation goes to Step 419. If all the affected area information is input in Step 418, the operation goes back to Step 402 to start the live view again. [0107] In Step 419, the system control circuit 220 displays a user interface prompting the user to input the affected area information in the display unit 223. [0108] In Step 420, upon input of the affected area information by the user, the operation goes back to Step 418.
[0109] FIG. 8A to FIG. 8G are diagrams for describing how to cause the user to input the affected area information in Step 419 and Step 420.
[0110] FIG. 8A is a screen prompting the user to input the region of the affected area in the affected area information.
[01]1] Region selection items 801 for specifying the regions: Head, Shoulder, Arm, Back, Waist, Hip, and Leg of the affected area are displayed in the display unit 223. An item for completing the input of the affected area information is provided below the region selection items 801. Selecting the item enables the input of the affected area information to be terminated even if part of the affected area information is not input.
[0112] The user is capable of specifying the region in which the affected area that is shot exists with the operation member 224. The item selected by the user is displayed with being surrounded by a frame line 802. The state in which Hip is selected is displayed in FIG. 8A. Since two or more affected areas may exist in one region selected from the region selection items 801, selection of multiple items, such as Hip], Hip2, and Hip3, may further be available.
-30 - [0113] FIG. 8B is a screen causing the user to confirm whether the selected region is appropriate after the region including the affected area is selected in FIG. 8A. Upon confirmation of the selected region with a user's operation, the display unit 223 displays a screen illustrated in FIG. 8C.
[0114] FIG. 8C is a screen prompting the user to input the evaluation value of each evaluation item in the DESIGN-R (registered trademark) in the affected area information.
[0115] An evaluation item selection portion 804 is displayed on the left side of the screen. The respective items: D (Depth), E (Exudate), S (Size), I (Inflammation/Infection), G (Granulation), N (Necrotic tissue), and P (Pocket) and information indicating whether each item is input are displayed with the image of the affected area. In FIG. 8C, an evaluation value "s9" is displayed for S (Size), which has been analyzed from the image, and "non" indicating that the item has not been confirmed is displayed for the remaining evaluation items. Hatching of the item S (Size) indicates that the item S (Size) has been input. [0116] The user is capable of specifying the evaluation item with the operation member 224. The selected evaluation item (D (Depth) here) is displayed with being surrounded by a frame line 805.
[0117] The evaluation values of a severity level of the evaluation item selected on the left side of the screen are superimposed on the bottom of the screen as a severity level selection portion 806. In FIG. 8C, dO, dl, d2, D3, D4, D5, and DU, which are the evaluation values indicating the severity level of D (Depth), are displayed.
[0118] The user is capable of selecting any of the evaluation values with the operation member 224. The selected evaluation value is displayed with being surrounded by a frame line 807, and a descriptive text 808 (description of the evaluation item Depth and the severity level d2: injury to dermis) of the -31 -evaluation value is also displayed. The evaluation value may be input by the user who inputs a character string.
[0119] FIG. 8D illustrates a confirmation notification 809 for inquiring of the user whether the selected evaluation value is appropriate after the evaluation value is selected in FIG. 8C.
[0120] Upon confirmation that there is no problem about the selected evaluation value by the user with the operation member 224, the screen makes a transition to a screen illustrated in FIG. 8E.
[0121] In FIG. 8E, the display of an evaluation item 810 of D (Depth) is changed from "non" to "d2" in response to the input of the evaluation value and the evaluation item 810 is hatched.
[0122] Similarly, screens prompting the user to input the evaluation values for E (Exudate), I (Inflammation/Infection), G (Granulation), N (Necrotic tissue), and P (Pocket) are displayed until the evaluation values are input for all the evaluation items.
[0123] In response to input of the evaluation values of all the evaluation items, the user is notified of completion of the input of the affected area information. Then, the operation goes back to Step 402 to start the live view process.
[0124] As described above, in the first embodiment, the function is provided to cause the user to input the affected area information by prompting the user to input the evaluation value of the evaluation item that is not subjected to the automatic analysis and the information about the region of the affected area after the affected area is shot in Step 418 to Step 420. It is possible to input the affected area information, which is input using another medium in the related art, only with the imaging apparatus in the above manner.
[0125] In addition, it is possible to prevent input omission of the affected area information by determining whether all the pieces of affected area information are -32 -input and sequentially prompting the user to input the evaluation items that are not input before the next affected area is shot.
[0126] Voice recognition input means may be used as the operation member 224 according to the first embodiment.
[0127] In FIG. 8A, the regions are displayed using the characters, such as "Head" and "Shoulder", when the region of the affected area is input and the characters are selected. In contrast, as illustrated in FIG. 8F, a configuration may be adopted in which a human body model 811 is displayed in the display unit 223 and the user is caused to specify the region of the affected area using a touch sensor provided on the display unit 223.
[0128] In addition, as illustrated in FIG. 8G, a configuration may be adopted in which the human body model 811 is enlarged, reduced, or rotated to enable the region of the affected area to be easily selected.
[0129] Although the hatching is used as the means for indicating that the input of the evaluation value is completed for the evaluation item in FIG. 8E, the luminance of the characters may be reduced or the characters may be highlighted. Other display methods may be used as long as the fact that the evaluation value has been input for the evaluation item is explicitly indicated to the user.
[0130] Although the DESIGN-R (registered trademark) is used as the available evaluation index of the pressure ulcer in the present example, the evaluation index is not limited to this. Another evaluation index, such as Bates-Jensen Wound Assessment Tool (BWAT), Pressure Ulcer Scale for Healing (PUSH), or Pressure Sore Status Tool (PSST), may be used. Specifically, a user interface used for inputting the evaluation items in the BWAT, the PUSH, the PSST, or the like may be displayed in response to the acquisition of the result of extraction of the area of the pressure ulcer and the information about the size of the extracted area. [013]] Although the example of the configuration intended to input of the -33 -evaluation values of the evaluation items of the pressure ulcer is described in the present example, the input of the evaluation values of the evaluation items in another skin disease may be intended as long as the visual evaluation items are used. For example, Severity Scoring of Atopic Dermatitis (SCORAD) in atopic dermatitis and Body Surface Area, Psoriasis Area and Severity Index (PASI) in psoriasis are exemplified.
[0132] As described above, according to the present embodiment, the image processing system is provided in which the information indicating the size of the affected area is displayed in the display unit 223 in the imaging apparatus 200 in response to shooting of the affected area 102 by the user with the imaging apparatus 200. Accordingly, it is possible to reduce the burden on the medical personnel in the evaluation of the size of the affected area of the pressure ulcer and the burden on the patient to be evaluated. In addition, the calculation of the size of the affected area based on a program enables the individual difference to be reduced, compared with the case in which the medical personnel manually measures the size of the affected area, to improve the accuracy in the evaluation of the size of the pressure ulcer. Furthermore, it is possible to calculate the area of the affected area, which is the evaluation value, and display the calculated area of the affected area in order to indicate the size of the pressure ulcer more accurately. [0133] Since the function to confirm whether the estimated area of the affected area is appropriate by the user in the live view display is not essential, a configuration may be adopted in which Step 406, Step 407, and Step 441 to Step 445 are omitted.
[0134] The image processing apparatus 300 may store the information indicating the result of extraction of the affected area 102, the information indicating the size of the affected area 102, and the image data about the superimposed image on which the information indicating the result of extraction of the affected area 102 -34 -and the information indicating the size of the affected area 102 are superimposed in the storage unit 312. The output unit 314 is capable of outputting at least one piece of information stored in the storage unit 312 or the image data to an output device, such as a display connected to the image processing apparatus 300. The display of the superimposed image in the display enables another user different from the user who shots the affected area 102 to acquire the image of the affected area 102 in real time or acquire the image of the affected area 102 which has been captured and the information indicating the size of the affected area 102. The arithmetic unit 311 in the image processing apparatus 300 may have a function to display a scale bar or the like which arbitrarily varies the position and the angle for the image data to be transmitted from the output unit 314 to the display. The display of such a scale bar enables the user who watches the display to measure the length of an arbitrary place of the affected area 102. The width of the memory of the scale bar is desirably adjusted automatically based on the distance information received in Step 451, the information about the zoom magnification, the information about the size (the number of pixels) of the image data subjected to the resizing process, and so on.
[0135] Use of the image processing apparatus 300 in a state in which power is constantly supplied at the stationary side enables the image of the affected area 102 and the information indicating the size of the affected area 102 to be acquired at arbitrary timing with no risk of battery exhaustion. In addition, since the image processing apparatus 300, which is generally the stationary device, has high storage capacity, the image processing apparatus 300 is capable of storing a large amount of image data.
[0136] In addition, according to the present embodiment, the user is capable of inputting and recording information about the affected area 102, which is different from the information acquired from the image analysis of the image, when the -35 -user shots the affected area 102 with the imaging apparatus 200. Accordingly, it is not necessary for the user to subsequently input the evaluation of the affected area on an electronic health record or a paper medium while the user is watching the image data that is captured. Furthermore, presentation of the item that is not input to the user inhibits the user from forgetting the input of the information when the user shots the affected area.
[0137] (Second embodiment) In the image processing system according to the first embodiment, the image processing apparatus 300 performs the process to superimpose the information indicating the result of extraction of the affected area and the information indicating the size of the affected area on the image data. In contrast, in the image processing system according to a second embodiment, the image processing circuit 217 in the imaging apparatus 200 performs the process to superimpose the information indicating the result of extraction of the affected area arid the information indicating the size of the affected area on the image data. [0138] FIG. 9 is a work flow chart illustrating the operation of the image processing system I according to the second embodiment.
[0139] In the work flow in FIG. 9, the superimposition process in Step 444 and Step 455 by the image processing apparatus 300 in the work flow illustrated in FIG. 4 is not performed and, instead of the superimposition process in Step 444 and Step 455 by the image processing apparatus 300, the superimposition process in Step 901 and Step 902 by the imaging apparatus 200 is added. The same processing as in the corresponding steps in FIG. 4 is performed in the steps to which the same numbers as those of the steps in FIG. 4 are given, among the steps described in FIG. 9 [0140] In the present embodiment, the data to be transmitted from the image processing apparatus 300 to the imaging apparatus 200 in Step 445 and Step 456 -36 -for the generation of the superimposed image by the imaging apparatus 200 may not be the image data using a color scale. Since the image processing apparatus 300 does not transmit the image data but transmits metadata indicating the size of an estimated affected area and data indicating the position of the affected area, it is possible to reduce the communication traffic to increase the communication speed. The data indicating the position of the estimated affected area is data in a vector format having a smaller size. The data indicating the position of the estimated affected area may be data in a binary raster format.
[0141] Upon reception of the metadata indicating the size of the estimated affected area and the data indicating the position of the affected area from the image processing apparatus 300 in Step 407 or Step 416, the imaging apparatus 200 generates the superimposed image in Step 901 or Step 902, respectively. [0142] Specifically, in Step 901, the image processing circuit 217 in the imaging apparatus 200 generates the superimposed image using the method described in Step 444 in FIG. 4. The image data on which the information indicating the size and the position of the estimated affected area is to be superimposed may be the image data transmitted from the imaging apparatus 200 to the image processing apparatus 300 in Step 406 or may the image data about the last frame displayed as the live view image.
[0143] In Step 902, the image processing circuit 217 in the imaging apparatus 200 generates the superimposed image using the method described in Step 455 in FIG. 4. The image data on which the information indicating the size and the position of the estimated affected area is to be superimposed is the image data transmitted from the imaging apparatus 200 to the image processing apparatus 300 in Step 415.
[0144] As described above, according to the present embodiment, since the amount of data to be transmitted from the image processing apparatus 300 to the -37 -imaging apparatus 200 is reduced, compared with the first embodiment, it is possible to reduce the communication traffic between the imaging apparatus 200 and the image processing apparatus 300 to increase the communication speed. [0145] (Third embodiment) FIG. 10 is a diagram schematically illustrating an image processing system 11 according to a third embodiment. The image processing system 11 illustrated in fig. 10 includes a terminal apparatus 1000, which is an electronic device capable of Web access, in addition to the imaging apparatus 200 and the image processing apparatus 300 described above in the first and second embodiments. The terminal apparatus 1000 is composed of, for example, a tablet terminal and has a Web browser function. The terminal apparatus 1000 is capable of accessing a Web server and displaying a Hyper Text Markup Language (HTML) file that is acquired. The terminal apparatus 1000 is not limited to the tablet terminal and may be a Web browser or a device capable of displaying an image with dedicated application software. The terminal apparatus 1000 may be, for example, a smartphone or a personal computer. Although the imaging apparatus 200 and the terminal apparatus 1000 are described as separate apparatuses here, a single apparatus may be used as the imaging apparatus 200 and the terminal apparatus 1000. When the terminal apparatus 1000 is a smartphone or a tablet terminal with a camera function, the terminal apparatus 1000 is capable of serving as the imaging apparatus 200.
[0146] The arithmetic unit 311 in the image processing apparatus 300 performs a process to identify the subject from the image data, in addition to the processes described above in the first and second embodiments. In addition, the arithmetic unit 311 performs a process to store the information about the size and the position of the estimated affected area and the image data about the affected area in the storage unit 312 for each subject that is identified. The terminal apparatus -38 - 1000 is capable of causing the user to confirm the information indicating the size of the estimated affected area, which is associated with the subject, and the image data about the affected area, which are stored in the storage unit 312 in the image processing apparatus 300, using a Web browser or dedicated application software. It is assumed here for description that the terminal apparatus 1000 causes the user to confirm the image data using the Web browser.
[0147] Although the function to identify the subject from the image data, the function to store the information about the affected area or the image data for each subject that is identified, or the function to perform a Web service is performed by the image processing apparatus 300 in the present embodiment, the functions are not limitedly performed by the image processing apparatus 300. Part or all of the functions may be realized by a computer on a network different from that of the image processing apparatus 300.
[0148] Referring to FIG. 10, the subject 101 wears a barcode tag 103 as information identifying the subject. The image data about the affected area 102 that is shot is capable of being associated with an identifier (ID) of the subject, which is indicated by the barcode tag 103. The tag identifying the subject is not limited to the barcode tag and may be a two-dimensional code, such as a QR code (registered trademark), or a numerical value. Alternatively, a tag having a text described thereon may be used as the tag identifying the subject and the tag may be read using an optical character recognition (OCR) reader function installed in the image processing apparatus 300.
[0149] The arithmetic unit 311 in the image processing apparatus 300 collates the ID resulting from analysis of the barcode tag included in the image data that is captured with a subject ID registered in the storage unit 312 in advance to acquire the name of the subject 101. A configuration may be adopted in which the imaging apparatus 200 analyzes the ID and the ID is transmitted to the image -39 -processing apparatus 300.
[0150] The arithmetic unit 311 creates a record based on the image data about the affected area 102, the information indicating the size of the affected area 102 of the subject, the subject ID, the acquired name of the subject, the shooting date and time, and so on and registers the record in a database in the storage unit 312. [015]] In addition, the arithmetic unit 311 returns the information registered in the database in the storage unit 312 in response to a request from the terminal apparatus 1000.
[0152] FIG. 11 is a work flow chart illustrating the operation of the image processing system 11 according to the third embodiment. The same processing as in the corresponding steps in FIG. 4 is performed in the steps to which the same numbers as those of the steps in FIG. 4 are given, among the steps described in FIG. 11.
[0153] Referring to FIG. 11, upon connection between the imaging apparatus 200 and the image processing apparatus 300, in Step 1101, the imaging apparatus 200 displays an instruction to cause the user to shoot the barcode tag 103 in the display unit 223 and shots the barcode tag 103 in response to the release operation by the user. Then, the operation goes to Step 402. Information about a patient ID for identifying the patient is included in the barcode tag 103. Shooting the affected area 102 after shooting the barcode tag 103 enables the shooting order to be managed based on the shooting date and time or the like to identify the image from the image of one barcode tag before the image of the next barcode tag as the image of the same subject using the subject ID. An order may be adopted in which the barcode tag 103 is shot after the affected area 102 is shot.
[0154] After the system control circuit 220 detects depression of the release button in Step 410 and Step 411 to Step 414 are performed, in Step 415, the communication unit 219 transmits the image data and at least one piece of -40 -information including the distance information to the image processing apparatus 300 through the wireless communication. The image data generated by shotting the barcode tag 103 in Step 1001 is included in the image data transmitted in Step 415, in addition to the image data generated by shooting the affected area 102 [0155] In Step 455, the image processing apparatus 300 generates the image data about the superimposed image. Then, the operation goes to Step 1111.
[0156] In Step 1111, the arithmetic unit 311 performs a process to read a one-dimensional barcode (not illustrated) included in the image data about the barcode tag 103 shot in Step 1001 to read the subject ID identifying the subject [0157] In Step 1112, the subject ID that is read is collated with the subject ID registered in the storage unit 312 [0158] In Step 1113, if the collation of the subject ID succeeded, the name of the patient registered in the database in the storage unit 312 and the past affected area information are acquired. The affected area information that is stored last is acquired here.
[0159] In Step 456, the communication unit 313 in the image processing apparatus 300 transmits the information indicating the result of extraction of the affected area 102 that is extracted, the information indicating the size of the affected area 102, and the past affected area information acquired from the storage unit 312 to the imaging apparatus 200.
[0160] In Step 416, the communication unit 219 in the imaging apparatus 200 receives the image data and the affected area information, which are transmitted from the image processing apparatus 300.
[0161] In Step 417, the display unit 223 displays the image data including the information indicating the size of the affected area 102, which is received in Step 416, for a certain time period.
[0162] In Step 418, it is determined whether the affected area information for -41 -which no value is input exists.
[0163] If the affected area information for which no value is input exists in Step 418, the operation goes to Step 1102. If all the affected area information is input in Step 418, the operation goes to Step 1104.
[0164] In Step 1102, the system control circuit 220 displays a user interface prompting the user to input the affected area information in the display unit 223 using the past affected area information.
[0165] FIG. 12A and FIG. 12B are diagrams for describing how to display the affected area information that has been acquired. In FIG. 12A, the character sizes of items 1102 displayed in a region selection item 1101 on the left side of the screen are made larger for the regions for which the evaluation values of the evaluation items have been input. FIG. 12A indicates that the evaluation values of the evaluation items of the affected areas have been input for Back and Hip. [0166] Upon input of the affected area information by the user in Step 420, in Step 1103, comparison with the past evaluation value of the evaluation item is performed to display a result of determination of whether the symptom is relieved or made worse.
[0167] In FIG. 12B, an evaluation item selection portion 1103 is displayed in three columns. The evaluation item names, the past evaluation values, and the current evaluation values are sequentially displayed from the left.
[0168] Here, the past evaluation values are compared with the current evaluation values. The green evaluation value is displayed for the item the symptom of which is determined to be relieved, and the red evaluation value is displayed for the item the symptom of which is determined to be made worse [0169] Upon input of the evaluation values of all the evaluation items, the user is notified of completion of the input of the affected area information. Then, the operation goes to Step 1104.
-42 - [0170] In Step 1104, the affected area information in which the evaluation values of the series of evaluation items are input and the image data are transmitted to the image processing apparatus 300 through the wireless communication. Then, the operation goes back to Step 402.
[0171] In Step 1114, the image processing apparatus 300 receives the affected area information and the image data, which are transmitted from the imaging apparatus 200.
[0172] In Step 1115, the arithmetic unit 311 creates a record based on the image data resulting from shooting of the affected area, the information about the region of the affected area 102, the evaluation value of each evaluation item of the affected area 102, the subject ID, the acquired name of the subject, the shooting date and time, and so on. In addition, the arithmetic unit 311 registers the created record in the database in the storage unit 312.
[0173] In Step 1116, the arithmetic unit 311 transmits the information registered in the database in the storage unit 312 to the terminal apparatus 1000 in response to a request from the terminal apparatus 1000.
[0174] Examples of display of a browser of the terminal apparatus 1000 are described with reference to FIG. 13 and FIG. 14.
[0175] FIG. 13 is a diagram for describing an example of a data selection window displayed in the browser of the terminal apparatus 1000. A data selection window 1301 is separated using separation lines 1303 for the respective dates 1302. Icons 1305 are displayed for the respective shooting times 1304 in the area of each date. The subject ID and the name of the subject are displayed in each icon 1305 and the icon 1305 represents a data set of the same subject shot at the same time zone. A search window 1306 is provided on the data selection window 1301. Inputting the date, the subject ID, or the name of the subject in the search window 1306 enables search for the data set In addition, operating a -43 -scroll bar 1307 enables multiple pieces of data to be displayed in an enlarged manner in a limited display area. Upon selection and clicking of the icon 1305 by the user, the browser makes a transition to a data browsing window and the user of the browser of the terminal apparatus 1000 is capable of browsing the image of the data set and the information indicating the size of the subject. In other words, a request indicating the subject and the date and time, which are specified in the terminal apparatus 1000, is transmitted from the terminal apparatus 1000 to the image processing apparatus 300. The image processing apparatus 300 transmits the image data corresponding to the request and the information indicating the size of the subject to the terminal apparatus 1000. [0176] FIG. 14 is a diagram for describing an example of the data browsing window displayed in the browser of the terminal apparatus 1000. A subject-ID and name-of-subject 1402 of the data set selected on the data selection window 1301 and a shooting-date-and-time 1403 are displayed on a data browsing window 1401. In addition, an image 1404 based on the image data and data 1405 based on the affected area information in the image 1404 are displayed for every shooting. Furthermore, the shooting number when the affected area of the same subject is continuously shot multiple times is displayed in a number 1406. Moving a slider 1407 on the right edge of the window enables the data based on the image data and the affected area information on another shooting date and time about the same subject ID to be displayed. In addition, changing the settings enables the data based on the affected area information on multiple shooting dates and times to be displayed to facilitate comparison of the change in the symptom of the affected area [0177] Although the process is performed in FIG. 14 in which, after the affected area is shot, the user is caused to perform the collation of the subject ID and the selection of the region where the affected area exists, the collation of the subject -44 -ID and the selection of the region where the affected area exists may be performed before the affected area is shot.
[0178] FIG. 15 is a work flow chart illustrating a modification of the operation of the image processing system 11 according to the third embodiment. The same processing as in the corresponding steps in FIG. 11 is performed in the steps to which the same numbers as those of the steps in FIG. 11 are given, among the steps described in FIG. 15 [0179] Upon shooting of the barcode tag 103 in Step 1101, in Step 1501, the communication unit 219 transmits the image data generated by shooting the barcode tag 103 to the image processing apparatus 300.
[0180] In Step 1511, the communication unit 313 in the image processing apparatus receives the image data generated by shooting the barcode tag 103, which is transmitted from the imaging apparatus 200.
[018]] In Step 1512, the arithmetic unit 311 performs a process to read a one-dimensional barcode included in the image data about the barcode tag 103 that is received to read the subject ID identifying the subject.
[0182] In Step 1513, the subject ID that is read is collated with the subject ID registered in the storage unit 312 [0183] In Step 1514, if the collation of the subject ID succeeded, the name of the patient registered in the database in the storage unit 312 is acquired. If the collation failed, information indicating that the collation failed is acquired, instead of the name of the patient [0184] In Step 1515, the communication unit 313 in the image processing apparatus transmits the name of the patient or the information indicating that the collation of the subject ID failed to the imaging apparatus 200.
[0185] In Step 1502, the communication unit 219 in the imaging apparatus 200 receives the name of the patient, which is transmitted from the image processing -45 -apparatus 300.
[0186] In Step 1503, the system control circuit 220 displays the name of the patient in the display unit 223.
[0187] In Step 1504, the system control circuit 220 displays the name of the patient in the display unit 223. Here, the user may be caused to input the result of confirmation of whether the name of the patient is correct. If the name of the patient is not correct or if the collation of the name of the patient failed, the operation may go back to Step 1101. Displaying the name of the patient before the image of the affected area is captured prevents wrong association between the image data about the affected area or the affected area information to be subsequently acquired and the subject ID.
[0188] In Step 1505, the system control circuit 220 displays a user interface prompting the user to input the information about the region where the affected area exists in the affected area information in the display unit 223. Specifically, as in FIG. 8A and FIG. 8B in the first embodiment, the region selection items 801 for specifying the regions: Head, Shoulder, Arm, Back, Waist, Hip, and Leg of the affected area are displayed to cause the user to select any of them.
[0189] In Step 1506, the user inputs the information about the affected area. Then, the operation goes to Step 402. Going to the step to shoot the affected area after the information about the region of the affected area to be shot is selected in the above manner prevents wrong selection of the information about the region of the affected area.
[0190] Since the collation of the subject ID is performed in Step 1513, it is not necessary for the image processing apparatus 300 to perform the collation of the subject ID after acquiring the image data including the affected area. In addition, since the information about the region of the affected area is input in Step 1506, it is not necessary for the user to input the information about the region of the -46 -affected area in Step 1507 and Step 1508 after the image data including the affected area is acquired and it is sufficient for the user to input the evaluation value of each evaluation item in Step 1507 and Step 1508.
[0191] As described above, in the image processing system 11 according to the present embodiment, it is possible to identify and store the image data about the affected area 102 and the result of analysis of the image data for each subject and to confirm whether each evaluation item is relieved or made worse only using the imaging apparatus the user has on hand. Accordingly, the user is capable of confirming management information about the affected area that has been registered immediately after shooting the affected area only using the imaging apparatus the user has on hand In addition, displaying the severity level that is currently confirmed in comparison with the last management information enables the user to confirm whether the symptom is relieved or made worse at first sight. [0192] The user is capable of confirming the result of analysis of the image data about the affected area 102 in association with the subject ID and the name of the subject from the terminal apparatus 1000, such as a tablet terminal, using a Web browser or a dedicated application.
[0193] In all the embodiments described above, a process to achieve the same effects as in the work flows in FIG. 4, FIG. 9, and FIG. 11 is capable of being performed only with the imaging apparatus 200 by installing a circuit corresponding to the auxiliary arithmetic unit 317 in the imaging apparatus 200. In this case, the same effects as in the image processing systems composed of the imaging apparatus 200 and the image processing apparatus 300 described above are achieved only with the imaging apparatus 200. Receiving the new leaned model that is created in an external computer enables improvement of the accuracy of the inference process of the affected area and extraction of the affected area of a new type.
-47 - [0194] (Other embodiments) The present invention can be implemented by processing of supplying a program for implementing one or more functions of the above-described embodiments to a system or apparatus via a network or storage medium, and causing one or more processors in the computer of the system or apparatus to read out and execute the program. The present invention can also be implemented by a circuit (for example, an ASIC) for implementing one or more functions. [0195] The present invention is not limited to the above embodiments and various changes and modifications can be made within the spirit and scope of the present invention. Therefore, to apprise the public of the scope of the present invention, the following claims are made.
[0196] This application claims priority from Japanese Patent Application No. 2018-104922 filed on May 31, 2018, Japanese Patent Application No. 2019-018653 filed on February 5, 2019, and Japanese Patent Application No, 2019-095938 filed on May 22, 2019, which are hereby incorporated by reference herein. [0197] The following numbered statements form part of the description The claims follow and are labelled as such.
Statement 1. An image processing system comprising an imaging apparatus and an image processing apparatus, the image processing system is characterized in that the imaging apparatus includes imaging means for receiving light from a subject to generate image data, first communication means for outputting the image data to a communication network, and display means for displaying an image based on the image data generated by the imaging means, -48 -the image processing apparatus includes second communication means for acquiring the image data over the communication network, and arithmetic means for extracting a certain area of the subject from the image data, the second communication means outputs information indicating a result of extraction of the certain area extracted by the arithmetic means to the communication network, the first communication means acquires the information indicating the result of extraction of the certain area over the communication network, and the display means performs display based on the information indicating the result of extraction of the certain area.
Statement 2. The image processing system according to Statement 1, characterized in that the display means displays an image based on the image data on which the result of extraction of the certain area is superimposed and which is used to extract the certain area by the arithmetic means.
Statement 3. The image processing system according to Statement 1, characterized in that the di splay means displays a live view image on which the result of extraction of the certain area is superimposed and which is generated by the imaging means.
Statement 4. The image processing system according to any of Statements 1 to 3, characterized in that -49 -the arithmetic means generates information indicating a size of the certain area extracted from the image data, and the second communication means outputs the information indicating the size generated by the arithmetic means to the communication network.
Statement 5. The image processing system according to Statement 4, characterized in that the imaging apparatus includes generating means for generating distance information about a distance from the imaging apparatus to the subject, the first communication means outputs the distance information to the communication network, the second communication means acquires the distance information over the communication network, and the arithmetic means generates the information indicating the size of the certain area based on the distance information.
Statement 6. The image processing system according to Statements 4 or 5, characterized in that the display means performs display based on the information indicating the result of extraction of the certain area and the information indicating the size Statement 7. The image processing system according to any of Statements 4 to 6, characterized in that the information indicating the size of the certain area is at least one of lengths in at least two directions of the certain area, an area of the certain area, an area of a rectangular area circumscribed around the certain area, and a scale bar for measuring the size of the certain area.
-50 -Statement 8. The image processing system according to Statement 7, characterized in that the arithmetic means converts a size of the certain area on the image data based on information indicating an angle of view of the image data or a size of a pixel and the distance information to generate the information indicating the size of the certain area.
Statement 9. The image processing system according to any of Statements 1 to 8, characterized in that the arithmetic means identifies information indicating a size of the certain area for each subject having the certain area and stores the identified information in storage means Statement 10. The image processing system according to Statement 9, characterized in that the arithmetic means identifies the information indicating the size of the certain area based on the subject having the certain area and a date and time when the image data used in the extraction of the certain area is generated and stores the identified information in the storage means.
Statement 11. The image processing apparatus according to Statement 9 or 10, characterized in that the arithmetic means transmits, in response to a request from an external terminal apparatus, the information indicating the size of the certain area, which corresponds to a subject specified in the request, to the terminal apparatus.
-51 -Statement 12. The image processing system according to any of Statements 9 to 11, characterized in that the second communication means further acquires image data including a code for identifying the subject, which is output from the first communication means, over the communication network, and the arithmetic means extracts information identifying the subject having the certain area from the image data including the code for identifying the subject.
Statement 13. The image processing system according to any of Statements 1 to 12, characterized in that the arithmetic means causes second display means different from the display means to display the information indicating the result of extraction of the certain area.
Statement 14. The image processing system according to Statement 13, characterized in that the arithmetic means causes the second display means to arrange for display an image based on the image data on which the result of extraction of the certain area is superimposed and an image based on the image data acquired by the second communication means.
Statement 15. The image processing system according to any of Statements 1 to 14, characterized in that the display means performs display to cause a user to input evaluation values of a plurality of predetermined evaluation items in the certain area.
Statement 16. The image processing system according to Statement 15, -52 -characterized in that the display means causes the user to input the evaluation values of the plurality of evaluation items in response to the acquisition of the result of extraction of the certain area.
Statement 17. The image processing system according to any of Statements 1 to 16, characterized in that the certain area is an affected area.
Statement 18. An imaging apparatus comprising:
imaging means for receiving light from a subject to generate image data; communication means for outputting the image data to an external apparatus over a communication network; and display means for displaying an image based on the image data generated by the imaging means, the imaging apparatus is characterized in that the communication means acquires information indicating a result of extraction of a certain area of the subject in the image data from the external apparatus over the communication network, and the display means performs display based on the information indicating the result of extraction of the certain area.
Statement 19. The imaging apparatus according to Statement 18, characterized in that the display means displays an image based on the image data on which the result of extraction of the certain area is superimposed and which is output to the external apparatus.
-53 -Statement 20. The imaging apparatus according to Statement 18, characterized in that the display means displays a live view image on which the result of extraction of the certain area is superimposed and which is generated by the imaging means Statement 21. The imaging apparatus according to any of Statements 18 to 20, characterized in that the communication means acquires information indicating a size of the certain area in the image data from the external apparatus over the communication network, and the display means performs display based on the information indicating the result of extraction of the certain area and the information indicating the size Statement 22. The imaging apparatus according to Statement 21, further comprising: generating means for generating distance information about a distance from the imaging apparatus to the subject, the imaging apparatus is characterized in that the communication means outputs the distance information to the external apparatus over the communication network Statement 23. The imaging apparatus according to Statement 21 or 22, characterized in that the information indicating the size of the certain area is at least one of lengths in at least two directions of the certain area, an area of the certain area, an area of a rectangular area circumscribed around the certain area, and a scale bar -54 -for measuring the size of the certain area.
Statement 24. The imaging apparatus according to any of Statements 18 to 23, characterized in that the communication means outputs information for identifying the subject having the certain area to the external apparatus over the communication network.
Statement 25. The imaging apparatus according to any of Statements 18 to 24, characterized in that the display means performs display to cause a user to input evaluation values of a plurality of predetermined evaluation items in the certain area.
Statement 26. The imaging apparatus according to Statement 25, characterized in that the display means causes the user to input the evaluation values of the plurality of evaluation items in response to the acquisition of the result of extraction of the certain area.
Statement 27. The imaging apparatus according to any of Statements 18 to 26, characterized in that the certain area is an affected area.
Statement 28. An image processing apparatus comprising: communication means for acquiring image data and distance information corresponding to a subject included in the image data from an imaging apparatus over a communication network; and arithmetic means for extracting a certain area of the subject from the -55 -image data and calculating a size of the certain area based on the distance information, the image processing apparatus is characterized in that the communication means outputs information indicating a result of extraction of the certain area extracted by the arithmetic means and information indicating the size to the imaging apparatus over the communication network Statement 29. The image processing apparatus according to Statement 28, characterized in that the di stance information is information about a distance from the imaging apparatus to the subject.
Statement 30. The image processing apparatus according to Statement 28 or 29, characterized in that the arithmetic means causes display means to arrange for display an image based on the image data on which at least one of the information indicating the result of extraction of the certain area and the information indicating the size of the certain area is superimposed and an image based on the image data acquired by acquiring means.
Statement 31. The image processing apparatus according to any of Statements 28 to 30, characterized in that the arithmetic means converts a size of the certain area on the image data based on an angle of view of the image data or information indicating a size of a pixel and the distance information to calculate the size of the certain area.
Statement 32. The image processing apparatus according to any of Statements 28 to 31, characterized in that -56 -the arithmetic means identifies the information indicating the size of the certain area for each subject having the certain area and stores the identified information in storage means Statement 33. The image processing apparatus according to Statement 32, characterized in that the arithmetic means identifies the information indicating the size of the certain area based on the subject having the certain area and a date and time when the image data used in the extraction of the certain area is generated and stores the identified information in the storage means.
Statement 34. The image processing apparatus according to Statement 32 or 33, characterized in that the arithmetic means transmits, in response to a request from an external terminal apparatus, the information indicating the size of the certain area, which corresponds to a subject specified in the request, to the terminal apparatus.
Statement 35. The image processing apparatus according to any of Statements 32 to 34, characterized in that the communication means further acquires image data including a code for identifying the subject over the communication network, and the arithmetic means extracts information identifying the subject having the certain area from the image data including the code for identifying the subject.
Statement 36. The image processing apparatus according to any of Statements 28 to 35, characterized in that the certain area is an affected area.
-57 -Statement 37. A method of controlling an image processing system including an imaging apparatus that includes imaging means, display means, and first communication means and an image processing apparatus that includes arithmetic means and second communication means, the method is characterized by comprising: receiving light from a subject to generate image data by the imaging means; outputting the image data to a communication network by the first communication means; acquiring the image data over the communication network by the second communication means; extracting a certain area of the subject from the image data by the arithmetic means; outputting information indicating a result of extraction of the certain area to the communication network by the second communication means; acquiring the information indicating the result of extraction of the certain area over the communication network by the first communication means; and performing display based on the information indicating the result of extraction of the certain area by the display means.
Statement 38. A method of controlling an imaging apparatus, the method is characterized by comprising: receiving light from a subject to generate image data; outputting the image data to an external apparatus over a communication network; acquiring information indicating a result of extraction of a certain area of -58 -the subject in the image data from the external apparatus over the communication network; and causing display means to perform display based on the information indicating the result of extraction of the certain area.
Statement 39. A method of controlling an image processing apparatus, the method is characterized by comprising: acquiring image data and distance information corresponding to a subject included in the image data from an imaging apparatus over a communication network; extracting a certain area of the subject from the image data and calculating a size of the certain area based on the distance information; and outputting information indicating a result of extraction of the certain area and information indicating the size to the imaging apparatus over the communication network.
Statement 40. A computer-readable non-volatile storage medium storing an instruction causing a computer to perform steps of a method of controlling an imaging apparatus, the method of controlling the imaging apparatus is characterized by comprising: receiving light from a subject to generate image data; outputting the image data to an external apparatus over a communication network, acquiring information indicating a result of extraction of a certain area of the subject in the image data from the external apparatus over the communication network; and causing display means to perform display based on the information -59 -indicating the result of extraction of the certain area.
Statement 41. A computer-readable non-volatile storage medium storing an instruction causing a computer to perform steps of a method of controlling an image processing apparatus, the method of controlling the image processing apparatus is characterized by comprising: acquiring image data and distance information corresponding to a subject included in the image data from an imaging apparatus over a communication network; extracting a certain area of the subject from the image data and calculating a size of the certain area based on the distance information; and outputting information indicating a result of extraction of the certain area and information indicating the size to the imaging apparatus over the communication network.
Statement 42. An imaging apparatus comprising:
imaging means for receiving light from a subject to generate image data; control means for acquiring a result of extraction of a certain area of the subject in the image data; and interface means for causing a user to input evaluation values of a plurality of predetermined evaluation items in the certain area of the subject, the imaging apparatus is characterized in that the control means associates the evaluation values of the input plurality of evaluation items with the image data.
Statement 43. The imaging apparatus according to Statement 42, characterized in that -60 -the interface means causes the user to input the evaluation values of the plurality of evaluation items in response to the acquisition of the result of extraction of the certain area of the subject.
Statement 44 The imaging apparatus according to Statement 42 or 43, characterized in that the result of extraction of the certain area of the subject includes information indicating a size of the certain area.
Statement 45. The imaging apparatus according to any of Statements 42 to 44, characterized in that the interface means causes the user to input information about a region of the subject in which the certain area exists.
Statement 46. The imaging apparatus according to Statement 45, characterized in that the interface means causes the user to input the information about the region in which the certain area exists before the result of extraction of the certain area is acquired.
Statement 47. The imaging apparatus according to Statements 42 to 46, characterized in that the interface means displays the evaluation items for which the evaluation values are input and the evaluation items for which the evaluation values are not input, among the plurality of evaluation items, in different modes.
Statement 48. The imaging apparatus according to any of Statements 42 to 46, -61 -characterized in that the control means acquires information for dentifying the subject having the certain area and associates the image data from which the certain area is extracted with the evaluation values of the plurality of evaluation items for each subject.
Statement 49. The imaging apparatus according to Statement 48, characterized in that the control means acquires the evaluation values of the plurality of evaluation items that has been associated with the same subject.
Statement 50. The imaging apparatus according to Statement 49, characterized in that the interface means displays the evaluation values of the plurality of evaluation items that are newly acquired and the evaluation values of the plurality of evaluation items that have been acquired.
Statement 51. The imaging apparatus according to any of Statements 48 to 50, characterized in that the interface means displays a result of identification of the subject having the certain area.
Statement 52. The imaging apparatus according to any of Statements 42 to 51, characterized by further comprising: communication means for transmitting the image data generated by the imaging means to an image processing apparatus, which is an external apparatus, over a communication network and receiving information about a result of -62 -extraction of the certain area from the image data from the image processing apparatus over the communication network.
Statement 53. The imaging apparatus according to Statement 52, characterized in that the communication means transmits the image data and distance information from the imaging apparatus to the subject to the image processing apparatus over the communication network and receives the information about the result of extraction of the certain area, which includes information indicating a size of the certain area, from the image processing apparatus over the communication network.
Statement 54. The imaging apparatus according to any of Statements 42 to 53, characterized in that the certain area is an affected area.
Statement 55. A method of controlling an imaging apparatus, the method is characterized by comprising: receiving light from a subject to generate image data; acquiring a result of extraction of a certain area of the subject in the image data; causing a user to input evaluation values of a plurality of predetermined evaluation items in the certain area of the subject; and associating the evaluation values of the input plurality of evaluation items with the image data.
Statement 56. A computer-readable non-volatile storage medium storing an -63 -instruction causing a computer to perform steps of a method of controlling an imaging apparatus, the method of controlling the imaging apparatus is characterized by comprising: receiving light from a subject to generate image data; acquiring a result of extraction of a certain area of the subject in the image data; causing a user to input evaluation values of a plurality of predetermined evaluation items in the certain area of the subject; and associating the evaluation values of the input plurality of evaluation items with the image data.
Statement 57. An electronic device characterized by comprising: communication means for acquiring image data generated by an imaging apparatus and information indicating evaluation values of a plurality of evaluation items for an affected area of a subject in the image data, which is input by a user with the imaging apparatus, over a communication network; and control means for causing display means to display an image based on the image data and the evaluation values of the plurality of evaluation items.
Statement 58. The electronic device according to Statement 57, characterized in that the control means causes the display means to identify for display an image based on the image data and the evaluation values of the plurality of evaluation items based on the subject having the certain area and a date and time when the image data used in extraction of the certain area is generated.
Statement 59. A method of controlling an electronic device, the method is -64 -characterized by comprising: acquiring image data generated by an imaging apparatus and information indicating evaluation values of a plurality of evaluation items for an affected area of a subject in the image data, which is input by a user with the imaging apparatus, over a communication network; and causing display means to display an image based on the image data and the evaluation values of the plurality of evaluation items.
Statement 60. A computer-readable non-volatile storage medium storing an instruction causing a computer to perform steps of a method of controlling an electronic device, the method of controlling the electronic device is characterized by comprising: acquiring image data generated by an imaging apparatus and information indicating evaluation values of a plurality of evaluation items for an affected area of a subject in the image data, which is input by a user with the imaging apparatus, over a communication network; and causing display means to display an image based on the image data and the evaluation values of the plurality of evaluation items.
-65 -

Claims (7)

  1. CLAIMSL An image processing apparatus comprising: communication means for acquiring image data over a communication network, arithmetic means for extracting an affected area of a subject from the image data and storing means for storing information in the storage means, and wherein the communication means outputs information indicating a result of extraction of the affected area extracted by the arithmetic means to an external apparatus over the communication network, and acquiring information of the affected area from the external apparatus over the communication network, and wherein the storing means stores information of the affected area from the external apparatus in the storage means.
  2. 2 The image processing apparatus according to Claim 1, wherein the storing means stores the information of the affected area from the external apparatus in a database in the storage means.
  3. 3. The image processing apparatus according to Claim 2, wherein the storing means stores the information of the affected area from the external apparatus and the information indicating the result of extraction of the affected area in the database in the storage means.
  4. 4. The image processing apparatus according to Claim 2 or 3, wherein the storing means stores the information of the affected area from the external apparatus and information for identifying the subject in the database in the storage means.
  5. 5. The image processing apparatus according to any of Claims 2 to 4, wherein -66 -the storing means stores the information of the affected area from the external apparatus and information of date and time in the database in the storage means.
  6. 6. The image processing apparatus according to any of Claims 2 to 5, wherein the communication means outputs information in the database in the storage means to a terminal apparatus over the communication network in response to a request from the terminal apparatus.
  7. 7. The image processing apparatus according to any of Claims 1 to 6, wherein the information of the affected area is the information inputted in the external apparatus.-67 -
GB2215730.9A 2018-05-31 2019-05-28 Image processing system, imaging apparatus, image processing apparatus, electronic device, methods of controlling the system, the apparatuses, and the device, Active GB2609147B (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2018104922 2018-05-31
JP2019018653 2019-02-05
JP2019095938A JP2020123304A (en) 2018-05-31 2019-05-22 Image processing system, imaging device, image processing device, electronic apparatus, control method thereof, and program
GB2206461.2A GB2606859B (en) 2018-05-31 2019-05-28 An image processing system and apparatus, imaging apparatus, device, system/control methods, means for storage

Publications (3)

Publication Number Publication Date
GB202215730D0 GB202215730D0 (en) 2022-12-07
GB2609147A true GB2609147A (en) 2023-01-25
GB2609147B GB2609147B (en) 2023-05-17

Family

ID=68698198

Family Applications (3)

Application Number Title Priority Date Filing Date
GB2206461.2A Active GB2606859B (en) 2018-05-31 2019-05-28 An image processing system and apparatus, imaging apparatus, device, system/control methods, means for storage
GB2017734.1A Active GB2588306B (en) 2018-05-31 2019-05-28 Image processing system, imaging apparatus, image processing apparatus, electronic device, methods of controlling the system, the apparatuses, and the device
GB2215730.9A Active GB2609147B (en) 2018-05-31 2019-05-28 Image processing system, imaging apparatus, image processing apparatus, electronic device, methods of controlling the system, the apparatuses, and the device,

Family Applications Before (2)

Application Number Title Priority Date Filing Date
GB2206461.2A Active GB2606859B (en) 2018-05-31 2019-05-28 An image processing system and apparatus, imaging apparatus, device, system/control methods, means for storage
GB2017734.1A Active GB2588306B (en) 2018-05-31 2019-05-28 Image processing system, imaging apparatus, image processing apparatus, electronic device, methods of controlling the system, the apparatuses, and the device

Country Status (2)

Country Link
GB (3) GB2606859B (en)
WO (1) WO2019230724A1 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6821852B1 (en) * 2020-07-15 2021-01-27 株式会社アイム Support methods, support programs and trained model generation methods to support skin condition evaluation
CN112150496B (en) * 2020-09-24 2023-06-02 平安科技(深圳)有限公司 Photo processing method, device, electronic equipment and readable storage medium
GB2617985A (en) * 2021-02-01 2023-10-25 Skinopathy Inc Machine learning enabled system for skin abnormality interventions

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015019573A1 (en) * 2013-08-08 2015-02-12 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ Control method for information processing device, and image processing method
WO2017203913A1 (en) * 2016-05-25 2017-11-30 パナソニックIpマネジメント株式会社 Skin diagnostic device and skin diagnostic method

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8407065B2 (en) * 2002-05-07 2013-03-26 Polyremedy, Inc. Wound care treatment service using automatic wound dressing fabricator
WO2014179594A2 (en) * 2013-05-01 2014-11-06 Francis Nathania Alexandra System and method for monitoring administration of nutrition
JP6451350B2 (en) * 2015-01-28 2019-01-16 カシオ計算機株式会社 Medical image processing apparatus, medical image processing method and program
US20180028108A1 (en) * 2015-03-18 2018-02-01 Bio1 Systems, Llc Digital wound assessment device and method
EP3383253A1 (en) * 2015-11-30 2018-10-10 Galderma Research & Development Skin inspection tool and method of improved assessment of skin lesions using a skin inspection tool
WO2018013321A1 (en) * 2016-06-28 2018-01-18 Kci Licensing, Inc. Semi-automated mobile system for wound image segmentation

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015019573A1 (en) * 2013-08-08 2015-02-12 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ Control method for information processing device, and image processing method
WO2017203913A1 (en) * 2016-05-25 2017-11-30 パナソニックIpマネジメント株式会社 Skin diagnostic device and skin diagnostic method

Also Published As

Publication number Publication date
GB202215730D0 (en) 2022-12-07
GB2606859B (en) 2023-09-06
GB2588306B (en) 2022-06-22
GB2588306A9 (en) 2022-03-02
GB2606859A (en) 2022-11-23
GB2588306A (en) 2021-04-21
GB202017734D0 (en) 2020-12-23
WO2019230724A1 (en) 2019-12-05
GB2609147B (en) 2023-05-17
GB202206461D0 (en) 2022-06-15

Similar Documents

Publication Publication Date Title
US20210068742A1 (en) Image processing system, imaging apparatus, electronic device, methods of controlling the system, and the apparatuses, and storage medium
US10956715B2 (en) Decreasing lighting-induced false facial recognition
US11600003B2 (en) Image processing apparatus and control method for an image processing apparatus that extract a region of interest based on a calculated confidence of unit regions and a modified reference value
KR101998595B1 (en) Method and Apparatus for jaundice diagnosis based on an image
GB2609147A (en) Image processing system, imaging apparatus, image processing apparatus, electronic device, methods of controlling the system, the apparatuses, and the device,
KR20190048340A (en) Electronic device and method for determining hyperemia grade of eye using the same
US9569838B2 (en) Image processing apparatus, method of controlling image processing apparatus and storage medium
US11599993B2 (en) Image processing apparatus, method of processing image, and program
US20210401327A1 (en) Imaging apparatus, information processing apparatus, image processing system, and control method
US11373312B2 (en) Processing system, processing apparatus, terminal apparatus, processing method, and program
US11475571B2 (en) Apparatus, image processing apparatus, and control method
JP2016092430A (en) Imaging system, information processing device, imaging method, program and storage medium
JP2021049262A (en) Image processing system and method for controlling the same
JP2021049248A (en) Image processing system and method for controlling the same
JP7317528B2 (en) Image processing device, image processing system and control method
US20240000307A1 (en) Photography support device, image-capturing device, and control method of image-capturing device
JP2020156082A (en) Imaging apparatus, image processing system, and control method
US20230131704A1 (en) Information processing apparatus, learning device, imaging apparatus, control method of information processing apparatus, and program
WO2024143176A1 (en) Biological information acquisition assistance device and biological information acquisition assistance method
EP4270926A1 (en) Image processing apparatus, image capture apparatus, image processing method, and program
US20220300350A1 (en) Information processing apparatus, control method of information processing apparatus, and recording medium
JP2020151461A (en) Imaging apparatus, information processing apparatus, and information processing system
JP2016139957A (en) Imaging apparatus, imaging method, and imaging program
JP2024095079A (en) Biometric information acquisition support device and method