WO2020044510A1 - Système d'ordinateur, procédé de détection d'objet et programme - Google Patents

Système d'ordinateur, procédé de détection d'objet et programme Download PDF

Info

Publication number
WO2020044510A1
WO2020044510A1 PCT/JP2018/032207 JP2018032207W WO2020044510A1 WO 2020044510 A1 WO2020044510 A1 WO 2020044510A1 JP 2018032207 W JP2018032207 W JP 2018032207W WO 2020044510 A1 WO2020044510 A1 WO 2020044510A1
Authority
WO
WIPO (PCT)
Prior art keywords
weight
image
module
size
computer
Prior art date
Application number
PCT/JP2018/032207
Other languages
English (en)
Japanese (ja)
Inventor
俊二 菅谷
Original Assignee
株式会社オプティム
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社オプティム filed Critical 株式会社オプティム
Priority to PCT/JP2018/032207 priority Critical patent/WO2020044510A1/fr
Priority to JP2020539962A priority patent/JP7068746B2/ja
Publication of WO2020044510A1 publication Critical patent/WO2020044510A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01GWEIGHING
    • G01G9/00Methods of, or apparatus for, the determination of weight, not provided for in groups G01G1/00 - G01G7/00
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes

Definitions

  • the present invention relates to a computer system for estimating the weight of an object, an object detection method, and a program.
  • Patent Document 1 As a technique for estimating the weight of such an object, there has been disclosed a technique in which an elevator door is photographed and the weight of an object carried into the elevator is estimated (see Patent Document 1).
  • a method of estimating the weight of the object an image of the object carried into the car is photographed by a photographing device or the like, and the image is analyzed to estimate the contour of the object. Further, the image is divided into predetermined blocks, and it is determined whether or not the weight is equal to or more than a predetermined value based on the number of blocks, the variation of the block positions, and the estimated contour of the object.
  • each of a plurality of images is divided into a plurality of blocks, and the similarity of a motion vector in each block in each image is calculated to estimate the contour of the object. .
  • the object of the present invention is to provide a computer system, an object detection method, and a program that can easily estimate the weight of an object accurately.
  • the present invention provides the following solutions.
  • the present invention provides an acquisition unit for acquiring a captured image, Detecting means for extracting a feature amount from the image and detecting an object; Estimating means for estimating the weight of the detected object from the size shown in the image of the object, A computer system is provided.
  • the computer system acquires a captured image, extracts a feature amount from the image, detects an object, and estimates a weight of the detected object from a size shown in the image of the object.
  • the present invention is in the category of computer systems.
  • other categories such as methods and programs exhibit the same functions and effects according to the categories.
  • FIG. 1 is a diagram illustrating an outline of the object detection system 1.
  • FIG. 2 is an overall configuration diagram of the object detection system 1.
  • FIG. 3 is a flowchart illustrating a first object detection process executed by the computer 10.
  • FIG. 4 is a flowchart illustrating a second object detection process executed by the computer 10.
  • FIG. 5 is a flowchart illustrating a third object detection process executed by the computer 10.
  • FIG. 6 is a flowchart illustrating a learning process performed by the computer 10.
  • FIG. 7 is a diagram illustrating an example of the object table.
  • FIG. 8 is a diagram illustrating an example of an image.
  • FIG. 9 is a diagram illustrating an example of an image.
  • FIG. 10 is a diagram illustrating an example of an image.
  • FIG. 10 is a diagram illustrating an example of an image.
  • FIG. 11 is a diagram illustrating an example of the notification screen.
  • FIG. 12 is a diagram illustrating an example of a state where a predetermined area is superimposed on an image.
  • FIG. 13 is a diagram illustrating an example of a state where a predetermined area is superimposed on an image.
  • FIG. 14 is a diagram illustrating an example of the notification screen.
  • FIG. 15 is a diagram illustrating an example of an image.
  • FIG. 16 is a diagram illustrating an example of a complemented image obtained by complementing an image.
  • FIG. 17 is a diagram illustrating an example of the notification screen.
  • FIG. 1 is a diagram for describing an outline of an object detection system 1 according to a preferred embodiment of the present invention.
  • the object detection system 1 is a computer system that includes a computer 10 and estimates the weight of an object.
  • the object detection system 1 includes, in addition to the computer 10, other devices such as a photographing device that photographs an object, a terminal device that displays an estimated weight, and a user terminal that receives a predetermined input from a user. You may.
  • the computer 10 is connected to other devices (not shown) so as to be able to perform data communication via a public line network or the like, and transmits and receives necessary data.
  • the computer 10 acquires, as image data, an image of an object (for example, a heavy machine such as a shovel car, a crop such as a vegetable, or a person) photographed by a photographing device (not shown).
  • the image data includes the position information of the shooting point.
  • the position information of the photographing point is obtained by the photographing apparatus acquiring its own current position from a GPS (Global Positioning System) or the like, and using the acquired current position as the position information of the photographing point.
  • GPS Global Positioning System
  • the computer 10 analyzes the image included in the image data, and extracts its feature amount (for example, statistical values such as the average, variance, and histogram of pixel values, and the shape and contour of an object).
  • the computer 10 detects an object appearing in the image based on the extracted feature amount.
  • the computer 10 estimates the weight of the detected object. For example, the computer 10 estimates the distance from the shooting point to the object based on the position information of the shooting point, and based on the distance and the size (area) of the object in the image, the size of the object ( Volume). The computer 10 estimates the weight of the object based on the size of the object and the detected density of the object.
  • the computer 10 estimates the weight of the object by learning the correlation between the actual weight of the detected object and the acquired image of the object. For example, when newly acquiring image data, the computer 10 estimates the weight of the object reflected in the newly acquired image data, taking into account the learning result. At this time, the computer 10 uses at least one of the density, the name, the size of the object, and the distance to the object as learning as a correlation.
  • the computer 10 acquires image data (Step S01).
  • the computer 10 acquires an image photographed by a photographing device (not shown) and position information of the photographing device (position information of a photographing point) as image data.
  • the imaging device acquires its own position information from GPS or the like, and the computer 10 acquires this position information as the position information of the imaging point.
  • the computer 10 performs image analysis on the basis of the image data, and extracts an image feature amount in the image data (step S02).
  • the computer 10 extracts statistical numerical values such as the average, variance, and histogram of pixel values, and the shape, contour, and the like of an object as the feature amount of an image.
  • the computer 10 detects an object appearing in the image based on the extracted feature amount (Step S03).
  • the computer 10 refers to an object table in which identifiers (names, model numbers, product numbers, etc.) of various objects are associated with feature amounts of the respective objects, and specifies identifiers of the objects corresponding to the feature amounts extracted this time. Detect an object.
  • the computer 10 estimates the weight of the detected object from the size shown in the image of the object (step S04).
  • the computer 10 estimates the weight of the detected object based on, for example, the size shown in the image of the object (that is, the area of the object in the image).
  • the computer 10 estimates the distance from the shooting point of the image to the object by a method such as three-point surveying, and estimates the weight of the object based on the distance and the size of the object shown in the image. .
  • the computer 10 refers to the weight table in which the identifiers (names, model numbers, product numbers, etc.) of the objects, the sizes (volumes) of the objects, and the weight densities of the objects are detected, and the computer 10 detects the current time. Guess the weight of the object.
  • the computer 10 specifies the size and the weight density associated with the identifier of the object detected this time by referring to the weight table, and based on the specified size and the weight density and the estimated size, Guess the weight of the object.
  • the computer 10 learns the correlation between the actual weight of the detected object and the image acquired this time (at least one correlation of the density, name, size of the object or distance to the object).
  • the computer 10 estimates the weight of the object in consideration of the learning result.
  • FIG. 2 is a diagram illustrating a system configuration of an object detection system 1 according to a preferred embodiment of the present invention.
  • an object detection system 1 is a computer system that includes a computer 10 and estimates the weight of an object.
  • the computer 10 is connected to other devices (not shown) such as the above-described photographing device, terminal device, and user terminal via a public line network or the like so as to be able to perform data communication.
  • the computer 10 includes a CPU (Central Processing Unit), a RAM (Random Access Memory), a ROM (Read Only Memory) and the like as a control unit, and a device for enabling communication with other terminals and devices as a communication unit.
  • a Wi-Fi (Wireless-Fidelity) compliant device or the like compliant with IEEE 802.11 is provided.
  • the computer 10 includes a data storage unit such as a hard disk, a semiconductor memory, a storage medium, and a memory card as a storage unit.
  • the computer 10 includes, as a processing unit, various devices that execute various processes.
  • the control unit when the control unit reads a predetermined program, the image data acquisition module 20, the notification module 21, the object designation data acquisition module 22, the area designation data acquisition module 23, the object data acquisition A module 24 and an actual weight data acquisition module 25 are realized. Further, in the computer 10, the control unit reads a predetermined program, thereby realizing the storage module 30 in cooperation with the storage unit. Further, in the computer 10, the control unit reads a predetermined program and cooperates with the processing unit to cooperate with the feature amount extraction module 40, the object detection module 41, the distance estimation module 42, the size estimation module 43, and the size estimation module. 44, a weight estimation module 45, an object complementing module 46, and a learning module 47 are realized.
  • FIG. 3 is a diagram illustrating a flowchart of the first object detection process executed by the computer 10. The processing executed by each module described above will be described together with this processing.
  • the image data acquisition module 20 acquires an image photographed by a photographing device (not shown) and position information of the photographing device as image data (step S10).
  • the image capturing apparatus transmits the image captured by the image capturing apparatus and its own position information acquired from GPS or the like to the computer 10 as image data.
  • the position information of the image capturing apparatus itself is the position information of the image capturing point.
  • the image data acquisition module 20 acquires the image photographed by the photographing device and the position information of the photographing point of the image.
  • the feature extracting module 40 analyzes the image included in the image data based on the acquired image data, and extracts the feature of the image (step S11).
  • the feature value extraction module 40 extracts, as feature values, statistical values such as the average, variance, and histogram of pixel values, and the shape and contour of an object.
  • the object detection module 41 determines whether an object is included in the image based on the extracted feature amount (Step S12).
  • the object detection module 41 stores an object table in which identifiers (names, model numbers, product numbers, and the like) of various objects stored in the storage module 30 in advance and feature amounts (one or more) of the objects are associated. By referencing, it is determined whether or not an object corresponding to the extracted feature amount is reflected in this image.
  • the storage module 30 stores the identifier of the heavy equipment (the name of the heavy equipment, the name of the heavy equipment classified by sales maker, the model number, the product number, etc.) and the feature amount of the heavy equipment in association with each other as an object table.
  • the storage module 30 also associates crop identifiers (crop names, crop names classified by producers, identification numbers that can identify individual crops, etc.) with the feature amounts of the crops, and creates an object table.
  • crop identifiers crop names, crop names classified by producers, identification numbers that can identify individual crops, etc.
  • the storage module 30 stores an identifier of a person (age, gender, race, height, weight, and the like) in association with a feature amount of the person as an object table.
  • the object detection module 41 compares the feature amount extracted this time with the object table, and determines whether or not the object corresponding to the feature amount is stored as the object table. Is determined.
  • step S12 when the object detection module 41 determines that the object is not reflected in the image (step S12 NO), the computer 10 ends the processing.
  • step S12 when the object detection module 41 determines that the object is reflected in the image (step S12 YES), the object detection module 41 detects the object reflected in the image (step S13). ).
  • step S13 the object detection module 41 compares the extracted feature amount with the object table, and detects an identifier of the object corresponding to the extracted feature amount as an object reflected in this image.
  • the distance estimation module 42 estimates the distance between the imaging device and the object based on the position information of the imaging point included in the image data (step S14). In step S14, the distance estimation module 42 estimates the distance from the shooting position to the object by, for example, triangulation. At this time, the distance estimation module 42 knows in advance the points at both ends of the base line passing through the imaging point, and estimates the distance from the imaging point to the object.
  • the method by which the distance estimating module 42 estimates the distance between the imaging device and the object is not limited to the above-described example, and can be appropriately changed.
  • the size estimation module 43 estimates the size (area) of the detected object in the image (step S15). In step S15, the size estimation module 43 estimates an area where the object exists as the size of the object based on, for example, the detected contour or shape of the object. The size estimation module 43 estimates, for example, the sum of the number of pixels in this area as the size.
  • the method of estimating the size of the object by the size estimating module 43 is not limited to the example described above, and can be changed as appropriate.
  • the size estimation module 44 estimates the size (volume) of the object based on the estimated distance from the shooting point to the object and the estimated size of the object (step S16). In step S16, the size estimation module 44 estimates the size of the object based on the ratio of the size of the object to the image and the distance.
  • the method of estimating the size of the object by the size estimating module 44 is not limited to the example described above, and can be appropriately changed.
  • the weight estimation module 45 refers to the weight table in which the identifier (name, model number, product number, etc.) of the object stored in the storage module 30 in advance, the size of the object, and the weight density are associated with each other, and The weight is estimated (step S17).
  • the weight estimation module 45 refers to the weight table based on the identifier of the object detected this time and the size of the object, and specifies the weight density corresponding to the detected object.
  • the weight estimation module 45 estimates the weight of the object based on the specified weight density and the estimated size of the object.
  • FIG. 7 is a diagram schematically illustrating a weight table stored in the storage module 30.
  • the storage module 30 associates the name of the object, which is the identifier of the object, the size (volume) of the object, and the weight density of the object, and registers them in the weight table, and registers the registered weight table.
  • the storage module 30 associates a shovel car (small), which is the name of the object, V1, which is the size of the object, and D1, which is the weight density of the object, and registers it as a weight table.
  • the storage module 30 registers the excavator (large), V2, and D2 in association with each other as a weight table. Similarly, the storage module 30 registers the cabbage, V3, and D3 in association with each other as a weight table. Similarly, the storage module 30 associates a person (male, 20s), V4, and D4 and registers them as a weight table. These data are obtained from an input from a user terminal or obtained through an external computer or the like, and the storage module 30 registers these data in a weight table and stores the weight table.
  • the identifier of the object registered as the weight table is not limited to the name, but may be another one.
  • the identifier may be associated with one or more combinations of age, gender, race, height, weight, and the like, and with the size and the weight density.
  • the weight estimation module 45 may estimate the weight of the object by a method other than the method of referring to the weight table when estimating the weight of the object. For example, the weight estimation module 45 may estimate the weight of the object based on a function of the identifier, the size, and the density of the object.
  • the notification module 21 notifies the estimated weight to the user terminal or the like (step S18).
  • the notification module 21 generates a weight notification in which the identifier of the estimated object and the weight are superimposed on the obtained image, and notifies the generated weight notification to a user terminal or the like.
  • the user terminal or the like acquires this notification and displays it on its own display unit or the like, thereby notifying the user of the weight of the object.
  • the computer 10 displays the estimated weight of the object on the user terminal or the like, thereby notifying the user of the weight of the object.
  • the above is the first object detection processing.
  • FIGS. 8, 9, and 10 are diagrams illustrating examples of image data acquired by the image data acquisition module 20. Each image data includes position information of each shooting point in addition to the image.
  • FIG. 8 is an image of a shovel car as a heavy machine.
  • FIG. 9 is an image of cabbage as a crop.
  • FIG. 10 is an image of a person.
  • the image data obtaining module 20 obtains the image data shown in FIGS. 8, 9 and 10 by the processing in step S10 described above.
  • the feature value extraction module 40 extracts a feature value from each image data by the process of step S11 described above.
  • the object detection module 41 detects an object appearing in this image by the processing in steps S12 and S13 described above.
  • the object detection module 41 detects the shovel car (small) 100 as an object based on the feature amount extracted from the image shown in FIG.
  • the object detection module 41 detects the cabbage 110 as an object based on the feature amount extracted from the image illustrated in FIG.
  • the object detection module 41 detects a person (male, 20's) 120 as an object based on the feature amount extracted from the image shown in FIG.
  • the distance estimation module 42 estimates the distance between the imaging device and the object by the processing in step S14 described above.
  • the size estimation module 43 estimates the size (area) in the image of the object by the processing in step S15 described above. That is, the size estimation module 43 estimates the size of the image of the shovel car (small) 100, estimates the size of the image of the cabbage 110, and estimates the size of the image of the person (male, 20's) 120.
  • the size estimating module 44 estimates the size (volume) of the object by the processing in step S16 described above. That is, the size estimation module 44 estimates the size of the excavator (small) 100, estimates the size of the cabbage 110, and estimates the size of the person (male, 20's) 120.
  • the weight estimation module 45 estimates the weight of the object by the processing in step S17 described above. Since the identifier of the object is “shovel car (small)”, the weight estimating module 45 refers to the weight table and specifies the weight density D1 associated with the “shovel car (small)”. The weight estimation module 45 estimates the weight W1 of the excavator (small) 100 based on the specified weight density D1 and the estimated size. Similarly, since the identifier of the object is “cabbage”, the weight estimation module 45 refers to the weight table and specifies the weight density D3 associated with the “cabbage”. The weight estimation module 45 estimates the weight W3 of the cabbage 110 based on the specified weight density D3 and the estimated size.
  • the weight estimating module 45 refers to the weight table and determines the weight density associated with the “person (male, 20s)”. D4 is specified. The weight estimation module 45 estimates the weight W4 of the person (male, 20's) 120 based on the specified weight density D4 and the estimated size.
  • the notification module 21 notifies the estimated weight by the processing in step S18 described above.
  • a notification screen that the notification module 21 notifies the user terminal or the like will be described with reference to FIG.
  • FIG. 11 is a diagram illustrating an example of a notification screen in which the notification module 21 notifies a user terminal or the like of the weight of an object.
  • the excavator (small) among the three objects described above will be described as an example.
  • the notification module 21 also notifies cabbage and people.
  • the notification module 21 uses, as the notification screen, a screen in which an identifier of the object (here, a shovel car (small) whose name is an identifier) and the estimated weight W1 are superimposed on the acquired image as the notification screen. It is displayed on a user terminal or the like.
  • the notification module 21 performs processing such as surrounding the specified object, highlighting, and changing the color as the notification screen to clarify which object has been specified.
  • the notification module 21 superimposes the enclosing line surrounding the shovel car (small) 100, the identifier of the object, and the weight on the acquired image as a notification screen as a notification screen on the user terminal or the like. Display.
  • the notification module 21 also displays, on the user terminal or the like, a superimposition of the enclosing line, identifier, and weight surrounding the cabbage 100 on the acquired image as the notification screen for the other cabbage 110 and the person 120. Then, an encircling line surrounding the person 120, an identifier, and a weight superimposed on the acquired image are displayed on a user terminal or the like as a notification screen.
  • FIG. 4 is a diagram illustrating a flowchart of the second object detection process executed by the computer 10. The processing executed by each module described above will be described together with this processing.
  • the object designation data acquisition module 22 determines whether or not object designation data for designating an object to be detected has been acquired (Step S20).
  • the object designation data acquisition module 22 outputs data that designates an object existing over a predetermined range such as soil, earth and sand (thing caused by a disaster, excavation under construction, and the like), water, and a whole crop. Is acquired as object designation data.
  • the user terminal or the like receives an input for designating an object to be detected, and transmits the accepted object to the computer 10 as object designation data.
  • the object designation data acquisition module 22 acquires the object designation data by receiving the object designation data transmitted by the user terminal.
  • the source of the object designation data acquired by the object designation data acquisition module 22 can be changed as appropriate. This process does not necessarily need to be executed. In this case, the computer 10 may execute the process after step S21 described later.
  • step S20 if the object designation data acquisition module 22 determines that the object designation data has not been acquired (NO in step S20), the process ends.
  • the computer 10 may execute the above-described first object detection processing.
  • step S20 when the object designation data acquisition module 22 determines that the object designation data has been acquired (step S20: YES), the image data acquisition module 20 acquires image data (step S21).
  • the processing in step S21 is the same as the processing in step S10 described above.
  • the imaging device transmits image data to a user terminal (not shown) in addition to the computer 10.
  • the user terminal receives the image data, and uses an image included in the image data for a process described below.
  • the area designation data acquisition module 23 acquires area designation data for designating a predetermined area for the acquired image (Step S22).
  • the terminal device and the computer 10 receive the same image data.
  • the user terminal displays an image based on the image data on its own display unit, receives an input such as a tap operation from the user, and receives an input specifying a predetermined area for the image from the user.
  • the user terminal specifies the coordinates of the predetermined area in the image. For example, when the region is a rectangle, the terminal device specifies the coordinates of each vertex, and specifies a region surrounded by the rectangle as a predetermined region.
  • the user terminal When the region is circular, the user terminal specifies the coordinates of the center, specifies the radius from the center to the circumference, and specifies the region surrounded by the circle as the predetermined region.
  • the user terminal transmits the specified predetermined area to the computer 10 as area specification data.
  • the area designation data acquisition module 23 receives this area designation data to acquire area designation data for designating a predetermined area for an image.
  • the feature amount extraction module 40 analyzes the image of the image based on the acquired image data, and extracts the feature amount of the image (Step S23).
  • the processing in step S23 is the same as the processing in step S11 described above.
  • the object detection module 41 detects an object reflected in a predetermined area based on the extracted feature amount (Step S24).
  • the object detection module 41 specifies an area corresponding to the predetermined area based on the obtained area specifying data. For example, when the predetermined area is a rectangle, the object detection module 41 specifies the coordinates of each vertex of the rectangle based on the area specification data, and determines a rectangular area connecting the vertices with the specified predetermined area. Identify as something.
  • the predetermined area is a circle
  • the object detection module 41 specifies the center coordinates and the radius of the circle based on the area specifying data, and the area surrounded by the circle is the specified predetermined area. Identify The object detection module 41 detects an object reflected in an area surrounded by the predetermined area.
  • the method by which the object detection module 41 detects an object is the same as the processing in steps S12 and S13 described above.
  • the distance estimation module 42 estimates the distance between the imaging device and the object based on the location information of the imaging point included in the image data (Step S25).
  • the processing in step S25 is the same as the processing in step S14 described above.
  • the size estimation module 43 estimates the size (area) of the image of the object reflected in the detected predetermined area (step S26).
  • the process in step S26 is the same as the process in step S15 described above.
  • the size estimation module 44 estimates the size (volume) of the object based on the estimated distance and the estimated size (step S27).
  • the processing in step S27 is the same as the processing in step S16 described above.
  • the weight estimation module 45 refers to the weight table in which the identifier (name, model number, product number, etc.) of the object stored in the storage module 30 in advance, the size of the object, and the weight density are associated with each other, and The weight is estimated (step S28).
  • the process in step S28 is the same as the process in step S17 described above.
  • step S29 The notification module 21 notifies the user terminal of the estimated weight (step S29).
  • the processing in step S29 is the same as the processing in step S18 described above.
  • the above is the second object detection processing.
  • FIGS. 12 and 13 are diagrams illustrating an example of a state in which a predetermined area designated based on the area designation data acquired by the area designation data acquisition module 23 is superimposed on the image acquired by the image data acquisition module 20. It is.
  • Each image data includes position information of each shooting point in addition to the image.
  • FIG. 12 shows an image including earth and sand and a designated predetermined area superimposed on the image.
  • FIG. 13 shows an image of a crop and a specified predetermined area superimposed on the image.
  • the object designation data acquisition module 22 acquires the object designation data by the processing in step S20 described above.
  • the object designation data acquisition module 22 acquires “earth and sand” as object designation data in FIG. 12, and acquires “cabbage” as object designation data in FIG.
  • the image data acquisition module 20 acquires the image data by the processing in step S21 described above.
  • the area designation data acquisition module 23 acquires the area designation data by the processing in step S22 described above.
  • the feature amount extraction module 40 extracts the feature amount of the image by the process of step S23 described above.
  • the object detection module 41 detects an object reflected in the designated predetermined area based on the above-described feature amounts by the above-described processing in step S24.
  • the object detection module 41 detects “earth and sand” as an object, and also detects the earth and sand existing in the predetermined area 200 because the specified predetermined area 200 exists.
  • the object detection module 41 specifies the “cabbage” as the object, and additionally, because the designated predetermined area 210 exists, the cabbage existing in the predetermined area 210 exists. Is detected.
  • the distance estimation module 42 estimates the distance between the imaging device and the object by the processing in step S25 described above.
  • the size estimation module 43 estimates the size (area) of the image of the object reflected in the predetermined area by the processing in step S26 described above.
  • the size estimation module 43 estimates the size of the earth and sand existing in the predetermined area 200 and the size of the cabbage existing in the predetermined area 210.
  • the size estimation module 43 estimates the size of the earth and sand existing in the predetermined area 200 by adding the size of the heavy equipment existing in the predetermined area 200 to the size of the earth and sand. This is effective when a large object such as a heavy machine hides the target object.
  • the size of the object corresponding to the other object is estimated in the image of the object assuming that there is no target object.
  • the size estimating module 43 excludes the size of cabbage in which one ball is not completely included in the predetermined region 210 from among the cabbage existing in the predetermined region 210, and Based on only the size of the cabbage included in the area 210, the size of the cabbage existing in the predetermined area 210 is estimated. This is effective when only a part of an object such as a crop is present in the predetermined area.
  • the size estimation module 44 estimates the size (volume) of this object by the processing in step S27 described above. That is, the size estimation module 44 estimates the size of the earth and sand in the predetermined region 200 and estimates the size of the cabbage in the predetermined region 210.
  • the weight estimation module 45 estimates the weight of the object by the processing in step S28 described above. Since the identifier of the object is “earth and sand”, the weight estimating module 45 refers to the weight table and specifies the weight density D5 associated with the “earth and sand”. The weight estimation module 45 estimates the weight W5 of “earth and sand” existing in the predetermined region 200 based on the specified weight density D5 and the estimated size. Similarly, since the identifier of the object is “cabbage”, the weight estimation module 45 refers to the weight table and specifies the weight density D3 associated with the “cabbage”. The weight estimation module 45 estimates the weight W6 of the “cabbage” existing in the predetermined area 210 based on the specified weight density D3 and the estimated size.
  • the notification module 21 notifies the estimated weight by the processing in step S29 described above.
  • the notification screen that the notification module 21 notifies the user terminal or the like will be described with reference to FIG.
  • FIG. 14 is a diagram illustrating an example of a notification screen in which the notification module 21 notifies a user terminal or the like of a weight of an object reflected in a predetermined area. Of the two objects described above, earth and sand will be described as an example.
  • the notification module 21 notifies the cabbage similarly.
  • the notification module 21 uses a screen obtained by superimposing an object identifier (here, earth and sand as an identifier), an estimated weight W5 kg, and a predetermined area 200 on an acquired image as a notification screen as a notification screen. To be displayed.
  • object identifier here, earth and sand as an identifier
  • W5 kg estimated weight
  • predetermined area 200 on an acquired image as a notification screen as a notification screen. To be displayed.
  • the notification module 21 causes the user terminal to display, as a notification screen, the cabbage in which the predetermined area 200, the identifier of the object, and the weight are superimposed on the acquired screen.
  • the notification module 21 performs processing such as highlighting and color change of a predetermined area as the notification screen, and clarifies where in the image the object reflected in the image has been specified. Eventually, the notification module 21 causes the user terminal to display, as the notification screen, a predetermined area, the identifier of the object, and the weight superimposed on the acquired screen as the notification screen.
  • FIG. 5 is a diagram illustrating a flowchart of the third object detection process executed by the computer 10. The processing executed by each module described above will be described together with this processing.
  • the image data acquisition module 20 acquires, as image data, an image photographed by a photographing device (not shown) and position information of the photographing device (step S30).
  • the processing in step S30 is the same as the processing in step S10 described above.
  • the feature amount extraction module 40 analyzes the image included in the image data based on the acquired image data, and extracts the feature amount of the image (step S31).
  • the process in step S31 is the same as the process in step S11 described above.
  • the object detection module 41 determines whether an object is included in this image based on the extracted feature amount (Step S32).
  • the processing in step S32 is the same as the processing in step S12 described above.
  • step S32 when the object detection module 41 determines that the object is not reflected in the image (step S32: NO), the computer 10 ends this processing.
  • step S32 when the object detection module 41 determines that the object is reflected in the image (step S32 YES), the object detection module 41 detects the object reflected in this image (step S33). ).
  • the processing in step S33 is the same as the processing in step S13 described above.
  • the object detection module 41 determines whether the entire object has been detected (step S34). In step S34, the object detection module 41 determines whether a part of the detected object is at an edge of the image (for example, if the image is rectangular, Is determined). Further, the object detection module 41 determines whether or not a part of the outline or shape of the detected object is interrupted halfway.
  • the object detection module 41 may determine whether the entire object has been detected by a method other than the method described above.
  • step S34 if the object detection module 41 determines that the entire object has been detected (step S34: YES), the computer 10 ends this processing. In this case, the computer 10 may execute the above-described first object detection processing.
  • step S34 when the object detection module 41 determines that the entire object has not been detected (step S34 NO), the object data acquisition module 24 uses the identifier of the object stored in the external computer or the storage module 30 in advance.
  • the object data which is data relating to the image, size, size, etc. of the object corresponding to is acquired (step S35).
  • step S35 the object data acquisition module 24 acquires the object data corresponding to the identifier of the detected object by referring to the external computer or various tables stored in the storage module 30.
  • the object complementing module 46 complements a missing part in the detected object based on the acquired object data (step S36).
  • the object complementing module 46 compares the image in the acquired object data with the image of the object detected this time, and estimates the ratio.
  • the object complementing module 46 corrects by reducing or enlarging the image in the acquired object data based on the estimated ratio.
  • the object complementing module 46 compares the corrected image with the detected object.
  • the object complementing module 46 specifies a portion missing from the detected object in the corrected image.
  • the object complementing module 46 pseudo-corrects the entire object as an image by connecting the specified portion to the detected image of the object, and complements the missing portion.
  • the distance estimation module 42 estimates the distance between the imaging device and the object based on the position information of the imaging point included in the image data (step S37).
  • the processing in step S37 is the same as the processing in step S14 described above.
  • the size estimation module 43 estimates the size (area) of the object shown in the complemented image based on the complemented object image and the acquired object data (step S38).
  • the processing in step S38 is the same as the processing in step S15 described above.
  • the size estimation module 44 estimates the size (volume) of the object based on the estimated distance from the shooting point to the object and the estimated size of the object (step S39).
  • the processing in step S39 is the same as the processing in step S16 described above.
  • the weight estimation module 45 estimates the weight of this object by referring to the weight table (step S40).
  • the processing in step S40 is the same as the processing in step S17 described above.
  • the notification module 21 notifies the estimated weight to the user terminal (step S41).
  • the processing in step S41 is the same as the processing in step S18 described above.
  • the above is the third object detection processing.
  • a method for estimating the weight of heavy equipment in the third object detection processing executed by the computer 10 will be described. It should be noted that the weight can be estimated for agricultural products and humans in the same manner.
  • FIG. 15 is a diagram illustrating an example of image data acquired by the image data acquisition module 20.
  • the image data includes, in addition to the image, positional information of the shooting location.
  • FIG. 15 is an image of a shovel car as a heavy machine.
  • the image data acquisition module 20 acquires the image data shown in FIG. 15 by the processing in step S30 described above.
  • the feature extraction module 40 extracts the feature from the image data by the above-described process of step S31.
  • the object detection module 41 detects an object appearing in this image by the processes in steps S32 and S33 described above.
  • the object detection module 41 detects the shovel car (small) 300 as an object based on the feature amount extracted from the image shown in FIG.
  • the object detection module 41 determines that part of the detected object is missing by the processing of step S34 described above, and the object data acquisition module 24 converts the object data relating to the detected object by the processing of step S35 described above. get.
  • the object complementing module 46 complements a missing part in the detected object by the processing in step S36 described above.
  • FIG. 16 is a diagram showing a shovel car (small) 310 in which the object complementing module 46 supplements a part missing from the shovel car (small) 300.
  • the object complementing module 46 complements the shovel car (small) 310 in which the correction portion 320 has been corrected based on the object data.
  • the distance estimation module 42 estimates the distance between the imaging device and the object by the processing in step S37 described above.
  • the size estimation module 43 estimates the size (area) of the complemented object by the processing in step S38 described above. In other words, the size estimation module 43 estimates the size of the image of the shovel car (small) 310 after the complementation.
  • the size estimation module 44 estimates the size (volume) of the object by the processing in step S39 described above. That is, the size estimating module 44 estimates the size of the shovel car (small) 310 after the complementation.
  • the weight estimation module 45 estimates the weight of the object by the processing in step S40 described above. Since the identifier of the object is “shovel car (small)”, the weight estimating module 45 refers to the weight table and specifies the weight density D1 associated with the “shovel car (small)”. The weight estimation module 45 estimates the weight W7 of the excavator (small) 100 based on the specified weight density D1 and the estimated size.
  • the notification module 21 notifies the estimated weight by the processing in step S41 described above.
  • a notification screen that the notification module 21 notifies the user terminal or the like will be described with reference to FIG.
  • FIG. 17 is a diagram illustrating an example of a notification screen in which the notification module 21 notifies a user terminal or the like of the weight of an object.
  • the notification module 21 superimposes an object identifier (here, a shovel car (small) whose name is an identifier) and an estimated weight on the acquired image (image before correction) as a notification screen. Is displayed on the user terminal or the like as a notification screen.
  • the notification module 21 performs processing such as surrounding the specified object, highlighting, and changing the color as the notification screen to clarify which object has been specified.
  • the notification module 21 superimposes the enclosing line surrounding the shovel car (small) 300, the identifier of the object, and the weight on the acquired image as a notification screen as a notification screen on the user terminal or the like. Display.
  • the notification module 21 displays the same notification on the user terminal as a notification screen for other crops and people.
  • FIG. 6 is a diagram illustrating a flowchart of the learning process performed by the computer 10. The processing executed by each module described above will be described together with this processing.
  • the actual weight data acquisition module 25 acquires actual weight data indicating the actual weight of the object whose weight has been estimated by the first, second, and third object detection processes described above (step S50). .
  • the terminal device inputs or obtains the result of actually measuring the weight of the object, and transmits data relating to the identifier, image, and actual weight of the object to the computer 10 as actual weight data. I do.
  • the actual weight data acquisition module 25 acquires the estimated actual weight of the object by receiving the actual weight data.
  • the learning module 47 learns a correlation between the acquired actual weight (actual weight) of the object and the detected image of the object (step S51). In step S51, the learning module 47 learns, as a correlation between the actual weight and the image, at least one correlation of the density, name, size, or distance to the object of the object.
  • the storage module 30 stores the learning result (Step S52).
  • the weight estimation module 45 estimates the weight of the object in consideration of the learning result in the processing of steps S17, S28, and S40 described above. That is, when estimating the weight of the object, the weight estimating module 45 estimates the weight of the object by referring to the weight table and correcting the correlation based on the learning result.
  • the means and functions described above are implemented when a computer (including a CPU, an information processing device, and various terminals) reads and executes a predetermined program.
  • the program is provided, for example, in the form of being provided from a computer via a network (SaaS: Software as a Service).
  • the program is provided in a form stored in a computer-readable storage medium such as a flexible disk, a CD (eg, a CD-ROM), and a DVD (eg, a DVD-ROM, a DVD-RAM).
  • the computer reads the program from the storage medium, transfers the program to an internal storage device or an external storage device, stores and executes the program.
  • the program may be stored in a storage device (storage medium) such as a magnetic disk, an optical disk, or a magneto-optical disk in advance, and may be provided to the computer from the storage device via a communication line.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Geometry (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

Le problème décrit par la présente invention est de fournir un système d'ordinateur, un procédé de détection d'objet et un programme qui facilitent une déduction précise du poids d'un objet. La solution selon l'invention porte sur un système d'ordinateur, lequel : acquiert une image capturée ; extrait une valeur de caractéristique de l'image ; détecte un objet ; et déduit le poids de l'objet détecté à partir d'une taille indiquée dans l'image de l'objet. Le système d'ordinateur : déduit la taille de l'objet à partir de la taille indiquée dans l'image de l'objet ; et à l'aide de la taille déduite et en se référant à la masse volumique de l'objet détecté, déduit le poids de l'objet. Le système d'ordinateur déduit le poids de l'objet par apprentissage d'une corrélation entre le poids réel déduit de l'objet et l'image d'objet détectée (densité, nom, taille et/ou distance à l'objet).
PCT/JP2018/032207 2018-08-30 2018-08-30 Système d'ordinateur, procédé de détection d'objet et programme WO2020044510A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2018/032207 WO2020044510A1 (fr) 2018-08-30 2018-08-30 Système d'ordinateur, procédé de détection d'objet et programme
JP2020539962A JP7068746B2 (ja) 2018-08-30 2018-08-30 コンピュータシステム、物体検知方法及びプログラム

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2018/032207 WO2020044510A1 (fr) 2018-08-30 2018-08-30 Système d'ordinateur, procédé de détection d'objet et programme

Publications (1)

Publication Number Publication Date
WO2020044510A1 true WO2020044510A1 (fr) 2020-03-05

Family

ID=69644014

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/032207 WO2020044510A1 (fr) 2018-08-30 2018-08-30 Système d'ordinateur, procédé de détection d'objet et programme

Country Status (2)

Country Link
JP (1) JP7068746B2 (fr)
WO (1) WO2020044510A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022180864A1 (fr) * 2021-02-26 2022-09-01 日本電気株式会社 Procédé, dispositif et système d'estimation de poids
WO2023074818A1 (fr) * 2021-10-27 2023-05-04 株式会社安川電機 Système de pesage, système de commande de support, procédé de pesage et programme de pesage
WO2023189216A1 (fr) * 2022-03-31 2023-10-05 日立建機株式会社 Système d'aide au travail

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018027581A (ja) * 2016-08-17 2018-02-22 株式会社安川電機 ピッキングシステム
JP2018124962A (ja) * 2017-01-27 2018-08-09 パナソニックIpマネジメント株式会社 情報処理装置および情報処理方法

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6724499B2 (ja) * 2016-04-05 2020-07-15 株式会社リコー 物体把持装置及び把持制御プログラム
JP2017220876A (ja) * 2016-06-10 2017-12-14 アイシン精機株式会社 周辺監視装置
JP2018036770A (ja) * 2016-08-30 2018-03-08 富士通株式会社 位置姿勢推定装置、位置姿勢推定方法、及び位置姿勢推定プログラム

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018027581A (ja) * 2016-08-17 2018-02-22 株式会社安川電機 ピッキングシステム
JP2018124962A (ja) * 2017-01-27 2018-08-09 パナソニックIpマネジメント株式会社 情報処理装置および情報処理方法

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022180864A1 (fr) * 2021-02-26 2022-09-01 日本電気株式会社 Procédé, dispositif et système d'estimation de poids
WO2023074818A1 (fr) * 2021-10-27 2023-05-04 株式会社安川電機 Système de pesage, système de commande de support, procédé de pesage et programme de pesage
WO2023189216A1 (fr) * 2022-03-31 2023-10-05 日立建機株式会社 Système d'aide au travail

Also Published As

Publication number Publication date
JPWO2020044510A1 (ja) 2021-08-26
JP7068746B2 (ja) 2022-05-17

Similar Documents

Publication Publication Date Title
WO2020044510A1 (fr) Système d'ordinateur, procédé de détection d'objet et programme
US8872851B2 (en) Augmenting image data based on related 3D point cloud data
US20170068840A1 (en) Predicting accuracy of object recognition in a stitched image
CN112700552A (zh) 三维物体检测方法、装置、电子设备及介质
KR101510206B1 (ko) 항공 하이퍼스펙트럴 영상을 이용한 수치지도 수정 도화용 도시변화지역의 탐지 방법
GB2554111A (en) Image processing apparatus, imaging apparatus, and image processing method
CN113052907B (zh) 一种动态环境移动机器人的定位方法
EP3214604A1 (fr) Procédé d'estimation d'orientation et dispositif d'estimation d'orientation
CN112949375A (zh) 计算***、计算方法及存储介质
JP2012226645A (ja) 画像処理装置および方法、記録媒体並びにプログラム
US20240104769A1 (en) Information processing apparatus, control method, and non-transitory storage medium
JP2014048131A (ja) 画像処理装置、方法及びプログラム
Shi et al. A method for detecting pedestrian height and distance based on monocular vision technology
EP2791865A1 (fr) Système et procédé d'évaluation des dimensions d'une cible
CN109035686B (zh) 一种预防丢失的报警方法及装置
CN108805004B (zh) 功能区域检测方法和装置、电子设备、存储介质
JP6831396B2 (ja) 映像監視装置
WO2020157879A1 (fr) Système informatique, procédé d'aide à la croissance de plantes et programme
US11967108B2 (en) Computer-readable recording medium storing position identification program, position identification method, and information processing apparatus
US20230137094A1 (en) Measurement device, measurement system, measurement method, and computer program product
CN113658313B (zh) 人脸模型的渲染方法、装置及电子设备
Diamantatos et al. Android based electronic travel aid system for blind people
KR20120138459A (ko) 모션인식 기능을 가지는 이미지프로세싱에 의한 자동화재인식 시스템
JP2009295062A (ja) 画像処理装置及び画像処理方法及び画像処理プログラム
WO2022130849A1 (fr) Dispositif de traitement d'image, procédé de traitement d'image et support non transitoire lisible par ordinateur

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18932047

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020539962

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18932047

Country of ref document: EP

Kind code of ref document: A1