US20190139245A1 - Image processing system and image processing method - Google Patents

Image processing system and image processing method Download PDF

Info

Publication number
US20190139245A1
US20190139245A1 US16/182,468 US201816182468A US2019139245A1 US 20190139245 A1 US20190139245 A1 US 20190139245A1 US 201816182468 A US201816182468 A US 201816182468A US 2019139245 A1 US2019139245 A1 US 2019139245A1
Authority
US
United States
Prior art keywords
article
processor
basic shape
camera
input
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US16/182,468
Other languages
English (en)
Inventor
Masaaki Yasunaga
Norimasa Ariga
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Toshiba TEC Corp
Original Assignee
Toshiba TEC Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Toshiba TEC Corp filed Critical Toshiba TEC Corp
Assigned to TOSHIBA TEC KABUSHIKI KAISHA reassignment TOSHIBA TEC KABUSHIKI KAISHA ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: ARIGA, NORIMASA, YASUNAGA, MASAAKI
Publication of US20190139245A1 publication Critical patent/US20190139245A1/en
Priority to US17/081,531 priority Critical patent/US11328438B2/en
Assigned to iZotope, Inc., EXPONENTIAL AUDIO, LLC reassignment iZotope, Inc. TERMINATION AND RELEASE OF GRANT OF SECURITY INTEREST IN UNITED STATES PATENTS Assignors: CAMBRIDGE TRUST COMPANY
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/593Depth or shape recovery from multiple images from stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/56Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • H04N5/2256
    • H04N5/23203
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/521Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N2013/0074Stereoscopic image analysis
    • H04N2013/0081Depth or disparity estimation from stereoscopic image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/90Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums

Definitions

  • Embodiments described herein relate generally to an image processing system and an image processing method.
  • An image processing system for executing such a technique acquires an image of an article and generates dictionary information in advance.
  • the number of images necessary for generating the dictionary information and an imaging angle are different for each article. Therefore, the image processing system has a problem of capturing unnecessary images.
  • FIG. 1 is a block diagram illustrating a configuration example of an image processing system according to a first embodiment.
  • FIG. 2 is a block diagram illustrating a configuration example of a control device according to the first embodiment.
  • FIG. 3 is a diagram illustrating an example of a parameter table according to the first embodiment.
  • FIG. 4 is a diagram illustrating a display example of an input/output device according to the first embodiment.
  • FIG. 5 is a flowchart illustrating an operation example of a control device according to the first embodiment.
  • FIG. 6 is a block diagram illustrating a configuration example of an image processing system according to a second embodiment.
  • FIG. 7 is a block diagram illustrating a configuration example of a control device according to a second embodiment.
  • FIG. 8 is a flowchart illustrating an operation example of the control device according to the second embodiment.
  • an image processing system and an image processing method capable of properly capturing an image for generating dictionary information are provided.
  • an image processing system includes a camera interface and a processor.
  • the camera interface acquires a captured image from a camera that images an article.
  • the processor acquires a basic shape of the article and acquires an image captured with imaging parameters corresponding to the basic shape from the camera through the camera interface.
  • the image processing system images an article in order to generate dictionary information.
  • an article is set at a predetermined position by an operator or the like.
  • the image processing system images the set article and acquires an image of the article.
  • the dictionary information is information used for object recognition. That is, the dictionary information is information for specifying an object by matching the image of the object therewith.
  • the dictionary information may be an image of an article or information indicating a feature amount of an image of an article.
  • FIG. 1 illustrates a configuration example of an image processing system 1 .
  • the image processing system 1 includes a control device 10 , an input/output device 20 , an imaging device 30 , and the like.
  • the control device 10 is communicably connected to the input/output device 20 and the imaging device 30 .
  • the image processing system 1 images an article A.
  • the control device 10 controls the entire image processing system 1 .
  • the control device 10 images the article A by using the imaging device 30 based on an instruction from the operator or the like. For example, the control device 10 receives an instruction input from the operator through the input/output device 20 . Further, the control device 10 displays various information to the operator through the input/output device 20 .
  • the input/output device 20 is an interface for receiving an instruction input from the operator and displaying various information to the operator.
  • the input/output device 20 is constituted with an operation unit that receives an instruction input and a display unit that displays information.
  • the input/output device 20 transmits a signal indicating an operation received from the operator to the control device 10 as the operation of the operation unit.
  • the operation unit has a touch panel.
  • the input/output device 20 displays various information as the operation of the display unit under the control of the control device 10 .
  • the display unit is constituted with a liquid crystal monitor.
  • the display unit is integrally formed with a touch panel as the operation unit.
  • the operation unit may be constituted with a keyboard or a numeric keypad.
  • the imaging device 30 images the article A under the control from the control device 10 .
  • the imaging device 30 images the article A from various angles.
  • the imaging device 30 is constituted with a housing 31 , a turntable 32 , cameras 41 to 43 , lighting units 51 to 53 , and the like.
  • the imaging device 30 may further have a configuration according to necessity, or a specific configuration may be excluded from the imaging device 30 .
  • the housing 31 is a frame that forms the outer shape of the imaging device 30 .
  • the housing 31 is made of resin or the like, for example.
  • the housing 31 is formed in a rectangular parallelepiped shape.
  • the housing 31 is provided with a partition 31 a in the middle part in a vertical direction.
  • the partition 31 a is formed horizontally.
  • the partition 31 a is formed in a rectangular shape according to the shape of the housing 31 .
  • the partition 31 a is formed of a transparent material.
  • the partition 31 a is an acrylic plate or the like.
  • the turntable 32 , the cameras 41 to 43 , and the lighting units 51 to 53 are formed inside the housing 31 .
  • the turntable 32 is formed on the partition 31 a.
  • the turntable 32 rotates the article A placed on the top side under the control from the control device 10 .
  • the turntable 32 is constituted by a disk 32 a on which the article A is placed and a drive unit that rotates the disk.
  • the disk 32 a is formed of a transparent material.
  • the disk 32 a is an acrylic plate or the like.
  • the cameras 41 to 43 images the article A under the control from the control device 10 .
  • the cameras 41 to 43 transmit captured images obtained by imaging the article A to the control device 10 .
  • the cameras 41 to 43 are charge coupled device (CCD) cameras and the like.
  • the camera 41 (a first camera) images the article A from a predetermined direction.
  • the camera 41 images the top side of the article A.
  • the camera 41 is installed downward on the top side of the housing 31 .
  • the camera 41 images the article A from above.
  • the camera 42 (a second camera) images the article A from a direction orthogonal to the predetermined direction.
  • the camera 42 images the side of the article A.
  • the camera 41 is installed horizontally on the side of the housing 31 .
  • the camera 41 images the article A from the side part.
  • the camera 43 (a third camera) images the article A in a direction opposite to the predetermined direction.
  • the camera 43 images the bottom side of the article A.
  • the camera 41 is installed upward on the bottom side of the housing 31 .
  • the camera 41 images the article A from the bottom part.
  • the lighting units 51 to 53 illuminate the article A under the control from the control device 10 .
  • the lighting units 51 to 53 are constituted with light emitting diodes (LED) or the like.
  • the lighting unit 52 illuminates an area to be imaged by the camera 42 . That is, the lighting unit 52 illuminates the side of the article A.
  • the lighting unit 52 is installed laterally on the side of the housing 31 .
  • the lighting unit 52 illuminates the article A from the side part.
  • the lighting unit 53 illuminates an area to be imaged by the camera 43 . That is, the lighting unit 53 illuminates the bottom side of the article A.
  • the lighting unit 53 is installed upward on the bottom side of the housing 31 .
  • the lighting unit 53 illuminates the article A from the bottom.
  • control device 10 Next, a configuration example of the control device 10 will be described.
  • FIG. 2 is a block diagram showing a configuration example of the control device 10 .
  • the control device 10 includes a processor 11 , a ROM 12 , a RAM 13 , an NVM 14 , a table interface 15 , a camera interface 16 , a lighting interface 17 , an input/output interface 18 , and the like.
  • the processor 11 , the ROM 12 , the RAM 13 , the NVM 14 , the table interface 15 , the camera interface 16 , the lighting interface 17 , and the input/output interface 18 are connected to each other via a data bus or the like.
  • control device 10 may further have a configuration according to the particular application, or a specific configuration may be excluded from the control device 10 .
  • the processor 11 has a function of controlling the overall operation of the control device 10 .
  • the processor 11 may include an internal cache and various interfaces, and the like.
  • the processor 11 realizes various processes by executing programs stored in the internal memory, the ROM 12 or the NVM 14 in advance.
  • Some of the various functions realized by the processor 11 executing the programs maybe realized by a hardware circuit.
  • the processor 11 controls functions executed by the hardware circuit.
  • the ROM 12 is a nonvolatile memory in which a control program, control data, and the like are stored in advance.
  • the control program and the control data stored in the ROM 12 are incorporated in advance according to the specification of the control device 10 .
  • the ROM 12 stores, for example, a program (for example, BIOS) for controlling the circuit board of the control device 10 .
  • the RAM 13 is a volatile memory.
  • the RAM 13 temporarily stores data under processing of the processor 11 and the like.
  • the RAM 13 stores various application programs based on instructions from the processor 11 . Further, the RAM 13 may store data necessary for executing the application programs, execution results of the application programs, and the like.
  • the NVM 14 is a nonvolatile memory capable of writing and rewriting data.
  • the NVM 14 is constituted with, for example, a hard disk drive (HDD), a solid state drive (SSD), an EEPROM (registered trademark) or a flash memory.
  • the NVM 14 stores control programs, applications, various data, and the like according to the operational application of the control device 10 .
  • the table interface 15 is an interface for transmitting and receiving data to and from the turntable 32 .
  • the table interface 15 transmits a signal for rotating the disk 32 a under the control of the processor 11 .
  • the table interface 15 may receive a signal indicating the angle of the disk 32 a from the turntable 32 .
  • the table interface 15 may support a USB connection.
  • the camera interface 16 is an interface for transmitting and receiving data to and from the cameras 41 to 43 .
  • the camera interface 16 transmits a signal for instructing imaging to the cameras 41 to 43 under the control of the processor 11 .
  • the camera interface 16 acquires captured images obtained by imaging from the cameras 41 to 43 .
  • the camera interface 16 may support a USB connection.
  • the lighting interface 17 is an interface for transmitting and receiving data to and from the lighting units 51 to 53 .
  • the lighting interface 17 transmits a signal for instructing lighting to the lighting units 51 to 53 under the control of the processor 11 .
  • the lighting interface 17 may support a USB connection.
  • the parameter table shows imaging parameters related to the imaging of the article for each basic shape of the article.
  • the basic shape is a category of the outline of the article. That is, the basic shape is an approximate shape of the outer shape of the article.
  • the basic shape is a cube, a rectangular parallelepiped, a polygonal prism, a cylinder, a cone, a sphere, a polygonal cone or a plane.
  • the content of the basic shape is not limited to a specific configuration.
  • the imaging parameters are parameters for capturing an image necessary for generating dictionary information of the article.
  • FIG. 3 shows an example of a parameter table. As shown in FIG. 3 , the parameter table stores “Basic Shape” and “Imaging Parameter” in association with each other.
  • Numberer of Captured Images indicates the number of images to be captured.
  • Lighting Position indicates a lighting unit to be turned on when the article is imaged.
  • the “Lighting Position” may indicate a lighting unit to be turned on at each imaging angle.
  • control device 10 functions realized by the control device 10 will be described.
  • the functions realized by the control device 10 are realized by the processor 11 executing a program stored in the NVM 14 or the like.
  • the processor 11 has a function of receiving an input of the basic shape of the article A set on the turntable 32 .
  • the processor 11 receives an input of the basic shape of the article A through the input/output device 20 .
  • the processor 11 sets the input basic shape.
  • FIG. 4 shows an example of a screen to be displayed when the input/output device 20 receives an input of a basic shape. As shown in FIG. 4 , the input/output device 20 displays icons 61 to 63 .
  • the icon 61 receives an input of “cube” as a basic shape.
  • the icon 62 receives an input of “cylinder” as a basic shape.
  • the icon 63 receives an input that the basic shape is unknown.
  • the input/output device 20 may display an icon for receiving an input of another basic shape.
  • the processor 11 When detecting the tap on the icon 61 or 62 , the processor 11 acquires the basic shape corresponding to the tapped icon 61 or 62 . Further, when detecting the tap on the icon 63 , the processor 11 recognizes that the basic shape is unknown.
  • the processor 11 has a function of acquiring imaging parameters corresponding to the basic shape of the article A.
  • the processor 11 refers to the parameter table and acquires imaging parameters corresponding to the basic shape.
  • the processor 11 sets the acquired imaging parameters.
  • the processor 11 may acquire predetermined imaging parameters.
  • the processor 11 has a function of imaging the article A according to the acquired imaging parameters.
  • the processor 11 uses the cameras 41 to 43 , the lighting units 51 to 53 , and the turntable 32 to image the article A.
  • the processor 11 images the “front side/back side/top side/bottom side/left side/right side” of the article A.
  • the processor 11 turns on the lighting units 51 to 53 corresponding to the respective cameras 41 to 43 .
  • the processor 11 turns on the lighting unit 51 .
  • the processor 11 images the “top side” of the article A by using the camera 41 .
  • the processor 11 turns on the lighting unit 53 .
  • the processor 11 uses the camera 43 to image the “bottom side” of the article A.
  • the processor 11 When imaging the “bottom side” of the article A, the processor 11 directs the article A in a predetermined direction by using the turntable 32 . When the article A is directed in the predetermined direction, the processor 11 turns on the lighting unit 52 . When the lighting unit 52 is turned on, the processor 11 images a predetermined side (front side, back side, left side or right side) of the article A by using the camera 42 .
  • the processor 11 uses the turntable 32 to direct the article A in the predetermined direction.
  • the processor 11 uses the camera 42 to image the other side (front side, back side, left side or right side) of the article A.
  • the processor 11 repeats the above operation to image the front side, back side, left side, and right side of the article A.
  • the processor 11 may simultaneously image the top side, the bottom side, and one side of the article A.
  • the processor 11 may store the captured image in the NVM 14 . Further, the processor 11 may store the captured image in an external memory. In addition, the processor 11 may transmit the captured image to the external device.
  • control device 10 Next, an operation example of the control device 10 will be described.
  • FIG. 5 is a flowchart for describing an operation example of the control device 10 .
  • the article A is set on the turntable 32 .
  • the processor 11 of the control device 10 receives the input of the basic shape of the article A through the input/output device 20 (ACT 11 ).
  • the processor 11 determines whether the basic shape of the article A has been acquired (ACT 12 ).
  • the processor 11 refers to the parameter table and acquires imaging parameters corresponding to the acquired basic shape (ACT 13 ).
  • the processor 11 acquires the predetermined imaging parameters (ACT 14 ).
  • the processor 11 images the article A based on the acquired imaging parameters (ACT 15 ).
  • the processor 11 ends the operation.
  • the processor 11 may end the operation.
  • the imaging device 30 may be provided with a movable camera.
  • the processor 11 moves the camera according to the imaging parameters to image the article A at a predetermined angle.
  • the processor 11 may acquire imaging parameters from an external device.
  • the processor 11 may set a plurality of basic shapes for the article. For example, the processor 11 may set the basic shape according to the part of the article. Further, the processor 11 may set imaging parameters corresponding to each part in order to image each part of the article. That is, the processor 11 captures images with different imaging parameters for each part of the article.
  • the image processing system configured as described above acquires the basic shape of an article that generates dictionary information.
  • the image processing system sets imaging parameters necessary for generating dictionary information based on the basic shape.
  • the image processing system images the article according to the set imaging parameters.
  • the image processing system may avoid capturing of unnecessary images for generating dictionary information. Therefore, the image processing system may properly image the article.
  • the image processing system includes cameras that images an article from the upper part, the side part, and the lower part.
  • the image processing system may capture necessary images without requiring the operator to rotate the article or the like.
  • the image processing system according to the second embodiment is different from the image processing system 1 according to the first embodiment in that the processor 11 determines the basic shape of an article. Therefore, the same reference numerals are given to the other points and the detailed description is omitted.
  • FIG. 6 shows a configuration example of an image processing system 1 ′ according to the second embodiment.
  • the image processing system 1 ′ includes a control device 10 ′, an input/output device 20 , and an imaging device 30 ′.
  • the imaging device 30 ′ includes a distance sensor 71 .
  • the distance sensor 71 measures the distance based on the reflected light of the light (visible light or invisible light) radiated from a light source.
  • the distance sensor 71 may perform a time-of-flight (ToF) method for measuring a distance to a measurement target based on the time until the radiated light is reflected by the measurement target and reaches the distance sensor 71 .
  • ToF time-of-flight
  • the configuration of the distance sensor 71 is not limited to a specific configuration.
  • control device 10 ′ Next, a configuration example of the control device 10 ′ will be described.
  • FIG. 7 is a block diagram showing a configuration example of the control device 10 ′.
  • the control device 10 ′ includes a distance sensor interface 19 .
  • the processor 11 and the distance sensor interface 19 are connected to each other via a data bus or the like.
  • the distance sensor interface 19 is an interface for transmitting and receiving data to and from the distance sensor 71 .
  • the distance sensor interface 19 transmits a signal for instructing the distance sensor 71 to measure the distance, under the control of the processor 11 .
  • the distance sensor interface 19 acquires a signal indicating the measured distance from the distance sensor 71 .
  • the distance sensor interface 19 may support a USB connection.
  • the processor 11 images the article A by using the cameras 41 to 43 .
  • the processor 11 determines the basic shape based on the captured image. For example, the processor 11 extracts the edges of each image and determines the shape of the top side, the side, the bottom side, and the like of the article A.
  • the processor 11 determines the basic shape of the article A based on each determined shape.
  • the processor 11 determines the basic shape of the article A based on the distance measured by using the distance sensor 71 and the images captured by using the cameras 41 to 43 .
  • the processor 11 has a function of acquiring imaging parameters corresponding to the basic shape of the article A.
  • the processor 11 may acquire predetermined imaging parameters.
  • FIG. 8 is a flowchart for describing an operation example of the control device 10 ′. Here, it is assumed that the article A is set on the turntable 32 .
  • the processor 11 of the control device 10 uses the cameras 41 to 43 to image the article A (ACT 21 ).
  • the processor 11 measures the distance by using the distance sensor 71 (ACT 22 ).
  • the processor 11 determines the basic shape of the article A based on the captured image and the measured distance (ACT 23 ).
  • the processor 11 When the determination of the basic shape of the article A is successful (ACT 24 , YES), the processor 11 refers to the parameter table and acquires imaging parameters corresponding to the determined basic shape (ACT 25 ).
  • the processor 11 images the article A based on the acquired imaging parameters (ACT 27 ).
  • the imaging device 30 ′ may include a plurality of distance sensors. Each distance sensor measures distances from different reference points or reference planes.
  • the processor 11 may determine the basic shape of the article based on the distance measured by each distance sensor.
  • the image processing system configured as described above determines the basic shape of the set article.
  • the image processing system sets imaging parameters according to the determined basic shape.
  • the image processing system images the article according to the set imaging parameters.
  • the image processing system may set appropriate imaging parameters even if the operator does not input the basic shape.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Signal Processing (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)
  • Image Input (AREA)
US16/182,468 2017-11-07 2018-11-06 Image processing system and image processing method Abandoned US20190139245A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/081,531 US11328438B2 (en) 2017-11-07 2020-10-27 Image processing system and image processing method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-214553 2017-11-07
JP2017214553A JP2019087008A (ja) 2017-11-07 2017-11-07 画像処理システム及び画像処理方法

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/081,531 Continuation US11328438B2 (en) 2017-11-07 2020-10-27 Image processing system and image processing method

Publications (1)

Publication Number Publication Date
US20190139245A1 true US20190139245A1 (en) 2019-05-09

Family

ID=64172407

Family Applications (2)

Application Number Title Priority Date Filing Date
US16/182,468 Abandoned US20190139245A1 (en) 2017-11-07 2018-11-06 Image processing system and image processing method
US17/081,531 Active US11328438B2 (en) 2017-11-07 2020-10-27 Image processing system and image processing method

Family Applications After (1)

Application Number Title Priority Date Filing Date
US17/081,531 Active US11328438B2 (en) 2017-11-07 2020-10-27 Image processing system and image processing method

Country Status (4)

Country Link
US (2) US20190139245A1 (ja)
EP (1) EP3480733A1 (ja)
JP (2) JP2019087008A (ja)
CN (1) CN109756670A (ja)

Family Cites Families (49)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH01224651A (ja) * 1988-03-04 1989-09-07 Mitsubishi Heavy Ind Ltd びん内容液中の異物検出装置
JPH04106672A (ja) * 1990-08-28 1992-04-08 Hitachi Ltd 物体の照明条件推定方法,物体の3次元情報生成方法,物体の形状と質感の生成方法,物体の形状と質感の生成装置
JP3311830B2 (ja) * 1993-09-20 2002-08-05 株式会社東芝 3次元動画作成装置
US6407817B1 (en) 1993-12-20 2002-06-18 Minolta Co., Ltd. Measuring system with improved method of reading image data of an object
JP4066488B2 (ja) 1998-01-22 2008-03-26 ソニー株式会社 画像データ生成装置及び画像データ生成方法
JP3052926B2 (ja) 1998-02-27 2000-06-19 日本電気株式会社 三次元座標計測装置
KR100382439B1 (ko) 1998-05-25 2003-05-09 마쯔시다덴기산교 가부시키가이샤 레인지파인더 장치와 카메라
JP2002077707A (ja) * 2000-08-30 2002-03-15 Minolta Co Ltd 撮像装置および撮像方法
JP2002150315A (ja) 2000-11-09 2002-05-24 Minolta Co Ltd 画像処理装置および記録媒体
JP2003042732A (ja) * 2001-08-02 2003-02-13 Topcon Corp 表面形状測定装置及びその方法、表面形状測定プログラム、並びに表面状態図化装置
JP4877891B2 (ja) * 2001-08-03 2012-02-15 株式会社トプコン 校正用被写体
JP4573085B2 (ja) * 2001-08-10 2010-11-04 日本電気株式会社 位置姿勢認識装置とその位置姿勢認識方法、及び位置姿勢認識プログラム
JP2003167927A (ja) * 2001-12-03 2003-06-13 Sharp Corp 要素分割装置、要素分割方法、要素分割プログラムおよび要素分割プログラムを記録したコンピュータ読取可能な記録媒体
JP2005241668A (ja) * 2004-02-24 2005-09-08 Wista:Kk 被写体スタンド
US7711179B2 (en) * 2004-04-21 2010-05-04 Nextengine, Inc. Hand held portable three dimensional scanner
JP4446383B2 (ja) * 2004-08-12 2010-04-07 Kddi株式会社 画像処理装置および画像認識装置
US7756325B2 (en) 2005-06-20 2010-07-13 University Of Basel Estimating 3D shape and texture of a 3D object based on a 2D image of the 3D object
JP4631760B2 (ja) * 2006-03-17 2011-02-16 カシオ計算機株式会社 デジタルカメラ、画像処理方法及びプログラム
JP4111231B2 (ja) 2006-07-14 2008-07-02 富士ゼロックス株式会社 立体表示システム
US8090194B2 (en) 2006-11-21 2012-01-03 Mantis Vision Ltd. 3D geometric modeling and motion capture using both single and dual imaging
JP2008154027A (ja) * 2006-12-19 2008-07-03 Seiko Epson Corp 撮影装置、撮影方法、およびプログラム
JP4784555B2 (ja) * 2007-05-25 2011-10-05 トヨタ自動車株式会社 形状評価方法、形状評価装置および三次元検査装置
WO2009110589A1 (ja) 2008-03-07 2009-09-11 株式会社ニコン 形状測定装置および方法、並びにプログラム
KR101251372B1 (ko) 2008-10-13 2013-04-05 주식회사 고영테크놀러지 3차원형상 측정방법
JP2011174896A (ja) * 2010-02-25 2011-09-08 Mitsubishi Heavy Ind Ltd 撮像装置及び撮像方法
US8964189B2 (en) 2010-08-19 2015-02-24 Canon Kabushiki Kaisha Three-dimensional measurement apparatus, method for three-dimensional measurement, and computer program
CN102375309A (zh) * 2010-08-26 2012-03-14 鸿富锦精密工业(深圳)有限公司 投影机光线调整***及方法
US8600192B2 (en) 2010-12-08 2013-12-03 Cognex Corporation System and method for finding correspondence between cameras in a three-dimensional vision system
GB201110156D0 (en) 2011-06-16 2011-07-27 Light Blue Optics Ltd Touch-sensitive display devices
JP5132832B1 (ja) 2011-07-11 2013-01-30 キヤノン株式会社 計測装置および情報処理装置
JP2013101045A (ja) 2011-11-08 2013-05-23 Fanuc Ltd 物品の3次元位置姿勢の認識装置及び認識方法
CA2899272A1 (en) * 2012-01-26 2013-08-01 Alexander Brunner Device and methods for fabricating a two-dimensional image of a three-dimensional object
KR101614061B1 (ko) 2012-03-29 2016-04-20 주식회사 고영테크놀러지 조인트 검사 장치
TWI465825B (zh) 2012-06-27 2014-12-21 Acer Inc 影像擷取裝置與其光源輔助拍攝方法
CN103167232B (zh) * 2012-10-26 2016-04-20 苏州比特速浪电子科技有限公司 摄像装置、图像合成装置及图像处理方法
CN103902241A (zh) * 2012-12-25 2014-07-02 华硕电脑股份有限公司 影像显示***以及影像显示方法
JP2014126494A (ja) * 2012-12-27 2014-07-07 Seiko Epson Corp 検査支援装置、検査支援方法、ロボットシステム、制御装置、ロボット、及び、プログラム
US9041914B2 (en) 2013-03-15 2015-05-26 Faro Technologies, Inc. Three-dimensional coordinate scanner and method of operation
JP6371044B2 (ja) * 2013-08-31 2018-08-08 国立大学法人豊橋技術科学大学 表面欠陥検査装置および表面欠陥検査方法
CN104243843B (zh) * 2014-09-30 2017-11-03 北京智谷睿拓技术服务有限公司 拍摄光照补偿方法、补偿装置及用户设备
US9256775B1 (en) * 2014-12-23 2016-02-09 Toshiba Tec Kabushiki Kaisha Image recognition apparatus and commodity information processing apparatus
WO2016137899A1 (en) * 2015-02-23 2016-09-01 Li-Cor, Inc. Fluorescence biopsy specimen imager and methods
JP6529314B2 (ja) * 2015-04-09 2019-06-12 キヤノン株式会社 画像処理装置、画像処理方法、及びプログラム
JP6480824B2 (ja) * 2015-07-27 2019-03-13 株式会社日立製作所 距離画像センサのパラメータ調整方法、パラメータ調整装置、およびエレベータシステム
KR101630154B1 (ko) * 2015-12-31 2016-06-15 주식회사 디앤에스테크놀로지 차량 하부 영상 검색 시스템 및 방법
WO2017119088A1 (ja) * 2016-01-06 2017-07-13 株式会社日立製作所 ロボットシステムおよび制御方法
JP6602690B2 (ja) * 2016-02-23 2019-11-06 株式会社Fuji 部品装着機
JP6800597B2 (ja) 2016-03-30 2020-12-16 キヤノン株式会社 制御装置、制御方法およびプログラム
US11176382B2 (en) * 2017-03-06 2021-11-16 Conduent Business Services, Llc System and method for person re-identification using overhead view images

Also Published As

Publication number Publication date
JP2022121671A (ja) 2022-08-19
US20210042953A1 (en) 2021-02-11
JP2019087008A (ja) 2019-06-06
EP3480733A1 (en) 2019-05-08
US11328438B2 (en) 2022-05-10
CN109756670A (zh) 2019-05-14

Similar Documents

Publication Publication Date Title
US10896521B2 (en) Coordinate calibration between two dimensional coordinate system and three dimensional coordinate system
US9443311B2 (en) Method and system to identify a position of a measurement pole
CN105700736B (zh) 输入操作检测设备、投影装置、交互白板和数字标识装置
US20170249054A1 (en) Displaying an object indicator
US10325377B2 (en) Image depth sensing method and image depth sensing apparatus
US11328438B2 (en) Image processing system and image processing method
US8908084B2 (en) Electronic device and method for focusing and measuring points of objects
WO2015183232A1 (en) Method and apparatus for interacting with display screen
US9013404B2 (en) Method and locating device for locating a pointing device
CN110213407B (zh) 一种电子装置的操作方法、电子装置和计算机存储介质
US20150213309A1 (en) Measurement method, measurement device, projection apparatus, and computer-readable recording medium
US11386573B2 (en) Article recognition apparatus
CN111656778B (zh) 图像采集装置、图像采集方法及采集芯片
JP6999493B2 (ja) 物品認識装置
US11481996B2 (en) Calculation device, information processing method, and storage medium
KR20210115819A (ko) 레이더 성능 분석 장치 및 방법
US10977512B2 (en) Article recognition device
US10963726B2 (en) Article recognition device
TWI543047B (zh) 光學式觸控面板
JP2023128843A (ja) お絵描き用の表示制御システム
JP6333618B2 (ja) 撮像装置と操作パネル式情報端末とを組み合わせた穀粒判別システム
JP2022106808A (ja) 物品認識装置
JP2021124905A (ja) メータ読取装置、メータ読取システム、メータ読取方法及びコンピュータプログラム
WO2018061664A1 (ja) 作業位置検出装置及び作業位置検出方法
JP2017037428A (ja) 情報処理方法および情報処理装置、プログラム

Legal Events

Date Code Title Description
AS Assignment

Owner name: TOSHIBA TEC KABUSHIKI KAISHA, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:YASUNAGA, MASAAKI;ARIGA, NORIMASA;REEL/FRAME:047427/0380

Effective date: 20181101

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION

AS Assignment

Owner name: EXPONENTIAL AUDIO, LLC, MASSACHUSETTS

Free format text: TERMINATION AND RELEASE OF GRANT OF SECURITY INTEREST IN UNITED STATES PATENTS;ASSIGNOR:CAMBRIDGE TRUST COMPANY;REEL/FRAME:055627/0958

Effective date: 20210310

Owner name: IZOTOPE, INC., MASSACHUSETTS

Free format text: TERMINATION AND RELEASE OF GRANT OF SECURITY INTEREST IN UNITED STATES PATENTS;ASSIGNOR:CAMBRIDGE TRUST COMPANY;REEL/FRAME:055627/0958

Effective date: 20210310