WO2017175492A1 - Dispositif de traitement d'image, procédé de traitement d'image, programme informatique et dispositif électronique - Google Patents

Dispositif de traitement d'image, procédé de traitement d'image, programme informatique et dispositif électronique Download PDF

Info

Publication number
WO2017175492A1
WO2017175492A1 PCT/JP2017/006042 JP2017006042W WO2017175492A1 WO 2017175492 A1 WO2017175492 A1 WO 2017175492A1 JP 2017006042 W JP2017006042 W JP 2017006042W WO 2017175492 A1 WO2017175492 A1 WO 2017175492A1
Authority
WO
WIPO (PCT)
Prior art keywords
processing
unit
signal processing
signal
detection
Prior art date
Application number
PCT/JP2017/006042
Other languages
English (en)
Japanese (ja)
Inventor
真生 全
Original Assignee
ソニー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニー株式会社 filed Critical ソニー株式会社
Publication of WO2017175492A1 publication Critical patent/WO2017175492A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/76Addressed sensors, e.g. MOS or CMOS sensors
    • H04N25/78Readout circuits for addressed sensors, e.g. output amplifiers or A/D converters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N17/00Diagnosis, testing or measuring for television systems or their details
    • H04N17/02Diagnosis, testing or measuring for television systems or their details for colour television signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • H04N23/12Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths with one sensor only

Definitions

  • the present disclosure relates to an image processing device, an image processing method, a computer program, and an electronic device.
  • CMOS Complementary Metal Oxide Semiconductor
  • CCD Charge Coupled Device
  • the detection process can be executed at high speed.
  • An improved image processing apparatus, image processing method, computer program, and electronic device are also proposed.
  • a storage unit that stores a pixel signal output from an image sensor, a signal processing unit that performs signal processing on the pixel signal stored in the storage unit, and signal processing performed by the signal processing unit
  • an image processing apparatus comprising: a detection unit that completes detection processing on the pixel signal in the same frame before completion.
  • the signal processing is performed so that the detection processing for the pixel signal output from the image sensor is completed, and the signal processing for the pixel signal in the same frame is completed after the detection processing is completed. Performing an image processing method.
  • the computer executes detection processing on the pixel signal output from the image sensor, and the signal processing on the pixel signal in the same frame is completed after the detection processing is completed.
  • a computer program is provided for performing signal processing.
  • an electronic apparatus including the image processing apparatus is provided.
  • the detection processing is executed at high speed. It is possible to provide a new and improved image processing apparatus, an image processing method, a computer program, and an electronic apparatus that can perform the above-described processing.
  • FIG. 2 is an explanatory diagram illustrating a configuration example of a sensor module 100 included in an imaging unit 11.
  • FIG. It is explanatory drawing which shows the function structural example of the sensor module 100 which concerns on the same embodiment.
  • Embodiment of the present disclosure [1.1. Overview] Before describing the embodiments of the present disclosure in detail, an outline of the embodiments of the present disclosure will be described first.
  • CMOS image sensor As described above, there is a technology that applies feedback to signal processing on a pixel signal output from the image sensor by detecting a pixel signal output from the image sensor such as a CMOS image sensor or a CCD image sensor.
  • Signal processing for pixel signals includes automatic white balance processing, automatic exposure processing, distortion correction processing, defect correction processing, noise reduction processing, and high dynamic range synthesis processing.
  • the detection processing for pixel signals includes, for example, calculation of statistical information such as peaks and average values, calculation of motion amount, flicker detection, face detection, and the like.
  • the processor (application processor) provided in the subsequent stage of the image sensor acquires the signal after the signal processing and the detection result from the image sensor. Then, the processor analyzes the detection result and applies feedback to the signal processing for the pixel signal in the image sensor or the processor. For example, the processor can perform feedback control such as changing the content of signal processing based on statistical information, or changing the content of signal processing depending on the presence or absence of face detection.
  • the detection result generation rate for each frame depends on the signal processing bandwidth inside the image sensor, the processor interface bandwidth, and the signal processing bandwidth. This limits the speed of the internal processing of the image sensor. That is, when the signal processing and the detection processing are executed in parallel inside the image sensor, the signal processing and the detection processing must be moved at the same time, and the detection data of the image is read until the reading of one image is completed. Since it is not completed, it takes time to complete detection.
  • FIG. 1 is an explanatory diagram for explaining processing times of signal processing and detection processing when signal processing and detection processing are executed in parallel inside the image sensor.
  • FIG. 1 shows processing times for signal processing and detection processing in two frames f1 and f2.
  • signal processing and detection processing are completed simultaneously. In other words, the detection process cannot be completed until the signal processing for all the vertical lines is completed. Therefore, as shown in FIG. 1, the detection information is completed after the signal processing for the vertical line is completed and the image data is generated. This means that the detection information of the frame f1 can be used only for the image data after the frame f2, not the image data of the frame f1.
  • FIG. 2 is an explanatory diagram illustrating signal processing and detection processing when signal processing and detection processing are executed in parallel inside the image sensor.
  • a case is considered in which one piece of image data is divided in the horizontal direction and the vertical direction as indicated by broken lines, and the detection processing is executed in units of areas surrounded by the broken lines.
  • the detection process is started from the upper left area of the screen, the detection process is performed in the horizontal direction and the vertical direction, and finally, when the detection process of the lower right area is completed, one piece of image data
  • the detection process for will end. That is, if all the image data of one sheet is not prepared, the detection data of the image is not prepared.
  • the present disclosure when performing signal processing on the pixel signal output from the image sensor and detection of the pixel signal output from the image sensor inside the image sensor, We intensively studied the technology that can be implemented. As a result, the present disclosure, as will be described below, performs detection processing when performing signal processing on the pixel signal output from the image sensor and detection of the pixel signal output from the image sensor inside the image sensor. Has come up with a technology that can be executed at high speed.
  • FIG. 3 is an explanatory diagram illustrating a functional configuration example of the electronic device 10 according to the embodiment of the present disclosure.
  • a functional configuration example of the electronic apparatus 10 according to the embodiment of the present disclosure will be described with reference to FIG.
  • the electronic device 10 includes an imaging unit 11, an image processing unit 12, a display unit 13, a control unit 14, a storage unit 15, and an operation unit 16. And comprising.
  • the imaging unit 11 includes a lens, a sensor module, and the like, and accumulates electrons for a predetermined period according to an image formed on the light receiving surface of the sensor module through the lens.
  • the imaging unit 11 performs predetermined signal processing on a signal corresponding to the accumulated electrons. Then, the imaging unit 11 outputs the signal after the signal processing to the image processing unit 12.
  • the configuration of the sensor module included in the imaging unit 11 will be described in detail later.
  • the imaging unit 11 includes, as the predetermined signal processing, camera shake correction processing using an electronic camera shake correction method, automatic white balance processing, automatic exposure processing, distortion correction processing, defect correction processing, noise reduction processing, high dynamic range synthesis processing, and the like. Signal processing may be performed.
  • the image processing unit 12 is configured by an application processor (AP), for example, and executes image processing using a signal output from the imaging unit 11.
  • the image processing executed by the image processing unit 12 includes, for example, demosaic processing using a signal output from the imaging unit 11, display processing of the demosaiced image on the display unit 13, storage processing in the storage unit 15, and the like. There is.
  • the display unit 13 is a display device configured by, for example, a liquid crystal display, an organic EL display, or the like. Display contents of the display unit 13 are controlled by the control unit 14. For example, the display unit 13 displays an image captured by the imaging unit 11 and subjected to image processing by the image processing unit 12 based on the control of the control unit 14.
  • the control unit 14 includes, for example, a processor such as a CPU (Central Processing Unit), a ROM, a RAM, and the like, and controls the operation of each unit of the electronic device 10.
  • a processor such as a CPU (Central Processing Unit), a ROM, a RAM, and the like, and controls the operation of each unit of the electronic device 10.
  • the storage unit 15 is configured by a storage medium such as a flash memory or other nonvolatile memory.
  • the storage unit 15 stores an image captured by the imaging unit 11 and subjected to image processing by the image processing unit 12.
  • the image stored in the storage unit 15 can be displayed on the display unit 13 in accordance with the operation of the user of the electronic device 10.
  • the operation unit 16 is a device for operating the electronic device 10 and includes, for example, buttons and a touch panel.
  • the touch panel is provided on the display surface of the display unit 13.
  • the user of the electronic device 10 wants to record the image captured by the imaging unit 11 in the electronic device 10, the user triggers a shutter trigger by operating a predetermined button of the operation unit 16.
  • the imaging unit 11 and the image processing unit 12 detect the occurrence of the shutter trigger, the imaging unit 11 and the image processing unit 12 execute a process for recording an image on the electronic device 10 according to the generation of the shutter trigger.
  • the electronic device 10 shown in FIG. 3 is not limited to a specific device, and can take various forms such as a digital camera, a smartphone, a tablet portable terminal, a portable music player, and a game machine.
  • FIG. 4 is an explanatory diagram illustrating a configuration example of the sensor module 100 included in the imaging unit 11.
  • the sensor module 100 according to the embodiment of the present disclosure is an example of the image processing apparatus of the present disclosure, and is configured by stacking three substrates as illustrated in FIG.
  • the sensor module 100 according to the embodiment of the present disclosure has a configuration in which a pixel substrate 110, a memory substrate 120, and a signal processing substrate 130 are stacked in this order.
  • the pixel substrate 110 is a substrate having an image sensor composed of pixel regions in which unit pixels are formed in an array. Each unit pixel receives light from a subject, photoelectrically converts the incident light, accumulates charges, and outputs the charges as pixel signals at a predetermined timing. Pixel signals output from the pixel substrate 110 are stored in the memory substrate 120, and signal processing is performed in the signal processing substrate 130.
  • the pixel substrate 110 includes an AD converter that converts an analog signal into a digital signal. That is, the pixel signal output from the pixel substrate 110 is a digital signal.
  • the memory substrate 120 is a substrate having a memory such as a DRAM (Dynamic Random Access Memory) that temporarily stores pixel signals output from the pixel substrate 110.
  • the memory substrate 120 has a capacity capable of temporarily storing pixel signals of a plurality of frames, for example, enough frames that can execute the camera shake correction process by the electronic camera shake correction method in the signal processing board 130.
  • the pixel signal stored in the memory substrate 120 is read based on a read command from the signal processing substrate 130.
  • the signal processing board 130 performs various signal processing on the pixel signals stored in the memory board 120.
  • the signal processing executed by the signal processing board 130 is signal processing related to image quality with respect to the pixel signals stored in the memory board 120. For example, camera shake correction processing by an electronic camera shake correction method, automatic white balance processing, automatic exposure processing, distortion Signal processing such as correction processing, defect correction processing, noise reduction processing, and high dynamic range synthesis processing can be executed.
  • the present disclosure is not limited to such an example.
  • the sensor module 100 may have a configuration in which a pixel substrate 110, a signal processing substrate 130, and a memory substrate 120 are stacked in this order.
  • the configuration example of the sensor module 100 has been described above with reference to FIG. Subsequently, a functional configuration example of the sensor module 100 will be described.
  • FIG. 5 is an explanatory diagram illustrating a functional configuration example of the sensor module 100 according to the embodiment of the present disclosure.
  • a functional configuration example of the sensor module 100 according to the embodiment of the present disclosure will be described with reference to FIG.
  • the pixel substrate 110 includes an image sensor 111 having a pixel region in which unit pixels are formed in an array, and a control unit 112 that supplies a predetermined clock signal and timing signal to the image sensor 111.
  • a pixel signal output from the image sensor 111 in response to a signal from the control unit 112 is once sent to the signal processing board 130 and then sent to the memory board 120.
  • the memory substrate 120 includes an image storage unit 121 configured by DRAM (Dynamic Random Access Memory) or the like.
  • the image storage unit 121 temporarily stores pixel signals output from the image sensor 111.
  • the image storage unit 121 has a capacity capable of temporarily storing pixel signals of a plurality of frames.
  • the pixel signal stored in the image storage unit 121 is read based on a read command from the signal processing board 130.
  • the signal processing board 130 includes a detection unit 131, an analysis unit 132, and a signal processing unit 133.
  • the detection unit 131 executes detection processing for the pixel signal output from the image sensor 111 for each frame.
  • the detection unit 131 executes, for example, calculation of statistical information such as a peak and an average value, calculation of motion amount, flicker detection, face detection, and the like as detection processing.
  • the detection unit 131 executes detection processing on the pixel signal, the detection unit 131 outputs detection information for each frame to the subsequent image processing unit 12 and also to the analysis unit 132.
  • the analysis unit 132 analyzes the detection information by obtaining detection information from the detection unit 131 every frame, every several frames, or irregularly. For example, the analysis unit 132 analyzes the detection information and controls the signal processing unit 133 to perform signal processing control on the frame signal or the pixel signal after the frame.
  • the analysis unit 132 detects, for example, the amount of motion of the image from the detection information, and based on the amount of motion, the image stabilization processing by the electronic camera shake correction method or noise by superimposing a plurality of images.
  • the signal processing unit 133 may perform control of the image cutout position in the process of removing the image.
  • the analysis unit 132 may instruct the signal processing unit 133 to perform processing for reducing the flicker.
  • the analysis unit 132 detects the presence of the predetermined object by not outputting the image signal until the presence of a predetermined object such as a face, a human body, a vehicle, or other moving object is detected from the detection information. Then, the signal processing unit 133 may be instructed to output an image signal from the frame or a predetermined time back from the frame.
  • the analysis unit 132 may control the operation of the control unit 112 based on the detection information. For example, the analysis unit 132 may detect the exposure of the image from the detection information, and may control the control unit 112 to adjust the exposure time. The analysis unit 132 may perform control for adjusting the position of the focus lens based on the detection information. The analysis unit 132 may perform control for adjusting the image height position by detecting phase difference information from detection information and adjusting the position of the focus lens, for example.
  • the signal processing unit 133 performs signal processing on the pixel signal stored in the image storage unit 121.
  • the signal processing unit 133 performs automatic white balance processing, automatic exposure processing, distortion correction processing, defect correction processing, noise reduction processing, high dynamic range synthesis processing, camera shake correction processing using an electronic camera shake correction method, as signal processing for pixel signals. Perform flicker reduction processing.
  • the signal processing unit 133 can execute the above-described signal processing under the control of the analysis unit 132 when performing signal processing on the pixel signal.
  • the signal processing unit 133 outputs the processed signal to the image processing unit 12 as an image signal.
  • the sensor module 100 has such a configuration, and executes signal processing on the pixel signal output from the image sensor and detection of the pixel signal output from the image sensor inside the image sensor.
  • the detection process can be executed at high speed. That is, the sensor module 100 according to the embodiment of the present disclosure can apply the detection result of the pixel signal of the same frame to the signal processing in the same frame by completing the detection process first.
  • FIG. 6 is an explanatory diagram illustrating a functional configuration example of the image processing unit 12 according to the embodiment of the present disclosure.
  • a functional configuration example of the image processing unit 12 according to the embodiment of the present disclosure will be described with reference to FIG.
  • the image processing unit 12 includes an analysis unit 201, a signal processing unit 202, and a buffer unit 203.
  • the analysis unit 201 acquires detection information for each frame from the sensor module 100, and executes analysis processing for the detection information. For example, the analysis unit 201 analyzes the detection information, and controls the signal processing unit 133 to perform signal processing control on the frame signal or a pixel signal after the frame.
  • the signal processing unit 202 acquires an image signal for each frame from the sensor module 100, and executes signal processing on the acquired image signal.
  • the signal processing unit 202 performs automatic white balance processing, automatic exposure processing, distortion correction processing, defect correction processing, noise reduction processing, high dynamic range synthesis processing, camera shake correction processing using an electronic camera shake correction method, as signal processing for pixel signals. Perform flicker reduction processing.
  • the signal processing unit 202 can execute the signal processing described above under the control of the analysis unit 201 when performing signal processing on the pixel signal.
  • the signal processing unit 202 temporarily stores the processed signal in the buffer unit 203 or displays the signal on the display unit 13.
  • the buffer unit 203 is composed of a RAM or the like, and temporarily stores the image signal subjected to signal processing by the signal processing unit 202.
  • FIG. 7 is a flowchart illustrating an operation example of the sensor module 100 according to the embodiment of the present disclosure.
  • FIG. 7 shows an operation example of the sensor module 100 when detecting processing and signal processing are internally executed on the pixel signal output from the image sensor 111.
  • an operation example of the sensor module 100 according to the embodiment of the present disclosure will be described with reference to FIG.
  • the sensor module 100 performs detection processing for each frame on the pixel signal output from the image sensor 111 (step S101). For example, the detection unit 131 executes the detection processing in step S101. The sensor module 100 completes the detection processing in step S101 before the signal processing for the pixel signal for each frame is completed.
  • step S101 when the detection process is executed on the pixel signal for each frame, the sensor module 100 subsequently analyzes the detection result (step S102).
  • the analysis processing in step S102 is executed by the analysis unit 132, for example.
  • the sensor module 100 may execute the analysis process of step S102 every frame, every few frames, or may be executed collectively for several frames.
  • the sensor module 100 When the detection result is analyzed in step S102, the sensor module 100 subsequently performs signal processing on the pixel signal based on the analysis result (step S103). For example, the signal processing unit 133 performs the signal processing in step S ⁇ b> 103 on the pixel signal stored in the image storage unit 121.
  • the sensor module 100 performs signal processing on a pixel signal output from the image sensor and detection of the pixel signal output from the image sensor by executing such an operation inside the image sensor.
  • the detection process can be executed at high speed. That is, the sensor module 100 according to the embodiment of the present disclosure can apply the detection result of the pixel signal of the same frame to the signal processing in the same frame by completing the detection process first.
  • FIG. 8 is a diagram for explaining the effects of the sensor module 100 according to the embodiment of the present disclosure.
  • the sensor module 100 according to the embodiment of the present disclosure independently performs signal processing and detection processing internally. It is explanatory drawing for demonstrating the processing time of a signal process and a detection process in the case of performing.
  • FIG. 8 shows processing times of signal processing and detection processing in two frames f1 and f2.
  • the sensor module 100 according to the embodiment of the present disclosure completes the detection processing prior to the signal processing, and thus can perform image processing using detection information for the same frame. It becomes. Therefore, the sensor module 100 according to the embodiment of the present disclosure can realize real-time image processing. In addition, the sensor module 100 according to the embodiment of the present disclosure can easily perform feedback control inside the sensor module 100 without using the image processing unit 12 at the subsequent stage, and thus the processing load on the image processing unit 12 is reduced. It can be reduced.
  • FIG. 9 is a diagram for explaining the effect of the sensor module 100 according to the embodiment of the present disclosure.
  • the sensor module 100 according to the embodiment of the present disclosure independently performs signal processing and detection processing internally. It is explanatory drawing for demonstrating the processing time of a signal process and a detection process in the case of performing.
  • FIG. 9 shows processing times of signal processing and detection processing in two frames f1 and f2.
  • the sensor module 100 stores the pixel signal in the image storage unit 121 once. Therefore, in the sensor module 100 according to the embodiment of the present disclosure, the sensor module 100 and the image processing unit 12 can determine whether or not the accumulated pixel signal is necessary based on the detection information.
  • the sensor module 100 can repeatedly execute detection processing in the signal processing period of each frame.
  • the output frame rate of the pixel signal from the image sensor 111 and the output frame rate of the image signal from the sensor module 100 may be different.
  • the output frame rate of the image signal from the sensor module 100 may be 30 fps
  • the output frame rate of the pixel signal from the image sensor 111 may be 120 fps or 240 fps. Therefore, if detection is performed at 120 fps or 240 fps, which is faster than detection at 30 fps, the amount of motion between frames is reduced, for example, so that the amount of detection processing can be reduced.
  • the sensor module 100 and the image processing unit 12 subsequent to the sensor module 100 perform signal processing on the presence / absence of signal processing on the pixel signals accumulated in the image storage unit 121 based on the detection information.
  • the unit 133 can be instructed.
  • the analysis unit 132 or the analysis unit 201 analyzes the detection information and instructs the signal processing unit 133 to perform signal processing on the pixel signal stored in the image storage unit 121 if a predetermined condition is satisfied. I can do it.
  • the analysis unit 132 or the analysis unit 201 may instruct the signal processing unit 133 to perform signal processing from a location that is several frames back rather than from a frame that satisfies a predetermined condition.
  • the sensor module 100 according to the embodiment of the present disclosure can perform detection processing on pixel signals without performing image processing by operating only detection processing. That is, the sensor module 100 according to the embodiment of the present disclosure enables image sensing with power saving.
  • the sensor module 100 that operates detection processing and signal processing independently uses detection information for automatic exposure processing, autofocus processing, and auto white balance processing. High speed can be realized.
  • the sensor module 100 may control the exposure time based on detection information.
  • the analysis unit 132 may detect the exposure of the image from the detection information, and may control the control unit 112 to adjust the exposure time.
  • the sensor module 100 according to the embodiment of the present disclosure may perform control to adjust the position of the focus lens based on detection information.
  • the analysis unit 132 may perform control for adjusting the image height position by detecting phase difference information from detection information and adjusting the position of the focus lens, for example.
  • the sensor module 100 may adjust the gamma curve based on the detection information.
  • the sensor module 100 according to the embodiment of the present disclosure may internally detect detection information to detect color temperature information, and perform gamma curve adjustment during image processing based on the color temperature information. .
  • the sensor module 100 that operates the detection processing and the signal processing independently can control the camera shake correction processing by the electronic camera shake correction method based on the detection information.
  • the sensor module 100 according to the embodiment of the present disclosure analyzes detection information and detects a motion amount between frames. And the sensor module 100 according to the embodiment of the present disclosure can quickly determine a range for extracting a pixel signal for each frame based on the amount of motion between frames obtained by analysis of detection information.
  • the sensor module 100 that operates detection processing and signal processing independently can control noise reduction processing by combining a plurality of images based on detection information.
  • noise reduction processing by combining a plurality of images
  • the sensor module 100 according to the embodiment of the present disclosure analyzes detection information and detects a motion amount between frames.
  • the sensor module 100 according to the embodiment of the present disclosure synthesizes a plurality of images based on the amount of motion between frames obtained by analysis of detection information, thereby reducing the processing time from the shutter timing to the completion of the synthesis process. It can be shortened.
  • the sensor module 100 that operates detection processing and signal processing independently, and when flicker is included in a captured moving image, processing for quickly reducing flicker based on detection information Can be executed.
  • the sensor module 100 according to the embodiment of the present disclosure analyzes detection information and detects the presence of flicker.
  • the sensor module 100 according to the embodiment of the present disclosure performs signal processing on the pixel signal so as to cancel flicker based on the presence of flicker obtained by analysis of detection information. Thereby, the sensor module 100 according to the embodiment of the present disclosure can quickly reduce flicker by detecting the detection information internally.
  • the sensor module 100 that operates detection processing and signal processing independently does not output an image signal to the image processing unit 12 unless a moving object is captured in the image to be captured. If it is obtained from analysis of detection information that a moving object is shown in the image to be displayed, an image signal may be output to the image processing unit 12 in accordance with the recognition of the moving object. Note that the sensor module 100 according to the embodiment of the present disclosure may output an image signal to the image processing unit 12 from a moving object recognition time, and an image signal from a time point several frames back from the moving object recognition time You may output to the process part 12.
  • the sensor module 100 may output an image signal to the image processing unit 12 only when a captured image satisfies other predetermined conditions. For example, the sensor module 100 does not output an image signal to the image processing unit 12 if the captured image does not include a human smile, and outputs an image signal if the captured image includes a human smile. You may output to the image process part 12. Also in this case, the sensor module 100 may output the image signal to the image processing unit 12 from the recognition time point of smile, and output the image signal to the image processing unit 12 from the time point several frames back from the recognition time point of smile. May be.
  • the technology according to the present disclosure can be applied to various products.
  • the technology according to the present disclosure is realized as a device that is mounted on any type of mobile body such as an automobile, an electric vehicle, a hybrid electric vehicle, a motorcycle, a bicycle, personal mobility, an airplane, a drone, a ship, and a robot. May be.
  • FIG. 10 is a block diagram illustrating a schematic configuration example of a vehicle control system that is an example of a mobile control system to which the technology according to the present disclosure can be applied.
  • the vehicle control system 12000 includes a plurality of electronic control units connected via a communication network 12001.
  • the vehicle control system 12000 includes a drive system control unit 12010, a body system control unit 12020, a vehicle exterior information detection unit 12030, a vehicle interior information detection unit 12040, and an integrated control unit 12050.
  • a microcomputer 12051, an audio image output unit 12052, and an in-vehicle network I / F (interface) 12053 are illustrated.
  • the drive system control unit 12010 controls the operation of the device related to the drive system of the vehicle according to various programs.
  • the drive system control unit 12010 includes a driving force generator for generating a driving force of a vehicle such as an internal combustion engine or a driving motor, a driving force transmission mechanism for transmitting the driving force to wheels, and a steering angle of the vehicle. It functions as a control device such as a steering mechanism that adjusts and a braking device that generates a braking force of the vehicle.
  • the body system control unit 12020 controls the operation of various devices mounted on the vehicle body according to various programs.
  • the body system control unit 12020 functions as a keyless entry system, a smart key system, a power window device, or a control device for various lamps such as a headlamp, a back lamp, a brake lamp, a blinker, or a fog lamp.
  • the body control unit 12020 can be input with radio waves transmitted from a portable device that substitutes for a key or signals from various switches.
  • the body system control unit 12020 receives input of these radio waves or signals, and controls a door lock device, a power window device, a lamp, and the like of the vehicle.
  • the vehicle outside information detection unit 12030 detects information outside the vehicle on which the vehicle control system 12000 is mounted.
  • the imaging unit 12031 is connected to the vehicle exterior information detection unit 12030.
  • the vehicle exterior information detection unit 12030 causes the imaging unit 12031 to capture an image outside the vehicle and receives the captured image.
  • the vehicle outside information detection unit 12030 may perform an object detection process or a distance detection process such as a person, a car, an obstacle, a sign, or a character on a road surface based on the received image.
  • the imaging unit 12031 is an optical sensor that receives light and outputs an electrical signal corresponding to the amount of received light.
  • the imaging unit 12031 can output an electrical signal as an image, or can output it as distance measurement information. Further, the light received by the imaging unit 12031 may be visible light or invisible light such as infrared rays.
  • the vehicle interior information detection unit 12040 detects vehicle interior information.
  • a driver state detection unit 12041 that detects a driver's state is connected to the in-vehicle information detection unit 12040.
  • the driver state detection unit 12041 includes, for example, a camera that images the driver, and the vehicle interior information detection unit 12040 determines the degree of fatigue or concentration of the driver based on the detection information input from the driver state detection unit 12041. It may be calculated or it may be determined whether the driver is asleep.
  • the microcomputer 12051 calculates a control target value of the driving force generator, the steering mechanism, or the braking device based on the information inside / outside the vehicle acquired by the vehicle outside information detection unit 12030 or the vehicle interior information detection unit 12040, and the drive system control unit A control command can be output to 12010.
  • the microcomputer 12051 realizes an ADAS (Advanced Driver Assistance System) function including vehicle collision avoidance or impact mitigation, following traveling based on inter-vehicle distance, vehicle speed maintaining traveling, vehicle collision warning, or vehicle lane departure warning, etc. It is possible to perform cooperative control for the purpose.
  • ADAS Advanced Driver Assistance System
  • the microcomputer 12051 controls the driving force generator, the steering mechanism, the braking device, and the like based on the information around the vehicle acquired by the vehicle exterior information detection unit 12030 or the vehicle interior information detection unit 12040. It is possible to perform cooperative control for the purpose of automatic driving that autonomously travels without depending on the operation.
  • the microcomputer 12051 can output a control command to the body system control unit 12020 based on information outside the vehicle acquired by the vehicle outside information detection unit 12030.
  • the microcomputer 12051 controls the headlamp according to the position of the preceding vehicle or the oncoming vehicle detected by the outside information detection unit 12030, and performs cooperative control for the purpose of anti-glare, such as switching from a high beam to a low beam. It can be carried out.
  • the sound image output unit 12052 transmits an output signal of at least one of sound and image to an output device capable of visually or audibly notifying information to a vehicle occupant or the outside of the vehicle.
  • an audio speaker 12061, a display unit 12062, and an instrument panel 12063 are illustrated as output devices.
  • the display unit 12062 may include at least one of an on-board display and a head-up display, for example.
  • FIG. 11 is a diagram illustrating an example of an installation position of the imaging unit 12031.
  • the vehicle 12100 includes imaging units 12101, 12102, 12103, 12104, and 12105 as the imaging unit 12031.
  • the imaging units 12101, 12102, 12103, 12104, and 12105 are provided, for example, at positions such as a front nose, a side mirror, a rear bumper, a back door, and an upper part of a windshield in the vehicle interior of the vehicle 12100.
  • the imaging unit 12101 provided in the front nose and the imaging unit 12105 provided in the upper part of the windshield in the vehicle interior mainly acquire an image in front of the vehicle 12100.
  • the imaging units 12102 and 12103 provided in the side mirror mainly acquire an image of the side of the vehicle 12100.
  • the imaging unit 12104 provided in the rear bumper or the back door mainly acquires an image behind the vehicle 12100.
  • the forward images acquired by the imaging units 12101 and 12105 are mainly used for detecting a preceding vehicle or a pedestrian, an obstacle, a traffic light, a traffic sign, a lane, or the like.
  • FIG. 11 shows an example of the shooting range of the imaging units 12101 to 12104.
  • the imaging range 12111 indicates the imaging range of the imaging unit 12101 provided in the front nose
  • the imaging ranges 12112 and 12113 indicate the imaging ranges of the imaging units 12102 and 12103 provided in the side mirrors, respectively
  • the imaging range 12114 The imaging range of the imaging part 12104 provided in the rear bumper or the back door is shown. For example, by superimposing the image data captured by the imaging units 12101 to 12104, an overhead image when the vehicle 12100 is viewed from above is obtained.
  • At least one of the imaging units 12101 to 12104 may have a function of acquiring distance information.
  • at least one of the imaging units 12101 to 12104 may be a stereo camera including a plurality of imaging elements, or may be an imaging element having pixels for phase difference detection.
  • the microcomputer 12051 based on the distance information obtained from the imaging units 12101 to 12104, the distance to each three-dimensional object in the imaging range 12111 to 12114 and the temporal change in this distance (relative speed with respect to the vehicle 12100).
  • a predetermined speed for example, 0 km / h or more
  • the microcomputer 12051 can set an inter-vehicle distance to be secured in advance before the preceding vehicle, and can perform automatic brake control (including follow-up stop control), automatic acceleration control (including follow-up start control), and the like.
  • automatic brake control including follow-up stop control
  • automatic acceleration control including follow-up start control
  • cooperative control for the purpose of autonomous driving or the like autonomously traveling without depending on the operation of the driver can be performed.
  • the microcomputer 12051 converts the three-dimensional object data related to the three-dimensional object to other three-dimensional objects such as a two-wheeled vehicle, a normal vehicle, a large vehicle, a pedestrian, and a utility pole based on the distance information obtained from the imaging units 12101 to 12104. It can be classified and extracted and used for automatic avoidance of obstacles.
  • the microcomputer 12051 identifies obstacles around the vehicle 12100 as obstacles that are visible to the driver of the vehicle 12100 and obstacles that are difficult to see.
  • the microcomputer 12051 determines the collision risk indicating the risk of collision with each obstacle, and when the collision risk is equal to or higher than the set value and there is a possibility of collision, the microcomputer 12051 is connected via the audio speaker 12061 or the display unit 12062. By outputting an alarm to the driver and performing forced deceleration or avoidance steering via the drive system control unit 12010, driving assistance for collision avoidance can be performed.
  • At least one of the imaging units 12101 to 12104 may be an infrared camera that detects infrared rays.
  • the microcomputer 12051 can recognize a pedestrian by determining whether a pedestrian is present in the captured images of the imaging units 12101 to 12104. Such pedestrian recognition is, for example, whether or not the user is a pedestrian by performing a pattern matching process on a sequence of feature points indicating the outline of an object and a procedure for extracting feature points in the captured images of the imaging units 12101 to 12104 as infrared cameras. It is carried out by the procedure for determining.
  • the audio image output unit 12052 When the microcomputer 12051 determines that there is a pedestrian in the captured images of the imaging units 12101 to 12104 and recognizes the pedestrian, the audio image output unit 12052 has a rectangular contour line for emphasizing the recognized pedestrian.
  • the display unit 12062 is controlled so as to be superimposed and displayed.
  • voice image output part 12052 may control the display part 12062 so that the icon etc. which show a pedestrian may be displayed on a desired position.
  • the technology according to the present disclosure can be applied to the imaging unit 12031 and the like among the configurations described above.
  • the detection processing is performed when the signal processing for the signal output by the imaging device and the detection of the signal output by the imaging device are executed inside the image sensor. Can be executed at high speed.
  • a sensor module 100 is provided that can be implemented.
  • the sensor module 100 When the sensor module 100 according to the embodiment of the present disclosure internally performs signal processing on the pixel signal output from the image sensor and detection of the pixel signal output from the image sensor, detection processing is performed before the signal processing is started. Works to complete. The sensor module 100 operates so that the detection process is completed before the start of the signal processing, so that the signal processing for the pixel signal of the same frame as the detection information can be controlled using the detection information.
  • the sensor module 100 operates independently the signal processing on the pixel signal output from the image sensor and the detection of the pixel signal output from the image sensor.
  • the sensor module 100 according to the embodiment of the present disclosure operates the signal processing and the detection processing independently to operate only the detection processing until the detection information satisfies a predetermined condition. When the condition is satisfied, signal processing can be operated in addition to detection processing. That is, the sensor module 100 according to the embodiment of the present disclosure can operate only the detection process with power saving.
  • each step in the processing executed by each device in this specification does not necessarily have to be processed in chronological order in the order described as a sequence diagram or flowchart.
  • each step in the processing executed by each device may be processed in an order different from the order described as the flowchart, or may be processed in parallel.
  • a storage unit for storing pixel signals output from the image sensor; A signal processing unit that performs signal processing on the pixel signal stored in the storage unit; A detection unit for completing detection processing for the pixel signal in the same frame before completion of signal processing by the signal processing unit;
  • An image processing apparatus comprising: (2) The image processing apparatus according to (1), wherein the signal processing unit starts signal processing on the pixel signal in the same frame after detection processing in the detection unit is completed. (3) The image processing apparatus according to (1) or (2), further including an analysis unit that analyzes a result of detection processing in the detection unit.
  • the image processing device according to any one of (1) to (6), wherein the detection processing in the detection unit and the signal processing in the signal processing unit are performed in a time division manner within one frame.
  • the signal processing for the pixel signal executed by the signal processing unit is at least one of automatic white balance processing, automatic exposure processing, distortion correction processing, defect correction processing, noise reduction processing, and high dynamic range synthesis processing.
  • the image processing apparatus according to any one of (1) to (7).
  • (9) It is configured by stacking three semiconductor substrates consisting of a first semiconductor substrate, a second semiconductor substrate, and a third semiconductor substrate, At least the imaging element is formed on the first semiconductor substrate, In the second semiconductor substrate, at least the storage unit is formed, The image processing apparatus according to any one of (1) to (8), wherein at least the signal processing unit and the detection unit are formed on the third semiconductor substrate. (10) The image processing apparatus according to (9), wherein the second semiconductor substrate is provided between the first semiconductor substrate and the third semiconductor substrate. (11) The image processing apparatus according to (9), wherein the third semiconductor substrate is provided between the first semiconductor substrate and the second semiconductor substrate.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Transforming Light Signals Into Electric Signals (AREA)
  • Studio Devices (AREA)
  • Color Television Image Signal Generators (AREA)

Abstract

[Problème] Fournir un dispositif de traitement d'image qui, lorsqu'il exécute un traitement de signal d'un signal émis par un élément d'imagerie et la détection du signal émis par l'élément d'imagerie dans un capteur d'image, exécute ce traitement de détection à grande vitesse. [Solution] L'invention propose un dispositif de traitement d'image pourvu : d'une unité de stockage qui stocke un signal de pixel émis à partir d'un élément d'imagerie ; d'une unité de traitement de signal qui effectue un traitement de signal pour le signal de pixel stocké dans l'unité de stockage ; et d'une unité de détection qui achève le traitement de détection pour le signal de pixel dans la même trame avant l'achèvement du traitement de signal par l'unité de traitement de signal.
PCT/JP2017/006042 2016-04-05 2017-02-20 Dispositif de traitement d'image, procédé de traitement d'image, programme informatique et dispositif électronique WO2017175492A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016075565A JP2017188760A (ja) 2016-04-05 2016-04-05 画像処理装置、画像処理方法、コンピュータプログラム及び電子機器
JP2016-075565 2016-04-05

Publications (1)

Publication Number Publication Date
WO2017175492A1 true WO2017175492A1 (fr) 2017-10-12

Family

ID=60000746

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/006042 WO2017175492A1 (fr) 2016-04-05 2017-02-20 Dispositif de traitement d'image, procédé de traitement d'image, programme informatique et dispositif électronique

Country Status (2)

Country Link
JP (1) JP2017188760A (fr)
WO (1) WO2017175492A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113287295A (zh) * 2018-12-26 2021-08-20 富士胶片株式会社 摄像元件、摄像装置、摄像元件的工作方法及程序
CN113661700A (zh) * 2019-05-10 2021-11-16 索尼半导体解决方案公司 成像装置与成像方法

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6913830B2 (ja) 2018-08-31 2021-08-04 富士フイルム株式会社 撮像素子、撮像装置、画像データ処理方法、及びプログラム
CN112740662A (zh) 2018-09-27 2021-04-30 富士胶片株式会社 成像元件、摄像装置、图像数据输出方法及程序
WO2020066187A1 (fr) 2018-09-27 2020-04-02 富士フイルム株式会社 Élément de capture d'image, dispositif de capture d'imagerie, procédé et programme de traitement de données d'image
WO2020137664A1 (fr) * 2018-12-26 2020-07-02 富士フイルム株式会社 Élément d'imagerie, dispositif d'imagerie, procédé de fonctionnement d'élément d'imagerie et programme
JP7023412B2 (ja) 2019-04-26 2022-02-21 富士フイルム株式会社 撮像素子、撮像装置、撮像素子の作動方法、及びプログラム

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004120205A (ja) * 2002-09-25 2004-04-15 Sony Corp 撮像装置,撮像装置の画像出力方法,およびコンピュータプログラム
JP2006208626A (ja) * 2005-01-27 2006-08-10 Sony Corp オートフォーカス制御装置、オートフォーカス制御方法および撮像装置
JP2015126043A (ja) * 2013-12-26 2015-07-06 ソニー株式会社 電子デバイス

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004120205A (ja) * 2002-09-25 2004-04-15 Sony Corp 撮像装置,撮像装置の画像出力方法,およびコンピュータプログラム
JP2006208626A (ja) * 2005-01-27 2006-08-10 Sony Corp オートフォーカス制御装置、オートフォーカス制御方法および撮像装置
JP2015126043A (ja) * 2013-12-26 2015-07-06 ソニー株式会社 電子デバイス

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113287295A (zh) * 2018-12-26 2021-08-20 富士胶片株式会社 摄像元件、摄像装置、摄像元件的工作方法及程序
CN113287295B (zh) * 2018-12-26 2023-07-25 富士胶片株式会社 摄像元件、摄像装置、摄像元件的工作方法及存储介质
CN113661700A (zh) * 2019-05-10 2021-11-16 索尼半导体解决方案公司 成像装置与成像方法

Also Published As

Publication number Publication date
JP2017188760A (ja) 2017-10-12

Similar Documents

Publication Publication Date Title
JP7105754B2 (ja) 撮像装置、及び、撮像装置の制御方法
WO2019150786A1 (fr) Élément d'imagerie à semi-conducteurs, dispositif d'imagerie et procédé de commande pour élément d'imagerie à semi-conducteurs
WO2017175492A1 (fr) Dispositif de traitement d'image, procédé de traitement d'image, programme informatique et dispositif électronique
US10880509B2 (en) Solid-state imaging element, electronic device, and method for correcting uneven luminance
US11553117B2 (en) Image pickup control apparatus, image pickup apparatus, control method for image pickup control apparatus, and non-transitory computer readable medium
WO2017169233A1 (fr) Dispositif de traitement d'image, procédé de traitement d'image, programme informatique et dispositif électronique
WO2018110002A1 (fr) Dispositif d'imagerie et procédé de commande destiné à un dispositif d'imagerie
KR102493027B1 (ko) 촬상 소자 및 그 구동 방법, 그리고 전자 기기
WO2017169274A1 (fr) Dispositif de commande d'imagerie, procédé de commande d'imagerie, programme informatique et équipement électronique
WO2017149964A1 (fr) Dispositif de traitement d'image, procédé de traitement d'image, programme informatique et dispositif électronique
WO2021060118A1 (fr) Dispositif d'imagerie
WO2020246264A1 (fr) Capteur de mesure de distance, procédé de traitement de signal et module de mesure de distance
WO2017212722A1 (fr) Appareil de commande et procédé de commande
WO2021065500A1 (fr) Capteur de mesure de distance, procédé de traitement de signal, et module de mesure de distance
WO2020021826A1 (fr) Élément d'imagerie à semi-conducteurs, dispositif d'imagerie et procédé de commande d'élément d'imagerie à semi-conducteurs
CN113661700A (zh) 成像装置与成像方法
WO2018220993A1 (fr) Dispositif de traitement de signal, procédé de traitement de signal et programme informatique
TWI842952B (zh) 攝像裝置
WO2020166284A1 (fr) Dispositif de capture d'image
WO2020137503A1 (fr) Dispositif de traitement d'image
WO2018207665A1 (fr) Dispositif d'imagerie à semi-conducteurs, procédé de commande et dispositif électronique
WO2020100399A1 (fr) Élément d'imagerie à semi-conducteurs, dispositif d'imagerie, et procédé de contrôle d'élément d'imagerie à semi-conducteurs

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17778871

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 17778871

Country of ref document: EP

Kind code of ref document: A1