WO2019163385A1 - Dispositif de traitement d'image, procédé de traitement d'image, et programme - Google Patents

Dispositif de traitement d'image, procédé de traitement d'image, et programme Download PDF

Info

Publication number
WO2019163385A1
WO2019163385A1 PCT/JP2019/002199 JP2019002199W WO2019163385A1 WO 2019163385 A1 WO2019163385 A1 WO 2019163385A1 JP 2019002199 W JP2019002199 W JP 2019002199W WO 2019163385 A1 WO2019163385 A1 WO 2019163385A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
display
image processing
display unit
processing apparatus
Prior art date
Application number
PCT/JP2019/002199
Other languages
English (en)
Japanese (ja)
Inventor
正俊 石井
Original Assignee
キヤノン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2018210844A external-priority patent/JP7391502B2/ja
Application filed by キヤノン株式会社 filed Critical キヤノン株式会社
Publication of WO2019163385A1 publication Critical patent/WO2019163385A1/fr
Priority to US16/940,655 priority Critical patent/US20200358964A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/74Projection arrangements for image reproduction, e.g. using eidophor

Definitions

  • the present invention relates to an image processing technique for assisting imaging by an imaging apparatus.
  • Patent Document 1 discloses a technique of superimposing a marker indicating a region that is supposed to be displayed on a display device on an image obtained by imaging at the time of imaging.
  • Patent Document 1 a region that is supposed to be displayed on a display device is only indicated by a marker.
  • the marker may not indicate the display area.
  • a marker for indicating a region where an image is displayed on the display device cannot always be generated appropriately.
  • an object of the present invention is to provide information for generating a marker suitable for a display device that displays a part of an image obtained by imaging.
  • an image processing apparatus for notifying a range to be displayed in an input image by an apparatus or system including a display unit that displays an image. Based on the first acquisition means for acquiring information representing the display form of the apparatus or system including the section, the second acquisition means for acquiring input image data representing the input image, the input image data, and the information And a specifying means for specifying a range to be displayed by the display unit in the input image, and an output means for outputting information representing the specified range.
  • the shape of the specified range is a curved screen. And a shape corresponding to the display unit which is at least one of a plurality of flat screens.
  • a diagram showing a configuration example of a display system Flowchart showing the flow of processing for generating an image with a marker Diagram showing 3D spatial coordinate system Diagram showing 3D spatial coordinate system Diagram showing a two-dimensional UV coordinate system Diagram showing an example of an image with markers Diagram showing an example of an image with markers Diagram showing an example of an image with markers Diagram showing an example of an image with markers Diagram showing an example of an image with markers Diagram showing an example of an image with markers Diagram showing an example of an image with markers Diagram showing an example of an image with markers Flowchart showing the flow of processing performed in the image processing apparatus Flowchart showing the flow of processing for generating an image with a marker Diagram showing an example of an image with markers Diagram showing an example of an image with markers Diagram showing an example of an image with markers A diagram showing a configuration example of a display system Diagram showing an example of an image with markers Diagram showing an example of
  • FIG. 15A shows a captured image 1501 including a subject 1502 to be displayed on the screen of the display system described above.
  • the display ranges 1503 to 1505 are all rectangles of the same size.
  • the display range 1503 is rectangular, whereas the display range 1504 and the display range 1505 are trapezoidal.
  • the shapes of the display ranges 1503 to 1505 also change depending on the lens projection method when the captured image 1501 is captured.
  • the display range shown in FIG. 15B shows a case of a general center projection lens. However, when an equidistant projection fisheye lens is used, for example, as shown in FIG. 8B, the shape of the display range is that of the center projection. Compared to the case of a lens, the shape is slightly rounded.
  • FIG. 15C shows an example in which the captured image 1501 shown in FIG. 15A and the screen display ranges 1503 to 1505 shown in FIG.
  • the display range of the screen does not include the entire subject 1502 to be originally displayed.
  • the fact that the entire subject does not fit within the display range of the screen and is partially cut is referred to as “out of sight”.
  • the imager takes an image while checking a display unit such as an electronic viewfinder (EVF) or a monitor of the image pickup apparatus. Therefore, the photographer can confirm the range of the captured image that is currently captured.
  • EMF electronic viewfinder
  • the photographer does not know which range in the captured image 1501 is displayed on the screen.
  • the discontinuity occurs because the photographer does not recognize which range is displayed on the screen in the captured image 1501 at the time of imaging. Therefore, in the present embodiment, at the time of imaging, the photographer is notified of which range of the image obtained by imaging is displayed on the display device. Specifically, at the time of imaging, a marker corresponding to the display form of the display device is generated, and the generated marker is displayed superimposed on the image. Thereby, the photographer can know the range that the display device can display in the image obtained by the imaging.
  • a display device or a display system that displays a part of a captured image after imaging is referred to as a first display device or a first display system.
  • a device that displays an image on which a marker is superimposed at the time of imaging is referred to as a second display device.
  • the display unit included in the imaging device is used as the second display device, the display unit is referred to as a second display unit.
  • the image processing apparatus 1 is, for example, a computer, and includes a CPU 101, a RAM 102, a ROM 103, an HDD interface (I / F) 104, an input I / F 106, an output I / F 108, an imaging device I / F 110, Have
  • the CPU 101 uses the RAM 102 as a work memory, executes programs stored in the ROM 103 and the hard disk drive (HDD) 105, and controls each configuration via the system bus 100.
  • the HDD I / F 104 is an interface such as serial ATA (SATA).
  • a secondary storage device such as an HDD 105 or an optical disk drive is connected to the HDD I / F 104.
  • the CPU 101 can read data from the HDD 105 and write data to the HDD 105 via the HDD I / F 104. Further, the CPU 101 can expand the data stored in the HDD 105 in the RAM 102 and can store the data expanded in the RAM 102 in the HDD 105. The CPU 101 can execute the data expanded in the RAM 102 as a program.
  • the input I / F 106 is a serial bus interface such as USB or IEEE1394.
  • An input device 107 such as a keyboard or a mouse is connected to the input I / F 106. The CPU 101 can read data from the input device 107 via the input I / F 106.
  • the output I / F 108 is a video output interface such as DVI or HDMI (registered trademark).
  • An output device 109 such as a liquid crystal display is connected to the output I / F 108.
  • the output device 109 corresponds to the second display device or the second display unit described above.
  • the CPU 101 can send data to the output device 109 via the output I / F 108 and execute processing such as display.
  • the imaging device I / F 110 is a serial bus interface such as a USB.
  • An imaging device 111 such as a video camera is connected to the imaging device I / F 110.
  • the CPU 101 can acquire imaging data such as moving image frame data from the imaging device 111 via the imaging device I / F 110.
  • the image processing apparatus 1 does not have to include the imaging device I / F 110.
  • an imaging device is connected to the input I / F 106 instead of the imaging device I / F 110.
  • an imaging device in which the imaging device 111 and the output device 109 are integrated may be connected to the imaging device I / F 110.
  • a video camera having a display unit such as an EVF or a monitor can be used as the imaging device 111.
  • the CPU 101 can send data to the display unit via the imaging device I / F 110 to execute display.
  • the image processing apparatus 1 may be included in the output device 109 or the imaging apparatus 111.
  • FIG. 1600 A block diagram showing an example of the imaging system 1600 is shown in FIG.
  • the imaging system 1600 is a digital camera, for example, and includes a CPU 101, a RAM 102, a ROM 103, an HDD interface (I / F) 104, an input unit 1601, a display unit 1602, and an imaging unit 1603.
  • the input unit 1601 is an input unit such as a button.
  • a display unit 1602 is a display unit such as an EVF or a monitor.
  • the imaging unit 1603 is an imaging unit that includes an optical system such as a lens and generates an image via the optical system.
  • the imaging system 1600 does not have to include the input unit 1601 and the display unit 1602 separately, and may include a touch panel display in which the input unit 1601 and the display unit 1602 are integrated.
  • the imaging system is not limited to a digital camera, and may be a portable information terminal such as a smartphone.
  • FIG. 2 is a block diagram illustrating a functional configuration of the image processing apparatus 1.
  • the CPU 101 functions as the functional configuration shown in FIG. 2 by reading a program stored in the ROM 103 or the HDD 105 and executing the RAM 102 as a work area. Note that it is not necessary for the CPU 101 to execute all of the processes described below, and the image processing apparatus 1 may be configured such that part or all of the processes are performed by one or a plurality of processing circuits other than the CPU 101. Good.
  • the image processing apparatus 1 includes a display form acquisition unit 201, an input image acquisition unit 202, a viewpoint information acquisition unit 203, an imaging condition acquisition unit 204, an output image generation unit 205, and an output unit 206.
  • the display form acquisition unit 201 acquires display form information representing the display form of the first display system that displays an image.
  • the display form information of the first display system is information configured by arrangement information indicating the number of screens included in the first display system, the size of each screen, the resolution of each screen, the position and orientation of each screen.
  • the input image acquisition unit 202 acquires input image data representing the input image.
  • the input image is an image to be displayed with a marker superimposed.
  • the viewpoint information acquisition unit 203 acquires viewpoint information representing the position of the viewpoint when observing an image displayed by the first display system.
  • the imaging condition acquisition unit 204 acquires imaging information representing imaging conditions.
  • the imaging information includes the sensor size of the imaging device 111, the focal length of the lens, the angle of view, the projection method, and the resolution of the input image.
  • the output image generation unit 205 generates image data with a marker that represents an image with a marker in which a marker is superimposed on the input image.
  • an image with a marker is also called an output image
  • image data with a marker is also called output image data.
  • the output unit 206 outputs the marker-added image data to the output device 109.
  • FIG. 3 is a flowchart showing a flow of processing performed in the image processing apparatus 1.
  • each step (process) is represented by adding S before the reference numeral.
  • the display form acquisition unit 201 acquires display form information representing the display form of the first display system that displays an image.
  • the process in S301 is performed based on a user instruction via the input device 107.
  • the display form information is acquired by selecting one of a plurality of display form information stored in advance in the HDD 105 based on a user instruction.
  • the first display system in the present embodiment is the display system shown in FIG.
  • the first display system includes three screens and three projectors, and a left side screen 402 and a right side screen 403 are arranged at an opening angle ⁇ with respect to the center screen 401. Each screen has a width W mm and a height H mm , and the three screens have the same size.
  • the screen arrangement information represents the position (x, y, z) of the center of the screen in the three-dimensional XYZ coordinate system and a normal vector N representing the direction of the normal on the surface of the screen.
  • This normal is the normal of the surface on the side where the viewpoint for observing the screen is located.
  • the origin of the XYZ coordinate system is the viewpoint position represented by the viewpoint information.
  • the resolution of the screen described above may be the resolution of a projector that projects an image on each screen instead of the resolution of the image on the screen.
  • the imaging condition acquisition unit 204 acquires imaging information representing the imaging conditions.
  • the processing in S302 is performed based on a user instruction via the input device 107.
  • imaging information is acquired by selecting one for each item from a plurality of imaging conditions stored in advance in the HDD 105 based on a user instruction.
  • the sensor size of the imaging device 111 is width SW mm and height SH mm
  • the focal length of the lens is f
  • the angle of view is ⁇ max
  • the resolution of the input image is SW pix ⁇ SH pix .
  • the lens of the imaging device 111 in this embodiment is a fisheye lens of equidistant projection
  • the projection method is equidistant projection.
  • step S303 the viewpoint information acquisition unit 203 acquires viewpoint information indicating the position of the viewpoint when observing an image displayed by the first display system.
  • the process in S303 is performed based on a user instruction via the input device 107.
  • viewpoint information is acquired by selecting one of viewpoint information representing viewpoint positions stored in advance in the HDD 105 based on a user instruction.
  • the viewpoint information represents the position (0, 0, 0) of the viewpoint in the XYZ coordinate system described above.
  • the viewpoint position is not limited to the above example.
  • the input image acquisition unit 202 acquires input image data obtained via the imaging device 111. Specifically, input image data is acquired to the RAM 102 via the imaging device I / F 110. In the present embodiment, since the imaging device 111 is a video camera, the subsequent processing is performed using image data corresponding to each frame of the moving image as input image data.
  • step S ⁇ b> 305 the output image generation unit 205 generates marker-added image data representing a marker-added image in which a marker representing a range cut out by the cut-out process, that is, a range displayed by the first display system, is superimposed on the input image. .
  • the marker-added image data is generated by superimposing the marker image data generated in this step on the input image data acquired in S304. Details of the processing in this step will be described later.
  • the output unit 206 outputs the marker-added image data generated in S305 to the output device 109 via the output I / F 108.
  • FIG. 5 is a flowchart showing details of the processing for generating image data with markers in S305.
  • S305 in order to display images on the three screens constituting the first display system shown in FIG. 4, cutout ranges corresponding to the respective screens are sequentially calculated in the input image.
  • Image data with a marker is generated by superimposing the calculated cutout range as input image data as marker image data.
  • a display image cut out from the input image is displayed on each screen.
  • the output image generation unit 205 sets a screen that has not been processed in S502 to S509 as a processing target.
  • the three screens shown in FIG. 4 are sequentially set as processing targets in this step.
  • the output image generation unit 205 generates marker image data having the same resolution as the input image data in the RAM 102 based on the resolution of the input image, and initializes all pixel values in white.
  • the marker image data in this embodiment is binary data whose pixel value can take one of 0 (white) and 1 (black).
  • multi-value data in which the pixel value is expressed by 8 bits or 16 bits may be used.
  • step S ⁇ b> 503 the output image generation unit 205 specifies positions corresponding to the edge portions of the image display area on the screen at predetermined intervals based on the screen size and the screen arrangement information, and determines the three-dimensional coordinates of each position P. calculate.
  • the output image generation unit 205 generates the three-dimensional coordinates (x, y, z) of each position P as point cloud data.
  • the three-dimensional coordinates used here are three-dimensional coordinates with the viewpoint position where the screen is observed as the origin.
  • the processing time increases accordingly. Therefore, the interval between the points depends on the marker shape accuracy and the processing time required in advance. May be set.
  • the interval between the points P in this embodiment is determined based on the screen resolution.
  • the screen size is W mm and height H mm and the screen resolution is W pix ⁇ H pix
  • the three-dimensional coordinates of the center point of each pixel on the screen are calculated based on these. To do.
  • all the three-dimensional coordinates of the pixels corresponding to the edge portion of the image display area are set as point group data to be processed.
  • FIG. 17 is a diagram for explaining the position of the edge of the image display area on the screen.
  • FIG. 17A is a view when the three screens shown in FIG. 4 are observed from the viewpoint position with the center screen 401 in front.
  • the entire screen is used as an image display area, and therefore the position of the edge of the image display area on the screen is as indicated by a thick line 1701 in FIG. 17B.
  • the entire screen does not necessarily have to be an image display area.
  • an edge of the image display area may exist inside the edge of the screen as indicated by a thick line 1702 in FIG. 17C.
  • the output image generation unit 205 When an image is displayed as shown in FIG. 17C, the output image generation unit 205 generates point cloud data based on screen layout information and information that can calculate the size of the image display area on the screen. For information that can calculate the size of the image display area on the screen, for example, the size of the image display area may be further acquired as display form information.
  • step S ⁇ b> 504 the output image generation unit 205 extracts one point P (x, y, z) from the point cloud data, and for the extracted point P, the vector OP and the Z axis when the viewpoint position is the origin O. Is calculated by the equation (1).
  • FIG. 6A is a diagram showing the positional relationship between the three screens on three-dimensional coordinates with the viewpoint position as the origin.
  • a point P (x, y, z) indicates the three-dimensional coordinates of the point on the screen to be processed.
  • FIG. 6B is a view of FIG. 6A viewed from another angle.
  • an angle between the vector OP and the Z axis is defined as ⁇ .
  • a perpendicular line drawn from the point P on the XY plane is defined as a point Q (x, y, 0), and an angle formed by the vector OQ and the X axis is defined as ⁇ .
  • step S ⁇ b> 505 the output image generation unit 205 sets the point corresponding to the point P (x, y, z) in the input image as I (u, v), and sets the image height r on the input image at the point I to Expression (2).
  • the image height r can be represented by the ratio of ⁇ and ⁇ max .
  • FIG. 7 is a diagram showing a point I (u, v) of the input image in the two-dimensional UV coordinate system. Normalization is performed such that the center of the input image is the origin, the lower left coordinates of the image are ( ⁇ 1, ⁇ 1), and the upper right coordinates are (1, 1).
  • the angle ⁇ formed by the vector OI and the U axis is equal to ⁇ shown in FIG. Since the lens used in the present embodiment is a fisheye lens, the range in which an image is actually shown is an area in the image circle 601 shown in FIG.
  • step S506 the output image generation unit 205 calculates the coordinates (u, v) of the point I on the input image using the equations (3) and (4).
  • step S507 the output image generation unit 205 changes the pixel value corresponding to the coordinates (u, v) calculated in step S506 to black in the marker image data. That is, the pixel value is converted from 0 (white) to 1 (black). Specifically, since (u, v) is a decimal value between ⁇ 1.0 and 1.0, 1 is added to each of u and v, and then divided by 2, (u, v ) Is normalized to take a value from 0 to 1.0. Further, information indicating the pixel position on the marker image is calculated by multiplying u by the width SW pix of the marker image and v by the height SH pix .
  • a process of changing the pixel value to black is performed on the pixel having the shortest distance among the four neighboring pixels in (u, v). If the marker image data has multiple values, the color of each pixel may be determined by weighting the four neighboring pixels with a distance.
  • the output image generation unit 205 determines whether processing has been performed for all points P of the point cloud data corresponding to the screen to be processed. If all points P have been processed, the process proceeds to S510. If all points P have not been processed, the process proceeds to S509.
  • step S509 the output image generation unit 205 updates the coordinates of the point P with points that have not been processed in the point cloud data, and the process returns to step S504.
  • step S510 the output image generation unit 205 determines whether all the screens constituting the first display system have been set as processing targets. In the present embodiment, it is determined whether or not the three screens of the center screen 401, the left side screen 402, and the right side screen 403 are set as processing targets. If all the screens are set as processing targets, the process proceeds to S511. If all the screens are not set as processing targets, the process returns to S501.
  • step S511 the output image generation unit 205 performs a process of superimposing the marker image data on the input image data. Specifically, in the input image, the pixel value at the pixel position corresponding to the black pixel of the marker image is converted into a pixel value representing a preset marker color.
  • 8A shows an input image
  • FIG. 8B shows a marker image
  • FIG. 8C shows an image with a marker.
  • a frame indicating a cutout range corresponding to each screen is added as a marker.
  • the marker does not have to be a frame indicating the cutout range corresponding to each screen.
  • an area in which the color, brightness, or the like within the cutout range is changed may be used as the marker.
  • the color or brightness within the cutout range is changed.
  • 9A shows an input image
  • FIG. 9B shows a marker image
  • FIG. 9C shows an image with a marker.
  • the cutout range may be indicated by performing processing on each pixel outside the region indicated by the marker, such as lowering the luminance of each pixel outside the rectangle indicated by the marker.
  • a color or luminance outside the cutout range is changed by subtracting a predetermined pixel value from a pixel value outside the cutout range.
  • the color or brightness outside the cutout range may be changed by converting a pixel value outside the cutout range into a predetermined pixel value.
  • the image processing apparatus acquires information indicating a display form of an apparatus or system including a display unit, and input image data indicating an input image. Based on the input image data and the information indicating the display form, a range displayed by the display unit in the input image is specified, and information indicating the specified range is output.
  • the shape of the specified range is a shape corresponding to a display unit that is at least one of a curved screen and a plurality of flat screens.
  • a marker image is generated, and an image with a marker is generated by superimposing the generated marker image on the input image.
  • a table in which the display form information of the first display system and the marker image are associated one-to-one is generated and held in advance, and an image with a marker is generated by referring to this table. How to do will be described. Note that the hardware configuration of the system including the image processing apparatus 1 and the functional configuration of the image processing apparatus 1 in the present embodiment are the same as those in the first embodiment, and a description thereof will be omitted. In the following, differences between the present embodiment and the first embodiment will be mainly described.
  • FIG. 10 is a flowchart showing the flow of processing performed in the image processing apparatus 1.
  • the display form acquisition unit 201 acquires display form information. Details of the processing are the same as S301 in FIG.
  • the input image acquisition unit 202 acquires input image data. Details of the processing are the same as S302 in FIG.
  • the output image generation unit 205 generates marker-added image data that represents a marker-added image in which a marker that represents a range that is cut out by the clipping process, that is, a range that is displayed by the first display system, is superimposed on the input image. . Details of the processing in S1003 will be described later.
  • the output unit 206 outputs the marker-added image data generated in S1003 to the output device 109 via the output I / F 108.
  • step S ⁇ b> 1101 the output image generation unit 205 acquires corresponding marker image data from the marker image data stored in advance in the HDD 105 based on the display form information acquired by the display form acquisition unit 201, and develops it in the RAM 102.
  • the marker image data is acquired by referring to a table that holds the correspondence relationship between the display form information of the first display system and the marker image. This table holds the correspondence relationship between the display form information of the first display system and the marker image under specific conditions including the viewpoint position and the angle of view at the time of imaging. In order to cope with a situation where the viewpoint position is frequently switched, a table in which marker images are associated with combinations of display form information and viewpoint positions of the first display system may be used.
  • the table may be switched for each angle of view in order to cope with the change of the angle of view.
  • the marker image data is generated by the method shown in the first embodiment and is held in the HDD 105.
  • the output image generation unit 205 performs a process of superimposing the marker image data on the input image data. The details of the process are the same as S511 in FIG.
  • the image processing apparatus acquires the marker image data by referring to the table that holds the correspondence relationship between the display form information of the display device and the marker image.
  • Image data with a marker is generated by superimposing the acquired marker image data on the input image data.
  • the marker generated by the above-described method can cause the photographer to recognize the display range of the display device at the time of imaging.
  • the marker-added image data corresponding to the display form of the display device can be generated by a relatively fast process of referring to the table.
  • marker image data is generated, but it is not always necessary to generate this. For example, only the coordinates of a frame indicating a cutout range corresponding to each screen as a marker is calculated and held, and the marker-added image data is generated by drawing the marker on the input image based on the coordinate value. good.
  • the timing of generating and displaying the marker is not limited to at the time of imaging.
  • the marker may be generated and displayed before imaging, and the photographer may be notified of the display range of the first display system.
  • the generation and display of the marker may be continued during or after imaging, or the marker may not be generated and displayed during or after imaging.
  • a marker may be generated using the captured image obtained by imaging as the input image described above and displayed on the second display device. In this case, the photographer can check whether or not the subject to be displayed in the input image is included in the display range of the first display system.
  • the marker is generated at the time of imaging and the image with the marker is displayed on the second display device.
  • the usage of the image with the marker is not limited to the above example.
  • the marker image generated at the time of capturing may be associated with the captured image and stored together as a file. Any method may be used for the association.
  • the association can be performed by assigning a file name that can uniquely identify the correspondence between the captured image and the marker image. By performing this association, the captured image can be edited while referring to the marker image after imaging. That is, the editing operation can be performed while confirming the area included in the display range of the first display system in the captured image.
  • coordinate information of a frame indicating a cutout range corresponding to each screen as a marker may be stored in the HDD 105 in association with the captured image. For example, this association can be performed by storing coordinate information in a metadata area attached to the captured image. Alternatively, the coordinate information may be stored in a file different from the captured image as a file with a file name that can uniquely identify the correspondence between the captured image and the coordinate information.
  • the coordinate information is not limited to the information corresponding to the entire cutout range, and only the coordinates of the corner of the frame indicating the cutout range may be stored as the coordinate information.
  • the cutout range is a simple figure such as a rectangle
  • information on the center coordinates of the rectangle and the width and height of the rectangle may be stored as coordinate information.
  • the output image generation unit 205 generates marker image data having the same resolution as the input image data based on the resolution of the input image.
  • the marker image data generation process is not limited to the above example. .
  • marker image data having a predetermined resolution may be generated, and the resolution may be adjusted by performing resolution conversion when the marker image data is superimposed on the input image data in S511.
  • the output image generation unit 205 determines the interval between the points P based on the resolution of the screen, but may determine it based on the resolution of the input image. Further, it is determined whether or not the interval between the points P is sufficient for forming the marker. If it is determined that the interval is insufficient, the interval between the points P may be further reduced. Further, after the marker image data or the image data with the marker is generated, the marker may be prevented from being interrupted by interpolating the points in between.
  • the imaging device 111 is a video camera, but it may be a still camera.
  • a still image is to be obtained by a video camera or a still camera, the photographer can know the display range of the first display system before taking an image.
  • the three projectors are arranged right above the viewpoint position.
  • the arrangement is not limited to the above example as long as an image can be projected on the screen.
  • an image may be projected from the rear side of the screen to the viewpoint position by using a transmissive screen as the screen.
  • you may project an image using one projector with respect to three screens.
  • the first display system is configured by the screen and the projector, but may be a display.
  • the first display system may be configured by a printer and a recording medium.
  • a marker indicating the print range in the input image may be generated according to the arrangement of the recording medium or the like.
  • the display form acquisition unit 201 uses the display form information as the display form information, the number of screens included in the first display system, the size of each screen, the resolution of each screen, the position information and the orientation of each screen Acquired.
  • the information for specifying the position corresponding to the edge portion of the image display area on the screen is included, it is not necessary to acquire all the above information. For example, if all the coordinates of the position corresponding to the screen in the three-dimensional space are acquired, it is not necessary to acquire the number of screens, the size of each screen, and the resolution of each screen.
  • the position information indicating the position corresponding to the edge portion of the image display area on the screen the above point cloud data may be generated and acquired in advance.
  • the imaging information acquisition unit 204 acquires the sensor size, the focal length of the lens, the angle of view, the projection method, and the resolution of the input image as imaging information.
  • the angle of view can be calculated using the sensor size of the imaging device 111 and the focal length of the lens, either the sensor size of the imaging device 111, the focal length of the lens, or the angle of view can be acquired. That's fine.
  • the marker is not necessarily a frame indicating a range.
  • a linear marker indicating the boundary position between the screens may be used.
  • FIG. 12A shows an input image
  • FIG. 12B shows a marker image
  • FIG. 12C shows an image with a marker.
  • the markers in this example are two linear markers, a line segment 1201 indicating the boundary position between the center screen 401 and the left side screen 402, and a line segment 1202 indicating the boundary position between the center screen 401 and the right side screen 403. It is constituted by. It is possible to know which subject is displayed on which screen by checking the image in which the marker is superimposed on the input image at the time of imaging. Therefore, the imager can take an image while avoiding the main subject from overlapping the screen boundary.
  • FIG. 14 shows an example of an image with a marker in the case where the display on the curved screen as shown in FIG. 13 is considered in addition to the first display system including the three-screen shown in FIG.
  • FIG. 14A shows a marker corresponding to the curved screen by a broken line 1401 in addition to the marker indicated by the solid line corresponding to the three-side screen shown in FIG. 8C.
  • the display form information of the curved screen is acquired in addition to the display form information of the three-screen screen as the display form information of the first display system.
  • the marker corresponding to the curved screen can be generated by setting the curved screen as a processing target.
  • FIG. 14B shows an example of marker display by coloring a range obtained by ANDing a cutout range corresponding to a three-plane screen and a cutout range corresponding to a curved screen. Further, at least one or more of markers corresponding to the plurality of display forms may be selected so that the selected marker can be displayed. This makes it possible to perform marker display in consideration of display forms of a plurality of display devices, and to perform imaging assistance based on a plurality of display environments.
  • the first display system is an example of a three-surface or curved screen arranged so as to surround the viewer.
  • the shape of the screen is not limited to this.
  • the same processing can be applied to a spherical screen as shown in FIG.
  • the same processing can be applied to a display or screen having a convex surface on the viewer side.
  • the same processing is applied to a curved screen having a shape obtained by cutting a part of a side surface of a cylinder as shown in FIG. 19 and a spherical screen having a shape obtained by cutting a part of a sphere as shown in FIG. It can be carried out.
  • the marker color may be a single color such as black, or may be set to be different from the color of the pixel by referring to the pixel value of the input image.
  • the color of the marker may be determined according to the ambient brightness. Specifically, in bright weather, the color is bright so as to improve the visibility of the marker, in cloudy weather, the color is highly conspicuous, and the color is dark so that it is not too bright in the middle of the night or in the dark. It is possible.
  • you may add the process which emphasizes a marker by making the thickness of a line thick.
  • the cutout range may be emphasized by reducing the luminance outside the cutout range.
  • the process performed to display the image is not limited to the cutout process.
  • the image may be displayed on the first display system after geometric transformation.
  • the display range of the image depends on the geometric transformation parameter used for the geometric transformation.
  • the present invention supplies a program that realizes one or more functions of the above-described embodiments to a system or apparatus via a network or a storage medium, and one or more processors in a computer of the system or apparatus read and execute the program This process can be realized. It can also be realized by a circuit (for example, ASIC) that realizes one or more functions.
  • a circuit for example, ASIC

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)

Abstract

L'invention concerne un dispositif de traitement d'image permettant de communiquer la plage d'une image d'entrée à afficher par un appareil ou un système contenant un affichage qui affiche une image, le dispositif de traitement d'image étant caractérisé en ce qu'il comprend un premier moyen d'acquisition qui acquiert des informations montrant la forme d'affichage du système ou du dispositif comprenant un affichage ; un second moyen d'acquisition qui acquiert des données d'image d'entrée montrant l'image d'entrée ; un moyen de spécification qui, sur la base des données d'image d'entrée et des informations, spécifie la plage de l'image d'entrée à afficher par l'affichage ; et un moyen de sortie pour délivrer en sortie des informations montrant la plage spécifiée, la forme de la plage spécifiée étant une forme selon l'affichage, qui est un écran incurvé et/ou une pluralité d'écrans plats.
PCT/JP2019/002199 2018-02-20 2019-01-24 Dispositif de traitement d'image, procédé de traitement d'image, et programme WO2019163385A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/940,655 US20200358964A1 (en) 2018-02-20 2020-07-28 Image-processing apparatus, imaging system, image processing method, and medium

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2018-028112 2018-02-20
JP2018028112 2018-02-20
JP2018-210844 2018-11-08
JP2018210844A JP7391502B2 (ja) 2018-02-20 2018-11-08 画像処理装置、画像処理方法及びプログラム

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/940,655 Continuation US20200358964A1 (en) 2018-02-20 2020-07-28 Image-processing apparatus, imaging system, image processing method, and medium

Publications (1)

Publication Number Publication Date
WO2019163385A1 true WO2019163385A1 (fr) 2019-08-29

Family

ID=67686805

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/002199 WO2019163385A1 (fr) 2018-02-20 2019-01-24 Dispositif de traitement d'image, procédé de traitement d'image, et programme

Country Status (1)

Country Link
WO (1) WO2019163385A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001184040A (ja) * 1999-12-24 2001-07-06 Hitachi Ltd 画像データ表示システム及び画像データ生成方法
JP2004032076A (ja) * 2002-06-21 2004-01-29 Canon Inc プリントシステムの撮像装置
JP2005234698A (ja) * 2004-02-17 2005-09-02 Mitsubishi Precision Co Ltd 歪みパラメータの生成方法及び映像発生方法並びに歪みパラメータ生成装置及び映像発生装置
JP2012508406A (ja) * 2009-09-03 2012-04-05 クゥアルコム・インコーポレイテッド インクリノメータをもつモバイルデバイス
JP2013020063A (ja) * 2011-07-11 2013-01-31 Canon Inc 撮像装置および画像表示システム並びに画像表示装置

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001184040A (ja) * 1999-12-24 2001-07-06 Hitachi Ltd 画像データ表示システム及び画像データ生成方法
JP2004032076A (ja) * 2002-06-21 2004-01-29 Canon Inc プリントシステムの撮像装置
JP2005234698A (ja) * 2004-02-17 2005-09-02 Mitsubishi Precision Co Ltd 歪みパラメータの生成方法及び映像発生方法並びに歪みパラメータ生成装置及び映像発生装置
JP2012508406A (ja) * 2009-09-03 2012-04-05 クゥアルコム・インコーポレイテッド インクリノメータをもつモバイルデバイス
JP2013020063A (ja) * 2011-07-11 2013-01-31 Canon Inc 撮像装置および画像表示システム並びに画像表示装置

Similar Documents

Publication Publication Date Title
US10789671B2 (en) Apparatus, system, and method of controlling display, and recording medium
JP4363151B2 (ja) 撮影装置、その画像処理方法及びプログラム
US9591237B2 (en) Automated generation of panning shots
US10437545B2 (en) Apparatus, system, and method for controlling display, and recording medium
JP4556813B2 (ja) 画像処理装置、及びプログラム
JP5907580B2 (ja) 投写型表示装置および記録画像生成方法
JP2011081775A (ja) 投影画像領域検出装置
JP2007531094A (ja) カメラ写真から得られる画像から原データを抽出する方法
JP2010050542A (ja) 投写型表示装置および表示方法
JP2007074578A (ja) 画像処理装置、撮影装置、及びプログラム
CN110610531A (zh) 图像处理方法、图像处理装置和记录介质
KR102311367B1 (ko) 화상 처리 장치, 화상 처리 방법, 및 프로그램
JP2020068514A (ja) 画像処理装置、画像処理方法、及びプログラム
JP6541501B2 (ja) 画像処理装置、撮像装置、及び画像処理方法
JP2019146155A (ja) 画像処理装置、画像処理方法およびプログラム
WO2019163385A1 (fr) Dispositif de traitement d'image, procédé de traitement d'image, et programme
JP2005269449A (ja) 画像処理装置、画像処理方法及びプログラム
JP7391502B2 (ja) 画像処理装置、画像処理方法及びプログラム
JP5162855B2 (ja) 画像処理装置、遠隔画像処理システム及び画像処理方法
US20090103811A1 (en) Document camera and its method to make an element distinguished from others on a projected image
WO2019163449A1 (fr) Appareil de traitement d'image, procédé de traitement d'image et programme
US11388341B2 (en) Image processing apparatus, image processing method, and storage medium
JP2021018414A (ja) 画像処理装置、画像処理方法及びプログラム
JP2006270580A (ja) 撮影装置、画像撮影方法、画像撮影プログラム、画像出力装置
JP2019146010A (ja) 画像処理装置、画像処理方法およびプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19757824

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19757824

Country of ref document: EP

Kind code of ref document: A1