WO2024070618A1 - Imaging device, program, and imaging system - Google Patents

Imaging device, program, and imaging system Download PDF

Info

Publication number
WO2024070618A1
WO2024070618A1 PCT/JP2023/032982 JP2023032982W WO2024070618A1 WO 2024070618 A1 WO2024070618 A1 WO 2024070618A1 JP 2023032982 W JP2023032982 W JP 2023032982W WO 2024070618 A1 WO2024070618 A1 WO 2024070618A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
unit
imaging
imaging device
parameters
Prior art date
Application number
PCT/JP2023/032982
Other languages
French (fr)
Japanese (ja)
Inventor
フォレスト マシュー
恭男 湯山
英二 新谷
桐郎 増井
知嗣 南川
Original Assignee
ソニーグループ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーグループ株式会社 filed Critical ソニーグループ株式会社
Publication of WO2024070618A1 publication Critical patent/WO2024070618A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment

Definitions

  • This technology relates to imaging devices, programs, and imaging systems, and in particular to technology for synthesizing captured images with CG (Computer Graphics) images.
  • CG Computer Graphics
  • a system has been proposed that can synthesize virtual space images created using 3D CG with real space images captured by a camera, and display the images in real time.
  • the aim of this technology is to improve work efficiency in video production, including CG compositing.
  • An imaging device includes an imaging unit that outputs an imaged image, a display control unit that causes a display unit to display a composite image generated by combining a CG image with the imaged image based on spatial information of an imaging space and CG parameters related to CG synthesis, and an association unit that associates the imaged image with the CG parameters.
  • the imaging device associates the captured image with the CG parameters, it is possible to generate a CG image based on the CG parameters even later.
  • FIG. 1 is a diagram illustrating an imaging system according to an embodiment.
  • 2A to 2C are diagrams illustrating a captured image, a CG image, and a composite image.
  • FIG. 2 is a diagram illustrating the external configuration of the imaging device.
  • FIG. 2 is a diagram illustrating a functional configuration of the imaging apparatus.
  • FIG. 2 is a diagram illustrating a configuration of a computer.
  • FIG. 2 is a diagram illustrating a processing process of the imaging system.
  • FIG. 13 is a diagram showing a sequence chart of a preparation process.
  • FIG. 4 is a diagram illustrating distance information.
  • FIG. 1 is a diagram illustrating a process for generating a CG file.
  • FIG. 2 is a diagram illustrating an example of CG parameters.
  • FIG. 1 is a diagram illustrating a process for generating a CG file.
  • FIG. 2 is a diagram illustrating an example of CG parameters.
  • FIG. 2 is a diagram illustrating a CG image based on a CG file and CG parameters.
  • FIG. 2 is a diagram illustrating meta information.
  • FIG. 11 is a diagram showing a sequence chart of an imaging process.
  • FIG. 2 is a diagram illustrating imaging settings of an imaging device.
  • FIG. 13 is a diagram illustrating a composite image.
  • FIG. 13 is a diagram illustrating an adjustment screen.
  • FIG. 13 is a diagram illustrating an adjustment screen.
  • FIG. 13 is a diagram illustrating an adjustment screen.
  • 11A to 11C are diagrams illustrating screens displayed on a display unit in the main imaging process.
  • FIG. 13 is a diagram illustrating an editing screen.
  • 11 is a diagram illustrating the functional configuration of a camera control unit and a computer control unit in Modification 1.
  • FIG. FIG. 11 is a block diagram illustrating an imaging process in the first modified example.
  • 11 is a diagram illustrating the functional configuration of a camera control unit and a computer control unit
  • Imaging system 2. Imaging device ⁇ 3. Configuration of Computer 3> 4. Processing steps of imaging system 1 5. Modifications ⁇ 6. Summary>
  • imaging includes not only imaging that involves recording of image data, but also imaging for displaying an image on a display unit without recording of image data, such as a so-called through image or live view image.
  • image does not only refer to an image displayed on a display unit, but may also refer to image data that is not displayed on a display unit.
  • FIG. 1 is a diagram illustrating an imaging system 1 according to an embodiment.
  • Fig. 2A is a diagram illustrating a captured image 101.
  • Fig. 2B is a diagram illustrating a CG image 102.
  • Fig. 2C is a diagram illustrating a composite image 103.
  • the imaging system 1 includes an imaging device 2 and a computer 3.
  • the imaging device 2 is placed, for example, at an imaging site, and captures an image of a subject at the imaging site to generate an image (moving image) 101 as shown in FIG. 2A.
  • one imaging device 2 will be referred to as imaging device 2a and the other imaging device 2 will be referred to as imaging device 2b.
  • the imaging devices 2 are connected to each other, for example, wirelessly, and can transmit and receive images and various information to and from each other.
  • the imaging devices 2 may be connected to each other by wires, or may be connected to each other via a cloud server or the like.
  • the imaging device 2 a is held by a cameraman 8 and captures an image of a subject in response to the operation of the cameraman 8 .
  • the imaging device 2b is fixed at a predetermined position and in a predetermined direction by, for example, a tripod, and captures an image of a subject in synchronization with the imaging device 2a using a known synchronization method.
  • the imaging devices 2a and 2b are capable of capturing images of a subject from different positions and directions. In the following, the imaging devices 2a and 2b will be described as capturing images without moving from a predetermined position, but they may be configured to capture images while moving.
  • the subject can be anything.
  • the computer 3 is assumed to be placed at a location different from the imaging site, but may be placed at the imaging site.
  • the computer 3 is connected to the imaging device 2, for example, wirelessly, and receives the captured image 101 from the imaging device 2 and transmits the composite image 103 to the imaging device 2.
  • the computer 3 When the computer 3 receives an captured image 101 from the imaging device 2 (one or both of the imaging devices 2a and 2b), if it is necessary to synthesize a CG image 102 with the captured image 101, the computer 3 generates a CG image 102 as shown in FIG. 2B based on spatial information and meta information described later. Then, the computer 3 composites the captured image 101 with the CG image 102 to generate a composite image 103 as shown in Fig. 2C. Note that the composite image 103 may also include frames in which the CG image 102 is not composited. After that, the computer 3 transmits the generated composite image 103 to, for example, the imaging device 2a.
  • the imaging device 2a When the imaging device 2a receives the composite image 103 from the computer 3, it displays the received composite image 103 on the display unit 13 (see FIG. 3). This makes it possible for the imaging system 1 to allow the cameraman 8 to check the composite image 103 as a moving image in almost real time while the imaging device 2 is capturing an image. Details of the imaging system 1 will be described below.
  • Imaging device Fig. 3 is a diagram illustrating the external configuration of the imaging device 2.
  • Fig. 4 is a diagram illustrating the functional configuration of the imaging device 2. As shown in FIG. 3, the imaging device 2 includes an imaging unit 11, a ToF (Time of Flight) sensor 12, a display unit 13, and an operation unit .
  • ToF Time of Flight
  • the imaging unit 11 includes, for example, a lens system 21, an imaging element unit 22, a camera processing unit 23, a recording unit 24, a communication unit 25, a camera control unit 26, a memory unit 27, and a driver unit 28.
  • the lens system 21 includes lenses such as a zoom lens and a focus lens, an aperture mechanism, etc.
  • the lens system 21 collects light (incident light) from a subject onto the image sensor unit 22.
  • the lens system 21 may be provided integrally with the imaging device 2 , or may be configured as an interchangeable lens separate from the imaging device 2 .
  • the imaging element unit 22 is configured to have an image sensor (imaging element), such as a complementary metal oxide semiconductor (CMOS) type or a charge coupled device (CCD) type.
  • image sensor imaging element
  • CMOS complementary metal oxide semiconductor
  • CCD charge coupled device
  • light received by the image sensor is photoelectrically converted to obtain an electric signal, and the electric signal is subjected to, for example, CDS (Correlated Double Sampling) processing, AGC (Automatic Gain Control) processing, and A/D (Analog/Digital) conversion processing.
  • CDS Correlated Double Sampling
  • AGC Automatic Gain Control
  • A/D Analog/Digital
  • the camera processor 23 is configured as an image processor using, for example, a DSP (Digital Signal Processor).
  • the camera processor 23 generates image data in a predetermined format by performing various signal processes on the digital data (image signal) from the image sensor 22. For example, the camera processor 23 performs lens correction, noise reduction, synchronization processing, YC generation processing, color reproduction/sharpness processing, file formation processing, etc.
  • a color separation process is performed so that the image data for each pixel has all color components of R, G, and B.
  • a demosaic process is performed as the color separation process.
  • YC generation process a luminance (Y) signal and a color (C) signal are generated (separated) from R, G, and B image data.
  • C color
  • processing is carried out to adjust gradation, saturation, tone, contrast, etc., as the so-called image creation.
  • image data is subjected to, for example, compression encoding for recording or communication, formatting, generation and addition of meta information, etc., to generate files for recording or communication.
  • compression encoding for recording or communication
  • formatting for compressing moving images conforming to MPEG-4
  • XAVC XAVC
  • the recording unit 24 is a memory card (such as a portable flash memory) which is a recording medium that can be attached to and detached from the imaging device 2, or a flash memory or HDD (Hard Disk Drive) built into the imaging device 2.
  • the recording unit 24 records image data output from the imaging element unit 22 or image data output from the camera process unit 23.
  • the recording unit 24 also records meta information for generating a CG image 102 to be combined with the captured image 101 in association with the image data. The meta information will be described in detail later.
  • the image data output from the imaging element unit 22 is RAW image data that has not been subjected to image processing by the camera process unit 23. Therefore, the recording unit 24 may record RAW image data, or may record image data that has been subjected to image processing by the camera process unit 23. In the following, it is assumed that RAW image data is recorded in the recording unit 24. Note that the RAW image data may be either uncompressed RAW image data or compressed RAW image data.
  • the display unit 13 is composed of a display device such as a liquid crystal display (LCD) panel or an organic electroluminescence (EL) display, and the number of such display units 13 may be any number. In the example of Fig. 3, two display units 13 are provided, but the display area of one display unit 13 may be divided into two, and the two divided display areas may be used as display areas for images from two imaging devices.
  • the display unit 13 may be provided integrally with the imaging unit 11 or may be provided separately.
  • a display unit of a smartphone may be used as the display unit 13. In this case, it is sufficient that the imaging device 2 and the smartphone are connected wirelessly or by wire and are capable of transmitting and receiving images and information to each other.
  • the display control unit 33 described later transmits the image via the communication unit 25, and the communication unit 25 receives a control signal in response to an operation from the smartphone.
  • the display unit 13 displays various images on a display screen. For example, the display unit 13 displays a reproduced image of image data read from the recording unit 24.
  • the display unit 13 also displays various operation menus, icons, messages, etc., that is, GUI (Graphical User Interface) on the screen.
  • GUI Graphic User Interface
  • the communication unit 25 communicates wirelessly or wired with external devices (other imaging devices 2, computer 3). For example, the communication unit 25 transmits the captured image 101 to the computer 3 and receives the composite image 103.
  • the operation unit 14 collectively refers to input devices for the user to input various operations. Specifically, the operation unit 14 includes various operators (keys, dials, touch panel) provided in the imaging unit 11, a touch panel 14a provided on the front surface of the display unit 13, and the like. When the operation unit 14 detects a user operation, a signal corresponding to the input operation is sent to the camera control unit 26 .
  • the camera control unit 26 is configured by a microcomputer (arithmetic processing device) equipped with a CPU (Central Processing Unit).
  • the camera control unit 26 includes functional units as a spatial information acquisition unit 31, a meta information acquisition unit 32, a display control unit 33, a recording control unit 34, and an adjustment unit 35. Details of these functional units will be described later.
  • the memory unit 27 stores information and the like used for processing by the camera control unit 26.
  • the illustrated memory unit 27 collectively represents, for example, a read only memory (ROM), a random access memory (RAM), a flash memory, and the like.
  • the memory unit 27 may be a memory area built into the microcomputer chip serving as the camera control unit 26, or may be configured by a separate memory chip.
  • the camera control unit 26 executes programs stored in the ROM of the memory unit 27, the recording unit 24, and the like, thereby controlling the entire imaging device 2. For example, the camera control unit 26 controls the operation of each necessary unit with respect to controlling the shutter speed of the image sensor unit 22, instructing various signal processing in the camera processing unit 23, image capturing and recording operations in response to user operations, playback operations of recorded image data, operations of the lens system 21 such as zoom, focus, and aperture adjustment, user interface operations, etc.
  • the RAM in the memory unit 27 is used as a working area for various data processing by the CPU of the camera control unit 26, and is used for temporarily storing data, programs, and the like.
  • the ROM and flash memory (non-volatile memory) in the memory unit 27 are used to store the OS (Operating System) that the CPU uses to control each unit, application programs for various operations, firmware, various setting information, etc.
  • the driver unit 28 includes, for example, a motor driver for a zoom lens drive motor, a motor driver for a focus lens drive motor, a motor driver for a diaphragm mechanism motor, and the like. These motor drivers apply drive currents to the corresponding drivers in response to instructions from the camera control unit 26, thereby moving the focus lens and zoom lens, opening and closing the aperture blades of the aperture mechanism, and so on.
  • the ToF sensor 12 includes, for example, an irradiation unit that irradiates infrared laser light (irradiation light), and a sensor, for example of a CMOS type, that receives the light that is irradiated by the irradiation unit and reflected by the subject.
  • the ToF sensor 12 calculates the distance from the imaging device 2 to the subject using the ToF (Time of Flight) method.
  • the ToF sensor 12 can use a direct ToF method that calculates distance by directly determining the time it takes for irradiated light to return as reflected light after being reflected by the subject, or an indirect ToF method that calculates distance to the subject based on the phase difference between irradiated light and reflected light.
  • FIG. 5 is a diagram illustrating the configuration of the computer 3.
  • the computer 3 is assumed to be a personal computer (PC), a mobile terminal device such as a smartphone or a tablet, a mobile phone, a video editing device, a video playback device, etc.
  • the computer 3 may also be configured as a server device or a computing device in cloud computing. Therefore, the computer 3 may be a smartphone used as the display unit 13 , or may be an imaging device 2 .
  • the computer 3 may be a different device for each process or for one or more functional units, which will be described later.
  • the computer 3 includes a computer control unit 41, a recording unit 42, a display unit 43, a communication unit 44, an operation unit 45, and a memory unit 46.
  • the computer control unit 41 is composed of a microcomputer (arithmetic processing unit) equipped with a CPU.
  • the computer control unit 41 has functional units as a CG correction unit 51, an image acquisition unit 52, a CG image generation unit 53, a CG synthesis unit 54, and an editing unit 55. Details of these functional units will be described later.
  • the recording unit 42 is a memory card (such as a portable flash memory) which is a recording medium that can be attached to and detached from the computer 3, or a flash memory or HDD (hard disk drive) built into the computer 3, or the like.
  • the recording unit 42 records the captured image 101 (RAW image data) received from the imaging device 2, and records a CG file or the like that includes various information for generating a CG image 102 to be combined with the captured image 101.
  • the captured image 101 received from the imaging device 2 may be image data that has been subjected to various signal processes by the camera processing unit 23.
  • the display unit 43 is a display unit that displays various information to the user, and is configured to include, for example, a liquid crystal panel or an organic EL display.
  • the communication unit 44 communicates with an external device (imaging device 2) wirelessly or via a wired connection.
  • the communication unit 44 receives the captured image 101 from the imaging device 2 and transmits the composite image 103.
  • the operation unit 45 collectively represents input devices for the user to input various operations, specifically, the operation unit 45 includes various operators (keyboard, mouse, keys, dial, touch panel, etc.).
  • the operation unit 45 detects a user operation, a signal corresponding to the input operation is sent to the computer control unit 41 .
  • the memory unit 46 collectively indicates, for example, a ROM, a RAM, a flash memory, etc.
  • the memory unit 46 may be a memory area built into the microcomputer chip serving as the computer control unit 41, or may be configured by a separate memory chip.
  • the computer control unit 41 controls the entire computer 3 by executing programs stored in the ROM of the memory unit 46, the recording unit 42, etc.
  • the RAM in the memory unit 46 is used as a working area for various data processing by the CPU of the computer control unit 41, for temporarily storing data, programs, etc.
  • the ROM and flash memory in the memory unit 46 are used to store the OS for the CPU to control each unit, application programs for various operations, various setting information, and the like.
  • FIG. 6 is a diagram for explaining the processing steps of the imaging system 1. Here, the steps performed by the imaging system 1 will be explained. As shown in Fig. 6, the processing steps of the imaging system 1 are roughly divided into three stages: a preparation step ST1, an imaging step ST2, and a post-processing step ST3.
  • the preparation process ST1 is a process carried out prior to the imaging process ST2.
  • spatial information spatial map
  • the imaging device 2 acquires spatial information from the imaging device 2
  • the acquired spatial information is used by the computer 3 to determine and modify CG parameters for generating the CG image 102.
  • the imaging process ST2 is a process in which imaging is actually performed at an imaging site such as that shown in FIG. 1.
  • This imaging process ST2 includes a rehearsal imaging process in which rehearsal imaging is performed prior to actual imaging, and a main imaging process in which actual imaging is performed.
  • a CG image 102 is synthesized in almost real time with an imaging image 101 obtained by the imaging device 2, and is displayed on the display unit 13 as a synthesized image 103.
  • a process of adjusting CG parameters is performed in response to the operation of the operation unit 14.
  • the post-processing step ST3 indicates various processes that are carried out after shooting. For example, CG parameter adjustment, image adjustment, clip editing, video effects, etc. are carried out.
  • Image adjustments may include color adjustments, brightness adjustments, contrast adjustments, and the like.
  • Clip editing may involve cutting clips, adjusting the order, adjusting the length of time, and the like.
  • image effect special effect image synthesis may be performed.
  • Fig. 7 is a diagram showing a sequence chart of the preparation process
  • Fig. 8 is a diagram for explaining distance information.
  • the ToF sensor 12 of one of the imaging devices 2 (e.g., imaging device 2a) is used to sense the imaging space that will be the background in the actual imaging.
  • the cameraman 8 may sense the imaging space while moving the imaging device 2a so that the ToF sensor 12 can sense the distance to the subject in the imaging space that will be captured as the background in the actual imaging. At this time, it is preferable that there are no moving subjects, such as the person 6 shown in Figure 1.
  • the imaging unit 11 may acquire an image 101 in addition to acquiring distance information in the ToF sensor 12. In this case, it is preferable to match the imaging frame rate and imaging timing of the imaging unit 11 with the sensing rate and sensing timing of the ToF sensor 12. However, it is not essential that the imaging frame rate and imaging timing of the imaging unit 11 do not match the sensing rate and sensing timing of the ToF sensor 12, i.e., that they are synchronized, and distance information corresponding to the imaging timing of each frame of the captured image may be calculated by interpolation processing or the like.
  • the ToF sensor 12 detects the distance to the subject, for example, according to the sensing rate.
  • the spatial information acquisition unit 31 of the imaging device 2a acquires distance information indicating the distance to the subject from the ToF sensor 12 (step S1). Then, the spatial information acquisition unit 31 acquires spatial information (spatial map) of the imaging space from the acquired distance information (step S2). Note that in FIG. 8, the imaging space for which spatial information has been acquired is shown by a mesh. Also, areas without a mesh are areas for which spatial information has not yet been acquired.
  • the spatial information may include not only the position (coordinates) of the imaging space relative to a predetermined reference position (here, the position of the imaging device 2a), but also the shape of structures in the imaging space and their relative positions relative to the reference position when the imaging device 2 captures images.
  • the reference position may be, for example, the position where the imaging device 2 is placed during actual imaging, or may be any other position.
  • the spatial information acquisition unit 31 may calculate spatial information using the captured image 101 acquired by the imaging unit 11 in addition to or instead of the distance information acquired from the ToF sensor 12.
  • the distance information may be calculated by image processing from the captured image 101 acquired by the imaging unit 11, and further, continuous areas of the same color may be determined to be one structure based on color information of the captured image 101. This makes it possible to calculate the spatial information with high accuracy.
  • the spatial information will be described as three-dimensional spatial information, it may be two-dimensional spatial information (depth map). Furthermore, by acquiring the captured image 101 together, it is possible to obtain spatial information including texture.
  • the communication unit 25 transmits the acquired spatial information to the computer 3 (step S3).
  • the computer 3 receives the spatial information transmitted from the imaging device 2a (step S11). This enables the computer 3 to modify the CG parameters based on the received spatial information and CG file.
  • FIG. 9 is a diagram explaining the CG parameter correction process.
  • FIG. 10 is a diagram explaining an example of CG parameters.
  • a CG correction screen 62 as shown in FIG. 9 is displayed on the display unit 43.
  • the user can determine and modify desired CG parameters by operating the CG correction screen 62 via the operation unit 45.
  • the CG correction screen 62 is merely one example, and any configuration is acceptable as long as it is possible to determine and modify CG parameters using spatial information.
  • the CG correction unit 51 displays a CG image 102 based on the three-dimensional imaging space indicated in the spatial information transmitted from the imaging device 2 in the CG display area 63 of the CG correction screen 62.
  • the CG correction unit 51 regenerates the CG image 102 while appropriately applying corrections based on user operations on the CG correction screen 62, displays it in the CG display area 63, and corrects the CG parameters in accordance with the user operations (step S12).
  • CG correction screen 62 it is possible to determine and correct CG parameters such as the placement area, type, color, size, number (density), target subject, start timing, drawing speed, shading (light source direction, intensity, color temperature) of CG objects such as plants and stones. These CG parameters can be said to be parameters for generating the CG image 102.
  • the CG modification unit 51 determines and modifies a placement area 63a in which a CG object is placed in the three-dimensional imaging space shown in the spatial information in response to a user operation on the CG display area 63.
  • the placement area 63a determined and modified here is indicated, for example, by an address (composite position) on the CG image 102 in which the CG object is drawn.
  • the placement area 63a may also be an address in the imaging space in which the CG object is placed.
  • the CG correction screen 62 displays a type selection area 64 for selecting the type of CG object, a number change dial 65 for changing the number of CG objects, a color wheel 66 for selecting the hue of the CG image 102, and a seek bar 67 for changing the drawing speed of the CG image 102.
  • the CG modification unit 51 determines the CG object selected by the user in response to operation of the operation unit 45 from the multiple CG object types displayed in the type selection area 64 as the type of CG object to be rendered as the CG image 102. Furthermore, the CG correction unit 51 corrects the number of CG objects to be drawn as CG images 102 in the placement area 63a to a number (density) corresponding to the value selected by the number-changing dial 65. Note that the number-changing dial 65 is rotated and displayed in response to a user operation via the operation unit 45. Therefore, the CG correction unit 51 variably determines the number of CG objects to be drawn as CG images 102 in the placement area 63a in response to the operation of the rotated and displayed number-changing dial 65. Furthermore, the CG correction section 51 corrects the CG image 102 to a color tone selected by a color wheel 66 as the color tone for rendering the CG image 102 .
  • the start timing of the CG parameters here indicates the timing at which the composition of the CG image 102 starts with the captured image 101, i.e., the start trigger.
  • the drawing speed indicates the time from the start to the end of composition of the CG image 102. Note that the drawing speed may be specified by the number of frames rather than by time.
  • the CG correction unit 51 determines and corrects the target subject and start timing in response to the user's operation of the operation unit 45. Specifically, the CG correction unit 51 determines the target subject in response to the user's operation, and can specify the distance to a specific subject as the start timing, and the synthesis start position indicating the position at which synthesis will start when the target subject is located, using an address in the imaging space.
  • the CG correction unit 51 can also change the drawing speed of the CG image 102 in response to the operation of the seek bar 67. For example, moving the seek bar 67 to the left increases the time from when the synthesis of the CG image 102 begins to when it ends. Moving the seek bar 67 to the right decreases the time from when the synthesis of the CG image 102 begins to when it ends.
  • two seconds is specified as the drawing speed
  • three CG objects are specified as the number of objects.
  • the specific person 6 may be identified by image processing of the captured image 101.
  • the position (address) of the specific person 6 in the imaging space is calculated based on distance information (position information in space) obtained from the ToF sensor 12. Also, a CG image 102 is generated for each frame in which, for example, three CG objects gradually appear forward at predetermined intervals over a period of two seconds from the start of composition.
  • the CG modification unit 51 associates the modified CG parameters and spatial information with the CG file and records them in the recording unit 42 (step S13).
  • FIG. 11 is a diagram illustrating a CG image 102 based on CG parameters.
  • the CG parameters are modified in response to user operations.
  • the CG image generation unit 53 generates the CG image 102 based on the modified CG parameters.
  • CG image 102 As shown in FIG. 11 from top to bottom, it is therefore possible to generate a CG image 102 based on CG parameters in which, for example, plants and rocks appear gradually from the rear to the front.
  • Fig. 12 is a diagram for explaining meta information.
  • the meta information includes a CG file ID, CG parameters, and imaging setting information.
  • the CG file ID is an identifier for identifying the determined CG file.
  • the in-space position information is information indicating the distance to the subject sensed by the ToF sensor 12, that is, the position (address) of the subject in the imaging space.
  • the imaging setting information is information regarding imaging settings of the imaging device 2 that are set in an imaging step ST2 described later.
  • the meta information includes the start timing, which is necessary to determine when to display the CG image 102, but as long as the CG image 102 can be composited at the appropriate timing and position, it may also include information other than the CG file ID, CG parameters, distance information, and imaging setting information, or it may not include any of this information.
  • the imaging process is mainly divided into a rehearsal imaging process and a main imaging process.
  • the imaging device 2 is caused to image the subject in the same manner as in the main imaging process. Note that the rehearsal imaging process does not necessarily have to be provided.
  • FIG. 13 is a diagram showing a sequence chart in the imaging process.
  • FIGS. 14A to 14D are diagrams explaining the imaging settings of the imaging device 2.
  • an imaging setting screen (camera setup screen) 71 for determining the imaging settings of the imaging device 2 is displayed on the display unit 13 of the imaging device 2a. Note that the display order of the screens shown in FIGS. 14A to 14D does not have to be the order described below, and may be any other order.
  • the imaging setting screen 71 displays operation items for selecting and setting the resolution (Imager Scan) of the image sensor of the imaging element section 22, the number of images vertically and horizontally, i.e., the image size, the video format (Video Format) indicating either interlaced or progressive, the frame rate of the output image (Project FPS), the 3D LUT (Monitor LUT) for color correction of the so-called live view image (monitor image) that is displayed on the display section 13 for monitoring during imaging, and the color space (Color Space) (step S21).
  • the resolution Imager Scan
  • Video Format Video Format
  • the 3D LUT Monitoring LUT
  • a list of options that can be selected for the operated selection item is displayed. For example, when the operation item for frame rate (Project FPS) is operated, a selection screen 72 is displayed that lists the selectable frame rates, as shown in FIG. 14B. When any value on the selection screen 72 is selected via the touch panel 14a, the frame rate is changed to the selected value, and the imaging setting screen 71 shown in FIG. 14A is displayed again.
  • the operation item for frame rate Project FPS
  • the meta information acquisition unit 32 acquires imaging setting information indicating the determined imaging settings as meta information, and the recording control unit 34 records them in the recording unit 24.
  • a CG selection screen 73 for selecting a CG file is displayed, as shown in FIG. 14C.
  • the names and IDs of the selectable CG files are displayed. Then, when any CG file is selected on the CG selection screen 73 via operation of the touch panel 14a, the selected CG file is confirmed.
  • the meta information acquisition unit 32 acquires the CG file ID of the determined CG file as meta information, and the recording control unit 34 records it in the recording unit 24.
  • an imaging device selection screen 74 for selecting the imaging device 2 to be used in the rehearsal imaging process and the main imaging process is displayed on the display unit 13.
  • the communication unit 25 transmits imaging setting information to the determined imaging device 2 directly or via the computer 3. This makes it possible to cause multiple imaging devices 2 to perform imaging with the same imaging settings. Note that here, for example, an imaging device 2 that can be linked to a smartphone as the display unit 13 is selected.
  • the communication unit 25 also transmits the imaging setting information, CG file ID, and imaging device 2 of the determined imaging device 2 to the computer 3 as meta information (step S21).
  • the computer 3 records the received imaging setting information and CG file ID of the imaging device 2 in the recording unit 42 as meta information.
  • the imaging device 2 starts rehearsal imaging in response to an operation by the cameraman 8 or the like (step S22).
  • the two imaging devices 2 start rehearsal imaging in synchronization with each other.
  • the imaging device 2 obtains an image 101.
  • the camera processor 23 sequentially performs image processing on the image 101 (RAW image data).
  • the communication unit 25 sequentially transmits the image 101 (image data) after image processing to the computer 3. Note that the communication unit 25 may transmit the RAW image data to the computer 3.
  • the meta information acquisition unit 32 acquires distance information that indicates the distance to the subject calculated by the ToF sensor 12, and the communication unit 25 transmits the distance information to the computer 3 as spatial position information.
  • the computer 3 first receives meta information from the captured image 101. After that, the computer 3 receives the spatial position information almost in real time.
  • the CG image generating unit 53 then reads out the CG file and CG parameters corresponding to the CG file ID included in the meta information from the recording unit 42, and uses the CG file to generate a CG image 102 to be composited with each frame of the captured image 101 based on the CG parameters and the spatial position information (step S31). Thereafter, the CG synthesis unit 54 synthesizes each frame of the captured image 101 received from the imaging device 2 with the CG image 102 generated by the CG image generation unit 53 to generate a composite image 103. Note that the captured image 101 may be recorded in the recording unit 42.
  • the display control unit 33 displays the received composite image 103 on the display unit 13 (step S23).
  • a composite image 103 in which plants, stones, etc. appear in time with the running of a specific person 6 is displayed on the display unit 13.
  • FIG. 16 is a diagram illustrating the adjustment screen.
  • the display unit 13 of the imaging device 2a displays a composite image display area 81 in which the composite image 103 is displayed, a number change dial 82 for changing the number (density) of CG objects, and a color wheel 83 for changing the color tone of the CG image 102, as shown in FIG. 16.
  • the CG parameters can be adjusted by operating the number change dial 82 and color wheel 83.
  • the imaging device 2a obtains the CG parameters from the computer 3 and records them in the recording unit 24.
  • the adjustment unit 35 changes (adjusts) the number (density) of CG objects in response to the operation.
  • the number-changing dial 82 is displayed and rotated in response to a user operation via the touch panel 14a. 17, when the number-changing dial 82 is rotated counterclockwise via the touch panel 14a, the number (density) of CG objects is decreased, whereas when the number-changing dial 82 is rotated clockwise via the touch panel 14a, the number (density) of CG objects is increased.
  • the recording control unit 34 updates the number of CG objects among the CG parameters in accordance with the operation of the number-changing dial 82, and records the updated CG parameters in the recording unit 24.
  • the communication unit 25 transmits the updated CG parameters to the computer 3 (step S24).
  • the computer 3 receives the CG parameters, it regenerates the CG image 102 and the composite image 103 based on the received CG parameters, and transmits the composite image 103 to the imaging device 2 (step S33).
  • the composite image 103 may be regenerated using the captured image recorded in the recording unit 42.
  • the imaging device 2 can display on the display unit 13 a composite image 103 that is composed of a CG image 102 in which the number of CG objects has been reduced in response to operation of the number-changing dial 82, as shown in the lower part of Figure 17, for example.
  • the adjustment unit 35 can display a composite image 103 in which the CG image 102 is adjusted to the hue selected by the color wheel 83 .
  • the recording control unit 34 updates the color tone, which is one of the CG parameters, in accordance with the operation of the color wheel 83 and records the updated color tone in the recording unit 24 .
  • the communication unit 25 transmits the updated CG parameters to the computer 3.
  • the computer 3 regenerates the CG image 102 and the composite image 103 based on the received CG parameters, and transmits the composite image 103 to the imaging device 2. This enables the imaging device 2 to display on the display unit 13 the composite image 103 obtained by combining the CG image 102 with a color tone corresponding to the operation of the color wheel 83.
  • the adjustment unit 35 can display a composite image 103 in which the drawing speed of the CG image 102 is updated by swiping the composite image display area 81 left or right. 18 , for example, by swiping the composite image display area 81 to the left, a composite image 103 in which the drawing speed of the CG image 102 is slowed down is displayed on the display unit 13. By swiping the composite image display area 81 to the right, a composite image 103 in which the drawing speed of the CG image 102 is increased is displayed on the display unit 13.
  • the recording control unit 34 updates the drawing speed, which is one of the CG parameters, in accordance with the amount of the swipe and records the updated CG parameters in the recording unit 24.
  • the communication unit 25 transmits the updated CG parameters to the computer 3.
  • the computer 3 receives the CG parameters, it regenerates the CG image 102 and the composite image 103 based on the received CG parameters, and transmits the composite image 103 to the imaging device 2.
  • the imaging device 2 can generate a composite image 103 in which the drawing speed is slowed down in accordance with the amount of swiping leftward on the composite image display area 81, and the timing of compositing the front tree among the trees that are CG objects is delayed, as shown in the lower part of Fig. 18, for example. That is, a composite image 103 can be generated in which the time (number of frames) from the composition start timing to the composition end timing is extended.
  • the adjustment unit 35 allows the cameraman 8 or the like to check the composite image 103 displayed in the rehearsal imaging process and allows the cameraman 8 or the like to adjust the CG parameters.
  • the adjustment unit 35 When the CG parameters are adjusted (updated) in the rehearsal imaging process, the adjustment unit 35 records the adjusted CG parameters in the recording unit 24 as meta information, and the communication unit 25 transmits the CG parameters to the computer 3.
  • the computer 3 receives the adjusted CG parameters, it records the received CG parameters in the recording unit 42 as meta information.
  • the CG parameters are adjusted using the captured image 101 obtained by the imaging device 2a, but the CG parameters may also be adjusted using the captured image 101 obtained by the imaging device 2b in addition to the imaging device 2a.
  • the touch panel 14a by the cameraman 8 or the like it may be possible to select whether to generate and display the composite image 103 from the captured image 101 obtained by the imaging device 2a or from the captured image 101 obtained by the imaging device 2b. Then, a captured image 101 obtained by the imaging device 2 a or 2 b selected by the cameraman 8 or the like is transmitted to the computer 3 , and the computer 3 generates a CG image 102 and a composite image 103 .
  • the generated composite image 103 is transmitted to the imaging device 2a, whereby the imaging device 2a displays on the display unit 13 a composite image 103 based on the captured image 101 obtained by the imaging device 2a or the imaging device 2b selected by the cameraman 8 or the like.
  • This allows the composite image 103 based on the captured image 101 obtained by the imaging device 2b to be confirmed by the imaging device 2a, and the CG parameters can be adjusted by the imaging device 2a using the composite image 103 based on the captured image 101 obtained by the imaging device 2b.
  • Fig. 19 is a diagram for explaining a screen displayed on the display unit 13 in the actual imaging step.
  • imaging is started in the imaging device 2 as shown in FIG. 1, and a captured image 101 (RAW image data) is obtained (step S25).
  • the imaging device 2 sequentially transmits the captured images 101 to the computer 3.
  • the meta information acquisition unit 32 acquires distance information indicating the distance to the subject calculated by the ToF sensor 12 as spatial position information and records it as a CG parameter. Also, the communication unit 25 transmits the spatial position information to the computer 3.
  • the CG image generation unit 53 generates a CG image 102 to be composited with each frame of the captured image 101 based on the selected CG file and meta information.
  • the CG composition unit 54 then composites the CG image 102 with each frame of the received captured image 101 to generate a composite image 103 (step S34).
  • the communication unit 44 transmits the composite image 103 to the imaging device 2a (step S35).
  • the recording control unit 34 records the captured image 101 in the recording unit 24, and associates the CG file ID, CG parameters, imaging setting information of the imaging device 2, etc., with the image data as meta information.
  • these CG parameters can also be considered parameters for performing post-processing steps.
  • the term "associate” means, for example, that when processing one piece of information (data, command, program, etc.), the other piece of information can be used (linked). That is, pieces of information associated with each other may be collected into a single file or the like, or may be individual pieces of information.
  • information B associated with information A may be transmitted on a transmission path different from that of information A.
  • information B associated with information A may be recorded on a recording medium different from that of information A (or on a different recording area of the same recording medium). Note that this "association" may be a part of information, not the whole information.
  • an image and information corresponding to the image may be associated with each other in any unit, such as multiple frames, one frame, or a part of a frame.
  • "associating" includes, for example, acts such as assigning the same ID (identification information) to multiple pieces of information, recording multiple pieces of information on the same recording medium, storing multiple pieces of information in the same folder, storing multiple pieces of information in the same file (assigning one to the other as metadata), embedding multiple pieces of information in the same stream, linking multiple pieces of information to the same project, and embedding metadata in an image like a digital watermark.
  • the recording control unit 34 associates the captured image 101 with meta information including the CG parameters and records them in the recording unit 24 .
  • the communication unit 25 also transmits the captured image 101 and meta information recorded in the recording unit 24 to the computer 3.
  • the computer 3 associates the received captured image 101 and meta information and records them in the recording unit 42. Note that the recording of the captured image 101 and meta information may be performed by only one of the imaging device 2 and the computer 3. This enables a post-processing process.
  • the composite image 103 is discarded without being recorded.
  • the imaging device 2 and the computer 3 may each be able to select a mode in which the composite image 103 is discarded and a mode in which the composite image 103 is recorded without being discarded.
  • the CG image generating section 53 generates a composite image 103 to be composited with each captured image 101
  • the CG composition section 54 composites each captured image 101 with each composite image 103 to generate two composite images 103 .
  • the composite image 103 may be generated independently for the captured images 101 obtained by the two imaging devices 2, for example, by determining the timing of synthesis based on distance information (space position information) acquired by the ToF sensor 12 of each imaging device 2.
  • the composite image 103 may be generated so as to be related to the captured images 101 obtained by the two imaging devices 2, for example, by determining the timing of simultaneous synthesis based on distance information (space position information) acquired by the ToF sensor 12 of one of the imaging devices 2.
  • the display control unit 33 may display two of the received composite images 103 side by side on the display unit 13, as shown in FIG. This allows the cameraman 8 or the like to simultaneously check in almost real time the composite image 103 in which the CG image 102 is composited with the captured images 101 obtained by the image capture devices 2a and 2b.
  • Fig. 20 is a diagram for explaining an editing screen 91.
  • an editing screen 91 as shown in Fig. 20 is displayed on the display unit 43 in the post-processing step.
  • the editing screen 91 includes a composite image display area 92 in which multiple, for example two, composite images 103 can be displayed side-by-side, an adjustment operation area 93 in which multiple icons for adjusting CG parameters are displayed, and a timeline display area 94 in which a timeline of a moving image is displayed.
  • the editing unit 55 performs image processing on the RAW image data in the same manner as the camera processing unit 23.
  • image processing is performed on each of the captured images 101 obtained by the two imaging devices 2.
  • the editing unit 55 Based on the CG files and meta information recorded in the recording unit 42, the editing unit 55 generates CG images 102 to be composited with each captured image 101 in the same manner as the CG image generation unit 53 generated the CG image 102 in this imaging process. Then, the editing unit 55 composites the captured image 101 and the CG image 102 to generate a composite image 103 in the same manner as the CG composition unit 54 generated the composite image 103 in this imaging process. Therefore, here, two composite images 103 are generated.
  • the editing unit 55 displays the two generated composite images 103 side by side in the composite image display area 92 of the editing screen 91.
  • the composite image 103 displayed in the composite image display area 92 can be stopped, fast-forwarded, rewound, etc. by operating the operation icons provided at the bottom of the composite image display area 92. This allows the user to confirm a composite image 103 in which the CG image 102 is composited with the captured image 101 obtained by the imaging device 2. In addition, the user can simultaneously confirm a composite image 103 in which the CG image 102 is composited with the captured images 101 obtained by two imaging devices 2.
  • the editing unit 55 displays the two captured images 101 and the CG image 102 recorded in the recording unit 42, for example, in the timeline display area 94, one above the other, with their time axes aligned. This allows the two captured images 101 and the CG image 102 to be simultaneously checked and then edited.
  • the editing unit 55 can also update the CG parameters in response to operations on icons displayed in the adjustment operation area 93.
  • the editing unit 55 records the updated CG parameters in association with the captured image 101, regenerates the CG image 102 based on the updated CG parameters, and generates a composite image 103 by combining the captured image 101 and the CG image 102, and displays it in the composite image display area 92.
  • a ToF sensor 12 is provided as a distance measuring unit that measures the distance to the subject, but the distance measuring unit that measures the distance to the subject is not limited to this, and may be an ultrasonic sensor or the like.
  • detection may be performed by image processing from the output of an image sensor, or multiple distance measuring sensors and image sensors may be used in combination.
  • the CG file and the CG parameters are recorded separately.
  • the CG parameters may be recorded within the CG file.
  • the camera control unit 26 of the imaging device 2 functions as a spatial information acquisition unit 31, a meta information acquisition unit 32, a display control unit 33, a recording control unit 34, and an adjustment unit 35.
  • these functional units may be made to function in the computer control unit 41 of the computer 3.
  • the computer control unit 41 also functions as a CG correction unit 51, an image acquisition unit 52, a CG image generation unit 53, a CG synthesis unit 54, and an editing unit 55. However, some or all of these functional units may be controlled by the camera control unit 26 of the imaging device 2.
  • Fig. 21 is a diagram for explaining the functional configuration of a camera control unit 26A and a computer control unit 41A in the modified example 1.
  • Fig. 22 is a block diagram illustrating an imaging process in the modified example 1.
  • the camera control unit 26A in the first modification includes a spatial information acquisition unit 31, a meta information acquisition unit 32, a display control unit 33, a recording control unit 34, and an adjustment unit 35, as well as a functional unit serving as a CG synthesis unit 54A.
  • the computer control unit 41 in the first modification also includes functional units serving as a CG correction unit 51, an image acquisition unit 52, a CG image generation unit 53, and an editing unit 55.
  • a CG image 102 is generated in a computer 3, but a composite image 103 is not generated, and the CG image 102 is transmitted to an imaging device 2. Then, when the imaging device 2 receives the CG image 102, it composites the captured image 101 with the CG image 102 to generate and display the composite image 103.
  • imaging is started by the imaging element unit 22 (indicated as “imaging” in the figure), and various signal processes are performed by the camera processing unit 23 on the image signal from the imaging element unit 22 (indicated as “camera process” in the figure).
  • the communication unit 25 transmits the captured image 101 that has been subjected to various signal processes to the computer 3.
  • the CG image generating unit 53 of the computer 3 reads out from the recording unit 42 the CG file corresponding to the CG file ID included in the meta information, and uses the CG file to generate a CG image 102 based on the CG parameters and spatial information (indicated as "CG generation" in the figure).
  • the communication unit 44 then transmits the generated CG image 102 to the imaging device 2a.
  • the CG synthesis unit 54A synthesizes the CG image 102 with the captured image 101 to generate a synthetic image 103 (indicated as "CG synthesis” in the figure).
  • the display control unit 33 displays the generated synthetic image 103 on the display unit 13 (indicated as "display” in the figure).
  • the communication unit 25 transmits the adjusted CG parameters to the computer 3.
  • the CG image generation unit 53 regenerates the CG image 102 based on the received CG parameters, and the communication unit 44 transmits the regenerated CG image 102 to the imaging device 2a.
  • the CG synthesis unit 54A synthesizes the captured image 101 with the CG image 102 to regenerate a synthetic image 103, and the display control unit 33 displays the regenerated synthetic image 103 on the display unit 13.
  • the imaging process starts, imaging and camera processing are performed in the imaging device 2, and the captured images 101 are sequentially sent to the computer 3.
  • the CG image generation unit 53 of the computer 3 receives the captured images 101, it generates CG images 102 to be composited with each captured image 101 based on the CG file and meta information.
  • the communication unit 44 transmits the generated CG images 102 to the imaging device 2a.
  • the CG synthesis unit 54A synthesizes the captured image 101 with the CG image 102 to generate a synthetic image 103.
  • the display control unit 33 displays the generated synthetic image 103 on the display unit 13.
  • the recording control unit 34 records the captured image 101 in the recording unit 24, and also associates the CG file ID, CG parameters, imaging setting information of the imaging device 2, etc. as meta information with the image data and records them in the recording unit 24 (indicated as "record” in the figure).
  • the captured image 101 recorded in the recording unit 24 may be RAW image data, image data after image processing by the camera processing unit 23, or both.
  • Fig. 23 is a diagram for explaining the functional configuration of a camera control unit 26B and a computer control unit 41B in the modified example 2.
  • Fig. 24 is a block diagram illustrating an image pickup process in the modified example 2.
  • the camera control unit 26B in the second modification includes functional units such as a spatial information acquisition unit 31, a meta information acquisition unit 32, a display control unit 33, a recording control unit 34, and an adjustment unit 35, as well as a CG image generation unit 53B and a CG synthesis unit 54B.
  • the computer control unit 41B in the second modification includes functional units such as a CG correction unit 51, an image acquisition unit 52, and an editing unit 55.
  • the CG image 102 and composite image 103 are not generated in the computer 3, and the CG file is recorded in the recording unit 24 of the imaging device 2. Then, the imaging device 2 generates the CG image 102 and composite image 103 based on the CG file, CG parameters, and spatial information.
  • rehearsal shooting is started and shooting is started by the imaging element unit 22 (indicated as “shooting” in the figure), and various signal processing is performed by the camera process unit 23 on the image signal from the imaging element unit 22 (indicated as “camera process” in the figure).
  • the CG image generation unit 53B uses the CG file to generate a CG image 102 based on the CG parameters and spatial information (indicated as "CG generation” in the figure).
  • the CG synthesis unit 54B synthesizes the CG image 102 with the captured image 101 to generate a synthesized image 103 (indicated as "CG synthesis” in the figure).
  • the display control unit 33 displays the generated synthesized image 103 on the display unit 13 (indicated as "display” in the figure).
  • the CG image generation unit 53B regenerates the CG image 102 based on the adjusted CG parameters. Furthermore, the CG synthesis unit 54B synthesizes the CG image 102 with the captured image 101 to regenerate the synthetic image 103.
  • the display control unit 33 displays the regenerated synthetic image 103 on the display unit 13.
  • the CG image generating unit 53B uses the CG file to generate a CG image 102 based on CG parameters and spatial information.
  • the CG synthesis unit 54B synthesizes the CG image 102 with the captured image 101 to generate a synthesized image 103.
  • the display control unit 33 displays the generated synthesized image 103 on the display unit 13.
  • the recording control unit 34 records the captured image 101 in the recording unit 24, and also records the CG file ID, CG parameters, imaging setting information of the imaging device 2, etc. as meta information in association with the image data in the recording unit 24 (indicated as "record” in the drawing).
  • the captured image 101 recorded in the recording unit 24 may be RAW image data, image data after image processing by the camera processing unit 23, or both.
  • the imaging device 2 of the embodiment includes an imaging unit 11 that outputs an imaged image 101, a display control unit 33 that causes a display unit 13 to display a composite image 103 generated by combining the imaged image 101 with a CG image 102 based on spatial information of the imaging space and CG parameters related to CG synthesis, and an association unit (recording control unit 34) that associates the imaged image 101 with the CG parameters.
  • This makes it possible for the imaging device 2 to display a composite image 103, in which the captured image 101 is composited with the CG image 102, on the display unit 13 in almost real time while the imaging unit 11 is capturing an image.
  • the computer 3 can generate and edit the CG image 102 using the CG parameters in the post-processing step ST3. Therefore, the imaging device 2 allows the CG image 102 to be easily adjusted with respect to the captured image. Thus, the imaging device 2 can improve the work efficiency in video production including CG compositing.
  • the spatial information is associated with the CG parameters. This allows the image capture device 2 to later correct the CG parameters based on the spatial information, or allows the correction to be performed by another device.
  • the associating section is the recording control section 34, which associates the captured image 101 with the CG parameters and records them. This enables the image capturing device 2 to easily manage the captured image 101 in association with the CG parameters.
  • the associating section records the captured image 101 and the CG parameters in association with each other without recording the composite image 103 .
  • the imaging device 2 can reduce the amount of data stored in the recording unit 24 by the amount of the composite image 103 not recorded, and can also reduce the processing load.
  • a composite image 103 is generated by combining a captured image 101 with a CG image 102.
  • This allows the imaging device 2 to reduce the processing load by having the computer 3 generate the CG image 102 and the composite image 103.
  • the processing capacity of the camera control unit 26 of the imaging device 2 is generally lower than the processing capacity of the computer control unit 41 of the computer 3. If an image capturing device 2 with low processing power generates the CG image 102 and the composite image 103, there is a risk that the image generation will not be able to keep up with the demands of real time. Therefore, by using a computer 3 with high processing power to generate the CG image 102 and the composite image 103, it becomes easier to ensure that the composite image 103 is displayed in real time.
  • the apparatus includes a CG composition unit 54A that generates a composite image 103 by combining a CG image 102 generated in another apparatus (computer 3) with a captured image 101. This allows the imaging device 2 to have the computer 3 generate the CG image 102, thereby reducing the processing load.
  • the image processing apparatus includes a CG image generating section 53B that generates a CG image 102 to be composited with a captured image 101, and a CG composition section 54B that generates a composite image 103 by compositing the captured image 101 with the CG image 102.
  • the imaging device 2 can generate a composite image 103 by itself and display it on the display unit 13, even in an environment where the imaging device 2 and the computer 3 cannot be connected either wired or wirelessly.
  • the display control unit 33 causes the display unit 13 to display a composite image 103 obtained by combining a CG image 102 in real time processing in accordance with a captured image 101 being captured. This enables the imaging device 2 to allow the cameraman 8 or the like to check the composite image 103 as a moving image in almost real time.
  • the image forming apparatus includes an adjustment unit 35 that adjusts CG parameters in response to user operations.
  • This allows the imaging device 2 to adjust CG parameters while checking a composite image 103 obtained by combining a captured image 101 captured by the imaging device 2 in a rehearsal imaging process with a CG image 102, for example. Therefore, it is no longer necessary to strictly determine CG parameters when generating a CG file, and the effort and time required for determining CG parameters can be reduced. Furthermore, the effort required for adjusting CG parameters in the post-processing step ST3 can be reduced. Thus, in the imaging device 2, the working efficiency of the imaging system 1 as a whole can be improved.
  • the adjustment unit 35 associates the CG parameters adjusted in response to the user operation with the captured image 101 .
  • the CG image 102 is generated based on the adjusted CG parameters, making it possible to reduce the need for CG parameter adjustment in the post-processing step ST3.
  • a plurality of composite images 103 each of which is obtained by combining a CG image 102 with a captured image 101 obtained by a plurality of image capturing devices 2, are displayed on the display unit 13 in a switched manner. This allows the cameraman 8 or the like who operates the imaging device 2a to check the CG image 102 relative to the captured image 101 obtained by another imaging device 2, such as the imaging device 2b.
  • a plurality of composite images 103 each of which is obtained by combining a CG image 102 with a captured image 101 obtained by a plurality of image capture devices 2, are simultaneously displayed on a display unit 13.
  • the imaging settings determined in any one of the plurality of imaging devices 2 are used in common by the plurality of imaging devices 2 . This makes it easy to set up imaging for a plurality of imaging devices 2 .
  • the CG parameters include information regarding the start timing for starting rendering of the CG image 102, and the start timing is set based on the distance to a specific subject. This makes it possible to start combining the CG image 102 with the captured image 101, for example, based on the position of a particular person 6 (the distance from the image capture device 2).
  • the CG parameters include start timing information (start timing) for starting synthesis of the CG image 102 with the captured image 101, and the start timing information is the position of a specific subject in the three-dimensional space indicated by the spatial information. This makes it possible to start combining the CG image 102 with the captured image 101, for example, based on the position of a particular person 6 (the distance from the image capture device 2).
  • a CG image 102 is synthesized with each of the captured images 101 obtained by the multiple image capture devices 2 based on start timing information. This makes it possible to synthesize a CG image 102 synchronized with the captured images 101 obtained by the multiple image capture devices 2 .
  • the CG parameters include information related to the rendering speed of the CG image 102, and the rendering speed is set according to the distance to a specific subject. This makes it possible to generate a CG image 102 in which, for example, plants and the like appear in accordance with the movement of a particular person 6 and to synthesize the CG image 102 with the captured image 101 .
  • the camera includes a distance measuring unit (ToF sensor 12) that measures the distance to the subject, and the spatial information acquiring unit 31 acquires spatial information based on the distance to the subject measured by the distance measuring unit. This makes it possible to easily obtain spatial information about the imaging space.
  • ToF sensor 12 a distance measuring unit
  • the spatial information acquiring unit 31 acquires spatial information based on the distance to the subject measured by the distance measuring unit. This makes it possible to easily obtain spatial information about the imaging space.
  • the information processing method of the embodiment outputs a captured image, and displays a composite image 103 on the display unit 13, which is generated by combining the captured image 101 with a CG image 102 based on spatial information of the captured space and CG parameters related to CG synthesis, and associates the captured image 101 with the CG parameters.
  • These programs can be pre-recorded on a HDD as a recording medium built into a device such as a computer device, or on a ROM in a microcomputer having a CPU. Alternatively, they can be temporarily or permanently stored (recorded) on a removable recording medium such as a flexible disk, a CD-ROM (Compact Disc Read Only Memory), an MO (Magneto Optical) disk, a DVD (Digital Versatile Disc), a Blu-ray Disc (registered trademark), a magnetic disk, a semiconductor memory, or a memory card.
  • a removable recording medium can be provided as so-called package software.
  • Such a program can be installed in a personal computer or the like from a removable recording medium, or can be downloaded from a download site via a network such as a LAN (Local Area Network) or the Internet.
  • an imaging system 1 is an imaging system 1 that includes an imaging device 2 and an information processing device (computer 3), in which the imaging device 2 includes an imaging unit 11 that outputs an imaged image 101, a display control unit 33 that causes the display unit 13 to display a composite image 103 generated by combining the imaged image 101 with a CG image 102 based on spatial information of the imaging space and CG parameters related to CG synthesis, and an association unit (recording control unit 34) that associates the imaged image 101 with the CG parameters, and the information processing device includes an editing unit 55 that adjusts the CG parameters in response to a specified user operation.
  • the imaging device 2 includes an imaging unit 11 that outputs an imaged image 101, a display control unit 33 that causes the display unit 13 to display a composite image 103 generated by combining the imaged image 101 with a CG image 102 based on spatial information of the imaging space and CG parameters related to CG synthesis, and an association unit (recording control unit 34) that associates the imaged image 101 with the CG
  • the editing unit 55 can align and display the timeline of the captured image 101 and the timeline of the CG image 102 based on the CG parameters. This makes it possible to check the timelines of the captured image 101 and the CG image 102 separately. It is also possible to separately move the timelines of the captured image 101 and the CG image 102. In other words, it is also possible to shift the synthesis timing of the CG image 102 relative to the captured image 101.
  • the editing unit 55 causes the display unit to display a plurality of composite images 103 in which the CG images 102 are respectively composited with the captured images 101 obtained by the plurality of image capturing devices 2 . This allows the user to check CG images 102 corresponding to the images 101 obtained by a plurality of image capture devices 2 at once.
  • the present technology can also be configured as follows.
  • an imaging unit that outputs a captured image; a display control unit that displays a composite image generated by combining the captured image with a CG image based on spatial information of the captured space and CG parameters related to CG composition on a display unit; an association unit that associates the captured image with the CG parameters;
  • An imaging device comprising: (2) The imaging device according to any one of the preceding claims, wherein the spatial information is associated with the CG parameters.
  • the associating unit is a recording control unit and associates the captured image with the CG parameters and records them.
  • the associating section associates the captured image with the CG parameters and records them without recording the composite image.
  • the imaging device according to any one of (1) to (4), wherein the composite image is generated by combining the captured image with the CG image in another device.
  • the imaging device further comprising a CG synthesis unit that generates the synthesized image by synthesizing the CG image generated in another device with the captured image.
  • a CG image generating unit that generates the CG image to be combined with the captured image
  • a CG synthesis unit that generates the synthesized image by synthesizing the CG image with the captured image
  • the imaging device according to any one of (1) to (4), (8) The imaging device according to any one of (1) to (7), wherein the display control unit causes the display unit to display the composite image in which the CG image is composited by real-time processing corresponding to the captured image being captured.
  • the adjustment unit associates the CG parameters adjusted in response to a predetermined user operation with the captured image.
  • the imaging device according to any one of (1) to (10), wherein the display unit switches between and displays a plurality of composite images in which the CG image is composited with the captured images obtained by the plurality of imaging devices.
  • the imaging device (12) The imaging device according to (11), further comprising: a display unit that displays a plurality of composite images, each of which is obtained by combining the CG image with the captured images obtained by the plurality of imaging devices, at the same time.
  • the imaging device according to any one of (1) to (12), wherein imaging settings determined in any one of the plurality of imaging devices are used in common in the plurality of imaging devices.
  • the CG parameters include information regarding start timing information for starting synthesis of the CG image with the captured image, The imaging device according to any one of (1) to (13), wherein the start timing information is set based on a distance to a specific subject.
  • the CG parameters include start timing information for starting synthesis of the CG image with the captured image, The imaging device according to any one of (1) to (14), wherein the start timing information is a position of a specific subject in a three-dimensional space indicated by the spatial information.
  • the CG parameters include information regarding a rendering speed of the CG image, The imaging device according to any one of (14) and (15), wherein the drawing speed is set based on a distance to a specific subject. (18) Equipped with a distance measuring unit that measures the distance to the subject, The imaging device according to any one of (1) to (16), wherein the spatial information is acquired based on a distance to the subject measured by the distance measuring unit. (19) Output the captured image. displaying a composite image on a display unit, the composite image being generated by combining the captured image with a CG image based on spatial information of the captured space and CG parameters related to CG combination; A program that causes an image capturing apparatus to execute a process of associating the captured image with the CG parameters.
  • An imaging system including an imaging device and an information processing device,
  • the imaging device includes: an imaging unit that outputs a captured image; a display control unit that displays a composite image generated by combining the captured image with a CG image based on spatial information of the captured space and CG parameters related to CG composition on a display unit; an association unit that associates the captured image with the CG parameters; Equipped with
  • the information processing device includes: An imaging system comprising an editing unit that adjusts the CG parameters in response to a predetermined user operation.
  • Reference Signs List 1 Imaging system 2 Imaging device 3 Computer 31 Information acquisition section 32 Meta information generation section 33 Display control section 34 Recording control section 35 Adjustment section 51 CG generation section 52 Image acquisition section 53 CG synthesis section 54 Editing section

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

This imaging device comprises: an imaging unit which outputs a captured image; a display control unit which causes a display unit to display a synthesized image generated by synthesizing a CG image on the captured image on the basis of spatial information about an imaging space and CG parameters pertaining to CG synthesis; and an association unit which associates the captured image and the CG parameters.

Description

撮像装置、プログラム、撮像システムImaging device, program, imaging system
 本技術は撮像装置、プログラム、撮像システムに関し、特に撮像画像にCG(Computer Graphics)画像を合成する技術に関する。 This technology relates to imaging devices, programs, and imaging systems, and in particular to technology for synthesizing captured images with CG (Computer Graphics) images.
 三次元CGによる仮想空間映像とカメラにより撮像された実空間映像とを合成して、リアルタイムにその映像を表示することができるシステムが提案されている。 A system has been proposed that can synthesize virtual space images created using 3D CG with real space images captured by a camera, and display the images in real time.
特開2011-35638号公報JP 2011-35638 A
 ところで、CG合成を含む映像制作において、さらなる業務効率の向上が求められている。 Incidentally, there is a demand for further improvements in operational efficiency in video production, including CG compositing.
 そこで本技術では、CG合成を含む映像制作における業務効率を向上することを目的とする。 The aim of this technology is to improve work efficiency in video production, including CG compositing.
 本技術に係る撮像装置は、撮像画像を出力する撮像部と、撮像空間の空間情報とCG合成に関するCGパラメータとに基づいて、前記撮像画像にCG画像を合成することで生成された合成画像を表示部に表示させる表示制御部と、前記撮像画像と前記CGパラメータとを関連付ける関連付け部と、を備える。
 これにより、撮像装置は、撮像部の撮像中に、撮像画像にCG画像が合成された合成画像をほぼリアルタイムで表示部に表示することが可能となる。また、撮像装置は、撮像画像にCGパラメータが関連付けられているため、CGパラメータに基づいてCG画像を後からでも生成させることが可能となる。
An imaging device according to the present technology includes an imaging unit that outputs an imaged image, a display control unit that causes a display unit to display a composite image generated by combining a CG image with the imaged image based on spatial information of an imaging space and CG parameters related to CG synthesis, and an association unit that associates the imaged image with the CG parameters.
This allows the imaging device to display a composite image, in which the captured image is combined with the CG image, on the display device in almost real time while the imaging device is capturing an image. In addition, since the imaging device associates the captured image with the CG parameters, it is possible to generate a CG image based on the CG parameters even later.
実施形態に係る撮像システムを説明する図である。FIG. 1 is a diagram illustrating an imaging system according to an embodiment. 撮像画像、CG画像、合成画像を説明する図である。2A to 2C are diagrams illustrating a captured image, a CG image, and a composite image. 撮像装置の外観構成を説明する図である。FIG. 2 is a diagram illustrating the external configuration of the imaging device. 撮像装置の機能構成を説明する図である。FIG. 2 is a diagram illustrating a functional configuration of the imaging apparatus. コンピュータの構成を説明する図である。FIG. 2 is a diagram illustrating a configuration of a computer. 撮像システムの処理工程を説明する図である。FIG. 2 is a diagram illustrating a processing process of the imaging system. 準備工程におけるシーケンスチャートを示す図である。FIG. 13 is a diagram showing a sequence chart of a preparation process. 距離情報を説明する図である。FIG. 4 is a diagram illustrating distance information. CGファイルの生成工程を説明する図である。FIG. 1 is a diagram illustrating a process for generating a CG file. CGパラメータの一例を説明する図である。FIG. 2 is a diagram illustrating an example of CG parameters. CGファイル及びCGパラメータに基づくCG画像を説明する図である。FIG. 2 is a diagram illustrating a CG image based on a CG file and CG parameters. メタ情報を説明する図である。FIG. 2 is a diagram illustrating meta information. 撮像工程におけるシーケンスチャートを示す図である。FIG. 11 is a diagram showing a sequence chart of an imaging process. 撮像装置の撮像設定を説明する図である。FIG. 2 is a diagram illustrating imaging settings of an imaging device. 合成画像を説明する図である。FIG. 13 is a diagram illustrating a composite image. 調整画面を説明する図である。FIG. 13 is a diagram illustrating an adjustment screen. 調整画面を説明する図である。FIG. 13 is a diagram illustrating an adjustment screen. 調整画面を説明する図である。FIG. 13 is a diagram illustrating an adjustment screen. 本撮像工程で表示部に表示される画面を説明する図である。11A to 11C are diagrams illustrating screens displayed on a display unit in the main imaging process. 編集画面を説明する図であるFIG. 13 is a diagram illustrating an editing screen. 変形例1におけるカメラ制御部及びコンピュータ制御部の機能構成を説明する図である。11 is a diagram illustrating the functional configuration of a camera control unit and a computer control unit in Modification 1. FIG. 変形例1における撮像工程を図示化したブロック図である。FIG. 11 is a block diagram illustrating an imaging process in the first modified example. 変形例2におけるカメラ制御部及びコンピュータ制御部の機能構成を説明する図である。11 is a diagram illustrating the functional configuration of a camera control unit and a computer control unit in Modification 2. FIG. 変形例2における撮像工程を図示化したブロック図である。FIG. 11 is a block diagram illustrating an imaging process in Modification 2.
 以下、実施の形態を次の順序で説明する。
<1.撮像システムの概要>
<2.撮像装置>
<3.コンピュータ3の構成>
<4.撮像システム1の処理工程>
<5.変形例>
<6.まとめ>
The embodiments will be described below in the following order.
1. Overview of the imaging system
2. Imaging device
<3. Configuration of Computer 3>
4. Processing steps of imaging system 1
5. Modifications
<6. Summary>
 なお、本技術において「撮像」とは、画像データの記録を伴う撮像のみでなく、所謂スルー画やライブビュー画のように画像データの記録を伴わずに表示部に画像を表示させるための撮像を含むものである。
 「画像」とは、表示部に表示されている状態を指すだけでなく、表示部に表示されていない状態の画像データについても「画像」と表記することがある。
In addition, in this technology, "imaging" includes not only imaging that involves recording of image data, but also imaging for displaying an image on a display unit without recording of image data, such as a so-called through image or live view image.
The term "image" does not only refer to an image displayed on a display unit, but may also refer to image data that is not displayed on a display unit.
<1.撮像システムの概要>
 図1は、実施形態に係る撮像システム1を説明する図である。図2Aは、撮像画像101を説明する図である。図2Bは、CG画像102を説明する図である。図2Cは、合成画像103を説明する図である。図1に示すように、撮像システム1は、撮像装置2及びコンピュータ3を含んで構成される。
1. Overview of the imaging system
Fig. 1 is a diagram illustrating an imaging system 1 according to an embodiment. Fig. 2A is a diagram illustrating a captured image 101. Fig. 2B is a diagram illustrating a CG image 102. Fig. 2C is a diagram illustrating a composite image 103. As shown in Fig. 1, the imaging system 1 includes an imaging device 2 and a computer 3.
 撮像装置2は、例えば撮像現場に配置され、撮像現場において被写体を撮像することで図2Aに示すような撮像画像(動画像)101を生成する。撮像装置2は、1つでもよいが、ここでは、2つ設けられている場合について説明する。なお、2つの撮像装置2を分けて説明する場合、一方の撮像装置2を撮像装置2aと表記し、他方の撮像装置2を撮像装置2bと表記する。 The imaging device 2 is placed, for example, at an imaging site, and captures an image of a subject at the imaging site to generate an image (moving image) 101 as shown in FIG. 2A. There may be only one imaging device 2, but here we will explain a case where two imaging devices 2 are provided. When explaining the two imaging devices 2 separately, one imaging device 2 will be referred to as imaging device 2a and the other imaging device 2 will be referred to as imaging device 2b.
 撮像装置2は、例えば無線により相互に接続されており、画像や各種情報を相互に送受信可能である。なお、撮像装置2は、有線により相互に接続されていてもよく、クラウドサーバ等を介して相互に接続されていてもよい。
 図1では、撮像装置2aは、カメラマン8によって保持されており、カメラマン8の操作に応じて被写体を撮像する。
 図1では、撮像装置2bは、例えば三脚等によって所定位置及び所定方向に固定されており、既知の同期方式を用いて撮像装置2aと同期して被写体を撮像する。
 撮像装置2a及び撮像装置2bは、異なる位置及び方向から被写体を撮像することが可能である。なお、以下では、撮像装置2a及び撮像装置2bは、所定位置から移動されずに撮像する場合について説明するが、移動されながら撮像するようにしてもよい。
The imaging devices 2 are connected to each other, for example, wirelessly, and can transmit and receive images and various information to and from each other. Note that the imaging devices 2 may be connected to each other by wires, or may be connected to each other via a cloud server or the like.
In FIG. 1, the imaging device 2 a is held by a cameraman 8 and captures an image of a subject in response to the operation of the cameraman 8 .
In FIG. 1, the imaging device 2b is fixed at a predetermined position and in a predetermined direction by, for example, a tripod, and captures an image of a subject in synchronization with the imaging device 2a using a known synchronization method.
The imaging devices 2a and 2b are capable of capturing images of a subject from different positions and directions. In the following, the imaging devices 2a and 2b will be described as capturing images without moving from a predetermined position, but they may be configured to capture images while moving.
 被写体としてはどのようなものであってもよい。ここでは、被写体の一例として、川4に隣接する歩道5を3人の人物6が後方(撮像装置2に対してより遠い方)から前方(撮像装置2に対してより近い方)に向かって走っており、川4及び歩道5には橋7が掛けられている場合を例に挙げて説明する。 The subject can be anything. Here, as an example of the subject, we will explain a case where three people 6 are running from the back (farther from the imaging device 2) to the front (closer to the imaging device 2) on a sidewalk 5 adjacent to a river 4, and there is a bridge 7 across the river 4 and sidewalk 5.
 コンピュータ3は、撮像現場とは異なる場所に配置されることが想定されているが、撮像現場に配置されていてもよい。コンピュータ3は、撮像装置2と例えば無線により接続されており、撮像装置2から撮像画像101を受信したり、撮像装置2に合成画像103を送信したりする。 The computer 3 is assumed to be placed at a location different from the imaging site, but may be placed at the imaging site. The computer 3 is connected to the imaging device 2, for example, wirelessly, and receives the captured image 101 from the imaging device 2 and transmits the composite image 103 to the imaging device 2.
 コンピュータ3は、撮像装置2(撮像装置2a及び撮像装置2bの一方又は双方)から撮像画像101を受信すると、撮像画像101に対してCG画像102を合成する必要がある場合に、後述する空間情報及びメタ情報に基づいて図2Bに示すようなCG画像102を生成する。
 そして、コンピュータ3は、撮像画像101にCG画像102を合成し、図2Cに示すような合成画像103を生成する。なお、合成画像103には、CG画像102が合成されていないフレームも含み得るものである。その後、コンピュータ3は、生成した合成画像103を例えば撮像装置2aに送信する。
When the computer 3 receives an captured image 101 from the imaging device 2 (one or both of the imaging devices 2a and 2b), if it is necessary to synthesize a CG image 102 with the captured image 101, the computer 3 generates a CG image 102 as shown in FIG. 2B based on spatial information and meta information described later.
Then, the computer 3 composites the captured image 101 with the CG image 102 to generate a composite image 103 as shown in Fig. 2C. Note that the composite image 103 may also include frames in which the CG image 102 is not composited. After that, the computer 3 transmits the generated composite image 103 to, for example, the imaging device 2a.
 撮像装置2aは、コンピュータ3から合成画像103を受信すると、受信した合成画像103を表示部13(図3参照)に表示する。これにより、撮像システム1では、撮像装置2の撮像中に、ほぼリアルタイムで動画像としての合成画像103をカメラマン8に確認させることが可能となる。以下では、撮像システム1の詳細について説明する。
When the imaging device 2a receives the composite image 103 from the computer 3, it displays the received composite image 103 on the display unit 13 (see FIG. 3). This makes it possible for the imaging system 1 to allow the cameraman 8 to check the composite image 103 as a moving image in almost real time while the imaging device 2 is capturing an image. Details of the imaging system 1 will be described below.
<2.撮像装置>
 図3は、撮像装置2の外観構成を説明する図である。図4は、撮像装置2の機能構成を説明する図である。
 図3に示すように、撮像装置2は、撮像部11、ToF(Time of Flight:光飛行時間)センサ12、表示部13及び操作部14を備える。
2. Imaging device
Fig. 3 is a diagram illustrating the external configuration of the imaging device 2. Fig. 4 is a diagram illustrating the functional configuration of the imaging device 2.
As shown in FIG. 3, the imaging device 2 includes an imaging unit 11, a ToF (Time of Flight) sensor 12, a display unit 13, and an operation unit .
 撮像部11は、図4に示すように、例えばレンズ系21、撮像素子部22、カメラプロセス部23、記録部24、通信部25、カメラ制御部26、メモリ部27、ドライバ部28を備える。 As shown in FIG. 4, the imaging unit 11 includes, for example, a lens system 21, an imaging element unit 22, a camera processing unit 23, a recording unit 24, a communication unit 25, a camera control unit 26, a memory unit 27, and a driver unit 28.
 レンズ系21は、ズームレンズ、フォーカスレンズ等のレンズや絞り機構などを備える。このレンズ系21により、被写体からの光(入射光)が撮像素子部22に集光される。
 なお、レンズ系21は、撮像装置2と一体に設けられていてもよいし、撮像装置2と別体の交換レンズとして構成されていてもよい。
The lens system 21 includes lenses such as a zoom lens and a focus lens, an aperture mechanism, etc. The lens system 21 collects light (incident light) from a subject onto the image sensor unit 22.
The lens system 21 may be provided integrally with the imaging device 2 , or may be configured as an interchangeable lens separate from the imaging device 2 .
 撮像素子部22は、例えば、CMOS(Complementary Metal Oxide Semiconductor)型やCCD(Charge Coupled Device)型などのイメージセンサ(撮像素子)を有して構成される。
 この撮像素子部22では、イメージセンサで受光した光を光電変換して電気信号を得て、その電気信号に対して例えばCDS(Correlated Double Sampling)処理、AGC(Automatic Gain Control)処理などを行い、さらにA/D(Analog/Digital)変換処理を行う。そして、撮像素子部22は、A/D変換処理後のデジタルデータとしての画像データを、後段のカメラプロセス部23、記録部24、通信部25に出力可能である。
The imaging element unit 22 is configured to have an image sensor (imaging element), such as a complementary metal oxide semiconductor (CMOS) type or a charge coupled device (CCD) type.
In the image sensor unit 22, light received by the image sensor is photoelectrically converted to obtain an electric signal, and the electric signal is subjected to, for example, CDS (Correlated Double Sampling) processing, AGC (Automatic Gain Control) processing, and A/D (Analog/Digital) conversion processing. The image sensor unit 22 can output image data as digital data after A/D conversion processing to the camera process unit 23, the recording unit 24, and the communication unit 25 in the subsequent stages.
 カメラプロセス部23は、例えばDSP(Digital Signal Processor)等により画像処理プロセッサとして構成される。
 カメラプロセス部23は、撮像素子部22からのデジタルデータ(画像信号)に対して、各種の信号処理を施して所定形式の画像データを生成する。例えばカメラプロセス部23は、レンズ補正、ノイズリダクション、同時化処理、YC生成処理、色再現/シャープネス処理、ファイル形成処理等を行う。
The camera processor 23 is configured as an image processor using, for example, a DSP (Digital Signal Processor).
The camera processor 23 generates image data in a predetermined format by performing various signal processes on the digital data (image signal) from the image sensor 22. For example, the camera processor 23 performs lens correction, noise reduction, synchronization processing, YC generation processing, color reproduction/sharpness processing, file formation processing, etc.
 同時化処理では、各画素についての画像データが、R,G,B全ての色成分を有するようにする色分離処理を施す。例えば、ベイヤー配列等のモザイクカラーフィルタを用いたイメージセンサの場合は、色分離処理としてデモザイク処理が行われる。
 YC生成処理では、R,G,Bの画像データから、輝度(Y)信号及び色(C)信号を生成(分離)する。
 色再現/シャープネス処理では、いわゆる画作りとしての、階調、彩度、トーン、コントラストなどを調整する処理を行う。
 ファイル形成処理では、画像データについて、例えば記録用や通信用の圧縮符号化、フォーマティング、メタ情報の生成や付加などを行って記録用や通信用のファイル生成を行う。
 例えばMPEG-4準拠の動画の圧縮に用いられているMP4フォーマットや、XAVCフォーマットなどとしての画像ファイルの生成を行うことが考えられる。
In the synchronization process, a color separation process is performed so that the image data for each pixel has all color components of R, G, and B. For example, in the case of an image sensor using a mosaic color filter such as a Bayer array, a demosaic process is performed as the color separation process.
In the YC generation process, a luminance (Y) signal and a color (C) signal are generated (separated) from R, G, and B image data.
In the color reproduction/sharpness processing, processing is carried out to adjust gradation, saturation, tone, contrast, etc., as the so-called image creation.
In the file creation process, image data is subjected to, for example, compression encoding for recording or communication, formatting, generation and addition of meta information, etc., to generate files for recording or communication.
For example, it is conceivable to generate image files in the MP4 format, which is used for compressing moving images conforming to MPEG-4, or in the XAVC format.
 記録部24は、撮像装置2に着脱できる記録媒体であるメモリカード(可搬型のフラッシュメモリ等)、撮像装置2に内蔵されるフラッシュメモリやHDD(Hard Disk Drive)等である。記録部24は、撮像素子部22から出力された画像データ、又は、カメラプロセス部23から出力された画像データが記録される。また、記録部24には、撮像画像101に合成されるCG画像102を生成するためのメタ情報が画像データに関連づけて記録される。なお、メタ情報について詳しくは後述する。
 撮像素子部22から出力された画像データは、カメラプロセス部23による画像処理が行われていないRAW画像データである。したがって、記録部24には、RAW画像データが記録されるようにしてもよいし、カメラプロセス部23による画像処理が行われた後の画像データが記録されるようにしてもよい。
 以下では、記録部24にはRAW画像データが記録されるものとする。なお、RAW画像データは、非圧縮RAW画像データ又は圧縮RAW画像データのいずれであってもよい。
The recording unit 24 is a memory card (such as a portable flash memory) which is a recording medium that can be attached to and detached from the imaging device 2, or a flash memory or HDD (Hard Disk Drive) built into the imaging device 2. The recording unit 24 records image data output from the imaging element unit 22 or image data output from the camera process unit 23. The recording unit 24 also records meta information for generating a CG image 102 to be combined with the captured image 101 in association with the image data. The meta information will be described in detail later.
The image data output from the imaging element unit 22 is RAW image data that has not been subjected to image processing by the camera process unit 23. Therefore, the recording unit 24 may record RAW image data, or may record image data that has been subjected to image processing by the camera process unit 23.
In the following, it is assumed that RAW image data is recorded in the recording unit 24. Note that the RAW image data may be either uncompressed RAW image data or compressed RAW image data.
 表示部13は、液晶パネル(LCD:Liquid Crystal Display)や有機EL(Electro-Luminescence)ディスプレイ等のディスプレイデバイスで構成されており、その数はいくつであってもよい。図3の例では、表示部13が2つ設けられているが、1つの表示部13の表示領域を2つに分割し、分割した2つ表示領域を2つの撮像装置それぞれの画像の表示領域として用いるようにしてもよい。
 表示部13は、撮像部11と一体に設けられていてもよいし、別体に設けられていてもよい。表示部13が撮像部11とは別体として設けられる例としては、例えばスマートフォンの表示部を表示部13として用いることが考えられる。この場合、撮像装置2とスマートフォンとが無線又は有線により接続されており、相互に画像や情報を送受信可能となっていればよい。そして、後述する表示制御部33は通信部25を介して画像を送信し、通信部25はスマートフォンからの操作に応じた制御信号を受信するようにすればよい。
 表示部13は、表示画面に各種画像を表示する。例えば表示部13は、記録部24から読み出された画像データの再生画像を表示する。
 また、表示部13は、各種操作メニュー、アイコン、メッセージ等、即ちGUI(Graphical User Interface)としての表示を画面上に実行する。
The display unit 13 is composed of a display device such as a liquid crystal display (LCD) panel or an organic electroluminescence (EL) display, and the number of such display units 13 may be any number. In the example of Fig. 3, two display units 13 are provided, but the display area of one display unit 13 may be divided into two, and the two divided display areas may be used as display areas for images from two imaging devices.
The display unit 13 may be provided integrally with the imaging unit 11 or may be provided separately. As an example of the display unit 13 being provided separately from the imaging unit 11, for example, a display unit of a smartphone may be used as the display unit 13. In this case, it is sufficient that the imaging device 2 and the smartphone are connected wirelessly or by wire and are capable of transmitting and receiving images and information to each other. Then, the display control unit 33 described later transmits the image via the communication unit 25, and the communication unit 25 receives a control signal in response to an operation from the smartphone.
The display unit 13 displays various images on a display screen. For example, the display unit 13 displays a reproduced image of image data read from the recording unit 24.
The display unit 13 also displays various operation menus, icons, messages, etc., that is, GUI (Graphical User Interface) on the screen.
 通信部25は、外部機器(他の撮像装置2、コンピュータ3)との間で無線又は有線による通信を行う。通信部25は、例えばコンピュータ3に対して撮像画像101を送信したり、合成画像103を受信したりする。 The communication unit 25 communicates wirelessly or wired with external devices (other imaging devices 2, computer 3). For example, the communication unit 25 transmits the captured image 101 to the computer 3 and receives the composite image 103.
 操作部14は、ユーザが各種操作入力を行うための入力デバイスを総括して示している。具体的には操作部14は、撮像部11に設けられた各種の操作子(キー、ダイヤル、タッチパネル)や表示部13の前面に設けられたタッチパネル14a等である。
 操作部14によりユーザの操作が検知されると、入力された操作に応じた信号がカメラ制御部26へ送られる。
The operation unit 14 collectively refers to input devices for the user to input various operations. Specifically, the operation unit 14 includes various operators (keys, dials, touch panel) provided in the imaging unit 11, a touch panel 14a provided on the front surface of the display unit 13, and the like.
When the operation unit 14 detects a user operation, a signal corresponding to the input operation is sent to the camera control unit 26 .
 カメラ制御部26は、CPU(Central Processing Unit)を備えたマイクロコンピュータ(演算処理装置)により構成される。カメラ制御部26は、空間情報取得部31、メタ情報取得部32、表示制御部33、記録制御部34及び調整部35としての機能部を備える。なお、これらの機能部の詳細については後述する。
 メモリ部27は、カメラ制御部26が処理に用いる情報等を記憶する。図示するメモリ部27としては、例えばROM(Read Only Memory)、RAM(Random Access Memory)、フラッシュメモリなどを包括的に示している。
 メモリ部27は、カメラ制御部26としてのマイクロコンピュータチップに内蔵されるメモリ領域であってもよいし、別体のメモリチップにより構成されてもよい。
 カメラ制御部26は、メモリ部27のROMや記録部24等に記憶されたプログラムを実行することで、この撮像装置2の全体を制御する。
 例えばカメラ制御部26は、撮像素子部22のシャッタースピードの制御、カメラプロセス部23における各種信号処理の指示、ユーザの操作に応じた撮像動作や記録動作、記録された画像データの再生動作、ズーム、フォーカス、絞り調整等のレンズ系21の動作、ユーザインタフェース動作等について、必要各部の動作を制御する。
The camera control unit 26 is configured by a microcomputer (arithmetic processing device) equipped with a CPU (Central Processing Unit). The camera control unit 26 includes functional units as a spatial information acquisition unit 31, a meta information acquisition unit 32, a display control unit 33, a recording control unit 34, and an adjustment unit 35. Details of these functional units will be described later.
The memory unit 27 stores information and the like used for processing by the camera control unit 26. The illustrated memory unit 27 collectively represents, for example, a read only memory (ROM), a random access memory (RAM), a flash memory, and the like.
The memory unit 27 may be a memory area built into the microcomputer chip serving as the camera control unit 26, or may be configured by a separate memory chip.
The camera control unit 26 executes programs stored in the ROM of the memory unit 27, the recording unit 24, and the like, thereby controlling the entire imaging device 2.
For example, the camera control unit 26 controls the operation of each necessary unit with respect to controlling the shutter speed of the image sensor unit 22, instructing various signal processing in the camera processing unit 23, image capturing and recording operations in response to user operations, playback operations of recorded image data, operations of the lens system 21 such as zoom, focus, and aperture adjustment, user interface operations, etc.
 メモリ部27におけるRAMは、カメラ制御部26のCPUの各種データ処理の際の作業領域として、データやプログラム等の一時的な格納に用いられる。
 メモリ部27におけるROMやフラッシュメモリ(不揮発性メモリ)は、CPUが各部を制御するためのOS(Operating System)や、各種動作のためのアプリケーションプログラムや、ファームウエア、各種の設定情報等の記憶に用いられる。
The RAM in the memory unit 27 is used as a working area for various data processing by the CPU of the camera control unit 26, and is used for temporarily storing data, programs, and the like.
The ROM and flash memory (non-volatile memory) in the memory unit 27 are used to store the OS (Operating System) that the CPU uses to control each unit, application programs for various operations, firmware, various setting information, etc.
 ドライバ部28には、例えばズームレンズ駆動モータに対するモータドライバ、フォーカスレンズ駆動モータに対するモータドライバ、絞り機構のモータに対するモータドライバ等が設けられている。
 これらのモータドライバはカメラ制御部26からの指示に応じて駆動電流を対応するドライバに印加し、フォーカスレンズやズームレンズの移動、絞り機構の絞り羽根の開閉等を実行させることになる。
The driver unit 28 includes, for example, a motor driver for a zoom lens drive motor, a motor driver for a focus lens drive motor, a motor driver for a diaphragm mechanism motor, and the like.
These motor drivers apply drive currents to the corresponding drivers in response to instructions from the camera control unit 26, thereby moving the focus lens and zoom lens, opening and closing the aperture blades of the aperture mechanism, and so on.
 ToFセンサ12は、例えば赤外線レーザー光(照射光)を照射する照射部と、照射部によって照射され被写体で反射した反射光を受光する例えばCMOS型のセンサとを含んで構成される。ToFセンサ12は、ToF(Time of Flight:光飛行時間)方式により撮像装置2から被写体までの距離を算出する。 The ToF sensor 12 includes, for example, an irradiation unit that irradiates infrared laser light (irradiation light), and a sensor, for example of a CMOS type, that receives the light that is irradiated by the irradiation unit and reflected by the subject. The ToF sensor 12 calculates the distance from the imaging device 2 to the subject using the ToF (Time of Flight) method.
 ToFセンサ12は、照射光が被写体で反射された後に反射光として戻ってくるまでの時間を直接的に求めることにより距離を算出する直接ToF方式が適用可能である。また、ToFセンサ12は、照射光と反射光との位相差に基づいて被写体までの距離を算出する間接ToF方式が適用可能である。
The ToF sensor 12 can use a direct ToF method that calculates distance by directly determining the time it takes for irradiated light to return as reflected light after being reflected by the subject, or an indirect ToF method that calculates distance to the subject based on the phase difference between irradiated light and reflected light.
<3.コンピュータ3の構成>
 図5は、コンピュータ3の構成を説明する図である。
 コンピュータ3は、パーソナルコンピュータ(PC)、スマートフォンやタブレット等の携帯端末装置、携帯電話機、ビデオ編集装置、ビデオ再生機器等が想定される。また、コンピュータ3は、クラウドコンピューティングにおけるサーバ装置や演算装置として構成されてもよい。
 したがって、コンピュータ3は、表示部13として用いられるスマートフォンであってもよく、また、撮像装置2であってもよい。
 さらに、コンピュータ3は、後述する工程毎、1又は複数の機能部毎に異なる装置であってもよい。
<3. Configuration of Computer 3>
FIG. 5 is a diagram illustrating the configuration of the computer 3.
The computer 3 is assumed to be a personal computer (PC), a mobile terminal device such as a smartphone or a tablet, a mobile phone, a video editing device, a video playback device, etc. The computer 3 may also be configured as a server device or a computing device in cloud computing.
Therefore, the computer 3 may be a smartphone used as the display unit 13 , or may be an imaging device 2 .
Furthermore, the computer 3 may be a different device for each process or for one or more functional units, which will be described later.
 コンピュータ3は、コンピュータ制御部41、記録部42、表示部43、通信部44、操作部45及びメモリ部46を備える。 The computer 3 includes a computer control unit 41, a recording unit 42, a display unit 43, a communication unit 44, an operation unit 45, and a memory unit 46.
 コンピュータ制御部41は、CPUを備えたマイクロコンピュータ(演算処理装置)により構成される。コンピュータ制御部41は、CG修正部51、画像取得部52、CG画像生成部53、CG合成部54及び編集部55としての機能部を備える。なお、これらの機能部の詳細については後述する。 The computer control unit 41 is composed of a microcomputer (arithmetic processing unit) equipped with a CPU. The computer control unit 41 has functional units as a CG correction unit 51, an image acquisition unit 52, a CG image generation unit 53, a CG synthesis unit 54, and an editing unit 55. Details of these functional units will be described later.
 記録部42は、コンピュータ3に着脱できる記録媒体であるメモリカード(可搬型のフラッシュメモリ等)、コンピュータ3に内蔵されるフラッシュメモリやHDD(hard Disk Drive)等である。
 記録部42には、撮像装置2から受信した撮像画像101(RAW画像データ)が記録されたり、撮像画像101に合成されるCG画像102を生成するための各種情報が含まれるCGファイル等が記録されたりする。なお、撮像装置2から受信する撮像画像101は、カメラプロセス部23により各種の信号処理が施された後の画像データでもよい。
The recording unit 42 is a memory card (such as a portable flash memory) which is a recording medium that can be attached to and detached from the computer 3, or a flash memory or HDD (hard disk drive) built into the computer 3, or the like.
The recording unit 42 records the captured image 101 (RAW image data) received from the imaging device 2, and records a CG file or the like that includes various information for generating a CG image 102 to be combined with the captured image 101. Note that the captured image 101 received from the imaging device 2 may be image data that has been subjected to various signal processes by the camera processing unit 23.
 表示部43は、ユーザに対して各種表示を行う表示部であり、例えば液晶パネルや有機ELディスプレイ等を含んで構成される。 The display unit 43 is a display unit that displays various information to the user, and is configured to include, for example, a liquid crystal panel or an organic EL display.
 通信部44は、外部機器(撮像装置2)との間で無線又は有線による通信を行う。通信部44は、例えば撮像装置2から撮像画像101を受信したり、合成画像103を送信したりする。 The communication unit 44 communicates with an external device (imaging device 2) wirelessly or via a wired connection. For example, the communication unit 44 receives the captured image 101 from the imaging device 2 and transmits the composite image 103.
 操作部45は、ユーザが各種操作入力を行うための入力デバイスを総括して示している。具体的には操作部45は、各種の操作子(キーボード、マウス、キー、ダイヤル、タッチパネル等)である。
 操作部45によりユーザの操作が検知されると、入力された操作に応じた信号はコンピュータ制御部41へ送られる。
The operation unit 45 collectively represents input devices for the user to input various operations, specifically, the operation unit 45 includes various operators (keyboard, mouse, keys, dial, touch panel, etc.).
When the operation unit 45 detects a user operation, a signal corresponding to the input operation is sent to the computer control unit 41 .
 メモリ部46は、例えばROM、RAM、フラッシュメモリなどを包括的に示している。メモリ部46は、コンピュータ制御部41としてのマイクロコンピュータチップに内蔵されるメモリ領域であってもよいし、別体のメモリチップにより構成されてもよい。
 コンピュータ制御部41は、メモリ部46のROMや記録部42等に記憶されたプログラムを実行することで、コンピュータ3全体を制御する。
The memory unit 46 collectively indicates, for example, a ROM, a RAM, a flash memory, etc. The memory unit 46 may be a memory area built into the microcomputer chip serving as the computer control unit 41, or may be configured by a separate memory chip.
The computer control unit 41 controls the entire computer 3 by executing programs stored in the ROM of the memory unit 46, the recording unit 42, etc.
 メモリ部46におけるRAMは、コンピュータ制御部41のCPUの各種データ処理の際の作業領域として、データやプログラム等の一時的な格納に用いられる。
 メモリ部46におけるROMやフラッシュメモリは、CPUが各部を制御するためのOSや、各種動作のためのアプリケーションプログラムや、各種の設定情報等の記憶に用いられる。
The RAM in the memory unit 46 is used as a working area for various data processing by the CPU of the computer control unit 41, for temporarily storing data, programs, etc.
The ROM and flash memory in the memory unit 46 are used to store the OS for the CPU to control each unit, application programs for various operations, various setting information, and the like.
<4.撮像システム1の処理工程>
 図6は、撮像システム1の処理工程を説明する図である。ここで、撮像システム1で行われる工程を説明しておく。図6に示すように、撮像システム1の処理工程は3つの段階に大別される。準備工程ST1、撮像工程ST2、後処理工程ST3である。
4. Processing steps of imaging system 1
Fig. 6 is a diagram for explaining the processing steps of the imaging system 1. Here, the steps performed by the imaging system 1 will be explained. As shown in Fig. 6, the processing steps of the imaging system 1 are roughly divided into three stages: a preparation step ST1, an imaging step ST2, and a post-processing step ST3.
 準備工程ST1は、撮像工程ST2に先駆けて行われる工程である。準備工程ST1では、撮像装置2により撮像空間の空間情報(空間マップ)が取得されるとともに、取得された空間情報を用いてコンピュータ3によりCG画像102を生成するためのCGパラメータの決定及び修正がなされる。 The preparation process ST1 is a process carried out prior to the imaging process ST2. In the preparation process ST1, spatial information (spatial map) of the imaging space is acquired by the imaging device 2, and the acquired spatial information is used by the computer 3 to determine and modify CG parameters for generating the CG image 102.
 撮像工程ST2は、図1に示したような撮像現場において実際に撮像を行う工程である。この撮像工程ST2は、本撮像に先駆けてリハーサル撮像を行うリハーサル撮像工程、及び、本撮像を行う本撮像工程が含まれる。リハーサル撮像工程及び本撮像工程においては、撮像装置2により得られた撮像画像101に対してCG画像102がほぼリアルタイムに合成されて合成画像103として表示部13に表示される。また、リハーサル撮像工程においては、操作部14の操作に応じてCGパラメータを調整する処理が行われる。 The imaging process ST2 is a process in which imaging is actually performed at an imaging site such as that shown in FIG. 1. This imaging process ST2 includes a rehearsal imaging process in which rehearsal imaging is performed prior to actual imaging, and a main imaging process in which actual imaging is performed. In the rehearsal imaging process and the main imaging process, a CG image 102 is synthesized in almost real time with an imaging image 101 obtained by the imaging device 2, and is displayed on the display unit 13 as a synthesized image 103. In addition, in the rehearsal imaging process, a process of adjusting CG parameters is performed in response to the operation of the operation unit 14.
 後処理工程ST3は、撮像後に行われる各種処理を示している。例えばCGパラメータの調整、画像の調整、クリップ編集、映像エフェクトなどが行われる。 The post-processing step ST3 indicates various processes that are carried out after shooting. For example, CG parameter adjustment, image adjustment, clip editing, video effects, etc. are carried out.
 画像の調整として色調整、輝度調整、コントラスト調整などが行われる場合がある。
 クリップ編集として、クリップのカット、順番の調整、時間長の調整などが行われる場合がある。
 画像エフェクトとして、特殊効果画像の合成などが行われる場合がある。
Image adjustments may include color adjustments, brightness adjustments, contrast adjustments, and the like.
Clip editing may involve cutting clips, adjusting the order, adjusting the length of time, and the like.
As an image effect, special effect image synthesis may be performed.
 以下では、これらの工程について詳しく説明する。
These steps are described in detail below.
[4.1.準備工程]
 図7は、準備工程におけるシーケンスチャートを示す図である。図8は、距離情報を説明する図である。
[4.1. Preparation process]
Fig. 7 is a diagram showing a sequence chart of the preparation process Fig. 8 is a diagram for explaining distance information.
 準備工程ST1においては、1つの撮像装置2(例えば撮像装置2a)のToFセンサ12を用いて、本撮像において背景となる撮像空間がセンシングされる。カメラマン8は、本撮像において背景として写る撮像空間における被写体までの距離がToFセンサ12によってセンシングできるように、撮像装置2aを移動させながら撮像空間をセンシングさせるようにしてもよい。このとき、図1に示した人物6等、移動する被写体は存在しないほうが好ましい。 In the preparation process ST1, the ToF sensor 12 of one of the imaging devices 2 (e.g., imaging device 2a) is used to sense the imaging space that will be the background in the actual imaging. The cameraman 8 may sense the imaging space while moving the imaging device 2a so that the ToF sensor 12 can sense the distance to the subject in the imaging space that will be captured as the background in the actual imaging. At this time, it is preferable that there are no moving subjects, such as the person 6 shown in Figure 1.
 撮像装置2aでは、ToFセンサ12において距離情報の取得と合わせて、撮像部11による撮像画像101が取得されてもよい。このとき、撮像部11の撮像フレームレート及び撮像タイミングとToFセンサ12のセンシングレート及びセンシングタイミングを一致させるとよい。
 ただし、撮像部11の撮像フレームレート及び撮像タイミングとToFセンサ12のセンシングレート及びセンシングタイミングが一致しない、即ち、同期していることは必須ではなく、補間処理などにより、撮像画像の各フレームの撮像タイミングに対応する距離情報を算出するようにしてもよい。
In the imaging device 2a, the imaging unit 11 may acquire an image 101 in addition to acquiring distance information in the ToF sensor 12. In this case, it is preferable to match the imaging frame rate and imaging timing of the imaging unit 11 with the sensing rate and sensing timing of the ToF sensor 12.
However, it is not essential that the imaging frame rate and imaging timing of the imaging unit 11 do not match the sensing rate and sensing timing of the ToF sensor 12, i.e., that they are synchronized, and distance information corresponding to the imaging timing of each frame of the captured image may be calculated by interpolation processing or the like.
 ToFセンサ12では、被写体までの距離が例えばセンシングレートに応じて検出される。撮像装置2aの空間情報取得部31は、被写体までの距離を示す距離情報をToFセンサ12から取得する(ステップS1)。そして、空間情報取得部31は、取得された距離情報から撮像空間の空間情報(空間マップ)を取得する(ステップS2)。なお、図8では、空間情報を取得済みの撮像空間をメッシュで示している。また、メッシュが示されていない領域は未だ空間情報を取得していない領域である。
 空間情報としては、所定の基準位置(ここでは、撮像装置2aの位置)に対する撮像空間の位置(座標)だけでなく、撮像空間内の構造物の造形、撮像装置2が撮像する際の基準位置に対する相対的な位置が含まれていてもよい。
 なお、基準位置は、例えば撮像装置2が本撮像において配置される位置であってもよく、その他の任意の位置であってもよい。
The ToF sensor 12 detects the distance to the subject, for example, according to the sensing rate. The spatial information acquisition unit 31 of the imaging device 2a acquires distance information indicating the distance to the subject from the ToF sensor 12 (step S1). Then, the spatial information acquisition unit 31 acquires spatial information (spatial map) of the imaging space from the acquired distance information (step S2). Note that in FIG. 8, the imaging space for which spatial information has been acquired is shown by a mesh. Also, areas without a mesh are areas for which spatial information has not yet been acquired.
The spatial information may include not only the position (coordinates) of the imaging space relative to a predetermined reference position (here, the position of the imaging device 2a), but also the shape of structures in the imaging space and their relative positions relative to the reference position when the imaging device 2 captures images.
The reference position may be, for example, the position where the imaging device 2 is placed during actual imaging, or may be any other position.
 ここでは、空間情報取得部31は、ToFセンサ12から取得された距離情報に加えて、もしくは代えて、撮像部11によって取得された撮像画像101を用いて空間情報を算出するようにしてもよい。例えば、撮像部11によって取得された撮像画像101から画像処理によって距離情報を算出するようにしてもよいし、さらに撮像画像101の色情報に基づいて、連続する同じ色の領域を1つの構造物と判定するようにしてもよい。これにより、空間情報を精度よく算出することが可能となる。
 また、空間情報は、三次元の空間情報である場合を例に挙げて説明するが、二次元の空間情報(デプスマップ)であってもよい。さらに、撮像画像101を合わせて取得することにより、テクスチャを含む空間情報とすること可能である。
Here, the spatial information acquisition unit 31 may calculate spatial information using the captured image 101 acquired by the imaging unit 11 in addition to or instead of the distance information acquired from the ToF sensor 12. For example, the distance information may be calculated by image processing from the captured image 101 acquired by the imaging unit 11, and further, continuous areas of the same color may be determined to be one structure based on color information of the captured image 101. This makes it possible to calculate the spatial information with high accuracy.
Although the spatial information will be described as three-dimensional spatial information, it may be two-dimensional spatial information (depth map). Furthermore, by acquiring the captured image 101 together, it is possible to obtain spatial information including texture.
 通信部25は、取得された空間情報をコンピュータ3に送信する(ステップS3)。コンピュータ3は、撮像装置2aから送信された空間情報を受信する(ステップS11)。これにより、コンピュータ3では、受信した空間情報及びCGファイルに基づいてCGパラメータを修正することが可能となる。 The communication unit 25 transmits the acquired spatial information to the computer 3 (step S3). The computer 3 receives the spatial information transmitted from the imaging device 2a (step S11). This enables the computer 3 to modify the CG parameters based on the received spatial information and CG file.
 図9は、CGパラメータの修正工程を説明する図である。図10は、CGパラメータの一例を説明する図である。 FIG. 9 is a diagram explaining the CG parameter correction process. FIG. 10 is a diagram explaining an example of CG parameters.
 コンピュータ3において図9に示すようなCG修正画面62が表示部43に表示される。ユーザは、CG修正画面62に対して操作部45を介して操作することにより、所望するCGパラメータを決定及び修正することができる。なお、CG修正画面62は、一例に過ぎず、空間情報を用いてCGパラメータの決定及び修正をすることが可能であれば、どのような構成であってもよい。 In the computer 3, a CG correction screen 62 as shown in FIG. 9 is displayed on the display unit 43. The user can determine and modify desired CG parameters by operating the CG correction screen 62 via the operation unit 45. Note that the CG correction screen 62 is merely one example, and any configuration is acceptable as long as it is possible to determine and modify CG parameters using spatial information.
 CG修正部51は、撮像装置2から送信された空間情報に示される三次元の撮像空間に基づくCG画像102をCG修正画面62のCG表示領域63に表示する。 The CG correction unit 51 displays a CG image 102 based on the three-dimensional imaging space indicated in the spatial information transmitted from the imaging device 2 in the CG display area 63 of the CG correction screen 62.
 CG修正部51は、CG修正画面62に対するユーザ操作に基づく修正を適宜適用しながらCG画像102を再生成してCG表示領域63に表示しつつ、当該ユーザ操作に応じてCGパラメータを修正する(ステップS12)。 The CG correction unit 51 regenerates the CG image 102 while appropriately applying corrections based on user operations on the CG correction screen 62, displays it in the CG display area 63, and corrects the CG parameters in accordance with the user operations (step S12).
 CG修正画面62では、例えば草木や石等のCGオブジェクトの配置領域、種類、色合い、サイズ、数(密度)、対象となる被写体、開始タイミング、描画スピード、シェーディング(光源の方向、強度、色温度)等のCGパラメータを決定及び修正することができる。これらのCGパラメータは、CG画像102を生成するためのパラメータと言い換えることができる。 In the CG correction screen 62, it is possible to determine and correct CG parameters such as the placement area, type, color, size, number (density), target subject, start timing, drawing speed, shading (light source direction, intensity, color temperature) of CG objects such as plants and stones. These CG parameters can be said to be parameters for generating the CG image 102.
 CG修正部51は、CG表示領域63に対するユーザ操作に応じて、空間情報に示される三次元の撮像空間においてCGオブジェクトが配置される配置領域63aを決定及び修正する。ここで決定及び修正された配置領域63aは、例えば、CGオブジェクトが描画されるCG画像102上のアドレス(合成位置)で示されている。ただし、配置領域63aは、CGオブジェクトが配置される撮像空間上のアドレスであってもよい。 The CG modification unit 51 determines and modifies a placement area 63a in which a CG object is placed in the three-dimensional imaging space shown in the spatial information in response to a user operation on the CG display area 63. The placement area 63a determined and modified here is indicated, for example, by an address (composite position) on the CG image 102 in which the CG object is drawn. However, the placement area 63a may also be an address in the imaging space in which the CG object is placed.
 CG修正画面62には、CG表示領域63に加え、CGオブジェクトの種類を選択するための種類選択領域64、CGオブジェクトの数を変更するための数変更ダイヤル65、CG画像102の色合いを選択するためのカラーホイール66、CG画像102の描画スピードを変更するためのシークバー67が表示される。 In addition to a CG display area 63, the CG correction screen 62 displays a type selection area 64 for selecting the type of CG object, a number change dial 65 for changing the number of CG objects, a color wheel 66 for selecting the hue of the CG image 102, and a seek bar 67 for changing the drawing speed of the CG image 102.
 CG修正部51は、種類選択領域64に表示された複数のCGオブジェクトの種類から、ユーザによる操作部45の操作に応じて選択されたCGオブジェクトを、CG画像102として描画されるCGオブジェクトの種類に決定する。
 また、CG修正部51は、配置領域63a内にCG画像102として描画されるCGオブジェクトの数として、数変更ダイヤル65によって選択されている値に対応する数(密度)に修正する。なお、数変更ダイヤル65は、操作部45を介したユーザ操作に応じて回転表示される。したがって、CG修正部51は、回転表示された数変更ダイヤル65の操作に応じて配置領域63a内にCG画像102として描画されるCGオブジェクトの数を変更可能に決定する。
 また、CG修正部51は、CG画像102を描画する色合いとして、カラーホイール66によって選択されている色合いに修正する。
The CG modification unit 51 determines the CG object selected by the user in response to operation of the operation unit 45 from the multiple CG object types displayed in the type selection area 64 as the type of CG object to be rendered as the CG image 102.
Furthermore, the CG correction unit 51 corrects the number of CG objects to be drawn as CG images 102 in the placement area 63a to a number (density) corresponding to the value selected by the number-changing dial 65. Note that the number-changing dial 65 is rotated and displayed in response to a user operation via the operation unit 45. Therefore, the CG correction unit 51 variably determines the number of CG objects to be drawn as CG images 102 in the placement area 63a in response to the operation of the rotated and displayed number-changing dial 65.
Furthermore, the CG correction section 51 corrects the CG image 102 to a color tone selected by a color wheel 66 as the color tone for rendering the CG image 102 .
 ここで、CGパラメータの開始タイミングとは、撮像画像101に対してCG画像102の合成を開始するタイミング、即ち、開始トリガーを示すものである。また、描画スピードとは、CG画像102の合成開始から合成終了までの時間を示すものである。なお、描画スピードは、時間で指定するのではなくフレーム数で指定するようにしてもよい。 The start timing of the CG parameters here indicates the timing at which the composition of the CG image 102 starts with the captured image 101, i.e., the start trigger. The drawing speed indicates the time from the start to the end of composition of the CG image 102. Note that the drawing speed may be specified by the number of frames rather than by time.
 CG修正部51は、ユーザによる操作部45の操作に応じて対象となる被写体及び開始タイミングを決定及び修正する。具体的には、CG修正部51は、対象となる被写体をユーザ操作に応じて決定するとともに、開始タイミングとして、特定の被写体までの距離や、対象となる被写体がどの位置にきたときに合成を開始するかを示す合成開始位置を撮像空間の空間内アドレスで指定可能である。 The CG correction unit 51 determines and corrects the target subject and start timing in response to the user's operation of the operation unit 45. Specifically, the CG correction unit 51 determines the target subject in response to the user's operation, and can specify the distance to a specific subject as the start timing, and the synthesis start position indicating the position at which synthesis will start when the target subject is located, using an address in the imaging space.
 また、CG修正部51は、CG画像102の描画スピードをシークバー67の操作に応じて変更可能である。例えばシークバー67を左側に移動させると、CG画像102の合成が開始されてから終了するまでの時間が長くなる。また、シークバー67を右側に移動させると、CG画像102の合成が開始されてから終了するまでの時間が短くなる。 The CG correction unit 51 can also change the drawing speed of the CG image 102 in response to the operation of the seek bar 67. For example, moving the seek bar 67 to the left increases the time from when the synthesis of the CG image 102 begins to when it ends. Moving the seek bar 67 to the right decreases the time from when the synthesis of the CG image 102 begins to when it ends.
 例えば、図10に示すように、対象となる被写体として特定の人物6が指定され、開始タイミングとして撮像空間の空間内アドレスが(x,y,z)=(10,50,30)と指定され、描画スピードとして2秒が指定されているとする。また、配置領域(合成位置)として(X,Y)=(10,50)と指定されており、CGオブジェクトの数として3個と指定されていたとする。
 この場合、特定の人物6が撮像空間の(x,y,z)=(10,50,30)に到達したときにCG画像102の合成が開始される。このとき、(X,Y)=(10,50)にCGオブジェクトが描画されたCG画像102が生成される。なお、特定の人物6は、撮像画像101に対する画像処理により特定されるようにすればよい。また、特定の人物6の撮像空間上の位置(アドレス)は、ToFセンサ12から得られる距離情報(空間内位置情報)に基づいて求められる。
 また、合成開始から2秒かけて例えばCGオブジェクトが所定時間毎に3個ずつ前方に徐々に現れるようなCG画像102がフレーム毎に生成される。
10, suppose that a specific person 6 is specified as the target subject, the spatial address of the imaging space is specified as (x, y, z) = (10, 50, 30) as the start timing, two seconds is specified as the drawing speed, (X, Y) = (10, 50) is specified as the placement area (composite position), and three CG objects are specified as the number of objects.
In this case, synthesis of the CG image 102 starts when the specific person 6 reaches (x, y, z) = (10, 50, 30) in the imaging space. At this time, the CG image 102 is generated in which a CG object is drawn at (X, Y) = (10, 50). Note that the specific person 6 may be identified by image processing of the captured image 101. The position (address) of the specific person 6 in the imaging space is calculated based on distance information (position information in space) obtained from the ToF sensor 12.
Also, a CG image 102 is generated for each frame in which, for example, three CG objects gradually appear forward at predetermined intervals over a period of two seconds from the start of composition.
 CG修正部51は、このようにして修正されたCGパラメータ及び空間情報をCGファイルに関連付けて記録部42に記録する(ステップS13)。 The CG modification unit 51 associates the modified CG parameters and spatial information with the CG file and records them in the recording unit 42 (step S13).
 図11は、CGパラメータに基づくCG画像102を説明する図である。上記したようにユーザ操作に応じてCGパラメータが修正される。これにより、修正されたCGパラメータに基づいてCG画像102がCG画像生成部53によって生成されることになる。 FIG. 11 is a diagram illustrating a CG image 102 based on CG parameters. As described above, the CG parameters are modified in response to user operations. As a result, the CG image generation unit 53 generates the CG image 102 based on the modified CG parameters.
 したがって、図11において上段から下段に向けて順に示すように、例えば、後方から前方に向かって草木や岩が徐々に現れるかのようなCG画像102をCGパラメータに基づいて生成することが可能である。 As shown in FIG. 11 from top to bottom, it is therefore possible to generate a CG image 102 based on CG parameters in which, for example, plants and rocks appear gradually from the rear to the front.
 図12は、メタ情報を説明する図である。ここで、図12に示すように、メタ情報には、CGファイルID、CGパラメータ、撮像設定情報を含まれている。
 CGファイルIDは、上記したように、決定されたCGファイルを識別するための識別子である。
 空間内位置情報は、ToFセンサ12によってセンシングされる被写体までの距離、即ち、撮像空間での被写体の位置(アドレス)を示す情報である。
 撮像設定情報は、後述する撮像工程ST2において設定される、撮像装置2の撮像設定に関する情報である。
 なお、メタ情報は、CG画像102をどのタイミングで表示するかを決めるために必要な開始タイミング等もあるが、CG画像102を適切なタイミング及び位置に合成することができれば、CGファイルID、CGパラメータ、距離情報及び撮像設定情報以外の情報が含まれていてもよく、また、これらの情報のいずれかが含まれていなくてもよい。
Fig. 12 is a diagram for explaining meta information. As shown in Fig. 12, the meta information includes a CG file ID, CG parameters, and imaging setting information.
As described above, the CG file ID is an identifier for identifying the determined CG file.
The in-space position information is information indicating the distance to the subject sensed by the ToF sensor 12, that is, the position (address) of the subject in the imaging space.
The imaging setting information is information regarding imaging settings of the imaging device 2 that are set in an imaging step ST2 described later.
The meta information includes the start timing, which is necessary to determine when to display the CG image 102, but as long as the CG image 102 can be composited at the appropriate timing and position, it may also include information other than the CG file ID, CG parameters, distance information, and imaging setting information, or it may not include any of this information.
[4.2.撮像工程]
 次に撮像工程について説明する。撮像工程は、主にリハーサル撮像工程と本撮像工程とに分かれている。リハーサル撮像工程では、本撮像工程と同様に撮像装置2に被写体を撮像させる。なお、リハーサル撮像工程は設けられていなくてもよい。
[4.2. Imaging process]
Next, the imaging process will be described. The imaging process is mainly divided into a rehearsal imaging process and a main imaging process. In the rehearsal imaging process, the imaging device 2 is caused to image the subject in the same manner as in the main imaging process. Note that the rehearsal imaging process does not necessarily have to be provided.
 図13は、撮像工程におけるシーケンスチャートを示す図である。図14A~図14Dは、撮像装置2の撮像設定を説明する図である。リハーサル撮像工程が開始されると、撮像装置2aの表示部13には、撮像装置2の撮像設定を決定するための撮像設定画面(カメラセットアップ画面)71が表示される。なお、図14A~図14Dに示す画面の表示順は以下に説明する順番でなく、他の任意の順番であってもよい。 FIG. 13 is a diagram showing a sequence chart in the imaging process. FIGS. 14A to 14D are diagrams explaining the imaging settings of the imaging device 2. When the rehearsal imaging process is started, an imaging setting screen (camera setup screen) 71 for determining the imaging settings of the imaging device 2 is displayed on the display unit 13 of the imaging device 2a. Note that the display order of the screens shown in FIGS. 14A to 14D does not have to be the order described below, and may be any other order.
 図14Aに示すように、撮像設定画面71には、撮像素子部22のイメージセンサの解像度(Imager Scan)、縦横の画像数、即ち、画サイズ、及びインタレースとプログレッシブのいずれかを示す動画フォーマット(Video Format)、出力画像のフレームレート(Project FPS)、撮像時におけるモニタリングのために表示部13に表示される画像であるいわゆるライブビュー画像(モニタ画像)に対する色補正のための3DLUT(Monitor LUT)、色空間(Color Space)をそれぞれ選択・設定するための操作項目が表示される(ステップS21)。 As shown in FIG. 14A, the imaging setting screen 71 displays operation items for selecting and setting the resolution (Imager Scan) of the image sensor of the imaging element section 22, the number of images vertically and horizontally, i.e., the image size, the video format (Video Format) indicating either interlaced or progressive, the frame rate of the output image (Project FPS), the 3D LUT (Monitor LUT) for color correction of the so-called live view image (monitor image) that is displayed on the display section 13 for monitoring during imaging, and the color space (Color Space) (step S21).
 各操作項目がタッチパネル14aを介して操作されると、操作された選択項目で選択可能な選択肢が一覧表示される。例えば、フレームレート(Project FPS)の操作項目が操作されると、図14Bに示すように、選択可能なフレームレートが一覧表示される選択画面72が表示される。選択画面72におけるいずれかの値がタッチパネル14aを介して選択されると、選択された値にフレームレートを変更して、図14Aに示した撮像設定画面71が再び表示される。 When each operation item is operated via the touch panel 14a, a list of options that can be selected for the operated selection item is displayed. For example, when the operation item for frame rate (Project FPS) is operated, a selection screen 72 is displayed that lists the selectable frame rates, as shown in FIG. 14B. When any value on the selection screen 72 is selected via the touch panel 14a, the frame rate is changed to the selected value, and the imaging setting screen 71 shown in FIG. 14A is displayed again.
 そして、全ての操作項目の撮像設定が決定されると、メタ情報取得部32は、決定された撮像設定を示す撮像設定情報をメタ情報として取得し、記録制御部34がそれらを記録部24に記録する。 When the imaging settings for all operation items have been determined, the meta information acquisition unit 32 acquires imaging setting information indicating the determined imaging settings as meta information, and the recording control unit 34 records them in the recording unit 24.
 また、撮像設定が決定されると、図14Cに示すように、CGファイルを選択するためのCG選択画面73が表示される。ここでは、選択可能なCGファイルの名称やID等が表示される。そして、CG選択画面73においてタッチパネル14aの操作を介していずれかのCGファイルが選択されると、選択されたCGファイルが決定される。 Furthermore, once the imaging settings have been determined, a CG selection screen 73 for selecting a CG file is displayed, as shown in FIG. 14C. Here, the names and IDs of the selectable CG files are displayed. Then, when any CG file is selected on the CG selection screen 73 via operation of the touch panel 14a, the selected CG file is confirmed.
 そして、CGファイルが決定されると、メタ情報取得部32は、決定されたCGファイルのCGファイルIDをメタ情報として取得し、記録制御部34がそれを記録部24に記録する。 Once the CG file is determined, the meta information acquisition unit 32 acquires the CG file ID of the determined CG file as meta information, and the recording control unit 34 records it in the recording unit 24.
 その後、図14Dに示すように、リハーサル撮像工程及び本撮像工程で使用する撮像装置2を選択するための撮像装置選択画面74が表示部13に表示される。そして、タッチパネル14aの操作によって、リハーサル撮像工程及び本撮像工程で使用される撮像装置2が決定されると、通信部25は、決定された撮像装置2に対して直接又はコンピュータ3を介して撮像設定情報を送信する。これにより、複数の撮像装置2において、同一の撮像設定で撮像を行わせることができる。なお、ここでは、例えば表示部13としてスマートフォンを連携可能な撮像装置2が選択される。 After that, as shown in FIG. 14D, an imaging device selection screen 74 for selecting the imaging device 2 to be used in the rehearsal imaging process and the main imaging process is displayed on the display unit 13. Then, when the imaging device 2 to be used in the rehearsal imaging process and the main imaging process is determined by operating the touch panel 14a, the communication unit 25 transmits imaging setting information to the determined imaging device 2 directly or via the computer 3. This makes it possible to cause multiple imaging devices 2 to perform imaging with the same imaging settings. Note that here, for example, an imaging device 2 that can be linked to a smartphone as the display unit 13 is selected.
 また、通信部25は、決定された撮像装置2の撮像設定情報、CGファイルID、撮像装置2をメタ情報としてコンピュータ3に送信する(ステップS21)。コンピュータ3では、受信した撮像装置2の撮像設定情報、CGファイルIDをメタ情報として記録部42に記録する。 The communication unit 25 also transmits the imaging setting information, CG file ID, and imaging device 2 of the determined imaging device 2 to the computer 3 as meta information (step S21). The computer 3 records the received imaging setting information and CG file ID of the imaging device 2 in the recording unit 42 as meta information.
 このようにして、リハーサル撮像工程における各種設定が終了すると、撮像装置2ではカメラマン8等の操作に応じて、リハーサル撮像が開始される(ステップS22)。なお、撮像装置2が2つの場合には2つの撮像装置2が同期してリハーサル撮像が開始される。
 リハーサル撮像が開始されると、撮像装置2においては、撮像画像101が得られる。カメラプロセス部23は、撮像画像101(RAW画像データ)に対して画像処理を順次行う。通信部25は、画像処理後の撮像画像101(画像データ)を順次コンピュータ3に送信する。なお、通信部25は、RAW画像データをコンピュータ3に送信するようにしてもよい。
When the various settings in the rehearsal imaging process are thus completed, the imaging device 2 starts rehearsal imaging in response to an operation by the cameraman 8 or the like (step S22). Note that, when there are two imaging devices 2, the two imaging devices 2 start rehearsal imaging in synchronization with each other.
When the rehearsal imaging is started, the imaging device 2 obtains an image 101. The camera processor 23 sequentially performs image processing on the image 101 (RAW image data). The communication unit 25 sequentially transmits the image 101 (image data) after image processing to the computer 3. Note that the communication unit 25 may transmit the RAW image data to the computer 3.
 また、メタ情報取得部32は、ToFセンサ12によって算出される被写体までの距離を示す距離情報を取得し、通信部25がその距離情報を空間内位置情報としてコンピュータ3に送信する。 In addition, the meta information acquisition unit 32 acquires distance information that indicates the distance to the subject calculated by the ToF sensor 12, and the communication unit 25 transmits the distance information to the computer 3 as spatial position information.
 図15は、合成画像103を説明する図である。コンピュータ3は、撮像画像101から、まず、メタ情報を受信する。その後、コンピュータ3は、ほぼリアルタイムに空間内位置情報を受信する。そして、CG画像生成部53は、メタ情報に含まれるCGファイルIDに対応するCGファイル及びCGパラメータを記録部42から読み出し、CGファイルを用いてCGパラメータ及び空間内位置情報に基づいて、撮像画像101の各フレームに合成するためのCG画像102を生成する(ステップS31)。
 その後、CG合成部54は、撮像装置2から受信した撮像画像101の各フレームと、CG画像生成部53によって生成されたCG画像102とを合成して合成画像103を生成する。なお、撮像画像101は記録部42に記録されるようにしてもよい。
15 is a diagram illustrating a composite image 103. The computer 3 first receives meta information from the captured image 101. After that, the computer 3 receives the spatial position information almost in real time. The CG image generating unit 53 then reads out the CG file and CG parameters corresponding to the CG file ID included in the meta information from the recording unit 42, and uses the CG file to generate a CG image 102 to be composited with each frame of the captured image 101 based on the CG parameters and the spatial position information (step S31).
Thereafter, the CG synthesis unit 54 synthesizes each frame of the captured image 101 received from the imaging device 2 with the CG image 102 generated by the CG image generation unit 53 to generate a composite image 103. Note that the captured image 101 may be recorded in the recording unit 42.
 そして、通信部44は、生成した合成画像103を撮像装置2aに送信する(ステップS32)。
 ここでは、予め設定された撮像装置2aにのみ合成画像103を送信するようになされている。一方で、予め設定されていない撮像装置2bには合成画像103を送信しないようになされている。
Then, the communication unit 44 transmits the generated composite image 103 to the imaging device 2a (step S32).
Here, the composite image 103 is transmitted only to the image capturing device 2a that has been preset, while the composite image 103 is not transmitted to the image capturing device 2b that has not been preset.
 撮像装置2aでは、合成画像103を受信すると、表示制御部33が、受信した合成画像103を表示部13に表示する(ステップS23)。 When the imaging device 2a receives the composite image 103, the display control unit 33 displays the received composite image 103 on the display unit 13 (step S23).
 このようにすることで、撮像装置2aにおいては、図15において上段から下段にかけて示すように、被写体を撮像しているときに、ほぼリアルタイムで合成画像103を表示部13に表示することが可能となる。 In this way, in the imaging device 2a, as shown from the top to the bottom of Figure 15, it is possible to display a composite image 103 on the display unit 13 almost in real time while the subject is being imaged.
 即ち、図15の例では、特定の人物6の走りに合わせて草木、石等が現れるような合成画像103が表示部13に表示されることになる。 In other words, in the example of FIG. 15, a composite image 103 in which plants, stones, etc. appear in time with the running of a specific person 6 is displayed on the display unit 13.
 図16は、調整画面を説明する図である。リハーサル撮像工程においてユーザ操作に応じて、撮像装置2aの表示部13には、図16に示すように、合成画像103が表示される合成画像表示領域81、CGオブジェクトの数(密度)を変更するための数変更ダイヤル82、CG画像102の色合いを変更するためのカラーホイール83が表示される。そして、数変更ダイヤル82、カラーホイール83を操作することにより、CGパラメータを調整することができる。このとき、撮像装置2aは、コンピュータ3からCGパラメータを取得して記録部24に記録すればよい。 FIG. 16 is a diagram illustrating the adjustment screen. In response to user operations during the rehearsal imaging process, the display unit 13 of the imaging device 2a displays a composite image display area 81 in which the composite image 103 is displayed, a number change dial 82 for changing the number (density) of CG objects, and a color wheel 83 for changing the color tone of the CG image 102, as shown in FIG. 16. The CG parameters can be adjusted by operating the number change dial 82 and color wheel 83. At this time, the imaging device 2a obtains the CG parameters from the computer 3 and records them in the recording unit 24.
 調整部35は、数変更ダイヤル82が操作されると、その操作に応じてCGオブジェクトの数(密度)を変更(調整)する。なお、数変更ダイヤル82は、タッチパネル14aを介したユーザ操作に応じて回転表示される。
 例えば、図17に示すように、数変更ダイヤル82を反時計回りに回す操作がタッチパネル14aを介して行われると、CGオブジェクトの数(密度)を減少させる。また、数変更ダイヤル82を時計回りに回す操作がタッチパネル14aを介して行われると、CGオブジェクトの数(密度)を増加させる。
When the number-changing dial 82 is operated, the adjustment unit 35 changes (adjusts) the number (density) of CG objects in response to the operation. Note that the number-changing dial 82 is displayed and rotated in response to a user operation via the touch panel 14a.
17, when the number-changing dial 82 is rotated counterclockwise via the touch panel 14a, the number (density) of CG objects is decreased, whereas when the number-changing dial 82 is rotated clockwise via the touch panel 14a, the number (density) of CG objects is increased.
 具体的には、数変更ダイヤル82が操作されると、記録制御部34によって、数変更ダイヤル82の操作に応じてCGパラメータのうちのCGオブジェクトの数が更新されて記録部24に記録される。また、通信部25は、更新されたCGパラメータをコンピュータ3に送信する(ステップS24)。
 コンピュータ3では、CGパラメータを受信すると、受信したCGパラメータに基づいてCG画像102及び合成画像103を再生成して、合成画像103を撮像装置2に送信する(ステップS33)。なお、ここでは、記録部42に記録された撮像画像を用いて合成画像103を再生成するようにすればよい。
 これにより、撮像装置2では、例えば図17下段に示すように、数変更ダイヤル82の操作に応じてCGオブジェクトの数が減少されたCG画像102が合成された合成画像103を表示部13に表示することができる。
Specifically, when the number-changing dial 82 is operated, the recording control unit 34 updates the number of CG objects among the CG parameters in accordance with the operation of the number-changing dial 82, and records the updated CG parameters in the recording unit 24. In addition, the communication unit 25 transmits the updated CG parameters to the computer 3 (step S24).
When the computer 3 receives the CG parameters, it regenerates the CG image 102 and the composite image 103 based on the received CG parameters, and transmits the composite image 103 to the imaging device 2 (step S33). Note that, in this case, the composite image 103 may be regenerated using the captured image recorded in the recording unit 42.
As a result, the imaging device 2 can display on the display unit 13 a composite image 103 that is composed of a CG image 102 in which the number of CG objects has been reduced in response to operation of the number-changing dial 82, as shown in the lower part of Figure 17, for example.
 また、調整部35は、カラーホイール83が操作されると、カラーホイール83によって選択された色合いにCG画像102が調整された合成画像103を表示することができる。
 具体的には、カラーホイール83が操作されると、記録制御部34によって、カラーホイール83の操作に応じてCGパラメータのうちの色合いが更新されて記録部24に記録される。
 また、通信部25は、更新されたCGパラメータをコンピュータ3に送信する。コンピュータ3では、CGパラメータを受信すると、受信したCGパラメータに基づいてCG画像102及び合成画像103を再生成して、合成画像103を撮像装置2に送信する。これにより、撮像装置2では、カラーホイール83の操作に応じた色合いのCG画像102が合成された合成画像103を表示部13に表示することができる。
Furthermore, when the color wheel 83 is operated, the adjustment unit 35 can display a composite image 103 in which the CG image 102 is adjusted to the hue selected by the color wheel 83 .
Specifically, when the color wheel 83 is operated, the recording control unit 34 updates the color tone, which is one of the CG parameters, in accordance with the operation of the color wheel 83 and records the updated color tone in the recording unit 24 .
Furthermore, the communication unit 25 transmits the updated CG parameters to the computer 3. Upon receiving the CG parameters, the computer 3 regenerates the CG image 102 and the composite image 103 based on the received CG parameters, and transmits the composite image 103 to the imaging device 2. This enables the imaging device 2 to display on the display unit 13 the composite image 103 obtained by combining the CG image 102 with a color tone corresponding to the operation of the color wheel 83.
 また、調整部35は、合成画像表示領域81を左右方向にスワイプさせることで、CG画像102の描画スピードを更新した合成画像103を表示することができる。
 例えば図18に示すように、合成画像表示領域81を左方向にスワイプさせることで、CG画像102の描画スピードを遅くした合成画像103が表示部13に表示される。また、合成画像表示領域81を右方向にスワイプさせることで、CG画像102の描画スピードを速くした合成画像103が表示部13に表示される。
Furthermore, the adjustment unit 35 can display a composite image 103 in which the drawing speed of the CG image 102 is updated by swiping the composite image display area 81 left or right.
18 , for example, by swiping the composite image display area 81 to the left, a composite image 103 in which the drawing speed of the CG image 102 is slowed down is displayed on the display unit 13. By swiping the composite image display area 81 to the right, a composite image 103 in which the drawing speed of the CG image 102 is increased is displayed on the display unit 13.
 具体的には、合成画像表示領域81を左右方向にスワイプされると、記録制御部34によって、スワイプされた量に応じてCGパラメータのうちの描画スピードが更新されて記録部24に記録される。また、通信部25は、更新されたCGパラメータをコンピュータ3に送信する。
 コンピュータ3では、CGパラメータを受信すると、受信したCGパラメータに基づいてCG画像102及び合成画像103を再生成して、合成画像103を撮像装置2に送信する。これにより、撮像装置2では、例えば図18下段に示すように、合成画像表示領域81を左方向にスワイプした量に応じて描画スピードが遅くなってCGオブジェクトである木のうちの前方の木の合成タイミングが遅くされた合成画像103を生成することができる。
 即ち、合成開始タイミングから合成終了タイミングまでの時間(フレーム数)が長くされた合成画像103が生成することができる。
Specifically, when the composite image display area 81 is swiped left or right, the recording control unit 34 updates the drawing speed, which is one of the CG parameters, in accordance with the amount of the swipe and records the updated CG parameters in the recording unit 24. In addition, the communication unit 25 transmits the updated CG parameters to the computer 3.
When the computer 3 receives the CG parameters, it regenerates the CG image 102 and the composite image 103 based on the received CG parameters, and transmits the composite image 103 to the imaging device 2. As a result, the imaging device 2 can generate a composite image 103 in which the drawing speed is slowed down in accordance with the amount of swiping leftward on the composite image display area 81, and the timing of compositing the front tree among the trees that are CG objects is delayed, as shown in the lower part of Fig. 18, for example.
That is, a composite image 103 can be generated in which the time (number of frames) from the composition start timing to the composition end timing is extended.
 このように、調整部35は、リハーサル撮像工程において表示された合成画像103をカメラマン8等に確認させ、カメラマン8等によってCGパラメータを調整させることができる。 In this way, the adjustment unit 35 allows the cameraman 8 or the like to check the composite image 103 displayed in the rehearsal imaging process and allows the cameraman 8 or the like to adjust the CG parameters.
 そして、リハーサル撮像工程においてCGパラメータが調整(更新)されると、調整部35は、調整されたCGパラメータをメタ情報として記録部24に記録し、通信部25がそのCGパラメータをコンピュータ3に送信する。コンピュータ3では、調整されたCGパラメータを受信すると、受信したCGパラメータをメタ情報として記録部42に記録する。 When the CG parameters are adjusted (updated) in the rehearsal imaging process, the adjustment unit 35 records the adjusted CG parameters in the recording unit 24 as meta information, and the communication unit 25 transmits the CG parameters to the computer 3. When the computer 3 receives the adjusted CG parameters, it records the received CG parameters in the recording unit 42 as meta information.
 これにより、準備工程ST1で決定及び修正されたCGパラメータを、実際の撮像画像101に合わせて調整することができる。 This allows the CG parameters determined and modified in the preparation process ST1 to be adjusted to match the actual captured image 101.
 なお、ここでは、撮像装置2aで得られる撮像画像101を用いてCGパラメータの調整を行うようにしたが、撮像装置2aに加えて撮像装置2bで得られる撮像画像101を用いてCGパラメータの調整を行うようにしてもよい。 Note that, here, the CG parameters are adjusted using the captured image 101 obtained by the imaging device 2a, but the CG parameters may also be adjusted using the captured image 101 obtained by the imaging device 2b in addition to the imaging device 2a.
 この場合、例えば、カメラマン8等のタッチパネル14aに対する操作に応じて、撮像装置2aで得られた撮像画像101と、撮像装置2bで得られた撮像画像101とのどちらについて合成画像103を生成して表示させるか選択可能であるとよい。
 そして、カメラマン8等に選択された撮像装置2a又は撮像装置2bで得られた撮像画像101がコンピュータ3に送信され、コンピュータ3でCG画像102及び合成画像103が生成される。
 そして、生成された合成画像103が撮像装置2aに送信されることにより、撮像装置2aでは、カメラマン8等に選択された撮像装置2a又は撮像装置2bで得られた撮像画像101に基づく合成画像103が表示部13に表示される。
 これにより、撮像装置2bで得られた撮像画像101に基づく合成画像103を撮像装置2aで確認することができるとともに、撮像装置2bで得られた撮像画像101に基づく合成画像103を用いて撮像装置2aでCGパラメータを調整することができる。
In this case, for example, depending on the operation of the touch panel 14a by the cameraman 8 or the like, it may be possible to select whether to generate and display the composite image 103 from the captured image 101 obtained by the imaging device 2a or from the captured image 101 obtained by the imaging device 2b.
Then, a captured image 101 obtained by the imaging device 2 a or 2 b selected by the cameraman 8 or the like is transmitted to the computer 3 , and the computer 3 generates a CG image 102 and a composite image 103 .
Then, the generated composite image 103 is transmitted to the imaging device 2a, whereby the imaging device 2a displays on the display unit 13 a composite image 103 based on the captured image 101 obtained by the imaging device 2a or the imaging device 2b selected by the cameraman 8 or the like.
This allows the composite image 103 based on the captured image 101 obtained by the imaging device 2b to be confirmed by the imaging device 2a, and the CG parameters can be adjusted by the imaging device 2a using the composite image 103 based on the captured image 101 obtained by the imaging device 2b.
 リハーサル撮像工程が終了すると、本撮像工程に移る。図19は、本撮像工程で表示部13に表示される画面を説明する図である。
 本撮像工程においては、図1に示したように撮像装置2において撮像が開始され撮像画像101(RAW画像データ)が得られる(ステップS25)。
When the rehearsal imaging step is completed, the process moves to the actual imaging step. Fig. 19 is a diagram for explaining a screen displayed on the display unit 13 in the actual imaging step.
In this imaging process, imaging is started in the imaging device 2 as shown in FIG. 1, and a captured image 101 (RAW image data) is obtained (step S25).
 撮像装置2は、撮像画像101を順次コンピュータ3に送信する。 The imaging device 2 sequentially transmits the captured images 101 to the computer 3.
 また、リハーサル撮像工程と同様に、メタ情報取得部32は、ToFセンサ12によって算出される被写体までの距離を示す距離情報を空間内位置情報として取得しCGパラメータとして記録する。また、通信部25は、空間内位置情報をコンピュータ3に送信する。 Also, similar to the rehearsal imaging process, the meta information acquisition unit 32 acquires distance information indicating the distance to the subject calculated by the ToF sensor 12 as spatial position information and records it as a CG parameter. Also, the communication unit 25 transmits the spatial position information to the computer 3.
 CG画像生成部53は、選択されたCGファイル及びメタ情報に基づいて、撮像画像101の各フレームに合成するためのCG画像102を生成する。そして、CG合成部54は、受信した撮像画像101の各フレームに対してCG画像102を合成して合成画像103を生成する(ステップS34)。 The CG image generation unit 53 generates a CG image 102 to be composited with each frame of the captured image 101 based on the selected CG file and meta information. The CG composition unit 54 then composites the CG image 102 with each frame of the received captured image 101 to generate a composite image 103 (step S34).
 合成画像103が生成されると、通信部44は、合成画像103を撮像装置2aに送信する(ステップS35)。 Once the composite image 103 is generated, the communication unit 44 transmits the composite image 103 to the imaging device 2a (step S35).
 このようにすることで、撮像装置2aにおいては、リハーサル撮像工程と同様に、被写体を撮像しているときに、ほぼリアルタイムで合成画像103を表示部13に表示することが可能となる(ステップS26)。 By doing this, in the imaging device 2a, it becomes possible to display the composite image 103 on the display unit 13 almost in real time while the subject is being imaged, similar to the rehearsal imaging process (step S26).
 このとき、記録制御部34は、撮像画像101を記録部24に記録するとともに、CGファイルID、CGパラメータ及び撮像装置2の撮像設定情報等をメタ情報として画像データに関連付ける。なお、このCGパラメータは、後処理工程を行うためのパラメータとも言える。 At this time, the recording control unit 34 records the captured image 101 in the recording unit 24, and associates the CG file ID, CG parameters, imaging setting information of the imaging device 2, etc., with the image data as meta information. Note that these CG parameters can also be considered parameters for performing post-processing steps.
 ここで、「関連付ける」という用語は、例えば、一方の情報(データ、コマンド、プログラム等)を処理する際に他方の情報を利用し得る(リンクさせ得る)ようにすることを意味する。つまり、互いに関連付けられた情報は、1つのファイル等としてまとめられてもよいし、それぞれ個別の情報としてもよい。例えば、情報Aに関連付けられた情報Bは、その情報Aとは別の伝送路上で伝送されるようにしてもよい。また、例えば、情報Aに関連付けられた情報Bは、その情報Aとは別の記録媒体(または同一の記録媒体の別の記録エリア)に記録されるようにしてもよい。なお、この「関連付け」は、情報全体でなく、情報の一部であってもよい。例えば、画像とその画像に対応する情報とが、複数フレーム、1フレーム、またはフレーム内の一部分などの任意の単位で互いに関連付けられるようにしてもよい。
 より具体的には、例えば、複数の情報に同一のID(識別情報)を付与すること、複数の情報を同一の記録媒体に記録すること、複数の情報を同一のフォルダに格納すること、複数の情報を同一のファイルに格納すること(一方を他方にメタデータとして付与すること)、複数の情報を同一のストリームに埋め込むこと、複数の情報を同一のプロジェクトに紐づけること、例えば電子透かしのように画像にメタデータを埋め込むこと等の行為が、「関連付ける」に含まれる。
 具体的には、記録制御部34は、撮像画像101と、CGパラメータを含むメタ情報とを関連付けて記録部24に記録する。
Here, the term "associate" means, for example, that when processing one piece of information (data, command, program, etc.), the other piece of information can be used (linked). That is, pieces of information associated with each other may be collected into a single file or the like, or may be individual pieces of information. For example, information B associated with information A may be transmitted on a transmission path different from that of information A. Also, for example, information B associated with information A may be recorded on a recording medium different from that of information A (or on a different recording area of the same recording medium). Note that this "association" may be a part of information, not the whole information. For example, an image and information corresponding to the image may be associated with each other in any unit, such as multiple frames, one frame, or a part of a frame.
More specifically, "associating" includes, for example, acts such as assigning the same ID (identification information) to multiple pieces of information, recording multiple pieces of information on the same recording medium, storing multiple pieces of information in the same folder, storing multiple pieces of information in the same file (assigning one to the other as metadata), embedding multiple pieces of information in the same stream, linking multiple pieces of information to the same project, and embedding metadata in an image like a digital watermark.
Specifically, the recording control unit 34 associates the captured image 101 with meta information including the CG parameters and records them in the recording unit 24 .
 また、通信部25は、記録部24に記録された撮像画像101及びメタ情報をコンピュータ3に送信する。コンピュータ3では、受信した撮像画像101及びメタ情報を関連付けて記録部42に記録する。なお、撮像画像101及びメタ情報の記録は撮像装置2とコンピュータ3の一方のみで行ってもよい。これにより、後処理工程が可能となる。 The communication unit 25 also transmits the captured image 101 and meta information recorded in the recording unit 24 to the computer 3. The computer 3 associates the received captured image 101 and meta information and records them in the recording unit 42. Note that the recording of the captured image 101 and meta information may be performed by only one of the imaging device 2 and the computer 3. This enables a post-processing process.
 一方で、撮像装置2及びコンピュータ3のいずれにおいても、合成画像103は記録されることがなく破棄される。なお、撮像装置2及びコンピュータ3では、合成画像103を破棄するモードと、合成画像103を破棄しないで記録するモードとがそれぞれ選択可能であってもよい。 On the other hand, in both the imaging device 2 and the computer 3, the composite image 103 is discarded without being recorded. Note that the imaging device 2 and the computer 3 may each be able to select a mode in which the composite image 103 is discarded and a mode in which the composite image 103 is recorded without being discarded.
 なお、撮像装置2が2つの場合、2つの撮像装置2において同期して撮像が開始される。したがって、撮像装置2が2つの場合、コンピュータ3には、撮像装置2a及び撮像装置2bの双方から撮像画像101が送信されることになる。
 コンピュータ3では、CG画像生成部53がそれぞれの撮像画像101に合成する合成画像103を生成し、CG合成部54がそれぞれの撮像画像101にそれぞれの合成画像103を合成することで2つの合成画像103を生成する。
 ここで、合成画像103は、それぞれの撮像装置2のToFセンサ12で取得された距離情報(空間内位置情報)に基づいて合成するタイミングを決定する等、2つの撮像装置2で得られた撮像画像101に対して独立して合成画像103を生成するようにしてもよい。また、合成画像103は、一方の撮像装置2のToFセンサ12で取得された距離情報(空間内位置情報)に基づいて同時に合成するタイミングを決定する等、2つの撮像装置2で得られた撮像画像101に対して関連性のある合成画像103を生成するようにしてもよい。
In addition, when there are two imaging devices 2, imaging is started synchronously in the two imaging devices 2. Therefore, when there are two imaging devices 2, the captured images 101 are transmitted to the computer 3 from both the imaging device 2a and the imaging device 2b.
In the computer 3 , the CG image generating section 53 generates a composite image 103 to be composited with each captured image 101 , and the CG composition section 54 composites each captured image 101 with each composite image 103 to generate two composite images 103 .
Here, the composite image 103 may be generated independently for the captured images 101 obtained by the two imaging devices 2, for example, by determining the timing of synthesis based on distance information (space position information) acquired by the ToF sensor 12 of each imaging device 2. Also, the composite image 103 may be generated so as to be related to the captured images 101 obtained by the two imaging devices 2, for example, by determining the timing of simultaneous synthesis based on distance information (space position information) acquired by the ToF sensor 12 of one of the imaging devices 2.
 また、撮像装置2それぞれの合成画像103を受信すると、表示制御部33は、図19に示すように、受信した合成画像103を2つ並べて表示部13に表示するようにしてもよい。
 これにより、撮像装置2a及び撮像装置2bで得られる撮像画像101にCG画像102が合成された合成画像103をほぼリアルタイムに同時にカメラマン8等に確認させることができる。
Furthermore, when receiving the composite images 103 from the imaging devices 2, the display control unit 33 may display two of the received composite images 103 side by side on the display unit 13, as shown in FIG.
This allows the cameraman 8 or the like to simultaneously check in almost real time the composite image 103 in which the CG image 102 is composited with the captured images 101 obtained by the image capture devices 2a and 2b.
[4.3.後処理工程]
 次に後処理工程について説明する。後処理工程は、主にコンピュータ3によって行われる。
 図20は、編集画面91を説明する図である。コンピュータ3では、後処理工程において図20に示すような編集画面91を表示部43に表示する。
[4.3. Post-treatment process]
Next, the post-processing step will be described. The post-processing step is mainly performed by the computer 3.
Fig. 20 is a diagram for explaining an editing screen 91. In the computer 3, an editing screen 91 as shown in Fig. 20 is displayed on the display unit 43 in the post-processing step.
 編集画面91には、複数、例えば2つの合成画像103を並べて表示可能な合成画像表示領域92、CGパラメータを調整するための複数のアイコンが表示される調整操作領域93、動画像のタイムラインを表示するタイムライン表示領域94が設けられている。 The editing screen 91 includes a composite image display area 92 in which multiple, for example two, composite images 103 can be displayed side-by-side, an adjustment operation area 93 in which multiple icons for adjusting CG parameters are displayed, and a timeline display area 94 in which a timeline of a moving image is displayed.
 編集部55は、記録部42に記録されている撮像画像101がRAW画像データの場合、RAW画像データに対してカメラプロセス部23と同様の画像処理を行う。ここでは、2つの撮像装置2で得られた撮像画像101に対してそれぞれ画像処理を行う。 If the captured image 101 recorded in the recording unit 42 is RAW image data, the editing unit 55 performs image processing on the RAW image data in the same manner as the camera processing unit 23. Here, image processing is performed on each of the captured images 101 obtained by the two imaging devices 2.
 また、編集部55は、記録部42に記録されたCGファイル及びメタ情報に基づいて、本撮像工程においてCG画像生成部53がCG画像102を生成した方法と同様にして、それぞれの撮像画像101に合成するためのCG画像102を生成する。そして、編集部55は、本撮像工程においてCG合成部54が合成画像103を生成した方法と同様にして、撮像画像101とCG画像102とを合成して合成画像103を生成する。したがって、ここでは、2つの合成画像103が生成されることになる。 In addition, based on the CG files and meta information recorded in the recording unit 42, the editing unit 55 generates CG images 102 to be composited with each captured image 101 in the same manner as the CG image generation unit 53 generated the CG image 102 in this imaging process. Then, the editing unit 55 composites the captured image 101 and the CG image 102 to generate a composite image 103 in the same manner as the CG composition unit 54 generated the composite image 103 in this imaging process. Therefore, here, two composite images 103 are generated.
 編集部55は、生成された2つの合成画像103を編集画面91の合成画像表示領域92に並べて表示する。 The editing unit 55 displays the two generated composite images 103 side by side in the composite image display area 92 of the editing screen 91.
 合成画像表示領域92に表示された合成画像103は、合成画像表示領域92の下部に設けられた操作アイコンを操作することにより、合成画像103の停止、早送り、巻き戻し等を行うことができる。
 これにより、撮像装置2で得られた撮像画像101に対してCG画像102が合成された合成画像103を確認させることができる。また、2つの撮像装置2で得られた撮像画像101に対してCG画像102が合成された合成画像103を同時に確認させることもできる。
The composite image 103 displayed in the composite image display area 92 can be stopped, fast-forwarded, rewound, etc. by operating the operation icons provided at the bottom of the composite image display area 92.
This allows the user to confirm a composite image 103 in which the CG image 102 is composited with the captured image 101 obtained by the imaging device 2. In addition, the user can simultaneously confirm a composite image 103 in which the CG image 102 is composited with the captured images 101 obtained by two imaging devices 2.
 また、編集部55は、記録部42に記録された2つの撮像画像101及びCG画像102について例えばタイムライン表示領域94に時間軸を一致させた上で上下に並べて表示する。
 これにより、2つの撮像画像101及びCG画像102を同時に確認した上で編集させることができる。
Furthermore, the editing unit 55 displays the two captured images 101 and the CG image 102 recorded in the recording unit 42, for example, in the timeline display area 94, one above the other, with their time axes aligned.
This allows the two captured images 101 and the CG image 102 to be simultaneously checked and then edited.
 また、編集部55は、調整操作領域93に表示されたアイコンに対する操作に応じてCGパラメータを更新可能である。CGパラメータが更新されると、編集部55は、更新されたCGパラメータを撮像画像101に関連付けて記録するとともに、更新されたCGパラメータに基づいてCG画像102を再生成し、撮像画像101とCG画像102とを合成した合成画像103を生成して合成画像表示領域92に表示する。 The editing unit 55 can also update the CG parameters in response to operations on icons displayed in the adjustment operation area 93. When the CG parameters are updated, the editing unit 55 records the updated CG parameters in association with the captured image 101, regenerates the CG image 102 based on the updated CG parameters, and generates a composite image 103 by combining the captured image 101 and the CG image 102, and displays it in the composite image display area 92.
 このとき、リハーサル撮像工程において、合成画像103を確認しながらCGパラメータが調整されているため、後処理工程においてはCGパラメータの調整は少なく済む。
At this time, since the CG parameters are adjusted while checking the composite image 103 in the rehearsal shooting step, the adjustment of the CG parameters is reduced in the post-processing step.
<5.変形例>
 なお、実施形態としては上記により説明した具体例に限定されるものではなく、多様な変形例としての構成を採り得るものである。
5. Modifications
It should be noted that the embodiment is not limited to the specific example described above, and various modified configurations may be adopted.
 例えば、上記実施形態では、被写体までの距離を測定する測距部としてToFセンサ12を備えるようにしたが、被写体までの距離を測定する測距部はこれに限らず、超音波センサなどでもよい。また、イメージセンサの出力から画像処理で検出するようにしてもよいし、複数の測距センサやイメージセンサを複合的に用いてもよい。 For example, in the above embodiment, a ToF sensor 12 is provided as a distance measuring unit that measures the distance to the subject, but the distance measuring unit that measures the distance to the subject is not limited to this, and may be an ultrasonic sensor or the like. In addition, detection may be performed by image processing from the output of an image sensor, or multiple distance measuring sensors and image sensors may be used in combination.
 また、上記実施形態では、CGファイルとCGパラメータとは別々に記録されるようにした。しかしながら、CGパラメータはCGファイル内に記録されるようにしてもよい。 In addition, in the above embodiment, the CG file and the CG parameters are recorded separately. However, the CG parameters may be recorded within the CG file.
 また、上記実施形態では、撮像装置2のカメラ制御部26が、空間情報取得部31、メタ情報取得部32、表示制御部33、記録制御部34及び調整部35として機能するようにした。しかしながら、これらの機能部の一部又は全部はコンピュータ3のコンピュータ制御部41に機能させるようにしてもよい。 In addition, in the above embodiment, the camera control unit 26 of the imaging device 2 functions as a spatial information acquisition unit 31, a meta information acquisition unit 32, a display control unit 33, a recording control unit 34, and an adjustment unit 35. However, some or all of these functional units may be made to function in the computer control unit 41 of the computer 3.
 また、コンピュータ制御部41が、CG修正部51、画像取得部52、CG画像生成部53、CG合成部54及び編集部55として機能するようにした。しかしながら、これらの機能部の一部又は全部は撮像装置2のカメラ制御部26に機能させるようにしてもよい。 The computer control unit 41 also functions as a CG correction unit 51, an image acquisition unit 52, a CG image generation unit 53, a CG synthesis unit 54, and an editing unit 55. However, some or all of these functional units may be controlled by the camera control unit 26 of the imaging device 2.
[5.1.変形例1]
 図21は、変形例1におけるカメラ制御部26A及びコンピュータ制御部41Aの機能構成を説明する図である。図22は、変形例1における撮像工程を図示化したブロック図である。
[5.1. Modification 1]
Fig. 21 is a diagram for explaining the functional configuration of a camera control unit 26A and a computer control unit 41A in the modified example 1. Fig. 22 is a block diagram illustrating an imaging process in the modified example 1.
 変形例1におけるカメラ制御部26Aは、空間情報取得部31、メタ情報取得部32、表示制御部33、記録制御部34及び調整部35に加えてCG合成部54Aとしての機能部を備える。また、変形例1におけるコンピュータ制御部41は、CG修正部51、画像取得部52、CG画像生成部53及び編集部55としての機能部を備える。 The camera control unit 26A in the first modification includes a spatial information acquisition unit 31, a meta information acquisition unit 32, a display control unit 33, a recording control unit 34, and an adjustment unit 35, as well as a functional unit serving as a CG synthesis unit 54A. The computer control unit 41 in the first modification also includes functional units serving as a CG correction unit 51, an image acquisition unit 52, a CG image generation unit 53, and an editing unit 55.
 変形例1では、図22に示すように、コンピュータ3においてCG画像102を生成するものの合成画像103の生成が行われずCG画像102が撮像装置2に送信される。そして、撮像装置2は、CG画像102を受信すると、撮像画像101にCG画像102を合成して合成画像103を生成して表示する。 In Modification 1, as shown in FIG. 22, a CG image 102 is generated in a computer 3, but a composite image 103 is not generated, and the CG image 102 is transmitted to an imaging device 2. Then, when the imaging device 2 receives the CG image 102, it composites the captured image 101 with the CG image 102 to generate and display the composite image 103.
 例えば、リハーサル工程において撮像素子部22によって撮像が開始され(図中、「撮像」と示す)、撮像素子部22からの画像信号に対してカメラプロセス部23によって各種の信号処理が施される(図中、「カメラプロセス」と示す)。そして、通信部25は、各種の信号処理が施された撮像画像101をコンピュータ3に送信する。 For example, in the rehearsal process, imaging is started by the imaging element unit 22 (indicated as "imaging" in the figure), and various signal processes are performed by the camera processing unit 23 on the image signal from the imaging element unit 22 (indicated as "camera process" in the figure). Then, the communication unit 25 transmits the captured image 101 that has been subjected to various signal processes to the computer 3.
 コンピュータ3のCG画像生成部53は、メタ情報に含まれるCGファイルIDに対応するCGファイルを記録部42から読み出し、CGファイルを用いてCGパラメータ及び空間情報に基づいたCG画像102を生成する(図中、「CG生成」と示す)。そして、通信部44は、生成されたCG画像102を撮像装置2aに送信する。 The CG image generating unit 53 of the computer 3 reads out from the recording unit 42 the CG file corresponding to the CG file ID included in the meta information, and uses the CG file to generate a CG image 102 based on the CG parameters and spatial information (indicated as "CG generation" in the figure). The communication unit 44 then transmits the generated CG image 102 to the imaging device 2a.
 撮像装置2aでは、CG画像102を受信すると、CG合成部54Aが、撮像画像101にCG画像102を合成して合成画像103を生成する(図中、「CG合成」と示す)。表示制御部33は、生成された合成画像103を表示部13に表示する(図中、「表示」と示す)。 When the imaging device 2a receives the CG image 102, the CG synthesis unit 54A synthesizes the CG image 102 with the captured image 101 to generate a synthetic image 103 (indicated as "CG synthesis" in the figure). The display control unit 33 displays the generated synthetic image 103 on the display unit 13 (indicated as "display" in the figure).
 また、撮像装置2aでは、CGパラメータが調整される(図中、「CGパラメータ調整」と示す)と、通信部25は、調整されたCGパラメータをコンピュータ3に送信する。コンピュータ3では、CG画像生成部53が受信したCGパラメータに基づいてCG画像102を再生成し、通信部44が再生成されたCG画像102を撮像装置2aに送信する。 Furthermore, when the CG parameters are adjusted in the imaging device 2a (indicated as "CG parameter adjustment" in the figure), the communication unit 25 transmits the adjusted CG parameters to the computer 3. In the computer 3, the CG image generation unit 53 regenerates the CG image 102 based on the received CG parameters, and the communication unit 44 transmits the regenerated CG image 102 to the imaging device 2a.
 撮像装置2aでは、CG画像102を受信すると、CG合成部54Aが撮像画像101にCG画像102を合成して合成画像103を再生成し、表示制御部33が再生成された合成画像103を表示部13に表示する。 When the imaging device 2a receives the CG image 102, the CG synthesis unit 54A synthesizes the captured image 101 with the CG image 102 to regenerate a synthetic image 103, and the display control unit 33 displays the regenerated synthetic image 103 on the display unit 13.
 また、本撮像工程が開始されると、撮像装置2において撮像及びカメラプロセスが行われ、撮像画像101が順次コンピュータ3に送信される。コンピュータ3のCG画像生成部53は、撮像画像101を受信すると、CGファイル及びメタ情報に基づいて、それぞれの撮像画像101に合成するためのCG画像102を生成する。通信部44は、生成されたCG画像102を撮像装置2aに送信する。 When the imaging process starts, imaging and camera processing are performed in the imaging device 2, and the captured images 101 are sequentially sent to the computer 3. When the CG image generation unit 53 of the computer 3 receives the captured images 101, it generates CG images 102 to be composited with each captured image 101 based on the CG file and meta information. The communication unit 44 transmits the generated CG images 102 to the imaging device 2a.
 撮像装置2aでは、CG画像102を受信すると、CG合成部54Aが、撮像画像101にCG画像102を合成して合成画像103を生成する。表示制御部33は、生成された合成画像103を表示部13に表示する。 When the imaging device 2a receives the CG image 102, the CG synthesis unit 54A synthesizes the captured image 101 with the CG image 102 to generate a synthetic image 103. The display control unit 33 displays the generated synthetic image 103 on the display unit 13.
 そして、記録制御部34は、撮像画像101を記録部24に記録するとともに、CGファイルID、CGパラメータ及び撮像装置2の撮像設定情報等をメタ情報として画像データに関連付けて記録部24に記録する(図中、「記録」と示す)。なお、記録部24に記録される撮像画像101は、RAW画像データでもよいし、カメラプロセス部23による画像処理が行われた後の画像データでもよいし、その両方でもよい。 Then, the recording control unit 34 records the captured image 101 in the recording unit 24, and also associates the CG file ID, CG parameters, imaging setting information of the imaging device 2, etc. as meta information with the image data and records them in the recording unit 24 (indicated as "record" in the figure). Note that the captured image 101 recorded in the recording unit 24 may be RAW image data, image data after image processing by the camera processing unit 23, or both.
[5.2.変形例2]
 図23は、変形例2におけるカメラ制御部26B及びコンピュータ制御部41Bの機能構成を説明する図である。図24は、変形例2における撮像工程を図示化したブロック図である。
[5.2. Modification 2]
Fig. 23 is a diagram for explaining the functional configuration of a camera control unit 26B and a computer control unit 41B in the modified example 2. Fig. 24 is a block diagram illustrating an image pickup process in the modified example 2.
 変形例2におけるカメラ制御部26Bは、空間情報取得部31、メタ情報取得部32、表示制御部33、記録制御部34及び調整部35に加えてCG画像生成部53B及びCG合成部54Bとしての機能部を備える。また、変形例2におけるコンピュータ制御部41Bは、CG修正部51、画像取得部52及び編集部55としての機能部を備える。 The camera control unit 26B in the second modification includes functional units such as a spatial information acquisition unit 31, a meta information acquisition unit 32, a display control unit 33, a recording control unit 34, and an adjustment unit 35, as well as a CG image generation unit 53B and a CG synthesis unit 54B. The computer control unit 41B in the second modification includes functional units such as a CG correction unit 51, an image acquisition unit 52, and an editing unit 55.
 変形例2では、図24に示すように、コンピュータ3においてCG画像102及び合成画像103の生成が行われず、CGファイルが撮像装置2の記録部24に記録されている。そして、撮像装置2は、CGファイル、CGパラメータ及び空間情報に基づいてCG画像102及び合成画像103を生成する。 In Modification 2, as shown in FIG. 24, the CG image 102 and composite image 103 are not generated in the computer 3, and the CG file is recorded in the recording unit 24 of the imaging device 2. Then, the imaging device 2 generates the CG image 102 and composite image 103 based on the CG file, CG parameters, and spatial information.
 具体的には、リハーサル撮像が開始されて撮像素子部22によって撮像が開始され(図中、「撮像」と示す)、撮像素子部22からの画像信号に対してカメラプロセス部23によって各種の信号処理が施される(図中、「カメラプロセス」と示す)。その後、CG画像生成部53Bは、CGファイルを用いてCGパラメータ及び空間情報に基づいてCG画像102を生成する(図中、「CG生成」と示す)。また、CG合成部54Bは、撮像画像101にCG画像102を合成して合成画像103を生成する(図中、「CG合成」と示す)。表示制御部33は、生成された合成画像103を表示部13に表示する(図中、「表示」と示す)。 Specifically, rehearsal shooting is started and shooting is started by the imaging element unit 22 (indicated as "shooting" in the figure), and various signal processing is performed by the camera process unit 23 on the image signal from the imaging element unit 22 (indicated as "camera process" in the figure). After that, the CG image generation unit 53B uses the CG file to generate a CG image 102 based on the CG parameters and spatial information (indicated as "CG generation" in the figure). In addition, the CG synthesis unit 54B synthesizes the CG image 102 with the captured image 101 to generate a synthesized image 103 (indicated as "CG synthesis" in the figure). The display control unit 33 displays the generated synthesized image 103 on the display unit 13 (indicated as "display" in the figure).
 その後、CGパラメータが調整されると(図中、「CGパラメータ調整」と示す)、CG画像生成部53Bは、調整されたCGパラメータに基づいてCG画像102を再生成する。また、CG合成部54Bは、撮像画像101にCG画像102を合成して合成画像103を再生成する。表示制御部33は、再生成された合成画像103を表示部13に表示する。 After that, when the CG parameters are adjusted (indicated as "CG parameter adjustment" in the figure), the CG image generation unit 53B regenerates the CG image 102 based on the adjusted CG parameters. Furthermore, the CG synthesis unit 54B synthesizes the CG image 102 with the captured image 101 to regenerate the synthetic image 103. The display control unit 33 displays the regenerated synthetic image 103 on the display unit 13.
 また、本撮像工程が開始されると、撮像装置2において撮像及びカメラプロセスが行われ、撮像画像101が得られる。CG画像生成部53Bは、CGファイルを用いてCGパラメータ及び空間情報に基づいてCG画像102を生成する。また、CG合成部54Bは、撮像画像101にCG画像102を合成して合成画像103を生成する。表示制御部33は、生成された合成画像103を表示部13に表示する。 When the imaging process starts, imaging and camera processing are performed in the imaging device 2, and an imaged image 101 is obtained. The CG image generating unit 53B uses the CG file to generate a CG image 102 based on CG parameters and spatial information. The CG synthesis unit 54B synthesizes the CG image 102 with the captured image 101 to generate a synthesized image 103. The display control unit 33 displays the generated synthesized image 103 on the display unit 13.
 そして、記録制御部34は、撮像画像101を記録部24に記録するとともに、CGファイルID、CGパラメータ及び撮像装置2の撮像設定情報等をメタ情報として画像データに関連付けて記録部24に記録する(図中、「記録」と示す)。なお、記録部24に記録される撮像画像101は、RAW画像データでもよいし、カメラプロセス部23による画像処理が行われた後の画像データでもよいし、その両方でもよい。
Then, the recording control unit 34 records the captured image 101 in the recording unit 24, and also records the CG file ID, CG parameters, imaging setting information of the imaging device 2, etc. as meta information in association with the image data in the recording unit 24 (indicated as "record" in the drawing). Note that the captured image 101 recorded in the recording unit 24 may be RAW image data, image data after image processing by the camera processing unit 23, or both.
<6.まとめ>
 以上の実施の形態によれば次のような効果が得られる。
 実施形態の撮像装置2は、撮像画像101を出力する撮像部11と、撮像空間の空間情報とCG合成に関するCGパラメータとに基づいて、撮像画像101にCG画像102を合成することで生成された合成画像103を表示部13に表示させる表示制御部33と、撮像画像101とCGパラメータとを関連付ける関連付け部(記録制御部34)と、を備える。
 これにより、撮像装置2では、撮像部11の撮像中に、撮像画像101にCG画像102が合成された合成画像103をほぼリアルタイムで表示部13に表示することが可能となる。
 また、コンピュータ3は、合成画像103を記録することなく撮像画像101及びCGパラメータが関連付けられていることで、後処理工程ST3において、CGパラメータを用いてCG画像102を生成及び編集可能である。そのため、撮像装置2は、撮像画像に対するCG画像102の調整を容易に行わせる。かくして、撮像装置2は、CG合成を含む映像制作における業務効率を向上することができる。
<6. Summary>
According to the above embodiment, the following effects can be obtained.
The imaging device 2 of the embodiment includes an imaging unit 11 that outputs an imaged image 101, a display control unit 33 that causes a display unit 13 to display a composite image 103 generated by combining the imaged image 101 with a CG image 102 based on spatial information of the imaging space and CG parameters related to CG synthesis, and an association unit (recording control unit 34) that associates the imaged image 101 with the CG parameters.
This makes it possible for the imaging device 2 to display a composite image 103, in which the captured image 101 is composited with the CG image 102, on the display unit 13 in almost real time while the imaging unit 11 is capturing an image.
Furthermore, since the captured image 101 and the CG parameters are associated with each other without recording the composite image 103, the computer 3 can generate and edit the CG image 102 using the CG parameters in the post-processing step ST3. Therefore, the imaging device 2 allows the CG image 102 to be easily adjusted with respect to the captured image. Thus, the imaging device 2 can improve the work efficiency in video production including CG compositing.
 空間情報とCGパラメータとが関連付けられている。
 これにより、撮像装置2では、空間情報に基づいてCGパラメータの修正を後からでも、また、別の装置でも行わせることが可能となる。
The spatial information is associated with the CG parameters.
This allows the image capture device 2 to later correct the CG parameters based on the spatial information, or allows the correction to be performed by another device.
 関連付け部は記録制御部34であり、撮像画像101とCGパラメータとを関連付けて記録する。
 これにより、撮像装置2は、撮像画像101とCGパラメータとを関連付けて容易に管理することができる。
The associating section is the recording control section 34, which associates the captured image 101 with the CG parameters and records them.
This enables the image capturing device 2 to easily manage the captured image 101 in association with the CG parameters.
 関連付け部は、合成画像103を記録することなく撮像画像101とCGパラメータとを関連付けて記録する。
 これにより、撮像装置2は、合成画像103を記録しない分だけ記録部24に記憶されるデータ容量を削減することができるとともに、処理負荷を低減することができる。
The associating section records the captured image 101 and the CG parameters in association with each other without recording the composite image 103 .
As a result, the imaging device 2 can reduce the amount of data stored in the recording unit 24 by the amount of the composite image 103 not recorded, and can also reduce the processing load.
 他の装置(コンピュータ3)において撮像画像101にCG画像102を合成することで合成画像103が生成される。
 これにより、撮像装置2では、コンピュータ3にCG画像102及び合成画像103の生成を行わせることができるため処理負荷を低減することができる。ここで、撮像装置2のカメラ制御部26の処理能力は、コンピュータ3のコンピュータ制御部41の処理能力よりも低いことが一般的である。
 そして、処理能力の低い撮像装置2がCG画像102及び合成画像103を生成すると、リアルタイムでの画像生成が間に合わないおそれがある。
 そこで、処理能力の高いコンピュータ3がCG画像102及び合成画像103を生成することで、リアルタイムでの合成画像103の表示を確保しやすくすることができる。
In another device (computer 3), a composite image 103 is generated by combining a captured image 101 with a CG image 102.
This allows the imaging device 2 to reduce the processing load by having the computer 3 generate the CG image 102 and the composite image 103. Here, the processing capacity of the camera control unit 26 of the imaging device 2 is generally lower than the processing capacity of the computer control unit 41 of the computer 3.
If an image capturing device 2 with low processing power generates the CG image 102 and the composite image 103, there is a risk that the image generation will not be able to keep up with the demands of real time.
Therefore, by using a computer 3 with high processing power to generate the CG image 102 and the composite image 103, it becomes easier to ensure that the composite image 103 is displayed in real time.
 他の装置(コンピュータ3)において生成されたCG画像102を撮像画像101に合成することで合成画像103を生成するCG合成部54Aを備える。
 これにより、撮像装置2では、コンピュータ3にCG画像102の生成を行わせることができるため処理負荷を低減することができる。
The apparatus includes a CG composition unit 54A that generates a composite image 103 by combining a CG image 102 generated in another apparatus (computer 3) with a captured image 101.
This allows the imaging device 2 to have the computer 3 generate the CG image 102, thereby reducing the processing load.
 撮像画像101に合成するCG画像102を生成するCG画像生成部53Bと、撮像画像101にCG画像102を合成することで合成画像103を生成するCG合成部54Bと、を備える。
 これにより、撮像装置2では、例えば撮像装置2とコンピュータ3とが有線及び無線によっても接続することができない環境下であっても、単独で合成画像103を生成して表示部13に表示させることができる。
The image processing apparatus includes a CG image generating section 53B that generates a CG image 102 to be composited with a captured image 101, and a CG composition section 54B that generates a composite image 103 by compositing the captured image 101 with the CG image 102.
As a result, the imaging device 2 can generate a composite image 103 by itself and display it on the display unit 13, even in an environment where the imaging device 2 and the computer 3 cannot be connected either wired or wirelessly.
 表示制御部33は、撮像中の撮像画像101に対応してリアルタイム処理でCG画像102が合成された合成画像103を表示部13に表示させる。
 これにより、撮像装置2は、ほぼリアルタイムで動画像としての合成画像103をカメラマン8等に確認させることが可能となる。
The display control unit 33 causes the display unit 13 to display a composite image 103 obtained by combining a CG image 102 in real time processing in accordance with a captured image 101 being captured.
This enables the imaging device 2 to allow the cameraman 8 or the like to check the composite image 103 as a moving image in almost real time.
 ユーザ操作に応じてCGパラメータを調整する調整部35を備える。
 これにより、撮像装置2では、例えばリハーサル撮像工程において撮像装置2で撮像された撮像画像101とCG画像102とが合成された合成画像103を確認しながら、CGパラメータを調整することができる。
 したがって、CGファイルを生成する際に厳密にCGパラメータを決定させる必要がなくなり、CGパラメータを決定させる手間及び時間を減らすことができる。
 また、後処理工程ST3においてCGパラメータの調整の手間を減らすことができる。
 かくして、撮像装置2では、撮像システム1全体としての作業効率を向上させることができる。
The image forming apparatus includes an adjustment unit 35 that adjusts CG parameters in response to user operations.
This allows the imaging device 2 to adjust CG parameters while checking a composite image 103 obtained by combining a captured image 101 captured by the imaging device 2 in a rehearsal imaging process with a CG image 102, for example.
Therefore, it is no longer necessary to strictly determine CG parameters when generating a CG file, and the effort and time required for determining CG parameters can be reduced.
Furthermore, the effort required for adjusting CG parameters in the post-processing step ST3 can be reduced.
Thus, in the imaging device 2, the working efficiency of the imaging system 1 as a whole can be improved.
 調整部35は、ユーザ操作に応じて調整されたCGパラメータを撮像画像101に関連付ける。
 これにより、調整後のCGパラメータに基づいてCG画像102が生成されることになるため、後処理工程ST3においてCGパラメータの調整を減らすことができる。
The adjustment unit 35 associates the CG parameters adjusted in response to the user operation with the captured image 101 .
As a result, the CG image 102 is generated based on the adjusted CG parameters, making it possible to reduce the need for CG parameter adjustment in the post-processing step ST3.
 複数の撮像装置2で得られた撮像画像101にCG画像102がそれぞれ合成された複数の合成画像103を表示部13に切り替えて表示させる。
 これにより、撮像装置2bのように他の撮像装置2で得られる撮像画像101に対するCG画像102の確認を、撮像装置2aを操作するカメラマン8等によって行わせることができる。
A plurality of composite images 103, each of which is obtained by combining a CG image 102 with a captured image 101 obtained by a plurality of image capturing devices 2, are displayed on the display unit 13 in a switched manner.
This allows the cameraman 8 or the like who operates the imaging device 2a to check the CG image 102 relative to the captured image 101 obtained by another imaging device 2, such as the imaging device 2b.
 複数の撮像装置2で得られた撮像画像101にCG画像102がそれぞれ合成された複数の合成画像103を表示部13に同時に表示させる。
 これにより、1つの撮像装置2aで、複数の撮像装置2で得られた撮像画像101に対する合成画像103を一度に確認させることができる。
 複数の撮像装置2では、撮像位置、被写体までの距離等の撮像環境がそれぞれ異なることから、1つの撮像装置2で得られた撮像画像101に対するCG画像102は好ましくても、他の撮像装置2で得られた撮像画像101に対するCG画像102は好ましくないことも起こり得る。
 そのため、複数の撮像装置2で得られた撮像画像101に対する合成画像103を一度に確認させることが有益である。
A plurality of composite images 103, each of which is obtained by combining a CG image 102 with a captured image 101 obtained by a plurality of image capture devices 2, are simultaneously displayed on a display unit 13.
This allows the composite image 103 of the captured images 101 obtained by the multiple image capture devices 2 to be viewed at once using a single image capture device 2a.
Since multiple imaging devices 2 have different imaging environments such as the imaging position and the distance to the subject, it is possible that while a CG image 102 for an image 101 obtained by one imaging device 2 is preferable, a CG image 102 for an image 101 obtained by another imaging device 2 is not preferable.
Therefore, it is useful to allow the user to check the composite image 103 of the captured images 101 obtained by the multiple image capture devices 2 at once.
 複数の撮像装置2のいずれかにおいて決定された撮像設定を、複数の撮像装置2において共通で用いる。
 これにより、複数の撮像装置2の撮像設定を容易に行うことができる。
The imaging settings determined in any one of the plurality of imaging devices 2 are used in common by the plurality of imaging devices 2 .
This makes it easy to set up imaging for a plurality of imaging devices 2 .
 CGパラメータには、CG画像102の描画を開始する開始タイミングに関する情報が含まれており、開始タイミングは、特定の被写体までの距離で設定されている。
 これにより、例えば特定の人物6の位置(撮像装置2との距離)に基づいて、撮像画像101に対するCG画像102の合成を開始することができる。
The CG parameters include information regarding the start timing for starting rendering of the CG image 102, and the start timing is set based on the distance to a specific subject.
This makes it possible to start combining the CG image 102 with the captured image 101, for example, based on the position of a particular person 6 (the distance from the image capture device 2).
 CGパラメータには、撮像画像101に対するCG画像102の合成を開始する開始タイミング情報(開始タイミング)が含まれており、開始タイミング情報は、空間情報で示される三次元空間内の特定の被写体の位置である。
 これにより、例えば特定の人物6の位置(撮像装置2との距離)に基づいて、撮像画像101に対するCG画像102の合成を開始することができる。
The CG parameters include start timing information (start timing) for starting synthesis of the CG image 102 with the captured image 101, and the start timing information is the position of a specific subject in the three-dimensional space indicated by the spatial information.
This makes it possible to start combining the CG image 102 with the captured image 101, for example, based on the position of a particular person 6 (the distance from the image capture device 2).
 複数の撮像装置2で得られた撮像画像101に、開始タイミング情報に基づいてCG画像102がそれぞれ合成される。
 これにより、複数の撮像装置2で得られた撮像画像101に同期したCG画像102を合成することができる。
A CG image 102 is synthesized with each of the captured images 101 obtained by the multiple image capture devices 2 based on start timing information.
This makes it possible to synthesize a CG image 102 synchronized with the captured images 101 obtained by the multiple image capture devices 2 .
 CGパラメータには、CG画像102の描画スピードに関する情報が含まれており、描画スピードは、特定の被写体までの距離で設定されている。
 これにより、例えば特定の人物6の移動に合わせて例えば草木等を出現させたようなCG画像102を生成して撮像画像101に合成することができる。
The CG parameters include information related to the rendering speed of the CG image 102, and the rendering speed is set according to the distance to a specific subject.
This makes it possible to generate a CG image 102 in which, for example, plants and the like appear in accordance with the movement of a particular person 6 and to synthesize the CG image 102 with the captured image 101 .
 被写体までの距離を測定する測距部(ToFセンサ12)を備え、空間情報取得部31は、測距部により測定された被写体までの距離に基づいて、空間情報を取得する。
 これにより、撮像空間の空間情報を容易に取得することができる。
The camera includes a distance measuring unit (ToF sensor 12) that measures the distance to the subject, and the spatial information acquiring unit 31 acquires spatial information based on the distance to the subject measured by the distance measuring unit.
This makes it possible to easily obtain spatial information about the imaging space.
 実施形態の情報処理方法は、撮像画像を出力し、撮像空間の空間情報とCG合成に関するCGパラメータとに基づいて、撮像画像101にCG画像102を合成することで生成された合成画像103を表示部13に表示させ、撮像画像101とCGパラメータを関連付ける。 The information processing method of the embodiment outputs a captured image, and displays a composite image 103 on the display unit 13, which is generated by combining the captured image 101 with a CG image 102 based on spatial information of the captured space and CG parameters related to CG synthesis, and associates the captured image 101 with the CG parameters.
 これらのプログラムはコンピュータ装置等の機器に内蔵されている記録媒体としてのHDDや、CPUを有するマイクロコンピュータ内のROM等に予め記録しておくことができる。あるいはまた、フレキシブルディスク、CD-ROM(Compact Disc Read Only Memory)、MO(Magneto Optical)ディスク、DVD(Digital Versatile Disc)、ブルーレイディスク(Blu-ray Disc(登録商標))、磁気ディスク、半導体メモリ、メモリカードなどのリムーバブル記録媒体に、一時的あるいは永続的に格納(記録)しておくことができる。このようなリムーバブル記録媒体は、いわゆるパッケージソフトウェアとして提供することができる。
 また、このようなプログラムは、リムーバブル記録媒体からパーソナルコンピュータ等にインストールする他、ダウンロードサイトから、LAN(Local Area Network)、インターネットなどのネットワークを介してダウンロードすることもできる。
These programs can be pre-recorded on a HDD as a recording medium built into a device such as a computer device, or on a ROM in a microcomputer having a CPU. Alternatively, they can be temporarily or permanently stored (recorded) on a removable recording medium such as a flexible disk, a CD-ROM (Compact Disc Read Only Memory), an MO (Magneto Optical) disk, a DVD (Digital Versatile Disc), a Blu-ray Disc (registered trademark), a magnetic disk, a semiconductor memory, or a memory card. Such removable recording media can be provided as so-called package software.
Such a program can be installed in a personal computer or the like from a removable recording medium, or can be downloaded from a download site via a network such as a LAN (Local Area Network) or the Internet.
 またこのような撮像システム1は、撮像装置2と情報処理装置(コンピュータ3)とを備える撮像システム1であって、撮像装置2は、撮像画像101を出力する撮像部11と、撮像空間の空間情報とCG合成に関するCGパラメータとに基づいて、撮像画像101にCG画像102を合成することで生成された合成画像103を表示部13に表示させる表示制御部33と、撮像画像101とCGパラメータとを関連付ける関連付け部(記録制御部34)と、を備え、情報処理装置は、所定のユーザ操作に応じてCGパラメータを調整する編集部55を備える。
 これにより、撮像後であっても、CG画像102を修正したいときにはCGパラメータを調整することで、容易にCG画像102を編集することができる。
Furthermore, such an imaging system 1 is an imaging system 1 that includes an imaging device 2 and an information processing device (computer 3), in which the imaging device 2 includes an imaging unit 11 that outputs an imaged image 101, a display control unit 33 that causes the display unit 13 to display a composite image 103 generated by combining the imaged image 101 with a CG image 102 based on spatial information of the imaging space and CG parameters related to CG synthesis, and an association unit (recording control unit 34) that associates the imaged image 101 with the CG parameters, and the information processing device includes an editing unit 55 that adjusts the CG parameters in response to a specified user operation.
As a result, even after capturing an image, if the user wishes to modify the CG image 102, the CG image 102 can be easily edited by adjusting the CG parameters.
 編集部55は、CGパラメータに基づいて、撮像画像101のタイムラインと、CG画像102のタイムラインとを揃えて表示可能である。
 これにより、撮像画像101及びCG画像102のタイムラインをそれぞれ別々に確認させることが可能となる。
 また、撮像画像101及びCG画像102のタイムラインを別々に移動させることも可能となる。即ち、撮像画像101に対するCG画像102の合成タイミングをずらすことも可能となる。
The editing unit 55 can align and display the timeline of the captured image 101 and the timeline of the CG image 102 based on the CG parameters.
This makes it possible to check the timelines of the captured image 101 and the CG image 102 separately.
It is also possible to separately move the timelines of the captured image 101 and the CG image 102. In other words, it is also possible to shift the synthesis timing of the CG image 102 relative to the captured image 101.
 編集部55は、複数の撮像装置2で得られた撮像画像101にCG画像102がそれぞれ合成された複数の合成画像103を表示部に表示させる。
 これにより、複数の撮像装置2で得られた撮像画像101に対するCG画像102を一度に確認させることができる。
The editing unit 55 causes the display unit to display a plurality of composite images 103 in which the CG images 102 are respectively composited with the captured images 101 obtained by the plurality of image capturing devices 2 .
This allows the user to check CG images 102 corresponding to the images 101 obtained by a plurality of image capture devices 2 at once.
 なお、本明細書に記載された効果はあくまでも例示であって限定されるものではなく、また他の効果があってもよい。 Note that the effects described in this specification are merely examples and are not limiting, and other effects may also be present.
 なお本技術は以下のような構成も採ることができる。
 (1)
 撮像画像を出力する撮像部と、
 撮像空間の空間情報とCG合成に関するCGパラメータとに基づいて、前記撮像画像にCG画像を合成することで生成された合成画像を表示部に表示させる表示制御部と、
 前記撮像画像と前記CGパラメータとを関連付ける関連付け部と、
 を備える撮像装置。
(2)
 前記空間情報と前記CGパラメータとが関連付けられている
 (1)に記載の撮像装置。
(3)
 前記関連付け部は記録制御部であり、前記撮像画像と前記CGパラメータとを関連付けて記録する
 (1)又は(2)に記載の撮像装置。
(4)
 前記関連付け部は、前記合成画像を記録することなく前記撮像画像と前記CGパラメータとを関連付けて記録する
 (3)に記載の撮像装置。
(5)
 他の装置において前記撮像画像に前記CG画像を合成することで前記合成画像が生成される
 (1)から(4)のいずれかに記載の撮像装置。
(6)
 他の装置において生成された前記CG画像を前記撮像画像に合成することで前記合成画像を生成するCG合成部を備える
 (1)から(4)のいずれかに記載の撮像装置。
(7)
 前記撮像画像に合成する前記CG画像を生成するCG画像生成部と、
 前記撮像画像に前記CG画像を合成することで前記合成画像を生成するCG合成部と、
 を備える(1)から(4)のいずれかに記載の撮像装置。
(8)
 前記表示制御部は、撮像中の前記撮像画像に対応してリアルタイム処理で前記CG画像が合成された前記合成画像を前記表示部に表示させる
(1)から(7)のいずれかに記載の撮像装置。
(9)
 ユーザ操作に応じて前記CGパラメータを調整する調整部を備える
 (1)から(8)のいずれかに記載の撮像装置。
(10)
 前記調整部は、所定のユーザ操作に応じて調整された前記CGパラメータを前記撮像画像に関連付ける
 (9)に記載の撮像装置。
(11)
 複数の前記撮像装置で得られた前記撮像画像に前記CG画像がそれぞれ合成された複数の前記合成画像を前記表示部に切り替えて表示させる
 (1)から(10)のいずれかに記載の撮像装置。
(12)
 複数の前記撮像装置で得られた前記撮像画像に前記CG画像がそれぞれ合成された複数の前記合成画像を前記表示部に同時に表示させる
 (11)に記載の撮像装置。
(13)
 複数の前記撮像装置のいずれかにおいて決定された撮像設定を、複数の前記撮像装置において共通で用いる
 (1)から(12)のいずれかに記載の撮像装置。
(14)
 前記CGパラメータには、前記撮像画像に対する前記CG画像の合成を開始する開始タイミング情報に関する情報が含まれており、
 前記開始タイミング情報は、特定の被写体までの距離で設定されている
 (1)から(13)のいずれかに記載の撮像装置。
(15)
 前記CGパラメータには、前記撮像画像に対する前記CG画像の合成を開始する開始タイミング情報が含まれており、
 前記開始タイミング情報は、前記空間情報で示される三次元空間内の特定の被写体の位置である
 (1)から(14)のいずれかに記載の撮像装置。
(16)
 複数の前記撮像装置で得られた前記撮像画像に、前記開始タイミング情報に基づいて前記CG画像がそれぞれ合成される
 (14)又は(15)に記載の撮像装置。
(17)
 前記CGパラメータには、前記CG画像の描画スピードに関する情報が含まれており、
 前記描画スピードは、特定の被写体までの距離で設定されている
 (14)又は(15)のいずれかに記載の撮像装置。
(18)
 被写体までの距離を測定する測距部を備え、
 前記空間情報は、前記測距部により測定された前記被写体までの距離に基づいて取得される
 (1)から(16)のいずれかに記載の撮像装置。
(19)
 撮像画像を出力し、
 撮像空間の空間情報とCG合成に関するCGパラメータとに基づいて、前記撮像画像にCG画像を合成することで生成された合成画像を表示部に表示させ、
 前記撮像画像と前記CGパラメータを関連付ける
 処理を撮像装置に実行させるプログラム。
(20)
 撮像装置と情報処理装置とを備える撮像システムであって、
 前記撮像装置は、
 撮像画像を出力する撮像部と、
 撮像空間の空間情報とCG合成に関するCGパラメータとに基づいて、前記撮像画像にCG画像を合成することで生成された合成画像を表示部に表示させる表示制御部と、
 前記撮像画像と前記CGパラメータとを関連付ける関連付け部と、
 を備え、
 前記情報処理装置は、
 所定のユーザ操作に応じて前記CGパラメータを調整する編集部を備える
 撮像システム。
The present technology can also be configured as follows.
(1)
an imaging unit that outputs a captured image;
a display control unit that displays a composite image generated by combining the captured image with a CG image based on spatial information of the captured space and CG parameters related to CG composition on a display unit;
an association unit that associates the captured image with the CG parameters;
An imaging device comprising:
(2)
The imaging device according to any one of the preceding claims, wherein the spatial information is associated with the CG parameters.
(3)
The imaging device according to (1) or (2), wherein the associating unit is a recording control unit and associates the captured image with the CG parameters and records them.
(4)
The imaging device according to (3), wherein the associating section associates the captured image with the CG parameters and records them without recording the composite image.
(5)
The imaging device according to any one of (1) to (4), wherein the composite image is generated by combining the captured image with the CG image in another device.
(6)
The imaging device according to any one of (1) to (4), further comprising a CG synthesis unit that generates the synthesized image by synthesizing the CG image generated in another device with the captured image.
(7)
a CG image generating unit that generates the CG image to be combined with the captured image;
a CG synthesis unit that generates the synthesized image by synthesizing the CG image with the captured image;
The imaging device according to any one of (1) to (4),
(8)
The imaging device according to any one of (1) to (7), wherein the display control unit causes the display unit to display the composite image in which the CG image is composited by real-time processing corresponding to the captured image being captured.
(9)
The imaging device according to any one of (1) to (8), further comprising an adjustment unit that adjusts the CG parameters in response to a user operation.
(10)
The imaging device according to (9), wherein the adjustment unit associates the CG parameters adjusted in response to a predetermined user operation with the captured image.
(11)
The imaging device according to any one of (1) to (10), wherein the display unit switches between and displays a plurality of composite images in which the CG image is composited with the captured images obtained by the plurality of imaging devices.
(12)
The imaging device according to (11), further comprising: a display unit that displays a plurality of composite images, each of which is obtained by combining the CG image with the captured images obtained by the plurality of imaging devices, at the same time.
(13)
The imaging device according to any one of (1) to (12), wherein imaging settings determined in any one of the plurality of imaging devices are used in common in the plurality of imaging devices.
(14)
The CG parameters include information regarding start timing information for starting synthesis of the CG image with the captured image,
The imaging device according to any one of (1) to (13), wherein the start timing information is set based on a distance to a specific subject.
(15)
the CG parameters include start timing information for starting synthesis of the CG image with the captured image,
The imaging device according to any one of (1) to (14), wherein the start timing information is a position of a specific subject in a three-dimensional space indicated by the spatial information.
(16)
The imaging device according to (14) or (15), wherein the CG image is synthesized with each of the captured images obtained by the plurality of imaging devices based on the start timing information.
(17)
The CG parameters include information regarding a rendering speed of the CG image,
The imaging device according to any one of (14) and (15), wherein the drawing speed is set based on a distance to a specific subject.
(18)
Equipped with a distance measuring unit that measures the distance to the subject,
The imaging device according to any one of (1) to (16), wherein the spatial information is acquired based on a distance to the subject measured by the distance measuring unit.
(19)
Output the captured image.
displaying a composite image on a display unit, the composite image being generated by combining the captured image with a CG image based on spatial information of the captured space and CG parameters related to CG combination;
A program that causes an image capturing apparatus to execute a process of associating the captured image with the CG parameters.
(20)
An imaging system including an imaging device and an information processing device,
The imaging device includes:
an imaging unit that outputs a captured image;
a display control unit that displays a composite image generated by combining the captured image with a CG image based on spatial information of the captured space and CG parameters related to CG composition on a display unit;
an association unit that associates the captured image with the CG parameters;
Equipped with
The information processing device includes:
An imaging system comprising an editing unit that adjusts the CG parameters in response to a predetermined user operation.
1 撮像システム
2 撮像装置
3 コンピュータ
31 情報取得部
32 メタ情報生成部
33 表示制御部
34 記録制御部
35 調整部
51 CG生成部
52 画像取得部
53 CG合成部
54 編集部
Reference Signs List 1 Imaging system 2 Imaging device 3 Computer 31 Information acquisition section 32 Meta information generation section 33 Display control section 34 Recording control section 35 Adjustment section 51 CG generation section 52 Image acquisition section 53 CG synthesis section 54 Editing section

Claims (20)

  1.  撮像画像を出力する撮像部と、
     撮像空間の空間情報とCG合成に関するCGパラメータとに基づいて、前記撮像画像にCG画像を合成することで生成された合成画像を表示部に表示させる表示制御部と、
     前記撮像画像と前記CGパラメータとを関連付ける関連付け部と、
     を備える撮像装置。
    an imaging unit that outputs a captured image;
    a display control unit that displays a composite image generated by combining the captured image with a CG image based on spatial information of the captured space and CG parameters related to CG composition on a display unit;
    an association unit that associates the captured image with the CG parameters;
    An imaging device comprising:
  2.  前記空間情報と前記CGパラメータとが関連付けられている
     請求項1に記載の撮像装置。
    The imaging device according to claim 1 , wherein the spatial information and the CG parameters are associated with each other.
  3.  前記関連付け部は記録制御部であり、前記撮像画像と前記CGパラメータとを関連付けて記録する
     請求項1に記載の撮像装置。
    The imaging device according to claim 1 , wherein the associating section is a recording control section, and records the captured image and the CG parameters in association with each other.
  4.  前記関連付け部は、前記合成画像を記録することなく前記撮像画像と前記CGパラメータとを関連付けて記録する
     請求項3に記載の撮像装置。
    The imaging device according to claim 3 , wherein the associating section associates the captured image with the CG parameters and records them without recording the composite image.
  5.  他の装置において前記撮像画像に前記CG画像を合成することで前記合成画像が生成される
     請求項1に記載の撮像装置。
    The imaging device according to claim 1 , wherein the composite image is generated by combining the captured image with the CG image in another device.
  6.  他の装置において生成された前記CG画像を前記撮像画像に合成することで前記合成画像を生成するCG合成部を備える
     請求項1に記載の撮像装置。
    The imaging device according to claim 1 , further comprising a CG composition unit that generates the composite image by combining the CG image generated in another device with the captured image.
  7.  前記撮像画像に合成する前記CG画像を生成するCG画像生成部と、
     前記撮像画像に前記CG画像を合成することで前記合成画像を生成するCG合成部と、
     を備える請求項1に記載の撮像装置。
    a CG image generating unit that generates the CG image to be combined with the captured image;
    a CG synthesis unit that generates the synthesized image by synthesizing the CG image with the captured image;
    The imaging device according to claim 1 .
  8.  前記表示制御部は、撮像中の前記撮像画像に対応してリアルタイム処理で前記CG画像が合成された前記合成画像を前記表示部に表示させる
    請求項1に記載の撮像装置。
    The imaging apparatus according to claim 1 , wherein the display control unit causes the display unit to display the composite image obtained by combining the CG image with the captured image through real-time processing in accordance with the captured image being captured.
  9.  ユーザ操作に応じて前記CGパラメータを調整する調整部を備える
     請求項1に記載の撮像装置。
    The imaging device according to claim 1 , further comprising an adjustment unit that adjusts the CG parameters in response to a user operation.
  10.  前記調整部は、ユーザ操作に応じて調整された前記CGパラメータを前記撮像画像に関連付ける
     請求項9に記載の撮像装置。
    The imaging device according to claim 9 , wherein the adjustment section associates the CG parameters adjusted in response to a user operation with the captured image.
  11.  複数の前記撮像装置で得られた前記撮像画像に前記CG画像がそれぞれ合成された複数の前記合成画像を前記表示部に切り替えて表示させる
     請求項1に記載の撮像装置。
    The imaging device according to claim 1 , wherein a plurality of composite images, each of which is obtained by combining the CG image with the captured images obtained by the plurality of imaging devices, are displayed on the display unit in a switched manner.
  12.  複数の前記撮像装置で得られた前記撮像画像に前記CG画像がそれぞれ合成された複数の前記合成画像を前記表示部に同時に表示させる
     請求項11に記載の撮像装置。
    The imaging device according to claim 11 , wherein a plurality of composite images, each of which is obtained by combining the CG image with the images obtained by the plurality of imaging devices, are simultaneously displayed on the display unit.
  13.  複数の前記撮像装置のいずれかにおいて決定された撮像設定を、複数の前記撮像装置において共通で用いる
     請求項1に記載の撮像装置。
    The imaging device according to claim 1 , wherein imaging settings determined in any one of the plurality of imaging devices are used in common in the plurality of imaging devices.
  14.  前記CGパラメータには、前記撮像画像に対する前記CG画像の合成を開始する開始タイミング情報が含まれており、
     前記開始タイミング情報は、特定の被写体までの距離である
     請求項1に記載の撮像装置。
    the CG parameters include start timing information for starting synthesis of the CG image with the captured image,
    The imaging device according to claim 1 , wherein the start timing information is a distance to a specific subject.
  15.  前記CGパラメータには、前記撮像画像に対する前記CG画像の合成を開始する開始タイミング情報が含まれており、
     前記開始タイミング情報は、前記空間情報で示される三次元空間内の特定の被写体の位置である
    請求項1に記載の撮像装置。
    the CG parameters include start timing information for starting synthesis of the CG image with the captured image,
    The imaging device according to claim 1 , wherein the start timing information is a position of a specific subject in a three-dimensional space indicated by the spatial information.
  16.  複数の前記撮像装置で得られた前記撮像画像に、前記開始タイミング情報に基づいて前記CG画像がそれぞれ合成される
     請求項14に記載の撮像装置。
    The imaging device according to claim 14 , wherein the CG image is synthesized with each of the captured images obtained by the plurality of imaging devices based on the start timing information.
  17.  前記CGパラメータには、前記CG画像の描画スピードに関する情報が含まれており、
     前記描画スピードは、特定の被写体までの距離で設定されている
     請求項14に記載の撮像装置。
    The CG parameters include information regarding a rendering speed of the CG image,
    The imaging device according to claim 14 , wherein the drawing speed is set based on a distance to a specific subject.
  18.  被写体までの距離を測定する測距部を備え、
     前記空間情報は、前記測距部により測定された前記被写体までの距離に基づいて取得される
     請求項1に記載の撮像装置。
    Equipped with a distance measuring unit that measures the distance to the subject,
    The imaging device according to claim 1 , wherein the spatial information is acquired based on a distance to the subject measured by the distance measuring unit.
  19.  撮像画像を出力し、
     撮像空間の空間情報とCG合成に関するCGパラメータとに基づいて、前記撮像画像にCG画像を合成することで生成された合成画像を表示部に表示させ、
     前記撮像画像と前記CGパラメータを関連付ける
     処理を撮像装置に実行させるプログラム。
    Output the captured image.
    displaying a composite image on a display unit, the composite image being generated by combining the captured image with a CG image based on spatial information of the captured space and CG parameters related to CG combination;
    A program that causes an image capturing apparatus to execute a process of associating the captured image with the CG parameters.
  20.  撮像装置と情報処理装置とを備える撮像システムであって、
     前記撮像装置は、
     撮像画像を出力する撮像部と、
     撮像空間の空間情報とCG合成に関するCGパラメータとに基づいて、前記撮像画像にCG画像を合成することで生成された合成画像を表示部に表示させる表示制御部と、
     前記撮像画像と前記CGパラメータとを関連付ける関連付け部と、
     を備え、
     前記情報処理装置は、
     所定のユーザ操作に応じて前記CGパラメータを調整する編集部を備える
     撮像システム。
    An imaging system including an imaging device and an information processing device,
    The imaging device includes:
    an imaging unit that outputs a captured image;
    a display control unit that displays a composite image generated by combining the captured image with a CG image based on spatial information of the captured space and CG parameters related to CG composition on a display unit;
    an association unit that associates the captured image with the CG parameters;
    Equipped with
    The information processing device includes:
    An imaging system comprising an editing unit that adjusts the CG parameters in response to a predetermined user operation.
PCT/JP2023/032982 2022-09-26 2023-09-11 Imaging device, program, and imaging system WO2024070618A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022152989 2022-09-26
JP2022-152989 2022-09-26

Publications (1)

Publication Number Publication Date
WO2024070618A1 true WO2024070618A1 (en) 2024-04-04

Family

ID=90477521

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/032982 WO2024070618A1 (en) 2022-09-26 2023-09-11 Imaging device, program, and imaging system

Country Status (1)

Country Link
WO (1) WO2024070618A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008287410A (en) * 2007-05-16 2008-11-27 Canon Inc Image processing method and device
JP2011035638A (en) * 2009-07-31 2011-02-17 Toppan Printing Co Ltd Virtual reality space video production system
JP2011223218A (en) * 2010-04-07 2011-11-04 Sony Corp Image processing device, image processing method, and program
JP2021009557A (en) * 2019-07-01 2021-01-28 キヤノン株式会社 Information processing device, information processing method, and program

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008287410A (en) * 2007-05-16 2008-11-27 Canon Inc Image processing method and device
JP2011035638A (en) * 2009-07-31 2011-02-17 Toppan Printing Co Ltd Virtual reality space video production system
JP2011223218A (en) * 2010-04-07 2011-11-04 Sony Corp Image processing device, image processing method, and program
JP2021009557A (en) * 2019-07-01 2021-01-28 キヤノン株式会社 Information processing device, information processing method, and program

Similar Documents

Publication Publication Date Title
JP4931845B2 (en) Imaging apparatus and captured image display control method
CN109155815A (en) Photographic device and its setting screen
JP5806623B2 (en) Imaging apparatus, imaging method, and program
US8750674B2 (en) Remotely controllable digital video camera system
WO2012138620A2 (en) Digital camera having variable duration burst mode
JP2011147067A (en) Image processing apparatus and method, and program
JP6304293B2 (en) Image processing apparatus, image processing method, and program
JP5253725B2 (en) Mobile communication terminal with video shooting function and operation method thereof
US20130077932A1 (en) Digital video camera system having two microphones
KR20120083085A (en) Digital photographing apparatus and control method thereof
JP2006211324A (en) Digital camera apparatus, method and program for reproducing image, and data structure
US20130063621A1 (en) Imaging device
WO2024070618A1 (en) Imaging device, program, and imaging system
JP6355333B2 (en) Imaging apparatus, image processing apparatus, image processing method, and program
JP2008310187A (en) Image processing device and image processing method
JP2017158021A (en) Information terminal device, imaging device, image information processing system, and image information processing method
JPH1042307A (en) Key system and synthetic image forming method
US20220264008A1 (en) Image processing device, image processing method, and program
WO2020189510A1 (en) Image processing device, image processing method, computer program, and storage medium
JP2006319524A (en) Image sensing device
JP4902365B2 (en) Image composition apparatus and program
JP2018074523A (en) Imaging device, control method thereof, program, and recording medium
JP6736289B2 (en) Information terminal device, imaging device, image information processing system, and image information processing method
JP2020188417A (en) Image processing apparatus, image processing method, and computer program
JP5332668B2 (en) Imaging apparatus and subject detection program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23871854

Country of ref document: EP

Kind code of ref document: A1