WO2017146224A1 - Système de photographie et procédé de génération de données - Google Patents

Système de photographie et procédé de génération de données Download PDF

Info

Publication number
WO2017146224A1
WO2017146224A1 PCT/JP2017/007171 JP2017007171W WO2017146224A1 WO 2017146224 A1 WO2017146224 A1 WO 2017146224A1 JP 2017007171 W JP2017007171 W JP 2017007171W WO 2017146224 A1 WO2017146224 A1 WO 2017146224A1
Authority
WO
WIPO (PCT)
Prior art keywords
image data
subject
camera
processing unit
background
Prior art date
Application number
PCT/JP2017/007171
Other languages
English (en)
Japanese (ja)
Inventor
淳 高嶋
圭昭 鍛冶屋敷
將史 大橋
Original Assignee
株式会社Tbwa Hakuhodo
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社Tbwa Hakuhodo filed Critical 株式会社Tbwa Hakuhodo
Publication of WO2017146224A1 publication Critical patent/WO2017146224A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/38Releasing-devices separate from shutter
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B37/00Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/272Means for inserting a foreground image in a background image, i.e. inlay, outlay

Definitions

  • This disclosure relates to a shooting system.
  • An imaging system in which a camera for capturing a guest outside the aquarium is arranged in the aquarium is also known (see, for example, Patent Document 1).
  • This photographing system remotely controls the camera to photograph a guest outside the aquarium.
  • This photographing system can photograph a photograph that cannot be photographed by a guest camera.
  • an imaging system including a group of cameras and an image processing unit.
  • a group of cameras operate according to instructions from a remote control device.
  • the group of cameras includes a narrow area camera and one or more wide area cameras.
  • the narrow-area camera is configured to generate first image data representing a photographed image of the subject by photographing the subject together with the background.
  • the one or more wide area cameras are configured to generate second image data representing a captured image of the wide area background by capturing a wide area background wider than the background including the periphery of the background captured by the narrow area camera. Is done.
  • the image processing unit is configured to generate processed image data using first image data and second image data generated by a group of cameras.
  • the image processing unit includes moving image data that includes at least one of moving image data that captures a subject and zooms in on the subject, and moving image that zooms out from the subject, and still image data that includes the subject and has a different angle of view. It may be configured to generate at least one of the still image data set that is a collection of data as processed image data.
  • the processed image data may be provided to a user corresponding to the subject.
  • the imaging system may include a communication unit.
  • the communication unit may be connected to a group of cameras.
  • the communication unit may be configured to transmit the first image data and the second image data generated by the group of cameras to the image processing unit through the wide area network.
  • the image processing unit may be arranged away from the group of cameras.
  • the image processing unit may be configured to acquire the first image data and the second image data through a wide area network.
  • the imaging system may include a notification unit that notifies the user corresponding to the subject of the position of the camera.
  • a camera configured to generate first image data representing a photographed image of a subject by photographing the subject together with a background according to a command from a remote control device, and photographing with the camera
  • a storage unit configured to store second image data representing a captured image of a wider background than the background including the surrounding of the background to be captured, and imaging of the subject represented by the first image data generated by the camera
  • a target subject can be photographed with high resolution by a camera, and composite image data including a beautiful background around the subject can be generated using the photographed image data. Therefore, according to this photographing system, it is possible to generate image data that can clearly recognize a subject in a beautiful background.
  • the image processing unit uses the composite image data to include moving image data that includes a subject that includes at least one of a moving image that zooms in on the subject and a moving image that zooms out of the subject, and still image data that includes the subject. At least one of the still image data sets that are a set of still image data having different angles of view may be generated as the processed image data.
  • the imaging system may include a communication unit configured to transmit the first image data generated by the camera to the image processing unit through the wide area network.
  • the communication unit may be connected to the camera.
  • the image processing unit may be arranged apart from the camera together with the storage unit.
  • the image processing unit may be configured to acquire first image data generated by the camera through a wide area network.
  • a common image processing unit may be provided for cameras installed in a plurality of shooting locations.
  • the image processing unit may be configured to acquire image data from a camera at each shooting location through a wide area network.
  • the image processing unit may be configured to distribute the processed image data to at least one of the user terminal associated with the subject and the remote operation device.
  • the remote control device may be a communication terminal possessed and / or owned by a user corresponding to the subject.
  • the image processing unit may be configured to distribute the generated processed image data to the communication terminal.
  • the image processing unit may be configured to store the processed image data in a server device accessible by a user corresponding to the subject.
  • the photographing system may be configured to issue an authentication code to a user corresponding to the subject.
  • the imaging system issues an authentication code to the user corresponding to the subject, receives a command from the remote control device on condition that the authentication code is input, and sets the group of cameras or the cameras.
  • An activation control unit configured to be activated may be provided.
  • the functions as the operation control unit and the image processing unit may be realized by a common server device.
  • a program for causing a computer to realize the function as the image processing unit included in the imaging system may be provided.
  • a program for causing a computer to realize the function as the operation control unit included in the imaging system may be provided.
  • a computer-readable non-transitional tangible recording medium storing one or more of these programs may be provided.
  • a narrow area camera configured to generate first image data representing a captured image of a subject by capturing the subject together with the background, and a background image captured by the narrow area camera. From a group of cameras including one or more wide area cameras configured to generate second image data representing a captured image of the wide area background by photographing a wide area background wider than the background including the surroundings;
  • a data generation method comprising: acquiring first image data and second image data; and generating processed image data using the acquired first image data and second image data May be provided.
  • the generation includes moving image data showing a subject and moving image data including at least one of a moving image zooming in on the subject and a moving image zooming out from the subject, and still image data showing the subject and still images having different angles of view.
  • At least one of the still image data sets, which is a set of image data, may be generated as processed image data.
  • obtaining first image data from a camera configured to generate first image data representing a captured image of a subject by photographing the subject together with a background
  • Obtaining second image data from a storage device configured to store second image data representing a captured image of a wider background than the background, including the periphery of the background imaged by the camera;
  • a composite image in which the captured image of the subject is arranged on the captured image of the wide background by combining the captured image of the subject represented by the acquired first image data with the captured image of the wide background represented by the acquired second image data.
  • Generating data a data generation method may be provided.
  • a program for causing a computer to execute the data generation method may be provided, or a computer-readable non-transitional tangible recording medium in which the program is recorded may be provided.
  • the photographing system 1 of the present embodiment is configured to generate photographed image data capable of clearly recognizing the user who is the subject TR in a beautiful landscape at places such as sightseeing spots and scenic spots, and provide the photographed image data to the user.
  • the captured image data includes at least one of moving image data and still image data.
  • the photographing system 1 includes a guide device 10 around the photographing spot Z as shown in FIG.
  • the guidance device 10 is provided to guide usage of the photographing system 1 to the user.
  • the photographing system 1 includes a plurality of cameras 21, 22, and 23 at locations away from the photographing spot Z in order to photograph a user.
  • the guide device 10 is installed so as not to stand out at a position slightly away from the shooting point, which is the user's standing position at the time of shooting.
  • the plurality of cameras 21, 22, and 23 are installed so that the background area size photographed together with the subject TR is different between the cameras 21, 22, and 23.
  • the plurality of cameras 21, 22, and 23 are fixedly arranged at different distances from the subject TR as shown in FIGS. 1 and 2.
  • the first camera 21 is disposed at a position closest to the subject TR, and the second camera 22 is more subject TR than the first camera 21.
  • the third camera 23 is disposed at a position further away from the subject TR than the second camera 22 is.
  • the second camera 22 is arranged so that a background wider than the background including the background photographed by the first camera 21 can be photographed.
  • the third camera 23 is arranged so that a background wider than the background including the background photographed by the second camera 22 can be photographed.
  • the arrows shown in FIG. 2 conceptually show the relationship between the shooting areas of the first camera 21, the second camera 22, and the third camera 23.
  • the cameras 21, 22, and 23 are fixed to a tripod.
  • each of the cameras 21, 22, and 23 does not appear in the captured image by adjusting the lens height, direction, angle of view, magnification, and the like. Installed.
  • Each of the cameras 21, 22, and 23 is connected to an individual control device 30.
  • the control device 30 controls the corresponding cameras 21, 22, and 23 in accordance with the shooting command transmitted remotely, causes the corresponding cameras 21, 22, and 23 to execute a shooting operation, and displays a still image representing the shot image
  • the data is configured to be transmitted to the service providing system 60 (see FIG. 3).
  • the service providing system 60 manages the photographing system 1 and provides various services to the user.
  • still image data representing an image captured by the first camera 21 is also expressed as first camera image data
  • still image data representing an image captured by the second camera 22 is also expressed as second camera image data
  • Still image data representing an image captured by the three cameras 23 is also expressed as third camera image data.
  • first camera image data, the second camera image data, and the third camera image data are not distinguished, they are simply expressed as camera image data.
  • the control device 30 includes a processing unit 31, a storage unit 32, a wireless communication unit 33, and a connection unit 34.
  • the processing unit 31 includes a CPU and a RAM (not shown), and executes processing according to a program stored in the storage unit 32.
  • the storage unit 32 is configured by an auxiliary storage device such as a hard disk device, and stores a program executed by the processing unit 31 and data used for execution of the program.
  • the wireless communication unit 33 is a communication interface capable of bidirectional communication with the service providing system 60 through the wide area network NT.
  • the wide area network NT includes a cellular network.
  • the connection unit 34 is an interface for connecting one of the cameras 21, 22 and 23 to the control device 30.
  • the connection unit 34 is configured by a USB interface, for example.
  • the connection of the cameras 21, 22, 23 to one control device 30 through the connection unit 34 is performed by the processing unit 31 controlling one of the cameras 21, 22, 23, and from one of the cameras 21, 22, 23. It is possible to acquire camera image data.
  • positioned in the distant place are also called the camera side system 40.
  • the guide device 10 described above is also configured to be capable of bidirectional communication with the service providing system 60 through the wide area network NT as shown in FIG. As shown in FIG. 3, the guide device 10 includes a processing unit 11, a storage unit 12, a wireless communication unit 13, a display 15, and an input unit 17.
  • the processing unit 11 includes a CPU and a RAM (not shown), and executes processing according to a program stored in the storage unit 12.
  • the storage unit 12 stores a program executed by the processing unit 11 and data used for execution of the program.
  • the wireless communication unit 13 is configured to be capable of bidirectional communication with the service providing system 60 through the wide area network NT.
  • the display 15 is controlled by the processing unit 11 and configured to display a guidance screen to the user.
  • the display 15 is configured by, for example, a liquid crystal display.
  • the input unit 17 is configured to accept a user operation on the guidance device 10.
  • the input unit 17 is configured by a touch panel on the display 15, for example.
  • the processing unit 11 When the processing unit 11 receives an operation from the user through the input unit 17, the processing unit 11 executes the processing shown in FIG.
  • the processing unit 11 displays a guidance screen on the display 15 in accordance with an operation from the user (S110).
  • the processing unit 11 executes a process of issuing a one-time password to the user in order to perform exclusive control related to the shooting operation (S130).
  • the processing unit 11 controls the display 15 to display a password issuance screen in which the one-time password is written.
  • the processing unit 11 requests the service providing system 60 for a one-time password to be issued to the user through the wireless communication unit 13, and causes the display 15 to display the one-time password provided from the service providing system 60.
  • a one-time password can be issued.
  • the service providing system 60 can manage the one-time password together with the issue date and time.
  • a message that guides the website that provides the shutter button can be described.
  • This screen can further describe a message that guides the user to enter a one-time password on the website.
  • This website is provided by the service providing system 60.
  • the service providing system 60 includes a front-end server 70 and a back-end server 80 as shown in FIG.
  • the front end server 70 and the back end server 80 are composed of one or more server machines.
  • the front end server 70 includes a processing unit 71 and a storage unit 72.
  • the processing unit 71 includes a CPU and a RAM (not shown), and executes processing according to a program stored in the storage unit 72.
  • the storage unit 72 stores a program executed by the processing unit 71 and data used for executing the program.
  • the front-end server 70 functions as the website described above.
  • the back-end server 80 controls the cameras 21, 22, and 23 according to instructions from the user through the website, processes the still image data received from the cameras 21, 22, and 23, and processes the processed image data Are provided to users through a website.
  • the back end server 80 includes a processing unit 81 and a storage unit 82.
  • the processing unit 81 includes a CPU and a RAM (not shown), and is configured to execute processing according to a program stored in the storage unit 82.
  • the storage unit 82 stores a program executed by the processing unit 81 and data used for executing the program.
  • the back end server 80 is installed so as to be able to communicate with the control device 30 and the guide device 10 of the camera side system 40 through the wide area network NT, and the front end server 70 can communicate with the user terminal 90 through the wide area network NT. It is installed so as to be able to communicate with the server 80 through an internal network.
  • Examples of user terminals 90 include portable communication terminals such as smartphones, tablets, mobile computers, and wearable devices. However, the user terminal 90 may not have portability.
  • the user terminal 90 may be an in-vehicle device.
  • the user terminal 90 may be configured to be able to access the wide area network NT alone, or may be a communication terminal that can access the wide area network NT through a relay such as a mobile router and a wireless access point.
  • the user terminal 90 includes a processing unit 91 including a CPU and a RAM, a storage unit 92 that stores programs, a wireless communication unit 93, a display 95, and an input unit 97 such as a touch panel.
  • the processing unit 71 of the front-end server 70 executes the shooting control process shown in FIG. 5 when the user terminal 90 accesses the URL guided by the guidance device 10 in order to provide the user terminal 90 with a software shutter button. To do.
  • the function of the website providing the shutter button is realized by this shooting control process.
  • the processing unit 71 When the processing unit 71 starts the photographing control process, the processing unit 71 transmits a top page (not shown) to the user terminal 90 that is the access source, and displays the top page on the display 95 of the user terminal 90. Further, the processing unit 71 transmits the one-time password input page G1 to the access source user terminal 90 through the wide area network NT, and displays the input page G1 on the display 95 of the user terminal 90 (S210). An example of the input page G1 is shown in the upper left of FIG.
  • the processing unit 71 receives a one-time password input operation for the input page G1.
  • the processing unit 71 receives the one-time password from the user terminal 90, the one-time password is issued by the guidance device 10 and has never been used. It is determined whether there is a valid password (S220). If the processing unit 71 determines that the password is valid, the processing unit 71 proceeds to S230. If the processing unit 71 determines that the password is not valid, the processing unit 71 proceeds to S210. In S210, the processing unit 71 displays the one-time password input page G1 on the user terminal 90 again, and requests the user to input the one-time password again. The processing unit 71 may operate so as to cut off the connection from the access source when an unauthorized password is input a plurality of times.
  • the processing unit 71 transmits the shutter operation reception page G2 to the user terminal 90, and displays the reception page G2 on the display 95 of the user terminal 90.
  • An example of the shutter operation acceptance page G2 is shown in the upper right of FIG.
  • the processing unit 71 waits instead of the shutter operation reception page G ⁇ b> 2 until shooting of the preceding user is completed. You may operate
  • the issuance of the one-time password in the guidance device 10 may be suspended until the shooting of the preceding user is completed.
  • the shutter operation acceptance page G2 shown in FIG. 6 includes a shutter button R1 as an operation object. Further, the reception page G2 has an image R2 that guides the camera position.
  • the image R2 is a composite image in which a pointer R3 indicating a camera position is combined with a captured image from a shooting point.
  • the image R2 may be an image that is dynamically displayed according to the orientation of the user terminal 90 based on the orientation information from the electronic compass 99 provided in the user terminal 90.
  • the front-end server 70 stores in the storage unit 72 panoramic image data obtained by photographing the installation locations of the cameras 21, 22, and 23 from the photographing points and combining the pointer R ⁇ b> 3. Can do.
  • the processing unit 71 regards the orientation of the user terminal 90 specified from the orientation information as the orientation of the user, and can extract an image corresponding to the field of view visible to the user from the panoramic image data.
  • the processing unit 71 can communicate with the user terminal 90 so that the extracted image is displayed on the shutter operation acceptance page G2 as the image R2.
  • the processing unit 71 waits until an operation signal indicating that the shutter button R1 has been pressed is received from the user terminal 90 (S240). When the operation signal is received (Yes in S240), the processing unit 71 executes a photographing process (S250). ).
  • the processing unit 71 communicates with the user terminal 90 so that the countdown display using the shutter operation reception page G2 is performed.
  • the configuration of the shutter operation reception page G2 during countdown display is illustrated in the lower right of FIG.
  • the processing unit 71 requests the back-end server 80 to execute a process for causing the camera-side system 40 to execute a photographing operation at the end of the countdown.
  • the processing unit 71 requests the back-end server 80 to execute processing for transmitting a shooting command to the control device 30 of the camera-side system 40. Thereafter, the processing unit 71 proceeds to S260.
  • the processing unit 71 transmits to the user terminal 90 an input page G4 for requesting input of the user's mail address corresponding to the subject TR, and displays the input page G4 on the display 95 of the user terminal 90 (S260).
  • An example of the input page G4 is shown in the lower left of FIG.
  • the processing unit 71 When the processing unit 71 receives the information of the mail address input on the input page G4 from the user terminal 90 (Yes in S270), the processing unit 71 sends this mail address to the back-end server 80 as the mail address to which the captured image data is provided. Notification is made (S280). Thereafter, the processing unit 71 ends the photographing control process shown in FIG.
  • the processing unit 81 of the back-end server 80 receives from the front-end server 70 an execution request for processing to transmit a shooting command by executing the shooting processing (S250) in the front-end server 70, the distribution preparation processing shown in FIG. Start.
  • the processing unit 81 follows the request from the front-end server 70 so that the camera-side system 40 executes a shooting operation in accordance with the end of the countdown.
  • a photographing command is transmitted to the control device 30 (S310).
  • the control device 30 of the first camera 21, the control device 30 of the second camera 22, and the control device 30 of the third camera 23 receive this shooting command from the back-end server 80, the connected cameras To cause the camera to perform a shooting operation.
  • This camera control according to the shooting command is realized by the processing unit 31 of the control device 30 executing the processing shown in FIG.
  • the processing unit 31 of the control device 30 when the processing unit 31 of the control device 30 receives a shooting command from the back-end server 80 (S410), it controls the camera connected to its own control device 30, and Execute the shooting operation. And the still image data which the camera produced
  • the processing unit 31 transmits the still image data acquired in this way as the camera image data to the back-end server 80 that is the transmission source of the shooting command (S430).
  • the processing unit 31 executes the above-described processes of S420 and S430 every time it receives a shooting command.
  • the processing unit 81 of the back-end server 80 transmits the shooting command in S310, in the subsequent S320, the first camera image data, the second camera image data, and the third camera image data generated based on the shooting command are transmitted to the wide area. It receives from each control device 30 of the 1st camera 21, the 2nd camera 22, and the 3rd camera 23 through network NT.
  • the processing unit 81 uses the received first camera image data, second camera image data, and third camera image data, and is moving image data that shows the user who is the subject TR and that is zoomed out from the user.
  • a process of generating data is executed (S330).
  • the moving image data generation method in S330 will be described in detail with reference to FIG.
  • an example of generating moving image data of 1920 ⁇ 720 pixels will be described.
  • the shooting area E1 illustrated in FIG. 9 corresponds to the shooting area by the first camera 21
  • the shooting area E2 corresponds to the shooting area by the second camera 22
  • the shooting area E3 is the shooting area by the third camera 23.
  • the first camera image data generated by the shooting operation of the first camera 21 is still image data in which the subject TR and the background BG in the shooting area E1 are captured, and is generated by the shooting operation of the second camera 22.
  • the second camera image data is still image data in which the subject TR and the background BG in the shooting region E2 are captured
  • the third camera image data generated by the shooting operation of the third camera 23 is the subject TR in the shooting region E3.
  • background image BG background image BG.
  • the first camera image data, the second camera image data, and the third camera image data are each generated as image data having a number of pixels larger than 1920 ⁇ 720 pixels.
  • the back-end server 80 corresponds to the pixel position P1 on the second camera image data corresponding to the end point of the first camera image data and the end point of the second camera image data in the storage unit 82.
  • the pixel position P2 on the third camera image data is stored.
  • the processing unit 81 can specify an image area corresponding to the shooting area of the first camera image data in the second camera image data.
  • the image area corresponding to the shooting area of the second camera image data can be specified.
  • the processing unit 81 takes in the image data of the region F in which the subject's bust-up appears from the first camera image data, and uses the image data of this region F as the first data in the moving image data.
  • the area F is an area of 1920 ⁇ 720 pixels.
  • the area F can be fixedly defined with respect to the imaging area E1, for example.
  • the region F may be dynamically determined based on the result of detecting a person by analyzing the first camera image data.
  • the image data of the area F larger than 1920 ⁇ 720 pixels is taken from the first camera image data, the image data is reduced to 1920 ⁇ 720 pixels, and the first frame
  • the image data may be set.
  • the processing unit 81 divides the area from the area F to the outer edge of the shooting area E3 by a rectangular frame similar to the area F into a number one less than the number N of frames of the moving image data, and the shooting area E3.
  • An image capture area from the second frame to the Nth frame defined by the rectangular frame is set in the area up to the outer edge of the frame.
  • the image capture area corresponding to that frame is stored. Image data is taken in, and reduced image data of 1920 ⁇ 720 pixels corresponding to this image data is set as image data of the corresponding frame.
  • the processing unit 81 starts from the second camera image data for each frame.
  • the image data of the image capture area corresponding to the image data is captured, and the reduced image data of 1920 ⁇ 720 pixels corresponding to the image data is set as the image data of the corresponding frame.
  • the processing unit 81 captures the image data of the image capture area corresponding to the frame from the third camera image data for each frame from the next frame of the second camera end frame to the Nth frame which is the end frame.
  • the reduced image data of 1920 ⁇ 720 pixels corresponding to this image data is set as the image data of the corresponding frame.
  • the processing unit 81 arranges the image data of the first frame to the Nth frame generated in this manner in time series, and generates moving image data that is a moving image data of the user who is the subject TR and is zoomed out from the user.
  • FIG. 10 illustrates a moving image generated from the still image data illustrated in FIG.
  • This moving image data generation method is characterized in that a frame is generated using the first camera image data without using the second and third camera image data for a frame in which the subject TR is greatly captured. According to this generation method, moving image data capable of clearly recognizing a subject located in a beautiful background can be generated.
  • the processing unit 81 stores the generated moving image data in the storage unit 82, and registers this moving image data as distribution data (S340). Specifically, a URL for distribution is assigned to the moving image data, and association information between the URL and the moving image data is stored (S340).
  • the processing unit 81 transmits an e-mail describing the distribution URL assigned to the moving image data in S340 to the e-mail address destination acquired by the front-end server 70 in S260 and S270. After the transmission, the processing unit 81 ends the process shown in FIG. This e-mail address is obtained from the user terminal 90 that has pressed the shutter button R1.
  • a user who has received an e-mail through the user terminal 90 or other electronic device accesses the URL destination described in the e-mail through the user terminal 90 or other electronic device, thereby moving image data associated with the URL. You can receive the delivery. Examples of other electronic devices include a personal computer owned by a user.
  • the processing unit 71 of the front-end server 70 When the processing unit 71 of the front-end server 70 repeatedly executes the processing shown in FIG. 11 and accesses the distribution URL (Yes in S510), the moving image associated with the URL is transmitted to the access source terminal. Data is transmitted (S520). As another example, the processing unit 71 may provide a moving image reproduction service to the access source terminal by transmitting a reproduction signal of the moving image data without transmitting the moving image data itself.
  • the imaging system 1 of the first embodiment described above may be modified as follows.
  • the imaging system 1 may be configured to transmit an e-mail attached with moving image data instead of an e-mail describing a URL for distributing moving image data.
  • the service providing system 60 may be configured to provide a video playback service free of charge to a user and a video data download service for a fee.
  • the moving image data may be generated or reproduced in a form in which an advertisement is inserted at the beginning or end.
  • Distribution of moving image data in the present disclosure includes a mode of transmitting a reproduction signal of moving image data in addition to a mode of transmitting moving image data itself.
  • the service providing system 60 may be configured to perform user authentication when distributing moving image data in order to avoid erroneous distribution to a person who is not a user corresponding to the subject.
  • User authentication can be performed using a password, for example.
  • the password for authentication can be acquired from the user terminal 90 in S260 and S270 together with the mail address.
  • the service providing system 60 may be configured to generate moving image data that zooms in on the user who is the subject.
  • the moving image data to be zoomed in by the user can be generated by reversing the arrangement of the frames constituting the moving image data to be zoomed out from the user.
  • the service providing system 60 may be configured to generate moving image data including both a moving image to be zoomed in and a moving image to be zoomed out.
  • the service providing system 60 may be configured to generate a still image data set in S330 instead of the moving image data or in addition to the moving image data.
  • the service providing system 60 extracts some still image data with different angles of view zoomed in / out from the subject from the moving image data, and the set of the extracted still image data is converted into the still image data. You may make it the structure produced
  • the still image data set at this time includes still image data having an intermediate angle of view between the first camera image data and the second camera image data, and an intermediate image between the second camera image data and the third camera image data. It may be generated as a still image data set including one or both of still image data having a corner. According to the present embodiment, it is possible to generate a still image data set having various angles of view that are not limited to the angles of view of the cameras 21, 22, and 23.
  • the function as the guidance device 10 may be realized by using hardware resources of the user terminal 90. That is, the function as the guidance device 10 is based on a command from the user input through the input unit 97 of the user terminal 90, the service providing system 60 transmits a web page to the user terminal 90, and the processing unit 91 of the user terminal 90. May be realized by displaying a web page on the display 95. In this case, the service providing system 60 can acquire position information from the user terminal 90 and issue a one-time password only to the user terminal 90 located at the shooting spot Z based on the position information.
  • a wireless access point may be provided at the shooting spot Z.
  • the imaging system 1 may be configured to issue a one-time password only to the user terminal 90 that has accessed through this wireless access point. Issuing such a one-time password can suppress use of the camera by a user who does not exist at the shooting spot Z. Further, according to the display restriction on the shutter operation reception page G2 using the one-time password (S220, S230), it is possible to suppress the occurrence of a collision of camera use by a plurality of users, and an exclusive shooting operation for each user. Can be executed appropriately.
  • the correspondence between terms is as follows.
  • the user terminal 90 corresponds to an example of a remote operation device, and an operation signal transmitted from the user terminal 90 to the service providing system 60 when the shutter button R1 is pressed corresponds to an example of a command from the remote operation device.
  • the restriction of use based on the one-time password by the guide device 10 and the service providing system 60 corresponds to an example of a function realized by the operation control unit.
  • the moving image data generation process (S330) and the moving image distribution process (S340, S350, S520) executed by the service providing system 60 correspond to an example of functions realized by the image processing unit.
  • the control device 30 that transmits the camera image data generated by the cameras 21, 22, and 23 to the back-end server 80 through the wide area network NT corresponds to an example of a communication unit.
  • the first camera 21 corresponds to an example of a narrow area camera
  • the second camera 22 and the third camera 23 correspond to an example of a wide area camera.
  • a control device 302 that replaces the control device 30 is provided in common for the plurality of cameras 21, 22, and 23, and the control device 302 generates moving image data.
  • the imaging system 1 of the second embodiment basically matches the first embodiment in other respects.
  • a structure of the imaging system 1 of 2nd embodiment the structure different from 1st embodiment is selectively demonstrated, and description of the structure which is common in 1st embodiment is abbreviate
  • the camera-side system 40 of the present embodiment includes a control device 302 that is common to the first camera 21, the second camera 22, and the third camera 23.
  • the control device 302 includes a connection unit 342 that replaces the connection unit 34 in addition to the processing unit 31, the storage unit 32, and the wireless communication unit 33 described in the first embodiment.
  • the connection unit 342 has a plurality of connection ports. All the cameras 21, 22 and 23 in the camera side system 40 are connected to the connection unit 342.
  • the processing unit 31 in the control device 302 executes the process shown in FIG. 13 instead of the process shown in FIG. That is, when the processing unit 31 receives a shooting command from the back-end server 80 (S610), the processing unit 31 controls the cameras 21, 22, and 23 to cause the cameras 21, 22, and 23 to simultaneously execute shooting operations. Then, the first camera image data, the second camera image data, and the third camera image data generated by the cameras 21, 22, and 23 by the shooting operation are acquired from the cameras 21, 22, and 23 (S620).
  • the processing unit 31 uses the first camera image data, the second camera image data, and the third camera image data acquired from the cameras 21, 22, and 23, and includes moving image data that shows a user who is the subject TR.
  • the moving image data to be zoomed out is generated (S630).
  • the method for generating moving image data is the same as in the first embodiment.
  • the moving image data may be moving image data that zooms in on the user, or moving image data that includes both a moving image that zooms out from the user and a moving image that zooms in on the user.
  • the processing unit 31 transmits the generated moving image data to the back-end server 80 that is the transmission source of the shooting command (S640).
  • the processing unit 31 executes the above-described processing of S620 to S640 every time it receives a shooting command.
  • the processing unit 81 executes the distribution preparation process shown in FIG. 14 instead of the distribution preparation process shown in FIG. Similar to S310 in the first embodiment, the processing unit 81 transmits a shooting command to the control device 302 of the camera side system 40 in accordance with a request from the front end server 70 (S710).
  • the processing unit 81 receives the moving image data generated by the camera-side system 40 based on the shooting command from the control device 302 through the wide area network NT (S720), and regarding the received moving image data, S340 of the first embodiment. , S350 is executed (S730, S740). That is, the processing unit 81 stores the received moving image data in the storage unit 82 and registers it as distribution data (S730). Further, the processing unit 81 transmits an e-mail describing the URL for moving image data distribution to the e-mail address of the user corresponding to the subject TR (S740), and ends the process shown in FIG.
  • the imaging system 1 of the second embodiment described above is meaningful when the first camera 21, the second camera 22, and the third camera 23 are arranged at close positions.
  • moving image data is generated by the control device 302.
  • the moving image data may be generated by the service providing system 60 as in the first embodiment.
  • the imaging system 1 of the second embodiment may have the same configuration as that of the first embodiment, except that one control device 302 is provided in common for the plurality of cameras 21, 22, and 23.
  • the control device 302 may be configured to generate the above-described still image data set and transmit it to the service providing system 60 in S630 and S640 instead of or in addition to the moving image data.
  • a panoramic background image in which camera-side system 40 includes a single camera 20 and a control device 303 and still image data D1 representing a photographed image of the camera 20 is prepared in advance. It differs from the first embodiment in that it is combined with the data D2 and moving image data is generated based on the combined image data D3.
  • the imaging system 1 of the third embodiment basically matches the first embodiment in other points.
  • the structure different from 1st embodiment is selectively demonstrated, and description of the structure which is common in 1st embodiment is abbreviate
  • the camera-side system 40 of this embodiment includes a camera 20 corresponding to the first camera 21 and a control device 303 connected to the camera 20.
  • the control device 303 includes a connection unit 343 in addition to the processing unit 31, the storage unit 32, and the wireless communication unit 33 described in the first embodiment.
  • the connection unit 343 is connected to the camera 20.
  • the processing unit 31 in the control device 303 executes the process shown in FIG. 16 instead of the process shown in FIG. That is, when the processing unit 31 receives a shooting command from the back-end server 80 (S810), the processing unit 31 controls the camera 20 to cause the camera 20 to execute a shooting operation.
  • the processing unit 31 acquires the still image data D1 representing the photographed image of the camera 20 by this photographing operation from the camera 20 as the camera image data D1 (S820).
  • the processing unit 31 combines the camera image data D1 with the panorama background image data D2 stored in the storage unit 32 to generate combined image data D3 illustrated in FIG. 19 ( S830).
  • the camera 20 takes a large image of the user as the subject TR located at the shooting point within a range where the whole body of the user is captured. Accordingly, the subject TR appears large in the camera image data D1 generated by the camera 20.
  • the panoramic background image data D2 is image data generated by shooting a wide-area background BG including shooting points for each section indicated by a broken line in FIG. 17 and connecting the background image data representing the shot images for each section. is there.
  • the panoramic background image data D2 representing the captured image of the wide-area background BG is composed of a combination of background image data for each section divided by broken lines shown in FIG.
  • the combined panorama background image data D2 is gigapixel image data depending on the number of sections.
  • FIG. 17 simply shows an example in which the panorama background image data D2 has a width corresponding to nine sections, but the actual panorama background image data D2 may have a larger number of sections.
  • the panorama background image data D2 is image data representing a captured image of the wide-area background BG, but is high-resolution image data.
  • the storage unit 32 of the control device 303 stores data that defines the synthesis position of the camera image data D1 with respect to the panorama background image data D2.
  • the processing unit 31 executes a composition process for pasting the photographed image represented by the camera image data D1 in a specific area of the photographed image represented by the panorama background image data D2 based on the data defining the composition position. Thereby, the processing unit 31 generates composite image data D3 in which the captured image of the subject TR is arranged on the captured image of the wide background BG.
  • the processing unit 31 may suppress the appearance of the discontinuity in brightness caused by the synthesis in the synthesized image data D3 by adjusting the brightness of the panoramic background image data D2 when executing the synthesis process. Due to the difference in weather at the time of shooting, the brightness of the camera image data D1 may be significantly different from the brightness of the panoramic background image data D2 prepared in advance.
  • the processing unit 31 is a statistical representative value of the luminance difference of pixels between the captured image represented by the camera image data D1 and the captured image represented by the panoramic background image data D2, and the background BG overlaps. And the luminance of the panorama background image data D2 can be adjusted in a direction to make the statistical representative value zero.
  • the statistical representative value may be an average value or a median value.
  • the processing unit 31 compares each pixel value of the panorama background image data D2 with each pixel value of the camera image data D1 when executing the synthesis process, and in the camera image data D1 of the pixels that capture the same part of the background BG.
  • the position and the position in the panorama background image data D2 may be specified.
  • the processing unit 31 may generate the composite image data D3 by combining the camera image data D1 with the panoramic background image data D2 so that pixels that capture the same part of the background BG overlap based on the specified position. Good.
  • the processing unit 31 transmits the generated composite image data D3 to the back-end server 80 that is the imaging command source through the wide area network NT (S840).
  • the processing unit 31 executes the above-described processing of S820, S830, and S840 every time it receives a shooting command.
  • the processing unit 81 executes the distribution preparation process shown in FIG. 18 instead of the distribution preparation process shown in FIG. Similar to S310 in the first embodiment, the processing unit 81 transmits a shooting command to the control device 303 of the camera side system 40 in accordance with a request from the front end server 70 (S910).
  • the processing unit 81 receives the composite image data D3 generated by the camera-side system 40 based on the shooting command from the control device 303 through the wide area network NT (S920). Furthermore, based on the received composite image data D3, moving image data that shows the user who is the subject TR and that is zoomed out from the user is generated (S930).
  • the moving image data generation method in S930 will be described in detail with reference to FIG.
  • an example of generating moving image data of 1920 ⁇ 720 pixels will be described.
  • the processing unit 81 takes in the image data of the region F in which the bust-up of the subject TR is captured from the composite image data D3, and this image data is the first frame image in the moving image data.
  • the area F is an area of 1920 ⁇ 720 pixels.
  • the area F may be an area smaller than the image area of the camera image data D1 synthesized with the panorama background image data D2.
  • the area F is fixedly defined with respect to the image area of the composite image data D3, for example.
  • the region F may be dynamically determined based on a person detection result obtained by analyzing the composite image data D3.
  • the image data of the area F larger than 1920 ⁇ 720 pixels is taken from the composite image data D3, the image data is reduced to 1920 ⁇ 720 pixels, and the image data of the first frame is set. Also good.
  • the processing unit 81 divides the area from the area F to the outer edge of the shooting area of the composite image data D3 with a rectangular frame similar to the area F into a number one less than the frame number N of the video data, An image capture area from the second frame to the Nth frame defined by a rectangular frame is set in the area up to the outer edge. Then, for each frame from the second frame to the Nth frame, the processing unit 81 takes in the image data of the image take-in area corresponding to the frame from the composite image data D3, and has 1920 ⁇ 720 pixels corresponding to the image data. The reduced image data is set to the image data of the corresponding frame.
  • the processing unit 81 arranges the image data of the first frame to the Nth frame generated in this manner in time series, and generates moving image data that is a moving image data of the user who is the subject TR and is zoomed out from the user.
  • the moving image reproduced by the moving image data is the same as that in the first embodiment (see FIG. 10). That is, this moving image is configured as a moving image that can clearly recognize the subject TR located in the beautiful background and can also enjoy the beautiful scenery.
  • the processing unit 81 stores the generated moving image data in the storage unit 82 and registers it as distribution data (S940).
  • the processing unit 81 transmits an e-mail describing the distribution URL assigned to the moving image data to the e-mail address obtained from the user terminal 90 operated by pressing the shutter button (S950), and the processing shown in FIG. Exit.
  • the moving image data of the present embodiment may be moving image data that zooms in on the user, or moving image data that includes a moving image that zooms in on the user and a moving image that zooms out from the user.
  • the service providing system 60 may be configured to deliver a still image data set instead of or in addition to the moving image data.
  • the service providing system 60 may be configured to acquire the camera image data D1 from the camera side system 40 and generate the composite image data D3 and the moving image data.
  • the processing unit 31 of the control device 303 may be configured to execute a process (S850) of transmitting camera image data D1 instead of the processes of S830 and S840.
  • the processing unit 81 of the back-end server 80 executes a process (S960) of generating the composite image data D3 based on the camera image data D1 received from the control device 303, instead of the process of S920. It may be configured to.
  • the storage unit 82 of the service providing system 60 can store the panoramic background image data D2.
  • the control device 303 of the camera side system 40 may be configured to generate moving image data based on the composite image data D3 and transmit the generated moving image data to the service providing system 60.
  • the service providing system 60 may be configured to deliver moving image data to a user corresponding to the subject TR in a form attached to an e-mail.
  • the camera 20 may be configured to generate moving image data instead of still image data as the camera image data D1
  • the control device 303 may include the moving image data from the camera 20 and the panoramic background that is still image data.
  • the image data D2 may be combined to generate moving image data in which a subject moves in a beautiful background.
  • the process of generating the composite image data D3 and generating the moving image data based on the composite image data D3, which is realized in at least one of the control device 303 and the back-end server 80, is realized by the image processing unit. This corresponds to an example of processing.
  • the imaging system of the present disclosure is not limited to the above embodiment, and can take various forms.
  • the user terminal 90 is configured to activate a built-in camera (not shown), display an image captured by the camera on the display 95, and display a pointer R3 indicating the camera position in the image. May be.
  • the user terminal 90 can display the pointer R ⁇ b> 3 based on the GPS position information and direction information obtained from the built-in device and the camera position information obtained from the service providing system 60.
  • the photographing system 1 may be configured such that a user who has completed photographing can check a moving image through the guidance device 10.
  • the service providing system 60 may be configured to distribute moving image data to the guidance device 10.
  • a password that can be used multiple times or for a long time may be used.
  • the guidance device 10 may be configured to issue the same password to a plurality of users by displaying different passwords every day. Even by issuing such a password, it is possible to suppress random use of the camera by a user who does not exist at the shooting spot Z.
  • the functions of one component in the above embodiment can be distributed among a plurality of components. Functions of a plurality of components may be integrated into one component. A part of the configuration of the above embodiment may be omitted. At least a part of the configuration of the embodiment may be added to or replaced with the configuration of the other embodiment. Any aspect included in the technical idea specified from the wording of the claims is an embodiment of the present disclosure.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computing Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)
  • Details Of Cameras Including Film Mechanisms (AREA)
  • Stereoscopic And Panoramic Photography (AREA)
  • Studio Circuits (AREA)

Abstract

La présente invention concerne un système de photographie comportant un ou plusieurs appareils de prise de vues destinés à photographier un sujet et à générer des données d'image qui représentent l'image photographiée du sujet. Le système de photographie génère, sur la base de données d'image provenant du ou des appareils de prise de vues, au moins l'une des données d'image en mouvement dans laquelle le sujet est photographié et qui comprend une image en mouvement qui zoome sur le sujet et/ou une image en mouvement qui zoome à partir du sujet, et un ensemble de données d'image fixe qui est un ensemble de données d'image fixe dans lesquelles le sujet est photographié et qui sont différentes en termes d'angle de vue. En variante, le système de photographie synthétise des données d'image qui représentent l'image photographiée du sujet générée par un appareil de prise de vues, et des données d'image qui représentent l'image photographiée d'un arrière-plan étendu enregistré auparavant, ce qui permet de générer des données d'image dans lesquelles le sujet est clairement reconnaissable par rapport à un grand arrière-plan.
PCT/JP2017/007171 2016-02-24 2017-02-24 Système de photographie et procédé de génération de données WO2017146224A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016033634A JP2017152931A (ja) 2016-02-24 2016-02-24 撮影システム及びプログラム
JP2016-033634 2016-02-24

Publications (1)

Publication Number Publication Date
WO2017146224A1 true WO2017146224A1 (fr) 2017-08-31

Family

ID=59686332

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/007171 WO2017146224A1 (fr) 2016-02-24 2017-02-24 Système de photographie et procédé de génération de données

Country Status (3)

Country Link
JP (1) JP2017152931A (fr)
TW (1) TW201736933A (fr)
WO (1) WO2017146224A1 (fr)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112631043A (zh) * 2020-12-09 2021-04-09 郑可 一种新型影楼摄影技术方法
KR102440759B1 (ko) * 2021-12-14 2022-09-07 에스큐아이소프트(주) 관광명소의 줌아웃 영상 촬영 서비스 제공 시스템 및 이를 이용한 관광명소의 줌아웃 영상 촬영 서비스 제공 방법

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004048648A (ja) * 2002-05-13 2004-02-12 Fuji Photo Film Co Ltd 特殊効果画像の作成方法及びカメラ並びに画像サーバ
JP2004297143A (ja) * 2003-03-25 2004-10-21 Fuji Photo Film Co Ltd 撮影システム
JP2006229467A (ja) * 2005-02-16 2006-08-31 Fuji Photo Film Co Ltd フォトムービー作成装置及びフォトムービー作成プログラム、並びに被写体認識方法
JP2007166352A (ja) * 2005-12-15 2007-06-28 Sony Corp カメラシステム
JP2013254302A (ja) * 2012-06-06 2013-12-19 Sony Corp 画像処理装置、画像処理方法、及びプログラム

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003207841A (ja) * 2002-01-16 2003-07-25 Omron Corp 写真シール自動販売方法とその装置、シール紙ユニット及び写真シールシート
JP2004007296A (ja) * 2002-06-03 2004-01-08 Toshiba Eng Co Ltd カメラ撮影システム及びカメラ撮影装置
JP2003241293A (ja) * 2002-12-16 2003-08-27 Fuji Photo Film Co Ltd リモコン装置付きカメラ
JP2005099164A (ja) * 2003-09-22 2005-04-14 Fuji Photo Film Co Ltd 自動撮影システム
JP4635675B2 (ja) * 2005-03-24 2011-02-23 カシオ計算機株式会社 撮影装置及びプログラム
JP4935264B2 (ja) * 2006-09-13 2012-05-23 辰巳電子工業株式会社 自動写真作成装置および自動写真作成方法
JP4607992B2 (ja) * 2008-08-20 2011-01-05 株式会社メイクソフトウェア 写真プリント提供装置および方法ならびにプログラム
JP5133816B2 (ja) * 2008-08-22 2013-01-30 オリンパスイメージング株式会社 カメラ、画像合成方法、およびプログラム
JP2011160354A (ja) * 2010-02-03 2011-08-18 Canon Inc 画像処理装置、画像処理方法及びプログラム
JP2014183425A (ja) * 2013-03-19 2014-09-29 Sony Corp 画像処理方法、画像処理装置および画像処理プログラム
JP2014217010A (ja) * 2013-04-30 2014-11-17 株式会社ニコン 撮像装置および制御プログラム
JP5928627B2 (ja) * 2014-03-20 2016-06-01 フリュー株式会社 管理装置および管理装置の制御方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004048648A (ja) * 2002-05-13 2004-02-12 Fuji Photo Film Co Ltd 特殊効果画像の作成方法及びカメラ並びに画像サーバ
JP2004297143A (ja) * 2003-03-25 2004-10-21 Fuji Photo Film Co Ltd 撮影システム
JP2006229467A (ja) * 2005-02-16 2006-08-31 Fuji Photo Film Co Ltd フォトムービー作成装置及びフォトムービー作成プログラム、並びに被写体認識方法
JP2007166352A (ja) * 2005-12-15 2007-06-28 Sony Corp カメラシステム
JP2013254302A (ja) * 2012-06-06 2013-12-19 Sony Corp 画像処理装置、画像処理方法、及びプログラム

Also Published As

Publication number Publication date
TW201736933A (zh) 2017-10-16
JP2017152931A (ja) 2017-08-31

Similar Documents

Publication Publication Date Title
JP6805861B2 (ja) 画像処理装置、画像処理システム、画像処理方法及びプログラム
JP6756269B2 (ja) 通信端末、画像通信システム、通信方法、及びプログラム
US10437545B2 (en) Apparatus, system, and method for controlling display, and recording medium
KR102145542B1 (ko) 촬영 장치, 복수의 촬영 장치를 이용하여 촬영하는 촬영 시스템 및 그 촬영 방법
JP4929940B2 (ja) 撮影システム
US9225947B2 (en) Image pickup apparatus, method of providing composition of image pickup and computer-readable recording medium
JP4098808B2 (ja) 遠隔映像表示方法、映像取得装置及びその方法とそのプログラム
US20170064174A1 (en) Image shooting terminal and image shooting method
US10187571B2 (en) Image management apparatus, image communication system, method for controlling display of captured image, and non-transitory computer-readable medium
KR102314594B1 (ko) 이미지 디스플레이 방법 및 전자 장치
US20090128644A1 (en) System and method for generating a photograph
KR20170013877A (ko) 동적 배향 잠금, 일정한 프레이밍 및 규정된 종횡비를 갖는 사진 비디오 카메라
WO2017092128A1 (fr) Procédé et dispositif pour afficher une image de prévisualisation
CN106791483B (zh) 图像传输方法及装置、电子设备
US11902660B2 (en) Image processing device, image processing method, and program
JP2017201753A (ja) ネットワークシステム及びその制御方法
WO2019000565A1 (fr) Procédé et système de photographie de zone panoramique
US20140210941A1 (en) Image capture apparatus, image capture method, and image capture program
KR102477993B1 (ko) 표시 제어장치, 촬상 장치, 제어 방법 및 컴퓨터 판독 가능한 매체
KR20100070146A (ko) 디스플레이 방법 및 이를 이용한 촬영 장치와 디스플레이 장치
WO2017146224A1 (fr) Système de photographie et procédé de génération de données
JP6196403B2 (ja) 撮影システム及びプログラム
JP2015126389A (ja) 撮像装置およびその制御方法
JP4206488B2 (ja) 画像処理装置、撮影システム、画像処理方法、画像処理プログラム、および、記録媒体
JP6586819B2 (ja) 画像管理システム、画像通信システム、画像管理方法、及びプログラム

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17756658

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 17756658

Country of ref document: EP

Kind code of ref document: A1