CN114650351A - Image pickup apparatus, image pickup method, storage medium, and computer apparatus - Google Patents

Image pickup apparatus, image pickup method, storage medium, and computer apparatus Download PDF

Info

Publication number
CN114650351A
CN114650351A CN202111545392.6A CN202111545392A CN114650351A CN 114650351 A CN114650351 A CN 114650351A CN 202111545392 A CN202111545392 A CN 202111545392A CN 114650351 A CN114650351 A CN 114650351A
Authority
CN
China
Prior art keywords
image
pixels
imaging
composite
synthesized
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111545392.6A
Other languages
Chinese (zh)
Inventor
谷幸一
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ricoh Co Ltd
Original Assignee
Ricoh Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ricoh Co Ltd filed Critical Ricoh Co Ltd
Publication of CN114650351A publication Critical patent/CN114650351A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/54Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/265Mixing

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Image Processing (AREA)
  • Editing Of Facsimile Originals (AREA)

Abstract

The present invention relates to an imaging apparatus, an imaging method, a storage medium, and a computer apparatus, and has an object of being capable of accepting an imaging operation for generating an image to be synthesized with a first image by synthesizing the first image with a captured image in a state where the synthesized image is displayed. The image pickup apparatus of the present invention includes an image pickup unit (11) for picking up an image; a synthesized image display unit (13) for displaying a first synthesized image obtained by synthesizing the first image and the second image captured by the imaging unit; and an image pickup operation accepting unit (14) for accepting an image pickup operation of a third image for generating a second composite image by being synthesized with the first image in a state where the first composite image is displayed.

Description

Image pickup apparatus, image pickup method, storage medium, and computer apparatus
Technical Field
The invention relates to an imaging apparatus, an imaging method, a storage medium, and a computer apparatus.
Background
Techniques for combining two-dimensional code images and other images are well known. For example, patent document 1(JP 2006-304247 a) discloses an image generating apparatus that, when a user synthesizes a two-dimensional code as an image into an arbitrary image, accepts an operation of displaying the image of the two-dimensional code and another image in a superimposed manner and designating a synthesis position, and synthesizes the image at the designated position.
However, in the conventional technique, when an image capturing operation for generating an image to be synthesized with a first image is received, there is a problem that a synthesized image in which the first image and the captured image are synthesized is not displayed. However, when the user performs an image capturing operation for generating an image of the composite image without displaying the composite image, the user may be burdened with a task such as repeating image capturing and composite image generation until the generated composite image is satisfied.
Disclosure of Invention
An object of the present invention is to accept an image capturing operation for an image synthesized with a first image to generate a synthesized image in a state where the synthesized image is displayed in which the first image is synthesized with a captured image.
In order to achieve the above object and solve the above problems in the prior art, the present invention provides an image pickup apparatus including an image pickup unit for picking up an image; a composite image display unit for displaying a first composite image obtained by compositing a first image and a second image captured by the imaging unit; and an imaging operation reception unit configured to receive an imaging operation of a third image for combining with the first image to generate a second combined image in a state where the first combined image is displayed. .
The present invention has an effect that it is possible to accept an image capturing operation for an image synthesized with a first image to generate a synthesized image in a state where the synthesized image in which the first image and a captured image are synthesized is displayed.
Drawings
Fig. 1 is a schematic diagram of a system configuration of an image forming system.
Fig. 2 is a block diagram of a hardware configuration of the image pickup apparatus.
Fig. 3 is a block diagram of a hardware configuration of the image forming apparatus.
Fig. 4 is a functional configuration block diagram of the imaging apparatus according to the first embodiment.
Fig. 5 is a flowchart of the image pickup process according to the first embodiment.
Fig. 6 is a diagram for explaining the imaging of the first image according to the first embodiment.
Fig. 7 is a diagram for explaining the imaging of the second image and the third image according to the first embodiment.
Fig. 8 is a flowchart of an image pickup process according to the second embodiment.
Fig. 9 is a diagram for explaining the imaging of the first image according to the second embodiment.
Fig. 10 is a diagram for explaining the imaging of the second image and the third image according to the second embodiment.
Fig. 11 is a schematic diagram of functions of an imaging apparatus according to a third embodiment.
Fig. 12 is a diagram for explaining the imaging of the second image and the third image according to the third embodiment.
Detailed Description
< first embodiment >
An embodiment of an image forming system including an image pickup apparatus according to the present invention is described below with reference to the drawings.
Fig. 1 is a schematic diagram of a system configuration of an image forming system.
The image forming system 1 includes an imaging device 10 and an image forming device 20. The image pickup apparatus 10 and the image forming apparatus 20 are communicably connected via a communication network 30.
The imaging device 10 has an imaging function such as a digital camera or a smartphone. The imaging apparatus 10 generates a composite image in which a plurality of captured images are combined, and sends data indicating the generated composite image to the image forming apparatus 20. For example, the image pickup device 10 picks up a scene image used as a train ticket and synthesizes the scene image with a QR code (registered trademark) image, and sends data indicating the generated synthesized image to the image forming device 20.
The image forming apparatus 20 forms an image based on the data received from the image pickup apparatus 10. For example, in order to print a train ticket, the image forming apparatus 20 prints a synthesized image of a landscape captured by the image capturing apparatus 10 and a QR code (registered trademark) image on a printing paper.
Fig. 2 is a block diagram showing an example of the hardware configuration of the imaging apparatus.
The imaging device 10 is configured by a computer, and includes a CPU101, a ROM102, a RAM103, an EEPROM104, a CMOS sensor 105, an imaging element I/F106, an acceleration and orientation sensor 107, a media I/F109, and a GPS receiving unit 111.
The CPU101 controls the overall operation of the imaging apparatus 10. The ROM102 stores programs for driving the CPU101 such as IPL. The RAM103 is used as a work area of the CPU 101. EEPROM104 reads and writes various data such as imaging device programs under the control of CPU 101.
A cmos (complementary Metal Oxide semiconductor) sensor 105 is a built-in image pickup device that picks up an image of an object (mainly a self-image) to obtain image information under the control of the CPU 101. Instead of the CMOS sensor, an imaging device such as a ccd (charge Coupled device) sensor may be used.
The image pickup element I/F106 is a circuit that controls driving of the CMOS sensor 105. The acceleration and orientation sensor 107 is various sensors such as an electromagnetic compass, a gyro compass, and an acceleration sensor that detect a geomagnetic field. The media I/F109 controls reading and writing of data in the recording medium 108 such as a flash memory. The GPS receiving unit 111 receives GPS signals from GPS satellites.
The imaging apparatus 10 includes a long-distance communication circuit 112, a CMOS sensor 113, an imaging device I/F114, a microphone 115, a speaker 116, an audio input/output I/F117, a display 118, a peripheral connection I/F119, a short-distance communication circuit 120, an antenna 120a of the short-distance communication circuit 120, and a touch panel 121.
Here, the long-distance communication circuit 112 communicates with the image forming apparatus 20 through wireless communication and via the communication network 30. The CMOS sensor 113 is a built-in type image pickup device that picks up an image of a subject to obtain image data in accordance with the control of the CPU 101. The image pickup element I/F114 is a circuit that controls driving of the CMOS sensor 113.
The microphone 115 is a built-in circuit that converts sound into an electric signal. The speaker 116 is a built-in circuit that generates sounds such as music and sounds by converting electrical signals into physical vibrations. The audio input/output I/F117 is a circuit that processes input and output of audio signals to and from the microphone 115 and the speaker 116 in accordance with control of the CPU 101.
The display 118 is a display means such as a liquid crystal or an organic el (electro luminescence) for displaying an image of a subject, various icons, and the like. The peripheral connection I/F119 is an interface for connecting various external devices. The near Field communication circuit 120 is a communication circuit such as nfc (near Field communication) or bluetooth (registered trademark). The touch panel 121 is an input device for operating the image pickup device 10 by a user pressing the display 118.
The imaging device 10 includes a bus 110. The bus 110 is an address bus, a data bus, or the like for electrically connecting the respective components such as the CPU101 shown in fig. 2.
Fig. 3 is a block diagram showing an example of a hardware configuration of the image forming apparatus.
The image forming apparatus 20 includes a controller 210, a near field communication circuit 220, an engine control unit 230, an operation panel 240, and a network I/F250.
The controller 210 has a cpu (central Processing unit)201 as a main part of the computer, a system memory (MEM-P)202, a North Bridge (NB)203, a South Bridge (SB)204, an asic (application Specific Integrated circuit)206, a local memory (MEM-C)207 as a storage unit, an hdd (hard Disk drive) controller 208, and an HD209 as a storage unit.
NB203 and ASIC206 are connected via agp (acquired Graphics port) bus 221.
Among these components, the CPU201 is a control unit that performs overall control of the image forming apparatus 20. The NB203 is a bridge for connecting the CPU201, MEM-P202, SB204, and AGP bus 221. NB203 includes a memory controller for controlling reading and writing of MEM-P202, a pci (peripheral Component interconnect) host, and an AGP target.
The MEM-P202 includes a ROM202a and a RAM202b, the ROM202a being a memory that holds programs and data for implementing the functions of the controller 210; the RAM202b is used as a drawing memory for developing programs and data and for memory printing. The program stored in the RAM202b can be provided by being recorded in a computer-readable recording medium such as a CD-ROM or CD-R, DVD as a file in an installable format or an executable format.
SB204 is a bridge connecting NB203 and PCI devices, peripherals. The ASIC206 is an image processing ic (integrated circuit) having hardware elements for image processing, and has a bridge function of connecting the AGP bus 221, the PCI bus 222, the HDD controller 208, and the MEM-C207, respectively.
The ASIC206 includes a PCI target and AGP host controller, an Arbiter (ARB) constituting a hub of the ASIC206, a Memory controller controlling the MEM-C207, a plurality of dmacs (direct Memory Access controllers) performing image rotation and the like by hardware logic and the like, and a PCI unit performing data transfer via the PCI bus 222 between the scanner section 231 and the printer section 232. The ASIC206 may be connected to an interface such as a USB (Universal Serial bus) interface or an IEEE1394(Institute of Electrical and Electronics Engineers 1394).
The MEM-C207 is a local memory used as an image buffer for copying and a symbol buffer. The HD209 is a register for accumulating image data, font data used in printing, forms, and the like. The HD209 controls data reading and writing of the HD209, following control by the CPU 201.
The AGP bus 221 is a bus interface of a graphics accelerator card proposed for increasing the graphics processing speed. The AGP bus 221 directly accesses the MEM-P202 at high throughput, which may increase the speed of the graphics accelerator card.
The near field communication circuit 220 includes a near field communication antenna 220 a. The near field communication circuit 220 is a communication circuit of NFC, bluetooth (registered trademark), or the like.
The engine control unit 230 includes a scanner unit 231 and a printer unit 232. The operation panel 240 includes a display portion 240a and operation keys 240 b. The panel display unit 240a includes a touch panel or the like that displays the current setting values, selection screens, and the like, and receives input from the operator. The operation keys 240b include a numeric keypad for receiving a setting value of a condition related to image formation such as a density setting condition, an execution key for receiving a copy start instruction, and the like.
The controller 210 performs overall control of the image forming apparatus 20, for example, drawing, communication, input from the operation panel 240, and the like. The scanner unit 231 or the printer unit 232 performs image processing such as error diffusion and gamma conversion.
The network I/F250 is an interface for data communication using the communication network 30. The close range communication circuit 220 and the network I/F250 are electrically connected to the ASIC206 via the PCI bus 222.
The functions of the respective devices are explained next with reference to the drawings.
Fig. 4 is a functional block diagram of an example of the imaging apparatus according to the first embodiment.
The imaging device 10 includes an imaging unit 11, an image binarizing unit 12, a composite image display unit 13, an imaging operation receiving unit 14, a composite image generating unit 15, a composite image output unit 16, and a storage unit 17.
The imaging unit 11 captures an image. Specifically, the imaging unit 11 captures a first image, a second image, and a third image. Wherein the first image is an image of a landscape or the like. The imaging unit 11 captures a first image and stores data representing the first image in the storage unit 17. The second image and the third image are images such as two-dimensional codes. The second image is an image used for capturing the third image. The third image is an image of a material of an image formed by the image forming apparatus 20.
The imaging unit 11 is configured to repeatedly capture a second image before receiving a next imaging operation for a third image. The imaging unit 11 is also configured to capture a third image when the imaging operation reception unit 14 receives an imaging operation.
The imaging unit 11 is realized by the CPU101 executing processing specified by a program stored in the ROM102, the EEPROM104, or the like, and controlling the imaging element I/F106, the imaging element I/F114, or the like.
The image binarizing unit 12 binarizes the image captured by the imaging unit 11. Specifically, the image binarizing unit 12 binarizes the second image and the third image captured by the imaging unit 11. The binarized image is composed of, for example, pixels having a value of 0 and pixels having a value of 1.
The image binarizing unit 12 is realized by the CPU101 executing processing specified by a program stored in the ROM102, the EEPROM104, or the like.
The composite image display unit 13 is configured to display a composite image (first composite image) in which the first image and the second image captured by the imaging unit 11 are combined. Specifically, the synthesized image display unit 13 synthesizes and displays the first image stored in the storage unit 17 and the second image repeatedly captured by the imaging unit 11. The composite image display unit 13 repeatedly executes this process until it receives the next imaging operation for the third image.
The composite image display unit 13 is realized by the CPU101 executing processing specified by the de program stored in the ROM102, the EEPROM104, or the like, and controlling the display 118 or the like.
The image pickup operation accepting unit 14 is configured to accept an image pickup operation for generating a third image that is synthesized with the first image to generate a second synthesized image, while the first synthesized image is displayed. Thus, the user can perform an image pickup operation by adjusting the shooting position, the angle of view, and the like for shooting the third image while looking at the first composite image.
The image pickup operation reception unit 14 is realized by the CPU101 executing processing specified by a program stored in the ROM102, the EEPROM104, or the like, and controlling the touch panel 121 or the like.
The synthetic image generating unit 15 is configured to generate a synthetic image (second synthetic image) in which the third image and the first image captured by the imaging unit 11 when the imaging operation is accepted are synthesized. The storage unit 17 may store data indicating the generated second composite image.
The composite image generating unit 15 is realized by the CPU101 executing processing specified by a program stored in the ROM102, the EEPROM104, or the like.
The composite image output unit 16 sends data representing the second composite image to the image forming apparatus 20.
The composite image output unit 16 is realized by the CPU101 executing processing specified by a program stored in the ROM102, the EEPROM104, or the like, and controlling the remote communication circuit 112 or the like.
The storage unit 17 stores various data used in various processes described later. For example, the storage unit 17 stores data representing the first image captured by the imaging unit 11. The storage unit 17 may store data indicating the second synthetic image generated by the synthetic image generating unit 15.
The storage unit 17 is realized by the CPU101 executing processing specified by a program stored in the ROM102, the EEPROM104, or the like, and controlling the ROM102, the EEPROM104, or the like.
Next, the operation of the image forming system 1 will be described with reference to the drawings.
Fig. 5 is a flowchart of an example of the image pickup process according to the first embodiment.
When the imaging apparatus 10 according to the present embodiment starts up or calls a dedicated application program, the imaging process is started. The imaging unit 11 receives an operation by the user and captures a 1 st image (still image) of a landscape or the like (step S101). Then, the storage unit 17 stores the first image (step S102).
Fig. 6 is a schematic diagram of imaging of a first image according to the first embodiment.
The first image 901 is an image in which the object 801 such as a landscape is captured. Upon receiving the user' S image pickup operation, the image pickup unit 11 picks up an image of an object 801 such as a landscape in step S101 of the image pickup process. In step S102 of the imaging process, the storage unit 17 stores data representing the first image 901. The first image 901 is an image in color form such as RGB or CMYK. The imaging device 10 may display the 1 st image 901 as shown in fig. 6.
Returning to fig. 5, after the first image (still image) is captured, the imaging unit 11 captures a second image such as a two-dimensional code (step S103). The image binarizing unit 12 performs binarization processing on the second image (step S104). The synthesized image display unit 13 displays a first synthesized image in which the first image and the second image are synthesized (step S105).
Next, the image pickup operation reception unit 14 determines whether or not the image pickup operation for the third image is received (step S106). When the imaging operation reception unit 14 determines that the imaging operation of the third image is not received (no in step S106), the process returns to step S103, and the imaging unit 11 captures a second image such as a two-dimensional code.
The imaging apparatus 10 may immediately return to step S103 to execute the processing, or may execute the processing of step S103 at predetermined intervals, for example, every 1 second.
For example, the imaging unit 11 repeatedly captures a second image as a so-called live view mode. At this time, the image binarizing unit 12 and the composite image display unit 13 execute the binarizing process and the display process in real time. Thus, the imaging apparatus 10 repeatedly performs imaging of the second image and display of the first composite image in real time without accepting a user operation.
The image pickup operation reception unit 14 determines that the image pickup operation for the third image is received (yes in step S106), and the image pickup unit 11 picks up the third image (step S107).
Fig. 7 is a schematic diagram of imaging of the second image and the third image according to the first embodiment.
The second image 902 is an image in which the object 802 such as a two-dimensional code is captured. Upon receiving the image pickup operation by the user, the image pickup unit 11 picks up an image of the object 802 such as a two-dimensional code in step S103 of the image pickup process. In step S104 of the image capturing process, the image binarizing unit 12 performs a binarization process on the second image 902. In step S105 of the imaging process, the synthetic image display unit 13 displays a first synthetic image in which the first image 901 and the second image 902 are synthesized.
Specifically, for example, the image binarizing unit 12 assigns, as data representing the second image 902, either one of a value of 0 (an example of a first value) and a value of 1 (an example of a second value) to each pixel. Then, the combined image display unit 13 displays, as the pixel value of the first image 901, the pixel combined with the pixel having the value 0 (first value) in the binarized second image 902, and displays, as black, the pixel combined with the pixel having the value 1 (second value).
Black is, for example, a color represented by (R, G, B) ═ 0,0,0 when image data having red (R), green (G), and blue (B) gradations in RGB format is represented by numerical values of 0 to 255. Thereby, the composite image display unit 13 displays a two-dimensional code such as a barcode or a QR code (registered trademark) in black so as to be superimposed on the first image 901.
As described above, before the imaging operation reception unit 14 receives the imaging operation, the imaging unit 11, the image binarization unit 12, and the composite image display unit 13 repeatedly execute the processes of step S103 to step S105 of the imaging process. Thus, the user can perform an imaging operation to synthesize the image of the object 802 captured at an appropriate position and size by moving the imaging apparatus 10 itself while checking the displayed first synthesized image or by adjusting the magnification by a zoom function or the like.
Returning to fig. 5, the image binarizing unit 12 binarizes the third image (step S108). The synthetic image generating unit 15 generates a second synthetic image in which the first image and the third image are synthesized (step S109). Here, the third image is almost the same as the second image, but is an image captured by a user's photographing operation in a state where the first composite image is displayed according to the second image, and this point is different from the second image. That is, the second image is an image to be displayed as the first composite image. On the other hand, the third image is an image for generating a second composite image to be transmitted to the image forming apparatus 20 in the process described later.
The combined image generating unit 15 combines the binarized third image as a spot color. The spot color is a color printed by using a spot color ink such as a transparent color, a metallic color, or a fluorescent color, or a spot color toner. By using the spot color, it is possible to realize a special expression such as RGB or CMYK which cannot be realized by only a general printing color.
The synthesized image generating unit 15 synthesizes pixels synthesized with pixels having a value of 0 (first value) in the third image into pixel values of the first image 901, and generates a second synthesized image synthesizing pixels synthesized with pixels having a value of 1 (second value) into pixels of a specified dedicated black color. Here, the image forming apparatus 20 prints pixels designated as a spot black color with IR ink, IR toner, or the like. The IR ink or IR toner has a property of transmitting infrared rays, and can be read by an infrared ray sensor or the like.
Next, the composite image output unit 16 sends data indicating the second composite image to the image forming apparatus 20 (step S110). In this way, the image forming apparatus 20 prints the second composite image on a predetermined printing sheet or the like based on the received data. Here, the image forming apparatus 20 prints the pixels designated as the spot black color with IR ink, IR toner, or the like.
The image pickup apparatus 10 of the present embodiment can accept an image pickup operation of a third image for generating a second composite image after being composited with the first image in a state where the first composite image is displayed. If the imaging device 10 and the object 802 are held in a state where the second image is captured, the second image and the third image are substantially the same image. That is, the displayed first combined image becomes a reference of the position, size, and the like of the image to be captured (third image) combined with the first image.
In this way, when the user captures the third image to be combined, the user can imagine the position, size, and the like of the combination of the image to be captured and the other image (first image), and does not need to repeat the capturing and combining until the combined image is satisfied, thereby reducing the workload.
The imaging device 10 according to the present embodiment binarizes and displays the second image, and synthesizes the third image as a spot color. Therefore, when the user does not specify the position of the two-dimensional code or the like in the image due to restrictions on the method of use or the like, the user can specify the position of the two-dimensional code or the like in the image of the landscape or the like by capturing the image of the landscape or the like first and then capturing the image of the two-dimensional code or the like.
< second embodiment >
The second embodiment is explained below with reference to the drawings. The second embodiment is different from the first embodiment in that the first image is the object of binarization and the object of spot color composition. Therefore, the following description of the second embodiment will be focused on the differences from the first embodiment, and the same reference numerals as used in the description of the first embodiment will be given to members having the same functional configurations as those of the first embodiment, and the description thereof will be omitted.
Fig. 8 is a flowchart of an example of image pickup processing according to the second embodiment.
When the imaging device 10 according to the present embodiment starts the imaging process, the imaging unit 11 receives an operation of the user and captures a first image such as a two-dimensional code (step S201). Then, the image binarizing unit 12 binarizes the first image (step S202). The storage section 17 stores the first image (step S203).
Fig. 9 is a schematic diagram of imaging of a first image according to the second embodiment.
The first image 903 is an image in which an object 803 such as a two-dimensional code is captured. Upon receiving the image pickup operation by the user, the image pickup unit 11 picks up an object 803 such as a landscape in step S201 of the image pickup process. In step S203 of the imaging process, the image binarizing unit 12 binarizes the first image 903. For example, the image binarizing unit 12 assigns any one of 0 (first value) and 1 (second value) to each pixel as data representing the first image 903.
In step S203 of the imaging process, the storage unit 17 stores data representing the first image 903. The imaging device 10 may also display the first image 903 as shown in fig. 9.
Returning to fig. 8, the image pickup unit 11 picks up a second image of a landscape or the like (step S204). The synthesized image display unit 13 displays a first synthesized image in which the first image and the second image are synthesized (step S205).
Next, the image pickup operation reception unit 14 determines whether or not the image pickup operation for the third image is received (step S206). If the imaging operation reception unit 14 determines that the imaging operation of the third image is not received (no in step S206), the process returns to step S204, and the imaging unit 11 captures a second image such as a landscape.
The imaging apparatus 10 may immediately return to the execution of the process of step S204, or may execute the process of step S204 at predetermined intervals, for example, every 1 second, as in the first embodiment.
For example, the imaging unit 11 repeatedly captures a second image as a so-called live view mode, as in the first embodiment. In this case, the composite image display unit 13 executes the display processing in real time. Thus, the imaging apparatus 10 repeatedly performs imaging of the second image and display of the first combined image in real time without accepting a user operation.
When the imaging operation reception unit 14 determines that the imaging operation for the third image is received (yes at step S206), the imaging unit 11 images the third image (step S207).
Fig. 10 is a schematic diagram of imaging of the second image and the third image according to the second embodiment.
The second image 904 is an image in which the object 804 such as a landscape is captured. Upon receiving the image capturing operation by the user, the image capturing unit 11 captures an object 804 such as a landscape in step S204 of the image capturing process. In step S205 of the imaging process, the synthetic image display unit 13 displays a first synthetic image in which the first image 903 and the second image 904 are synthesized.
Specifically, for example, the combined image display unit 13 displays, as the pixel value of the second image 904, a pixel combined with a pixel having a value of 0 (first value) in the binarized first image 903, and displays, as black, a pixel combined with a pixel having a value of 1 (second value). In this way, the composite image display portion 13 displays a two-dimensional code such as a barcode or a QR code (registered trademark) in black so as to be superimposed on the first image 903.
As described above, before the imaging operation reception unit 14 receives the imaging operation, the imaging unit 11 and the synthesized image display unit 13 repeatedly execute the processes of step S204 to step S205 of the imaging process. Thus, the user can perform an imaging operation to combine the images of the object 804 captured at an appropriate position and size by moving the imaging apparatus 10 itself while checking the displayed first combined image or adjusting the magnification by a zoom function or the like.
Returning to fig. 8, the synthetic image generator 15 generates a second synthetic image in which the first image and the third image are synthesized (step S208). Here, the combined image generating unit 15 combines the binarized first image as a spot color, as in the first embodiment, in which the binarized third image is combined as a spot color. Specifically, the synthetic image generating unit 15 synthesizes pixels synthesized with pixels having a value of 0 (first value) from among the first images into pixel values of the third image, and generates a second synthetic image in which pixels synthesized with pixels having a value of 1 (second value) are synthesized into pixels of a specified dedicated black color.
Next, the synthetic image output unit 16 sends data indicating the second synthetic image to the image forming apparatus 20 (step S209). In this way, the image forming apparatus 20 prints the second composite image on a predetermined printing paper or the like based on the received data. Here, the image forming apparatus 20 prints the pixels designated as the spot color of black using IR ink, IR toner, or the like, as in the first embodiment.
The imaging apparatus 10 according to the present embodiment, like the first embodiment, can receive an imaging operation of a third image for generating a second composite image after being composited with the first image in a state where the first composite image is displayed.
The imaging device 10 according to the present embodiment binarizes and displays the first image captured first, and synthesizes the first image as a spot color. Therefore, when the position of the two-dimensional code or the like in the image is determined by the user due to restrictions on the usage method or the like, the user can determine the position of the object such as a landscape after capturing the image of the two-dimensional code or the like.
< third embodiment >
The third embodiment is explained below with reference to the drawings. The third embodiment is different from the first embodiment in that it includes a warning display unit that displays a warning and a setting acceptance unit that accepts an operation to set a threshold. Therefore, in the following third embodiment, only the differences from the first embodiment will be mainly described, and the same reference numerals as used in the description of the first embodiment will be given to members having the same functional configurations as those of the first embodiment, and the description thereof will be omitted.
Fig. 11 is a functional block diagram of an example of the imaging apparatus according to the third embodiment.
The imaging apparatus 10A according to the present embodiment is configured by adding a setting acceptance unit 18 and a warning display unit 19 to the configuration of the first embodiment.
The setting receiving unit 18 receives a setting operation of a threshold value used in processing of the warning display unit 19 described later. For example, the setting acceptance unit 18 displays the set threshold value on the display 118, and accepts input of the changed threshold value via the touch panel 121. The threshold value to be initially set may be set or may be input at first. The setting acceptance unit 18 may display a plurality of threshold value candidates as options and accept selection of a setting threshold value.
The setting acceptance unit 18 is realized by the CPU101 executing processing specified by a program stored in the ROM102, the EEPROM104, or the like, and controlling the display 118, the touch panel 121, or the like.
The warning display unit 19 displays a warning when a pixel having a color density equal to or higher than a predetermined threshold value is included in a pixel overlapping with a pixel of a second value (for example, 1) of the second image subjected to the binarization processing or in an adjacent pixel in the first image. The threshold value is a value set by the setting receiving unit 18.
The gradation of a color above the threshold value means, for example, a pixel in which the average value of the gradations of red, green, and blue is equal to or less than the threshold value (for example, 30) when the image data of the gradations of red, green, and blue in the RGB format is represented by numerical values of 0 to 255. In this case, the smaller the gradation value, the darker the color. The threshold value 30 is a value set by the setting receiving unit 18.
The warning display unit 19 is realized by the CPU101 executing processing specified by a program stored in the ROM102, the EEPROM104, or the like, and controlling the display 118 or the like.
In step S105 in the image capturing process shown in fig. 5, when a pixel overlapping with the pixel of the second value of the binarized second image in the first image or a pixel having a color shade of a predetermined threshold value or more is included in adjacent pixels, the warning display unit 19 displays a warning.
Fig. 12 is a schematic diagram of imaging of the second image and the third image according to the third embodiment.
The warning display unit 19 displays a warning when pixels having color shades equal to or higher than a threshold value are included in pixels overlapping with black pixels of the binarized second image 902 or adjacent pixels in the first image 901. Specifically, the warning display unit 19 may display a message on the screen of the display 118, or may display the portion of the displayed first composite image in bright red color.
Black toner or black ink, like IR toner or IR ink, may also react to infrared rays, making the two indistinguishable. In contrast, the imaging apparatus 10 according to the present embodiment may be configured so that pixels having color shades equal to or higher than the threshold value are not included in pixels overlapping with black pixels of the binarized second image 902 in the first image 901 or in adjacent pixels when the third image is captured.
Since the imaging device 10A according to the present embodiment can set the threshold value, it is possible to flexibly cope with the difference in the degree of infrared ray reaction of the black toner or the black ink.
This embodiment mode can be combined with the second embodiment mode. Specifically, the warning display unit 19 may execute the same processing as that shown in the present embodiment as the processing of step S205 of the image pickup processing of the second embodiment shown in fig. 8. That is, the warning display unit 19 may display a warning when a pixel overlapping with the pixel of the second value of the first image subjected to the binarization processing or a pixel having a color shade of a predetermined threshold value or more is included in the adjacent pixels in the second image.
In this way, in the image capturing shown in fig. 10, when the third image is captured, pixels having color shades equal to or higher than the threshold value may not be included in pixels overlapping with the black pixels of the first image 903 subjected to the binarization processing or in adjacent pixels in the second image 904.
The above embodiments show examples in which one first image is an image captured by the imaging unit 11. However, the scope of the present invention is not limited thereto, and the first image may be another image, for example, an image received from another device or the like.
Each of the above embodiments shows an example in which one synthesized image output unit 16 transmits data representing the second synthesized image to the image forming apparatus 20. However, the composite image output unit 16 may output data representing the second composite image to an external storage medium such as a flash memory or an SD card. In the case of job, the user or the like inserts the output external storage medium into another apparatus or the image forming apparatus 20, and the image forming apparatus 20 performs printing.
The camera 10 in some embodiments comprises a plurality of computing devices, such as a server cluster. Multiple computing devices configured to communicate with each other over any type of communications link, including networks and shared memory, may also perform the processes disclosed herein.
The various functions of the above described embodiments may be implemented by one or more processing circuits. The "processing circuit" herein includes a processor programmed by software to perform each function, such as a processor mounted by a circuit, and elements such as an asic (application Specific Integrated circuit), a dsp (digital Signal processor), an fpga (field Programmable Gate array), and an existing circuit module designed to perform each function.
The imaging device 10 according to the above-described embodiment may have an imaging function, and may be, for example, a projector, an output device such as a Digital signage, a teleconference device, a hud (head Up display) device, an industrial machine, a medical instrument, a network home appliance, an automobile (Connected Car), a notebook Computer (Personal Computer), a mobile phone, a tablet terminal, a game machine, a pda (Personal Digital assistant), a Digital camera, an all-celestial panoramic image imaging device, a wearable PC, a desktop PC, or the like.
The present invention has been described above with reference to the embodiments, but the present invention is not limited to the requirements shown in the above embodiments. In these respects, modifications can be made within a range not affecting the gist of the present invention, and the specifications are appropriately given according to the application manner thereof.
Description of the symbols
1 image forming system, 10 image pickup device, 11 image pickup section, 12 image binarization section, 13 composite image display section, 14 image pickup operation receiving section, 15 composite image generating section, 16 composite image output section, 17 storage section, 18 setting receiving section, 19 warning display section, 20 image forming device, 30 communication network.

Claims (12)

1. An image pickup apparatus is provided with
An image pickup section for picking up an image;
a composite image display unit for displaying a first composite image obtained by compositing a first image and a second image captured by the imaging unit; and
and an image pickup operation receiving unit configured to receive an image pickup operation of a third image for combining with the first image to generate a second combined image in a state where the first combined image is displayed.
2. The imaging apparatus according to claim 1, further comprising a composite image generating unit configured to generate the second composite image in which the third image is synthesized as a spot color with the first image.
3. The image pickup apparatus according to claim 2,
further comprises an image binarization section for performing binarization processing on the second image and the third image,
the composite image display unit displays pixels combined with pixels of a first value as pixel values of the first image and pixels combined with pixels of a second value as black in the second image subjected to the binarization,
the synthesized image generating unit generates the second synthesized image by synthesizing the pixels synthesized with the pixels of the second value into the pixel values of the first image, and synthesizing the pixels synthesized with the pixels of the first value into the pixels specified by the black spot color, among the third image.
4. The imaging apparatus according to claim 3, further comprising a warning display unit configured to display a warning when a pixel having a color shade equal to or larger than a predetermined threshold value is included in a pixel overlapping with the pixel of the second value of the second image subjected to the binarization processing or an adjacent pixel in the first image.
5. The imaging apparatus according to claim 1, further comprising a composite image generating unit configured to generate the second composite image in which the first image is synthesized as a spot color with the third image.
6. The image pickup apparatus according to claim 5,
further comprises an image binarization section for performing binarization processing on the first image,
the composite image display unit displays pixels combined with pixels of a first value as pixel values of the second image and pixels combined with pixels of a second value as black in the first image subjected to the binarization,
the composite image generating unit generates the second composite image by synthesizing the pixels synthesized with the pixels of the first value into the pixel values of the third image, and synthesizing the pixels synthesized with the pixels of the second value into the pixels specified by the black spot color, from the first image subjected to the binarization processing.
7. The imaging apparatus according to claim 6, further comprising a warning display unit configured to display a warning when a pixel having a color shade equal to or larger than a predetermined threshold value is included in a pixel overlapping with the pixel of the second value of the first image subjected to the binarization processing or an adjacent pixel in the second image.
8. The imaging apparatus according to claim 4 or 7, further comprising a setting acceptance unit configured to accept an operation of setting the threshold value.
9. The imaging apparatus according to any one of claims 1 to 8, wherein the imaging unit and the composite image display unit repeatedly execute imaging processing of the second image by the imaging unit and display processing of the first composite image by the composite image display unit.
10. An image pickup method for an image pickup apparatus, comprising
Shooting an image;
a composite image display step of displaying a first composite image in which a first image and a second image captured by the imaging unit are combined; and
and an image pickup operation reception step of receiving an image pickup operation of a third image for combining with the first image to generate a second combined image in a state where the first combined image is displayed.
11. A computer-readable storage medium in which a program is stored for execution by a computer, the program comprising the steps of,
shooting an image;
a composite image display step of displaying a first composite image in which a first image and a second image captured by the imaging unit are combined; and
and an image pickup operation reception step of, in a state where the first composite image is displayed, receiving an image pickup operation of a third image to be synthesized with the first image and generating a second composite image.
12. A computer device comprising a storage device and a processor, wherein the storage device stores a program, and the program is executed by the processor to realize the following functions,
an image pickup section for picking up an image;
a composite image display unit for displaying a first composite image obtained by compositing a first image and a second image captured by the imaging unit; and
and an image pickup operation receiving unit configured to receive an image pickup operation of a third image for combining with the first image to generate a second combined image in a state where the first combined image is displayed.
CN202111545392.6A 2020-12-21 2021-12-16 Image pickup apparatus, image pickup method, storage medium, and computer apparatus Pending CN114650351A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020211185A JP2022097928A (en) 2020-12-21 2020-12-21 Imaging apparatus, imaging method, and program
JP2020-211185 2020-12-21

Publications (1)

Publication Number Publication Date
CN114650351A true CN114650351A (en) 2022-06-21

Family

ID=81992122

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111545392.6A Pending CN114650351A (en) 2020-12-21 2021-12-16 Image pickup apparatus, image pickup method, storage medium, and computer apparatus

Country Status (2)

Country Link
JP (1) JP2022097928A (en)
CN (1) CN114650351A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1565119A (en) * 2002-07-08 2005-01-12 卡西欧计算机株式会社 Camera apparatus, photographing method and a storage medium that records method of photographing
US20060215931A1 (en) * 2005-03-25 2006-09-28 Kabushiki Kaisha Image forming apparatus and computer readable medium
CN106599965A (en) * 2016-11-25 2017-04-26 北京矩石科技有限公司 Method and device for making image cartoony and fusing image with 2D code
CN106791367A (en) * 2016-11-25 2017-05-31 努比亚技术有限公司 A kind of filming apparatus and method, mobile terminal
CN107025480A (en) * 2017-02-13 2017-08-08 阿里巴巴集团控股有限公司 Image generating method and its equipment
CN109729274A (en) * 2019-01-30 2019-05-07 Oppo广东移动通信有限公司 Image processing method, device, electronic equipment and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1565119A (en) * 2002-07-08 2005-01-12 卡西欧计算机株式会社 Camera apparatus, photographing method and a storage medium that records method of photographing
US20060215931A1 (en) * 2005-03-25 2006-09-28 Kabushiki Kaisha Image forming apparatus and computer readable medium
CN106599965A (en) * 2016-11-25 2017-04-26 北京矩石科技有限公司 Method and device for making image cartoony and fusing image with 2D code
CN106791367A (en) * 2016-11-25 2017-05-31 努比亚技术有限公司 A kind of filming apparatus and method, mobile terminal
CN107025480A (en) * 2017-02-13 2017-08-08 阿里巴巴集团控股有限公司 Image generating method and its equipment
CN109729274A (en) * 2019-01-30 2019-05-07 Oppo广东移动通信有限公司 Image processing method, device, electronic equipment and storage medium

Also Published As

Publication number Publication date
JP2022097928A (en) 2022-07-01

Similar Documents

Publication Publication Date Title
US11245814B2 (en) Shared terminal transmits print data with name of the shared terminal as a print requester to printer when the terminal device identification is not received
CN101998041B (en) Imaging apparatus and method of controlling same
JP2020123172A (en) Imaging system, developing system, imaging method, and program
JP4772889B2 (en) Portable terminal device, captured image processing system, program, and recording medium
US9277069B2 (en) Image forming system and image forming apparatus to display together with job information about printing jobs
US20070249376A1 (en) Information processing device, information processing method, control program for realizing information processing method by computer, and computer readable recording medium with control program recorded thereon
US10831435B2 (en) Shared terminal, communication system, image transmission method, and recording medium
US10915289B2 (en) Shared terminal and image transmission method
US20180309742A1 (en) Shared terminal, communication method, and non-transitory computer-readable medium
US10848483B2 (en) Shared terminal, communication system, and display control method, and recording medium
US10637852B2 (en) Shared terminal and display control method
CN114650351A (en) Image pickup apparatus, image pickup method, storage medium, and computer apparatus
CN108712590B (en) Shared terminal, communication system, communication method, and recording medium
US11249708B2 (en) Image forming apparatus, image forming method and recording medium
JP2007221685A (en) Digital camera and control method therefor
JP7206969B2 (en) IMAGE PROCESSING DEVICE, IMAGE PRINTING DEVICE, PRINTING SYSTEM, PROGRAM AND PRINTING METHOD
JP6880998B2 (en) Content management system, content management method and content management program
JP6274186B2 (en) Image forming apparatus, image forming system, and program
US20180285042A1 (en) Communication terminal, communication system, communication control method, and non-transitory computer-readable medium
JP2022034158A (en) Image forming apparatus, image processing system, image processing method, and program
JP7017167B2 (en) Shared terminals, communication systems, image transmission methods, and programs
JP7404952B2 (en) Communication terminal, information provision system, information provision method and program
JP4190197B2 (en) Image display system, program, and recording medium
JP2009027250A (en) Image processor and image processing method
JP2021087074A (en) Image forming apparatus, control method, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination