WO2008050620A1 - Serveur de caméra - Google Patents

Serveur de caméra Download PDF

Info

Publication number
WO2008050620A1
WO2008050620A1 PCT/JP2007/070007 JP2007070007W WO2008050620A1 WO 2008050620 A1 WO2008050620 A1 WO 2008050620A1 JP 2007070007 W JP2007070007 W JP 2007070007W WO 2008050620 A1 WO2008050620 A1 WO 2008050620A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
image data
sound
image
predetermined
Prior art date
Application number
PCT/JP2007/070007
Other languages
English (en)
Japanese (ja)
Inventor
Katsura Obikawa
Jong-Sun Hur
Original Assignee
Opt Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Opt Corporation filed Critical Opt Corporation
Publication of WO2008050620A1 publication Critical patent/WO2008050620A1/fr

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/183Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/4223Cameras

Definitions

  • the present invention relates to a camera server and a data processing method.
  • a camera server for distributing image data of an image captured by a camera device including an image sensor to a client via a network is known (for example, see Patent Document 1).
  • the KARU Camera Server uses all the data signals output from the camera device.
  • It is configured to be distributed to the client side.
  • Patent Document 1 Japanese Patent Application Laid-Open No. 2005-318412
  • a data signal output from the camera device may include a noise data signal that is not a data signal of image data. Therefore, the client is distributed not only for image data but also for a noise data signal. Therefore, when displaying a video on the client side based on the data transmitted from the camera server, there is a problem that the video is disturbed by a data signal of noise.
  • an object of the present invention is to provide a camera server and a data processing method for distributing image data with less noise to a client and reducing disturbance of an image displayed on the client side.
  • the image data is predetermined image data in a camera server that distributes image data of an image captured by a camera device including an image sensor to a client via a network. If the image data is determined to be predetermined image data, the image data is output to the network side. If the image data is determined not to be predetermined image data, the image data is output to the network side. Data selection means not to output is provided. In such a configuration, image data with less noise can be delivered to the client, so that the client can obtain image data with less noise.
  • the data selecting means determines whether or not the image data is predetermined based on information of a header added to the image data!
  • the force S can be used to easily determine whether the image data is predetermined.
  • the camera server receives image data based on header information of image data and sound data when sound data is input in addition to the image data.
  • Data synchronization means for synchronizing the sound data with the sound data and the data synchronization means synchronizes the predetermined sound data of the sound data with the data portion determined by the data selection means to be not the predetermined image data. I will do it.
  • image data with less noise can be delivered to the client, so that the client can obtain image data with less noise.
  • sound data sound data synchronized with image data that has not been distributed is also distributed. Therefore, the client can listen to the collected sound without interruption.
  • the image data is predetermined image data. If the image data is determined to be the predetermined image data, the image data is output to the network side, and the image data determined not to be the predetermined image data is Do not output to the network side.
  • whether or not the image data is predetermined image data is determined based on header information added to the image data.
  • the force S can be used to easily determine whether the image data is predetermined.
  • a data processing method for distributing image data of an image captured by a camera device including an image sensor and sound data of sound collected by a microphone to a client via a network For the image data, it is determined whether or not the image data is predetermined image data.
  • the image data determined to be the predetermined image data is output to the network side, and the image data is determined to be the predetermined image data.
  • Data that is determined not to be image data is processed not to be output to the network side, and the sound data and the image data output to the network side are processed based on the header information added to each data.
  • the predetermined sound data of the sound data is also applied to the data portion that is synchronized with each other and is determined not to be the predetermined image data. And to synchronize the data.
  • image data with less noise can be delivered to the client, so that the client can obtain image data with less noise.
  • sound data sound data synchronized with image data that has not been distributed is also distributed. Therefore, the client can listen to the collected sound without interruption.
  • FIG. 1 is a diagram showing a configuration of a camera server system according to an embodiment of the present invention.
  • FIG. 2 is a block diagram showing an electric circuit of a camera device in the camera server system shown in FIG.
  • FIG. 3 is a block diagram showing an electric circuit of a camera server in the camera server system shown in FIG.
  • FIG. 1 is a system configuration diagram showing a camera server system 1 according to an embodiment of the present invention.
  • the camera server system 1 includes a camera device 2 and a camera server 3.
  • the camera device 2 and the camera server 3 are connected by a USB (Universal Serial Bus) cable 4.
  • the camera server system 1 and the client 5 are connected via the network 6, and the camera server system 1 receives a command from the personal computer 7 on the client 5 side.
  • the camera server 3 distributes image data of an image captured by the camera device 2 to the personal computer 7 (client 5) based on a command received from the personal computer 7.
  • client 5 One or a plurality of clients 5 exist, and the camera server system 1 distributes the image data to each client 5 in accordance with a request from each client 5.
  • the camera device 2 has a cubic-shaped housing 8.
  • Housing 8 has a fisheye lens
  • USB connector 10 audio 'video output connector 11, sound output connector 12, etc.
  • the fisheye lens 9 is disposed on the upper surface of the housing 8.
  • a ventilation hole 14 for the microphone 13 is formed on the upper surface of the housing 8.
  • the USB connector 10, the audio / video output connector 11 and the sound output connector 12 are arranged on the side surface of the housing 8.
  • the camera server 3 has a rectangular parallelepiped housing 15.
  • the housing 15 includes a USB connector 16 as an input unit that is an input terminal for inputting data from the camera device 2, a microphone connector 18 to which an expansion microphone 17 to be used as needed is connected, an infrared sensor, and the like.
  • General purpose input / output port (GPIO) 19 for connecting external equipment, audio output connector 20 for outputting audio signals, video output connector 21 for outputting video signals, and speakers to which speakers are connected
  • GPIO General purpose input / output port
  • a single connector 22, a communication interface 23 connected to the network, a power switch 24 that also serves as a reset switch, and the like are provided.
  • FIG. 2 is a block diagram showing an electric circuit of the camera device 2.
  • the camera device 2 includes a fisheye lens 9, a USB connector 10, an audio 'video output connector 11, a sound output connector 12, a microphone 13, an image sensor 25 as an image sensor, and an F PGA (Field Programmable Gate Array).
  • Color conversion processing unit 26 sound processing unit 27 composed of sound IC (Integrated Circuit) 27, audio 'video encoder 28, streaming generation unit 29, and CPU (CCU) that controls the operation of each of the above components Central Processing Unit) 30 etc.
  • the microphone 13 collects sounds and sounds around the camera device 2.
  • the microphone 13 generates a sound signal that is an electrical signal corresponding to the collected sound and outputs the sound signal to the sound processing unit 27.
  • the sound processing unit 27 generates digital sound data based on the electrical signal from the microphone 13.
  • the sound processing unit 27 outputs an electrical signal corresponding to the sound from the microphone 13 to the sound output connector 12 without converting it into sound data (digital format). Therefore, the sound collected by the microphone 13 can be heard by connecting, for example, a speaker, headphones or the like to the sound output connector 12.
  • the image sensor 25 for example, a CMOS (Complementary Metal-Oxide Semiconductor) or the like is used. An image by the fisheye lens 9 is formed on the image sensor 25.
  • the image sensor 25 outputs luminance distribution data, which is an imaging signal, to the color conversion processing unit 26 as digital image data.
  • luminance distribution data which is an imaging signal
  • the image sensor 25 performs imaging at an imaging speed of 1/15 frames per second, for example. Therefore, one frame of image data is output from the image sensor 25 every 1/15 second.
  • the color conversion processing unit 26 replaces the data of each pixel of the luminance distribution data using a color conversion table (not shown) to generate digital image data.
  • the generated image data is added with a header for each frame by a header adding unit 31 formed by the CPU 30.
  • the CPU 30 is configured with a JPEG encoder section 32 as an encoder that converts image data into JPEG (Joint Photographic Experts Group) open-type compressed image data.
  • JPEG Joint Photographic Experts Group
  • noise data other than image data is output from the image sensor 25, or noise data is mixed between the image sensor 25 and the JPEG encoder unit 32.
  • the JPEG encoder unit 32 when the data is image data input to the JPEG encoder unit 32, the data is converted into JPEG format image data, and the header is added to the header. Gives information that it is image data.
  • the JPEG encoder unit 32 does not convert the image data into JPEG format image data. That is, of the data input as image data output from the image sensor 25 to the JPEG encoder unit 32, the resolution data is not image data! /, And is not converted to JPEG image data. In the header, information indicating that the image data is in JPEG format is not added.
  • the JPEG encoder unit 32 may be configured so as not to convert the image data into JPEG format data if the image data includes noise data exceeding a predetermined level! /.
  • a header is added by the header adding unit 31 in a predetermined amount of data. For example, in the present embodiment, a header is added to sound data for each data length of 1/30 second.
  • image data has a data amount of 15 frames per second
  • sound data has a data amount of 1/30 second. Therefore, one unit of image data corresponds to two units (1/30 seconds x 2) of sound data.
  • the image data and the sound data are output from the CPU 30 to the streaming generation unit 29 in a state where the timing at the time of imaging and the time of sound collection are synchronized with each other based on the header number! It is powered.
  • the header number M of the image data is 1, 2, 3, 4, 5, ...
  • the header number N of the sound data is 1, 2, 3, 4, 5, 6, 7, 8, 9, Let's say 10, ...
  • the streaming generation unit 29 reads the image data and sound data output from the CPU 30, and generates data having these content data as streaming data.
  • the streaming generation unit 29 supplies the generated digital format data signal to the communication processing unit 33, and transmits it from the USB connector 10 to the camera server 3 via the USB cable 4.
  • Audio 'The video encoder 28 reads the image data output from the color conversion processing unit 26 without converting it to JPEG format image data, and NTSC (National TV standards Committee) method 1 -PAL (Phase Alternating Line ) Method high signal.
  • the audio / video encoder 28 converts the sound data in the digital format into a sound signal that is an electrical signal.
  • the video output connector 11 outputs the video signal and the sound signal together.
  • FIG. 3 is a block diagram showing an electric circuit of the camera server 3.
  • the camera server 3 includes a USB connector 16, a microphone connector 18 to which an extension microphone 17 is connected, a general-purpose input / output port 19 to which an external device such as an infrared sensor is connected, Audio output connector 20, video output connector 21, speaker connector 22, communication interface 23, power switch 24, audio 'video input connector 34, audio encoder 35, memory 36, and control of the above components MCU (Microcontroller Unit) 37 as a control means for managing
  • the MCU 37 includes a data selection unit 38 that is a data selection unit, a data synchronization unit 39 that is a data synchronization unit, a data distribution unit 40 that is a data distribution unit, and the like.
  • the data selection unit 38, the data synchronization unit 39, and the data distribution unit 40 are functionally realized by reading the control program power MCU 37 stored in the memory 36.
  • the data selection unit 38 determines the content of the image data transmitted from the camera device 2 based on the header information of the image data of each frame. As described above, in the case of image data that has been subjected to JPEG conversion in the data power in the header, information to that effect is given to the header, and in the case where the data in the header is noise data, the header No information is given. That is, the data selection unit 38 determines whether the content of the data for each header is predetermined image data or data other than the predetermined image data based on the header information.
  • the data selection unit 38 selects data to be output to the data synchronization unit 39 based on the determination result. That is, when the data selection unit 38 determines that the content of the data is predetermined image data, the image data is transmitted from the communication interface unit 23 via the data synchronization unit 39 and the data distribution unit 40. Then, it is distributed to the personal computer 7 of the client 5 through the network 6. On the other hand, data that is not predetermined image data is not output to the data synchronization unit 39 side.
  • the image data distributed to the client 5 side can be obtained by providing the data selection unit 38 and not distributing the predetermined image data! / To the client 5 side.
  • Noise data will not be included.
  • the video displayed on the monitor of the personal computer 7 of the client 5 is a beautiful video with little noise. Since noise data is not transmitted, the amount of data transmitted from the camera server 3 to the client 5 is reduced, and the transmission speed of image data from the camera server 3 to the client 5 can be increased.
  • the sound data is not selected as in the case of image data, all the sound data is transmitted from the communication interface unit 23 via the data synchronization unit 39 and the data distribution unit 40. It is distributed to the personal computer 7 of the client 5 through the network 6. On the client 5 side, sound such as voice collected by the microphone 13 can be heard from a speaker built in the personal computer 7 or the like.
  • the header number N of the sound data input from the data selection unit 38 to the data distribution unit 40 is a continuous number without any loss.
  • the data selection unit 38 selects the data to be output to the data distribution unit 40, and therefore the image data input to the data distribution unit 40 from the data selection unit 38 is selected.
  • the header number M may be missing.
  • the sound data is distributed to the client 5 side without packing the data.
  • the video based on the image data is ahead of the content of the sound based on the video, and the video and sound are not synchronized.
  • sound data that synchronizes with missing image data is deleted, there is a problem in that the content of power sound that can synchronize video and sound is skipped.
  • the sound collection timing and the imaging timing of the sound data and the image data output from the data selection unit 38 are synchronized with each other, and the sound content is skipped. Process to synchronize so that nothing happens.
  • the header number power of the image data output from the data selection unit 38 is 1, 2,-, 4, 5, ... If the header number of the sound data is 1, 2, 3, 4, 5, 6, ..., the sound with the header numbers 1 and 2 for the image data with the header number 1 Synchronize the data, and synchronize the sound data of header numbers 3 and 4 with the image data of header number 2.
  • Image data with header number 3 is missing force Sound data with header numbers 5 and 6 is synchronized with this part.
  • the sound data of header numbers 7 and 8 are synchronized with the image data of header number 4. In this way, the sound data and image data are synchronized.
  • Data is distributed from the data distribution unit 40 to the network 6 side through the communication interface 23.
  • the viewer software of the personal computer 7 of the client 5 is configured to continuously display the video based on the previous image data on the monitor until new image data is distributed. When image data is lost, nothing can be displayed on the monitor of the PC 7 so that it does not appear unnatural.
  • the visual software is configured in this way, the video data based on the image data with the header number 2 is displayed until the image data with the header number 2 is distributed and then the image data with the header number 4 is distributed. During this time, sound based on the sound data of header numbers 5 and 6 is reproduced.
  • the data distribution unit 40 simultaneously distributes data signals to these clients 5 when a plurality of clients 5 are accessing the camera server 3.
  • the USB connector 16 is supplied with a data signal having image data in digital format and sound data in digital format from the camera device 2 via the USB cable 4 and is input to the MCU 37.
  • the digital data signal input to the MCU 37 is configured to be output from the communication interface 23 to the network 6 as a digital data signal.
  • the camera server 3 outputs the digital data signal output from the camera device 2 to the network side as a digital signal, there is no signal deterioration. Further, it is not necessary to provide an AD conversion function for converting an analog signal into a digital signal in the camera server 3.
  • image data is transmitted using the TCP or UDP protocol in a half-duplex communication system. Sound data is transmitted by the TCP protocol or UDP protocol in a duplex communication system.
  • image data is converted into compressed image data in JPEG format in the camera device 2, and then transmitted to the camera server 3. From the data distribution unit 40, the image data is transmitted to the client through the communication interface 23. Delivered to 5. In this way, the image data is By converting the apparatus 2 to the JPEG format and sending the image data in the JPEG format to the camera server 3, the capacity of data transmitted from the camera apparatus 2 to the camera server 3 can be reduced. Therefore, it is possible to improve the transmission speed of image data in the camera server system 1. As a result, it is possible to increase the speed at which image data is sent to the client 5.
  • the conversion of the image data into the JPEG format is performed in the MCU 37 of the camera server 3, and in response to a request for transmission of the image data from the client 5, the JPEG format is used each time.
  • the speed of distributing the image data from the camera server 3 to the client 5 becomes slow.
  • the MCU 37 performs many processes related to the control of the camera device 2 and the communication with the client 5. Therefore, in addition to these processes, if an encoding function for JPEG conversion is provided, the load on the MCU 37 increases, the processing speed of the MCU 37 decreases, and the encoding processing speed for JPEG conversion also slows down. As a result, the transmission speed of the image data from the camera server 3 to the client 5 becomes slow.
  • JPEG conversion speed can be increased by performing JPEG conversion using the CPU 30 on the camera device 2 side, which has fewer processing items than the MCU 37 on the camera server 3 side.
  • the processing speed does not decrease, the processing speed of the data selection unit 38, the data distribution unit 40, etc. can be increased, and the image data is distributed to the client 5.
  • the processing speed can be increased.
  • the client 5 waits for the distribution to the other clients 5 to finish and stores the image data.
  • the delivery is not received and the delivery speed can be increased.
  • the JPEG encoder unit 32 is provided in the CPU 30 of the camera device 2, and a JPEG encoder 32A is connected to the camera server 3 side separately from the MCU 37 as shown in FIG. It may be provided between the connector 16 and the MCU 37, and the image data that has been subjected to JPEG conversion by the JPEG encoder 32A may be input to the MCU 37.
  • the JPEG encoder unit 32A is provided as a processing device different from the MCU 37 provided with the data selection unit 38, the data synchronization unit 39, the data distribution unit 40, and the like, so that the processing speed of the MCU 37 is not reduced. .
  • a dedicated JPEG encoder may be provided on the camera device 2 side separately from the CPU 30, and JPEG conversion may be performed by the JP EG encoder.
  • the audio encoder 35 reads the digital sound data input from the communication interface 23 to the camera server 3 from the MCU 37, converts the sound data into a sound signal that is an electrical signal, and outputs the sound signal to the speaker connector 22. For this reason, for example, by connecting a speaker or the like to the speaker connector 22, the sound data input from the communication interface 23 can be reproduced with the force S.
  • a camera device control signal for controlling the camera device 2 is transmitted from the camera server 3 to the camera device 2 via the USB cable 4.
  • This camera device control signal is generated by the device control unit 41 configured in the MCU 37, and controls, for example, the start and stop of the operation of the camera device 2, or the camera device 2 receives image data. That require the transmission of Also, for a command transmitted from the personal computer 7 on the client 5 side connected to the network to the camera server 3, the command is interpreted and a camera device control signal is transmitted to the camera device 2.
  • the device control unit 41 interprets the command and transmits a camera device control signal to the camera device 2.
  • the device controller 41 is connected to the camera In addition to device 2, if an external device is connected to GPIO, this external device is also controlled.
  • the microphone 13 incorporated in the camera device 2 is limited in size that can be used due to restrictions on the image sensor 25 and circuit configuration disposed in the housing 8, and has sufficient sound collecting power. You may not get. In such a case, the extension microphone 17 is connected to the microphone connector 18.
  • the MCU 37 When the MCU 37 recognizes that the extension microphone 17 is connected to the microphone connector 18, the MCU 37 transmits a camera device control signal for stopping the sound collection by the microphone 13 to the camera device 2. In response to the camera device control signal, the CPU 30 of the camera device 2 controls the sound processing unit 27 to stop generating sound data. Therefore, the power camera device 2 outputs data without sound data.
  • the sound signal is read from the microphone connector 18, and the sound processing unit 42 configured in the MCU 37 of the camera server 3 generates sound data in a digital format.
  • the sound data and the image data transmitted from the camera device 2 are supplied from the data distribution unit 40 to the communication interface 23 and distributed to the network 6.
  • the extension microphone 17 when the extension microphone 17 is connected, the sound is collected by the extension microphone 17 instead of the microphone 13 built in the camera device 2. Sound data suitable for the environment such as noise can be obtained and placed on the data for distribution.
  • the application of the camera server 3 sets the sound data of the microphone 13 sent from the USB cable 4 as a default sound input. ing. Thereafter, the application of the camera server 3 searches for the connection of the extension microphone 17 of the microphone connector 18.
  • the device control unit 41 transmits a camera device control signal for stopping the sound collection by the microphone 13 to the camera device 2. To do.
  • the application of the camera server 3 does not recognize the connection of the extension microphone 17, the sound data of the microphone 13 sent from the USB cable 4 is used as it is.
  • the determination by the data selection unit 38 may be performed based on other information instead of based on the header information added to the image data. For example, when image data is inspected for each header and noise data exceeding a predetermined level is detected, the image data of the header may not be transmitted to the network 6 side.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

L'invention permet de distribuer des données d'image présentant peu de bruit à un client et de réduire la perturbation de la vidéo affichée côté client. Un serveur de caméra (3) distribue des données d'image sur une image saisie par un élément d'imagerie (capteur d'image) (25) d'un dispositif de caméra (2) à un client (5) via un réseau (6). Le serveur de caméra (3) comprend un moyen de sélection de données (38) qui produit des données d'image côté réseau (6) lorsque les données d'image sont jugées comme étant des données d'image prédéterminées, et ne produit pas de données d'image côté réseau (6) lorsque les données d'image sont jugées différentes des données d'image prédéterminées.
PCT/JP2007/070007 2006-10-25 2007-10-12 Serveur de caméra WO2008050620A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2006-289887 2006-10-25
JP2006289887A JP2008109362A (ja) 2006-10-25 2006-10-25 カメラサーバおよびデータの処理方法

Publications (1)

Publication Number Publication Date
WO2008050620A1 true WO2008050620A1 (fr) 2008-05-02

Family

ID=39324420

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2007/070007 WO2008050620A1 (fr) 2006-10-25 2007-10-12 Serveur de caméra

Country Status (2)

Country Link
JP (1) JP2008109362A (fr)
WO (1) WO2008050620A1 (fr)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001094954A (ja) * 1999-09-17 2001-04-06 Fujitsu Ltd 画像配信システムおよびその方法
JP2001136503A (ja) * 1999-11-04 2001-05-18 Nec Corp テレビ会議端末及びそれに用いる画像・音声再生方法
JP2004158919A (ja) * 2002-11-01 2004-06-03 Matsushita Electric Ind Co Ltd ネットワークカメラシステムとそのネットワークカメラ、及びデータ送信方法
JP2004320513A (ja) * 2003-04-17 2004-11-11 Casio Comput Co Ltd 情報管理システムおよびプログラム
JP2004343421A (ja) * 2003-05-15 2004-12-02 Olympus Corp 通信機能付きカメラ
JP2006222720A (ja) * 2005-02-10 2006-08-24 Canon Inc 映像通信システム、情報処理装置、映像通信方法及びそのプログラム
JP2007258957A (ja) * 2006-03-22 2007-10-04 Oki Electric Ind Co Ltd 映像監視システム及びその映像表示方法

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001094954A (ja) * 1999-09-17 2001-04-06 Fujitsu Ltd 画像配信システムおよびその方法
JP2001136503A (ja) * 1999-11-04 2001-05-18 Nec Corp テレビ会議端末及びそれに用いる画像・音声再生方法
JP2004158919A (ja) * 2002-11-01 2004-06-03 Matsushita Electric Ind Co Ltd ネットワークカメラシステムとそのネットワークカメラ、及びデータ送信方法
JP2004320513A (ja) * 2003-04-17 2004-11-11 Casio Comput Co Ltd 情報管理システムおよびプログラム
JP2004343421A (ja) * 2003-05-15 2004-12-02 Olympus Corp 通信機能付きカメラ
JP2006222720A (ja) * 2005-02-10 2006-08-24 Canon Inc 映像通信システム、情報処理装置、映像通信方法及びそのプログラム
JP2007258957A (ja) * 2006-03-22 2007-10-04 Oki Electric Ind Co Ltd 映像監視システム及びその映像表示方法

Also Published As

Publication number Publication date
JP2008109362A (ja) 2008-05-08

Similar Documents

Publication Publication Date Title
US5969750A (en) Moving picture camera with universal serial bus interface
WO2013132828A1 (fr) Système de communication et appareil de relais
EP1696396A2 (fr) Dispositif de prise d'images et mèthode de distribution d'images
WO2008050621A1 (fr) Système de serveur de caméra, procédé de traitement de données et serveur de caméra
JPH11250235A (ja) 画像入力装置及び画像入力システム及び画像送受信システム及び画像入力方法及び記憶媒体
US7593580B2 (en) Video encoding using parallel processors
WO2022143212A1 (fr) Système et procédé d'extraction d'un flux spécifique à partir de multiples flux transmis en combinaison pour la lecture
TWI655865B (zh) 用於組態自數位視訊攝影機輸出之視訊串流之方法
JP2004135968A (ja) 遠隔操作可能な内視鏡制御システム
JP2007251779A (ja) デジタルカメラシステムおよびデジタルカメラ
WO2008050620A1 (fr) Serveur de caméra
JP2008131264A (ja) 監視カメラ、画像記録表示装置及び監視カメラシステム
JP2009182372A (ja) 画像圧縮配信装置、及び画像圧縮配信方法
CN102104739B (zh) 传输***、摄像装置和传输方法
JP4232397B2 (ja) 音声付情報端末と情報端末システム
JP2000184261A (ja) 撮像装置
US20090213220A1 (en) Active monitoring system with multi-spot image display and a method thereof
KR100223590B1 (ko) 다기능 텔레비전
WO2008018351A1 (fr) système de caméra, dispositif de caméra et serveur de caméra
KR100563141B1 (ko) 고화소 줌 카메라 시스템
KR20060121506A (ko) 이동 단말에서 영상 표시 장치 및 방법
JP2000152069A (ja) 撮影装置、映像伝送システム、映像受信装置、映像送信装置、映像符号化装置および映像再生装置
JPH0759070A (ja) 表示制御装置
JP2004253910A (ja) 映像配信システム
JPH10173983A (ja) 撮像装置及び画像処理装置並びに撮像システム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07829743

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 07829743

Country of ref document: EP

Kind code of ref document: A1