WO2008059859A1 - Device drive control server device, device drive control system, and device drive control method - Google Patents

Device drive control server device, device drive control system, and device drive control method Download PDF

Info

Publication number
WO2008059859A1
WO2008059859A1 PCT/JP2007/072072 JP2007072072W WO2008059859A1 WO 2008059859 A1 WO2008059859 A1 WO 2008059859A1 JP 2007072072 W JP2007072072 W JP 2007072072W WO 2008059859 A1 WO2008059859 A1 WO 2008059859A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
drive control
client device
server
command
Prior art date
Application number
PCT/JP2007/072072
Other languages
French (fr)
Japanese (ja)
Inventor
Katsura Obikawa
Jong-Sun Hur
Original Assignee
Opt Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Opt Corporation filed Critical Opt Corporation
Publication of WO2008059859A1 publication Critical patent/WO2008059859A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/183Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • H04N23/661Transmitting camera control signals through networks, e.g. control via the Internet

Definitions

  • Device drive control server device device drive control system, and device drive control method
  • the present invention relates to a device drive control server device, a device drive control system, and a device drive control method.
  • a server that is installed between a monitoring camera installed at a position where a monitored object can be monitored and a client device, and that returns monitoring image data obtained by driving the monitoring camera according to a command received from the client device
  • the device is known.
  • the server device disclosed in Patent Document 1 is equipped with a microphone and a speaker. This server device performs control such as transmitting sound data collected by a microphone to a client device or emitting sound from a speaker according to a command from the client device.
  • this type of server device uses video data obtained by driving an imaging device such as a surveillance camera and other data obtained by driving a device such as a microphone on separate data communication paths. Via the client device. Therefore, in order to acquire both video data and other data from the server device in real time, there is a problem that a plurality of data communication paths must be secured between the client device and the server device. In addition, when the client device itself performs remote control of the device, there is a problem that it is necessary to secure one more data communication path used for transmitting a command addressed to the server device for driving the device.
  • the present invention has been devised under such a background, and a client that secures a plurality of data communication paths between a server device that controls a device and a client device. It is an object of the present invention to provide a mechanism capable of smoothly performing remote control of a device via a server device by the device and acquisition of data obtained as a result of the control. Means for solving the problem
  • a device drive control server device includes a communication unit and a first lens including a wide-angle lens according to a command received from the client device by the communication unit.
  • a first device drive control means for driving the device and transmitting the first data obtained as a result of the drive from the communication means to the client device; and a command received by the communication means from the client device.
  • the second device drive control means for driving the second device including the wide-angle lens and transmitting the second data obtained as a result of the drive from the communication means to the client device.
  • the communication means includes a command for driving both the first and second devices, the first data, and the second data as one data. It is characterized by exchanging with the client device via a data communication path.
  • event data indicating the event is sent from the communication means to the client device.
  • Event notification means for transmitting wherein the event notification means transmits the event data to the client device via the same data communication path as the command, the first data, and the second data. You may do it.
  • command, the first data, the second data, and the event data may be exchanged according to a protocol of a specific application layer.
  • a device drive control system includes a first device including a wide-angle lens, a second device not including a wide-angle lens, a communication unit, and the communication unit.
  • the first device is driven, and the first data obtained as a result of the driving is transmitted from the communication means to the client device, and the communication means According to the command received from the client device, the second device is driven, and the result of the drive is
  • a device drive control server device having second device drive control means for transmitting the second data obtained from the communication means to the client device, and the device drive control server device described above.
  • the communication means exchanges the command for driving both the first and second devices, the first data, and the second data with the client device via one data communication path. It may be.
  • the first device generates a light receiving surface in which a plurality of light receiving elements are arranged, and image data representing an image formed on the light receiving surface through the wide-angle lens.
  • image processing means for acquiring compressed image data as the first data by performing processing according to a predetermined data compression method on the image data.
  • the second device includes: a macrophone that collects sound; and sound processing means that acquires sound data representing the sound collected by the microphone mouthphone as the second data.
  • a drive control signal for controlling driving of the first device and the second device is input via a cable, and the compressed image data acquired by the image processing means and the sound processing means are acquired.
  • Input / output control means for outputting sound data via the cable may be further provided.
  • a device drive control method is directed to a server apparatus in which a client apparatus drives a first device including a wide-angle lens and a second device not including a wide-angle lens.
  • a second step of transmitting drive control signals of the devices and the second device to the devices, and the first device is driven according to the control signal received from the server device, and obtained as a result of the drive.
  • a third step of transmitting the received first data to the server device, and the second device is driven according to the control signal received from the server device, and A fourth step of transmitting the second data obtained as a result of driving to the server device; the server device; the first data received from the first device; and the second data received from the second device
  • the second data is the same as that used to send the command in the first step above
  • a fifth step of transmitting to the client device via a data communication path is the same as that used to send the command in the first step above.
  • FIG. 1 is an overall configuration diagram of a device drive control system according to an embodiment.
  • FIG. 2 is a schematic hardware configuration diagram of a camera device.
  • FIG. 3 is a diagram showing a positional relationship between a wide-angle lens and an image sensor.
  • FIG. 4 is a schematic hardware configuration diagram of a server device.
  • FIG. 5 is a flowchart showing server server activation processing.
  • FIG. 6 is a schematic hardware configuration diagram of a client device.
  • Flash memory 37 '.' USB connector, 38 --- UART, 39 ... 40 ⁇ Ethernet controller (part of communication means), 41 ⁇ Camera device drive control section (first device drive control means and second device drive control hand) ), 42... Server device drive control unit, 43... Client connection control unit (part of communication means), 44... Audio data transmission control unit, 45... Video data transmission control unit, 46. ⁇ Server event notification control part (event notification means), 47 'USB communication control part (input / output control means), 48 ⁇ General purpose input / output port, 50... Client device, 51 --- CPU, 52 --- RAM, 53 --- ROM, 54 ... Node, disk, 55 ... computer display, 56 ... mouse, 57 ... keyboard, 58 network interface card
  • FIG. 1 is a diagram showing an overall configuration of a device drive control system according to the present embodiment.
  • a camera device 10 and a server device 30 are connected by a USB (Universal Serial Bus) cable 80, and the server device 30 and each of a plurality of client devices 50 are connected.
  • the network 90 is an aggregate of a plurality of nodes that perform data communication according to TCP / IP (Transmission Control Protocol / Internet Protocol).
  • TCP / IP Transmission Control Protocol / Internet Protocol
  • FIG. 2 is a diagram showing a schematic hardware configuration of the camera apparatus 10.
  • the camera device 10 has a wide-angle lens 11, an image sensor 12, SDRAM (Synchronous DRAM) 13, SRAM (Static Random Access Memory) 14, microphone 15, audio IC (Integrated Circuit) 16, video output A connector 17, an audio output connector 18, an ASIC (Application Specific Integrated Circuit) 19, a USB connector 20, and a video encoder 21 are provided.
  • the wide-angle lens 11 (corresponding to a part of the first device) is a so-called fish-eye lens having a wide viewing angle of 180 degrees or more.
  • Fisheye lens projection methods include stereo projection method, equidistant projection method, equal stereo projection method, and orthographic projection method, but it is desirable that the projection of the image formed on the lens is less distorted at the periphery of the image. It should be a projection system. As shown in FIG. 1, the wide-angle lens 11 is exposed on the upper surface of the housing 26 of the camera device 10.
  • the image sensor 12 (corresponding to a part of the first device) is a photoelectric conversion sensor using CMOS (Complementary Metal Oxide Semiconductor), in which a plurality of light receiving elements are arranged in a matrix with a ratio of 3: 4 for example. It has a light receiving surface.
  • CMOS Complementary Metal Oxide Semiconductor
  • FIG. 3 is a diagram showing an optical positional relationship between the wide-angle lens 11 and the light receiving surface of the image sensor 12.
  • the image sensor 12 is disposed in a posture in which the wide-angle lens 11 is positioned in a direction substantially perpendicular to the light receiving surface.
  • the wide-angle lens 11 has a wide viewing angle of 180 degrees or more, and the image of the monitored region condensed on the wide-angle lens 11 from within the viewing angle is connected to the light receiving surface of the image sensor 12. Image.
  • the image sensor 12 reads the light intensity value indicating the received light amount of each of the plurality of light receiving elements on the light receiving surface, and the brightness distribution data in which the read light amount values are arranged as a square image having the same aspect ratio as the light receiving surface. Generate.
  • a microphone 15 (corresponding to a part of the second device) generates a waveform signal of the collected sound and supplies it to the audio IC 16.
  • a vent hole 22 is formed on the upper surface of the server device 30, and the microphone 15 is disposed below the vent hole 22.
  • the sound IC 16 (part of the second device, corresponding to sound processing means) outputs the waveform signal supplied from the microphone 15 to the audio output connector 18.
  • the waveform signal supplied from the microphone 15 is sampled every predetermined time length and obtained by the sampling.
  • the sound data (corresponding to the second data) is supplied to ASIC19.
  • the ASIC 19 stores the sound data supplied from the sound IC 16 in the SDRAM 13.
  • the audio output connector 18 is exposed on the side surface of the housing 26 of the camera device 10. Therefore, if a speaker or headphones are connected to the audio output connector 18, the sound collected by the microphone 15 can be emitted.
  • the ASIC 19 is customized so as to operate as the color conversion processing unit 23, the JPEG engine 24, and the USB communication control unit 25.
  • the SRAM 14 is used as a work area when the ASIC 19 operates as the color change processing unit 23, the JP EG engine 24, and the USB communication control unit 25.
  • the color conversion processing unit 23 obtains image data by replacing the light intensity values of the light receiving elements sequentially input from the image sensor 12 with the metric values of each color of the R GB (Red-Green-Blue) color system. .
  • the image data acquired by the color conversion processing unit 23 is the image data constituting one image. This is bitmap data representing the density of the element by, for example, RGB256 gradation.
  • the acquired image data is stored in the SDRAM 13.
  • JPEG G Joint Compression processing according to the Photographic Experts Group
  • the compressed image data (corresponding to the first data) obtained by the compression processing is stored in the SDRAM 13.
  • the outline of the compression procedure is as follows. First, the image is subjected to discrete cosine transform (DCT) processing and quantization processing in units of blocks for each predetermined number of pixels to obtain spatial frequency components in units of blocks.
  • DCT discrete cosine transform
  • the frequency component for each block of the image is composed of a DC component for each block and a plurality of AC components for each block.
  • entropy coding is performed for each frequency component of the image to reduce the amount of image data.
  • the DC component of the image is encoded by predictive encoding such as Huffman encoding, and each AC component of the image is encoded by run-length encoding such as arithmetic encoding.
  • compressed image data is acquired by adding header data to the bit stream obtained by the encoding.
  • the compressed image data is read from the SDRAM 13 and supplied to the video encoder 21.
  • the video encoder 21 that has been supplied with the compressed image data outputs an image signal obtained by expanding the compressed image data from the video output connector 17. As shown in FIG. 1, the video output connector 17 is exposed on the side surface of the housing 21 of the camera device 10. Therefore, if a display is connected to the video output connector 17, an image of the output image data can be displayed.
  • the USB communication control unit 25 transmits data according to a procedure in accordance with the USB standard.
  • the USB communication control unit 25 receives a control signal instructing generation of sound data or compressed image data via the USB connector 20 and supplies the control signal to the audio IC 16 or the JPEG engine 24. Further, when the audio data or compressed image data obtained by driving the audio IC 16 or the JPEG engine 24 according to the control signal supplied by itself is stored in the SDRAM 13, the USB communication control unit 25 The stored data is transmitted from the USB connector 20 to the server device 30.
  • multiple devices called “endpoints” are stored in both devices to store various data to be transmitted individually. And Thus, data is transmitted through a plurality of logical communication lines called “pipes” that connect the end points of both devices.
  • the USB communication control unit 25 performs a setting process called “configuration” with the server device 30, and secures the number of endpoints and pipes necessary for communication between itself and the server device 30.
  • the control signal for driving the device of the camera device 10 the sound data returned from the camera device 10 to the server device 30 according to the control signal, and Three types of compressed image data are transmitted through the USB cable 80. Therefore, it is necessary to secure at least three pipes for data transmission in the configuration.
  • the sound data and the compressed image data buffered to the respective endpoints from the audio IC 16 or the JPEG engine 24 are read out in units called “transactions”.
  • the USB cable 80 is transmitted in units called “frames” in which a plurality of transactions are collected. This transmission takes place approximately every 1 millisecond / frame.
  • the control signal transmitted from the server device 30 in the same procedure and stored in the endpoint is immediately supplied to the audio IC 16 or the JPEG engine 24.
  • FIG. 4 is a diagram showing a hardware configuration of the server device 30.
  • the server device 30 includes a power input unit 31, a power switch 32, a reset switch 33, an MCU (Micro Controller Unit) 34, an SDRAM (Synchronous DRAM) 35, a flash memory 36, and a USB connector 37.
  • MCU Micro Controller Unit
  • SDRAM Serial DRAM
  • USB connector 37 USB connector 37.
  • UART Universal Asynchronous Receiver Transmitter
  • Audio Codec 39 Audio Codec
  • Ethernet controller 40 for example, PCKPeriphe
  • the power input unit 31 supplies power into the apparatus.
  • the power switch 32 instructs the activation of the device.
  • the reset switch 33 instructs to reset various settings of the device.
  • the MCU 34 executes the various programs stored in the flash memory 36 while appropriately using the S DRAM 35 as a work area, so that the camera device drive control unit (Camera Device Control Module) 41, the server device drive control unit (Server Control) Module) 42, Client Connection Control Unit 43, Audio Data Transmission Control Unit 44, Video Data Transmission Control Unit (Video Network Transfer Module) 4 5 and Server Event Notify Module 46 are logically realized. The functions of these units will be described in detail later.
  • Some of the hardware resources of the MCU 34 function as a USB communication control unit 47 and a general purpose input / output port (GPIO) 48.
  • the USB communication control unit 47 (corresponding to the input / output control means) transmits a control signal instructing generation of sound data or compressed image data to the camera device 10 via the USB connector 37. Generation of this control signal is performed by the camera device drive control unit 41. Further, the USB communication control unit 47 receives the sound data or the compressed image data generated by the camera device 10 according to the control signal via the USB connector 37.
  • the USB cable 80 that connects the USB connectors of the server device 30 and the camera device 10 transmits data as a serial signal, while the PCI bus that connects each element such as the MCU 34 in the server device 30 transmits data as a parallel signal. Is transmitted.
  • the data generated in the server device 30 and sent to the camera device 10 is converted into a serial signal by the UART 38 and then transmitted through the USB connector 37.
  • the data input from the USB connector 37 is converted into a parallel signal by the UART 38 and then used for subsequent processing by the MCU 34 or the like.
  • the general-purpose input / output port 48 has three signal output terminals and two signal input terminals.
  • a lamp 91, an alarm 92, and a fan 93 are connected to the signal output terminal.
  • Each of the lamp 91, the alarm 92, and the fan 93 is provided in the vicinity of the camera device 10. Then, for example, when a control signal having a peak voltage of A.C 5 V or higher is output from the MCU 34 via the signal output terminal, the lamp 91 turns on light of a predetermined wavelength.
  • the alarm 92 emits a sound of a predetermined wavelength when a control signal is output via the signal output terminal.
  • the fan 93 rotates when a control signal is output via the signal output terminal.
  • two optical sensors 94 and 95 are connected to the signal input terminal of the general-purpose input / output port 48, respectively. Both of these sensors 94 and 95 are provided with their respective fields of view in the monitored area.
  • an electric signal having a peak voltage of A.C 5 V or more is input to the MCU 34 via the signal input terminal.
  • the audio codec 39 is connected to the microphone 96 and the spin through a connector (not shown). Connected with each of the power 97.
  • the audio codec 39 supplies sound data obtained by encoding the sound waveform signal input from the microphone 96 force to the MCU 34. Further, the audio codec 39 supplies a waveform signal obtained by decoding the sound data supplied from the MCU 34 to the speaker 97.
  • the Ethernet (registered trademark) controller 40 (corresponding to a part of communication means) performs data communication with other nodes in accordance with a protocol according to the protocol belonging to the TCP / IP network interface layer.
  • the network interface layer includes a plurality of protocols such as Ethernet (registered trademark) protocol.
  • the processing for data communication that belongs to the Internet layer, transport layer, and application layer above the network interface layer is controlled by the client connection control unit 43 described in Section 7.
  • the Ethernet (registered trademark) controller 40 acquires, from the client connection control unit 43, a packet to which a TCP header and an IP header are added through the processing of each higher layer.
  • the client connection control unit 4 3 (corresponding to a part of the communication means) receives a command for instructing driving of the device itself or the device built in the camera device 10 from the client device 50 and receives the camera device drive control unit 41 or the server. This is supplied to the device drive control unit 42. Then, the server device drive control unit 42 supplies a control signal corresponding to the command to the devices (lamp 91, alarm 92, fan 93) connected to the general-purpose input / output port 48.
  • the camera device drive control unit 41 (first device drive control means and second device (Corresponding to the vise drive control means) supplies the camera device 10 with a control signal corresponding to the command supplied to itself.
  • a device audio IC 16 or JPEG engine 24
  • sound data or compressed image data obtained as a result of the drive is transferred from the camera device 10 to the server device 30. Is transmitted.
  • the audio data transmission control unit 44 and the video data transmission control unit 45 of the server device 30 transmit the transmitted data to the client device 50 via the client connection control unit 43.
  • the server event notification control unit 46 Transmits event data indicating that the sensors 94 and 95 have detected an object to the client device 50 via the client connection control unit 43.
  • the client connection control unit 43 establishes a connection with the client device 50 that has accessed the device itself. This connection is established according to the following procedure according to TCP, which is one of the protocols belonging to the transport layer. First, a packet that includes a SYN (Synchronize) code bit for requesting establishment of a connection in the TCP header is transmitted from the client device 50 to the client connection control unit 43. Receiving the packet, the client connection control unit 43 returns a packet including an AC (Acknowledgment) and SYN code bits to the client device 50.
  • SYN Synchronitize
  • a packet including an ACK code bit is transmitted from the client device 50 to the client device 50, thereby forming a data communication path used for both subsequent data transmissions.
  • This formed data communication channel is maintained until a packet including an ACK and FIN (Finish) code bit is transmitted from one of the client device 50 or the client connection control unit 43 to the other.
  • the client connection control unit 43 can form and maintain individual data communication paths with a plurality of client devices 50 at the same time by performing time division processing by multitasking.
  • a command instructing driving of a device built in the server device 30 or the camera device 10 is transmitted from the client device 50.
  • the command includes a run command to drive the lamp 91.
  • Drive command alarm drive command for instructing driving of alarm 92
  • fan drive command for instructing driving of fan 93
  • audio IC drive command for instructing driving of audio IC 16 of camera device 10
  • JPEG engine 24 There are 5 types of JPEG engine drive commands. Which of these five types is transmitted is determined depending on the operation of the client device 50.
  • the client connection control unit 43 of the server device 30 determines that the command is a lamp drive command, an alarm drive command, a fan drive command, or a voice IC drive command. , And a force that is one of the JPEG engine drive commands. If it is a lamp drive command, alarm drive command or fan drive command, it is supplied to the server device drive control unit 42, and if it is a voice IC drive command or JPEG engine drive command, it is supplied to the camera device drive control unit 41.
  • the server device drive control unit 42 Upon receiving the command, the server device drive control unit 42 outputs a control signal corresponding to the command from the general-purpose input / output port 48. Thereby, the lamp 91 is turned on, the alarm 92 is emitted, and the fan 93 is rotated.
  • the camera device drive control unit 41 that has received the command supplies the USB communication control unit 47 with a control signal generated in response to the command.
  • the USB communication control unit 47 transmits the control signal from the USB connector 37 to the camera device 10.
  • the audio IC 16 or the JPEG engine 24 is driven according to the control signal, and the sound data or the compressed image data obtained as a result of the driving is stored in the SDRAM 13.
  • the USB communication control unit 25 transmits the sound data or the compressed image data stored in the SDRA Ml 3 from the USB connector 20 to the server device 30.
  • the sound data transmitted from the camera device 10 to the server device 30 is supplied to the audio data transmission control unit 44 via the USB communication control unit 47.
  • the compressed image data transmitted from the camera device 10 to the server device 30 is supplied to the video data transmission control unit 45 via the USB communication control unit 47.
  • the audio data transmission control unit 44 that has been supplied with sound data, and the video data transmission control unit 45 that has been supplied with compressed image data includes a series of packets obtained by enclosing the data for each predetermined byte length, Ethernet It is transmitted from the (registered trademark) controller 40 to the client device 50.
  • the server event notification control unit 46 when an electric signal is input to the sensors 94 and 95 via the general-purpose input / output port 48, the server event notification control unit 46 generates event data and includes the event data by a predetermined byte length. The obtained series of packets is transmitted from the Ethernet (registered trademark) controller 40 to the client device 50.
  • the command, sound data, compressed image data, and event data are packetized according to a protocol specific to each.
  • the protocol of the network layer used for exchanging commands, sound data, compressed image data, and event data is different.
  • four data with different properties can be exchanged via a single data channel.
  • step S 180 When the camera device drive control unit 41 is executed through the execution of step S 180, a program that realizes the client connection control unit 43 is activated (S 190). Then, the video data transmission control unit 45, the server device drive control unit 42, the server event notification control Programs for realizing the respective units of the unit 46 and the audio data transmission control unit 44 are started (S200 to S230).
  • FIG. 6 is a diagram showing a schematic hardware configuration of the client device 50.
  • the client device 50 includes a central processing unit (CPU) 51, a random access memory (RAM) 52, a read only memory (ROM) 53, a node disk 54, a computer display 55, a mouse 56, and a keyboard 57.
  • the network interface card 58 are connected by, for example, a PCI system bus.
  • the CPU 51 controls the entire apparatus by executing various programs stored in the ROM 53 and the hard disk 54 while using the RAM 52 as a work area.
  • the network interface card 58 performs data communication with other nodes in accordance with the protocol according to the protocol belonging to the TCP / IP network interface layer. Processing for data communication belonging to a layer higher than the network interface layer is realized by software by the CPU 51 that executes various programs.
  • the ROM 53 stores a simple program such as an IPL (Initial Program Loader).
  • the hard disk 54 stores a communication control program 54a, a command control program 54b, an event presentation program 54c, and a moving image data generation program 54d.
  • the communication control program 54a manages processing belonging to a layer higher than the network interface layer.
  • the command control program 54b performs processing to send various commands (lamp drive command, alarm drive command, fan drive command, audio IC drive command, JPEG engine drive command) specified through the operation of the mouse 56 and keyboard 57.
  • the event presentation program 54c is responsible for processing for displaying a message to the effect that the event data has been received from the server device 30 on the computer display 55.
  • the moving image data generation program 54d manages the process of converting a series of compressed image data transmitted from the server device 30 in response to the JPEG engine drive command into moving image data of MPEG (Moving Picture Experts Group).
  • MPEG Motion Picture Experts Group
  • remote operation of the camera device 10 via the server device 30 and acquisition of data obtained by the operation are realized without individually securing a plurality of data communication paths between the server device 30 and the client device 50. be able to. Also, it is only necessary to secure one port as the transfer destination for each of these types of data. Furthermore, since only one port number of the port used for data exchange between the server device 30 and the client device 50 needs to be designated, it can be easily changed later. Therefore, even if the port number is leaked to the Service-to-Self, it is possible to exchange data afterwards after changing to a new port number.
  • the server device 30 is equipped with the video output connector 17 and the audio output connector 18, and if a display or headphones are connected to these connectors, the image captured by the server device 30 can be obtained. The collected sound can be confirmed immediately. Therefore, dedicated monitoring personnel are resident at the installation location of the server device 30 and direct monitoring by the monitoring personnel is performed via the display and headphones connected to the server device 30.
  • the server event notification control unit 46 is realized in the MCU 34 of the server device 30.
  • the server event notification control unit 46 includes sensors 94 and 95.
  • event data is generated and transmitted to the client device 50.
  • the event presentation program 54c is installed in the client device 50, and when event data is received, a message indicating that is immediately displayed on the computer display 55. Therefore, the user transmits a command for instructing generation of sound data and compressed image data to the server device 30 after the sensors 94 and 95 confirm that the object has entered the sensing field of view. That's it.
  • the camera device 10 includes the first device including the wide-angle lens 11, the image sensor 12, the color conversion processing unit 23, and the JPEG engine 24, and the first device including the microphone 15 and the audio IC 16. 2 devices are installed.
  • the compressed image data obtained by driving the first device and the sound data obtained by operating the second device are transmitted from the server device 30 to the client device 50 through one data communication path. It is supposed to be done.
  • an infrared camera may be mounted instead of the microphone 15 and the audio IC 16.
  • the camera device 10 is equipped with a device including the wide-angle lens 11 and a device not including the wide-angle lens 11, and the two data obtained by operating these two devices under the control of the server device 30 are one. It is even sent to the client device 50 via the data communication path! /!
  • the color conversion processing unit 23, the JPEG engine 24, and the USB communication control unit 25 of the camera device 10 are formed by the ASIC 19.
  • these units may be formed by other hardware such as DSP (Digital Signal Processor).
  • each unit may be realized in software by a program developed on a memory and a general-purpose arithmetic unit such as a CPU (Central Processing Unit) that executes the program.
  • the first device including the wide-angle lens 11, the image sensor 12, the color conversion processing unit 23, and the JPEG engine 24 is the same as the second device including the microphone 15 and the audio IC 16.
  • the second device including the microphone 15 and the audio IC 16. Built in one housing 21.
  • Only one of the devices may be built in the housing 21 of the camera apparatus 10 and the other device may be separated as another housing. Further, the other device may be provided in the server device 30 itself.
  • the server device 30 when the server device 30 receives the audio IC drive command or the JPEG engine drive command from the client device 50, the server device 30 immediately starts generating sound data or compressed image data by the camera device 10. It is summer. On the other hand, in addition to the generation of data by manual drive, schedule drive that starts generation of sound data or compressed image data triggered by the arrival of a preset time is enabled. Gore.
  • the video encoder 21 of the camera device 10 is mounted as a module different from the ASIC 19.
  • ASIC19 should be equipped with a function for decompressing compressed image data, and video encoder 21 should not be installed!

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Studio Devices (AREA)

Abstract

A client device can smoothly control a device via a server device at a distance without using a plurality of data communication paths between the server device controlling the device and the client device. The data obtained as a result of the control can also be acquired smoothly. The server device (30) includes: communication means; first device drive control means which drives a first device including a wide-angle lens (11) according to a command received by the communication means from the client device (50) and transmits first data obtained by the drive, from the communication means; and second device drive means which drives a second device including no wide-angle lens (11) according to a command received by the communication means from the client device (50) and transmits second data obtained by the drive, from the communication means. The communication means passes the commands for driving the both devices, the first data, and the second data via one data communication path.

Description

明 細 書  Specification
デバイス駆動制御サーバ装置、デバイス駆動制御システム、およびデバ イス駆動制御方法  Device drive control server device, device drive control system, and device drive control method
技術分野  Technical field
[0001] 本発明は、デバイス駆動制御サーバ装置、デバイス駆動制御システム、およびデバ イス駆動制御方法に関する。  The present invention relates to a device drive control server device, a device drive control system, and a device drive control method.
背景技術  Background art
[0002] 被監視対象を監視し得る位置に設置された監視カメラとクライアント装置との間に設 置され、クライアント装置から受信したコマンドに従って監視カメラを駆動させて得た 監視画像データを返信するサーバ装置が知られている。たとえば、特許文献 1に開 示されたサーバ装置は、マイクやスピーカを搭載している。このサーバ装置は、マイク により収音した音の音データをクライアント装置へ送信したり、または、クライアント装 置からのコマンドに従ってスピーカから放音したりといった制御を行うようになっている 特許文献 1 :特開 2005— 318412号公報  A server that is installed between a monitoring camera installed at a position where a monitored object can be monitored and a client device, and that returns monitoring image data obtained by driving the monitoring camera according to a command received from the client device The device is known. For example, the server device disclosed in Patent Document 1 is equipped with a microphone and a speaker. This server device performs control such as transmitting sound data collected by a microphone to a client device or emitting sound from a speaker according to a command from the client device. JP 2005-318412 A
発明の開示  Disclosure of the invention
発明が解決しょうとする課題  Problems to be solved by the invention
[0003] ところで、この種のサーバ装置は、監視カメラなどの撮像デバイスを駆動させて得た ビデオデータとマイクなどのデバイスを駆動させて得たその他のデータを、別々のデ ータ通信路を介してクライアント装置に送信せねばならないようになつている。よって 、クライアント装置は、ビデオデータとその他のデータの両方をリアルタイムでサーバ 装置から取得するためには、サーバ装置との間に複数のデータ通信路を確保してお かねばならないという問題がある。また、クライアント装置自らによるデバイスの遠隔制 御を行う場合、デバイスを駆動させるためのサーバ装置宛てのコマンドの送信に用い るデータ通信路をさらにもう 1つ確保せねばならないという問題がある。  [0003] By the way, this type of server device uses video data obtained by driving an imaging device such as a surveillance camera and other data obtained by driving a device such as a microphone on separate data communication paths. Via the client device. Therefore, in order to acquire both video data and other data from the server device in real time, there is a problem that a plurality of data communication paths must be secured between the client device and the server device. In addition, when the client device itself performs remote control of the device, there is a problem that it is necessary to secure one more data communication path used for transmitting a command addressed to the server device for driving the device.
[0004] 本発明は、このような背景の下に案出されたものであり、デバイスを制御するサーバ 装置とクライアント装置との間に複数のデータ通信路を確保することなぐクライアント 装置によるサーバ装置を介したデバイスの遠隔制御、およびその制御の結果として 得たデータの取得を円滑に行い得るような仕組みを提供することを目的とする。 課題を解決するための手段 [0004] The present invention has been devised under such a background, and a client that secures a plurality of data communication paths between a server device that controls a device and a client device. It is an object of the present invention to provide a mechanism capable of smoothly performing remote control of a device via a server device by the device and acquisition of data obtained as a result of the control. Means for solving the problem
[0005] 上記課題を解決するため、本発明の好適な態様であるデバイス駆動制御サーバ装 置は、通信手段と、上記通信手段がクライアント装置より受信したコマンドに従い、広 角レンズを含む第 1のデバイスを駆動させ、その駆動の結果として得られた第 1のデ ータを上記通信手段から上記クライアント装置へ送信させる第 1のデバイス駆動制御 手段と、上記通信手段が上記クライアント装置より受信したコマンドに従い、広角レン ズを含まなレ、第 2のデバイスを駆動させ、その駆動の結果として得られた第 2のデー タを上記通信手段から上記クライアント装置へ送信させる第 2のデバイス駆動制御手 段と、を備え、上記通信手段は、上記第 1と第 2の両デバイスを駆動させるコマンド、 上記第 1のデータ、および上記第 2のデータを 1つのデータ通信路を介して上記クラ イアント装置との間でやりとりすることを特徴とする。 [0005] In order to solve the above-described problem, a device drive control server device according to a preferred aspect of the present invention includes a communication unit and a first lens including a wide-angle lens according to a command received from the client device by the communication unit. A first device drive control means for driving the device and transmitting the first data obtained as a result of the drive from the communication means to the client device; and a command received by the communication means from the client device. The second device drive control means for driving the second device including the wide-angle lens and transmitting the second data obtained as a result of the drive from the communication means to the client device. The communication means includes a command for driving both the first and second devices, the first data, and the second data as one data. It is characterized by exchanging with the client device via a data communication path.
[0006] この態様において、自らのセンシング視野内に物体が進入したことを検知するセン サと、上記センサが物体の進入を検知すると、そのことを示すイベントデータを前記 通信手段から前記クライアント装置へ送信させるイベント通知手段と、をさらに備え、 上記イベント通知手段は、上記コマンド、前記第 1のデータ、および前記第 2のデー タと同じデータ通信路を介して上記イベントデータを前記クライアント装置へ送信する ようにしてもよい。 [0006] In this aspect, when a sensor detects that an object has entered into its own sensing field of view, and the sensor detects the entry of an object, event data indicating the event is sent from the communication means to the client device. Event notification means for transmitting, wherein the event notification means transmits the event data to the client device via the same data communication path as the command, the first data, and the second data. You may do it.
[0007] また、前記コマンド、前記第 1のデータ、前記第 2のデータ、および前記イベントデ ータは、各々に固有のアプリケーション層のプロトコルに従ってそれぞれやりとりされ るようにしてあよレヽ。  [0007] Further, the command, the first data, the second data, and the event data may be exchanged according to a protocol of a specific application layer.
[0008] 本発明の別の好適な態様であるデバイス駆動制御システムは、広角レンズを含む 第 1のデバイスと、広角レンズを含まない第 2のデバイスと、通信手段、上記通信手段 力 Sクライアント装置より受信したコマンドに従い、上記第 1のデバイスを駆動させ、その 駆動の結果として得られた第 1のデータを上記通信手段から上記クライアント装置へ 送信させる第 1のデバイス駆動制御手段、および上記通信手段が上記クライアント装 置より受信したコマンドに従い、上記第 2のデバイスを駆動させ、その駆動の結果とし て得られた第 2のデータを上記通信手段から上記クライアント装置へ送信させる第 2 のデバイス駆動制御手段を備えたデバイス駆動制御サーバ装置と、を有し、上記デ ノ イス駆動制御サーバ装置の上記通信手段は、上記第 1と上記第 2の両デバイスを 駆動させるコマンド、上記第 1のデータ、および上記第 2のデータを 1つのデータ通信 路を介して上記クライアント装置との間でやりとりするようにしてもよい。 [0008] A device drive control system according to another preferred embodiment of the present invention includes a first device including a wide-angle lens, a second device not including a wide-angle lens, a communication unit, and the communication unit. In accordance with the received command, the first device is driven, and the first data obtained as a result of the driving is transmitted from the communication means to the client device, and the communication means According to the command received from the client device, the second device is driven, and the result of the drive is And a device drive control server device having second device drive control means for transmitting the second data obtained from the communication means to the client device, and the device drive control server device described above. The communication means exchanges the command for driving both the first and second devices, the first data, and the second data with the client device via one data communication path. It may be.
[0009] この態様において、前記第 1のデバイスは、複数の受光素子を配列している受光面 と、前記広角レンズを介して上記受光面に結像した像を表す画像データを生成し、 生成した画像データに所定のデータ圧縮方式に従った処理を施すことによって、圧 縮画像データを前記第 1のデータとして取得する画像処理手段と、を備えたことを特 徴とする。 [0009] In this aspect, the first device generates a light receiving surface in which a plurality of light receiving elements are arranged, and image data representing an image formed on the light receiving surface through the wide-angle lens. And image processing means for acquiring compressed image data as the first data by performing processing according to a predetermined data compression method on the image data.
[0010] この態様において、前記第 2のデバイスは、音を収音するマクロホンと、上記マイク 口ホンが収音した音を表す音データを前記第 2のデータとして取得する音処理手段と 、を備え、前記第 1のデバイスおよび前記第 2のデバイスを駆動を制御する駆動制御 信号をケーブルを介して入力すると共に、前記画像処理手段が取得した圧縮画像デ ータおよび上記音処理手段が取得した音データを上記ケーブルを介して出力する 入出力制御手段、をさらに備えてもよい。  [0010] In this aspect, the second device includes: a macrophone that collects sound; and sound processing means that acquires sound data representing the sound collected by the microphone mouthphone as the second data. A drive control signal for controlling driving of the first device and the second device is input via a cable, and the compressed image data acquired by the image processing means and the sound processing means are acquired. Input / output control means for outputting sound data via the cable may be further provided.
[0011] 本発明の別の好適な態様であるデバイス駆動制御方法は、クライアント装置が、広 角レンズを含む第 1のデバイスと広角レンズを含まない第 2のデバイスとを駆動させる サーバ装置に宛てたコマンドを、当該サーバ装置との間に確立されているデータ通 信路を介して送信する第 1のステップと、上記サーバ装置が、上記クライアント装置か ら受信したコマンドに応じた上記第 1のデバイスおよび上記第 2のデバイスの駆動制 御信号をそれらのデバイスへ送信する第 2のステップと、上記第 1のデバイスが、上記 サーバ装置から受信した制御信号に従って駆動し、その駆動の結果として得られた 第 1のデータを上記サーバ装置へ送信する第 3のステップと、上記第 2のデバイスが 、上記サーバ装置から受信した制御信号に従って駆動し、その駆動の結果として得 られた第 2のデータを上記サーバ装置へ送信する第 4のステップと、上記サーバ装置 力 上記第 1のデバイスから受信した第 1のデータおよび上記第 2のデバイスから受 信した第 2のデータを、上記第 1のステップのコマンドの送信に用いられたものと同じ データ通信路を介して上記クライアント装置へ送信する第 5のステップとを有すること を特徴とする。 [0011] A device drive control method according to another preferred aspect of the present invention is directed to a server apparatus in which a client apparatus drives a first device including a wide-angle lens and a second device not including a wide-angle lens. A first step of transmitting the received command via a data communication path established with the server device, and the server device responding to the first command according to the command received from the client device. A second step of transmitting drive control signals of the devices and the second device to the devices, and the first device is driven according to the control signal received from the server device, and obtained as a result of the drive. A third step of transmitting the received first data to the server device, and the second device is driven according to the control signal received from the server device, and A fourth step of transmitting the second data obtained as a result of driving to the server device; the server device; the first data received from the first device; and the second data received from the second device The second data is the same as that used to send the command in the first step above And a fifth step of transmitting to the client device via a data communication path.
発明の効果  The invention's effect
[0012] 本発明によると、デバイスを制御するサーバ装置とクライアント装置との間に複数の データ通信路を確保することなぐクライアント装置によるサーバ装置を介したデバイ スの遠隔制御、およびその制御の結果として得たデータの取得を円滑に行うことがで きる。  [0012] According to the present invention, remote control of a device via a server device by a client device without securing a plurality of data communication paths between the server device that controls the device and the client device, and a result of the control As a result, the obtained data can be obtained smoothly.
図面の簡単な説明  Brief Description of Drawings
[0013] [図 1]実施形態に力、かるデバイス駆動制御システムの全体構成図である。  [0013] FIG. 1 is an overall configuration diagram of a device drive control system according to an embodiment.
[図 2]カメラ装置のハードウェア概略構成図である。  FIG. 2 is a schematic hardware configuration diagram of a camera device.
[図 3]広角レンズとイメージセンサの位置関係を示す図である。  FIG. 3 is a diagram showing a positional relationship between a wide-angle lens and an image sensor.
[図 4]サーバ装置のハードウェア概略構成図である。  FIG. 4 is a schematic hardware configuration diagram of a server device.
[図 5]サーバ装置の起動処理を示すフローチャートである。  FIG. 5 is a flowchart showing server server activation processing.
[図 6]クライアント装置のハードウェア概略構成図である。  FIG. 6 is a schematic hardware configuration diagram of a client device.
符号の説明  Explanation of symbols
[0014] 10···カメラ装置、 11···広角レンズ (第 1のデバイスの一部)、 12···イメージセンサ( 第 1のデバイスの一部)、 13·· -SDRAM, 14---SRAM, 15···マイクロホン(第 2のデ バイスの一部)、 16···音声 IC (第 2のデバイスの一部)、 17···ビデオ出力コネクタ、 1 8…オーディオ出力コネクタ、 19---ASIC, 20---USBコネクタ、 21…ビデオェンコ一 ダ、 23···色変換処理部(第 1のデバイスの一部)、 30···サーバ装置、 31···電源入力 部、 32…電源スィッチ、 33…リセットスィッチ、 34---MCU, 35---SDRAM, 36…フ ラッシュメモリ、 37'.'USBコネクタ、 38---UART, 39···才ーディ才コーデック、 40··· イーサネット (登録商標)コントローラ(通信手段の一部)、 41 · ··カメラデバイス駆動制 御部(第 1のデバイス駆動制御手段および第 2のデバイス駆動制御手段)、 42···サー バデバイス駆動制御部、 43···クライアント接続制御部(通信手段の一部)、 44···ォー ディォデータ送信制御部、 45…ビデオデータ送信制御部、 46···サーバイベント通知 制御部 (イベント通知手段)、 47' USB通信制御部(入出力制御手段)、 48···汎用 入出力ポート、 50…クライアント装置、 51---CPU, 52---RAM, 53---ROM, 54··· ノヽード、ディスク、 55…コンピュータディスプレイ、 56…マウス、 57…キーボード、、 58 · · · ネットワークインターフェースカード [0014] 10 ··· Camera device, 11 · · · Wide-angle lens (part of the first device), 12 · · · Image sensor (part of the first device), 13 · · · -SDRAM, 14- --SRAM, 15 ... Microphone (part of second device), 16 ... Audio IC (part of second device), 17 ... Video output connector, 1 8 ... Audio output connector , 19 --- ASIC, 20 --- USB connector, 21 ... Video encoder, 23 ... Color conversion processing part (part of the first device), 30 ... Server device, 31 ... Power supply Input section, 32 ... Power switch, 33 ... Reset switch, 34 --- MCU, 35 --- SDRAM, 36 ... Flash memory, 37 '.' USB connector, 38 --- UART, 39 ... 40 ··· Ethernet controller (part of communication means), 41 ··· Camera device drive control section (first device drive control means and second device drive control hand) ), 42... Server device drive control unit, 43... Client connection control unit (part of communication means), 44... Audio data transmission control unit, 45... Video data transmission control unit, 46. ··· Server event notification control part (event notification means), 47 'USB communication control part (input / output control means), 48 ··· General purpose input / output port, 50… Client device, 51 --- CPU, 52 --- RAM, 53 --- ROM, 54 ... Node, disk, 55 ... computer display, 56 ... mouse, 57 ... keyboard, 58 network interface card
発明を実施するための最良の形態  BEST MODE FOR CARRYING OUT THE INVENTION
[0015] (発明の実施の形態)  [0015] (Embodiment of the invention)
本発明の実施形態について、以下、図面を参照しながら説明する。  Embodiments of the present invention will be described below with reference to the drawings.
[0016] 図 1は、本実施形態に力、かるデバイス駆動制御システムの全体構成を示す図であ る。図 1に示すように、このデバイス駆動制御システムは、カメラ装置 10とサーバ装置 30とを、 USB (Universal Serial Bus)ケーブル 80により接続し、そのサーバ装置 30と 複数台のクライアント装置 50の各々とを、ネットワーク 90を介して接続している。ネット ワーク 90は、 TCP/IP (Transmission Control Protocol/Internet Protocol)に従った データ通信を行う複数のノードの集合体である。本システムは、駐車場や店舗などと いった、監視を要する場所にカメラ装置 10を備え付け、サーバ装置 30による制御の 下にそのカメラ装置 10に被監視領域内を撮像させて得た画像データを、クライアント 装置 50へ配信する。  [0016] FIG. 1 is a diagram showing an overall configuration of a device drive control system according to the present embodiment. As shown in FIG. 1, in this device drive control system, a camera device 10 and a server device 30 are connected by a USB (Universal Serial Bus) cable 80, and the server device 30 and each of a plurality of client devices 50 are connected. Are connected via network 90. The network 90 is an aggregate of a plurality of nodes that perform data communication according to TCP / IP (Transmission Control Protocol / Internet Protocol). This system is equipped with a camera device 10 at a place requiring monitoring, such as a parking lot or a store, and image data obtained by causing the camera device 10 to image the monitored area under the control of the server device 30. And delivered to the client device 50.
[0017] 図 2は、カメラ装置 10のハードウェア概略構成を示す図である。図 2に示すように、 カメラ装置 10は、広角レンズ 11、イメージセンサ 12、 SDRAM(Synchronous DRAM) 13、 SRAM (Static Random Access Memory) 14、マイクロホン 15、音声 IC (Integrat ed Circuit) 16、ビデオ出力コネクタ 17、オーディオ出力コネクタ 18、 ASIC (Applicati on Specific Integrated Circuit) 19、 USBコネクタ 20、およびビデオエンコーダ 21を 備える。  FIG. 2 is a diagram showing a schematic hardware configuration of the camera apparatus 10. As shown in FIG. As shown in Fig. 2, the camera device 10 has a wide-angle lens 11, an image sensor 12, SDRAM (Synchronous DRAM) 13, SRAM (Static Random Access Memory) 14, microphone 15, audio IC (Integrated Circuit) 16, video output A connector 17, an audio output connector 18, an ASIC (Application Specific Integrated Circuit) 19, a USB connector 20, and a video encoder 21 are provided.
[0018] これら各部について詳述すると、まず、広角レンズ 11 (第 1のデバイスの一部に相 当)は、 180度以上の広視野角を持つ、いわゆる魚眼レンズである。魚眼レンズの射 影方式としては、立体射影方式、等距離射影方式、等立体射影方式、正射影方式な どがあるが、望ましくは、レンズに結像される像の周辺部のゆがみがより小さな立体射 影方式のものであるとよい。図 1に示すように、この広角レンズ 11は、カメラ装置 10の ハウジング 26の上面に露出されている。イメージセンサ 12 (第 1のデバイスの一部に 相当)は、 CMOS (Complementary Metal Oxide Semiconductor)を用いた光電変換 センサであり、複数の受光素子をたとえば縦横 3 : 4の比率でマトリックス状に配列した 受光面を有する。 [0018] The details of each part will be described. First, the wide-angle lens 11 (corresponding to a part of the first device) is a so-called fish-eye lens having a wide viewing angle of 180 degrees or more. Fisheye lens projection methods include stereo projection method, equidistant projection method, equal stereo projection method, and orthographic projection method, but it is desirable that the projection of the image formed on the lens is less distorted at the periphery of the image. It should be a projection system. As shown in FIG. 1, the wide-angle lens 11 is exposed on the upper surface of the housing 26 of the camera device 10. The image sensor 12 (corresponding to a part of the first device) is a photoelectric conversion sensor using CMOS (Complementary Metal Oxide Semiconductor), in which a plurality of light receiving elements are arranged in a matrix with a ratio of 3: 4 for example. It has a light receiving surface.
[0019] 図 3は、広角レンズ 11とイメージセンサ 12の受光面の光学的な位置関係を示す図 である。図 3に示すように、イメージセンサ 12は、その受光面の略垂直方向に広角レ ンズ 11が位置する姿勢で配設される。広角レンズ 11が 180度以上の広視野角を有 することは上述したところであり、その視野角内から広角レンズ 11に集光された被監 視領域の像は、イメージセンサ 12の受光面に結像する。すると、イメージセンサ 12は 、その受光面の複数の受光素子の各々の受光光量を示す光量値を読み込み、読み 込んだ光量値を受光面と同じ縦横比の四角形の画像として並べた輝度分布データ を生成する。  FIG. 3 is a diagram showing an optical positional relationship between the wide-angle lens 11 and the light receiving surface of the image sensor 12. As shown in FIG. 3, the image sensor 12 is disposed in a posture in which the wide-angle lens 11 is positioned in a direction substantially perpendicular to the light receiving surface. As described above, the wide-angle lens 11 has a wide viewing angle of 180 degrees or more, and the image of the monitored region condensed on the wide-angle lens 11 from within the viewing angle is connected to the light receiving surface of the image sensor 12. Image. Then, the image sensor 12 reads the light intensity value indicating the received light amount of each of the plurality of light receiving elements on the light receiving surface, and the brightness distribution data in which the read light amount values are arranged as a square image having the same aspect ratio as the light receiving surface. Generate.
[0020] 図 2において、マイクロホン 15 (第 2のデバイスの一部に相当)は、収音した音の波 形信号を生成して音声 IC16へ供給する。図 1に示すように、サーバ装置 30の上面 には通気孔 22が形成されており、マイクロホン 15は、この通気孔 22の下側に配設さ れている。音声 IC16 (第 2のデバイスの一部、音処理手段に相当)は、マイクロホン 1 5から供給される波形信号をオーディオ出力コネクタ 18へ出力する。また、この音声 I C16は、音データの生成を指示する制御信号が自らへ供給されると、マイクロホン 15 力、ら供給される波形信号を所定の時間長毎にサンプリングし、そのサンプリングにより 得た音データ(第 2のデータに相当)を ASIC19へ供給する。そして、 ASIC19は、音 声 IC16から供給されてくる音データを SDRAM13に記憶させる。図 1に示すように、 オーディオ出力コネクタ 18は、カメラ装置 10のハウジング 26の側面に露出されてい る。よって、このオーディオ出力コネクタ 18にスピーカやヘッドフォンを接続すれば、 マイクロホン 15が収音した音を放音させることができる。  In FIG. 2, a microphone 15 (corresponding to a part of the second device) generates a waveform signal of the collected sound and supplies it to the audio IC 16. As shown in FIG. 1, a vent hole 22 is formed on the upper surface of the server device 30, and the microphone 15 is disposed below the vent hole 22. The sound IC 16 (part of the second device, corresponding to sound processing means) outputs the waveform signal supplied from the microphone 15 to the audio output connector 18. In addition, when a control signal instructing generation of sound data is supplied to the voice IC 16, the waveform signal supplied from the microphone 15 is sampled every predetermined time length and obtained by the sampling. The sound data (corresponding to the second data) is supplied to ASIC19. Then, the ASIC 19 stores the sound data supplied from the sound IC 16 in the SDRAM 13. As shown in FIG. 1, the audio output connector 18 is exposed on the side surface of the housing 26 of the camera device 10. Therefore, if a speaker or headphones are connected to the audio output connector 18, the sound collected by the microphone 15 can be emitted.
[0021] ASIC19は、色変換処理部 23、 JPEGエンジン 24、および USB通信制御部 25とし て動作するようにカスタマイズされている。そして、 SRAM14は、色変処理部 23、 JP EGエンジン 24、および USB通信制御部 25として ASIC19が動作する際のワークェ リアとして禾 lj用される。  The ASIC 19 is customized so as to operate as the color conversion processing unit 23, the JPEG engine 24, and the USB communication control unit 25. The SRAM 14 is used as a work area when the ASIC 19 operates as the color change processing unit 23, the JP EG engine 24, and the USB communication control unit 25.
[0022] 色変換処理部 23は、イメージセンサ 12から順次入力される受光素子の光量値を R GB (Red-Green-Blue)表色系の各色のメトリタス値と置き換えて画像データを取得す る。この色変換処理部 23によって取得される画像データは、一枚の画像を成す各画 素の濃度を、たとえば、 RGB256階調により表したビットマップデータである。取得さ れた画像データは、 SDRAM 13に記憶される。 [0022] The color conversion processing unit 23 obtains image data by replacing the light intensity values of the light receiving elements sequentially input from the image sensor 12 with the metric values of each color of the R GB (Red-Green-Blue) color system. . The image data acquired by the color conversion processing unit 23 is the image data constituting one image. This is bitmap data representing the density of the element by, for example, RGB256 gradation. The acquired image data is stored in the SDRAM 13.
[0023] JPEGエンジン 24 (第 1のデバイスの一部に相当)は、圧縮画像データの生成を指 示する制御信号が自らに供給されると、 SDRAM13に記憶された画像データに JPE G (Joint Photographic Experts Group)方式に従った圧縮処理を施し、その圧縮処理 により得た圧縮画像データ(第 1のデータに相当)を SDRAM13に記憶させる。圧縮 の手順の概要を示すと以下のようになる。まず、画像を所定の画素数毎のブロック単 位で離散コサイン変換(DCT)処理および量子化処理することで、ブロック単位での 空間周波数成分を得る。画像のブロック単位の周波数成分は、ブロック単位の直流 成分と、ブロック単位の複数の交流成分とで構成される。次に、画像の周波数成分毎 のエントロピー符号化処理をして、画像のデータ量を減らす。画像の直流成分は、ハ フマン符号化などの予測符号化により符号化し、画像の各交流成分は、算術符号化 などのランレングス符号化により符号化する。そして、それらの符号化により得たビット ストリームにヘッダデータを付加することにより、圧縮画像データが取得される。また、 この圧縮画像データは、 SDRAM13から読み出され、ビデオエンコーダ 21へ供給さ れるようにもなつている。圧縮画像データの供給を受けたビデオエンコーダ 21は、そ の圧縮画像データを伸張して得た画像信号を、ビデオ出力コネクタ 17から出力する 。図 1に示すように、ビデオ出力コネクタ 17は、カメラ装置 10のハウジング 21の側面 に露出されている。よって、このビデオ出力コネクタ 17にディスプレイを接続すれば、 出力される画像データの画像を表示させることができる。  [0023] When a control signal instructing generation of compressed image data is supplied to the JPEG engine 24 (corresponding to a part of the first device), JPEG G (Joint Compression processing according to the Photographic Experts Group) method is performed, and the compressed image data (corresponding to the first data) obtained by the compression processing is stored in the SDRAM 13. The outline of the compression procedure is as follows. First, the image is subjected to discrete cosine transform (DCT) processing and quantization processing in units of blocks for each predetermined number of pixels to obtain spatial frequency components in units of blocks. The frequency component for each block of the image is composed of a DC component for each block and a plurality of AC components for each block. Next, entropy coding is performed for each frequency component of the image to reduce the amount of image data. The DC component of the image is encoded by predictive encoding such as Huffman encoding, and each AC component of the image is encoded by run-length encoding such as arithmetic encoding. Then, compressed image data is acquired by adding header data to the bit stream obtained by the encoding. The compressed image data is read from the SDRAM 13 and supplied to the video encoder 21. The video encoder 21 that has been supplied with the compressed image data outputs an image signal obtained by expanding the compressed image data from the video output connector 17. As shown in FIG. 1, the video output connector 17 is exposed on the side surface of the housing 21 of the camera device 10. Therefore, if a display is connected to the video output connector 17, an image of the output image data can be displayed.
[0024] USB通信制御部 25は、 USBの規格に従った手順でデータの伝送を行う。この US B通信制御部 25は、音データまたは圧縮画像データの生成を指示する制御信号を USBコネクタ 20を介して受信し、その制御信号を音声 IC 16または JPEGエンジン 24 へ供給する。また、 USB通信制御部 25は、自らが供給した制御信号に応じて音声 I C16または JPEGエンジン 24が駆動することによって得られた音データまたは圧縮画 像データが、 SDRAM 13に記憶されると、その記憶されたデータを USBコネクタ 20 からサーバ装置 30へ送信する。 USBの規格では、伝送する各種データを個別に記 憶させる、「エンドポイント」と呼ばれる複数のバッファを双方の装置に確保する。そし て、両装置の各々のエンドポイント同士を結ぶ、「パイプ」と呼ばれる複数の論理的な 通信線を介してデータの伝送が行われる。 USB通信制御部 25は、まず、「コンフィギ ユレーシヨン」と呼ばれる設定処理をサーバ装置 30との間で執り行い、自らとサーバ 装置 30との通信に必要な数のエンドポイントおよびパイプを確保する。 [0024] The USB communication control unit 25 transmits data according to a procedure in accordance with the USB standard. The USB communication control unit 25 receives a control signal instructing generation of sound data or compressed image data via the USB connector 20 and supplies the control signal to the audio IC 16 or the JPEG engine 24. Further, when the audio data or compressed image data obtained by driving the audio IC 16 or the JPEG engine 24 according to the control signal supplied by itself is stored in the SDRAM 13, the USB communication control unit 25 The stored data is transmitted from the USB connector 20 to the server device 30. In the USB standard, multiple devices called “endpoints” are stored in both devices to store various data to be transmitted individually. And Thus, data is transmitted through a plurality of logical communication lines called “pipes” that connect the end points of both devices. First, the USB communication control unit 25 performs a setting process called “configuration” with the server device 30, and secures the number of endpoints and pipes necessary for communication between itself and the server device 30.
[0025] 上述したように、本実施形態では、カメラ装置 10のデバイスを駆動させるための制 御信号、その制御信号に応じてカメラ装置 10からサーバ装置 30へ返信される音デ ータ、および圧縮画像データの 3種のデータが USBケーブル 80内を伝送される。よ つて、コンフィギュレーションでは、 3つ以上のデータ伝送用のパイプが確保される必 要がある。コンフィギュレーションの完了後、音声 IC16または JPEGエンジン 24から 各々に固有のエンドポイントにバッファリングされる音データおよび圧縮画像データ は、「トランザクション」と呼ばれる単位で読み出される。さらに、そのトランザクションを 複数取り纏めた「フレーム」と呼ばれる単位で USBケーブル 80内を伝送される。この 伝送は、概ね 1ミリ秒/フレームごとに行われる。一方で、同様の手順でサーバ装置 30から伝送されてエンドポイントに記憶される制御信号は、音声 IC16または JPEGェ ンジン 24へ直ちに供給される。 [0025] As described above, in the present embodiment, the control signal for driving the device of the camera device 10, the sound data returned from the camera device 10 to the server device 30 according to the control signal, and Three types of compressed image data are transmitted through the USB cable 80. Therefore, it is necessary to secure at least three pipes for data transmission in the configuration. After the configuration is completed, the sound data and the compressed image data buffered to the respective endpoints from the audio IC 16 or the JPEG engine 24 are read out in units called “transactions”. Further, the USB cable 80 is transmitted in units called “frames” in which a plurality of transactions are collected. This transmission takes place approximately every 1 millisecond / frame. On the other hand, the control signal transmitted from the server device 30 in the same procedure and stored in the endpoint is immediately supplied to the audio IC 16 or the JPEG engine 24.
[0026] 図 4は、サーバ装置 30のハードウェア構成を示す図である。図 4に示すように、サー バ装置 30は、電源入力部 31、電源スィッチ 32、リセットスィッチ 33、 MCU (Micro C ontroller Unit) 34、 SDRAM (Synchronous DRAM) 35、フラッシュメモリ 36、 USBコ ネクタ 37、 UART (Universal Asynchronous Receiver Transmitter) 38、ォーアイオコ 一デック 39、およびイーサネット(登録商標)コントローラ 40を、例えば、 PCKPeriphe FIG. 4 is a diagram showing a hardware configuration of the server device 30. As shown in FIG. As shown in FIG. 4, the server device 30 includes a power input unit 31, a power switch 32, a reset switch 33, an MCU (Micro Controller Unit) 34, an SDRAM (Synchronous DRAM) 35, a flash memory 36, and a USB connector 37. UART (Universal Asynchronous Receiver Transmitter) 38, Audio Codec 39, and Ethernet controller 40, for example, PCKPeriphe
[0027] 電源入力部 31は、装置内に電力を供給する。電源スィッチ 32は、装置の起動を指 示する。リセットスィッチ 33は、装置の各種設定のリセットを指示する。 MCU34は、 S DRAM35をワークエリアとして適宜利用しつつフラッシュメモリ 36に記憶された各種 プログラムを実行することで、カメラデバイス駆動制御部(Camera Device Control Mo dule) 41、サーバデバイス駆動制御部(Server Control Module) 42、クライアント接続 制御部(Client Connection Module) 43、オーディオデータ送信制御部(Audio Comm unication Module) 44、ビデオデータ送信制卸部 (Video Network Transfer Module) 4 5、およびサーバイベント通知制御部(Server Event Notify Module) 46を論理的に実 現する。これら各部の機能については後に詳述する。また、この MCU34のハードウ エア資源の一部は、 USB通信制御部 47および汎用入出力ポート(GPIO : General P urpose Input Output) 48として機能する。 [0027] The power input unit 31 supplies power into the apparatus. The power switch 32 instructs the activation of the device. The reset switch 33 instructs to reset various settings of the device. The MCU 34 executes the various programs stored in the flash memory 36 while appropriately using the S DRAM 35 as a work area, so that the camera device drive control unit (Camera Device Control Module) 41, the server device drive control unit (Server Control) Module) 42, Client Connection Control Unit 43, Audio Data Transmission Control Unit 44, Video Data Transmission Control Unit (Video Network Transfer Module) 4 5 and Server Event Notify Module 46 are logically realized. The functions of these units will be described in detail later. Some of the hardware resources of the MCU 34 function as a USB communication control unit 47 and a general purpose input / output port (GPIO) 48.
[0028] USB通信制御部 47 (入出力制御手段に相当)は、音データまたは圧縮画像デー タの生成を指示する制御信号を USBコネクタ 37を介してカメラ装置 10へ伝送する。 この制御信号の生成はカメラデバイス駆動制御部 41が執り行う。また、 USB通信制 御部 47は、その制御信号に応じてカメラ装置 10が生成した音データまたは圧縮画 像データを USBコネクタ 37を介して受信する。なお、サーバ装置 30とカメラ装置 10 の USBコネクタ同士を繋ぐ USBケーブル 80はシリアル信号としてデータを伝送する 一方、サーバ装置 30内にて MCU34などの各素子を繋ぐ PCIバスはパラレル信号と してデータを伝送する。よって、サーバ装置 30内で生成されたデータであってカメラ 装置 10へ送るデータは、 UART38にてシリアル信号化されてから、 USBコネクタ 37 を介した伝送に供される。また、 USBコネクタ 37から入力されるデータは、 UART38 にてパラレル信号化されてから、 MCU34などによる以降の処理に供される。  [0028] The USB communication control unit 47 (corresponding to the input / output control means) transmits a control signal instructing generation of sound data or compressed image data to the camera device 10 via the USB connector 37. Generation of this control signal is performed by the camera device drive control unit 41. Further, the USB communication control unit 47 receives the sound data or the compressed image data generated by the camera device 10 according to the control signal via the USB connector 37. The USB cable 80 that connects the USB connectors of the server device 30 and the camera device 10 transmits data as a serial signal, while the PCI bus that connects each element such as the MCU 34 in the server device 30 transmits data as a parallel signal. Is transmitted. Therefore, the data generated in the server device 30 and sent to the camera device 10 is converted into a serial signal by the UART 38 and then transmitted through the USB connector 37. The data input from the USB connector 37 is converted into a parallel signal by the UART 38 and then used for subsequent processing by the MCU 34 or the like.
[0029] 汎用入出力ポート 48は、 3つの信号出力端子と 2つの信号入力端子とを有している 。信号出力端子には、ランプ 91、アラーム 92、ファン 93がそれぞれ接続される。これ らのランプ 91、アラーム 92、およびファン 93の各々は、カメラ装置 10の近傍に備え 付けられている。そして、ランプ 91は、例えば、 A.C 5V以上のピーク電圧の制御信 号が信号出力端子を介して MCU34から出力されてくると、所定の波長の光を点灯 する。アラーム 92は、制御信号が信号出力端子を介して出力されてくると、所定の波 長の音を放音する。また、ファン 93は、制御信号が信号出力端子を介して出力され てくると、回転する。一方、汎用入出力ポート 48の信号入力端子には、 2つの光学式 センサ 94, 95がそれぞれ接続される。これらの両センサ 94, 95は、被監視域内に各 々のセンシング視野を向けて備え付けられている。そして、自らのセンシング視野内 に物体が進入したことを検知すると、例えば、 A.C 5V以上のピーク電圧の電気信号 を信号入力端子を介して MCU34へ入力する。  The general-purpose input / output port 48 has three signal output terminals and two signal input terminals. A lamp 91, an alarm 92, and a fan 93 are connected to the signal output terminal. Each of the lamp 91, the alarm 92, and the fan 93 is provided in the vicinity of the camera device 10. Then, for example, when a control signal having a peak voltage of A.C 5 V or higher is output from the MCU 34 via the signal output terminal, the lamp 91 turns on light of a predetermined wavelength. The alarm 92 emits a sound of a predetermined wavelength when a control signal is output via the signal output terminal. The fan 93 rotates when a control signal is output via the signal output terminal. On the other hand, two optical sensors 94 and 95 are connected to the signal input terminal of the general-purpose input / output port 48, respectively. Both of these sensors 94 and 95 are provided with their respective fields of view in the monitored area. When detecting that an object has entered the sensing field of view, for example, an electric signal having a peak voltage of A.C 5 V or more is input to the MCU 34 via the signal input terminal.
[0030] オーディオコーデック 39は、図示しないコネクタを介してマイクロホン 96およびスピ 一力 97の各々と接続されている。このオーディオコーデック 39は、マイクロホン 96力、 ら入力される音の波形信号をエンコードして得た音データを MCU34へ供給する。ま た、オーディオコーデック 39は、 MCU34から供給される音データをデコードして得 た波形信号をスピーカ 97へ供給する。 [0030] The audio codec 39 is connected to the microphone 96 and the spin through a connector (not shown). Connected with each of the power 97. The audio codec 39 supplies sound data obtained by encoding the sound waveform signal input from the microphone 96 force to the MCU 34. Further, the audio codec 39 supplies a waveform signal obtained by decoding the sound data supplied from the MCU 34 to the speaker 97.
[0031] イーサネット (登録商標)コントローラ 40 (通信手段の一部に相当)は、 TCP/IPの ネットワークインターフェース層に属するプロトコルに従った手順で、他のノードとの間 でデータ通信を行う。このネットワークインターフェース層には、イーサネット(登録商 標)プロトコル(Ethernet (登録商標) Protocol)などの複数のプロトコルが含まれる。ネ ットワークインターフェース層よりも上位の、インターネット層、トランスポート層、および アプリケーション層に属するデータ通信のための処理は、 7火に説明するクライアント 接続制御部 43が司る。このイーサネット(登録商標)コントローラ 40は、それらの上位 の各層の処理を通じて TCPヘッダおよび IPヘッダが付加されたパケットを、クライア ント接続制御部 43から取得する。そして、そのパケットの IPヘッダに含まれる宛先の I Pアドレスを基に自らのルーティングテーブルを照合することで、宛先の IPアドレスま での経路上の直近のノードを突き止める。そして、そのノードの MAC (Media Access Control)アドレスを含むイーサネット(登録商標)ヘッダをパケットに付加して得たフレ ームをネットワーク 90に送出する。すると、そのフレームは、複数のノードをルーティ ングされて宛先の IPアドレスまで転送される。一方、イーサネット(登録商標)コント口 ーラ 40は、他のノードから転送されてきたフレームに含まれるパケットの宛先の IPアド レスが自装置のものであるときは、そのパケットをクライアント接続制御部 43へ供給す [0031] The Ethernet (registered trademark) controller 40 (corresponding to a part of communication means) performs data communication with other nodes in accordance with a protocol according to the protocol belonging to the TCP / IP network interface layer. The network interface layer includes a plurality of protocols such as Ethernet (registered trademark) protocol. The processing for data communication that belongs to the Internet layer, transport layer, and application layer above the network interface layer is controlled by the client connection control unit 43 described in Section 7. The Ethernet (registered trademark) controller 40 acquires, from the client connection control unit 43, a packet to which a TCP header and an IP header are added through the processing of each higher layer. Then, by checking its own routing table based on the destination IP address included in the IP header of the packet, it finds the nearest node on the route to the destination IP address. Then, a frame obtained by adding an Ethernet (registered trademark) header including the MAC (Media Access Control) address of the node to the packet is transmitted to the network 90. The frame is then routed through multiple nodes and forwarded to the destination IP address. On the other hand, when the IP address of the packet included in the frame transferred from another node is that of the local device, the Ethernet controller 40 transmits the packet to the client connection control unit. Supply to 43
[0032] 次に、 MCU34により実現される各部について説明する。クライアント接続制御部 4 3 (通信手段の一部に相当)は、自装置またはカメラ装置 10に内蔵されたデバイスの 駆動を指示するコマンドをクライアント装置 50から受信してカメラデバイス駆動制御部 41またはサーバデバイス駆動制御部 42へ供給する。そして、サーバデバイス駆動制 御部 42は、そのコマンドに応じた制御信号を、汎用入出力ポート 48に接続されたデ バイス(ランプ 91、アラーム 92、ファン 93)へ供給する。 Next, each unit realized by the MCU 34 will be described. The client connection control unit 4 3 (corresponding to a part of the communication means) receives a command for instructing driving of the device itself or the device built in the camera device 10 from the client device 50 and receives the camera device drive control unit 41 or the server. This is supplied to the device drive control unit 42. Then, the server device drive control unit 42 supplies a control signal corresponding to the command to the devices (lamp 91, alarm 92, fan 93) connected to the general-purpose input / output port 48.
[0033] 一方、カメラデバイス駆動制御部 41 (第 1のデバイス駆動制御手段および第 2のデ バイス駆動制御手段に相当)は、自らに供給されるコマンドに応じた制御信号を、カメ ラ装置 10へ供給する。そして、その制御信号に従い、カメラ装置 10に内蔵されたデ バイス(音声 IC16または JPEGエンジン 24)が駆動すると、駆動の結果として得られ る音データまたは圧縮画像データがカメラ装置 10からサーバ装置 30へ伝送される。 サーバ装置 30のオーディオデータ送信制御部 44およびビデオデータ送信制御部 4 5は、それらの伝送されたデータを、クライアント接続制御部 43を介してクライアント装 置 50へ送信する。 On the other hand, the camera device drive control unit 41 (first device drive control means and second device (Corresponding to the vise drive control means) supplies the camera device 10 with a control signal corresponding to the command supplied to itself. When a device (audio IC 16 or JPEG engine 24) built in the camera device 10 is driven according to the control signal, sound data or compressed image data obtained as a result of the drive is transferred from the camera device 10 to the server device 30. Is transmitted. The audio data transmission control unit 44 and the video data transmission control unit 45 of the server device 30 transmit the transmitted data to the client device 50 via the client connection control unit 43.
[0034] また、センシング視野内に物体が進入したことを検知したセンサ 94, 95から汎用入 出力ポート 94を介して電気信号が入力されると、サーバイベント通知制御部 46 (ィべ ント通知手段に相当)は、センサ 94, 95が物体を検知したことを示すイベントデータ を、クライアント接続制御部 43を介してクライアント装置 50へ送信する。  [0034] When an electrical signal is input from the sensors 94, 95 that detect that an object has entered the sensing field of view through the general-purpose input / output port 94, the server event notification control unit 46 (event notification means) ) Transmits event data indicating that the sensors 94 and 95 have detected an object to the client device 50 via the client connection control unit 43.
[0035] これら各部の処理についてさらに詳述する。クライアント接続制御部 43は、 自装置 にアクセスしてきたクライアント装置 50との間でコネクションを確立する。このコネクシ ヨンの確立は、トランスポート層に属するプロトコルの 1つである TCPに従った以下の 手順で行われる。まず、コネクションの確立を求める、 SYN (Synchronize)のコードビッ トを TCPヘッダに含むパケットが、クライアント装置 50からクライアント接続制御部 43 へ送信される。そのパケットを受信したクライアント接続制御部 43は、 AC (Acknowle dgement)と SYNのコードビットを含むパケットをクライアント装置 50へ返信する。そして 、クライアント装置 50から、 ACKのコードビットを含むパケットがクライアント装置 50へ 送信されることにより、以降の両者のデータ伝送に用いられるデータ通信路が形成さ れる。この形成されたデータ通信路は、クライアント装置 50またはクライアント接続制 御部 43の一方から、 ACKと FIN (Finish)のコードビットを含むパケットが他方へ送信さ れるまで維持される。なお、クライアント接続制御部 43は、マルチタスクによる時分割 処理を行うことにより、同時に複数のクライアント装置 50と個別のデータ通信路を形 成し、それらを維持することが可能である。  [0035] The processing of each of these units will be described in further detail. The client connection control unit 43 establishes a connection with the client device 50 that has accessed the device itself. This connection is established according to the following procedure according to TCP, which is one of the protocols belonging to the transport layer. First, a packet that includes a SYN (Synchronize) code bit for requesting establishment of a connection in the TCP header is transmitted from the client device 50 to the client connection control unit 43. Receiving the packet, the client connection control unit 43 returns a packet including an AC (Acknowledgment) and SYN code bits to the client device 50. Then, a packet including an ACK code bit is transmitted from the client device 50 to the client device 50, thereby forming a data communication path used for both subsequent data transmissions. This formed data communication channel is maintained until a packet including an ACK and FIN (Finish) code bit is transmitted from one of the client device 50 or the client connection control unit 43 to the other. Note that the client connection control unit 43 can form and maintain individual data communication paths with a plurality of client devices 50 at the same time by performing time division processing by multitasking.
[0036] サーバ装置 30とクライアント装置 50との間にデータ通信路が形成されると、サーバ 装置 30またはカメラ装置 10に内蔵されたデバイスの駆動を指示するコマンドが、そ のクライアント装置 50から送信される。コマンドには、ランプ 91の駆動を指示するラン プ駆動コマンド、アラーム 92の駆動を指示するアラーム駆動コマンド、ファン 93の駆 動を指示するファン駆動コマンド、カメラ装置 10の音声 IC16の駆動を指示する音声 I C駆動コマンド、 JPEGエンジン 24の駆動を指示する JPEGエンジン駆動コマンドの 5 種がある。これら 5種のいずれが送信されるかは、クライアント装置 50の操作に依存し て決定付けられる。 When a data communication path is formed between the server device 30 and the client device 50, a command instructing driving of a device built in the server device 30 or the camera device 10 is transmitted from the client device 50. Is done. The command includes a run command to drive the lamp 91. Drive command, alarm drive command for instructing driving of alarm 92, fan drive command for instructing driving of fan 93, audio IC drive command for instructing driving of audio IC 16 of camera device 10, and instructing driving of JPEG engine 24 There are 5 types of JPEG engine drive commands. Which of these five types is transmitted is determined depending on the operation of the client device 50.
[0037] クライアント装置 50からサーバ装置 30へコマンドが送信されると、サーバ装置 30の クライアント接続制御部 43は、そのコマンドが、ランプ駆動コマンド、アラーム駆動コ マンド、ファン駆動コマンド、音声 IC駆動コマンド、および JPEGエンジン駆動コマンド のいずれかである力、を判断する。そして、ランプ駆動コマンド、アラーム駆動コマンド 、またはファン駆動コマンドであればサーバデバイス駆動制御部 42へ、音声 IC駆動 コマンド、または JPEGエンジン駆動コマンドであればカメラデバイス駆動制御部 41 へそれぞれ供給する。  [0037] When a command is transmitted from the client device 50 to the server device 30, the client connection control unit 43 of the server device 30 determines that the command is a lamp drive command, an alarm drive command, a fan drive command, or a voice IC drive command. , And a force that is one of the JPEG engine drive commands. If it is a lamp drive command, alarm drive command or fan drive command, it is supplied to the server device drive control unit 42, and if it is a voice IC drive command or JPEG engine drive command, it is supplied to the camera device drive control unit 41.
[0038] コマンドの供給を受けたサーバデバイス駆動制御部 42は、そのコマンドに応じた制 御信号を汎用入出力ポート 48から出力する。これにより、ランプ 91の点灯、アラーム 92の放音、ファン 93の回転が行われる。一方、コマンドの供給を受けたカメラデバイ ス駆動制御部 41は、そのコマンドに応じて生成した制御信号を USB通信制御部 47 へ供給する。 USB通信制御部 47は、その制御信号を USBコネクタ 37からカメラ装 置 10へ伝送する。カメラ装置 10へ制御信号が伝送されると、制御信号に応じてその 音声 IC16または JPEGエンジン 24が駆動し、駆動の結果として得られた音データま たは圧縮画像データが SDRAM13に記憶される。 USB通信制御部 25は、 SDRA Ml 3に記憶された音データまたは圧縮画像データを USBコネクタ 20からサーバ装 置 30へ伝送する。  Upon receiving the command, the server device drive control unit 42 outputs a control signal corresponding to the command from the general-purpose input / output port 48. Thereby, the lamp 91 is turned on, the alarm 92 is emitted, and the fan 93 is rotated. On the other hand, the camera device drive control unit 41 that has received the command supplies the USB communication control unit 47 with a control signal generated in response to the command. The USB communication control unit 47 transmits the control signal from the USB connector 37 to the camera device 10. When the control signal is transmitted to the camera device 10, the audio IC 16 or the JPEG engine 24 is driven according to the control signal, and the sound data or the compressed image data obtained as a result of the driving is stored in the SDRAM 13. The USB communication control unit 25 transmits the sound data or the compressed image data stored in the SDRA Ml 3 from the USB connector 20 to the server device 30.
[0039] カメラ装置 10からサーバ装置 30へ伝送された音データは、 USB通信制御部 47を 介してオーディオデータ送信制御部 44へ供給される。また、カメラ装置 10からサー バ装置 30へ伝送された圧縮画像データは、 USB通信制御部 47を介してビデオデ ータ送信制御部 45へ供給される。音データの供給を受けたオーディオデータ送信 制御部 44、および圧縮画像データの供給を受けたビデオデータ送信制御部 45は、 それらのデータを所定のバイト長ずつ内包させて得た一連のパケットを、イーサネット (登録商標)コントローラ 40からクライアント装置 50へ送信する。また、センサ 94, 95 力も汎用入出力ポート 48を介して電気信号が入力されたとき、サーバイベント通知制 御部 46は、イベントデータを生成し、そのイベントデータを所定のバイト長ずつ内包 させて得た一連のパケットを、イーサネット(登録商標)コントローラ 40からクライアント 装置 50へ送信する。 The sound data transmitted from the camera device 10 to the server device 30 is supplied to the audio data transmission control unit 44 via the USB communication control unit 47. The compressed image data transmitted from the camera device 10 to the server device 30 is supplied to the video data transmission control unit 45 via the USB communication control unit 47. The audio data transmission control unit 44 that has been supplied with sound data, and the video data transmission control unit 45 that has been supplied with compressed image data includes a series of packets obtained by enclosing the data for each predetermined byte length, Ethernet It is transmitted from the (registered trademark) controller 40 to the client device 50. In addition, when an electric signal is input to the sensors 94 and 95 via the general-purpose input / output port 48, the server event notification control unit 46 generates event data and includes the event data by a predetermined byte length. The obtained series of packets is transmitted from the Ethernet (registered trademark) controller 40 to the client device 50.
[0040] なお、本実施形態では、コマンド、音データ、圧縮画像データ、およびイベントデー タカ 各々に固有のプロトコルに従ってそれぞれパケット化される。つまり、本実施形 態では、コマンド、音データ、圧縮画像データ、およびイベントデータのやりとりに採 用されるネットワーク層のプロトコルが異なる。これにより、性質の異なる 4つのデータ を 1つのデータ通信路を介してやりとりできる。  In this embodiment, the command, sound data, compressed image data, and event data are packetized according to a protocol specific to each. In other words, in this embodiment, the protocol of the network layer used for exchanging commands, sound data, compressed image data, and event data is different. As a result, four data with different properties can be exchanged via a single data channel.
[0041] ここで、サーバ装置 30の起動操作が行われた後、カメラデバイス駆動制御部 41、 サーバデバイス駆動制御部 42、クライアント接続制御部 43、オーディオデータ送信 制御部 44、ビデオデータ送信制御部 45、およびサーバイベント通知制御部 46の各 部が MCU34に形成されるまでの起動処理について、図 5を参照して説明しておく。  Here, after the start operation of the server device 30 is performed, the camera device drive control unit 41, the server device drive control unit 42, the client connection control unit 43, the audio data transmission control unit 44, and the video data transmission control unit Referring to FIG. 5, the startup process until 45 and each part of the server event notification control unit 46 are formed in the MCU 34 will be described.
[0042] サーバ装置 30の電源スィッチ 32が投入されると(S 100)、ブートローダ(Boot Load er)が起動され(S I 10)、オペレーティングシステム(Operating System)カーネルが起 動される(S 120)。続いて、 OSのファイルシステムが起動され(S 130)、アプリケーシ ヨンの起動を開始するコマンド(サーバスタートアプリケーション: Start Server Allpicati on)が実行される(S 140)。  [0042] When the power switch 32 of the server device 30 is turned on (S100), the boot loader (Boot Loader) is started (SI 10) and the operating system (Operating System) kernel is started (S120). . Subsequently, the OS file system is activated (S130), and a command for starting the application (server start application: Start Server Allpication) is executed (S140).
[0043] さらに、システムを初期化するためのコマンド(サーバシステムィニシァライズ: Serve r System Initialize)が実行された後(S I 50)、上述した各制御部をそれぞれ実現させ るためのプログラムモジュールをフラッシュメモリ 36からサーチする(S 160)。次に、 各制御部の機能を調停するためのコマンド(コンフィギュレーション: Configuration)の コマンドが実行され(S 170)、カメラデバイス駆動制御部 41を実現するプログラムが 起動される(S 180)。  [0043] Furthermore, after a command for initializing the system (Server System Initialization: Server System Initialize) is executed (SI 50), a program module for realizing each control unit described above. Is searched from the flash memory 36 (S 160). Next, a command (configuration) command for arbitrating the function of each control unit is executed (S 170), and a program for realizing the camera device drive control unit 41 is started (S 180).
[0044] ステップ S 180の実行を経てカメラデバイス駆動制御部 41が実行されると、今度は、 クライアント接続制御部 43を実現するプログラムが起動される(S 190)。そして、ビデ ォデータ送信制御部 45、サーバデバイス駆動制御部 42、サーバイベント通知制御 部 46、およびオーディオデータ送信制御部 44の各部を実現するためのプログラムが それぞれ起動される(S200〜S230)。 When the camera device drive control unit 41 is executed through the execution of step S 180, a program that realizes the client connection control unit 43 is activated (S 190). Then, the video data transmission control unit 45, the server device drive control unit 42, the server event notification control Programs for realizing the respective units of the unit 46 and the audio data transmission control unit 44 are started (S200 to S230).
[0045] 図 6は、クライアント装置 50のハードウェア概略構成を示す図である。図 5に示すよ うに、クライアント装置 50は、 CPU (Central Processing Unit) 51、 RAM (Random Ac cess Memory) 52、 ROM (Read Only Memory) 53、ノヽードディスク 54、コンピュータ ディスプレイ 55、マウス 56、キーボード 57、およびネットワークインターフェースカード 58を、例えば、 PCIシステムバスにより接続する。  FIG. 6 is a diagram showing a schematic hardware configuration of the client device 50. As shown in FIG. As shown in FIG. 5, the client device 50 includes a central processing unit (CPU) 51, a random access memory (RAM) 52, a read only memory (ROM) 53, a node disk 54, a computer display 55, a mouse 56, and a keyboard 57. And the network interface card 58 are connected by, for example, a PCI system bus.
[0046] CPU51は、 RAM52をワークエリアとして利用しつつ ROM53やハードディスク 54 に記憶された各種プログラムを実行することにより、装置全体を制御する。ネットヮー クインターフェースカード 58は、 TCP/IPのネットワークインターフェース層に属する プロトコルに従った手順で、他のノードとの間でデータ通信を行う。ネットワークインタ 一フェース層よりも上位の層に属する、データ通信のための処理は、各種プログラム を実行する CPU51によってソフトウェア的に実現される。  The CPU 51 controls the entire apparatus by executing various programs stored in the ROM 53 and the hard disk 54 while using the RAM 52 as a work area. The network interface card 58 performs data communication with other nodes in accordance with the protocol according to the protocol belonging to the TCP / IP network interface layer. Processing for data communication belonging to a layer higher than the network interface layer is realized by software by the CPU 51 that executes various programs.
[0047] ROM53には、 IPL (Initial Program Loader)などの簡易なプログラムが記憶される 。ハードディスク 54には、通信制御プログラム 54a、コマンド制御プログラム 54b、ィべ ント提示プログラム 54c、および動画像データ生成プログラム 54dが記憶される。  [0047] The ROM 53 stores a simple program such as an IPL (Initial Program Loader). The hard disk 54 stores a communication control program 54a, a command control program 54b, an event presentation program 54c, and a moving image data generation program 54d.
[0048] 通信制御プログラム 54aは、ネットワークインターフェース層よりも上位の層に属する 処理を司る。コマンド制御プログラム 54bは、マウス 56およびキーボード 57の操作を 介して指定された各種コマンド (ランプ駆動コマンド、アラーム駆動コマンド、ファン駆 動コマンド、音声 IC駆動コマンド、 JPEGエンジン駆動コマンド)を送信する処理を司 る。イベント提示プログラム 54cは、サーバ装置 30からイベントデータを受信した旨の メッセージをコンピュータディスプレイ 55に表示させる処理を司る。動画像データ生 成プログラム 54dは、 JPEGエンジン駆動コマンドに応答してサーバ装置 30から送信 されてくる一連の圧縮画像データを MPEG (Moving Picture Experts Group)の動画 像データにする処理を司る。  [0048] The communication control program 54a manages processing belonging to a layer higher than the network interface layer. The command control program 54b performs processing to send various commands (lamp drive command, alarm drive command, fan drive command, audio IC drive command, JPEG engine drive command) specified through the operation of the mouse 56 and keyboard 57. Control. The event presentation program 54c is responsible for processing for displaying a message to the effect that the event data has been received from the server device 30 on the computer display 55. The moving image data generation program 54d manages the process of converting a series of compressed image data transmitted from the server device 30 in response to the JPEG engine drive command into moving image data of MPEG (Moving Picture Experts Group).
[0049] 以上説明した本実施形態によると、クライアント装置 50を利用する利用者は、自ら のクライアント装置 50からサーバ装置 30へアクセスしてサーバ装置 30との間にデー タ通信路を形成し、カメラ装置 10の音声 IC16または JPEGエンジン 24の駆動を指示 するコマンドを送信することにより、カメラ装置 10の被監視領域内から収音した音の 音データ、または、その被監視領域内を撮像して得た画像の圧縮画像データの配信 を受けること力 Sできる。そして、その音声 IC 16または JPEGエンジン 24の駆動を指示 するコマンド、そのコマンドに応じて生成される音データおよび圧縮画像データ、なら びにイベントデータは、はじめに形成された 1つのデータ通信路を介してやりとりされ るようになっている。よって、サーバ装置 30とクライアント装置 50の間で複数のデータ 通信路を個別に確保することなぐサーバ装置 30を介したカメラ装置 10の遠隔操作 とその操作により得られたデータの取得とを実現することができる。また、それらの複 数の種類のデータの各々の転送先となるポートも 1つだけ確保すれば済む。さらに、 サーバ装置 30とクライアント装置 50との間のデータのやりとりに用いるポートのポート 番号も 1つだけ指定すればよいので、後から容易に変更することができる。よって、ポ ート番号が悪意者に漏洩するなどしたときも、新たなポート番号へと変更した上で以 降のデータのやりとりを滞りなく行うことができる。 According to the present embodiment described above, a user who uses the client device 50 accesses the server device 30 from his / her client device 50 and forms a data communication path with the server device 30. Instructed to drive audio IC16 or JPEG engine 24 of camera device 10 To receive the sound data of the sound collected from the monitored area of the camera device 10 or the compressed image data of the image obtained by imaging the monitored area it can. Then, a command for instructing driving of the audio IC 16 or the JPEG engine 24, sound data and compressed image data generated in response to the command, and event data are transmitted via one data communication path formed first. It is designed to be exchanged. Therefore, remote operation of the camera device 10 via the server device 30 and acquisition of data obtained by the operation are realized without individually securing a plurality of data communication paths between the server device 30 and the client device 50. be able to. Also, it is only necessary to secure one port as the transfer destination for each of these types of data. Furthermore, since only one port number of the port used for data exchange between the server device 30 and the client device 50 needs to be designated, it can be easily changed later. Therefore, even if the port number is leaked to the Service-to-Self, it is possible to exchange data afterwards after changing to a new port number.
[0050] 上記実施形態において、カメラ装置 10に内蔵される ASIC19は、 JPEGエンジン 2 4として機能するようにカスタマイズされている。そして、カメラ装置 10は、自らの広角 レンズ 11の視野角内の像のビットマップデータである画像データをこの JPEGェンジ ン 24により圧縮した上で、サーバ装置 30へ伝送するようになっている。よって、サー バ装置 30の側に画像データ圧縮のためのモジュールを搭載する必要がなくなるた め、その処理負担を軽減できる。また、圧縮画像データは非圧縮のビットマップデー タよりもデータ量が小さ!/、ので、 USBケーブル 80のトラフィックを軽減できる。  In the above embodiment, the ASIC 19 built in the camera device 10 is customized so as to function as the JPEG engine 24. The camera device 10 compresses image data, which is bitmap data of an image within the viewing angle of its wide-angle lens 11, using the JPEG engine 24 and transmits the compressed image data to the server device 30. Therefore, it is not necessary to install a module for compressing image data on the server device 30 side, so that the processing load can be reduced. In addition, compressed image data has a smaller data volume than uncompressed bitmap data! /, So traffic on the USB cable 80 can be reduced.
[0051] 上記実施形態において、サーバ装置 30は、ビデオ出力コネクタ 17およびオーディ ォ出力コネクタ 18を搭載しており、これら両コネクタにディスプレイやヘッドフォンを接 続すれば、サーバ装置 30の撮像した画像ゃ収音した音を直ちに確認することもでき る。よって、サーバ装置 30の設置場所に専用の監視要員を常駐させ、サーバ装置 3 0に接続したディスプレイやヘッドフォンを介して、監視要員よる直接的な監視を行わ せることあでさる。  [0051] In the above embodiment, the server device 30 is equipped with the video output connector 17 and the audio output connector 18, and if a display or headphones are connected to these connectors, the image captured by the server device 30 can be obtained. The collected sound can be confirmed immediately. Therefore, dedicated monitoring personnel are resident at the installation location of the server device 30 and direct monitoring by the monitoring personnel is performed via the display and headphones connected to the server device 30.
[0052] 上記実施形態において、サーバ装置 30の MCU34には、サーバイベント通知制御 部 46が実現されている。このサーバイベント通知制御部 46は、センサ 94, 95がその センシング視野内に物体が進入したことを検知すると、イベントデータを生成してクラ イアント装置 50へ送信するようになっている。一方で、クライアント装置 50には、ィべ ント提示プログラム 54cが実装されており、イベントデータを受信するとそのことを示す メッセージがコンピュータディスプレイ 55に直ちに表示されるようになっている。よって 、利用者は、センサ 94, 95がそのセンシング視野内に物体が進入したことを確認し た上で、音データや圧縮画像データの生成を指示するコマンドをサーバ装置 30へ 送信するようにすることあでさる。 In the embodiment described above, the server event notification control unit 46 is realized in the MCU 34 of the server device 30. The server event notification control unit 46 includes sensors 94 and 95. When it is detected that an object has entered the sensing field of view, event data is generated and transmitted to the client device 50. On the other hand, the event presentation program 54c is installed in the client device 50, and when event data is received, a message indicating that is immediately displayed on the computer display 55. Therefore, the user transmits a command for instructing generation of sound data and compressed image data to the server device 30 after the sensors 94 and 95 confirm that the object has entered the sensing field of view. That's it.
[0053] (他の実施形態) [0053] (Other Embodiments)
本発明は、種々の変形実施が可能である。  The present invention can be modified in various ways.
上記実施形態において、カメラ装置 10は、広角レンズ 11、イメージセンサ 12、色変 換処理部 23、および JPEGエンジン 24からなる第 1のデバイスのほ力、に、マイクロホ ン 15および音声 IC16からなる第 2のデバイスを搭載している。そして、第 1のデバイ スを駆動して得た圧縮画像データと、第 2のデバイスを稼動して得た音データとが、 1 つのデータ通信路を介してサーバ装置 30からクライアント装置 50へ送信されるように なっている。これに対し、赤外線カメラをマイクロホン 15および音声 IC16の代わりに 搭載させてもよい。要するに、カメラ装置 10は、広角レンズ 11を含むデバイスと広角 レンズ 11を含まないデバイスとを搭載し、サーバ装置 30による制御の下にこれら 2つ のデバイスを稼動させて得た両データが 1つのデータ通信路を介してクライアント装 置 50へ送信されるようにさえなって!/、ればよ!/、。  In the embodiment described above, the camera device 10 includes the first device including the wide-angle lens 11, the image sensor 12, the color conversion processing unit 23, and the JPEG engine 24, and the first device including the microphone 15 and the audio IC 16. 2 devices are installed. The compressed image data obtained by driving the first device and the sound data obtained by operating the second device are transmitted from the server device 30 to the client device 50 through one data communication path. It is supposed to be done. On the other hand, an infrared camera may be mounted instead of the microphone 15 and the audio IC 16. In short, the camera device 10 is equipped with a device including the wide-angle lens 11 and a device not including the wide-angle lens 11, and the two data obtained by operating these two devices under the control of the server device 30 are one. It is even sent to the client device 50 via the data communication path! /!
[0054] 上記実施形態において、カメラ装置 10の色変換処理部 23、 JPEGエンジン 24、お よび USB通信制御部 25は、 ASIC19により形成されている。これに対し、それら各 部を、たとえば、 DSP (Digital Signal Processor)などの別のハードウェアにより形成し てもよい。また、メモリ上に展開したプログラムとそのプログラムを実行する CPU (Cent ral Processing Unit)などの汎用演算装置とにより、それら各部をソフトウェア的に実現 してもよい。 In the above embodiment, the color conversion processing unit 23, the JPEG engine 24, and the USB communication control unit 25 of the camera device 10 are formed by the ASIC 19. On the other hand, these units may be formed by other hardware such as DSP (Digital Signal Processor). In addition, each unit may be realized in software by a program developed on a memory and a general-purpose arithmetic unit such as a CPU (Central Processing Unit) that executes the program.
[0055] 上記実施形態では、広角レンズ 11、イメージセンサ 12、色変換処理部 23、および J PEGエンジン 24からなる第 1のデバイスと、マイクロホン 15および音声 IC16からなる 第 2のデバイスが、ともに同じ 1つのハウジング 21に内蔵されている。これに対し、一 方のデバイスだけをカメラ装置 10のハウジング 21に内蔵させ、他方のデバイスを別 のハウジングとして分離してもよい。また、他方のデバイスをサーバ装置 30自体に備 え付けてもよい。 [0055] In the above embodiment, the first device including the wide-angle lens 11, the image sensor 12, the color conversion processing unit 23, and the JPEG engine 24 is the same as the second device including the microphone 15 and the audio IC 16. Built in one housing 21. On the other hand, Only one of the devices may be built in the housing 21 of the camera apparatus 10 and the other device may be separated as another housing. Further, the other device may be provided in the server device 30 itself.
[0056] 上記実施形態において、サーバ装置 30は、音声 IC駆動コマンドまたは JPEGェン ジン駆動コマンドをクライアント装置 50から受信すると、カメラ装置 10による音データ または圧縮画像データの生成を直ちに開始させるようになつている。これに対し、この ようなマニュアル駆動によるデータの生成に加え、あらかじめ設定しておいた時刻に 至ったことをトリガーとして音データまたは圧縮画像データの生成を開始するスケジュ ール駆動ができるようにしてもょレ、。  [0056] In the above embodiment, when the server device 30 receives the audio IC drive command or the JPEG engine drive command from the client device 50, the server device 30 immediately starts generating sound data or compressed image data by the camera device 10. It is summer. On the other hand, in addition to the generation of data by manual drive, schedule drive that starts generation of sound data or compressed image data triggered by the arrival of a preset time is enabled. Gore.
[0057] 上記実施形態において、カメラ装置 10のビデオエンコーダ 21は、 ASIC19とは別 のモジュールとして搭載されている。これに対し、 ASIC19に圧縮画像データの伸張 の機能を持たせ、ビデオエンコーダ 21を搭載しな!/、構成をとつてもよ!/、。  In the above embodiment, the video encoder 21 of the camera device 10 is mounted as a module different from the ASIC 19. On the other hand, ASIC19 should be equipped with a function for decompressing compressed image data, and video encoder 21 should not be installed!

Claims

請求の範囲 The scope of the claims
[1] 通信手段と、  [1] communication means;
上記通信手段がクライアント装置より受信したコマンドに従い、広角レンズを含む第 According to the command received from the client device by the communication means,
1のデバイスを駆動させ、その駆動の結果として得られた第 1のデータを上記通信手 段から上記クライアント装置へ送信させる第 1のデバイス駆動制御手段と、 A first device driving control means for driving the first device and transmitting the first data obtained as a result of the driving from the communication means to the client device;
上記通信手段が上記クライアント装置より受信したコマンドに従い、広角レンズを含 まな!/、第 2のデバイスを駆動させ、その駆動の結果として得られた第 2のデータを上 記通信手段から上記クライアント装置へ送信させる第 2のデバイス駆動制御手段と、 を備え、  In accordance with the command received from the client device by the communication means, the second device including a wide-angle lens is driven !, and the second data obtained as a result of the drive is transmitted from the communication means to the client device. A second device drive control means for transmitting to
上記通信手段は、  The communication means is
上記第 1と第 2の両デバイスを駆動させるコマンド、上記第 1のデータ、および上記 第 2のデータを 1つのデータ通信路を介して上記クライアント装置との間でやりとりす る  The command for driving both the first and second devices, the first data, and the second data are exchanged with the client device via one data communication path.
ことを特徴とするデバイス駆動制御サーバ装置。  The device drive control server apparatus characterized by the above-mentioned.
[2] 自らのセンシング視野内に物体が進入したことを検知するセンサと、 [2] A sensor that detects that an object has entered the field of view of its own sensing,
上記センサが物体の進入を検知すると、そのことを示すイベントデータを前記通信 手段から前記クライアント装置へ送信させるイベント通知手段と、  When the sensor detects the entry of an object, event notification means for transmitting event data indicating the fact from the communication means to the client device;
をさらに備え、  Further comprising
上記イベント通知手段は、  The event notification means is
上記コマンド、前記第 1のデータ、および前記第 2のデータと同じデータ通信路を介 して上記イベントデータを前記クライアント装置へ送信する  The event data is transmitted to the client device via the same data communication path as the command, the first data, and the second data.
ことを特徴とする請求項 1記載のデバイス駆動制御サーバ装置。  The device drive control server apparatus according to claim 1, wherein:
[3] 前記コマンド、前記第 1のデータ、前記第 2のデータ、および前記イベントデータは 各々に固有のアプリケーション層のプロトコルに従ってそれぞれやりとりされる ことを特徴とする請求項 2記載のデバイス駆動制御サーバ装置。 3. The device drive control server according to claim 2, wherein the command, the first data, the second data, and the event data are exchanged according to a protocol of a specific application layer. apparatus.
[4] 広角レンズを含む第 1のデバイスと、 [4] a first device including a wide-angle lens;
広角レンズを含まない第 2のデバイスと、 通信手段、上記通信手段がクライアント装置より受信したコマンドに従い、上記第 1 のデバイスを駆動させ、その駆動の結果として得られた第 1のデータを上記通信手段 から上記クライアント装置へ送信させる第 1のデバイス駆動制御手段、および上記通 信手段が上記クライアント装置より受信したコマンドに従!/、、上記第 2のデバイスを駆 動させ、その駆動の結果として得られた第 2のデータを上記通信手段から上記クライ アント装置へ送信させる第 2のデバイス駆動制御手段を備えたデバイス駆動制御サ ーバ装置と、 A second device that does not include a wide-angle lens; First means for driving the first device according to a command received from the client device by the communication means and the communication means, and transmitting the first data obtained as a result of the driving from the communication means to the client device. The device drive control means and the communication means follow the command received from the client device! /, The second device is driven and the second data obtained as a result of the drive is sent to the communication means. A device drive control server device comprising second device drive control means for transmitting from the device to the client device;
を有し、  Have
上記デバイス駆動制御サーバ装置の上記通信手段は、  The communication means of the device drive control server device includes:
上記第 1と上記第 2の両デバイスを駆動させるコマンド、上記第 1のデータ、および 上記第 2のデータを 1つのデータ通信路を介して上記クライアント装置との間でやりと りする  The command for driving both the first device and the second device, the first data, and the second data are exchanged with the client device via one data communication path.
ことを特徴とするデバイス駆動制御システム。  A device drive control system.
[5] 前記第 1のデバイスは、 [5] The first device is:
複数の受光素子を配列してレ、る受光面と、  A light receiving surface in which a plurality of light receiving elements are arranged;
前記広角レンズを介して上記受光面に結像した像を表す画像データを生成し、生 成した画像データに所定のデータ圧縮方式に従った処理を施すことによって、圧縮 画像データを前記第 1のデータとして取得する画像処理手段と、  Image data representing an image formed on the light receiving surface through the wide-angle lens is generated, and the generated image data is subjected to processing according to a predetermined data compression method, whereby the compressed image data is converted into the first image data. Image processing means for obtaining data;
を備えたことを特徴とする請求項 4記載のデバイス駆動制御システム。  The device drive control system according to claim 4, further comprising:
[6] 前記第 2のデバイスは、 [6] The second device is:
音を収音するマクロホンと、  A macrophone that picks up the sound,
上記マイクロホンが収音した音を表す音データを前記第 2のデータとして取得する 音処理手段と、  Sound processing means for acquiring sound data representing sound collected by the microphone as the second data;
を備え、  With
前記第 1のデバイスおよび前記第 2のデバイスを駆動を制御する駆動制御信号を ケーブルを介して入力すると共に、前記画像処理手段が取得した圧縮画像データお よび上記音処理手段が取得した音データを上記ケーブルを介して出力する入出力 制御手段、 をさらに備えたことを特徴とする請求項 5記載のデバイス駆動制御システム。 A drive control signal for controlling driving of the first device and the second device is input via a cable, and the compressed image data acquired by the image processing unit and the sound data acquired by the sound processing unit are input. Input / output control means for output via the cable, 6. The device drive control system according to claim 5, further comprising:
クライアント装置が、広角レンズを含む第 1のデバイスと広角レンズを含まな!/、第 2の デバイスとを駆動させるサーバ装置に宛てたコマンドを、当該サーバ装置との間に確 立されているデータ通信路を介して送信する第 1のステップと、  Data that is established between the client device and the server device that drives the first device including the wide-angle lens and the second device that does not include the wide-angle lens! / And the second device. A first step of transmitting over a communication path;
上記サーバ装置が、上記クライアント装置から受信したコマンドに応じた上記第 1の デバイスおよび上記第 2のデバイスの駆動制御信号をそれらのデバイスへ送信する 上記第 1のデバイスが、上記サーバ装置から受信した制御信号に従って駆動し、そ の駆動の結果として得られた第 1のデータを上記サーバ装置へ送信する第 3のステツ プと、  The server device transmits a drive control signal for the first device and the second device according to the command received from the client device to the devices. The first device has received from the server device. A third step of driving according to the control signal and transmitting the first data obtained as a result of the driving to the server device;
上記第 2のデバイスが、上記サーバ装置から受信した制御信号に従って駆動し、そ の駆動の結果として得られた第 2のデータを上記サーバ装置へ送信する第 4のステツ プと、  A fourth step in which the second device is driven in accordance with a control signal received from the server device, and second data obtained as a result of the driving is transmitted to the server device;
上記サーバ装置が、上記第 1のデバイスから受信した第 1のデータおよび上記第 2 のデバイスから受信した第 2のデータを、上記第 1のステップのコマンドの送信に用い られたものと同じデータ通信路を介して上記クライアント装置へ送信する第 5のステツ プと、  The server device uses the same data communication as the first data received from the first device and the second data received from the second device for transmitting the command in the first step. A fifth step of transmitting to the client device via a path;
を有することを特徴とするデバイス駆動制御方法。  A device drive control method comprising:
PCT/JP2007/072072 2006-11-16 2007-11-14 Device drive control server device, device drive control system, and device drive control method WO2008059859A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2006-309803 2006-11-16
JP2006309803A JP2008125032A (en) 2006-11-16 2006-11-16 Device drive control server apparatus, device drive control system, and device drive control method

Publications (1)

Publication Number Publication Date
WO2008059859A1 true WO2008059859A1 (en) 2008-05-22

Family

ID=39401665

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2007/072072 WO2008059859A1 (en) 2006-11-16 2007-11-14 Device drive control server device, device drive control system, and device drive control method

Country Status (2)

Country Link
JP (1) JP2008125032A (en)
WO (1) WO2008059859A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09261522A (en) * 1996-03-27 1997-10-03 Nippon Telegr & Teleph Corp <Ntt> Video image distribution method and system to obtain variable area
JPH10164394A (en) * 1996-12-02 1998-06-19 Hitachi Ltd Information transmission method, information recording method and device for executing the method
JPH11215143A (en) * 1998-01-27 1999-08-06 Canon Inc Device and method for data communication
JP2000115753A (en) * 1998-10-06 2000-04-21 Sony Corp Communication system and method, photographing device to be controlled and terminal
JP2003256958A (en) * 2002-03-01 2003-09-12 Sony Corp Device and system of remote monitoring
JP2003283677A (en) * 2002-03-20 2003-10-03 Fuji Photo Film Co Ltd Remote monitoring system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09261522A (en) * 1996-03-27 1997-10-03 Nippon Telegr & Teleph Corp <Ntt> Video image distribution method and system to obtain variable area
JPH10164394A (en) * 1996-12-02 1998-06-19 Hitachi Ltd Information transmission method, information recording method and device for executing the method
JPH11215143A (en) * 1998-01-27 1999-08-06 Canon Inc Device and method for data communication
JP2000115753A (en) * 1998-10-06 2000-04-21 Sony Corp Communication system and method, photographing device to be controlled and terminal
JP2003256958A (en) * 2002-03-01 2003-09-12 Sony Corp Device and system of remote monitoring
JP2003283677A (en) * 2002-03-20 2003-10-03 Fuji Photo Film Co Ltd Remote monitoring system

Also Published As

Publication number Publication date
JP2008125032A (en) 2008-05-29

Similar Documents

Publication Publication Date Title
US8605151B2 (en) Methods and systems for operating a video surveillance system
CN102209232A (en) Remote audio and video monitor system and method thereof
US20140082052A1 (en) Data redirection system and method for providing data redirection service
WO2014161402A2 (en) Distributed video conference method, system, terminal, and audio-video integrated device
JP6179179B2 (en) Information processing apparatus, information processing method, and program
JP2003338971A (en) Adapter unit for camera
WO2006129631A1 (en) Relay apparatus and communication system
TW201401808A (en) System and method for remotely monitoring video of IP cameras
JP2007053717A (en) Security monitoring system capable of outputting still image
JP6493236B2 (en) Communication method, communication program, and server
JP2004254031A (en) Control method for image processing system
JP2004135968A (en) Remote controllable endoscope controlling system
JP2005051590A (en) Camera device
JP2008109364A (en) Camera server system, processing method for data, and camera server
TW201210343A (en) Control device, camera system and program
WO2008059859A1 (en) Device drive control server device, device drive control system, and device drive control method
JP2008125034A (en) Motion picture file and image data processor
JP2005328280A (en) Data processor
US20150373073A1 (en) Image pickup apparatus, control method and recording medium
TWI543603B (en) Ip camera, communication method and communication system
US20020129154A1 (en) Router and control method of audio/video apparatus using the router
JP5328875B2 (en) Communication device and method for restoring power of communication device
US20090110053A1 (en) Embedded system and remote-control servo apparatus thereof
JP3162152U (en) Network camera
JPH11164035A (en) Remote monitoring system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07831802

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 07831802

Country of ref document: EP

Kind code of ref document: A1