US20020126208A1 - Apparatus and method for camera control - Google Patents
Apparatus and method for camera control Download PDFInfo
- Publication number
- US20020126208A1 US20020126208A1 US10/108,397 US10839702A US2002126208A1 US 20020126208 A1 US20020126208 A1 US 20020126208A1 US 10839702 A US10839702 A US 10839702A US 2002126208 A1 US2002126208 A1 US 2002126208A1
- Authority
- US
- United States
- Prior art keywords
- camera
- window
- information
- control
- control information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/66—Remote control of cameras or camera parts, e.g. by remote control devices
- H04N23/661—Transmitting camera control signals through networks, e.g. control via the Internet
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/14—Systems for two-way working
- H04N7/141—Systems for two-way working between two video terminals, e.g. videophone
- H04N7/142—Constructional details of the terminal equipment, e.g. arrangements of the camera and the display
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/62—Control of parameters via user interfaces
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
- H04N23/631—Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/65—Control of camera operation in relation to power supply
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/695—Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
Definitions
- This invention refers to an apparatus and a method for managing a multi-window system operating with a workstation and a camera controlling device, and more specifically to an apparatus and method for a coordination control between a window and a camera for displaying dynamic and static images captured by the camera on the window by interrelating a camera operation control with a window operation control.
- the window In a conventional apparatus for displaying images captured by a camera on a window of a display unit of a workstation, the window simply displays an image without interlocking information on an image displayed on the window with information of the environment in which the window exists, thereby giving an unnatural impression to a user.
- This invention has been developed to solve the problems of prior arts, and aims to provide an apparatus and method for a coordination control between a window and a camera by which a user can control a camera quite naturally and efficiently and can effectively see an image captured by the camera on a display unit of a computer.
- This invention also aims to provide an apparatus and method for a coordination control between each window of multi-windows and cameras capable of showing relative image positions of objects and their inter-relations without giving an unnatural impression.
- This invention also aims to provide an apparatus and method for a coordination control between a window and a camera which allow a user to freely select the range of an image displayed on the window by the user directly controlling the camera.
- An apparatus for a coordination control between a window and a camera comprises; a window control unit for controlling the window according to a window operation command issued by a user; an information conversion unit for converting information on a position of the window in a display and/or information on a size of the window into camera control information; and a camera control unit for controlling an on/off status (switching between on-state and off-state) of the camera, a position of the camera, a shooting distance of the camera, a direction (a shooting direction) of the camera, and/or a zoom setting of the camera, based on the camera control information.
- Another apparatus for a coordination control between a window and a camera comprises a camera control unit for controlling the camera according to a camera operation command issued by a user; an information conversion unit for converting information on an on/off status (switching between on-state and off-state) of the camera, a position of the camera, a shooting distance of the camera, a direction (a shooting direction) of the camera, and/or a zoom setting of the camera, into window control information; and a window control unit for controlling a position of the window and/or a size of the window based on the window control information.
- the window control unit may also control the window according to a window operation command issued by a user
- the information conversion unit may also convert information on the window supplied from the window control unit into camera control information
- the camera control unit may also control the camera based on the camera control information.
- the information conversion unit may convert information on a position of the window into the camera control information and the camera control unit may control at least one of a change of a position of the camera and a change of a direction of the camera based on the camera control information.
- the information conversion unit may convert information on a size of the window into the camera control information and the camera control unit may control a zoom of the camera based on the camera control information.
- the information conversion unit may also convert a relative position of the window to another window on the display into the camera control information, and the on/off status, the position, the direction, and/or the zoom setting of the camera, may be controlled based on the camera control information. Further, an on/off status, a position, a direction, and/or a zoom setting of another camera, which provides images to another window on the display, may be controlled based on the camera control information.
- the information conversion unit may also convert sound information supplied by a user into the camera control information, and the camera may be controlled based on the sound information.
- the information conversion unit may convert information on a zoom setting of the camera into the window control information, and the window control unit may control a size of the window based on the window control information.
- the information conversion unit may convert information on a direction of the camera into the window control information, and the window control unit may control a position of the window based on the window control information.
- the information conversion unit may convert information on an on/off status of the camera into the window control information, and the window control unit may control a position of the window based on the window control information.
- the information conversion unit may convert sound information provided by a user into the window control information, and a position of the window may by controlled based on the window control information.
- a method for a coordination control between a window screen and a camera includes a step of controlling the window according to a window operation command issued by a user; a step of converting information on the control of the window into camera control information; and a step of controlling the camera based on the camera control information.
- Another method for a coordination control between a window screen and a camera includes a step of controlling the camera according to a camera operation command issued by a user, a step of converting information on the control of the camera into window control information; and a step of controlling the window based on the window control information.
- Another method for a coordination control between a window screen and a camera includes a step of controlling the window according to a window operation command issued by a user; a step of converting information on the control of the window into camera control information; a step of controlling the camera based on the camera control information; a step of controlling the camera according to a camera operation command issued by a user; a step of converting information on the control of the camera into window control information; and a step of controlling the window based on the window control information.
- the window and/or the camera may be also controlled by sound information supplied by a user.
- FIG. 1 is a block diagram illustrating the global configuration of a first embodiment of this invention
- FIG. 2 is a flowchart illustrating processes of the first embodiment
- FIGS. 3A and 3B show a control of a camera according to a position of a window
- FIGS. 4A and 4B show a control of a camera according to a size of a window
- FIG. 5 is a block diagram illustrating the global configuration of a second embodiment of this invention.
- FIG. 6 is a flowchart illustrating processes of the second embodiment
- FIG. 7 is a block diagram illustrating the global configuration of a third embodiment of this invention.
- FIG. 8 is a flowchart illustrating processes of the third embodiment
- FIGS. 9A through 9G show a control of positions of a camera according to positions of a window
- FIG. 10 is a block diagram illustrating the global configuration of a fourth embodiment of this invention.
- FIG. 11 is a flowchart illustrating processes of the fourth embodiment
- FIG. 12 is an explanatory diagram illustrating user images captured by a plurality of cameras
- FIG. 13 is a block diagram illustrating the global configuration of a fifth embodiment of this invention.
- FIG. 14 is a flowchart illustrating processes of the fifth embodiment
- FIG. 15 is an explanatory diagram illustrating the configuration of a sixth embodiment of this invention.
- FIG. 16 is an explanatory diagram illustrating operations of the sixth embodiment
- FIG. 17 is a flowchart illustrating processes of the sixth embodiment
- FIGS. 18A and 18B are explanatory diagrams illustrating a seventh embodiment of this invention.
- FIGS. 19A and 19B are explanatory diagrams illustrating an eighth embodiment of this invention.
- FIG. 20 is an explanatory diagram illustrating a ninth embodiment of this invention.
- FIG. 21 is a flowchart illustrating processes of the ninth embodiment.
- FIG. 1 is a block diagram illustrating the global configuration of an apparatus for a coordination control between a window and a camera according to a first embodiment of this invention.
- the apparatus displays images captured by the camera 13 in a window 19 of multiple window screens on the display device 14 .
- information (data) on a position on a display device 14 and information on a size of the window 19 are used for determining an on/off status (on-state or off-state), a position, a direction, and a zoom of the camera.
- numerals 11 , 12 , and 13 denote a viewed object such as a person, a user operating the terminal or the window, and the camera, respectively.
- the camera 13 comprises a camera control device 16 , which controls the on/off status, the position, the direction, and the zoom of the camera 13 .
- the display device 14 displays the viewed object 11 captured by the camera 13 in the window 19 .
- An display control device 15 of this embodiment displays images captured by the camera 13 in the window 19 using information (or data) on the position and the size of the window provided from a window control unit 18 .
- An information conversion unit 17 of this embodiment converts information on the position and the size of the window provided from a window control unit 18 into camera control information, including information on the on/off status, the position, the direction, and the zoom setting of the camera.
- the window control unit 18 controls the window 19 or the multi-windows on the display device 14 according to a window operation command issued by the user 12 .
- FIG. 2 is a flowchart illustrating processes of the first embodiment.
- a computer waits for an input into the window control unit 18 from the user 12 (Step S 1 ), determines whether or not the input is the window operation command (Step S 2 ), and performs another process according to the input when the input is not the window operation command (Step S 3 ).
- the window control unit 18 sends the information on the position of the window 19 and the information on the size of the window 19 to the display control device 15 , revises (moves, resizes, etc.) windows 19 , and provides the information on the position and the size of the window to the information conversion unit 17 .
- the information conversion unit 17 converts the information on the position and the size of the window 19 into camera control information including the information on the position, the direction, and the zoom setting of the camera 13 , and supplies the camera control information to the camera control device 16 (Step S 4 ).
- the camera control device 16 controls the camera 13 based on the camera control information (Step S 5 ), and in parallel with the above processes, the display control device 15 continues displaying the image captured by the camera 13 on the display device 14 , based on the information on the position and the size of the window 19 sent from the window control unit 18 .
- FIG. 3A is a first one of explanatory diagrams illustrating in a two-part series a displayed image, where a camera direction is unrelated to an on-screen window position.
- FIG. 3B is a second one of explanatory diagrams illustrating in a two-part series a displayed image, where a camera direction is interrelated with an on-screen window position.
- FIG. 4A is a first one of explanatory diagrams illustrating in a two-part series a displayed image, where a camera zoom setting is unrelated to an on-screen window size.
- FIG. 4B is a second one of explanatory diagrams illustrating in a two-part series a displayed image, where a camera zoom setting is interrelated with an on-screen window size.
- This embodiment may have not only the functions shown in FIG. 3B and FIG. 4B but also the functions shown in FIG. 3A and FIG. 4A.
- FIG. 5 is a block diagram illustrating the global configuration of an apparatus for a coordination control between a window and a camera according to a second embodiment of this invention. As shown in FIG. 5, the position and the size on the display device 14 of the window 19 is controlled using the camera information, including information on the on/off status, the position, the direction, and the zoom setting of the camera 13 .
- the camera 13 comprises a camera control device 26 which controls the on/off status, the position, the direction, and the zoom setting of the camera 13 based on a camera operation control command provided by the user 12 .
- the camera 13 captures the image of the object 11 and the display control device 15 displays the image in the window 19 of the display device 14 using the window information, including information on the position and the size of the window 19 .
- An information conversion unit 27 converts camera information including information on the on/off status, the position, the direction, and the zoom setting of the camera 13 supplied from the camera control device 26 into window control information, including information on the position and size of the window 19 on the display device 19 .
- a window control unit 28 controls window 19 on the display device 14 based on the window control information supplied from the information conversion unit 27 .
- FIG. 6 is a flowchart illustrating processes of the second embodiment.
- a computer (workstation) waits an input to the camera control device 26 from the user 12 (step S 11 ). If the input is the camera operation command, the camera operation command is sent to the camera control device 26 and the camera control device 26 controls the camera 13 according to the camera operation command. Then, the camera control device outputs camera information on the on/off status, position, direction, and zoom setting of the camera 13 to the information conversion unit 27 , and the information conversion unit 27 converts the camera information into camera control information, which is sent to the window control unit 28 (Step S 12 ). The window control unit 28 controls the window 19 based on the window control information (Step S 13 ).
- the display control device 15 controls the display device 14 and displays the image captured by the camera 13 on a position designated by the window control unit 28 with a specified window size.
- the user 12 can directly operate the camera 13 , and the corresponding window 19 is controlled according to the camera information, the user 12 can select the viewed object by observing the window 19 .
- the user 12 can easily get the forward view, side view and rear view of the object 11 , and can easily move the camera 13 from the present position to a position far from the present position to view a new object.
- FIG. 7 is a block diagram illustrating the global configuration of an apparatus for a coordination control between a window and a camera according to a third embodiment of this invention.
- the third embodiment is a combination of the first embodiment and the second embodiment, and operates in both directions between a camera operation control and a window operation control.
- a camera control device 36 controls the on/off status, the position, the direction, and the zoom setting of the camera 13 according to the camera operation command provided by the user 12 , and the camera control information from an information conversion unit 37 .
- a window control unit 38 controls the position and the size of the window 19 according to the window operation command provided by the user 12 and the window control information from the information conversion unit 37 .
- the information conversion unit 37 converts the window information, including information on the position and the size of the window 19 , into the camera control information for sending to the camera control device 36 , and also converts the camera information including information on the on/off status, the position, the direction, and the zoom setting of the camera 13 into the window control information for sending to the window control unit 38 .
- FIG. 8 is a flowchart illustrating processes of the third embodiment.
- a computer waits an input from the user 12 or the camera control device 36 (Step S 1 ), and determines the source of the input (Step S 1 ′).
- the computer performs Step 2 through Step 5 as described in the first embodiment.
- the computer performs Step 12 and Step 13 as described in the second embodiment.
- the display control device 15 displays the image captured by the camera 13 on a portion of the display device 14 designated by the window control unit 38 with the specified window size.
- the camera 13 is controlled based on the window information, which can be controlled by the user 12 based on the window operation command, and/or the camera operation command input by the user 12 . Therefore, the window information and the camera information are interrelated for displaying the image in the window 19 , and the user 12 can get an effective image from the display device 14 . Further, the user can visualize the interrelations and relative positions between objects shown in different windows on the same display device 14 and can get a natural impression. Especially, when the system is used in a teleconference, the participant can face another member who the participant wants to talk to on the display device, and their faces can be naturally directed toward each other on the display device.
- the window 19 can be regarded as a user interface for controlling the camera 13 , and the user 12 can control the camera 13 quite naturally and effectively for presentation of an image. Further, the user can visualize the interrelations and relative positions between objects shown in different windows on the same display device 14 , and can get a natural impression.
- the user 12 can directly control the camera 13 and freely select the scope of the image displayed in the window 19 .
- the apparatus is effective in maintaining the relative on-screen sizes or on-screen positions of the objects presented in different windows on the same display device.
- FIG. 9A through FIG. 9G is a control of a camera position by the relative on-screen window positions of windows (a control of the camera based on window information).
- the teleconference system presents only front views of the participants so that the user faces the participants, as shown FIG. 9A.
- the relative on-screen window positions between the windows showing the user and a participant to be talked to, and the directions of the faces of the user and the participant, are important. For example, it gives an unnatural impression when the participants do not face each other on the display device, even though the windows are placed side by side on the display device, as shown in FIG. 9C and FIG. 9D.
- the faces of the participants should be captured so that the eyes of the participants meet on the display device, as shown in FIG. 9E.
- FIG. 9F shows an example presenting an unnatural impression, in which the eyes of the participants do not meet on the display device because the windows are not placed side by side on the display device.
- the faces of the participants should be captured and displayed so as to face each other, as shown in FIG. 9G.
- a fourth embodiment of this invention a system according to this invention for a coordination control between a window and a camera is used in a teleconference in which two people in different locations have a dialog.
- FIG. 10 is a block diagram illustrating a global configuration of the system. This system uses two computers (workstations) shown in FIG. 1 and each of the cameras of the workstations captures a image of the corresponding user.
- Camera 13 A captures image of user 12 A of a workstation WS-A
- camera 13 B capture image of user 12 B of workstation WS-B.
- the workstations WS-A and WS-B are connected to each other via a transmission path.
- the workstations WS-A and WS-B respectively comprise display devices 14 A and 14 B
- the display devices 14 A and 14 B respectively comprise a pair of windows 19 A-A and 19 A-B, and 19 B-A and 19 B-B.
- the window 19 A-A and the window 19 B-A display the image captured by the camera 13 A
- the window 19 A-B and the window 19 B-B display the image captured by the camera 13 B.
- windows 19 A-A and 19 B-A show the same image of user 12 A
- windows 19 A-B and 19 B-B show the same image of user 12 B.
- both of the windows 19 A-A and 19 A-B in the display device 14 A and windows 19 B-A and 19 B-B in the display device 14 B must be positioned in the same relationship to each other.
- the corresponding one of camera control devices 16 A or 16 B changes the camera position of the corresponding one of the cameras 13 A or 13 B, and the other one of the window control units 18 A or 18 B also changes the window position of the other one of the pairs of windows.
- FIG. 11 is a flowchart illustrating processes of the fourth embodiment, and described below with reference to FIG. 11 are processes of the fourth embodiment.
- Steps with a large letter “A” after a numerical step number indicate those for workstation WS-A, and steps with a large letter “B” after a numerical step number indicate those for workstation WS-B, where the same numerical step numbers indicate basically identical processes.
- Each of the workstations WS-A and WS-B stands by for an input to the corresponding window control units from the corresponding users or from the other workstation (Steps S 1 A and S 1 B), and determines whether or not the input is the window operation command (Steps S 2 A and S 2 B).
- the workstations WS-A and WS-B perform another process according to the input (Steps S 3 A and S 3 B).
- the corresponding one of the window control units 18 A and 18 B transfers the window operation command to the other one of the window control units 18 A and 18 B, thereby sending window control information on an on-screen window position and an on-screen window size to both of the display control devices 15 A and 15 B, for a strictly interlocked control of the workstations WS-A and WS-B.
- Each of the window control units 18 A and 18 B revises (moves, resizes, etc.) the corresponding pairs of the windows 19 A-A and 19 A-B and windows 19 B-A and 19 B-B, and provides to the corresponding one of the information conversion units 17 A and 17 B the window control information on an on-screen window position and an on-screen window size, respectively.
- Each of the information conversion units 17 A and 17 B converts the window control information into camera control information on a camera activation and deactivation (on/off status of the camera), a camera position, a camera direction, a camera zoom setting and so forth, and supplies them to the corresponding one of the camera control devices 16 A and 16 B (Steps S 4 A and S 4 B).
- Each of the camera control devices 16 A and 16 B controls the corresponding one of the cameras 13 A and 13 B by moving its position by the corresponding one of angles ⁇ and ⁇ , based on the camera control information calculated in steps S 4 ′A and S 4 ′B, respectively (Steps S 5 A and S 5 B).
- each of the display control devices 15 A and 15 B continues having the corresponding one of the display devices 14 A and 14 B display each image captured by either one of the cameras 13 A and 13 B, based on the window control information.
- information conversion unit 17 A calculates an angle ⁇ , which is defined as an angle between a straight line drawn from the center of window to 19 A-A to the center of window 19 A-B, and a horizontal line through the center-line of displays 14 , and camera control device 16 A moves the camera position of camera 13 A by ⁇ .
- window control unit 18 B moves the window position of window 19 B-A
- information conversion unit 17 B calculates an angle ⁇ , which is defined as an angle between a straight line drawn from the center of window 19 B-A to the center of window 19 B-B, and a horizontal line through the center-line of displays 14 , and camera control device 16 B moves the camera position of camera 13 B by ⁇ .
- FIG. 12 is an explanatory diagram illustrating user images captured by a plurality of cameras.
- a plurality of cameras capture images of the faces of the users from various directions. In this case, although the cameras cannot always catch the ideal direction of the faces of the users, it is not necessary to physically move cameras or to control manipulators.
- FIG. 13 is a block diagram illustrating the global configuration of an apparatus for a coordination control between a window and a camera according to a fifth embodiment of this invention.
- FIG. 13 shows a configuration allowing two people to have a dialog, each person having a computer (workstation) as shown in FIG. 1, and a plurality of cameras to capture his images.
- workstation WS-A has eight cameras connected and workstation WS-B has six cameras connected for capturing respective user's images.
- Group of cameras 13 A 0 through 13 A 7 and group of cameras 13 B 0 through 13 B 5 capture images of users 12 A and 12 B, respectively, and the respective workstations WS-A and WS-B are connected to each other via a transmission path.
- the workstations WS-A and WS-B respectively comprise display devices 14 A and 14 B, and have a pair of windows 19 A-A and 19 A-B and a pair of windows 19 B-A and 19 B-B displaying images respectively captured by the group of cameras 13 A 0 through 13 A 7 and the group of cameras 13 B 0 through 13 B 5 .
- FIG. 14 is a flowchart illustrating processes of this embodiment. Steps with a large letter “A” after a numerical step number indicate those for workstation WS-A and steps with a large letter “B” after a numerical step number indicate those for workstation WS-B, where the same numerical step numbers indicate basically identical processes.
- Each of workstations WS-A and WS-B waits for an input from the users 12 A and 12 B or from the other workstations WS-A and WS-B (Steps S 1 A and S 1 B), and determines whether or not the input is a window operation command (Steps S 2 A and S 2 B).
- the workstations WS-A and WS-B perform another process according to the input (Steps S 3 A and S 3 B).
- the corresponding window control unit 18 changes the position of the corresponding window on the display device according to the window operation command (Steps S 4 A and S 4 B).
- the information conversion units 17 of workstations WS-A and WS-B send information on the switching of cameras to the camera control devices 16 , respectively.
- the camera control devices 16 A and 16 B activate (or select) one at f( ⁇ ) and one at f( ⁇ ) of group of cameras 13 A (comprising 13 A 0 through 13 A 7 ) as cameras capturing images of user 12 A to be displayed on windows 19 A-A and 19 B-A, respectively (Steps S 6 A and S 6 B), and all other cameras around the user of workstation WS-A are deactivated.
- the camera control devices 16 A and 16 B activate (or select) one at g( ⁇ ) and one at g( ⁇ ) of group of cameras 13 B (comprising 13 B 0 through 13 B 5 ) as cameras capturing images of user 12 B to be displayed on windows 19 A-B and 19 B-B, respectively (Steps S 7 A and S 7 B), and all other cameras around the user of workstation WS-B are deactivated.
- the selection of cameras is performed based on the functions “f” and “g”, which are determined according to the window angles ⁇ and ⁇ .
- the functions “f” and “g” provide the number of cameras which will be select as cameras sending images to the windows.
- m is a constant indicating the number of cameras in the group of cameras 13 A.
- the constant m is eight in workstation WS-A and is six in workstation WS-B. It is assumed here that cameras 13 A 0 through 13 A 7 in the group of cameras 13 A are positioned radially and equidistant around a circle centered on the user of the workstation WS-A. It is also assumed that cameras 13 B 0 through 13 B 5 in the group of cameras 13 B are positioned radially and equidistant around a circle centered on the user of the workstation WS-B.
- the cameras 13 A 0 and 13 B 0 are positioned on the positive direction of the X axis, which is a horizontal line through the center-line of displays 14 .
- cameras do not have to be positioned radially and equidistant around a circle centered on the user, but other positioning of the cameras can be used in this invention.
- the camera control devices of workstations WS-A and WS-B respectively select a camera closest to a ⁇ direction and a camera closest to a ⁇ direction for capturing the image of the user of the workstation WS-A using the function “f”, and also select a camera closest to a ⁇ direction and a camera closest to a ⁇ direction for capturing the image of the user of the workstation WS-B using the function “g”.
- the camera control device 16 A switches a camera activation as follows:
- the information conversion unit 17 A calculates an angle ⁇ , which is defined as an angle between a straight line drawn from the center of window 19 A-A to the center of window 19 A-B, against and the to the center of window 19 A-B, against and the horizontal line through the center-line of displays 14 , and also calculates f( ⁇ ) and g( ⁇ )according to expression (1).
- camera control device 16 A of workstation WS-A switches a camera activation of the group of cameras 13 A and selects a camera at f( ⁇ ).
- the camera control device 16 A of workstation WS-A selects the camera 13 A 7 , which captures an image of user 12 A from a position to his lower-right.
- the camera control device 16 A of workstation WS-A also switches a camera activation of the group of cameras 13 B and selects a camera at g( ⁇ ). Accordingly, the camera control device 16 A of workstation WS-A selects the camera 13 B 2 , which captures an image of user 12 B from a position to his upper-left.
- camera control device 16 B is not required to switch a camera activation of either group of cameras 13 A or group of cameras 13 B.
- the selection of cameras allows users 12 A and 12 B to face each other regardless of the window positions in the display devices 14 A and 14 B, thereby presenting a natural impression of the images.
- FIG. 15 is an explanatory diagram illustrating the relative positions of a display device and cameras according to a sixth embodiment of this invention.
- Each workstation of the participants includes the third embodiment shown in FIG. 7, and all workstations are connected by a transmission path.
- An observation camera 13 C, a right speech camera 13 R, a left speech camera 13 L, a right speech switch 113 R and a left speech switch 113 L are provided around the workstation of each participant (user).
- FIG. 16 is an explanatory diagram illustrating operations of the sixth embodiment.
- the observation camera 13 C captures an image of the user when the user is one of the observers.
- the right speech camera 13 R captures an image of the user from the right direction and the left speech camera 13 L captures an image of the user from the left direction when the user is one of the speakers.
- users A and B are the speakers and users C, D, and E are the observers of the discussion, as shown at the top-center of FIG. 16.
- One of the observers C, D, and E can become a speaker in two ways. He may switch his camera from the observation camera 13 C to either the right speech camera 13 R or the left speech camera 13 L by selecting a corresponding one of the right speech switch 113 R or the left speech switch 113 L. Alternatively, he may move the window showing his face into the speaker area, for example, by using a mouse device.
- each window control unit 38 to swap an image of user A in left speaker's window a with an image of user C in one of the observer's windows c in the display device 14 , as shown at the bottom-left of FIG. 16.
- a information conversion unit 37 converts camera control information on a camera activation and a camera deactivation into window control information on window positions, and the window control unit 38 swaps the window showing the speaker with the window showing the observer.
- the information conversion unit 37 converts window control information on window positions into camera control information for a camera activation and a camera deactivation, and the camera control device 36 switches the cameras accordingly.
- FIG. 17 is a flowchart illustrating processes of the sixth embodiment.
- the computer waits for an input from the user 12 , an input from the camera control device 36 , or an input from another computer (Step S 1 ), and determines whether or not the input is from the user 12 or from another computer (Step S 1 ′).
- the computer determines whether or not the input is the window operation command (Step S 2 ), and if the input is not the window operation command (Step S 3 ), performs another process corresponding to the input and the process then goes back to step S 1 .
- the window control unit 38 sends the window operation data to the display control device 15 for revising (moving, resizing, etc.) the windows 19 .
- the window control unit 38 also sends the window information to the information conversion unit 37 , and the information conversion unit 37 converts the window information into the camera control information and sends them to the camera control device 36 .
- the camera control device 36 switches the cameras so that a camera corresponding to the position of the window showing the user is selected (Step S 5 ). S 17 .
- step S 1 ′ when the input is from the camera control device 36 , the information conversion unit 37 converts the data into window control information, and sends this information to the window control unit 38 .
- the window control unit 38 swaps to the window on the opposite side to the speech switch selected by the user with speaker's window (Step S 16 ). In this operation, when the user pushed the right speech switch 113 R, the right speech camera 13 R is selected and the left speaker's window a becomes his window, as shown in the left part of FIG. 16.
- the workstation transfers the window and the windows in the display device of other workstations are re-arranged so as to have the same arrangement as in the display device of the user who pushed the speech switch. Then, the process continues by returning to step S 1 .
- a workstation controls an on-screen window size in relation to a camera zoom setting.
- the seventh embodiment is for keeping relative image sizes between objects by controlling a window size in relation to the zoom setting of a camera.
- This embodiment uses the apparatus of the second embodiment shown in FIG. 5, and the window control unit 28 enlarges or reduces the window size according to a scaling of the image set by the camera zoom.
- FIG. 18A is a first one of explanatory diagrams illustrating a pair of images displayed when the window size is unrelated to the camera zoom setting.
- FIG. 18B is a second one of explanatory diagrams illustrating a pair of images displayed when the window size is related to the camera zoom setting. setting.
- the information conversion unit 27 converts camera information on respective camera zoom settings of cameras 13 A and 13 B supplied from the camera control device 26 , into window control information on window sizes. Then, the window control information is sent to the window control unit 28 to control the sizes of the windows on the display device 14 .
- the window size changes corresponding to the camera zoom setting, and the relative image sizes between objects can be correctly represented in the display device.
- the window control unit 27 controls the window size independently of the camera zoom setting, even if the zoom setting of the camera is changed, the size of the corresponding window does not change and an image inside of the window is scaled up or down proportionally, thereby failing to provide the relative image sizes between images in the display device.
- FIG. 18A when the camera 13 A captures a bigger object 11 A with a wide angle view (with low magnification), the bigger object 11 A and a smaller object 11 B captured by a camera 13 B are shown to be the same size in the display device 14 .
- the window control unit 27 controls the window size corresponding to the camera zoom setting according to this invention, the relative sizes of the objects can be represented in the display device 14 .
- the eighth embodiment is for keeping a relative position between objects by controlling the window position in relation to the position and the direction Of the camera.
- This embodiment uses the apparatus of the second embodiment shown in FIG. 5, and the window control unit 28 changes the positions of the windows according to the camera information on the position and the direction of the camera.
- FIG. 19A is a first one of explanatory diagrams illustrating a pair of images displayed when the window position is unrelated to the position and the direction of the camera.
- FIG. 19B is a second one of explanatory diagrams illustrating a pair of images displayed when the window position is related to the position and the direction of the camera.
- the information conversion unit 27 converts camera information on camera position and camera direction of respective cameras 13 A and 13 B supplied from the camera control device 26 , into window control information on window sizes. Then, the window control information is sent to the window control unit 28 to control the positions of the windows on the display device 14 .
- the window control unit 27 controls the window position independently of the camera position and the camera direction, even if the direction or the position of the camera is changed, the relative position of the windows in the display device does not change, thereby failing to provide the relative positions between objects in the display device.
- the window control unit 27 controls the window position corresponding to the camera position and the camera direction according to this invention, the relative position between the objects can be shown in the display device 14 .
- FIGS. 20 and 21 A ninth embodiment of this invention will be described below using FIGS. 20 and 21.
- This embodiment basically uses apparatus of the sixth embodiment whose configuration, operations and processes are shown in FIG. 15, FIG. 16 and FIG. 17, and the circumstances of the teleconference are the same.
- the user switches the observation camera 13 C, the right speech camera 13 R and the left speech camera 13 L, and swaps windows in a speaker area and an observer area manually, by either depressing one of the left speech switch 113 L and the right speech switch 113 R, or by dragging a window over another window.
- the ninth embodiment comprises a left microphone 113 L′ placed to the left of the user and a right microphone 113 R′ placed to the right of the user as substitutes for the left speech switch 113 L and the right speech switch 113 R.
- switching of cameras 13 L, 13 R, and 13 C and swapping the windows are performed by sound information from the right microphone 113 R′ and the left microphone 113 L′.
- Each of the participants in the teleconference has a workstation of the third embodiment, and cameras 13 L, 13 R, and 13 C and the microphones 113 L′ and 113 R′ are attached to the workstation.
- the right microphone 113 R′ and the left microphone 113 L′ are connected to a microphone operation control unit (not shown), and all workstations are connected by a transmission path.
- Each user listens to a discussion between two speakers using a stereo headphone set that emits the voice of the right speaker displayed in the speaker area from its right side and the voice of the left speaker displayed in the speaker area from its left side.
- a headphone operation control unit (not shown) controls sounds emitted from the stereo headphone set using the window information.
- the sound source for the stereo headphone set is changed by the headphone operation control unit according to the swapping of the windows to match the sound with the voice of a new speaker, when windows are swapped between the speaker area and the observer area.
- An observer can become a speaker in two ways. He may switch his camera from the observation camera 13 C to either one of the right speech camera 13 R and the left speech camera 13 L by speaking toward either the right microphone 113 R′ or the left microphone 113 L′. Alternatively, he may drag his window into a speaker area, for example, by using a mouse device.
- FIG. 21 is a flowchart illustrating processes of the ninth embodiment.
- a computer waits for an input from the user, an input from the microphone operation control unit, or an input from another computer (Step S 1 ), then determines whether or not the input is from the user 12 or another computer (Step S 1 ′).
- the computer determines whether the input is a window operation command (Step S 2 ). If the input is not the window operation command, the computer performs another process corresponding to the input (Step S 3 ), and the process continues by returning to step S 1 .
- the window control unit 38 sends the window control information to the display control device 15 to swap images of users in the display device. Then, the window control unit 38 sends the information on the position of the windows to the information conversion unit 37 to provide the camera control information to the camera control device 36 .
- the camera control device 36 controls the cameras so that one of the right speech camera 13 R and the left speech camera 13 L captures the user's image based on the camera control information (Step S 5 ).
- the headphone operation control unit swaps the sound emitted from either the right or left side of a headphone set with the sound from the user who wants to be a speaker (Step S 5 ′).
- the computer transfers window control operation data on an on-screen window position to the other workstations (Step S 14 ). The process continues by returning to step S 1 .
- Step 15 the window control unit 38 swaps a window, which is on the opposite side in the speaker area to a microphone to which a sound is input first, for this user's window (Step 15 ). After Step 15 , the process goes to Step 5 .
- the camera is controlled based on the window information
- the user can control the window to obtain an image that he wishes to see without directly controlling the camera.
- the window is controlled based on the camera control, the original relationship between objects can be shown in windows on the display device.
- this invention produces the following advantages:
- a camera can be controlled according to information on the position of the window, which can be directly controlled by a user
- the window can be regarded as an interface between the user and the camera, and the user can control the position and the direction of the camera by changing the window position while watching the image of the object.
- a camera can be controlled according to information on the size of the window, which can be directly controlled by a user, the user can control the zoom setting of the camera by changing the window size and can get a desired field of view while watching the image of the object.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Human Computer Interaction (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
- User Interface Of Digital Computer (AREA)
- Image Input (AREA)
- Digital Computer Display Output (AREA)
Abstract
An apparatus according to this invention for a coordination control between a window and a camera for use in displaying an image captured by the camera in the window of a display means comprises, a window control means for controlling the window according to a window operation command issued by a user, an information conversion means for converting window information supplied from the window control means into camera control information, and a camera control means for controlling the camera based on the camera control information supplied from the information conversion means. In this apparatus, the window information and the camera control information are interrelated for displaying the image.
Description
- This application is a divisional of application Ser. No. 09/008,567, filed Jan. 16, 1998, now pending, which is a continuation of Ser. No. 08/438,061 filed May 8, 1995,now abandoned.
- 1. Field of the Invention
- This invention refers to an apparatus and a method for managing a multi-window system operating with a workstation and a camera controlling device, and more specifically to an apparatus and method for a coordination control between a window and a camera for displaying dynamic and static images captured by the camera on the window by interrelating a camera operation control with a window operation control.
- 2. Background of the Related Arts
- In a conventional apparatus for displaying images captured by a camera on a window of a display unit of a workstation, the window simply displays an image without interlocking information on an image displayed on the window with information of the environment in which the window exists, thereby giving an unnatural impression to a user.
- Thus, techniques have been proposed for displaying appropriate image data input from a camera on a window in connection with a scrolling of an image inside of the window and a scaling-up of a window size by controlling an image capturing apparatus according to an instruction for changing the image in the window. (Refer to the Japanese Laid-open Patent Publication No. 3-217978.)
- However, such prior arts simply control an image capturing apparatus corresponding to an instruction for changing the image in a single window, and cannot control an interrelation between multiple windows (multi-windows) or the camera itself by an instruction from a user.
- Thus, the prior art has the following disadvantages.
- (1) Since the relative positions of images cannot be obtained, users receive an unnatural impression.
- (2) When participants are displayed in multi-windows as in a teleconference, since images of faces of the participants are controlled individually in the conventional apparatus, each of the participants cannot control the images so that they can face each other on the display device, and an appropriate display of relationships of the participants cannot be obtained.
- (3) Since the relative position of the windows does not change in connection with the change of positions of cameras, the users cannot grasp the relative positions of the objects seen in the windows.
- (4) Since the user cannot control the camera directly, the user cannot easily control the camera in a three-dimensional-manner so as to display a forward view, a side view, or a rear view of the object, and cannot easily move the camera from the present position to a different position.
- This invention has been developed to solve the problems of prior arts, and aims to provide an apparatus and method for a coordination control between a window and a camera by which a user can control a camera quite naturally and efficiently and can effectively see an image captured by the camera on a display unit of a computer.
- This invention also aims to provide an apparatus and method for a coordination control between each window of multi-windows and cameras capable of showing relative image positions of objects and their inter-relations without giving an unnatural impression.
- This invention also aims to provide an apparatus and method for a coordination control between a window and a camera which allow a user to freely select the range of an image displayed on the window by the user directly controlling the camera.
- An apparatus for a coordination control between a window and a camera according to this invention comprises; a window control unit for controlling the window according to a window operation command issued by a user; an information conversion unit for converting information on a position of the window in a display and/or information on a size of the window into camera control information; and a camera control unit for controlling an on/off status (switching between on-state and off-state) of the camera, a position of the camera, a shooting distance of the camera, a direction (a shooting direction) of the camera, and/or a zoom setting of the camera, based on the camera control information.
- Another apparatus for a coordination control between a window and a camera according to this invention comprises a camera control unit for controlling the camera according to a camera operation command issued by a user; an information conversion unit for converting information on an on/off status (switching between on-state and off-state) of the camera, a position of the camera, a shooting distance of the camera, a direction (a shooting direction) of the camera, and/or a zoom setting of the camera, into window control information; and a window control unit for controlling a position of the window and/or a size of the window based on the window control information.
- The window control unit may also control the window according to a window operation command issued by a user, the information conversion unit may also convert information on the window supplied from the window control unit into camera control information, and the camera control unit may also control the camera based on the camera control information.
- The information conversion unit may convert information on a position of the window into the camera control information and the camera control unit may control at least one of a change of a position of the camera and a change of a direction of the camera based on the camera control information.
- The information conversion unit may convert information on a size of the window into the camera control information and the camera control unit may control a zoom of the camera based on the camera control information.
- The information conversion unit may also convert a relative position of the window to another window on the display into the camera control information, and the on/off status, the position, the direction, and/or the zoom setting of the camera, may be controlled based on the camera control information. Further, an on/off status, a position, a direction, and/or a zoom setting of another camera, which provides images to another window on the display, may be controlled based on the camera control information.
- The information conversion unit may also convert sound information supplied by a user into the camera control information, and the camera may be controlled based on the sound information.
- The information conversion unit may convert information on a zoom setting of the camera into the window control information, and the window control unit may control a size of the window based on the window control information.
- The information conversion unit may convert information on a direction of the camera into the window control information, and the window control unit may control a position of the window based on the window control information.
- The information conversion unit may convert information on an on/off status of the camera into the window control information, and the window control unit may control a position of the window based on the window control information.
- The information conversion unit may convert sound information provided by a user into the window control information, and a position of the window may by controlled based on the window control information.
- A method for a coordination control between a window screen and a camera according to this invention includes a step of controlling the window according to a window operation command issued by a user; a step of converting information on the control of the window into camera control information; and a step of controlling the camera based on the camera control information.
- Another method for a coordination control between a window screen and a camera according to this invention includes a step of controlling the camera according to a camera operation command issued by a user, a step of converting information on the control of the camera into window control information; and a step of controlling the window based on the window control information.
- Another method for a coordination control between a window screen and a camera according to this invention includes a step of controlling the window according to a window operation command issued by a user; a step of converting information on the control of the window into camera control information; a step of controlling the camera based on the camera control information; a step of controlling the camera according to a camera operation command issued by a user; a step of converting information on the control of the camera into window control information; and a step of controlling the window based on the window control information.
- In the methods, the window and/or the camera may be also controlled by sound information supplied by a user.
- One of an ordinary skill in the art may easily understand additional features and objects of this invention from the description of the preferred embodiments and the attached drawings.
- In the drawings:
- FIG. 1 is a block diagram illustrating the global configuration of a first embodiment of this invention;
- FIG. 2 is a flowchart illustrating processes of the first embodiment;
- FIGS. 3A and 3B show a control of a camera according to a position of a window;
- FIGS. 4A and 4B show a control of a camera according to a size of a window;
- FIG. 5 is a block diagram illustrating the global configuration of a second embodiment of this invention;
- FIG. 6 is a flowchart illustrating processes of the second embodiment;
- FIG. 7 is a block diagram illustrating the global configuration of a third embodiment of this invention;
- FIG. 8 is a flowchart illustrating processes of the third embodiment;
- FIGS. 9A through 9G show a control of positions of a camera according to positions of a window;
- FIG. 10 is a block diagram illustrating the global configuration of a fourth embodiment of this invention;
- FIG. 11 is a flowchart illustrating processes of the fourth embodiment;
- FIG. 12 is an explanatory diagram illustrating user images captured by a plurality of cameras;
- FIG. 13 is a block diagram illustrating the global configuration of a fifth embodiment of this invention;
- FIG. 14 is a flowchart illustrating processes of the fifth embodiment;
- FIG. 15 is an explanatory diagram illustrating the configuration of a sixth embodiment of this invention;
- FIG. 16 is an explanatory diagram illustrating operations of the sixth embodiment;
- FIG. 17 is a flowchart illustrating processes of the sixth embodiment;
- FIGS. 18A and 18B are explanatory diagrams illustrating a seventh embodiment of this invention;
- FIGS. 19A and 19B are explanatory diagrams illustrating an eighth embodiment of this invention;
- FIG. 20 is an explanatory diagram illustrating a ninth embodiment of this invention; and
- FIG. 21 is a flowchart illustrating processes of the ninth embodiment.
- Described below with reference to drawings are concrete embodiments of this invention. Parts shown in a drawing which are the same as those shown in another drawing have the same reference numbers, thereby saving the repetition of their explanations.
- First Embodiment
- FIG. 1 is a block diagram illustrating the global configuration of an apparatus for a coordination control between a window and a camera according to a first embodiment of this invention. As shown in FIG. 1, the apparatus displays images captured by the
camera 13 in awindow 19 of multiple window screens on thedisplay device 14. In the apparatus, information (data) on a position on adisplay device 14 and information on a size of thewindow 19 are used for determining an on/off status (on-state or off-state), a position, a direction, and a zoom of the camera. In FIG. 1,numerals camera 13 comprises acamera control device 16, which controls the on/off status, the position, the direction, and the zoom of thecamera 13. - The
display device 14 displays the viewedobject 11 captured by thecamera 13 in thewindow 19. Andisplay control device 15 of this embodiment displays images captured by thecamera 13 in thewindow 19 using information (or data) on the position and the size of the window provided from awindow control unit 18. Aninformation conversion unit 17 of this embodiment converts information on the position and the size of the window provided from awindow control unit 18 into camera control information, including information on the on/off status, the position, the direction, and the zoom setting of the camera. - The
window control unit 18 controls thewindow 19 or the multi-windows on thedisplay device 14 according to a window operation command issued by theuser 12. - FIG. 2 is a flowchart illustrating processes of the first embodiment.
- A computer (workstation) waits for an input into the
window control unit 18 from the user 12 (Step S1), determines whether or not the input is the window operation command (Step S2), and performs another process according to the input when the input is not the window operation command (Step S3). - When the input is the window operation command, the
window control unit 18 sends the information on the position of thewindow 19 and the information on the size of thewindow 19 to thedisplay control device 15, revises (moves, resizes, etc.)windows 19, and provides the information on the position and the size of the window to theinformation conversion unit 17. - The
information conversion unit 17 converts the information on the position and the size of thewindow 19 into camera control information including the information on the position, the direction, and the zoom setting of thecamera 13, and supplies the camera control information to the camera control device 16 (Step S4). Thecamera control device 16 controls thecamera 13 based on the camera control information (Step S5), and in parallel with the above processes, thedisplay control device 15 continues displaying the image captured by thecamera 13 on thedisplay device 14, based on the information on the position and the size of thewindow 19 sent from thewindow control unit 18. - FIG. 3A is a first one of explanatory diagrams illustrating in a two-part series a displayed image, where a camera direction is unrelated to an on-screen window position.
- When the camera direction is controlled independently of the on-screen window position, since an image in the window moves as the window displaying the image captured by the camera is moved, the image shown on the display device does not change.
- FIG. 3B is a second one of explanatory diagrams illustrating in a two-part series a displayed image, where a camera direction is interrelated with an on-screen window position.
- As shown in FIG. 3B, when the
camera control device 16 controls the camera direction corresponding to the on-screen window position, since the image in thewindow 19 changes corresponding to the on-screen window position, the user can chose the view on thedisplay device 14 and can get a natural impression as if the total image is fixed in the display device and the user is watching part of the image by moving a viewing window. - FIG. 4A is a first one of explanatory diagrams illustrating in a two-part series a displayed image, where a camera zoom setting is unrelated to an on-screen window size.
- As shown in FIG. 4A, when the camera zoom setting is controlled independently of an on-screen window size, since the image in the window is enlarged or reduced as the window is enlarged or reduced, the boundary of the image does not change.
- FIG. 4B is a second one of explanatory diagrams illustrating in a two-part series a displayed image, where a camera zoom setting is interrelated with an on-screen window size.
- As shown in FIG. 4B, when the
camera control device 16 controls the camera zoom setting corresponding to an on-screen window size, since the boundary of the image in thewindow 19 changes corresponding to the on-screen window size, the user can chose the boundary of the view on thedisplay device 14 and can get the natural impression as if the total image is fixed in thedisplay device 14 and the user is observing part of the image by resizing a viewing window. - This embodiment may have not only the functions shown in FIG. 3B and FIG. 4B but also the functions shown in FIG. 3A and FIG. 4A.
- Second Embodiment
- FIG. 5 is a block diagram illustrating the global configuration of an apparatus for a coordination control between a window and a camera according to a second embodiment of this invention. As shown in FIG. 5, the position and the size on the
display device 14 of thewindow 19 is controlled using the camera information, including information on the on/off status, the position, the direction, and the zoom setting of thecamera 13. - The
camera 13 comprises acamera control device 26 which controls the on/off status, the position, the direction, and the zoom setting of thecamera 13 based on a camera operation control command provided by theuser 12. Thecamera 13 captures the image of theobject 11 and thedisplay control device 15 displays the image in thewindow 19 of thedisplay device 14 using the window information, including information on the position and the size of thewindow 19. - An
information conversion unit 27 converts camera information including information on the on/off status, the position, the direction, and the zoom setting of thecamera 13 supplied from thecamera control device 26 into window control information, including information on the position and size of thewindow 19 on thedisplay device 19. Awindow control unit 28controls window 19 on thedisplay device 14 based on the window control information supplied from theinformation conversion unit 27. - FIG. 6 is a flowchart illustrating processes of the second embodiment. A computer (workstation) waits an input to the
camera control device 26 from the user 12 (step S 11). If the input is the camera operation command, the camera operation command is sent to thecamera control device 26 and thecamera control device 26 controls thecamera 13 according to the camera operation command. Then, the camera control device outputs camera information on the on/off status, position, direction, and zoom setting of thecamera 13 to theinformation conversion unit 27, and theinformation conversion unit 27 converts the camera information into camera control information, which is sent to the window control unit 28 (Step S12). Thewindow control unit 28 controls thewindow 19 based on the window control information (Step S13). - In the meantime, in parallel with the above processes, the
display control device 15 controls thedisplay device 14 and displays the image captured by thecamera 13 on a position designated by thewindow control unit 28 with a specified window size. - In this embodiment, since the
user 12 can directly operate thecamera 13, and the correspondingwindow 19 is controlled according to the camera information, theuser 12 can select the viewed object by observing thewindow 19. For example, theuser 12 can easily get the forward view, side view and rear view of theobject 11, and can easily move thecamera 13 from the present position to a position far from the present position to view a new object. - Third Embodiment
- FIG. 7 is a block diagram illustrating the global configuration of an apparatus for a coordination control between a window and a camera according to a third embodiment of this invention.
- The third embodiment is a combination of the first embodiment and the second embodiment, and operates in both directions between a camera operation control and a window operation control.
- As shown in FIG. 7, in this embodiment, a
camera control device 36 controls the on/off status, the position, the direction, and the zoom setting of thecamera 13 according to the camera operation command provided by theuser 12, and the camera control information from aninformation conversion unit 37. Awindow control unit 38 controls the position and the size of thewindow 19 according to the window operation command provided by theuser 12 and the window control information from theinformation conversion unit 37. - The
information conversion unit 37 converts the window information, including information on the position and the size of thewindow 19, into the camera control information for sending to thecamera control device 36, and also converts the camera information including information on the on/off status, the position, the direction, and the zoom setting of thecamera 13 into the window control information for sending to thewindow control unit 38. - FIG. 8 is a flowchart illustrating processes of the third embodiment.
- A computer (workstation) waits an input from the
user 12 or the camera control device 36 (Step S1), and determines the source of the input (Step S1′). When the input is from the user, the computer performsStep 2 throughStep 5 as described in the first embodiment. When the input is from thecamera control device 36, the computer performsStep 12 andStep 13 as described in the second embodiment. - In the meantime, in parallel with the above processes, the
display control device 15 displays the image captured by thecamera 13 on a portion of thedisplay device 14 designated by thewindow control unit 38 with the specified window size. - According to a method of this invention, the
camera 13 is controlled based on the window information, which can be controlled by theuser 12 based on the window operation command, and/or the camera operation command input by theuser 12. Therefore, the window information and the camera information are interrelated for displaying the image in thewindow 19, and theuser 12 can get an effective image from thedisplay device 14. Further, the user can visualize the interrelations and relative positions between objects shown in different windows on thesame display device 14 and can get a natural impression. Especially, when the system is used in a teleconference, the participant can face another member who the participant wants to talk to on the display device, and their faces can be naturally directed toward each other on the display device. - According to the apparatus of this invention, the
window 19 can be regarded as a user interface for controlling thecamera 13, and theuser 12 can control thecamera 13 quite naturally and effectively for presentation of an image. Further, the user can visualize the interrelations and relative positions between objects shown in different windows on thesame display device 14, and can get a natural impression. - In addition, the
user 12 can directly control thecamera 13 and freely select the scope of the image displayed in thewindow 19. The apparatus is effective in maintaining the relative on-screen sizes or on-screen positions of the objects presented in different windows on the same display device. - The above described embodiments are essential configurations of this invention and applications of these configurations will be described below.
- Explained first using FIG. 9A through FIG. 9G is a control of a camera position by the relative on-screen window positions of windows (a control of the camera based on window information).
- It is important for a teleconference system in a remote distribution system to appropriately present the faces of participants in windows.
- Usually, the teleconference system presents only front views of the participants so that the user faces the participants, as shown FIG. 9A.
- However, since the camera cannot be placed directly in front of the face of the user due to the existence of a display unit, a devise for obtaining a strict front view of the face, such as a half-mirror, is needed.
- Window screens displaying the faces of both a user and another participant present a natural impression even if the participant in the image does not actually look at the user on the display device, as shown in FIG. 9B. In this method, the camera need not be placed directly in front of the face for capturing a front view, instead the camera can be placed beside the display device.
- For presenting a better natural impression, the relative on-screen window positions between the windows showing the user and a participant to be talked to, and the directions of the faces of the user and the participant, are important. For example, it gives an unnatural impression when the participants do not face each other on the display device, even though the windows are placed side by side on the display device, as shown in FIG. 9C and FIG. 9D. The faces of the participants should be captured so that the eyes of the participants meet on the display device, as shown in FIG. 9E.
- FIG. 9F shows an example presenting an unnatural impression, in which the eyes of the participants do not meet on the display device because the windows are not placed side by side on the display device. The faces of the participants should be captured and displayed so as to face each other, as shown in FIG. 9G.
- It can generally be assumed that the user or another participant is facing the display device while operating their computers. However, a change of camera positions allows images to be captured from various directions, even if the user constantly faces the display device. The camera operation control of either the camera position or the camera direction, according to the relative on-screen window positions of the window showing the face of the user and the window showing the face of another participant, enables the camera to constantly capture an appropriate image, thereby presenting a natural impression.
- Fourth Embodiment
- Described below is a fourth embodiment of this invention. In the description of the fourth embodiment, a system according to this invention for a coordination control between a window and a camera is used in a teleconference in which two people in different locations have a dialog.
- FIG. 10 is a block diagram illustrating a global configuration of the system. This system uses two computers (workstations) shown in FIG. 1 and each of the cameras of the workstations captures a image of the corresponding user.
-
Camera 13A captures image ofuser 12A of a workstation WS-A, andcamera 13B capture image ofuser 12B of workstation WS-B. The workstations WS-A and WS-B are connected to each other via a transmission path. The workstations WS-A and WS-B respectively comprisedisplay devices display devices windows 19A-A and 19A-B, and 19B-A and 19B-B. Thewindow 19A-A and thewindow 19B-A display the image captured by thecamera 13A, and thewindow 19A-B and thewindow 19B-B display the image captured by thecamera 13B. - In the fourth embodiment, because each of the
users cameras windows 19A-A and 19B-A show the same image ofuser 12A, andwindows 19A-B and 19B-B show the same image ofuser 12B. Thus, both of thewindows 19A-A and 19A-B in thedisplay device 14A andwindows 19B-A and 19B-B in thedisplay device 14B must be positioned in the same relationship to each other. - In this embodiment, when one of the
users camera control devices cameras window control units - FIG. 11 is a flowchart illustrating processes of the fourth embodiment, and described below with reference to FIG. 11 are processes of the fourth embodiment.
- Steps with a large letter “A” after a numerical step number indicate those for workstation WS-A, and steps with a large letter “B” after a numerical step number indicate those for workstation WS-B, where the same numerical step numbers indicate basically identical processes.
- Each of the workstations WS-A and WS-B (computers A and B) stands by for an input to the corresponding window control units from the corresponding users or from the other workstation (Steps S1A and S1B), and determines whether or not the input is the window operation command (Steps S2A and S2B). When the input is not the window operation command, the workstations WS-A and WS-B perform another process according to the input (Steps S3A and S3B).
- When the input is the window operation command, the corresponding one of the
window control units window control units - Each of the
window control units windows 19A-A and 19A-B andwindows 19B-A and 19B-B, and provides to the corresponding one of theinformation conversion units - Each of the
information conversion units camera control devices - Further, each of the
information conversion units window 19A-B againstwindow 19A-A and a relative on-screen window angle δ ofwindow 19B-B againstwindow 19B-A, respectively, where δ=θ±π (Steps S4′A and S4′B). - Each of the
camera control devices cameras - In the meantime, in parallel with the above processes, each of the display control devices15A and 15B continues having the corresponding one of the
display devices cameras - Returning to FIG. 10, the processes shown in FIG. 11 are explained by using a specific example.
- When
user 12A of workstation WS-A instructswindow control unit 18A to move the window position ofwindow 19A-A,information conversion unit 17A calculates an angle θ, which is defined as an angle between a straight line drawn from the center of window to 19A-A to the center ofwindow 19A-B, and a horizontal line through the center-line ofdisplays 14, andcamera control device 16A moves the camera position ofcamera 13A by −θ. - Then, upon receipt of window control information from
window control unit 18A,window control unit 18B moves the window position ofwindow 19B-A,information conversion unit 17B calculates an angle δ, which is defined as an angle between a straight line drawn from the center ofwindow 19B-A to the center ofwindow 19B-B, and a horizontal line through the center-line ofdisplays 14, andcamera control device 16B moves the camera position ofcamera 13B by −δ. - This equalizes the relative on-screen window positions between
windows 19A-A and 19A-B with those betweenwindows 19B-A and 19B-B, and theusers users - Fifth Embodiment
- Explained below is a fifth embodiment of this invention in which workstations control a plurality of fixed cameras on the basis of window control information on relative on-screen window positions.
- Under circumstances similar to those of the fourth embodiment, it is possible to display an appropriate image by switching an activation and a deactivation (on/off status) of the cameras positioned at several locations according to the relative on-screen window positions, instead of physically moving one camera.
- FIG. 12 is an explanatory diagram illustrating user images captured by a plurality of cameras. A plurality of cameras capture images of the faces of the users from various directions. In this case, although the cameras cannot always catch the ideal direction of the faces of the users, it is not necessary to physically move cameras or to control manipulators.
- FIG. 13 is a block diagram illustrating the global configuration of an apparatus for a coordination control between a window and a camera according to a fifth embodiment of this invention.
- More specifically, FIG. 13 shows a configuration allowing two people to have a dialog, each person having a computer (workstation) as shown in FIG. 1, and a plurality of cameras to capture his images. In this case, workstation WS-A has eight cameras connected and workstation WS-B has six cameras connected for capturing respective user's images.
- Group of cameras13A0 through 13A7 and group of cameras 13B0 through 13B5 capture images of
users display devices windows 19A-A and 19A-B and a pair ofwindows 19B-A and 19B-B displaying images respectively captured by the group of cameras 13A0 through 13A7 and the group of cameras 13B0 through 13B5. - In the fifth embodiment, because each of the faces of the
users windows 19A-A and 19B-A of therespective display devices user 12A, andwindows 19A-B and 19B-B of therespective display devices user 12B. Thus, thedisplay devices windows 19A-A and 19A-B andwindows 19B-A and 19B-B to be positioned in the same relationship. - When one of the
users camera control device - FIG. 14 is a flowchart illustrating processes of this embodiment. Steps with a large letter “A” after a numerical step number indicate those for workstation WS-A and steps with a large letter “B” after a numerical step number indicate those for workstation WS-B, where the same numerical step numbers indicate basically identical processes.
- Each of workstations WS-A and WS-B (computers A and B) waits for an input from the
users - When the input is the window operation command, the corresponding
window control unit 18 changes the position of the corresponding window on the display device according to the window operation command (Steps S4A and S4B). - Then, the
information conversion units 17 of the workstations WS-A and WS-B respectively calculate a relative on-screen window angle θ ofwindow 19A-B againstwindow 19A-A, and a relative on-screen window angle δ ofwindow 19B-B againstwindow 19B-A, respectively, where δ=θ±π (Steps S4′A and S4′B). Theinformation conversion units 17 of workstations WS-A and WS-B send information on the switching of cameras to thecamera control devices 16, respectively. - Then, the
camera control devices cameras 13A (comprising 13A0 through 13A7) as cameras capturing images ofuser 12A to be displayed onwindows 19A-A and 19B-A, respectively (Steps S6A and S6B), and all other cameras around the user of workstation WS-A are deactivated. - Further, the
camera control devices cameras 13B (comprising 13B0 through 13B5) as cameras capturing images ofuser 12B to be displayed onwindows 19A-B and 19B-B, respectively (Steps S7A and S7B), and all other cameras around the user of workstation WS-B are deactivated. - In the above steps S6A, S6B, S7A, and S7B, the selection of cameras is performed based on the functions “f” and “g”, which are determined according to the window angles δ and α. The functions “f” and “g” provide the number of cameras which will be select as cameras sending images to the windows.
-
- where
- α=π/m
- βi=(2i−1)α±2nπ(n=0, 1, . . . ) In the definition of function “f” and “g”, m is a constant indicating the number of cameras in the group of
cameras 13A. The constant m is eight in workstation WS-A and is six in workstation WS-B. It is assumed here that cameras 13A0 through 13A7 in the group ofcameras 13A are positioned radially and equidistant around a circle centered on the user of the workstation WS-A. It is also assumed that cameras 13B0 through 13B5 in the group ofcameras 13B are positioned radially and equidistant around a circle centered on the user of the workstation WS-B. The cameras 13A0 and 13B0 are positioned on the positive direction of the X axis, which is a horizontal line through the center-line ofdisplays 14. - Yet, cameras do not have to be positioned radially and equidistant around a circle centered on the user, but other positioning of the cameras can be used in this invention.
- In the fifth embodiment, the camera control devices of workstations WS-A and WS-B respectively select a camera closest to a −θ direction and a camera closest to a −δ direction for capturing the image of the user of the workstation WS-A using the function “f”, and also select a camera closest to a −θ−π direction and a camera closest to a−δ−π direction for capturing the image of the user of the workstation WS-B using the function “g”.
- Returning to FIG. 13, a function of the fifth embodiment will be described in detail below.
- When the
user 12A moves thewindow 19A-A so that the relative angle of thewindow 19A-A and thewindow 19A-B becomes θ, thecamera control device 16A switches a camera activation as follows: - The
information conversion unit 17A calculates an angle θ, which is defined as an angle between a straight line drawn from the center ofwindow 19A-A to the center ofwindow 19A-B, against and the to the center ofwindow 19A-B, against and the horizontal line through the center-line ofdisplays 14, and also calculates f(−θ) and g(−θ−π)according to expression (1). - Then,
camera control device 16A of workstation WS-A switches a camera activation of the group ofcameras 13A and selects a camera at f(−θ). When thewindow 19A-A is positioned at the lower-left of thewindow 19A-B as shown in the left figure of FIG. 13, thecamera control device 16A of workstation WS-A selects the camera 13A7, which captures an image ofuser 12A from a position to his lower-right. - The
camera control device 16A of workstation WS-A also switches a camera activation of the group ofcameras 13B and selects a camera at g(−θ−π). Accordingly, thecamera control device 16A of workstation WS-A selects the camera 13B2, which captures an image ofuser 12B from a position to his upper-left. - In this case, since the
user 12B did not move thewindow 19B-A nor thewindow 19B-B,camera control device 16B is not required to switch a camera activation of either group ofcameras 13A or group ofcameras 13B. - The selection of cameras allows
users display devices - Sixth Embodiment
- Explained below is a sixth embodiment of this invention in which workstations control a plurality of fixed cameras according to relative on-screen window positions. This embodiment performs a bi-directional camera operation control.
- In the following description of a sixth embodiment, it is assumed that five people participate in a teleconference. Each person has a workstation having three cameras, two people (speakers) speak at a time, and the other three people (observers) observe their discussion.
- FIG. 15 is an explanatory diagram illustrating the relative positions of a display device and cameras according to a sixth embodiment of this invention.
- Each workstation of the participants includes the third embodiment shown in FIG. 7, and all workstations are connected by a transmission path. An
observation camera 13C, aright speech camera 13R, aleft speech camera 13L, aright speech switch 113R and aleft speech switch 113L are provided around the workstation of each participant (user). - FIG. 16 is an explanatory diagram illustrating operations of the sixth embodiment.
- A speaker area including left speaker's window a and right speaker's window b, which present images of two speakers, is provided on the display device of each user, and an observer area including three observer's windows c which present images of three observers, is also provided on the display device.
- The
observation camera 13C captures an image of the user when the user is one of the observers. Theright speech camera 13R captures an image of the user from the right direction and theleft speech camera 13L captures an image of the user from the left direction when the user is one of the speakers. - Assume first that users A and B are the speakers and users C, D, and E are the observers of the discussion, as shown at the top-center of FIG. 16. One of the observers C, D, and E can become a speaker in two ways. He may switch his camera from the
observation camera 13C to either theright speech camera 13R or theleft speech camera 13L by selecting a corresponding one of theright speech switch 113R or theleft speech switch 113L. Alternatively, he may move the window showing his face into the speaker area, for example, by using a mouse device. - Assume second that user C who is an observer wants to speak with user B who is a speaker. The user C pushes a
right speech switch 113R provided on his workstation WS-C. This causes aright speech camera 13R to capture an image of user 12C in lieu of theobservation camera 13C. This also causes eachwindow control unit 38 to swap an image of user A in left speaker's window a with an image of user C in one of the observer's windows c in thedisplay device 14, as shown at the bottom-left of FIG. 16. - Assume third that user D who is an observer wants to speak with user A who is a speaker. User D drags the observer's window c displaying his image over to the right speaker's window b displaying the image of user B. This causes a
left speech camera 13L to capture an image of the user D in lieu of anobservation camera 13C. This also causes eachwindow control unit 38 to swap an image of the user B in the right speaker's window b with an image of the user D in one of the observer's windows c, as shown at the upper-right of FIG. 16. - When a user switches a camera activation and a camera deactivation by depressing either his
right speech switch 113R or hisleft speech switch 113L, ainformation conversion unit 37 converts camera control information on a camera activation and a camera deactivation into window control information on window positions, and thewindow control unit 38 swaps the window showing the speaker with the window showing the observer. - When a user drags a window over to another, the
information conversion unit 37 converts window control information on window positions into camera control information for a camera activation and a camera deactivation, and thecamera control device 36 switches the cameras accordingly. - FIG. 17 is a flowchart illustrating processes of the sixth embodiment.
- The computer (workstation) waits for an input from the
user 12, an input from thecamera control device 36, or an input from another computer (Step S1), and determines whether or not the input is from theuser 12 or from another computer (Step S1′). When the input is from theuser 12 or from another computer, the computer determines whether or not the input is the window operation command (Step S2), and if the input is not the window operation command (Step S3), performs another process corresponding to the input and the process then goes back to step S1. - If the input is the window operation command, the
window control unit 38 sends the window operation data to thedisplay control device 15 for revising (moving, resizing, etc.) thewindows 19. Thewindow control unit 38 also sends the window information to theinformation conversion unit 37, and theinformation conversion unit 37 converts the window information into the camera control information and sends them to thecamera control device 36. Then, thecamera control device 36 switches the cameras so that a camera corresponding to the position of the window showing the user is selected (Step S5). S17. - In step S1′, when the input is from the
camera control device 36, theinformation conversion unit 37 converts the data into window control information, and sends this information to thewindow control unit 38. Thewindow control unit 38 swaps to the window on the opposite side to the speech switch selected by the user with speaker's window (Step S16). In this operation, when the user pushed theright speech switch 113R, theright speech camera 13R is selected and the left speaker's window a becomes his window, as shown in the left part of FIG. 16. - Next, the workstation transfers the window and the windows in the display device of other workstations are re-arranged so as to have the same arrangement as in the display device of the user who pushed the speech switch. Then, the process continues by returning to step S1.
- Seventh Embodiment
- Explained below is a seventh embodiment of this invention in which a workstation controls an on-screen window size in relation to a camera zoom setting.
- The seventh embodiment is for keeping relative image sizes between objects by controlling a window size in relation to the zoom setting of a camera. This embodiment uses the apparatus of the second embodiment shown in FIG. 5, and the
window control unit 28 enlarges or reduces the window size according to a scaling of the image set by the camera zoom. - FIG. 18A is a first one of explanatory diagrams illustrating a pair of images displayed when the window size is unrelated to the camera zoom setting.
- FIG. 18B is a second one of explanatory diagrams illustrating a pair of images displayed when the window size is related to the camera zoom setting. setting.
- The
information conversion unit 27 converts camera information on respective camera zoom settings ofcameras camera control device 26, into window control information on window sizes. Then, the window control information is sent to thewindow control unit 28 to control the sizes of the windows on thedisplay device 14. - As a result, the window size changes corresponding to the camera zoom setting, and the relative image sizes between objects can be correctly represented in the display device.
- When the
window control unit 27 controls the window size independently of the camera zoom setting, even if the zoom setting of the camera is changed, the size of the corresponding window does not change and an image inside of the window is scaled up or down proportionally, thereby failing to provide the relative image sizes between images in the display device. - As shown in FIG. 18A, when the
camera 13A captures abigger object 11A with a wide angle view (with low magnification), thebigger object 11A and asmaller object 11B captured by acamera 13B are shown to be the same size in thedisplay device 14. - When the
window control unit 27 controls the window size corresponding to the camera zoom setting according to this invention, the relative sizes of the objects can be represented in thedisplay device 14. - As shown in FIG. 18B, when the
camera 13A captures thebigger object 11A with a wide angle view (with low magnification), the image of thebigger object 11A is shown bigger than the image of thesmaller object 11B captured by thecamera 13B in thedisplay device 14 due to the window size control. - Eighth Embodiment
- Explained below is an eighth embodiment of this invention in which a workstation controls an on-screen window position in relation to a position and a direction of the camera.
- The eighth embodiment is for keeping a relative position between objects by controlling the window position in relation to the position and the direction Of the camera. This embodiment uses the apparatus of the second embodiment shown in FIG. 5, and the
window control unit 28 changes the positions of the windows according to the camera information on the position and the direction of the camera. - FIG. 19A is a first one of explanatory diagrams illustrating a pair of images displayed when the window position is unrelated to the position and the direction of the camera.
- FIG. 19B is a second one of explanatory diagrams illustrating a pair of images displayed when the window position is related to the position and the direction of the camera.
- The
information conversion unit 27 converts camera information on camera position and camera direction ofrespective cameras camera control device 26, into window control information on window sizes. Then, the window control information is sent to thewindow control unit 28 to control the positions of the windows on thedisplay device 14. - As a result, the relative positions between the windows change corresponding to the camera position and the camera direction, and the relative image position between objects can be correctly presented.
- When the
window control unit 27 controls the window position independently of the camera position and the camera direction, even if the direction or the position of the camera is changed, the relative position of the windows in the display device does not change, thereby failing to provide the relative positions between objects in the display device. - As shown in FIG. 19A, even if the
camera 13B is moved or changes its direction to captureother object 11C, the position of the window showing the image of theobject 11C does not change, and the actual relationship in position between theobject 11A and theobject 11C is not represented in the display device. - When the
window control unit 27 controls the window position corresponding to the camera position and the camera direction according to this invention, the relative position between the objects can be shown in thedisplay device 14. - As shown in FIG. 19B, when the
camera 13B is moved or changes its direction to capture theobject 11C, the position of the window showing the image of theobject 11C is moved according to the change of the camera position or the camera direction, thereby representing the relative position between the twoobjects - Ninth Embodiment
- A ninth embodiment of this invention will be described below using FIGS. 20 and 21. This embodiment basically uses apparatus of the sixth embodiment whose configuration, operations and processes are shown in FIG. 15, FIG. 16 and FIG. 17, and the circumstances of the teleconference are the same.
- In the sixth embodiment, the user switches the
observation camera 13C, theright speech camera 13R and theleft speech camera 13L, and swaps windows in a speaker area and an observer area manually, by either depressing one of theleft speech switch 113L and theright speech switch 113R, or by dragging a window over another window. - The ninth embodiment comprises a
left microphone 113L′ placed to the left of the user and aright microphone 113R′ placed to the right of the user as substitutes for theleft speech switch 113L and theright speech switch 113R. In the ninth embodiment, switching ofcameras right microphone 113R′ and theleft microphone 113L′. - Assuming that five people participate in the teleconference, that two people speak at a time, and that the other three people observe their discussion, this embodiment will be as described below.
- Each of the participants in the teleconference has a workstation of the third embodiment, and
cameras microphones 113L′ and 113R′ are attached to the workstation. In each workstation, theright microphone 113R′ and theleft microphone 113L′ are connected to a microphone operation control unit (not shown), and all workstations are connected by a transmission path. - Each user listens to a discussion between two speakers using a stereo headphone set that emits the voice of the right speaker displayed in the speaker area from its right side and the voice of the left speaker displayed in the speaker area from its left side. A headphone operation control unit (not shown) controls sounds emitted from the stereo headphone set using the window information. The sound source for the stereo headphone set is changed by the headphone operation control unit according to the swapping of the windows to match the sound with the voice of a new speaker, when windows are swapped between the speaker area and the observer area.
- Assume first that users A and B are talking and that users C, D and E are observing the discussion, as shown at the top-center of FIG. 20.
- An observer can become a speaker in two ways. He may switch his camera from the
observation camera 13C to either one of theright speech camera 13R and theleft speech camera 13L by speaking toward either theright microphone 113R′ or theleft microphone 113L′. Alternatively, he may drag his window into a speaker area, for example, by using a mouse device. - Assume second that user C who has been an observer wants to speak with user B who has been a speaker. User C initially speaks toward a
right microphone 113R′. The sound information is sent to thewindow control unit 38 through the microphone operation control unit. Then, thewindow control unit 38 swaps an image in window a, which is on the opposite side of themicrophone 113R′, with the image of user C, based on the information on the user's voice from the microphone operation control unit, as shown at the bottom-left of FIG. 20. This also causes aright speech camera 13R, in lieu of anobservation camera 13C, to capture an image of user 12C, and the headphone operation control unit controls the headphone set so that the voice of the user C is emitted from the left side of the headphone set. - Assume third that user D who has been an observer wants to speak with user A who has been a speaker. User D drags a window displaying his image over to window b displaying an image of user B, as shown at the upper-middle right of FIG. 20. The window information on changing the window position is sent to the
camera control device 36 through thewindow control unit 38 and theinformation conversion unit 37, then thecamera control device 36 controls the cameras so that theleft speech camera 13L, in lieu of theobservation camera 13C, capture an image of user 12D. As a result, the user D faces the user A in the speaker area. In addition, the headphone operation control unit controls the sound of the headphone set so that the voice of user D is emitted from the right side of each headphone set. - FIG. 21 is a flowchart illustrating processes of the ninth embodiment.
- A computer (workstation) waits for an input from the user, an input from the microphone operation control unit, or an input from another computer (Step S1), then determines whether or not the input is from the
user 12 or another computer (Step S1′). When the input is from the user or from another computer, the computer determines whether the input is a window operation command (Step S2). If the input is not the window operation command, the computer performs another process corresponding to the input (Step S3), and the process continues by returning to step S1. - If the input is the window operation command, the
window control unit 38 sends the window control information to thedisplay control device 15 to swap images of users in the display device. Then, thewindow control unit 38 sends the information on the position of the windows to theinformation conversion unit 37 to provide the camera control information to thecamera control device 36. - After receiving the camera control information, the
camera control device 36 controls the cameras so that one of theright speech camera 13R and theleft speech camera 13L captures the user's image based on the camera control information (Step S5). - Next, the headphone operation control unit swaps the sound emitted from either the right or left side of a headphone set with the sound from the user who wants to be a speaker (Step S5′). The computer transfers window control operation data on an on-screen window position to the other workstations (Step S14). The process continues by returning to step S1.
- When the computer has determined that the input is from the microphone operation control unit in Step S1′, the
window control unit 38 swaps a window, which is on the opposite side in the speaker area to a microphone to which a sound is input first, for this user's window (Step 15). AfterStep 15, the process goes toStep 5. - As described above, since the camera is controlled based on the window information, the user can control the window to obtain an image that he wishes to see without directly controlling the camera. Further, since the window is controlled based on the camera control, the original relationship between objects can be shown in windows on the display device.
- More specifically, this invention produces the following advantages:
- [1] Because images captured by cameras are displayed in windows using an interrelation between the camera information and the window information, the user can obtain the relative positions and relative sizes between objects from the display device, thereby being presented with natural images of the objects.
- It is especially effective in a teleconference system, because the participants can face each other on the display device when they talk to each other.
- [2] Since a camera can be controlled according to information on the position of the window, which can be directly controlled by a user, the window can be regarded as an interface between the user and the camera, and the user can control the position and the direction of the camera by changing the window position while watching the image of the object.
- [3] Since a camera can be controlled according to information on the size of the window, which can be directly controlled by a user, the user can control the zoom setting of the camera by changing the window size and can get a desired field of view while watching the image of the object.
- [4] When the viewing range of the camera is changed by the switching between a plurality of cameras according to this invention, a manipulator for the camera is not needed and a simplification of the system can be realized.
- [5] Even if the zoom, the direction, or the position of the camera is changed, the proper relationship in size or position between viewed objects can be obtained on the display device.
Claims (20)
1. An apparatus for coordination control between a window on a display and a camera for displaying an image captured by the camera in the window, the apparatus comprising:
camera control means for controlling the camera based on a camera operation command issued by a user and for producing camera information;
information conversion means for converting the camera information into interrelated window control information for displaying the image; and
window control means for controlling the window according to the window control information.
2. An apparatus for coordination control between a window and a camera for use in displaying an image captured by the camera in the window on a display means, comprising:
camera control means for controlling the camera based on a camera operation command issued by a user;
information conversion means for converting camera information supplied from the camera control means into window control information; and
window control means for controlling the window according to the window control information, the camera information and the window control information being interrelated for displaying the image, wherein
the window control means controls the window according to a window operation command issued by a user,
the information conversion means converts window information supplied from the window control means into camera control information, and
the camera control means controls the camera based on the camera control information.
3. The apparatus according to claim 1 , wherein
the information conversion means converts at least one of an on/off status of the camera, a position of the camera, a direction of the camera, and a zoom of the camera into at least one of a position of the window and a size of the window on the display.
4. The apparatus according to claim 1 , wherein
the information conversion means converts a zoom of the camera into window control information, and the window control means controls a size of the window based on the window control information.
5. The apparatus according to claim 1 , wherein
the information conversion means converts a direction of the camera into window control information, and the window control means controls a position of the window on the display based on the window control information.
6. The apparatus according to claim 1 , wherein
the information conversion means converts an on/off status of the camera into window control information, and the window control means controls a position of the window on the display based on the window control information.
7. The apparatus according to claim 1 , wherein
the information conversion means converts sound information provided by a user into window control information, and a position of the window on the display is controlled based on the window control information.
8. The apparatus according to claim 2 , wherein
the information conversion means converts at least one of a position of the window and a size of the window on the display into at least one of an on/off status of the camera, a position of the camera, a direction of the camera, and a zoom of the camera.
9. The apparatus according to claim 2 , wherein
the information conversion means converts a position of the window into camera control information, and the camera control means controls at least one of a change of a position of the camera and a change of a direction of the camera based on the camera control information.
10. The apparatus according to claim 2 , wherein
the information conversion means converts a size of the window into camera control information, and the camera control means controls a zoom of the camera based on the camera control information.
11. The apparatus according to claim 2 , wherein
the information conversion means converts a relative position of the window to another window on the display into camera control information, and the camera control means controls at least one of a position of the camera and a direction of the camera based on the camera control information.
12. The apparatus according to claim 2 , wherein
the information conversion means converts a relative position of the window to another window on the display into camera control information, and the camera control means controls an on/off status of the camera based on the camera control information.
13. A method for coordination control between a window on a display and a camera so as to display an image captured by the camera in the window, the method comprising:
controlling the camera according to a camera operation command issued by a user;
converting information about the camera into interrelated window control information for displaying the image; and
controlling the window based on the window control information.
14. A method for coordination control between a window on a display and a camera so as to display an image captured by the camera in the window, the method comprising:
controlling the window according to a window operation command issued by a user;
converting information about the window into camera control information;
controlling the camera based on the camera control information;
controlling the camera according to a camera operation command issued by the user;
converting information about the camera into window control information; and
controlling the window based on the window control information.
15. A method for coordination control between a window on a display and a camera to display an image captured by the camera in the window, the method comprising:
converting sound information issued by a user into control information on the window;
controlling a position of the window on the display means based on the control information; and
controlling the camera according to the position of the window.
16. A method for coordination control between a window on a display and a camera to display an image captured by the camera in the window, the method comprising:
converting sound information issued by a user into control information on the camera;
turning the camera OFF or ON based on the control information; and
controlling the window according to the on/off status of the camera.
17. A system for coordination control between a window and a camera comprising:
a camera for capturing an image;
a display for displaying the window and an image captured by the camera the window;
information conversion means for performing at least one of converting information on the camera into window control information and converting information on the window into camera control information; and
at least one of a window control means for controlling the window based on the window control information and a camera control means for controlling the camera based on the camera control information.
18. An apparatus for coordination control between a window on a display and a camera for displaying an image captured by the camera in the window, the apparatus comprising:
window control means for controlling the window according to a command issued by a user and for producing window information including at least a relative position of the window to another window on the display;
information conversion means for converting the window information into interrelated camera control information for controlling at least one of the position of the camera and a direction of the camera; and
camera control means for controlling at least one of a position and a direction of the camera based on the camera control information.
19. A method for coordination control between a window on a display and a camera so as to display an image captured by the camera in the window, the method comprising:
controlling the window according to a command issued by a user;
providing window information including at least a relative position of the window to another window on the display;
converting the window information into interrelated camera control information including at least one of a position of the camera and a direction of the camera; and
controlling at least one of a position and a direction of the camera based on the camera control information.
20. A method for controlling a camera comprising:
converting sound information issued by a subject into control information for the camera;
controlling whether the camera captures images based on the control information; and
controlling the window according to whether the camera is capturing images.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/108,397 US20020126208A1 (en) | 1994-05-26 | 2002-03-29 | Apparatus and method for camera control |
Applications Claiming Priority (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP11314494A JP3797678B2 (en) | 1994-05-26 | 1994-05-26 | Window and camera cooperative control method and apparatus |
JP06-113144 | 1994-05-26 | ||
US43806195A | 1995-05-08 | 1995-05-08 | |
US09/008,567 US6424373B1 (en) | 1994-05-26 | 1998-01-16 | Apparatus and method for camera control |
US10/108,397 US20020126208A1 (en) | 1994-05-26 | 2002-03-29 | Apparatus and method for camera control |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/008,567 Division US6424373B1 (en) | 1994-05-26 | 1998-01-16 | Apparatus and method for camera control |
Publications (1)
Publication Number | Publication Date |
---|---|
US20020126208A1 true US20020126208A1 (en) | 2002-09-12 |
Family
ID=14604691
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/008,567 Expired - Fee Related US6424373B1 (en) | 1994-05-26 | 1998-01-16 | Apparatus and method for camera control |
US10/108,397 Abandoned US20020126208A1 (en) | 1994-05-26 | 2002-03-29 | Apparatus and method for camera control |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US09/008,567 Expired - Fee Related US6424373B1 (en) | 1994-05-26 | 1998-01-16 | Apparatus and method for camera control |
Country Status (2)
Country | Link |
---|---|
US (2) | US6424373B1 (en) |
JP (1) | JP3797678B2 (en) |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050223333A1 (en) * | 2004-03-31 | 2005-10-06 | Canon Kabushiki Kaisha | Image displaying method, image displaying program, and display |
US20060203008A1 (en) * | 2005-03-10 | 2006-09-14 | Seiko Epson Corporation | Apparatus for displaying an image |
US20080158340A1 (en) * | 2006-12-04 | 2008-07-03 | Kabushiki Kaisha Toshiba | Video chat apparatus and method |
US20110069206A1 (en) * | 2004-12-10 | 2011-03-24 | Legall Didier | High resolution zoom: a novel digital zoom for digital video camera |
US20110199484A1 (en) * | 2007-01-10 | 2011-08-18 | Canon Kabushiki Kaisha | Camera control apparatus and method, and camera control system |
US20120038679A1 (en) * | 2010-08-13 | 2012-02-16 | Lg Electronics Inc. | Mobile terminal, display device and controlling method thereof |
CN105683867A (en) * | 2013-09-20 | 2016-06-15 | 微软技术许可有限责任公司 | Configuration of a touch screen display with conferencing |
US20170068417A1 (en) * | 2013-12-16 | 2017-03-09 | Sony Corporation | Information processing apparatus, program, information processing method, and information processing system |
CN113395479A (en) * | 2021-06-16 | 2021-09-14 | 随锐科技集团股份有限公司 | Video conference picture processing method and system |
Families Citing this family (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020171742A1 (en) * | 2001-03-30 | 2002-11-21 | Wataru Ito | Method and apparatus for controlling a view field of an image picking-up apparatus and computer program therefor |
US20030169350A1 (en) * | 2002-03-07 | 2003-09-11 | Avi Wiezel | Camera assisted method and apparatus for improving composition of photography |
JP3870124B2 (en) * | 2002-06-14 | 2007-01-17 | キヤノン株式会社 | Image processing apparatus and method, computer program, and computer-readable storage medium |
KR20040034803A (en) * | 2002-10-17 | 2004-04-29 | 주식회사씨엠티 | Module and Method for Resizing Time-varying Images in the Internet Image Communications |
JP4101043B2 (en) * | 2002-12-11 | 2008-06-11 | キヤノン株式会社 | Image data display system, image data display method, program, storage medium, and imaging apparatus |
JP4355512B2 (en) * | 2003-04-17 | 2009-11-04 | 任天堂株式会社 | Image processing apparatus and image processing program |
TWI228378B (en) * | 2003-06-19 | 2005-02-21 | Primax Electronics Ltd | Auxiliary method and device for finding view |
US7693308B2 (en) * | 2004-03-24 | 2010-04-06 | Fujifilm Corporation | Authentication system, authentication method, machine readable medium storing thereon authentication program, certificate photograph taking apparatus, and certificate photograph taking method |
US7865834B1 (en) * | 2004-06-25 | 2011-01-04 | Apple Inc. | Multi-way video conferencing user interface |
US7400357B2 (en) * | 2004-12-10 | 2008-07-15 | The United States Of America As Represented By The Department Of The Army | Remotely delivered, self-deployed multi-function sensor |
JP2011097447A (en) * | 2009-10-30 | 2011-05-12 | Sharp Corp | Communication system |
US20150085060A1 (en) | 2013-09-20 | 2015-03-26 | Microsoft Corporation | User experience for conferencing with a touch screen display |
CN104754216B (en) * | 2015-03-06 | 2018-03-27 | 广东欧珀移动通信有限公司 | A kind of photographic method and device |
JP6534120B2 (en) * | 2015-07-13 | 2019-06-26 | 国立大学法人静岡大学 | Image communication device |
US9491374B1 (en) * | 2015-12-11 | 2016-11-08 | Fuji Xerox Co., Ltd. | Systems and methods for videoconferencing input and display management based on activity |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4264928A (en) * | 1979-11-05 | 1981-04-28 | Schober Gary W | Conference video system |
US5206721A (en) * | 1990-03-08 | 1993-04-27 | Fujitsu Limited | Television conference system |
US5568183A (en) * | 1993-10-20 | 1996-10-22 | Videoconferencing Systems, Inc. | Network videoconferencing system |
US5936668A (en) * | 1995-10-02 | 1999-08-10 | Asahi Kogaku Kogyo Kabushiki Kaisha | Color image display device |
US5999214A (en) * | 1992-10-26 | 1999-12-07 | Canon Kabushiki Kaisha | Image pickup system and communication system for use in video conference or the like |
US6008844A (en) * | 1995-04-07 | 1999-12-28 | Canon Kabushiki Kaisha | Display device having index movement direction in correspondence with aspect ratio |
US6008837A (en) * | 1995-10-05 | 1999-12-28 | Canon Kabushiki Kaisha | Camera control apparatus and method |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3034891B2 (en) | 1990-01-24 | 2000-04-17 | 株式会社東芝 | Image display device |
-
1994
- 1994-05-26 JP JP11314494A patent/JP3797678B2/en not_active Expired - Fee Related
-
1998
- 1998-01-16 US US09/008,567 patent/US6424373B1/en not_active Expired - Fee Related
-
2002
- 2002-03-29 US US10/108,397 patent/US20020126208A1/en not_active Abandoned
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4264928A (en) * | 1979-11-05 | 1981-04-28 | Schober Gary W | Conference video system |
US5206721A (en) * | 1990-03-08 | 1993-04-27 | Fujitsu Limited | Television conference system |
US5999214A (en) * | 1992-10-26 | 1999-12-07 | Canon Kabushiki Kaisha | Image pickup system and communication system for use in video conference or the like |
US5568183A (en) * | 1993-10-20 | 1996-10-22 | Videoconferencing Systems, Inc. | Network videoconferencing system |
US6008844A (en) * | 1995-04-07 | 1999-12-28 | Canon Kabushiki Kaisha | Display device having index movement direction in correspondence with aspect ratio |
US5936668A (en) * | 1995-10-02 | 1999-08-10 | Asahi Kogaku Kogyo Kabushiki Kaisha | Color image display device |
US6008837A (en) * | 1995-10-05 | 1999-12-28 | Canon Kabushiki Kaisha | Camera control apparatus and method |
Cited By (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9086790B2 (en) | 2004-03-31 | 2015-07-21 | Canon Kabushiki Kaisha | Image displaying method, image displaying program, and display |
US7810039B2 (en) * | 2004-03-31 | 2010-10-05 | Canon Kabushiki Kaisha | Image displaying method, image displaying program, and display |
US20100309224A1 (en) * | 2004-03-31 | 2010-12-09 | Canon Kabushiki Kaisha | Image displaying method, image displaying program, and display |
US20050223333A1 (en) * | 2004-03-31 | 2005-10-06 | Canon Kabushiki Kaisha | Image displaying method, image displaying program, and display |
US20110069206A1 (en) * | 2004-12-10 | 2011-03-24 | Legall Didier | High resolution zoom: a novel digital zoom for digital video camera |
US8243171B2 (en) * | 2004-12-10 | 2012-08-14 | Ambarella, Inc. | High resolution zoom: a novel digital zoom for digital video camera |
US20060203008A1 (en) * | 2005-03-10 | 2006-09-14 | Seiko Epson Corporation | Apparatus for displaying an image |
US7554561B2 (en) * | 2005-03-10 | 2009-06-30 | Seiko Epson Corporation | Apparatus for displaying an image |
US20080158340A1 (en) * | 2006-12-04 | 2008-07-03 | Kabushiki Kaisha Toshiba | Video chat apparatus and method |
US20110199484A1 (en) * | 2007-01-10 | 2011-08-18 | Canon Kabushiki Kaisha | Camera control apparatus and method, and camera control system |
US8427539B2 (en) * | 2007-01-10 | 2013-04-23 | Canon Kabushiki Kaisha | Camera control apparatus and method, and camera control system |
US20120038679A1 (en) * | 2010-08-13 | 2012-02-16 | Lg Electronics Inc. | Mobile terminal, display device and controlling method thereof |
CN105683867A (en) * | 2013-09-20 | 2016-06-15 | 微软技术许可有限责任公司 | Configuration of a touch screen display with conferencing |
US20170068417A1 (en) * | 2013-12-16 | 2017-03-09 | Sony Corporation | Information processing apparatus, program, information processing method, and information processing system |
US10802663B2 (en) * | 2013-12-16 | 2020-10-13 | Sony Corporation | Information processing apparatus, information processing method, and information processing system |
CN113395479A (en) * | 2021-06-16 | 2021-09-14 | 随锐科技集团股份有限公司 | Video conference picture processing method and system |
Also Published As
Publication number | Publication date |
---|---|
JP3797678B2 (en) | 2006-07-19 |
US6424373B1 (en) | 2002-07-23 |
JPH07320031A (en) | 1995-12-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US6424373B1 (en) | Apparatus and method for camera control | |
US6346962B1 (en) | Control of video conferencing system with pointing device | |
US6208373B1 (en) | Method and apparatus for enabling a videoconferencing participant to appear focused on camera to corresponding users | |
JP3862315B2 (en) | Image display apparatus and control method thereof | |
JP4228010B2 (en) | Video conferencing equipment | |
US9270936B2 (en) | Functionality for indicating direction of attention | |
EP1536645A1 (en) | Video conferencing system with physical cues | |
EP2338277A1 (en) | A control system for a local telepresence videoconferencing system and a method for establishing a video conference call | |
WO2001059749A1 (en) | Multiple-screen simultaneous displaying apparatus, multiple-screen simultaneous displaying method, video signal generating device, and recorded medium | |
JP3036088B2 (en) | Sound signal output method for displaying multiple image windows | |
WO1994007327A1 (en) | Method and apparatus for on-screen camera control in video-conference equipment | |
JP2003186593A (en) | Multiwindow display method and system thereof | |
JP3674993B2 (en) | Image display method for virtual conference system and terminal device for virtual conference | |
JPH07162532A (en) | Inter-multi-point communication conference support equipment | |
JPH06311510A (en) | Conference supporting system for remote location | |
JP3625549B2 (en) | Multipoint video conferencing system | |
JP2007221437A (en) | Remote conference system | |
JPH04238475A (en) | Handset type television device and video telephone system using the same | |
JPH09247638A (en) | Video conference system | |
KR102619761B1 (en) | Server for TelePresentation video Conference System | |
JPH08181960A (en) | Remote conference video image display system | |
JP2947108B2 (en) | Cooperative work interface controller | |
KR100493292B1 (en) | apparatus for controlling split zoom of display device | |
JP2004147105A (en) | Terminal equipment for video phone | |
EP4297399A1 (en) | Distribution system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |