GB2323238A - Selecting lens and television camera combination for designated conditions - Google Patents
Selecting lens and television camera combination for designated conditions Download PDFInfo
- Publication number
- GB2323238A GB2323238A GB9805140A GB9805140A GB2323238A GB 2323238 A GB2323238 A GB 2323238A GB 9805140 A GB9805140 A GB 9805140A GB 9805140 A GB9805140 A GB 9805140A GB 2323238 A GB2323238 A GB 2323238A
- Authority
- GB
- United Kingdom
- Prior art keywords
- camera
- lens
- photographing
- lenses
- selection method
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/66—Remote control of cameras or camera parts, e.g. by remote control devices
- H04N23/663—Remote control of cameras or camera parts, e.g. by remote control devices for controlling interchangeable camera parts based on electronic image sensor signals
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Cameras In General (AREA)
- Structure And Mechanism Of Cameras (AREA)
- Studio Devices (AREA)
Abstract
A lens and TV camera selection apparatus in which specifications of a plurality of lenses and TV cameras are stored in a memory 13, so that an optimal combination of a lens and a TV camera to meet predetermined conditions can be selected and indicated; including an input unit 15 for inputting photographing conditions, a retrieval means for retrieving lenses which meet the input photographing conditions, and a calculation unit 11 for calculating photographing conditions for a lens selected by the retrieval means and for displaying the calculation result via a monitor 17. The invention also discloses a lens and TV camera selection method including inputting photographing conditions, retrieving and indicating the lenses which meet the input photographing conditions, selecting one lens from among the indicated lenses, calculating photographing conditions for the selected lens, and indicating the photographing conditions thus calculated.
Description
METHOD AND APPARATUS FOR SELECTING LENS AND TV CAMERA
The present invention relates to a method and an apparatus for selecting a combination of a lens and camera, for example, a combination of a TV camera for industrial use (CCTV camera) and an appropriate lens to be mounted thereto, in accordance with the conditions of usage.
A conventional TV camera for industrial use or a CCTV camera usually comprises a camera body having an image pickup device and a photographing lens mounted to the camera body. In general, various kinds of camera bodies and photographing lenses are available,and they are selected and combined to meet photographing conditions, such as the size of an object to be photographed or an object distance.
However, it can be difficult for a user to determine an appropriate combination of the camera body and the photographing lens. The camera body will be referred to as a TV camera.
It is an object of the present invention to provide a camera body - photographing lens selection method and apparatus in which an appropriate combination of camera body and photographing lens can be easily selected to meet the demand of the user.
According to an aspect of the present invention, there is provided a lens and TV camera selection apparatus in which specifications of a plurality of lenses and TV cameras are stored in a memory storage device, so that an optimal combination of a lens and a TV camera can be selected to meet predetermined conditions and displayed via a monitor comprising: an input device for inputting a plurality of photographing conditions; a retrieval means for retrieving lenses which meet the photographing conditions input by the input unit and for displaying the same via the monitor; and a calculation unit for calculating photographing conditions for a lens selected by the retrieval means from among those indicated and the calculation result thereof is displayed via the monitor.
According to another aspect of the present invention, there is provided a lens and TV camera selection method in which specifications of a plurality of lenses and TV cameras are stored in a memory, so that an optimal combination of a lens and TV camera to meet predetermined conditions can be selected, comprising the steps of: inputting a plurality of photographing conditions, retrieving and indicating which lenses meet the input photographing conditions so as to select the same, selecting one lens from among the indicated lenses, calculating photographing conditions for the selected lens, and displaying the photographing conditions thus calculated.
An example of the present
invention will be described below in detail with reference to the accompanying drawings, in which;
Figure 1 is a block diagram of an apparatus to which a selection method of the present invention is applied;
Figures 2A, 2B, 3A, 3B, 4A and 4B are flow charts of the main operation of the present invention;
Figure 5 is a flow chart of an interruption operation of the present invention;
Figure 6 is a flow chart of a printing operation of the present invention;
Figure 7 shows an example display of a specification of a photographing lens and a content of printed information obtained by a selection method of the present invention;
Figure 8 shows an example display of photographing conditions for a photographing lens and a camera and the content of printed information obtained by a selection method of the present invention; and
Figure 9 shows an example display of photographing conditions for a photographing lens and a camera and a content of printed information obtained by a selection method of the present invention at an object distance priority mode.
In the illustrated embodiment, a control routine is carried out by a personal computer. Fig. 1, shows the utilization of a personal computer for a camera and lens selection method of the present invention. It is also possible to store a program by which the lens selection method is executed via a medium such as a floppy disk.
An apparatus of the present invention is composed of a calculation/control circuit 11; a memory 13 in which predetermined data or a program(s) are stored; a data input unit 15 adapted to input commands or data; a monitor 17 on which the data input/output display, input data, calculation data, and graphic data are displayed; and a printer 19. The
alculation/control circuit 11 can be in the form of a personal computer body; the data input unit 15 can be in the form of a key board and a mouse; the memory 13 can be in the form of a ROM and RAM in the personal computer body and/or a removable memory such as a hard disk drive, floppy disk drive, CD-ROM, or an opto-magnetic disk drive; the monitor 17 can be in the form of a liquid crystal display or CRT display; and the printer can be in the form of a laser printer, ink-jet printer or sublimation type printer.
In the illustrated embodiment, when predetermined data is input by an operator through the data input unit 15, the control circuit 11 reads programs written in the memory 13 to thereby perform the lens selection operation.
The operations in the present invention will be discussed below with reference to the flow charts shown in
Figs. 2 through 6. The operations are carried out by the calculation/control circuit 11 which reads the programs stored in the memory 13.
In the illustrated embodiment, data to be input by a provider of the camera or photographing lens, and data to be input or selected by an operator who selects the camera or photographing lens using the selection apparatus are as follows:
< Data to be input by Provider >
The camera is a CCTV camera using a CCD image sensor as an image pickup device. Data for the CCTV camera includes the type of camera, the format size for each kind, for example, the type of camera mount; the diagonal, horizontal and vertical dimensions of the image surface of the CCD image pickup device; and the price.
The photographing lens is for CCD TV camera use, of which the lens data includes: the type of lens, the focal length, the maximum aperture (iris) ratio, the diaphragm range, the diaphragm system (iris), the type of the lens mount, the size of filter, the size of lens, the lens system, the real focal length, the distance between the principal points, the total length of the lens barrel, the physical amount of feed (the displacement of the focusing lens based on the unit number of revolutions of, for example, an automatic focusing motor), the near point, far point and the price.
< Data to be input by User >
1. Desired Object Distance (Working Distance)
2. Desired Focal Length
3. Size of Object (e.g., height or width)
4. Format Size of the Camera (the size of the image
surface, the size of the image pickup surface of
the CCD image picture device [e.g., 1 inch or 1/2
inch, etc.], or the length of the diagonal, etc.)
5. Direction in which the object is to be photographed
over the entire picture frame (whether the object
is to be photographed vertically or horizontally in
respect to the picture frame)
6. Diaphragm System (automatic diaphragm or manual
diaphragm)
If three of the items of data 1, 2, 3 or 4 (or 5) are determined, the remaining single item of data can be determined. The data to be input by the user is displayed via the monitor 17 so that the user can input the data using the input unit 15tor the data to be input by the user can be selected by the user from that shown in a so-called pop-up menu.
For example, the desired object distance and the desired focal length are input by the user using the input unit 15 while the numerical values thereof are displayed via the monitor 17.
The object size is input by the user using the input unit 15 while displaying the numerical value thereof together with one or both of the height and width indication windows in the display (in the monitor 17). If both the height and width indication windows are indicated, it is possible to execute a setting which determines whether it is necessary to input at least one of the items of data or both the height and width data.
The format size of the camera and the diaphragm system are indicated in a check portion of the display (in the monitor 17), so that the user can check the format size and the diaphragm system using the input device 15.
With respect to the format size of the camera, the size of the picture frame which can be provided by the provider of the CCTV camera is input in advance as provider data and is stored in the memory 13. The camera size data is indicated in the format size input window in the form of a pop-up menu, so that the user can select the data from the pop-up menu via the input unit 15.
The diaphragm system (the manual diaphragm or automatic diaphragm) is indicated together with the check windows in the display (in the monitor 17), so that the user can select the desired check window via the input unit 15.
Concerning the direction in which the object is to be photographed within the entire picture frame, a separate check is made to determine whether the object is vertical or horizontal and is displayed with the check windows, and the user can select the desired check window using the input unit.
On commencement of the control routine (when the program is activated) shown in the flow chart of figures 2A and 2B, the initial screen is indicated. The control routine proceeds no further until the input unit receives input (S101). The input data can be, for example, selected from meters (m), centimeters (cm), millimeters (mm), feet (ft), or inches (in).
If the unit of input receives input, the control waits until any one of menu-l, menu-2 or menu-3 is selected (S101, S103, S131, S133). Menu-l is for determining the focal length of the photographing lens; menu-2 is for determining the size of the object to be viewed in the picture frame; and menu-3 is for determining the object distance.
Menu-l corresponds to a mode in which the user inputs a desired object distance, an object size, a format size of the camera, a diaphragm system, and a direction in which the object is to be photographed within the picture frame. A list of the photographing lenses (appropriate photographing lenses) being considered for selection is indicated in accordance with the input data; the user selects a desired photographing lens from the list and the photographing conditions namely, the object distance, the object size, the format size of the camera, the diaphragm system, and the direction in which the object is to be photographed within the picture frame, depending on the selected photographing lens are re-calculated and displayed.
Menu-2 corresponds to a mode in which the user selects a desired object distance, a desired focal length, and a format size of the cacmera and a diaphragm system. A list of the photographing lenses (appropriate photographing lenses) being considered for selection is indicated in accordance with the input data; the user selects a desired photographing lens from the list and the size of the object to be viewed in the picture frame is calculated and indicated in accordance with the input data and the selected photographing lens.
Menu-3 corresponds to a mode in which the user inputs a desired focal length, an object size, a format size of the camera, a diaphragm system, and the direction in which the object is photographed within the picture frame. A list of the photographing lenses (appropriate photographing lenses) being considered for selection is indicated in accordance with the input data; the user selects a desired photographing lens from the list and an appropriate object distance is calculated and indicated in accordance with the input data and the selected photographing lens.
If menu-l is selected, photographing-condition-inputscreen-1 corresponding to menu-1 is displayed. The data used by the user in photographing-condition-input-screen-1 may comprise a desired object distance, an object size, a format size of the camera, a diaphragm system, and a direction in which the object image is to be viewed within the picture frame.
If menu-2 is selected, the photographing-conditioninput-screen-2 corresponding to menu-2 is displayed. The data used by the user in the photographing-condition-inputscreen-2 may comprise a desired object distance, a desired focal length, a format size of the camera, and a diaphragm system.
If menu-3 is selected, the photographing-condition-inputscreen-3 corresponding to menu-3 is displayed. The data used by the user in the photographing-condition-input-screen-3 may comprise an object size, a desired focal length, a format size of the camera, a diaphragm system, and an overall direction in which the object image is to be viewed within the picture frame.
The operation when menu-1 is selected will be discussed below. When menu-1 is selected, the first photographingcondition-input screen 1 is indicated (S103: Yes, S105). In the first photographing condition input screen 1, the data input windows for a desired object distance, an object size, a format size of the camera, a diaphragm system, and a direction in which the object image is indicated over the entire picture surface are indicated. The data input windows can be indicated on the same screen or can be separately displayed, so that each time data is input, the indication is switched from one input window to other input window.
Once the data has been entered, a recommended focal length is calculated in accordance with the desired object distance, the object size, the format size of the camera, and the direction in which the object image is to be viewed within the picture frame. Thereafter, the stored photographinglens-data is retrieved to select and indicate the photographing lenses corresponding to the selected diaphragm system and having focal lengths equal to or close to the recommended focal length in the order starting from the closest focal length (S107, S109) . The indicated data includes at least model numbers and focal lengths of the photographing lenses.
The user decides whether or not to select a photographing lens from the indicated list of the photographing lenses. If no photographing lens is selected, the control routine is returned to photographing condition input-screen-l (S111: No, S105). Consequently, the previously input data is indicated in photographing-condition-inputscreen-1.
If a photographing lens is selected by the user, the calculation is carried out again in accordance with the specification of the selected photographing lens and the data input at step S105 (S113).
Thereafter, the detailed-lens-data is indicated in accordance with the calculation result (S115). As a result of the calculation, a check is made to determine whether or not a close-up ring is necessary. If the close-up ring is needed, the type of the close-up ring which can be used is indicated. Furthermore, the general angle (the field angle in the horizontal and vertical directions) is also indicated. The photographing conditions when the desired object distance has a priority are calculated and indicated.
The detailed-lens-data indication screen displays a diaphragm value F input window, a print option-key, a reoperation option-key which is actuatable after the input conditions are changed, a re-menu option-key which is actuatable after the menu is changed, and a completion option
key. The user actuates the data input unit 15 to input the
diaphragm value F and to actuate the above-mentioned option
keys.
If the diaphragm value F is input or the input
diaphragm value is changed, the calculation operation of the
depth of field is carried out to obtain and indicate the
near point and the far point (S116; Yes, S124). If the
print option-key is actuated, the printing operation is
carried out (S117; Yes, S125). If the re-operation option
key is turned ON, the control routine is returned to
photographing condition input-screen-l (initial screens
(S119; Yes, S105). If the re-menu option-key is turned ON
after the menu is changed, the control routine is
returned to the menu selection screen (initial screen)
(S121; Yes, S101) . If the completion option-key is
actuated, the application ends (S123; Yes). If no option
key is actuated, the detailed-lens-data indication is
maintained (S123; No, S115).
The focal depth calculation operation and the printing
operation will be discussed below with reference to Figs. 5
and 6.
In the focal depth calculation operation, a check is
made to determine whether or not the diaphragm value F which
is input by the user through the data input device 15 is within a predetermined diaphragm value range of the desired photographing lens, and the control routine does not proceed until the input diaphragm value F is within the predetermined diaphragm value range (S401, S403; Yes, S401). If the input diaphragm value F is within the predetermined range, the near point and the far point are calculated in accordance with the diaphragm value F, the focal length of the selected photographing lens, and the object distance. The near point and the far point thus obtained are graphically indicated together with the predetermined diaphragm value range.
Thereafter, the control routine is returned (S403; Yes, S405).
The user can find out the depth of field by inputting an optional diaphragm value F.
In the printing operation, a check is made to determine whether or not the calculation of the depth of field (the calculation of the near point and the far point) is completed. If the calculation is completed, the detailedlens-data including the near point and the far point calculations is printed (S501; Yes, S503) . If no calculation is completed, only the detailed-lens-data is printed (S501; No, S505). An example print-out is shown in
Figs. 7 and 8.
Upon completion of the printing operation, a check is made to determine whether or not menu-l has been selected.
If menu-l has not been selected, the printing operation ends (S507; No). If menu-l has been selected, a check is made to determine whether there is a priority indication for the object distance. If there is such an indication, the indication is printed and thereafter, the control routine is returned (S507; Yes, S509). An example of such an object distance priority indication is shown in Fig. 9. If there is no such indication, the control routine is returned without printing (S507; No).
Although the drawings shown are examples of typical print-outs that would be produced, the content of these print-outs is the same as that displayed by the monitor 17. Since the monitor 17 and the printer 19 have different resolutions, different effective areas, and differences in color, the layout thereof may not be identical.
< Menu-2 >
The operation when menu-2 is selected in the initial screen option will be discussed below with reference to
Figures 3A and 3B. When menu-2 is selected, photographingcondition-input screen-2 is displayed (S101, S103; No, S131;
Yes, S201). Photographing-condition-input-screen-2 allows a desired object distance, a desired focal length, a format size of the camera, and a diaphragm system to be entered.
Once the data has been entered, the stored photographing lens - data is retrieved in order to select and display the photographing lenses whose focal lengths are equal to or are close to the desired focal length in order starting from the closest focal length (S209). The indicated list includes an option-key which allows the user to avoid selecting a photographing lens. If this option-key is actuated, the application ends (S211; No), or alternatively, it is possible to return the control routine to step S101, as in the case that menu-l is selected.
If an appropriate photographing lens is selected from the indicated list of the photographing lenses, the object distance for the selected photographing lens is calculated
(S211; Yes, S213). Namely, the size (in the longitudinal and lateral directions) of the largest object which can be photographed is calculated in accordance with the focal length of the selected photographing lens, the input desired object distance and the format size of the camera.
Thereafter, the detailed-lens-data is displayed in accordance with the calculation result (S215). As a result
of the calculation, a check is made to determine whether or not the close-up ring is necessary. If the close-up ring is
needed, the type of close-up ring which can be used is
indicated. Furthermore, the general angle (field angle in the horizontal and vertical directions), the near point and the far point are indicated via a display device. Unlike the case of menu-1 being selected, where there has been a priority put on the desired photographing distance, the photographing condition is not indicated on the display device.
The detailed-lens-data screen indicates a diaphragm value F input window, a print option-key, a re-operation option-key which is actuatable after the input conditions are changed, a re-operation option-key which is actuatable after the menu is changed, and a completion option-key. The user can actuate the above-mentioned option-keys and input the diaphragm value F via the input unit 15.
If the diaphragm value F is input or the input diaphragm value F is changed, the calculation operation of the depth of field is carried out to obtain and indicate the near point and the far point (S216; Yes, S224). If the print option-key is actuated, the printing operation is carried out (S217; Yes, S225). If the re-operation optionkey is actuated after the input condition is changed, the control routine is returned to the photographing conditioninput screen-2 (S219; Yes, S205). If the re-operation optionkey is actuated after the menu is changed, the control routine is returned to the menu selected screen (initial screen (S221;
Yes, Sly1). If the completion option-key is actuated, the application ends (S223; Yes). If no option-key is actuated, the detailed-lens-data indication is maintained (S223; No, S215).
< Menu-3 >
The operation when menu-3 is selected in the initial screen will be discussed below. When menu-3 is selected, photographing-condition-input-screen-3 is displayed (S101,
S103; No, S131; No, S133; Yes, S301, S305). Photographingcondition-input-screen-3 is adapted for entry of a desired focal length, object size,format size of the camera, diaphragm system, and direction (vertical or horizontal) in which the object image is to be viewed in the picture frame.
Once the data has been entered (S305), the stored photographing-lens-data is retrieved in order to select and display a list of the photographing lenses whose focal lengths are equal to or are close to the desired focal length in order starting from the closest focal length value (S309). The indicated list includes an option-key which allows the user to avoid selecting a photographing lens. If this option-key is actuated, the application ends (S311; No), or alternatively, it is possible to return the control to step S101, as in the case that menu-1 is selected.
If an appropriate photographing lens is selected from the indicated list of the photographing lenses, the object distance for the selected photographing lens is calculated (S311; Yes, S3 13 ) . Namely, the object distance when the object to be photographed fills the picture frame of the camera is calculated in accordance with the focal length of the selected photographing lens, input object size, format size of the camera, and the direction in which the object image is to be viewed in the picture frame. Thereafter, the detailed-lens-data including the object distance is displayed (S315). The content of the display is basically the same as that in menu-l or menu-2, but unlike menu-l, where there has been a priority put on the desired photographing distance, the photographing condition is not indicated.
The detailed-lens-data screen displays a diaphragm value F input window, a print option-key, a re-operation option-key which is actuatable after the input conditions are changed, a re-operation option-key which is actuatable after the menu is changed, and a completion option-key. The user can actuate the above-mentioned option-keys and input the diaphragm value F via the input unit 15.
If the diaphragm value F is input or the input diaphragm value F is changed, the calculation operation to obtain the depth of field is carried out to thereby obtain and indicate the near point and the far point (S316; Yes,
S324). If the print option-key is actuated, the printing operation is carried out (S317; Yes, S325). If the reoperation option-key is actuated after the input condition is changed, the control routine is returned to photographingcondition-input-screen-3 (S319; Yes, S305). If the re-operation option-key is actuated after the menu is changed, the control routine is returned to the menu selection screen (initial screen) (S321; Yes, S101). If the completion option-key is actuated, the application ends (S323; Yes). If no optionkey is actuated, the detailed-lens-data indication is maintained (S323; No, S315).
In the selection method and apparatus of the present invention, as mentioned above, the user inputs and selects predetermined data in accordance with the display on the screen. Consequently, the user can learn an appropriate combination of the CCTV camera and the photographing lens to meet the predetermined conditions. The user also can learn the photographing conditions when the appropriate combination is determined.
Claims (23)
1. A selection method for a lens and TV Camera, the method comprising the steps of:
storing the specifications of a plurality of lenses and TV cameras in a memory, so that an optimal combination of a lens and a TV camera to meet predetermined conditions can be selected,
designating a plurality of photographing conditions;
retrieving from said memory lenses which meet said photographing conditions;
arranging said retrieved lenses for allowing selection of one lens from among said lenses which meets said photographing conditions.
2. A selection method according to claim 1, further comprising the steps,
calculating specific photographing conditions for said selected one lens; and
displaying the specific photographing conditions thus calculated.
3. A selection method according to claim 2 in which said retrieved lenses are displayed with said specific photographing conditions.
4. A selection method according to any preceding claim, wherein said plurality of photographing conditions include a size of the picture frame of the camera, an object distance, a size of the object to be photographed, and a direction in which the object is to be photographed within the picture frame.
5. A selection method according to any preceding claim, wherein said plurality of photographing conditions include a diaphragm system.
6. A selection method according to any preceding claim, wherein the specification of the lenses includes a type of the lens, a focal length of the lens, a diaphragm range, a lens mount, and a diaphragm system.
7. A selection method according to any preceding claim, wherein the specification of the camera includes a type of the camera, a size of the picture frame, and the type of a camera mount.
8. A selection method according to any preceding claim, further comprising a step of selecting groups of photographing conditions to be designated prior to designating the photographing conditions.
9. A selection method according to claim 8, wherein first group of photographing conditions comprises
a size of the picture frame of the camera, an object distance, a size of the object to be photographed, a diaphragm system, and a direction in which the object is photographed within the picture frame; wherein the step of retrieving and arranging the lenses
comprises the calculating of an appropriate focal length and retrieving and displaying the lenses whose focal lengths are equal to or are close to the appropriate focal length so as to select the lenses.
10. A selection method according to claim 8, wherein a second group of said photographing conditions comprises a size of the picture frame of the camera, an object distance, a focal length, and a diaphragm system, and wherein the step of retrieving and arranging the lenses
comprises retrieving and displaying the lenses whose focal lengths are equal to or are close to said appropriate focal length so as to select the lenses.
11. A selection method according to claim 8, wherein a third group of photographing conditions comprises a size of said picture frame of said camera, a focal length, an object size, a diaphragm system, and a direction in which the object is to be photographed within the picture frame, and wherein the step of retrieving and arranging the lenses comprises retrieving and displaying the lenses whose focal lengths are equal to or close to said appropriate focal length so as to select the lenses.
12. A selection method according to claim 11, further comprising the step of calculating said photographing conditions in accordance with at least one of said focal lengths of said selected photographing lens, the size of the picture surface of the selected camera, the object distance, the size of the object to be photographed, and the direction in which the object is to be photographed within the picture frame.
13. A selection method according to claim 12, further comprising the step of calculating the size of a largest object which can be photographed in the longitudinal and lateral directions, in accordance with the focal length of the selected photographing lens, the desired object distance, the format size of the camera, and the direction in which the object is to be photographed within the picture frame.
14. A selection method according to claim 11, further comprising a step of calculating the object size when the object is photographed within the picture frame of the camera, in accordance with the focal length of the selected photographing lens, the object size, the format size of the camera, and the direction in which the object is to be photographed within the picture frame.
15. A selection method according to any preceding claim, further comprising a step of calculating and displaying the depth of field in accordance with the selected photographing lens, the designated photographing conditions, and a newly designated diaphragm value after one of the retrieved photographing lenses has been selected.
16. A selection method according to claim 3, further comprising the step of indicating the type of the selected photographing lens, the type of selected camera, the designated photographing conditions, and the calculation result via a monitor.
17. A selection method according to claim 3, further comprising the step of printing said type of said selected photographing lens, said type of said selected camera, said designated photographing conditions, and said calculation result via a printer.
18. A selection method according to claim 3, wherein the step of displaying data includes indicating the photographing range of the object for said picture frame, the relative distance between the camera and the object, the object size, the necessity and the type of said close-up ring, and the calculated depth of field.
19. A selection method according to claim 3, wherein the step of displaying data includes indicating the photographing range of the object for the picture surface, the relative distance between the camera and the object, the object size, and the calculated depth of field.
20. A selection apparatus in which specifications of a plurality of lenses and TV cameras are stored in a memory, so that an optimal combination of a lens and a TV camera to meet predetermined conditions can be selected and displayed in a monitor, comprising:
an input unit for inputting a plurality of photographing conditions;
a retrieval means for retrieving the lenses which meet the photographing conditions input by said input unit and for displaying the same in said monitor; and
a calculation unit for calculating photographing conditions for a lens selected by said retrieval means from among those displayed via said monitor and for displaying the calculation result via said monitor.
21. A selection method for a lens and camera body, the method comprising the steps of:
storing data of the specifications of a plurality of lenses and camera bodies in a memory;
designating a plurality of photographing conditions;
comparing said photographing conditions with said stored data; and
determining and displaying a plurality of best matches of said conditions and data for allowing a selection from said best matches to be made.
22. A selection method substantially as hereinbefore described with reference to the drawings.
23. A selection apparatus substantially as hereinbefore described with reference to the drawings.
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP05517897A JP3371279B2 (en) | 1997-03-10 | 1997-03-10 | Method and apparatus for selecting lens for TV camera |
Publications (2)
Publication Number | Publication Date |
---|---|
GB9805140D0 GB9805140D0 (en) | 1998-05-06 |
GB2323238A true GB2323238A (en) | 1998-09-16 |
Family
ID=12991479
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB9805140A Withdrawn GB2323238A (en) | 1997-03-10 | 1998-03-10 | Selecting lens and television camera combination for designated conditions |
Country Status (2)
Country | Link |
---|---|
JP (1) | JP3371279B2 (en) |
GB (1) | GB2323238A (en) |
Cited By (44)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160112628A1 (en) * | 2014-10-20 | 2016-04-21 | Symbol Technologies, Inc. | Apparatus and method for specifying and aiming cameras at shelves |
US10352689B2 (en) | 2016-01-28 | 2019-07-16 | Symbol Technologies, Llc | Methods and systems for high precision locationing with depth values |
US10505057B2 (en) | 2017-05-01 | 2019-12-10 | Symbol Technologies, Llc | Device and method for operating cameras and light sources wherein parasitic reflections from a paired light source are not reflected into the paired camera |
US10521914B2 (en) | 2017-09-07 | 2019-12-31 | Symbol Technologies, Llc | Multi-sensor object recognition system and method |
US10572763B2 (en) | 2017-09-07 | 2020-02-25 | Symbol Technologies, Llc | Method and apparatus for support surface edge detection |
US10591918B2 (en) | 2017-05-01 | 2020-03-17 | Symbol Technologies, Llc | Fixed segmented lattice planning for a mobile automation apparatus |
US10663590B2 (en) | 2017-05-01 | 2020-05-26 | Symbol Technologies, Llc | Device and method for merging lidar data |
US10726273B2 (en) | 2017-05-01 | 2020-07-28 | Symbol Technologies, Llc | Method and apparatus for shelf feature and object placement detection from shelf images |
US10731970B2 (en) | 2018-12-13 | 2020-08-04 | Zebra Technologies Corporation | Method, system and apparatus for support structure detection |
US10740911B2 (en) | 2018-04-05 | 2020-08-11 | Symbol Technologies, Llc | Method, system and apparatus for correcting translucency artifacts in data representing a support structure |
US10809078B2 (en) | 2018-04-05 | 2020-10-20 | Symbol Technologies, Llc | Method, system and apparatus for dynamic path generation |
US10823572B2 (en) | 2018-04-05 | 2020-11-03 | Symbol Technologies, Llc | Method, system and apparatus for generating navigational data |
US10949798B2 (en) | 2017-05-01 | 2021-03-16 | Symbol Technologies, Llc | Multimodal localization and mapping for a mobile automation apparatus |
US11003188B2 (en) | 2018-11-13 | 2021-05-11 | Zebra Technologies Corporation | Method, system and apparatus for obstacle handling in navigational path generation |
US11010920B2 (en) | 2018-10-05 | 2021-05-18 | Zebra Technologies Corporation | Method, system and apparatus for object detection in point clouds |
US11015938B2 (en) | 2018-12-12 | 2021-05-25 | Zebra Technologies Corporation | Method, system and apparatus for navigational assistance |
US11042161B2 (en) | 2016-11-16 | 2021-06-22 | Symbol Technologies, Llc | Navigation control method and apparatus in a mobile automation system |
US11079240B2 (en) | 2018-12-07 | 2021-08-03 | Zebra Technologies Corporation | Method, system and apparatus for adaptive particle filter localization |
US11080566B2 (en) | 2019-06-03 | 2021-08-03 | Zebra Technologies Corporation | Method, system and apparatus for gap detection in support structures with peg regions |
US11093896B2 (en) | 2017-05-01 | 2021-08-17 | Symbol Technologies, Llc | Product status detection system |
US11090811B2 (en) | 2018-11-13 | 2021-08-17 | Zebra Technologies Corporation | Method and apparatus for labeling of support structures |
US11100303B2 (en) | 2018-12-10 | 2021-08-24 | Zebra Technologies Corporation | Method, system and apparatus for auxiliary label detection and association |
US11107238B2 (en) | 2019-12-13 | 2021-08-31 | Zebra Technologies Corporation | Method, system and apparatus for detecting item facings |
US11151743B2 (en) | 2019-06-03 | 2021-10-19 | Zebra Technologies Corporation | Method, system and apparatus for end of aisle detection |
US11200677B2 (en) | 2019-06-03 | 2021-12-14 | Zebra Technologies Corporation | Method, system and apparatus for shelf edge detection |
US11327504B2 (en) | 2018-04-05 | 2022-05-10 | Symbol Technologies, Llc | Method, system and apparatus for mobile automation apparatus localization |
US11341663B2 (en) | 2019-06-03 | 2022-05-24 | Zebra Technologies Corporation | Method, system and apparatus for detecting support structure obstructions |
US11367092B2 (en) | 2017-05-01 | 2022-06-21 | Symbol Technologies, Llc | Method and apparatus for extracting and processing price text from an image set |
US11392891B2 (en) | 2020-11-03 | 2022-07-19 | Zebra Technologies Corporation | Item placement detection and optimization in material handling systems |
US11402846B2 (en) | 2019-06-03 | 2022-08-02 | Zebra Technologies Corporation | Method, system and apparatus for mitigating data capture light leakage |
US11416000B2 (en) | 2018-12-07 | 2022-08-16 | Zebra Technologies Corporation | Method and apparatus for navigational ray tracing |
US11450024B2 (en) | 2020-07-17 | 2022-09-20 | Zebra Technologies Corporation | Mixed depth object detection |
US11449059B2 (en) | 2017-05-01 | 2022-09-20 | Symbol Technologies, Llc | Obstacle detection for a mobile automation apparatus |
US11506483B2 (en) | 2018-10-05 | 2022-11-22 | Zebra Technologies Corporation | Method, system and apparatus for support structure depth determination |
US11507103B2 (en) | 2019-12-04 | 2022-11-22 | Zebra Technologies Corporation | Method, system and apparatus for localization-based historical obstacle handling |
US11592826B2 (en) | 2018-12-28 | 2023-02-28 | Zebra Technologies Corporation | Method, system and apparatus for dynamic loop closure in mapping trajectories |
US11593915B2 (en) | 2020-10-21 | 2023-02-28 | Zebra Technologies Corporation | Parallax-tolerant panoramic image generation |
US11600084B2 (en) | 2017-05-05 | 2023-03-07 | Symbol Technologies, Llc | Method and apparatus for detecting and interpreting price label text |
US11662739B2 (en) | 2019-06-03 | 2023-05-30 | Zebra Technologies Corporation | Method, system and apparatus for adaptive ceiling-based localization |
US11822333B2 (en) | 2020-03-30 | 2023-11-21 | Zebra Technologies Corporation | Method, system and apparatus for data capture illumination control |
US11847832B2 (en) | 2020-11-11 | 2023-12-19 | Zebra Technologies Corporation | Object classification for autonomous navigation systems |
US11954882B2 (en) | 2021-06-17 | 2024-04-09 | Zebra Technologies Corporation | Feature-based georegistration for mobile computing devices |
US11960286B2 (en) | 2019-06-03 | 2024-04-16 | Zebra Technologies Corporation | Method, system and apparatus for dynamic task sequencing |
US11978011B2 (en) | 2017-05-01 | 2024-05-07 | Symbol Technologies, Llc | Method and apparatus for object status detection |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005070635A (en) * | 2003-08-27 | 2005-03-17 | Fujinon Corp | Lens information display device |
JP2010268031A (en) * | 2009-05-12 | 2010-11-25 | Olympus Corp | Digital camera |
JP2010268029A (en) * | 2009-05-12 | 2010-11-25 | Olympus Corp | Digital camera |
JP5586071B2 (en) * | 2012-02-17 | 2014-09-10 | Necソリューションイノベータ株式会社 | Imaging support apparatus, imaging support method, and program |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0497376A1 (en) * | 1991-02-01 | 1992-08-05 | Canon Kabushiki Kaisha | Interchangeable lens type camera apparatus |
-
1997
- 1997-03-10 JP JP05517897A patent/JP3371279B2/en not_active Expired - Fee Related
-
1998
- 1998-03-10 GB GB9805140A patent/GB2323238A/en not_active Withdrawn
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0497376A1 (en) * | 1991-02-01 | 1992-08-05 | Canon Kabushiki Kaisha | Interchangeable lens type camera apparatus |
Cited By (46)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20160112628A1 (en) * | 2014-10-20 | 2016-04-21 | Symbol Technologies, Inc. | Apparatus and method for specifying and aiming cameras at shelves |
US9706105B2 (en) * | 2014-10-20 | 2017-07-11 | Symbol Technologies, Llc | Apparatus and method for specifying and aiming cameras at shelves |
US9961256B2 (en) * | 2014-10-20 | 2018-05-01 | Symbol Technologies, Llc | Apparatus and method for specifying and aiming cameras at shelves |
US10352689B2 (en) | 2016-01-28 | 2019-07-16 | Symbol Technologies, Llc | Methods and systems for high precision locationing with depth values |
US11042161B2 (en) | 2016-11-16 | 2021-06-22 | Symbol Technologies, Llc | Navigation control method and apparatus in a mobile automation system |
US11093896B2 (en) | 2017-05-01 | 2021-08-17 | Symbol Technologies, Llc | Product status detection system |
US10949798B2 (en) | 2017-05-01 | 2021-03-16 | Symbol Technologies, Llc | Multimodal localization and mapping for a mobile automation apparatus |
US10591918B2 (en) | 2017-05-01 | 2020-03-17 | Symbol Technologies, Llc | Fixed segmented lattice planning for a mobile automation apparatus |
US10663590B2 (en) | 2017-05-01 | 2020-05-26 | Symbol Technologies, Llc | Device and method for merging lidar data |
US10726273B2 (en) | 2017-05-01 | 2020-07-28 | Symbol Technologies, Llc | Method and apparatus for shelf feature and object placement detection from shelf images |
US11978011B2 (en) | 2017-05-01 | 2024-05-07 | Symbol Technologies, Llc | Method and apparatus for object status detection |
US11367092B2 (en) | 2017-05-01 | 2022-06-21 | Symbol Technologies, Llc | Method and apparatus for extracting and processing price text from an image set |
US11449059B2 (en) | 2017-05-01 | 2022-09-20 | Symbol Technologies, Llc | Obstacle detection for a mobile automation apparatus |
US10505057B2 (en) | 2017-05-01 | 2019-12-10 | Symbol Technologies, Llc | Device and method for operating cameras and light sources wherein parasitic reflections from a paired light source are not reflected into the paired camera |
US11600084B2 (en) | 2017-05-05 | 2023-03-07 | Symbol Technologies, Llc | Method and apparatus for detecting and interpreting price label text |
US10572763B2 (en) | 2017-09-07 | 2020-02-25 | Symbol Technologies, Llc | Method and apparatus for support surface edge detection |
US10521914B2 (en) | 2017-09-07 | 2019-12-31 | Symbol Technologies, Llc | Multi-sensor object recognition system and method |
US11327504B2 (en) | 2018-04-05 | 2022-05-10 | Symbol Technologies, Llc | Method, system and apparatus for mobile automation apparatus localization |
US10823572B2 (en) | 2018-04-05 | 2020-11-03 | Symbol Technologies, Llc | Method, system and apparatus for generating navigational data |
US10809078B2 (en) | 2018-04-05 | 2020-10-20 | Symbol Technologies, Llc | Method, system and apparatus for dynamic path generation |
US10740911B2 (en) | 2018-04-05 | 2020-08-11 | Symbol Technologies, Llc | Method, system and apparatus for correcting translucency artifacts in data representing a support structure |
US11010920B2 (en) | 2018-10-05 | 2021-05-18 | Zebra Technologies Corporation | Method, system and apparatus for object detection in point clouds |
US11506483B2 (en) | 2018-10-05 | 2022-11-22 | Zebra Technologies Corporation | Method, system and apparatus for support structure depth determination |
US11090811B2 (en) | 2018-11-13 | 2021-08-17 | Zebra Technologies Corporation | Method and apparatus for labeling of support structures |
US11003188B2 (en) | 2018-11-13 | 2021-05-11 | Zebra Technologies Corporation | Method, system and apparatus for obstacle handling in navigational path generation |
US11079240B2 (en) | 2018-12-07 | 2021-08-03 | Zebra Technologies Corporation | Method, system and apparatus for adaptive particle filter localization |
US11416000B2 (en) | 2018-12-07 | 2022-08-16 | Zebra Technologies Corporation | Method and apparatus for navigational ray tracing |
US11100303B2 (en) | 2018-12-10 | 2021-08-24 | Zebra Technologies Corporation | Method, system and apparatus for auxiliary label detection and association |
US11015938B2 (en) | 2018-12-12 | 2021-05-25 | Zebra Technologies Corporation | Method, system and apparatus for navigational assistance |
US10731970B2 (en) | 2018-12-13 | 2020-08-04 | Zebra Technologies Corporation | Method, system and apparatus for support structure detection |
US11592826B2 (en) | 2018-12-28 | 2023-02-28 | Zebra Technologies Corporation | Method, system and apparatus for dynamic loop closure in mapping trajectories |
US11080566B2 (en) | 2019-06-03 | 2021-08-03 | Zebra Technologies Corporation | Method, system and apparatus for gap detection in support structures with peg regions |
US11960286B2 (en) | 2019-06-03 | 2024-04-16 | Zebra Technologies Corporation | Method, system and apparatus for dynamic task sequencing |
US11200677B2 (en) | 2019-06-03 | 2021-12-14 | Zebra Technologies Corporation | Method, system and apparatus for shelf edge detection |
US11341663B2 (en) | 2019-06-03 | 2022-05-24 | Zebra Technologies Corporation | Method, system and apparatus for detecting support structure obstructions |
US11402846B2 (en) | 2019-06-03 | 2022-08-02 | Zebra Technologies Corporation | Method, system and apparatus for mitigating data capture light leakage |
US11151743B2 (en) | 2019-06-03 | 2021-10-19 | Zebra Technologies Corporation | Method, system and apparatus for end of aisle detection |
US11662739B2 (en) | 2019-06-03 | 2023-05-30 | Zebra Technologies Corporation | Method, system and apparatus for adaptive ceiling-based localization |
US11507103B2 (en) | 2019-12-04 | 2022-11-22 | Zebra Technologies Corporation | Method, system and apparatus for localization-based historical obstacle handling |
US11107238B2 (en) | 2019-12-13 | 2021-08-31 | Zebra Technologies Corporation | Method, system and apparatus for detecting item facings |
US11822333B2 (en) | 2020-03-30 | 2023-11-21 | Zebra Technologies Corporation | Method, system and apparatus for data capture illumination control |
US11450024B2 (en) | 2020-07-17 | 2022-09-20 | Zebra Technologies Corporation | Mixed depth object detection |
US11593915B2 (en) | 2020-10-21 | 2023-02-28 | Zebra Technologies Corporation | Parallax-tolerant panoramic image generation |
US11392891B2 (en) | 2020-11-03 | 2022-07-19 | Zebra Technologies Corporation | Item placement detection and optimization in material handling systems |
US11847832B2 (en) | 2020-11-11 | 2023-12-19 | Zebra Technologies Corporation | Object classification for autonomous navigation systems |
US11954882B2 (en) | 2021-06-17 | 2024-04-09 | Zebra Technologies Corporation | Feature-based georegistration for mobile computing devices |
Also Published As
Publication number | Publication date |
---|---|
JP3371279B2 (en) | 2003-01-27 |
GB9805140D0 (en) | 1998-05-06 |
JPH10257361A (en) | 1998-09-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
GB2323238A (en) | Selecting lens and television camera combination for designated conditions | |
US5500700A (en) | Method of creating a composite print including the user's image | |
US6879342B1 (en) | Electronic camera with image review | |
EP0172368B1 (en) | Method for displaying an image | |
US4845634A (en) | Product information network system | |
US5777753A (en) | Method and apparatus for maximizing the number of radiological images printed on sheet of film | |
CN100394762C (en) | Image processing apparatus | |
US6937321B2 (en) | Distance measuring method and image input device with distance measuring function | |
DE60206320T2 (en) | Apparatus and method for perspective projection imaging | |
US5685002A (en) | Image processing system capable of generating a multi-picture image | |
JP2011238283A (en) | Interactive processing system | |
CN100367756C (en) | Apparatus for processing photographic image | |
EP0804031A3 (en) | Image display apparatus, camera control apparatus and method | |
EP1844605A1 (en) | Digital imaging system with digital zoom warning | |
EP1403758A2 (en) | Image printing preview device and image forming apparatus | |
US5319403A (en) | Camera capable of providing printing information | |
US5666471A (en) | Image processing apparatus for dividing images for printing | |
EP1185073B1 (en) | Image printing apparatus | |
EP0531030B1 (en) | Method and device for image makeup | |
EP2092743A2 (en) | Distance camera having a memory module | |
US5410639A (en) | Automatic installation for the composition and the continuous printing of small texts | |
JP2007241793A (en) | Id photo photographing device | |
US5572656A (en) | Portrait drawing apparatus having image data input function | |
JP4406100B2 (en) | Interactive processing system | |
JP2005111945A (en) | Photographic method and photographic printer |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WAP | Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1) |