WO2012008856A1 - A camera arrangement - Google Patents

A camera arrangement Download PDF

Info

Publication number
WO2012008856A1
WO2012008856A1 PCT/NZ2011/000132 NZ2011000132W WO2012008856A1 WO 2012008856 A1 WO2012008856 A1 WO 2012008856A1 NZ 2011000132 W NZ2011000132 W NZ 2011000132W WO 2012008856 A1 WO2012008856 A1 WO 2012008856A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
sensing devices
camera
image sensing
view
Prior art date
Application number
PCT/NZ2011/000132
Other languages
French (fr)
Inventor
Therese Bruce Glass
Michael Walter Glass
Walter Michael Glass
Darren Grant Lucinsky
Original Assignee
Therese Bruce Glass
Walter Michael Glass
Walter Michael Glass
Darren Grant Lucinsky
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Therese Bruce Glass, Walter Michael Glass, Walter Michael Glass, Darren Grant Lucinsky filed Critical Therese Bruce Glass
Publication of WO2012008856A1 publication Critical patent/WO2012008856A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources

Definitions

  • the present invention broadly relates to an interactive electronics camera display system for use in a retail environment and in particular to an electronic mirror apparatus and method of operation.
  • mirrors are extremely expensive and a potential safety hazard - especially moveable mirrors suspended on hinges.
  • a fitting room also has spatial constraints that limit the use (and quantity) of floating or angled mirrors. Consequently, there has never been an accurate, undistorted, easy to view image from the back that is viewable by the subject.
  • the invention may broadly be said to consist in a display system comprising a mounting structure, a plurality of spatially aligned image sensing devices mounted on the mounting structure, each said image sensing device capable of outputting digital output image, and a control system configured to receive the output images from the plurality of spatially separated image sensing devices, combine the output images to produce a final image, and send the output image to a display device.
  • the system further comprises a second mounting structure, each structure having a plurality of spatially aligned image sensing devices mounted thereon.
  • the mounting structures are located at the front and rear of a fitting room.
  • control system further comprises a control input.
  • control input is used to select one or more of the desired output images to be sent to the display device.
  • the controller combines the output images by truncating the output images and abutting the truncated images together.
  • the display system is used in a fitting room.
  • the image sensing devices are a camera capable of outputting a digital representation of an image.
  • the image sensing devices are located a distance apart that substantially minimises overlapping field of view.
  • the system further comprises a lighting system.
  • control input is used to selectively control the lighting system.
  • the lighting system is used to selectively control the ambient colour balance of the images captured by the image sensing devices.
  • the control system is further configured to manipulate the output images of the image sensing devices to affect the colour balance of each image.
  • the image sensing devices are vertically aligned.
  • control system is configured to facilitate storage or transmission of the output images, or final image or both.
  • the invention may broadly be said to consist in a method of producing a close- up and wide angle view of a subject comprising providing a control system, the control system configured to perform the steps of: acquiring at least a first image and a second image, wherein a portion of the first and second images have a common field of view, adjusting at least the first image to reduce the common field of view, combining at least the first image and the second image to produce an output image and displaying the output image on a screen.
  • the step of adjusting includes truncating the common field of view of at least the first image.
  • the step of adjusting includes truncating at least part of a common field of view of the first and second image.
  • images are recorded by image sensing devices.
  • the first and the second images are recorded by a first and a second image sensing device, respectively.
  • the image sensing devices are a camera capable of outputting a digital representation of an image.
  • the image sensing devices are located a distance apart to minimise the overlapping field of view.
  • the image sensing devices are arranged on a mounting structure.
  • the image sensing devices are located in a fitting room.
  • the image sensing devices are spatially arranged to minimise overlapping field of view when capturing an object located a distance of less than 0.5 metres away.
  • the field of view is at least 57 degrees.
  • an arrangement of image sensing devices captures a combined height of approximately 2 metres at a distance of approximately 500mm.
  • the invention may broadly be said to consist in a camera mounting apparatus comprising an elongate mounting structure, a plurality of image sensing devices mounted to the mounted structure, wherein each camera device is spatially arranged to minimise overlapping fields of view at a close range.
  • the image sensing devices are spatially arranged to minimise overlapping fields of view when capturing an object located a distance of less than two metres away.
  • the field of view is at least 57 degrees.
  • the plurality of image sensing devices is three image sensing devices, the devices arranged to capture a combined height of approximately 2 metres at a distance of approximately 0.5 metres.
  • the invention may broadly be said to consist in an image processing method for a multiple camera system comprising configuring a control system to perform the steps of:
  • each image sensing device has an field of view that overlaps the field of view of an adjacent image sensing device, truncating at least part of the overlapping field of view from each image and adjoining the truncated images to form a final image.
  • the invention may broadly be said to consist in a method of calibrating output data from a plurality of image sensing devices, comprising aligning a plurality of spatially separated image sensing devices providing a reference image in front of the image sensing devices, configuring a controller to: receive an image from each of the image sensing devices, and assemble each of said images to produce an output image, wherein the output image is substantially representative of said reference image.
  • the step of assembling the image is performed by image truncation and image offset manipulation.
  • the step of image offset manipulation includes vertical and horizontal movement of at least one received image relative to an adjacent received image.
  • the reference image has a plurality of lines extending in at least a vertical and diagonal directions.
  • the invention may broadly be said to consist in a method of calculating the quantity of cameras required in a multi-camera display device comprising: determining a field of view angle for each camera, determining a desired maximum height of an object to be resolved, determining a desired minimum distance between the object and the cameras, determining a percentage of the output image that can be overlapped, determining the quantity of cameras.
  • the maximum desired height is approximately two metres.
  • the minimum distance between the object and the cameras is approximately 400mm.
  • the percentage of the output image that can be overlapped is up to 25%.
  • Figure 1 shows a schematic view of the components of a preferred embodiment of the invention.
  • Figure 2 shows an arrangement of three cameras and the resulting field of view.
  • Figure 3 illustrates an example of a person in a fitting room having two camera arrays and a display.
  • Figure 4 shows a detailed view of a camera stick including lighting and range sensors.
  • Figure 5 shows an example of a screen having a calibration datum.
  • Figure 6 illustrates an example of the images received by each camera.
  • Figure 7 illustrates an example of received misaligned images from three cameras.
  • Appendix A provides equations for calculating physical requirements used by the camera stick.
  • Equation 1 Distance to subject using tangent. Equation 2: Distance to subject using cotangent.
  • Equation 3 Distance to subject including overlap percentage.
  • Equation 4a Maximum camera capture height.
  • Equation 4 Camera capture height including overlap percentage.
  • Equation 5a Cameras required.
  • Equation 5 * Cameras required including overlap percentage. Equation 6: Minimum distance between cameras. Equation 7a: Overlap percentage.
  • Equation 7 Overlap percentage using distance to subject.
  • Equation 8 Total camera capture height including overlap percentage.
  • Appendix B provides an example calculation using the equations of Appendix A
  • Appendix C provides a calculation with four cameras vertically spaced 525mm apart
  • Appendix D provides a calculation with five camera system vertically spaced 420mm apart.
  • a plurality of image sensing devices 10 are mounted on a mounting structure 40 in a substantially planar alignment.
  • Each image sensing device is preferably a camera capable of outputting a digital representation of an image.
  • Each of the cameras 10 are spatially positioned on the structure 40 and aimed in a direction substantially tangential to the plane of the cameras. That is, the cameras are spaced apart such that there is substantially no overlap in field of view at close proximity. In this context, close proximity is considered to be within 500mm.
  • each camera 10 is most preferably arranged so that there is a slight overlap or abutment in fields of view with the wall to subject distance dictated by fitting room dimensions.
  • the camera outputs provide a plurality of images that can be merged together to produce a wide angle view of a close range object without requiring the use of special lenses or introducing image distortion.
  • the arrangement of cameras 10 the mounting structure 40 is hereinafter referred to as a 'camera stick' 16.
  • the camera stick 16 has a series of vertically arranged cameras to enable a wide angle image to be obtained.
  • each camera is an electronic camera capable of outputting a digital signal representative of the image captured in real time.
  • Each camera captures a segment of the total desired subject that is desired to be resolved.
  • the camera is centrally located to its particular segment.
  • a control system 20 is connected to each of the cameras and combines the images from each camera to produce a single image for communication to a display 30.
  • Each camera output is connected to the control system 20 via an appropriate connection 1 1.
  • the control system 20 is programmed to receive image information from each camera in the camera stick, process the image information, and output a single image.
  • the control system 20 provides the necessary processing such that an image can be shown on the display 30 in real time. That is, the processing occurs in a short enough period of time such that a person viewing the image from the cameras interprets the image to be a live broadcast from the cameras.
  • the control system may optionally include a communications interface 21 to relay the image data to a remote location or memory device.
  • Suitable data communications protocols include RS232 Serial Link, Centronics Parallel Link, USB, Bluetooth, Wi-Fi and the like.
  • the output image is provided to a viewing device 30 via an appropriate connection 12.
  • the viewing device 30 is preferably a screen suitable for viewing wide angle images in a portrait arrangement.
  • the control system 20 is a microprocessor or personal computer, but may also include other processing devices where appropriate. Such devices include display drivers and image processors. Such devices may be used to ease processor load, reduce costs where dedicated processing functions are readily performed by widely available hardware devices, or where localised processing would ease the amount of bulk data required to be transmitted.
  • the control system 20 may be located separately from the camera stick 40 and display 30, or the components may be integrated into a standalone device.
  • Figure 2 illustrates an arrangement of three cameras in the camera stick and the resulting field of view.
  • Each camera has a predetermined field of view, usually an angle of approximately 60 degrees. However, the field of view is preferably at least 57 degrees.
  • Each camera is located to provide a small overlap region between adjacent fields of view. Therefore, for a desired total capture height a particular number of cameras are specified based on particular design criteria. Design criteria include the camera field of view angle, the minimum distance to an object to be resolved, and the height of an object to be resolved.
  • a subject For a mirror application in a retail fitting room, a subject is typically in close proximity to the camera stick. Therefore the distance between the camera stick and the subject is a critical parameter in determining the requirements to ensure a full portrait view is captured. The distance between the camera stick and subject and the maximum subject height will determine the minimum room dimensions and/or the number of cameras required on the camera stick.
  • Appendix A provides equations that are computed by the control system 20 to calculate the span of the cameras field of view based on specific design constraints.
  • Appendix A, Equation 7 refers to the image overlap percentage (OverLap%) which is a measure of how much overlap there is between adjacent camera images based on the distance to subject. For example, a 10% overlap indicates that 10% of an image is shared with the adjacent image.
  • the overlap percentage is calculated by the control system 20 and used to determine the number of common pixels from each camera image. As the camera stick only requires a single full image without overlap, the control system 20 may truncate half the overlapping pixels from each image before the images are joined together.
  • control system may implement several other image processing techniques to produce the same or a similar image stitching effect.
  • Appendix B provides an example calculation using the equations of Appendix A. It is shown in this example that with three cameras spaced vertically 630mm apart it is possible to capture an object of height 191 1.5mm. However, the minimum distance to subject is 579.5mm with a camera field of view of 57 degrees. If a subject is closer than the minimum distance to subject, gaps appear in the final image as there is a negative image overlap. Appendix C provides an example of a calculation having four cameras vertically spaced 525mm apart. The result is a minimum distance to subject of 482.9mm and the maximum capture height increases to
  • Appendix D provides an example of a calculation with five camera system vertically spaced 420mm apart. The minimum distance to subject becomes 386.3mm with a maximum capture height of 2114.4mm.
  • Camera lighting or more specifically the lack of lighting, and the lighting colour temperature are important considerations for a person using a fitting room.
  • Commercial or retail environments often use fluorescent lighting which can result in colour variations when using cameras, as economic electronic capture systems are only an approximation of human eyesight. Consequently the subject can be "washed out", “yellowed” or have some other type of colour variation. These effects may not produce a complimentary image and may result in the subject rejecting a potential purchase if the actual colour is not truly represented. Similarly, a colour that looks acceptable in the changing room may not in the outside world.
  • the ideal solution is to have a variety of lighting options that the subject may choose - e.g. sunlight, or other options such as fluorescent, incandescent, or true white.
  • Figure 3 shows a detailed view of a camera stick.
  • the stick has three cameras 10 equally interspersed along a backing structure 40.
  • a series of lights 63 are also located on backing structure 40.
  • the lights 63 are bright LEDs distributed evenly along the length of the backing structure 40, however, they may also be located remotely from the camera stick 16, such as on the ceiling of the fitting room.
  • the LED lighting may optionally include white, natural and/or red, green or blue colour sources.
  • a variation the relative intensity of various coloured light sources allows for colour correction and lighting effects to be implemented. Such correction and effects are often useful for simulating outdoor colour tones in contrast to the often used fluorescent lighting of retail stores.
  • the control system may optionally implement a process whereby the images from each of the cameras are analysed for colour content which is subsequently used to control the intensity of various coloured light sources to thereby provide or compensate colour correction.
  • a range and/or motion sensor 64 is located on the backing 40 to provide a signal representative of the presence of a person, and the distance of that person from the camera stick.
  • the signal from the motion sensor is provided to the control system where the control system calculates the desired camera focus, field of view and image overlap ratios to be shown on the display 30.
  • the control system may also use the sensor 64 to detect the absence of a person and initiate a power saving mode.
  • a secondary controller 65 is optionally located on the backing structure 40 to allow local processing of image and control information.
  • a communication interface 66 sends information from any local processor 65, cameras 10 and motion sensors 64 to be received by the control system 20.
  • a front cover 60 is adapted to shield the camera stick 16 to thereby provide protective and aesthetic benefits.
  • An intended, but not sole use of the present invention is to provide a versatile 'virtual mirror' device for use in a commercial fitting type environment. That is, to display the image of a person in the fitting room that is captured by the camera stick.
  • a rear view of the person can be displayed by having placing the camera stick behind the person, or having an additional camera stick.
  • a further adaption includes a moveable mount for the camera stick to allow the placement to vary as desired.
  • Figure 4 illustrates an example of a person 57 in a fitting room 50.
  • a display 53 is located in front of the person 57.
  • a first camera stick 51 is located in front of the person, and a second camera stick 52 is located behind them.
  • a control interface 54 is provided to allow the person 57 to switch the display between the images captured by the first and second camera sticks.
  • the field of view 56 of each camera in the camera stick is shown for reference.
  • the control interface may also be used to control a number of functions, such as lighting, image processing, image storage or transmission and/or the number of views shown simultaneously on the display.
  • the control interface may also be implemented by a touch screen or Sixthsense device or other input mechanism.
  • a display controller in the form of a small industrial style PC or embedded controller implements the functions of the control system 20 by managing the functionality of multiple camera sticks 16, the visual display device 53, the control interface 54, the light sources 63, data transmission, system communication, and any image manipulation or storage.
  • a commercial/retail establishment is usually noisy (background music, other customers, traffic) and therefore voice activated systems are not currently ideal as identification of discrete voice commands is very difficult.
  • External noise combined with the processing power required to identify variations in accent, dialect and language, make voice control currently impractical in a most commercial situations. Consequently, any electronic mirror adjustments must be fully automatic, remotely controlled or require a control panel.
  • a fully automatic solution may not give the subject the view they want immediately - they may have to wait, or repeat an action e.g. turn around. Remotes can be lost, stolen, damaged and require battery replacements.
  • the preferred economic solution for a low maintenance commercial environment is to implement a control panel to provide a user interface to the control system.
  • vandal resistant and tough sealed switches are used on the control interface, as are switches that promote easy cleaning.
  • a preferred interface to allow a user to direct the control system would be through the use of a touch screen, either integrated into the main viewer, or as a separate device.
  • a further implementation of a functional interface is a SixthSense interface. These systems use hand and finger gestures to control object manipulation in the real world.
  • a SixthSense interface would function well commercial (and noisy) environment and provide similar advantages to a touch screen.
  • the control system 20 may implement the following functional steps to manage the display:
  • Idle Mode a Do Idle Mode Activity (Low Power or Display Image or Display
  • Step 4 Read Control Panel Settings a. Determine View requirements b. Determine Lighting options c. Poll Camera Sticks if Subject still present, if not go to Step 2 (Idle) d. Determine if Shutdown is selected, go to Step 8
  • Camera Sticks Activate Camera Sticks a. Send "Lighting" command to Camera Sticks b. Send “Take Picture” command to Camera Sticks c. Send "Get Picture” command to Camera Sticks d. Download Images from Camera Sticks 6. Display Images on Display Device a. Send Images to External Device if Selected by Control Panel Settings
  • the control system 20 may implement the following functional steps to control the camera stick device:
  • a calibration datum is shown generally in Figure 5.
  • the calibration datum has several vertical lines and several horizontal lines.
  • the control system matches the lines between adjacent cameras.
  • Figure 6 illustrates an example of the images received by each camera and relayed to the control system.
  • Figure 7 illustrates an example of received misaligned images from three cameras.
  • the disparity between any combination of horizontal, diagonal and vertical datum are used to shift the images relative to one another to thereby bring them into both horizontal and vertical alignment.
  • the camera stick output is calibrated once installed into a fitting room or the cameras are assembled into the backing structure.
  • the controller stores horizontal and vertical offset information for each received image to enable a consistently aligned image to be reproduced on the display when each image has been joined by an image processing technique such as stitching.
  • This invention may also be said broadly to consist in the parts, elements and features referred to or indicated in the specification of the application, individually or collectively, and any or all combinations of any two or more of said parts, elements or features, and where specific integers are mentioned herein which have known equivalents in the art to which this invention relates, such known equivalents are deemed to be incorporated herein as if individually set forth.
  • Equation 3 25
  • Distance_to_Subject •cot re-writing Equation 3 in terms of camera capture height and adjustment for overlap
  • Max_Camera_Capture_Height 2 Distance to Subject tanl APPENDIX A: Equations for Calculating Parameters used by the "Camera Stick" (continued)
  • Total_Camera_Capture_Height (Cameras - l) Camera_Capture_Height... ,
  • Total_Camera_Capture_Height (Cameras + OverLap%)
  • Camera_Capture_Heigh APPENDIX B Calculations - Distance to Subject 600mm. Subject Height 1800mm
  • Max_Camera_Capture_Height 651.5 mn APPENDIX B: Calculations Distance to Subject 600mm. Subject Height 1800mm
  • VGA image (640 x 480) rotated so the 640 pixels are vertical
  • APPENDIX C Calculations - Distance to Subject 500mm. Subject Height 1800mm
  • VGA image (640 x 480) rotated so the 640 pixels are vertical
  • APPENDIX D Calculations - Distance to Subject 400mm. Subject Height 1800mm
  • Max_Camera_Capture_Height 434.4 mir APPENDIX D: Calculations - Distance to Subject 400mm. Subject Height 1800mm (continued)
  • VGA image (640 x 480) rotated so the 640 pixels are vertical

Abstract

A display system has a mounting structure with a plurality of spatially aligned cameras is capable of outputting an image, and a control system is configured to receive the output images from the image sensing devices, combine them and produce a final image for displaying on a display device.

Description

A CAMERA ARRANGEMENT
FIELD OF THE INVENTION
The present invention broadly relates to an interactive electronics camera display system for use in a retail environment and in particular to an electronic mirror apparatus and method of operation.
BACKGROUND
In a conventional commercial fitting room environment, one of the primary concerns with the purchase of clothing is a full view from the back of the back i.e. the rear view. Many high end fitting rooms have used multiple mirrors to obtain this view; the simplest solution being a second mirror behind the subject. A more sophisticated solution includes being an adjustable mirror or mirrors on hinges. However, these systems have several drawbacks: the subject is often in the way of their own view leading to some extremely distorted poses by the subject. This defeats obtaining a "normal" rear view. Hinged mirrors also only capture one side and require the subject to move around to get the view they require.
In addition, mirrors are extremely expensive and a potential safety hazard - especially moveable mirrors suspended on hinges. Typically a fitting room also has spatial constraints that limit the use (and quantity) of floating or angled mirrors. Consequently, there has never been an accurate, undistorted, easy to view image from the back that is viewable by the subject.
Use of an electronic camera with interactive display technology is becoming a prevalent replacement for the conventional mirror arrangement. For example, by placing a camera in front of a person and connecting it to a display that a person also faces. When a full portrait view of a person is required, this method has the disadvantage in that it requires a wide angle lens, or a substantial distance between the person and the camera. A wide-angle lens can lead to optical distortion commonly known as the "fisheye" effect, and conventional fitting rooms do not allow the space for such an arrangement.
Other known camera systems for capturing wide images have sought to overcome the fisheye effect by providing several cameras aimed at different portions of a person's body. The edges of the captured images are then adjoined to produce a larger image. Inevitably this method creates optical distortion in the final image, known as parallax error. SUMMARY OF THE INVENTION
It is therefore an objective of the present invention to provide an electronic camera system that allows a close up and wide image, goes some way towards overcoming or improving upon the abovementioned disadvantages, or at which at least provides the public with a choice.
In a first aspect the invention may broadly be said to consist in a display system comprising a mounting structure, a plurality of spatially aligned image sensing devices mounted on the mounting structure, each said image sensing device capable of outputting digital output image, and a control system configured to receive the output images from the plurality of spatially separated image sensing devices, combine the output images to produce a final image, and send the output image to a display device.
Preferably the system further comprises a second mounting structure, each structure having a plurality of spatially aligned image sensing devices mounted thereon.
Preferably the mounting structures are located at the front and rear of a fitting room.
Preferably the control system further comprises a control input.
Preferably the control input is used to select one or more of the desired output images to be sent to the display device.
Preferably the controller combines the output images by truncating the output images and abutting the truncated images together.
Preferably the display system is used in a fitting room.
Preferably the image sensing devices are a camera capable of outputting a digital representation of an image.
Preferably the image sensing devices are located a distance apart that substantially minimises overlapping field of view.
Preferably the system further comprises a lighting system.
Preferably the control input is used to selectively control the lighting system.
Preferably the lighting system is used to selectively control the ambient colour balance of the images captured by the image sensing devices. Preferably the control system is further configured to manipulate the output images of the image sensing devices to affect the colour balance of each image.
Preferably the image sensing devices are vertically aligned.
Preferably the control system is configured to facilitate storage or transmission of the output images, or final image or both.
In another aspect the invention may broadly be said to consist in a method of producing a close- up and wide angle view of a subject comprising providing a control system, the control system configured to perform the steps of: acquiring at least a first image and a second image, wherein a portion of the first and second images have a common field of view, adjusting at least the first image to reduce the common field of view, combining at least the first image and the second image to produce an output image and displaying the output image on a screen.
Preferably the step of adjusting includes truncating the common field of view of at least the first image.
Preferably the step of adjusting includes truncating at least part of a common field of view of the first and second image.
Preferably images are recorded by image sensing devices.
Preferably the first and the second images are recorded by a first and a second image sensing device, respectively.
Preferably the image sensing devices are a camera capable of outputting a digital representation of an image.
Preferably the image sensing devices are located a distance apart to minimise the overlapping field of view.
Preferably the image sensing devices are arranged on a mounting structure. Preferably the image sensing devices are located in a fitting room.
Preferably wherein the image sensing devices are spatially arranged to minimise overlapping field of view when capturing an object located a distance of less than 0.5 metres away.
Preferably the field of view is at least 57 degrees. Preferably an arrangement of image sensing devices captures a combined height of approximately 2 metres at a distance of approximately 500mm.
In another aspect the invention may broadly be said to consist in a camera mounting apparatus comprising an elongate mounting structure, a plurality of image sensing devices mounted to the mounted structure, wherein each camera device is spatially arranged to minimise overlapping fields of view at a close range.
Preferably the image sensing devices are spatially arranged to minimise overlapping fields of view when capturing an object located a distance of less than two metres away.
Preferably the field of view is at least 57 degrees.
Preferably the plurality of image sensing devices is three image sensing devices, the devices arranged to capture a combined height of approximately 2 metres at a distance of approximately 0.5 metres.
In» another aspect the invention may broadly be said to consist in an image processing method for a multiple camera system comprising configuring a control system to perform the steps of:
receiving a plurality of images from a plurality of spatially aligned image sensing devices, wherein each image sensing device has an field of view that overlaps the field of view of an adjacent image sensing device, truncating at least part of the overlapping field of view from each image and adjoining the truncated images to form a final image.
In another aspect the invention may broadly be said to consist in a method of calibrating output data from a plurality of image sensing devices, comprising aligning a plurality of spatially separated image sensing devices providing a reference image in front of the image sensing devices, configuring a controller to: receive an image from each of the image sensing devices, and assemble each of said images to produce an output image, wherein the output image is substantially representative of said reference image.
Preferably the step of assembling the image is performed by image truncation and image offset manipulation.
Preferably the step of image offset manipulation includes vertical and horizontal movement of at least one received image relative to an adjacent received image. Preferably the reference image has a plurality of lines extending in at least a vertical and diagonal directions.
In another aspect the invention may broadly be said to consist in a method of calculating the quantity of cameras required in a multi-camera display device comprising: determining a field of view angle for each camera, determining a desired maximum height of an object to be resolved, determining a desired minimum distance between the object and the cameras, determining a percentage of the output image that can be overlapped, determining the quantity of cameras.
Preferably the maximum desired height is approximately two metres.
Preferably the minimum distance between the object and the cameras is approximately 400mm.
Preferably the percentage of the output image that can be overlapped is up to 25%.
The term "comprising" as used in this specification means "consisting at least in part of. When interpreting each statement in this specification that includes the term "comprising", features other than that or those prefaced by the term may also be present. Related terms such as "comprise" and "comprises" are to be interpreted in the same manner.
BRIEF DESCRIPTION OF THE DRAWINGS
Figure 1 shows a schematic view of the components of a preferred embodiment of the invention.
Figure 2 shows an arrangement of three cameras and the resulting field of view.
Figure 3 illustrates an example of a person in a fitting room having two camera arrays and a display.
Figure 4 shows a detailed view of a camera stick including lighting and range sensors.
Figure 5 shows an example of a screen having a calibration datum.
Figure 6 illustrates an example of the images received by each camera.
Figure 7 illustrates an example of received misaligned images from three cameras.
Appendix A provides equations for calculating physical requirements used by the camera stick.
Equation 1 : Distance to subject using tangent. Equation 2: Distance to subject using cotangent.
Equation 3 : Distance to subject including overlap percentage.
Equation 4a: Maximum camera capture height.
Equation 4: Camera capture height including overlap percentage.
Equation 5a: Cameras required.
Equation 5: * Cameras required including overlap percentage. Equation 6: Minimum distance between cameras. Equation 7a: Overlap percentage.
Equation 7: Overlap percentage using distance to subject.
Equation 8: Total camera capture height including overlap percentage.
Appendix B provides an example calculation using the equations of Appendix A
Appendix C provides a calculation with four cameras vertically spaced 525mm apart
Appendix D provides a calculation with five camera system vertically spaced 420mm apart.
DETAILED DESCRIPTION
Preferred embodiments of the invention will now be described. With reference to Figure 1 , a schematic of the components of the preferred system is shown. A plurality of image sensing devices 10 are mounted on a mounting structure 40 in a substantially planar alignment. Each image sensing device is preferably a camera capable of outputting a digital representation of an image. Each of the cameras 10 are spatially positioned on the structure 40 and aimed in a direction substantially tangential to the plane of the cameras. That is, the cameras are spaced apart such that there is substantially no overlap in field of view at close proximity. In this context, close proximity is considered to be within 500mm. However, the field of view of each camera 10 is most preferably arranged so that there is a slight overlap or abutment in fields of view with the wall to subject distance dictated by fitting room dimensions. The camera outputs provide a plurality of images that can be merged together to produce a wide angle view of a close range object without requiring the use of special lenses or introducing image distortion. The arrangement of cameras 10 the mounting structure 40 is hereinafter referred to as a 'camera stick' 16.
The camera stick 16 has a series of vertically arranged cameras to enable a wide angle image to be obtained. Preferably each camera is an electronic camera capable of outputting a digital signal representative of the image captured in real time. Each camera captures a segment of the total desired subject that is desired to be resolved. Preferably the camera is centrally located to its particular segment. A control system 20 is connected to each of the cameras and combines the images from each camera to produce a single image for communication to a display 30.
Each camera output is connected to the control system 20 via an appropriate connection 1 1. The control system 20 is programmed to receive image information from each camera in the camera stick, process the image information, and output a single image. Preferably the control system 20 provides the necessary processing such that an image can be shown on the display 30 in real time. That is, the processing occurs in a short enough period of time such that a person viewing the image from the cameras interprets the image to be a live broadcast from the cameras.
The control system may optionally include a communications interface 21 to relay the image data to a remote location or memory device. Suitable data communications protocols include RS232 Serial Link, Centronics Parallel Link, USB, Bluetooth, Wi-Fi and the like.
Preferably the output image is provided to a viewing device 30 via an appropriate connection 12. The viewing device 30 is preferably a screen suitable for viewing wide angle images in a portrait arrangement. Preferably the control system 20 is a microprocessor or personal computer, but may also include other processing devices where appropriate. Such devices include display drivers and image processors. Such devices may be used to ease processor load, reduce costs where dedicated processing functions are readily performed by widely available hardware devices, or where localised processing would ease the amount of bulk data required to be transmitted. The control system 20 may be located separately from the camera stick 40 and display 30, or the components may be integrated into a standalone device.
Figure 2 illustrates an arrangement of three cameras in the camera stick and the resulting field of view. Each camera has a predetermined field of view, usually an angle of approximately 60 degrees. However, the field of view is preferably at least 57 degrees. Each camera is located to provide a small overlap region between adjacent fields of view. Therefore, for a desired total capture height a particular number of cameras are specified based on particular design criteria. Design criteria include the camera field of view angle, the minimum distance to an object to be resolved, and the height of an object to be resolved.
For a mirror application in a retail fitting room, a subject is typically in close proximity to the camera stick. Therefore the distance between the camera stick and the subject is a critical parameter in determining the requirements to ensure a full portrait view is captured. The distance between the camera stick and subject and the maximum subject height will determine the minimum room dimensions and/or the number of cameras required on the camera stick.
Appendix A provides equations that are computed by the control system 20 to calculate the span of the cameras field of view based on specific design constraints. Appendix A, Equation 7 refers to the image overlap percentage (OverLap%) which is a measure of how much overlap there is between adjacent camera images based on the distance to subject. For example, a 10% overlap indicates that 10% of an image is shared with the adjacent image. The overlap percentage is calculated by the control system 20 and used to determine the number of common pixels from each camera image. As the camera stick only requires a single full image without overlap, the control system 20 may truncate half the overlapping pixels from each image before the images are joined together. This method eliminates image distortion from including the same part of the subject twice, and also substantially eliminates parallax effects that occur from viewing one subject from two adjacent viewing positions. It should be appreciated that the control system may implement several other image processing techniques to produce the same or a similar image stitching effect.
Appendix B provides an example calculation using the equations of Appendix A. It is shown in this example that with three cameras spaced vertically 630mm apart it is possible to capture an object of height 191 1.5mm. However, the minimum distance to subject is 579.5mm with a camera field of view of 57 degrees. If a subject is closer than the minimum distance to subject, gaps appear in the final image as there is a negative image overlap. Appendix C provides an example of a calculation having four cameras vertically spaced 525mm apart. The result is a minimum distance to subject of 482.9mm and the maximum capture height increases to
21 18.0mm. Appendix D provides an example of a calculation with five camera system vertically spaced 420mm apart. The minimum distance to subject becomes 386.3mm with a maximum capture height of 2114.4mm.
Camera lighting, or more specifically the lack of lighting, and the lighting colour temperature are important considerations for a person using a fitting room. Commercial or retail environments often use fluorescent lighting which can result in colour variations when using cameras, as economic electronic capture systems are only an approximation of human eyesight. Consequently the subject can be "washed out", "yellowed" or have some other type of colour variation. These effects may not produce a complimentary image and may result in the subject rejecting a potential purchase if the actual colour is not truly represented. Similarly, a colour that looks acceptable in the changing room may not in the outside world. The ideal solution is to have a variety of lighting options that the subject may choose - e.g. sunlight, or other options such as fluorescent, incandescent, or true white.
Figure 3 shows a detailed view of a camera stick. The stick has three cameras 10 equally interspersed along a backing structure 40. A series of lights 63 are also located on backing structure 40. Preferably the lights 63 are bright LEDs distributed evenly along the length of the backing structure 40, however, they may also be located remotely from the camera stick 16, such as on the ceiling of the fitting room. In addition to providing a primary lighting source, the LED lighting may optionally include white, natural and/or red, green or blue colour sources. A variation the relative intensity of various coloured light sources allows for colour correction and lighting effects to be implemented. Such correction and effects are often useful for simulating outdoor colour tones in contrast to the often used fluorescent lighting of retail stores. The control system may optionally implement a process whereby the images from each of the cameras are analysed for colour content which is subsequently used to control the intensity of various coloured light sources to thereby provide or compensate colour correction.
A range and/or motion sensor 64 is located on the backing 40 to provide a signal representative of the presence of a person, and the distance of that person from the camera stick. Preferably the signal from the motion sensor is provided to the control system where the control system calculates the desired camera focus, field of view and image overlap ratios to be shown on the display 30. The control system may also use the sensor 64 to detect the absence of a person and initiate a power saving mode. A secondary controller 65 is optionally located on the backing structure 40 to allow local processing of image and control information. A
communication interface 66 sends information from any local processor 65, cameras 10 and motion sensors 64 to be received by the control system 20. A front cover 60 is adapted to shield the camera stick 16 to thereby provide protective and aesthetic benefits.
An intended, but not sole use of the present invention is to provide a versatile 'virtual mirror' device for use in a commercial fitting type environment. That is, to display the image of a person in the fitting room that is captured by the camera stick. Advantageously, a rear view of the person can be displayed by having placing the camera stick behind the person, or having an additional camera stick. A further adaption includes a moveable mount for the camera stick to allow the placement to vary as desired.
Figure 4 illustrates an example of a person 57 in a fitting room 50. A display 53 is located in front of the person 57. A first camera stick 51 is located in front of the person, and a second camera stick 52 is located behind them. A control interface 54 is provided to allow the person 57 to switch the display between the images captured by the first and second camera sticks. The field of view 56 of each camera in the camera stick is shown for reference. The control interface may also be used to control a number of functions, such as lighting, image processing, image storage or transmission and/or the number of views shown simultaneously on the display. The control interface may also be implemented by a touch screen or Sixthsense device or other input mechanism. In one embodiment, a display controller in the form of a small industrial style PC or embedded controller implements the functions of the control system 20 by managing the functionality of multiple camera sticks 16, the visual display device 53, the control interface 54, the light sources 63, data transmission, system communication, and any image manipulation or storage.
A commercial/retail establishment is usually noisy (background music, other customers, traffic) and therefore voice activated systems are not currently ideal as identification of discrete voice commands is very difficult. External noise, combined with the processing power required to identify variations in accent, dialect and language, make voice control currently impractical in a most commercial situations. Consequently, any electronic mirror adjustments must be fully automatic, remotely controlled or require a control panel. A fully automatic solution may not give the subject the view they want immediately - they may have to wait, or repeat an action e.g. turn around. Remotes can be lost, stolen, damaged and require battery replacements. The preferred economic solution for a low maintenance commercial environment is to implement a control panel to provide a user interface to the control system. Preferably, use of vandal resistant and tough sealed switches is used on the control interface, as are switches that promote easy cleaning. A preferred interface to allow a user to direct the control system would be through the use of a touch screen, either integrated into the main viewer, or as a separate device. A further implementation of a functional interface is a SixthSense interface. These systems use hand and finger gestures to control object manipulation in the real world. A SixthSense interface would function well commercial (and noisy) environment and provide similar advantages to a touch screen.
The control system 20 may implement the following functional steps to manage the display:
1. Initialise a. Test Internal Systems and Communications b. Check Display Device, Control Panel and Camera Sticks present c. Initialise Camera Sticks
2. Idle Mode a. Do Idle Mode Activity (Low Power or Display Image or Display
Advertising) b. Poll Camera Sticks until motion detected on Range and Motion Sensors
3. Display Introduction and any Instructions
4. Read Control Panel Settings a. Determine View requirements b. Determine Lighting options c. Poll Camera Sticks if Subject still present, if not go to Step 2 (Idle) d. Determine if Shutdown is selected, go to Step 8
5. Activate Camera Sticks a. Send "Lighting" command to Camera Sticks b. Send "Take Picture" command to Camera Sticks c. Send "Get Picture" command to Camera Sticks d. Download Images from Camera Sticks 6. Display Images on Display Device a. Send Images to External Device if Selected by Control Panel Settings
7. Repeat Process, go to Step 4
8. Shutdown a. Power down Camera Sticks b. Power down Display Device c. Power down Display Controller
The control system 20 may implement the following functional steps to control the camera stick device:
1. Initialise a. Initialise Cameras b. Initialise Software Communications
2. Idle Mode a. Power down Cameras b. Power down LEDs c. Wait for Motion Sense on Range and Motion Sensors d. Wait for any Command from Display Controller e. If Shutdown Command go to Step 5
3. Activate Camera Stick a. Activate Range Sensors b. Detetrnine range to Subject at each Sensor c. Calculate Image Overlap at each Sensor d. Deactivate Range Sensors e. Activate LED Lighting from Display Controller Settings f. Take pictures with Cameras g. Download Camera Pictures h. Truncate image overlaps i. Join Images j. Send Image to Display Controller
4. Go to Step 2 (Idle)
5. Shutdown Camera Stick a. Very Low Power Mode - all devices off. b. Wait for Hardware Wake Up Signal from Display Controller, go to Step 1.
To calibrate the system, a calibration datum is shown generally in Figure 5. The calibration datum has several vertical lines and several horizontal lines. To align the images received from each camera, the control system matches the lines between adjacent cameras. Figure 6 illustrates an example of the images received by each camera and relayed to the control system. Figure 7 illustrates an example of received misaligned images from three cameras. The disparity between any combination of horizontal, diagonal and vertical datum are used to shift the images relative to one another to thereby bring them into both horizontal and vertical alignment. Preferably the camera stick output is calibrated once installed into a fitting room or the cameras are assembled into the backing structure. The controller stores horizontal and vertical offset information for each received image to enable a consistently aligned image to be reproduced on the display when each image has been joined by an image processing technique such as stitching.
This invention may also be said broadly to consist in the parts, elements and features referred to or indicated in the specification of the application, individually or collectively, and any or all combinations of any two or more of said parts, elements or features, and where specific integers are mentioned herein which have known equivalents in the art to which this invention relates, such known equivalents are deemed to be incorporated herein as if individually set forth.
APPENDIX A: Equations for Calculating Parameters used by the "Camera Stick" m Diagram 1
Figure imgf000015_0001
tan(e)
Camera Definitions from Diagram 2
[Definition 2]
Camera_Capture_Height:= 2 opposite
[Definition 3]
Figure imgf000015_0002
Field of View := 2 Θ
From Definition 1,2 and 3
[Equation 1] 15
Distance_to_Subject :
Figure imgf000015_0003
re-writing Equation 1 using cotangent
[Equation 2]
Field of View
Distance_to_Subject :=
Figure imgf000015_0004
modification of Equation 2 allowing for image overlap (OverLap%) in camera capture height [Equation 3] 25
Camera_Capture_Height(l - OverLap%) Field of View
Distance_to_Subject := •cot re-writing Equation 3 in terms of camera capture height and adjustment for overlap
[Equation 4a]
Field of View Λ
Max_Camera_Capture_Height:= 2 Distance to Subject tanl APPENDIX A: Equations for Calculating Parameters used by the "Camera Stick" (continued)
[Equation 4]
Figure imgf000016_0001
(1 + OverLap%) expanding Equation 4 to calculate the number of cameras required
[Equation 5a]
f T i outiaail_C«^aapptiuuirce
Cameras_Required :=
Figure imgf000016_0002
[Equation 5]
Total_Capture_Height(l + OverLap%) λ Cameras Required := cot —•Field of View
2 Distance_to_Subject - * j minimum distance between cameras from Equation 4
[Equation 6]
Min_Distance_between_Cameras := Camera_Capture
15
calculation of the OverLap%
[Equation 7a]
Max Camera Capture Height
OverLap% := 1 20
Actual Distance between Cameras
[Equation 7]
(2 Distance_to_Subject) Field of View
OverLap% := tan
Actual Distance between Cameras calculation of the total camera capture height 2^
Total_Camera_Capture_Height= (Cameras - l) Camera_Capture_Height... ,
+ Max Camera Capture Height
[Equation 8]
Total_Camera_Capture_Height= (Cameras + OverLap%) Camera_Capture_Heigh APPENDIX B: Calculations - Distance to Subject 600mm. Subject Height 1800mm
Using a COMedia C328-7640 camera
Obtain the number of cameras required using Equation 5.
Total_Capture_Height:= 1.8m Field_of_View := 57deg
Distance_to_Subject := 600mir OverLap% := 4%
Total Capture_Height(l + OverLap%) (
Cameras_Required cot — Field of View
2 · Distance_to_Subject 2 J
Cameras_Required = 2.9
Verify this calculation using Equation 4 and Equation 5a
Figure imgf000017_0001
Camera_Capture_Height= 626.5 inn
10
f Total Capture Height
Cameras_Required := - -
C amera C apture_Heigh^
Cameras_Required = 2.9 Verified
Obtain the distance between cameras using Equation 6
Min_Distance_between_Cameras := Camera_Capture_Heigh
20
Min Distance between Cameras = 626.5 m
Recalculate OverLap% for cameras at a spacing of 630mm using Equation 4a and Equation 7a
Cameras := 3 Actual Distance between Cameras := 630mn
Max_Camera_Capture_Height:=
Figure imgf000017_0002
Max_Camera_Capture_Height= 651.5 mn APPENDIX B: Calculations Distance to Subject 600mm. Subject Height 1800mm
(continued)
Λ I Max Camera Capture Height
OverLap% := = =— = - Actual_Distance_between_Cameras
OverLap% = 3.4 %
Using a VGA image (640 x 480) rotated so the 640 pixels are vertical
640 OverLap% = 2L9 pixels of overlap
Recalculate Camera Capture Height with OverLap% of 3.4% using Equation 4
Figure imgf000018_0001
Camera_Capture_Height= 630.0 mm
Recalculate Minimum Distance to Subject with OverLap% of 3.4¼ using Equation 3
Max_Camera_Capture_Height(l - OverLap%) f Field of View
Distance_to_Subject :=
Figure imgf000018_0002
Distance_to_Subject = 579.5 mn (Within Specification)
Recalculate Total Capture Height with 3 Cameras at Camera Capture Height of 630mm using Equation 8
(Cameras + OverLap%) · Camera_Capture_Height= 1911.5 mn (Within Specification)
APPENDIX C: Calculations - Distance to Subject 500mm. Subject Height 1800mm
Using a COMedia C328-7640 camera
Obtain the number of cameras required using Equation 5.
Total_Capture_Height:= 1.8m Field_of_View := 57deg
Distance_to_Subject := 500mir OverLap% := 4%
Total_Capture Height(l + OverLap%)
Cameras Required := cot , - Field of View
2 Distance_to_Subject 12 - - j
Cameras_Required = 3.4
Verify this calculation using Equation 4 and Equation 5a
_
Camera Captur
~
Figure imgf000019_0001
Camera_Capture_Height= 522.1 mir
Total_Capture_Height
C ameras_Required 10
^ Camera_Capture_Heigh
Cameras_Required 3.4 Verified
Obtain the distance between cameras using Equation 6
Min_Distance_between_Cameras := Camera_Capture_Heigh
Min Distance between Cameras = 522.1 mir 20
Recalculate OverLap% for cameras at a spacing of 525mm using Equation 4a and Equation 7a
Cameras := 4 Actual Distance between Cameras := 525mir
Figure imgf000019_0002
Max_Camera_Capture_Height= 543 mir
25 APPENDIX C: Calculations - Distance to Subject 500mm. Subject Height 1800mm (continued)
Max Camera Capture Height
OverLap% :=
Actual_Distance_between_Cameras J ->
OverLap% = 3.4 %
Using a VGA image (640 x 480) rotated so the 640 pixels are vertical
640 OverLap% = 21.9 pixels of overlap
Recalculate Camera Capture Height with OverLap% of 3.4% using Equation 4
(2 Distance_to_Subjecf) Field_of_View^
Figure imgf000020_0001
Camera Capture Height= 525.0 mir
Recalculate Minimum Distance to Subject with OverLap% of 3.4% using Equation 3
Distance_to_Subject :=
Figure imgf000020_0002
Distance_to_Subject = 482.9 mir (Within Specification)
Recalculate Total Capture Height with 4 Cameras at Camera Capture Height of 525mm using Equation 8
(Cameras + OverLap%) Camera_Capture_Height= 2118.0 mir (Within Specification)
APPENDIX D: Calculations - Distance to Subject 400mm. Subject Height 1800mm
Using a COMedia C328-7640 camera
Obtain the number of cameras required using Equation 5.
Total_Capture_Height:= 1.8m Field_of_View := 57deg
Distance to Subject := 400 mn OverLap% := 4%
Total_Capture_Height(l + OverLap%) λ
Cameras_Required := cot — Field of View
2 · Distance_to_Subject 2 J
Cameras_Required = 4.3
Verify this calculation using Equation 4 and Equation 5a
(2 Distance to Subject)
Camera Capture Height:=
- (l + OverLap%)
Figure imgf000021_0001
Camera_Capture_Height= 417.7 mir
Total C apture Height
Cameras_Required := 10
Camera_Capture_Height^
Cameras_Required = 4.3 Verified
Obtain the distance between cameras using Equation 6
Min_Distance_between_Cameras := Camera_Capture_Heigh
Min Distance between Cameras = 417.7 mn n
Recalculate OverLap% for cameras at a spacing of 420mm using Equation 4a and Equation 7a
Cameras := 5 Actual Distance between Cameras := 420mir
Max_Camera_Capture_Height:=
Figure imgf000021_0002
Max_Camera_Capture_Height= 434.4 mir APPENDIX D: Calculations - Distance to Subject 400mm. Subject Height 1800mm (continued)
[ Max Camera Capture Height
OverLap% :=
Actual_Distance_between_Cameras J 1 5
OverLap% = 3.4 %
Using a VGA image (640 x 480) rotated so the 640 pixels are vertical
640 OverLap% = 21.9 pixels of overlap
Recalculate Camera Capture Height with OverLap% of 3.4% using Equation 4
Came
Figure imgf000022_0001
Camera_Capture_Height= 420.0 nut
Recalculate Minimum Distance to Subject with OverLap% of 3.4% using Equation 3
Distance_to_Subject
Figure imgf000022_0002
Distance to Subject = 386.3 nut (Within Specification)
Recalculate Total Capture Height with 5 Cameras at Camera Capture Height of 420mm using Equation 8
(Cameras + OverLap%) Camera_Capture_Height= 2114.4 nut (Within Specification)

Claims

1. A display system comprising: a mounting structure, a plurality of spatially aligned image sensing devices mounted on the mounting structure, each said image sensing device capable of outputting digital output image, and a control system configured to: receive the output images from the plurality of spatially separated image sensing devices, combine the output images to produce a final image, and send the output image to a display device.
2. The system as claimed in claim 1, the system further comprising a second mounting structure, each structure having a plurality of spatially aligned image sensing devices mounted thereon.
3. The system as claimed in claim 2 wherein the mounting structures are located at the front and rear of a fitting room.
4. The system as claimed in claims 1 to 3 wherein the control system further comprises a control input.
5. The system as claimed in claim 4 wherein the control input is used to select one or more of the desired output images to be sent to the display device.
6. The system as claimed in claims 1 to 5 wherein the controller combines the output images by truncating the output images and abutting the truncated images together.
7. The system as claimed in claims 1 to 6 wherein the display system is used in a fitting room.
8. The system as claimed in claims 1 to 7 wherein the image sensing devices are a camera capable of outputting a digital representation of an image.
9. The system as claimed in claims 1 to 8 wherein the image sensing devices are located a distance apart that substantially minimises overlapping field of view.
10. The system as claimed in claims 1 to 9 wherein the system further comprises a lighting system.
11. The system as claimed in claim 10 wherein the control input is used to selectively control the lighting system.
12. The system as claimed in claims 1 to 11 wherein the lighting system is used to selectively control the ambient colour balance of the images captured by the image sensing devices.
13. The system as claimed in claims 1 to 12 wherein the control system is further configured to manipulate the output images of the image sensing devices to affect the colour balance of each image.
14. The system as claimed in claims 1 to 13 wherein the image sensing devices are vertically aligned.
15. The system as claimed in claims 1 to 14 wherein the control system is configured to facilitate storage or transmission of the output images, or final image or both.
16. A method of producing a close-up and wide angle view of a subject comprising: providing a control system, the control system configured to perform the steps of: acquiring at least a first image and a second image, wherein a portion of the first and second images have a common field of view, adjusting at least the first image to reduce the common field of view, combining at least the first image and the second image to produce an output image, displaying the output image on a screen.
17. The method as claimed in claim 16 wherein the step of adjusting includes truncating the common field of view of at least the first image.
18. The method as claimed in claims 16 or 17 wherein the step of adjusting includes truncating at least part of a common field of view of the first and second image.
19. The method as claimed in claims 16 to 18 wherein images are recorded by image sensing devices.
20. The method as claimed in claims 16 to 19 wherein the first and the second images are recorded by a first and a second image sensing device, respectively.
21. The method as claimed in claim 20 wherein the image sensing devices are a camera capable of outputting a digital representation of an image.
22. The method as claimed in claims 19 to 21 wherein the image sensing devices are located a distance apart to minimise the overlapping field of view.
23. The method as claimed in claims 19 to 22 wherein the image sensing devices are arranged on a mounting structure.
24. The method as claimed in claims 19 to 23 wherein the image sensing devices are located in a fitting room.
25. The method as claimed in claims 19 to 24 wherein the image sensing devices are spatially arranged to minimise overlapping field of view when capturing an object located a distance of less than 0.5 metres away.
26. The method as claimed in claims 25 wherein the field of view is at least 57 degrees.
27. The method as claimed in claim 19 to 26 wherein an arrangement of image sensing devices captures a combined height of approximately 2 metres at a distance of approximately 500mm.
28. A camera mounting apparatus comprising: an elongate mounting structure, a plurality of image sensing devices mounted to the mounted structure, wherein each camera device is spatially arranged to minimise overlapping fields of view at a close range.
29. The apparatus as claimed in claim 28 wherein the image sensing devices are spatially arranged to minimise overlapping fields of view when capturing an object located a distance of less than two metres away.
30. The apparatus as claimed in claims 28 or 29 wherein the field of view is at least 57 degrees.
31. The apparatus as claimed in claims 28 to 30 wherein the plurality of image sensing devices is three image sensing devices, the devices arranged to capture a combined height of approximately 2 metres at a distance of approximately 0.5 metres.
PCT/NZ2011/000132 2010-07-16 2011-07-15 A camera arrangement WO2012008856A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
NZ586841 2010-07-16
NZ58684110 2010-07-16

Publications (1)

Publication Number Publication Date
WO2012008856A1 true WO2012008856A1 (en) 2012-01-19

Family

ID=45469664

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/NZ2011/000132 WO2012008856A1 (en) 2010-07-16 2011-07-15 A camera arrangement

Country Status (1)

Country Link
WO (1) WO2012008856A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018136098A1 (en) * 2017-01-23 2018-07-26 Huami Inc. System and Method for Generating Digital Content for a Composite Camera

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5657073A (en) * 1995-06-01 1997-08-12 Panoramic Viewing Systems, Inc. Seamless multi-camera panoramic imaging with distortion correction and selectable field of view
US20020122113A1 (en) * 1999-08-09 2002-09-05 Foote Jonathan T. Method and system for compensating for parallax in multiple camera systems
US20040201754A1 (en) * 2001-10-12 2004-10-14 Mcalister Micheal J. Dual camera mounting arrangement for a wide screen imaging system
GB2403617A (en) * 2001-10-11 2005-01-05 Hewlett Packard Co Multiple Camera Arrangement
US20060139475A1 (en) * 2004-12-23 2006-06-29 Esch John W Multiple field of view camera arrays
US20080143842A1 (en) * 2006-12-06 2008-06-19 Sony United Kingdom Limited Camera arrangement and method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5657073A (en) * 1995-06-01 1997-08-12 Panoramic Viewing Systems, Inc. Seamless multi-camera panoramic imaging with distortion correction and selectable field of view
US20020122113A1 (en) * 1999-08-09 2002-09-05 Foote Jonathan T. Method and system for compensating for parallax in multiple camera systems
GB2403617A (en) * 2001-10-11 2005-01-05 Hewlett Packard Co Multiple Camera Arrangement
US20040201754A1 (en) * 2001-10-12 2004-10-14 Mcalister Micheal J. Dual camera mounting arrangement for a wide screen imaging system
US20060139475A1 (en) * 2004-12-23 2006-06-29 Esch John W Multiple field of view camera arrays
US20080143842A1 (en) * 2006-12-06 2008-06-19 Sony United Kingdom Limited Camera arrangement and method

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018136098A1 (en) * 2017-01-23 2018-07-26 Huami Inc. System and Method for Generating Digital Content for a Composite Camera
CN108347567A (en) * 2017-01-23 2018-07-31 北京顺源开华科技有限公司 Method, equipment and system for generating digital content
US10692260B2 (en) 2017-01-23 2020-06-23 Anhu Huami Information Technology Co., Ltd. System and method for generating digital content for a composite camera
CN108347567B (en) * 2017-01-23 2020-10-30 北京顺源开华科技有限公司 Method, device and system for generating digital content

Similar Documents

Publication Publication Date Title
US8957913B2 (en) Display apparatus, display control method, and storage medium storing program
US7961157B2 (en) Configurable imaging system
US8248454B2 (en) Video display calibration system and method
US7626600B2 (en) Projection-type image display system, projector, information storage medium, and image projection method
US8953094B2 (en) Illumination system
CN104144353B (en) Multizone environment light regime control method based on smart television
US11494960B2 (en) Display that uses a light sensor to generate environmentally matched artificial reality content
US9753580B2 (en) Position detecting device, position detecting system, and controlling method of position detecting device
US10088919B2 (en) Position detecting device, position detecting system, and controlling method of position detecting device
EP2508931A1 (en) Micro mirror array screen
JP2007086545A (en) Information presenting system
WO2019107060A1 (en) Illumination control system and illumination control method
KR20120007094A (en) Apparatus, method for measuring 3 dimensional position of a viewer and display device having the apparatus
JP2009010728A (en) Camera setting support device
US10321107B2 (en) Methods, systems, and computer readable media for improved illumination of spatial augmented reality objects
US20160212396A1 (en) Display apparatus capable of seamlessly displaying a plurality of projection images on screen
JP2022529417A (en) Image acquisition projection system, use of the system and image acquisition projection insertion method
EP3336601A1 (en) A photo terminal stand system
WO2012008856A1 (en) A camera arrangement
CN104345772B (en) A kind of information processing method and device
KR101392591B1 (en) Mirror combined display system of capable viewing rear
US20160119614A1 (en) Display apparatus, display control method and computer readable recording medium recording program thereon
US20110069079A1 (en) System and method for adjusting view angle of display
JP2016171040A (en) Luminaire and illumination system having the same
CN108366248A (en) A kind of 3D display device and 3D calibrating installations and method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11807117

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11807117

Country of ref document: EP

Kind code of ref document: A1