CA2447817A1 - Method and apparatus for refractive and reflective polystereoscopic imaging - Google Patents

Method and apparatus for refractive and reflective polystereoscopic imaging Download PDF

Info

Publication number
CA2447817A1
CA2447817A1 CA 2447817 CA2447817A CA2447817A1 CA 2447817 A1 CA2447817 A1 CA 2447817A1 CA 2447817 CA2447817 CA 2447817 CA 2447817 A CA2447817 A CA 2447817A CA 2447817 A1 CA2447817 A1 CA 2447817A1
Authority
CA
Canada
Prior art keywords
image
images
screen
refractive
projector
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
CA 2447817
Other languages
French (fr)
Inventor
Ovid Stavrica
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
WAMEDES Inc
Original Assignee
WAMEDES Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by WAMEDES Inc filed Critical WAMEDES Inc
Publication of CA2447817A1 publication Critical patent/CA2447817A1/en
Abandoned legal-status Critical Current

Links

Abstract

A polystereoscopic image acquisition and display arrangement is disclosed. A
plurality of image acquiring devices such as video cameras are positioned in an array and simultaneously acquire a sequence of images, which are then uniquely tagged and recorded. Alternatively, a plurality of optical elements can be used to direct a plurality of images to a fewer number of image capturing devices. The unique images are then displayed from an array of complementarily positioned projectors onto either a refractive display screen or a reflective display screen. The refractive display screen is characterized as having a plurality of micro-lens elements that condition the incoming beams to emerge parallel to each other while the reflective display screen is characterized as having a plurality of retroreflective elements that retain the incoming beams' incidences of reflection. A partial mirror is also disclosed for use with the retroreflective display screen.

Description

METHOD AND APPARATUS FOR REFRACTIVE AND REFLECTIVE
POLYSTEREOSCOPIC IMAGING
Background of the Invention Field of the Invention The invention relates to polystereoscopic image capture and display technology, and more particularly to methods and apparatus for capturing three dimensional images and displaying the same to emulate motion.
Descriation of the Prior Art Recent stereoscopic solutions have employed LCD panel displays with lenticular lens sheets. Examples of this technology can be found in United States patent numbers 4,717,949; 4,829,365; and 6,157,424. However, physical size constraints of LCD pixels and sub-pixels impose a number of constraints including the number of views (angular resolution), planar resolution per view, and viewable observer distance from screen The invention disclosed in United States patent number 6,224,214 identifies a plurality of projectors that cast images onto an active optical viewing screen, which employs shutters to control the projected angular views. In fact, any embodiment employing any kind of active light control mechanisms within the screen itself is limited in angular resolution as well as planar resolution due to the physical limitations of how small the LCD or physical screen "shutters" can be manufactured.
Other embodiments of this shutter technology such as United States patent number 6,304,288 require head tracking to direct the appropriate stereoscopic image to the viewer's eyes, as determined by the viewer's location and orientation.
Multiple viewers introduce a substantial complexity factor to this approach making it unfeasible or extremely cost prohibitive for multiple viewer capabilities. The same restrictions apply to this approach as do to the above patent, i.e., the 6,224,214 patent. Active shutter size minimization constraints in turn restrict angular resolution and planar resolution.

Summary of the Invention The purpose of the invention is to display, either simultaneously or approximately simultaneously, a plurality of images onto a single surface from a plurality of discrete locations thereby providing one or more appropriately positioned viewers with a polystereoscopic image (one that has stereoscopic properties from a plurality of viewing locations). Thus, the invention comprises several related components, in particular an image capturing component, a data translation component, and an image display component.
Turning first to the image capturing component, a plurality (at least two) of image capturing devices, such as still or motion video cameras, are positioned so that each device captures substantially the same target field but at unique angles with respect to such field. Preferably, the devices are located along a plane that is generally orthogonal to any primary objects located in the field and are positioned in an array. Each captured image In originates from a discrete image recording device (R), and comprises a plurality of discrete elements EX,y, which may be considered pixels, where R is unique (for convenience, R = an integer), n = the image number, 0 < x < (maximum horizontal element count), and 0 < y < (maximum vertical element count). Each image (In) captured by each recording device (R) (referred to as image Rln) is stored at least temporarily for further use as will be described below.
The data translation component of the invention comprises alternative means for providing a plurality of image projection apparatus (P) with display data.
In part, the selection of a suitable data translation component depends upon the means available for displaying the captured images. It is well known that the human eye cannot easily discriminate between related images if the display rate is above about 28 frames or images per second. Exploiting this deficiency, the objective of the data translation component is to present each image Rln within about 0.035 seconds where each (R) is unique within the domain and (n) is constant. For example, if RmaX
= 20, then a total of 560 images must be displayed in one second if n = 1 (20 images
2 x 28 images/second). The following second, another 560 images must be displayed when n = n + 1. Depending upon design considerations, several alternatives might be considered, e.g., a single projector many mirrors solution or a many projectors solution.
A first alternative is deemed a serial image display process and a second alternative is deemed a serial element display process. In the first alternative, each image RI~ is sequentially written to a memory such as a display buffer, e.g., each element EX,Y from one unique image RI" is sequentially written to a display buffer where (R) and n remain constant for the duration of the data writing. Once the display buffer has been loaded, the unique image RI" is displayed, and the buffer is cleared or overwritten by data for the unique image (R+1)I~~~.~~.
In the second alternative, one element EX,y from each image Rln is sequentially written to a display buffer, where (R) = 1 ~ RmaX, n = 1, and x,y remain constant.
Once the value (R) has reached its limit, the resultant image composition is displayed in a manner discussed in more detail below, and the buffer is cleared or overwritten by data for the next element EX+~,y+~ from each image Rln where (R) = 1 ->
Rmax and n = n + 1. Selection of one projection arrangement over another may depend upon external factors such as image compression algorithms, e.g., the change between all pixel elements "1" is less than the change of all pixels within a given image.
With respect to the foregoing, those persons skilled in the art will appreciate that the incrementing presented herein does not limit the invention, but serves to exemplify the operations of the invention. Thus, the incrementing can be via multiples of integers other than 1 or the product of an applied optimization algorithm.
Once an appropriate degree of image translation or conversion has taken place, it is necessary to project the resolved stereoscopic images. Regardless of the mode of image translation, there are several means for projecting images to the selected screen. A first projection means involves a plurality of image projectors that
3 create a projection matrix. Each projector is preferably positioned relative to a target surface in a manner similar to the location and orientation of the image capturing devices. A second projection means involves a plurality of mirrors that create a matrix and one or more projectors precisely aimed at the plurality of mirrors. In this second embodiment, each projector may project one or more images towards the mirrors, and/or a rotatable mirror or refraction element may be positioned intermediate one projector and the mirror matrix to actively redirect a projected image.
Regardless of the projection means chosen, each projected image should be precisely aligned relative to each other so as to cause superposition of each projected image portion or pixel. It is to be noted that while each superposed pixel represents the same image object (if not interfered with by another object in the light path of that object to the image capturing device), the specific attributes of that pixel vary depending upon the angle of capture of that image pixel.
'I 5 According to the invention, there are two fundamental approaches to building a parallax display that produces a stereoscopic image of remotely recorded subject matter. Refractive parallax display technology uses lenses to transmit an image to a "specific" viewing spot (can also be said in a specific viewing direction) from a corresponding location as emitted by a specific projector. Reflective parallax display technology uses directional reflection technologies to reflect light back to its source of origin. Both of these approaches employ related projection solutions to resolve commonly identified issues with existing parallax display technologies but differ in the "screen" technologies, insofar as one utilizes refraction while the other uses reflection. Thus, once an image Rln has been projected, whether serial or sequential, the target surFace must be capable of either reflecting or refracting the projected image.
In a first embodiment, a refracting surface is used. The refracting surface should be capable of receiving projected light from a plurality of incident angles and redirecting the incident light to emit it in a substantially parallel manner.
In this first
4 embodiment, a plurality of thin lenses formed in or on a planar surface comprise the refracting surface. The refracting surface serves two primary functions. The first function is to refocus the projected light onto a corresponding location in the plane of image convergence. Thus, the refracting surface functions as a large convex lens towards each projection passed through it. The second function relates the conditioning of the individual pixels, which varies accarding to design considerations and the portion of the surface in which the beams of light are incident. Thus, the refracting surface can make one or more beams collimated (neither diverging nor converging); it can make one or more beams converge, either onto the plane of convergence or other location as best deemed by empirical results; or it can otherwise alter but maintain the diverging nature of the beams.
To achieve these primary functions, the refracting surface should have qualities of a Fresnel lens employing individually unique conventional thin lenses.
The refracting surface is preferably manufactured to include an integrated lens structure employing both lens types.
In the second embodiment, a directionally reflective (retroreflective) surface is used to reflect projected light generally back to the source or other target location.
The directionally reflective surface comprises, in a preferred embodiment, a plurality of reflective elements formed in or embedded into a surface. The reflective elements are characterized as spherical or partially spherical badies that reflect incident light rays towards the source of such rays. A schematic representation of such behavior is shown in Fig. 9.
Because the screen elements are retroreflective, the issue of projector opacity must be addressed. The problem of projector opacity is that in a truly retroreflective environment, the incident and reflected beams are coincident. Thus, in order to observe the reflected beam, the observer must be in the incident beam path. If the observer is opaque, then the incident beam cannot transit to the reflector; if the projector is opaque, it will prevent observation of the reflected beam. To overcome this difficulty, in one embodiment a partial mirror or beam splitter is used to redirect a portion of the projected image towards the screen, and permit a portion of the
5 reflected image to pass through the mirror and be observed by the viewer(s).
The partial mirror is positioned oblique to the viewing screen so as to receive off angle projection light and redirect the same to the viewing screen. Reflected light is then permitted to pass partially through the partial mirror back towards the viewer(s).
Brief Description of the Drawings Fig. 1 is a schematic diagram of a plurality of image capturing devices exemplifying the stereoscopic recordation of an object by a pair of such devices;
Fig. 2 illustrates a matrix conversion scheme illustrating the rearrangement of pixel projections from multiple projectors (top) to multi-angular refractive or reflective screen pixel emissions (bottom).
Fig. 3 is a schematic diagram illustrating sample beams of collimated light emitted from two closely spaced projectors where the beams carry the same information as was acquired by the cameras in Fig. 1;
Fig. 4A is an illustrative schematic view of a single convex lens, which is approximated by the plurality of lenses in Fig.3, illustrating the desired functionality of a passive refractive display screen;
Fig. 4B is the same illustrative schematic view shown in Fig 4a, but the projectors are arranged in an arc (if 3 dimensionally, a spherical cap) in order to remove the requirement from a macrolens that it provide a variable focal length to projectors located further from the center of the project array;
Fig. 5 is a schematic diagram illustrating the placement and projection range of a plurality of linear projectors and convergence of beams of light emanating from each projector;
Fig. 6A shows a schematically isolated far lateral microlens receiving a plurality of incident light beams originating from discrete angles of incidence, and the subsequent refraction of the incident light beams;
Fig. 6B shows a schematically isolated central microlens receiving a plurality of incident light beams originating from discrete angles of incidence, and the subsequent refraction of the incident light beams;
6 Fig. 6C is a schematic diagram illustrating a single light beam representing a single pixel from a source entering into and exiting from the microlens shown in Fig.
6a;
Fig. 6D is an illustrative schematic diagram of the discreet refractive properties of a single microlens wherein a shadow mask is used to limit incident beam entrance properties;
Fig. 6E is an illustrative schematic diagram of the discreet refractive properties of a single microlens using a compound lens structure to address unique incident beam angle refractions, thereby maintaining constant angular refraction regardless of incident angles;
Fig. 7A is a schematic diagram of an alternative projection arrangement wherein a single projector and rotating mirror is used to sequentially target a plurality of discrete mirrors mounted to a surface;
Fig. 7B is a schematic diagram of an alternative projection arrangement wherein a single projector and rotating mirror is used to sequentially target a continuous mirror;
Fig. 7C is a schematic diagram illustrating possible projector locations necessary to achieve a desired refractive convergence when the refractive screen does not address suitable angular refraction;
Fig. 8 is a schematic diagram of a reflective display arrangement using a partial mirror to permit a viewer to see a retroreflection;
Fig. 9 is a schematic diagram of a mircosphere used to achieve retroreflection;
Fig. 10 is a perspective view of a reflective display screen;
Fig.11 is a detailed perspective view of a plurality of microspheres that comprise the screen shown in Fig. 10; and Fig. 12 is a side elevation taken substantially along the line 12-12 in Fig.

exemplifying the association of the microspheres and the screen backing.
DESCRIPTION OF THE PREFERRED EMBODIMENTS
7 The following discussion is presented to enable a person skilled in the art to make and use the invention. Various modifications to the preferred embodiment will be readily apparent to those skilled in the art, and the generic principles herein may be applied to other embodiments and applications without departing from the spirit and scope of the present invention as defined by the appended claims. Thus, the present invention is not intended to be limited to the embodiment shown, but is to be accorded the widest scope consistent with the principles and features disclosed herein.
Turning then to the several Figures wherein like numerals indicate like components, and more particularly to Figs. 1-2 and 3-7, the image acquisition and passive refractive display features are illustrated, respectively. It should be noted that for illustration purposes, all drawings and descriptions herein (unless otherwise noted) are directed to a single line or horizontal sweep (i.e., "x" coordinate axis). It is to be understood that for "y" coordinates to be captured andlor displayed, repetition of the illustrated embodiments along the "y" axis must be carried out, as is contemplated in a preferred embodiment.
Image Acquisition In order for the invention to operate, appropriate image information must be obtained. Turning then to Fig. 1, a one-dimensional array of image recording devices in the form of digital cameras 22, 24 mounted to plane 20 is shown. It is to be understood that while the functioning of only two cameras are described, a plurality of cameras is used as is illustrated in this Figure. Camera 22 captures an image (I) of an object bounded by the leftlright image plane 30 coordinates 26, 28 to define its field of view and is preferably aimed directly at (and presumably focused on) the object from its unique vantage point. Thus, the captured image (I) from camera can be identified as 221. In the same manner, camera 24 captures an image (I) of the object bounded by the leftlright coordinates 26, 28 to define its field of view and is also preferably aimed directly at (and presumably focused on) the object from its unique vantage point. Thus, the captured image (I) from camera 24 can be identified as 241. The distance between cameras 22 and 24 is represented by the distance "x".
The value "x" represents the horizontal stereoscopic resolution of the composite
8 image (I) being captured. This camera positioning scheme is repeated for all cameras mounted to plane 20.
An alternative to this "many camera" scheme is to use a discrete mirror or contiguous mirror arrangement to replace the plurality of required cameras.
Instead, a passive mirror array or mechanism covering the necessary vantage points could reflect the object from each specific vantage point onto a rotating prism or mirror mechanism, which then redirects the reflection into a high-scan rate image capturing device. This arrangement is complementary to the projection arrangement shown in Figs. 7A and 7B.
Depending on the artistic intentions of the image acquisition setup, captured scenes may require that all information in the scene be in focus. Such a focus requirement presents a problem, as each camera lens typically has one focal surface which is usually set to pass through the object being recorded. The image acquisition requires that all foreground and background objects be in focus as captured by each camera. One way to achieve this task is to use wide-field lenses on the camera array. Wide-field lenses tend to keep all imaged objects in focus, but also disproportionately distort the size of objects that are closer to the camera.
Another way is to minimize the size of the camera aperture a sufficiently large depth of field is achieved. Again, there are out-of focus distortions that result from this solution which occur on the fringes of the image. Both means require additional computer processing to electronically correct distortions by reparameterizing the light field as captured by the multiple camera vantage points to provide multiple focal surfaces and ensure that all objects are in focus, as discussed in (Isaksen, Aaron, Leonard McMillan, and Steven J. Gortler. Dynamically Reparameterized Light Fields.
SIGGRAPH 2000), which is incorporated herein by reference. Yet another solution is to ensure that a minimum operating distance between the subject matter and surrounding scenery is maintained in order to empirically avoid distortion issues introduced by wide-field lenses or high depth of field camera implementations.
Image Data Manipulation
9 In a preferred embodiment, each camera captures a plurality of single frames (representing a plurality of single images (I") where (n) represents a unique frame or image number), which are stored on a time indexed video tape, magnetic storage media, optical disc, or similar medium. If not stored on a digital format the data comprising the plurality of single frames is converted into digital format or stored on standard high speed photographic film.
For each frame of data, each pixel therein is assigned a unique address.
Thus, an image (In) originating from camera 22 is labeled 221, and has a plurality of pixel elements EX,y (the x,y naming convention is useful for a two dimensional array, but serves to exemplify the naming conventions for the invention; for the purposes of illustration herein, only the "x" coordinate designation is used); this results in a unique address for each pixel data of 221nEx,Y. Similarly, pixel data for an image originating from camera 24 would be represented as 24fnEX,y. For purposes of this disclosure, each image (I) comprises only three pixel elements, namely Ea, Eb, and E~. A
database of all image data is then created for subsequent use as will now be described.
Returning to Fig. 1, it becomes apparent that camera 22 captured three discrete pixels of data: 221~Ea, 221~Eb, and 221~E~. Similarly, camera 24 captured three discrete pixels of data: 241~Ea, 24f~Eb, and 241~E~. As is illustrated in Fig. 3, this data is converted by projectors 42 and 44 into corresponding beams of light:
beams 42a corresponding to 221~Ea, 42b corresponding to 221~Eb and 421c corresponding to 22~E~ from projector 42; and beams 44a corresponding to 241~Ea, 44b corresponding to 241~Eb and 44c corresponding to 241~E~ from projector 44. It should be noted that projectors 42 and 44 are positioned identically with respect to refractive screen 50 as were cameras 22 and 24 with respect to image plane 30 so as to maintain the fidelity of image reproduction.

As will be discussed in more detail below, the labeling of each pixel permits various display options, including generating composite images using pixels derived from various image capturing devices.
Implicit to the invention is the reorganization of the data from each projector (or projector vantage point) to the screen (reflective and refractive). Fig 2 shows 8 projectors 10-18 each projecting an image comprising of 3 pixels, a, b, c.
Through the crisscrossing of the pixel light beams, the data is rearrange so as to be re-emitted by 3 pixels on the reflective/refractive screen. Each screen pixel emits light in 18 different directions so that the pixel is perceived as having a quality, which depends on the viewer's vantage point.
Substantial cost reduction is achieved in that the signals as emitted by common, off-the-shelf projectors) are standard image projections, which are converted by the invention arrangement for re-emission by the refractive or reflective screen. This matrix is special in that it describes a process native to the technology that would otherwise involve substantial processing requirements or active screen components such LCD shutters. As well, in utilizing a passive optical screen in both the refractive and reflective modes, this matrix is a logical-pixel-rearrangement example of how a multitude of angular views for each passive optical "pixel"
is effectively achieved.
Refractive Display In order to achieve the beam redirection shown in Fig. 3, refractive screen 50 comprises a plurality of microlenses 52, each laterally unique one from the other.
The purpose of refractive screen 50 is to approximate a single macrolens with a plurality of unique optical properties. Each unique property is only evident to a specific beam path such that different light beams from different projectors are affected by the macrolens as if there were a different macrolens with different properties for each projector. For example, projections coming from projector along path 42a are refracted by microlens 54, which provides a longer focal length than projections coming from projector 46. An enlarged functional illustration of a pair of macrolenses 50' and 50" is shown in Fig. 4. As shown therein, the same beams of light 42a, 42b, 42c, 46a, 46b, and 46c are redirected to focal plane 30' as was the case in Fig. 3. Consequently, refractive screen 50 in Fig. 3 has a functionality equivalent to macrolenses 50' and 50" for these beam paths. A
potential simplification of the varying focal length requirements placed upon macrolens 50' is to arrange the projectors in a spherical cap instead of a plane as in Fig. 4a.
In this arrangement, the focal lengths of macrolens 50' and 50" are identical.
Projectors may be set in other arrangements to further minimize other optical requirements of the refractive screen.
Those persons skilled in the art will appreciate that the focal length between beam 42a and 42c is different. Thus, unless corrective optics are employed, the entire projected image on the back of the refractive screen 50 will not be in focus with respect to distally lateral projectors. This correction may ar may not be required, depending on the specific refractive screen implementation. The correction can take place anywhere along the incident beam path, such as in the projector optics, beam splitter, or rotating reflector (discussed below). Alternatively, the geometry of surface 40 and/or refractive screen 50 can be altered to maintain a constant focal length between the plurality of projectors and the screen.
Those persons skilled in the art will also appreciate that a laterally distal projector such as projector 42 will cause a skewed image to appear on refractive screen 50 as compared to a central projector such as projector 46. To overcome this optical deficiency, the displayed image can either be digitally corrected via keystone adjustment, or can be optically corrected by mounting a unique lens assembly to the projector. In either solution, the objective is to acquire a true image (i.e., one that has the same optical properties as a centrally projected image).
Returning then to refractive screen 50 in Fig. 5, each microlens 52 is laterally unique, one from another. In this manner, the intended directional orientation of refracted light is maintained regardless of its source. Figures 6A-B
schematically illustrate the beam redirection properties of a distally lateral microlens (for illustrative purposes, a thin lens is shown while a preferred embodiment would use a Fresnel fens having the same or highly similar beam redirection properties) in 6A, and a central microlens in 6B. As illustrated in Fig. 6A, each incident beam is refracted by a constant amount. As illustrated in Fig. 6B, each incident beam is nominally refracted.
Moreover and as functionally illustrated in Fig. 6C, beam divergence, which is an artifact of the projected image (as the distance between the image plane and the projector increases, so does the image size), is addressed insofar as the refracted beam is either conditioned to converge or maintain a parallel profile. The beam may be set to converge, maintain a parallel profile or even diverge to an extent in order to help blend one vantage point into another as the observer moves wigthin the viewing space. If all beams emitted by all microlenses as projected by all projectors converge onto a singe point, the refractive screen will appear black for viewing locations not specifically targeted by a camera vantage points. A parallel profile or divergent profile on the beam from each microlens mitigates this by "diffusing" the image so that it is still visible from viewing locations in the immediate vicinity of the specifically targeted vantage point. The functionally flat input surface on each microlens gives the refracting screen surface the properties of a convex macrolens (and is illustrated as such} by redirecting the beam in a new direction. The exit surface of the microlens controls the convergence/divergence properties of the light beam.
The exiting beam in Fig. 6C is shown as perfectly collimated. Furthermore, the incident surface does not have to be flat, but may have a specifically calculated functional curvature as required by the location andlor other properties of the projector array.
See Fig. 6E. The relation between the angle of the microlens entry surface and the screen wilt vary, depending of the location of the microlens on the screen, e.g., at the center of the screen, the angle (if a flat entry surface is present) would be 0 degrees.
While for simplicity there is an implied 1:1 correlation between projected pixel beams and corresponding microlenses, the correlation does not have to be the same, given that the lenses can be sufficiently miniaturized such that one pixel beam hits 2, 3 or more lenses simultaneously. By eliminating the 1:1 correlation, certain manufacturing and alignment costs can be substantially reduced.
Figure 6D is a functional schematic diagram of one of the possible microlens structures that would address the "angle constant requirement" that each microlens, as described in Fig. 6A and Fig. 6C, should have. Specially, Fig. 6D
illustrates a grouping of hypothetical sub-microlenses 52a, 52b, and 52c, which together make up one of the primary colors (see below) of one pixel on the refracting screen.
Each sub-microlens 52a-c is exposed to light from only a single direction, as determined by slit 62 in shadow mask 60. Shadow mask 60 preferably covers the incident side of the refractive screen. In this manner, each projected incident beam 48a-c is conditioned by a slit 62 and uniquely (or approximately uniquely) refracted by a microlens 52 to produce refracted beams 49a-c wherein the total angle of refraction (indicated by arrows) is the same or substantially the same.
The continued extension of the discrete sub-pixel micro-lenses shown illustratively in Fig 6D is a single microlens 52' with a specifically calculated entry surface as Fig 6E demonstrates, to function as the individual sub-pixel microlens in Fig 6D. The two important factors to Fig 6E are that the angle of incidence of incoming light and the tangent at the point of entry into the microlens is the same for all incident beams, 48a, 48b, and 48c. In like manner, all refracted beams exiting microlens 52' have a perpendicular exit path relative to microlens 52'.
While not illustrated herein, any differences in angle of refraction between the three primary colors, i.e., red, green and blue, are rectified by having three hypothetical microlenses as defined in Fig 6E for each pixel coordinate targeted by all projectors, and three color-filtered slits in the shadow mask there at. All projectors in the array will, for example, send pixel (1,1 ) to the same coordinate on the refractive screen. At that location, there are 3 color-filtered slits in the shadow mask allowing each of the RGB colors to pass through on to the respective hypothetical microlens.
Each hypothetical microlens will have a specific entry surface curvature to property refract light coming from the specific directions as determined by the slit.
Again, the properties of each hypothetical microlens are preferably incorporated into a fewer number of functional composite lenses such as the type illustrated in Fig. 6E.
Heretofore, this embodiment of the invention has been described in terms of multiple projectors, e.g., one projector for each image capturing device.
However, it is contemplated to use a single projector for a plurality of image capturing devices.
Turning attention to Figs. 7A and 7B, it can be seen that the output of a single high-scan rate projector 70 can be directed to rotating reflectorlrefractor 80, which in turn distributes a projected image to reflective member 90, which may be mirror array 92 having a plurality of discrete reflective surfaces 94 as shown in Fig. 7, or may be continuous reflective member 96 as shown in Fig. 7B. By rapidly sequencing a plurality of discrete projector images during operation of projector 70 and synchronizing rotation of reflector/refractor 80 to deliver each image to a unique position on reflective member 90, a fewer number of projectors, or even a single projector can be used.
It is well known that in order to emulate motion, it is necessary to have a frame rate in excess of about 25 FPS. Consequently, if a projector has a vertical refresh rate of 150 Hz, it is capable of displaying six (6) unique image (frame) sequences per second and still maintain motion emulation. In the illustrated embodiment, if projector 70 has a vertical refresh rate of 150 Hz, then reflectorlrefractor 80 may have 6 facets and will therefore redirect 6 images to screen 50 during a single rotation.
Naturally, the relative rotation rate and optical characteristics of reflector/refractor 80 may vary depending upon overall design considerations, and a rate of 50 rpms has been chosen for illustration purposes only. A benefit of this approach is that for similarly grouped projectors (ones that have substantially the same focal plane and keystone correction factor) this expedient can be applied without notably degrading the image quality. Moreover, consistency issues relating to minor dififerences in projectors can be eliminated by using a single projector for providing the imaging to multiple, discrete projector locations.

Numerous variations on this approach can be employed. For example, it is contemplated to modify the spatial geometry of reflective member 90, utilize refraction principles in rotating reflectorlrefractor 80 so as to address optical issues, and distribute projection images in the "y" axis.
Of the current digital projector technologies, the Digital Light Processing ("DLP") technology developed and marketed by Texas Instruments offers vertical refresh capabilities to potentially enable a single projector to provide all necessary frames from all required vantage points via rotating reflector/refractor 80.
However, the DLP technology Digital Micromirror Device introduces time as a variable in composing the projected image. A minimum time period is required to carry out the binary pulse-width modulation for each projected frame in order to construct the varying shades of the frame's RGB components. Three mechanisms may be employed individually or in tandem to mitigate time synchronization incompatibilities between the rotating reflectorlrefractor 80 and the Digital Micromirror Device. The rotating reflector/refractor can be driven by a stepper motor, able to stop in specific positions, so that the rotation is comprised of discrete reflectorlrefractor movements instead of a continuous rotational sweep. If a continuous rotational sweep is desired, a plurality of rotating reflectors/refractors can be set to work in unison to keep the image specifically aligned on a specific reflector of reflective member 90 without halting the rotational momentum of the reflectors/refractors. The third method is to increase the vertical refresh rate of the DMD chip in order to decrease the time-period required by the DLP projector to construct all necessary shades of each frame's RGB components.
Heretofore, features of the invention addressed the issue of incident beam angles and constant angle refraction so that the refracted beams would properly converge on the focal plane. Special optical properties were employed to address extreme lateral projections, variable focal lengths and the like. However, an alternative scheme is proposed that would significantly reduce the requirement for optical solutions to these issues. Figure 7C illustrates a situation wherein comparatively homogenous microlenses 52' are used throughout refractive screen 52". Since these microlenses are not of the "angle constant" type (a rather unique microlens for each screen location), incident beams 48a-c do not emanate from a single location on projector array plane 40 in order to converge on focal plan 30'.
Because of the nature of image processing as described above and illustrated in Fig.
2, it is possible to produce a composite image for each projection location.
Thus, an image projected from projector 42 in Fig. 3 may comprise pixel data from a plurality of cameras. Returning to Fig. 7C, incident beam 48a would be emanate from a projector located at point "a" on plane 40, which includes corresponding image data acquired from a similarly positioned camera; incident beams 48b and 48c would emanate from a projector located at point "b-c", which includes corresponding image data acquired from a similarly positioned camera. However, even though the projected beams emanate from discrete locations, they converge at a common location after refraction by screen 52". Thus, deficiencies or expedients regarding screen 50 or 50" may be addressed by modifying the pixel projections for any given image (I) being projected. Since the process in Fig. 2 illustrates that each pixel for any given image may be projected in any selected manner, the unique projection locations for each image (I) can be determined and implemented via a suitable algorithm.
Reflective Display Previously in this disclosure, the means for viewing the acquired image data was by way of projection onto a refractive screen. However as illustrated in Figs. 8-12, the invention also includes a passive reflective display implementation.
In this implementation, retroreflection is used to provide a viewer with stereographic imaging. Through experimentation, it has been verified that microspheres provide the desired means for achieving retroreflection. Tests have successfully been conducted using the following products: 3M~ Scotchlite~ black retroreflective film;
Swarco Megalux-Beads~.
A highly desirable property of microspheres is their ability to directionally reflect emitted light back to the light origin. This retroreflectivity is a desirable component to the implementation of this feature of the invention. Fig. 9 illustrates the retroreflectivity of a single microsphere 100. This arrangement advantageously addresses the issue of beam divergence that would otherwise occur with a non-retroreffective surface, such as a planar polished surface, which maintains the beam divergence property. As shown in this Figure, exiting light is either collimated or converging; if diverging beams of fight enter the sphere, they are reflected so as to generally mimic the emission path of the light beam source. Thus, if a projector is located at an optical distance "z" from the screen, a viewer similarly located would perceive a sharp and accurate image as projected on the screen. In addition, lateral resolution is maintained as reflection bleed (one pixel to the next) is minimized.
To construct passive reflective screen 150, a plurality of microspheres 100 are mounted to a suitable planar surface as is illustrated in Figs. 10-12. To minimize reflection of incident or ambient light, adhesive 152, which serves to bind microspheres 100 to backing member 154, should have very low reflective properties; alternatively, a separate coating can be applied over screen 150 where after the coating is removed from spheres 100 but not adhesive 152. If using a coating solution, the solution should not significantly bind to the spheres so that the coating can be removed from the spheres but not from the adhesive.
Also when utilizing a screen as described herein and illustrated in Figs. 10-12, true retroreflection will result in the reflected fight being redirected to the projectors.
A viewer must be positioned so as to intercept the reflected light in order to perceive the projected image, however, such a viewer would then also interfere with the projected light. Consequently, provisions must be made to permit simultaneous projection and observation. Figure 8 illustrates a means for accomplishing this objective. fn particular, a single projector 146 is functionally shown directing light beams 148 towards angled half mirror or beam splitter 160. A portion of the projected light beams 148 reflects from beam splitter 160 as light beams 248 and impinges on screen 150. Screen 150 in turn retroreflects light beams back to beam splitter 160. A portion of this retroreflected light 249 passes through beam splitter 160, where after it enters the observer space. While the light observed by a viewer has an illumination of 25% of the projected source light, this solution satisfactorily addresses the issue of projection beam obstruction.
As was the case with the refractive screen discussed above, it is important to align the multiple projected images so that each image is superposed upon each other, i.e., each image pixel from each projector is substantially projected to the same location of the screen. In this manner, any given area on the screen will represent the same object, but from a plurality of viewing angles.

Claims (2)

What is claimed:
1. A polystereoscopic system for acquiring and presenting images simultaneously to a plurality of viewers comprising:
image capturing means for acquiring a plurality of discrete images of at least one object from a plurality of discrete angles;
data translation means for compiling and sequencing the images acquired by the image capturing means; and image display means for projecting the sequenced images so that discrete stereoscopic images of the at least one object are simultaneously viewable from a plurality of locations.
2. A method for presenting polystereoscopic images simultaneously to a plurality of viewers comprising:
acquiring a plurality of discrete images of at least one object from a plurality of discrete angles;
compiling and sequencing the acquired images; and projecting the sequenced images using projection means to a screen selected from the group consisting of a retro-reflective screen and a refractive screen so that discrete stereoscopic images of the at least one object are simultaneously viewable from a plurality of locations.
CA 2447817 2002-10-15 2003-10-15 Method and apparatus for refractive and reflective polystereoscopic imaging Abandoned CA2447817A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US27166602A 2002-10-15 2002-10-15
US10/271,666 2002-10-15

Publications (1)

Publication Number Publication Date
CA2447817A1 true CA2447817A1 (en) 2004-04-15

Family

ID=32467703

Family Applications (1)

Application Number Title Priority Date Filing Date
CA 2447817 Abandoned CA2447817A1 (en) 2002-10-15 2003-10-15 Method and apparatus for refractive and reflective polystereoscopic imaging

Country Status (1)

Country Link
CA (1) CA2447817A1 (en)

Similar Documents

Publication Publication Date Title
TW571120B (en) Three-dimensional display method and its device
KR100864139B1 (en) Method and apparatus for displaying 3d images
US6813085B2 (en) Virtual reality display device
US7703924B2 (en) Systems and methods for displaying three-dimensional images
US8432436B2 (en) Rendering for an interactive 360 degree light field display
JP3151347B2 (en) Automatic stereo directional display device
US20190007677A1 (en) Systems and Methods for Convergent Angular Slice True-3D Display
CN100595669C (en) Two-sided display screen and its three-dimensional display apparatus
JP2007503020A (en) Wide-angle scanner for panoramic display
US7111943B2 (en) Wide field display using a scanned linear light modulator array
US20050093713A1 (en) Devices for three dimensional imaging and recording
JP2018533062A (en) Wide-field head-mounted display
JP2007519958A (en) 3D display
JP5999662B2 (en) Image display device
CN111856775A (en) Display device
US11284053B2 (en) Head-mounted display and projection screen
WO2021139204A1 (en) Three-dimensional display device and system
JP2002523790A (en) Projection system
JP2006276292A (en) Image display system
CA2447817A1 (en) Method and apparatus for refractive and reflective polystereoscopic imaging
US4676613A (en) Stereoscopic pictures using astigmatic low f-number projection lenses-method and apparatus
CA2180188A1 (en) Method and apparatus for viewing with a virtual optical center
US5971547A (en) Astigmatic lenticular projector system
WO2019225233A1 (en) Video display device
JP2005208308A (en) Stereoscopic projector

Legal Events

Date Code Title Description
FZDE Dead