WO2014209355A1 - Apparatus and method of communicating between portable projection devices - Google Patents

Apparatus and method of communicating between portable projection devices Download PDF

Info

Publication number
WO2014209355A1
WO2014209355A1 PCT/US2013/048538 US2013048538W WO2014209355A1 WO 2014209355 A1 WO2014209355 A1 WO 2014209355A1 US 2013048538 W US2013048538 W US 2013048538W WO 2014209355 A1 WO2014209355 A1 WO 2014209355A1
Authority
WO
WIPO (PCT)
Prior art keywords
display image
projection
control information
image
secondary display
Prior art date
Application number
PCT/US2013/048538
Other languages
French (fr)
Inventor
Mark Alan Schultz
Michael Scott Deiss
Original Assignee
Thomson Licensing
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Thomson Licensing filed Critical Thomson Licensing
Priority to PCT/US2013/048538 priority Critical patent/WO2014209355A1/en
Publication of WO2014209355A1 publication Critical patent/WO2014209355A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3141Constructional details thereof
    • H04N9/3147Multi-projection systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3141Constructional details thereof
    • H04N9/3173Constructional details thereof wherein the projection device is specially adapted for enhanced portability
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3191Testing thereof
    • H04N9/3194Testing thereof including sensor feedback

Definitions

  • the present arrangement provides a system and method for controlling the operation of a portable projection device.
  • projection devices were (and are) designed as non-mobile devices that are positioned in a room and project a series of audio-visual images on a screen that is viewable by individuals within the room and in the line of sight of the projected image.
  • these projection devices are precisely configured to minimize errors in the audio-visual images being displayed. Examples of these systems include but are not limited to movie theaters, professional meeting rooms, lecture halls and the like.
  • pico projectors A pico projector may be included in any handheld device that can selectively project at least one of an image or series of images on a surface. Moreover, it is important for the pico projector to be able to generate a clear image of sufficient quality on any type of surface. This may include, for example, a conventional display screen or a wall in a room. It is, therefore, necessary for the pico projector to compensate for any surface impurities when generating and projecting a display image.
  • a further drawback associated with pico projection relates to the nature of the device itself. Because the pico projector is naturally handheld and/or portable, the pico projector suffers from increased visual display errors as compared to a traditional projection device. The increased visual errors (e.g. noise, distortion, etc) in images projected by pico projectors result from the often sub-optimal positioning of the pico projector with respect to the surface on which the images are being displayed as well as the orientation of individuals viewing the image to the surface on which the image is displayed.
  • the increased visual errors e.g. noise, distortion, etc
  • an apparatus for projecting received images onto a display surface includes a projection unit that projects a primary display image and a secondary display image on a display surface and a controller coupled to the projection unit that generates the primary display image from the received images and the secondary display image including control information associated with the apparatus for receipt by at least one other projection device.
  • the control information controls an operation of the at least one other projection device, wherein the primary display image being within the visible spectrum and the secondary display image being one of within the visible spectrum and outside of the visible spectrum.
  • a method of projecting received images onto a display surface includes generating a primary display image from the received images, the primary display image being within the visible spectrum and generating a secondary display image including control information associated with the apparatus for receipt by at least one other projection device, the secondary display image being one of with in the visible spectrum and outside of the visible spectrum.
  • the method further includes projecting, using a projection unit, the primary display image and the secondary display image on a display surface and controlling an operation of the at least one other projection device using the control information displayed in the secondary display image received by the at least one other projection device.
  • an apparatus for projecting received images onto a display surface includes means, such as a projection unit, for projecting a primary display image and a secondary display image on a display surface and means for generating the primary display image from the received images and means for generating the secondary display image including control information associated with the apparatus for receipt by at least one other projection device, the control information controlling an operation of the at least one other projection device, wherein the primary display image being within the visible spectrum and the secondary display image being one of within the visible spectrum and outside of the visible spectrum.
  • means such as a projection unit, for projecting a primary display image and a secondary display image on a display surface and means for generating the primary display image from the received images and means for generating the secondary display image including control information associated with the apparatus for receipt by at least one other projection device, the control information controlling an operation of the at least one other projection device, wherein the primary display image being within the visible spectrum and the secondary display image being one of within the visible spectrum and outside of the visible spectrum.
  • FIG. 1 is a block diagram of the portable projection device according to aspects of the present invention.
  • FIGS. 2A - 2D are exemplary light engines for use in the portable projection device according to aspects of the present invention.
  • FIG. 3 is a block diagram of components used in communicating with other devices according to aspects of the present invention.
  • FIG 4 is an illustrative view of images projected by the portable projection device according to aspects of the present invention.
  • FIG 5 is an illustrative view of a plurality of projection devices projecting images according to aspects of the present invention.
  • FIG. 6 is an illustrative view of a plurality of projection devices
  • FIGS. 7A - 7C are exemplary image alignments according to aspects of the present invention.
  • FIG. 8 is an exemplary alignment image used in discovering a display position among a plurality of projection devices according to aspects of the present invention.
  • FIG. 9 is a flow diagram detailing the operation of the portable projection device according to aspects of the present invention.
  • FIGS may be implemented in various forms of hardware, software or combinations thereof.
  • these elements are implemented in a combination of hardware and software on one or more appropriately programmed general-purpose devices, which may include a processor, memory and input/output interfaces.
  • processor or “controller” should not be construed to refer exclusively to hardware capable of executing software, and may implicitly include, without limitation, digital signal processor ("DSP") hardware, read only memory (“ROM”) for storing software, random access memory (“RAM”), and nonvolatile storage.
  • DSP digital signal processor
  • ROM read only memory
  • RAM random access memory
  • a component is intended to refer to hardware, or a combination of hardware and software in execution.
  • a component can be, but is not limited to being, a process running on a processor, a processor, an object, an executable, and/or a microchip and the like.
  • an application running on a processor and the processor can be a component.
  • One or more components can reside within a process and a component can be localized on one system and/or distributed between two or more systems. Functions of the various components shown in the figures can be provided through the use of dedicated hardware as well as hardware capable of executing software in association with appropriate software.
  • any element expressed as a means for performing a specified function is intended to encompass any way of performing that function including, for example, a) a combination of circuit elements that performs that function or b) software in any form, including, therefore, firmware, microcode or the like, combined with appropriate circuitry for executing that software to perform the function.
  • the disclosure as defined by such claims resides in the fact that the functionalities provided by the various recited means are combined and brought together in the manner which the claims call for. It is thus regarded that any means that can provide those functionalities are equivalent to those shown herein.
  • the present invention is directed towards a multifunction portable electronic device (hereinafter, the "device") that includes audiovisual image projection capabilities (e.g. a pico projector) and method of operating the same.
  • An exemplary block diagram of the device 10 is provided in Figure 1.
  • the device 10 includes a controller 12.
  • the controller 12 is a component that executes various operational algorithms that control the various functions of the device 10. In one embodiment, the controller 12 executes algorithms that enable audio and video processing of a source input signal.
  • the controller 12 may also include a memory in which various machine executable instructions controlling various device functionality may be stored and accessed as needed in response to various control signals generated by one of (a) a user and (b) other components of the device 10 as will be discussed below.
  • the memory of the controller 12 may also store data associated with any input signal received by the controller 12.
  • the memory of controller 12 may also store user- specific information that is associated with a user of the device 10.
  • user specific information may include user preferences for configuring the device for a particular type of operation.
  • the user specific information may include global preference information that configures aspects of device operation that are common between the various functions as well as function specific preference information that configures the device to operate in a particular manner when executing a particular function.
  • the controller 12 is described as including a memory, one skilled in the art should understand that the memory (or other storage medium) within the device may be a separately embodied component that is read/write accessible by the controller 12 as needed.
  • the device 10 also includes a power converter 14 and battery 16 connected to the power converter 14.
  • the power converter 14 is selectively connectable to an input power source (either AC or DC) for receiving power therefrom. Power received by the power converter 14 is provided to the battery 14 and selectively charges the battery 16 as needed. It should be understood that the operation of charging is meant to include an initial charging of the battery 16 as well as recharging the battery 16 after the power level has been depleted. Power is also simultaneously provided by the power converter 14 to the controller 12 for powering operation thereof.
  • the controller 12 may selectively detect when input power is being provided to the power converter 14 causing the device 10 to operate in a first power mode when a connection to an input power source is detected and a second mode when no connection to an input power source is detected.
  • the controller 12 may execute a battery monitoring algorithm that enables the controller 12 to selectively detect a power level in the battery 16 and control the power converter 14 to direct power thereto. The controller 12 can also control charging of the battery 16 when the detected power level in the battery 16 is below a predetermined threshold. In another embodiment of the first power mode, the controller 12 may automatically direct power from the power converter 14 to be provided to the battery 16 in response to connection of the power converter with the input power source. In the second mode of operation, the controller 12 is powered by the battery 16 until such time that the battery power is depleted below a
  • the controller 12 may receive an input audiovisual signal from one of a plurality of device inputs collectively referred to using reference numeral 15.
  • the controller 12 can control selective projection of the audiovisual input signal using projection unit/microdisplay 30.
  • the input audiovisual signal may include one of (a) a still image; (b) a series of images; (c) a video signal; and (d) an audio signal.
  • the input audiovisual signal may also include an audio component that is intended to be audibly reproduced by speaker 29 in conjunction with the projection, by the projection unit 30, of the one still image or series of images as will be discussed below.
  • the plurality of inputs may include any combination of but is not limited to (a) a card reader 18; (b) a USB port 20; (c) a digital video input port (HDMI) 22; (d) a VGA/Component video input port 24; and (e) a composite/S-Video input port 26.
  • the depiction of the plurality of input ports 15 is for purposes of example only and the device 10 may include any combination of the described input ports or other known input ports .
  • the card reader selectively receives a storage card that may include data representative of the input audiovisual signal that is accessed by the controller 12 and provided to the projection unit 30 and/or speaker 29 for output thereof.
  • the card reader 18 may be a MicroSD card reader. This is described for purposes of example only and any card reading device able to read any standardized storage card may be included in device 10.
  • the USB port 20 enables the device 10 to be selectively connected to one of (a) a portable storage device (e.g. flash drive); or (b) a secondary device, that stores data representative of the audiovisual input signal.
  • Any of the digital video input 22, VGA/component input 24 and/or composite video input 26 may enable connection with a secondary device that includes the source audiovisual input signal and are coupled to the controller 12 via an input selector 28.
  • the input selector 28 selectively couples a respective one of the digital video input 22, VGA/component input 24 and/or composite video input 26 with the controller 12 such that the controller 12 may provide the audiovisual input signal to the projection unit 30 and speaker 29 for output thereof.
  • the device 10 further includes a plurality of user controls, collectively referred to using reference numeral 31, enabling the user to selectively control various device functions.
  • An input/output (IO) interface 32 may include at least one user selectable button associated with at least one device function such that selection thereof initiates a control signal received by the controller 12 that is used to control the particular device function.
  • the 10 interface 32 may be a touch screen and the at least one button may be a user selectable image element displayed on the touch screen enabling selection thereof by a user.
  • the number and types of user selectable image elements may be generated by the controller 12 depending on the particular operational mode of the device. For example, during projection mode, the user selectable image elements may enable activation of image projection functionality and, if the device 10 is operating in a communication mode, the user selectable image elements displayed on the I/O interface 32 may relate to
  • the 10 interface 32 may include at least one dedicated button on a housing of the device 10 that may be manually activated by a user.
  • a further user control 31 that may be provided is a remote infrared (IR) sensor 36.
  • Remote IR sensor 36 selectively receives an IR input signal that is generated by a remote control.
  • the IR input signal received by the remote IR sensor 36 is communicated to the controller 12 which interprets the received IR input signal and initiates operation of a particular function of the device corresponding to user input.
  • Any of the user controls 32, 34 and/or 36 may be used to generate control signals for selecting an input audiovisual signal from a respective input source of the plurality of input sources 15.
  • the control signals input via the user are received by the controller 12 which processes the user input signal and selects the source of the input audiovisual signal.
  • Input received from any of the user controls 31 may also condition the controller 12 to selectively output the audiovisual signal using projection unit 30 and speaker 29.
  • the projection unit 30 may include a microdisplay/pico projection unit 30.
  • the projection unit 30 includes a panel driver 38, a light engine 39 and a projection lens 48.
  • the panel driver 38 receives the audiovisual input signal from the controller 12 and controls the light engine to emit light representative of the audiovisual input signal that may be projected via a projection lens 48 coupled thereto.
  • the light engine 39 may include a light source and light processing circuitry that is selectively controlled by the panel driver 38 to generate light and project an image representing the audiovisual signal onto a surface. Exemplary types of light engines 39 will be discussed in greater detail with respect to Figures 2 A - 2D.
  • any light engine used in any type of projection device may be incorporated in the projection unit 30 of the device 10.
  • the light generated by the light engine 39 is provided to the projection lens 48 which projects the full color image onto a display surface (e.g. screen, wall, etc).
  • the projection lens 48 may be focused in response to user input received by the controller 12 as needed.
  • the operation and position of the various components of the projection unit 30 may be controlled via a control signal that is generated by either the user or another component of device 10.
  • the projection unit 30 of the device may also include an infrared light emitting diode (IR LED) 50 that is coupled to the panel driver 38.
  • the controller 12 may generate an IR audiovisual input signal based on the audiovisual input signal received from one of the plurality of inputs 31 or user controls.
  • the IR audiovisual signal may be provided to the panel driver 38 which conditions the IR LED 50 to project an IR version of the audiovisual input signal.
  • the IR signal is imperceptible to the human eye but may be used by other components as an input control signal in the manner discussed below.
  • the device 10 may also include a camera module 52.
  • the camera module 52 may include a lens 54 coupled to an image sensor 56. Image data received via the lens 54 and sensed by image sensor 56 may be processed by image processor 58.
  • the camera module 52 may operate as a convention digital camera able to capture one of still images and video images.
  • the camera module 52 may also operate as a sensor that senses at least one type of image being displayed and uses the sensed image as a control signal for controlling at least one function of the device 10 as will be discussed below.
  • the lens 54 of the camera module 52 shown in conjunction with the projection lens 48 of the projection unit, is described for purposes of example only and the device may include a single lens that is shared between the projection unit 30 and camera module 52.
  • a motion sensor 60 is also provided.
  • the motion sensor 60 is coupled to the controller 12 and selectively senses data representing movement of the device 10.
  • the motion sensor 60 may sense the position of the device and generate an input control signal used by the controller 12 for controlling device operation.
  • the motion sensor 60 may include any type of motion sensor including but not limited to a gyroscope and/or an accelerometer.
  • the device 10 may include at least three accelerometers positioned on the X, Y and Z axis such that accelerometers may sense the position of the device 10 with respect to gravity.
  • the motion sensor 60 may refer to a plurality of different sensors that are able to sense various types of data which may be provided to the controller 12 for analysis and processing thereof.
  • the device 10 also includes a communications processor 62 that enables bidirectional communication between the device 10 and a remote device.
  • the communication processor 62 is described generally and is intended to include all electronic circuitry and algorithms that enable bidirectional communication between devices.
  • the communication processor 62 enables the device to operate as a cellular phone.
  • the communication processor 62 includes all components and instructions for connecting the device 10 to the internet.
  • the communication processor 62 includes all components associated with a smartphone to enable a plurality of different types of bidirectional communication (e.g. telephone, email, messaging, internet, etc) between the device and a communications network.
  • Figures 2A - 2D are block diagrams representing different types of light engines 39 that may be employed within the projection unit 30 described in Figure 1. It should be understood that the portable projection device 10 as discussed herein may utilize any of the different light engines 39a - 39d described in Figures 2A - 2D. It should also be appreciated that the description of the light engines 39a - 39d is not limited to those described herein and any type of light engine able to generate and process light into a full color image for display on a surface may be used by the device 10.
  • Figure 2A represents a three-color LED light engine 39a. The light engine 39a is controlled via the panel driver 38 (Fig. 1).
  • the panel driver 38 receives the audiovisual input signal from the controller 12 and controls the operation of light emitting diodes (LED) 40a, 40b, and 40c.
  • the LEDs 40a - c represent three color LEDs including a blue LED 40a, a green LED 40b and a red LED 40c.
  • the audiovisual input signal provided to the panel driver 38 has been separated into its component colors by the controller 12 and the panel driver 38 selectively controls the LEDs 40a-c to emit the necessary light to generate the desired audiovisual image for output.
  • Light generated by the LEDs 40a-c is focused into a full color image by a focusing element 42.
  • the focusing element 42 may be an x-cube.
  • the focusing element 42 may be a dichroic mirror.
  • the focused image is projected on a liquid crystal on silicon (LCOS) chip 44 for receiving light emitted from each of the LEDs 40a - c and optically combines the received light via a polarizing beam splitter 46.
  • the combined light is provided to the projection lens 48 which projects the combined full color image onto a display surface (e.g. screen, wall, etc).
  • the projection lens 48 may be focused in response to user input received by the controller 12 as needed. Additionally, the operation and position of the various components of the projection unit 30 may be controlled via a control signal that is generated by either the user or another component of device 10.
  • Figure 2B depicts a white-light LED light engine 39b that may be used in the projection unit of the device 10.
  • Light engine 39b may include a while light LED 41.
  • the panel driver 38 (in Fig. 1) receives the audiovisual input signal from the controller 12 and controls the operation of the white light LED 41.
  • the LED 41 is controlled to emit a pattern of light to generate the desired audiovisual image for output.
  • Light generated by the LED 41 is provided to a LCOS chip 44b.
  • the LCOS chip 44b has a predetermined pattern of primary color dots thereon.
  • the panel driver 38 controls the LCOS chip 44b to have certain of the dots illuminated by the light emitted by LED 41 to provide colored light to the polarizing beam splitter 46b which optically combines the colored light reflected off of the LOCS chip 44b.
  • the combined light is provided to the projection lens 48 which projects the combined full color image onto a display surface (e.g. screen, wall, etc).
  • FIG. 2C depicts a digital light processing (DLP) engine 39c.
  • the DLP engine 39c includes three colored light sources 40a, 40b, and 40c.
  • the light sources 40a - c represent three color LEDs including a blue LED 40a, a green LED 40b and a red LED 40c. While these are described as LED light sources, this is done for purposes of example only and the light sources may be any type of light sources including, but not limited to lasers as are known to be implemented in a DLP light engine.
  • the light sources 40a-c are not on simultaneously. Rather, the panel driver 38 controls the individual light sources in sequence and the emitted light is provided to the focusing element for producing the full color image.
  • a color wheel may be positioned between a light source and the focusing element 42.
  • the panel driver 38 selectively controls the color wheel to rotate to one of the three primary colors based on the data in the audiovisual input signal to illuminate a respective light color at a given time.
  • the audiovisual input signal provided to the panel driver 38 has been separated into its component colors by the controller 12 and the panel driver 38 selectively controls the LEDs 40a-c to emit the necessary light to generate the desired audiovisual image for output.
  • Light generated by the LEDs 40a-c are projected and focused into a full color image by a focusing element 42.
  • the focusing element 42 may include a mirror unit 45 formed from at least one mirror which reflects the emitted light through prisms 47.
  • the focused image is provided to the projection lens 48 which projects the combined full color image onto a display surface (e.g. screen, wall, etc).
  • FIG. 2D depicts a laser-based light engine 39d.
  • the laser light engine 39d includes light sources 43a - c that each emit a respective color light based on an audiovisual input signal.
  • the light sources 43a - c are lasers that emit light in three distinct wavelengths.
  • light source 43a may be a laser that emits light at a wavelength associated with the color red whereas light source 43b may emit light at a wavelength associated with the color green and light source 43c may emit light at a wavelength associated with the color blue.
  • the panel driver 38 controls the light sources 43a-c to emit respective colored light based on the audiovisual input signal received from the controller 12.
  • the emitted light (either, concurrently or sequentially - depending on the panel driver being used) is provided to a focusing element 42.
  • the focusing element 42 includes a set of combiner optics 49 that receives and combines the emitted laser light and provides the light to the mirror unit 45 including a plurality of individual mirrors.
  • the mirror unit 45 is controlled by the panel driver 38 to rotate the plurality of mirrors based on the audiovisual input signal and reflects light to the projection lens 48 for projection onto a display surface (e.g. screen, wall, etc).
  • the projection device 10 advantageously emits at least one secondary image in a spectrum that is not visible to a user viewing the primary audiovisual image derived from the AV input signal that is projected through the lens 48 of the projection unit 30.
  • the secondary image emitted by the projection unit may include control information that may be selectively captured by a sensor located on another projection device.
  • the projection device 10 may also include the same sensor for sensing secondary images including control information projected by other projection devices thereby enabling communication between projection devices.
  • the control information included in the secondary image may be any information associated with the particular projection device emitting the secondary image.
  • Control information may include but is not limited to (a) device configuration information; (b) information associated with the primary image being projected; (c) position information associated with the projection position in a projection array; and (d) synchronization information for synchronizing display of primary images between projection devices.
  • control information contained in the secondary image may deliver patterns and other information that enhance the performance of the device projecting the secondary image as well as other devices capturing and processing the secondary images projected by the device. Because the secondary image is not visible to users viewing the primary image, real-time automated configuration and communication may occur. Thus, the viewers will not see the active light references associated with the secondary image but will have their viewing experience enhanced because the projection device(s) may use the control signal in these active light references to modify or otherwise optimize the viewing experience.
  • the secondary image projected by the projection unit is an infrared (IR) image that is projected from an IR emitter or other IR light source.
  • the sensor included in the device and used to capture and process the secondary IR image may be a camera having sensitivity within the IR spectra.
  • the following description will reference the secondary image projected by the projection device as an IR image.
  • the projection unit may include any type of light source able to project an image in the light spectrum that is not viewable by the human eye but which may be captured by an image capturing device such as a sensor or camera.
  • the secondary IR images can also be used to communicate information either to the same projection device, or in the case of multiple projection devices, adjacent projectors.
  • the secondary IR image may be pulse modulated over time and include control information (e.g. a digital message) for receipt by an IR sensor on another different projection device.
  • control information e.g. a digital message
  • One exemplary operation includes providing a secondary IR image that can be processed between projectors where one projection device can discover the existence of an adjacent projection to share or split the image processing between two projectors.
  • Another exemplary operation allows for the secondary IR images to be used to help align multiple projectors individually by communication between the projectors to share the display time for some images and keep an individual display time for other images.
  • each projector could optimize its display for the enhancement of the multi-projection display.
  • the secondary image may be generated by the light engine that generates the primary display image.
  • the communication between the devices may occur during a communication session that occurs when all projection devices are being configured for displaying the primary image.
  • the light engine may modulate the light to communicate information associated with device operation and/or parameters to other projection devices that are being simultaneously configured.
  • FIG. 3 An exemplary block diagram of the components of the device 10 shown in Figure 1 used in generating, projecting and sensing a secondary image is shown in Figure 3.
  • the device 10 shown in Figure 3 includes all components shown in Figure 1.
  • the components shown in Figure 3 having the same reference numerals operate in a similar manner as described above in Figure 1 and include the further operational features described in Figure 3.
  • the controller 12 includes a microprocessor 302 for controlling all logic operations able to be executed by the controller 12.
  • a memory 304 is also provided for storing at least one type of data therein.
  • the at least one type of data stored in memory 304 and which may be communicated to at least one other projection device may be any data related to a particular operation of the controller 12.
  • the data stored in memory 304 may also include data representing at least one of (a) an alignment pattern for aligning the display of images projected from a plurality of projection devices onto a display surface; (b) a relative position of one projection device with respect to at least one other projection device; (c) projection configuration information; (d) a video characteristic associated with the AV input signal; (e) an audio characteristic associated with the AV input signal; (f) ordering information for controlling an order in which images in the AV input signal are to be displayed in a multi-projection array; (g) a focal distance enabling other projection devices to use the same focus distance value to ensure that the image is at substantially the same distance and has substantially a same picture size; (h) a type of light engine; (i) aspect ratio and/or resolution of the image being projected; (j) a brightness value identifying a percent of present setting which may be used to communicate if the adjacent projection device has some margin to increase the brightness to match the brightness of the other projection devices; (k) an IR overscan value with respect to the
  • IR camera if a trapezoidal picture is seen by IR camera, other projection devices identifies device 10 being positioned at an angle with respect to the display surface; (m) image center designator identifying a center of the picture of the imager (e.g. cross-hairs); and (n) at least one type of icon representing a particular parameter wherein the position of display on the icon may represent the value associated with the particular parameter.
  • the above types of data stored in the memory and used in communicating with other figures is described for purposes of example only.
  • the memory may include at least a unique identifier able to identify the device to any other device.
  • the data stored in memory 304 may also include any type of data that describe a state of operation of the device 10.
  • the memory 304 may also store unique identifiers and data describing a state of operation associated with other projection devices and that has been communicated to device 10.
  • each projection device advantageously may know the identity of every other projection device along with the control state of each device.
  • a video processor 306 and audio processor 310 are also provided.
  • the video processor 306 processes the video component of any audiovisual (A/V) input signal 301 received from a respective A/V input 15 in a known manner.
  • the video processor 306 selectively decodes the video component of the A/V input signal 301 and provides the decoded video data to the panel driver 38 of the projection unit 30 for projection thereof.
  • the decoded video component being projected by the projection unit is termed the primary display image.
  • the primary display image is the image that is visible to any person within the viewing range of the projection device 10.
  • the audio processor 310 processes any audio component of the A/V input signal 301 for output to the audio driver 312 coupled to the speaker 29 (Fig. 1).
  • a communication bus 303 connects each of the microprocessor 302, memory 304, video processor 306 and audio processor 310 and enables each of the components to communicate data therebetween.
  • the video processor 306 In conjunction with decoding and processing the video component of the A/V input signal 301, the video processor 306 generates data representing a secondary IR display image which may be provided to the panel driver 38.
  • the video processor 306 includes control information in the secondary IR display image for use in communicating with other projection devices.
  • the control information included within the secondary IR display image includes a projector identifier enabling other projection devices to identify the source from which a respective secondary IR display image is being projected.
  • the control information may also include at least one of (a) an alignment pattern for aligning the display of images projected from a plurality of projection devices onto a display surface; (b) a relative position of one projection device with respect to at least one other projection device; (c) projection configuration information; (d) a video characteristic associated with the AV input signal; (e) an audio characteristic associated with the AV input signal; and (f) ordering information for controlling an order in which images in the AV input signal are to be displayed in a multi-projection array.
  • the data included in the control information may be derived from one of (a) the memory 304; (b) user input via a respective user input control 31 ; and (c) the A/V input signal 301.
  • the control information in the secondary IR display image may include a predetermined pattern which may be selectively captured by a sensor for processing.
  • the secondary display image may include any number of unique predetermined patterns that represent different types (classes) of information that may be used by other projection devices.
  • IR signaling can be transmitted as pulses which vary position or duration in time, or as a carrier modulated signal which varies in frequency, phase, or amplitude, or in some combination thereof.
  • syntax has associated semantics which define the commands plus data that can be interpreted and executed in each projection device. In this manner, any type of command, parameter or other information can be directly communicated to and understood by other projection devices.
  • the video processor 306 pulses the secondary IR display image and controls the panel driver 38 to project the secondary IR display image onto the display surface via the IR LED 50. Based on the pulse generated by the video processor 306, the panel driver 38 causes the IR LED to project the secondary IR image at the predetermined pulse time.
  • the secondary IR display image may be overlaid on top of the primary display image and may have a larger display area as compared to the primary display image. Since the secondary IR display image is in a spectrum that is not visible to the human eye, the overlay of this image on top of the primary image will not affect the viewing experience.
  • the larger display area of the secondary IR display image creates a border region extending around the periphery of the display area of the primary display image.
  • the video processor 306 may control the panel driver 38 to simultaneously drive the light engine 39 projecting the primary image and the IR LED 50 projecting the secondary IR display image.
  • the video processor 306, when decoding the video component of the A/V input signal 301 may reserve a frame time for each frame in which the secondary IR display image will be displayed. Additionally, or instead, the video processor 306 may assign a time within a current image during which the secondary IR display image is to be displayed.
  • the light engine 39 is turned off (e.g. inhibited) while the IR LED 50 is displaying the secondary IR display image.
  • the microprocessor 302 may control the sensor for sensing the secondary IR display image to be synchronized to the display pattern of the secondary IR display image thereby enabling the device 10 to know when to expect an appearance of secondary IR display image from another projection device in order to capture the secondary IR display image for processing thereof.
  • the sensor is a camera module that includes the lens 52, the image sensor 56 and image processor 58. While the secondary IR display image is invisible to the human eye, any camera module, even one with an IR filter, will still be able to capture the secondary IR display image because the contrast ratio of the IR pattern versus an image having no pattern will still register on the image sensor 56 of the camera module and enable the devices to communicate with one another.
  • the control information included within the secondary IR display image by the video processor 306 may include data derived from other sources.
  • the video processor 306 may generate control information representative of (a) environmental data around the device 10 and (b) output data that is output by any component of the device 10.
  • Environmental data may include, but is not limited to, (a) position of the device with respect to a display surface; (b) position of the device with respect to a projection surface (e.g. the surface on which the device is positioned); (c) ambient light; and (d) ambient sound.
  • Output data may include, but is not limited to, (a) size of images being projected; (b) brightness of images being projected; (c) geometry of images being projected; (d) focus of images being projected and (e) alignment of images being projected.
  • the environmental data may be sensed by the motion sensor 60 whereas the output data may be sensed by the camera module 54.
  • the motion sensor 60 senses the position of the device 10 with respect to the display surface and the projection surface and a camera module 52 can capture and record the primary display images being projected onto the display surface by the projection unit 30.
  • control information generated and inserted into the secondary IR display image by the video processor 306 may be derived from a remote system and received by the communication processor 62 of the device 10. Since the communication processor 62 enables bidirectional communication with other devices and systems over a communications network such as a cellular network or the internet, the communication processor 62 may receive a control message including instructions for generating the secondary IR display image. In this embodiment, the control message may provide the video processor 306 with a set of information to be included in the secondary IR display image as the control information. Moreover, the control message received from the remote system may include a set of information usable by the controller 12 in interpreting control information received from a different projection device.
  • a remote user and system can, in real time, send control messages to the various projection devices allowing for dynamic control and configuration of the primary images being projected by the respective projection units based on the secondary IR display images being pulsed by the panel driver 38 via the IR LED 50.
  • control information generated by and included within the secondary IR display image may be derived from a user input received from a respective user input control 31.
  • the projection device may continually re-configure other projection devices and simultaneously be re-configured by other projection devices in response to the successive secondary IR display images captured and interpreted by the camera module 52.
  • the microprocessor 302 executes an inter-device communication algorithm that controls the video processor 306 to generate data representative of the secondary IR display image including the control information.
  • the generated secondary IR display image is provided to the panel driver 38 which drives the IR LED 50 to project the secondary IR display image at predetermined pulsed intervals as discussed above.
  • the inter-device communication algorithm also simultaneously configures the camera module 52 to automatically sense any secondary IR display images projected from other devices while the IR LED 50 is projecting its own secondary IR display image.
  • the image sensor 56 detects the captured secondary IR display image and the image processor 58 processes the detected image to discern at least one pattern representative of the control information contained in the captured secondary IR display image. Any patterns detected by the image processor 58 are provided to the microprocessor 302 which compares detected pattern data with control pattern data stored in memory 304 and generates an action associated with the pattern data when a match is detected.
  • control information may include a type of video setting information having a unique pattern associated therewith.
  • the microprocessor 302 may compare the detected pattern with the control pattern information and cause the respective video setting in the projection device 10 to be modified in response to receipt of the control information from another projection device.
  • the control information including video setting data and the response to receipt thereof by the device is described for purposes of example only.
  • the control information may include any type of data that may cause the device 10 receiving the control information to operate in any desired manner and the microprocessor 302 may initiate a corresponding action associated with the operation information received.
  • Figure 4 illustrates a single projection device 10 including the projection unit 30 for projecting images there from onto a display surface 402.
  • the device also includes the sensor (e.g. camera module 52) for sensing any secondary IR display images from other projection devices.
  • the projection device may project a composite image 401 including the primary display image 404 and the secondary display image 406 such that the display area of the secondary IR display image 406 exceeds, on all sides (406a - 406d), the display area 404 associated with the primary display image.
  • the device 10 advantageously provides an ability to communicate with any other projection device that is configured to display a primary image adjacent thereto.
  • the projection device 10 in Figure 4 can simultaneously provide control information in a secondary display image to and sense control information from other secondary display images projected by up to four separate projection devices.
  • each device can directly communicate and/or interact with up to four other projection devices. Additionally, in an instance where there are more than four projection devices, the total number of projection devices in the array may be discovered and provided to every other projection device to enable complete communication between and control over all projection devices in the array.
  • Figure 5 illustrates an example of inter-device communication in a projection array comprised of two projection devices 10a and 10b.
  • the description contained herein is for ease of understanding and to illustrate the principles of operation. It should not be construed to be limiting and while only two devices 10a and 10b are shown, any number of projection devices may be used according to the same inventive operation and principles.
  • Figure 5 depicts a first device 10a having a first projection unit 30a and first sensor unit 52a and a second device 10b having a second projection unit 30b and second sensor unit 52b.
  • the first device 10a projects a first composite display image 501a including the first primary display image 504a and secondary I display image 506a of the first device 10a via the first projection unit 30a onto a display surface 502.
  • the display area associated with the first composite display image 501a is shown via the dashed lines emanating from the first projection unit 30a.
  • the second device 10b projects a second composite display image 501b including the second primary display image 504b and secondary IR display image 506b of the second device 10b via the second projection unit 30b onto a display surface 502.
  • the display area associated with the second composite display image 501b is shown via the dotted lines emanating from the second projection unit 30a.
  • the devices 10a and 10b are configured to project images adjacent each other and, in doing so, an overlapped region 503 that includes both the secondary IR display image 506a of the second device 10a 506 and secondary IR display image 506b of the second device 10b is represented.
  • any control information 510a and/or 510b contained in either of the secondary display images 506a and/or 506b is included and may be used by the adjacent device 10a or 10b.
  • the control information 510a and 510b are shown here as differently shaped polygons merely to illustrate that control information may be formed as a unique pattern within the secondary display image 506a and/or 506b and any pattern may be used to represent different types of control information.
  • control information 510a identifying the device as the first projection device 10a may be generated and included in the secondary I display image 506a of the first device 10a whereas control information 510b identifying the device as the second projection device 10a may be generated and included in the secondary IR display image 506b of the second device 10b.
  • control information items 510a and 510b are present in the overlapped region 503, they are discoverable and able to be used by both the first device 10a and second device 10b
  • Figure 6 illustrates how a respective device captures a secondary display image and uses the control information contained therein.
  • the image and image components shown in Figure 6 mirror those shown and described above in Figure 5.
  • the difference in Figure 6 relates to the particular operation of the second device 10b.
  • the controller 12 controls the projection device to simultaneously pulse the secondary display image and initiate an image capture function using the sensor (e.g. camera) 52.
  • the first device is shown projecting the first composite display image 501a including the secondary IR display image 506a of the second device 10a 506a.
  • the sensor 52b on the second device 10b is activated to capture any secondary display images from any other devices (in this case, the first device 10a).
  • the area able to be captured by the sensor 52b of the second device 10b is shown by the dotted lines having a directional arrow towards the second device 10b.
  • the capturable area includes the overlapped region 503 and thus any control information 510a and 510 contained therein.
  • the sensor 52a captures the available secondary display image 506a and 506b and processes the captured image in the manner discussed above to derive any control information contained therein. Once the control information has been derived, the microprocessor 302 of the device 10b can determine what, if any action, should be taken based on the derived control information.
  • control information 510a and 510b represents respective projector identifiers associated with the first device 10a and second device 10b
  • the device 10b upon capturing and processing the secondary display image by sensor 52b, the device 10b will determine that first control information 510a and second control information 510b are present.
  • the microprocessor 302 may compare these control information items to a pre-stored list of control information items to determine the action, if any, to be taken. In this instance, the device may determine that first control information 510a identifies the first device 10a and second control information 510b identifies itself (the second device).
  • the microprocessor 302 may then identify the edge at which the control information 510a was captured by sensor 52a and update a projection array map to indicate that the device adjacent to the second device 10b on the left side is the first device 10a.
  • control information projected onto a surface and sensed by a device may be used to specifically control an operation of the device sensing the control information.
  • control information may include information associated with configuration parameters of one device and cause the device sensing that control information to automatically modify a configuration parameter identified in the control information.
  • Figures 7A - 7C and 8 illustrate another aspect of the inter-device communication algorithm executed by the controller 12 of the respective projection devices. This exemplary operation relates to the automatic discovery of adjacent projection devices and configuration of images to be projected from the various available adjacent projection arrays.
  • a single primary display image may be formed from a subset of individual partial primary display images each projected by a different projection device.
  • the A/V input signal may include information indicating that the images contained therein are to be displayed in array and the video processor 306 may use the array information to process the video component of the A/V input signal that it is charged to project.
  • the device 10 may automatically determine that a plurality of projection devices exist and automatically share the projection load by portioning the primary display image into partial image components based on the discovery and position of additional projection devices.
  • the microprocessor 302 may use the
  • control information may include information identifying a position of its respective partial image within the composite image of the array.
  • the control information may also include configuration settings of the adjacent device (e.g. color, brightness, focus, etc) so as to minimize visual errors when the images are being displayed.
  • the devices may then use the various control information items as a feedback to further re-configure itself in order to maintain visual display consistency across the array.
  • the projection array may be used by a plurality of projection devices to display a plurality of different primary display images at various locations on the display surface 502. These images may be displayed simultaneously or at predetermined time intervals depending on a preset array configuration information.
  • the display of one image may trigger an adjacent projection device to display the following image.
  • the display of one image by a respective projection device may cause another different adjacent projection device to be turned off.
  • the control information items would include display ordering information along with any parameter configuration information that may be used to keep the settings of each projection device substantially uniform.
  • each device in the array needs to discover and identify a position of all other projection devices in the array.
  • An example of how this is accomplished by the inter-device communication algorithm is shown in Figure 8 which provides a projection array formed from nine projection devices that are each configured to display their composite image in the positions identified therein.
  • all devices are aware of a current projection array pattern based on the pre-stored projection array patterns.
  • a user may use a respective user input control to select which of the array patterns is active.
  • the A/V signal may include array pattern data encoded therein such that the video processor 306, upon decoding the A/V input signal set a particular array pattern based thereon.
  • the array pattern information identifies a number of rows and number of positions within each row.
  • the array pattern information also sets a first device (1) as a master against which all future positions are to be compared.
  • the first device (1) knows that it is in the first position in an upper left corner of the array.
  • the first device (1) emits the secondary display image identifying itself.
  • Each of the other devices (2) - (9) simultaneously sense any secondary display images in their viewing field.
  • each of the other devices (2) - (9) look to a predetermined edge for the overlapped region with adjacent devices.
  • the discovery algorithm instructs each device (2) - (9) to look at each edge of the display area to sense secondary display images and identify at which edges secondary display images are sensed.
  • the devices use the presence and/or absence of secondary display images to determine their position relative to other devices.
  • the control information included in the secondary display image projected by the first device includes that it is the master (e.g. first device) and positioned in the first position of the first row.
  • the control information will also include a total number of rows in the array and number of positions within each row.
  • camera of device (2) captures the secondary display image of device (1) and, from the control information contained therein, understands that the device to its left is the master device and device (2) is identified as such indicating that device (2) is the second device and is positioned in the second position of the first row and will now project its own secondary display image that includes its device number and position.
  • the control information will also include the number of each preceding device and its respective position along with the total number of rows and positions within each row for the particular array.
  • the control information for device (2) will indicate that device (2) is in the second position of the first row and that device (1) is positioned adjacent left of device (2).
  • Device (3) will be able to sense the secondary display image from device (2) and know that devices (1) and (2) are positioned left thereof and that the array includes three rows each with three positions. Knowing that positions one and two of the first row are taken, device (3) automatically identifies itself as device (3) and its control information will indicate it as such.
  • device (4) With respect to device (4), since it is not adjacent device (3), device (4) senses an upper edge thereof for secondary display information. By sensing the upper edge, device (4) will see that it is adjacent device (1). Moreover, device (4) will understand that it is the in the fourth position because the control information projected by device (1) provides the array information. Additionally, the control information projected by device (1) is continually updated as each successive device (2) and (3) are identified. Device (4) will use the edge at which the secondary display image was sensed along with the control information identifying all other devices and positions in the adjacent row to identify itself as device (4). The device identification for Devices (5) - (8) are performed similarly. By way of all prior control information, all devices have knowledge of the position of all other devices. Once this has occurred, the devices can be controlled to project images in any manner while continually providing information to all other devices such that individual configuration and/or operation thereof may be modified to maintain an optimal display of the images being projected onto the display surface 502.
  • Figure 9 is a flow diagram detailing another exemplary operation based on inter-device communication between projection devices.
  • the communication occurring between adjacent projection devices relates to
  • the control information in the secondary IR display image represents at least one video characteristic parameter associated with the primary display image being displayed by the respective projector.
  • Each projection device can use any video characteristic parameter from any other device as a form of error correction and/or image modification in an attempt to improve the image being projected thereby.
  • the following flow will be discussed in terms of a second projector sensing control information from secondary IR display image being projected from a first projection device similar to the arrangement shown in Figure 5. However, this is described merely for purposes of example only and each projection device may use information from any other adjacent projection device to selectively modify the primary display image being projected thereby.
  • step 902 the sensor 52b on the second device 10b captures the secondary IR display image 506a projected by the projection 30a of the first device 10a.
  • the captured image is parsed to identify any patterns indicative of control information contained therein.
  • the control information may represent an image transform parameter.
  • the second device 10b determines a first error level by comparing the image transform parameter associated with the primary image 504a from the first device 10a with a current image transform pattern associated with the primary image 504b being projected by the second device 10b.
  • step 906 increments the current image transform pattern and in step 908, the controller determines if an error level is increased in response to the incrementing of the transform parameter.
  • step 910 the method continues in step 910 and further increments the transform pattern and compares the transform parameter to the transform parameter of the primary display image 506b being output by projection unit 30b.
  • the method in step 914 queries whether or not an error level is increased by the further incrementing of the parameter. If not, the method refers back to step 910 and further incrementing and comparison occur. If the result of the query is positive, the adjustment of the transform parameter is ended and the video processor 306 may implement the updated transform parameter when decoding the A/V input signal 301 for projection by proj ection unit 30b .
  • step 909 decrement the value of the transform parameter.
  • step 91 1 the decremented transform parameter is compared to the current transform parameter and in step 913 it is determined whether or not the error level has increased. If the result of the query of block 913 is negative, the method reverts back to step 909 for further decrementing and comparison. If the result of the query is positive, the adjustment of the transform parameter is ended and the video processor 306 may implement the updated transform parameter when decoding the A/V input signal 301 for projection by projection unit 30b.
  • each projection device can use the parameters of the other projection devices for inclusion in the control information to enable dynamic configuration of video parameters for each projection device to further improve the viewing experience of those in the viewing area.
  • the implementations described herein may be implemented in, for example, a method or process, an apparatus, or a combination of hardware and software. Even if only discussed in the context of a single form of implementation (for example, discussed only as a method), the implementation of features discussed may also be implemented in other forms (for example, a hardware apparatus, hardware and software apparatus, or a computer-readable media). An apparatus may be implemented in, for example, appropriate hardware, software, and firmware.
  • the methods may be implemented by instructions being performed by a processor, and such instructions may be stored on a processor or computer-readable media such as, for example, an integrated circuit, a software carrier or other storage device such as, for example, a hard disk, a compact diskette, a random access memory ("RAM"), a read-only memory (“ROM”) or any other magnetic, optical, or solid state media.
  • the instructions may form an application program tangibly embodied on a computer-readable medium such as any of the media listed above.
  • a processor may include, as part of the processor unit, a computer-readable media having, for example, instructions for carrying out a process.
  • the instructions corresponding to the method of the present invention, when executed, can transform a general purpose computer into a specific machine that performs the methods of the present invention.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

An apparatus and method for projecting received images onto a display surface is provided. The apparatus and method include a projection unit (30) that projects a primary display image and a secondary display image on a display surface, and a controller (12) coupled to the projection unit that generates the primary display image from the received images and the secondary display image including control information associated with the apparatus for receipt by at least one other projection device. The control information controls an operation of the at least one other projection device, wherein the primary display image being within the visible spectrum and the secondary display image being one of within the visible spectrum and outside of the visible spectrum.

Description

Apparatus and Method of Communicating Between
Portable Projection Devices
FIELD
[001] The present arrangement provides a system and method for controlling the operation of a portable projection device.
BACKGROUND
[002] Conventionally, projection devices were (and are) designed as non-mobile devices that are positioned in a room and project a series of audio-visual images on a screen that is viewable by individuals within the room and in the line of sight of the projected image. To ensure projection quality and an optimal viewing experience for the individuals, these projection devices are precisely configured to minimize errors in the audio-visual images being displayed. Examples of these systems include but are not limited to movie theaters, professional meeting rooms, lecture halls and the like.
[003] However, the rapid miniaturization of electronic devices has also extended to projection devices. Currently, there exists a portable electronic projection device that may be easily transported and able to turn virtually any room into a projection room. These portable electronic projection devices are termed pico projectors. A pico projector may be included in any handheld device that can selectively project at least one of an image or series of images on a surface. Moreover, it is important for the pico projector to be able to generate a clear image of sufficient quality on any type of surface. This may include, for example, a conventional display screen or a wall in a room. It is, therefore, necessary for the pico projector to compensate for any surface impurities when generating and projecting a display image.
[004] Moreover, a further drawback associated with pico projection relates to the nature of the device itself. Because the pico projector is naturally handheld and/or portable, the pico projector suffers from increased visual display errors as compared to a traditional projection device. The increased visual errors (e.g. noise, distortion, etc) in images projected by pico projectors result from the often sub-optimal positioning of the pico projector with respect to the surface on which the images are being displayed as well as the orientation of individuals viewing the image to the surface on which the image is displayed. [005] Furthermore, as pico projectors are increasingly being embodied in multifunction devices, activities associated with functions other than the projection of images may interrupt, distort and/or otherwise affect the image being projected by the pico projector and/or the experience of the individuals viewing the projected images. An example of these drawbacks is present in a multi-function portable electronic device that, in addition to being a pico projector, is also a portable communication device (e.g. smartphone). Various call and message functionality associated with the portable communication device may interfere with the functionality of the pico projector embodied in the multifunction portable electronic device.
[006] It would, therefore, be desirable to correct any of the above identified drawbacks associated with pico projectors. A system and method according the present invention addresses these deficiencies.
SUMMARY
[007] In one embodiment, an apparatus for projecting received images onto a display surface is provided. The apparatus includes a projection unit that projects a primary display image and a secondary display image on a display surface and a controller coupled to the projection unit that generates the primary display image from the received images and the secondary display image including control information associated with the apparatus for receipt by at least one other projection device. The control information controls an operation of the at least one other projection device, wherein the primary display image being within the visible spectrum and the secondary display image being one of within the visible spectrum and outside of the visible spectrum.
[008] In another embodiment, a method of projecting received images onto a display surface is provided. The method includes generating a primary display image from the received images, the primary display image being within the visible spectrum and generating a secondary display image including control information associated with the apparatus for receipt by at least one other projection device, the secondary display image being one of with in the visible spectrum and outside of the visible spectrum. The method further includes projecting, using a projection unit, the primary display image and the secondary display image on a display surface and controlling an operation of the at least one other projection device using the control information displayed in the secondary display image received by the at least one other projection device.
[009] In a further embodiment, an apparatus for projecting received images onto a display surface is provided. The apparatus includes means, such as a projection unit, for projecting a primary display image and a secondary display image on a display surface and means for generating the primary display image from the received images and means for generating the secondary display image including control information associated with the apparatus for receipt by at least one other projection device, the control information controlling an operation of the at least one other projection device, wherein the primary display image being within the visible spectrum and the secondary display image being one of within the visible spectrum and outside of the visible spectrum.
[0010] The above presents a simplified summary of the subject matter in order to provide a basic understanding of some aspects of subject matter embodiments. This summary is not an extensive overview of the subject matter. It is not intended to identify key/critical elements of the embodiments or to delineate the scope of the subject matter. Its sole purpose is to present some concepts of the subject matter in a simplified form as a prelude to the more detailed description that is presented later.
[0011 ] To the accomplishment of the foregoing and related ends, certain illustrative aspects of embodiments are described herein in connection with the following description and the annexed drawings. These aspects are indicative, however, of but a few of the various ways in which the principles of the subject matter can be employed, and the subject matter is intended to include all such aspects and their equivalents. Other advantages and novel features of the subject matter can become apparent from the following detailed description when considered in conjunction with the drawings.
BRIEF DESCRIPTION OF THE DRAWINGS
[0012] FIG. 1 is a block diagram of the portable projection device according to aspects of the present invention;
[0013] FIGS. 2A - 2D are exemplary light engines for use in the portable projection device according to aspects of the present invention; [0014] FIG. 3 is a block diagram of components used in communicating with other devices according to aspects of the present invention;
[0015] FIG 4 is an illustrative view of images projected by the portable projection device according to aspects of the present invention;
[0016] FIG 5 is an illustrative view of a plurality of projection devices projecting images according to aspects of the present invention;
[0017] FIG. 6 is an illustrative view of a plurality of projection devices
communicating with each other according to aspects of the present invention; and
[0018] FIGS. 7A - 7C are exemplary image alignments according to aspects of the present invention;
[0019] FIG. 8 is an exemplary alignment image used in discovering a display position among a plurality of projection devices according to aspects of the present invention; and
[0020] FIG. 9 is a flow diagram detailing the operation of the portable projection device according to aspects of the present invention.
DETAILED DESCRIPTION
[0021 ] It should be understood that the elements shown in the FIGS, may be implemented in various forms of hardware, software or combinations thereof.
Preferably, these elements are implemented in a combination of hardware and software on one or more appropriately programmed general-purpose devices, which may include a processor, memory and input/output interfaces.
[0022] The present description illustrates the principles of the present disclosure. It will thus be appreciated that those skilled in the art will be able to devise various arrangements that, although not explicitly described or shown herein, embody the principles of the disclosure and are included within its spirit and scope.
[0023] All examples and conditional language recited herein are intended for pedagogical purposes to aid the reader in understanding the principles of the disclosure and the concepts contributed by the inventor to furthering the art, and are to be construed as being without limitation to such specifically recited examples and conditions.
[0024] Moreover, all statements herein reciting principles, aspects, and embodiments of the disclosure, as well as specific examples thereof, are intended to encompass both structural and functional equivalents thereof. Additionally, it is intended that such equivalents include both currently known equivalents as well as equivalents developed in the future, i.e., any elements developed that perform the same function, regardless of structure.
[0025] Thus, for example, it will be appreciated by those skilled in the art that the block diagrams presented herein represent conceptual views of illustrative circuitry embodying the principles of the disclosure. Similarly, it will be appreciated that any flow charts, flow diagrams, state transition diagrams, pseudocode, and the like represent various processes which may be substantially represented in computer readable media and so executed by a computer or processor, whether or not such computer or processor is explicitly shown.
[0026] The functions of the various elements shown in the figures may be provided through the use of dedicated hardware as well as hardware capable of executing software in association with appropriate software. When provided by a processor, the functions may be provided by a single dedicated processor, by a single shared processor, or by a plurality of individual processors, some of which may be shared. Moreover, explicit use of the term "processor" or "controller" should not be construed to refer exclusively to hardware capable of executing software, and may implicitly include, without limitation, digital signal processor ("DSP") hardware, read only memory ("ROM") for storing software, random access memory ("RAM"), and nonvolatile storage.
[0027] If used herein, the term "component" is intended to refer to hardware, or a combination of hardware and software in execution. For example, a component can be, but is not limited to being, a process running on a processor, a processor, an object, an executable, and/or a microchip and the like. By way of illustration, both an application running on a processor and the processor can be a component. One or more components can reside within a process and a component can be localized on one system and/or distributed between two or more systems. Functions of the various components shown in the figures can be provided through the use of dedicated hardware as well as hardware capable of executing software in association with appropriate software.
[0028] Other hardware, conventional and/or custom, may also be included. Similarly, any switches shown in the figures are conceptual only. Their function may be carried out through the operation of program logic, through dedicated logic, through the interaction of program control and dedicated logic, or even manually, the particular technique being selectable by the implementer as more specifically understood from the context.
[0029] In the claims hereof, any element expressed as a means for performing a specified function is intended to encompass any way of performing that function including, for example, a) a combination of circuit elements that performs that function or b) software in any form, including, therefore, firmware, microcode or the like, combined with appropriate circuitry for executing that software to perform the function. The disclosure as defined by such claims resides in the fact that the functionalities provided by the various recited means are combined and brought together in the manner which the claims call for. It is thus regarded that any means that can provide those functionalities are equivalent to those shown herein. The subject matter is now described with reference to the drawings, wherein like reference numerals are used to refer to like elements throughout. In the following description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the subject matter. It can be evident, however, that subject matter embodiments can be practiced without these specific details. In other instances, well-known structures and devices are shown in block diagram form in order to facilitate describing the embodiments.
[0030] The present invention is directed towards a multifunction portable electronic device (hereinafter, the "device") that includes audiovisual image projection capabilities (e.g. a pico projector) and method of operating the same. An exemplary block diagram of the device 10 is provided in Figure 1. The device 10 includes a controller 12. The controller 12 is a component that executes various operational algorithms that control the various functions of the device 10. In one embodiment, the controller 12 executes algorithms that enable audio and video processing of a source input signal. The controller 12 may also include a memory in which various machine executable instructions controlling various device functionality may be stored and accessed as needed in response to various control signals generated by one of (a) a user and (b) other components of the device 10 as will be discussed below. The memory of the controller 12 may also store data associated with any input signal received by the controller 12. The memory of controller 12 may also store user- specific information that is associated with a user of the device 10. In one embodiment, user specific information may include user preferences for configuring the device for a particular type of operation. The user specific information may include global preference information that configures aspects of device operation that are common between the various functions as well as function specific preference information that configures the device to operate in a particular manner when executing a particular function. While the controller 12 is described as including a memory, one skilled in the art should understand that the memory (or other storage medium) within the device may be a separately embodied component that is read/write accessible by the controller 12 as needed.
[0031 ] The device 10 also includes a power converter 14 and battery 16 connected to the power converter 14. The power converter 14 is selectively connectable to an input power source (either AC or DC) for receiving power therefrom. Power received by the power converter 14 is provided to the battery 14 and selectively charges the battery 16 as needed. It should be understood that the operation of charging is meant to include an initial charging of the battery 16 as well as recharging the battery 16 after the power level has been depleted. Power is also simultaneously provided by the power converter 14 to the controller 12 for powering operation thereof. The controller 12 may selectively detect when input power is being provided to the power converter 14 causing the device 10 to operate in a first power mode when a connection to an input power source is detected and a second mode when no connection to an input power source is detected. In one embodiment of the first power mode, the controller 12 may execute a battery monitoring algorithm that enables the controller 12 to selectively detect a power level in the battery 16 and control the power converter 14 to direct power thereto. The controller 12 can also control charging of the battery 16 when the detected power level in the battery 16 is below a predetermined threshold. In another embodiment of the first power mode, the controller 12 may automatically direct power from the power converter 14 to be provided to the battery 16 in response to connection of the power converter with the input power source. In the second mode of operation, the controller 12 is powered by the battery 16 until such time that the battery power is depleted below a
predetermined operational threshold representing a minimum amount of power needed to operate the device. [0032] The controller 12 may receive an input audiovisual signal from one of a plurality of device inputs collectively referred to using reference numeral 15. The controller 12 can control selective projection of the audiovisual input signal using projection unit/microdisplay 30. The input audiovisual signal may include one of (a) a still image; (b) a series of images; (c) a video signal; and (d) an audio signal. The input audiovisual signal may also include an audio component that is intended to be audibly reproduced by speaker 29 in conjunction with the projection, by the projection unit 30, of the one still image or series of images as will be discussed below.
[0033] The plurality of inputs may include any combination of but is not limited to (a) a card reader 18; (b) a USB port 20; (c) a digital video input port (HDMI) 22; (d) a VGA/Component video input port 24; and (e) a composite/S-Video input port 26. The depiction of the plurality of input ports 15 is for purposes of example only and the device 10 may include any combination of the described input ports or other known input ports .
[0034] The card reader selectively receives a storage card that may include data representative of the input audiovisual signal that is accessed by the controller 12 and provided to the projection unit 30 and/or speaker 29 for output thereof. In one embodiment, the card reader 18 may be a MicroSD card reader. This is described for purposes of example only and any card reading device able to read any standardized storage card may be included in device 10. The USB port 20 enables the device 10 to be selectively connected to one of (a) a portable storage device (e.g. flash drive); or (b) a secondary device, that stores data representative of the audiovisual input signal. Any of the digital video input 22, VGA/component input 24 and/or composite video input 26 may enable connection with a secondary device that includes the source audiovisual input signal and are coupled to the controller 12 via an input selector 28. The input selector 28 selectively couples a respective one of the digital video input 22, VGA/component input 24 and/or composite video input 26 with the controller 12 such that the controller 12 may provide the audiovisual input signal to the projection unit 30 and speaker 29 for output thereof.
[0035] The device 10 further includes a plurality of user controls, collectively referred to using reference numeral 31, enabling the user to selectively control various device functions. An input/output (IO) interface 32 may include at least one user selectable button associated with at least one device function such that selection thereof initiates a control signal received by the controller 12 that is used to control the particular device function. In one embodiment, the 10 interface 32 may be a touch screen and the at least one button may be a user selectable image element displayed on the touch screen enabling selection thereof by a user. In this embodiment, the number and types of user selectable image elements may be generated by the controller 12 depending on the particular operational mode of the device. For example, during projection mode, the user selectable image elements may enable activation of image projection functionality and, if the device 10 is operating in a communication mode, the user selectable image elements displayed on the I/O interface 32 may relate to
communication functionality. In another embodiment, the 10 interface 32 may include at least one dedicated button on a housing of the device 10 that may be manually activated by a user.
[0036] Another user control 31 included with the device 10 includes a keyboard 34. The keyboard 34 enables a user to enter alphanumeric text-based input commands for controlling the operation of the device. In one embodiment, the keyboard is positioned on the housing of the device. In another embodiment, there is no dedicated keyboard and the keyboard may be generated by the controller 12 and provided for display by the 10 interface 32.
[0037] A further user control 31 that may be provided is a remote infrared (IR) sensor 36. Remote IR sensor 36 selectively receives an IR input signal that is generated by a remote control. The IR input signal received by the remote IR sensor 36 is communicated to the controller 12 which interprets the received IR input signal and initiates operation of a particular function of the device corresponding to user input.
[0038] Any of the user controls 32, 34 and/or 36 may be used to generate control signals for selecting an input audiovisual signal from a respective input source of the plurality of input sources 15. The control signals input via the user are received by the controller 12 which processes the user input signal and selects the source of the input audiovisual signal. Input received from any of the user controls 31 may also condition the controller 12 to selectively output the audiovisual signal using projection unit 30 and speaker 29.
[0039] Operation of the projection unit 30 will now be discussed. The projection unit 30 may include a microdisplay/pico projection unit 30. The projection unit 30 includes a panel driver 38, a light engine 39 and a projection lens 48. The panel driver 38 receives the audiovisual input signal from the controller 12 and controls the light engine to emit light representative of the audiovisual input signal that may be projected via a projection lens 48 coupled thereto. The light engine 39 may include a light source and light processing circuitry that is selectively controlled by the panel driver 38 to generate light and project an image representing the audiovisual signal onto a surface. Exemplary types of light engines 39 will be discussed in greater detail with respect to Figures 2 A - 2D. However, persons skilled in the art will understand that any light engine used in any type of projection device (portable or otherwise) may be incorporated in the projection unit 30 of the device 10. In operation, the light generated by the light engine 39 is provided to the projection lens 48 which projects the full color image onto a display surface (e.g. screen, wall, etc). The projection lens 48 may be focused in response to user input received by the controller 12 as needed. Additionally, the operation and position of the various components of the projection unit 30 may be controlled via a control signal that is generated by either the user or another component of device 10.
[0040] The projection unit 30 of the device may also include an infrared light emitting diode (IR LED) 50 that is coupled to the panel driver 38. In certain exemplary operations, the controller 12 may generate an IR audiovisual input signal based on the audiovisual input signal received from one of the plurality of inputs 31 or user controls. The IR audiovisual signal may be provided to the panel driver 38 which conditions the IR LED 50 to project an IR version of the audiovisual input signal. The IR signal is imperceptible to the human eye but may be used by other components as an input control signal in the manner discussed below.
[0041 ] The device 10 may also include a camera module 52. The camera module 52 may include a lens 54 coupled to an image sensor 56. Image data received via the lens 54 and sensed by image sensor 56 may be processed by image processor 58. The camera module 52 may operate as a convention digital camera able to capture one of still images and video images. The camera module 52 may also operate as a sensor that senses at least one type of image being displayed and uses the sensed image as a control signal for controlling at least one function of the device 10 as will be discussed below. The lens 54 of the camera module 52, shown in conjunction with the projection lens 48 of the projection unit, is described for purposes of example only and the device may include a single lens that is shared between the projection unit 30 and camera module 52.
[0042] A motion sensor 60 is also provided. The motion sensor 60 is coupled to the controller 12 and selectively senses data representing movement of the device 10. The motion sensor 60 may sense the position of the device and generate an input control signal used by the controller 12 for controlling device operation. The motion sensor 60 may include any type of motion sensor including but not limited to a gyroscope and/or an accelerometer. For example, in an embodiment, where the motion sensor 60 includes an accelerometer, the device 10 may include at least three accelerometers positioned on the X, Y and Z axis such that accelerometers may sense the position of the device 10 with respect to gravity. The motion sensor 60 may refer to a plurality of different sensors that are able to sense various types of data which may be provided to the controller 12 for analysis and processing thereof.
[0043] The device 10 also includes a communications processor 62 that enables bidirectional communication between the device 10 and a remote device. The communication processor 62 is described generally and is intended to include all electronic circuitry and algorithms that enable bidirectional communication between devices. In one embodiment, the communication processor 62 enables the device to operate as a cellular phone. In another embodiment, the communication processor 62 includes all components and instructions for connecting the device 10 to the internet. In a further embodiment, the communication processor 62 includes all components associated with a smartphone to enable a plurality of different types of bidirectional communication (e.g. telephone, email, messaging, internet, etc) between the device and a communications network.
[0044] Figures 2A - 2D are block diagrams representing different types of light engines 39 that may be employed within the projection unit 30 described in Figure 1. It should be understood that the portable projection device 10 as discussed herein may utilize any of the different light engines 39a - 39d described in Figures 2A - 2D. It should also be appreciated that the description of the light engines 39a - 39d is not limited to those described herein and any type of light engine able to generate and process light into a full color image for display on a surface may be used by the device 10. [0045] Figure 2A represents a three-color LED light engine 39a. The light engine 39a is controlled via the panel driver 38 (Fig. 1). The panel driver 38 receives the audiovisual input signal from the controller 12 and controls the operation of light emitting diodes (LED) 40a, 40b, and 40c. The LEDs 40a - c represent three color LEDs including a blue LED 40a, a green LED 40b and a red LED 40c. The audiovisual input signal provided to the panel driver 38 has been separated into its component colors by the controller 12 and the panel driver 38 selectively controls the LEDs 40a-c to emit the necessary light to generate the desired audiovisual image for output. Light generated by the LEDs 40a-c is focused into a full color image by a focusing element 42. In one embodiment, the focusing element 42 may be an x-cube. In another embodiment, the focusing element 42 may be a dichroic mirror. These focusing elements are described for purposes of example only and any focusing element 42 able to combine light from a plurality of LEDs into a single full color image may be used in the projection unit 30.
[0046] The focused image is projected on a liquid crystal on silicon (LCOS) chip 44 for receiving light emitted from each of the LEDs 40a - c and optically combines the received light via a polarizing beam splitter 46. The combined light is provided to the projection lens 48 which projects the combined full color image onto a display surface (e.g. screen, wall, etc). The projection lens 48 may be focused in response to user input received by the controller 12 as needed. Additionally, the operation and position of the various components of the projection unit 30 may be controlled via a control signal that is generated by either the user or another component of device 10.
[0047] Figure 2B depicts a white-light LED light engine 39b that may be used in the projection unit of the device 10. Light engine 39b may include a while light LED 41. The panel driver 38 (in Fig. 1) receives the audiovisual input signal from the controller 12 and controls the operation of the white light LED 41. The LED 41 is controlled to emit a pattern of light to generate the desired audiovisual image for output. Light generated by the LED 41 is provided to a LCOS chip 44b. The LCOS chip 44b has a predetermined pattern of primary color dots thereon. The panel driver 38 controls the LCOS chip 44b to have certain of the dots illuminated by the light emitted by LED 41 to provide colored light to the polarizing beam splitter 46b which optically combines the colored light reflected off of the LOCS chip 44b. The combined light is provided to the projection lens 48 which projects the combined full color image onto a display surface (e.g. screen, wall, etc).
[0048] Figure 2C depicts a digital light processing (DLP) engine 39c. The DLP engine 39c includes three colored light sources 40a, 40b, and 40c. In one embodiment, the light sources 40a - c represent three color LEDs including a blue LED 40a, a green LED 40b and a red LED 40c. While these are described as LED light sources, this is done for purposes of example only and the light sources may be any type of light sources including, but not limited to lasers as are known to be implemented in a DLP light engine. In operation, the light sources 40a-c are not on simultaneously. Rather, the panel driver 38 controls the individual light sources in sequence and the emitted light is provided to the focusing element for producing the full color image. In another embodiment of a DLP engine, a color wheel may be positioned between a light source and the focusing element 42. The panel driver 38 selectively controls the color wheel to rotate to one of the three primary colors based on the data in the audiovisual input signal to illuminate a respective light color at a given time.
[0049] The audiovisual input signal provided to the panel driver 38 has been separated into its component colors by the controller 12 and the panel driver 38 selectively controls the LEDs 40a-c to emit the necessary light to generate the desired audiovisual image for output. Light generated by the LEDs 40a-c are projected and focused into a full color image by a focusing element 42. The focusing element 42 may include a mirror unit 45 formed from at least one mirror which reflects the emitted light through prisms 47. The focused image is provided to the projection lens 48 which projects the combined full color image onto a display surface (e.g. screen, wall, etc).
[0050] Figure 2D depicts a laser-based light engine 39d. The laser light engine 39d includes light sources 43a - c that each emit a respective color light based on an audiovisual input signal. The light sources 43a - c are lasers that emit light in three distinct wavelengths. For example, light source 43a may be a laser that emits light at a wavelength associated with the color red whereas light source 43b may emit light at a wavelength associated with the color green and light source 43c may emit light at a wavelength associated with the color blue. The panel driver 38 controls the light sources 43a-c to emit respective colored light based on the audiovisual input signal received from the controller 12. The emitted light (either, concurrently or sequentially - depending on the panel driver being used) is provided to a focusing element 42. The focusing element 42 includes a set of combiner optics 49 that receives and combines the emitted laser light and provides the light to the mirror unit 45 including a plurality of individual mirrors. The mirror unit 45 is controlled by the panel driver 38 to rotate the plurality of mirrors based on the audiovisual input signal and reflects light to the projection lens 48 for projection onto a display surface (e.g. screen, wall, etc).
[0051 ] The projection device 10 advantageously emits at least one secondary image in a spectrum that is not visible to a user viewing the primary audiovisual image derived from the AV input signal that is projected through the lens 48 of the projection unit 30. The secondary image emitted by the projection unit may include control information that may be selectively captured by a sensor located on another projection device. Moreover, the projection device 10 may also include the same sensor for sensing secondary images including control information projected by other projection devices thereby enabling communication between projection devices. The control information included in the secondary image may be any information associated with the particular projection device emitting the secondary image. Control information may include but is not limited to (a) device configuration information; (b) information associated with the primary image being projected; (c) position information associated with the projection position in a projection array; and (d) synchronization information for synchronizing display of primary images between projection devices. Thus, the projection of secondary images that can be captured and processed by other projection devices advantageously enables real-time
communication between devices to facilitate at least one of (a) projector
identification; (b) image and/or projector alignment; and (c) projector configuration. Furthermore, the control information contained in the secondary image may deliver patterns and other information that enhance the performance of the device projecting the secondary image as well as other devices capturing and processing the secondary images projected by the device. Because the secondary image is not visible to users viewing the primary image, real-time automated configuration and communication may occur. Thus, the viewers will not see the active light references associated with the secondary image but will have their viewing experience enhanced because the projection device(s) may use the control signal in these active light references to modify or otherwise optimize the viewing experience.
[0052] In one embodiment, the secondary image projected by the projection unit is an infrared (IR) image that is projected from an IR emitter or other IR light source. In this embodiment, the sensor included in the device and used to capture and process the secondary IR image may be a camera having sensitivity within the IR spectra. The following description will reference the secondary image projected by the projection device as an IR image. However, this is merely a single contemplated embodiment. It should be understood that the projection unit may include any type of light source able to project an image in the light spectrum that is not viewable by the human eye but which may be captured by an image capturing device such as a sensor or camera.
[0053] In this embodiment, the secondary IR images can also be used to communicate information either to the same projection device, or in the case of multiple projection devices, adjacent projectors. The secondary IR image may be pulse modulated over time and include control information (e.g. a digital message) for receipt by an IR sensor on another different projection device. One exemplary operation includes providing a secondary IR image that can be processed between projectors where one projection device can discover the existence of an adjacent projection to share or split the image processing between two projectors. Another exemplary operation allows for the secondary IR images to be used to help align multiple projectors individually by communication between the projectors to share the display time for some images and keep an individual display time for other images. Moreover, each projector could optimize its display for the enhancement of the multi-projection display.
[0054] In another embodiment, the secondary image may be generated by the light engine that generates the primary display image. In this embodiment, the communication between the devices may occur during a communication session that occurs when all projection devices are being configured for displaying the primary image. In this embodiment, the light engine may modulate the light to communicate information associated with device operation and/or parameters to other projection devices that are being simultaneously configured.
[0055] An exemplary block diagram of the components of the device 10 shown in Figure 1 used in generating, projecting and sensing a secondary image is shown in Figure 3. However, it should be understood that the device 10 shown in Figure 3 includes all components shown in Figure 1. Moreover, the components shown in Figure 3 having the same reference numerals operate in a similar manner as described above in Figure 1 and include the further operational features described in Figure 3.
[0056] The controller 12 includes a microprocessor 302 for controlling all logic operations able to be executed by the controller 12. A memory 304 is also provided for storing at least one type of data therein. The at least one type of data stored in memory 304 and which may be communicated to at least one other projection device may be any data related to a particular operation of the controller 12. The data stored in memory 304 may also include data representing at least one of (a) an alignment pattern for aligning the display of images projected from a plurality of projection devices onto a display surface; (b) a relative position of one projection device with respect to at least one other projection device; (c) projection configuration information; (d) a video characteristic associated with the AV input signal; (e) an audio characteristic associated with the AV input signal; (f) ordering information for controlling an order in which images in the AV input signal are to be displayed in a multi-projection array; (g) a focal distance enabling other projection devices to use the same focus distance value to ensure that the image is at substantially the same distance and has substantially a same picture size; (h) a type of light engine; (i) aspect ratio and/or resolution of the image being projected; (j) a brightness value identifying a percent of present setting which may be used to communicate if the adjacent projection device has some margin to increase the brightness to match the brightness of the other projection devices; (k) an IR overscan value with respect to the main picture enabling other projection devices to calculate actual picture from IR image; (1) a pattern being projected on wall to identify a position of the device projecting the image (e.g. if a trapezoidal picture is seen by IR camera, other projection devices identifies device 10 being positioned at an angle with respect to the display surface; (m) image center designator identifying a center of the picture of the imager (e.g. cross-hairs); and (n) at least one type of icon representing a particular parameter wherein the position of display on the icon may represent the value associated with the particular parameter.
[0057] The above types of data stored in the memory and used in communicating with other figures is described for purposes of example only. The memory may include at least a unique identifier able to identify the device to any other device. The data stored in memory 304 may also include any type of data that describe a state of operation of the device 10. The memory 304 may also store unique identifiers and data describing a state of operation associated with other projection devices and that has been communicated to device 10. Thus, each projection device advantageously may know the identity of every other projection device along with the control state of each device.
[0058] A video processor 306 and audio processor 310 are also provided. The video processor 306 processes the video component of any audiovisual (A/V) input signal 301 received from a respective A/V input 15 in a known manner. The video processor 306 selectively decodes the video component of the A/V input signal 301 and provides the decoded video data to the panel driver 38 of the projection unit 30 for projection thereof. The decoded video component being projected by the projection unit is termed the primary display image. The primary display image is the image that is visible to any person within the viewing range of the projection device 10. The audio processor 310 processes any audio component of the A/V input signal 301 for output to the audio driver 312 coupled to the speaker 29 (Fig. 1). A communication bus 303 connects each of the microprocessor 302, memory 304, video processor 306 and audio processor 310 and enables each of the components to communicate data therebetween.
[0059] In conjunction with decoding and processing the video component of the A/V input signal 301, the video processor 306 generates data representing a secondary IR display image which may be provided to the panel driver 38. The video processor 306 includes control information in the secondary IR display image for use in communicating with other projection devices. The control information included within the secondary IR display image includes a projector identifier enabling other projection devices to identify the source from which a respective secondary IR display image is being projected. The control information may also include at least one of (a) an alignment pattern for aligning the display of images projected from a plurality of projection devices onto a display surface; (b) a relative position of one projection device with respect to at least one other projection device; (c) projection configuration information; (d) a video characteristic associated with the AV input signal; (e) an audio characteristic associated with the AV input signal; and (f) ordering information for controlling an order in which images in the AV input signal are to be displayed in a multi-projection array. The data included in the control information may be derived from one of (a) the memory 304; (b) user input via a respective user input control 31 ; and (c) the A/V input signal 301.
[0060] The control information in the secondary IR display image may include a predetermined pattern which may be selectively captured by a sensor for processing. Thus, the secondary display image may include any number of unique predetermined patterns that represent different types (classes) of information that may be used by other projection devices. In one embodiment, IR signaling can be transmitted as pulses which vary position or duration in time, or as a carrier modulated signal which varies in frequency, phase, or amplitude, or in some combination thereof. This advantageously enables ones and zeroes to be communicated to other projection devices and the time progression of these bits, according to predetermined rules, may represent a pre-defined syntax (e.g., bits can compose bytes, and bytes can compose words which are interpreted as a sequence of commands and data. Thus, syntax has associated semantics which define the commands plus data that can be interpreted and executed in each projection device. In this manner, any type of command, parameter or other information can be directly communicated to and understood by other projection devices.
[0061 ] The video processor 306 pulses the secondary IR display image and controls the panel driver 38 to project the secondary IR display image onto the display surface via the IR LED 50. Based on the pulse generated by the video processor 306, the panel driver 38 causes the IR LED to project the secondary IR image at the predetermined pulse time. The secondary IR display image may be overlaid on top of the primary display image and may have a larger display area as compared to the primary display image. Since the secondary IR display image is in a spectrum that is not visible to the human eye, the overlay of this image on top of the primary image will not affect the viewing experience. The larger display area of the secondary IR display image creates a border region extending around the periphery of the display area of the primary display image. Thus, border regions of respective adjacent primary display areas overlap thereby enabling adjacent projectors to project data to and sense data from adjacent projectors as will be discussed below. The video processor 306 may control the panel driver 38 to simultaneously drive the light engine 39 projecting the primary image and the IR LED 50 projecting the secondary IR display image. Alternatively, the video processor 306, when decoding the video component of the A/V input signal 301 may reserve a frame time for each frame in which the secondary IR display image will be displayed. Additionally, or instead, the video processor 306 may assign a time within a current image during which the secondary IR display image is to be displayed. In one embodiment, the light engine 39 is turned off (e.g. inhibited) while the IR LED 50 is displaying the secondary IR display image.
[0062] The microprocessor 302 may control the sensor for sensing the secondary IR display image to be synchronized to the display pattern of the secondary IR display image thereby enabling the device 10 to know when to expect an appearance of secondary IR display image from another projection device in order to capture the secondary IR display image for processing thereof. In the embodiment, shown herein, the sensor is a camera module that includes the lens 52, the image sensor 56 and image processor 58. While the secondary IR display image is invisible to the human eye, any camera module, even one with an IR filter, will still be able to capture the secondary IR display image because the contrast ratio of the IR pattern versus an image having no pattern will still register on the image sensor 56 of the camera module and enable the devices to communicate with one another.
[0063] The control information included within the secondary IR display image by the video processor 306 may include data derived from other sources. In one embodiment, the video processor 306 may generate control information representative of (a) environmental data around the device 10 and (b) output data that is output by any component of the device 10. Environmental data may include, but is not limited to, (a) position of the device with respect to a display surface; (b) position of the device with respect to a projection surface (e.g. the surface on which the device is positioned); (c) ambient light; and (d) ambient sound. Output data may include, but is not limited to, (a) size of images being projected; (b) brightness of images being projected; (c) geometry of images being projected; (d) focus of images being projected and (e) alignment of images being projected. The environmental data may be sensed by the motion sensor 60 whereas the output data may be sensed by the camera module 54. The motion sensor 60 senses the position of the device 10 with respect to the display surface and the projection surface and a camera module 52 can capture and record the primary display images being projected onto the display surface by the projection unit 30.
[0064] Additionally, the control information generated and inserted into the secondary IR display image by the video processor 306 may be derived from a remote system and received by the communication processor 62 of the device 10. Since the communication processor 62 enables bidirectional communication with other devices and systems over a communications network such as a cellular network or the internet, the communication processor 62 may receive a control message including instructions for generating the secondary IR display image. In this embodiment, the control message may provide the video processor 306 with a set of information to be included in the secondary IR display image as the control information. Moreover, the control message received from the remote system may include a set of information usable by the controller 12 in interpreting control information received from a different projection device. In this instance, a remote user and system can, in real time, send control messages to the various projection devices allowing for dynamic control and configuration of the primary images being projected by the respective projection units based on the secondary IR display images being pulsed by the panel driver 38 via the IR LED 50.
[0065] In a further embodiment, the control information generated by and included within the secondary IR display image may be derived from a user input received from a respective user input control 31.
[0066] Because the secondary IR display images including control information are continually pulsed at predetermined intervals, the projection device may continually re-configure other projection devices and simultaneously be re-configured by other projection devices in response to the successive secondary IR display images captured and interpreted by the camera module 52.
[0067] In operation, the microprocessor 302 executes an inter-device communication algorithm that controls the video processor 306 to generate data representative of the secondary IR display image including the control information. The generated secondary IR display image is provided to the panel driver 38 which drives the IR LED 50 to project the secondary IR display image at predetermined pulsed intervals as discussed above. The inter-device communication algorithm also simultaneously configures the camera module 52 to automatically sense any secondary IR display images projected from other devices while the IR LED 50 is projecting its own secondary IR display image. Upon receipt of any secondary IR display images by the camera module 52, the image sensor 56 detects the captured secondary IR display image and the image processor 58 processes the detected image to discern at least one pattern representative of the control information contained in the captured secondary IR display image. Any patterns detected by the image processor 58 are provided to the microprocessor 302 which compares detected pattern data with control pattern data stored in memory 304 and generates an action associated with the pattern data when a match is detected.
[0068] For example, the control information may include a type of video setting information having a unique pattern associated therewith. The microprocessor 302 may compare the detected pattern with the control pattern information and cause the respective video setting in the projection device 10 to be modified in response to receipt of the control information from another projection device. The control information including video setting data and the response to receipt thereof by the device is described for purposes of example only. The control information may include any type of data that may cause the device 10 receiving the control information to operate in any desired manner and the microprocessor 302 may initiate a corresponding action associated with the operation information received.
[0069] Further operation of the device 10 alone and in combination with other projection devices will be described with respect to Figures 4 - 8.
[0070] Figure 4 illustrates a single projection device 10 including the projection unit 30 for projecting images there from onto a display surface 402. The device also includes the sensor (e.g. camera module 52) for sensing any secondary IR display images from other projection devices. As shown herein, the projection device may project a composite image 401 including the primary display image 404 and the secondary display image 406 such that the display area of the secondary IR display image 406 exceeds, on all sides (406a - 406d), the display area 404 associated with the primary display image. By exceeding the display area 404 associated with the primary display image (e.g. the one visible to any viewers) on all sides, the device 10 advantageously provides an ability to communicate with any other projection device that is configured to display a primary image adjacent thereto. Thus, the projection device 10 in Figure 4 can simultaneously provide control information in a secondary display image to and sense control information from other secondary display images projected by up to four separate projection devices. As the secondary display images from each projector are pulse modulated, each device can directly communicate and/or interact with up to four other projection devices. Additionally, in an instance where there are more than four projection devices, the total number of projection devices in the array may be discovered and provided to every other projection device to enable complete communication between and control over all projection devices in the array.
[0071 ] The manner in which communication occurs between projection devices will be discussed with respect to Figures 5 and 6. Figure 5 illustrates an example of inter-device communication in a projection array comprised of two projection devices 10a and 10b. The description contained herein is for ease of understanding and to illustrate the principles of operation. It should not be construed to be limiting and while only two devices 10a and 10b are shown, any number of projection devices may be used according to the same inventive operation and principles.
[0072] Figure 5 depicts a first device 10a having a first projection unit 30a and first sensor unit 52a and a second device 10b having a second projection unit 30b and second sensor unit 52b. The first device 10a projects a first composite display image 501a including the first primary display image 504a and secondary I display image 506a of the first device 10a via the first projection unit 30a onto a display surface 502. The display area associated with the first composite display image 501a is shown via the dashed lines emanating from the first projection unit 30a. The second device 10b projects a second composite display image 501b including the second primary display image 504b and secondary IR display image 506b of the second device 10b via the second projection unit 30b onto a display surface 502. The display area associated with the second composite display image 501b is shown via the dotted lines emanating from the second projection unit 30a. As can be seen herein, the devices 10a and 10b are configured to project images adjacent each other and, in doing so, an overlapped region 503 that includes both the secondary IR display image 506a of the second device 10a 506 and secondary IR display image 506b of the second device 10b is represented. Moreover, in the overlapped region 503, any control information 510a and/or 510b contained in either of the secondary display images 506a and/or 506b, is included and may be used by the adjacent device 10a or 10b. The control information 510a and 510b are shown here as differently shaped polygons merely to illustrate that control information may be formed as a unique pattern within the secondary display image 506a and/or 506b and any pattern may be used to represent different types of control information.
[0073] For example, control information 510a identifying the device as the first projection device 10a may be generated and included in the secondary I display image 506a of the first device 10a whereas control information 510b identifying the device as the second projection device 10a may be generated and included in the secondary IR display image 506b of the second device 10b. However, because the control information items 510a and 510b are present in the overlapped region 503, they are discoverable and able to be used by both the first device 10a and second device 10b
[0074] It should be noted that the depiction of a single overlapped region 503 containing control information items from the first and second device 10a and 10b is shown for purposes of example only and in operation, for each composite display image 501 projected by a respective device, up to at least four overlapped regions around each edge of the primary display image may be generated depending on the position of the device and where on the display surface 502 the respective device is to project its composite display image.
[0075] Figure 6 illustrates how a respective device captures a secondary display image and uses the control information contained therein. The image and image components shown in Figure 6 mirror those shown and described above in Figure 5. The difference in Figure 6 relates to the particular operation of the second device 10b. As discussed above, the controller 12 controls the projection device to simultaneously pulse the secondary display image and initiate an image capture function using the sensor (e.g. camera) 52. To illustrate this, the first device is shown projecting the first composite display image 501a including the secondary IR display image 506a of the second device 10a 506a. At substantially the same time as this, the sensor 52b on the second device 10b is activated to capture any secondary display images from any other devices (in this case, the first device 10a). The area able to be captured by the sensor 52b of the second device 10b is shown by the dotted lines having a directional arrow towards the second device 10b. The capturable area includes the overlapped region 503 and thus any control information 510a and 510 contained therein. The sensor 52a captures the available secondary display image 506a and 506b and processes the captured image in the manner discussed above to derive any control information contained therein. Once the control information has been derived, the microprocessor 302 of the device 10b can determine what, if any action, should be taken based on the derived control information. [0076] To continue with the example described above where the control information 510a and 510b represents respective projector identifiers associated with the first device 10a and second device 10b, upon capturing and processing the secondary display image by sensor 52b, the device 10b will determine that first control information 510a and second control information 510b are present. The microprocessor 302 may compare these control information items to a pre-stored list of control information items to determine the action, if any, to be taken. In this instance, the device may determine that first control information 510a identifies the first device 10a and second control information 510b identifies itself (the second device). The microprocessor 302 may then identify the edge at which the control information 510a was captured by sensor 52a and update a projection array map to indicate that the device adjacent to the second device 10b on the left side is the first device 10a.
[0077] While identification and mapping is described to illustrate the inter-device communication functionality, it is also understood that the control information projected onto a surface and sensed by a device may be used to specifically control an operation of the device sensing the control information. For example, the control information may include information associated with configuration parameters of one device and cause the device sensing that control information to automatically modify a configuration parameter identified in the control information.
[0078] Figures 7A - 7C and 8 illustrate another aspect of the inter-device communication algorithm executed by the controller 12 of the respective projection devices. This exemplary operation relates to the automatic discovery of adjacent projection devices and configuration of images to be projected from the various available adjacent projection arrays.
[0079] By using a projection array comprised of a plurality of projection devices, a single primary display image may be formed from a subset of individual partial primary display images each projected by a different projection device. In this embodiment, the A/V input signal may include information indicating that the images contained therein are to be displayed in array and the video processor 306 may use the array information to process the video component of the A/V input signal that it is charged to project.
However, the device 10 may automatically determine that a plurality of projection devices exist and automatically share the projection load by portioning the primary display image into partial image components based on the discovery and position of additional projection devices. To accomplish this, the microprocessor 302 may use the
predetermined array patterns stored in memory 304 to partition the image contained in the A/V input signal 301 into respective components based on the number of available projection devices. In this embodiment, the control information may include information identifying a position of its respective partial image within the composite image of the array. The control information may also include configuration settings of the adjacent device (e.g. color, brightness, focus, etc) so as to minimize visual errors when the images are being displayed. The devices may then use the various control information items as a feedback to further re-configure itself in order to maintain visual display consistency across the array.
[0080] In another embodiment, the projection array may be used by a plurality of projection devices to display a plurality of different primary display images at various locations on the display surface 502. These images may be displayed simultaneously or at predetermined time intervals depending on a preset array configuration information. In this embodiment, where the projection devices display a plurality of different display images, the display of one image may trigger an adjacent projection device to display the following image. Additionally, the display of one image by a respective projection device may cause another different adjacent projection device to be turned off. In this embodiment, the control information items would include display ordering information along with any parameter configuration information that may be used to keep the settings of each projection device substantially uniform.
[0081 ] Each device 10 may include a predetermined projection array pattern stored in memory 304 that may be used to automatically identify and configure the positions of each projection device in the projection array. Examples of projection arrays and where respective projection devices will project respective images within the array are shown in Figures 7A - 7C. Each of the arrays shown in Figures 7A - 7C are formed from nine (9) projection devices. However, this is shown for purposes of example only and any number of projection devices may be used to create any projection pattern for any type of projection array. Figure 7A represents a clockwise spiral array pattern. Figure 7B represents a horizontal zigzag pattern and Figure 7C represents a diagonal zigzag pattern.
[0082] In order to participate in a projection array and to automatically configure various projection devices, each device in the array needs to discover and identify a position of all other projection devices in the array. An example of how this is accomplished by the inter-device communication algorithm is shown in Figure 8 which provides a projection array formed from nine projection devices that are each configured to display their composite image in the positions identified therein. [0083] In determining the position of the primary display image for each projection device, all devices are aware of a current projection array pattern based on the pre-stored projection array patterns. In one embodiment, a user may use a respective user input control to select which of the array patterns is active. In another embodiment, the A/V signal may include array pattern data encoded therein such that the video processor 306, upon decoding the A/V input signal set a particular array pattern based thereon.
[0084] The array pattern information identifies a number of rows and number of positions within each row. The array pattern information also sets a first device (1) as a master against which all future positions are to be compared. In this embodiment, the first device (1) knows that it is in the first position in an upper left corner of the array. The first device (1) emits the secondary display image identifying itself. Each of the other devices (2) - (9) simultaneously sense any secondary display images in their viewing field. Depending on the array pattern designated, each of the other devices (2) - (9) look to a predetermined edge for the overlapped region with adjacent devices. Based on this pattern, the discovery algorithm instructs each device (2) - (9) to look at each edge of the display area to sense secondary display images and identify at which edges secondary display images are sensed. The devices use the presence and/or absence of secondary display images to determine their position relative to other devices.
[0085] Thus, in exemplary operation, once device (1) has been set as the master projection device, the control information included in the secondary display image projected by the first device includes that it is the master (e.g. first device) and positioned in the first position of the first row. The control information will also include a total number of rows in the array and number of positions within each row. Before any other projection device can announce its position, it needs to sense the immediately preceding and positioned device. Thus, camera of device (2) captures the secondary display image of device (1) and, from the control information contained therein, understands that the device to its left is the master device and device (2) is identified as such indicating that device (2) is the second device and is positioned in the second position of the first row and will now project its own secondary display image that includes its device number and position. The control information will also include the number of each preceding device and its respective position along with the total number of rows and positions within each row for the particular array. Thus, the control information for device (2) will indicate that device (2) is in the second position of the first row and that device (1) is positioned adjacent left of device (2). Device (3) will be able to sense the secondary display image from device (2) and know that devices (1) and (2) are positioned left thereof and that the array includes three rows each with three positions. Knowing that positions one and two of the first row are taken, device (3) automatically identifies itself as device (3) and its control information will indicate it as such.
[0086] With respect to device (4), since it is not adjacent device (3), device (4) senses an upper edge thereof for secondary display information. By sensing the upper edge, device (4) will see that it is adjacent device (1). Moreover, device (4) will understand that it is the in the fourth position because the control information projected by device (1) provides the array information. Additionally, the control information projected by device (1) is continually updated as each successive device (2) and (3) are identified. Device (4) will use the edge at which the secondary display image was sensed along with the control information identifying all other devices and positions in the adjacent row to identify itself as device (4). The device identification for Devices (5) - (8) are performed similarly. By way of all prior control information, all devices have knowledge of the position of all other devices. Once this has occurred, the devices can be controlled to project images in any manner while continually providing information to all other devices such that individual configuration and/or operation thereof may be modified to maintain an optimal display of the images being projected onto the display surface 502.
[0087] Figure 9 is a flow diagram detailing another exemplary operation based on inter-device communication between projection devices. In this embodiment, the communication occurring between adjacent projection devices relates to
characteristics of the images being displayed by each of the projection devices. Thus, the control information in the secondary IR display image represents at least one video characteristic parameter associated with the primary display image being displayed by the respective projector. Each projection device can use any video characteristic parameter from any other device as a form of error correction and/or image modification in an attempt to improve the image being projected thereby. The following flow will be discussed in terms of a second projector sensing control information from secondary IR display image being projected from a first projection device similar to the arrangement shown in Figure 5. However, this is described merely for purposes of example only and each projection device may use information from any other adjacent projection device to selectively modify the primary display image being projected thereby. [0088] In step 902, the sensor 52b on the second device 10b captures the secondary IR display image 506a projected by the projection 30a of the first device 10a. The captured image is parsed to identify any patterns indicative of control information contained therein. In this embodiment, the control information may represent an image transform parameter. In step 904, the second device 10b determines a first error level by comparing the image transform parameter associated with the primary image 504a from the first device 10a with a current image transform pattern associated with the primary image 504b being projected by the second device 10b. In step 906, the second device 10b increments the current image transform pattern and in step 908, the controller determines if an error level is increased in response to the incrementing of the transform parameter. If the error level is not increased, the method continues in step 910 and further increments the transform pattern and compares the transform parameter to the transform parameter of the primary display image 506b being output by projection unit 30b. The method in step 914 queries whether or not an error level is increased by the further incrementing of the parameter. If not, the method refers back to step 910 and further incrementing and comparison occur. If the result of the query is positive, the adjustment of the transform parameter is ended and the video processor 306 may implement the updated transform parameter when decoding the A/V input signal 301 for projection by proj ection unit 30b .
[0089] Referring back to step 908, if the incrementing of the transform parameter increases the error level, the method proceeds in step 909 to decrement the value of the transform parameter. In step 91 1, the decremented transform parameter is compared to the current transform parameter and in step 913 it is determined whether or not the error level has increased. If the result of the query of block 913 is negative, the method reverts back to step 909 for further decrementing and comparison. If the result of the query is positive, the adjustment of the transform parameter is ended and the video processor 306 may implement the updated transform parameter when decoding the A/V input signal 301 for projection by projection unit 30b.
[0090] Thus, each projection device can use the parameters of the other projection devices for inclusion in the control information to enable dynamic configuration of video parameters for each projection device to further improve the viewing experience of those in the viewing area. [0091 ] The implementations described herein may be implemented in, for example, a method or process, an apparatus, or a combination of hardware and software. Even if only discussed in the context of a single form of implementation (for example, discussed only as a method), the implementation of features discussed may also be implemented in other forms (for example, a hardware apparatus, hardware and software apparatus, or a computer-readable media). An apparatus may be implemented in, for example, appropriate hardware, software, and firmware. The methods may be implemented in, for example, an apparatus such as, for example, a processor, which refers to any processing device, including, for example, a computer, a microprocessor, an integrated circuit, or a programmable logic device. Processing devices also include communication devices, such as, for example, computers, cell phones, tablets, portable/personal digital assistants ("PDAs"), and other devices that facilitate communication of information between end-users.
[0092] Additionally, the methods may be implemented by instructions being performed by a processor, and such instructions may be stored on a processor or computer-readable media such as, for example, an integrated circuit, a software carrier or other storage device such as, for example, a hard disk, a compact diskette, a random access memory ("RAM"), a read-only memory ("ROM") or any other magnetic, optical, or solid state media. The instructions may form an application program tangibly embodied on a computer-readable medium such as any of the media listed above. As should be clear, a processor may include, as part of the processor unit, a computer-readable media having, for example, instructions for carrying out a process. The instructions, corresponding to the method of the present invention, when executed, can transform a general purpose computer into a specific machine that performs the methods of the present invention.
[0093] What has been described above includes examples of the embodiments. It is, of course, not possible to describe every conceivable combination of components or methodologies for purposes of describing the embodiments, but one of ordinary skill in the art can recognize that many further combinations and permutations of the embodiments are possible. Accordingly, the subject matter is intended to embrace all such alterations, modifications and variations that fall within the spirit and scope of the appended claims. Furthermore, to the extent that the term "includes" is used in either the detailed description or the claims, such term is intended to be inclusive in a manner similar to the term "comprising" as "comprising" is interpreted when employed as a transitional word in a claim.

Claims

1. An apparatus for projecting received images onto a display surface comprising
a projection unit (30) that projects a primary display image and a secondary display image on a display surface; and
a controller (12) coupled to the projection unit that generates the primary display image from the received images and the secondary display image including control information associated with the apparatus for receipt by at least one other projection device, the control information controlling an operation of the at least one other projection device, wherein the primary display image being within the visible spectrum and the secondary display image being one of within the visible spectrum and outside of the visible spectrum.
2. The apparatus according to claim 1, wherein
said controller controls the projection unit to project the secondary display image onto the display surface at predetermined intervals.
3. The apparatus according to claim 1, further comprising
at least one sensor (52) coupled to the controller that senses the display surface to detect a secondary display image projected by at least one of said projection unit and the at least one other projection device.
4. The apparatus according to claim 3, wherein
said sensor detects control information contained in the secondary display image projected by the at least one other projection device; and
said controller uses the detected control information to control at least one operation of the apparatus.
5. The apparatus according to claim 3, wherein
said controller synchronizes activation of said sensor with projection of said secondary display image from one of said projection unit and the at least one other projection device.
6. The apparatus according to claim I, wherein
said controller reserves a frame time within each image frame for projection of said secondary display image and selectively controls the projection unit to inhibit projection of the primary display image during the reserved frame time.
7. The apparatus according to claim I, wherein
said controller causes the projection unit to simultaneously project the secondary display image with the primary display image.
8. The apparatus according to claim 1, wherein
the secondary display image has a display area larger than a display area associated with the primary display image enabling an adjacent projection device to sense the secondary display image.
9. The apparatus according to claim 1, wherein
the control information includes at least one of (a) an alignment pattern for aligning the display of images projected from a plurality of projection devices onto a display surface; (b) a relative position of one projection device with respect to at least one other projection device; (c) projection configuration information; (d) a video characteristic associated with the AV input signal; (e) an audio characteristic associated with the AV input signal; (f) ordering information for controlling an order in which images in the AV input signal are to be displayed in a multi-projection array; and (g) a unique apparatus identifier.
10. The apparatus according to claim 1, further comprising
a communication processor (62) coupled to said controller for receiving at least one control message from a remote device, and
said controller generating control information based on the at least one received control message for controlling a parameter of one of the apparatus and the at least one other projection device.
1 1. The apparatus according to claim 1, further comprising
a motion sensor (60) coupled to said controller for sensing a position of the apparatus with respect to a surface on which the apparatus is positioned and the surface on which the primary and secondary display images are being displayed, wherein said controller generates the control information based on the sensed position of the apparatus for inclusion into said secondary display image.
12. The apparatus according to claim 3, wherein
said sensor (52) is a camera module including
an image sensor (56) for sensing the secondary display image from one of the apparatus and the at least one other projection device; and
an image processor (58) for detecting the control information within the sensed secondary display image.
13. The apparatus according to claim 1, wherein
said controller generates the control information by creating unique image patterns associated with respective operations of the apparatus and the at least one other projection device able to be controlled using said control information.
14. A method of projecting received images onto a display surface comprising generating (12) a primary display image from the received images, the primary display image being within the visible spectrum;
generating (12) a secondary display image including control information associated with the apparatus for receipt by at least one other projection device, the secondary display image being one of within the visible spectrum and outside of the visible spectrum;
projecting, using a projection unit (30), the primary display image and the secondary display image on a display surface;
controlling an operation of the at least one other projection device using the control information displayed in the secondary display image.
15. The method according to claim 14, further comprising
controlling the projection unit to project the secondary display image onto the display surface at predetermined intervals.
16. The method according to claim 14, further comprising
sensing, via at least one sensor (52), the display surface to detect a secondary display image projected by at least one of said projection unit and the at least one other projection device.
17. The method according to claim 16, further comprising
detecting control information contained in the secondary display image projected by the at least one other projection device; and
using the detected control information to control at least one operation of the apparatus.
18. The method according to claim 16, further comprising
synchronizing activation of said at least one sensor with projection of said secondary display image from one of said projection unit and the at least one other projection device.
19. The method according to claim 14, further comprising
reserving a frame time within each image frame for projection of said secondary display image;
inhibiting projection of the primary display image during the reserved frame time.
20. The method according to claim 14, further comprising
simultaneously projecting, by the projection unit, the secondary display image with the primary display image.
21. The method according to claim 14, wherein
the secondary display image has a display area larger than a display area associated with the primary display image enabling an adjacent projection device to sense the secondary display image.
22. The method according to claim 14, wherein
the control information includes at least one of (a) an alignment pattern for aligning the display of images projected from a plurality of projection devices onto a display surface; (b) a relative position of one projection device with respect to at least one other projection device; (c) projection configuration information; (d) a video characteristic associated with the AV input signal; (e) an audio characteristic associated with the AV input signal; (f) ordering information for controlling an order in which images in the AV input signal are to be displayed in a multi-projection array; and (g) a unique apparatus identifier.
23. The method according to claim 14, further comprising
receiving, at a communication processor (62), at least one control message from a remote device, and
generating control information based on the at least one received control message for controlling a parameter of one of the apparatus and the at least one other projection device.
24. The method according to claim 14, further comprising
sensing, using a motion sensor (60), a position of the apparatus with respect to a surface on which the apparatus is positioned and the surface on which the primary and secondary display images are being displayed; and
generating the control information based on the sensed position of the apparatus for inclusion into said secondary display image.
25. The method according to claim 14, wherein
said activity of sensing is performed by a camera module and further includes
sensing, at an image sensor (56), the secondary display image from one of the apparatus and the at least one other projection device; and
detecting, at an image processor (58), the control information within the sensed secondary display image.
26. The method according to claim 14, further comprising
generating the control information by creating unique image patterns associated with respective operations of the apparatus and the at least one other projection device able to be controlled using said control information.
27. An apparatus for projecting received images onto a display surface comprising
means for projecting (30) a primary display image and a secondary display image on a display surface; and
means for generating (12) the primary display image from the received images;
means for generating (12) the secondary display image including control information associated with the apparatus for receipt by at least one other projection device, the control information controlling an operation of the at least one other projection device, wherein the primary display image being within the visible spectrum and the secondary display image being one of within the visible spectrum and outside of the visible spectrum.
28. The apparatus according to claim 27, further comprising
means for controlling (12) the projection unit to project the secondary image onto the display surface at predetermined intervals.
29. The apparatus according to claim 1, further comprising
means for sensing (52) the display surface to detect a secondary display image projected by at least one of said projection unit and the at least one other projection device.
30. The apparatus according to claim 29, wherein
said means for sensing detects control information contained in the secondary display image projected by the at least one other projection device and further comprises
means for using (12) the detected control information to control at least one operation of the apparatus.
31. The apparatus according to claim 29, further comprising
means for synchronizing (12) activation of said means for sensing with projection of said secondary display image from one of said means for projecting the secondary display image and the at least one other projection device.
32. The apparatus according to claim 27, further comprising
means for reserving a frame time within each image frame for projection of said secondary display image and selectively controlling said means for projecting the primary display image during the reserved frame time.
33. The apparatus according to claim 27, wherein
said means for projecting the projecting the primary display image and means for projecting the secondary display image are caused to simultaneously project the secondary display image with the primary display image.
34. The apparatus according to claim 1, wherein
the secondary display image has a display area larger than a display area associated with the primary display image enabling an adjacent projection device to sense the secondary display image.
35. The apparatus according to claim 1, wherein
the control information includes at least one of (a) an alignment pattern for aligning the display of images projected from a plurality of projection devices onto a display surface; (b) a relative position of one projection device with respect to at least one other projection device; (c) projection configuration information; (d) a video characteristic associated with the AV input signal; (e) an audio characteristic associated with the AV input signal; (f) ordering information for controlling an order in which images in the AV input signal are to be displayed in a multi-projection array; and (g) a unique apparatus identifier.
36. The apparatus according to claim 1, further comprising
means for receiving (62) at least one control message from a remote device, and
said means for generating said secondary display image generates control information for controlling a parameter of one of the apparatus and the at least one other projection device based on the at least one received control message.
37. The apparatus according to claim 1, further comprising
means for sensing a position (60) of the apparatus with respect to a surface on which the apparatus is positioned and the surface on which the primary and secondary display images are being displayed, wherein said means for generating the secondary display image generates the control information based on the sensed position of the apparatus for inclusion into said secondary display image.
38. The apparatus according to claim 1, wherein
said means for generating the secondary display image generates the control information by creating unique image patterns associated with respective operations of the apparatus and the at least one other projection device able to be controlled using said control information.
PCT/US2013/048538 2013-06-28 2013-06-28 Apparatus and method of communicating between portable projection devices WO2014209355A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2013/048538 WO2014209355A1 (en) 2013-06-28 2013-06-28 Apparatus and method of communicating between portable projection devices

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2013/048538 WO2014209355A1 (en) 2013-06-28 2013-06-28 Apparatus and method of communicating between portable projection devices

Publications (1)

Publication Number Publication Date
WO2014209355A1 true WO2014209355A1 (en) 2014-12-31

Family

ID=48783376

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2013/048538 WO2014209355A1 (en) 2013-06-28 2013-06-28 Apparatus and method of communicating between portable projection devices

Country Status (1)

Country Link
WO (1) WO2014209355A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106162128A (en) * 2016-08-30 2016-11-23 郑崧 The method and system of projection image co-registration

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110075101A1 (en) * 2009-09-28 2011-03-31 Seiko Epson Corporation Projector, Projection System, and Method for Controlling Projection System
EP2584403A2 (en) * 2011-10-21 2013-04-24 Disney Enterprises, Inc. Multi-user interaction with handheld projectors

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110075101A1 (en) * 2009-09-28 2011-03-31 Seiko Epson Corporation Projector, Projection System, and Method for Controlling Projection System
EP2584403A2 (en) * 2011-10-21 2013-04-24 Disney Enterprises, Inc. Multi-user interaction with handheld projectors

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106162128A (en) * 2016-08-30 2016-11-23 郑崧 The method and system of projection image co-registration

Similar Documents

Publication Publication Date Title
JP6398248B2 (en) Position detection system and method for controlling position detection system
US20110001701A1 (en) Projection apparatus
US10768884B2 (en) Communication apparatus, display apparatus, control method thereof, storage medium, and display system for configuring multi-display settings
US9304379B1 (en) Projection display intensity equalization
JP6375660B2 (en) POSITION DETECTION DEVICE, PROJECTOR, POSITION DETECTION SYSTEM, AND POSITION DETECTION DEVICE CONTROL METHOD
US20190265847A1 (en) Display apparatus and method for controlling display apparatus
EP2957954B1 (en) Image projector and image projector control method
US20050264772A1 (en) Projection device and control method
JP2018160265A (en) Position detection system, and control method of position detection system
KR100560242B1 (en) Image display
JP2000112021A (en) Projection type display device
JP2012181264A (en) Projection device, projection method, and program
JP2005301307A (en) Projection type display apparatus
WO2014209355A1 (en) Apparatus and method of communicating between portable projection devices
JP6883256B2 (en) Projection device
WO2014209354A1 (en) Image stabilization in a pico projector
EP3014869B1 (en) Highlighting an object displayed by a pico projector
WO2016147236A1 (en) Illumination device
WO2014209351A1 (en) Method and apparatus of disabling a notification module in a portable projection device
JP2013057822A (en) Image projection system
US9723279B1 (en) Projector and method of controlling projector
JP2005258456A (en) Projection display device
WO2014209349A1 (en) Graphical user interface for a portable projection device
US11800070B2 (en) Control method for projector, projector, and image projection system
KR101873748B1 (en) A projector and a method of processing an image thereof

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13736728

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13736728

Country of ref document: EP

Kind code of ref document: A1