GB2477787A - Data Overlay Generation Using Portable Electronic Device With Head-Mounted Display - Google Patents

Data Overlay Generation Using Portable Electronic Device With Head-Mounted Display Download PDF

Info

Publication number
GB2477787A
GB2477787A GB1002485A GB201002485A GB2477787A GB 2477787 A GB2477787 A GB 2477787A GB 1002485 A GB1002485 A GB 1002485A GB 201002485 A GB201002485 A GB 201002485A GB 2477787 A GB2477787 A GB 2477787A
Authority
GB
United Kingdom
Prior art keywords
signal
portable device
orientation
head
mounted display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB1002485A
Other versions
GB2477787B (en
GB201002485D0 (en
Inventor
Marcus Alexander Mawson Cavalier
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to GB1002485.9A priority Critical patent/GB2477787B/en
Publication of GB201002485D0 publication Critical patent/GB201002485D0/en
Publication of GB2477787A publication Critical patent/GB2477787A/en
Application granted granted Critical
Publication of GB2477787B publication Critical patent/GB2477787B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/398Synchronisation thereof; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/172Processing image signals image signals comprising non-image signal components, e.g. headers or format information
    • H04N13/183On-screen display [OSD] information, e.g. subtitles or menus
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • H04N13/344Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The present invention provides a portable electronic device 200 comprising a first input for receiving a first image signal from a camera 110 of a head-mounted display (HMD) device 100, a second input for receiving a position signal from a position sensor 230 in proximity to the portable electronic device, for example a GPS sensor and a third input for receiving a first orientation signal from an orientation sensor 130 of the HMD device. These signals are then used by a processor for running an augmented reality (AR) application 30 to generate a data overlay 32 of the first image signal based on the position signal and the first orientation signal. This data overlay 32 is transmitted to a display 120 of said head-mounted display device.

Description

Use of Portable Electronic Devices with Head-Mounted Display Devices The present invention concerns portable electronic devices. Examples of such portable devices include, without limitation, mobile telephones, personal digital organizers, pocket, tablet and laptop computers, portable games machines and portable navigational devices. More particularly, the present invention concerns portable devices usable with head mounted display (HMD) devices. Such HMD devices are well known and are typically used in virtual reality (VR) applications, such as immersive computer gaming. They may be either monocular or binocular and may provide either flat, two-dimensional or stereoscopic, three-dimensional images.
More sophisticated varieties of HMD can also be used in augmented reality (AR) applications, wherein data is overlaid over a real world subject that is viewed directly by a user of the HMD. An example of a known HMD device usable with an AR application is a pilot's helmet fitted with a visor which gives the pilot a head-up display of flight data overlaid on the pilot's view of the real world. HMD devices can operate on a variety of different display technologies, such as retinal projection or OLED.
It is known to run an AR application on the processor of a portable electronic device.
An example of such an AR application for portable devices is the LayarT' application for smart phones like the AppleTM iPhoneTM or mobile telephones with the GoogleTM AndroidTM operating system. LayarTM works by receiving an image signal from a camera mounted on the portable device, a position signal from a position sensor in the portable device, an orientation signal from an orientation sensor also in the portable device, generating a data overlay of the image signal based on the position signal and the orientation signal, and displaying the data overlay on a display of the portable device over an image derived from the image signal. Thus, when a user of such a mobile telephone points the phone's camera at a real world subject when the phone's display screen is in real-time display mode, data about the subject which the phone's camera is pointed at can be displayed on the phone's display screen overlaid on a image of the real world subject. Although potentially very useful, this technology has the disadvantage that for constant viewing of changing data overlaid on changing real world subjects, a user must look constantly at the phone's display screen, whilst moving the phone around and pointing it in different directions. For example, if the data overlay gives directional information on how to reach a desired destination (for example, a map), a user must walk, holding the phone out in front of them and pointing the phone's camera in their direction of travel, whilst continuing to look at the phone's display screen. This runs the risk of colliding with another person or object, or tripping over something outside the user's field of vision or focus of attention, as well as making it difficult to focus on the phone's display screen as it moves.
It is also known to use a portable device with an HMD. For example, international patent application no. WO 2009/0546 19 describes providing a portable device with a recognisable mark, such as the shape of the display screen of the portable device.
During operation, the portable device receives an image signal from a camera, which is preferably mounted on an HMD, where the image signal includes the mark captured by the camera. The portable device then uses the apparent location and distance of the mark captured by the camera to calculate the relative position of the camera and the mark, transforms data displayed on the screen of the portable device and overlays the transformed data on the received image signal to form a synthesized image, where both the transformation and the position of the overlay are determined by the relative position of the camera and the mark, and then transmits the synthesized image to a display of the HMD. Thus, when a user of such a portable device points the camera at the display screen of the portable device, an enlarged image of data displayed on the screen is seen on the display of the HMD as a data overlay synthesized with an image of the real world captured by the camera of the HMD. This technology allows a user to view very detailed or large amounts of data which otherwise could not be viewed on the small display screen of the portable device, whilst still being able to view an image of the real world, but only works if the portable device remains part of the user's real world view. In other words, the user must keep the portable device visible by the camera at all times for the technology to work. This has a similar disadvantage to the LayarTM technology described above, in that a user must keep the camera looking constantly at the display screen of the portable device to see the data overlay.
If the camera is mounted on an HMD worn by the user, this again entails the risk that the user will collide with another person or object as he focuses on the portable device, or will trip over something outside his field of vision or focus of attention.
The technology described in WO 2009/0546 19 also has the disadvantage that the data overlay can only be displayed on the user's view of the portable device, and cannot be overlaid on any subject in the user's view of the real world.
The present invention aims to address these and other problems with the known art.
The present invention also aims to provide an improved portable electronic device usable with an HMD device, an improved method of operating a portable electronic device, and an improved method of operating a HMD device.
Accordingly, in a first aspect, the present invention provides a portable electronic device comprising: a first input for receiving a first image signal from a camera of a head-mounted display device; a second input for receiving a position signal from a position sensor in proximity to the portable electronic device; a third input for receiving a first orientation signal from an orientation sensor of said head-mounted display device; a processor for running an augmented reality application to generate a data overlay of the first image signal based on the position signal and the first orientation signal; and an output for transmitting the data overlay to a display of said head-mounted display device.
At least some of the first, second and third inputs and the output of the portable device can be embodied by a single, multipurpose transceiver.
The present invention differs from the LayarTM application for smart phones like the AppleTM iPhoneTM or mobile telephones with the GoogleTM AndroidTM operating system in that the first image signal and the first orientation signal are respectively derived from a camera and an orientation sensor of a head-mounted display device, rather than from a camera and an orientation sensor of the portable device itself The present invention also differs from the aforementioned LayarTM application in that the data overlay is transmitted to a display of the head-mounted display device, rather than being displayed on a display screen of the portable device itself Thus the point of view which is displayed on the display of the HMD is the point of view of the HMD S camera, rather than the point of view of the camera of the portable device, and the data overlay displayed on the display of the HMD is correlated with the point of view of the HMD's camera. Moreover, since the data overlay is transmitted to the display of the HMD rather than to a display screen of the portable device, a user does not have to look constantly at the display of the portable device in order to see the data overlay. This has the big advantage that a user can look in whatever direction he chooses and does not have to look at the portable device at all. Instead, the portable device may be carried, for example, in a pocket or under an arm of the user.
Nevertheless, since the camera and the orientation sensor are both mounted on the HMD, the data overlay which the user views on the display of the HMD remains correlated with the user's own view of real world subjects as the user looks around.
The present invention also differs from the disclosure of WO 2009/0546 19 because it generates a data overlay of the first image signal from the camera of an HMD device based on the position signal from a position sensor and the first orientation signal from an orientation sensor of the HMD device, instead of using a mark on the portable device detected by the camera on the HMD to calculate the relative position between the camera and the mark, transforming data displayed on the screen of the portable device and overlaying the transformed data on the received image signal to form a synthesized image, where both the transformation and the position of the overlay are determined by the relative position of the camera and the mark. In other words, the present invention generates a data overlay of the image signal from the camera using a position and the orientation of the HMD detected by sensors, whereas in WO 2009/0546 19, a data overlay of the image signal from the camera is generated by identifying the location of the HMD relative to the portable device using pattern recognition to locate the portable device. Thus the portable device of the invention does not have to be visible to the camera of the HMD at all for the portable device still to be able to generate a data overlay of the first image signal from the camera, and a user does not have to look constantly at the portable device either in order to be able to see the data overlay which it generates. This has the advantage that a user does not have to look at the portable device at all and can instead look in whatever direction he chooses, thereby allowing the portable device to be carried, for example, in a pocket or under an arm of the user. Nevertheless, since the camera and the orientation sensor are both mounted on the HMD device, the data overlay which the user views on the display of the HMD remains correlated with the user's own real world view as the user looks around. Moreover, the present invention also has the advantage over the disclosure of WO 2009/054619 that data can be displayed anywhere on the display of the HMD device, and not just in a region thereof which includes the user's view of the portable device.
n a second aspect, the present invention also provides a kit comprising: a portable device according to the first aspect of the invention; a head-mounted display device comprising an input for receiving the data overlay, a display for displaying the data overlay, a camera, an orientation sensor, a first output for transmitting the first image signal from the camera to the portable device, and a second output for transmitting the first orientation signal from the orientation sensor to the portable device; and a position sensor for supplying the position signal to the portable device, the position sensor being mounted on one of the portable device and the head-mounted display device.
n a third aspect, the present invention also provides a method of operating a portable electronic device, the method comprising: receiving a first image signal from a camera of a head-mounted display device; receiving a position signal from a position sensor in proximity to the portable device; receiving a first orientation signal from an orientation sensor of said head-mounted display device; running an augmented reality application on a processor of said portable device to generate a data overlay of the first image signal based on the position signal and the first orientation signal; and transmitting the data overlay to a display of said head-mounted display device.
n a fourth aspect, the present invention also provides a method of operating a head-mounted display device comprising a camera, a display and an orientation sensor, the method comprising: transmitting a first image signal from the camera and a first orientation signal from the orientation sensor to a portable electronic device, the portable device comprising a processor for running an augmented reality application to generate a data overlay of the first image signal based on the first orientation signal and a position signal received from a position sensor in proximity to the portable device; receiving the data overlay; and displaying the data overlay on the display.
At least some of the first and second outputs and the input of the head-mounted display device can be embodied by a single, multipurpose transceiver.
The data overlay may comprise words, static or moving images, symbols, diagrams, hyperlinks and anything else which can normally be displayed on such a display.
If the display of the head-mounted display device is at least partially transparent, the data overlay can be overlaid on a user's view of real world subjects seen directly through the display. Alternatively, if the display of the head-mounted display device is opaque, the data overlay can instead be synthesized by the processor with the first image signal from the camera of the HMD device to create a synthesized image of the data overlay and the view of real world subjects captured by the camera. Preferably, however, the display is transparent and the data overlay is overlaid on the user's direct view of the real world, since this gives a more realistic view than a synthesized image, uses less processing power, and therefore also reduces the power consumption and weight of the HMD.
Preferably, the processor can also run a virtual reality application and the display of the head-mounted display device is switchable between a transparent and an opaque mode of operation. Thus, a user may block out the view of real world subjects seen directly through the transparent display by switching the display to opaque mode and use the HMD with the portable device for virtual reality applications as well as for augmented reality applications. Such a switching capability can be provided by using crossed polarisers or known liquid crystal technology, or simply by providing the HMD device with an opaque visor which is slidable to cover an outer surface of the display.
The camera can be any sort of detector for capturing an image, such as a CCD or CMOS detector with suitable optics.
The position sensor may be a global positioning system (GPS) receiver, or equivalent sensor, such as a receiver for detecting the European GalileoTM satellite positioning system, Russia's GlonassTM system or China's CompassTM system. It may also be a position sensor for detecting position in a local system, such as within a building or room. Since during operation, the portable device and the HMD remain in close proximity to each other, the position sensor can be mounted on either the portable device or the HMD. The position sensor may even be mounted on another device kept in proximity to the portable device during use. However, the position sensor is preferably mounted on the portable device, as this has the advantages of minimising the number of devices for the user to carry, as well as minimising the power consumption and weight of the HMD.
The orientation sensor is for detecting the orientation of the HMD in x, y and z directions. Conveniently, it may be an arrangement of magnetometers, commonly also known as a digital compass, for detecting the orientation of the HMD relative to an external magnetic field, such as the earth's, although other orientation sensors, for example using accelerometers, are known and can be used as well or instead.
If the portable device comprises its own camera which generates a second image signal, then the first image signal from the camera of the HMD should override the second image signal from the camera of the portable device. In this way, the processor will receive the correct point of view of a user of the HMD from the camera of the HMD for the AR application to generate a data overlay.
If the portable device comprises its own orientation sensor which generates a second orientation signal, then the first orientation signal from the orientation sensor of the HMD should override the second orientation signal from the orientation sensor of the portable device. In this way, the processor will receive the same orientation of the HMD from the orientation sensor of the HMD as a user of the HMD perceives for the AR application to generate the correct data overlay.
The HMD may also comprise an eyeball tracking sensor for supplying an eyeball tracking signal to the portable device. Such eyeball tracking sensors are well known, and detect the direction of gaze of a user of the HMD. In this case, the portable device may also comprise a fourth input for receiving the eyeball tracking signal. The fourth input can be embodied by the same multipurpose transceiver as at least some of the first, second and third inputs and the output of the portable device. The processor can then use the eyeball tracking signal to adapt the data overlay on the basis of said signal. For example, the data overlay may provide more data for real world subjects on which a user's gaze lingers for more than a first pre-determined period of time, for example 2 seconds. The user may then study the data provided until this extra data is removed by the processor, for example when the eyeball tracking sensor detects that the user blinks for more than a second pre-determined period of time, for example 1 second.
Preferably, either the HMD or the portable device also comprises means for selecting a subject identified by the eyeball tracking signal. This selection means may take the form of a push button, which for convenience as well as to reduce the power consumption and weight of the HMD, is preferably mounted on the portable device, rather than on the HMD. In this way, a user may use the eyeball tracking sensor and the selection means together with each other in a similar way to using a computer mouse, for example by gazing at a real world subject to cause the eyeball tracking signal to identify the subject and then clicking on the push button to select the real world subject in a similar way to clicking on a hyperlink on a conventional computer display screen.
The HMD device may be connected to the portable device either via a cable or via a wireless connection. Preferably however, the HMD device is connected to the portable device via a wireless connection. This has advantages of comfort and convenience for a user. Such a wireless connection may be of a known type, such as a BluetoothTM wireless connection.
It is also preferable that the portable device comprises a memory for buffering and/or recording, as well as for playback, of the first image signal from the camera of the head-mounted display device, the position signal from the position sensor and the first orientation signal from the orientation sensor of the head-mounted display device. Tn this way, the processor can generate a data overlay of an earlier first image signal stored in memory based on a historical position signal and a historical first orientation signal which were both also stored in memory and which are correlated with the stored first image signal, in order to transmit a data overlay appropriate to the stored first image signal and the stored first image signal itself to the display of the HMD, as well as being able to transmit a real-time data overlay of a current first image signal received directly from the camera of the HMD.
The portable device may also comprise a second output for transmitting an image synthesized from the first image signal and the data overlay to a display of a second head-mounted display device. n this way, two users may both view the same scene at the same time as each other, including the contents of the same data overlay, one user as an AR scene and the second user as a YR scene representing the first user's point of view. The portable device may also comprise at least one of a telephone, a coder/decoder (codec) and a modulator/demodulator (modem). If so, the synthesized image may also be sent via a telecommunications link to another electronic device connected therewith but located remotely therefrom, for viewing and/or recording at the remote location. However, if the synthesized image thus transmitted is viewed in real-time by a second user at the remote location, the two users may still discuss both the local user's view of the real world and the contents of the data overlay, since they remain connected to each other via the telecommunications link at the same time as the synthesized image is being transmitted.
The processor of the portable device may be adapted to run an augmented reality application which is autonomous, or it may be adapted to run a client application for transmitting at least one of the first image signal, the position signal and the first orientation signal via a telecommunications link to a server running a server application and for receiving data therefrom, the data thus received being derived by the server application from the at least one of the first image signal, the position signal and the first orientation signal, to generate the data overlay of the first image signal on the basis of the data thus received. This allows generation of the data overlay to be distributed between the processor of the portable device and the server, giving flexibility in the assignment of processing tasks between the client application and the server application.
Further features and advantages of the present invention will become apparent from the following detailed description, which is given by way of example and in association with the accompanying drawings, in which: Fig. 1 is a schematic block diagram of a kit according to a first embodiment of the invention, which comprises a portable device and a head-mounted display device; Fig. 2 is a schematic block diagram of a kit according to a second embodiment of the invention, which comprises a portable device and a head-mounted display device; Fig. 3 is a schematic block diagram of a portable device according to a third embodiment of the invention; Fig. 4 is a schematic block diagram of a method of using a portable device according to a fourth embodiment of the invention; Fig. 5 is a schematic block diagram of a method of using a portable device according to a fifth embodiment of the invention; and Fig. 6 is a schematic block diagram of a method of using a portable device according to a sixth embodiment of the invention.
Fig. 1 schematically shows a best mode for carrying out the invention, wherein a head-mounted display device 100 comprises a camera 110, a display 120 and an orientation sensor 130, and a portable device 200 comprises a camera 210, a display 220 and a position sensor 230. In this case, the portable device 200 also comprises a mobile telephone. The camera 110 of the HMD device 100 and the camera 210 of the portable device 200 both capture images of a real world subject 50 as respective first and second image signals. The display 120 of the HMD device 100 is transparent, so that the real world subject 50 is also displayed directly on display 120.
The orientation sensor 130 detects the orientation of the HMD device 100 in x, y and z directions and generates a first orientation signal on the basis thereof. In other words, it detects in which direction the HMD device 100 is pointing and therefore in which direction the real world subject 50 lies relative to the HMD. The position sensor 230 of the portable device 200 detects the position of the portable device 200 by receiving signals broadcast from global positioning system (GPS) satellites and generates a position signal on the basis thereof Since, during use, the portable device remains in close proximity to the HMD device 100, the position sensor 230 therefore also detects the approximate position of the HMD device 100 to the same level of accuracy as signals from GPS satellites which are broadcast for general commercial use allow.
HMD device 100 and portable device 200 are linked via a wireless two-way link 40.
Wireless link 40 transmits the first image signal from an output of camera 110 of the HMD device 100 and the first orientation signal from an output of orientation sensor to an input of the portable device 200. A processor in the portable device 200 receives the first image signal and the first orientation signal from the input of wireless link 40, as well as the position signal from the position sensor 230 mounted therein from an internal input of the portable device, and runs an augmented reality application 30 to generate a data overlay 32 of the first image signal based on the position signal and the first orientation signal.
When the portable device 200 is not being used in conjunction with a head-mounted display device like HMD device 100, the display 220 of portable device 200 normally displays the second image signal captured by the camera 210 of the portable device.
However, when being used with HMD device 100, the second image signal is overridden by the first image signal from the camera 110 of the HMD device 100 to generate the data overlay 32. The display 220 of the portable device 200 therefore switches to displaying the first image signal captured by the camera 110 of the HMD device 100, as well as the data overlay 32 laid thereon, instead of the second image signal captured by the camera 210 of the portable device.
The portable device 200 further comprises a second orientation sensor (not shown in Fig 1) mounted therein which generates a second orientation signal representing an orientation of the portable device, but this second orientation signal is also overridden by the first orientation signal received from the orientation sensor 130 of the HMD device 100 to generate the data overlay 32. Thus the data overlay 32 is based on the orientation of the HMD device 100 in x, y and z directions, rather than on the orientation of the portable device 200.
After the data overlay 32 has been generated by the processor of the portable device, it is then transmitted from an output of the portable device 200 via wireless link 40 to an input of the HMD device 100, where the data overlay 32 is also displayed on the display 120 thereof Since the display 120 of the HMD device is transparent, the date overlay 32 is displayed directly over the real world subject 50 which is displayed directly. Thus, a user's view of the real world subject 50 is supplied with a data overlay 32 that is correlated therewith, since it is based on both the orientation of the HMD device 100 and its approximate position. As the user looks around at new real world subjects through the display 120 of the HMD device, images captured by the camera 110 of the HMD device 100 continue to be used by the processor in the portable device 200 to generate an updated data overlay 32, which therefore remains correlated with the real world subjects being viewed by the user.
In the first embodiment of Fig. 1, the display 120 of the head-mounted display device is also switchable between a transparent mode and an opaque mode of operation.
This may be achieved by a user entering a command to switch between these two modes via a keypad of the portable device 200 (not shown in Fig. 1) which is transmitted to HMD device 100 via wireless connection 40. Alternatively, HMD device 100 can just comprise a mode change switch thereon. When the display 120 is switched to the opaque mode of operation, the user's view of the real world is obscured by the display 120 and only image data generated by the portable device 200 can be seen on the display 120. When the processor of the portable device 200 is running AR application 30, this means that the user can only see the data overlay 32.
However, if the processor can also run a VR application, then the user may use the HMD device 100 as a display for the VR application as well. Thus, the HMD device can be used in the opaque mode of operation for immersive computer gaming for example, as well as in the transparent mode of operation for displaying a data overlay 32 of a real world subject 50.
Fig. 2 schematically shows a head-mounted display device 700 and a portable device 800 according to a second embodiment of the invention. The HMD device 700 again comprises a camera 110, a display 120 and an orientation sensor 130, and the portable device 800 again comprises a camera 210, a display 220 and a position sensor 230, where like reference numerals have been used to refer to the same components as in the first embodiment. As in the first embodiment, the HMD device 700 and the portable device 800 are again connected to each other via a wireless link 40, and the portable device 800 also comprises a processor which runs an augmented reality application 30 to generate a data overlay 32 of the first image signal from the camera of the HMD device 700 based on the position signal from position sensor 230 and the first orientation signal from orientation sensor 130. Since the operation of all these components has already been described above in relation to the first embodiment, it will not be explained again here.
In this case, however, the HMD device 700 further comprises an eyeball tracking sensor 140 for detecting the direction of gaze of a user of the HMD and generating an eyeball tracking signal on the basis thereof. In other words, it detects the direction in which a user of the HMD is looking by monitoring the position of one or both of the user's eyeballs and outputs a signal encoding that information, for example as the co-ordinates of a cursor which follows the user's focus of attention.
During use, this eyeball tracking signal is transmitted via wireless link 40 from an output of eyeball tracking sensor 140 to a fourth input of the portable device 800. The processor in the portable device 800 receives the eyeball tracking signal and uses it to generate a cursor which is added to the data overlay 32 before it is transmitted via the wireless link 40 from the output of portable device 800 to the input of HMD device 700, for display on the display 120 thereof A user has the option to switch this cursor on or off during set-up of the portable device 800, but in both cases, the processor maintains a record of the cursor's co-ordinates. Thus, if the user chooses to display the cursor on display 120, it will be seen hovering over whatever the user is looking at. Also in both cases, if the user's gaze lingers for more than a first pre-determined period of time, for example 2 seconds, over a real world subject 50, SO that the cursor would also hover over one location for more than this first pre-determined period of time, then this information is included in the eyeball tracking signal and is relayed to the portable device 800, where the processor adapts the data overlay 32 on the basis thereof by adding more data to the area of the display 120 in the user's focus of attention. The user may then study this extra data until it is removed by the processor, when the eyeball tracking sensor 140 detects that the user blinks for more than a second pre-determined period of time, for example 1 second.
The portable device 800 further comprises means 240 for selecting a subject identified by the eyeball tracking signal. In this embodiment, this selection means 240 takes the form of a push-button on the portable device 800, but it may also take another form, such as a microphone responsive to a voice command of the user. In the latter case, the microphone does not have to be mounted on the portable device itself, but can instead be connected to it via a wireless link, and may even be incorporated into the HMD device. The selection means 240 can be used in co-operation with the eyeball tracking signal in a "point-and-click" operation similar to that performed with a traditional computer mouse. By using the cursor generated from the eyeball tracking signal to identify a subject in the user's field of view and then pressing the push-button of selection means 240, the user can "point-and-click" on that subject. The subject can be either a real world subject 50 or part of the data overlay 32. In the latter case, the data overlay 32 can include hyperlinks, so that the user may click through to more data in a similar fashion to using a traditional computer mouse.
Fig. 3 schematically shows a portable device 900 according to a third embodiment of the invention. Portable device 900 comprises a camera 210, a display 220 and a position sensor 230, which function as previously described and where like reference numerals have been used to refer to the same components as before. As in the first and second embodiments, portable device 900 also comprises a processor which can run an augmented reality application 30 to generate a data overlay 32 of the first image signal from a camera of an HMD device based on the position signal from position sensor 230 and a first orientation signal from an orientation sensor of the HMD device, where the first image signal and the first orientation signal are received via a wireless connection 40. However, in this case, portable device 900 also comprises a memory 250 for storing the first image signal from the camera of the HMD device, the position signal from position sensor 230 and the first orientation signal from the orientation sensor of the HMD device. Thus, as well as generating a data overlay 32 in real-time, the processor of portable device 900 can also generate a data overlay 32 of a past version of the first image signal which was previously stored in the memory 250, based on the position signal from position sensor 230 and the first orientation signal from the orientation sensor of the HMD device which were both also stored in memory 250 at the same time as the past first image signal. Effectively, therefore, this allows a user to record the data overlay 32 for a real world subject 50 and play it back later. This may be done after a short delay by buffering the first image signal, the position signal and the first orientation signal, or much later, for example when the real world subject 50 is no longer in view. However, since the first image signal is stored in the memory 250 instead of just a past version of data overlay 32, this allows the stored version of the first image signal to be replayed as well as the data overlay generated from this past version. Thus a whole scene, comprising both a recorded image of real world subject 50 and the data overlay 32 which is appropriate to real world subject 50, can be synthesized and viewed, rather than just a past version of data overlay 32 overlaid on a current view of the real world which is not correlated therewith. Therefore, according to this embodiment, during playback, both the stored version of the first image signal and the data overlay generated from this past version are displayed on display 220 as a synthesized image and can also be transmitted via wireless link 40 to the display of an HMD device. In this case, the HMD device has an opaque mode of operation, as already described above, to allow the recorded image of real world subject 50 and the data overlay 32 which is appropriate thereto to be viewed together, without interference from a current view of the real world.
Fig. 4 schematically shows a method of using a portable device 200A according to a fourth embodiment of the invention. Portable device 200A again comprises a camera 210, a display 220 and a position sensor 230, which function as previously described and where like reference numerals have been used to refer to the same components as before. Portable device 200A also again comprises a processor which can run an augmented reality application 30 to generate a data overlay 32 of the first image signal from camera 110 of HMD device 100 based on the position signal from position sensor 230 and a first orientation signal from orientation sensor 130 of HMD device 100, where the first image signal and the first orientation signal are received from HMD device 100 via two-way wireless connection 40. The data overlay 32 is also transmitted via wireless link 40 to an input of HMD device 100 and displayed on the display 120 thereof as before.
However, in this case, the processor of portable device 200A can also generate a synthesized image 34 from the first image signal and the data overlay 32, and the portable device 200A further comprises a second output for transmitting this synthesized image 34 to a second display 120 of a second head-mounted display device bOA. Thus, a second user of the second HMD device 100A can view the synthesized image 34 at the same time as a user of the HMD device 100 views the data overlay 32. This allows the second user to view the contents of data overlay 32 at the same time as the user of HMD device 100, even if the second HMD device 1 OOA is simpler than HMD device 100 and lacks a camera or an orientation sensor mounted thereon to provide input signals for data overlay 32. Thus, data overlay 32 can still be viewed by the second user as part of a VR scene on an opaque display 120 of the simpler HMD device 100A at the same time as the user of HMD device 100 views data overlay 32 laid on his direct view of the real world as an AR scene. This allows friends to share and comment on the same data overlay for example, even if only one of them has an AR-capable HMD device and even if the display screen 220 of portable device 200A is too small for the second user to be able to view the data overlay 32 thereon in detail.
Fig. 5 schematically shows a method of using a portable device 200L according to a fifth embodiment of the invention, which is similar to that shown in Fig. 4, in that portable device 200L operates in the same fashion as portable device 200A, and can also generate a synthesized image 34 from the first image signal of HMD device 100 and data overlay 32. However, in this case, since portable device 200L, which is located locally, comprises a mobile phone, portable device 200L transmits the synthesized image 34 thus generated via a telecommunications link 60 to another, remotely located electronic device 600R connected therewith. Telecommunications link 60 can be provided in one of several ways, including a telephone connection or an internet connection. The remote device 600R does not have to be a portable device, but may instead be, for example, a desktop computer. At the remote device 600R, the synthesized image 34 can be recorded, displayed on a display 620 of the remote device, transmitted to the display 120 of another HMD device 100R which is also remote from portable device 200L via a local data link 40R, such as a BluetoothTM wireless connection, or any combination thereof Thus, a user of remotely located device 600R can view the transmitted synthesized image 34 at the same time as a user of the HMD device 100 views data overlay 32. This allows the remotely located user to view the same scene as the user of HMD device 100, including the contents of data overlay 32, even if the remotely located device 600R does not comprise a processor for running an augmented reality application. Since the local user of portable device 200L and the user of remotely located device 600R are connected via telecommunications link 60, they may also discuss both the local user's view of the real world and the contents of data overlay 32 at the same time. This is particularly useful for remote inspection of real world subjects, where the local user can be directed by the remote user, not only to focus his attention on a particular real world subject, but also to interact with the data overlay for that real world subject in a particular way.
Finally, Fig. 6 shows a method of using a portable device 200C according to a sixth embodiment of the invention. Portable device 200C again comprises a camera 210, a display 220 and a position sensor 230, which function as previously described, and where like reference numerals have been used to refer to the same components as before. Portable device 200C again comprises a mobile phone and a processor which can run an augmented reality application. However, in this case, the augmented reality application is a client application 30C, which transmits at least the position signal from position sensor 230 as data 36 via a telecommunications link 60 to a server 300 running a server application 30S. Telecommunications link 60 can be provided in one of several ways, including a telephone connection or an internet connection. For example, telecommunications link 60 may be an always-on" connection, and the data 36 may be encoded by a codec of client application 30C and then transmitted over telecommunications link 60 using push technology. Client application 30C is also able to transmit at least one of the first image signal from camera 110 and the first orientation signal from orientation sensor 130 to server 300 as part of data 36, in addition to transmitting the position signal from position sensor 230.
At server 300, the server application 30S running thereon generates location-based services (for example, directions of travel) based on the position signal received from portable device 200C, and transmits these location-based services back to portable device 200C, again via telecommunications link 60. Client application 30C incorporates these location-based services received from server 300 into a data overlay 32 of the first image signal, which is therefore based on the position signal, as well as on the first orientation signal. Data overlay 32 is then transmitted by client application 30C via wireless link 40 to an input of HMD device 100 to be displayed on the display 120 thereof as before.
If portable device 200C also sends the first image signal and the first orientation signal over telecommunications link 60 as part of data 36, then server application 30S can also be an augmented reality application, which generates a data overlay 32 of the first image signal based on the position signal and the first orientation signal received from client application 30C. In this case, the data overlay 32 thus generated is transmitted back to portable device 200C, again via telecommunications link 60, where it is received by client application 30C, processed and transmitted via wireless link 40 to an input of HMD device 100 for display on the display 120 thereof, as before.
Generating a data overlay 32 in the server application 30S of server 300 has the advantages that server 300 can draw on greater processing power and more frequent updates to data overlay 32 than portable device 200C, whereas generating the data overlay 32 in the client application 30C of portable device 200C has the advantage of reducing the volume of data 36 transmitted over telecommunications link 60. This may be preferred if the bandwidth of telecommunications link 60 is restricted, since uplinks typically have less bandwidth than downlinks.
If data 36 includes the first image signal and the first orientation signal, server 300 can also synthesize the date overlay 32 generated therefrom with the first image signal to create a synthesized image. This can be recorded, displayed or transmitted to the display of another device remote from portable device 200C, in a manner similar to that already described above in relation to Fig. 5.

Claims (21)

  1. Claims 1. A portable electronic device (200, 800, 900, 200A, 200L, 200C) comprising: a first input for receiving a first image signal from a camera (110) of a head-mounted display device (100, 700); a second input for receiving a position signal from a position sensor (230) in proximity to the portable electronic device; a third input for receiving a first orientation signal from an orientation sensor (130) of said head-mounted display device (100, 700); a processor for running an augmented reality application (30, 30C) to generate a data overlay (32) of the first image signal based on the position signal and the first orientation signal; and an output for transmitting the data overlay (32) to a display (120) of said head-mounted display device (100, 700).
  2. 2. A portable device according to claim 1, wherein the position sensor (230) is mounted on said portable device.
  3. 3. A portable device according to claim 1 or claim 2, further comprising a camera (210) for generating a second image signal, wherein the second image signal from the camera (210) of the portable device (200, 800, 900, 200A, 200L, 200C) can be overridden by the first image signal from the camera (110) of the head-mounted display device (100, 700).
  4. 4. A portable device according to any one of the preceding claims, further comprising an orientation sensor for generating a second orientation signal, wherein the second orientation signal from the orientation sensor of the portable device (200, 800, 900, 200A, 200L, 200C) can be overridden by the first orientation signal from the orientation sensor (130) of the head-mounted display device (100, 700).
  5. 5. A portable device (800) according to any one of the preceding claims, further comprising a fourth input for receiving an eyeball tracking signal from an eyeball tracking sensor (140) of said head-mounted display device (700), and wherein the processor can adapt said data overlay (32) on the basis of said eyeball tracking signal.
  6. 6. A portable device (800) according to claim 5, further comprising means (240) for selecting a subject identified by said eyeball tracking signal.
  7. 7. A portable device (900) according to any one of the preceding claims, further comprising a memory for performing at least one of buffering and recording, as well as for playback, of: the first image signal from the camera (110) of the head-mounted display device (100, 700), the position signal from the position sensor (230), and the first orientation signal from the orientation sensor (130) of the head-mounted display device (100, 700).
  8. 8. A portable device (200A, 200L) according to any one of the preceding claims, wherein the processor can generate a synthesized image (34) from the first image signal and the data overlay (32), the portable device further comprising a second output for transmitting the synthesized image (34) to a display (120) of a second head-mounted display device (bOA, 100R).
  9. 9. A portable device according to any one of the preceding claims, wherein the portable device comprises at least one of a telephone, a codec and a modem.
  10. 10. A portable device according to claim 9, wherein the processor is for running a client application (30C) for transmitting (36) at least one of the first image signal, the position signal and the first orientation signal via a telecommunications link (60) to a server (300) running a server application (30S), and for receiving data derived by the server application (3 OS) from the at least one of the first image signal, the position signal and the first orientation signal from the server (300) via the telecommunications link (60), to generate the data overlay (32) of the first image signal on the basis of the data thus received.
  11. 11. A kit comprising: a portable device (200, 800, 900, 200A, 200L, 200C) according to any one of the preceding claims; a head-mounted display device (100, 700) comprising: an input for receiving the data overlay (32), a display (120) for displaying the data overlay, a camera (110), an orientation sensor (130), a first output for transmitting the first image signal from the camera (110) to the portable device (200, 800, 900, 200A, 200L, 200C), and a second output for transmitting the first orientation signal from the orientation sensor (130) to the portable device (200, 800, 900, 200A, 200L, 200C); and a position sensor (230) for supplying the position signal to the portable device (200, 800, 900, 200A, 200L, 200C), the position sensor being mounted on one of the portable device (200, 800, 900, 200A, 200L, 200C) and the head-mounted display device (100, 700).
  12. 12. A kit according to claim 11, wherein the processor can also run a virtual reality application, and the display (120) of the head-mounted display device (100, 700) is switchable between a transparent mode and an opaque mode of operation.
  13. 13. A method of operating a portable electronic device (200, 800, 900, 200A, 200L, 200C), comprising: receiving a first image signal from a camera (110) of a head-mounted display device (100, 700); receiving a position signal from a position sensor (230) in proximity to the portable device (200, 800, 900, 200A, 200L, 200C); receiving a first orientation signal from an orientation sensor (130) of said head-mounted display device (100, 700); running an augmented reality application (30, 30C) on a processor of said portable device (200, 800, 900, 200A, 200L, 200C) to generate a data overlay (32) of the first image signal based on the position signal and the first orientation signal; and transmitting the data overlay (32) to a display (120) of said head-mounted display device (100, 700).
  14. 14. A method of operating a portable device (200A, 200L) according to claim 13, further comprising: generating a synthesized image (34) from the first image signal and the data overlay (32); and transmitting the synthesized image (34) to a display (120) of a second head-mounted display device (bOA, 100R).
  15. 15. A method of operating a portable device (200L) according to claim 14, wherein the synthesized image (34) is transmitted via a telecommunications link (60).
  16. 16. A method of operating a portable device (200C) according to any one of claims 13 to 15, further comprising: transmitting (36) at least one of the first image signal, the position signal and the first orientation signal to a server (300) via a telecommunications link (60); receiving data derived from the at least one of the first image signal, the position signal and the first orientation signal from the server (300) via the telecommunications link (60); and generating the data overlay (32) of the first image signal on the basis of the data thus received.
  17. 17. A method of operating a head-mounted display device (100, 700) comprising a camera (110), a display (120) and an orientation sensor (130), the method comprising: transmitting a first image signal from the camera (110) and a first orientation signal from the orientation sensor (130) to a portable electronic device (200, 800, 900, 200A, 200L, 200C), the portable device comprising a processor for running an augmented reality application (30, 30C) to generate a data overlay (32) of the first image signal based on the first orientation signal and a position signal received from a position sensor (230) in proximity to the portable device (200, 800, 900, 200A, 200L, 200C); receiving the data overlay (32); and displaying the data overlay (32) on the display (120).
  18. 18. A portable electronic device substantially as hereinbefore described with reference to any one of Figs. 1 to 6.
  19. 19. A kit comprising a portable electronic device, a head-mounted display device and a position sensor mounted on one of the portable device and the head-mounted display device, substantially as hereinbefore described with reference to any one of Figs. 1, 2 and 4 to 6.
  20. 20. A method of operating a portable electronic device substantially as hereinbefore described with reference to any one of Figs. 1 to 6.
  21. 21. A method of operating a head mounted display device substantially as hereinbefore described with reference to any one of Figs. 1, 2 and 4 to 6.
GB1002485.9A 2010-02-15 2010-02-15 Use of portable electonic devices with head-mounted display devices Expired - Fee Related GB2477787B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB1002485.9A GB2477787B (en) 2010-02-15 2010-02-15 Use of portable electonic devices with head-mounted display devices

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1002485.9A GB2477787B (en) 2010-02-15 2010-02-15 Use of portable electonic devices with head-mounted display devices

Publications (3)

Publication Number Publication Date
GB201002485D0 GB201002485D0 (en) 2010-03-31
GB2477787A true GB2477787A (en) 2011-08-17
GB2477787B GB2477787B (en) 2014-09-24

Family

ID=42110696

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1002485.9A Expired - Fee Related GB2477787B (en) 2010-02-15 2010-02-15 Use of portable electonic devices with head-mounted display devices

Country Status (1)

Country Link
GB (1) GB2477787B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3093743A4 (en) * 2014-01-06 2017-09-13 Samsung Electronics Co., Ltd. Electronic device and method for displaying event in virtual reality mode
NO342793B1 (en) * 2017-06-20 2018-08-06 Augmenti As Augmented reality system and method of displaying an augmented reality image
US11663789B2 (en) 2013-03-11 2023-05-30 Magic Leap, Inc. Recognizing objects in a passable world model in augmented or virtual reality systems
US11854150B2 (en) 2013-03-15 2023-12-26 Magic Leap, Inc. Frame-by-frame rendering for augmented or virtual reality systems

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110286754B (en) * 2019-06-11 2022-06-24 Oppo广东移动通信有限公司 Projection method based on eyeball tracking and related equipment

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5742264A (en) * 1995-01-24 1998-04-21 Matsushita Electric Industrial Co., Ltd. Head-mounted display
US6299308B1 (en) * 1999-04-02 2001-10-09 Cybernet Systems Corporation Low-cost non-imaging eye tracker system for computer control
US20020196202A1 (en) * 2000-08-09 2002-12-26 Bastian Mark Stanley Method for displaying emergency first responder command, control, and safety information using augmented reality
US20060019614A1 (en) * 2004-07-20 2006-01-26 Olympus Corporation Mobile information apparatus
WO2006087709A1 (en) * 2005-02-17 2006-08-24 Lumus Ltd. Personal navigation system
US20070273610A1 (en) * 2006-05-26 2007-11-29 Itt Manufacturing Enterprises, Inc. System and method to display maintenance and operational instructions of an apparatus using augmented reality
US20090128449A1 (en) * 2007-11-15 2009-05-21 International Business Machines Corporation Augmenting Reality For A User

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050289590A1 (en) * 2004-05-28 2005-12-29 Cheok Adrian D Marketing platform
US8180396B2 (en) * 2007-10-18 2012-05-15 Yahoo! Inc. User augmented reality for camera-enabled mobile devices

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5742264A (en) * 1995-01-24 1998-04-21 Matsushita Electric Industrial Co., Ltd. Head-mounted display
US6299308B1 (en) * 1999-04-02 2001-10-09 Cybernet Systems Corporation Low-cost non-imaging eye tracker system for computer control
US20020196202A1 (en) * 2000-08-09 2002-12-26 Bastian Mark Stanley Method for displaying emergency first responder command, control, and safety information using augmented reality
US20060019614A1 (en) * 2004-07-20 2006-01-26 Olympus Corporation Mobile information apparatus
WO2006087709A1 (en) * 2005-02-17 2006-08-24 Lumus Ltd. Personal navigation system
US20070273610A1 (en) * 2006-05-26 2007-11-29 Itt Manufacturing Enterprises, Inc. System and method to display maintenance and operational instructions of an apparatus using augmented reality
US20090128449A1 (en) * 2007-11-15 2009-05-21 International Business Machines Corporation Augmenting Reality For A User

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11663789B2 (en) 2013-03-11 2023-05-30 Magic Leap, Inc. Recognizing objects in a passable world model in augmented or virtual reality systems
US12039680B2 (en) 2013-03-11 2024-07-16 Magic Leap, Inc. Method of rendering using a display device
US11854150B2 (en) 2013-03-15 2023-12-26 Magic Leap, Inc. Frame-by-frame rendering for augmented or virtual reality systems
EP3093743A4 (en) * 2014-01-06 2017-09-13 Samsung Electronics Co., Ltd. Electronic device and method for displaying event in virtual reality mode
US10431004B2 (en) 2014-01-06 2019-10-01 Samsung Electronics Co., Ltd. Electronic device and method for displaying event in virtual reality mode
NO342793B1 (en) * 2017-06-20 2018-08-06 Augmenti As Augmented reality system and method of displaying an augmented reality image
NO20171008A1 (en) * 2017-06-20 2018-08-06 Augmenti As Augmented reality system and method of displaying an augmented reality image
WO2018233881A1 (en) 2017-06-20 2018-12-27 Augmenti As Augmented reality system and method of displaying an augmented reality image
US10970883B2 (en) 2017-06-20 2021-04-06 Augmenti As Augmented reality system and method of displaying an augmented reality image

Also Published As

Publication number Publication date
GB2477787B (en) 2014-09-24
GB201002485D0 (en) 2010-03-31

Similar Documents

Publication Publication Date Title
US10356398B2 (en) Method for capturing virtual space and electronic device using the same
US10169923B2 (en) Wearable display system that displays a workout guide
EP3410264B1 (en) Image display device and image display method
US7817104B2 (en) Augmented reality apparatus and method
JP6304241B2 (en) Display control apparatus, display control method, and program
CN105264572B (en) Information processing equipment, information processing method and program
US9824497B2 (en) Information processing apparatus, information processing system, and information processing method
US7301648B2 (en) Self-referenced tracking
Caggianese et al. Natural interaction and wearable augmented reality for the enjoyment of the cultural heritage in outdoor conditions
JP5843340B2 (en) 3D environment sharing system and 3D environment sharing method
WO2014162852A1 (en) Image processing device, image processing method and program
US20160171780A1 (en) Computer device in form of wearable glasses and user interface thereof
KR20200031083A (en) Information processing apparatus, information processing method, and program
JP2013258614A (en) Image generation device and image generation method
JP2011203984A (en) Navigation device, navigation image generation method, and program
Baldauf et al. KIBITZER: a wearable system for eye-gaze-based mobile urban exploration
WO2019142560A1 (en) Information processing device for guiding gaze
CN108604123A (en) Position instrument in virtual reality
GB2477787A (en) Data Overlay Generation Using Portable Electronic Device With Head-Mounted Display
WO2021193062A1 (en) Information processing device, information processing method, and program
JP2001076168A (en) Information terminal equipment, data base server device, picture display system and control method for the same
CN114115544B (en) Man-machine interaction method, three-dimensional display device and storage medium
WO2022176450A1 (en) Information processing device, information processing method, and program
JP2006260191A (en) Remote directing system
WO2019026713A1 (en) Information processing device, information processing method, and program

Legal Events

Date Code Title Description
PCNP Patent ceased through non-payment of renewal fee

Effective date: 20200215