WO2021210028A2 - The present invention is about a folding headset with its electronic accessories, in particular, a folding - Google Patents

The present invention is about a folding headset with its electronic accessories, in particular, a folding Download PDF

Info

Publication number
WO2021210028A2
WO2021210028A2 PCT/IR2020/050008 IR2020050008W WO2021210028A2 WO 2021210028 A2 WO2021210028 A2 WO 2021210028A2 IR 2020050008 W IR2020050008 W IR 2020050008W WO 2021210028 A2 WO2021210028 A2 WO 2021210028A2
Authority
WO
WIPO (PCT)
Prior art keywords
folding
headset
controller
handles
hmd
Prior art date
Application number
PCT/IR2020/050008
Other languages
French (fr)
Inventor
Arman TAJMIRRIAHI
Original Assignee
Tajmirriahi Arman
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tajmirriahi Arman filed Critical Tajmirriahi Arman
Priority to PCT/IR2020/050008 priority Critical patent/WO2021210028A2/en
Publication of WO2021210028A2 publication Critical patent/WO2021210028A2/en

Links

Definitions

  • the present invention is about a folding headset with its electronic accessories, in particular, a folding headset with curved frame, a folding screen, its electronic accessories and their control methods.
  • the electronic accessories include: a curved frame with a folding screen and a lightweight electric motor with a gearbox and a pair if smart HMD glasses.
  • a mobile terminal can perform various functions. Examples of such functions include data and voice communications, capturing images and video via a camera, recording audio, playing music files and outputting music via a speaker system, and displaying images and video on a display. Some terminals include additional functionality which supports game playing, while other terminals are also configured as multimedia players. More recently, mobile terminals have been configured to receive broadcast and multicast signals which permit viewing of contents, such as videos and television programs.
  • terminals can be classified into mobile terminals and stationary terminals according to a presence or non-presence of mobility.
  • the mobile terminals can be further classified into handheld terminals and vehicle mounted terminals.
  • Such efforts include software and hardware improvements, as well as changes and improvements in the structural components which form the mobile terminal.
  • Mixed reality is a technology that allows virtual imagery to be mixed with a real world physical environment in a display.
  • Systems for mixed reality may include, for example, see through head mounted display (HMD) devices or smart phones with built in cameras. Such systems typically include processing units which provide the imagery under the control of one or more applications. Full virtual reality environments in which no real world objects are viewable can also be supported using HMD and other devices.
  • HMD head mounted display
  • processing units which provide the imagery under the control of one or more applications.
  • Full virtual reality environments in which no real world objects are viewable can also be supported using HMD and other devices.
  • Such systems may also include one or more wireless hand-held inertial controllers that the user of the system can manipulate to interact with the HMD and provide user input to the HMD, including, but not limited to, controlling and moving a virtual cursor, selection, movement and rotation of objects, scrolling, etc.
  • This Background is provided to introduce a brief context for the Summary and Detailed Description that follow. This Background is not intended to be an aid in determining the scope of the claimed subject matter nor be viewed as limiting the claimed subject matter to implementations that solve any or all of the disadvantages or problems presented above. Furthermore, the subject matter claimed herein is not limited to embodiments that solve any disadvantages or that operate only in environments such as those described above. Rather, this background is only provided to illustrate one exemplary technology area where some embodiments described herein may be practiced.
  • Another challenge in achieving targeted design has been to position the electronic components relative to the position of the user, while also providing the device with an open space suitable for the electronic components and considering anti-impact, shatterproof, and waterproof properties of the system.
  • the appearance of the computing system is in U-shape to convert the applied pressure to kinetic and spring energy as much as possible, and with good resistance in accidental impacts.
  • the embodiments of the invention are led to a folding headset, a folding screen with a rotating moveable frame and its control device, and since a pair of HMD glasses is used and configured behind the curved display frame, it has provided the space and situation to utilize one or two terminal blocks and modules in order to make the folding headset more organized, which substantially eliminate one or more problems due to limitations and disadvantages related to the art.
  • a flexible display device corresponding to a sort of a mobile terminal means a device having a display deformable like a paper.
  • Flexible devices include a foldable display, a Tollable display, a bendable display and the like.
  • a foldable display foldable like paper various user interfaces are needed in consideration of the foldable display is repeatedly opened and closed.
  • the curved frame of the folding screen has two handles, defined together with the handles of the folding headset.
  • condition is as such that the curved frame and the set of folding rotating screen will be able to rotate and move upward and downward around the headset device by a lightweight electric motor with a gearbox.
  • the electric motor has a gearbox, empowering the frame on the folding headset to move vertically in a circular motion around the folding headset device. This movement can be a manual movement or a preset motion.
  • An embodiment of the electric motor includes the lightweight AC electric motor or a DC motor, which can be together with gearwheels as the gearbox.
  • the power, motor speed and its acceleration may be supported at the folding headset device in an embodiment of the invention, and in another embodiment it is possible that the lightweight electric motor with the gearbox is supported in the curved frame section.
  • the rotational angle between the curved frame and the headset may be altered by the lightweight electric motor having a gearbox.
  • our folding angle may change between the folding headset device and the folding frame handles.
  • One of the aims of this innovation is to provide a folding headset with a curved frame of the rotating screen and a control method that allows to change the angle between our folding headset and its curved frame based on the user's input or a preset condition, and also a suitable angle relative to the user's input and the preset mode can be set relative to the curved frame handle and the handles of the folding headset.
  • Another aim of the present invention is to provide a folding headset, the curved frame and its controlling means, the open/closed moving angle which relative to the curved frame is defined by adjusting its speed with a lightweight electric motor and a gearbox together with an electromagnet or a memory alloy and the folding headset and curved frame handles. No failure can occur by adjusting the speed of opening/closing handles of the folding headset and the curved display frame by an electromagnet or the memory alloy.
  • Another aim of this innovation in providing a folding headset and a curved frame is to switch a state of the moving open/closed angle easily by a lightweight electric motor with a gearbox between the headset and the curved frame. Moreover, the open/closed states of the handles of the folding headset and the curve frame can easily be switched to each other (i.e. functioning by one hand).
  • Another aim of this innovation is to provide a folding headset and its curved frame, using the HMD glasses system located on the curved frame, which include:
  • this system can include the following items: A display connected to a processor, a hand-held input device configured to communicate with the processor to provide one or more user inputs.
  • This hand-held input device comprises a primary sensor to determine the direction of the folding headset device, a hand-held input relative to the predefined reference frame and transmitting the related position to the processor.
  • the second sensor is located at a specific location relative to the screen to determine the position of one or more users according to the screen and to provide the position data to the processor, where the processor tracks the position data of one hand of the use or more in a 3D field of view with six degrees of freedom.
  • this method includes the following: Detecting the folding display headset located on the curved frame of the folding display by an optical sensor, one hand if the user in the field of view of the optital sensor, if a wireless hand-held inertial controller is active and paired with the head mounted display device, determining the location and orientation of the user's hand relative to the display on the folding display curved frame by the optical sensor, tracking the user's hand relative to the display located on the curved frame of the folding screen by an optical sensor of the corresponding display device over a period of time to extract the data related to the user's hand trajectory, getting the acceleration data for a specified period of time from a wireless hand-held inertial controller by a display unit on the folding display curved frame and measuring the inertial state of the wireless hand-held controller, comparing the trajectory data with the acceleration data to compute a confidence level that the wireless hand-held device is located in the user's hand and fusing the location data derived from the optical sensor of the head mounted display device with the orientation
  • the folding headset and the rotating screen frame with a touchscreen and a transparent curve as well as the HMD glasses system define the present folding headset body disclosure in an embodiment, including a set of frames, a primary curved frame and a secondary curved frame.
  • the first curved frame defines the received space for placing the electronic elements of the electronic device.
  • the second curved frame as a cover and holder is configured as a complement to the first frame.
  • the curved frame consisting of a set of frames, a first curved frame and a second curved frame.
  • the first curved frame defines the received space for the positioning of the electronic elements of the electronic device.
  • the second curved frame is configured with a clear, flexible touchscreen with a screen protector.
  • the headset body supports the frame display area, and in the other embodiment, the headset body can support the area of the system of HMD smart glasses.
  • the present invention may be the supporting of the curved frame area which supports the part and area of the frame itself; i.e. the display screen supports the curved frame, and also in another embodiment, it is the second area of the curved frame that supports the area of the system of HMD smart glasses.
  • the folding device may comprise the following items in accordance with the embodiment of the present invention:
  • the display unit comprises a folding curved frame display and folding curved frame handles and according to the present invention, the electronic elements such as sensors, controllers, etc. can be located in between the headset handles for the aim of controlling the display unit.
  • the electronic elements can be positioned between the folding headset and the curved frame for the required angular motion, rotational angle of rotation, which include:
  • the controller may be configured by means of a lightweight electric motor with a gearbox to switch the rotational angle of the circular motion between the folding headset and the curved frame with regards to the determined changed information into the open/closed mode.
  • the open mode may indicate a case where the rotational angle of the circular motion between the folding headset and the curved frame exceeds the angle of the primary state
  • the closed mode may indicate a case where the rotational angle of the circular motion between the headset and the curved frame is equal to or smaller than the first rotational angle of rotation.
  • the controller can preferably be configured to enable switching the angle between the two handles of the folding headset and the angle between the two handles of the curved frame based on the determined changed information specified to the open or closed mode.
  • the open mode may indicate the case that the angle between the two handles of the folding headset and the angle between the two handles of the curved display frame exceeds the angle of the first state
  • the closed mode may indicate an angle between the two handles of the headset and the angle between The two handles of the curved display frame is equal to or smaller than the first angle.
  • the controller can be configured for the actuator control in such a way to switch the two handles of the headset and the handles of the curved frame of the folding display to the open mode.
  • the controller can be configured for the actuator control in such a way to switch the headset and the curved frame to the open mode by the lightweight electric motor with a gearbox.
  • the user authentication may be performed based on at least one of a fingerprint input, a pattern input, an iris input, a touch input and a speech input.
  • the controller may be further configured to control the actuator unit to switch the folding headset and the curved frame handles to the closed mode.
  • the controller for controlling the actuating system can be configured in such a way to switch over the open state of the folding headset and the curved frame to the closed mode by a lightweight electric motor with a gearbox.
  • the controller can be configured for controlling the actuating system in such a way to switch over the handles of the folding headset and the handles of the curved frame of the folding display to the closed mode.
  • the controller for the actuating system can be configured in such a way to switch over the headset and curved frame to the closed mode by a lightweight electric motor with a gearbox.
  • the actuator unit may include at least one of electromagnet and shape memory alloy.
  • the controller can be configured for the change of speed to the open mode based on the occurred conditions by the handles of the folding headset and the handles of the curved frame.
  • the controller may be switched over for the change of speed to the open state of the folding headset and the curved frame based on the occurred conditions by a lightweight electric motor with a gearbox.
  • the controller can be configured for the actuating system in such a way activated to control the system in a way to adjust the conversion speed with the relevant magnet modifier for at least one preset part between the folding headset handles and the handles of the curved frame.
  • the controller can be configured for the system control in such a way to adjust the conversion speed with the relevant magnet modifier for at least one preset part between the folding headset handles and the handles of the curved frame.
  • the folding headset and the curved frame can include a wireless communication unit transmitting/receiving data to/from a recorder (stylus pen).
  • the controller may be further configured to control the actuator unit to change the folding angle between the two handles of the folding headset and that of the folding display curved frame, based on a control signal received from the stylus pen.
  • the controller can also be configured for the actuator unit to change the rotational angle of the circular motion based on the received signal from the recorder and by the lightweight electric motor with a gearbox.
  • the altered angles between the two handles of the folding headset and the handles of the folding display curved frame or between the folding headset and the folding display curved frame can at least be determined according to the sensed duration and the input length of the user by the stylus recorder.
  • the controller can be configured for controlling the actuating system, in such a way that the primary and secondary body magnets are equal to the stylus.
  • the controller can be configured for controlling the actuating system by a lightweight electric motor with the gearbox, in such a way that the magnet of the primary and secondary bodies is equal to that of the stylus.
  • the controller can be configured in such a way that the angle between the two handles of the folding headset and the two handles of the folding display curved frame is switched to a third angle.
  • the controller can be configured by a lightweight electric motor with a gearbox in such a way that the rotational angle of the circular motion between the folding headset and the curved frame is switched to a third angle.
  • the controller can be configured in such a way to transmit a preset content to the curved frame screen system for responding to the input signal.
  • the controller in this case, can be configured in such a way to transmit a preset content for responding to the input signal to the lightweight electric motor system with a gearbox to switch over the rotational angle of the circular motion between the folding headset and the curved frame.
  • a method of controlling the two handles of the folding headset together with the two handles of the curved frame of the folding display may include determining a change information on a folding angle between a first body and a second body and controlling an actuator to change the folding angle according to the determined change information, wherein the first body supports a first display region, wherein the second body supports a second display region, and wherein the folding angle is an angle between the first body and the second body.
  • the method for controlling the rotational angle of the circular motion of this folding headset together with the curved frame can include determining a change information on a folding angles between the folding headset and the frame by the lightweight electric motor with a gearbox.
  • the body and control of an actuator are considered to change the folding angle according to designated changing information, where the folding headset supports the lightweight electric motor with a gearbox, and where the curved frame supports the lightweight electric motor with a gearbox, in which the rotational angle of the circular motion is the angle between the folding headset and the curved frame.
  • FIG. 1A is a block diagram of a mobile terminal according to an embodiment of the present disclosure
  • FIGS. IB and 1C are conceptual views of one example of the mobile terminal, viewed from different directions;
  • FIG. 2 is a block diagram of a folding headset with curved frame according to one embodiment of the present invention
  • FIG. 3 is a perspective diagram of a folding headset with curved frame according to one embodiment of the present invention.
  • FIG. 4 is a diagram illustrating one example of an actuator unit of a folding headset with curved frame according to one embodiment of the present invention
  • FIG. 5 is a diagram illustrating one example of an auto-opening through fingerprint recognition in a folding headset with curved frame according to one embodiment of the present invention
  • FIG. 6 is a diagram illustrating another example of an auto-opening through fingerprint recognition in a folding headset with curved frame according to one embodiment of the present invention
  • FIG. 7 is a diagram illustrating one example of an auto-opening using an angle and pressure in a folding headset with curved frame according to one embodiment of the present invention
  • FIG. 8 is a diagram illustrating one example of an auto-opening based on various user inputs in a folding headset with curved frame according to one embodiment of the present invention
  • FIG. 9 is a diagram illustrating one example of an auto-opening in accordance with a gesture input in a folding headset with curved frame according to one embodiment of the present invention
  • FIG. 10 is a diagram illustrating one example of an auto-closing in accordance with a voice input in a folding headset with curved frame according to one embodiment of the present invention
  • FIG. 11 is a diagram illustrating one example of an auto-closing in accordance with a gesture input in a folding headset with curved frame according to one embodiment of the present invention
  • FIG. 12 is a diagram illustrating another example of an auto-closing in accordance with a gesture input in a folding headset with curved frame according to one embodiment of the present invention
  • FIG. 13 is a diagram illustrating one example of an auto-closing in accordance with time expiration in a folding headset with curved frame according to one embodiment of the present invention
  • FIG. 14 is a diagram illustrating one example of an opening/closing speed of a folding headset with curved frame according to one embodiment of the present invention.
  • FIG. 15 is a diagram illustrating one example of controlling an opening/closing speed of a folding headset with curved frame using a variable magnetic field according to one embodiment of the present invention
  • FIG. 16 is a diagram illustrating one example of controlling an opening/closing speed of a folding headset with curved frame using a variable magnetic field according to one embodiment of the present invention
  • FIG. 17 is a diagram illustrating one example of controlling a folding angle in accordance with user's posture in a folding headset with curved frame according to one embodiment of the present invention.
  • FIG. 18 is a diagram illustrating one example of controlling a folding angle in accordance with a scroll input in a folding headset with curved frame according to one embodiment of the present invention
  • FIG. 19 is a diagram illustrating one example of controlling a folding angle of a folding headset with curved frame using a stylus pen according to one embodiment of the present invention.
  • FIG. 20 is a diagram illustrating another example of controlling a folding angle of a folding headset with curved frame using a stylus pen according to one embodiment of the present invention
  • FIG. 21 is a diagram illustrating further example of controlling a folding angle of a folding headset with curved frame using a stylus pen according to one embodiment of the present invention
  • FIG. 22 is a diagram illustrating one example of controlling a folding headset with curved frame in case of a stylus pen located within the foldable device according to one embodiment of the present invention.
  • FIG. 23 is a flowchart for a method of controlling a folding headset with curved frame according to one embodiment of the present invention.
  • FIG. 24 is a schematic representation of one embodiment of a folding headset with curved frame, mounted virtual or augmented reality display.
  • FIG. 25 is a general perspective rendering of one embodiment of the folding headset with curved frame, mounted virtual or augmented reality display.
  • FIG. 26 is an exploded perspective rendering of one embodiment of the folding headset with curved frame, mounted virtual or augmented reality display, further illustrating one embodiment of a stereoscopic display system.
  • FIG. 27 is a general perspective rendering of one embodiment of the folding headset with curved frame, mounted virtual or augmented reality display, further illustrating one embodiment of an optical sensor system.
  • FIG. 28 is a perspective rendering of one embodiment of wireless hand-held stylus pen controller.
  • FIG. 29 is a functional block diagram illustrating the basic components of one embodiment of a wireless hand-held inertial controller.
  • FIG. 30 is a graphical representation of one example of a possible field of view of one embodiment of an folding headset with curved frame, mounted virtual or augmented reality display.
  • FIG.31 is a flowchart of one embodiment of a method for determining the location and orientation of a hand-held inertial controller with six degrees of freedom.
  • first, second, etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are generally only used to distinguish one element from another. When an element is referred to as being “connected with” another element, the element can be connected with the other element or intervening elements may also be present. In contrast, when an element is referred to as being “directly connected with” another element, there are no intervening elements present.
  • the disclosed systems and methods must be clearly understood and appreciated as such descriptions of an invention are merely an example for easy understanding.
  • the cell phone terminal block and the HMD glasses terminal block are supported, explained and understood in the headset body and the ,
  • one terminal block can be conceptualized and applied into two terminal blocks; e.g. the terminal block, modules of a cell phone and the HMD smart glasses can be integrated.
  • the block of the cell phone module terminals and the terminal block of the HMD smart glasses modules are each considered separately and used. In this case, the space and possible position within the folding headset itself and the curved frame are considered.
  • Mobile terminals presented herein may be implemented using a variety of different types of terminals. Examples of such terminals include cellular phones, smart phones, user equipment, laptop computers, digital broadcast terminals, personal digital assistants (PDAs), portable multimedia players (PMPs), navigators, portable computers (PCs), slate PCs, tablet PCs, ultra books
  • PDAs personal digital assistants
  • PMPs portable multimedia players
  • PCs portable computers
  • slate PCs slate PCs
  • tablet PCs tablet PCs
  • ultra books ultra books
  • FIGS. 1A-1C where FIG. 1A is a block diagram of a mobile terminal in accordance with the present disclosure, and FIGS. IB and 1C are conceptual views of one example of the mobile terminal, viewed from different directions.
  • the mobile terminal 100 is shown having components such as a wireless communication unit 110, an input unit 120, a sensing unit 140, an output unit 150, an interface unit 160, a memory 170, a controller 180, and a power supply unit 190.
  • a wireless communication unit 110 an input unit 120, a sensing unit 140, an output unit 150, an interface unit 160, a memory 170, a controller 180, and a power supply unit 190.
  • FIG. 1A the mobile terminal 100 is shown having wireless communication unit 110 configured with several commonly implemented components.
  • the wireless communication unit 110 typically includes one or more modules which permit communications such as wireless communications between the mobile terminal 100 and a wireless communication system, communications between the mobile terminal 100 and another mobile terminal, communications between the mobile terminal 100 and an external server. Further, the wireless communication unit 110 typically includes one or more modules which connect the mobile terminal 100 to one or more networks.
  • the wireless communication unit 110 includes one or more of a broadcast receiving module 111, a mobile communication module 112, a wireless Internet module 113, a short- range communication module 114, and a location information module 115.
  • the input unit 120 includes a camera 121 for obtaining images or video, a microphone 122, which is one type of audio input device for inputting an audio signal, and a user input unit 123 (for example, a touch key, a push key, a mechanical key, a soft key, and the like) for allowing a user to input information.
  • Data for example, audio, video, image, and the like
  • controller 180 may analyze and processed by controller 180 according to device parameters, user commands, and combinations thereof.
  • the sensing unit 140 is typically implemented using one or more sensors configured to sense internal information of the mobile terminal, the surrounding environment of the mobile terminal, user information, and the like.
  • the sensing unit 140 is shown having a proximity sensor 141 and an illumination sensor 142.
  • the sensing unit 140 may alternatively or additionally include other types of sensors or devices, such as a touch sensor, an acceleration sensor, a magnetic sensor, a G-sensor, a gyroscope sensor, a motion sensor, an RGB sensor, an infrared (IR) sensor, a finger scan sensor, a ultrasonic sensor, an optical sensor (for example, camera 121), a microphone 122, a battery gauge, an environment sensor (for example, a barometer, a hygrometer, a thermometer, a radiation detection sensor, a thermal sensor, and a gas sensor, among others), and a chemical sensor (for example, an electronic nose, a health care sensor, a biometric sensor, and the like), to name a few.
  • the mobile terminal 100 may be configured to utilize information obtained from sensing unit 140, and in particular, information obtained from one or more sensors of the sensing unit 140, and combinations thereof.
  • the output unit 150 is typically configured to output various types of information, such as audio, video, tactile output, and the like.
  • the output unit 150 is shown having a display unit 151, an audio output module 152, a haptic module 153, and an optical output module 154.
  • the display unit 151 may have an inter-layered structure or an integrated structure with a touch sensor in order to facilitate a touch screen.
  • the touch screen may provide an output interface between the mobile terminal 100 and a user, as well as function as the user input unit 123 which provides an input interface between the mobile terminal 100 and the user98 ‘ Special module for the lightweight electric motor with a gearbox to raise and bring down the curved frame99‘Special cooling fan module is provided to keep the block and modules in cool condition.
  • the interface unit 160 serves as an interface with various types of external devices that can be coupled to the mobile terminal 100.
  • the interface unit 160 may include any of wired or wireless ports, external power supply ports, wired or wireless data ports, memory card ports, ports for connecting a device having an identification module, audio input/output (I/O) ports, video I/O ports, earphone ports, and the like.
  • the mobile terminal 100 may perform assorted control functions associated with a connected external device, in response to the external device being connected to the interface unit 160.
  • the memory 170 is typically implemented to store data to support various functions or features of the mobile terminal 100.
  • the memory 170 may be configured to store application programs executed in the mobile terminal 100, data or instructions for operations of the mobile terminal 100, and the like. Some of these application programs may be downloaded from an external server via wireless communication. Other application programs may be installed within the mobile terminal 100 at time of manufacturing or shipping, which is typically the case for basic functions of the mobile terminal 100 (for example, receiving a call, placing a call, receiving a message, sending a message, and the like). It is common for application programs to be stored in the memory 170, installed in the mobile terminal 100, and executed by the controller 180 to perform an operation (or function) for the mobile terminal 100.
  • the controller 180 typically functions to control overall operation of the mobile terminal 100, in addition to the operations associated with the application programs.
  • the controller 180 processes signals, data, and information input or output through the components mentioned in the foregoing description or runs an application program saved in the memory 170, thereby providing or processing an information or function appropriate for a user.
  • the controller 180 can provide or process information or functions appropriate for a user by processing signals, data, information and the like, which are input or output by the various components depicted in Fig. 1A , or activating application programs stored in the memory 170. As one example, the controller 180 controls some or all of the components illustrated in FIGS. 1A according to the execution of an application program that have been stored in the memory 170.
  • the power supply unit 190 can be configured to receive external power or provide internal power in order to supply appropriate power required for operating elements and components included in the mobile terminal 100.
  • the power supply unit 190 may include a battery, and the battery may be configured to be embedded in the terminal body, or configured to be detachable from the terminal body.
  • At least one portion of the respective components can cooperatively operate to implement operations, controls or controlling methods of a mobile terminal according to various embodiments of the present invention mentioned in the following description.
  • the operations, controls or controlling methods of the mobile terminal can be implemented on the mobile terminal by running at least one application program saved in the memory 170.
  • the broadcast receiving module 111 is typically configured to receive a broadcast signal and/or broadcast associated information from an external broadcast managing entity via a broadcast channel.
  • the broadcast channel may include a satellite channel, a terrestrial channel, or both.
  • two or more broadcast receiving modules 111 may be utilized to facilitate simultaneously receiving of two or more broadcast channels, or to support switching among broadcast channels.
  • the broadcast managing entity may be implemented using a server or system which generates and transmits a broadcast signal and/or broadcast associated information, or a server which receives a pre-generated broadcast signal and/or broadcast associated information, and sends such items to the mobile terminal.
  • the broadcast signal may be implemented using any of a TV broadcast signal, a radio broadcast signal, a data broadcast signal, and combinations thereof, among others.
  • the broadcast signal in some cases may further include a data broadcast signal combined with a TV or radio broadcast signal.
  • the broadcast signal may be encoded according to any of a variety of technical standards or broadcasting methods (for example, International Organization for Standardization (ISO), International Electrotechnical Commission (IEC), Digital Video Broadcast (DVB), Advanced Television Systems Committee (ATSC), and the like) for transmission and reception of digital broadcast signals.
  • the broadcast receiving module 111 can receive the digital broadcast signals using a method appropriate for the transmission method utilized.
  • broadcast associated information may include information associated with a broadcast channel, a broadcast program, a broadcast event, a broadcast service provider, or the like.
  • the broadcast associated information may also be provided via a mobile communication network, and in this instance, received by the mobile communication module 112.
  • broadcast associated information may be implemented in various formats.
  • broadcast associated information may include an Electronic Program Guide (EPG) of Digital Multimedia Broadcasting (DMB), an Electronic Service Guide (ESG) of Digital Video Broadcast-Flandheld (DVB-H), and the like.
  • EPG Electronic Program Guide
  • ESG Electronic Service Guide
  • Broadcast signals and/or broadcast associated information received via the broadcast receiving module 111 may be stored in a suitable device, such as a memory 170.
  • the mobile communication module 112 can transmit and/or receive wireless signals to and from one or more network entities.
  • a network entity include a base station, an external mobile terminal, a server, and the like.
  • Such network entities form part of a mobile communication network, which is constructed according to technical standards or communication methods for mobile communications (for example, Global System for Mobile Communication (GSM), Code Division Multi Access (CDMA), CDMA2000 (Code Division Multi Access 2000), EV-DO (Enhanced Voice-Data Optimized or Enhanced Voice-Data Only), Wideband CDMA (WCDMA), High Speed Downlink Packet access (HSDPA), HSUPA (High Speed Uplink Packet Access), Long Term Evolution (LTE), LTE-A (Long Term Evolution-Advanced), and the like).
  • GSM Global System for Mobile Communication
  • CDMA Code Division Multi Access
  • CDMA2000 Code Division Multi Access 2000
  • EV-DO Enhanced Voice-Data Optimized or Enhanced Voice-Data Only
  • WCDMA Wideband CDMA
  • HSDPA High Speed Downlink
  • wireless signals transmitted and/or received via the mobile communication module 112 include audio call signals, video (telephony) call signals, or various formats of data to support communication of text and multimedia messages.
  • the wireless Internet module 113 is configured to facilitate wireless Internet access. This module may be internally or externally coupled to the mobile terminal 100. The wireless Internet module 113 may transmit and/or receive wireless signals via communication networks according to wireless Internet technologies.
  • wireless Internet access examples include Wireless LAN (WLAN), Wireless Fidelity (Wi-Fi), Wi-Fi Direct, Digital Living Network Alliance (DLNA), Wireless Broadband (WiBro), Worldwide Interoperability for Microwave Access (WiMAX), High Speed Downlink Packet Access (HSDPA), HSUPA (High Speed Uplink Packet Access), Long Term Evolution (LTE), LTE-A (Long Term Evolution-Advanced), and the like.
  • the wireless Internet module 113 may transmit/receive data according to one or more of such wireless Internet technologies, and other Internet technologies as well.
  • the wireless Internet module 113 when the wireless Internet access is implemented according to, for example, WiBro, HSDPA,HSUPA, GSM, CDMA, WCDMA, LTE, LTE-A and the like, as part of a mobile communication network, the wireless Internet module 113 performs such wireless Internet access. As such, the Internet module 113 may cooperate with, or function as, the mobile communication module 112.
  • the short-range communication module 114 is configured to facilitate short-range communications. Suitable technologies for implementing such short-range communications include BLUETOOTHTM, Radio Frequency IDentification (RFID), Infrared Data Association (IrDA), Ultra-WideBand (UWB), ZigBee, Near Field Communication (NFC), Wireless-Fidelity (Wi-Fi), Wi-Fi Direct, Wireless USB (Wireless Universal Serial Bus), and the like.
  • the short-range communication module 114 in general supports wireless communications between the mobile terminal 100 and a wireless communication system, communications between the mobile terminal 100 and another mobile terminal 100, or communications between the mobile terminal and a network where another mobile terminal 100 (or an external server) is located, via wireless area networks.
  • One example of the wireless area networks is a wireless personal area networks.
  • another mobile terminal (which may be configured similarly to mobile terminal 100) may be a wearable device, for example, a smart watch, a smart glass or a head mounted display (HMD), which can exchange data with the mobile terminal 100 (or otherwise cooperate with the mobile terminal 100).
  • the short- range communication module 114 may sense or recognize the wearable device, and permit communication between the wearable device and the mobile terminal 100.
  • the controller 180 when the sensed wearable device is a device which is authenticated to communicate with the mobile terminal 100, the controller 180, for example, may cause transmission of data processed in the mobile terminal 100 to the wearable device via the short-range communication module 114.
  • a user of the wearable device may use the data processed in the mobile terminal 100 on the wearable device. For example, when a call is received in the mobile terminal 100, the user may answer the call using the wearable device. Also, when a message is received in the mobile terminal 100, the user can check the received message using the wearable device.
  • the location information module 115 is generally configured to detect, calculate, derive or otherwise identify a position of the mobile terminal.
  • the location information module 115 includes a Global Position System (GPS) module, a Wi-Fi module, or both. If desired, the location information module 115 may alternatively or additionally function with any of the other modules of the wireless communication unit 110 to obtain data related to the position of the mobile terminal.
  • GPS Global Position System
  • a position of the mobile terminal may be acquired using a signal sent from a GPS satellite.
  • Wi-Fi module a position of the mobile terminal can be acquired based on information related to a wireless access point (AP) which transmits or receives a wireless signal to or from the Wi-Fi module.
  • AP wireless access point
  • the input unit 120 may be configured to permit various types of input to the mobile terminal 120. Examples of such input include audio, image, video, data, and user input. Image and video input is often obtained using one or more cameras 121. Such cameras 121 may process image frames of still pictures or video obtained by image sensors in a video or image capture mode. The processed image frames can be displayed on the display unit 151 or stored in memory 170. In some cases, the cameras 121 may be arranged in a matrix configuration to permit a plurality of images having various angles or focal points to be input to the mobile terminal 100. As another example, the cameras 121 may be located in a stereoscopic arrangement to acquire left and right images for implementing a stereoscopic image.
  • the microphone 122 is generally implemented to permit audio input to the mobile terminal 100.
  • the audio input can be processed in various manners according to a function being executed in the mobile terminal 100.
  • the microphone 122 may include assorted noise removing algorithms to remove unwanted noise generated in the course of receiving the external audio.
  • the user input unit 123 is a component that permits input by a user. Such user input may enable the controller 180 to control operation of the mobile terminal 100.
  • the user input unit 123 may include one or more of a mechanical input element (for example, a key, a button located on a front and/or rear surface or a side surface of the mobile terminal 100, a dome switch, a jog wheel, a jog switch, and the like), or a touch-sensitive input, among others.
  • the touch-sensitive input may be a virtual key or a soft key, which is displayed on a touch screen through software processing, or a touch key which is located on the mobile terminal at a location that is other than the touch screen.
  • the virtual key or the visual key may be displayed on the touch screen in various shapes, for example, graphic, text, icon, video, or a combination thereof.
  • the sensing unit 140 is generally configured to sense one or more of internal information of the mobile terminal, surrounding environment information of the mobile terminal, user information, or the like.
  • the controller 180 generally cooperates with the sending unit 140 to control operation of the mobile terminal 100 or execute data processing, a function or an operation associated with an application program installed in the mobile terminal based on the sensing provided by the sensing unit 140.
  • the sensing unit 140 may be implemented using any of a variety of sensors, some of which will now be described in more detail.
  • the proximity sensor 141 may include a sensor to sense presence or absence of an object approaching a surface, or an object located near a surface, by using an electromagnetic field, infrared rays, or the like without a mechanical contact.
  • the proximity sensor 141 may be arranged at an inner region of the mobile terminal covered by the touch screen, or near the touch screen.
  • the proximity sensor 141 may include any of a transmissive type photoelectric sensor, a direct reflective type photoelectric sensor, a mirror reflective type photoelectric sensor, a high-frequency oscillation proximity sensor, a capacitance type proximity sensor, a magnetic type proximity sensor, an infrared rays proximity sensor, and the like.
  • the proximity sensor 141 can sense proximity of a pointer relative to the touch screen by changes of an electromagnetic field, which is responsive to an approach of an object with conductivity.
  • the touch screen may also be categorized as a proximity sensor.
  • the term "proximity touch” will often be referred to herein to denote the scenario in which a pointer is positioned to be proximate to the touch screen without contacting the touch screen.
  • the term "contact touch” will often be referred to herein to denote the scenario in which a pointer makes physical contact with the touch screen.
  • For the position corresponding to the proximity touch of the pointer relative to the touch screen, such position will correspond to a position where the pointer is perpendicular to the touch screen.
  • the proximity sensor 141 may sense proximity touch, and proximity touch patterns (for example, distance, direction, speed, time, position, moving status, and the like).
  • controller 180 processes data corresponding to proximity touches and proximity touch patterns sensed by the proximity sensor 141, and cause output of visual information on the touch screen.
  • the controller 180 can control the mobile terminal 100 to execute different operations or process different data according to whether a touch with respect to a point on the touch screen is either a proximity touch or a contact touch.
  • a touch sensor can sense a touch applied to the touch screen, such as display unit 151, using any of a variety of touch methods. Examples of such touch methods include a resistive type, a capacitive type, an infrared type, and a magnetic field type, among others. As one example, the touch sensor may be configured to convert changes of pressure applied to a specific part of the display unit 151, or convert capacitance occurring at a specific part of the display unit 151, into electric input signals. The touch sensor may also be configured to sense not only a touched position and a touched area, but also touch pressure and/or touch capacitance.
  • a touch object is generally used to apply a touch input to the touch sensor. Examples of typical touch objects include a finger, a touch pen, a stylus pen, a pointer, or the like.
  • a touch input When a touch input is sensed by a touch sensor, corresponding signals may be transmitted to a touch controller.
  • the touch controller may process the received signals, and then transmit corresponding data to the controller 180.
  • the controller 180 can sense which region of the display unit 151 has been touched.
  • the touch controller may be a component separate from the controller 180, the controller 180, and combinations thereof.
  • the controller 180 can execute the same or different controls according to a type of touch object that touches the touch screen or a touch key provided in addition to the touch screen. Whether to execute the same or different control according to the object which provides a touch input may be decided based on a current operating state of the mobile terminal 100 or a currently executed application program, for example.
  • the touch sensor and the proximity sensor may be implemented individually, or in combination, to sense various types of touches.
  • Such touches includes a short (or tap) touch, a long touch, a multi-touch, a drag touch, a flick touch, a pinch-in touch, a pinch-out touch, a swipe touch, a hovering touch, and the like.
  • an ultrasonic sensor may be implemented to recognize position information relating to a touch object using ultrasonic waves.
  • the controller 180 may calculate a position of a wave generation source based on information sensed by an illumination sensor and a plurality of ultrasonic sensors. Since light is much faster than ultrasonic waves, the time for which the light reaches the optical sensor is much shorter than the time for which the ultrasonic wave reaches the ultrasonic sensor. The position of the wave generation source may be calculated using this fact. For instance, the position of the wave generation source may be calculated using the time difference from the time that the ultrasonic wave reaches the sensor based on the light as a reference signal.
  • the camera 121 typically includes at least one a camera sensor (CCD, CMOS etc.), a photo sensor (or image sensors), and a laser sensor.
  • a laser sensor may allow detection of a touch of a physical object with respect to a 3D stereoscopic image.
  • the photo sensor may be laminated on, or overlapped with, the display device.
  • the photo sensor may be configured to scan movement of the physical object in proximity to the touch screen.
  • the photo sensor may include photo diodes and transistors at rows and columns to scan content received at the photo sensor using an electrical signal which changes according to the quantity of applied light. Namely, the photo sensor may calculate the coordinates of the physical object according to variation of light to thus obtain position information of the physical object.
  • the display unit 151 is generally configured to output information processed in the mobile terminal 100.
  • the display unit 151 may display execution screen information of an application program executing at the mobile terminal 100 or user interface (Ul) and graphic user interface (GUI) information in response to the execution screen information.
  • User user interface
  • GUI graphic user interface
  • the display unit 151 may be implemented as a stereoscopic display unit for displaying stereoscopic images.
  • a typical stereoscopic display unit may employ a stereoscopic display scheme such as a stereoscopic scheme (a glass scheme), an auto-stereoscopic scheme (glassless scheme), a projection scheme (holographic scheme), or the like.
  • a 3D stereoscopic image may include the left image (e.g., the left eye image) and the right image (e.g., the right eye image).
  • a 3D stereoscopic imaging method can be divided into a top-down method in which left and right images are located up and down in a frame, an L-to-R (left-to-right or side by side) method in which left and right images are located left and right in a frame, a checker board method in which fragments of left and right images are located in a tile form, an interlaced method in which left and right images are alternately located by columns or rows, and a time sequential (or frame by frame) method in which left and right images are alternately displayed on a time basis.
  • the left image thumbnail and the right image thumbnail can be generated from the left image and the right image of an original image frame, respectively, and then combined to generate a single 3D thumbnail image.
  • thumbnail may be used to refer to a reduced image or a reduced still image.
  • a generated left image thumbnail and right image thumbnail may be displayed with a horizontal distance difference there between by a depth corresponding to the disparity between the left image and the right image on the screen, thereby providing a stereoscopic space sense.
  • the left image and the right image required for implementing a 3D stereoscopic image may be displayed on the stereoscopic display unit using a stereoscopic processing unit.
  • the stereoscopic processing unit can receive the 3D image and extract the left image and the right image, or can receive the 2D image and change it into the left image and the right image.
  • the audio output module 152 is generally configured to output audio data. Such audio data may be obtained from any of a number of different sources, such that the audio data may be received from the wireless communication unit 110 or may have been stored in the memory 170. The audio data may be output during modes such as a signal reception mode, a call mode, a record mode, a voice recognition mode, a broadcast reception mode, and the like. The audio output module 152 can provide audible output related to a particular function (e.g., a call signal reception sound, a message reception sound, etc.) performed by the mobile terminal 100. The audio output module 152 may also be implemented as a receiver, a speaker, a buzzer, or the like.
  • a haptic module 153 can be configured to generate various tactile effects that a user feels, perceive, or otherwise experience.
  • a typical example of a tactile effect generated by the haptic module 153 is vibration.
  • the strength, pattern and the like of the vibration generated by the haptic module 153 can be controlled by user selection or setting by the controller. For example, the haptic module 153 may output different vibrations in a combining manner or a sequential manner.
  • the haptic module 153 can generate various other tactile effects, including an effect by stimulation such as a pin arrangement vertically moving to contact skin, a spray force or suction force of air through a jet orifice or a suction opening, a touch to the skin, a contact of an electrode, electrostatic force, an effect by reproducing the sense of cold and warmth using an element that can absorb or generate heat, and the like.
  • an effect by stimulation such as a pin arrangement vertically moving to contact skin, a spray force or suction force of air through a jet orifice or a suction opening, a touch to the skin, a contact of an electrode, electrostatic force, an effect by reproducing the sense of cold and warmth using an element that can absorb or generate heat, and the like.
  • the haptic module 153 can also be implemented to allow the user to feel a tactile effect through a muscle sensation such as the user's fingers or arm, as well as transferring the tactile effect through direct contact. Two or more haptic modules 153 may be provided according to the particular configuration of the mobile terminal 100.
  • An optical output module 154 can output a signal for indicating an event generation using light of a light source. Examples of events generated in the mobile terminal 100 may include message reception, call signal reception, a missed call, an alarm, a schedule notice, an email reception, information reception through an application, and the like.
  • a signal output by the optical output module 154 may be implemented so the mobile terminal emits monochromatic light or light with a plurality of colors.
  • the signal output may be terminated as the mobile terminal senses that a user has checked the generated event, for example.
  • the interface unit 160 serves as an interface for external devices to be connected with the mobile terminal 100.
  • the interface unit 160 can receive data transmitted from an external device, receive power to transfer to elements and components within the mobile terminal 100, or transmit internal data of the mobile terminal 100 to such external device.
  • the interface unit 160 may include wired or wireless headset ports, external power supply ports, wired or wireless data ports, memory card ports, ports for connecting a device having an identification module, audio input/output (I/O) ports, video I/O ports, earphone ports, or the like.
  • the identification module may be a chip that stores various information for authenticating authority of using the mobile terminal 100 and may include a user identity module (UIM), a subscriber identity module (SIM), a universal subscriber identity module (USIM), and the like.
  • the device having the identification module (also referred to herein as an "identifying device") may take the form of a smart card. Accordingly, the identifying device can be connected with the terminal 100 via the interface unit 160.
  • the interface unit 160 can serve as a passage to allow power from the cradle to be supplied to the mobile terminal 100 or may serve as a passage to allow various command signals input by the user from the cradle to be transferred to the mobile terminal there through.
  • Various command signals or power input from the cradle may operate as signals for recognizing that the mobile terminal is properly mounted on the cradle.
  • the memory 170 can store programs to support operations of the controller 180 and store input/output data (for example, phonebook, messages, still images, videos, etc.).
  • the memory 170 may store data related to various patterns of vibrations and audio which are output in response to touch inputs on the touch screen.
  • the memory 170 may include one or more types of storage mediums including a Flash memory, a hard disk, a solid state disk, a silicon disk, a multimedia card micro type, a card-type memory (e.g., SD or DX memory, etc.), a Random Access Memory (RAM), a Static Random Access Memory (SRAM), a Read-Only Memory (ROM), an Electrically Erasable Programmable Read-Only Memory (EEPROM), a Programmable Read-Only memory (PROM), a magnetic memory, a magnetic disk, an optical disk, and the like.
  • the mobile terminal 100 may also be operated in relation to a network storage device that performs the storage function of the memory 170 over a network, such as the Internet.
  • the controller 180 can typically control the general operations of the mobile terminal 100. For example, the controller 180 can set or release a lock state for restricting a user from inputting a control command with respect to applications when a status of the mobile terminal meets a preset condition.
  • the controller 180 can also perform the controlling and processing associated with voice calls, data communications, video calls, and the like, or perform pattern recognition processing to recognize a handwriting input or a picture drawing input performed on the touch screen as characters or images, respectively.
  • the controller 180 can control one or a combination of those components in order to implement various exemplary embodiments disclosed herein.
  • the power supply unit 190 receives external power or provide internal power and supply the appropriate power required for operating respective elements and components included in the mobile terminal 100.
  • the power supply unit 190 may include a battery, which is typically rechargeable or be detachably coupled to the terminal body for charging.
  • the power supply unit 190 may include a connection port.
  • the connection port may be configured as one example of the interface unit 160 to which an external charger for supplying power to recharge the battery is electrically connected.
  • the power supply unit 190 may be configured to recharge the battery in a wireless manner without use of the connection port.
  • the power supply unit 190 can receive power, transferred from an external wireless power transmitter, using at least one of an inductive coupling method which is based on magnetic induction or a magnetic resonance coupling method which is based on electromagnetic resonance.
  • an inductive coupling method which is based on magnetic induction
  • a magnetic resonance coupling method which is based on electromagnetic resonance.
  • Various embodiments described herein may be implemented in a computer-readable medium, a machine-readable medium, or similar medium using, for example, software, hardware, or any combination thereof.
  • a folding headset with curved frame FIMD takes the form of wearable glasses or goggles, but it will be appreciated that other forms are possible.
  • the folding headset with curved frame FIMD may be configured in an augmented reality configuration to present an augmented reality environment, and thus may include an at least partially see-through stereoscopic display 5012 that may be configured to visually augment an appearance of a physical environment being viewed by the user through the at least partially see- through stereoscopic display 5012 .
  • the at least partially see-through stereoscopic display 5012 may include one or more regions that are transparent (e.g., optically clear) and may include one or more regions that are opaque or semi-transparent.
  • the at least partially see-through stereoscopic display 5012 may be transparent (e.g., optically clear) across an entire usable display surface of the stereoscopic display 5012.
  • the folding headset with curved frame FIMD may be configured in a virtual reality configuration to present a full virtual reality environment, and thus the stereoscopic display 5012 may be a non- see-though stereoscopic display.
  • the folding headset with curved frame FIMD may be configured to display virtual three dimensional environments to the user via the non-see-through stereoscopic display.
  • the folding headset with curved frame FIMD may be configured to display a virtual representation such as a three dimensional graphical rendering of the physical environment in front of the user that may include additional virtual objects or may be configured to display camera-captured images of the physical environment along with additional virtual objects including the virtual cursor overlaid on the camera-captured images.
  • thefolding headset with curved frame HMD may include an image production system 5014 that is configured to display virtual objects to the user with the stereoscopic display 5012 .
  • the virtual objects are visually superimposed onto the physical environment that is visible through the display so as to be perceived at various depths and locations.
  • the image production system 5014 may be configured to display virtual objects to the user with the non-see-through stereoscopic display, such that the virtual objects are perceived to be at various depths and locations relative to one another.
  • thefolding headset with curved frame HMD may use stereoscopy to visually place a virtual object at a desired depth by displaying separate images of the virtual object to both of the user's eyes. Using this stereoscopy technique, thefolding headset with curved frame HMD may control the displayed images of the virtual objects, such that the user will perceive that the virtual objects exist at a desired depth and location in the viewed physical environment.
  • the virtual object may be a virtual cursor that is displayed to the user, such that the virtual cursor appears to the user to be located at a desired location in the virtual three dimensional environment.
  • the virtual object may be a holographic cursor that is displayed to the user, such that the holographic cursor appears to the user to be located at a desired location in the real world physical environment.
  • the folding headset with curved frame HMD includes an optical sensor system 5016 that may include one or more optical sensors.
  • the optical sensor system 5016 includes an outward facing optical sensor 5018 that may be configured to detect the real-world background from a similar vantage point (e.g., line of sight) as observed by the user through the at least partially see-through stereoscopic display 5012 .
  • the optical sensor system 5016 may additionally include an inward facing optical sensor 5020 that may be configured to detect a gaze direction of the user's eye.
  • the outward facing optical sensor 5018 may include one or more component sensors, including an RGB camera and a depth camera.
  • the RGB camera may be a high definition camera or have another resolution.
  • the depth camera may be configured to project non-visible light, such as infrared (IR) radiation, and capture reflections of the projected light, and based thereon, generate an image comprised of measured depth data for each pixel in the image.
  • This depth data may be combined with color information from the image captured by the RGB camera, into a single image representation including both color data and depth data, if desired.
  • the color and depth data captured by the optical sensor system 5016 may be used to perform surface reconstruction and generate a virtual model of the real world background that may be displayed to the user via the display 5012 .
  • the image data captured by the optical sensor system 5016 may be directly presented as image data to the user on the display 5012 .
  • the folding headset with curved frame HMD may further include a position sensor system 5022 that may include one or more position sensors, such as one or more inertial measurement unit (IMU) that incorporates a 3- axis accelerometer, 3-axis gyroscope and/or a 3-axis magnetometer, global positioning system(s), multilateration tracker(s), and/or other sensors that output position sensor information useable as a position, orientation, and/or movement of the relevant sensor.
  • IMU inertial measurement unit
  • Optical sensor information received from the optical sensor system 5016 and/or position sensor information received from position sensor system 5022 may be used to assess a position and orientation of the vantage point of folding headset with curved frame HMD relative to other environmental objects.
  • the position and orientation of the vantage point may be characterized with six degrees of freedom (e.g., world-space X, Y, Z, (Opitch , Oyaw and Oroll ).
  • the vantage point may be characterized globally or independent of the real-world background.
  • the position and/or orientation may be determined with an on-board computing system (e.g., on-board computing system 5024 ) and/or an off-board computing system.
  • frames of reference of all sensors located on board folding headset with curved frame HMD are factory aligned and calibrated to resolve six degrees of freedom relative to world-space.
  • the optical sensor information and the position sensor information may be used by a computing system to perform analysis of the real-world background, such as depth analysis, surface reconstruction, environmental color and lighting analysis, or other suitable operations.
  • the optical and positional sensor information may be used to create a virtual model of the real-world background.
  • the position and orientation of the vantage point may be characterized relative to this virtual space.
  • the virtual model may be used to determine positions of virtual objects in the virtual space and add additional virtual objects to be displayed to the
  • the optical sensor information received from the optical sensor system 5016 may be used to identify and track objects in the field of view of optical sensor system 5016 .
  • depth data captured by optical sensor system 5016 may be used to identify and track motion of a user's hand.
  • the tracked motion may include movement of the user's hand in three-dimensional space, and may be characterized with six degrees of freedom (e.g., world-space X, Y, Z, (Opitch , Oyaw and Oroll ).
  • the tracked motion may also be used to identify and track a hand gesture made by the user's hand. For example, one identifiable hand gesture may be moving a forefinger upwards or downwards.
  • optical tags may be placed at known locations on the user's hand or a glove worn by the user, and the optical tags may be tracked through the image data captured by optical sensor system 5016 .
  • the following examples and methods may be applied to both a virtual reality and an augmented reality configuration of the folding headset with curved frame HMD .
  • the display 5012 of the folding headset with curved frame HMD is a non-see-through display
  • the three dimensional environment is a virtual environment displayed to the user.
  • the virtual environment may be a virtual model generated based on image data captured of the real-world background by optical sensor system 5016 of the folding headset with curved frame HMD .
  • folding headset with curved frame HMD is a pair of mixed reality -mounted smartglasses.
  • folding headset with curved frame has see-through holographic lenses that use an advanced optical projection system to generate multi-dimensional full-color holograms with very low latency so a user can see holographic objects in a real world setting.
  • the curved frame Located at the front of the curved frame are sensors and related hardware, including cameras and processors.
  • the curved frame also incorporates an inertial measurement unit (IMU), which includes an accelerometer, gyroscope, and a magnetometer, four "environment understanding" sensors, an energy-efficient depth camera with a 120°xl20° angle of view, a forward-facing 2.4-megapixel photographic video camera, a four- microphone array, and an ambient light sensor curved frame contains advanced sensors to capture information about what the user is doing and the environment the user is in.
  • the built in cameras also enable a user to record (mixed reality capture (MRC)) HD pictures and video of the holograms in the surrounding world to share with others.
  • MRC mixed reality capture
  • a pair of transparent combiner lenses in which the projected images are displayed in the lower half.
  • the folding headset with curved frame must be calibrated to the interpupillary distance (IPD), or accustomed vision of the user.
  • curved frame features a custom-made Holographic (HPU) , a coprocessor manufactured specifically for the folding headset with curved frame.
  • HPU Holographic
  • the main purpose of the HPU is processing and integrating data from the sensors, as well as handling tasks such as spatial mapping, gesture recognition, and voice and speech recognition.
  • the HPU processes terabytes of information from the folding headset with curved frame's sensors from real-time data.
  • the lenses of the curved frame use optical waveguides to color blue, green, and red across three different layers, each with diffractive features.
  • a light engine above each combiner lens projects light into the lens, a wavelength which then hits a diffractive element and is reflected repeatedly along a waveguide until it is output to the eye.
  • the display projection for the curved frame occupies a limited portion of the user's field of view (FOV), particularly in comparison to virtual reality head- mounted displays, which typically cover a much greater field of view.
  • the folding headset with curved frame contains an internal rechargeable battery, but can be operated while charging folding headset with curved frame also features IEEE 802.11ac Wi-Fi and Bluetooth 4.1 Low Energy (LE) wireless connectivity.
  • folding headset with curved frame With folding headset with curved frame a user can create and shape holograms with gestures, communicate with apps using voice commands, and navigate with a glance, hand gestures, Controllers and/or other pointing devices folding headset with curved frame understands gestures, gaze, and voice, enabling the user to interact in the most natural way possible. With spatial sound, folding headset with curved frame synthesizes sound so the user can hear holograms from anywhere in the room, even if they are behind the user. [0128] As mentioned above, the folding headset with curved frame includes a depth camera, which is capable of detecting the 3D location of objects located within the depth camera's FOV.
  • the depth camera is able to accurately detect, on a pixel-by-pixel basis, the exact 3D location of each point on a physical object within the camera's field of view.
  • the folding headset with curved frame uses a depth camera
  • stereoscopic optics can also be used to detect the distance of objects from the HMD and the locations of such objects in 3D space via triangulation. In either event, such sensors can detect the 3D location (x, y and z coordinates) of real objects located within the FOV relative to the HMD.
  • the depth camera of the HMD can be used to detect the 3D location of the Controller relative to the HMD.
  • the folding headset with curved frame has the ability to track the movement of a user's hands through space and to identify and interpret a variety of hand poses, gestures and movements to manipulate virtual objects in the AR space. Additional details regarding hand tracking, hand gesture identification, classification and recognition and/or hand pose identification.
  • the manual wireless controllers of the HMD glasses are pen-like.
  • Controller 5040 can include an on-board microcontroller 5042 , its own IMU 5044 , a communications radio 5046 , a rechargeable battery (not shown), and one or more status LEDs 5048 .
  • the IMU typically includes a 3-axis accelerometer and a 3-axis gyroscope, and may also include a magnetometer.
  • User inputs and orientation data (pitch, yaw and roll) derived from the IMU can be wirelessly communicated by the microcontroller 5042 to the CPU of the folding headset with curved frame HMD via wireless radio 5046 .
  • Controller 5040 can also include one more momentary switch(es) 5050 for selective activation by the user to control a virtual cursor and/or to manipulate virtual objects in various ways (such as, for example, select, move, rotate, scroll, etc.). Controller 5040 can also include an elastic finger loop (for holding the device) and a USB 2.0 micro-B receptacle for charging the internal battery folding headset with curved frame
  • the IMU 5044 can detect the orientation of the Controller 5040 , but only with three degrees of freedom, namely, pitch (elevation angle), yaw (azimuth angle) and roll (rotation). Because the accelerometer can detect the gravity vector, the vertical axis of the frame of reference of the Controller 5040 is easily identified and aligned. Similarly, the gyroscopes of the IMU 5044 can readily detect the horizontal plane and, therefore, the horizontal plane is readily identified and aligned. If the IMU 5044 also includes a magnetometer, then magnetic north can readily be identified and the frame of reference of the Controller 5040 can be north aligned.
  • both the IMU of the folding headset with curved frame HMD and the IMU 5044 of the Controller 5040 include a magnetometer, then the frame of reference of the Controller 5040 will automatically be aligned with the folding headset with curved frame HMD's frame of reference.
  • the IMU 5044 of the Controller 5040 does not include a magnetometer, then the IMU 5044 arbitrarily assigns an x-axis when it powers up and then continuously tracks azimuth changes (angular rotation in the horizontal plane) from that initial frame of reference. In that case, the frame of reference of the Controller 5040 will need to be aligned with or calibrated to the folding headset with curved frame HMD's frame of reference, as discussed in more detail below.
  • FIG. 30 illustrates an augmented reality configuration of a folding headset with curved frame HMD worn by a user 5026 , displaying a virtual cursor, which is a holographic cursor 5028 in this example, on the at least partially see-through stereoscopic display 5012 so as to appear to at a location 5030 in a three dimensional environment 5032 .
  • the three dimensional environment 5032 is a room in the real world
  • the holographic cursor 5028 is displayed on the at least partially see-through stereoscopic display such that the holographic cursor 5028 appears to the user 5026 , to be hovering in the middle of the room at the location 5030 .
  • the location 5030 for the holographic cursor 5028 may be calculated based on a variety of suitable methods.
  • the location 5030 may be calculated based on a predetermined distance and orientation relative to the user 5026 , such as being two feet in front of the user 5026 as one specific example.
  • the location 5030 may be calculated based on a detected gaze direction 5034 and a recognized object that intersects with the detected gaze direction.
  • the recognized object may be a real object in the three dimensional environment. This example is illustrated in FIG.30, with the recognized object being the wall 5036 that is a part of the room that serves as the three dimensional environment 5032 . Accordingly, the intersection between the wall 5036 and the detected gaze direction 5034 of the user 5026 may be used to calculate the location 5030 for the holographic cursor 5028 . It may be advantageous to further ensure that the holographic cursor 5028 is displayed to the user 5026 , such that the holographic cursor 5028 is easily visible to the user 5026 .
  • the location 5030 of the holographic cursor 5028 may be placed a threshold distance away from the recognized object to prevent the holographic cursor 5028 from being occluded by any protrusions of the recognized object. Additionally, it may be advantageous to further calculate the location 5030 of the holographic cursor 5028 based on a plane that is orthogonal to the detected gaze direction 5034 of the user 5026 . By placing the location 5030 of the holographic cursor 5028 on such a plane, a consistent view of the holographic cursor 5028 may be maintained even as the user changes gaze direction.
  • the folding headset with curved frame HMD worn by the user 5026 may be configured to detect motion of the user's hand. Based on a series of images captured by the optical sensor system 5016 , the folding headset with curved frame HMD may determine whether motion of hand 5038 of the user 5026 is trackable. For example, the user's hand at positions 5038 and 5038 A are within the field of view of the optical sensor system 5016 . Accordingly, motion of the user's hand moving from position 5038 to position 5038 A over time T 1 is trackable by the folding headset with curved frame HMD .
  • position 5038 B may be outside of the field of view of the optical sensor system 5016 , motion of the user's hand moving from position 5038 A to position 5038 B over time T 2 may not be trackable by the folding headset with curved frame HMD .
  • the user's hand is determined to be trackable by the HMD when the HMD can monitor the hand for gesture input.
  • the user's hand is deemed to be trackable, for example, when computer algorithms implemented in software executed on the processor of the folding headset with curved frame HMD identify the hand in images captured by the onboard camera and begin tracking the hand, until a point in time at which those algorithms lose track of the hand.
  • Techniques that may be used to track the hand the hand include searching for regions of similar color values and segmenting a portion of the image based on the color values from the rest of the image, as well as searching for regions of pixels that have changed, indicating foreground movement by a hand or other object.
  • the hand may be located using skeletal tracking techniques in addition or as an alternative to the above.
  • a hand may be determined to be trackable when a confidence degree output by the algorithm indicates that the hand is being tracked with above a predetermined threshold level of confidence.
  • the folding headset with curved frame HMD communicates to the user whether motion of the user's hand is trackable.
  • the folding headset with curved frame HMD modifies the visual appearance of the holographic cursor to indicate that motion of the hand is trackable.
  • the visual appearance of the holographic cursor is modified to appear as holographic cursor 5028 , which is an unfilled circle.
  • the user moves the hand from position 5038 to position 5038 A over time T 1 , the user is shown holographic cursor having visual appearance 5028 and is thus provided with the feedback that motion of the user's hand is currently trackable, and any hand gestures or hand movements will be tracked by the folding headset with curved frame HMD .
  • the folding headset with curved frame HMD modifies the visual appearance of the holographic cursor to indicate that motion of the hand is not trackable.
  • the visual appearance of the holographic cursor may be modified to appear as holographic cursor 5028 A, which has a different visual appearance than holographic cursor 5028 .
  • the visual appearance of holographic cursor 5028 A is a filled circle. Accordingly, as the user moves the hand from position 5038 A to position 5038 B over time T 2 , the user is shown holographic cursor having visual appearance 5028 A and is thus provided with the feedback that motion of the user's hand is not currently trackable.
  • the example illustrated in FIG. 30 modifies the visual appearance of the holographic cursor to appear as a filled or unfilled circle
  • any suitable visual modification is possible.
  • the visual appearance of the holographic cursor may be modified by changing a color, changing a shape, adding or removing an icon, or changing a size of the holographic cursor.
  • wireless Controllers found in the prior art may provide orientation information with 3DOF, they do not provide location information. 6DOF can be recovered, however, in accordance with the systems and methods described below. For example, and as set forth in more detail below, one embodiment of the invention is directed to a system 6DOF mixed reality input by fusing inertial handheld controller with hand tracking.
  • the system can include: a display with an onboard processor; a hand-held input device configured to communicate with the processor to selectively provide one or more user inputs, the hand-held input device also including a first sensor for determining the orientation of the hand-held input device relative to a predetermined frame of reference and providing orientation data to the processor; and a second sensor located at a known location relative to the display for determining the position of one or more hands of a user relative to the display and for providing position data to the processor, wherein the processor uses the orientation data and the position data to track the one or more hands of the user within a three dimensional field of view with six degrees of freedom.
  • the hand-tracking feature of the HMD can be used to accurately and precisely determine the 3D position of a Controller relative to the HMD by detecting the location of a user's hand in which the Controller is located. Then, the location information derived from the optical system of the HMD can be combined with the orientation data derived from the orientation sensors (e.g., IMU) incorporated in the Controller. In this manner, the system provides a Controller that operates with 6DOF.
  • the orientation sensors e.g., IMU
  • the image processor analyzes the video to determine the presence of one or more of the user's hands within the field of view of the optical sensor. If a user's hand is detected by the image processor, then the image processor can also determine whether the orientation and shape of the hand indicates the presence of a Controller, based on known geometrical constraints of the Controller and the position and orientation of the hand relative to the Controller. To determine which hand is holding the Controller, a classifier forming part of the environment tracking components of the HMD is trained to determine if a segmented hand is positioned in a hand pose consistent with holding a controller, using training examples of hands interacting with the controller. When using two controllers, one in each hand, it is possible to further differentiate which hand holds which controller by matching the hand trajectory as observed by the hand tracking sensor of the HMD with the acceleration data from the IMU of each controller over a period of time.
  • the depth camera of the HMD determines the exact position (x, y and z coordinates) of the Controller in 3D space relative to a known frame of reference.
  • orientation data (Opitch , 0yaw and Broil ) for time T 1 is also obtained from the IMU of the Controller.
  • 6DOF By combining the location data, derived from the depth camera, with the orientation data, derived from the IMU of the Controller, 6DOF are recovered, thereby allowing the HMD to track and interact with the Controller with 6DOF. This process can be repeated for each successive frame, or some other predetermined sampling of video captured by the optical sensor, to track and interact with the Controller with 6DOF.
  • the image processor can detect orientation of the user's hand by segmenting various parts of the user's hands and arms, determining the relative positions of each part and, from that information, derive the orientation of the user's hand(s). Information concerning the orientation of the user's hand can also be compared to the orientation of a Controller (based on orientation data derived from the IMU) to determine if the hand orientation data is consistent with the controller orientation data. This information, along with other positional data, helps to determine whether the controller should be associated with a particular hand. Once a certain level of confidence that a controller should be associated with a particular hand, then such association is made for future identification and tracking.
  • the location data (x, y and z coordinates) derived from the depth camera can be combined with the orientation data (Opitch , Oyaw and Oroll ) derived from the IMU of the Controller to achieve a Controller that can be accurately detected with a relatively high degree of reliability and resolution in 6DOF.
  • the system comprises of an inertial handheld controller and an HMD with a hand tracking sensor and environment tracking sensor.
  • the Controller can include an IMU that can include a combination of accelerometers and gyroscopes.
  • the IMU may also contain magnetometers. IMU data is fused to compute, with high frequency and low latency, the orientation (Opitch , Oyaw and Oroll ) of the Controller relative to some initial reference frame that is gravity aligned. The presence of magnetometers ensures there is little drift in maintaining the north pole alignment.
  • the hand tracking sensor consists of a depth camera that observes the hands moving through space.
  • the depth image can be used to segment the hand from the background and the rest of the body, classify pixels as belonging to different hand parts using decision trees/jungles, and compute centroids for them (palm, fingertips, etc.) in 3D space.
  • the hand tracking sensor is factory calibrated relative to the environment tracking components on board of the HMD, allowing for the hand position to be transformed to a gravity aligned world frame of reference.
  • the hand(s) can also be classified into several hand poses (open, closed, pointing, bloom etc.).
  • the location data (x, y and z coordinates) of the hand and the orientation data (Opitch , Oyaw and Oroll ) of the IMU are combined to determine the 6DOF transform of the Controller in the world frame of reference.
  • the HMD and the Controller frames of reference are both gravity aligned (z axis is shared). In the embodiment where the HMD and the Controller are both gravity aligned and north aligned, then they are rotationally invariant. If the two frames of reference are not north-aligned, then there is an azimuth offset between the two frames of reference that needs to be resolved in one of several ways. For example, in a scenario where there is a 3D cursor (such as gaze targeting against 3D content), the cursor has a 3D location. For a manipulation gesture, the azimuth offset is calculated at the time of the button press by aligning the IMU forward vector with the vector between the hand and the cursor and is maintained constant throughout the manipulation gesture, when the button is released.
  • one way to determine the azimuth offset and calibrate the Controller to the HMD's frame reference is to have the user point at a virtual object and calculate the azimuth delta between the HMD's frame of reference and the Controller's frame of reference.
  • a coarse estimate of the hand orientation could also be used to initially estimate the azimuth offset and update it gradually over time using a moving average approach. Such a coarse estimate could be based on the segment between lower arm centroid and palm centroid provided by a hand tracking pipeline.
  • folding headset with curved frame As discussed in greater detail below, the invention is also directed to methods for recovering six degrees of freedom (6DOF) relative to a wireless hand-held inertial controller when used in combination with a head mounted display.
  • 6DOF degrees of freedom
  • the method can include one or more of the following acts: detecting by an optical sensor of mounted display device, the presence of a user's hand within the field of view of the optical sensor; determining by the head mounted display device if a wireless hand-held inertial controller is active and paired with mounted display device; tracking by the optical sensor of the head mounted display device movement of the user's hand relative to the head mounted display over a period of time to derive trajectory data representative of the trajectory of the user's hand during the period of time; receiving by the head mounted display device acceleration data for the period of time from the wireless hand-held inertial controller as derived by the inertial measurement unit of the wireless hand-held inertial controller; comparing the trajectory data with the acceleration data to compute a confidence level that the wireless hand-held device is located in the user's hand; and if the confidence level meets or exceeds a predetermined minimum threshold, fusing the location data derived from the optical sensor of mounted display device with the orientation data derived from the inertial measurement unit of the wireless hand-held
  • a method 500 for recovering six degrees of freedom (6DOF) relative to a wireless hand-held inertial controller when used in combination with a head mounted display is illustrated.
  • the process starts at block 50102 .
  • the hand tracking component of the FIMD analyzes the video data to determine if a user's hand is located within the field of view of the FIMD's optical sensor as indicated at step 50104 . If so, the process continues to step 50106 . If not, the process returns to step 50102 .
  • step 50106 the processor of the FIMD device checks to see if it is paired with any active hand-held Controller. If so, the process continues to step 50108 . If not, the process returns to step 50102 .
  • the optical sensor of the FIMD tracks movement of the user's hand relative to the head mounted display over a period of time to derive trajectory data representative of the trajectory of the user's hand during the period of time.
  • the FIMD receives acceleration data for the same period of time from the Controller as derived by the IMU of the wireless hand-held inertial controller.
  • the FIMD compares the trajectory data with the acceleration data to compute a confidence level that the wireless hand-held device is located in the user's hand. Then, as indicated at step 50110 , if the confidence level meets or exceeds a predetermined threshold, then the process continues to step 50116 . If not, the process continues with step 50112 .
  • step 50108 can be performed as follows.
  • the processor of the FIMD retrieves orientation data from the IMU of the Controller and compares it to the orientation data the FIMD derives from its optical sensors.
  • the processor of the FIMD then computes a confidence level based on the correlation or lack thereof between the orientation data from the Controller and the orientation data from the FIMD. As indicated at step 50110 , if the confidence level meets or exceeds a predetermined threshold, then the process continues to step 50116 . If not, the process continues with step 50112 .
  • the hand pose component of the FIMD compares the pose detected in the video frame against a pose classifier and calculate a confidence level based on the correlation or lack thereof between the hand pose as detected in the video frame and hand poses consistent with the Controller being held in the user's hand. As indicated at step 50114 , if the confidence level meets or exceeds a predetermined threshold, then the process continues to step 5116 . If not, the process returns to step 50102 .
  • step 50116 If the process reaches step 50116 , that means that there is a sufficient confidence level to create an association between the detected user hand and the Controller, and such association is created. Such association is persisted unless and until further analysis demonstrates that the association is no longer valid based on subsequent confidence level calculations.
  • step 50118 the process continue to step 50118 and the location data derived from the optical sensors of the HMD and the orientation data derived from the IMU of the Controller are fused, thereby recovering 6DOF in relation to the Controller. Then the process continues by returning to block 50102 for continued processing of subsequent frames of captured video.
  • the systems and methods described above may be practiced by a computer system including one or more processors and computer-readable media such as computer memory.
  • the computer memory may store computer-executable instructions that when executed by one or more processors cause various functions to be performed, such as the acts recited in the embodiments.
  • Embodiments of the present invention may comprise or utilize a special purpose or general-purpose computer including computer hardware, as discussed in greater detail below.
  • Embodiments within the scope of the present invention also include physical and other computer-readable media for carrying or storing computer- executable instructions and/or data structures.
  • Such computer-readable media can be any available media that can be accessed by a general purpose or special purpose computer system.
  • Computer-readable media that store computer-executable instructions are physical storage media.
  • Computer-readable media that carry computer- executable instructions are transmission media.
  • embodiments of the invention can comprise at least two distinctly different kinds of computer-readable media: physical computer- readable storage media and transmission computer-readable media.
  • Physical computer-readable storage media includes RAM, ROM, EEPROM, CD-ROM or other optical disk storage (such as CDs, DVDs, etc), magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer.
  • a "network” is defined as one or more data links that enable the transport of electronic data between computer systems and/or modules and/or other electronic devices.
  • a network or another communications connection can include a network and/or data links which can be used to carry or desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer. Combinations of the above are also included within the scope of computer-readable media.
  • program code means in the form of computer-executable instructions or data structures can be transferred automatically from transmission computer-readable media to physical computer-readable storage media (or vice versa).
  • program code means in the form of computer-executable instructions or data structures received over a network or data link can be buffered in RAM within a network interface module (e.g., a "NIC"), and then eventually transferred to computer system RAM and/or to less volatile computer-readable physical storage media at a computer system.
  • NIC network interface module
  • computer-readable physical storage media can be included in computer system components that also (or even primarily) utilize transmission media.
  • Computer-executable instructions comprise, for example, instructions and data which cause a general purpose computer, special purpose computer, or special purpose processing device to perform a certain function or group of functions.
  • the computer-executable instructions may be, for example, binaries, intermediate format instructions such as assembly language, or even source code.
  • the invention may be practiced in network computing environments with many types of computer system configurations, including, personal computers, desktop computers, laptop computers, message processors, hand-held devices, multi-processor systems, microprocessor- based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, mobile telephones, PDAs, pagers, routers, switches, and the like.
  • the invention may also be practiced in distributed system environments where local and remote computer systems, which are linked (either by hardwired data links, wireless data links, or by a combination of hardwired and wireless data links) through a network, both perform tasks.
  • program modules may be located in both local and remote memory storage devices.
  • the functionality described herein can be performed, at least in part, by one or more hardware logic components.
  • illustrative types of hardware logic components include Field-programmable Gate Arrays (FPGAs), Program-specific Integrated Circuits (ASICs), Program-specific Standard Products (ASSPs), System-on-a-chip systems (SOCs), Complex Programmable Logic Devices (CPLDs), etc.

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The present invention is about a folding headset with its electronic accessories, in particular, a folding headset with curved frame, a folding screen, its electronic accessories and their control methods. Regarding the embodiment of the present disclosure, the electronic accessories include: a curved frame with a folding screen and a lightweight electric motor with a gearbox and a pair if smart HMD glasses. it is intended in particular to design and use one or more terminal blocks and modules in separate basis in a folding headset considering the type of input signals.

Description

[ 0001 ] The present invention is about a folding headset with its electronic accessories, in particular, a folding headset with curved frame, a folding screen, its electronic accessories and their control methods. Regarding the embodiment of the present disclosure, the electronic accessories include: a curved frame with a folding screen and a lightweight electric motor with a gearbox and a pair if smart HMD glasses. Although this invention is suitable for a wide range of applications, it is intended in particular to design and use one or more terminal blocks and modules in separate basis in a folding headset considering the type of input signals, especially suitable for changing the folding angle and changing the angle of movement sensed by the folding headset and the curved folding frame.
Discussion of the Related Art
[0002] Considering the design and weight, the type of use and the techniques, which are important for the user to communicate with the moving computing system. Also, organizing the technology for facilitating the interaction with users is one of the challenges of design associated with building of the moving computing systems, since the users have various known methods regarding their learning potentials and imaginations, as the users that have easier interactions with their minds in audiovisual and digital basis, etc., or the users that use audiovisual and digital aspects, etc. One of the aims pursued by this device is helping and organizing the methods and technologies providing more convenient interaction with the user by their moving computing systems, especially as the moving device can be used on the head, on the table, or held by hand.
[0003] A mobile terminal can perform various functions. Examples of such functions include data and voice communications, capturing images and video via a camera, recording audio, playing music files and outputting music via a speaker system, and displaying images and video on a display. Some terminals include additional functionality which supports game playing, while other terminals are also configured as multimedia players. More recently, mobile terminals have been configured to receive broadcast and multicast signals which permit viewing of contents, such as videos and television programs.
[0004] Generally, terminals can be classified into mobile terminals and stationary terminals according to a presence or non-presence of mobility. In addition, the mobile terminals can be further classified into handheld terminals and vehicle mounted terminals. There are ongoing efforts to support and increase the functionality of mobile terminals. Such efforts include software and hardware improvements, as well as changes and improvements in the structural components which form the mobile terminal.
[0005] Mixed reality is a technology that allows virtual imagery to be mixed with a real world physical environment in a display. Systems for mixed reality may include, for example, see through head mounted display (HMD) devices or smart phones with built in cameras. Such systems typically include processing units which provide the imagery under the control of one or more applications. Full virtual reality environments in which no real world objects are viewable can also be supported using HMD and other devices.
[0006] Such systems may also include one or more wireless hand-held inertial controllers that the user of the system can manipulate to interact with the HMD and provide user input to the HMD, including, but not limited to, controlling and moving a virtual cursor, selection, movement and rotation of objects, scrolling, etc.
[0007] This Background is provided to introduce a brief context for the Summary and Detailed Description that follow. This Background is not intended to be an aid in determining the scope of the claimed subject matter nor be viewed as limiting the claimed subject matter to implementations that solve any or all of the disadvantages or problems presented above. Furthermore, the subject matter claimed herein is not limited to embodiments that solve any disadvantages or that operate only in environments such as those described above. Rather, this background is only provided to illustrate one exemplary technology area where some embodiments described herein may be practiced.
Summary of the invention
[0008] Considering the design and weight, the type of use and the techniques, which are important for the user to communicate with the moving computing system. Also, organizing the technology for facilitating the interaction with users is one of the challenges of design associated with building of the moving computing systems, especially as the moving device can be used on the head, on
Another challenge in achieving targeted design has been to position the electronic components relative to the position of the user, while also providing the device with an open space suitable for the electronic components and considering anti-impact, shatterproof, and waterproof properties of the system. Hence, the appearance of the computing system is in U-shape to convert the applied pressure to kinetic and spring energy as much as possible, and with good resistance in accidental impacts.
[0009] Accordingly, the embodiments of the invention are led to a folding headset, a folding screen with a rotating moveable frame and its control device, and since a pair of HMD glasses is used and configured behind the curved display frame, it has provided the space and situation to utilize one or two terminal blocks and modules in order to make the folding headset more organized, which substantially eliminate one or more problems due to limitations and disadvantages related to the art.
[0010] Meanwhile, a flexible display device corresponding to a sort of a mobile terminal means a device having a display deformable like a paper. Flexible devices include a foldable display, a Tollable display, a bendable display and the like. For a foldable display foldable like paper, various user interfaces are needed in consideration of the foldable display is repeatedly opened and closed.
[0011] Based on the aspect of the present disclosure in another embodiment, the curved frame of the folding screen has two handles, defined together with the handles of the folding headset.
Moreover, according to the aspect in another embodiment of the current disclosure, the condition is as such that the curved frame and the set of folding rotating screen will be able to rotate and move upward and downward around the headset device by a lightweight electric motor with a gearbox.
The electric motor has a gearbox, empowering the frame on the folding headset to move vertically in a circular motion around the folding headset device. This movement can be a manual movement or a preset motion.
An embodiment of the electric motor includes the lightweight AC electric motor or a DC motor, which can be together with gearwheels as the gearbox.
The power, motor speed and its acceleration may be supported at the folding headset device in an embodiment of the invention, and in another embodiment it is possible that the lightweight electric motor with the gearbox is supported in the curved frame section.
And, it is possible in one embodiment to replace the electric motor having a gearbox with a spring or lever system or a gear system for changing the angular motion between the folding headset and the curved frame in order to move on the headset.
In another embodiment of the present invention, the rotational angle between the curved frame and the headset may be altered by the lightweight electric motor having a gearbox. Furthermore, in another embodiment of the present invention, our folding angle may change between the folding headset device and the folding frame handles.
[0012] One of the aims of this innovation is to provide a folding headset with a curved frame of the rotating screen and a control method that allows to change the angle between our folding headset and its curved frame based on the user's input or a preset condition, and also a suitable angle relative to the user's input and the preset mode can be set relative to the curved frame handle and the handles of the folding headset.
[0013] Another aim of the present invention is to provide a folding headset, the curved frame and its controlling means, the open/closed moving angle which relative to the curved frame is defined by adjusting its speed with a lightweight electric motor and a gearbox together with an electromagnet or a memory alloy and the folding headset and curved frame handles. No failure can occur by adjusting the speed of opening/closing handles of the folding headset and the curved display frame by an electromagnet or the memory alloy.
[0014] Another aim of this innovation in providing a folding headset and a curved frame is to switch a state of the moving open/closed angle easily by a lightweight electric motor with a gearbox between the headset and the curved frame. Moreover, the open/closed states of the handles of the folding headset and the curve frame can easily be switched to each other (i.e. functioning by one hand).
Another aim of this innovation is to provide a folding headset and its curved frame, using the HMD glasses system located on the curved frame, which include:
[0015] According to an embodiment, this system can include the following items: A display connected to a processor, a hand-held input device configured to communicate with the processor to provide one or more user inputs. This hand-held input device comprises a primary sensor to determine the direction of the folding headset device, a hand-held input relative to the predefined reference frame and transmitting the related position to the processor. The second sensor is located at a specific location relative to the screen to determine the position of one or more users according to the screen and to provide the position data to the processor, where the processor tracks the position data of one hand of the use or more in a 3D field of view with six degrees of freedom.
[0016] In another embodiment, this method includes the following: Detecting the folding display headset located on the curved frame of the folding display by an optical sensor, one hand if the user in the field of view of the optital sensor, if a wireless hand-held inertial controller is active and paired with the head mounted display device, determining the location and orientation of the user's hand relative to the display on the folding display curved frame by the optical sensor, tracking the user's hand relative to the display located on the curved frame of the folding screen by an optical sensor of the corresponding display device over a period of time to extract the data related to the user's hand trajectory, getting the acceleration data for a specified period of time from a wireless hand-held inertial controller by a display unit on the folding display curved frame and measuring the inertial state of the wireless hand-held controller, comparing the trajectory data with the acceleration data to compute a confidence level that the wireless hand-held device is located in the user's hand and fusing the location data derived from the optical sensor of the head mounted display device with the orientation data derived from the inertial measurement unit of the wireless hand-held inertial controller to track the user's hand within three dimensional space with six degrees of freedom, in case the confidence level meets or exceeds a predetermined minimum threshold.
[0017] Technical tasks obtainable from the present invention are non-limited by the above-mentioned technical tasks. And, other unmentioned technical tasks can be clearly understood from the following description by those having ordinary skill in the technical field to which the present invention pertains. [0018] Additional advantages, objects, and features of the invention will be set forth in the disclosure herein as well as the accompanying drawings. Such aspects may also be appreciated by those skilled in the art based on the disclosure herein.
[0019] In order to achieve the objectives and the required advantages, and in accordance with the aim of the invention, as herein embodied, the folding headset and the rotating screen frame with a touchscreen and a transparent curve as well as the HMD glasses system define the present folding headset body disclosure in an embodiment, including a set of frames, a primary curved frame and a secondary curved frame. The first curved frame defines the received space for placing the electronic elements of the electronic device. The second curved frame as a cover and holder is configured as a complement to the first frame.
In one embodiment of the present disclosure, the curved frame, consisting of a set of frames, a first curved frame and a second curved frame. The first curved frame defines the received space for the positioning of the electronic elements of the electronic device. The second curved frame is configured with a clear, flexible touchscreen with a screen protector.
Since it is disclosed in the invention and the position of the created space can withstand one or a few terminal blocks of the modules and can support systems, thus:
In one embodiment, the headset body supports the frame display area, and in the other embodiment, the headset body can support the area of the system of HMD smart glasses.
According to one embodiment of the present invention, it may be the supporting of the curved frame area which supports the part and area of the frame itself; i.e. the display screen supports the curved frame, and also in another embodiment, it is the second area of the curved frame that supports the area of the system of HMD smart glasses.
[0020] For achieving these aims and other advantages and in accordance with the purpose of this innovation, as broadly considered and described herein, the folding device may comprise the following items in accordance with the embodiment of the present invention:
In one embodiment, the display unit comprises a folding curved frame display and folding curved frame handles and according to the present invention, the electronic elements such as sensors, controllers, etc. can be located in between the headset handles for the aim of controlling the display unit.
Regarding the present innovation and in another suggestion, the electronic elements can be positioned between the folding headset and the curved frame for the required angular motion, rotational angle of rotation, which include:
A primary body that holds the first display unit, and a secondary body that holds the second display unit, a sensor sensing the folding angle between the primary and secondary bodies, an actuator unit providing a change of the folding angle, and a controller configured to determine the changing information on the folding angle between the two handles of the headset and the two handles of the folding display curved frame and also for controlling the actuator to change the folding angle according to the determined change information. Also, a controller to determine the changing information regarding the rotational angle of the circular motion between the headset and the curved frame as well as controlling the actuator system to change the rotational angle of the circular motion based on the changing information of the lightweight electric motor with the gearbox.
[0021] Preferably, the controller may be configured by means of a lightweight electric motor with a gearbox to switch the rotational angle of the circular motion between the folding headset and the curved frame with regards to the determined changed information into the open/closed mode. The open mode may indicate a case where the rotational angle of the circular motion between the folding headset and the curved frame exceeds the angle of the primary state, and the closed mode may indicate a case where the rotational angle of the circular motion between the headset and the curved frame is equal to or smaller than the first rotational angle of rotation.
[0022] The controller can preferably be configured to enable switching the angle between the two handles of the folding headset and the angle between the two handles of the curved frame based on the determined changed information specified to the open or closed mode. The open mode may indicate the case that the angle between the two handles of the folding headset and the angle between the two handles of the curved display frame exceeds the angle of the first state, and the closed mode may indicate an angle between the two handles of the headset and the angle between The two handles of the curved display frame is equal to or smaller than the first angle.
[0023] If the user condition in the closed mode is more verifiable between the two handles of the folding headset device and between the two handles of the folding display curved frame, the controller can be configured for the actuator control in such a way to switch the two handles of the headset and the handles of the curved frame of the folding display to the open mode.
[0024] In case the user condition in the rotational angle of the circular motion in closed mode is more verifiable between the two handles of the folding headset device, the controller can be configured for the actuator control in such a way to switch the headset and the curved frame to the open mode by the lightweight electric motor with a gearbox.
[0025] And, the user authentication may be performed based on at least one of a fingerprint input, a pattern input, an iris input, a touch input and a speech input.
[0026] More preferably, if an input signal is sensed in the open mode between the two handles of the folding device and between that of the curved frame of the folding display, the controller may be further configured to control the actuator unit to switch the folding headset and the curved frame handles to the closed mode.
[0027] Also, preferably, if an input signal is sensed between the folding headset and the curved frame in the angular motion mode in the open state, the controller for controlling the actuating system can be configured in such a way to switch over the open state of the folding headset and the curved frame to the closed mode by a lightweight electric motor with a gearbox.
[0028] Preferably, if an input signal is not sensed for the preset duration in the open state between the two handles of the folding headset and between the two handles of the curved frame, the controller can be configured for controlling the actuating system in such a way to switch over the handles of the folding headset and the handles of the curved frame of the folding display to the closed mode.
[0029] More preferably, if an input signal in the open state is not sensed for the preset duration in the rotational angle of the circular motion for the open state of the headset device and the curved frame, the controller for the actuating system can be configured in such a way to switch over the headset and curved frame to the closed mode by a lightweight electric motor with a gearbox.
[0030] More preferably, the actuator unit may include at least one of electromagnet and shape memory alloy.
[0031] In this case, if something happens in closed mode between the two handles of folding headset device and the two handles of the curved frame of the folding display, the controller can be configured for the change of speed to the open mode based on the occurred conditions by the handles of the folding headset and the handles of the curved frame.
[0032] Also in this case, if an event occurs in the closed mode of the in the rotational angle of the circular motion, the folding headset device and the curved frame, the controller may be switched over for the change of speed to the open state of the folding headset and the curved frame based on the occurred conditions by a lightweight electric motor with a gearbox.
[0033] Moreover, when the two handles of the folding headset and the two handles of the folding curved frame are switched over to the closed mode, if the folding angle is equal to or smaller than the second angle, the controller can be configured for the actuating system in such a way activated to control the system in a way to adjust the conversion speed with the relevant magnet modifier for at least one preset part between the folding headset handles and the handles of the curved frame.
[0034] Also, when the rotational angle of the circular motion between the folding headset and the curved frame is changed to the closed mode, if this angle is equal to or smaller than the second angle, the controller can be configured for the system control in such a way to adjust the conversion speed with the relevant magnet modifier for at least one preset part between the folding headset handles and the handles of the curved frame.
[0035] Preferably, the folding headset and the curved frame can include a wireless communication unit transmitting/receiving data to/from a recorder (stylus pen).
[0036] In this case, the controller may be further configured to control the actuator unit to change the folding angle between the two handles of the folding headset and that of the folding display curved frame, based on a control signal received from the stylus pen.
[0037] In this case, the controller can also be configured for the actuator unit to change the rotational angle of the circular motion based on the received signal from the recorder and by the lightweight electric motor with a gearbox.
[0038] The altered angles between the two handles of the folding headset and the handles of the folding display curved frame or between the folding headset and the folding display curved frame can at least be determined according to the sensed duration and the input length of the user by the stylus recorder.
[0039] Also, when between the two handles of the folding headset and between the two handles of the folding display curved frame is changed in closed form, if the stylus is in the folding headset or the curved frame, the controller can be configured for controlling the actuating system, in such a way that the primary and secondary body magnets are equal to the stylus.
[0040] If a change has occurred between the folding headset and the curved frame in the basis of the closed rotational angle of the circular motion and in case the stylus is located the folding headset or the curved frame, the controller can be configured for controlling the actuating system by a lightweight electric motor with the gearbox, in such a way that the magnet of the primary and secondary bodies is equal to that of the stylus.
[0041] Preferably, if an input signal is sensed in the open state, the controller can be configured in such a way that the angle between the two handles of the folding headset and the two handles of the folding display curved frame is switched to a third angle.
[0042] Also, if an input signal is sensed in the open mode, the controller can be configured by a lightweight electric motor with a gearbox in such a way that the rotational angle of the circular motion between the folding headset and the curved frame is switched to a third angle.
[0043] In this case, the controller can be configured in such a way to transmit a preset content to the curved frame screen system for responding to the input signal.
[0044] Also, the controller, in this case, can be configured in such a way to transmit a preset content for responding to the input signal to the lightweight electric motor system with a gearbox to switch over the rotational angle of the circular motion between the folding headset and the curved frame. [0045] In another aspect of the invention, as described herein and extensively described, in accordance with another embodiment of the present invention, a method of controlling the two handles of the folding headset together with the two handles of the curved frame of the folding display may include determining a change information on a folding angle between a first body and a second body and controlling an actuator to change the folding angle according to the determined change information, wherein the first body supports a first display region, wherein the second body supports a second display region, and wherein the folding angle is an angle between the first body and the second body.
[0046] Also, in another aspect of this invention and as it is herein embodied and broadly described, the method for controlling the rotational angle of the circular motion of this folding headset together with the curved frame can include determining a change information on a folding angles between the folding headset and the frame by the lightweight electric motor with a gearbox. The body and control of an actuator are considered to change the folding angle according to designated changing information, where the folding headset supports the lightweight electric motor with a gearbox, and where the curved frame supports the lightweight electric motor with a gearbox, in which the rotational angle of the circular motion is the angle between the folding headset and the curved frame.
[0047] It is to be understood that both the foregoing general description and the following detailed description of the preferred embodiments of the present invention are exemplary and explanatory and are intended to provide further explanation of the invention as claimed.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1A is a block diagram of a mobile terminal according to an embodiment of the present disclosure;
FIGS. IB and 1C are conceptual views of one example of the mobile terminal, viewed from different directions;
FIG. 2 is a block diagram of a folding headset with curved frame according to one embodiment of the present invention;
FIG. 3 is a perspective diagram of a folding headset with curved frame according to one embodiment of the present invention;
FIG. 4 is a diagram illustrating one example of an actuator unit of a folding headset with curved frame according to one embodiment of the present invention;
FIG. 5 is a diagram illustrating one example of an auto-opening through fingerprint recognition in a folding headset with curved frame according to one embodiment of the present invention;
FIG. 6 is a diagram illustrating another example of an auto-opening through fingerprint recognition in a folding headset with curved frame according to one embodiment of the present invention;
FIG. 7 is a diagram illustrating one example of an auto-opening using an angle and pressure in a folding headset with curved frame according to one embodiment of the present invention;
FIG. 8 is a diagram illustrating one example of an auto-opening based on various user inputs in a folding headset with curved frame according to one embodiment of the present invention;
FIG. 9 is a diagram illustrating one example of an auto-opening in accordance with a gesture input in a folding headset with curved frame according to one embodiment of the present invention; FIG. 10 is a diagram illustrating one example of an auto-closing in accordance with a voice input in a folding headset with curved frame according to one embodiment of the present invention;
FIG. 11 is a diagram illustrating one example of an auto-closing in accordance with a gesture input in a folding headset with curved frame according to one embodiment of the present invention;
FIG. 12 is a diagram illustrating another example of an auto-closing in accordance with a gesture input in a folding headset with curved frame according to one embodiment of the present invention;
FIG. 13 is a diagram illustrating one example of an auto-closing in accordance with time expiration in a folding headset with curved frame according to one embodiment of the present invention;
FIG. 14 is a diagram illustrating one example of an opening/closing speed of a folding headset with curved frame according to one embodiment of the present invention;
FIG. 15 is a diagram illustrating one example of controlling an opening/closing speed of a folding headset with curved frame using a variable magnetic field according to one embodiment of the present invention;
FIG. 16 is a diagram illustrating one example of controlling an opening/closing speed of a folding headset with curved frame using a variable magnetic field according to one embodiment of the present invention;
FIG. 17 is a diagram illustrating one example of controlling a folding angle in accordance with user's posture in a folding headset with curved frame according to one embodiment of the present invention;
FIG. 18 is a diagram illustrating one example of controlling a folding angle in accordance with a scroll input in a folding headset with curved frame according to one embodiment of the present invention;
FIG. 19 is a diagram illustrating one example of controlling a folding angle of a folding headset with curved frame using a stylus pen according to one embodiment of the present invention;
FIG. 20 is a diagram illustrating another example of controlling a folding angle of a folding headset with curved frame using a stylus pen according to one embodiment of the present invention;
FIG. 21 is a diagram illustrating further example of controlling a folding angle of a folding headset with curved frame using a stylus pen according to one embodiment of the present invention;
FIG. 22 is a diagram illustrating one example of controlling a folding headset with curved frame in case of a stylus pen located within the foldable device according to one embodiment of the present invention; and
FIG. 23 is a flowchart for a method of controlling a folding headset with curved frame according to one embodiment of the present invention.
FIG. 24 is a schematic representation of one embodiment of a folding headset with curved frame, mounted virtual or augmented reality display.
FIG. 25 is a general perspective rendering of one embodiment of the folding headset with curved frame, mounted virtual or augmented reality display.
FIG. 26 is an exploded perspective rendering of one embodiment of the folding headset with curved frame, mounted virtual or augmented reality display, further illustrating one embodiment of a stereoscopic display system.
FIG. 27 is a general perspective rendering of one embodiment of the folding headset with curved frame, mounted virtual or augmented reality display, further illustrating one embodiment of an optical sensor system.
FIG. 28 is a perspective rendering of one embodiment of wireless hand-held stylus pen controller. FIG. 29 is a functional block diagram illustrating the basic components of one embodiment of a wireless hand-held inertial controller.
FIG. 30 is a graphical representation of one example of a possible field of view of one embodiment of an folding headset with curved frame, mounted virtual or augmented reality display.
FIG.31 is a flowchart of one embodiment of a method for determining the location and orientation of a hand-held inertial controller with six degrees of freedom.
[0048] The present invention will become more fully understood from the detailed description given herein below and the accompanying drawings, which are given by illustration only, and thus are not limitative of the present invention, and wherein:
DETAILED DESCRIPTION OF THE INVENTION
[0049] Description will now be given in detail according to exemplary embodiments disclosed herein, with reference to the accompanying drawings. For the sake of brief description with reference to the drawings, the same or equivalent components may be provided with the same reference numbers, and description thereof will not be repeated. In general, a suffix such as "module" and "unit" may be used to refer to elements or components. Use of such a suffix herein is merely intended to facilitate description of the specification, and the suffix itself is not intended to give any special meaning or function. The accompanying drawings are used to help easily understand various technical features and it should be understood that the embodiments presented herein are not limited by the accompanying drawings. As such, the present disclosure should be construed to extend to any alterations, equivalents and substitutes in addition to those which are particularly set out in the accompanying drawings.
[0050] Although the terms first, second, etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are generally only used to distinguish one element from another. When an element is referred to as being "connected with" another element, the element can be connected with the other element or intervening elements may also be present. In contrast, when an element is referred to as being "directly connected with" another element, there are no intervening elements present.
[0051] The disclosed systems and methods must be clearly understood and appreciated as such descriptions of an invention are merely an example for easy understanding. The cell phone terminal block and the HMD glasses terminal block are supported, explained and understood in the headset body and the , In an embodiment, one terminal block can be conceptualized and applied into two terminal blocks; e.g. the terminal block, modules of a cell phone and the HMD smart glasses can be integrated. In another embodiment, the block of the cell phone module terminals and the terminal block of the HMD smart glasses modules are each considered separately and used. In this case, the space and possible position within the folding headset itself and the curved frame are considered.
[0052] Mobile terminals presented herein may be implemented using a variety of different types of terminals. Examples of such terminals include cellular phones, smart phones, user equipment, laptop computers, digital broadcast terminals, personal digital assistants (PDAs), portable multimedia players (PMPs), navigators, portable computers (PCs), slate PCs, tablet PCs, ultra books
[0053] Further explanation is provided with reference to certain types of cell phone terminals and the HMD smart- glasses system only through the required non-limiting example, although the above-mentioned examples can be applied. [0054] By way of non-limiting example only, further description will be made with reference to particular types of mobile terminals. Flowever, such teachings apply equally to other types of terminals, such as those types noted above. In addition, , desktop computers, and the like.
[0055] Reference is now made to FIGS. 1A-1C , where FIG. 1A is a block diagram of a mobile terminal in accordance with the present disclosure, and FIGS. IB and 1C are conceptual views of one example of the mobile terminal, viewed from different directions. The mobile terminal 100 is shown having components such as a wireless communication unit 110, an input unit 120, a sensing unit 140, an output unit 150, an interface unit 160, a memory 170, a controller 180, and a power supply unit 190. Implementing all of the illustrated components is not a requirement, and that greater or fewer components may alternatively be implemented. Referring now to FIG. 1A , the mobile terminal 100 is shown having wireless communication unit 110 configured with several commonly implemented components.
[0056] The wireless communication unit 110 typically includes one or more modules which permit communications such as wireless communications between the mobile terminal 100 and a wireless communication system, communications between the mobile terminal 100 and another mobile terminal, communications between the mobile terminal 100 and an external server. Further, the wireless communication unit 110 typically includes one or more modules which connect the mobile terminal 100 to one or more networks.
[0057] To facilitate such communications, the wireless communication unit 110 includes one or more of a broadcast receiving module 111, a mobile communication module 112, a wireless Internet module 113, a short- range communication module 114, and a location information module 115. The input unit 120 includes a camera 121 for obtaining images or video, a microphone 122, which is one type of audio input device for inputting an audio signal, and a user input unit 123 (for example, a touch key, a push key, a mechanical key, a soft key, and the like) for allowing a user to input information. Data (for example, audio, video, image, and the like) is obtained by the input unit 120 and may be analyzed and processed by controller 180 according to device parameters, user commands, and combinations thereof.
[0058] The sensing unit 140 is typically implemented using one or more sensors configured to sense internal information of the mobile terminal, the surrounding environment of the mobile terminal, user information, and the like. For example, in FIG. 1A , the sensing unit 140 is shown having a proximity sensor 141 and an illumination sensor 142. If desired, the sensing unit 140 may alternatively or additionally include other types of sensors or devices, such as a touch sensor, an acceleration sensor, a magnetic sensor, a G-sensor, a gyroscope sensor, a motion sensor, an RGB sensor, an infrared (IR) sensor, a finger scan sensor, a ultrasonic sensor, an optical sensor (for example, camera 121), a microphone 122, a battery gauge, an environment sensor (for example, a barometer, a hygrometer, a thermometer, a radiation detection sensor, a thermal sensor, and a gas sensor, among others), and a chemical sensor (for example, an electronic nose, a health care sensor, a biometric sensor, and the like), to name a few. The mobile terminal 100 may be configured to utilize information obtained from sensing unit 140, and in particular, information obtained from one or more sensors of the sensing unit 140, and combinations thereof.
[0059] The output unit 150 is typically configured to output various types of information, such as audio, video, tactile output, and the like. The output unit 150 is shown having a display unit 151, an audio output module 152, a haptic module 153, and an optical output module 154. The display unit 151 may have an inter-layered structure or an integrated structure with a touch sensor in order to facilitate a touch screen. The touch screen may provide an output interface between the mobile terminal 100 and a user, as well as function as the user input unit 123 which provides an input interface between the mobile terminal 100 and the user98 ‘ Special module for the lightweight electric motor with a gearbox to raise and bring down the curved frame99‘Special cooling fan module is provided to keep the block and modules in cool condition.
[0060] The interface unit 160 serves as an interface with various types of external devices that can be coupled to the mobile terminal 100. The interface unit 160, for example, may include any of wired or wireless ports, external power supply ports, wired or wireless data ports, memory card ports, ports for connecting a device having an identification module, audio input/output (I/O) ports, video I/O ports, earphone ports, and the like. In some cases, the mobile terminal 100 may perform assorted control functions associated with a connected external device, in response to the external device being connected to the interface unit 160.
[0061] The memory 170 is typically implemented to store data to support various functions or features of the mobile terminal 100. For instance, the memory 170 may be configured to store application programs executed in the mobile terminal 100, data or instructions for operations of the mobile terminal 100, and the like. Some of these application programs may be downloaded from an external server via wireless communication. Other application programs may be installed within the mobile terminal 100 at time of manufacturing or shipping, which is typically the case for basic functions of the mobile terminal 100 (for example, receiving a call, placing a call, receiving a message, sending a message, and the like). It is common for application programs to be stored in the memory 170, installed in the mobile terminal 100, and executed by the controller 180 to perform an operation (or function) for the mobile terminal 100.
[0062] The controller 180 typically functions to control overall operation of the mobile terminal 100, in addition to the operations associated with the application programs. The controller 180 processes signals, data, and information input or output through the components mentioned in the foregoing description or runs an application program saved in the memory 170, thereby providing or processing an information or function appropriate for a user.
[0063] The controller 180 can provide or process information or functions appropriate for a user by processing signals, data, information and the like, which are input or output by the various components depicted in Fig. 1A , or activating application programs stored in the memory 170. As one example, the controller 180 controls some or all of the components illustrated in FIGS. 1A according to the execution of an application program that have been stored in the memory 170.
[0064] The power supply unit 190 can be configured to receive external power or provide internal power in order to supply appropriate power required for operating elements and components included in the mobile terminal 100. The power supply unit 190 may include a battery, and the battery may be configured to be embedded in the terminal body, or configured to be detachable from the terminal body. [0065] At least one portion of the respective components can cooperatively operate to implement operations, controls or controlling methods of a mobile terminal according to various embodiments of the present invention mentioned in the following description. The operations, controls or controlling methods of the mobile terminal can be implemented on the mobile terminal by running at least one application program saved in the memory 170.
[0066] Referring still to FIG. 1A , various components depicted in this figure will now be described in more detail. Regarding the wireless communication unit 110, the broadcast receiving module 111 is typically configured to receive a broadcast signal and/or broadcast associated information from an external broadcast managing entity via a broadcast channel. The broadcast channel may include a satellite channel, a terrestrial channel, or both. In some embodiments, two or more broadcast receiving modules 111 may be utilized to facilitate simultaneously receiving of two or more broadcast channels, or to support switching among broadcast channels.
[0067] The broadcast managing entity may be implemented using a server or system which generates and transmits a broadcast signal and/or broadcast associated information, or a server which receives a pre-generated broadcast signal and/or broadcast associated information, and sends such items to the mobile terminal. The broadcast signal may be implemented using any of a TV broadcast signal, a radio broadcast signal, a data broadcast signal, and combinations thereof, among others. The broadcast signal in some cases may further include a data broadcast signal combined with a TV or radio broadcast signal.
[0068] The broadcast signal may be encoded according to any of a variety of technical standards or broadcasting methods (for example, International Organization for Standardization (ISO), International Electrotechnical Commission (IEC), Digital Video Broadcast (DVB), Advanced Television Systems Committee (ATSC), and the like) for transmission and reception of digital broadcast signals. The broadcast receiving module 111 can receive the digital broadcast signals using a method appropriate for the transmission method utilized.
[0069] Examples of broadcast associated information may include information associated with a broadcast channel, a broadcast program, a broadcast event, a broadcast service provider, or the like. The broadcast associated information may also be provided via a mobile communication network, and in this instance, received by the mobile communication module 112.
[0070] The broadcast associated information may be implemented in various formats. For instance, broadcast associated information may include an Electronic Program Guide (EPG) of Digital Multimedia Broadcasting (DMB), an Electronic Service Guide (ESG) of Digital Video Broadcast-Flandheld (DVB-H), and the like. Broadcast signals and/or broadcast associated information received via the broadcast receiving module 111 may be stored in a suitable device, such as a memory 170.
[0071] The mobile communication module 112 can transmit and/or receive wireless signals to and from one or more network entities. Typical examples of a network entity include a base station, an external mobile terminal, a server, and the like. Such network entities form part of a mobile communication network, which is constructed according to technical standards or communication methods for mobile communications (for example, Global System for Mobile Communication (GSM), Code Division Multi Access (CDMA), CDMA2000 (Code Division Multi Access 2000), EV-DO (Enhanced Voice-Data Optimized or Enhanced Voice-Data Only), Wideband CDMA (WCDMA), High Speed Downlink Packet access (HSDPA), HSUPA (High Speed Uplink Packet Access), Long Term Evolution (LTE), LTE-A (Long Term Evolution-Advanced), and the like).
[0072] Examples of wireless signals transmitted and/or received via the mobile communication module 112 include audio call signals, video (telephony) call signals, or various formats of data to support communication of text and multimedia messages. The wireless Internet module 113 is configured to facilitate wireless Internet access. This module may be internally or externally coupled to the mobile terminal 100. The wireless Internet module 113 may transmit and/or receive wireless signals via communication networks according to wireless Internet technologies.
[0073] Examples of such wireless Internet access include Wireless LAN (WLAN), Wireless Fidelity (Wi-Fi), Wi-Fi Direct, Digital Living Network Alliance (DLNA), Wireless Broadband (WiBro), Worldwide Interoperability for Microwave Access (WiMAX), High Speed Downlink Packet Access (HSDPA), HSUPA (High Speed Uplink Packet Access), Long Term Evolution (LTE), LTE-A (Long Term Evolution-Advanced), and the like. The wireless Internet module 113 may transmit/receive data according to one or more of such wireless Internet technologies, and other Internet technologies as well.
[0074] In some embodiments, when the wireless Internet access is implemented according to, for example, WiBro, HSDPA,HSUPA, GSM, CDMA, WCDMA, LTE, LTE-A and the like, as part of a mobile communication network, the wireless Internet module 113 performs such wireless Internet access. As such, the Internet module 113 may cooperate with, or function as, the mobile communication module 112.
[0075] The short-range communication module 114 is configured to facilitate short-range communications. Suitable technologies for implementing such short-range communications include BLUETOOTHTM, Radio Frequency IDentification (RFID), Infrared Data Association (IrDA), Ultra-WideBand (UWB), ZigBee, Near Field Communication (NFC), Wireless-Fidelity (Wi-Fi), Wi-Fi Direct, Wireless USB (Wireless Universal Serial Bus), and the like. The short-range communication module 114 in general supports wireless communications between the mobile terminal 100 and a wireless communication system, communications between the mobile terminal 100 and another mobile terminal 100, or communications between the mobile terminal and a network where another mobile terminal 100 (or an external server) is located, via wireless area networks. One example of the wireless area networks is a wireless personal area networks.
[0076] In some embodiments, another mobile terminal (which may be configured similarly to mobile terminal 100) may be a wearable device, for example, a smart watch, a smart glass or a head mounted display (HMD), which can exchange data with the mobile terminal 100 (or otherwise cooperate with the mobile terminal 100). The short- range communication module 114 may sense or recognize the wearable device, and permit communication between the wearable device and the mobile terminal 100. In addition, when the sensed wearable device is a device which is authenticated to communicate with the mobile terminal 100, the controller 180, for example, may cause transmission of data processed in the mobile terminal 100 to the wearable device via the short-range communication module 114. Hence, a user of the wearable device may use the data processed in the mobile terminal 100 on the wearable device. For example, when a call is received in the mobile terminal 100, the user may answer the call using the wearable device. Also, when a message is received in the mobile terminal 100, the user can check the received message using the wearable device.
[0077] The location information module 115 is generally configured to detect, calculate, derive or otherwise identify a position of the mobile terminal. As an example, the location information module 115 includes a Global Position System (GPS) module, a Wi-Fi module, or both. If desired, the location information module 115 may alternatively or additionally function with any of the other modules of the wireless communication unit 110 to obtain data related to the position of the mobile terminal. As one example, when the mobile terminal uses a GPS module, a position of the mobile terminal may be acquired using a signal sent from a GPS satellite. As another example, when the mobile terminal uses the Wi-Fi module, a position of the mobile terminal can be acquired based on information related to a wireless access point (AP) which transmits or receives a wireless signal to or from the Wi-Fi module.
[0078] The input unit 120 may be configured to permit various types of input to the mobile terminal 120. Examples of such input include audio, image, video, data, and user input. Image and video input is often obtained using one or more cameras 121. Such cameras 121 may process image frames of still pictures or video obtained by image sensors in a video or image capture mode. The processed image frames can be displayed on the display unit 151 or stored in memory 170. In some cases, the cameras 121 may be arranged in a matrix configuration to permit a plurality of images having various angles or focal points to be input to the mobile terminal 100. As another example, the cameras 121 may be located in a stereoscopic arrangement to acquire left and right images for implementing a stereoscopic image.
[0079] The microphone 122 is generally implemented to permit audio input to the mobile terminal 100. The audio input can be processed in various manners according to a function being executed in the mobile terminal 100. If desired, the microphone 122 may include assorted noise removing algorithms to remove unwanted noise generated in the course of receiving the external audio.
[0080] The user input unit 123 is a component that permits input by a user. Such user input may enable the controller 180 to control operation of the mobile terminal 100. The user input unit 123 may include one or more of a mechanical input element (for example, a key, a button located on a front and/or rear surface or a side surface of the mobile terminal 100, a dome switch, a jog wheel, a jog switch, and the like), or a touch-sensitive input, among others. As one example, the touch-sensitive input may be a virtual key or a soft key, which is displayed on a touch screen through software processing, or a touch key which is located on the mobile terminal at a location that is other than the touch screen. Further, the virtual key or the visual key may be displayed on the touch screen in various shapes, for example, graphic, text, icon, video, or a combination thereof.
[0081] The sensing unit 140 is generally configured to sense one or more of internal information of the mobile terminal, surrounding environment information of the mobile terminal, user information, or the like. The controller 180 generally cooperates with the sending unit 140 to control operation of the mobile terminal 100 or execute data processing, a function or an operation associated with an application program installed in the mobile terminal based on the sensing provided by the sensing unit 140. The sensing unit 140 may be implemented using any of a variety of sensors, some of which will now be described in more detail. [0082] The proximity sensor 141 may include a sensor to sense presence or absence of an object approaching a surface, or an object located near a surface, by using an electromagnetic field, infrared rays, or the like without a mechanical contact. The proximity sensor 141 may be arranged at an inner region of the mobile terminal covered by the touch screen, or near the touch screen.
[0083] The proximity sensor 141, for example, may include any of a transmissive type photoelectric sensor, a direct reflective type photoelectric sensor, a mirror reflective type photoelectric sensor, a high-frequency oscillation proximity sensor, a capacitance type proximity sensor, a magnetic type proximity sensor, an infrared rays proximity sensor, and the like. When the touch screen is implemented as a capacitance type, the proximity sensor 141 can sense proximity of a pointer relative to the touch screen by changes of an electromagnetic field, which is responsive to an approach of an object with conductivity. In this instance, the touch screen (touch sensor) may also be categorized as a proximity sensor.
[0084] The term "proximity touch" will often be referred to herein to denote the scenario in which a pointer is positioned to be proximate to the touch screen without contacting the touch screen. The term "contact touch" will often be referred to herein to denote the scenario in which a pointer makes physical contact with the touch screen. For the position corresponding to the proximity touch of the pointer relative to the touch screen, such position will correspond to a position where the pointer is perpendicular to the touch screen. The proximity sensor 141 may sense proximity touch, and proximity touch patterns (for example, distance, direction, speed, time, position, moving status, and the like).
[0085] In general, controller 180 processes data corresponding to proximity touches and proximity touch patterns sensed by the proximity sensor 141, and cause output of visual information on the touch screen. In addition, the controller 180 can control the mobile terminal 100 to execute different operations or process different data according to whether a touch with respect to a point on the touch screen is either a proximity touch or a contact touch.
[0086] A touch sensor can sense a touch applied to the touch screen, such as display unit 151, using any of a variety of touch methods. Examples of such touch methods include a resistive type, a capacitive type, an infrared type, and a magnetic field type, among others. As one example, the touch sensor may be configured to convert changes of pressure applied to a specific part of the display unit 151, or convert capacitance occurring at a specific part of the display unit 151, into electric input signals. The touch sensor may also be configured to sense not only a touched position and a touched area, but also touch pressure and/or touch capacitance. A touch object is generally used to apply a touch input to the touch sensor. Examples of typical touch objects include a finger, a touch pen, a stylus pen, a pointer, or the like.
[0087] When a touch input is sensed by a touch sensor, corresponding signals may be transmitted to a touch controller. The touch controller may process the received signals, and then transmit corresponding data to the controller 180. Accordingly, the controller 180 can sense which region of the display unit 151 has been touched. Here, the touch controller may be a component separate from the controller 180, the controller 180, and combinations thereof. [0088] In some embodiments, the controller 180 can execute the same or different controls according to a type of touch object that touches the touch screen or a touch key provided in addition to the touch screen. Whether to execute the same or different control according to the object which provides a touch input may be decided based on a current operating state of the mobile terminal 100 or a currently executed application program, for example.
[0089] The touch sensor and the proximity sensor may be implemented individually, or in combination, to sense various types of touches. Such touches includes a short (or tap) touch, a long touch, a multi-touch, a drag touch, a flick touch, a pinch-in touch, a pinch-out touch, a swipe touch, a hovering touch, and the like.
[0090] If desired, an ultrasonic sensor may be implemented to recognize position information relating to a touch object using ultrasonic waves. The controller 180, for example, may calculate a position of a wave generation source based on information sensed by an illumination sensor and a plurality of ultrasonic sensors. Since light is much faster than ultrasonic waves, the time for which the light reaches the optical sensor is much shorter than the time for which the ultrasonic wave reaches the ultrasonic sensor. The position of the wave generation source may be calculated using this fact. For instance, the position of the wave generation source may be calculated using the time difference from the time that the ultrasonic wave reaches the sensor based on the light as a reference signal.
[0091] The camera 121 typically includes at least one a camera sensor (CCD, CMOS etc.), a photo sensor (or image sensors), and a laser sensor. Implementing the camera 121 with a laser sensor may allow detection of a touch of a physical object with respect to a 3D stereoscopic image. The photo sensor may be laminated on, or overlapped with, the display device. The photo sensor may be configured to scan movement of the physical object in proximity to the touch screen. In more detail, the photo sensor may include photo diodes and transistors at rows and columns to scan content received at the photo sensor using an electrical signal which changes according to the quantity of applied light. Namely, the photo sensor may calculate the coordinates of the physical object according to variation of light to thus obtain position information of the physical object.
[0092] The display unit 151 is generally configured to output information processed in the mobile terminal 100. For example, the display unit 151 may display execution screen information of an application program executing at the mobile terminal 100 or user interface (Ul) and graphic user interface (GUI) information in response to the execution screen information.
[0093] In some embodiments, the display unit 151 may be implemented as a stereoscopic display unit for displaying stereoscopic images. A typical stereoscopic display unit may employ a stereoscopic display scheme such as a stereoscopic scheme (a glass scheme), an auto-stereoscopic scheme (glassless scheme), a projection scheme (holographic scheme), or the like.
[0094] In general, a 3D stereoscopic image may include the left image (e.g., the left eye image) and the right image (e.g., the right eye image). According to how left and right images are combined into a 3D stereoscopic image, a 3D stereoscopic imaging method can be divided into a top-down method in which left and right images are located up and down in a frame, an L-to-R (left-to-right or side by side) method in which left and right images are located left and right in a frame, a checker board method in which fragments of left and right images are located in a tile form, an interlaced method in which left and right images are alternately located by columns or rows, and a time sequential (or frame by frame) method in which left and right images are alternately displayed on a time basis.
[0095] Also, as for a 3D thumbnail image, the left image thumbnail and the right image thumbnail can be generated from the left image and the right image of an original image frame, respectively, and then combined to generate a single 3D thumbnail image. In general, the term "thumbnail" may be used to refer to a reduced image or a reduced still image. A generated left image thumbnail and right image thumbnail may be displayed with a horizontal distance difference there between by a depth corresponding to the disparity between the left image and the right image on the screen, thereby providing a stereoscopic space sense.
[0096] The left image and the right image required for implementing a 3D stereoscopic image may be displayed on the stereoscopic display unit using a stereoscopic processing unit. The stereoscopic processing unit can receive the 3D image and extract the left image and the right image, or can receive the 2D image and change it into the left image and the right image.
[0097] The audio output module 152 is generally configured to output audio data. Such audio data may be obtained from any of a number of different sources, such that the audio data may be received from the wireless communication unit 110 or may have been stored in the memory 170. The audio data may be output during modes such as a signal reception mode, a call mode, a record mode, a voice recognition mode, a broadcast reception mode, and the like. The audio output module 152 can provide audible output related to a particular function (e.g., a call signal reception sound, a message reception sound, etc.) performed by the mobile terminal 100. The audio output module 152 may also be implemented as a receiver, a speaker, a buzzer, or the like.
[0098] A haptic module 153 can be configured to generate various tactile effects that a user feels, perceive, or otherwise experience. A typical example of a tactile effect generated by the haptic module 153 is vibration. The strength, pattern and the like of the vibration generated by the haptic module 153 can be controlled by user selection or setting by the controller. For example, the haptic module 153 may output different vibrations in a combining manner or a sequential manner.
[0099] Besides vibration, the haptic module 153 can generate various other tactile effects, including an effect by stimulation such as a pin arrangement vertically moving to contact skin, a spray force or suction force of air through a jet orifice or a suction opening, a touch to the skin, a contact of an electrode, electrostatic force, an effect by reproducing the sense of cold and warmth using an element that can absorb or generate heat, and the like.
[0100] The haptic module 153 can also be implemented to allow the user to feel a tactile effect through a muscle sensation such as the user's fingers or arm, as well as transferring the tactile effect through direct contact. Two or more haptic modules 153 may be provided according to the particular configuration of the mobile terminal 100. [0101] An optical output module 154 can output a signal for indicating an event generation using light of a light source. Examples of events generated in the mobile terminal 100 may include message reception, call signal reception, a missed call, an alarm, a schedule notice, an email reception, information reception through an application, and the like.
[0102] A signal output by the optical output module 154 may be implemented so the mobile terminal emits monochromatic light or light with a plurality of colors. The signal output may be terminated as the mobile terminal senses that a user has checked the generated event, for example.
[0103] The interface unit 160 serves as an interface for external devices to be connected with the mobile terminal 100. For example, the interface unit 160 can receive data transmitted from an external device, receive power to transfer to elements and components within the mobile terminal 100, or transmit internal data of the mobile terminal 100 to such external device. The interface unit 160 may include wired or wireless headset ports, external power supply ports, wired or wireless data ports, memory card ports, ports for connecting a device having an identification module, audio input/output (I/O) ports, video I/O ports, earphone ports, or the like.
[0104] The identification module may be a chip that stores various information for authenticating authority of using the mobile terminal 100 and may include a user identity module (UIM), a subscriber identity module (SIM), a universal subscriber identity module (USIM), and the like. In addition, the device having the identification module (also referred to herein as an "identifying device") may take the form of a smart card. Accordingly, the identifying device can be connected with the terminal 100 via the interface unit 160.
[0105] When the mobile terminal 100 is connected with an external cradle, the interface unit 160 can serve as a passage to allow power from the cradle to be supplied to the mobile terminal 100 or may serve as a passage to allow various command signals input by the user from the cradle to be transferred to the mobile terminal there through. Various command signals or power input from the cradle may operate as signals for recognizing that the mobile terminal is properly mounted on the cradle.
[0106] The memory 170 can store programs to support operations of the controller 180 and store input/output data (for example, phonebook, messages, still images, videos, etc.). The memory 170 may store data related to various patterns of vibrations and audio which are output in response to touch inputs on the touch screen.
[0107] The memory 170 may include one or more types of storage mediums including a Flash memory, a hard disk, a solid state disk, a silicon disk, a multimedia card micro type, a card-type memory (e.g., SD or DX memory, etc.), a Random Access Memory (RAM), a Static Random Access Memory (SRAM), a Read-Only Memory (ROM), an Electrically Erasable Programmable Read-Only Memory (EEPROM), a Programmable Read-Only memory (PROM), a magnetic memory, a magnetic disk, an optical disk, and the like. The mobile terminal 100 may also be operated in relation to a network storage device that performs the storage function of the memory 170 over a network, such as the Internet. [0108] The controller 180 can typically control the general operations of the mobile terminal 100. For example, the controller 180 can set or release a lock state for restricting a user from inputting a control command with respect to applications when a status of the mobile terminal meets a preset condition.
[0109] The controller 180 can also perform the controlling and processing associated with voice calls, data communications, video calls, and the like, or perform pattern recognition processing to recognize a handwriting input or a picture drawing input performed on the touch screen as characters or images, respectively. In addition, the controller 180 can control one or a combination of those components in order to implement various exemplary embodiments disclosed herein.
[0110] The power supply unit 190 receives external power or provide internal power and supply the appropriate power required for operating respective elements and components included in the mobile terminal 100. The power supply unit 190 may include a battery, which is typically rechargeable or be detachably coupled to the terminal body for charging. The power supply unit 190 may include a connection port. The connection port may be configured as one example of the interface unit 160 to which an external charger for supplying power to recharge the battery is electrically connected.
[0111] As another example, the power supply unit 190 may be configured to recharge the battery in a wireless manner without use of the connection port. In this example, the power supply unit 190 can receive power, transferred from an external wireless power transmitter, using at least one of an inductive coupling method which is based on magnetic induction or a magnetic resonance coupling method which is based on electromagnetic resonance. Various embodiments described herein may be implemented in a computer-readable medium, a machine-readable medium, or similar medium using, for example, software, hardware, or any combination thereof.
[0112] Referring to FIGS. 24-28 generally, a folding headset with curved frame FIMD takes the form of wearable glasses or goggles, but it will be appreciated that other forms are possible. The folding headset with curved frame FIMD may be configured in an augmented reality configuration to present an augmented reality environment, and thus may include an at least partially see-through stereoscopic display 5012 that may be configured to visually augment an appearance of a physical environment being viewed by the user through the at least partially see- through stereoscopic display 5012 . In some examples, the at least partially see-through stereoscopic display 5012 may include one or more regions that are transparent (e.g., optically clear) and may include one or more regions that are opaque or semi-transparent. In other examples, the at least partially see-through stereoscopic display 5012 may be transparent (e.g., optically clear) across an entire usable display surface of the stereoscopic display 5012. Alternatively, the folding headset with curved frame FIMD may be configured in a virtual reality configuration to present a full virtual reality environment, and thus the stereoscopic display 5012 may be a non- see-though stereoscopic display. The folding headset with curved frame FIMD may be configured to display virtual three dimensional environments to the user via the non-see-through stereoscopic display. The folding headset with curved frame FIMD may be configured to display a virtual representation such as a three dimensional graphical rendering of the physical environment in front of the user that may include additional virtual objects or may be configured to display camera-captured images of the physical environment along with additional virtual objects including the virtual cursor overlaid on the camera-captured images. [0113] For example, thefolding headset with curved frame HMD may include an image production system 5014 that is configured to display virtual objects to the user with the stereoscopic display 5012 . In the augmented reality configuration with an at least partially see-through display, the virtual objects are visually superimposed onto the physical environment that is visible through the display so as to be perceived at various depths and locations. In the virtual reality configuration, the image production system 5014 may be configured to display virtual objects to the user with the non-see-through stereoscopic display, such that the virtual objects are perceived to be at various depths and locations relative to one another. In one embodiment, thefolding headset with curved frame HMD may use stereoscopy to visually place a virtual object at a desired depth by displaying separate images of the virtual object to both of the user's eyes. Using this stereoscopy technique, thefolding headset with curved frame HMD may control the displayed images of the virtual objects, such that the user will perceive that the virtual objects exist at a desired depth and location in the viewed physical environment. In one example, the virtual object may be a virtual cursor that is displayed to the user, such that the virtual cursor appears to the user to be located at a desired location in the virtual three dimensional environment. In the augmented reality configuration, the virtual object may be a holographic cursor that is displayed to the user, such that the holographic cursor appears to the user to be located at a desired location in the real world physical environment.
[0114] The folding headset with curved frame HMD includes an optical sensor system 5016 that may include one or more optical sensors. In one example, the optical sensor system 5016 includes an outward facing optical sensor 5018 that may be configured to detect the real-world background from a similar vantage point (e.g., line of sight) as observed by the user through the at least partially see-through stereoscopic display 5012 . The optical sensor system 5016 may additionally include an inward facing optical sensor 5020 that may be configured to detect a gaze direction of the user's eye. It will be appreciated that the outward facing optical sensor 5018 may include one or more component sensors, including an RGB camera and a depth camera. The RGB camera may be a high definition camera or have another resolution. The depth camera may be configured to project non-visible light, such as infrared (IR) radiation, and capture reflections of the projected light, and based thereon, generate an image comprised of measured depth data for each pixel in the image. This depth data may be combined with color information from the image captured by the RGB camera, into a single image representation including both color data and depth data, if desired. In a virtual reality configuration, the color and depth data captured by the optical sensor system 5016 may be used to perform surface reconstruction and generate a virtual model of the real world background that may be displayed to the user via the display 5012 . Alternatively, the image data captured by the optical sensor system 5016 may be directly presented as image data to the user on the display 5012 .
[0115] The folding headset with curved frame HMD may further include a position sensor system 5022 that may include one or more position sensors, such as one or more inertial measurement unit (IMU) that incorporates a 3- axis accelerometer, 3-axis gyroscope and/or a 3-axis magnetometer, global positioning system(s), multilateration tracker(s), and/or other sensors that output position sensor information useable as a position, orientation, and/or movement of the relevant sensor.
[0116] Optical sensor information received from the optical sensor system 5016 and/or position sensor information received from position sensor system 5022 may be used to assess a position and orientation of the vantage point of folding headset with curved frame HMD relative to other environmental objects. In some embodiments, the position and orientation of the vantage point may be characterized with six degrees of freedom (e.g., world-space X, Y, Z, (Opitch , Oyaw and Oroll ). The vantage point may be characterized globally or independent of the real-world background. The position and/or orientation may be determined with an on-board computing system (e.g., on-board computing system 5024 ) and/or an off-board computing system. Typically, frames of reference of all sensors located on board folding headset with curved frame HMD are factory aligned and calibrated to resolve six degrees of freedom relative to world-space.
[0117] Furthermore, the optical sensor information and the position sensor information may be used by a computing system to perform analysis of the real-world background, such as depth analysis, surface reconstruction, environmental color and lighting analysis, or other suitable operations. In particular, the optical and positional sensor information may be used to create a virtual model of the real-world background. In some embodiments, the position and orientation of the vantage point may be characterized relative to this virtual space. Moreover, the virtual model may be used to determine positions of virtual objects in the virtual space and add additional virtual objects to be displayed to the
[0118] Additionally, the optical sensor information received from the optical sensor system 5016 may be used to identify and track objects in the field of view of optical sensor system 5016 . For example, depth data captured by optical sensor system 5016 may be used to identify and track motion of a user's hand. The tracked motion may include movement of the user's hand in three-dimensional space, and may be characterized with six degrees of freedom (e.g., world-space X, Y, Z, (Opitch , Oyaw and Oroll ). The tracked motion may also be used to identify and track a hand gesture made by the user's hand. For example, one identifiable hand gesture may be moving a forefinger upwards or downwards. It will be appreciated that other methods may be used to identify and track motion of the user's hand. For example, optical tags may be placed at known locations on the user's hand or a glove worn by the user, and the optical tags may be tracked through the image data captured by optical sensor system 5016 .
[0119] It will be appreciated that the following examples and methods may be applied to both a virtual reality and an augmented reality configuration of the folding headset with curved frame HMD . In a virtual reality configuration, the display 5012 of the folding headset with curved frame HMD is a non-see-through display, and the three dimensional environment is a virtual environment displayed to the user. The virtual environment may be a virtual model generated based on image data captured of the real-world background by optical sensor system 5016 of the folding headset with curved frame HMD .
[0120] One example of a folding headset with curved frame HMD is , which is a pair of mixed reality -mounted smartglasses. folding headset with curved frame has see-through holographic lenses that use an advanced optical projection system to generate multi-dimensional full-color holograms with very low latency so a user can see holographic objects in a real world setting.
[0121] Located at the front of the curved frame are sensors and related hardware, including cameras and processors. The curved frame also incorporates an inertial measurement unit (IMU), which includes an accelerometer, gyroscope, and a magnetometer, four "environment understanding" sensors, an energy-efficient depth camera with a 120°xl20° angle of view, a forward-facing 2.4-megapixel photographic video camera, a four- microphone array, and an ambient light sensor curved frame contains advanced sensors to capture information about what the user is doing and the environment the user is in. The built in cameras also enable a user to record (mixed reality capture (MRC)) HD pictures and video of the holograms in the surrounding world to share with others.
[0122] Enclosed within the visor is a pair of transparent combiner lenses, in which the projected images are displayed in the lower half. The folding headset with curved frame must be calibrated to the interpupillary distance (IPD), or accustomed vision of the user.
[0123] There is a pair of 3D speakers, and external sounds can be heard through the speakers after receiving by a microphone, allowing the user to hear virtual sounds together with the ambient sound. In other words, with head functions are associated with the folding headset, creating the binaural sound, which can simulate spatial effects for the user. It can almost sense and locate a sound, as if coming from a specific location or virtual location.
[0124] In addition to a central processing unit (CPU) and graphics processing unit (GPU), curved frame features a custom-made Holographic (HPU) , a coprocessor manufactured specifically for the folding headset with curved frame. The main purpose of the HPU is processing and integrating data from the sensors, as well as handling tasks such as spatial mapping, gesture recognition, and voice and speech recognition. The HPU processes terabytes of information from the folding headset with curved frame's sensors from real-time data.
[0125] The lenses of the curved frame use optical waveguides to color blue, green, and red across three different layers, each with diffractive features. A light engine above each combiner lens projects light into the lens, a wavelength which then hits a diffractive element and is reflected repeatedly along a waveguide until it is output to the eye. Similar to that of many other optical head-mounted displays, the display projection for the curved frame occupies a limited portion of the user's field of view (FOV), particularly in comparison to virtual reality head- mounted displays, which typically cover a much greater field of view.
[0126] The folding headset with curved frame contains an internal rechargeable battery, but can be operated while charging folding headset with curved frame also features IEEE 802.11ac Wi-Fi and Bluetooth 4.1 Low Energy (LE) wireless connectivity.
[0127] With folding headset with curved frame a user can create and shape holograms with gestures, communicate with apps using voice commands, and navigate with a glance, hand gestures, Controllers and/or other pointing devices folding headset with curved frame understands gestures, gaze, and voice, enabling the user to interact in the most natural way possible. With spatial sound, folding headset with curved frame synthesizes sound so the user can hear holograms from anywhere in the room, even if they are behind the user. [0128] As mentioned above, the folding headset with curved frame includes a depth camera, which is capable of detecting the 3D location of objects located within the depth camera's FOV. Technical details of exactly how the depth camera accomplishes such detection are known to those skilled in the art, but are not necessary for the present disclosure. Suffice it to say that the depth camera is able to accurately detect, on a pixel-by-pixel basis, the exact 3D location of each point on a physical object within the camera's field of view. While the folding headset with curved frame uses a depth camera, stereoscopic optics can also be used to detect the distance of objects from the HMD and the locations of such objects in 3D space via triangulation. In either event, such sensors can detect the 3D location (x, y and z coordinates) of real objects located within the FOV relative to the HMD. In the case of a Controller, the depth camera of the HMD can be used to detect the 3D location of the Controller relative to the HMD.
[0129] Wireless Hand-Held Inertial Controllers.
[0130] nAs previously mentioned, the folding headset with curved frame has the ability to track the movement of a user's hands through space and to identify and interpret a variety of hand poses, gestures and movements to manipulate virtual objects in the AR space. Additional details regarding hand tracking, hand gesture identification, classification and recognition and/or hand pose identification.
[0131] the manual wireless controllers of the HMD glasses are pen-like.
[0132] One of the challenges with hand tracking and gesture recognition, however, is that they can require a relatively high level of processing overhead. To reduce such overhead, it can be useful to provide a Controller that can communicate with the folding headset with curved frame HMD and allow manipulation of objects in the AR space. For example, in the case of folding headset with curved frame, the headset uses Bluetooth LE to pair with a Controller,
[0133] Referring to FIGS. 28 and 29, Controller 5040 can include an on-board microcontroller 5042 , its own IMU 5044 , a communications radio 5046 , a rechargeable battery (not shown), and one or more status LEDs 5048 . The IMU typically includes a 3-axis accelerometer and a 3-axis gyroscope, and may also include a magnetometer. User inputs and orientation data (pitch, yaw and roll) derived from the IMU can be wirelessly communicated by the microcontroller 5042 to the CPU of the folding headset with curved frame HMD via wireless radio 5046 . Controller 5040 can also include one more momentary switch(es) 5050 for selective activation by the user to control a virtual cursor and/or to manipulate virtual objects in various ways (such as, for example, select, move, rotate, scroll, etc.). Controller 5040 can also include an elastic finger loop (for holding the device) and a USB 2.0 micro-B receptacle for charging the internal battery folding headset with curved frame
[0134] From the accelerometer and gyroscope, the IMU 5044 can detect the orientation of the Controller 5040 , but only with three degrees of freedom, namely, pitch (elevation angle), yaw (azimuth angle) and roll (rotation). Because the accelerometer can detect the gravity vector, the vertical axis of the frame of reference of the Controller 5040 is easily identified and aligned. Similarly, the gyroscopes of the IMU 5044 can readily detect the horizontal plane and, therefore, the horizontal plane is readily identified and aligned. If the IMU 5044 also includes a magnetometer, then magnetic north can readily be identified and the frame of reference of the Controller 5040 can be north aligned. If both the IMU of the folding headset with curved frame HMD and the IMU 5044 of the Controller 5040 include a magnetometer, then the frame of reference of the Controller 5040 will automatically be aligned with the folding headset with curved frame HMD's frame of reference.
[0135] If the IMU 5044 of the Controller 5040 does not include a magnetometer, then the IMU 5044 arbitrarily assigns an x-axis when it powers up and then continuously tracks azimuth changes (angular rotation in the horizontal plane) from that initial frame of reference. In that case, the frame of reference of the Controller 5040 will need to be aligned with or calibrated to the folding headset with curved frame HMD's frame of reference, as discussed in more detail below.
[0136] FIG. 30 illustrates an augmented reality configuration of a folding headset with curved frame HMD worn by a user 5026 , displaying a virtual cursor, which is a holographic cursor 5028 in this example, on the at least partially see-through stereoscopic display 5012 so as to appear to at a location 5030 in a three dimensional environment 5032 . In the specific example shown in FIG. 30, the three dimensional environment 5032 is a room in the real world, and the holographic cursor 5028 is displayed on the at least partially see-through stereoscopic display such that the holographic cursor 5028 appears to the user 5026 , to be hovering in the middle of the room at the location 5030 . It will be appreciated that the location 5030 for the holographic cursor 5028 may be calculated based on a variety of suitable methods. For example, the location 5030 may be calculated based on a predetermined distance and orientation relative to the user 5026 , such as being two feet in front of the user 5026 as one specific example. [
[0137] As another non-limiting example, the location 5030 may be calculated based on a detected gaze direction 5034 and a recognized object that intersects with the detected gaze direction. In this example, the recognized object may be a real object in the three dimensional environment. This example is illustrated in FIG.30, with the recognized object being the wall 5036 that is a part of the room that serves as the three dimensional environment 5032 . Accordingly, the intersection between the wall 5036 and the detected gaze direction 5034 of the user 5026 may be used to calculate the location 5030 for the holographic cursor 5028 . It may be advantageous to further ensure that the holographic cursor 5028 is displayed to the user 5026 , such that the holographic cursor 5028 is easily visible to the user 5026 . For example, to increase visibility, the location 5030 of the holographic cursor 5028 may be placed a threshold distance away from the recognized object to prevent the holographic cursor 5028 from being occluded by any protrusions of the recognized object. Additionally, it may be advantageous to further calculate the location 5030 of the holographic cursor 5028 based on a plane that is orthogonal to the detected gaze direction 5034 of the user 5026 . By placing the location 5030 of the holographic cursor 5028 on such a plane, a consistent view of the holographic cursor 5028 may be maintained even as the user changes gaze direction.
[0138] Additionally, in the example illustrated in FIG. 30, the folding headset with curved frame HMD worn by the user 5026 may be configured to detect motion of the user's hand. Based on a series of images captured by the optical sensor system 5016 , the folding headset with curved frame HMD may determine whether motion of hand 5038 of the user 5026 is trackable. For example, the user's hand at positions 5038 and 5038 A are within the field of view of the optical sensor system 5016 . Accordingly, motion of the user's hand moving from position 5038 to position 5038 A over time T 1 is trackable by the folding headset with curved frame HMD . However, as position 5038 B may be outside of the field of view of the optical sensor system 5016 , motion of the user's hand moving from position 5038 A to position 5038 B over time T 2 may not be trackable by the folding headset with curved frame HMD . It will be appreciated that the user's hand is determined to be trackable by the HMD when the HMD can monitor the hand for gesture input. Thus, the user's hand is deemed to be trackable, for example, when computer algorithms implemented in software executed on the processor of the folding headset with curved frame HMD identify the hand in images captured by the onboard camera and begin tracking the hand, until a point in time at which those algorithms lose track of the hand. Techniques that may be used to track the hand the hand include searching for regions of similar color values and segmenting a portion of the image based on the color values from the rest of the image, as well as searching for regions of pixels that have changed, indicating foreground movement by a hand or other object. When depth information is available, the hand may be located using skeletal tracking techniques in addition or as an alternative to the above. A hand may be determined to be trackable when a confidence degree output by the algorithm indicates that the hand is being tracked with above a predetermined threshold level of confidence.
[0139] In the above embodiment, the folding headset with curved frame HMD communicates to the user whether motion of the user's hand is trackable. In this embodiment, in response to at least determining that motion of the hand is trackable, the folding headset with curved frame HMD modifies the visual appearance of the holographic cursor to indicate that motion of the hand is trackable. In the example illustrated in FIG.30, the visual appearance of the holographic cursor is modified to appear as holographic cursor 5028 , which is an unfilled circle. Accordingly, as the user moves the hand from position 5038 to position 5038 A over time T 1 , the user is shown holographic cursor having visual appearance 5028 and is thus provided with the feedback that motion of the user's hand is currently trackable, and any hand gestures or hand movements will be tracked by the folding headset with curved frame HMD .
[0140] Further in this embodiment, in response to at least determining that motion of the hand is not trackable, the folding headset with curved frame HMD modifies the visual appearance of the holographic cursor to indicate that motion of the hand is not trackable. As illustrated in FIG. 30, the visual appearance of the holographic cursor may be modified to appear as holographic cursor 5028 A, which has a different visual appearance than holographic cursor 5028 . In this example, the visual appearance of holographic cursor 5028 A is a filled circle. Accordingly, as the user moves the hand from position 5038 A to position 5038 B over time T 2 , the user is shown holographic cursor having visual appearance 5028 A and is thus provided with the feedback that motion of the user's hand is not currently trackable. It will be appreciated that while the example illustrated in FIG. 30 modifies the visual appearance of the holographic cursor to appear as a filled or unfilled circle, any suitable visual modification is possible. As a few other non-limiting examples, the visual appearance of the holographic cursor may be modified by changing a color, changing a shape, adding or removing an icon, or changing a size of the holographic cursor.
[0141] Mixed Reality Tracking and Input with Six Degrees of Freedom.
[0142] While the wireless Controllers found in the prior art may provide orientation information with 3DOF, they do not provide location information. 6DOF can be recovered, however, in accordance with the systems and methods described below. For example, and as set forth in more detail below, one embodiment of the invention is directed to a system 6DOF mixed reality input by fusing inertial handheld controller with hand tracking. The system can include: a display with an onboard processor; a hand-held input device configured to communicate with the processor to selectively provide one or more user inputs, the hand-held input device also including a first sensor for determining the orientation of the hand-held input device relative to a predetermined frame of reference and providing orientation data to the processor; and a second sensor located at a known location relative to the display for determining the position of one or more hands of a user relative to the display and for providing position data to the processor, wherein the processor uses the orientation data and the position data to track the one or more hands of the user within a three dimensional field of view with six degrees of freedom.
[0143] ] In one embodiment herein, the hand-tracking feature of the HMD can be used to accurately and precisely determine the 3D position of a Controller relative to the HMD by detecting the location of a user's hand in which the Controller is located. Then, the location information derived from the optical system of the HMD can be combined with the orientation data derived from the orientation sensors (e.g., IMU) incorporated in the Controller. In this manner, the system provides a Controller that operates with 6DOF.
[0144] Referring again to FIG. 30, for each frame of video captured by the optical sensor, for example at time T 1 , the image processor analyzes the video to determine the presence of one or more of the user's hands within the field of view of the optical sensor. If a user's hand is detected by the image processor, then the image processor can also determine whether the orientation and shape of the hand indicates the presence of a Controller, based on known geometrical constraints of the Controller and the position and orientation of the hand relative to the Controller. To determine which hand is holding the Controller, a classifier forming part of the environment tracking components of the HMD is trained to determine if a segmented hand is positioned in a hand pose consistent with holding a controller, using training examples of hands interacting with the controller. When using two controllers, one in each hand, it is possible to further differentiate which hand holds which controller by matching the hand trajectory as observed by the hand tracking sensor of the HMD with the acceleration data from the IMU of each controller over a period of time.
[0145] If the image processor detects the presence of Controller, then the depth camera of the HMD determines the exact position (x, y and z coordinates) of the Controller in 3D space relative to a known frame of reference. In addition to the location data derived from the depth camera, orientation data (Opitch , 0yaw and Broil ) for time T 1 is also obtained from the IMU of the Controller. By combining the location data, derived from the depth camera, with the orientation data, derived from the IMU of the Controller, 6DOF are recovered, thereby allowing the HMD to track and interact with the Controller with 6DOF. This process can be repeated for each successive frame, or some other predetermined sampling of video captured by the optical sensor, to track and interact with the Controller with 6DOF.
[0146]] In addition, once a particular hand of the user has been identified (e.g., right v. left), a unique hand identifier is associated with that hand for future identification and tracking. Similarly, once a particular Controller has been identified, a unique controller identifier is associated with that Controller for future identification and tracking. Finally, once the system determines to a desired confidence level that one particular Controller is located within a particular user hand, then an association is created between that particular Controller and that particular user hand, and that association is persisted unless and until subsequent sampling indicates that the association is no longer valid.
[0147] In addition to identifying the presence of a user's hand within the field of view of the optical sensor, the image processor can detect orientation of the user's hand by segmenting various parts of the user's hands and arms, determining the relative positions of each part and, from that information, derive the orientation of the user's hand(s). Information concerning the orientation of the user's hand can also be compared to the orientation of a Controller (based on orientation data derived from the IMU) to determine if the hand orientation data is consistent with the controller orientation data. This information, along with other positional data, helps to determine whether the controller should be associated with a particular hand. Once a certain level of confidence that a controller should be associated with a particular hand, then such association is made for future identification and tracking. Of course, it is possible that a user may transfer the Controller from one hand to the other. Therefore, such association(s) can be continually tested and updated based on successive video frames. [0148] When a Controller is detected by the depth camera of the HMD, the location data (x, y and z coordinates) derived from the depth camera can be combined with the orientation data (Opitch , Oyaw and Oroll ) derived from the IMU of the Controller to achieve a Controller that can be accurately detected with a relatively high degree of reliability and resolution in 6DOF.
[0149] The system comprises of an inertial handheld controller and an HMD with a hand tracking sensor and environment tracking sensor.
[0150] As discussed above, the Controller can include an IMU that can include a combination of accelerometers and gyroscopes. In addition, the IMU may also contain magnetometers. IMU data is fused to compute, with high frequency and low latency, the orientation (Opitch , Oyaw and Oroll ) of the Controller relative to some initial reference frame that is gravity aligned. The presence of magnetometers ensures there is little drift in maintaining the north pole alignment.
[0151] For folding headset with curved frame, the hand tracking sensor consists of a depth camera that observes the hands moving through space. The depth image can be used to segment the hand from the background and the rest of the body, classify pixels as belonging to different hand parts using decision trees/jungles, and compute centroids for them (palm, fingertips, etc.) in 3D space.
[0152] The hand tracking sensor is factory calibrated relative to the environment tracking components on board of the HMD, allowing for the hand position to be transformed to a gravity aligned world frame of reference. The hand(s) can also be classified into several hand poses (open, closed, pointing, bloom etc.).
[0153] Once a specific controller is matched with a specific hand, the location data (x, y and z coordinates) of the hand and the orientation data (Opitch , Oyaw and Oroll ) of the IMU are combined to determine the 6DOF transform of the Controller in the world frame of reference.
[0154] The HMD and the Controller frames of reference are both gravity aligned (z axis is shared). In the embodiment where the HMD and the Controller are both gravity aligned and north aligned, then they are rotationally invariant. If the two frames of reference are not north-aligned, then there is an azimuth offset between the two frames of reference that needs to be resolved in one of several ways. For example, in a scenario where there is a 3D cursor (such as gaze targeting against 3D content), the cursor has a 3D location. For a manipulation gesture, the azimuth offset is calculated at the time of the button press by aligning the IMU forward vector with the vector between the hand and the cursor and is maintained constant throughout the manipulation gesture, when the button is released. For example, one way to determine the azimuth offset and calibrate the Controller to the HMD's frame reference is to have the user point at a virtual object and calculate the azimuth delta between the HMD's frame of reference and the Controller's frame of reference. Alternatively, a coarse estimate of the hand orientation could also be used to initially estimate the azimuth offset and update it gradually over time using a moving average approach. Such a coarse estimate could be based on the segment between lower arm centroid and palm centroid provided by a hand tracking pipeline.
[0155] ] The following discussion now refers to a number of methods and method acts that may be performed. Although the method acts may be discussed in a certain order or illustrated in a flow chart as occurring in a particular order, no particular ordering is required unless specifically stated, or required because an act is dependent on another act being completed prior to the act being performed. [0156] folding headset with curved frame As discussed in greater detail below, the invention is also directed to methods for recovering six degrees of freedom (6DOF) relative to a wireless hand-held inertial controller when used in combination with a head mounted display. In one embodiment, the method can include one or more of the following acts: detecting by an optical sensor of mounted display device, the presence of a user's hand within the field of view of the optical sensor; determining by the head mounted display device if a wireless hand-held inertial controller is active and paired with mounted display device; tracking by the optical sensor of the head mounted display device movement of the user's hand relative to the head mounted display over a period of time to derive trajectory data representative of the trajectory of the user's hand during the period of time; receiving by the head mounted display device acceleration data for the period of time from the wireless hand-held inertial controller as derived by the inertial measurement unit of the wireless hand-held inertial controller; comparing the trajectory data with the acceleration data to compute a confidence level that the wireless hand-held device is located in the user's hand; and if the confidence level meets or exceeds a predetermined minimum threshold, fusing the location data derived from the optical sensor of mounted display device with the orientation data derived from the inertial measurement unit of the wireless hand-held inertial controller to track the user's hand within three dimensional space with six degrees of freedom.
[0157] Referring now to FIG 31, a method 500 for recovering six degrees of freedom (6DOF) relative to a wireless hand-held inertial controller when used in combination with a head mounted display is illustrated. The process starts at block 50102 . For each frame of video captured by the optical sensor of the folding headset with curved frame FIMD device, the hand tracking component of the FIMD analyzes the video data to determine if a user's hand is located within the field of view of the FIMD's optical sensor as indicated at step 50104 . If so, the process continues to step 50106 . If not, the process returns to step 50102 .[
[0158] At step 50106 , the processor of the FIMD device checks to see if it is paired with any active hand-held Controller. If so, the process continues to step 50108 . If not, the process returns to step 50102 .
[0159] At step 50108 , for several video frames (i.e., over some period of time) the optical sensor of the FIMD tracks movement of the user's hand relative to the head mounted display over a period of time to derive trajectory data representative of the trajectory of the user's hand during the period of time. In addition, the FIMD receives acceleration data for the same period of time from the Controller as derived by the IMU of the wireless hand-held inertial controller. The FIMD then compares the trajectory data with the acceleration data to compute a confidence level that the wireless hand-held device is located in the user's hand. Then, as indicated at step 50110 , if the confidence level meets or exceeds a predetermined threshold, then the process continues to step 50116 . If not, the process continues with step 50112 .
[0160] Alternately, step 50108 can be performed as follows. The processor of the FIMD retrieves orientation data from the IMU of the Controller and compares it to the orientation data the FIMD derives from its optical sensors. The processor of the FIMD then computes a confidence level based on the correlation or lack thereof between the orientation data from the Controller and the orientation data from the FIMD. As indicated at step 50110 , if the confidence level meets or exceeds a predetermined threshold, then the process continues to step 50116 . If not, the process continues with step 50112 .
[0161] ] At step 50112 , the hand pose component of the FIMD compares the pose detected in the video frame against a pose classifier and calculate a confidence level based on the correlation or lack thereof between the hand pose as detected in the video frame and hand poses consistent with the Controller being held in the user's hand. As indicated at step 50114 , if the confidence level meets or exceeds a predetermined threshold, then the process continues to step 5116 . If not, the process returns to step 50102 .
[0162] If the process reaches step 50116 , that means that there is a sufficient confidence level to create an association between the detected user hand and the Controller, and such association is created. Such association is persisted unless and until further analysis demonstrates that the association is no longer valid based on subsequent confidence level calculations.
[0163] Once the association is established, the process continue to step 50118 and the location data derived from the optical sensors of the HMD and the orientation data derived from the IMU of the Controller are fused, thereby recovering 6DOF in relation to the Controller. Then the process continues by returning to block 50102 for continued processing of subsequent frames of captured video.
[0164] Further, the systems and methods described above may be practiced by a computer system including one or more processors and computer-readable media such as computer memory. In particular, the computer memory may store computer-executable instructions that when executed by one or more processors cause various functions to be performed, such as the acts recited in the embodiments.
[0165] Embodiments of the present invention may comprise or utilize a special purpose or general-purpose computer including computer hardware, as discussed in greater detail below. Embodiments within the scope of the present invention also include physical and other computer-readable media for carrying or storing computer- executable instructions and/or data structures. Such computer-readable media can be any available media that can be accessed by a general purpose or special purpose computer system. Computer-readable media that store computer-executable instructions are physical storage media. Computer-readable media that carry computer- executable instructions are transmission media. Thus, by way of example, and not limitation, embodiments of the invention can comprise at least two distinctly different kinds of computer-readable media: physical computer- readable storage media and transmission computer-readable media.
[0166] Physical computer-readable storage media includes RAM, ROM, EEPROM, CD-ROM or other optical disk storage (such as CDs, DVDs, etc), magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer.
[0167] A "network" is defined as one or more data links that enable the transport of electronic data between computer systems and/or modules and/or other electronic devices. When information is transferred or provided over a network or another communications connection (either hardwired, wireless, or a combination of hardwired or wireless) to a computer, the computer properly views the connection as a transmission medium. Transmissions media can include a network and/or data links which can be used to carry or desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer. Combinations of the above are also included within the scope of computer-readable media. [0168] Further, upon reaching various computer system components, program code means in the form of computer-executable instructions or data structures can be transferred automatically from transmission computer-readable media to physical computer-readable storage media (or vice versa). For example, computer- executable instructions or data structures received over a network or data link can be buffered in RAM within a network interface module (e.g., a "NIC"), and then eventually transferred to computer system RAM and/or to less volatile computer-readable physical storage media at a computer system. Thus, computer-readable physical storage media can be included in computer system components that also (or even primarily) utilize transmission media.
[0169] Computer-executable instructions comprise, for example, instructions and data which cause a general purpose computer, special purpose computer, or special purpose processing device to perform a certain function or group of functions. The computer-executable instructions may be, for example, binaries, intermediate format instructions such as assembly language, or even source code. Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the described features or acts described above. Rather, the described features and acts are disclosed as example forms of implementing the claims.
[0170] Those skilled in the art will appreciate that the invention may be practiced in network computing environments with many types of computer system configurations, including, personal computers, desktop computers, laptop computers, message processors, hand-held devices, multi-processor systems, microprocessor- based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, mobile telephones, PDAs, pagers, routers, switches, and the like. The invention may also be practiced in distributed system environments where local and remote computer systems, which are linked (either by hardwired data links, wireless data links, or by a combination of hardwired and wireless data links) through a network, both perform tasks. In a distributed system environment, program modules may be located in both local and remote memory storage devices.
[0171] Alternatively, or in addition, the functionality described herein can be performed, at least in part, by one or more hardware logic components. For example, and without limitation, illustrative types of hardware logic components that can be used include Field-programmable Gate Arrays (FPGAs), Program-specific Integrated Circuits (ASICs), Program-specific Standard Products (ASSPs), System-on-a-chip systems (SOCs), Complex Programmable Logic Devices (CPLDs), etc.

Claims

CLAIMS:
1. A folding headset with a curved frame, each with two folding handles. The curved frame consists of a curved folding screen and folding frame and the HMD smart glasses system.
The folding headset body supports a curved frame with a curved, folding screen.
A folding screen curved frame that supports an HMD smart glasses display configured inside the curved frame.
The curved folding display frame rotates around the folding headset by a lightweight electric motor with a gearbox.
A cooling fan is located in the body of the folding headset to keep the module terminal block in cool condition.
The curved folding display frame and folding headset both contain terminal blocks and modules.
The sensor is adjusted in the handles of the folding headset device and curved frame of the display to sense a folding angle between the two handles of the folding headset and the folding frame of the display.
2. a sensor is set to sense a rotating angle between the folding headset and the curved frame, which is driven by an electric motor with a gearbox.
The actuator system is adjusted for changing the folding angle of the handles of the folding headset folding frame display.
The actuator system is adjusted for changing the angular motion between the folding headset and the curved frame by a lightweight electric motor with a gearbox.
3. A controller is configured to control the actuating system for the following purposes:
Control the actuating system to increase the folding angle between the two handles of the folding headset and the two handles of the folding frame of the folding display, without the user's physical pressure responding to a primary preset input.
Control the actuator to increase the rotating angle between the folding headset and the curved frame by a lightweight electrical motor with a gearbox without the user's physical pressure responding to a primary preset input.
Control the actuating system to reduce the folding angle between the two handles of the folding headset and the two handles of the folding frame of the folding display without the user's physical pressure responding to the second preset input. Control the actuator to reduce the rotational angle between the two handles of the folding headset and the two handles of the folding frame of the folding display by a lightweight electric motor with a gearbox without the user's physical pressure responding to a second preset input.
Claim 1: Moveable system, in which the controller is set more significantly:
Control the actuator to increase the folding angle until the two headset handles and the two curved frame handles of the folding display are open, where the folding angle between the two the two headset handles and the two curved frame handles of the folding display is more than a first angle in response to the first preset input.
Control the actuator to increase the rotating angle between two handles of the folding headset and the two handles of the curved frame until it is open and where there is more than one first angle of motion in response to the first preset input.
Control the actuator to reduce the rotating angle between two handles of the folding headset and the two handles of the curved frame until it is closed and where there the folding angle between the two headset handles and the two curved frame handles of the folding display is equal to or smaller than the first angle.
Claim 2: Setting the controller system, as follows:The credibility of the user should be confirmed during receiving the confirmation of the two handles of the folding headset and the two handles of the curved frame.
While confirming the folding headset and the curved frame at the rotating angle in closed mode, the credibility of user should be received.
Control the actuator so that when the user's authentication is received, the confirmed information about the user is saved, and change the two handles of the folding headset and the two handles of the curved frame at open mode.
Control the actuator so that when the user's authentication is received, the confirmed information about the user is saved, and change the headset curved frame rotating angle at open mode.
4. The folding headset of claim 3 : In this regard, the user authentication is done based on at least one of a fingerprint input, a pattern input, an iris input, a touch input and a speech input.
The folding headset of claim 2:As the controller and the device control are further configured in the open mode.
An input signal is sensed, when two handles of the folding headset and two handles of the curved frame are in open mode.
5. It also senses an input signal when the folding headset device and the curved frame are in the open mode of rotating angle, an input signal will be sensed.
Control the actuator switch the two handles of the folding headset and two handles of the curved frame to the closed mode in response to the input signal.
Control the actuating system to switch the folding headset and curved frame to the closed mode of the rotating angle in response to the input signal.
6. The folding headset of claim 2:The controller is further configured to control the actuator unit to switch the two handles of the folding headset and the two handles of the curved frame of the folding display in the closed mode, when the two handles of the folding headset and the two handles of the curved frame of the folding display are in the open mode and an input signal has not been sensed for a preset time.
Also, the controller is further configured to control the actuator for changing the rotating angle of the folding headset and the curved frame are in closed mode, when the folding headset and the curved frame are in the open mode of the rotating angle and an input signal has not been sensed for a preset time.
7. The folding headset of any one of the claims 1 to 6:1 n which the actuator comprises at least one of an electromagnet and a shape memory alloy.
8. The folding headset of claim 7 : If an event occurs in the closed mode of the two handles of the folding headset and the two handles of the curved frame of the folding display, the controller is further configured to determine a switching speed to the open mode of the folding headset based on the type of the occurring event.
Also, if an event occurs in the rotating angle in the closed mode relative to the folding headset and the curved frame, the controller is configured to switch the speed to the rotating angle of relative in the open mode relative to the folding headset and the curved frame based on the type of the occurring event.
9. The foldable device of claim 8:ln which the displaceable two handles of the folding headset and the handles of the curved frame of the folding display are switched to the closed mode, and if the folding angle is equal to or smaller than a second angle, the controller is further configured to control the actuator to change the switching speed by switching the magnet of at least one preset region of the first body and the second body.
Moreover, the displaceable folding headset and curved frame is switched to the closed mode rotating angle, and if the rotating angle between the folding headset and curved frame is equal to or less than the second rotating angle, the controller is further configured to control the actuator to change the switching speed by switching the magnet of at least one preset region of the headset and the curved frame.
10. The foldable device of claim 7: It is further comprising of: A wireless communication processor configured to wirelessly communicate with a stylus pen.
The controller is hereby further configured to control the actuator to change the folding angle of the two handles of the folding headset and the handles of the curved frame of the folding display based on a control signal received from the stylus pen.
Wherein, the controller is configured to control the actuator to change the angle of the two handles of the folding headset and the two handles of the folding display curved frame based on the control signal received from the stylus.
Furthermore, the controller is configured for the actuating system to change the rotating angle between the headset and the curved frame in accordance with the received control signal from the stylus pen.
11. The folding headset of claim 10:Where the controller is further configured to determine the changed folding angle based on at least one of a duration time and length of a user input sensed by the stylus pen.
Furthermore, the controller is further configured to determine the changed rotating angle between the folding headset and the curved frame based on at least one of a duration time and length of a user input sensed by the stylus pen.
12. The folding headset of claim 10:Where the two displaceable handles of the folding headset and that of the curved frame are switched to the closed mode, and if the pen is located in the headset and the curved frame, the controller is further configured to control the actuator to activate the first magnet.
The headset and curved frame of the folding display handles are at the same level with the stylus pen.
Moreover, the displaceable folding headset and curved frame rotating angle is displaced at the closed mode, and if the pen is located in the headset and the curved frame, the controller is configured to adjust the actuating system in order to activate the first magnet. The headset and curved frame of the folding display handles are at the same level with the stylus pen.
13. The folding headset of claim 2:Wherein, if an input signal is sensed in the open mode, the controller is further configured to change the folding angle of the two handles of the folding headset and the two handles of the curved frame of the folding display into a third angle.
Also, if an input signal is sensed at the open mode of the folding headset and the curved frame, the controller is configured to adjust the rotating angle between the folding headset and the curved frame into a third angle.
14. The foldable device of claim 14:Where the controller is further configured to transfer the preset content to the display unit in response to the input signal.
Furthermore, the controller is further configured to switch the preset rotating angle between the folding headset and the curved frame to the electric motor section with the gearbox in response to the input signal to change the rotating angle.
A method of controlling a folding headset, the two handles of this folding headset and the two handles of the folding display curved frame includes supporting of the folding headset from a section of the folding curved display screen that is located in the curved frame and the frame handles, and the curved frame supports a smart FIMD glasses system. Also, the rotating angle between the folding headset and the curved frame can be changed. The required methods are consisting of:
A folding angle is configured via a sensor between the two handles of this folding headset and the two handles of the folding display curved frame
A rotating angle is configured by the aid of an electric motor with a gearbox, via a sensor between the two handles of this folding headset and the two handles of the folding display curved frame.
15. The system of claim 1: 1. A system comprising:a head-mounted display device (FIMD') comprising:a processor; a wearable stereoscopic display adapted for displaying one of an augmented reality (AR) environment or a full virtual reality (VR) environment; and an optical sensor system that produces location data at a particular instant of time for a wireless hand-held input controller, wherein the location data is obtained from a depth camera mounted on the HMD, and wherein the location data is defined by x, v and z coordinates; the wireless hand-held input controller comprising:an inertial measurement unit (IMU) comprising one or more sensors for determining orientation data of the wireless hand-held input controller and wherein the orientation data is defined by pitch (elevation angle), yaw (azimuth angle) and roll (rotation) relative to a predetermined frame of reference; and a microcontroller that communicates to the processor of the HMD one or more user inputs relative to the orientation data; and wherein the processor of the HMD uses the orientation data at the particular instant of time as provided by the IMU of the wireless hand-held input controller and the location data at the particular instant of time as provided by the optical sensor system of the HMD to determine location and orientation of the wireless hand-held input controller in reference to the HMD with six degrees of freedom as derived from the x, v and z coordinates determined at the HMD and the yaw, pitch and roll coordinates determined at the IMU of the wireless hand-held input controller.
16. The system of claim 15, wherein the HMD further comprises an on-board image production system mounted on the HMD.
17. The system of claim 15, wherein the HMD comprises a virtual reality display.
18. The system of claim 15, wherein the HMD comprises a three dimensional, augmented reality display.
19. The system of claim 15, wherein the wireless hand-held input controller comprises one or more momentary switches that are operatively connected to, and selectively provide inputs to, the microcontroller, one more status LEDs operatively connected to the microcontroller, and a wireless radio operatively connected to the microcontroller for transmitting user inputs and orientation data to the processor of the HMD.
20. The system of claim 19, wherein the IMU comprises one or more of a 3-axis accelerometer, a 3-axis gyroscope and a 3-axis magnetometer.
21. The system of claim 20, wherein the orientation data comprises Opitch, 6yaw and Oroll coordinates.
22. The system of claim 21, wherein the depth camera is an infrared camera.
23. The system of claim 15, wherein the HMD further comprises an inward facing optical sensor that detects a gaze direction of a user's eyes.
24. The system of claim 15, wherein the HMD further comprises an on-board position sensor system mounted to the HMD.
25. The system of claim 22, wherein the position sensor system mounted to the HMD comprises an inertial measurement unit (IMU) comprising one or more of:a 3-axis accelerometer; a 3-axis gyroscope; a 3-axis magnetometer; a global positioning system; and a multilateration tracker.
26. The system of claim 15, wherein the determined location and orientation of the wireless hand-held input controller is determined in reference to x, y, z, 6pitch, 6yaw and 6roll coordinates within a real world frame of reference.
27. In a system comprising a head mounted display device (HMD) configured to display a three- dimensional space for an augmented reality (AR) environment or a full virtual reality (VR) environment, wherein the HMD comprises a forward facing optical sensor having a field of view, and wherein the HMD interfaces with a wireless hand-held input controller that provides user input to the HMD, a computer-implemented method for determining and tracking location and orientation of the wireless hand-held input controller in reference to the HMD, and wherein the determined location and orientation are determined with six degrees of freedom, the computer-implemented method comprising:detecting with the forward facing optical sensor the presence of a user's hand within the field of view; determining if the wireless hand-held input controller is active and paired with the HMD; tracking with the forward facing optical sensor movement of the user's hand relative to the HMD over a period of time; generating trajectory data representative of the trajectory of the user's hand during the period of time; generating acceleration data for the period of time from the wireless hand-held input controller, wherein the acceleration data is derived by an inertial measurement unit (IMU) of the wireless hand-held input controller; receiving at a processor of the HMD the trajectory data and the acceleration data and comparing the trajectory data with the acceleration data to compute a confidence level that the wireless hand-held input controller is located in the user's hand; and if the confidence level meets or exceeds a predetermined minimum threshold, combining location data representative of a location of the user's hand as derived from the forward facing optical sensor of the HMD with orientation data representative of an orientation of the user's hand as derived from the IMU of the wireless hand-held input controller in order to track the user's hand within the three dimensional space of the AR or VR environment with six degrees of freedom.
28. The method of claim 27, wherein detecting the presence of the user's hand within the field of view comprisesxapturing, by the forward facing optical sensor of the HMD, a plurality of successive frames of video; and for each frame of video captured by the forward facing optical sensor of the HMD, analyzing video data captured by the HMD to determine if a user's hand is located within the field of view of the HMD's forward facing optical sensor.
29. The method of claim 27, further comprisingxeceiving from the wireless hand-held input controller first orientation data from the IMU of the wireless hand-held input controller, wherein the first orientation data is received at the processor of the HMD and is representative of the orientation of the wireless hand-held input controller derived from the IMU of the wireless hand-held input controller; computing at the processor of the HMD second orientation data representative of the orientation of the wireless hand-held input controller, wherein the second orientation data is computed based on pose data detected by the forward facing optical sensor of the HMD; and computing at the processor of the HMD a confidence level by comparing the first orientation data with the second orientation data.
30. The method of claim 29, further comprising creating an association between the user's hand and the wireless hand-held input controller if the confidence level meets or exceeds a predetermined minimum threshold.
31. The method of claim 30, further comprising:periodically monitoring the confidence level of the association between the user's hand and the associated wireless hand held input controller; if the confidence level remains above the predetermined minimum threshold, persisting the association; and if the confidence level drops below the predetermined minimum threshold, removing the association.
32. A system used for augmented reality (AR) or full virtual reality (VR) in which a head-mounted display device (HMD) is aligned with a wireless hand-held controller in a manner so that the HMD and wireless hand-held controller are rotationally invariant relative to one another, the system comprising:a head- mounted display device (HMD) comprising:an on-board processor mounted to the HMD; a wearable stereoscopic display adapted for displaying one of an augmented reality (AR) environment or a full virtual reality (VR) environment; an optical sensor system that produces location data at a particular instant of time for the wireless hand-held controller, wherein the optical sensor system comprises:an outward facing optical sensor that senses a field of view of the HMD within the environment such that location data is obtained from the outward facing optical sensor, and wherein the location data is defined by x, y and z coordinates; and a position sensor system comprising one or more sensors for determining orientation data of the HMD and wherein the orientation data for the HMD is defined by pitch (elevation angle), yaw (azimuth angle) and roll (rotation) relative to a predetermined frame of reference; a wireless hand-held controller comprising:an inertial measurement unit (IMU) comprising one or more sensors for determining orientation data of the wireless hand-held controller and wherein the orientation data of the wireless hand-held controller is defined by pitch (elevation angle), yaw (azimuth angle) and roll (rotation) relative to the particular frame of reference; and a microcontroller that communicates to the on-board processor of the HMD one or more user inputs relative to the orientation data of the wireless handheld controller; and wherein the on-board processor of the HMD performs at least the following:processes the orientation data at the particular instant of time as provided by the IMU of the wireless hand-held controller and the location data at the particular instant of time as provided by the optical sensor system of the HMD to determine location and orientation of the wireless hand-held controller in reference to the HMD with six degrees of freedom as derived from the x, y and z coordinates determined at the HMD and the yaw, pitch and roll determined at the IMU of the wireless hand-held controller; and processes the orientation data at the particular instant of time as provided by the position sensor system of the HMD and the orientation data at the particular instant of time as provided by the IMU of the wireless hand-held controller such that the HMD and the wireless hand-held controller are gravity aligned and north aligned so as to be rotationally invariant.
PCT/IR2020/050008 2020-04-12 2020-04-12 The present invention is about a folding headset with its electronic accessories, in particular, a folding WO2021210028A2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/IR2020/050008 WO2021210028A2 (en) 2020-04-12 2020-04-12 The present invention is about a folding headset with its electronic accessories, in particular, a folding

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/IR2020/050008 WO2021210028A2 (en) 2020-04-12 2020-04-12 The present invention is about a folding headset with its electronic accessories, in particular, a folding

Publications (1)

Publication Number Publication Date
WO2021210028A2 true WO2021210028A2 (en) 2021-10-21

Family

ID=78084374

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IR2020/050008 WO2021210028A2 (en) 2020-04-12 2020-04-12 The present invention is about a folding headset with its electronic accessories, in particular, a folding

Country Status (1)

Country Link
WO (1) WO2021210028A2 (en)

Similar Documents

Publication Publication Date Title
EP3469458B1 (en) Six dof mixed reality input by fusing inertial handheld controller with hand tracking
US10521026B2 (en) Passive optical and inertial tracking in slim form-factor
EP3469457B1 (en) Modular extension of inertial controller for six dof mixed reality input
EP3179290B1 (en) Mobile terminal and method for controlling the same
US10249090B2 (en) Robust optical disambiguation and tracking of two or more hand-held controllers with passive optical and inertial tracking
US10776618B2 (en) Mobile terminal and control method therefor
US10495878B2 (en) Mobile terminal and controlling method thereof
US10649173B2 (en) Head mounted display and method for controlling the same
US20170115728A1 (en) System and method of controlling the same
KR102099834B1 (en) Electric device and operation method thereof
US9367128B2 (en) Glass-type device and control method thereof
US11869156B2 (en) Augmented reality eyewear with speech bubbles and translation
CN106067833B (en) Mobile terminal and control method thereof
CN108474950A (en) HMD device and its control method
CN111630478A (en) High-speed staggered binocular tracking system
KR102039948B1 (en) Mobile terminal for rendering virtual organs in augmented/virtual reality and system thereof
US20220375172A1 (en) Contextual visual and voice search from electronic eyewear device
WO2021210028A2 (en) The present invention is about a folding headset with its electronic accessories, in particular, a folding
KR20190061825A (en) Tethering type head mounted display and method for controlling the same
WO2021229266A1 (en) A folding headset with curved frame hmd, electronic accessories and their control methods
US20240070299A1 (en) Revealing collaborative object using countdown timer
US20240069643A1 (en) Physical gesture interaction with objects based on intuitive design
US20240070302A1 (en) Collaborative object associated with a geographical location
US20240070243A1 (en) Authenticating a selective collaborative object
US20240071020A1 (en) Real-world responsiveness of a collaborative object

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2020930971

Country of ref document: EP

Effective date: 20221114