CN115826734A - Information display method, near-to-eye display device and electronic device - Google Patents

Information display method, near-to-eye display device and electronic device Download PDF

Info

Publication number
CN115826734A
CN115826734A CN202111088751.XA CN202111088751A CN115826734A CN 115826734 A CN115826734 A CN 115826734A CN 202111088751 A CN202111088751 A CN 202111088751A CN 115826734 A CN115826734 A CN 115826734A
Authority
CN
China
Prior art keywords
image
information
display device
eye display
parameter
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111088751.XA
Other languages
Chinese (zh)
Inventor
林鼎豪
陈碧莹
张宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202111088751.XA priority Critical patent/CN115826734A/en
Priority to PCT/CN2022/113110 priority patent/WO2023040562A1/en
Publication of CN115826734A publication Critical patent/CN115826734A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Optics & Photonics (AREA)
  • Computer Hardware Design (AREA)
  • Controls And Circuits For Display Device (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application provides an information display method, a near-eye display device and an electronic device, wherein the information display method is applied to the near-eye display device and comprises the steps of displaying a first image, wherein the first image is a part of a source image; acquiring attitude information of the near-to-eye display device in response to an image display updating instruction; and displaying a second image according to the attitude information, wherein the second image belongs to a part of the source image, and the second image is different from the first image. According to the method and the device, the first image which belongs to one part of the source image is displayed on the near-eye display device in advance, the attitude information of the near-eye display device is obtained after the image display updating instruction is responded, the second image which belongs to one part of the source image is displayed according to the attitude information, the second image can comprise some information which does not need to be checked in real time, the second image is displayed according to the attitude information of the near-eye display device when the second image needs to be checked, the flexibility of the display content is high, and the display effect of the near-eye display device is improved.

Description

Information display method, near-to-eye display device and electronic device
Technical Field
The present application relates to the field of near-eye display device technologies, and in particular, to an information display method, a near-eye display device, and an electronic device.
Background
Near-eye display devices such as smart glasses are becoming increasingly popular with consumers as a way to combine the latest IT technology with the functions of conventional glasses, with the advantages of portability, ease of use, and rich functionality.
Because the size of intelligent glasses is limited, the area of the display picture that the user can observe from intelligent glasses through naked eyes is less, and intelligent glasses can not show more information, and the display effect is relatively poor.
Disclosure of Invention
The embodiment of the application provides an information display method, near-eye display equipment and electronic equipment, and the display effect of the near-eye display equipment can be improved.
The embodiment of the application provides an information display method, which is applied to near-eye display equipment and comprises the following steps:
displaying a first image, the first image being part of a source image;
acquiring attitude information of the near-to-eye display device in response to an image display updating instruction;
and displaying a second image according to the attitude information, wherein the second image belongs to a part of the source image and is different from the first image.
The application also provides an information display method, which is applied to electronic equipment and stores a source image, and the method comprises the following steps:
sending a first image to a near-eye display device, the first image belonging to a portion of a source image;
responding to an image display updating instruction, and acquiring the posture information of the near-to-eye display equipment;
determining a second image according to the attitude information, wherein the second image belongs to a part of the source image, and the second image is different from the first image;
and sending the second image to a near-eye display device.
The present application also provides a near-eye display device for performing the information display method as described above.
The present application further provides a near-eye display device, comprising:
display means for displaying a first image, the first image being part of a source image;
the touch module is used for responding to the image display updating instruction;
the attitude sensor is used for acquiring attitude information of the near-eye display equipment;
wherein the display device is further configured to display a second image according to the pose information, the second image belonging to a portion of the source image, the second image being different from the first image.
The present application further provides an electronic device for executing the information display method as described above.
According to the embodiment of the application, the first image which belongs to one part of the source image is displayed on the near-eye display device in advance, the attitude information of the near-eye display device is obtained after the image display updating instruction is responded, the second image which belongs to one part of the source image is displayed according to the attitude information, the second image can comprise some information which does not need to be checked in real time, the information is displayed according to the attitude information of the near-eye display device when the information needs to be checked, the flexibility of the display content is high, and the display effect of the near-eye display device is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings used in the description of the embodiments will be briefly introduced below. It is obvious that the drawings in the following description are only some embodiments of the application, and that for a person skilled in the art, other drawings can be derived from them without inventive effort.
Fig. 1 is a schematic view of a first structure of a near-eye display device according to an embodiment of the present disclosure.
Fig. 2 is a second schematic structural diagram of a near-eye display device according to an embodiment of the present application.
Fig. 3 is a first flowchart of an information display method according to an embodiment of the present disclosure.
Fig. 4 is a first schematic diagram of a source image provided by an embodiment of the present application.
Fig. 5 is a second schematic diagram of a source image provided by an embodiment of the present application.
Fig. 6 is a third schematic diagram of a source image provided by an embodiment of the present application.
Fig. 7 is a first view of a rotated scene of a near-eye display device according to an embodiment of the present application.
Fig. 8 is a second view of a rotating scene of a near-eye display device according to an embodiment of the present application.
Fig. 9 is a first display schematic diagram of a near-eye display device according to an embodiment of the present application.
Fig. 10 is a second display schematic diagram of a near-eye display device according to an embodiment of the present application.
Fig. 11 is a third display schematic diagram of a near-eye display device according to an embodiment of the present application
Fig. 12 is a second flowchart of the information display method according to the embodiment of the present application.
Fig. 13 is a schematic view of a first application scenario of a near-eye display device according to an embodiment of the present application.
Fig. 14 is a schematic view of a second application scenario of a near-eye display device according to an embodiment of the present application.
Fig. 15 is a schematic view of a third application scenario of a near-eye display device according to an embodiment of the present application.
Fig. 16 is a schematic view of a fourth application scenario of a near-eye display device according to an embodiment of the present application.
Fig. 17 is a schematic view of a fifth application scenario of a near-eye display device according to an embodiment of the present application.
Fig. 18 is a third flow chart of the information display method according to the embodiment of the present application.
Fig. 19 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application. It is to be understood that the embodiments described are only a few embodiments of the present application and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The embodiment of the application provides a near-eye display device, which may be an intelligent near-eye display device, such as intelligent glasses, an intelligent headgear, an intelligent helmet, etc., the intelligent glasses may be AR (Augmented Reality) glasses or VR (Virtual Reality) glasses, etc., and the near-eye display device generally includes some electronic devices such as a power supply, a display device, a wearing component, a sensor, etc. The near-eye display apparatus may implement a preset function, such as displaying a picture or the like through the display device, according to an operation of a user.
Taking a near-eye display device as an example of AR smart glasses, as shown in fig. 1, fig. 1 is a first structural schematic diagram of the near-eye display device provided in the embodiment of the present application.
Near-eye display device 10 may include wearing component 11 and display device 12, where display device 12 may include display 121 and projector 124, display 121 may be a lens, and the lens may be a waveguide lens with a grating structure, or may be other forms of a plain glasses lens, a sunglasses lens, a prescription glasses lens, a prescription sunglasses lens, etc., and the prescription glasses lens or the prescription sunglasses lens refers to glasses lens or sunglasses lens equipped according to a prescription (or optometry). The projector 124 may enable the Display 121 to Display image information through different Display modes, such as a Fiber Scanning Display (FSD) mode, a Digital Light Processing (DLP) mode, a Laser Scanning Beam (LBS) mode, and the like, and the projector 124 may be fixedly disposed on the wearing component 11 or detachably connected to the wearing component 11, in some embodiments, the Display 121 and the projector 124 may be integrated into a whole, the integrated Display device 12 may be fixedly disposed on the wearing component 11 or detachably connected to the wearing component 11, and a user may directly view the image information through the integrated Display device 12 without a lens.
The wearing assembly 11 may serve as a frame structure of the near-eye display device 10, the wearing assembly 11 may further include a frame 111 and a temple 112, the frame 111 may be provided with the display 121 (lens) as described above to display image information, in some embodiments, no lens may be provided, or only a common lens may be provided, the image information is displayed through an integrated display device, and a user may wear the display device to the head of the user through the temple 112 and directly view the image information displayed by the display device without displaying through the lens.
The temples 112 are respectively connected to two opposite sides of the frame 111, and the user can wear the near-eye display device 10 to the head of the user through the temples 112, and in some embodiments, the near-eye display device can also be worn to the head of the user through other wearing components, such as a connecting band or a connecting buckle having elasticity, magnetism or adhesive force.
Referring to fig. 2, fig. 2 is a second schematic structural diagram of a near-eye display device according to an embodiment of the present application. The visible area 122 is a region where the user can see the display image information through the glasses, and is related to a Field of view (FOV) of the near-eye display device, and the larger the FOV of the display device of the near-eye display device is, the larger the display range that the user can see is. The FOV of display devices of current near-eye display devices, such as AR glasses, may be 20 °, 40 °, 60 °, 80 °, and so on. The size of the FOV of the display device is related to the structures of the display and the projector, for example, when the display is a waveguide lens, the FOV of the display device is related to the structure of the waveguide lens, and at present, the area of the visible region of the display device 12 for displaying image information is limited, and cannot meet the requirement of a user for viewing information by the near-eye display device. The information display method is applied to the near-eye display equipment and comprises the following steps:
a first image is displayed 201, the first image being part of a source image.
The near-eye display device may store a source image, the source image may include image information to be displayed by the display apparatus, and in some embodiments, the source image may also be generated in a server or other electronic devices connected to the near-eye display device, and sent to the near-eye display device through a data transmission manner of wired or wireless transmission. The other electronic devices may be electronic devices such as a smart phone, a tablet computer, a Personal Digital Assistant (PDA), a smart watch, and a smart bracelet.
The first image may be image information that a user needs to view currently, for example, the first image may include application information, communication information, audio information, video information, and the like of a near-eye display device or an electronic device connected to the near-eye display device, the source image may include other information besides the first image, such as auxiliary display information that the user does not need to view in real time, and the auxiliary display information may be parameter information of the near-eye display device and/or parameter information of the electronic device connected to the near-eye display device, such as a power parameter, a network parameter, a time parameter, an audio playing parameter, a display parameter, and the like. The position of the auxiliary display information in the source image may be around the position of the first image, such as to one side of the first image edge, or around the first image edge.
As shown in fig. 4 to 6, fig. 4 is a first schematic view of a source image provided by an embodiment of the present application, fig. 5 is a second schematic view of the source image provided by the embodiment of the present application, and fig. 6 is a third schematic view of the source image provided by the embodiment of the present application. In the examples of fig. 4 to 6, the first image includes application information related to the fitness application, except that in the example of fig. 4, the auxiliary display information includes a power parameter and a time parameter, in the example of fig. 5, the auxiliary display information includes a power parameter, a time parameter and a network parameter, in the example of fig. 6, the auxiliary display information includes a power parameter, a network parameter, a time parameter, an audio playing parameter, a display parameter and a notification message parameter, wherein the audio playing parameter may include a volume parameter and a previous current audio application playing state parameter, and the display parameter may include a display brightness parameter. It is to be understood that the first image and the auxiliary display information in the drawings are only exemplary, the first image may also be image information that is set according to actual requirements and needs to be viewed by a user currently, and the auxiliary display information may also be information that is set according to actual requirements and does not need to be viewed by the user in real time.
And 202, acquiring the posture information of the near-eye display device in response to the image display updating instruction.
The image display update instruction may be generated according to a trigger signal, where the trigger signal may be a trigger signal triggered by a user through a trigger module, and the trigger signal may include at least one of a touch signal, a voice signal, an image signal, and an action signal, where the touch module may be a touch sensor, the touch sensor is configured to acquire the touch signal, and the touch signal may be a press signal and/or a slide signal. The touch module may also be an audio acquisition sensor, such as a microphone, for acquiring a voice signal, where the voice signal may be an audio signal acquired by the microphone and meeting a preset condition. The touch module can also be an image acquisition sensor, such as a camera, for acquiring image signals, wherein the image signals are acquired by the camera and meet preset conditions. The touch module can be the sensor is gathered for the action, for example attitude sensor, and the action signal can be the action signal that satisfies the default condition that gathers through attitude sensor. It can be understood that other types of touch modules and corresponding trigger signals can be set according to actual requirements. And if the trigger signal is received, generating an image display updating instruction according to the trigger signal, and acquiring the posture information of the near-to-eye display equipment.
The near-eye display device may include a posture sensor, and the posture sensor may acquire posture information of the near-eye display device, please refer to fig. 1, fig. 7, and fig. 8, and fig. 7 is a first view of a rotation scene of the near-eye display device according to the embodiment of the present disclosure. Fig. 8 is a second view of a rotating scene of a near-eye display device according to an embodiment of the present application. The attitude sensor 13 is used to detect attitude information of the near-eye display device 10. The attitude sensor 13 may include a gyroscope, an electronic compass, an acceleration sensor, and/or a hall sensor, among others. The attitude sensor 13 can realize 3 degrees of freedom detection (3degreeffreedom, 3dof) or 6 degrees of freedom detection (6degreeffreedfreedom, 6dof) of the near-eye display device, and with the near-eye display device capable of realizing the 3DOF example, the near-eye display device can detect attitude information of a first degree of freedom rotation, attitude information of a second degree of freedom rotation, and attitude information of a third degree of freedom rotation through the attitude sensor 13.
And 203, displaying a second image according to the attitude information, wherein the second image belongs to one part of the source image and is different from the first image. The second image may include the auxiliary display information described above.
In some embodiments, the region of the second image in the source image intersects the region of the first image in the source image. Referring to fig. 9 and 10, fig. 9 is a first display schematic diagram of a near-eye display device provided in an embodiment of the present application, and fig. 10 is a second display schematic diagram of the near-eye display device provided in the embodiment of the present application. Before the image display update instruction is not responded, as shown in fig. 9, a first image can be observed by the user from the near-eye display device, after the image display update instruction is responded, a second image can be observed by the user from the near-eye display device, the second image belongs to a part of the source image, and the second image is different from the first image, for example, as shown in fig. 10, the position of the second image in the source image is not the same as the position of the first image in the source image, and the area of the second image in the source image has intersection with the area of the first image in the source image. The first image comprises information of body building application, the second image comprises auxiliary display information, in an actual application scene, when a user needs to check electric quantity parameters in the auxiliary display information, corresponding trigger signals can be triggered through the trigger module, the gesture information of the near-to-eye display device is collected through the gesture sensor, the electric quantity parameters needing to be checked by the user are displayed according to the gesture information, and at the moment, the second image can comprise application information and the auxiliary display information of a part of the first image. And the intersection part of the first image in the region of the source image and the second image in the region of the source image is a partial region corresponding to the application information.
In some embodiments, the second image comprises a first sub-image region and a second sub-image region, the second sub-image region of the second image being located adjacent to the first sub-image region, the first sub-image region being located within an intersection region of the second image and the first image in the source image. As shown in fig. 9 and 10, the first sub-image region may be a region including partial application information, the second sub-image region may be a region including auxiliary display information, and the first sub-image region is located in an intersection region of the second image and the first image in the source image.
In some embodiments, the second sub-image region of the second image is disposed adjacent to one side of the edge of the first image, as shown in fig. 10, and the second sub-image region of the second image is disposed adjacent to one side of the first image, while in other embodiments, the second sub-image region of the second image is disposed adjacent to two sides of the first image. As shown in fig. 11.
In some embodiments, before displaying the second image information according to the posture information, the near-eye display device may not display the first image, for example, when the near-eye display device is in a low power consumption mode or a low power state, the visible area is in a black screen state, when the user generates an image display update instruction through the trigger signal, the posture information of the near-eye display device is acquired through the posture sensor, the second image is displayed according to the posture information, and when the near-eye display device is in the low power consumption mode or the low power mode, reading of parameters of the near-eye display device by the user may be further satisfied, so that readability of the auxiliary display information is increased.
According to the information display method, the first image which belongs to the part of the source image is displayed on the near-eye display device in advance, the posture information of the near-eye display device is obtained after the image display updating instruction is responded, the second image which belongs to the part of the source image is displayed according to the posture information, the second image can comprise some information which does not need to be checked in real time, the information is displayed according to the posture information of the near-eye display device when the information needs to be checked, the flexibility of the display content is high, and the display effect of the near-eye display device is improved.
Referring to fig. 12, fig. 12 is a second flow chart of the information display method according to the embodiment of the present application, and the information synchronization method includes:
301, a first image is displayed, the first image being part of the source image.
The near-eye display device may store a source image, the source image may include image information to be displayed by the display apparatus, and in some embodiments, the source image may also be generated in a server or other electronic devices connected to the near-eye display device, and sent to the near-eye display device through a data transmission manner of wired or wireless transmission. The other electronic devices may be electronic devices such as a smart phone, a tablet computer, a Personal Digital Assistant (PDA), a smart watch, and a smart band.
The first image may be image information that a user needs to view currently, for example, the first image may include application information, communication information, audio information, video information, and the like of a near-eye display device or an electronic device connected to the near-eye display device, the source image may include other information besides the first image, such as auxiliary display information that the user does not need to view in real time, and the auxiliary display information may be parameter information of the near-eye display device and/or parameter information of the electronic device connected to the near-eye display device, such as an electric quantity parameter, a network parameter, a time parameter, an audio playing parameter, a display parameter, and the like. The position of the auxiliary display information in the source image may be around the position of the first image, such as to one side of the first image edge, or around the first image edge.
As shown in fig. 4 to fig. 6, fig. 4 is a first schematic diagram of a source image provided by the embodiment of the present application, fig. 5 is a second schematic diagram of the source image provided by the embodiment of the present application, and fig. 6 is a third schematic diagram of the source image provided by the embodiment of the present application. In the examples of fig. 4 to 6, the first image includes application information related to the fitness application, except that in the example of fig. 4, the auxiliary display information includes a power parameter and a time parameter, in the example of fig. 5, the auxiliary display information includes a power parameter, a time parameter and a network parameter, in the example of fig. 6, the auxiliary display information includes a power parameter, a network parameter, a time parameter, an audio playing parameter, a display parameter and a notification message parameter, wherein the audio playing parameter may include a volume parameter and a previous current audio application playing state parameter, and the display parameter may include a display brightness parameter. It is to be understood that the first image and the auxiliary display information in the drawings are only exemplary, the first image may also be image information that is set according to actual requirements and needs to be viewed by a user currently, and the auxiliary display information may also be information that is set according to actual requirements and does not need to be viewed by the user in real time.
With a source image such as the source image example of fig. 6, the source image may include information related to the fitness application and auxiliary display information, and the auxiliary display information may include a power parameter, a time parameter, and a network parameter adjacent to the top of the fitness application, a music play parameter and a notification message parameter adjacent to the bottom of the fitness application, a volume parameter adjacent to the left side of the fitness application, and a brightness parameter adjacent to the right side of the fitness application. Parameters which do not need to be viewed in real time by the user are hidden, and the fitness application information which needs to be viewed in real time by the user is only displayed in the visual area before the trigger signal of the image display updating instruction is not triggered.
In some embodiments, the auxiliary display information may also be image information related to the first image display image, e.g., the first image includes application information, which may include information related to the application information. For example, the first image information includes main information of the fitness application, the auxiliary display information may be information related to the fitness application, and illustratively, the first image includes an icon of the fitness application, a current fitness item, calories burned, heart rate information, and the like, and the auxiliary display information may include a fitness duration and specific information of the current fitness item (the current fitness item is running, and the specific information of the current fitness item is running kilometers, steps, and the like), and it may be understood that the auxiliary display information may be generated according to specific content of the main image.
In some embodiments, the auxiliary display information may further include image information related to an external image signal acquired by the current near-eye display device, the near-eye display device may further include a camera, the camera is configured to acquire the external image information to implement an interaction function, a first image generated after a user opens the interaction function may be the image information acquired by the camera, the auxiliary display information may be identified information obtained based on the first image content, illustratively, the user opens the camera of the near-eye display device to acquire the external image information in real time, and performs recognition modes such as image-text recognition, face recognition, object recognition, scene recognition and the like on the image information to obtain the identified information, and sets the identified information as the auxiliary display information around the first image, and when the user needs to view the information, the user may view the information by using a trigger signal for triggering an image display instruction.
It is understood that the first image and the auxiliary display information in the illustration are only exemplary, the first image may be image information that is set according to actual requirements and currently needs to be viewed by a user, and the auxiliary display information may be information that is set according to actual requirements and does not need to be viewed by the user in real time.
And 302, in response to the image display updating instruction, determining a target area from the source image according to the attitude information.
The image display update instruction may be generated according to a trigger signal, where the trigger signal may be a trigger signal triggered by a user through a trigger module, and the trigger signal may include at least one of a touch signal, a voice signal, an image signal, and an action signal, where the touch module may be a touch sensor, the touch sensor is configured to acquire the touch signal, and the touch signal may be a press signal and/or a slide signal. The touch module may also be an audio acquisition sensor, such as a microphone, for acquiring a voice signal, where the voice signal may be an audio signal acquired by the microphone and meeting a preset condition. The touch module can also be an image acquisition sensor, such as a camera, for acquiring image signals, wherein the image signals are acquired by the camera and meet preset conditions. The touch module can be an action acquisition sensor, such as a posture sensor, and the action signal can be an action signal which is acquired through the posture sensor and meets a preset condition. It can be understood that other types of touch modules and corresponding trigger signals can be set according to actual requirements. And if the trigger signal is received, generating an image display updating instruction according to the trigger signal, and acquiring the posture information of the near-to-eye display equipment.
As shown in fig. 7, the triggering module may be a touch key 14 disposed on the wearing component 11, the touch key includes a touch sensor, the touch key 14 is configured to receive a touch signal triggered by a user, and the user may trigger a corresponding triggering signal by pressing, touching, approaching and/or sliding the touch key.
It can be understood that other types of touch modules and corresponding trigger signals can be set according to actual requirements. And if the trigger signal is received, generating an image display updating instruction so that the near-eye display equipment displays the updating instruction to the application image to acquire the posture information of the near-eye display equipment.
The near-eye display device may include a gesture sensor, where the gesture sensor acquires gesture information of the near-eye display device, please refer to fig. 1, fig. 7, and fig. 8, and fig. 7 is a first view of a rotation scene of the near-eye display device according to the embodiment of the present disclosure. Fig. 8 is a second view of a near-eye display device rotation scene provided in an embodiment of the present application. The attitude sensor 13 is used to detect attitude information of the near-eye display device 10. The attitude sensor 13 may include a gyroscope, an electronic compass, an acceleration sensor, and/or a hall sensor. The attitude sensor 13 may implement 3-degree-of-freedom detection (3 degreeffreedom, 3 DOF) or 6-degree-of-freedom detection (6 degreeffreedom, 6 DOF) of the near-eye display device, and with the near-eye display device implementing the 3DOF example, the near-eye display device may detect attitude information of a first degree-of-freedom rotation, attitude information of a second degree-of-freedom rotation, and attitude information of a third degree-of-freedom rotation through the attitude sensor 13.
The position change information of the near-eye display equipment can be obtained by converting the posture information acquired by the posture sensor within the preset time period, the position change information can be the position change information of the near-eye display equipment in the space, a corresponding relation exists between the position information of the near-eye display equipment and the position information of the visible area of the near-eye display equipment due to the fact that the visible area of the near-eye display equipment changes along with the change of the position of the near-eye display equipment, and the position information of the visible area in the space can be obtained according to the position information of the near-eye display equipment. When the near-eye display equipment moves, the position of the visual area in the space changes, and the target area is determined from the source image according to the change information of the position of the visual area.
For example, when the user wears the near-eye display device and rotates the head, the position of the near-eye display device in the space changes, the change is converted into a visible region position change, a target region is determined from the source image according to the position change, and the target region may include a part of the first image and the auxiliary display information. The second image corresponding to the target area can be projected to the visual area by the projector for display, and the user can view the information of the auxiliary image in the visual area by changing the position of the near-eye display device.
In some embodiments, the target area may be determined by:
acquiring coordinate information of a first image displayed in a visual area on a source image;
acquiring attitude information of near-to-eye display equipment within a preset time period;
obtaining position change information of the near-eye display equipment according to the posture information of the near-eye display equipment in a preset time period;
determining a target area from a source image according to the position change information and the coordinate information;
for example, the coordinate information of all the image information of the first image currently displayed by the visible region is obtained, the coordinate information of all the image information of the first image on the source image is obtained, the coordinate information may be the coordinate information of a target frame generated on the source image according to the size of the visible region, the position variation of the near-eye display device is obtained through calculation according to the posture information, the corresponding position variation of the visible region is obtained according to the position variation of the near-eye display device, the coordinate information of the target region is obtained through calculation according to the position variation of the visible region and the coordinate information of the target frame, the image of the target region is determined according to the coordinate information of the target region, and the target region includes a part of the first image in the source image region and at least a part of the auxiliary display information in the source image region.
303, displaying the second image corresponding to the target area in the visible area.
Wherein the second image belongs to a part of the source image, the second image being different from the first image. The second image may include the auxiliary display information described above, and in some embodiments, the region of the second image in the source image intersects the region of the first image in the source image. Referring to fig. 9 and 10, fig. 9 is a first display schematic diagram of a near-eye display device provided in an embodiment of the present application, and fig. 10 is a second display schematic diagram of the near-eye display device provided in the embodiment of the present application. Before the image display update instruction is not responded, as shown in fig. 9, a first image can be observed by the user from the near-eye display device, after the image display update instruction is responded, a second image can be observed by the user from the near-eye display device, the second image belongs to a part of the source image, and the second image is different from the first image, for example, as shown in fig. 10, the position of the second image in the source image is not the same as the position of the first image in the source image, and the area of the second image in the source image has intersection with the area of the first image in the source image. The first image comprises information of body building application, the second image comprises auxiliary display information, in an actual application scene, when a user needs to check electric quantity parameters in the auxiliary display information, corresponding trigger signals can be triggered through the trigger module, the gesture information of the near-to-eye display device is collected through the gesture sensor, the electric quantity parameters needing to be checked by the user are displayed according to the gesture information, and at the moment, the second image can comprise application information and the auxiliary display information of a part of the first image. And the intersection part of the first image in the region of the source image and the second image in the region of the source image is a partial region corresponding to the application information.
In some embodiments, the second image comprises a first sub-image region and a second sub-image region, the second sub-image region of the second image being located adjacent to the first sub-image region, the first sub-image region being located within an intersection region of the second image and the first image in the source image. As shown in fig. 9 and 10, the first sub-image region may be a region including partial application information, the second sub-image region may be a region including auxiliary display information, and the first sub-image region is located in an intersection region of the second image and the first image in the source image.
In some embodiments, the second sub-image region of the second image is disposed adjacent to one side of the edge of the first image, as shown in fig. 10, and the second sub-image region of the second image is disposed adjacent to one side of the first image, while in other embodiments, the second sub-image region of the second image is disposed adjacent to two sides of the first image. As shown in fig. 11.
In some embodiments, the first image and the second image displayed in the viewable area are the same size. For example, the first image and the second image displayed in the visible area have the same image parameters such as resolution, brightness, transparency, etc., and for example, the size of the area of the first image displayed in the visible area on the source image is the same as the size of the area of the second image on the source image, and for example, the positions and/or display sizes of the first image and the second image displayed in the visible area are the same. Of course, for the purpose of diversification of display, the second image may be displayed in the visible region in a manner adjusted according to the user's preference, that is, the size of the first image and the second image displayed in the visible region may be different.
In an actual application scenario, please continue to refer to fig. 1, fig. 7, and fig. 13 to fig. 17, fig. 13 is a schematic view of a first application scenario of the near-eye display device according to the embodiment of the present application, fig. 14 is a schematic view of a second application scenario of the near-eye display device according to the embodiment of the present application, and fig. 15 is a schematic view of a third application scenario of the near-eye display device according to the embodiment of the present application. Fig. 16 is a schematic view of a fourth application scenario of a near-eye display device according to an embodiment of the present application. Fig. 17 is a schematic view of a fifth application scenario of a near-eye display device according to an embodiment of the present application.
As shown in fig. 13, after the user wears the near-eye display device and the near-eye display device 10 is started, the near-eye display device displays a first image, the first image may include currently viewed application information and the like, the first image is a part of a source image, the source image may include auxiliary display information in addition to the application information corresponding to the first image, and the auxiliary display information includes an electric quantity parameter, a time parameter and a network parameter adjacent to the top of the first image, a music playing parameter and a notification message parameter adjacent to the bottom of the first image, a volume parameter adjacent to the left of the first image, and a brightness parameter adjacent to the right of the first image. Hiding the parameters that the user does not need to view in real time, before the image display update command is responded, the user may display all information of the first image in the viewable area 122, and the first image may include application information that the user needs to view in real time, such as information related to a fitness application.
After the user triggers the related trigger signal through the trigger module, the near-eye display device responds to the image update instruction corresponding to the trigger signal, where the user may trigger the corresponding trigger signal through the trigger manner shown in fig. 7, specifically, the trigger module may be a touch key 14 disposed on the wearing component 11, the touch key includes a touch sensor, the touch key 14 is configured to receive the touch signal triggered by the user, and the user may press the touch key through a finger to trigger the press signal, so as to trigger the trigger signal corresponding to the image display update instruction.
After the user triggers the trigger signal corresponding to the image display update instruction, the user may rotate the head, so that the near-eye display device 10 acquires the posture information through the posture sensor, and the user may rotate the head to the left, as shown in fig. 14, after the user rotates the head to the left, the near-eye display device determines a target area from the source image according to the corresponding posture information, the target area includes a volume parameter of the auxiliary display information and partial information of the first image fitness application, displays the volume parameter corresponding to the target area and the image information of the partial information of the fitness application in a visible area, displays the volume parameter corresponding to the auxiliary display information in the visible area, hides a part of the application information related to the fitness application originally displayed in the visible area, and the user may view the volume parameter which is pre-existing in the source image but is not visible by rotating the near-eye display device, where the volume parameter may be a volume parameter of an audio file currently played by the near-eye display device or other electronic devices connected to the near-eye display device when the near-eye display device is in a call state.
The user may rotate the head to the right, as shown in fig. 15, after the user rotates the head to the right, the near-eye display device determines a target area from the source image according to the corresponding posture information, the target area includes a luminance parameter of the auxiliary display information and part of information of the first image fitness application, the luminance parameter corresponding to the target area and the image information of the part of information of the first image fitness application are displayed in the visible area, a part of application information related to the fitness application, which is originally displayed in the visible area, is hidden, and the user may view a pre-existing but invisible luminance parameter by rotating the near-eye display device. The brightness parameter may be a brightness parameter of a display frame in a visual area of the near-eye display device.
The user may rotate the head upward, as shown in fig. 16, after the user rotates the head upward, the near-eye display device determines a target area from the source image according to the corresponding posture information, the target area includes an electric quantity parameter, a time parameter, and a network parameter of the auxiliary display information and part of information of the first image fitness application, the electric quantity parameter, the time parameter, and the network parameter corresponding to the target area are displayed in the visible area by using the image information of the part of information of the first image fitness application, a part of application information related to the fitness application, which is originally displayed in the visible area, is hidden, and the user may view the electric quantity parameter, the time parameter, and the network parameter, which are pre-existing but invisible, by rotating the near-eye display device. The electric quantity parameter may be an electric quantity parameter of the near-eye display device, and the network information may be network information connected to the near-eye display device, such as signal strength and network type.
The user may rotate the head downwards, as shown in fig. 17, after the user rotates the head downwards, the near-eye display device determines a target area from the source image according to the corresponding posture information, the target area includes the audio playing parameter and the notification message of the auxiliary image information and the partial information of the first image fitness application, the audio playing parameter and the notification message corresponding to the target area and the image information of the partial information of the first image fitness application are displayed in the visible area, a part of the application information related to the fitness application, which is originally displayed in the visible area, is hidden, and the user may view the pre-existing but invisible audio playing parameter and the notification message by rotating the near-eye display device. The notification information may be an application notification message received by the near-eye display device, such as a notification message of an instant messaging application, a sports application, or a shopping application.
It should be noted that the rotation amount of the head rotation of the user is different, and the content of the pre-existing auxiliary image information viewed from the visual area is also different, and the content of the pre-existing auxiliary image information viewed from the visual area by the user is positively correlated with the rotation amount of the head rotation of the user.
In some embodiments, after the user views the message to be viewed, an exit signal may be triggered to generate an image reset instruction, and after the near-eye display device responds to the image reset instruction, the near-eye display device switches the displayed second image to display the first image, for example, moving and hiding an electric quantity parameter, a time parameter, a date parameter, a network parameter, a volume parameter, a playing parameter of audio, a notification message or a brightness parameter currently displayed in the visible area, and displaying all information of the first image in the visible area to restore to an initial state.
When the trigger signal is the trigger signal triggered by long-time pressing of the touch key, the exit signal can be the exit signal triggered after the touch key is reset, when a user needs to check an auxiliary image, the user only needs to press the touch key with fingers, then the head is rotated, auxiliary display information needing to be checked can be checked, after the fingers are loosened, the auxiliary display information can be restored to an initial state, the user can quickly check the information needing to be checked, and the efficiency of checking the auxiliary display information is improved. Of course, the exit signal may also be another signal, for example, when the trigger signal is a touch signal triggered by double-clicking the touch key, the exit signal may be an exit signal triggered by single-clicking the touch key.
In the description of the embodiments of the present application, it should be understood that the terms "upper", "lower", "left", "right", "top", "bottom", "left", "right", and the like indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings, and are only for convenience of describing the present invention and simplifying the description, but do not indicate or imply that the display image information indicated has a specific orientation, is configured and operated in a specific orientation, and thus, is not to be construed as limiting the present invention.
According to the information display method provided by the embodiment of the application, the auxiliary image information which does not need to be checked by the user in real time is generated around the main image in advance, and when the auxiliary image function is not started, the visual area only displays the main image which the user needs to check, so that the auxiliary image is not displayed in front of the eyes of the user in real time when the user uses the near-eye display device in daily life, and the visual field of the user is interfered. And the display content of the main graph of the visual area can be increased, and the display effect is improved. In actual operation, when a user wants to view a hidden auxiliary image, the user only needs to touch the glasses legs, then rotates the head, and sees the hidden auxiliary image from the visual area, so that the user can confirm physical touch on the operation layer, and the user can also feel a spatial visual sense of correspondence by moving the head to see the hidden content, thereby being beneficial to reducing the cognitive burden of the user and the dizzy feeling possibly generated during reading near to the eyes. When the user releases the fingers, the auxiliary image viewing function can be quitted, so that the near-to-eye display equipment restores the state of the initially displayed main image, and the efficiency of viewing the hidden auxiliary image is improved. The near-eye display device provided by the embodiment of the application can further comprise a processor and a storage medium, wherein the processor is electrically connected with the memory. The processor is a control center of the near-eye display device, connects various parts of the entire near-eye display device using various interfaces and lines, and performs various functions of the near-eye display device and processes data by running or loading a computer program stored in the memory and calling data stored in the memory.
The memory may be used to store software programs and modules, and the processor may execute various functional applications and data processing by executing the computer programs and modules stored in the memory. The memory may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, a computer program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data created according to use of the electronic device, and the like.
Further, the memory may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device. Accordingly, the memory may also include a memory controller to provide the processor access to the memory.
In this embodiment of the present application, a processor in the near-eye display device loads instructions corresponding to one or more processes of a computer program into a memory according to the following steps, and the processor executes the computer program stored in the memory, thereby implementing various functions as follows:
displaying a first image, the first image being part of a source image;
acquiring attitude information of the near-to-eye display device in response to an image display updating instruction;
and displaying a second image according to the attitude information, wherein the second image belongs to a part of the source image and is different from the first image.
In some embodiments, the region of the second image in the source image intersects the region of the first image in the source image.
In some embodiments, the second image comprises a first sub-image region and a second sub-image region, the second sub-image region of the second image being located adjacent to the first sub-image region, the first sub-image region being located within an intersection region of the second image and the first image in the source image.
In some embodiments, the second sub-image area of the second image is disposed adjacent to the first image edge side.
In some embodiments, the near-eye display device comprises a visual region, and the processor is configured to perform, while said displaying the second image according to the pose information:
determining a target area from the source image according to the attitude information;
and displaying a second image corresponding to the target area in the visual area.
In some embodiments, the first image and the second image displayed in the viewable area are the same size.
In some embodiments, the second image comprises auxiliary display information comprising one or more of parameter information of the near-eye display device and/or parameter information of an electronic device to which the near-eye display device is connected.
In some embodiments, the parameter information includes one or more of a power parameter, a network parameter, a time parameter, an audio playback parameter, a display parameter, and a notification message parameter.
In some embodiments, after displaying the first image, the processor is further configured to perform:
acquiring coordinate information of a first image displayed in the visible area on the source image;
when the pose information of the near-eye display device is acquired and the second image is displayed according to the pose information, the processor is further configured to:
acquiring gesture information of the near-eye display device within a preset time period;
obtaining position change information of the near-eye display equipment according to the posture information of the near-eye display equipment in the preset time period;
determining a target area from a source image according to the position change information and the coordinate information;
and displaying a second image corresponding to the target area in the visual area.
In some embodiments, after displaying the second image according to the pose information, the processor is further configured to perform:
in response to the received image display reset instruction, the near-eye display device switches the displayed second image to display the first image.
In some embodiments, the trigger signal of the image display update instruction includes at least one of a touch signal, a voice signal, an image signal, and an action signal.
In some embodiments, the near-eye display device includes a wearing component, the wearing component is provided with a touch module, the touch module is configured to receive the touch signal, and the touch module receives the touch signal of the image display update instruction, and then the near-eye display device obtains the posture information of the near-eye display device in response to the image display update instruction.
The near-eye display device 10 provided in the embodiment of the present application may further include: the functions of the devices such as the camera, the microphone, the speaker, the LED lamp and the like are expanded (such as playing songs through the speaker, or taking pictures through the camera, and the like, and the processor displays interactive data related to the current picture in the visual area according to the pictures taken by the camera), so that the functions and the playability of the near-eye display device 10 are improved. It can be understood that the near-eye display device may perform data interaction with other electronic devices (e.g., a mobile phone, a tablet computer, a smart watch, and an intelligent vehicle), and the near-eye display device 10 implements corresponding functions according to instructions of the other electronic devices.
Referring to fig. 18 and fig. 19, fig. 18 is a third flow chart of the information display method according to the embodiment of the present application. Fig. 19 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
The embodiment of the present application further provides an information display method, which is applied to an electronic device, where the electronic device may be an electronic device such as a smart phone, a tablet computer, a Personal Digital Assistant (PDA), a smart watch, and a smart bracelet, as shown in fig. 19, the electronic device may be a smart phone, and the information display method includes:
a first image is sent 401 to a near-eye display device, the first image belonging to a portion of a source image.
The electronic device may store a source image, where the source image may include image information to be displayed by the near-eye display device, and in some embodiments, the source image may also be generated in a server and sent to the electronic device through a data transmission manner of wired or wireless transmission. The first image may be image information that a user needs to view currently, for example, the first image may include application information, communication information, audio information, video information, and the like of a near-eye display device or an electronic device connected to the near-eye display device, the source image may include other information besides the first image, such as auxiliary display information that the user does not need to view in real time, and the auxiliary display information may be parameter information of the near-eye display device and/or parameter information of the electronic device connected to the near-eye display device, such as a power parameter, a network parameter, a time parameter, an audio playing parameter, a display parameter, and the like. The position of the auxiliary display information in the source image may be around the position of the first image, such as to one side of the first image edge, or around the first image edge.
As shown in fig. 4 to 6, fig. 4 is a first schematic view of a source image provided by an embodiment of the present application, fig. 5 is a second schematic view of the source image provided by the embodiment of the present application, and fig. 6 is a third schematic view of the source image provided by the embodiment of the present application. In the examples of fig. 4 to 6, the first image includes application information related to the fitness application, except that in the example of fig. 4, the auxiliary display information includes a power parameter and a time parameter, in the example of fig. 5, the auxiliary display information includes a power parameter, a time parameter and a network parameter, in the example of fig. 6, the auxiliary display information includes a power parameter, a network parameter, a time parameter, an audio playing parameter, a display parameter and a notification message parameter, wherein the audio playing parameter may include a volume parameter and a previous current audio application playing state parameter, and the display parameter may include a display brightness parameter. It is to be understood that the first image and the auxiliary display information in the drawings are only exemplary, the first image may also be image information that is set according to actual requirements and needs to be viewed by a user currently, and the auxiliary display information may also be information that is set according to actual requirements and does not need to be viewed by the user in real time.
And 402, acquiring the posture information of the near-eye display device in response to the image display updating instruction.
The image display update instruction may be generated according to a trigger signal, where the trigger signal may be a trigger signal triggered by a user through a trigger module, and the trigger signal may include at least one of a touch signal, a voice signal, an image signal, and an action signal, where the touch module may be a touch sensor, the touch sensor is configured to acquire the touch signal, and the touch signal may be a press signal and/or a slide signal. The touch module may also be an audio acquisition sensor, such as a microphone, for acquiring a voice signal, where the voice signal may be an audio signal acquired by the microphone and meeting a preset condition. The touch module can also be an image acquisition sensor, such as a camera, for acquiring image signals, wherein the image signals are acquired by the camera and meet preset conditions. The touch module can be the sensor is gathered for the action, for example attitude sensor, and the action signal can be the action signal that satisfies the default condition that gathers through attitude sensor. It can be understood that other types of touch modules and corresponding trigger signals can be set according to actual requirements. If the trigger signal is received, an image display updating instruction is generated according to the trigger signal, the electronic equipment receives the image display updating instruction, and the electronic equipment responds to the image updating instruction and acquires the posture information of the near-to-eye display equipment.
The near-eye display device may include a posture sensor, and the posture sensor may acquire posture information of the near-eye display device, please refer to fig. 1, fig. 7, and fig. 8, and fig. 7 is a first view of a rotation scene of the near-eye display device according to the embodiment of the present disclosure. Fig. 8 is a second view of a rotating scene of a near-eye display device according to an embodiment of the present application. The attitude sensor 13 is used to detect attitude information of the near-eye display device 10. The attitude sensor 13 may include a gyroscope, an electronic compass, an acceleration sensor, and/or a hall sensor. The attitude sensor 13 can realize 3 degrees of freedom detection (3degreeffreedom, 3dof) or 6 degrees of freedom detection (6degreeffreedfreedom, 6dof) of the near-eye display device, and with the near-eye display device capable of realizing the 3DOF example, the near-eye display device can detect attitude information of a first degree of freedom rotation, attitude information of a second degree of freedom rotation, and attitude information of a third degree of freedom rotation through the attitude sensor 13. After the attitude sensor collects the attitude information of the near-eye display device, the attitude information is sent to the electronic device, so that the electronic device can obtain the attitude information of the near-eye display device.
And 403, determining a second image according to the attitude information, wherein the second image belongs to a part of the source image and is different from the first image.
And 404, sending the second image to the near-eye display device.
With respect to steps 403-404:
the electronic equipment determines a second image according to the acquired attitude information of the near-eye display equipment, the second image belongs to one part of the source image, the second image is different from the first image, and the second image is sent to the near-eye display equipment so that the near-eye display equipment can display the second image.
In some embodiments, the region of the second image in the source image intersects the region of the first image in the source image. Referring to fig. 9 and 10, fig. 9 is a first display schematic diagram of a near-eye display device provided in an embodiment of the present application, and fig. 10 is a second display schematic diagram of the near-eye display device provided in the embodiment of the present application. Before the image display update instruction is not responded, as shown in fig. 9, a first image can be observed by the user from the near-eye display device, after the image display update instruction is responded, a second image can be observed by the user from the near-eye display device, the second image belongs to a part of the source image, and the second image is different from the first image, for example, as shown in fig. 10, the position of the second image in the source image is not the same as the position of the first image in the source image, and the area of the second image in the source image has intersection with the area of the first image in the source image. The first image comprises information of body building application, the second image comprises auxiliary display information, in an actual application scene, when a user needs to check electric quantity parameters in the auxiliary display information, corresponding trigger signals can be triggered through the trigger module, the gesture information of the near-to-eye display device is collected through the gesture sensor, the electric quantity parameters needing to be checked by the user are displayed according to the gesture information, and at the moment, the second image can comprise application information and the auxiliary display information of a part of the first image. And the intersection part of the first image in the region of the source image and the second image in the region of the source image is a partial region corresponding to the application information.
In some embodiments, the second image comprises a first sub-image region and a second sub-image region, the second sub-image region of the second image being located adjacent to the first sub-image region, the first sub-image region being located within an intersection region of the second image and the first image in the source image. As shown in fig. 9 and 10, the first sub-image region may be a region including partial application information, the second sub-image region may be a region including auxiliary display information, and the first sub-image region is located in an intersection region of the second image and the first image in the source image.
In some embodiments, the second sub-image region of the second image is disposed adjacent to one side of the edge of the first image, as shown in fig. 10, and the second sub-image region of the second image is disposed adjacent to one side of the first image, while in other embodiments, the second sub-image region of the second image is disposed adjacent to two sides of the first image. As shown in fig. 11.
In some embodiments, before displaying the second image information according to the posture information, the near-eye display device may not display the first image, for example, when the near-eye display device is in a low power consumption mode or a low power state, the visible area is in a black screen state, when the user generates an image display update instruction through the trigger signal, the posture information of the near-eye display device is acquired through the posture sensor, the second image is displayed according to the posture information, and when the near-eye display device is in the low power consumption mode or the low power mode, reading of parameters of the near-eye display device by the user may be further satisfied, so that readability of the auxiliary display information is increased.
According to the information display method, the electronic device displays the first image which belongs to the part of the source image in the near-eye display device in advance, the posture information of the near-eye display device is obtained after the image display updating instruction is responded, the second image which belongs to the part of the source image is displayed according to the posture information, the second image can comprise some information which does not need to be checked in real time, the information is displayed according to the posture information of the near-eye display device when the information needs to be checked, the flexibility of the display content is high, and the display effect of the near-eye display device is improved.
In some embodiments, said determining a second image from said pose information may comprise: determining a target area from the source image according to the attitude information; sending the second image to a near-eye display device comprises: the near-eye display device comprises a visual area, and the second image corresponding to the target area is sent to the near-eye display device, so that the near-eye display device displays the second image corresponding to the target area in the visual area.
In some embodiments, the first image and the second image displayed in the viewable area are the same size.
In some embodiments, the second image includes auxiliary display information including one or more of parameter information of an electronic device and/or parameter information of a near-eye display device to which the electronic device is connected.
In some embodiments, the parameter information includes one or more of a power parameter, a network parameter, a time parameter, an audio playback parameter, a display parameter, and a notification message parameter.
In some embodiments, after sending the first image to the near-eye display device, further comprising:
acquiring coordinate information of a first image displayed in the visible area on the source image;
acquiring the attitude information of the near-eye display device, determining a second image according to the attitude information, and sending the second image to the near-eye display device comprises the following steps:
acquiring attitude information of near-to-eye display equipment within a preset time period;
obtaining position change information of the near-eye display equipment according to the posture information of the near-eye display equipment in the preset time period;
determining a target area from the source image according to the position change information and the coordinate information;
and sending the second image corresponding to the target area to the near-to-eye display equipment.
It can be understood that the electronic device determines the target area in a manner similar to that of the near-eye device, and thus, the description thereof is omitted here.
In some embodiments, after sending the second image to the near-eye display device, the information display method further comprises:
and sending the first image to the near-eye display device in response to the received image display reset instruction.
The electronic device responds to the image display reset instruction in a similar manner as the near-eye display device responds to the image reset instruction, and the details are not repeated here.
In this embodiment, a processor in the electronic device loads instructions corresponding to processes of one or more computer programs into a memory according to the following steps, and the processor runs the computer programs stored in the memory, so as to implement various functions, as follows:
sending a first image to a near-eye display device, the first image belonging to a portion of a source image;
acquiring attitude information of the near-to-eye display device in response to an image display updating instruction;
determining a second image according to the attitude information, wherein the second image belongs to a part of the source image, and the second image is different from the first image;
and sending the second image to a near-eye display device.
In some embodiments, the region of the second image in the source image intersects the region of the first image in the source image.
In some embodiments, the second image comprises a first sub-image region and a second sub-image region, the second sub-image region of the second image being located adjacent to the first sub-image region, the first sub-image region being located within an intersection region of the second image and the first image in the source image.
In some embodiments, the second sub-image area of the second image is disposed adjacent to the first image edge side.
In some embodiments, when determining the second image from the pose information, the processor is further configured to:
determining a target area from the source image according to the attitude information;
sending the second image to a near-eye display device comprises:
the near-eye display equipment comprises a visual area, and the second image corresponding to the target area is sent to the near-eye display equipment, so that the near-eye display equipment displays the second image corresponding to the target area in the visual area.
In some embodiments, the first image and the second image displayed in the viewable area are the same size.
In some embodiments, the second image includes auxiliary display information including one or more of parameter information of an electronic device and/or parameter information of a near-eye display device to which the electronic device is connected.
In some embodiments, the parameter information includes one or more of a power parameter, a network parameter, a time parameter, an audio playback parameter, a display parameter, and a notification message parameter.
In some embodiments, after sending the first image to the near-eye display device, the processor is further configured to perform: acquiring coordinate information of a first image displayed in the visible area on the source image;
when acquiring the attitude information of the near-eye display device, determining a second image according to the attitude information, and sending the second image to the near-eye display device, the processor is further configured to execute:
acquiring gesture information of the near-eye display device within a preset time period;
obtaining position change information of the near-eye display equipment according to the posture information of the near-eye display equipment in the preset time period;
determining a target area from the source image according to the position change information and the coordinate information;
and sending the second image corresponding to the target area to the near-to-eye display equipment.
In some embodiments, after sending the second image to the near-eye display device, the processor is further configured to perform: sending the first image to the near-eye display device in response to the received image display reset instruction.
The information display method, the near-eye display device, and the electronic device provided in the embodiments of the present application are described in detail above. The principles and implementations of the present application are described herein using specific examples, which are presented only to aid in understanding the present application. Meanwhile, for those skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (26)

1. An information display method applied to a near-eye display device, the method comprising:
displaying a first image, the first image being part of a source image;
responding to an image display updating instruction, and acquiring the posture information of the near-to-eye display equipment;
and displaying a second image according to the attitude information, wherein the second image belongs to a part of the source image and is different from the first image.
2. The information display method according to claim 1, characterized in that the region of the second image in the source image intersects the region of the first image in the source image.
3. The information display method according to claim 2, wherein the second image includes a first sub-image region and a second sub-image region, the second sub-image region of the second image being disposed adjacent to the first sub-image region, the first sub-image region being located within an intersection region of the second image and the first image in the source image.
4. The information display method according to claim 3, wherein the second sub-image area of the second image is disposed adjacent to a side of an edge of the first image.
5. The information display method according to claim 1, wherein the near-eye display device includes a visible region, and the displaying the second image according to the posture information includes:
determining a target area from the source image according to the attitude information;
and displaying a second image corresponding to the target area in the visual area.
6. The information display method according to claim 5, wherein the first image and the second image displayed in the visual area are the same in size.
7. The information display method according to claim 1, wherein the second image includes auxiliary display information including one or more of parameter information of the near-eye display device and/or parameter information of an electronic device to which the near-eye display device is connected.
8. The information display method according to claim 7, wherein the parameter information includes one or more of a power parameter, a network parameter, a time parameter, an audio playing parameter, a display parameter, and a notification message parameter.
9. The information display method according to claim 1, further comprising, after displaying the first image:
acquiring coordinate information of a first image displayed in the visible area on the source image;
acquiring the posture information of the near-eye display device, and displaying the second image according to the posture information specifically comprises the following steps:
acquiring gesture information of the near-eye display device within a preset time period;
obtaining position change information of the near-eye display equipment according to the posture information of the near-eye display equipment in the preset time period;
determining a target area from the source image according to the position change information and the coordinate information;
and displaying a second image corresponding to the target area in the visual area.
10. The information display method according to any one of claims 1 to 9, wherein after displaying the second image according to the posture information, the information display method further comprises:
in response to the received image display reset instruction, the near-eye display device switches the displayed second image to display the first image.
11. The information display method according to any one of claims 1 to 9, wherein the trigger signal of the image display update instruction includes at least one of a touch signal, a voice signal, an image signal, and an action signal.
12. The information display method according to claim 11, wherein the near-eye display device includes a wearing component, the wearing component is provided with a touch module, the touch module is configured to receive the touch signal, and when the touch module receives the touch signal of the image display update instruction, the near-eye display device responds to the image display update instruction to obtain the posture information of the near-eye display device.
13. An information display method applied to an electronic device and storing a source image, the method comprising:
sending a first image to a near-eye display device, the first image belonging to a portion of a source image;
acquiring attitude information of the near-to-eye display device in response to an image display updating instruction;
determining a second image according to the attitude information, wherein the second image belongs to a part of the source image, and the second image is different from the first image;
and sending the second image to a near-eye display device.
14. The information display method according to claim 13, wherein a region of the second image in the source image intersects with a region of the first image in the source image.
15. The information display method according to claim 14, wherein the second image includes a first sub-image region and a second sub-image region, the second sub-image region of the second image being disposed adjacent to the first sub-image region, the first sub-image region being located within an intersection region of the second image and the first image in the source image.
16. The information display method according to claim 15, wherein the second sub-image area of the second image is disposed adjacent to a side of an edge of the first image.
17. The information display method of claim 13, wherein said determining a second image from the pose information comprises:
determining a target area from the source image according to the attitude information;
sending the second image to a near-eye display device comprises:
the near-eye display device comprises a visual area, and the second image corresponding to the target area is sent to the near-eye display device, so that the near-eye display device displays the second image corresponding to the target area in the visual area.
18. The information display method according to claim 17, wherein the first image and the second image displayed in the visible region have the same size.
19. The information display method according to claim 13, wherein the second image includes auxiliary display information including one or more of parameter information of an electronic device and/or parameter information of a near-eye display device to which the electronic device is connected.
20. The information display method according to claim 19, wherein the parameter information includes one or more of a power parameter, a network parameter, a time parameter, an audio playing parameter, a display parameter, and a notification message parameter.
21. The information display method according to claim 13, further comprising, after transmitting the first image to the near-eye display device:
acquiring coordinate information of a first image displayed in the visible area on the source image;
acquiring the attitude information of the near-eye display device, determining a second image according to the attitude information, and sending the second image to the near-eye display device comprises the following steps:
acquiring gesture information of the near-eye display device within a preset time period;
obtaining position change information of the near-eye display equipment according to the posture information of the near-eye display equipment in the preset time period;
determining a target area from the source image according to the position change information and the coordinate information;
and sending the second image corresponding to the target area to the near-to-eye display equipment.
22. The information display method according to any one of claims 13 to 21, wherein after the second image is transmitted to the near-eye display device, the information display method further comprises:
and sending the first image to the near-eye display device in response to the received image display reset instruction.
23. A near-eye display device characterized by being configured to perform the information display method of any one of claims 1 to 12.
24. A near-eye display device, characterized in that the near-eye display device comprises:
display means for displaying a first image, the first image being part of a source image;
the touch module is used for responding to the image display updating instruction;
the attitude sensor is used for acquiring attitude information of the near-eye display equipment;
wherein the display device is further configured to display a second image according to the pose information, the second image belonging to a portion of the source image, the second image being different from the first image.
25. The near-eye display device of claim 24, further comprising:
a wearing assembly for wearing of a near-eye display device;
the display device comprises a visible area, the visible area is arranged on the wearing component and used for displaying the second image, and the trigger module comprises a touch key and is arranged on the wearing component and used for receiving a trigger signal of the image display updating instruction.
26. An electronic device, characterized in that the electronic device is configured to perform the information display method of any one of claims 13-22.
CN202111088751.XA 2021-09-16 2021-09-16 Information display method, near-to-eye display device and electronic device Pending CN115826734A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202111088751.XA CN115826734A (en) 2021-09-16 2021-09-16 Information display method, near-to-eye display device and electronic device
PCT/CN2022/113110 WO2023040562A1 (en) 2021-09-16 2022-08-17 Information display method, near-eye display device, and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111088751.XA CN115826734A (en) 2021-09-16 2021-09-16 Information display method, near-to-eye display device and electronic device

Publications (1)

Publication Number Publication Date
CN115826734A true CN115826734A (en) 2023-03-21

Family

ID=85515116

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111088751.XA Pending CN115826734A (en) 2021-09-16 2021-09-16 Information display method, near-to-eye display device and electronic device

Country Status (2)

Country Link
CN (1) CN115826734A (en)
WO (1) WO2023040562A1 (en)

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8416152B2 (en) * 2008-06-11 2013-04-09 Honeywell International Inc. Method and system for operating a near-to-eye display
US9690099B2 (en) * 2010-12-17 2017-06-27 Microsoft Technology Licensing, Llc Optimized focal area for augmented reality displays
CN107533375A (en) * 2015-06-29 2018-01-02 埃西勒国际通用光学公司 scene image analysis module
JP6976719B2 (en) * 2017-05-25 2021-12-08 キヤノン株式会社 Display control device, display control method and program
WO2019152619A1 (en) * 2018-02-03 2019-08-08 The Johns Hopkins University Blink-based calibration of an optical see-through head-mounted display
CN112882672B (en) * 2021-02-26 2024-01-23 京东方科技集团股份有限公司 Near-eye display control method and device and near-eye display equipment

Also Published As

Publication number Publication date
WO2023040562A1 (en) 2023-03-23

Similar Documents

Publication Publication Date Title
CN110221432B (en) Image display method and device of head-mounted display
CN108710525B (en) Map display method, device, equipment and storage medium in virtual scene
US11995285B2 (en) Methods for adjusting and/or controlling immersion associated with user interfaces
US20180288391A1 (en) Method for capturing virtual space and electronic device using the same
CN111970456B (en) Shooting control method, device, equipment and storage medium
CN112533017B (en) Live broadcast method, device, terminal and storage medium
CN112835445B (en) Interaction method, device and system in virtual reality scene
CN113223129B (en) Image rendering method, electronic equipment and system
CN112221134B (en) Virtual environment-based picture display method, device, equipment and medium
CN107924234B (en) Auxiliary item selection for see-through eyewear
CN112241199B (en) Interaction method and device in virtual reality scene
CN112367533B (en) Interactive service processing method, device, equipment and computer readable storage medium
CN112023403B (en) Battle process display method and device based on image-text information
CN110891181B (en) Live broadcast picture display method and device, storage medium and terminal
CN112612387A (en) Method, device and equipment for displaying information and storage medium
CN114201030A (en) Device interaction method, electronic device and interaction system
CN110717993A (en) Interaction method, system and medium of split type AR glasses system
CN115826734A (en) Information display method, near-to-eye display device and electronic device
CN209859042U (en) Wearable control device and virtual/augmented reality system
US20240152244A1 (en) Devices, Methods, and Graphical User Interfaces for Interacting with Three-Dimensional Environments
US20240103679A1 (en) Devices, Methods, and Graphical User Interfaces for Interacting with Three-Dimensional Environments
US20240104819A1 (en) Representations of participants in real-time communication sessions
US20240103678A1 (en) Devices, methods, and graphical user interfaces for interacting with extended reality experiences
US20240103681A1 (en) Devices, Methods, and Graphical User Interfaces for Interacting with Window Controls in Three-Dimensional Environments
US20240152245A1 (en) Devices, Methods, and Graphical User Interfaces for Interacting with Window Controls in Three-Dimensional Environments

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination