CN115032797A - Display method for wireless intelligent glasses and wireless intelligent glasses - Google Patents

Display method for wireless intelligent glasses and wireless intelligent glasses Download PDF

Info

Publication number
CN115032797A
CN115032797A CN202210770749.9A CN202210770749A CN115032797A CN 115032797 A CN115032797 A CN 115032797A CN 202210770749 A CN202210770749 A CN 202210770749A CN 115032797 A CN115032797 A CN 115032797A
Authority
CN
China
Prior art keywords
reference signal
glasses
display screen
slave
section
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210770749.9A
Other languages
Chinese (zh)
Other versions
CN115032797B (en
Inventor
童伟峰
方栋良
黎骅
张亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Bestechnic Shanghai Co Ltd
Original Assignee
Bestechnic Shanghai Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Bestechnic Shanghai Co Ltd filed Critical Bestechnic Shanghai Co Ltd
Priority to CN202210770749.9A priority Critical patent/CN115032797B/en
Publication of CN115032797A publication Critical patent/CN115032797A/en
Application granted granted Critical
Publication of CN115032797B publication Critical patent/CN115032797B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0141Head-up displays characterised by optical features characterised by the informative content of the display
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Control Of Indicators Other Than Cathode Ray Tubes (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The application relates to a display method for wireless intelligent glasses and the wireless intelligent glasses. The wireless intelligent glasses comprise a left glasses part and a right glasses part, and the display method comprises the following steps: a processing part, a display screen and a first wireless module are respectively arranged in each of the left glasses part and the right glasses part; and the processing part synchronously displays the received video data on the display screen of each glasses part respectively based on the cooperatively arranged synchronous reference signals. According to the display method for the wireless intelligent glasses, the displayed picture has a stronger stereoscopic impression, the picture is more natural, and the use experience of a user wearing the intelligent glasses is greatly improved.

Description

Display method for wireless intelligent glasses and wireless intelligent glasses
Technical Field
The application relates to the technical field of wireless intelligent glasses, in particular to a display method for wireless intelligent glasses and the wireless intelligent glasses.
Background
Along with the social progress and the improvement of the living standard of people, the market of intelligent glasses such as AR/VR is gradually rising. Augmented Reality (AR)/Virtual Reality (VR) Head Mounted Display (HMD) may present images to a user through a Display screen, giving the user an immersive experience. In the prior art, left and right eye display screens are connected with the same main control chip, and display of the left and right eye display screens is controlled by the same clock, but the video connection line is required between the left and right eyes of the head-mounted display, so that the structure of the glasses is bound and limited, the wireless reception of image/video data can be interfered by the overlong connection line between the main control chip and the display screens, and the fault risk can be increased due to the overlarge load of a single main control chip. The left and right eye display screens are respectively controlled by different main control chips, the problems can be solved to a certain extent, and if the left and right eye display screens cannot synchronously brush the screen when the left and right eye display screens respectively display corresponding video data, poor visual experience can be brought to a wearer. The prior art still fails to provide a good solution to the problem of synchronous screen brushing of the left and right eye display screens of the intelligent glasses.
Disclosure of Invention
The present application is provided to solve the above-mentioned problems occurring in the prior art.
There is a need for a display method for wireless smart glasses and wireless smart glasses, wherein the left and right glasses of the wireless smart glasses can respectively obtain video data, and the left and right glasses respectively process the received data, and when the video data is respectively displayed on the display screens of the left and right glasses, the display screens of the left and right glasses can be synchronously refreshed, so that the images/videos displayed on the left and right display screens have a stronger stereoscopic impression for the wearer of the smart glasses, the images are more natural, and the user experience is improved.
According to a first aspect of the present application, a display method for wireless smart glasses is provided, the wireless smart glasses include a left glasses part and a right glasses part, the display method includes setting a processing part, a display screen and a first wireless module in each of the left glasses part and the right glasses part, respectively, wherein a frame image buffer is provided in the display screen of each of the glasses parts. The display method further comprises receiving, by the first wireless module of each of the eyeglass portions, video data of the smart device; and writing the received video data into the frame image buffers in the respective display screens by the processing parts of the respective glasses parts based on the cooperatively arranged synchronous reference signals, so that the display screens of the respective glasses parts synchronously display the video data in the respective frame image buffers respectively.
According to a second aspect of the present application, there is provided wireless smart glasses, which include a left glasses part and a right glasses part, each of the left glasses part and the right glasses part includes a processing part, a display screen, and a first wireless module, respectively, wherein a frame image buffer is provided in the display screen of each glasses part. The first wireless module of each eyeglass portion is configured to receive video data of the smart device; the processing section of each of the glasses sections is configured to write the received video data to a frame image buffer in the respective display screen based on a synchronization reference signal set cooperatively; the display screens of the respective eyeglass portions are configured to synchronously display the video data in the respective frame image buffers.
According to the display method for the wireless intelligent glasses and the wireless intelligent glasses, the processing part, the display screen and the first wireless module are respectively arranged in each of the left glasses part and the right glasses part, so that each glasses part can respectively receive video data of the intelligent device through the respective first wireless module, the respective processing part independently processes the video data, the display screen of each glasses part is provided with the frame image buffer, the processing parts of the left and right glasses parts write the video data into the frame image buffers in the respective display screens based on the cooperatively arranged synchronous reference signals, and the display screens of the respective glasses parts can respectively but synchronously display the video data in the respective frame image buffers. According to the display method for the wireless intelligent glasses and the wireless intelligent glasses, the connecting lines between the left and right glasses parts can be reduced, the wireless intelligent glasses have a more flexible structure, the interference on wireless receiving of video data is smaller, and meanwhile, synchronous reference signals for controlling writing of frame image buffers of the display screens of the left and right glasses parts are cooperatively arranged, so that the left and right display screens can be ensured to have better synchronism when displaying the video data, bad experience caused by the fact that the left and right pictures are asynchronous can be avoided, the displayed pictures can have stronger stereoscopic impression, the pictures are more natural, and the use experience of a wearer of the intelligent glasses is greatly improved.
The foregoing description is only an overview of the technical solutions of the present application, and the present application can be implemented according to the content of the description in order to make the technical means of the present application more clearly understood, and the following detailed description of the present application is given in order to make the above and other objects, features, and advantages of the present application more clearly understandable.
Drawings
In the drawings, which are not necessarily drawn to scale, like reference numerals may describe similar components in different views. Like reference numerals having alphabetic suffixes or different alphabetic suffixes may represent different instances of similar components. The drawings illustrate various embodiments generally by way of example, and not by way of limitation, and together with the description and claims serve to explain the disclosed embodiments. Such embodiments are illustrative, and are not intended to be exhaustive or exclusive embodiments of the present apparatus or method.
Fig. 1 shows a schematic diagram of a part of the components of wireless smart glasses according to an embodiment of the present application.
Fig. 2 is a schematic diagram illustrating a synchronization reference signal coordination setting manner according to an embodiment of the present application.
Fig. 3 shows a schematic diagram of a DSI clock calibrated using a second radio module, a first reference signal, and a second reference signal according to an embodiment of the present application.
Fig. 4(a) shows a schematic diagram of a left eyeglass portion and a right eyeglass portion synchronously displaying video data based on a first reference signal and a second reference signal, respectively, according to an embodiment of the present application.
Fig. 4(b) shows another schematic diagram of the left and right eyeglass portions synchronously displaying video data based on the first reference signal and the second reference signal, respectively, according to an embodiment of the present application.
Fig. 5(a) shows still another schematic diagram of the left and right eyeglass portions synchronously displaying video data based on the first reference signal and the second reference signal, respectively, according to an embodiment of the present application.
Fig. 5(b) is a schematic diagram showing a screen brushing timing sequence of the display screen of the main glasses part according to the embodiment of the application.
Detailed Description
In order to make the technical solutions of the present application better understood, the present application is described in detail below with reference to the accompanying drawings and the detailed description. The embodiments of the present application will be described in further detail with reference to the drawings and specific embodiments, but the present application is not limited thereto.
As used in this application, the terms "first," "second," and the like, do not denote any order, quantity, or importance, but rather are used to distinguish one element from another. The word "comprising" or "comprises", and the like, means that the element preceding the word covers the element listed after the word, and does not exclude the possibility that other elements are also covered. The order of execution of the steps in the methods described in this application in connection with the figures is not intended to be limiting. As long as the logical relationship between the steps is not affected, the steps can be integrated into a single step, the single step can be divided into a plurality of steps, and the execution order of the steps can be changed according to the specific requirements.
According to an embodiment of the present application, there is provided wireless smart glasses, which may include, for example, a left glasses part and a right glasses part, wherein each of the left and right glasses parts includes components such as a processing part, a display screen, and a first wireless module, and corresponding steps in the display method for wireless smart glasses according to the various embodiments of the present application are performed by each component of each glasses part.
The application provides a wireless intelligent glasses. Fig. 1 shows a schematic diagram of a part of the components of wireless smart glasses according to an embodiment of the present application. As shown in fig. 1, the wireless smart glasses 1 may include a left glasses part 10 and a right glasses part 11, further, the left glasses part 10 at least includes a processing part 101, a display screen 102 and a first wireless module 103, wherein the display screen 102 is provided with a frame image buffer 1021, correspondingly, the right glasses part 11 at least includes a processing part 111, a display screen 112 and a first wireless module 113, and the display screen 112 is provided with a frame image buffer 1121.
In some embodiments, the first wireless module of each eyeglass portion is configured to receive video data of the smart device, for example, the first wireless module 103 of the left eyeglass portion 10 and the first wireless module 113 of the right eyeglass portion 11 respectively receive video data from the smart device 2 shown in fig. 1. The smart device 2 may be, for example, a smart phone, a pad, a computer, or a cloud server, and the application is not limited herein.
In some embodiments, the first wireless module 103 and the first wireless module 113 may be a bluetooth module, a WiFi module, or any other communication module capable of supporting wireless transmission of video data, which is not limited herein.
In some embodiments, the processing portion of each of the glasses portion may be configured to write the received video data to the frame image buffer in the respective display screen based on a synchronization reference signal cooperatively set, for example, the processing portion 101 of the left glass portion 10 writes the received video data to the frame image buffer 1021 based on a synchronization reference signal 1 (not shown), and correspondingly, the processing portion 111 of the right glass portion 11 writes the received video data to the frame image buffer 1121 based on a synchronization reference signal 2 (not shown). The processing unit of each of the glasses may decompress the video data after receiving the video data, and then write the decompressed video data into the frame image buffer of each display screen. Further, the synchronization reference signal 1 and the synchronization reference signal 2 are provided in cooperation, and the synchronization reference signals for the left and right eye parts are provided in cooperation, whereby the timings of writing the video data into the frame image buffer 1021 and the frame image buffer 1121 can be synchronized.
In some embodiments, the processing portion 101 and the processing portion 111 may be, for example, processing components including one or more general-purpose processors, such as microprocessors, Central Processing Units (CPUs), Graphics Processing Units (GPUs), and so on. More specifically, the processing element may be a Complex Instruction Set Computing (CISC) microprocessor, Reduced Instruction Set Computing (RISC) microprocessor, Very Long Instruction Word (VLIW) microprocessor, processor running other instruction sets, or processors running a combination of instruction sets. The processing element may also be one or more special-purpose processing devices such as an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA), a Digital Signal Processor (DSP), system on chip (SoC), or the like.
In some embodiments, the display screen of each eyeglass portion is configured to: the video data in the respective frame image buffers are synchronously displayed. As mentioned above, since the writing timings of the video data in the frame image buffer 1021 and the frame image buffer 1121 can be synchronized, when the display screen 102 of the left lens portion 10 and the display screen 112 of the right lens portion 11 respectively use the video data in the frame image buffer 1021 and the frame image buffer 1121 for refreshing, a synchronized data base is provided, and how to ensure the synchronicity of the display screen 102 of the left lens portion 10 and the display screen 112 of the right lens portion 11 during refreshing will be described in detail later with reference to other embodiments.
In some embodiments, the Display screen 102 and the Display screen 112 may be, for example, a Liquid Crystal Display (LCD), a Liquid Crystal On Silicon (LCOS), a Digital Light Processing (DLP) Display, or other types or forms of microdisplays that are not self-illuminating, which may be flexible screens or rigid screens (i.e., non-flexible screens), and the application is not limited herein. In other embodiments, the display screen 102 and the display screen 112 may also be other types or forms of self-Emitting micro-displays such as a Light Emitting Diode (LED) display screen, an Organic Light Emitting Diode (OLED) display screen, and the like, and may be a flexible screen or a rigid screen (i.e., a non-flexible screen), which is not limited herein.
According to the intelligent glasses provided by the embodiment of the application, the video data from the intelligent equipment or the cloud can be received through the wireless module, the limitation of wires is reduced, and the user experience is greatly improved. In addition, each of the screens of the respective lens portions includes a frame image buffer, and thus belongs to a Command Mode (Command Mode) screen. For a display screen in command mode, an MIPI (Mobile Industry Processor Interface) (DSI is one of them) bus controller between a processing section and the display screen transmits a pixel data stream to the display screen using a display command message, and the display screen stores all pixel data by using a full-frame-length frame image buffer included in the display screen itself. Once the data are placed in the frame image buffer of the display screen, the timing controller may take out the data from the frame image buffer based on a certain timing signal (clock signal) and automatically display them on the display screen. In the command Mode, the MIPI bus controller does not need to refresh the display screen periodically, but only when the data screen changes, the processing unit of each glasses unit sends data to the display screen via the MIPI (for example, DSI), and otherwise the display screen (or the driving chip of the display screen) itself reads data from the internal frame image buffer and displays the data, and does not need to refresh the display periodically via the MIPI bus, so that the power consumption is lower than that in a Video Mode (Video Mode) in which the display screen is refreshed periodically via the MIPI bus without providing the frame image buffer.
Further, as for the display panel of the command mode, the frequency of the screen swiping of the display panel is F1 Hz, and the frequency of the image data transmission to the display panel by the glasses processing section may be F2 Hz. F1, F2 are often not the same, in many cases F1> F2. Thus, the command mode display screen may have lower power consumption relative to the video mode. In addition, in the command mode, data transmission between the processing part and the display screen of each glasses part is greatly reduced, and interference to wireless transmission can be reduced.
F2 tends to be a constant value when the same video file is played on each of the lens portions. F2 may be different when different video files are played on each of the lens portions. On the other hand, when some interfaces, some relatively still images are displayed on the glasses, F2 may be low and may be changed at any time or by the operation of the user, and when there is a need, for example, when there is a change in the image picture, the processing section transmits the image data to the frame image buffer of the display screen. For the above various application scenarios, no distinction is made in this application.
Fig. 2 is a schematic diagram illustrating a synchronization reference signal coordination setting manner according to an embodiment of the present application. In some embodiments, one of the left and right glasses may be a master glasses and the other may be a slave glasses, for example, in fig. 2, one of the left and right glasses may be set as a master glasses 20 and the other may be set as a slave glasses 21, and the synchronization reference signals according to the embodiments of the present application are cooperatively set as follows.
First, a first reference signal is triggered by the display screen 202 of the primary eyewear section 20 and is output to the processing section 201 of the primary eyewear section 20. That is, the display screen 202 of the primary eyewear portion 20 may be configured to trigger the first reference signal and output the first reference signal to the processing portion 201 of the primary eyewear portion 20.
Then, the first reference signal, or a third reference signal triggered based on the first reference signal, is transmitted to the processing section 211 of the slave eyeglass section 21 by the processing section 201 of the master eyeglass section 20. That is, the processing section 201 of the master eyeglass portion 20 may be further configured to transmit the first reference signal, or a third reference signal triggered based on the first reference signal, to the processing section 211 of the slave eyeglass portion 21.
After receiving the first reference signal or the third reference signal from the processing unit 211 of the glasses unit 21, the processing unit triggers a second reference signal based on the first reference signal or the third reference signal, and outputs the second reference signal to the display screen 212 of the glasses unit 21. The processing unit 211 of the slave eyeglass unit 21 may further trigger a fourth reference signal based on the first reference signal or the third reference signal, and output the fourth reference signal to the display screen 212 of the slave eyeglass unit 21 instead of the second reference signal. That is, the processing section 211 of the slave glass section 21 may be further configured to trigger a second reference signal based on the first reference signal or the third reference signal, and output the second reference signal or a fourth reference signal triggered based on the first reference signal or the third reference signal to the display screen 212 of the slave glass section 21. In some embodiments, the fourth reference signal may also be generated based on the second reference signal, for example, a signal with a fixed time delay from the second reference signal, and so on, which are not listed here.
The second reference signal and the fourth reference signal, which are generated in a coordinated manner with the first reference signal or the third reference signal as described above, will be used for display synchronization of the left and right eyeglass portions in the next step.
On the one hand, the above-mentioned respective reference signals may be used for synchronous writing of the respective frame image buffers of the left and right eyeglass portions, for example, the received video data may be written into the frame image buffer 2021 in the display screen 202 of the main eyeglass portion 20 by the processing portion 201 of the main eyeglass portion 21 based on the first reference signal. That is, the processing section 201 of the primary eyewear section 20 may be further configured to write the received video data to the frame image buffer 2021 in the display screen 202 of the primary eyewear section 20 based on the first reference signal. The received video data is written in the frame image buffer 2121 in the display screen 212 of the slave eyeglass portion 21 in synchronization with the processing portion 201 of the master eyeglass portion 20 by the processing portion 211 of the slave eyeglass portion 21 based on the second reference signal, corresponding to the master eyeglass portion 20. That is, the processing section 211 of the slave eyeglass section 21 may be further configured to write the received video data to the frame image buffer 2121 in the display screen 212 of the slave eyeglass section 21 in synchronization with the processing section 201 of the master eyeglass section 20 based on the second reference signal.
On the other hand, the above-mentioned respective reference signals may also be used for synchronous screen refreshing of the left and right eyeglass portion display screens, for example, the video data in the frame image buffer 2121 may be displayed in synchronization with the video data in the frame image buffer 2021 displayed by the display screen 202 of the master eyeglass portion 20 based on the second reference signal or the fourth reference signal by the display screen 212 of the slave eyeglass portion 21. That is, the display screen 212 of the slave eyeglass portion 21 may be further configured to: the video data in the frame image buffer 2121 is displayed in synchronization with the display screen 202 display frame image buffer 2021 of the main glasses section 20 based on the second reference signal or the fourth reference signal.
As described above, each reference signal such as the first reference signal and the second reference signal may be an applicable rising edge, falling edge, or narrow pulse trigger signal, for example, a TE pin is provided between the display screen (or the display screen driving chip) of each glasses portion and the processing portion, and the display screen driving chip may transmit a TE (defining Effect signal) signal to the processing portion by using the TE pin, so as to inform the processing portion that the display screen has been refreshed, and new frame data can be transmitted. In some embodiments, the display screen of the primary eyewear portion may generally employ a TE signal (e.g., a rising or falling edge of the TE signal, or a signal generated based on a TE pulse signal, etc.) as the first reference signal. The TE signal as the first reference signal is transmitted from the display screen (or the display screen driving chip) of the master glasses section to the processing section of the master glasses section, and the transmission direction of the second reference signal cooperatively set based on the first reference signal is opposite, and is output from the processing section of the glasses section to the display screen of the slave glasses section, so that the display screen of the slave glasses section can perform synchronous data update and screen refreshing display with the display screen of the master glasses section.
Through the above-described respective steps, the display screen 202 of the master eyeglass portion 20 and the display screen 212 of the slave eyeglass portion 21 can be made to write video data in the respective frame image buffers in synchronization. And under the condition that the first reference signal is triggered based on the TE signal, the pace that the processing parts of the respective glasses parts send data to the frame image buffer and the refresh pace of the corresponding display screen can be kept consistent, and as long as the initial time of writing data into the frame image buffer is earlier than the time of starting refreshing the display screen, and the data writing speed is higher than the refresh display speed of the display screen, the data used for refreshing the display screen can be ensured to be the updated data in the frame image buffer, so that the screen cut phenomenon that new and old data are the same in one screen can be avoided.
In some other embodiments, the first reference signal or the third reference signal triggered based on the first reference signal may be transmitted by the processing section of the master glasses section to the processing section of the slave glasses section by means of a wired connection. That is, the processing section of the master eyeglass portion may be further configured to transmit the first reference signal, or a third reference signal triggered based on the first reference signal, to the processing section of the slave eyeglass portion using a wired connection. The processing portion of the primary eyewear portion is further configured to: the first reference signal or a third reference signal triggered based on the first reference signal is transmitted to a processing section of the slave eyeglass section using a wired connection. In some embodiments, the wired connection for transmitting the first reference signal or the third reference signal may be a signal line exclusively provided between the processing section of the master eyeglass portion and the processing section of the slave eyeglass portion, or may be an existing idle signal line configured to transmit the first reference signal or the third reference signal. The reference signals required by synchronization between the master glasses part and the slave glasses part are transmitted through wired connection, so that the time delay is smaller, the time sequence configuration of each subsequent signal is more convenient, the consistency and synchronization are easier to achieve, and the overall time delay is smaller.
In some embodiments, the processing portion of each of the left and right eyeglass portions is connected to a corresponding display screen or a display screen driving chip (or a driving module integrally provided with the display screen) via a GPIO (General-purpose input/output) interface, and the processing portions of the respective eyeglass portions are connected to each other via the GPIO interface. As shown in fig. 2, the processing section 201 of the master glasses section 20 is connected to the display screen 202 or the display screen driving chip 202' of the master glasses section 20 via a GPIO interface, the processing section 211 of the slave glasses section 21 is connected to the display screen 212 or the display screen driving chip 212' of the slave glasses section 21 via a GPIO interface, and the display screen 202 or the display screen driving chip 202' of the master glasses section 20 triggers the first reference signal and outputs the first reference signal to the processing section 201 of the master glasses section 20 via the GPIO interface. Then, a third reference signal is triggered by the processing section 201 of the master glasses section 20 based on the first reference signal and is output to the processing section 211 of the slave glasses section 21 via the GPIO interface. Further, a second reference signal may be triggered by the processing portion 211 of the slave glasses portion 21 based on the third reference signal, and the second reference signal, or a fourth reference signal triggered by the first reference signal or the third reference signal may be output to the display screen 212 or the display screen driving chip 212' of the slave glasses portion 21 via a GPIO interface.
That is, the display screen 202 of the primary eyewear section 20 may be further configured to trigger the first reference signal by the display screen 202 or the display screen driving chip 202' of the primary eyewear section 20 and output the first reference signal to the processing section 201 of the primary eyewear section 20 via the GPIO interface; the processing section 201 of the master glasses section 20 is further configured to trigger a third reference signal based on the first reference signal and output the third reference signal to the processing section 211 of the slave glasses section 21 via the GPIO interface; the processing section 211 of the slave glasses section 21 may be further configured to trigger a second reference signal based on the third reference signal and output the second reference signal or the fourth reference signal to the display screen 212 or the display screen driving chip 212' of the slave glasses section 21 via the GPIO interface.
Therefore, the GPIO is used for transmitting the third reference signal, synchronous display of video signals received by the left glasses part and the right glasses part through wireless can be achieved, the implementation scheme is simplified, the cost of software and hardware for achieving synchronization through wireless receiving between the left glasses part and the right glasses part is reduced, interference of wireless connection between the left glasses part and the right glasses part on receiving wireless video of the left glasses part and the right glasses part is reduced, and occupation of time slots of wireless connection between the left glasses part and the right glasses part on receiving wireless video is avoided. In some embodiments, the third reference signal may also be a rising or falling edge or a narrow pulse trigger signal. In some embodiments, the second reference signal and the third reference signal may be triggered by a rising edge or a falling edge of the wireless receiving clock or the display clock. In some embodiments, the second reference signal may be the third reference signal or the third reference signal is obtained after a fixed time delay, or triggered by the third reference signal.
In the display method for the smart glasses according to the embodiment of the application, when the display screens of the master glasses part and the slave glasses part are respectively brushed based on the first reference signal and the second reference signal which are cooperatively set, because the first reference signal and the second reference signal are cooperatively set, the display screen of the master glasses part and the display screen of the slave glasses part can be ensured to be synchronously brushed or synchronously brushed, so that videos or images on the display screens of the left and right glasses parts have stronger stereoscopic impression, and the method is more natural and real and has better user wearing experience.
The subsequent method and steps for triggering the second reference signal by the processing portion of the slave glasses portion based on the first reference signal or the third reference signal and outputting the second reference signal to the display screen of the slave glasses portion, and writing the received video data into the frame image buffer in the corresponding display screen by the processing portion of the master glasses portion and the processing portion of the slave glasses portion based on the first reference signal and the second reference signal respectively are similar to the foregoing, and are not repeated herein.
Fig. 3 shows a schematic diagram of a DSI clock calibrated using a second radio module, a first reference signal, and a second reference signal according to an embodiment of the present application.
In some embodiments, the processing portion of each of the left and right eyeglass portions is connected to the corresponding display screen through a DSI serial port, and each DSI serial port has a corresponding DSI clock, and the received video data may be written into the frame image buffer in the respective display screen by the processing portion of each eyeglass portion based on the respective DSI clock. As shown in fig. 3, either one of the left and right eyeglass portions serves as a master eyeglass portion 30, the other serves as a slave eyeglass portion 31, the processing portion 301 of the master eyeglass portion 30 may be connected to the display screen 302 through a DSI serial port, and the processing portion 301 of the master eyeglass portion 30 is further configured to write received video data to a frame image buffer 3021 in the display screen 302 of the master eyeglass portion 30 based on a DSI clock of the DSI serial port; the processing section 311 of the slave eyeglass portion 31 may be connected to the display screen 312 through a DSI serial port, and the processing section 311 of the slave eyeglass portion 31 may be further configured to write the received video data to the frame image buffer 3121 in the display screen 312 of the slave eyeglass portion 31 in synchronization with the processing section 301 of the master eyeglass portion 30 based on a DSI clock of the DSI serial port.
In some embodiments, the first reference signal may be selected to be associated with the DSI clock of the master eyewear portion 30, and the second reference signal may be selected to be associated with the DSI clock of the slave eyewear portion 31 in the same manner as the first reference signal. The following will describe in detail the selection manner and transmission manner of the first reference signal and the second reference signal, and the manner of calibrating the DSI clock of the left and right eyeglass portions by using the first reference signal and the second reference signal in a wireless transmission manner.
In some embodiments, a second wireless module may be provided in each of the left and right eyeglass portions, wherein the second wireless module is a bluetooth module or a WiFi module. The second wireless module may be used to transmit audio data, video data, image data or other control commands, etc. from the smart device. In some embodiments, the second wireless module may be the same wireless module as the first wireless module, or may be a wireless module equipped additionally. In some embodiments, the clock signal of the second wireless module of the left eyeglass portion is synchronized with the clock signal of the second wireless module of the right eyeglass portion. As shown in fig. 3, the second wireless module 304 is disposed in the master glasses part 30, the second wireless module 314 is disposed in the slave glasses part 31, both the second wireless module 304 and the second wireless module 314 may be bluetooth modules, or both WiFi modules, and the clock signal of the second wireless module 314 and the clock signal of the second wireless module 304 are always synchronized. Specifically, taking the second wireless module 304 and the second wireless module 314 as bluetooth modules as an example, the two bluetooth modules may receive bluetooth data of the same intelligent device (not shown), and may keep bluetooth clocks of the two bluetooth modules synchronized with a bluetooth clock of the intelligent device through a bluetooth Access Code (Access Code) of a physical layer or a correlation process of a part of the Access Code, so that clocks of the two bluetooth modules are also synchronized. In the case where the second wireless module 304 and the second wireless module 314 are both WiFi modules, similarly, both WiFi modules can receive WiFi data of the same smart device (not shown), and by receiving a periodic WiFi Beacon signal (i.e., Beacon), the WiFi clocks of both WiFi modules can be synchronized with the WiFi clock of the smart device, and thus the clocks of both WiFi modules are also synchronized.
However, normally, the DSI clocks of the left and right eyeglass portions are not necessarily synchronized with the clocks of the second wireless modules of the same eyeglass portion, and therefore, the DSI clocks of the left and right eyeglass portions may be calibrated by the clocks of the second wireless modules on the premise that the clocks of the two second wireless modules are synchronized as described above. Specifically, for example, a field synchronization signal of a DSI serial port or an SOT (start of transmission) start signal of the DSI serial port between the processing unit of the main glasses part and the display screen of the main glasses part may be generated by using the first reference signal generated by the display screen 302 of the main glasses part 30 as a reference, and the clock signal (bluetooth clock or WiFi clock) of the second wireless module 304 corresponding to the time of the signal may be triggered (or locked) by using the first reference signal, the field synchronization signal, the SOT start signal, or the like, or a signal obtained by fixedly delaying the above signals, so as to obtain a first clock value corresponding to the second wireless module 304 corresponding to the time of the signal, and transmit the first clock value to the processing unit 301 of the main glasses part 30. Since the above signals are generated based on the first reference signal, i.e. are all associated with the first reference signal, for convenience of description, only the clock signal of the second wireless module 304 at the time when the first reference signal triggers the first reference signal will be described as an example. The processing unit 301 of the master glasses unit 30 may transmit the received first clock value to the processing unit 311 of the slave glasses unit 31 by wireless transmission using the second wireless module 304 of the master glasses unit 30. That is, the processing section 301 of the master eyeglass portion 30 may be further configured to transmit a first clock value of a clock signal of the second wireless module 304 corresponding to the master eyeglass portion at the time of receiving the first reference signal transmitted by the display screen 302 thereof to the processing section 311 of the slave eyeglass portion 31 using the second wireless module 304 of the master eyeglass portion 30.
Next, the DSI clock of the DSI serial port of the slave eyeglass portion 31 may be calibrated by the processing portion 311 of the slave eyeglass portion 31 according to the first clock value received via the second wireless module 314, so that the DSI clock of the DSI serial port of the slave eyeglass portion 31 and the DSI clock of the DSI serial port of the master eyeglass portion 30 keep synchronized, as follows. First, the processing unit 311 of the slave eyeglass unit 31 may generate the second reference signal associated with the DSI clock of the slave eyeglass unit in cooperation with the first reference signal, and the second reference signal may select the SOT signal of the display screen 312 when the SOT signal of the display screen 302 is selected by the first reference signal, which is not listed here. The second reference signal is then used to trigger (or lock) the clock signal of the second wireless module 314 at the time of the second reference signal to obtain a second clock value corresponding to the second wireless module 314 at the time of the second reference signal. Further, the second clock value is compared with the first clock value received via the second wireless module of the slave glasses part, and since the clock signals of the second wireless module 304 and the second wireless module 314 can be considered to be synchronous, the difference between the first clock value and the second clock value can be considered to be the difference between the DSI clocks of the master glasses part 30 and the slave glasses part 31, and therefore, the DSI clock of the DSI serial port of the slave glasses part 31 can be calibrated based on the difference between the first clock value and the second clock value, so that the DSI clock of the DSI serial port of the slave glasses part 31 and the DSI clock of the DSI serial port of the master glasses part 30 keep synchronous. In some embodiments, the DSI clock of the display screen 312 of the slave glasses part 31 may be adjusted to be consistent with the DSI clock of the master glasses part 30, for example, by adjusting a phase-locked loop generating the DSI clock, or by adjusting an integer number of DSI clock cycles according to a clock value difference.
According to the wireless intelligent glasses, the synchronization of the DSI clocks between the master glasses part and the slave glasses part is calibrated according to the above manner, so that the second reference signal related to the DSI clock triggered by the processing part of the slave glasses part can be consistent with the time of the first reference signal, the processing parts of the master glasses part and the slave glasses part can synchronously write in the frame image buffers of the corresponding display screens, and the display screens of the master glasses part and the slave glasses part can synchronously swipe in the screen.
Fig. 4(a) shows a schematic diagram of a left glasses part and a right glasses part synchronously displaying video data based on a first reference signal and a second reference signal, respectively, according to an embodiment of the present application.
In some embodiments, it often takes a certain time for the processing portion of the master glasses portion to transmit the first reference signal to the processing portion of the slave glasses portion, especially when the first reference signal is transmitted wirelessly, the required time may be several hundreds of μ s, several ms, even ten and several ms, and therefore, the processing portion of the slave glasses portion often receives the first reference signal after a certain time delay after the time when the processing portion of the master glasses portion receives the first reference signal triggered by the display screen thereof, and therefore, the triggering time of the second reference signal triggered based on the first reference signal may have a certain time interval with the triggering time of the first reference signal. As shown in FIG. 4(a), assume t triggered by the first reference signal of the primary eyeglass portion 1 The time and t of receiving the first reference signal from the glasses part and triggering the second reference signal based on the first reference signal 2 The time difference is T3 long, then the processing part of the main glasses part can be used for T 2 At that time, the received video data is written into the frame image buffer in the display screen of the master glasses part, and correspondingly, the processing part of the slave glasses part is also triggered at t by the second reference signal 2 At this time, the second reference signal is input to the display screen of the slave eyeglass portion, and the video data is written into the frame image buffer based on the second reference signal, so that the display screen of the slave eyeglass portion can have the video data synchronized with the master eyeglass portion when the screen is flushed with the data in the frame image buffer. That is, in the case where the triggering time of the second reference signal differs from the triggering time of the first reference signal by a time period of T3, the processing section of the main glasses section may be further configured to write the received video data to a frame image buffer in the display screen of the main glasses section at a time after a time period of T3 after the triggering of the first reference signal. In some embodiments, T3 may be a fixed value, may be configured, or may be determined through testing in a software and hardware implementation. For example, the wireless receiving clock of the master glasses part may be triggered by the first reference signal, and the second reference signal may be triggered after the time T3 elapses according to the count of the wireless receiving clock of the slave glasses part and output to the slave glasses partThe display screen of (1).
Fig. 4(b) shows another schematic diagram of the left eyeglass portion and the right eyeglass portion synchronously displaying video data based on the first reference signal and the second reference signal, respectively, according to the embodiment of the present application. Similar to fig. 4(a), it is still assumed that t is triggered by the first reference signal of the primary spectacle portion 1 The time and t of the slave glasses part triggering the second reference signal based on the first reference signal 2 The time differences are T3, in other embodiments, the processing part of the slave glasses part can be at T 2 T after T4 time period after time 3 At the moment, the received video data is written into a frame image buffer in a display screen of the slave glasses part, and the processing part of the master glasses part triggers t at the first reference signal 1 After a time period T2, the received video data is written into the frame image buffer in the display screen of the master glasses part, and as can be seen from fig. 4(b), when T2, T3 and T4 are set to satisfy the relationship of T2 ═ T3+ T4, the master glasses part and the slave glasses part can be synchronized when writing the video data into the frame image buffers of the respective display screens, and the display screen of the master glasses part and the display screen of the slave glasses part can synchronously display the video data in the respective frame image buffers. That is, in the case that the triggering time of the second reference signal is different from the triggering time of the first reference signal by T3 duration, the display screen of the master glasses part may be further configured to write the received video data into the frame image buffer in the display screen of the master glasses part after T2 duration after the triggering of the first reference signal, and the processing part of the slave glasses part may be further configured to write the received video data into the frame image buffer in the display screen of the slave glasses part after T4 duration after the triggering of the second reference signal, where T2 is the sum of T4 and T3, so that the display screen of the master glasses part and the display screen of the slave glasses part may write the video data into the respective frame image buffers separately but synchronously.
Fig. 5(a) shows still another schematic diagram of the left and right eyeglass portions synchronously displaying video data based on the first reference signal and the second reference signal, respectively, according to an embodiment of the present application. It should be noted that FIG. 5 shows a schematic view of a display device(a) Based on FIG. 4(b), t in FIG. 5(a) 1 、t 2 、t 3 And T2, T3, T4 and the like have the same meanings as those of the same reference numerals in FIG. 4 (b).
As shown in FIG. 5(a), in some embodiments, the display screen of the main glasses part may also use the video data in the frame image buffer to flush t of the current screen 0 T of duration T1 before time 1 And triggering the first reference signal at any time, and outputting the first reference signal to a processing part of the main glasses part. That is, the display screen of the primary eyeglass portion may be further configured to trigger the first reference signal at a time T1 duration before the current screen is swiped over and output the first reference signal to the processing portion of the primary eyeglass portion. By setting the time for triggering the first reference signal by the display screen of the main glasses part as above, the undesirable screen cutting phenomenon generated when the display screen of the main glasses part utilizes the video data in the respective frame image buffer to screen. The following describes in detail a screen brushing timing sequence of the display screen of the main glasses part according to the embodiment of the present application, which is shown in fig. 5 (b).
For example only, a rising edge of the TE signal may be used as the first reference signal according to embodiments of the present application, for example. As mentioned above, the TE signal is used to inform the processing unit of the timing for writing data into the frame image buffer according to the timing for reading and displaying (i.e. refreshing) the data in the frame image buffer, so that the display screen can avoid the conflict caused by the processing unit writing data to the same position when the display screen reads and displays the data in the frame image buffer by properly arranging the TE signal and the timing for refreshing the screen itself (such conflict may cause the screen-cut phenomenon), and meanwhile, the screen-cut phenomenon caused by the replacement of the data in the frame image buffer during the refreshing process of the display screen can be avoided.
Fig. 5(b) illustrates a display screen of the main glasses section as an example. In fig. 5(b), it is assumed that the time point at which the scanning display period of the display panel starts is the falling edge of the TE signal, and the low level period of the TE signal is the scanning display period of the display panel, and during this period, the display panel reads the frame image bufferAnd (3) displaying the data in the memory (namely, refreshing the screen), and controlling to pull up the level of the TE signal when the screen refreshing of the display screen is finished, wherein the screen refreshing process is a non-scanning display period of the display screen during the high level period of the TE signal as shown by a solid line 502 in the figure, namely, the display screen is not refreshed during the period. And the first reference signal may be set in association with a screen-swiping signal (falling edge of TE signal) of the display screen, and the processing portion writes data to the frame image buffer based on the first reference signal, the writing process of the data being shown by a dotted line 501 in the figure. In fig. 5(b), the time when the display screen of the main glasses part is flushed (i.e., the rising edge of the TE signal) is marked as t 0 Marking the time corresponding to the first reference signal as t 1 The start time of writing data into the frame image buffer by the processing unit of the main glasses unit is marked as t 3
Taking scanning display period 1 as an example, the first reference signal is triggered based on the rising edge of the TE signal, and at the same time, the processor starts to write the data of a new frame to the frame image buffer, i.e. t 3 And t 0 And t 1 And (6) overlapping. Under normal conditions, as long as the writing speed shown by the dashed line 501 is faster than the reading speed shown by the solid line 502, it can be ensured that the dashed line 501 and the solid line 502 do not intersect, i.e., the screen blooming or the screen cut phenomenon is not generated. However, consider the case shown in fig. 4(b), namely: t triggered by the first reference signal by the processing part of the main glasses part 1 T after T2 time period after time 3 At the moment, the received video data is written into a frame image buffer in the display screen of the main glasses part, t 3 、t 0 And t 1 The timing relationship of the three is shown in the scanning display period 1 in FIG. 5(b), in this case, the TE still rises (t) 0 Time) triggering the first reference signal, then, due to the existence of the time duration of T2, the buffer write delay may intersect with the dashed line 501 and the solid line 502 at a certain time, for example, at point X in fig. 5(b), which may not only cause screen splash, but also may affect the screen cut phenomenon caused by the switching of new and old data since the screen is refreshed with the previous frame data before the intersection point X and with the updated new data after the intersection point XThe user experiences wearing smart glasses.
In some embodiments, the first reference signal may be triggered by the display screen of the primary eyewear portion at a time T1 hours before the current screen is swiped through and output to the processing portion of the primary eyewear portion. That is, the display screen of the primary eyeglass portion may be further configured to trigger the first reference signal at a time T1 duration before the current screen is swiped over and output the first reference signal to the processing portion of the primary eyeglass portion. As shown in FIG. 5(b), in the scanning display period 2, the display screen of the main glasses part is at t 0 At all times the current screen is refreshed, e.g. t which will trigger the first reference signal 1 Time is set at t 0 A time of a duration of T1 before the time, so that the processing section of the main eyeglass section can always be caused to start writing T of data by the cooperative arrangement of T1 and T2 3 The time is not later than the start time (i.e. the falling edge of the TE signal) of the reading of the buffer data shown by the solid line 502 in the scanning display period 3, that is, it can always be ensured that the dotted line 501 and the solid line 502 do not intersect, and the speed of updating the data in the buffer is faster than the screen refreshing speed of the display screen in the current scanning display period, so that the screen blooming or screen cutting phenomenon is not generated. Assuming that the timings of writing data into the frame image buffer by the master and slave glasses sections adopt the timing shown in fig. 4(a), the broken line 501 and the solid line 502 are guaranteed to be disjoint as long as T1 is greater than T3. Therefore, for the main glasses part, by associating with T2 and appropriately setting the time length of T1, the phenomenon of screen blooming or screen cut of the display screen of the main glasses part can be effectively avoided.
Further, as mentioned above, the processing unit of the master glasses part transmits the first reference signal or the third reference signal triggered based on the first reference signal to the processing unit of the slave glasses part, so that the processing unit of the slave glasses part may generate a fourth reference signal based on the first reference signal or the third reference signal, and transmit the fourth reference signal to the display screen of the slave glasses part, and cause the display screen of the slave glasses part to perform screen refreshing based on the fourth reference signal, wherein the fourth reference signal may be the same signal as the second reference signal or a signal with a certain time delay relationship cooperatively arranged with the second reference signal. In the process, the screen refreshing signal of the display screen of the master glasses part and the screen refreshing signal of the slave glasses part can be kept synchronous by adjusting the time delay or the advance between the reference signals, namely, synchronous screen refreshing of the left display screen and the right display screen is realized. Furthermore, through the adjustment of the time sequence relation of each signal, the relative relation between each signal and the relative relation between the signals corresponding to the main glasses part can be kept consistent in the screen brushing time sequence of the display screen and the writing time sequence of the frame image buffer of the auxiliary glasses part, so that the screen blooming or screen cutting phenomenon cannot occur necessarily in the auxiliary glasses part when the main glasses part can ensure that the screen blooming or screen cutting phenomenon cannot occur. Furthermore, in some embodiments, if there is a fixed time delay between the time when the display screens of the left and right glasses portions start to be swiped, for example, a time interval of one line is needed, the time sequence between the signals (for example, the values of T2 and T4 in fig. 5 (a)) can be adjusted, so that synchronous screen swiping of the display screens of the two glasses portions can still be ensured.
Through the process, on the basis that the main glasses part and the auxiliary glasses part can synchronously write the video data into respective frame image buffers (data synchronization), synchronous screen refreshing (display synchronization) of the display screens of the glasses parts is further realized, so that the video or the image on the display screens of the left and the right glasses parts has stronger stereoscopic impression, the picture is more natural and real, and the wearing experience of a user is greatly improved.
In addition, the display method for the wireless smart glasses according to the embodiment of the present application may further include providing a DLNA module in each of the left and right glasses portions, wherein a DLNA protocol adopted by the DLNA module is first proposed by sony, intel, microsoft, etc., and is generally called DIGITAL LIVING NETWORK ALLIANCE, which is intended to solve the interconnection between a wireless NETWORK and a wired NETWORK, such as a wireless NETWORK and a wired NETWORK, between a computer and other electronic products, such as a mobile phone and a tablet.
According to the embodiment of the application, any one of the left glasses part and the right glasses part of the wireless intelligent glasses is used as a main glasses part, and the other one is used as a slave glasses part, and the following steps can be executed:
in step S601, a first DLNA connection is established between the DLNA module of the primary eyeglass portion and the smart device, first control information of the smart device, which at least includes identification information of the target DMS terminal, is received via the first DLNA connection, and first video data is acquired from the target DMS terminal based on the first control information. Among them, a dms (digital Media server) can provide the capability of acquiring, recording, storing and serving as a source of the Media files.
In step S602, a second DLNA connection is established between the DLNA module of the master eyeglass portion and the DLNA module of the slave eyeglass portion, and second control information including at least identification information of the target DMS terminal is transmitted to the slave eyeglass portion via the second DLNA connection.
In some embodiments, the second control information may comprise, for example, information associated with a synchronization reference signal.
In step S603, the second control information is received by the DLNA module of the slave eyeglass portion via the second DLNA connection, and second video data identical to or different from the first video data is acquired from the target DMS terminal based on the second control information.
In step S604, the acquired first video data is written into a frame image buffer in the display screen of the main glasses section based on the first reference signal by the processing section of the main glasses section.
In step S605, the acquired second video data is written in the frame image buffer in the display screen of the slave eyeglass section in synchronization with the processing section of the master eyeglass section based on the second reference signal set in cooperation with the first reference signal by the processing section of the slave eyeglass section.
Through steps S601-S605, the left glasses part and the right glasses part of the wireless smart glasses according to the embodiment of the present application may respectively acquire video data to be displayed from the DMS terminal by using the DLNA protocol, and transmit information associated with the synchronization reference signal, such as the first clock value and the like as described above, by using control information of the DLNA connection, thereby enabling the display screen of the master glasses part and the display screen of the slave glasses part to synchronously display data in each frame image buffer.
In some embodiments, the first DLNA connection and the second DLNA connection may be established based on a WIFI channel, or may be established based on other standard or proprietary wireless transmission protocols, which is not limited in this application.
In some embodiments, the target DMS may be located locally on the smart device, or may be located in a cloud server outside the smart device, which is not limited in this application.
Accordingly, each of the left and right eyeglass portions of the wireless smart glasses according to the embodiments of the present application may be respectively provided with a DLNA module, and in a case where either one of the left and right eyeglass portions serves as a master eyeglass portion and the other serves as a slave eyeglass portion, the DLNA module of the master eyeglass portion may be configured to establish a first DLNA connection with the smart device, receive first control information of the smart device including at least identification information of the target DMS terminal via the first DLNA connection, and acquire first video data from the target DMS terminal based on the first control information. The DLNA module of the master eyeglass portion may be further configured to establish a second DLNA connection with the DLNA module of the slave eyeglass portion, and transmit second control information including at least identification information of the target DMS terminal to the slave eyeglass portion via the second DLNA connection. In some embodiments, the second control information may include, for example, the first reference signal or a third reference signal triggered based on the first reference signal. The DLNA module of the slave glass section may be configured to receive the second control information via the second DLNA connection and acquire second video data identical to or different from the first video data from the target DMS terminal based on the second control information. In some embodiments, the processing section of the master eyeglass portion may be configured to write the acquired first video data to a frame image buffer in the display screen of the master eyeglass portion based on a first reference signal, and the processing section of the slave eyeglass portion may be configured to write the acquired second video data to the frame image buffer in the display screen of the slave eyeglass portion in synchronization with the processing section of the master eyeglass portion based on a second reference signal set in cooperation with the first reference signal.
Moreover, although exemplary embodiments have been described herein, the scope thereof includes any and all embodiments based on the present application with equivalent elements, modifications, omissions, combinations (e.g., of various embodiments across), adaptations or alterations. The elements of the claims are to be interpreted broadly based on the language employed in the claims and not limited to examples described in the present specification or during the prosecution of the application, which examples are to be construed as non-exclusive. It is intended, therefore, that the specification and examples be considered as exemplary only, with a true scope and spirit being indicated by the following claims and their full scope of equivalents.
The above description is intended to be illustrative and not restrictive. For example, the above-described examples (or one or more versions thereof) may be used in combination with each other. For example, other embodiments may be used by those of ordinary skill in the art upon reading the above description. In addition, in the above detailed description, various features may be grouped together to streamline the application. This should not be interpreted as an intention that a disclosed feature not claimed is essential to any claim. Rather, subject matter of the present application can lie in less than all features of a particular disclosed embodiment. Thus, the following claims are hereby incorporated into the detailed description as examples or embodiments, with each claim standing on its own as a separate embodiment, and it is contemplated that these embodiments may be combined with each other in various combinations or permutations. The scope of the invention should be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled.
The above embodiments are only exemplary embodiments of the present application, and are not intended to limit the present invention, the scope of which is defined by the claims. Various modifications and equivalents may be made by those skilled in the art within the spirit and scope of the present application and such modifications and equivalents should also be considered as falling within the scope of the present invention.

Claims (19)

1. A display method for wireless smart glasses, the wireless smart glasses comprising a left glass portion and a right glass portion, the method comprising:
the method comprises the following steps that a processing part, a display screen and a first wireless module are respectively arranged in each of a left glasses part and a right glasses part, wherein a frame image buffer is arranged in the display screen of each glasses part;
receiving video data of the intelligent device by the first wireless module of each glasses part;
and writing the received video data into the frame image buffers in the respective display screens by the processing parts of the respective glasses parts based on the cooperatively arranged synchronous reference signals, so that the display screens of the respective glasses parts synchronously display the video data in the respective frame image buffers respectively.
2. The display method according to claim 1, further comprising: one of the left and right eyeglass portions is used as a master eyeglass portion, the other is used as a slave eyeglass portion,
triggering a first reference signal by a display screen of the main glasses part, and outputting the first reference signal to a processing part of the main glasses part;
transmitting, by the processing section of the master eyeglass portion, the first reference signal or a third reference signal triggered based on the first reference signal to the processing section of the slave eyeglass portion;
triggering, by a processing section of the slave glasses section, a second reference signal based on the first reference signal or the third reference signal, and outputting the second reference signal or a fourth reference signal triggered based on the first reference signal or the third reference signal to a display screen of the slave glasses section;
writing, by the processing portion of each of the eyeglass portions, the received video data into the frame image buffers in the respective display screens based on the cooperatively set synchronization reference signals, such that the display screens of the respective eyeglass portions respectively synchronously display the video data in the respective frame image buffers further comprise:
writing, by a processing section of the primary eyewear section, the received video data into a frame image buffer in a display screen of the primary eyewear section based on the first reference signal;
writing, by the processing section of the slave eyeglass section, the received video data into a frame image buffer in a display screen of the slave eyeglass section in synchronization with the processing section of the master eyeglass section based on the second reference signal;
and displaying, by the display screen of the slave eyeglass portion, the video data in the frame image buffer in the display screen of the slave eyeglass portion in synchronization with the display screen of the master eyeglass portion based on the second reference signal or the fourth reference signal.
3. The display method according to claim 2, wherein in the case that the trigger time of the second reference signal is different from the trigger time of the first reference signal by a time length of T3,
writing, by the processing section of the primary eyewear section, the received video data to a frame image buffer in a display screen of the primary eyewear section based on the first reference signal, further comprising: and writing the received video data into a frame image buffer in a display screen of the main glasses part by the processing part of the main glasses part at the time after the time length of T3 after the triggering of the first reference signal.
4. The display method according to claim 2, wherein in the case that the trigger time of the second reference signal is different from the trigger time of the first reference signal by a time length of T3,
writing, by the processing section of the primary eyewear section, the received video data to a frame image buffer in a display screen of the primary eyewear section based on the first reference signal, further comprising: writing the received video data into a frame image buffer in a display screen of the main glasses part after the time length of T2 triggered by the first reference signal by the processing part of the main glasses part;
writing, by the processing section of the slave eyeglass section, the received video data in the frame image buffer in the display screen of the slave eyeglass section in synchronization with the processing section of the master eyeglass section based on the second reference signal further includes: writing the received video data into a frame image buffer in a display screen of the slave glasses part after a time length T4 triggered by the second reference signal by the processing part of the slave glasses part;
where T2 is the sum of T4 and T3, so that the display screen of the master glasses section and the display screen of the slave glasses section respectively display the video data in the respective frame image buffers in synchronization.
5. The display method according to any one of claims 2 to 4, wherein triggering the first reference signal by the display screen of the main glasses section and outputting the first reference signal to the processing section of the main glasses section further comprises: and triggering the first reference signal by the display screen of the main glasses part at the time T1 duration before the current screen is brushed, and outputting the first reference signal to the processing part of the main glasses part.
6. The display method according to any one of claims 2 to 4, further comprising:
the first reference signal, or a third reference signal triggered based on the first reference signal, is transmitted by the processing section of the master eyeglass portion to the processing section of the slave eyeglass portion using a wired connection.
7. The display method according to claim 6, wherein the display method further comprises connecting the processing section of each of the left and right glasses sections with a corresponding display screen or a display screen driving chip via a GPIO interface, and connecting the processing sections of the respective glasses sections with each other via the GPIO interface,
the triggering of the first reference signal by the display screen of the primary eyewear portion and the outputting of the first reference signal to the processing portion of the primary eyewear portion specifically include: triggering a first reference signal by a display screen or a display screen driving chip of the main glasses part, and outputting the first reference signal to a processing part of the main glasses part by the display screen or the display screen driving chip of the main glasses part through a GPIO (general purpose input/output) interface;
triggering, by the processing section of the slave glasses section, a second reference signal based on the first reference signal or the third reference signal, and outputting the second reference signal or a fourth reference signal triggered based on the first reference signal or the third reference signal to the display screen of the slave glasses section further includes:
triggering, by the processing section of the master eyewear section, a third reference signal based on the first reference signal and outputting the third reference signal to the processing section of the slave eyewear section via a GPIO interface;
triggering a second reference signal by a processing part of the slave glasses part based on the third reference signal, and outputting the second reference signal or the fourth reference signal to a display screen or a display screen driving chip of the slave glasses part via a GPIO interface.
8. The display method according to any one of claims 2 to 4, characterized in that the display method further comprises:
the processing parts of the left glasses part and the right glasses part are connected with the corresponding display screens through DSI serial ports, the DSI serial ports are provided with DSI clocks, the first reference signals are associated with the DSI clock of the master glasses part, and the second reference signals are associated with the DSI clock of the slave glasses part;
a second wireless module is arranged in each of the left glasses part and the right glasses part, wherein the second wireless module is a Bluetooth module or a WiFi module, and a clock signal of the second wireless module of the left glasses part and a clock signal of the second wireless module of the right glasses part are kept synchronous;
and calibrating the DSI clock of the left glasses part and the DSI clock of the right glasses part by using the second wireless module and the first reference signal in the left glasses part and the second wireless module and the second reference signal in the right glasses part.
9. The display method according to claim 8, wherein calibrating the DSI clock of the left glasses part and the DSI clock of the right glasses part using the second wireless module and the first reference signal in the left glasses part and the second wireless module and the second reference signal in the right glasses part specifically comprises:
transmitting, by the processing section of the master eyeglass portion, a first clock value of a clock signal of a second wireless module corresponding to the master eyeglass portion at a time when the first reference signal is received to the processing section of the slave eyeglass portion by using the second wireless module of the master eyeglass portion;
the processing unit of the slave glasses unit generates a second reference signal associated with the DSI clock of the slave glasses unit in cooperation with the first reference signal, compares a second clock value corresponding to the clock signal of the second wireless module of the slave glasses unit at the time of the second reference signal with the first clock value received via the second wireless module of the slave glasses unit, and calibrates the DSI clock of the DSI serial port of the slave glasses unit based on a difference between the first clock value and the second clock value so that the DSI clock of the DSI serial port of the slave glasses unit and the DSI clock of the DSI serial port of the master glasses unit are synchronized.
10. The method as claimed in claim 9, wherein the second wireless module is the same wireless module as the first wireless module.
11. A pair of wireless intelligent glasses comprises a left glasses part and a right glasses part, and is characterized in that each of the left glasses part and the right glasses part respectively comprises a processing part, a display screen and a first wireless module, wherein a frame image buffer is arranged in the display screen of each glasses part,
the first wireless module of each eyeglass portion is configured to: receiving video data of the intelligent equipment;
the processing portion of each of the eyeglass portions is configured to: writing the received video data into frame image buffers in respective display screens based on the cooperatively set synchronous reference signals;
the display screen of each of the eyeglass portions is configured to: the video data in the respective frame image buffers are synchronously displayed.
12. The wireless smart glasses according to claim 11, wherein one of the left and right glasses sections is a master glasses section, and the other is a slave glasses section,
the display screen of the primary eyewear portion is further configured to: a processing unit for triggering a first reference signal and outputting the first reference signal to a main glasses unit;
the processing portion of the primary eyewear portion is further configured to: transmitting the first reference signal or a third reference signal triggered based on the first reference signal to a processing section of the slave glass section; writing the received video data to a frame image buffer in a display screen of a primary eyewear portion based on the first reference signal;
the processing portion of the slave eyeglass portion is further configured to: triggering a second reference signal based on the first reference signal or the third reference signal, and outputting the second reference signal or a fourth reference signal triggered based on the first reference signal or the third reference signal to a display screen of a slave glasses part; writing the received video data to a frame image buffer in a display screen of a slave eyeglass portion in synchronization with a processing portion of the master eyeglass portion based on the second reference signal;
the display screen of the slave eyeglass portion is further configured to: and displaying the video data in the frame image buffer in the display screen of the slave glasses part synchronously with the display screen of the master glasses part based on the second reference signal or the fourth reference signal.
13. The wireless smart glasses of claim 12, wherein in the case that the triggering time of the second reference signal is different from the triggering time of the first reference signal by a time length of T3,
the processing portion of the primary eyewear portion is further configured to: and writing the received video data into a frame image buffer in a display screen of the main glasses part at a time after T3 duration after the triggering of the first reference signal.
14. The wireless smart glasses according to claim 12, wherein in case that the triggering time of the second reference signal is different from the triggering time of the first reference signal by T3,
the display screen of the primary eyewear portion is further configured to: writing the received video data into a frame image buffer in a display screen of the main glasses part after a time length of T2 after the first reference signal is triggered;
the processing section of the slave lens section is further configured to: writing the received video data into a frame image buffer in a display screen of the slave glasses part after a time length of T4 after the triggering of the second reference signal;
where T2 is the sum of T4 and T3, so that the display panel of the master glasses section and the display panel of the slave glasses section display the video data in the respective frame image buffers in synchronization with each other.
15. The wireless smart eyewear of any of claims 12-14, wherein the display screen of the primary eyewear portion is further configured to: triggering the first reference signal at a time T1 time before the current screen is brushed off, and outputting the first reference signal to a processing part of the main glasses part.
16. The wireless smart eyewear according to any one of claims 12-14,
the processing portion of the primary eyewear portion is further configured to: the first reference signal, or a third reference signal triggered based on the first reference signal, is transmitted to the processing section of the slave eyeglass section using a wired connection.
17. The wireless intelligent glasses according to claim 16, wherein the processing portion of each of the left and right glasses is connected to the corresponding display screen or the display screen driving chip via a GPIO interface, and the processing portions of the glasses are connected to each other via the GPIO interface,
the display screen of the primary eyewear portion is further configured to: triggering a first reference signal by a display screen or a display screen driving chip of the main glasses part, and outputting the first reference signal to a processing part of the main glasses part through a GPIO (general purpose input/output) interface;
the processing portion of the primary eyewear portion is further configured to: triggering a third reference signal based on the first reference signal and outputting the third reference signal to a processing section of the slave glasses section via a GPIO interface;
the processing section of the slave eyeglass section is further configured to: and triggering a second reference signal based on the third reference signal, and outputting the second reference signal or the fourth reference signal to a display screen or a display screen driving chip of the slave glasses part through a GPIO (general purpose input/output) interface.
18. The wireless smart eyewear of any of claims 12-14, wherein the processing portion of each of the left and right eyewear portions is connected to the corresponding display screen via a DSI serial port having a DSI clock, and wherein the first reference signal is associated with the DSI clock of the master eyewear portion and the second reference signal is associated with the DSI clock of the slave eyewear portion;
each of the left glasses part and the right glasses part further comprises a second wireless module, wherein the second wireless module is a Bluetooth module or a WiFi module, and a clock signal of the second wireless module of the left glasses part and a clock signal of the second wireless module of the right glasses part are kept synchronous;
the processing portion of the primary eyewear portion is further configured to: transmitting a first clock value of a clock signal of a second wireless module corresponding to the master glasses part at the moment of receiving the first reference signal to a processing part of the slave glasses part by using the second wireless module of the master glasses part;
the processing portion of the slave eyewear portion is further configured to:
generating a second reference signal associated with a DSI clock of the slave glass section in cooperation with the first reference signal;
and comparing a second clock value of the second reference signal corresponding to the clock signal of the second wireless module of the slave glasses part at the moment with the first clock value received via the second wireless module of the slave glasses part, and calibrating the DSI clock of the DSI serial port of the slave glasses part based on the difference between the first clock value and the second clock value, so that the DSI clock of the DSI serial port of the slave glasses part and the DSI clock of the DSI serial port of the master glasses part keep synchronous.
19. The wireless smart eyewear of claim 18, wherein the second wireless module is the same wireless module as the first wireless module.
CN202210770749.9A 2022-06-30 2022-06-30 Display method for wireless intelligent glasses and wireless intelligent glasses Active CN115032797B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210770749.9A CN115032797B (en) 2022-06-30 2022-06-30 Display method for wireless intelligent glasses and wireless intelligent glasses

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210770749.9A CN115032797B (en) 2022-06-30 2022-06-30 Display method for wireless intelligent glasses and wireless intelligent glasses

Publications (2)

Publication Number Publication Date
CN115032797A true CN115032797A (en) 2022-09-09
CN115032797B CN115032797B (en) 2023-12-08

Family

ID=83128430

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210770749.9A Active CN115032797B (en) 2022-06-30 2022-06-30 Display method for wireless intelligent glasses and wireless intelligent glasses

Country Status (1)

Country Link
CN (1) CN115032797B (en)

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101415126A (en) * 2007-10-18 2009-04-22 深圳Tcl新技术有限公司 Method for generating three-dimensional image effect and digital video apparatus
CN101547371A (en) * 2008-03-24 2009-09-30 三星电子株式会社 Method for generating signal to display three-dimensional (3d) image and image display apparatus using the same
CN102055994A (en) * 2009-11-05 2011-05-11 乐金显示有限公司 Stereoscopic display device
CN201887901U (en) * 2010-12-21 2011-06-29 北京睿为视讯技术有限公司 Three-dimensional video playing system
CN102547313A (en) * 2010-12-21 2012-07-04 北京睿为视讯技术有限公司 Three-dimensional video play system and method thereof
CN102740087A (en) * 2011-04-06 2012-10-17 云南北方奥雷德光电科技股份有限公司 Active 3d glasses
CN102901703A (en) * 2012-10-10 2013-01-30 彩虹集团公司 Three-dimensional (3D) image displaying method for security inspection equipment
CN103327352A (en) * 2013-05-03 2013-09-25 四川虹视显示技术有限公司 Device and method for achieving double display screen 3D display by adoption of serial processing mode
CN109426474A (en) * 2017-08-28 2019-03-05 珠海全志科技股份有限公司 A kind of double-display screen synchronization system
CN110087054A (en) * 2019-06-06 2019-08-02 北京七鑫易维科技有限公司 The processing method of image, apparatus and system
CN110140353A (en) * 2016-04-01 2019-08-16 线性代数技术有限公司 System and method for being suitable for the head-mounted display of visual perception
CN112399167A (en) * 2020-12-08 2021-02-23 恒玄科技(北京)有限公司 A intelligent glasses for radio communication
CN113794871A (en) * 2021-09-16 2021-12-14 京东方科技集团股份有限公司 Delay device, delay method thereof and display system
CN115552451A (en) * 2020-05-08 2022-12-30 高通股份有限公司 Multi-layer reprojection techniques for augmented reality

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101415126A (en) * 2007-10-18 2009-04-22 深圳Tcl新技术有限公司 Method for generating three-dimensional image effect and digital video apparatus
CN101547371A (en) * 2008-03-24 2009-09-30 三星电子株式会社 Method for generating signal to display three-dimensional (3d) image and image display apparatus using the same
CN102055994A (en) * 2009-11-05 2011-05-11 乐金显示有限公司 Stereoscopic display device
CN201887901U (en) * 2010-12-21 2011-06-29 北京睿为视讯技术有限公司 Three-dimensional video playing system
CN102547313A (en) * 2010-12-21 2012-07-04 北京睿为视讯技术有限公司 Three-dimensional video play system and method thereof
CN102740087A (en) * 2011-04-06 2012-10-17 云南北方奥雷德光电科技股份有限公司 Active 3d glasses
CN102901703A (en) * 2012-10-10 2013-01-30 彩虹集团公司 Three-dimensional (3D) image displaying method for security inspection equipment
CN103327352A (en) * 2013-05-03 2013-09-25 四川虹视显示技术有限公司 Device and method for achieving double display screen 3D display by adoption of serial processing mode
CN110140353A (en) * 2016-04-01 2019-08-16 线性代数技术有限公司 System and method for being suitable for the head-mounted display of visual perception
CN109426474A (en) * 2017-08-28 2019-03-05 珠海全志科技股份有限公司 A kind of double-display screen synchronization system
CN110087054A (en) * 2019-06-06 2019-08-02 北京七鑫易维科技有限公司 The processing method of image, apparatus and system
CN115552451A (en) * 2020-05-08 2022-12-30 高通股份有限公司 Multi-layer reprojection techniques for augmented reality
CN112399167A (en) * 2020-12-08 2021-02-23 恒玄科技(北京)有限公司 A intelligent glasses for radio communication
CN113794871A (en) * 2021-09-16 2021-12-14 京东方科技集团股份有限公司 Delay device, delay method thereof and display system

Also Published As

Publication number Publication date
CN115032797B (en) 2023-12-08

Similar Documents

Publication Publication Date Title
JP6465946B2 (en) Distributed video display system, control device, and control method
CN102045157B (en) Methods and systems for updating a buffer
US8692838B2 (en) Methods and systems for updating a buffer
CN101982978B (en) System and method for controlling stereo glasses shutters
US8645585B2 (en) System and method for dynamically configuring a serial data link in a display device
WO2016091082A1 (en) Multi-screen joint display processing method and device
JP2016509425A (en) Synchronous signal processing method and apparatus for stereoscopic display of splice screen, splice screen
US20030016223A1 (en) Drawing apparatus
EP4343407A1 (en) Method for refreshing screen of head-mounted display device and head-mounted display device
CN111355861A (en) Multi-screen video synchronous splicing device and method
CN103019639B (en) A kind of multiprocessor splicing synchronous display system
WO2024051386A1 (en) Tiled display screen, and control system for tiled display screen
CN103838533B (en) The synchronous method of figure signal and sync card in computer cluster splice displaying system
CN115032797B (en) Display method for wireless intelligent glasses and wireless intelligent glasses
CN115151886A (en) Delaying DSI clock changes based on frame updates to provide a smoother user interface experience
CN111586454B (en) Large screen splicing synchronization method and system
CN112309311B (en) Display control method, device, display control card and computer readable medium
CN115167669A (en) Intelligent glasses, synchronous display method and medium
CN113099212A (en) 3D display method, device, computer equipment and storage medium
US20220244777A1 (en) Interface between host processor and wireless processor for artificial reality
TWI389511B (en) Methods and systems for updating a buffer
TWI376114B (en) Methods and systems for synchronous execution of commands across a communication link
CN117193691A (en) Synchronous display method, device and display system of pictures in display module
MX2007006198A (en) Methods and systems for updating a buffer

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant