CN112558768A - Function interface proportion control method and system and AR glasses thereof - Google Patents

Function interface proportion control method and system and AR glasses thereof Download PDF

Info

Publication number
CN112558768A
CN112558768A CN202011445885.8A CN202011445885A CN112558768A CN 112558768 A CN112558768 A CN 112558768A CN 202011445885 A CN202011445885 A CN 202011445885A CN 112558768 A CN112558768 A CN 112558768A
Authority
CN
China
Prior art keywords
glasses
interface
wearer
proportion
glasses wearer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011445885.8A
Other languages
Chinese (zh)
Inventor
孙立
陈婧
刘晖
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Shadow Creator Information Technology Co Ltd
Original Assignee
Shanghai Shadow Creator Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Shadow Creator Information Technology Co Ltd filed Critical Shanghai Shadow Creator Information Technology Co Ltd
Priority to CN202011445885.8A priority Critical patent/CN112558768A/en
Publication of CN112558768A publication Critical patent/CN112558768A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention provides a method and a system for controlling the proportion of functional interfaces and AR glasses thereof, wherein the method comprises the following steps: a display step of an occupation ratio control interface: displaying a proportion control interface to an AR glasses wearer through AR glasses; gradually displaying: on the basis of displaying the proportion control interface, displaying the function interfaces one by one to AR glasses wearers in a mode of appearing one by one through AR glasses; the proportion control operation step: selecting a proportion control interface according to the fixation point of the AR glasses wearer; for the proportion control interface, utilizing the breathing characteristics of the AR glasses wearer to perform proportion fixation and/or proportion unlocking; the invention can allow the AR glasses wearer to control the number of functional interfaces through the breathing characteristics. The invention realizes gesture control by utilizing the breathing characteristic, can still control the AR glasses by the invention when both hands of the AR glasses wearer carry heavy objects to be inconvenient for waving hands, and can improve the operation efficiency for a plurality of functional interfaces.

Description

Function interface proportion control method and system and AR glasses thereof
Technical Field
The invention relates to the field of AR (augmented reality) glasses, in particular to a method and a system for controlling the proportion of a functional interface and AR glasses thereof.
Background
Patent document CN210720884U provides AR glasses including: the device comprises two AR lenses arranged at intervals and a spectacle frame for fixing the AR lenses; a cavity for accommodating the microprocessor is arranged in the spectacle frame, and displays are arranged on the sides of the AR lenses, which are back to human eyes, and are electrically connected with the microprocessor; the mirror holder is provided with a first camera and a second camera which are used for collecting images in the visual range of a wearer and a third camera which is used for collecting gesture actions of the wearer, and the first camera, the second camera and the third camera are respectively and electrically connected with the microprocessor. The utility model discloses the structure is succinct, gathers three-dimensional outdoor scene through two cameras, carries out gesture recognition through the third camera, is suitable for the gesture action that detects to be located glasses the place ahead to correspondingly produce control command, thereby realize people and virtual object and carry out the interdynamic in the real scene, experience the integration of reality and virtual world.
The defects of the prior art are that hand motions are needed to perform gestures in front of the glasses.
Disclosure of Invention
Aiming at the defects in the prior art, the invention aims to provide a method and a system for controlling the proportion of a functional interface and AR glasses thereof.
The invention provides a method for controlling the proportion of functional interfaces, which comprises the following steps:
a display step of an occupation ratio control interface: displaying a proportion control interface to an AR glasses wearer through AR glasses;
gradually displaying: on the basis of displaying the proportion control interface, displaying the function interfaces one by one to AR glasses wearers in a mode of appearing one by one through AR glasses;
the proportion control operation step: selecting a proportion control interface according to the fixation point of the AR glasses wearer; for the proportion control interface, utilizing the breathing characteristics of the AR glasses wearer to perform proportion fixation and/or proportion unlocking; wherein the fixed occupation ratio refers to stopping the operation of changing the number of the functional interfaces so as to fix the picture occupation ratio of the functional interfaces in the picture displayed by the AR glasses to the AR glasses wearer; the occupation unlocking refers to operation of allowing the number of the function interfaces to be changed so as to allow the picture occupation of the function interfaces in the pictures displayed by the AR glasses to the AR glasses wearer to be changed.
Preferably, in the duty control operation step,
in a first time period after the proportion control interface is selected, if the breathing characteristic gesture of the AR glasses wearer is detected, the proportion control interface is subjected to proportion fixing; the breathing characteristic gesture refers to the change of the spatial position of the head caused by the change of the spatial position of the AR glasses when the AR glasses wearer sighs or breathes deeply;
in a first time period after the proportion control interface is selected, if the AR glasses detect the breathing characteristic sound of the AR glasses wearer and the breathing characteristic gesture of the AR glasses wearer is not detected, proportion unlocking is carried out on the proportion control interface; wherein the breathing characteristic sound is a sound emitted when the AR glasses wearer sighs or breathes deeply.
Preferably, after the duty ratio control interface is unlocked in a duty ratio mode, triggering allows the following steps to be executed:
a multifunctional interface display step: determining a plurality of functional interfaces that have been displayed to an AR glasses wearer through AR glasses;
selecting an interface: selecting a functional interface according to the fixation point of the AR glasses wearer;
selecting an interface, and processing: processing the selected interface by using the breathing characteristics of the AR glasses wearer;
interface full selection processing steps: for a plurality of displayed functional interfaces, if the fixation point of the AR glasses wearer does not fall on any functional interface and the breathing characteristic gesture of the AR glasses wearer is detected, defining the real-time real image and the preset scene as a non-matching relation and hiding all displayed functional interfaces; the breathing characteristic gesture refers to the change of the spatial position of the AR glasses caused by the change of the spatial position of the head when the AR glasses wearer sighs or breathes deeply.
Preferably, the multifunctional interface displaying step includes:
scene presetting step: establishing a preset scene library, wherein the preset scene library comprises one or more preset scenes;
an image acquisition step: the AR glasses acquire real-time images of the real-time environment in front of the AR glasses through the camera to obtain real-time images;
scene matching: screening a preset scene matched with the real-time real image from a preset scene library, and recording the preset scene as a matched scene;
a wake-up step: displaying a virtual object corresponding to the matching scene to an AR glasses wearer through AR glasses, wherein the virtual object comprises a plurality of functional interfaces;
the selected interface processing step comprises:
a breathing characteristic gesture obtaining step: in a first time period after the functional interface is selected, if a breathing characteristic gesture of an AR glasses wearer is detected, defining the real-time real image and the preset scene as a non-matching relation, and hiding the displayed functional interface; the breathing characteristic gesture refers to the change of the spatial position of the head caused by the change of the spatial position of the AR glasses when the AR glasses wearer sighs or breathes deeply;
a breathing characteristic sound obtaining step: in a first time period after the functional interface is selected, if the AR glasses detect the breathing characteristic sound of the AR glasses wearer and the breathing characteristic gesture of the AR glasses wearer is not detected, defining the real-time real image and the preset scene as a high matching relation, and keeping the displayed functional interface; wherein the breathing characteristic sound is a sound emitted when the AR glasses wearer sighs or breathes deeply.
The invention provides a function interface proportion control system, which comprises:
the proportion control interface display module: displaying a proportion control interface to an AR glasses wearer through AR glasses;
a gradual display module: on the basis of displaying the proportion control interface, displaying the function interfaces one by one to AR glasses wearers in a mode of appearing one by one through AR glasses;
the proportion control operation module: selecting a proportion control interface according to the fixation point of the AR glasses wearer; for the proportion control interface, utilizing the breathing characteristics of the AR glasses wearer to perform proportion fixation and/or proportion unlocking; wherein the fixed occupation ratio refers to stopping the operation of changing the number of the functional interfaces so as to fix the picture occupation ratio of the functional interfaces in the picture displayed by the AR glasses to the AR glasses wearer; the occupation unlocking refers to operation of allowing the number of the function interfaces to be changed so as to allow the picture occupation of the function interfaces in the pictures displayed by the AR glasses to the AR glasses wearer to be changed.
Preferably, in the duty control operation module,
in a first time period after the proportion control interface is selected, if the breathing characteristic gesture of the AR glasses wearer is detected, the proportion control interface is subjected to proportion fixing; the breathing characteristic gesture refers to the change of the spatial position of the head caused by the change of the spatial position of the AR glasses when the AR glasses wearer sighs or breathes deeply;
in a first time period after the proportion control interface is selected, if the AR glasses detect the breathing characteristic sound of the AR glasses wearer and the breathing characteristic gesture of the AR glasses wearer is not detected, proportion unlocking is carried out on the proportion control interface; wherein the breathing characteristic sound is a sound emitted when the AR glasses wearer sighs or breathes deeply.
Preferably, after the duty ratio control interface is unlocked in a duty ratio mode, triggering allows the following modules to execute:
the multifunctional interface display module: determining a plurality of functional interfaces that have been displayed to an AR glasses wearer through AR glasses;
an interface selection module: selecting a functional interface according to the fixation point of the AR glasses wearer;
selecting an interface processing module: processing the selected interface by using the breathing characteristics of the AR glasses wearer;
the interface all-selection processing module: for a plurality of displayed functional interfaces, if the fixation point of the AR glasses wearer does not fall on any functional interface and the breathing characteristic gesture of the AR glasses wearer is detected, defining the real-time real image and the preset scene as a non-matching relation and hiding all displayed functional interfaces; the breathing characteristic gesture refers to the change of the spatial position of the AR glasses caused by the change of the spatial position of the head when the AR glasses wearer sighs or breathes deeply.
Preferably, the multifunctional interface display module includes:
a scene presetting module: establishing a preset scene library, wherein the preset scene library comprises one or more preset scenes;
an image acquisition module: the AR glasses acquire real-time images of the real-time environment in front of the AR glasses through the camera to obtain real-time images;
a scene matching module: screening a preset scene matched with the real-time real image from a preset scene library, and recording the preset scene as a matched scene;
a wake-up module: displaying a virtual object corresponding to the matching scene to an AR glasses wearer through AR glasses, wherein the virtual object comprises a plurality of functional interfaces;
the selected interface processing module comprises:
a breathing characteristic gesture acquisition module: in a first time period after the functional interface is selected, if a breathing characteristic gesture of an AR glasses wearer is detected, defining the real-time real image and the preset scene as a non-matching relation, and hiding the displayed functional interface; the breathing characteristic gesture refers to the change of the spatial position of the head caused by the change of the spatial position of the AR glasses when the AR glasses wearer sighs or breathes deeply;
breath characteristic sound acquisition module: in a first time period after the functional interface is selected, if the AR glasses detect the breathing characteristic sound of the AR glasses wearer and the breathing characteristic gesture of the AR glasses wearer is not detected, defining the real-time real image and the preset scene as a high matching relation, and keeping the displayed functional interface; wherein the breathing characteristic sound is a sound emitted when the AR glasses wearer sighs or breathes deeply.
According to the present invention, there is provided a computer-readable storage medium storing a computer program which, when executed by a processor, implements the steps of the functional interface proportion control method.
According to the AR glasses provided by the invention, the AR glasses comprise a function interface proportion control system or a computer readable storage medium stored with a computer program.
Compared with the prior art, the invention has the following beneficial effects:
the invention can allow the AR glasses wearer to control the number of functional interfaces through the breathing characteristics. The invention realizes gesture control by utilizing the breathing characteristic, can still control the AR glasses by the invention when both hands of the AR glasses wearer carry heavy objects to be inconvenient for waving hands, and can improve the operation efficiency for a plurality of functional interfaces.
Drawings
Other features, objects and advantages of the invention will become more apparent upon reading of the detailed description of non-limiting embodiments with reference to the following drawings:
FIG. 1 is a flow chart of the method steps of the present invention.
Detailed Description
The present invention will be described in detail with reference to specific examples. The following examples will assist those skilled in the art in further understanding the invention, but are not intended to limit the invention in any way. It should be noted that it would be obvious to those skilled in the art that various changes and modifications can be made without departing from the spirit of the invention. All falling within the scope of the present invention.
The invention provides a method for controlling the proportion of functional interfaces, which comprises the following steps:
a display step of an occupation ratio control interface: displaying a proportion control interface to an AR glasses wearer through AR glasses; specifically, the proportion control interface is a human-computer interaction interface, and allows proportion fixing operation and proportion unlocking operation to be performed on the proportion control interface.
Gradually displaying: on the basis of displaying the proportion control interface, displaying the function interfaces one by one to AR glasses wearers in a mode of appearing one by one through AR glasses; in particular, when the number of the functional interfaces is enough (for example, the proportion of the functional interfaces in the pictures viewed by the wearer of the AR glasses is too large after the number is increased again, which affects the viewing of the real environment by the wearer of the AR glasses), the wearer of the AR glasses can have time to operate the proportion control interface. The number of the functional interfaces is controlled, that is, the ratio of the functional interfaces can be controlled, and the number of the functional interfaces and the ratio of the functional interfaces can be linear or not.
The proportion control operation step: selecting a proportion control interface according to the fixation point of the AR glasses wearer; for the proportion control interface, utilizing the breathing characteristics of the AR glasses wearer to perform proportion fixation and/or proportion unlocking; wherein the fixed occupation ratio refers to stopping the operation of changing the number of the functional interfaces so as to fix the picture occupation ratio of the functional interfaces in the picture displayed by the AR glasses to the AR glasses wearer; the occupation unlocking refers to operation of allowing the number of the function interfaces to be changed so as to allow the picture occupation of the function interfaces in the pictures displayed by the AR glasses to the AR glasses wearer to be changed. Specifically, the implementation of a gaze point is referred to at least patent document CN111757090A, which provides a real-time VR image filtering method, system and storage medium based on gaze point information, including: capturing a human eye image by a high-speed camera in a head-mounted display device (such as AR glasses), and extracting the fixation point information of an eyeball from the human eye image; determining an observation angle of view according to the head position information of the head-mounted display device, and rendering a VR picture observed by the observation angle of view; and determining a gazing point area of the VR picture according to the gazing point information, selecting a non-gazing point area of the VR picture according to the gazing point area, and filtering the non-gazing point area.
In the proportion-controlling operation step,
in a first time period after the proportion control interface is selected, if the breathing characteristic gesture of the AR glasses wearer is detected, the proportion control interface is subjected to proportion fixing; the breathing characteristic gesture refers to the change of the spatial position of the head caused by the change of the spatial position of the AR glasses when the AR glasses wearer sighs or breathes deeply;
in a first time period after the proportion control interface is selected, if the AR glasses detect the breathing characteristic sound of the AR glasses wearer and the breathing characteristic gesture of the AR glasses wearer is not detected, proportion unlocking is carried out on the proportion control interface; wherein the breathing characteristic sound is a sound emitted when the AR glasses wearer sighs or breathes deeply.
After the duty ratio control interface is subjected to duty ratio unlocking, the method for triggering and allowing the duty ratio control of the functional interface comprises the following steps:
a multifunctional interface display step: determining a plurality of functional interfaces that have been displayed to an AR glasses wearer through AR glasses; specifically, the plurality of functional interfaces are distributed without overlapping, and the functional interfaces are human-computer interaction interfaces.
Selecting an interface: selecting a functional interface according to the fixation point of the AR glasses wearer; in the interface selection step, if the watching time of the functional interface from the watching point exceeds a set time value, the functional interface is considered to be selected.
Selecting an interface, and processing: and processing the selected interface by utilizing the breathing characteristics of the AR glasses wearer. Therefore, the AR glasses wearer can realize two different operation instructions only by breathing under the condition that no hand action is needed, the real-time real image and the preset scene are defined as a non-matching relation and the displayed function interface is hidden, the real-time real image and the preset scene are defined as a high-matching relation, and the displayed function interface is reserved.
Interface full selection processing steps: for a plurality of displayed functional interfaces, if the fixation point of the AR glasses wearer does not fall on any functional interface and the breathing characteristic gesture of the AR glasses wearer is detected, defining the real-time real image and the preset scene as a non-matching relation and hiding all displayed functional interfaces; the breathing characteristic gesture refers to the change of the spatial position of the AR glasses caused by the change of the spatial position of the head when the AR glasses wearer sighs or breathes deeply. Therefore, unified and quick operation processing of a plurality of functional interfaces is realized.
In a preferred embodiment, the step of displaying the multifunctional interface includes:
scene presetting step: establishing a preset scene library, wherein the preset scene library comprises one or more preset scenes; specifically, the preset scene includes a scene image, which may be a color rendering image, an edge feature image, or a gray histogram, that is, the scene image is an image capable of representing preset scene features and serves as a matching comparison object.
An image acquisition step: the AR glasses acquire real-time images of the real-time environment in front of the AR glasses through the camera to obtain real-time images; the AR glasses are provided with cameras, the cameras shoot real-time pictures to serve as real-time real images, and for example, front or rear cameras arranged on a smart phone are utilized when a WeChat video chat is conducted. The orientation of the camera of the AR glasses is coaxially arranged with the front of the AR glasses or is parallel to the front of the AR glasses. The viewing angle in front of the AR glasses depends on the viewing angle of the camera, and for example, if the lens of the camera is a wide-angle lens, the viewing angle in front of the AR glasses is larger than that of a standard lens.
Scene matching: screening a preset scene matched with the real-time real image from a preset scene library, and recording the preset scene as a matched scene; specifically, matching is performed in an image contrast manner, the preset scene with the highest matching degree is recorded as a matching scene, for example, the real-time real image and the scene images of each preset scene are subjected to image contrast matching, so as to screen out the scene image with the highest matching degree, and the preset scene corresponding to the scene image with the highest matching degree is taken as the matching scene.
A wake-up step: and displaying a virtual object corresponding to the matching scene to an AR glasses wearer through AR glasses, wherein the virtual object comprises a plurality of functional interfaces. For example, when the wearer of the AR glasses stands at an intersection, the matching scene is the intersection of the road, and the functional interface is an interface with an electronic map navigation function.
The selected interface processing step comprises:
a breathing characteristic gesture obtaining step: in a first time period after the functional interface is selected, if a breathing characteristic gesture of an AR glasses wearer is detected, defining the real-time real image and the preset scene as a non-matching relation, and hiding the displayed functional interface; the breathing characteristic gesture refers to the change of the spatial position of the AR glasses caused by the change of the spatial position of the head when the AR glasses wearer sighs or breathes deeply. Specifically, when the AR glasses wearer breathes anytime and anywhere, the fluctuation of chest expansion and contraction can cause the head to move, the fluctuation rule of sighs and deep breaths is different from the fluctuation rule of ordinary breaths, and the corresponding head moves differently. Therefore, the corresponding fluctuation rule can be obtained by detecting the head movement, so that the sigh and the deep breathing action of the AR glasses wearer can be known.
A breathing characteristic sound obtaining step: in a first time period after the functional interface is selected, if the AR glasses detect the breathing characteristic sound of the AR glasses wearer and the breathing characteristic gesture of the AR glasses wearer is not detected, defining the real-time real image and the preset scene as a high matching relation, and keeping the displayed functional interface; wherein the breathing characteristic sound is a sound emitted when the AR glasses wearer sighs or breathes deeply. Specifically, the wearer of the AR glasses breathes anytime and anywhere, the chest expands, contracts, inhales and exhales to make a sound, and the sound law during sigh and deep breathing is different from that during ordinary breathing. Therefore, by detecting the breathing sounds, the sigh and deep breathing action of the AR glasses wearer can be known. If the AR glasses detect the breathing characteristic sound of the AR glasses wearer and the breathing characteristic gesture of the AR glasses wearer is not detected, the AR glasses wearer is considered to control the head to move deliberately, for example, the head is kept still, and this constitutes an operation instruction, and the operation instruction indicates that the real-time real image and the preset scene are defined as a high matching relationship, and the displayed functional interface is reserved.
The first time period may be a time of 2-5 normal breaths, for example 3 normal breaths, by the wearer of the AR glasses.
The method for controlling the functional interface proportion can be used as an embodiment of a system for controlling the functional interface proportion, and the system for controlling the functional interface proportion can be realized by executing the step flow of the method for controlling the functional interface proportion.
The invention provides a function interface proportion control system, which comprises:
the proportion control interface display module: displaying a proportion control interface to an AR glasses wearer through AR glasses; specifically, the proportion control interface is a human-computer interaction interface, and allows proportion fixing operation and proportion unlocking operation to be performed on the proportion control interface.
A gradual display module: on the basis of displaying the proportion control interface, displaying the function interfaces one by one to AR glasses wearers in a mode of appearing one by one through AR glasses; in particular, when the number of the functional interfaces is enough (for example, the proportion of the functional interfaces in the pictures viewed by the wearer of the AR glasses is too large after the number is increased again, which affects the viewing of the real environment by the wearer of the AR glasses), the wearer of the AR glasses can have time to operate the proportion control interface. The number of the functional interfaces is controlled, that is, the ratio of the functional interfaces can be controlled, and the number of the functional interfaces and the ratio of the functional interfaces can be linear or not.
The proportion control operation module: selecting a proportion control interface according to the fixation point of the AR glasses wearer; for the proportion control interface, utilizing the breathing characteristics of the AR glasses wearer to perform proportion fixation and/or proportion unlocking; wherein the fixed occupation ratio refers to stopping the operation of changing the number of the functional interfaces so as to fix the picture occupation ratio of the functional interfaces in the picture displayed by the AR glasses to the AR glasses wearer; the occupation unlocking refers to operation of allowing the number of the function interfaces to be changed so as to allow the picture occupation of the function interfaces in the pictures displayed by the AR glasses to the AR glasses wearer to be changed. Specifically, the implementation of a point of regard is referred to at least patent document CN111757090A, which provides a real-time VR image filtering system, system and storage medium based on point of regard information, including: capturing a human eye image by a high-speed camera in a head-mounted display device (such as AR glasses), and extracting the fixation point information of an eyeball from the human eye image; determining an observation angle of view according to the head position information of the head-mounted display device, and rendering a VR picture observed by the observation angle of view; and determining a gazing point area of the VR picture according to the gazing point information, selecting a non-gazing point area of the VR picture according to the gazing point area, and filtering the non-gazing point area.
In the duty control operation module, the duty ratio control operation module,
in a first time period after the proportion control interface is selected, if the breathing characteristic gesture of the AR glasses wearer is detected, the proportion control interface is subjected to proportion fixing; the breathing characteristic gesture refers to the change of the spatial position of the head caused by the change of the spatial position of the AR glasses when the AR glasses wearer sighs or breathes deeply;
in a first time period after the proportion control interface is selected, if the AR glasses detect the breathing characteristic sound of the AR glasses wearer and the breathing characteristic gesture of the AR glasses wearer is not detected, proportion unlocking is carried out on the proportion control interface; wherein the breathing characteristic sound is a sound emitted when the AR glasses wearer sighs or breathes deeply.
After the duty ratio control interface is subjected to duty ratio unlocking, the following modules included in the function interface duty ratio control system are triggered to execute:
the multifunctional interface display module: determining a plurality of functional interfaces that have been displayed to an AR glasses wearer through AR glasses; specifically, the plurality of functional interfaces are distributed without overlapping, and the functional interfaces are human-computer interaction interfaces.
An interface selection module: selecting a functional interface according to the fixation point of the AR glasses wearer; in the interface selection module, if the watching time of the functional interface from the watching point exceeds a set time value, the functional interface is considered to be selected.
Selecting an interface processing module: and processing the selected interface by utilizing the breathing characteristics of the AR glasses wearer. Therefore, the AR glasses wearer can realize two different operation instructions only by breathing under the condition that no hand action is needed, the real-time real image and the preset scene are defined as a non-matching relation and the displayed function interface is hidden, the real-time real image and the preset scene are defined as a high-matching relation, and the displayed function interface is reserved.
The interface all-selection processing module: for a plurality of displayed functional interfaces, if the fixation point of the AR glasses wearer does not fall on any functional interface and the breathing characteristic gesture of the AR glasses wearer is detected, defining the real-time real image and the preset scene as a non-matching relation and hiding all displayed functional interfaces; the breathing characteristic gesture refers to the change of the spatial position of the AR glasses caused by the change of the spatial position of the head when the AR glasses wearer sighs or breathes deeply. Therefore, unified and quick operation processing of a plurality of functional interfaces is realized.
In a preferred embodiment, the multifunctional interface display module includes:
a scene presetting module: establishing a preset scene library, wherein the preset scene library comprises one or more preset scenes; specifically, the preset scene includes a scene image, which may be a color rendering image, an edge feature image, or a gray histogram, that is, the scene image is an image capable of representing preset scene features and serves as a matching comparison object.
An image acquisition module: the AR glasses acquire real-time images of the real-time environment in front of the AR glasses through the camera to obtain real-time images; the AR glasses are provided with cameras, the cameras shoot real-time pictures to serve as real-time real images, and for example, front or rear cameras arranged on a smart phone are utilized when a WeChat video chat is conducted. The orientation of the camera of the AR glasses is coaxially arranged with the front of the AR glasses or is parallel to the front of the AR glasses. The viewing angle in front of the AR glasses depends on the viewing angle of the camera, and for example, if the lens of the camera is a wide-angle lens, the viewing angle in front of the AR glasses is larger than that of a standard lens.
A scene matching module: screening a preset scene matched with the real-time real image from a preset scene library, and recording the preset scene as a matched scene; specifically, matching is performed in an image contrast manner, the preset scene with the highest matching degree is recorded as a matching scene, for example, the real-time real image and the scene images of each preset scene are subjected to image contrast matching, so as to screen out the scene image with the highest matching degree, and the preset scene corresponding to the scene image with the highest matching degree is taken as the matching scene.
A wake-up module: and displaying a virtual object corresponding to the matching scene to an AR glasses wearer through AR glasses, wherein the virtual object comprises a plurality of functional interfaces. For example, when the wearer of the AR glasses stands at an intersection, the matching scene is the intersection of the road, and the functional interface is an interface with an electronic map navigation function.
The selected interface processing module comprises:
a breathing characteristic gesture acquisition module: in a first time period after the functional interface is selected, if a breathing characteristic gesture of an AR glasses wearer is detected, defining the real-time real image and the preset scene as a non-matching relation, and hiding the displayed functional interface; the breathing characteristic gesture refers to the change of the spatial position of the AR glasses caused by the change of the spatial position of the head when the AR glasses wearer sighs or breathes deeply. Specifically, when the AR glasses wearer breathes anytime and anywhere, the fluctuation of chest expansion and contraction can cause the head to move, the fluctuation rule of sighs and deep breaths is different from the fluctuation rule of ordinary breaths, and the corresponding head moves differently. Therefore, the corresponding fluctuation rule can be obtained by detecting the head movement, so that the sigh and the deep breathing action of the AR glasses wearer can be known.
Breath characteristic sound acquisition module: in a first time period after the functional interface is selected, if the AR glasses detect the breathing characteristic sound of the AR glasses wearer and the breathing characteristic gesture of the AR glasses wearer is not detected, defining the real-time real image and the preset scene as a high matching relation, and keeping the displayed functional interface; wherein the breathing characteristic sound is a sound emitted when the AR glasses wearer sighs or breathes deeply. Specifically, the wearer of the AR glasses breathes anytime and anywhere, the chest expands, contracts, inhales and exhales to make a sound, and the sound law during sigh and deep breathing is different from that during ordinary breathing. Therefore, by detecting the breathing sounds, the sigh and deep breathing action of the AR glasses wearer can be known. If the AR glasses detect the breathing characteristic sound of the AR glasses wearer and the breathing characteristic gesture of the AR glasses wearer is not detected, the AR glasses wearer is considered to control the head to move deliberately, for example, the head is kept still, and this constitutes an operation instruction, and the operation instruction indicates that the real-time real image and the preset scene are defined as a high matching relationship, and the displayed functional interface is reserved.
The first time period may be a time of 2-5 normal breaths, for example 3 normal breaths, by the wearer of the AR glasses.
The foregoing description of specific embodiments of the present invention has been presented. It is to be understood that the present invention is not limited to the specific embodiments described above, and that various changes or modifications may be made by one skilled in the art within the scope of the appended claims without departing from the spirit of the invention. The embodiments and features of the embodiments of the present application may be combined with each other arbitrarily without conflict.

Claims (10)

1. A method for controlling the proportion of a functional interface is characterized by comprising the following steps:
a display step of an occupation ratio control interface: displaying a proportion control interface to an AR glasses wearer through AR glasses;
gradually displaying: on the basis of displaying the proportion control interface, displaying the function interfaces one by one to AR glasses wearers in a mode of appearing one by one through AR glasses;
the proportion control operation step: selecting a proportion control interface according to the fixation point of the AR glasses wearer; for the proportion control interface, utilizing the breathing characteristics of the AR glasses wearer to perform proportion fixation and/or proportion unlocking; wherein the fixed occupation ratio refers to stopping the operation of changing the number of the functional interfaces so as to fix the picture occupation ratio of the functional interfaces in the picture displayed by the AR glasses to the AR glasses wearer; the occupation unlocking refers to operation of allowing the number of the function interfaces to be changed so as to allow the picture occupation of the function interfaces in the pictures displayed by the AR glasses to the AR glasses wearer to be changed.
2. The method for controlling a percentage of functional interfaces according to claim 1, wherein, in the percentage control operation step,
in a first time period after the proportion control interface is selected, if the breathing characteristic gesture of the AR glasses wearer is detected, the proportion control interface is subjected to proportion fixing; the breathing characteristic gesture refers to the change of the spatial position of the head caused by the change of the spatial position of the AR glasses when the AR glasses wearer sighs or breathes deeply;
in a first time period after the proportion control interface is selected, if the AR glasses detect the breathing characteristic sound of the AR glasses wearer and the breathing characteristic gesture of the AR glasses wearer is not detected, proportion unlocking is carried out on the proportion control interface; wherein the breathing characteristic sound is a sound emitted when the AR glasses wearer sighs or breathes deeply.
3. The method for controlling the proportion of the functional interface according to claim 2, wherein after the proportion control interface is subjected to proportion unlocking, the following steps are triggered and allowed to be executed:
a multifunctional interface display step: determining a plurality of functional interfaces that have been displayed to an AR glasses wearer through AR glasses;
selecting an interface: selecting a functional interface according to the fixation point of the AR glasses wearer;
selecting an interface, and processing: processing the selected interface by using the breathing characteristics of the AR glasses wearer;
interface full selection processing steps: for a plurality of displayed functional interfaces, if the fixation point of the AR glasses wearer does not fall on any functional interface and the breathing characteristic gesture of the AR glasses wearer is detected, defining the real-time real image and the preset scene as a non-matching relation and hiding all displayed functional interfaces; the breathing characteristic gesture refers to the change of the spatial position of the AR glasses caused by the change of the spatial position of the head when the AR glasses wearer sighs or breathes deeply.
4. The method of claim 3,
the multifunctional interface displaying step comprises the following steps:
scene presetting step: establishing a preset scene library, wherein the preset scene library comprises one or more preset scenes;
an image acquisition step: the AR glasses acquire real-time images of the real-time environment in front of the AR glasses through the camera to obtain real-time images;
scene matching: screening a preset scene matched with the real-time real image from a preset scene library, and recording the preset scene as a matched scene;
a wake-up step: displaying a virtual object corresponding to the matching scene to an AR glasses wearer through AR glasses, wherein the virtual object comprises a plurality of functional interfaces;
the selected interface processing step comprises:
a breathing characteristic gesture obtaining step: in a first time period after the functional interface is selected, if a breathing characteristic gesture of an AR glasses wearer is detected, defining the real-time real image and the preset scene as a non-matching relation, and hiding the displayed functional interface; the breathing characteristic gesture refers to the change of the spatial position of the head caused by the change of the spatial position of the AR glasses when the AR glasses wearer sighs or breathes deeply;
a breathing characteristic sound obtaining step: in a first time period after the functional interface is selected, if the AR glasses detect the breathing characteristic sound of the AR glasses wearer and the breathing characteristic gesture of the AR glasses wearer is not detected, defining the real-time real image and the preset scene as a high matching relation, and keeping the displayed functional interface; wherein the breathing characteristic sound is a sound emitted when the AR glasses wearer sighs or breathes deeply.
5. A functional interface duty control system, comprising:
the proportion control interface display module: displaying a proportion control interface to an AR glasses wearer through AR glasses;
a gradual display module: on the basis of displaying the proportion control interface, displaying the function interfaces one by one to AR glasses wearers in a mode of appearing one by one through AR glasses;
the proportion control operation module: selecting a proportion control interface according to the fixation point of the AR glasses wearer; for the proportion control interface, utilizing the breathing characteristics of the AR glasses wearer to perform proportion fixation and/or proportion unlocking; wherein the fixed occupation ratio refers to stopping the operation of changing the number of the functional interfaces so as to fix the picture occupation ratio of the functional interfaces in the picture displayed by the AR glasses to the AR glasses wearer; the occupation unlocking refers to operation of allowing the number of the function interfaces to be changed so as to allow the picture occupation of the function interfaces in the pictures displayed by the AR glasses to the AR glasses wearer to be changed.
6. The function interface duty control system of claim 5, wherein, in said duty control operation module,
in a first time period after the proportion control interface is selected, if the breathing characteristic gesture of the AR glasses wearer is detected, the proportion control interface is subjected to proportion fixing; the breathing characteristic gesture refers to the change of the spatial position of the head caused by the change of the spatial position of the AR glasses when the AR glasses wearer sighs or breathes deeply;
in a first time period after the proportion control interface is selected, if the AR glasses detect the breathing characteristic sound of the AR glasses wearer and the breathing characteristic gesture of the AR glasses wearer is not detected, proportion unlocking is carried out on the proportion control interface; wherein the breathing characteristic sound is a sound emitted when the AR glasses wearer sighs or breathes deeply.
7. The system according to claim 6, wherein after the duty control interface is duty unlocked, the following modules are triggered to be allowed to execute:
the multifunctional interface display module: determining a plurality of functional interfaces that have been displayed to an AR glasses wearer through AR glasses;
an interface selection module: selecting a functional interface according to the fixation point of the AR glasses wearer;
selecting an interface processing module: processing the selected interface by using the breathing characteristics of the AR glasses wearer;
the interface all-selection processing module: for a plurality of displayed functional interfaces, if the fixation point of the AR glasses wearer does not fall on any functional interface and the breathing characteristic gesture of the AR glasses wearer is detected, defining the real-time real image and the preset scene as a non-matching relation and hiding all displayed functional interfaces; the breathing characteristic gesture refers to the change of the spatial position of the AR glasses caused by the change of the spatial position of the head when the AR glasses wearer sighs or breathes deeply.
8. The functional interface proportion control system of claim 7,
the multifunctional interface display module comprises:
a scene presetting module: establishing a preset scene library, wherein the preset scene library comprises one or more preset scenes;
an image acquisition module: the AR glasses acquire real-time images of the real-time environment in front of the AR glasses through the camera to obtain real-time images;
a scene matching module: screening a preset scene matched with the real-time real image from a preset scene library, and recording the preset scene as a matched scene;
a wake-up module: displaying a virtual object corresponding to the matching scene to an AR glasses wearer through AR glasses, wherein the virtual object comprises a plurality of functional interfaces;
the selected interface processing module comprises:
a breathing characteristic gesture acquisition module: in a first time period after the functional interface is selected, if a breathing characteristic gesture of an AR glasses wearer is detected, defining the real-time real image and the preset scene as a non-matching relation, and hiding the displayed functional interface; the breathing characteristic gesture refers to the change of the spatial position of the head caused by the change of the spatial position of the AR glasses when the AR glasses wearer sighs or breathes deeply;
breath characteristic sound acquisition module: in a first time period after the functional interface is selected, if the AR glasses detect the breathing characteristic sound of the AR glasses wearer and the breathing characteristic gesture of the AR glasses wearer is not detected, defining the real-time real image and the preset scene as a high matching relation, and keeping the displayed functional interface; wherein the breathing characteristic sound is a sound emitted when the AR glasses wearer sighs or breathes deeply.
9. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the method of any one of claims 1 to 4.
10. AR glasses comprising the functional interface proportion control system of any one of claims 5 to 8, or comprising the computer-readable storage medium of claim 9 having a computer program stored thereon.
CN202011445885.8A 2020-12-11 2020-12-11 Function interface proportion control method and system and AR glasses thereof Pending CN112558768A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011445885.8A CN112558768A (en) 2020-12-11 2020-12-11 Function interface proportion control method and system and AR glasses thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011445885.8A CN112558768A (en) 2020-12-11 2020-12-11 Function interface proportion control method and system and AR glasses thereof

Publications (1)

Publication Number Publication Date
CN112558768A true CN112558768A (en) 2021-03-26

Family

ID=75061153

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011445885.8A Pending CN112558768A (en) 2020-12-11 2020-12-11 Function interface proportion control method and system and AR glasses thereof

Country Status (1)

Country Link
CN (1) CN112558768A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8643951B1 (en) * 2012-03-15 2014-02-04 Google Inc. Graphical menu and interaction therewith through a viewing window
CN107480490A (en) * 2017-07-18 2017-12-15 歌尔科技有限公司 It is a kind of for the unlocking method of head-mounted display, device and head-mounted display
US20180350119A1 (en) * 2017-06-01 2018-12-06 Samsung Electronics Co., Ltd Systems and methods for window control in virtual reality environment
CN109145566A (en) * 2018-09-08 2019-01-04 太若科技(北京)有限公司 Method, apparatus and AR glasses based on blinkpunkt information unlock AR glasses

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8643951B1 (en) * 2012-03-15 2014-02-04 Google Inc. Graphical menu and interaction therewith through a viewing window
US20180350119A1 (en) * 2017-06-01 2018-12-06 Samsung Electronics Co., Ltd Systems and methods for window control in virtual reality environment
CN110692031A (en) * 2017-06-01 2020-01-14 三星电子株式会社 System and method for window control in a virtual reality environment
CN107480490A (en) * 2017-07-18 2017-12-15 歌尔科技有限公司 It is a kind of for the unlocking method of head-mounted display, device and head-mounted display
CN109145566A (en) * 2018-09-08 2019-01-04 太若科技(北京)有限公司 Method, apparatus and AR glasses based on blinkpunkt information unlock AR glasses
WO2020048535A1 (en) * 2018-09-08 2020-03-12 太若科技(北京)有限公司 Method and apparatus for unlocking head-mounted display device

Similar Documents

Publication Publication Date Title
US10666856B1 (en) Gaze-directed photography via augmented reality feedback
US10009542B2 (en) Systems and methods for environment content sharing
US20190331914A1 (en) Experience Sharing with Region-Of-Interest Selection
US10182720B2 (en) System and method for interacting with and analyzing media on a display using eye gaze tracking
US20200387226A9 (en) Systems and methods for monitoring a user's eye
KR101309176B1 (en) Apparatus and method for augmented reality
US9076033B1 (en) Hand-triggered head-mounted photography
JP2019092170A (en) System and method for generating 3-d plenoptic video images
US20150097826A1 (en) System and method for rendering dynamic three-dimensional appearing imagery on a two-dimensional user interface
KR20180101496A (en) Head-mounted display for virtual and mixed reality with inside-out location, user body and environment tracking
KR20180096434A (en) Method for displaying virtual image, storage medium and electronic device therefor
KR20170031733A (en) Technologies for adjusting a perspective of a captured image for display
JP2018018089A (en) Information processing device, information processing method, and program
JP6294054B2 (en) Video display device, video presentation method, and program
US20230334684A1 (en) Scene camera retargeting
US20210278671A1 (en) Head wearable device with adjustable image sensing modules and its system
CN112183200A (en) Eye movement tracking method and system based on video image
US11287881B2 (en) Presenting images on a display device
US11212502B2 (en) Method of modifying an image on a computational device
CN110796116A (en) Multi-panel display system, vehicle with multi-panel display system and display method
US11328187B2 (en) Information processing apparatus and information processing method
CN112558768A (en) Function interface proportion control method and system and AR glasses thereof
CN112558767A (en) Method and system for processing multiple functional interfaces and AR glasses thereof
CN111651043B (en) Augmented reality system supporting customized multi-channel interaction
CN112558766A (en) Method and system for waking up function interface in scene and AR glasses thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination