CN115079423A - Intelligent glasses and ultrasonic tactile feedback method thereof - Google Patents

Intelligent glasses and ultrasonic tactile feedback method thereof Download PDF

Info

Publication number
CN115079423A
CN115079423A CN202210996567.3A CN202210996567A CN115079423A CN 115079423 A CN115079423 A CN 115079423A CN 202210996567 A CN202210996567 A CN 202210996567A CN 115079423 A CN115079423 A CN 115079423A
Authority
CN
China
Prior art keywords
tactile feedback
virtual image
image
array ultrasonic
controller
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210996567.3A
Other languages
Chinese (zh)
Inventor
杨柳
于洋
任红恩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Goertek Inc
Original Assignee
Goertek Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Goertek Inc filed Critical Goertek Inc
Priority to CN202210996567.3A priority Critical patent/CN115079423A/en
Publication of CN115079423A publication Critical patent/CN115079423A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Optics & Photonics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses intelligent glasses and an ultrasonic tactile feedback method thereof, wherein the intelligent glasses comprise: an image acquisition device for acquiring a real image; the processor is connected with the image acquisition device and used for carrying out algorithm processing on the real image so as to obtain the hand space position coordinates of the user, generating corresponding hand space position coordinate information, processing the received virtual image and outputting the virtual image information to be displayed; an array ultrasonic wave emitting device; and the controller is respectively connected with the processor and the array ultrasonic transmitting device, and is used for determining the tactile feedback area according to the spatial position coordinate information of the hand of the user, controlling the array ultrasonic transmitting device to transmit ultrasonic waves with corresponding physical quantities to the tactile feedback area according to the corresponding position relation between the array ultrasonic transmitting device and the tactile feedback area and the to-be-displayed virtual image information so as to form tactile feedback. The invention realizes the tactile feedback of the intelligent glasses.

Description

Intelligent glasses and ultrasonic tactile feedback method thereof
Technical Field
The invention relates to the technical field of electronic circuits, in particular to intelligent glasses and an ultrasonic tactile feedback method thereof.
Background
The AR technology is basically the combination of a virtual world and a real world, then the interaction between a human body and the augmented real world is realized, the interaction mode comprises vision, hearing and touch, the interaction mode based on the touch can provide a more natural and more intuitive interaction which is different from the traditional audio-visual interaction mode for the user, the information which cannot be conveyed by other senses such as object texture and texture is conveyed, and the more intuitive experience is brought to the user in the human-computer interaction process. Current augmented reality technologies are based on visual and auditory interaction technologies, and there are no AR glasses with haptic interaction technologies for a while.
Disclosure of Invention
The invention mainly aims to provide intelligent glasses and an ultrasonic tactile feedback method thereof, and aims to realize the tactile feedback of the intelligent glasses.
In order to achieve the above object, the present invention provides a pair of smart glasses, including:
an image acquisition device for acquiring a real image;
the processor is connected with the image acquisition device and used for performing algorithm processing on the real image to obtain a hand space position coordinate of a user, generating corresponding hand space position coordinate information, processing the received virtual image and outputting virtual image information to be displayed;
an array ultrasonic wave emitting device;
and the controller is respectively connected with the processor and the array ultrasonic transmitting device, and is used for determining a tactile feedback area according to the spatial position coordinate information of the hand of the user, controlling the array ultrasonic transmitting device to transmit ultrasonic waves with corresponding physical quantities to the tactile feedback area according to the corresponding position relation between the array ultrasonic transmitting device and the tactile feedback area and the to-be-displayed virtual image information so as to form tactile feedback.
Optionally, the smart glasses further comprise:
an optical display system;
the controller is also used for outputting and driving the optical display system to work according to the information of the virtual image to be displayed so as to display the corresponding virtual image.
Optionally, the processor is further configured to perform algorithm processing on the reality image to obtain environment image information;
the controller is also used for driving the optical-mechanical module to work according to the environment image information so as to display the corresponding environment image.
Optionally, the smart glasses further comprise:
a mirror frame;
the wearable part is connected with the mirror frame, and the array ultrasonic transmitting device is detachably arranged on the wearable part;
or the array ultrasonic emission device is detachably arranged at one end, far away from the wearing part, of the glasses frame.
Optionally, the smart glasses further comprise:
a mirror frame;
a wearing part connected to the frame; the array ultrasonic transmitting device is movably connected to the mirror frame.
Optionally, the optical display system comprises:
and the display screen is connected with the controller and is used for displaying the virtual image information to be displayed output by the controller.
Optionally, the controller is specifically configured to adjust a duty cycle and/or an amplitude of the delay driving signal output to the array ultrasonic wave emitting device according to the information of the virtual image to be displayed.
Optionally, the controller comprises:
and the driving circuit is connected with the controller and is used for driving the array ultrasonic wave emitting device to emit the ultrasonic waves with the corresponding physical quantity to the tactile feedback area according to the delay driving signal output by the controller so as to form tactile feedback.
The invention also provides an ultrasonic tactile feedback method of the intelligent glasses, which comprises the following steps:
acquiring a real image, performing algorithm processing on the real image to acquire a hand space position coordinate of a user, and generating corresponding hand space position coordinate information;
processing the received virtual image and outputting the information of the virtual image to be displayed;
and determining a tactile feedback area according to the spatial position coordinate information of the hand of the user, and controlling the array ultrasonic wave emitting device to emit ultrasonic waves with corresponding physical quantities to the tactile feedback area according to the corresponding position relation between the array ultrasonic wave emitting device and the tactile feedback area and the to-be-displayed virtual image information so as to form tactile feedback.
Optionally, the ultrasonic tactile feedback method of the smart glasses includes:
performing algorithm processing on the real image to obtain environment image information;
and outputting and driving an optical display system of the intelligent glasses to work according to the environment image information and the virtual image information to be displayed so as to superpose the virtual image and the environment image and then carry out imaging.
The invention discloses a new AR glasses scheme with ultrasonic pressure touch feedback technology by integrating array ultrasonic feedback technology into intelligent glasses with the existing hardware platform of the intelligent glasses, which utilizes an image acquisition device to acquire a real image and output the real image to a processor so that the processor performs algorithm processing on the real image to obtain the spatial position coordinates of the hand of a user and generate corresponding spatial position coordinate information of the hand, processes a received virtual image and outputs virtual image information to be displayed to a controller, and then triggers the controller to determine a touch feedback area according to the spatial position coordinate information of the hand of the user and output a corresponding delay driving signal to an array ultrasonic transmitting device according to the virtual image information to be displayed so as to drive the array ultrasonic transmitting device to transmit ultrasonic waves with corresponding physical quantities to the touch feedback area according to the delay driving signal, to form tactile feedback. The invention can generate sound pressure for the finger touch area, so that the user generates touch sense for the touch area, the focus of the sound pressure correspondingly follows along with the change of the user gesture, and the interaction of vision, hearing and touch sense is realized, so that the touch sense similar to that brought by the real world is generated in the virtual world.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the structures shown in the drawings without creative efforts.
Fig. 1 is a schematic circuit diagram of an embodiment of smart glasses according to the present invention;
FIG. 2 is a schematic structural diagram of an embodiment of smart glasses according to the present invention;
FIG. 3 is a schematic view of a workflow of an embodiment of the smart glasses according to the present invention;
fig. 4 is a flowchart illustrating an embodiment of an ultrasonic tactile feedback method for smart glasses according to the present invention.
The reference numbers illustrate:
reference numerals Name(s) Reference numerals Name (R)
10 Image acquisition device 50 Optical display system
20 Processor with a memory having a plurality of memory cells 51 Display deviceScreen (B)
31 Controller 52 Optical machine module
32 Driving circuit 60 Picture frame
40 Array ultrasonic transmitter 70 Wearing part
The implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be obtained by a person skilled in the art without inventive step based on the embodiments of the present invention, are within the scope of protection of the present invention.
It should be noted that, if directional indications (such as up, down, left, right, front, and back … …) are involved in the embodiment of the present invention, the directional indications are only used to explain the relative positional relationship between the components, the movement situation, and the like in a specific posture (as shown in the drawing), and if the specific posture is changed, the directional indications are changed accordingly.
In addition, if there is a description of "first", "second", etc. in an embodiment of the present invention, the description of "first", "second", etc. is for descriptive purposes only and is not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature. In addition, technical solutions between the embodiments may be combined with each other, but must be based on the realization of the technical solutions by a person skilled in the art, and when the technical solutions are contradictory to each other or cannot be realized, such a combination should not be considered to exist, and is not within the protection scope of the present invention.
The term "and/or" herein is merely an association describing an associated object, meaning that three relationships may exist, e.g., a and/or B, may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" herein generally indicates that the former and latter related objects are in an "or" relationship.
The present invention provides smart glasses, which may be AR glasses or VR glasses, and the following embodiments of the present invention are described by taking AR glasses as an example. The intelligent glasses provided by the invention can interact vision, hearing and touch so as to generate touch feeling similar to that brought by a real world in a virtual world.
Referring to fig. 1 to 3, in an embodiment of the present invention, the smart glasses include:
an image acquisition device 10 for acquiring a real image;
the processor 20 is connected with the image acquisition device 10, and the processor 20 is used for performing algorithm processing on the real image to obtain the spatial position coordinates of the hand of the user, generating corresponding spatial position coordinate information of the hand, processing the received virtual image and outputting the information of the virtual image to be displayed;
an array ultrasonic wave transmitting device 40;
the controller 31 is connected to the processor 20 and the array ultrasonic wave emitting device 40, respectively, and the controller 31 is configured to determine a haptic feedback area according to the spatial position coordinate information of the user hand, and control the array ultrasonic wave emitting device 40 to emit an ultrasonic wave of a corresponding physical quantity to the haptic feedback area according to the corresponding position relationship between the array ultrasonic wave emitting device 40 and the haptic feedback area and the to-be-displayed virtual image information, so as to form haptic feedback.
In this embodiment, the image capturing device 10 may be a camera, the camera may be disposed on the frame 60 of the smart glasses, and the image capturing device 10 may capture a real image in the camera shooting area Z1 when the user wears the smart glasses during the work, so as to confirm the finger position of the user. In addition, the real image collected by the image collector 10 can be imaged on the optical module of the smart glasses to combine the virtual image with the real image. In a specific embodiment, the AR glasses have the camera facing coaxially or parallel to the front of the AR glasses. AR can understand that the viewing angle in front of the glasses depends on the viewing angle of the camera, e.g. if the lens of the camera is a wide-angle lens, the viewing angle in front of the AR glasses is larger than for a standard lens.
The processor 20 may receive the real image data acquired by the image acquisition device 10 through the MIPI interface, analyze the data, establish a coordinate axis according to an analysis result, locate an area where the user's hand is located, and transmit the coordinate data to the controller 31 through the MIPI/DSI interface. The processor 20 can realize signal interaction with the functional devices on the smart glasses through the central processor 20 of the smart glasses, receive information acquired by the sensor module, the camera, the microphone, the voice recognition device and the like, and output control signals to the sensor module, the camera, the microphone, the voice recognition device and the like to control the working states of the functional devices. Processor 20 can handle the audio and video signal of intelligent glasses to can receive the information that the sensor module gathered, for example sound, motion, infrared induction etc. and according to the signal control intelligent glasses work that the sensor module gathered. In some embodiments, the processor 20 may extract the overall hand characteristics of the image data acquired by the camera and determine whether a hand exists in the preset area, that is, whether the user performs a hand stretching touch action, analyze the image data when it is determined that the hand exists, and establish a coordinate axis according to the analysis result of the image data, so as to generate sound pressure for the finger touch area, so that the user generates a touch sensation for the touch area, and the focus of the sound pressure follows correspondingly with the change of the user gesture. In some embodiments, the smart glasses further include a wireless communication module, an audio processing module, and the like, where the wireless communication module may be a bluetooth chip, WIFI, and the like, and may be connected to an external terminal, such as a mobile phone, a game host, and the like, and may receive an audio/video signal through the wireless communication module, and output audio/video information to be processed to the processor 20 in synchronization. The processor 20 may be connected to the audio processing module, and the audio processing module performs digital-to-analog conversion, filtering, noise reduction, amplification, and the like on the signal, and then outputs the audio signal to the speaker to play the sound signal. The smart glasses may further include a memory connected to the processor 20, and the processor 20 may be further configured to control the optical module 52 to project a corresponding image onto the optical waveguide lens of the glasses according to image information pre-stored in the memory, so as to image the image in human eyes.
The controller 31 may be implemented by an FPGA, the FPGA may be used as a bridge chip for the MIPI signal and the QSPI signal, and the FPGA has abundant logic resources and I/O resources, and meanwhile, the FPGA is used as a parallel logic device, which may more conveniently implement processing and control of a multi-channel signal. In the working process, the FPGA can convert the DSI image signal of the MIPI protocol output by the processor 20AP into a QSPI signal and transmit the QSPI signal to the optical module of the smart glasses, and on the other hand, the coordinate signal of the hand of the user output by the AP is converted into delay data corresponding to each unit of the ultrasonic array transmitter, and a corresponding driving signal is generated for each unit of the ultrasonic array transmitter. The array ultrasonic feedback system uses ultrasonic focusing to generate tactile feedback, and the system can perform parallel control on multi-channel signals, namely, the driving signal of each ultrasonic generator in the array ultrasonic transmitting device 40 can be calculated and delayed independently.
The array ultrasonic wave emitting device 40 is provided with M × N ultrasonic wave generators, M and N may be 4, that is, 4 × 4 arrays are formed, and A, B … P in sequence, the M × N ultrasonic wave generators form an ultrasonic wave generator array, each ultrasonic wave generator is arranged at intervals and can be independently controlled, so as to generate sound pressure for a finger touch area, so that a user generates a touch sense for the touch area, and a focus of the sound pressure correspondingly follows along with a change of a user gesture. The ultrasonic array should be of variable shape, circular, rectangular or oblong, etc. For example, the plurality of ultrasonic sensors constitute a uniform rectangular planar array (equally spaced) or a uniform circular array. It can be understood that, after the user wears the smart glasses, the extended hand is closer to the glasses, so the small-sized array ultrasonic wave emitting device 40 can also meet the power requirement for generating the tactile feedback when being applied to the smart glasses. In practical applications, the controller 31 may drive the array ultrasonic wave emitting device 40 to generate tactile feedback only for a specific finger of the user, such as the index finger, or generate tactile feedback for multiple fingers simultaneously, and the tactile feedback for each finger may be the same or different. Alternatively, the controller 31 may drive the array ultrasound transmitter 40 to generate tactile feedback only on the palm of the user's hand, or simultaneously drive the array ultrasound transmitter 40 to generate tactile feedback on the palm of the user's hand and each finger of the user's hand, the data of the ultrasound generator is adjusted according to the requirements of accuracy, power, etc., and the software algorithm is adapted and implemented, which is not limited herein. In this embodiment, the modulation signal that can be used by the array ultrasonic transmitter 40 is generally between 100 Hz and 200Hz, which means that the system has at least 2.5 ms vacuum period when driving the ultrasonic transmitter, and this period of time can satisfy the requirement of camera for the next image acquisition and is transmitted back to the processor 20AP for processing, so as to realize the next touch sound pressure feedback, thereby positioning the position of the human hand in real time. The controller 31 controls each ultrasonic generator to feed back sound pressure feedback with different force according to the information of the virtual image to be displayed so as to simulate the virtual image scene to be displayed. For example, different scenes can be presented with the change of the virtual image to be displayed, each scene can have the change of people, animals and real objects, the material, hardness and the like of the real objects can also change with the change of the scenes, and the touch sensation can also be different according to the difference of the distance between the hand and the object in the scene, the implementation controller 31 can control the array ultrasonic wave emitting device 40 to emit corresponding physical quantities to the touch sensation feedback area according to the corresponding position relation of the array ultrasonic wave emitting device 40 and the touch sensation feedback area, and the virtual image information to be displayed can control the array ultrasonic wave emitting device 40 to emit corresponding physical quantities to the touch sensation feedback area, for example, the virtual image information to be displayed can be emitted correspondingly
Etc. to form a haptic feedback in combination with the virtual image scene.
The array ultrasonic wave emitting device 40 may be disposed on a frame 60 or a temple of the smart glasses, or may be disposed on a headband when the smart glasses are disposed as head-mounted smart glasses. When a user wears the smart glasses and uses the smart glasses, the relative position between the array ultrasonic wave emitting device 40 and the hand of the user changes along with the movement of the hand position of the user, and therefore, according to the relative position between the array ultrasonic wave emitting device 40 and the hand of the user, namely the corresponding position relationship between the array ultrasonic wave emitting device 40 and the tactile feedback area, a plurality of corresponding driving delay signals can be generated to drive each ultrasonic wave generator in the array ultrasonic wave emitting device 40 to perform corresponding delay so as to form tactile feedback on the tactile feedback area together.
The invention discloses a new AR glasses scheme with ultrasonic pressure touch feedback technology by integrating array ultrasonic feedback technology into intelligent glasses and the existing hardware platform of the intelligent glasses, which utilizes an image acquisition device 10 to acquire a real image and output the real image to a processor 20, so that the processor 20 performs algorithm processing on the real image, thereby acquiring the spatial position coordinates of the hand of a user, generating corresponding spatial position coordinate information of the hand, processing a received virtual image, outputting virtual image information to be displayed to a controller 31, further triggering the controller 31 to determine a touch feedback area according to the spatial position coordinate information of the hand of the user, outputting a corresponding delay driving signal to the array ultrasonic transmitting device 40 according to the position relation between the touch feedback area and the array ultrasonic transmitting device 40 and the virtual image information to be displayed, so as to drive the array ultrasonic wave emitting device 40 to emit the ultrasonic wave of the corresponding physical quantity to the tactile feedback area according to the delay driving signal, so as to form the tactile feedback. The invention can generate sound pressure for the finger touch area, so that the user generates touch sense for the touch area, the focus of the sound pressure correspondingly follows along with the change of the user gesture, and the interaction of vision, hearing and touch sense is realized, so that the touch sense similar to that brought by the real world is generated in the virtual world.
Referring to fig. 1, in an embodiment, the smart glasses further include:
an optical display system 50;
the controller 31 is further configured to output and drive the optical display system 50 to work according to the information of the virtual image to be displayed, so as to display a corresponding virtual image.
In this embodiment, the optical display system 50 is used for realizing the imaging work of the smart glasses, and the optical display system 50 is provided with an optical imaging element, and forms a distant virtual image through the optical imaging element and projects the virtual image to the human eye. When the perspective function of the intelligent glasses is realized, a user can see the real external world and also can see the virtual information, and the optical display system 50 can integrate the virtual information and the real scene into a whole in a laminated mode, supplement each other and enhance each other. The optical display system 50 may employ an optical waveguide optical scheme using an optical waveguide as a medium device, in which the optical waveguide guides the light waves to propagate. The optical display system 50 of the AR device may be composed of a display screen 51 and optical elements, implementing a combination of the display screen 51 and optical elements such as prisms, free-form surfaces, BirdBath, optical waveguides, and the like. The display screen 51 is used to provide display content for the device. The display screen 51 may be a self-luminous active device such as a light emitting diode panel like micro-OLED and now very popular micro-LED, a liquid crystal display screen 51 requiring an external light source for illumination (e.g., transmissive LCD and reflective LCOS), or a digital micromirror array (DMD, i.e., the core of DLP) and a Laser Beam Scanner (LBS) based on Micro Electro Mechanical System (MEMS) technology. The controller 31 may control the optical display system 50 to operate according to the information of the virtual image to be displayed, and the optical display system 50 may use a prism reflection mode, an off-axis curved surface reflection mode, a free-form surface prism mode, a geometric waveguide mode, or a holographic waveguide mode to image the virtual image at an infinite distance or at a certain distance in front of the eye, so as to implement virtual image display.
Referring to fig. 1, in an embodiment, the processor 20 is further configured to perform an algorithm process on the reality image to obtain environment image information;
the controller 31 is further configured to drive the optical-mechanical module 52 to work according to the environment image information, so as to display a corresponding environment image.
It can be understood that there are two main display modes of the AR device: video See-Through and Optical See-Through. In the smart glasses, the semi-transparent lenses are used, and human eyes can see the external real scene through the lenses directly. When adopting video perspective formula to realize showing, the environment image that intelligent glasses can gather the camera shows on the virtual image screen with the mode of video or image stream after with virtual image information stack, and virtual image screen display image does not coincide with the reality picture that people's eyes directly saw, and virtual reality is promptly not coincident. When the optical perspective type is adopted for realizing, the intelligent glasses directly superpose the enhanced information on the corresponding position of the real picture seen by human eyes through the lens through an optical principle and equipment, and only the enhanced information is displayed on the screen. This embodiment can select the display mode for the video perspective, shoot real-time picture through the camera that sets up on the intelligent glasses, also show the environment image, processor 20 can carry out image processing back to reality projection environment, the environment image that waits to play is generated, and export to controller 31, controller 31 control ray apparatus module 52 will show that the image forms images, virtual image information of processor 20AR output and the real image information eclipsed form of camera output are projected in predetermineeing the position, make user experience more lifelike, make the user obtain better AR experience.
Referring to fig. 2, in an embodiment, the smart glasses further include:
a frame 60;
a wearing part 70 connected to the frame 60, the array ultrasonic transmitter 40 being detachably attached to the wearing part 70;
alternatively, the array ultrasonic transmitter 40 may be detachably attached to an end of the frame 60 away from the wearing portion 70.
In this embodiment, the wearing part 70 may be a headband or a temple, the ultrasonic module may be detachable, and when the wearing part 70 is a temple, the array ultrasonic transmitter 40 is movably connected to the frame 60, and the mounting part is disposed on the temple; alternatively, the attachment portion is provided at an end of the frame 60 away from the temple. The array ultrasonic wave emitting device 40 and the spectacle frame 60 can be fixed by one or more methods such as screws, buckles and magnetic attraction, or the array ultrasonic wave emitting device 40 and the spectacle legs can be fixed by one or more methods such as screws, buckles and magnetic attraction. The array ultrasonic transmitter 40 and the frame 60 or the wearing part 70 are provided with connectors, and can be buckled together through a board-to-board connector, or through an android interface or a matching interface matched with a Type-c interface, so as to realize reliable splicing between the interfaces. Or set up conductive contact in either one, can be for adopting suppress formula contact electrode or magnetism to inhale formula contact electrode to improve the stability that contacts between conductive contact or the conductive interface and the conductive contact and switch on. The position can be selected, and the middle of the glasses or the two sides of the glasses legs can ensure that the proximity sensor and the ambient light sensor are not influenced.
Referring to fig. 2, in an embodiment, the smart glasses further include:
a frame 60;
a wearing section 70 connected to the frame 60; the array ultrasonic transmitter 40 is movably connected to an end of the frame 60 away from the wearing portion 70.
In this embodiment, when the wearing part 70 is a temple, the array ultrasonic transmitter 40 is movably connected to the mounting part of the frame.
The array ultrasonic transmitter 40 may also be foldably assembled to the frame 60, for example, at a position corresponding to the bridge of the nose of the user, the array ultrasonic transmitter 40 has an unfolded position where the array ultrasonic transmitter 40 may be in a working state or a standby state, and a folded position where the array ultrasonic transmitter 40 may be in a shutdown state, a connector may be provided between the array ultrasonic transmitter 40 and the frame 60, one end of the array ultrasonic transmitter 40 is rotatably connected to the wearing part 70 through the connector, and the other end of the array ultrasonic transmitter 40 moves between the unfolded position and the folded position. The connecting member may be a connecting shaft and a connecting rod, and under the cooperation of the connecting shaft and the connecting rod, the array ultrasonic emission device 40 and the wearing part 70 are mutually rotated and unfolded, so that the array ultrasonic emission device 40 is arranged in an unfolded state at one end of the wearing part 70 and mutually rotated and closed, and the array ultrasonic emission device 40 is partially protruded in a folded state at one side of the wearing part 70. Alternatively, the array ultrasonic transmitter 40 and the wearing portion 70 may be rotated relative to each other, and the array ultrasonic transmitter 40 may be partially protruded from the wearing portion 70 to be unfolded and rotated relative to each other to be folded to be overlapped. Considering the shielding of light, the ultrasonic transmitting device is integrally placed on the spectacle frame between two eyes, the interior of the ultrasonic transmitting device is electrically connected with an AR spectacle system through a soft and hard combined plate, and the array ultrasonic generator is attached to a PCB of the soft and hard combined plate, so that the touch sound pressure feedback function is realized.
Referring to fig. 1, in one embodiment, the optical display system 50 includes:
the display screen 51 is connected with the controller 31, and the display screen 51 is used for displaying the to-be-displayed virtual image information output by the controller 31;
and the optical-mechanical module 52 is configured to form the virtual image into a virtual image to a preset position by the optical-mechanical module 52.
In this embodiment, the quantity of ray apparatus module 52 can be one, also can be two, when installing an optical module on a picture frame 60, this glasses can realize the monocular AR and show, certainly can also dispose two optical modules, also an optical module is corresponded respectively to picture frame 60, install two optical modules respectively on the picture frame 60 that corresponds back when, control two optical modules and can realize binocular AR through controlling two optical waveguide lenses on the glasses body and show, and can realize the binocular and fuse. The optical module 52120 may include two optical machines and two optical lenses, the front frame casing 110 may have a receiving groove therein, the two optical machines are both received in the receiving groove, one of the optical machines is disposed at one end of the receiving groove away from the nose bridge, the other optical machine is disposed at one end of the receiving groove away from the nose bridge, and the two optical machines are both disposed away from the nose bridge. The optical lens can be an optical waveguide lens, an incident aperture is arranged on the optical waveguide lens, the emergent end of the optical machine is aligned with the incident aperture of the optical waveguide lens, so that the accuracy of a diffracted optical waveguide light path is guaranteed, the controller 31 controls the optical machine to display after acquiring image information to be played, namely, the controller 31 can control light rays corresponding to the emergent and image of the optical machine according to the image information to be played and project the light rays onto the optical waveguide lens through the incident aperture, and then the light rays are imaged to an imaging area Z3 after multi-layer diffraction of the optical waveguide lens, namely, the distance is infinite. At this time, the human eye can see a virtual-real combined image through the AR glasses.
Referring to fig. 1, in an embodiment, the controller 31 is specifically configured to adjust a duty cycle and/or an amplitude of a delay driving signal output to the array ultrasonic wave emitting device according to the to-be-displayed virtual image information, so as to generate a corresponding acoustic radiation pressure waveform.
In this embodiment, in the process of displaying, the virtual objects presented may also be different along with the change of the virtual image, so when a user stretches his/her hand to want to touch the virtual objects, in order to simulate the tactile sensation of different virtual objects, the present embodiment adds the array ultrasonic feedback technology to the AR glasses, and based on the existing AR hardware platform, proposes a new AR glasses solution with an ultrasonic pressure touch feedback technology. Meanwhile, according to different virtual objects presented by virtual image information, the duty ratio of the delay driving signal output to the array ultrasonic emission device is adjusted, or the amplitude of the delay driving signal is adjusted, so that different acoustic radiation pressure waveforms can be generated, and the visual sense and the tactile sense of a user are combined, therefore, by adjusting the duty ratio of the driving signal of the array ultrasonic emission device, or the amplitude of the delay driving signal, or simultaneously adjusting the amplitude and the duty ratio of the delay driving signal, the acoustic radiation pressure waveform which can be changed can adjust the power output to the array ultrasonic emission device 40, and further emit ultrasonic waves with corresponding physical quantities to a touch area, such as ultrasonic waves with different intensities or different frequencies, so as to generate different tactile textures, meet different tactile demands, enhance the reality sense of tactile feedback, in a virtual world, producing a tactile sensation similar to that brought by the real world.
Referring to fig. 1, in an embodiment, the smart glasses further include:
and the driving circuit 32 is connected with the controller 31, and the driving circuit 32 is configured to drive the array ultrasonic wave emitting device 40 to emit an ultrasonic wave of a corresponding physical quantity to a haptic feedback area according to the delay driving signal to form haptic feedback.
In this embodiment, the controller 31 may convert a DSI image signal of the MIPI protocol output by the processor 20AP into a QSPI signal, and transmit the QSPI signal to the display screen 51, and may also convert a coordinate signal of a finger output by the processor 20AP into delay data corresponding to each unit of the ultrasonic array transmitter, and generate a corresponding driving signal for each unit of the ultrasonic array transmitter to the driving circuit 32. The driving circuit 32 can be realized by adopting a driving IC, the driving IC amplifies a driving signal transmitted by the FPGA and transmits the amplified driving signal to the ultrasonic generators, and in order to save the board distribution space and power consumption, the driving circuit adopts a multi-channel driving IC, can simultaneously drive a plurality of ultrasonic generators, accurately realize time delay and save the number of communication interfaces of the FPGA. The ultrasonic generator can emit ultrasonic signals under the drive of the drive IC, and time delay is set according to different time of reaching a certain point in space of each unit, so that the ultrasonic signals are focused to a target area touched by fingers, namely a hand touch area Z2 of a user. It can be understood that, the array ultrasonic wave emitting device 40 emits the ultrasonic wave, and the time for the ultrasonic wave to reach the touch area is different because the position of each unit in the array is different, so that a delay time needs to be applied to each array unit, the ultrasonic wave emitted by each unit is controlled to reach the touch area at the same time, resonance is generated in the touch area, a focus point is formed, and sound pressure capable of being sensed by a finger is generated. In order to realize the light weight and low power consumption of the AR glasses, the signal amplification IC with 16 channels is adopted in the technical scheme, the amplitude of the driving signal transmitted by the FPGA is amplified to the amplitude required by the ultrasonic transmitter, the driving signal is output to the ultrasonic generator, and the signal of each channel drives one ultrasonic generator.
Referring to fig. 3, in a specific embodiment, the working process of the smart glasses: the camera gathers external environment data to data information transmission who will gather gives the AP, and the AP carries out analysis processes to the image data who receives, exports for FPGA, and FPGA truns into the QSPI signal that is fit for the screen with the signal, and transmits for the screen, and the curtain shows the part and constitutes the virtual image through the optical mode to predetermineeing the position, realizes the virtual reality image combination. The camera continues to collect external environment data, if a user touches an image in front of eyes with fingers, a display image including the hand of the user is collected at the moment, the collected data information is sent to the processor AP, the processor AP analyzes and processes received real image data, positions a finger touch area, outputs a coordinate signal after positioning to the FPGA, and simultaneously outputs image data to the FPGA, the FPGA calculates delay according to the coordinate information and outputs 16 paths of driving signals to the driving IC, meanwhile, the image data is output to the screen, the driving IC amplifies the driving signals and outputs the driving signals to the ultrasonic generator, the ultrasonic generator transmits ultrasonic waves, the ultrasonic waves are focused to a target area to generate sound pressure, and touch sensation is generated on the finger.
The invention further provides an ultrasonic tactile feedback method of the intelligent glasses. Referring to fig. 4, the ultrasonic tactile feedback method of the smart glasses includes:
s100, acquiring a real image, performing algorithm processing on the real image to acquire a hand space position coordinate of a user, and generating corresponding hand space position coordinate information;
step S200, processing the received virtual image and outputting the information of the virtual image to be displayed;
step S300, determining a tactile feedback area according to the spatial position coordinate information of the hand of the user, and controlling the array ultrasonic wave emitting device to emit ultrasonic waves with corresponding physical quantities to the tactile feedback area according to the corresponding position relation between the array ultrasonic wave emitting device and the tactile feedback area and the to-be-displayed virtual image information to form tactile feedback.
In this embodiment, the image data acquired by the camera is subjected to hand overall feature extraction, whether a hand exists in a preset area is judged, that is, whether a user performs a hand stretching touch action is judged, the image data is analyzed when the hand exists, and a coordinate axis is established according to an image data analysis result, so that sound pressure is generated for a finger touch area, the user generates a touch sense for the touch area, and a focus of the sound pressure correspondingly follows along with the change of a user gesture. After the coordinate system is established, delay data corresponding to each unit of the ultrasonic array transmitter can be generated according to the hand space position coordinate information, and corresponding driving signals can be generated for each unit of the ultrasonic array transmitter. The array ultrasonic feedback system uses ultrasonic focusing to generate tactile feedback, and the system can perform parallel control on multi-channel signals, namely the driving signal of each transmitter in the array ultrasonic transmitting device can be calculated and delayed independently. The array ultrasonic transmitting device is provided with M-N ultrasonic generators, the M-N ultrasonic generators form an ultrasonic generator array, each ultrasonic generator is arranged at intervals and can be independently controlled to generate sound pressure for a finger touch area, so that a user generates touch for the touch area, and a focus of the sound pressure correspondingly follows along with the change of gestures of the user. The ultrasonic array should be of variable shape, circular, rectangular or oblong, etc. In practical applications, the controller may drive the array ultrasonic wave emitting device to generate the tactile feedback only for a specific finger of the user, such as the index finger, or generate the tactile feedback for a plurality of fingers simultaneously, and the tactile feedback for each finger may be the same or different. Or, the controller may drive the array ultrasonic wave emitting device to generate the tactile feedback only to the palm of the user, or simultaneously drive the array ultrasonic wave emitting device to generate the tactile feedback to the palm of the user and each finger of the user, the data of the ultrasonic wave generator is adjusted according to the requirements of accuracy, power and the like, and the software algorithm is adapted and implemented, which is not limited herein. In this embodiment, the modulation signal that can be used by the array ultrasonic transmitter 40 is generally between 100 Hz and 200Hz, which means that the system has at least a vacuum period of 2.5 ms when driving the ultrasonic transmitter, and this period of time can satisfy the requirement of the camera for the next image acquisition and is transmitted back to the processor AP for processing, so as to implement the next touch sound pressure feedback, thereby positioning the position of the human hand in real time. And controlling each ultrasonic generator to feed back sound pressure feedback with different force according to the information of the virtual image to be displayed so as to simulate the scene of the virtual image to be displayed. For example, different scenes can be presented with the change of the virtual image to be displayed, each scene can have the change of people, animals and real objects, the material, hardness and the like of the real objects can also change with the change of the scene, and the touch sensation can also be different according to the difference of the distance between the hand and the object in the scene, and the implementation controller 31 controls the array ultrasonic wave emitting device to emit the ultrasonic wave with the corresponding physical quantity to the touch sensation feedback area according to the corresponding position relation of the array ultrasonic wave emitting device and the touch sensation feedback area and the information of the virtual image to be displayed so as to form the touch sensation feedback combined with the virtual image scene.
The array ultrasonic emission device can be arranged on a glasses frame or glasses legs of the intelligent glasses, and can also be arranged on a head band when the intelligent glasses are arranged as head-wearing intelligent glasses. When a user wears the intelligent glasses and uses the intelligent glasses, the relative position between the array ultrasonic emission device and the hand of the user can be changed along with the movement of the hand position of the user.
The invention discloses a new AR (augmented reality) glasses scheme with an ultrasonic pressure touch feedback technology, which combines an array ultrasonic feedback technology into AR glasses and an AR existing hardware platform.
In one embodiment, the ultrasonic tactile feedback method of the smart glasses comprises the following steps:
carrying out algorithm processing on the reality image to obtain environment image information;
and outputting and driving an optical display system of the intelligent glasses to work according to the environment image information and the virtual image information to be displayed so as to superpose the virtual image and the environment image and then carry out imaging.
In this embodiment, carry out image processing back to the reality projection environment, generate the environment image of treating the broadcast to output to the controller of intelligent glasses, controller control ray apparatus module will show that the image forms images, virtual image information of treater AR output and the real image information eclipsed form projection of camera output in preset position, make user experience more lifelike, make the user obtain better AR experience.
The above description is only an alternative embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications and equivalents of the present invention, which are made by the contents of the present specification and the accompanying drawings, or directly/indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (10)

1. A smart eyewear, comprising:
an image acquisition device for acquiring a real image;
the processor is connected with the image acquisition device and used for performing algorithm processing on the real image to obtain a hand space position coordinate of a user, generating corresponding hand space position coordinate information, processing the received virtual image and outputting virtual image information to be displayed;
an array ultrasonic wave emitting device;
and the controller is respectively connected with the processor and the array ultrasonic transmitting device, and is used for determining a tactile feedback area according to the spatial position coordinate information of the hand of the user, controlling the array ultrasonic transmitting device to transmit ultrasonic waves with corresponding physical quantities to the tactile feedback area according to the corresponding position relation between the array ultrasonic transmitting device and the tactile feedback area and the to-be-displayed virtual image information so as to form tactile feedback.
2. The smart eyewear of claim 1, further comprising:
an optical display system;
the controller is also used for outputting and driving the optical display system to work according to the information of the virtual image to be displayed so as to display the corresponding virtual image.
3. The smart eyewear of claim 2, wherein the processor is further configured to perform algorithmic processing on the reality image to obtain environmental image information;
the controller is further used for driving the optical display system to work according to the environment image information so as to display the corresponding environment image.
4. The smart eyewear of claim 2, further comprising:
a mirror frame;
the wearable part is connected with the mirror frame, and the array ultrasonic transmitting device is detachably arranged on the wearable part;
or the array ultrasonic emission device is detachably arranged at one end, far away from the wearing part, of the glasses frame.
5. The smart eyewear of claim 2, further comprising:
a mirror frame;
a wearing part connected to the frame; the array ultrasonic transmitting device is movably connected to the mirror frame.
6. The smart eyewear of claim 2, wherein the optical display system comprises:
and the display screen is connected with the controller and is used for displaying the virtual image information to be displayed output by the controller.
7. The smart eyewear of claim 1, wherein the controller is specifically configured to adjust a duty cycle and/or an amplitude of the delayed drive signal output to the array ultrasound emitting device according to the virtual image information to be displayed to generate a corresponding acoustic radiation pressure waveform.
8. The smart eyewear of any of claims 1-7, wherein the controller comprises:
and the driving circuit is connected with the controller and used for driving the array ultrasonic wave emitting device to emit ultrasonic waves with corresponding physical quantities to the tactile feedback area according to the delay driving signal output by the controller so as to form tactile feedback.
9. An ultrasonic tactile feedback method of intelligent glasses is applied to the intelligent glasses, the intelligent glasses comprise an array ultrasonic emitting device, and the ultrasonic tactile feedback method of the intelligent glasses is characterized by comprising the following steps of:
acquiring a real image, performing algorithm processing on the real image to acquire a hand space position coordinate of a user, and generating corresponding hand space position coordinate information;
processing the received virtual image and outputting the information of the virtual image to be displayed;
and determining a tactile feedback area according to the spatial position coordinate information of the hand of the user, and controlling the array ultrasonic wave emitting device to emit ultrasonic waves with corresponding physical quantities to the tactile feedback area according to the corresponding position relation between the array ultrasonic wave emitting device and the tactile feedback area and the to-be-displayed virtual image information so as to form tactile feedback.
10. The ultrasonic haptic feedback method of smart glasses according to claim 9, comprising:
carrying out algorithm processing on the reality image to obtain environment image information;
and outputting and driving an optical display system of the intelligent glasses to work according to the environment image information and the virtual image information to be displayed so as to superpose the virtual image and the environment image and then carry out imaging.
CN202210996567.3A 2022-08-19 2022-08-19 Intelligent glasses and ultrasonic tactile feedback method thereof Pending CN115079423A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210996567.3A CN115079423A (en) 2022-08-19 2022-08-19 Intelligent glasses and ultrasonic tactile feedback method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210996567.3A CN115079423A (en) 2022-08-19 2022-08-19 Intelligent glasses and ultrasonic tactile feedback method thereof

Publications (1)

Publication Number Publication Date
CN115079423A true CN115079423A (en) 2022-09-20

Family

ID=83245306

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210996567.3A Pending CN115079423A (en) 2022-08-19 2022-08-19 Intelligent glasses and ultrasonic tactile feedback method thereof

Country Status (1)

Country Link
CN (1) CN115079423A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107908000A (en) * 2017-11-27 2018-04-13 西安交通大学 A kind of mixed reality system with ultrasonic virtual tactile
US20180232050A1 (en) * 2017-02-14 2018-08-16 Microsoft Technology Licensing, Llc Physical haptic feedback system with spatial warping
CN109085926A (en) * 2018-08-21 2018-12-25 华东师范大学 A kind of the augmented reality system and its application of multi-modality imaging and more perception blendings
CN111552412A (en) * 2019-02-08 2020-08-18 原子能与替代能源委员会 Virtual, augmented or mixed reality device
CN113110734A (en) * 2021-03-03 2021-07-13 中国运载火箭技术研究院 System for generating virtual shape perception based on focused ultrasonic waves

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180232050A1 (en) * 2017-02-14 2018-08-16 Microsoft Technology Licensing, Llc Physical haptic feedback system with spatial warping
CN107908000A (en) * 2017-11-27 2018-04-13 西安交通大学 A kind of mixed reality system with ultrasonic virtual tactile
CN109085926A (en) * 2018-08-21 2018-12-25 华东师范大学 A kind of the augmented reality system and its application of multi-modality imaging and more perception blendings
CN111552412A (en) * 2019-02-08 2020-08-18 原子能与替代能源委员会 Virtual, augmented or mixed reality device
CN113110734A (en) * 2021-03-03 2021-07-13 中国运载火箭技术研究院 System for generating virtual shape perception based on focused ultrasonic waves

Similar Documents

Publication Publication Date Title
US10795445B2 (en) Methods, devices, and systems for determining contact on a user of a virtual reality and/or augmented reality device
JP6250041B2 (en) Reduction of external vibration in bone conduction speakers
EP2638693B1 (en) Automatic variable virtual focus for augmented reality displays
JP7095602B2 (en) Information processing equipment, information processing method and recording medium
US11467670B2 (en) Methods, devices, and systems for displaying a user interface on a user and detecting touch gestures
JPH086708A (en) Display device
US11662812B2 (en) Systems and methods for using a display as an illumination source for eye tracking
US11150737B2 (en) Apparatus, system, and method for wrist tracking and gesture detection via time of flight sensors
JP2023100820A (en) Photo-real character configuration for spatial computing
US9960477B2 (en) Antenna assembly utilizing space between a battery and a housing
EP3894981A1 (en) Modular accessory systems for wearable devices
US9319980B1 (en) Efficient digital image distribution
CN111025650A (en) Wearable laser micro-projection device
US20240095948A1 (en) Self-tracked controller
US11120258B1 (en) Apparatuses, systems, and methods for scanning an eye via a folding mirror
JP6668811B2 (en) Training device, training method, program
CN114846434A (en) Non-uniform stereoscopic rendering
CN115079423A (en) Intelligent glasses and ultrasonic tactile feedback method thereof
JP2016109710A (en) Head-mounted display system, control method of head-mounted display system, and computer program
CN209859042U (en) Wearable control device and virtual/augmented reality system
US11333895B1 (en) Systems and methods for structured light projector operational safety
KR20220062938A (en) Electronic device and method for providing vitural reality service
CN106773096B (en) Virtual display interactive 3D information mirror frame
CN220627014U (en) Computing system with head-mounted display
US11571159B1 (en) Floating biopotential samplings

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20220920