US20190028793A1 - Wireless Earpiece and Smart Glasses System and Method - Google Patents
Wireless Earpiece and Smart Glasses System and Method Download PDFInfo
- Publication number
- US20190028793A1 US20190028793A1 US16/138,206 US201816138206A US2019028793A1 US 20190028793 A1 US20190028793 A1 US 20190028793A1 US 201816138206 A US201816138206 A US 201816138206A US 2019028793 A1 US2019028793 A1 US 2019028793A1
- Authority
- US
- United States
- Prior art keywords
- lens
- eyeglasses
- earpieces
- data
- display
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000000034 method Methods 0.000 title claims abstract description 29
- 239000004984 smart glass Substances 0.000 title description 23
- 230000000007 visual effect Effects 0.000 claims description 27
- 229920000642 polymer Polymers 0.000 claims description 26
- 239000010409 thin film Substances 0.000 claims description 25
- 230000003190 augmentative effect Effects 0.000 claims description 16
- 239000010408 film Substances 0.000 claims description 10
- 230000003416 augmentation Effects 0.000 claims description 5
- 238000012545 processing Methods 0.000 claims description 2
- 239000011521 glass Substances 0.000 description 21
- 230000008901 benefit Effects 0.000 description 19
- 238000005516 engineering process Methods 0.000 description 8
- 238000003491 array Methods 0.000 description 6
- 238000010586 diagram Methods 0.000 description 4
- 230000007246 mechanism Effects 0.000 description 4
- 230000002093 peripheral effect Effects 0.000 description 4
- 210000003984 auditory pathway Anatomy 0.000 description 3
- 238000004891 communication Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 230000007613 environmental effect Effects 0.000 description 3
- 230000004044 response Effects 0.000 description 3
- 210000001525 retina Anatomy 0.000 description 3
- 238000004458 analytical method Methods 0.000 description 2
- 238000013459 approach Methods 0.000 description 2
- 210000000988 bone and bone Anatomy 0.000 description 2
- 230000000295 complement effect Effects 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 230000006698 induction Effects 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 238000012544 monitoring process Methods 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 210000001747 pupil Anatomy 0.000 description 2
- MDNWOSOZYLHTCG-UHFFFAOYSA-N Dichlorophen Chemical compound OC1=CC=C(Cl)C=C1CC1=CC(Cl)=CC=C1O MDNWOSOZYLHTCG-UHFFFAOYSA-N 0.000 description 1
- 206010040030 Sensory loss Diseases 0.000 description 1
- 238000007792 addition Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 210000003128 head Anatomy 0.000 description 1
- 230000004886 head movement Effects 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 230000002085 persistent effect Effects 0.000 description 1
- 239000004033 plastic Substances 0.000 description 1
- 239000004417 polycarbonate Substances 0.000 description 1
- 229920000515 polycarbonate Polymers 0.000 description 1
- 238000004451 qualitative analysis Methods 0.000 description 1
- 238000004445 quantitative analysis Methods 0.000 description 1
- 230000001953 sensory effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04R—LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
- H04R1/00—Details of transducers, loudspeakers or microphones
- H04R1/10—Earpieces; Attachments therefor ; Earphones; Monophonic headphones
- H04R1/1041—Mechanical or electronic switches, or control elements
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/16—Sound input; Sound output
- G06F3/167—Audio in a user interface, e.g. using voice commands for navigating, audio feedback
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04R—LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
- H04R1/00—Details of transducers, loudspeakers or microphones
- H04R1/10—Earpieces; Attachments therefor ; Earphones; Monophonic headphones
- H04R1/1016—Earpieces of the intra-aural type
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0138—Head-up displays characterised by optical features comprising image capture systems, e.g. camera
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B2027/0178—Eyeglass type
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04R—LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
- H04R2420/00—Details of connection covered by H04R, not provided for in its groups
- H04R2420/07—Applications of wireless loudspeakers or wireless microphones
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04R—LOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
- H04R2460/00—Details of hearing devices, i.e. of ear- or headphones covered by H04R1/10 or H04R5/033 but not provided for in any of their subgroups, or of hearing aids covered by H04R25/00 but not provided for in any of its subgroups
- H04R2460/13—Hearing devices using bone conduction transducers
Definitions
- the present invention relates to wearable devices. More particularly, but not exclusively, the present invention relates to wireless ear pieces and smart glasses.
- Another object, feature, or advantage is to provide enhanced visual fields based upon thin film polymer technology.
- Another object, feature, or advantage is to provide the ability to place thin film polymer technology onto both side of the lens.
- Yet another object, feature, or advantage is to provide the ability of the smart glasses to interact with the smart earpieces of the user.
- a further object, feature, or advantage is to provide an enhanced ability to measure biometrics and display the biometrics to the user.
- a still further object, feature, or advantage is to provide enhanced feedback controls for visual based interfaces using the audio engine, as well as from the audio engine to display data on the visual based interfaces.
- Another object, feature, or advantage is to provide enhanced localization, quantitative and qualitative analysis of acoustic sources from earpiece. Data from which can be fed to the glasses to provide visual cues with information on the acoustic source.
- the present invention provides a smart linkage of completely wireless earpieces to smart glasses. Such linkages will highlight the strength of each platform.
- Data aggregated at the earpiece may be wirelessly transmitted to the screen(s) of the smart glasses for viewing by the user. Further, data aggregated by the smart glasses or auditory inputs from the smart glasses may be wirelessly transmitted to the smart earpieces for input via the auditory pathway.
- This has multiple advantages over any previously considered systems.
- the system may be streamlined so sensor arrays need not be duplicated. Power management may be enhanced due to the removal of the need for each device to have similar sensor arrays, e.g. accelerometers. Alternately, both devices may have duplicate sensors so compiled data could be processed for greater precision.
- Thin film polymer may be used on the smart glasses as a part of a display system.
- the thin film polymer may be positioned on the internal or external side of the lens(es) or the thin film polymer may be placed on both sides of the lens of the glasses to create a natural scenario where depth of image may be provided. This allows for filtering the image presented to the retina in another way. Alternately, the effect which would be detected by the user may be adjusted. Either alternately or in conjunction, images may be projected directly onto the retina. This would allow the user an unparalleled advantage in the augmented or assisted visual world. Certainly, recording or image capture would be able to be facilitated from the forward facing, or case dependently rear facing cameras.
- the cameras of the smart glasses may interact with the world and provide visual based analysis of the user's environment.
- the glasses may then supply information to the smart earpieces to coordinate, highlight, augment or otherwise alert the user to the information in a timely and responsive measure.
- a forward facing camera detects the presence of an ambulance moving into the central field of vision from the peripheral field of vision of the user.
- the sensing mechanism may highlight the image of the emergency vehicle relative to the tracking position of the user.
- Auditory linkage to the smart earpiece may selectively highlight the position of the siren and modulate the environmental conditions to highlight the position acoustically for the user in three dimensional space.
- the complementary linking of the two principal technologies may therefore could allow the siren to be localized when it is not-yet visible, and visual cues given as to its location.
- this system may give rise to camera systems possibly monitoring the “peripheral visual field” of the user as well as the lateral and posterior fields. This would allow theoretical access to 360 degree visual fields. Such fields could be tied into the camera systems available to them from smart earpieces or other connected body worn devices.
- control systems may be enhanced both from the visual as well as the audio worlds. For example, one could track the pupils, the smart glasses would be able to detect from the front facing cameras what the person was looking at with reasonable certainty. This information may be transmitted to the smart earpieces which may then process the request and do a search, as but one example. Feedback to the user may be given acoustically, as well as visually. The choice may be made by the user as to which approach the user would prefer. Additionally, visually instituted actions may be able to coordinate with an audio mechanism for enhanced feedback. Such feedback may be optimized to provide for feedback from the audio system to the visual system or from the visual system to the audio system.
- a method includes providing a set of earpieces comprising a left ear piece and a right ear piece, each of the earpieces comprising an ear piece housing, a wireless transceiver disposed within the ear piece housing, a processor disposed within the housing and operatively connected to the wireless transceiver.
- the method further includes providing a set of eyeglasses comprising an eyeglass frame, a wireless transceiver disposed within the eyeglass frame, a processor disposed within the eyeglass frame, and a first lens and a second lens operatively connected to the eyeglass frame.
- the method provides for communicating data between at least one of the set of earpieces and the set of eyeglasses.
- the communicating is wirelessly communicating.
- the method may further provide for displaying on the first lens or the second lens data from the one of the set of earpieces.
- the method may further include at least one sensor on the set of eye glasses and wherein the data is data from the set of eye glasses.
- the method may further include at least one sensor disposed within the left earpiece or the right earpiece and wherein the data is data from the at least one sensor.
- There may be a film on an external face of at least one of the first lens and the second lens configured to display imagery.
- There may be a film on an internal face of at least one of the first lens and the second lens configured to display imagery.
- the imagery may provide for augmented vision or assisted vision.
- first film on an external face of at least one of the first lens and the second lens configured to display a first set of imagery
- a second film on an internal face of at least one of the first lens and the second lens configured to display a second set of imagery and wherein the first set of imagery and the second set of imagery combined to provide imagery having three-dimensional depth.
- a system includes a set of earpieces including a left ear piece and a right ear piece, each of the earpieces comprising an ear piece housing, a wireless transceiver disposed within the ear piece housing, a processor disposed within the housing and operatively connected to the wireless transceiver.
- the system further includes a set of eyeglasses comprising an eyeglass frame, a wireless transceiver disposed within the eyeglass frame, a processor disposed within the eyeglass frame, and a first lens and a second lens operatively connected to the eyeglass frame and a thin film polymer layer placed on at least one of the first lens or the second lens. At least one of the first lens and the second lens may be a corrective lens.
- the thin film polymer layer may be positioned on an inside of at least one of the first lens and the second lens or may be positioned on an outside of at least one of the first lens and the second lens, or there may be thin film polymer on both the inside and the outside of one or both of the lenses.
- the first lens and the second lens may be integral with one another.
- the thin film polymer may form a portion of a display.
- At least one of the earpieces may include at least one sensor operatively connected to the processor and the at least one of the earpieces is configured to communicate data from the at least one sensor to the set of eyeglasses, the set of eyeglasses configured to display the data collected from the at least one sensor on the display.
- the set of earpieces may be configured to provide feedback to the set of eyeglasses and/or the set of eyeglasses is configured to provide feedback to the set of earpieces.
- a method for interacting with a user includes providing a set of earpieces comprising a left ear piece and a right ear piece, each of the earpieces comprising an ear piece housing, a wireless transceiver disposed within the ear piece housing, a processor disposed within the housing and operatively connected to the wireless transceiver, at least one sensor operatively connected to the processor.
- the method further includes providing a set of eyeglasses comprising an eyeglass frame, a wireless transceiver disposed within the eyeglass frame, a processor disposed within the eyeglass frame, and a first lens and a second lens operatively connected to the eyeglass frame.
- the method further includes sensing earpiece sensor data at the set of earpieces using the at least one sensor of at least one of the left earpiece and the right earpiece, processing the earpiece sensor data at the set of earpieces to provide data to display, communicating the data to display from the set of earpieces to the set of eyeglasses, displaying a visual representation based on the data to display on at least one of the first lens and the second lens, sensing eyeglasses data at the set of eyeglasses, communicating the eyeglasses data from the set of eyeglasses to the set of earpieces, generating audio at the set of earpieces based on the eyeglasses data.
- FIG. 1 illustrates a set of earpieces including a left earpiece and a right earpiece.
- FIG. 2 illustrates one example of a set of earpieces in greater detail.
- FIG. 3 is a block diagram illustrating one embodiment of an earpiece.
- FIG. 4 is another block diagram illustrating another example of an earpiece.
- FIG. 5 illustrates examples of various devices.
- FIG. 6 illustrates another example of a set of glasses having a frame which includes side arms and a bridge area.
- FIG. 7 illustrates one example of assisted vision through illustrating a view from a view area of an eyeglass on the left and an assisted view on the right.
- FIG. 8 illustrates an example of an augmented visual display.
- FIG. 9 illustrates examples of augmented visual displays.
- FIG. 10 illustrates a cross-section of a lens assembly having a thin polymer layer on both the inside and the outside.
- FIG. 1 illustrates a set of earpieces 10 including a left earpiece 12 A and a right earpiece 12 B. Also shown in FIG. 1 is a set of eyeglasses 8 .
- the set of earpieces 10 and the set of eyeglasses together may form a system.
- the set of eyeglasses 8 may wirelessly communicate with one or both of the earpieces 12 A, 12 B.
- a computing device such as a mobile device such as a phone 2 with a display 4 may also be in wireless communication with one or more of the earpieces 12 A as well as the set of glasses 8 .
- FIG. 2 illustrates one example of a set of earpieces in greater detail.
- the set of earpieces 10 includes a left earpiece 12 A and a right earpiece 12 B.
- Each of the earpieces 12 A, 12 B has an earpiece housing 14 A, 14 B.
- An external or outward facing microphone 70 A in the left earpiece 12 A and an outward facing microphone 70 B in the right earpiece 12 B are also shown.
- the external microphones may be used for sensing ambient or environmental sound.
- FIG. 3 is a block diagram illustrating one embodiment of an earpiece.
- the earpiece 12 has a housing 14 .
- One or more processors 30 are disposed within the earpiece housing 14 .
- One or more wireless transceivers 34 are operatively connected to the one or more processors 30 .
- One or more external microphones 70 are operatively connected to the one or more processors 30 .
- One or more internal microphones such as bone conduction microphones 71 are operatively connected to the processors 30 .
- a speaker 73 is also shown which is operatively connected to one or more of the processors 30 .
- FIG. 4 is another block diagram illustrating another example of an earpiece.
- the device may include one or more LEDs 20 electrically connected to an intelligent control system 30 .
- the intelligent control system 30 may include one or more processors, microcontrollers, application specific integrated circuits, or other types of integrated circuits.
- the intelligent control system 30 may also be electrically connected to one or more sensors 32 .
- the sensor(s) may include an inertial sensor 74 , another inertial sensor 76 .
- Each inertial sensor 74 , 76 may include an accelerometer, a gyro sensor or gyrometer, a magnetometer or other type of inertial sensor.
- the sensor(s) 32 may also include one or more bone conduction microphones 71 , one or more air conduction microphones 70 and/or other types of sensors. It is further contemplated where multiple earpieces are used, a first or left earpiece may include a first subset of the sensors 32 and a second or right earpiece may include a second subset of the sensors 32 .
- a gesture control interface 36 is also operatively connected to or integrated into the intelligent control system 30 .
- the gesture control interface 36 may include one or more emitters 82 and one or more detectors 84 for sensing user gestures.
- the emitters may be of any number of types including infrared LEDs.
- the device may include a transceiver 35 which may allow for induction transmissions such as through near field magnetic induction.
- a short range or radio transceiver 34 using Bluetooth, BLE, UWB, or other means of radio communication may also be present.
- the intelligent control system 30 may be configured to convey different information using one or more of the LED(s) 20 based on context or mode of operation of the device.
- the various sensors 32 , the processor 30 , and other electronic components may be located on the printed circuit board of the device.
- One or more speakers 73 may also be operatively connected to the intelligent control system 30 .
- FIG. 5 illustrates examples of various devices.
- a set of earpieces 10 includes a left earpiece 12 A and a right earpiece 12 B.
- Another example of a wearable device is a watch 100 which includes a display 102 and a watch band or strap 104 .
- a set of glasses includes a first eye glass 106 A and a second eye glass 106 B. Each eye glass 106 A, 106 B has a display 108 A, 108 B, with a portion connected to the display connected with a hinge 112 A, 112 B to a side arm 110 A, 110 B.
- a mobile device 2 which may be a smart phone or other mobile device having a display 4 .
- a wireless linkage between one or more wireless earpieces 10 and the smart glasses 106 A, 106 B may serve to highlight the strength of each platform.
- Data aggregated at the earpiece 12 A, 12 B may be wirelessly transmitted to the screen(s) of the smart glasses 106 A, 106 B for viewing by the user.
- data aggregated by the smart glasses 106 A, 106 B or auditory inputs from the smart glasses 106 A, 106 B may be wirelessly transmitted to the earpieces 12 A, 12 B for input via the auditory pathway.
- This has multiple advantages over any previously considered systems.
- the system may also be streamlined so sensor arrays would not be duplicated been the glasses and the earpieces. Power management may be enhanced due to the removal of the need for each device to have similar sensor arrays, e.g. accelerometers. Alternately, both devices may have duplicate sensors so compiled data could be processed for greater precision as may be appropriate for applications.
- FIG. 6 illustrates another example of a set of glasses 200 having a frame which includes side arms 202 A, 202 B and a bridge area 204 .
- Viewing areas 206 A, 206 B are mounted to the frame between the bridge area 204 and the side arms 202 A, 202 B.
- the viewing areas 206 A, 206 B may be traditional eyeglass lenses, corrective or non-corrective, formed from glass, plastic, polycarbonate, Trivex, or other materials.
- the lenses may be coated with a thin layer of film which allows for actively creating a display on the lenses.
- the thin layer of film may be positioned on an outer surface of the lenses, an inner surface of the lenses or both on an outer surface of the lenses and on an inner surface of the lenses.
- One or more cameras or other imaging sensors are positioned on the set of eyeglasses. This may include cameras on opposite sides of the glasses such as cameras 208 A, 208 B, a central camera 210 positioned at the bridge area 204 , cameras 212 A, 212 B proximate the central camera 210 and positioned at the bridge area 204 . Side facing cameras 214 A, 214 B may also be present.
- FIG. 7 illustrates one example of assisted vision through illustrating a view from a view area of an eyeglass on the left and an assisted view on the right. Note on the right the displayed image is larger than on the left. In assisted vision, the displayed image is magnified relative to a representation some individual sees or would see. Of course, other assisted vision features may be present which may include altering the size of images, altering colors of an image, or otherwise augmenting an image.
- FIG. 8 illustrates an example of augmented vision.
- the augmented vision provides for contextual augmentation.
- a scale has been applied to an image being viewed by an individual.
- a person wearing one or more eyeglasses is viewing another individual at left.
- augmented vision is provided by overlaying a scale on the person, in this instance the person's height as measured along a vertical axis which is also shown.
- the name of the person, “John Doe”, is also shown.
- other types of information may be presented.
- a person wearing one or more earpieces may also wear an eyeglass or a set of eyeglasses.
- the person may use the earpiece as a part of the user interface of the glasses.
- the earpiece may enhance the experience of using the eyeglass(es) in various ways.
- the earpiece may provide for additional context than what is only available visually. This may include ambient noise, sensor data, or other data which is available using an earpiece which may be used to provide additional contextual clues to enhance an experience of a user.
- Another way in which one or more earpieces may be used to enhance the experience of using the eyeglass(es) is to provide for voice prompts and voice commands.
- the person may give a voice command to the earpiece to control operation of the eyeglass(es).
- the person may say, “BRAGI, who is this?” to the earpiece.
- a determination is made as to who the person is.
- This determination may be performed in various ways. According to one aspect, the determination is made by acquiring one or more images of the person and using the images to perform a search against a database.
- the database may be local or remote.
- a physical item may have a bar code or three-dimensional bar code on it.
- the person may say, “BRAGI, what is this?” to the earpiece.
- BRAGI what is this?”
- a determination is made as to which article it is the person wishes to identify. This determination may be made in any number of different ways. Including based on what object is in most central or direct view, based on what object is being held by or pointed to by a user or otherwise.
- a bar code such a two-dimensional bar code may be identified and interpreted to provide additional information about an object.
- voice prompts may be used to assist in identifying them.
- an augmented visual display may be presented emphasizing one of the individuals and a voice prompt may be provided to query, “Who is this?” The user may then respond such as by saying “Yes”, “No”, “No, to the left”, or other response.
- the user may respond non-verbally by shaking their head yes or no where the earpiece includes an inertial sensor suitable for tracking head movement.
- the size of the person may be enhanced, the color of the person may be altered, the image of the person may blink, or other type of emphasis may be placed.
- An alternative may be to add another shape such as an arrow to select a person or other object.
- the earpiece may assist in operation of an eyeglass or eyeglasses by providing for voice commands, voice prompts, and additional sensor data to help establish context.
- a system which includes one or more eyeglass and one or more earpieces various other advantages may be achieved where there is operative communication between the devices. For example, data aggregated at or associated with the earpiece may be wirelessly communicated to the eyeglass(s) to be displayed on the display of the smart glasses.
- reduced functionality need be present in the eye glass or eye glasses which may be beneficial in reducing manufacturing cost of the device or allowing for extended battery life.
- a smart linkage of completely wireless earpieces may be made with smart glasses. Such linkages allow for highlighting the strength of each platform.
- Data aggregated at the earpiece may be wirelessly transmitted to the screen(s) of the smart glasses for viewing by the user. Further, data aggregated by the smart glasses or auditory inputs from the smart glasses may be wirelessly transmitted to the smart earpieces for input via the auditory pathway.
- This has multiple advantages.
- the system may be streamlined so sensor arrays need not be duplicated. Power management may be enhanced due to the removal of the need for each device to have similar sensor arrays, e.g. accelerometers. Alternately, both devices may have duplicate sensors so compiled data could be processed for greater precision.
- recording or image capture may be facilitated from the forward facing, or case dependently rear facing cameras.
- the cameras of the smart glasses may interact with the world and provide visual based analysis of the user's environment.
- the glasses may then supply information to the smart earpieces possibly coordinating, highlighting, augmenting or otherwise alerting the user to the information in a timely and responsive measure.
- a forward facing camera detects the presence of an ambulance moving into the central field of vision from the peripheral field of vision of the user.
- the sensing mechanism may highlight the image of the emergency vehicle relative to the tracking position of the user.
- An auditory linkage to the smart earpiece may selectively highlight the position of the siren and modulate the environmental conditions to highlight the position acoustically for the user in three dimensional space.
- the complementary linking of the two principal technologies would therefore logically be the siren may be localized when it is not-yet visible, and visual cues given as to its location.
- this system allows for camera systems possibly monitoring the “peripheral visual field” of the user as well as the lateral and posterior fields. This allows theoretical access to 360 degree visual fields. Such fields may be tied into the camera systems available to them from smart earpieces or other connected body worn devices.
- “assistive vision” capabilities as well as “augmented vision” may be placed into the visual field of the user. This may nicely parallel with the smart ear-based system allowing a wide suite of biometric measurements as well as detailed position sensors.
- Control systems may be enhanced both from the visual as well as the audio worlds. For example, one may track the pupils using an inwardly facing camera, so the smart glasses may be able to detect from the front facing cameras what the person was looking at with reasonable certainty. This information may be transmitted to the smart earpieces which may process the request and do a search, as but one example.
- Feedback to the user may be given acoustically, as well as visually. The choice may be made by the user as to which approach the user would prefer. Additionally, visually instituted actions may be able to coordinate with an audio mechanism for enhanced feedback. Such feedback may be optimized to provide for feedback from the audio system to the visual system or from the visual system to the audio system.
- a thin film polymer may be placed on the lens to allow a screen to be created.
- a lens assembly 300 may have a lens 302 with a thin film polymer 304 , 306 on each side of the lens 302 .
- the thin film polymer need only be present on one side.
- a natural scenario may be created which allows for a depth of the image to be created.
- the effect detected by a user may be adjusted.
- images may be projected directly onto the retina. This provides an unparalleled advantage in the augmented or assisted visual world.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Acoustics & Sound (AREA)
- Signal Processing (AREA)
- Health & Medical Sciences (AREA)
- Audiology, Speech & Language Pathology (AREA)
- General Health & Medical Sciences (AREA)
- Multimedia (AREA)
- Optics & Photonics (AREA)
- Eyeglasses (AREA)
Abstract
A method includes providing a set of earpieces comprising a left ear piece and a right ear piece, each of the earpieces comprising an ear piece housing, a wireless transceiver disposed within the ear piece housing, a processor disposed within the housing and operatively connected to the wireless transceiver. The method further includes providing a set of eyeglasses comprising an eyeglass frame, a wireless transceiver disposed within the eyeglass frame, a processor disposed within the eyeglass frame, and a first lens and a second lens operatively connected to the eyeglass frame. The method provides for communicating data between at least one of the set of earpieces and the set of eyeglasses.
Description
- This application is a continuation of U.S. patent application Ser. No. 15/682,986, filed on Aug. 22, 2017, which claims priority to U.S. Provisional Patent Application 62/379,534, filed on Aug. 25, 2016, all of which are titled “Wireless Earpiece and Smart Glasses System and Method”, all of which are hereby incorporated by reference in their entireties.
- The present invention relates to wearable devices. More particularly, but not exclusively, the present invention relates to wireless ear pieces and smart glasses.
- The possibilities of virtual reality and augmented reality present new and impressive use cases, as well as potential far exceeding those currently predicted. This new technology offers an unparalleled potential for enhancements to our sensory input methods, as well as providing for the possibility of the extension of such enhancements in order to assist those who have ongoing and persistent sensory deficits. Current technology is however, somewhat limited. What is needed is a new and integrated method for sharing data between smart glasses as well as smart earpieces.
- Therefore, it is a primary object, feature, or advantage of the present invention to improve over the state of the art.
- It is a further object, feature, or advantage of the present invention to provide enhanced controls based upon thin film polymer technology.
- It is a still further object, feature, or advantage of the present invention to provide enhanced imaging based upon thin film polymer technology.
- Another object, feature, or advantage is to provide enhanced visual fields based upon thin film polymer technology.
- Another object, feature, or advantage is to provide the ability to place thin film polymer technology onto both side of the lens.
- Yet another object, feature, or advantage is to provide the ability of the smart glasses to interact with the smart earpieces of the user.
- A further object, feature, or advantage is to provide an enhanced ability to measure biometrics and display the biometrics to the user.
- A still further object, feature, or advantage is to provide enhanced feedback controls for visual based interfaces using the audio engine, as well as from the audio engine to display data on the visual based interfaces.
- Another object, feature, or advantage is to provide enhanced localization, quantitative and qualitative analysis of acoustic sources from earpiece. Data from which can be fed to the glasses to provide visual cues with information on the acoustic source.
- One or more of these and/or other objects, features, or advantages of the present invention will become apparent from the specification and following claims. No single embodiment need provide each and every object, feature, or advantage. Different embodiments may have different objects, features, or advantages. Therefore, the present invention is not to be limited to or by an objects, features, or advantages stated herein.
- According to one aspect, the present invention provides a smart linkage of completely wireless earpieces to smart glasses. Such linkages will highlight the strength of each platform. Data aggregated at the earpiece may be wirelessly transmitted to the screen(s) of the smart glasses for viewing by the user. Further, data aggregated by the smart glasses or auditory inputs from the smart glasses may be wirelessly transmitted to the smart earpieces for input via the auditory pathway. This has multiple advantages over any previously considered systems. The system may be streamlined so sensor arrays need not be duplicated. Power management may be enhanced due to the removal of the need for each device to have similar sensor arrays, e.g. accelerometers. Alternately, both devices may have duplicate sensors so compiled data could be processed for greater precision.
- Thin film polymer may be used on the smart glasses as a part of a display system. The thin film polymer may be positioned on the internal or external side of the lens(es) or the thin film polymer may be placed on both sides of the lens of the glasses to create a natural scenario where depth of image may be provided. This allows for filtering the image presented to the retina in another way. Alternately, the effect which would be detected by the user may be adjusted. Either alternately or in conjunction, images may be projected directly onto the retina. This would allow the user an unparalleled advantage in the augmented or assisted visual world. Certainly, recording or image capture would be able to be facilitated from the forward facing, or case dependently rear facing cameras. The cameras of the smart glasses may interact with the world and provide visual based analysis of the user's environment. The glasses may then supply information to the smart earpieces to coordinate, highlight, augment or otherwise alert the user to the information in a timely and responsive measure. One example may be explained as such: a forward facing camera detects the presence of an ambulance moving into the central field of vision from the peripheral field of vision of the user. The sensing mechanism may highlight the image of the emergency vehicle relative to the tracking position of the user. Auditory linkage to the smart earpiece may selectively highlight the position of the siren and modulate the environmental conditions to highlight the position acoustically for the user in three dimensional space. The complementary linking of the two principal technologies, may therefore could allow the siren to be localized when it is not-yet visible, and visual cues given as to its location.
- Additionally, this system may give rise to camera systems possibly monitoring the “peripheral visual field” of the user as well as the lateral and posterior fields. This would allow theoretical access to 360 degree visual fields. Such fields could be tied into the camera systems available to them from smart earpieces or other connected body worn devices.
- Thus, according to one aspect we may construct the “assistive vision” capabilities as well as “augmented vision” into the visual field of the user. This parallels with the smart ear-based system possibly allowing a wide suite of biometric measurements as well as detailed position sensors. Control systems may be enhanced both from the visual as well as the audio worlds. For example, one could track the pupils, the smart glasses would be able to detect from the front facing cameras what the person was looking at with reasonable certainty. This information may be transmitted to the smart earpieces which may then process the request and do a search, as but one example. Feedback to the user may be given acoustically, as well as visually. The choice may be made by the user as to which approach the user would prefer. Additionally, visually instituted actions may be able to coordinate with an audio mechanism for enhanced feedback. Such feedback may be optimized to provide for feedback from the audio system to the visual system or from the visual system to the audio system.
- According to one aspect, a method includes providing a set of earpieces comprising a left ear piece and a right ear piece, each of the earpieces comprising an ear piece housing, a wireless transceiver disposed within the ear piece housing, a processor disposed within the housing and operatively connected to the wireless transceiver. The method further includes providing a set of eyeglasses comprising an eyeglass frame, a wireless transceiver disposed within the eyeglass frame, a processor disposed within the eyeglass frame, and a first lens and a second lens operatively connected to the eyeglass frame. The method provides for communicating data between at least one of the set of earpieces and the set of eyeglasses. The communicating is wirelessly communicating. The method may further provide for displaying on the first lens or the second lens data from the one of the set of earpieces. The method may further include at least one sensor on the set of eye glasses and wherein the data is data from the set of eye glasses. The method may further include at least one sensor disposed within the left earpiece or the right earpiece and wherein the data is data from the at least one sensor. There may be a film on an external face of at least one of the first lens and the second lens configured to display imagery. There may be a film on an internal face of at least one of the first lens and the second lens configured to display imagery. The imagery may provide for augmented vision or assisted vision. There may be a first film on an external face of at least one of the first lens and the second lens configured to display a first set of imagery and a second film on an internal face of at least one of the first lens and the second lens configured to display a second set of imagery and wherein the first set of imagery and the second set of imagery combined to provide imagery having three-dimensional depth.
- According to another aspect, a system includes a set of earpieces including a left ear piece and a right ear piece, each of the earpieces comprising an ear piece housing, a wireless transceiver disposed within the ear piece housing, a processor disposed within the housing and operatively connected to the wireless transceiver. The system further includes a set of eyeglasses comprising an eyeglass frame, a wireless transceiver disposed within the eyeglass frame, a processor disposed within the eyeglass frame, and a first lens and a second lens operatively connected to the eyeglass frame and a thin film polymer layer placed on at least one of the first lens or the second lens. At least one of the first lens and the second lens may be a corrective lens. The thin film polymer layer may be positioned on an inside of at least one of the first lens and the second lens or may be positioned on an outside of at least one of the first lens and the second lens, or there may be thin film polymer on both the inside and the outside of one or both of the lenses. The first lens and the second lens may be integral with one another. The thin film polymer may form a portion of a display. At least one of the earpieces may include at least one sensor operatively connected to the processor and the at least one of the earpieces is configured to communicate data from the at least one sensor to the set of eyeglasses, the set of eyeglasses configured to display the data collected from the at least one sensor on the display. The set of earpieces may be configured to provide feedback to the set of eyeglasses and/or the set of eyeglasses is configured to provide feedback to the set of earpieces.
- According to another aspect, a method for interacting with a user is provided. The method includes providing a set of earpieces comprising a left ear piece and a right ear piece, each of the earpieces comprising an ear piece housing, a wireless transceiver disposed within the ear piece housing, a processor disposed within the housing and operatively connected to the wireless transceiver, at least one sensor operatively connected to the processor. The method further includes providing a set of eyeglasses comprising an eyeglass frame, a wireless transceiver disposed within the eyeglass frame, a processor disposed within the eyeglass frame, and a first lens and a second lens operatively connected to the eyeglass frame. The method further includes sensing earpiece sensor data at the set of earpieces using the at least one sensor of at least one of the left earpiece and the right earpiece, processing the earpiece sensor data at the set of earpieces to provide data to display, communicating the data to display from the set of earpieces to the set of eyeglasses, displaying a visual representation based on the data to display on at least one of the first lens and the second lens, sensing eyeglasses data at the set of eyeglasses, communicating the eyeglasses data from the set of eyeglasses to the set of earpieces, generating audio at the set of earpieces based on the eyeglasses data.
-
FIG. 1 illustrates a set of earpieces including a left earpiece and a right earpiece. -
FIG. 2 illustrates one example of a set of earpieces in greater detail. -
FIG. 3 is a block diagram illustrating one embodiment of an earpiece. -
FIG. 4 is another block diagram illustrating another example of an earpiece. -
FIG. 5 illustrates examples of various devices. -
FIG. 6 illustrates another example of a set of glasses having a frame which includes side arms and a bridge area. -
FIG. 7 illustrates one example of assisted vision through illustrating a view from a view area of an eyeglass on the left and an assisted view on the right. -
FIG. 8 illustrates an example of an augmented visual display. -
FIG. 9 illustrates examples of augmented visual displays. -
FIG. 10 illustrates a cross-section of a lens assembly having a thin polymer layer on both the inside and the outside. -
FIG. 1 illustrates a set ofearpieces 10 including aleft earpiece 12A and aright earpiece 12B. Also shown inFIG. 1 is a set ofeyeglasses 8. The set ofearpieces 10 and the set of eyeglasses together may form a system. The set ofeyeglasses 8 may wirelessly communicate with one or both of theearpieces phone 2 with adisplay 4 may also be in wireless communication with one or more of theearpieces 12A as well as the set ofglasses 8. -
FIG. 2 illustrates one example of a set of earpieces in greater detail. The set ofearpieces 10 includes aleft earpiece 12A and aright earpiece 12B. Each of theearpieces earpiece housing microphone 70A in theleft earpiece 12A and an outward facingmicrophone 70B in theright earpiece 12B are also shown. The external microphones may be used for sensing ambient or environmental sound. -
FIG. 3 is a block diagram illustrating one embodiment of an earpiece. Theearpiece 12 has ahousing 14. One ormore processors 30 are disposed within theearpiece housing 14. One ormore wireless transceivers 34 are operatively connected to the one ormore processors 30. One or moreexternal microphones 70 are operatively connected to the one ormore processors 30. One or more internal microphones such asbone conduction microphones 71 are operatively connected to theprocessors 30. Aspeaker 73 is also shown which is operatively connected to one or more of theprocessors 30. -
FIG. 4 is another block diagram illustrating another example of an earpiece. The device may include one ormore LEDs 20 electrically connected to anintelligent control system 30. Theintelligent control system 30 may include one or more processors, microcontrollers, application specific integrated circuits, or other types of integrated circuits. Theintelligent control system 30 may also be electrically connected to one ormore sensors 32. Where the device is an earpiece, the sensor(s) may include aninertial sensor 74, anotherinertial sensor 76. Eachinertial sensor bone conduction microphones 71, one or moreair conduction microphones 70 and/or other types of sensors. It is further contemplated where multiple earpieces are used, a first or left earpiece may include a first subset of thesensors 32 and a second or right earpiece may include a second subset of thesensors 32. - A
gesture control interface 36 is also operatively connected to or integrated into theintelligent control system 30. Thegesture control interface 36 may include one ormore emitters 82 and one ormore detectors 84 for sensing user gestures. The emitters may be of any number of types including infrared LEDs. The device may include atransceiver 35 which may allow for induction transmissions such as through near field magnetic induction. A short range orradio transceiver 34 using Bluetooth, BLE, UWB, or other means of radio communication may also be present. In operation, theintelligent control system 30 may be configured to convey different information using one or more of the LED(s) 20 based on context or mode of operation of the device. Thevarious sensors 32, theprocessor 30, and other electronic components may be located on the printed circuit board of the device. One ormore speakers 73 may also be operatively connected to theintelligent control system 30. -
FIG. 5 illustrates examples of various devices. A set ofearpieces 10 includes aleft earpiece 12A and aright earpiece 12B. Another example of a wearable device is awatch 100 which includes adisplay 102 and a watch band orstrap 104. A set of glasses includes afirst eye glass 106A and asecond eye glass 106B. Eacheye glass display hinge side arm FIG. 5 is amobile device 2 which may be a smart phone or other mobile device having adisplay 4. As shown there may be a wireless linkage between one ormore wireless earpieces 10 and thesmart glasses earpiece smart glasses smart glasses smart glasses earpieces -
FIG. 6 illustrates another example of a set ofglasses 200 having a frame which includesside arms bridge area 204.Viewing areas bridge area 204 and theside arms viewing areas cameras central camera 210 positioned at thebridge area 204,cameras central camera 210 and positioned at thebridge area 204.Side facing cameras -
FIG. 7 illustrates one example of assisted vision through illustrating a view from a view area of an eyeglass on the left and an assisted view on the right. Note on the right the displayed image is larger than on the left. In assisted vision, the displayed image is magnified relative to a representation some individual sees or would see. Of course, other assisted vision features may be present which may include altering the size of images, altering colors of an image, or otherwise augmenting an image. -
FIG. 8 illustrates an example of augmented vision. The augmented vision provides for contextual augmentation. Here, a scale has been applied to an image being viewed by an individual. Thus, in this instance a person wearing one or more eyeglasses is viewing another individual at left. At right, augmented vision is provided by overlaying a scale on the person, in this instance the person's height as measured along a vertical axis which is also shown. In addition, the name of the person, “John Doe”, is also shown. Of course, other types of information may be presented. - According to one aspect, a person wearing one or more earpieces may also wear an eyeglass or a set of eyeglasses. The person may use the earpiece as a part of the user interface of the glasses. The earpiece may enhance the experience of using the eyeglass(es) in various ways. For example, the earpiece may provide for additional context than what is only available visually. This may include ambient noise, sensor data, or other data which is available using an earpiece which may be used to provide additional contextual clues to enhance an experience of a user.
- Another way in which one or more earpieces may be used to enhance the experience of using the eyeglass(es) is to provide for voice prompts and voice commands. For example, the person may give a voice command to the earpiece to control operation of the eyeglass(es). For example, the person may say, “BRAGI, who is this?” to the earpiece. In response, a determination is made as to who the person is. This determination may be performed in various ways. According to one aspect, the determination is made by acquiring one or more images of the person and using the images to perform a search against a database. The database may be local or remote.
- By way of another example, a physical item may have a bar code or three-dimensional bar code on it. The person may say, “BRAGI, what is this?” to the earpiece. In response a determination is made as to which article it is the person wishes to identify. This determination may be made in any number of different ways. Including based on what object is in most central or direct view, based on what object is being held by or pointed to by a user or otherwise. In one embodiment, a bar code such a two-dimensional bar code may be identified and interpreted to provide additional information about an object. In both above examples, if it is unclear as to which person or object the user wishes to identify, voice prompts may be used to assist in identifying them. For example, if two people are present and it is not clear which person the user wants to identify, then as shown in
FIG. 9 , an augmented visual display may be presented emphasizing one of the individuals and a voice prompt may be provided to query, “Who is this?” The user may then respond such as by saying “Yes”, “No”, “No, to the left”, or other response. Alternatively, the user may respond non-verbally by shaking their head yes or no where the earpiece includes an inertial sensor suitable for tracking head movement. To emphasize the person through augmentation, the size of the person may be enhanced, the color of the person may be altered, the image of the person may blink, or other type of emphasis may be placed. An alternative may be to add another shape such as an arrow to select a person or other object. - Thus, the earpiece may assist in operation of an eyeglass or eyeglasses by providing for voice commands, voice prompts, and additional sensor data to help establish context. In a system which includes one or more eyeglass and one or more earpieces various other advantages may be achieved where there is operative communication between the devices. For example, data aggregated at or associated with the earpiece may be wirelessly communicated to the eyeglass(s) to be displayed on the display of the smart glasses. In addition, where the eye glass or eye glasses are used in combination with one or more earpieces or other wearables, reduced functionality need be present in the eye glass or eye glasses which may be beneficial in reducing manufacturing cost of the device or allowing for extended battery life.
- Thus, a smart linkage of completely wireless earpieces may be made with smart glasses. Such linkages allow for highlighting the strength of each platform. Data aggregated at the earpiece may be wirelessly transmitted to the screen(s) of the smart glasses for viewing by the user. Further, data aggregated by the smart glasses or auditory inputs from the smart glasses may be wirelessly transmitted to the smart earpieces for input via the auditory pathway. This has multiple advantages. The system may be streamlined so sensor arrays need not be duplicated. Power management may be enhanced due to the removal of the need for each device to have similar sensor arrays, e.g. accelerometers. Alternately, both devices may have duplicate sensors so compiled data could be processed for greater precision.
- Certainly, recording or image capture may be facilitated from the forward facing, or case dependently rear facing cameras. The cameras of the smart glasses may interact with the world and provide visual based analysis of the user's environment. The glasses may then supply information to the smart earpieces possibly coordinating, highlighting, augmenting or otherwise alerting the user to the information in a timely and responsive measure. One example would be explained as such: a forward facing camera detects the presence of an ambulance moving into the central field of vision from the peripheral field of vision of the user. The sensing mechanism may highlight the image of the emergency vehicle relative to the tracking position of the user. An auditory linkage to the smart earpiece may selectively highlight the position of the siren and modulate the environmental conditions to highlight the position acoustically for the user in three dimensional space. The complementary linking of the two principal technologies, would therefore logically be the siren may be localized when it is not-yet visible, and visual cues given as to its location.
- Additionally, this system allows for camera systems possibly monitoring the “peripheral visual field” of the user as well as the lateral and posterior fields. This allows theoretical access to 360 degree visual fields. Such fields may be tied into the camera systems available to them from smart earpieces or other connected body worn devices.
- In addition, “assistive vision” capabilities as well as “augmented vision” may be placed into the visual field of the user. This may nicely parallel with the smart ear-based system allowing a wide suite of biometric measurements as well as detailed position sensors. Control systems may be enhanced both from the visual as well as the audio worlds. For example, one may track the pupils using an inwardly facing camera, so the smart glasses may be able to detect from the front facing cameras what the person was looking at with reasonable certainty. This information may be transmitted to the smart earpieces which may process the request and do a search, as but one example. Feedback to the user may be given acoustically, as well as visually. The choice may be made by the user as to which approach the user would prefer. Additionally, visually instituted actions may be able to coordinate with an audio mechanism for enhanced feedback. Such feedback may be optimized to provide for feedback from the audio system to the visual system or from the visual system to the audio system.
- According to another aspect, a thin film polymer may be placed on the lens to allow a screen to be created. As shown in
FIG. 10 , alens assembly 300 may have alens 302 with athin film polymer lens 302. Instead of placing the thin film polymer on both sides, the thin film polymer need only be present on one side. Where the thin film polymer is placed on both sides, a natural scenario may be created which allows for a depth of the image to be created. Thus, the effect detected by a user may be adjusted. Alternately or in conjunction with such effects, images may be projected directly onto the retina. This provides an unparalleled advantage in the augmented or assisted visual world. - Therefore, various methods, systems and apparatus have been shown and described. Although specific embodiments are included, the present invention contemplates numerous additions, options, and alternations.
Claims (20)
1. A method for interacting with a user using:
a set of earpieces comprising:
a left ear piece and a right ear piece, each of the earpieces comprising:
an ear piece housing,
a wireless transceiver disposed within the ear piece housing,
a processor disposed within the housing and operatively connected to the wireless transceiver, and
at least one sensor operatively connected to the processor;
a set of eyeglasses comprising:
an eyeglass frame,
a wireless transceiver disposed within the eyeglass frame,
a processor disposed within the eyeglass frame, and
a first lens and a second lens operatively connected to the eyeglass frame;
the method comprising:
sensing earpiece sensor data at the set of earpieces using at least one sensor of the set of earpieces;
processing the earpiece sensor data at the set of earpieces to provide data to display;
communicating the data to display from the set of earpieces to the set of eyeglasses;
displaying a visual representation based on the data to display on at least one of the first lens and the second lens of the set of eyeglasses;
sensing eyeglasses data at the set of eyeglasses;
communicating the eyeglasses data from the set of eyeglasses to the set of earpieces; generating audio at the set of earpieces.
2. The method of claim 1 , wherein the communicating the data to display from the set of earpieces to the set of eyeglasses comprises wirelessly communicating the data to display from the set of earpieces to the set of eyeglasses and wherein the communicating the eyeglasses data from the set of eyeglasses to the set of earpieces comprises wirelessly communicating the eyeglasses data from the set of eyeglasses to the set of earpieces.
3. The method of claim 2 wherein the set of eyeglasses further comprises a film on an external face of at least one of the first lens and the second lens configured to display imagery.
4. The method of claim 3 wherein the imagery is augmented vision imagery and wherein the data to display is augmented imagery data.
5. The method of claim 3 wherein the imagery is assisted vision imagery.
6. The method of claim 2 wherein the set of eyeglasses further comprises a film on an internal face of at least one of the first lens and the second lens configured to display imagery.
7. The method of claim 2 wherein the set of eyeglasses further comprises a first film on an external face of at least one of the first lens and the second lens configured to display a first set of imagery and a second film on an internal face of at least one of the first lens and the second lens configured to display a second set of imagery and wherein the first set of imagery and the second set of imagery combine to provide imagery perceived as having three-dimensional depth.
8. The method of claim 1 wherein the audio comprises an audio prompt requesting information from the user.
9. A system comprising:
a set of wireless earpieces comprising:
a left ear piece and a right ear piece, each of the wireless earpieces comprising:
an ear piece housing,
a wireless transceiver disposed within the ear piece housing,
a processor disposed within the housing and operatively connected to the wireless transceiver, and
at least one sensor operatively connected to the processor;
a set of eyeglasses comprising:
an eyeglass frame,
a wireless transceiver disposed within the eyeglass frame,
a processor disposed within the eyeglass frame, and
a first lens and a second lens operatively connected to the eyeglass frame; and
a thin film polymer layer placed on at least one of the first lens or the second lens;
wherein the set of wireless earpieces are wirelessly linked with the set of eyeglasses; and
wherein the set of wireless earpieces issue voice prompts to a user requesting assistance in providing context to an image viewed using the set of eyeglasses;
wherein eyeglasses data sensed from the set of eyeglasses is wirelessly transmitted to the set of wireless earpieces and the voice prompts are generated at the set of wireless earpieces based on the eyeglasses data; and
wherein the user may respond verbally or non-verbally to identify a person and/or object within the image viewed using the set of eyeglasses.
10. The system of claim 9 , wherein the size of the person and/or the object may be enhanced to emphasize the person or the object through augmentation.
11. The system of claim 9 wherein the color of the person and/or the object may be altered to emphasize the person or the object through augmentation.
12. The system of claim 9 , wherein the image of the person and/or the object may blink to emphasize the person or the object through augmentation.
13. The system of claim 9 , wherein at least one of the first lens and the second lens is a corrective lens.
14. The system of claim 9 wherein the thin film polymer layer is positioned on an inside of at least one of the first lens and the second lens.
15. The system of claim 9 wherein the thin film polymer layer is positioned on an outside of at least one of the first lens and the second lens.
16. The system of claim 9 wherein the thin film polymer layer is a first thin film polymer layer and the system further comprises a second thin film polymer layer wherein the first thin film polymer layer is on an inside of at least one of the first lens and the second lens and wherein the second thin film polymer layer on an outside of at least one of the first lens and the second lens.
17. The system of claim 9 wherein the first lens is integral with the second lens.
18. The system of claim 9 wherein the thin film polymer layer forms a portion of a display.
19. The system of claim 15 wherein the system is configured to display information from one or more of the wireless earpieces on the display.
20. The system of claim 9 wherein the set of eyeglasses further comprise at least one forward facing camera and at least one inward facing camera.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/138,206 US20190028793A1 (en) | 2016-08-25 | 2018-09-21 | Wireless Earpiece and Smart Glasses System and Method |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201662379534P | 2016-08-25 | 2016-08-25 | |
US15/682,986 US10104464B2 (en) | 2016-08-25 | 2017-08-22 | Wireless earpiece and smart glasses system and method |
US16/138,206 US20190028793A1 (en) | 2016-08-25 | 2018-09-21 | Wireless Earpiece and Smart Glasses System and Method |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/682,986 Continuation US10104464B2 (en) | 2016-08-25 | 2017-08-22 | Wireless earpiece and smart glasses system and method |
Publications (1)
Publication Number | Publication Date |
---|---|
US20190028793A1 true US20190028793A1 (en) | 2019-01-24 |
Family
ID=61244158
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/682,986 Active US10104464B2 (en) | 2016-08-25 | 2017-08-22 | Wireless earpiece and smart glasses system and method |
US16/138,206 Abandoned US20190028793A1 (en) | 2016-08-25 | 2018-09-21 | Wireless Earpiece and Smart Glasses System and Method |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/682,986 Active US10104464B2 (en) | 2016-08-25 | 2017-08-22 | Wireless earpiece and smart glasses system and method |
Country Status (1)
Country | Link |
---|---|
US (2) | US10104464B2 (en) |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3238131A4 (en) * | 2014-12-22 | 2018-09-05 | Genie Enterprise Ltd. | Three-dimensional rotatably-readable encoding of data for optical machine-reading |
US20180124497A1 (en) * | 2016-10-31 | 2018-05-03 | Bragi GmbH | Augmented Reality Sharing for Wearable Devices |
US20180335773A1 (en) * | 2017-05-16 | 2018-11-22 | Yi Xie | Balancing board |
US10110999B1 (en) * | 2017-09-05 | 2018-10-23 | Motorola Solutions, Inc. | Associating a user voice query with head direction |
US11381949B2 (en) * | 2018-02-25 | 2022-07-05 | Carlos Eduardo Escobar K'David | Wireless wearable Push-To-Talk (PTT) device |
US20210038427A1 (en) * | 2018-03-14 | 2021-02-11 | Menicon Singapore Pte Ltd. | Wearable device for communication with an ophthalmic device |
Family Cites Families (276)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US2325590A (en) | 1940-05-11 | 1943-08-03 | Sonotone Corp | Earphone |
US2430229A (en) | 1943-10-23 | 1947-11-04 | Zenith Radio Corp | Hearing aid earpiece |
US3047089A (en) | 1959-08-31 | 1962-07-31 | Univ Syracuse | Ear plugs |
US3586794A (en) | 1967-11-04 | 1971-06-22 | Sennheiser Electronic | Earphone having sound detour path |
US3934100A (en) | 1974-04-22 | 1976-01-20 | Seeburg Corporation | Acoustic coupler for use with auditory equipment |
US3983336A (en) | 1974-10-15 | 1976-09-28 | Hooshang Malek | Directional self containing ear mounted hearing aid |
US4150262A (en) | 1974-11-18 | 1979-04-17 | Hiroshi Ono | Piezoelectric bone conductive in ear voice sounds transmitting and receiving apparatus |
US4069400A (en) | 1977-01-31 | 1978-01-17 | United States Surgical Corporation | Modular in-the-ear hearing aid |
USD266271S (en) | 1979-01-29 | 1982-09-21 | Audivox, Inc. | Hearing aid |
JPS5850078B2 (en) | 1979-05-04 | 1983-11-08 | 株式会社 弦エンジニアリング | Vibration pickup type ear microphone transmitting device and transmitting/receiving device |
JPS56152395A (en) | 1980-04-24 | 1981-11-25 | Gen Eng:Kk | Ear microphone of simultaneous transmitting and receiving type |
US4375016A (en) | 1980-04-28 | 1983-02-22 | Qualitone Hearing Aids Inc. | Vented ear tip for hearing aid and adapter coupler therefore |
US4588867A (en) | 1982-04-27 | 1986-05-13 | Masao Konomi | Ear microphone |
JPS6068734U (en) | 1983-10-18 | 1985-05-15 | 株式会社岩田エレクトリツク | handset |
US4617429A (en) | 1985-02-04 | 1986-10-14 | Gaspare Bellafiore | Hearing aid |
US4682180A (en) | 1985-09-23 | 1987-07-21 | American Telephone And Telegraph Company At&T Bell Laboratories | Multidirectional feed and flush-mounted surface wave antenna |
US4852177A (en) | 1986-08-28 | 1989-07-25 | Sensesonics, Inc. | High fidelity earphone and hearing aid |
CA1274184A (en) | 1986-10-07 | 1990-09-18 | Edward S. Kroetsch | Modular hearing aid with lid hinged to faceplate |
US4791673A (en) | 1986-12-04 | 1988-12-13 | Schreiber Simeon B | Bone conduction audio listening device and method |
US5201008A (en) | 1987-01-27 | 1993-04-06 | Unitron Industries Ltd. | Modular hearing aid with lid hinged to faceplate |
US4865044A (en) | 1987-03-09 | 1989-09-12 | Wallace Thomas L | Temperature-sensing system for cattle |
DK157647C (en) | 1987-10-14 | 1990-07-09 | Gn Danavox As | PROTECTION ORGANIZATION FOR ALT-I-HEARED HEARING AND TOOL FOR USE IN REPLACEMENT OF IT |
US5201007A (en) | 1988-09-15 | 1993-04-06 | Epic Corporation | Apparatus and method for conveying amplified sound to ear |
US5185802A (en) | 1990-04-12 | 1993-02-09 | Beltone Electronics Corporation | Modular hearing aid system |
US5298692A (en) | 1990-11-09 | 1994-03-29 | Kabushiki Kaisha Pilot | Earpiece for insertion in an ear canal, and an earphone, microphone, and earphone/microphone combination comprising the same |
US5191602A (en) | 1991-01-09 | 1993-03-02 | Plantronics, Inc. | Cellular telephone headset |
USD340286S (en) | 1991-01-29 | 1993-10-12 | Jinseong Seo | Shell for hearing aid |
US5347584A (en) | 1991-05-31 | 1994-09-13 | Rion Kabushiki-Kaisha | Hearing aid |
US5295193A (en) | 1992-01-22 | 1994-03-15 | Hiroshi Ono | Device for picking up bone-conducted sound in external auditory meatus and communication device using the same |
US5343532A (en) | 1992-03-09 | 1994-08-30 | Shugart Iii M Wilbert | Hearing aid device |
US5280524A (en) | 1992-05-11 | 1994-01-18 | Jabra Corporation | Bone conductive ear microphone and method |
ATE211339T1 (en) | 1992-05-11 | 2002-01-15 | Jabra Corp | UNIDIRECTIONAL EAR MICROPHONE AND METHOD THEREOF |
US5497339A (en) | 1993-11-15 | 1996-03-05 | Ete, Inc. | Portable apparatus for providing multiple integrated communication media |
EP0683621B1 (en) | 1994-05-18 | 2002-03-27 | Nippon Telegraph And Telephone Corporation | Transmitter-receiver having ear-piece type acoustic transducing part |
US5749072A (en) | 1994-06-03 | 1998-05-05 | Motorola Inc. | Communications device responsive to spoken commands and methods of using same |
US5613222A (en) | 1994-06-06 | 1997-03-18 | The Creative Solutions Company | Cellular telephone headset for hand-free communication |
USD367113S (en) | 1994-08-01 | 1996-02-13 | Earcraft Technologies, Inc. | Air conduction hearing aid |
US5748743A (en) | 1994-08-01 | 1998-05-05 | Ear Craft Technologies | Air conduction hearing device |
DE19504478C2 (en) | 1995-02-10 | 1996-12-19 | Siemens Audiologische Technik | Ear canal insert for hearing aids |
US6339754B1 (en) | 1995-02-14 | 2002-01-15 | America Online, Inc. | System for automated translation of speech |
US5692059A (en) | 1995-02-24 | 1997-11-25 | Kruger; Frederick M. | Two active element in-the-ear microphone system |
KR19990014897A (en) | 1995-05-18 | 1999-02-25 | 프란시스 에이 월드만 | Near field communication system |
US5721783A (en) | 1995-06-07 | 1998-02-24 | Anderson; James C. | Hearing aid with wireless remote processor |
US5606621A (en) | 1995-06-14 | 1997-02-25 | Siemens Hearing Instruments, Inc. | Hybrid behind-the-ear and completely-in-canal hearing aid |
US6081724A (en) | 1996-01-31 | 2000-06-27 | Qualcomm Incorporated | Portable communication device and accessory system |
US7010137B1 (en) | 1997-03-12 | 2006-03-07 | Sarnoff Corporation | Hearing aid |
JP3815513B2 (en) | 1996-08-19 | 2006-08-30 | ソニー株式会社 | earphone |
US5802167A (en) | 1996-11-12 | 1998-09-01 | Hong; Chu-Chai | Hands-free device for use with a cellular telephone in a car to permit hands-free operation of the cellular telephone |
US6112103A (en) | 1996-12-03 | 2000-08-29 | Puthuff; Steven H. | Personal communication device |
IL119948A (en) | 1996-12-31 | 2004-09-27 | News Datacom Ltd | Voice activated communication system and program guide |
US6111569A (en) | 1997-02-21 | 2000-08-29 | Compaq Computer Corporation | Computer-based universal remote control system |
US5987146A (en) | 1997-04-03 | 1999-11-16 | Resound Corporation | Ear canal microphone |
US6021207A (en) | 1997-04-03 | 2000-02-01 | Resound Corporation | Wireless open ear canal earpiece |
US6181801B1 (en) | 1997-04-03 | 2001-01-30 | Resound Corporation | Wired open ear canal earpiece |
DE19721982C2 (en) | 1997-05-26 | 2001-08-02 | Siemens Audiologische Technik | Communication system for users of a portable hearing aid |
US5929774A (en) | 1997-06-13 | 1999-07-27 | Charlton; Norman J | Combination pager, organizer and radio |
USD397796S (en) | 1997-07-01 | 1998-09-01 | Citizen Tokei Kabushiki Kaisha | Hearing aid |
USD411200S (en) | 1997-08-15 | 1999-06-22 | Peltor Ab | Ear protection with radio |
US6167039A (en) | 1997-12-17 | 2000-12-26 | Telefonaktiebolget Lm Ericsson | Mobile station having plural antenna elements and interference suppression |
US6230029B1 (en) | 1998-01-07 | 2001-05-08 | Advanced Mobile Solutions, Inc. | Modular wireless headset system |
US6041130A (en) | 1998-06-23 | 2000-03-21 | Mci Communications Corporation | Headset with multiple connections |
US6054989A (en) | 1998-09-14 | 2000-04-25 | Microsoft Corporation | Methods, apparatus and data structures for providing a user interface, which exploits spatial memory in three-dimensions, to objects and which provides spatialized audio |
US6519448B1 (en) | 1998-09-30 | 2003-02-11 | William A. Dress | Personal, self-programming, short-range transceiver system |
US20030034874A1 (en) | 1998-10-29 | 2003-02-20 | W. Stephen G. Mann | System or architecture for secure mail transport and verifiable delivery, or apparatus for mail security |
US20020030637A1 (en) | 1998-10-29 | 2002-03-14 | Mann W. Stephen G. | Aremac-based means and apparatus for interaction with computer, or one or more other people, through a camera |
US6275789B1 (en) | 1998-12-18 | 2001-08-14 | Leo Moser | Method and apparatus for performing full bidirectional translation between a source language and a linked alternative language |
US20010005197A1 (en) | 1998-12-21 | 2001-06-28 | Animesh Mishra | Remotely controlling electronic devices |
US6424820B1 (en) | 1999-04-02 | 2002-07-23 | Interval Research Corporation | Inductively coupled wireless system and method |
DK1046943T3 (en) | 1999-04-20 | 2002-10-28 | Koechler Erika Fa | Hearing aid |
US7403629B1 (en) | 1999-05-05 | 2008-07-22 | Sarnoff Corporation | Disposable modular hearing aid |
US7113611B2 (en) | 1999-05-05 | 2006-09-26 | Sarnoff Corporation | Disposable modular hearing aid |
US6738485B1 (en) | 1999-05-10 | 2004-05-18 | Peter V. Boesen | Apparatus, method and system for ultra short range communication |
USD468299S1 (en) | 1999-05-10 | 2003-01-07 | Peter V. Boesen | Communication device |
US6823195B1 (en) | 2000-06-30 | 2004-11-23 | Peter V. Boesen | Ultra short range communication with sensing device and method |
US6560468B1 (en) | 1999-05-10 | 2003-05-06 | Peter V. Boesen | Cellular telephone, personal digital assistant, and pager unit with capability of short range radio frequency transmissions |
US6094492A (en) | 1999-05-10 | 2000-07-25 | Boesen; Peter V. | Bone conduction voice transmission apparatus and system |
US20020057810A1 (en) | 1999-05-10 | 2002-05-16 | Boesen Peter V. | Computer and voice communication unit with handsfree device |
US6542721B2 (en) | 1999-10-11 | 2003-04-01 | Peter V. Boesen | Cellular telephone, personal digital assistant and pager unit |
US6952483B2 (en) | 1999-05-10 | 2005-10-04 | Genisus Systems, Inc. | Voice transmission apparatus with UWB |
US6920229B2 (en) | 1999-05-10 | 2005-07-19 | Peter V. Boesen | Earpiece with an inertial sensor |
US6879698B2 (en) | 1999-05-10 | 2005-04-12 | Peter V. Boesen | Cellular telephone, personal digital assistant with voice communication unit |
US6084526A (en) | 1999-05-12 | 2000-07-04 | Time Warner Entertainment Co., L.P. | Container with means for displaying still and moving images |
US6208372B1 (en) | 1999-07-29 | 2001-03-27 | Netergy Networks, Inc. | Remote electromechanical control of a video communications system |
US6694180B1 (en) | 1999-10-11 | 2004-02-17 | Peter V. Boesen | Wireless biopotential sensing device and method with capability of short-range radio frequency transmission and reception |
US6852084B1 (en) | 2000-04-28 | 2005-02-08 | Peter V. Boesen | Wireless physiological pressure sensor and transmitter with capability of short range radio frequency transmissions |
US7508411B2 (en) | 1999-10-11 | 2009-03-24 | S.P. Technologies Llp | Personal communications device |
US6470893B1 (en) | 2000-05-15 | 2002-10-29 | Peter V. Boesen | Wireless biopotential sensing device and method with capability of short-range radio frequency transmission and reception |
AU2001245678A1 (en) | 2000-03-13 | 2001-09-24 | Sarnoff Corporation | Hearing aid with a flexible shell |
US8140357B1 (en) | 2000-04-26 | 2012-03-20 | Boesen Peter V | Point of service billing and records system |
US7047196B2 (en) | 2000-06-08 | 2006-05-16 | Agiletv Corporation | System and method of voice recognition near a wireline node of a network supporting cable television and/or video delivery |
JP2002083152A (en) | 2000-06-30 | 2002-03-22 | Victor Co Of Japan Ltd | Contents download system, portable terminal player, and contents provider |
KR100387918B1 (en) | 2000-07-11 | 2003-06-18 | 이수성 | Interpreter |
US6784873B1 (en) | 2000-08-04 | 2004-08-31 | Peter V. Boesen | Method and medium for computer readable keyboard display incapable of user termination |
JP4135307B2 (en) | 2000-10-17 | 2008-08-20 | 株式会社日立製作所 | Voice interpretation service method and voice interpretation server |
WO2002039600A2 (en) | 2000-11-07 | 2002-05-16 | Research In Motion Limited | Communication device with multiple detachable communication modules |
US20020076073A1 (en) | 2000-12-19 | 2002-06-20 | Taenzer Jon C. | Automatically switched hearing aid communications earpiece |
USD455835S1 (en) | 2001-04-03 | 2002-04-16 | Voice And Wireless Corporation | Wireless earpiece |
US6987986B2 (en) | 2001-06-21 | 2006-01-17 | Boesen Peter V | Cellular telephone, personal digital assistant with dual lines for simultaneous uses |
USD468300S1 (en) | 2001-06-26 | 2003-01-07 | Peter V. Boesen | Communication device |
USD464039S1 (en) | 2001-06-26 | 2002-10-08 | Peter V. Boesen | Communication device |
US20030065504A1 (en) | 2001-10-02 | 2003-04-03 | Jessica Kraemer | Instant verbal translator |
US6664713B2 (en) | 2001-12-04 | 2003-12-16 | Peter V. Boesen | Single chip device for voice communications |
US7539504B2 (en) | 2001-12-05 | 2009-05-26 | Espre Solutions, Inc. | Wireless telepresence collaboration system |
US8527280B2 (en) | 2001-12-13 | 2013-09-03 | Peter V. Boesen | Voice communication device with foreign language translation |
US20030218064A1 (en) | 2002-03-12 | 2003-11-27 | Storcard, Inc. | Multi-purpose personal portable electronic system |
US8436780B2 (en) | 2010-07-12 | 2013-05-07 | Q-Track Corporation | Planar loop antenna system |
US9153074B2 (en) | 2011-07-18 | 2015-10-06 | Dylan T X Zhou | Wearable augmented reality eyeglass communication device including mobile phone and mobile computing via virtual touch screen gesture control and neuron command |
US7030856B2 (en) | 2002-10-15 | 2006-04-18 | Sony Corporation | Method and system for controlling a display device |
US7107010B2 (en) | 2003-04-16 | 2006-09-12 | Nokia Corporation | Short-range radio terminal adapted for data streaming and real time services |
US20050017842A1 (en) | 2003-07-25 | 2005-01-27 | Bryan Dematteo | Adjustment apparatus for adjusting customizable vehicle components |
US7818036B2 (en) | 2003-09-19 | 2010-10-19 | Radeum, Inc. | Techniques for wirelessly controlling push-to-talk operation of half-duplex wireless device |
US20050094839A1 (en) | 2003-11-05 | 2005-05-05 | Gwee Lin K. | Earpiece set for the wireless communication apparatus |
US7136282B1 (en) | 2004-01-06 | 2006-11-14 | Carlton Rebeske | Tablet laptop and interactive conferencing station system |
US7558744B2 (en) | 2004-01-23 | 2009-07-07 | Razumov Sergey N | Multimedia terminal for product ordering |
US20050251455A1 (en) | 2004-05-10 | 2005-11-10 | Boesen Peter V | Method and system for purchasing access to a recording |
US20060074808A1 (en) | 2004-05-10 | 2006-04-06 | Boesen Peter V | Method and system for purchasing access to a recording |
ATE511298T1 (en) | 2004-06-14 | 2011-06-15 | Nokia Corp | AUTOMATED APPLICATION-SELECTIVE PROCESSING OF INFORMATION OBTAINED THROUGH WIRELESS DATA COMMUNICATIONS LINKS |
US7925506B2 (en) | 2004-10-05 | 2011-04-12 | Inago Corporation | Speech recognition accuracy via concept to keyword mapping |
USD532520S1 (en) | 2004-12-22 | 2006-11-21 | Siemens Aktiengesellschaft | Combined hearing aid and communication device |
US7558529B2 (en) | 2005-01-24 | 2009-07-07 | Broadcom Corporation | Earpiece/microphone (headset) servicing multiple incoming audio streams |
US8489151B2 (en) | 2005-01-24 | 2013-07-16 | Broadcom Corporation | Integrated and detachable wireless headset element for cellular/mobile/portable phones and audio playback devices |
US7183932B2 (en) | 2005-03-21 | 2007-02-27 | Toyota Technical Center Usa, Inc | Inter-vehicle drowsy driver advisory system |
US20060258412A1 (en) | 2005-05-16 | 2006-11-16 | Serina Liu | Mobile phone wireless earpiece |
US20100186051A1 (en) | 2005-05-17 | 2010-07-22 | Vondoenhoff Roger C | Wireless transmission of information between seats in a mobile platform using magnetic resonance energy |
US20140122116A1 (en) | 2005-07-06 | 2014-05-01 | Alan H. Smythe | System and method for providing audio data to assist in electronic medical records management |
USD554756S1 (en) | 2006-01-30 | 2007-11-06 | Songbird Hearing, Inc. | Hearing aid |
US20120057740A1 (en) | 2006-03-15 | 2012-03-08 | Mark Bryan Rosal | Security and protection device for an ear-mounted audio amplifier or telecommunication instrument |
US7965855B1 (en) | 2006-03-29 | 2011-06-21 | Plantronics, Inc. | Conformable ear tip with spout |
USD549222S1 (en) | 2006-07-10 | 2007-08-21 | Jetvox Acoustic Corp. | Earplug type earphone |
US20080076972A1 (en) | 2006-09-21 | 2008-03-27 | Apple Inc. | Integrated sensors for tracking performance metrics |
KR100842607B1 (en) | 2006-10-13 | 2008-07-01 | 삼성전자주식회사 | Charging cradle for head set device and speaker cover for head set device |
US8652040B2 (en) * | 2006-12-19 | 2014-02-18 | Valencell, Inc. | Telemetric apparatus for health and environmental monitoring |
GB0703275D0 (en) | 2007-02-20 | 2007-03-28 | Skype Ltd | Method of estimating noise levels in a communication system |
US8194865B2 (en) | 2007-02-22 | 2012-06-05 | Personics Holdings Inc. | Method and device for sound detection and audio control |
US8063769B2 (en) | 2007-03-30 | 2011-11-22 | Broadcom Corporation | Dual band antenna and methods for use therewith |
US20080255430A1 (en) | 2007-04-16 | 2008-10-16 | Sony Ericsson Mobile Communications Ab | Portable device with biometric sensor arrangement |
US20080298606A1 (en) * | 2007-06-01 | 2008-12-04 | Manifold Products, Llc | Wireless digital audio player |
US8068925B2 (en) | 2007-06-28 | 2011-11-29 | Apple Inc. | Dynamic routing of audio among multiple audio devices |
US8102275B2 (en) | 2007-07-02 | 2012-01-24 | Procter & Gamble | Package and merchandising system |
US20090008275A1 (en) | 2007-07-02 | 2009-01-08 | Ferrari Michael G | Package and merchandising system |
USD579006S1 (en) | 2007-07-05 | 2008-10-21 | Samsung Electronics Co., Ltd. | Wireless headset |
US20090017881A1 (en) | 2007-07-10 | 2009-01-15 | David Madrigal | Storage and activation of mobile phone components |
US8655004B2 (en) | 2007-10-16 | 2014-02-18 | Apple Inc. | Sports monitoring system for headphones, earbuds and/or headsets |
US20090105548A1 (en) | 2007-10-23 | 2009-04-23 | Bart Gary F | In-Ear Biometrics |
US7825626B2 (en) | 2007-10-29 | 2010-11-02 | Embarq Holdings Company Llc | Integrated charger and holder for one or more wireless devices |
US8108143B1 (en) | 2007-12-20 | 2012-01-31 | U-Blox Ag | Navigation system enabled wireless headset |
US20090191920A1 (en) | 2008-01-29 | 2009-07-30 | Paul Regen | Multi-Function Electronic Ear Piece |
US8199952B2 (en) | 2008-04-01 | 2012-06-12 | Siemens Hearing Instruments, Inc. | Method for adaptive construction of a small CIC hearing instrument |
US20090296968A1 (en) | 2008-05-28 | 2009-12-03 | Zounds, Inc. | Maintenance station for hearing aid |
EP2129088A1 (en) | 2008-05-30 | 2009-12-02 | Oticon A/S | A hearing aid system with a low power wireless link between a hearing instrument and a telephone |
US8319620B2 (en) | 2008-06-19 | 2012-11-27 | Personics Holdings Inc. | Ambient situation awareness system and method for vehicles |
CN101616350A (en) | 2008-06-27 | 2009-12-30 | 深圳富泰宏精密工业有限公司 | The portable electron device of bluetooth earphone and this bluetooth earphone of tool |
US8213862B2 (en) | 2009-02-06 | 2012-07-03 | Broadcom Corporation | Headset charge via short-range RF communication |
USD601134S1 (en) | 2009-02-10 | 2009-09-29 | Plantronics, Inc. | Earbud for a communications headset |
JP5245894B2 (en) | 2009-02-16 | 2013-07-24 | 富士通モバイルコミュニケーションズ株式会社 | Mobile communication device |
US8112066B2 (en) | 2009-06-22 | 2012-02-07 | Mourad Ben Ayed | System for NFC authentication based on BLUETOOTH proximity |
US8045961B2 (en) | 2009-06-22 | 2011-10-25 | Mourad Ben Ayed | Systems for wireless authentication based on bluetooth proximity |
DE102009030070A1 (en) | 2009-06-22 | 2010-12-23 | Sennheiser Electronic Gmbh & Co. Kg | Transport and / or storage containers for rechargeable wireless handset |
WO2011001433A2 (en) | 2009-07-02 | 2011-01-06 | Bone Tone Communications Ltd | A system and a method for providing sound signals |
US20110140844A1 (en) | 2009-12-15 | 2011-06-16 | Mcguire Kenneth Stephen | Packaged product having a reactive label and a method of its use |
US8446252B2 (en) | 2010-03-31 | 2013-05-21 | The Procter & Gamble Company | Interactive product package that forms a node of a product-centric communications network |
US20110286615A1 (en) | 2010-05-18 | 2011-11-24 | Robert Olodort | Wireless stereo headsets and methods |
USD647491S1 (en) | 2010-07-30 | 2011-10-25 | Everlight Electronics Co., Ltd. | Light emitting diode |
US8406448B2 (en) | 2010-10-19 | 2013-03-26 | Cheng Uei Precision Industry Co., Ltd. | Earphone with rotatable earphone cap |
US8774434B2 (en) | 2010-11-02 | 2014-07-08 | Yong D. Zhao | Self-adjustable and deforming hearing device |
US9880014B2 (en) | 2010-11-24 | 2018-01-30 | Telenav, Inc. | Navigation system with session transfer mechanism and method of operation thereof |
CN204468122U (en) | 2011-04-05 | 2015-07-15 | 蓝色齿轮有限责任公司 | Ear piece and comprise the system of this ear piece |
US20130065617A1 (en) | 2011-09-14 | 2013-03-14 | Mo-B-Safe Ltd. | System for Reducing Radiation for Cellular Users |
US8548532B1 (en) | 2011-09-27 | 2013-10-01 | Sprint Communications Company L.P. | Head unit to handset interface and integration |
USD666581S1 (en) | 2011-10-25 | 2012-09-04 | Nokia Corporation | Headset device |
JP5731362B2 (en) | 2011-11-28 | 2015-06-10 | 京セラ株式会社 | Electronics |
WO2013134956A1 (en) | 2012-03-16 | 2013-09-19 | Qoros Automotive Co., Ltd. | Navigation system and method for different mobility modes |
US9949205B2 (en) | 2012-05-26 | 2018-04-17 | Qualcomm Incorporated | Smart battery wear leveling for audio devices |
US9374448B2 (en) | 2012-05-27 | 2016-06-21 | Qualcomm Incorporated | Systems and methods for managing concurrent audio messages |
USD687021S1 (en) | 2012-06-18 | 2013-07-30 | Imego Infinity Limited | Pair of earphones |
US8467770B1 (en) | 2012-08-21 | 2013-06-18 | Mourad Ben Ayed | System for securing a mobile terminal |
US8929573B2 (en) | 2012-09-14 | 2015-01-06 | Bose Corporation | Powered headset accessory devices |
SE537958C2 (en) | 2012-09-24 | 2015-12-08 | Scania Cv Ab | Procedure, measuring device and control unit for adapting vehicle train control |
CN102868428B (en) | 2012-09-29 | 2014-11-19 | 裴维彩 | Ultra-low power consumption standby bluetooth device and implementation method thereof |
US10158391B2 (en) | 2012-10-15 | 2018-12-18 | Qualcomm Incorporated | Wireless area network enabled mobile device accessory |
GB2508226B (en) | 2012-11-26 | 2015-08-19 | Selex Es Ltd | Protective housing |
US20140163771A1 (en) | 2012-12-10 | 2014-06-12 | Ford Global Technologies, Llc | Occupant interaction with vehicle system using brought-in devices |
US9391580B2 (en) | 2012-12-31 | 2016-07-12 | Cellco Paternership | Ambient audio injection |
US10043535B2 (en) | 2013-01-15 | 2018-08-07 | Staton Techiya, Llc | Method and device for spectral expansion for an audio signal |
WO2014124100A1 (en) | 2013-02-07 | 2014-08-14 | Earmonics, Llc | Media playback system having wireless earbuds |
US20140222462A1 (en) | 2013-02-07 | 2014-08-07 | Ian Shakil | System and Method for Augmenting Healthcare Provider Performance |
US9301085B2 (en) | 2013-02-20 | 2016-03-29 | Kopin Corporation | Computer headset with detachable 4G radio |
US9516428B2 (en) | 2013-03-14 | 2016-12-06 | Infineon Technologies Ag | MEMS acoustic transducer, MEMS microphone, MEMS microspeaker, array of speakers and method for manufacturing an acoustic transducer |
US9210493B2 (en) | 2013-03-14 | 2015-12-08 | Cirrus Logic, Inc. | Wireless earpiece with local audio cache |
US10268276B2 (en) * | 2013-03-15 | 2019-04-23 | Eyecam, LLC | Autonomous computing and telecommunications head-up displays glasses |
DE102013204681B4 (en) | 2013-03-18 | 2019-10-02 | Sivantos Pte. Ltd. | Binaural hearing instrument and earpiece |
US20140335908A1 (en) | 2013-05-09 | 2014-11-13 | Bose Corporation | Management of conversation circles for short-range audio communication |
US9668041B2 (en) | 2013-05-22 | 2017-05-30 | Zonaar Corporation | Activity monitoring and directing system |
US9081944B2 (en) | 2013-06-21 | 2015-07-14 | General Motors Llc | Access control for personalized user information maintained by a telematics unit |
TWM469709U (en) | 2013-07-05 | 2014-01-01 | Jetvox Acoustic Corp | Tunable earphone |
WO2015011552A1 (en) | 2013-07-25 | 2015-01-29 | Bionym Inc. | Preauthorized wearable biometric device, system and method for use thereof |
JP6107596B2 (en) | 2013-10-23 | 2017-04-05 | 富士通株式会社 | Article conveying device |
US9279696B2 (en) | 2013-10-25 | 2016-03-08 | Qualcomm Incorporated | Automatic handover of positioning parameters from a navigation device to a mobile device |
KR101710317B1 (en) | 2013-11-22 | 2017-02-24 | 퀄컴 인코포레이티드 | System and method for configuring an interior of a vehicle based on preferences provided with multiple mobile computing devices within the vehicle |
USD733103S1 (en) | 2014-01-06 | 2015-06-30 | Google Technology Holdings LLC | Headset for a communication device |
CN106464996A (en) | 2014-01-24 | 2017-02-22 | 布拉吉有限公司 | Multifunctional headphone system for sports activities |
DE102014100824A1 (en) | 2014-01-24 | 2015-07-30 | Nikolaj Hviid | Independent multifunctional headphones for sports activities |
US9791336B2 (en) | 2014-02-13 | 2017-10-17 | Evigia Systems, Inc. | System and method for head acceleration measurement in helmeted activities |
US9148717B2 (en) | 2014-02-21 | 2015-09-29 | Alpha Audiotronics, Inc. | Earbud charging case |
US8891800B1 (en) | 2014-02-21 | 2014-11-18 | Jonathan Everett Shaffer | Earbud charging case for mobile device |
US9037125B1 (en) | 2014-04-07 | 2015-05-19 | Google Inc. | Detecting driving with a wearable computing device |
USD758385S1 (en) | 2014-04-15 | 2016-06-07 | Huawei Device Co., Ltd. | Display screen or portion thereof with animated graphical user interface |
USD728107S1 (en) | 2014-06-09 | 2015-04-28 | Actervis Gmbh | Hearing aid |
US9344543B2 (en) | 2014-07-15 | 2016-05-17 | Wistron Corporation | Utilizing telecoil compatibility on a mobile device for improving frequency range of multimedia playback |
US10024667B2 (en) | 2014-08-01 | 2018-07-17 | Toyota Motor Engineering & Manufacturing North America, Inc. | Wearable earpiece for providing social and environmental awareness |
JP6337199B2 (en) | 2014-08-26 | 2018-06-06 | トヨタ モーター セールス,ユー.エス.エー.,インコーポレイティド | Integrated wearables for interactive mobile control systems |
US9532128B2 (en) | 2014-09-05 | 2016-12-27 | Earin Ab | Charging of wireless earbuds |
US9779752B2 (en) | 2014-10-31 | 2017-10-03 | At&T Intellectual Property I, L.P. | Acoustic enhancement by leveraging metadata to mitigate the impact of noisy environments |
CN204244472U (en) | 2014-12-19 | 2015-04-01 | 中国长江三峡集团公司 | A kind of vehicle-mounted road background sound is adopted and is broadcast safety device |
US20160204839A1 (en) * | 2015-01-12 | 2016-07-14 | Futurewei Technologies, Inc. | Multi-band Antenna for Wearable Glasses |
CN104683519A (en) | 2015-03-16 | 2015-06-03 | 镇江博昊科技有限公司 | Mobile phone case with signal shielding function |
CN104837094A (en) | 2015-04-24 | 2015-08-12 | 成都迈奥信息技术有限公司 | Bluetooth earphone integrated with navigation function |
US9510159B1 (en) | 2015-05-15 | 2016-11-29 | Ford Global Technologies, Llc | Determining vehicle occupant location |
US10219062B2 (en) | 2015-06-05 | 2019-02-26 | Apple Inc. | Wireless audio output devices |
USD777710S1 (en) | 2015-07-22 | 2017-01-31 | Doppler Labs, Inc. | Ear piece |
US20170115742A1 (en) * | 2015-08-01 | 2017-04-27 | Zhou Tian Xing | Wearable augmented reality eyeglass communication device including mobile phone and mobile computing via virtual touch screen gesture control and neuron command |
USD773439S1 (en) | 2015-08-05 | 2016-12-06 | Harman International Industries, Incorporated | Ear bud adapter |
US9480096B1 (en) * | 2015-08-25 | 2016-10-25 | Motorola Solutions, Inc. | Method, device, and system for fast wireless accessory devices pairing |
US10234133B2 (en) | 2015-08-29 | 2019-03-19 | Bragi GmbH | System and method for prevention of LED light spillage |
US10194228B2 (en) | 2015-08-29 | 2019-01-29 | Bragi GmbH | Load balancing to maximize device function in a personal area network device system and method |
US10409394B2 (en) | 2015-08-29 | 2019-09-10 | Bragi GmbH | Gesture based control system based upon device orientation system and method |
US9866282B2 (en) | 2015-08-29 | 2018-01-09 | Bragi GmbH | Magnetic induction antenna for use in a wearable device |
US9905088B2 (en) | 2015-08-29 | 2018-02-27 | Bragi GmbH | Responsive visual communication system and method |
US9972895B2 (en) | 2015-08-29 | 2018-05-15 | Bragi GmbH | Antenna for use in a wearable device |
US10194232B2 (en) | 2015-08-29 | 2019-01-29 | Bragi GmbH | Responsive packaging system for managing display actions |
US10203773B2 (en) | 2015-08-29 | 2019-02-12 | Bragi GmbH | Interactive product packaging system and method |
US9949008B2 (en) | 2015-08-29 | 2018-04-17 | Bragi GmbH | Reproduction of ambient environmental sound for acoustic transparency of ear canal device system and method |
US9949013B2 (en) | 2015-08-29 | 2018-04-17 | Bragi GmbH | Near field gesture control system and method |
US9838775B2 (en) | 2015-09-16 | 2017-12-05 | Apple Inc. | Earbuds with biometric sensing |
US11609427B2 (en) * | 2015-10-16 | 2023-03-21 | Ostendo Technologies, Inc. | Dual-mode augmented/virtual reality (AR/VR) near-eye wearable displays |
US10506322B2 (en) | 2015-10-20 | 2019-12-10 | Bragi GmbH | Wearable device onboard applications system and method |
US20170109131A1 (en) | 2015-10-20 | 2017-04-20 | Bragi GmbH | Earpiece 3D Sound Localization Using Mixed Sensor Array for Virtual Reality System and Method |
US10206042B2 (en) | 2015-10-20 | 2019-02-12 | Bragi GmbH | 3D sound field using bilateral earpieces system and method |
US10453450B2 (en) | 2015-10-20 | 2019-10-22 | Bragi GmbH | Wearable earpiece voice command control system and method |
US20170110899A1 (en) | 2015-10-20 | 2017-04-20 | Bragi GmbH | Galvanic Charging and Data Transfer of Remote Devices in a Personal Area Network System and Method |
US10175753B2 (en) | 2015-10-20 | 2019-01-08 | Bragi GmbH | Second screen devices utilizing data from ear worn device system and method |
US9980189B2 (en) | 2015-10-20 | 2018-05-22 | Bragi GmbH | Diversity bluetooth system and method |
US10104458B2 (en) | 2015-10-20 | 2018-10-16 | Bragi GmbH | Enhanced biometric control systems for detection of emergency events system and method |
US20170111723A1 (en) | 2015-10-20 | 2017-04-20 | Bragi GmbH | Personal Area Network Devices System and Method |
US10635385B2 (en) | 2015-11-13 | 2020-04-28 | Bragi GmbH | Method and apparatus for interfacing with wireless earpieces |
US10040423B2 (en) | 2015-11-27 | 2018-08-07 | Bragi GmbH | Vehicle with wearable for identifying one or more vehicle occupants |
US20170153636A1 (en) | 2015-11-27 | 2017-06-01 | Bragi GmbH | Vehicle with wearable integration or communication |
US20170156000A1 (en) | 2015-11-27 | 2017-06-01 | Bragi GmbH | Vehicle with ear piece to provide audio safety |
US20170151959A1 (en) | 2015-11-27 | 2017-06-01 | Bragi GmbH | Autonomous vehicle with interactions with wearable devices |
US9944295B2 (en) | 2015-11-27 | 2018-04-17 | Bragi GmbH | Vehicle with wearable for identifying role of one or more users and adjustment of user settings |
US10099636B2 (en) | 2015-11-27 | 2018-10-16 | Bragi GmbH | System and method for determining a user role and user settings associated with a vehicle |
US20170151957A1 (en) | 2015-11-27 | 2017-06-01 | Bragi GmbH | Vehicle with interactions with wearable device to provide health or physical monitoring |
US20170155998A1 (en) | 2015-11-27 | 2017-06-01 | Bragi GmbH | Vehicle with display system for interacting with wearable device |
US20170153114A1 (en) | 2015-11-27 | 2017-06-01 | Bragi GmbH | Vehicle with interaction between vehicle navigation system and wearable devices |
US10104460B2 (en) | 2015-11-27 | 2018-10-16 | Bragi GmbH | Vehicle with interaction between entertainment systems and wearable devices |
US9978278B2 (en) | 2015-11-27 | 2018-05-22 | Bragi GmbH | Vehicle to vehicle communications using ear pieces |
US10542340B2 (en) | 2015-11-30 | 2020-01-21 | Bragi GmbH | Power management for wireless earpieces |
US20170155993A1 (en) | 2015-11-30 | 2017-06-01 | Bragi GmbH | Wireless Earpieces Utilizing Graphene Based Microphones and Speakers |
US20170151447A1 (en) | 2015-11-30 | 2017-06-01 | Bragi GmbH | Graphene Based Ultrasound Generation |
US20170155985A1 (en) | 2015-11-30 | 2017-06-01 | Bragi GmbH | Graphene Based Mesh for Use in Portable Electronic Devices |
US10099374B2 (en) | 2015-12-01 | 2018-10-16 | Bragi GmbH | Robotic safety using wearables |
US9939891B2 (en) | 2015-12-21 | 2018-04-10 | Bragi GmbH | Voice dictation systems using earpiece microphone system and method |
US9980033B2 (en) | 2015-12-21 | 2018-05-22 | Bragi GmbH | Microphone natural speech capture voice dictation system and method |
US10206052B2 (en) | 2015-12-22 | 2019-02-12 | Bragi GmbH | Analytical determination of remote battery temperature through distributed sensor array system and method |
US10575083B2 (en) | 2015-12-22 | 2020-02-25 | Bragi GmbH | Near field based earpiece data transfer system and method |
US10334345B2 (en) | 2015-12-29 | 2019-06-25 | Bragi GmbH | Notification and activation system utilizing onboard sensors of wireless earpieces |
US10154332B2 (en) | 2015-12-29 | 2018-12-11 | Bragi GmbH | Power management for wireless earpieces utilizing sensor measurements |
US20170195829A1 (en) | 2015-12-31 | 2017-07-06 | Bragi GmbH | Generalized Short Range Communications Device and Method |
USD788079S1 (en) | 2016-01-08 | 2017-05-30 | Samsung Electronics Co., Ltd. | Electronic device |
US10200790B2 (en) | 2016-01-15 | 2019-02-05 | Bragi GmbH | Earpiece with cellular connectivity |
US10129620B2 (en) | 2016-01-25 | 2018-11-13 | Bragi GmbH | Multilayer approach to hydrophobic and oleophobic system and method |
US10104486B2 (en) | 2016-01-25 | 2018-10-16 | Bragi GmbH | In-ear sensor calibration and detecting system and method |
US10085091B2 (en) | 2016-02-09 | 2018-09-25 | Bragi GmbH | Ambient volume modification through environmental microphone feedback loop system and method |
US10667033B2 (en) | 2016-03-02 | 2020-05-26 | Bragi GmbH | Multifactorial unlocking function for smart wearable device and method |
US10327082B2 (en) | 2016-03-02 | 2019-06-18 | Bragi GmbH | Location based tracking using a wireless earpiece device, system, and method |
US20170257694A1 (en) | 2016-03-02 | 2017-09-07 | Bragi GmbH | System and Method for Rapid Scan and Three Dimensional Print of External Ear Canal |
-
2017
- 2017-08-22 US US15/682,986 patent/US10104464B2/en active Active
-
2018
- 2018-09-21 US US16/138,206 patent/US20190028793A1/en not_active Abandoned
Also Published As
Publication number | Publication date |
---|---|
US10104464B2 (en) | 2018-10-16 |
US20180063625A1 (en) | 2018-03-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10104464B2 (en) | Wireless earpiece and smart glasses system and method | |
CN105431763B (en) | The tracking head movement when wearing mobile equipment | |
Hu et al. | An overview of assistive devices for blind and visually impaired people | |
JP6030582B2 (en) | Optical device for individuals with visual impairment | |
CN108475120A (en) | The method and mixed reality system of object motion tracking are carried out with the remote equipment of mixed reality system | |
EP4157474A1 (en) | Interactive augmented reality experiences using positional tracking | |
EP2984540A1 (en) | Personal holographic billboard | |
CN107533375A (en) | scene image analysis module | |
US11366527B1 (en) | Systems and methods for sensing gestures via vibration-sensitive wearables donned by users of artificial reality systems | |
US9223451B1 (en) | Active capacitive sensing on an HMD | |
US11670157B2 (en) | Augmented reality system | |
CN115735174A (en) | Augmented reality experience using social distance preservation | |
CN115715177A (en) | Blind person auxiliary glasses with geometrical hazard detection function | |
WO2022245642A1 (en) | Audio enhanced augmented reality | |
US20210303258A1 (en) | Information processing device, information processing method, and recording medium | |
US20220343534A1 (en) | Image based detection of display fit and ophthalmic fit measurements | |
US20230306781A1 (en) | Predicting display fit and ophthalmic fit measurements using a simulator | |
JP7078568B2 (en) | Display device, display control method, and display system | |
WO2023244932A1 (en) | Predicting sizing and/or fitting of head mounted wearable device | |
KR20180045644A (en) | Head mounted display apparatus and method for controlling thereof | |
US20220293241A1 (en) | Systems and methods for signaling cognitive-state transitions | |
EP4124073A1 (en) | Augmented reality device performing audio recognition and control method therefor | |
US20230046950A1 (en) | Image based detection of fit for a head mounted wearable computing device | |
US20230168522A1 (en) | Eyewear with direction of sound arrival detection | |
US20240069688A1 (en) | Head-Mounted Electronic Device with Magnification Tool |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
AS | Assignment |
Owner name: BRAGI GMBH, GERMANY Free format text: EMPLOYMENT DOCUMENT;ASSIGNOR:BOESEN, PETER VINCENT;REEL/FRAME:049672/0188 Effective date: 20190603 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |