US9794699B2 - Hearing device considering external environment of user and control method of hearing device - Google Patents

Hearing device considering external environment of user and control method of hearing device Download PDF

Info

Publication number
US9794699B2
US9794699B2 US14/103,218 US201314103218A US9794699B2 US 9794699 B2 US9794699 B2 US 9794699B2 US 201314103218 A US201314103218 A US 201314103218A US 9794699 B2 US9794699 B2 US 9794699B2
Authority
US
United States
Prior art keywords
user
hearing device
external environment
question
hearing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US14/103,218
Other languages
English (en)
Other versions
US20140169574A1 (en
Inventor
Jong Min Choi
Yun Seo KU
Dong Wook Kim
Jong Jin Kim
Jun Il SOHN
Jun Whon UHM
Heun Chul Lee
Yoon Chae Cheong
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Assigned to SAMSUNG ELECTRONICS CO., LTD. reassignment SAMSUNG ELECTRONICS CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: CHEONG, YOON CHAE, CHOI, JONG MIN, KIM, DONG WOOK, KIM, JONG JIN, LEE, HEUN CHUL, SOHN, JUN IL, UHM, JUN WHON, KU, YUN SEO
Publication of US20140169574A1 publication Critical patent/US20140169574A1/en
Application granted granted Critical
Publication of US9794699B2 publication Critical patent/US9794699B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F2/00Filters implantable into blood vessels; Prostheses, i.e. artificial substitutes or replacements for parts of the body; Appliances for connecting them with the body; Devices providing patency to, or preventing collapsing of, tubular structures of the body, e.g. stents
    • A61F2/02Prostheses implantable into the body
    • A61F2/18Internal ear or nose parts, e.g. ear-drums
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R25/00Deaf-aid sets, i.e. electro-acoustic or electro-mechanical hearing aids; Electric tinnitus maskers providing an auditory perception
    • H04R25/43Electronic input selection or mixing based on input signal analysis, e.g. mixing or selection between microphone and telecoil or between microphones with different directivity characteristics
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R25/00Deaf-aid sets, i.e. electro-acoustic or electro-mechanical hearing aids; Electric tinnitus maskers providing an auditory perception
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R25/00Deaf-aid sets, i.e. electro-acoustic or electro-mechanical hearing aids; Electric tinnitus maskers providing an auditory perception
    • H04R25/50Customised settings for obtaining desired overall acoustical characteristics
    • H04R25/505Customised settings for obtaining desired overall acoustical characteristics using digital signal processing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R2225/00Details of deaf aids covered by H04R25/00, not provided for in any of its subgroups
    • H04R2225/41Detection or adaptation of hearing aid parameters or programs to listening situation, e.g. pub, forest
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R2225/00Details of deaf aids covered by H04R25/00, not provided for in any of its subgroups
    • H04R2225/61Aspects relating to mechanical or electronic switches or control elements, e.g. functioning

Definitions

  • the following description relates to a hearing device and a control method of the hearing device, and more particularly, to setting of parameters of a hearing device taking into consideration an external environment surrounding a user.
  • a hearing device amplifies a sound generated from an outside source and helps a user perceive the sound.
  • hearing devices are built in different shapes and sizes including a pocket type, an earring type, a concha type, and an eardrum type.
  • the hearing device may deliver a sound as desired by the user by varying one or more settings of the hearing device such as volume control and frequency control.
  • the hearing device measures hearing ability of the user and sets an optimal setting for the hearing device.
  • a settings change device may be so complicated to operate or use making it difficult for the user to operate, and requiring a skilled engineer to operate.
  • a hearing device including an external environment determination unit configured to determine an external environment surrounding a user wearing the hearing device; an inquiry unit configured to select a question corresponding to the external environment and inquire the question to the user; and a parameter set unit configured to set a parameter of the hearing device based on a response to the question and hearing loss information of the user.
  • the external environment may include at least one of oscillation frequency, frequency, and electronic waves generated in the external environment surrounding the user.
  • the external environment determination unit may be configured to determine the external environment surrounding the user using an audio sensor and a radio wave sensor included in the hearing device.
  • the inquiry unit may include a question list including questions corresponding to the external environment surrounding the user.
  • the inquiry unit may be configured to select the question considering sound distortions due to influences caused by at least one of a user gender, a user age, a time zone at which the external environment surrounding the user is determined, a season, a weather, and an external temperature.
  • the inquiry unit may be configured to select a question to connect to an electronic device disposed in the external environment surrounding the user according to intensity of radio waves detected through the electronic device.
  • the inquiry unit may be configured to inquire the selected question to the user through a user terminal connected to the hearing device or to an audio sensor or a touch sensor included in the hearing device, and to receive the response to the question.
  • a method for a hearing device including determining an external environment surrounding a user wearing the hearing device; selecting a question corresponding to the external environment and inquiring the question to the user; and setting a parameter of the hearing device based on a response to the question and hearing loss information of the user.
  • the external environment may include at least one of oscillation frequency, frequency, and electronic waves generated in the external environment surrounding the user.
  • the determining of the external environment may include determining the external environment surrounding the user using an audio sensor and a radio wave sensor included in the hearing device.
  • the selecting and inquiring of the question may include a question list including questions corresponding to the external environment surrounding the user.
  • the selecting and inquiring of the question may include selecting the question considering sound distortions due to influences caused by at least one of a user gender, a user age, a time zone at which the external environment surrounding the user is determined, a season, a weather, and an external temperature.
  • the selecting and inquiring of the question may include selecting a question to connect to an electronic device disposed in the external environment surrounding the user according to intensity of radio waves detected through the electronic device.
  • the selecting and inquiring of the question may include inquiring the selected question to the user through a user terminal connected to the hearing device or to an audio sensor or a touch sensor included in the hearing device, and to receive the response to the question.
  • a non-transitory computer readable medium configured to control a processor to perform the method described above.
  • a hearing device including an external environment determination unit configured to determine an external environment surrounding a user; an inquiry unit configured to search for a question corresponding to the external environment, and receive a response to the question from the user through a sensor included in the hearing device; and a parameter set unit configured to set a parameter related to the external environment and corresponding to the response received from the user.
  • the parameter may include at least one of volume control, equalizer control, control of a particular frequency band, noise control, reverberation control, wind noise control, acoustic feedback control and is set based on the response of the user and hearing loss information of the user.
  • the external environment determination unit may be configured to use at least one of a touch sensor, an audio sensor, an acceleration sensor, a temperature sensor, an optical sensor, and a radio wave sensor to determine the external environment surrounding the user.
  • the inquiry unit may be configured to store in a database a question list including the question corresponding to the external environment.
  • the sensor may include an audio sensor, an acceleration sensor, a radio wave sensor, a temperature sensor, an optical sensor.
  • the inquiry unit may select the question corresponding to the external environment of the user, taking into consideration sound distortions due to at least one of time zone, weather, and season defining the external environment surrounding the user.
  • FIG. 1 is a diagram illustrating operation of a hearing device, in accord with an embodiment.
  • FIG. 2 is a diagram illustrating a detailed structure of the hearing device, in accord with an embodiment.
  • FIG. 3 is a diagram illustrating an example of a process to select questions from a predetermined question list according to an external environment surrounding a user, in accord with an embodiment.
  • FIG. 4 is a diagram illustrating an example of operation of responding using a touch sensor included in the hearing device, in accord with an embodiment.
  • FIG. 5 is a diagram illustrating an example of operation of responding using a voice of the user, in accord with an embodiment.
  • FIG. 6 is a diagram illustrating an example of operation of responding using a user terminal, in accord with an embodiment.
  • FIG. 7 is a diagram illustrating an example of operation of connecting with an electronic device according to an intensity of detected electronic waves, in accord with an embodiment.
  • FIG. 8 is a flowchart illustrating an example an operational process of a hearing device, in accord with an embodiment.
  • FIG. 1 illustrates an operation of a hearing device 101 , in accord with an embodiment.
  • the hearing device 101 determines an external environment 102 of a user.
  • the external environment 102 of the user includes various environmental factors that may exist at a location of the user.
  • the hearing device 101 determines the external environment 102 through sound, acceleration, external temperature, light, humidity, radio waves, and vibrations.
  • the hearing device 101 may be a device adapted to transmit sound, such as a hearing aid, an earphone, a headset, and a microphone.
  • the hearing device 101 may be equipped with an artificial intelligence (AI) function that automatically analyzes and determines the external environment 102 of the user.
  • AI artificial intelligence
  • the hearing device 101 In response to the hearing device 101 determining the external environment 102 of the user, the hearing device 101 selects questions from a predefined question list corresponding to the external environment 102 .
  • the hearing device 101 receives a response from the user to the questions through a user terminal.
  • the user terminal may be integrated to an audio sensor or touch sensor included in the hearing device or the hearing device.
  • the user terminal may be an external device including all types of data processing devices such as a personal computer (PC), a notebook, a television (TV), an audio equipment, and mobile terminals such as a mobile phone, a tablet PC, and a personal digital assistant (PDA).
  • PC personal computer
  • TV television
  • PDA personal digital assistant
  • the hearing device 101 sets the parameters of the hearing device based on the received response from the user and hearing loss information associated with the user.
  • the hearing device 101 determines the external environment to set the parameters according to the external environment of the user, and sets, partly or entirely, the parameters of the hearing device based on the determined external environment; thus, reducing any inconvenience for the user to set the hearing device. Also, by properly coping with external environment, which is frequently changing, the hearing device 101 provides the user with convenience. In one illustrative example, the hearing device 101 triggers the determination of the external environment and the questions dynamically, that is, automatically and without user intervention, or manually.
  • FIG. 2 illustrates a detailed structure of an example of a hearing device 201 , in accord with an illustrative example.
  • the hearing device 201 includes an external environment determination unit 202 , an inquiry unit 203 , and a parameter set unit 204 .
  • the external environment determination unit 202 determines the external environment of the user, including vehicle sounds, human voices, wind sounds, and footsteps generated at a user location.
  • the external environment determination unit 202 determines the external environment surrounding the user using the audio sensor and the temperature sensor. For instance, the external environment determination unit 202 determines the external environment by analyzing the external environment including, but not limited to, the sounds generated outside the user and an external environment including, but not limited to, temperature varied according to morning temperature, afternoon temperature, winds, humidity, and sunlight.
  • the external environment determination unit 202 determines the external environment surrounding the user using the audio sensor and the optical sensor. For instance, the external environment determination unit 202 determines the external environment by analyzing the external environment including, but not limited to, the sounds generated outside the user and an external environment including, but not limited to, light varied according to day and night, and a difference in sunshine throughout a day.
  • the external environment determination unit 202 determines the external environment surrounding the user using the audio sensor, the temperature sensor, and the optical sensor.
  • the external environment determination unit 202 may determine the external environment surrounding the user using the audio sensor and the radio wave sensor. For instance, the external environment determination unit 202 determines the external environment by analyzing the external environment including, but not limited to, the sounds generated at the outside of the user and radio waves detected by an electronic device disposed in the external environment surrounding the user. This case will be described in detail with reference to FIG. 7 .
  • the external environment determination unit 202 determines the external environment surrounding the user using oscillation frequency, frequency, and radio waves using sensors included in the hearing device 201 . Furthermore, the external environment determination unit 202 may determine the external environment using light, temperature, and humidity, in addition to the sounds.
  • the questions in the question list may be deleted, modified, or added as circumstances require.
  • the questions may be added to the DB including the question list.
  • the questions may be deleted from the DB including the question list.
  • the DB may organize contents of the question list in an order such as a predefined relevant order, alphabetically or numerical order, or an order as defined by the user.
  • the inquiry unit 203 selects the questions, corresponding to or to be in accord with the external environment surrounding the user, from the predetermined question list.
  • the inquiry unit 203 selects questions corresponding to the ‘noise of construction site’ from the question list.
  • the question list may include questions such as ‘Reduce ambient sound?’, ‘Increase sound related to voice?’, ‘Remove noise of construction site and increase sound related to voice?’, and the like.
  • the inquiry unit 203 may select the questions corresponding to the user in consideration of a user gender, a user age, a time zone at which the external environment surrounding the user is determined, a season, weather, an external temperature, and humidity according to the external environment surrounding the user.
  • the inquiry unit 203 may select the questions corresponding to the external environment of the user, taking into consideration sound distortions due to influences caused by the time zone, the weather, the season, and the like that define the external environment surrounding the user.
  • the external environment determination unit 202 determines the external environment of the user as ‘hot weather and high humidity.’ According to the external environment determined, the external environment determination unit 202 also determines that a discomfort index of the user may be high. Accordingly, the inquiry unit 203 may select questions corresponding to the external environment from the question list. For example, the inquiry unit 203 may select a question, such as ‘Attenuate amplitude of hearing device?’
  • the inquiry unit 203 may set parameters corresponding to the external environment surrounding the user considering a sound perception level of the user as a function of the user gender, the user age, and the like. For example, in case of the user being seventy years old, the inquiry unit 203 may increase a sound volume of the hearing device, compared to a normal sound volume, when presenting the question to ensure that the user is able to hear and understand the question.
  • the inquiry unit 203 analyzes information including weather, such as ‘fine weather’, and ambient sound of the external environment. Based on the analyzed information, the inquiry unit 203 may select questions for the user in consideration of the ambient sound and the weather, the questions may include, for instance, ‘The weather is fine and therefore an ambient sound is relatively small. Increase the sound volume a bit?’ and ‘It is windy and window noise is high. Control window noise?’
  • the inquiry unit 203 provides the selected questions to the user.
  • the inquiry unit 203 transmits the questions to the user through, for example, a user terminal which is compatibly used with the audio sensor included in the hearing device 201 or with the hearing device 201 .
  • the audio sensor is adapted to provide questions dedicatedly included in the hearing device.
  • the audio sensor may be same as the audio sensor included in the hearing device to provide an external sound.
  • the inquiry unit 203 provides the questions corresponding to the external environment surrounding the user using the audio sensor.
  • the user may feel comfortable with the hearing device 201 by receiving the questions through the audio sensor.
  • the hearing device 201 would enable the user to recognize a change of the external environment in real time and cope with the external environment through a state of the hearing device corresponding to the external environment.
  • the hearing device 201 receives a response to the questions provided through the touch sensor or the audio sensor included in the hearing device 201 .
  • the hearing device 201 pre-stores a particular frequency corresponding to the voice of the user. Therefore, the hearing device 201 recognizes the voice of the user by comparing a frequency of a sound received through the audio sensor with the pre-stored frequency. Additionally, the hearing device 201 receives the response to the questions provided through a separate microphone connected to the hearing device 201 . It will be understood that when an element or layer is referred to as being “on” or “connected to” another element, it can be directly on, operatively connected to, or connected to the other element or through intervening elements may be present.
  • the inquiry unit 203 may provide the selected questions to the user through the user terminal connected to the hearing device 201 .
  • the user terminal may be a terminal that the user may freely carry and use.
  • the user terminal described herein may refer to a mobile device such as a cellular phone, a personal digital assistant (PDA), a digital camera, a portable game console, and an MP3 player, a portable/personal multimedia player (PMP), a handheld e-book, a portable lab-top PC, and devices such as a desktop PC, a high definition television (HDTV), an optical disc player, a setup box, and the like capable of wireless communication or network communication consistent with that disclosed herein.
  • the user terminal may be a device configured to display the questions provided by the hearing device 201 . When provided with the questions from the hearing device 201 , the user terminal displays or audibly provides the questions to the user through a notification function.
  • the inquiry unit 203 transmits the same questions corresponding to the external environment of the user to the user terminal.
  • the user may recognize the change of the external environment by checking the user terminal and based on the questions. The user may respond to the questions through the user terminal.
  • the parameter set unit 204 sets parameters of the hearing device based on the response from the user to the questions and hearing loss information of the user. That is, the parameter set unit 204 sets the parameters of the hearing device, including volume control, equalizer control, volume control at a particular frequency band, noise control, reverberation control, acoustic feedback control, wind noise control, microphone power control, connection to an electronic device, and preset.
  • the parameter set unit 204 determines the external environment surrounding the user related to wind noise that may be generated when the user is near a window or between buildings, and, accordingly, controls or compensates for the wind noise of the hearing device corresponding to the external environment.
  • the parameter set unit 204 determines the external environment surrounding the user related to vibration generated when a sound leaking from a receiver of the hearing device fails to return to a transmitter of the hearing device. Accordingly, the parameter set unit 204 controls or compensates for audio feedback of the hearing device 202 corresponding to the external environment.
  • the parameter set unit 204 minimizes user inconvenience of directly setting the parameters with respect to changes in the external environment, by setting the parameters of the hearing device 202 based on the response of the user.
  • the external environment determination unit 202 , the inquiry unit 203 , and the parameter set unit 204 described herein may be implemented using hardware components.
  • the hardware components may include, for example, controllers, sensors, processors, generators, drivers, and other equivalent electronic components.
  • the hardware components may be implemented using one or more general-purpose or special purpose computers, such as, for example, a processor, a controller and an arithmetic logic unit, a digital signal processor, a microcomputer, a field programmable array, a programmable logic unit, a microprocessor or any other device capable of responding to and executing instructions in a defined manner.
  • the hardware components may run an operating system (OS) and one or more software applications that run on the OS.
  • OS operating system
  • the hardware components also may access, store, manipulate, process, and create data in response to execution of the software.
  • a processing device may include multiple processing elements and multiple types of processing elements.
  • a hardware component may include multiple processors or a processor and a controller.
  • different processing configurations are possible, such a parallel processors.
  • FIG. 3 illustrates an example process of selecting questions from a predetermined question list 303 according to an external environment surrounding a user being determined.
  • the question list 303 includes a plurality of questions corresponding to an external environment 301 .
  • the question list 303 may be a list of questions corresponding to the external environment of the user.
  • the question list 303 may include a plurality of questions such as ‘Reduce turbo volume?’, ‘Correct frequency band?’, and ‘Increase frequency?’
  • the question list 303 may include questions considering time, season, external temperature, and weather at a point in time at which the external environment of the user is determined.
  • the question list 303 may include questions considering time, season, external temperature, and weather, such as ‘Cicadas singing loudly because it is Summer.’
  • the question list 303 may include questions such as ‘Reduce cicada sound?’, ‘Outside is quiet because it is evening. Increase sound volume?’.
  • an environment list 302 is a list of the external environments 301 that may occur at surrounding the user.
  • the environment list 302 may include external environments of the user, such as ‘turbo engine sound’, ‘quiet’, ‘wind noise’, and ‘echo sound’.
  • the environment list 302 is matched with the question list 303 , which includes questions that correspond to each of the external environments included in the environment list 302 . Contents of the environment list 302 may be added, deleted, or modified as circumstances require.
  • FIG. 4 illustrates an example of operation of responding using a touch sensor included in a hearing device 401 , in accord with an embodiment.
  • the hearing device 401 may be attached or mounted to a body of a user.
  • the hearing device 401 processes and determines a change in an external environment surrounding the user from an external environment previously used to calibrate the hearing device 401 .
  • the hearing device 401 transmits to the user a question 402 corresponding to the external environment surrounding the user.
  • the hearing device 401 inquires or presents the question 402 to the user through a sound from the hearing device 401 using an audio sensor included in the hearing device 401 .
  • the hearing device 401 provides the user with the question 402 , such as ‘Correct frequency band?’, using the audio sensor included in the hearing device 401 . Because the question 402 corresponding to the external environment is provided through the audio sensor included in the hearing device 401 , the user may feel comfortable with the hearing device 401 .
  • the user recognizes the question 402 related to the external environment of the user from the hearing device 401 and responds to the hearing device 401 in relation to the question 402 .
  • the hearing device 401 receives the response to the question 402 from the user through the touch sensor included in the hearing device 401 .
  • the hearing device 401 receives the response to the question, for example, by pressing 404 the touch sensor included in the hearing device 401 for a predetermine time.
  • the hearing device 401 may receive the response to the question by tapping 404 the touch sensor for a predetermined time period.
  • the hearing device 401 may receive the response to the question by triggering the touch sensor wirelessly through a voice command from the user.
  • the voice command may be a pre-programmed or pre-set command or password that would trigger the touch sensor to be enabled to receive the response to the question.
  • the hearing device 401 may set parameters of the hearing device 401 and then provide a setting result to the user. For example, the hearing device 401 sets the parameters of the hearing device 401 related to a frequency band based on the response to the provided question 402 and hearing loss information of the user, and then provides the result such as ‘Correction is completed.’
  • FIG. 5 is a diagram illustrating an example of operation of responding using a voice of a user, in accord with an embodiment.
  • a hearing device 501 may be attached or mounted to a body of a user.
  • the hearing device 501 processes and determines a change in an external environment surrounding the user from an external environment previously used to calibrate the hearing device 501 .
  • the hearing device 501 transmits to the user a question 502 corresponding to an external environment of the user.
  • the hearing device 501 inquires or presents the question 502 to the user using an audio sensor included in the hearing device 501 .
  • the hearing device 501 may inquire or present the user with the question 502 such as ‘Remove reverberations?’ using the audio sensor included in the hearing device 501 .
  • the question 502 corresponding to the external environment is provided through the audio sensor included in the hearing device 501 , the user may feel comfortable with the hearing device 501 .
  • the user recognizes the question 502 transmitted from the hearing device 501 and responds to the hearing device 501 in relation to the question 502 using the audio sensor included in the hearing device 501 .
  • a particular frequency of the voice of the user may be pre-stored in the hearing device 501 .
  • the user may make a response such as ‘Yes, please remove.’ to the question 502 corresponding to the external environment.
  • the hearing device 501 receives the particular frequency from the user using the audio sensor.
  • the user responds to the hearing device 501 through various response methods including the foregoing example illustrated in FIG. 5 .
  • the hearing device 501 may receive the response to the question by triggering the touch sensor wirelessly through a voice command from the user.
  • the voice command may be a pre-programmed or pre-set command or password that would trigger the touch sensor to be enabled to receive the response to the question.
  • the hearing device 501 may set parameters of the hearing devices based on the response of the user and hearing loss information of the user. That is, the hearing device 501 may set the parameters such as volume control, equalizer control, control of a particular frequency band, noise control, reverberation control, wind noise control, acoustic feedback control, and the like, according to the response of the user.
  • the hearing device 501 may determine an external environment related to an uneven frequency, and adjust an equalizer of the hearing device 501 corresponding to the external environment.
  • the hearing device 501 determines an external environment related to sound reverberations generated from wind reflections on ambient objects, and control or remove the reverberations corresponding to the external environment. That is, the hearing device 501 sets the parameters of the hearing device 501 related to oscillation frequency, and frequency, corresponding to the external environment.
  • the hearing device 501 may set the parameters and then provide the setting result to the user.
  • the hearing device 501 may set parameters related to reverberations based on the response from the user and the hearing loss information of the user, and provide the result such as ‘Removed.’
  • FIG. 6 is a diagram illustrating an example of operation of responding using a user terminal 601 , in accord with an embodiment.
  • the user terminal 601 connected to a hearing device receives a question corresponding to an external environment surrounding a user of the hearing device.
  • the user terminal 601 displays the question and the user may check the displayed question.
  • the user may check the question transmitted from the hearing device through a user terminal 602 . Also, the user may respond to the question using the user terminal 602 . For example, the user may check the transmitted question through the user terminal 602 and respond to the question from the hearing device by selecting or touching a response expressing an intention or response from the user. As another example, the user may check the question using the user terminal 601 and respond to the question using an audio sensor or a touch sensor included in the hearing device.
  • FIG. 7 is a diagram illustrating an example of operation of connecting with an electronic device 701 according to intensity of detected electronic waves, in accord with an embodiment.
  • a hearing device 702 detects radio waves from the electronic device 701 located in the external environment near the user.
  • the hearing device 702 may include a sensor to detect the radio waves.
  • the hearing device 702 detects various intensities of radio waves according to a distance between the user and the electronic device 701 . That is, when the hearing device 702 is adjacent to the electronic device 701 , the hearing device 702 detects radio waves with high intensity from the electronic device 701 .
  • the radio waves with high intensity refer to radio waves having a maximum value that the electronic device 701 may emit.
  • the hearing device 702 and the electronic device 701 are not adjacent to each other, the hearing device 702 detects radio waves of low intensity from the electronic device 701 .
  • the radio waves of low intensity refer to radio waves having a minimum value that the electronic device 701 may emit.
  • the embodiments are not to be interpreted in a limiting manner because the intensity of the radio waves may be different according to a type of the electronic device 701 .
  • the hearing device 702 may select questions according to intensity of the detected radio waves. For instance, when the hearing device 702 is adjacent to the electronic device 701 and detects radio waves of high intensity, the hearing device 702 may select questions to optimize the sound through the hearing device 702 received from the electronic device 701 . For example, the hearing device 702 may check the electronic device 701 through the feature information of the detected radio waves, and select questions corresponding to the electronic device 701 , the questions such as ‘TV is proximate. Connect with TV?’, ‘Connect with radio?’, ‘Connect with desktop?’, and the like.
  • the hearing device 702 may provide the selected questions through the user terminal, which is associated with an audio sensor or the hearing device 702 .
  • the hearing device 702 may set parameters based on a response to the questions and hearing loss information of the user, and then connect with the electronic device 701 .
  • the hearing device 702 may include a separate sensor configured to receive a sound in connection with the electronic device 701 , besides the sensor configured to receive an external sound.
  • the hearing device 702 transmits the sound from the electronic device 701 connected with the hearing device 702 to the user using the separate sensor.
  • the hearing device 702 may interrupt the sound from a speaker of the electronic device 701 through the connection with the electronic device 701 .
  • the hearing device 702 enables the sound to be output from the electronic device 701 through the separate sensor. That is, the hearing device 702 may include an algorithm to control the sound of the electronic device 701 .
  • the hearing device 702 may be connected to a TV disposed in the external environment surrounding the user, and interrupt a TV sound or allow output of the TV sound through the separate sensor.
  • the hearing device 702 When connected with the electronic device 701 , the hearing device 702 controls power of the sensor configured to receive the external sound, such as the audio sensor or a microphone.
  • the hearing device 702 connected with a radio provides the user with questions for turning on and off the sensor receiving the external sound. Accordingly, the hearing device 702 may control power of the sensor receiving the external sound according to a response to the questions.
  • the questions related to power control of the sensor may be inquired in consideration of user convenience, for example, when the user is alone or when the user is with other people, but wants to concentrate on the electronic device.
  • the hearing device 702 may be connected with another hearing device 702 disposed around the user.
  • the hearing device 702 may be positioned in one ear of the user and the other hearing device 702 may be positioned in another ear of the user.
  • the other hearing device 702 may be a hand held device.
  • the different hearing devices 702 may be correlated through a social network service (SNS).
  • SNS social network service
  • unique frequencies for identification may be pre-stored in the different hearing devices 702 to be connected with each other.
  • the hearing device 702 may be connected with another hearing device 702 by correcting the unique frequency. Also, the hearing device 702 may be connected with another hearing device 702 through various connection methods such as streaming.
  • the hearing device 702 may receive the pre-stored unique frequency at a location of the user.
  • the hearing device 702 may check the pre-stored unique frequency, select questions about whether to connect with another hearing device 702 that corresponds to the unique frequency, and transmit the questions to the user.
  • the hearing device 702 may be connected to another hearing device 702 by correcting a frequency band corresponding to the frequency. Accordingly, the hearing device 702 may receive a sound from another hearing device 702 .
  • the function of connecting with another hearing device 702 may be useful when the user is dropped out of a party in a crowded place, for example during a party or a trip.
  • FIG. 8 is a flowchart illustrating an example of a process performed by a hearing device, in accord with an embodiment.
  • the process of the hearing device determines an external environment surrounding a user using an audio sensor.
  • the process of the hearing device may use an acceleration sensor, a temperature sensor, an optical sensor, and the like, besides the audio sensor, to determine the external environment of the user.
  • the process of the hearing device may search questions corresponding to the external environment surrounding the user. That is, the process of the hearing device searches the questions corresponding to the external environment surrounding the user, from a DB including a predetermined question list.
  • the process of the hearing device finds questions corresponding to the external environment from the question list.
  • the questions may be found based on, as a function of, or considering factors influencing a sound of the user, for example, a user gender, a user age, a point of time at which the external environment of the user is determined, and a temperature. For example, when corresponding questions are absent in the question list, the process of the hearing device may add and select the corresponding questions.
  • the process of the hearing device inquires the selected questions to the user.
  • the process of the hearing device inquires the questions using the audio sensor included in the hearing device.
  • the hearing device may inquire the selected questions using a user terminal compatible with the hearing device.
  • the process of the hearing device receives a response to the questions from the user.
  • the process of the hearing device uses the audio sensor or a touch sensor included in the hearing device to receive the response from the user.
  • the hearing device may receive the response from the user through a user terminal connected with the hearing device.
  • the process of the hearing device determines the response received from the user. For example, the process of the hearing device receives the response expressing the user intention with respect to the questions.
  • the process of the hearing device may set parameters related to the external environment of the hearing device, corresponding to the response received from the user.
  • the process of the hearing device sets the parameters related to the external environment based on the response from the user to minimize user inconvenience of directly setting the parameters with respect to the external environment being changed.
  • the process of the hearing device sets the parameters related to the external environment based on the response from the user and hearing loss information of the user, and then provides the user with the setting result.
  • FIG. 8 it is to be understood that in the embodiment of the present invention, the operations in FIG. 8 are performed in the sequence and manner as shown although the order of some steps and the like may be changed without departing from the spirit and scope of the present invention.
  • a computer program embodied on a non-transitory computer-readable medium may also be provided, encoding instructions to perform at least the process described in FIG. 8 .
  • Program instructions to perform a method described in FIG. 8 may be recorded, stored, or fixed in one or more non-transitory computer-readable storage media.
  • the program instructions may be implemented by a computer.
  • the computer may cause a processor to execute the program instructions.
  • the media may include, alone or in combination with the program instructions, data files, data structures, and the like.
  • Examples of computer-readable media include magnetic media, such as hard disks, floppy disks, and magnetic tape; optical media such as CD ROM disks and DVDs; magneto-optical media, such as optical disks; and hardware devices that are specially configured to store and perform program instructions, such as read-only memory (ROM), random access memory (RAM), flash memory, and the like.
  • Examples of program instructions include machine code, such as produced by a compiler, and files containing higher level code that may be executed by the computer using an interpreter.
  • the program instructions that is, software, may be distributed over network coupled computer systems so that the software is stored and executed in a distributed fashion.
  • the software and data may be stored by one or more computer readable recording mediums.
  • functional programs, codes, and code segments for accomplishing the example embodiments disclosed herein may be easily construed by programmers skilled in the art to which the embodiments pertain based on and using the flow diagrams and block diagrams of the figures and their corresponding descriptions as provided herein.

Landscapes

  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Otolaryngology (AREA)
  • Engineering & Computer Science (AREA)
  • Neurosurgery (AREA)
  • Physics & Mathematics (AREA)
  • Acoustics & Sound (AREA)
  • Signal Processing (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Pulmonology (AREA)
  • Transplantation (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Vascular Medicine (AREA)
  • Cardiology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • User Interface Of Digital Computer (AREA)
  • Circuit For Audible Band Transducer (AREA)
  • Telephone Function (AREA)
US14/103,218 2012-12-13 2013-12-11 Hearing device considering external environment of user and control method of hearing device Active 2034-07-18 US9794699B2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020120145615A KR102051545B1 (ko) 2012-12-13 2012-12-13 사용자의 외부 환경을 고려한 청각 장치 및 방법
KR10-2012-0145615 2012-12-13

Publications (2)

Publication Number Publication Date
US20140169574A1 US20140169574A1 (en) 2014-06-19
US9794699B2 true US9794699B2 (en) 2017-10-17

Family

ID=50930907

Family Applications (1)

Application Number Title Priority Date Filing Date
US14/103,218 Active 2034-07-18 US9794699B2 (en) 2012-12-13 2013-12-11 Hearing device considering external environment of user and control method of hearing device

Country Status (2)

Country Link
US (1) US9794699B2 (ko)
KR (1) KR102051545B1 (ko)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170127201A1 (en) * 2014-06-16 2017-05-04 Sonova Ag Method for evaluating an individual hearing benefit of a hearing device feature and for fitting a hearing device
US11729540B2 (en) 2021-12-16 2023-08-15 Starkey Laboratories, Inc. Water immune user-actuatable touch control for an ear-worn electronic device
US11974088B2 (en) 2020-06-25 2024-04-30 Starkey Laboratories, Inc. User-actuatable touch control for an ear-worn electronic device
US12014114B2 (en) 2021-06-17 2024-06-18 Samsung Electronics Co., Ltd. Electronic device for responding to user reaction and outside sound and operating method thereof

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3120578B2 (en) * 2014-03-19 2022-08-17 Bose Corporation Crowd sourced recommendations for hearing assistance devices
KR102411263B1 (ko) 2017-10-11 2022-06-22 삼성전자 주식회사 전자장치 및 그 제어방법
WO2021144033A1 (de) * 2020-01-17 2021-07-22 Sivantos Pte. Ltd. Verfahren zum betrieb eines hörhilfegerätes
JP2023541723A (ja) * 2020-09-09 2023-10-03 オリーブ ユニオン インコーポレイテッド 自然語または非自然語を区分するスマートヒアリングデバイス、人工知能ヒアリングシステム、およびその方法
KR20220168833A (ko) * 2021-06-17 2022-12-26 삼성전자주식회사 외부 소리 및 사용자 반응에 대응하는 전자 장치 및 이의 동작 방법
US11218817B1 (en) 2021-08-01 2022-01-04 Audiocare Technologies Ltd. System and method for personalized hearing aid adjustment
US11991502B2 (en) 2021-08-01 2024-05-21 Tuned Ltd. System and method for personalized hearing aid adjustment
US11425516B1 (en) 2021-12-06 2022-08-23 Audiocare Technologies Ltd. System and method for personalized fitting of hearing aids
WO2024052781A1 (en) * 2022-09-08 2024-03-14 Cochlear Limited Smooth switching between medical device settings
KR20240047064A (ko) * 2022-10-04 2024-04-12 올리브유니온(주) 이어폰 제어 방법, 컴퓨터 프로그램 및 컴퓨터 장치

Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0918997A (ja) 1995-06-30 1997-01-17 Hitachi Ltd 音声処理装置
KR20010008008A (ko) 2000-11-02 2001-02-05 심윤주 보청기 자동 피팅방법
US6320969B1 (en) 1989-09-29 2001-11-20 Etymotic Research, Inc. Hearing aid with audible alarm
US20020044669A1 (en) * 2000-09-29 2002-04-18 Wolfram Meyer Method of operating a hearing aid and hearing-aid arrangement or hearing aid
KR20040019147A (ko) 2002-08-26 2004-03-05 세기스타 주식회사 보청기의 피팅방법
US20040052391A1 (en) * 2002-09-12 2004-03-18 Micro Ear Technology, Inc. System and method for selectively coupling hearing aids to electromagnetic signals
US20050141737A1 (en) * 2002-07-12 2005-06-30 Widex A/S Hearing aid and a method for enhancing speech intelligibility
US6978155B2 (en) 2000-02-18 2005-12-20 Phonak Ag Fitting-setup for hearing device
US20070230726A1 (en) * 2006-03-31 2007-10-04 Siemens Audiologische Technik Gmbh Hearing aid with adaptive start values for apparatus
JP2008042787A (ja) 2006-08-10 2008-02-21 Baisera:Kk 聴力適合化装置、聴力適合化方法
JP2010034949A (ja) 2008-07-30 2010-02-12 Panasonic Corp 補聴器ユニット
KR20100042370A (ko) 2008-10-16 2010-04-26 인하대학교 산학협력단 주파수 밴드 및 채널의 가변이 가능한 디지털 보청기 피팅 시스템
JP2010239603A (ja) 2009-03-09 2010-10-21 Panasonic Corp 補聴器
US20110051963A1 (en) * 2009-08-28 2011-03-03 Siemens Medical Instruments Pte. Ltd. Method for fine-tuning a hearing aid and hearing aid
KR20110079846A (ko) 2008-12-12 2011-07-08 비덱스 에이/에스 보청기를 미세 튜닝하는 방법
KR20110097530A (ko) 2010-02-25 2011-08-31 휴리아 주식회사 보청기 겸용 블루투스 헤드셋 및 그 제어방법
US20130022223A1 (en) * 2011-01-25 2013-01-24 The Board Of Regents Of The University Of Texas System Automated method of classifying and suppressing noise in hearing devices

Patent Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6320969B1 (en) 1989-09-29 2001-11-20 Etymotic Research, Inc. Hearing aid with audible alarm
JPH0918997A (ja) 1995-06-30 1997-01-17 Hitachi Ltd 音声処理装置
US6978155B2 (en) 2000-02-18 2005-12-20 Phonak Ag Fitting-setup for hearing device
US20020044669A1 (en) * 2000-09-29 2002-04-18 Wolfram Meyer Method of operating a hearing aid and hearing-aid arrangement or hearing aid
KR20010008008A (ko) 2000-11-02 2001-02-05 심윤주 보청기 자동 피팅방법
US20050141737A1 (en) * 2002-07-12 2005-06-30 Widex A/S Hearing aid and a method for enhancing speech intelligibility
KR20040019147A (ko) 2002-08-26 2004-03-05 세기스타 주식회사 보청기의 피팅방법
US20040052391A1 (en) * 2002-09-12 2004-03-18 Micro Ear Technology, Inc. System and method for selectively coupling hearing aids to electromagnetic signals
US20070230726A1 (en) * 2006-03-31 2007-10-04 Siemens Audiologische Technik Gmbh Hearing aid with adaptive start values for apparatus
JP2008042787A (ja) 2006-08-10 2008-02-21 Baisera:Kk 聴力適合化装置、聴力適合化方法
JP2010034949A (ja) 2008-07-30 2010-02-12 Panasonic Corp 補聴器ユニット
KR20100042370A (ko) 2008-10-16 2010-04-26 인하대학교 산학협력단 주파수 밴드 및 채널의 가변이 가능한 디지털 보청기 피팅 시스템
KR20110079846A (ko) 2008-12-12 2011-07-08 비덱스 에이/에스 보청기를 미세 튜닝하는 방법
US20110235835A1 (en) * 2008-12-12 2011-09-29 Widex A/S Method for fine tuning a hearing aid
JP2010239603A (ja) 2009-03-09 2010-10-21 Panasonic Corp 補聴器
US20110051963A1 (en) * 2009-08-28 2011-03-03 Siemens Medical Instruments Pte. Ltd. Method for fine-tuning a hearing aid and hearing aid
KR20110097530A (ko) 2010-02-25 2011-08-31 휴리아 주식회사 보청기 겸용 블루투스 헤드셋 및 그 제어방법
US20130022223A1 (en) * 2011-01-25 2013-01-24 The Board Of Regents Of The University Of Texas System Automated method of classifying and suppressing noise in hearing devices

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170127201A1 (en) * 2014-06-16 2017-05-04 Sonova Ag Method for evaluating an individual hearing benefit of a hearing device feature and for fitting a hearing device
US10231069B2 (en) * 2014-06-16 2019-03-12 Sonova Ag Method for evaluating an individual hearing benefit of a hearing device feature and for fitting a hearing device
US11974088B2 (en) 2020-06-25 2024-04-30 Starkey Laboratories, Inc. User-actuatable touch control for an ear-worn electronic device
US12014114B2 (en) 2021-06-17 2024-06-18 Samsung Electronics Co., Ltd. Electronic device for responding to user reaction and outside sound and operating method thereof
US11729540B2 (en) 2021-12-16 2023-08-15 Starkey Laboratories, Inc. Water immune user-actuatable touch control for an ear-worn electronic device

Also Published As

Publication number Publication date
KR102051545B1 (ko) 2019-12-04
KR20140084367A (ko) 2014-07-07
US20140169574A1 (en) 2014-06-19

Similar Documents

Publication Publication Date Title
US9794699B2 (en) Hearing device considering external environment of user and control method of hearing device
US11705878B2 (en) Intelligent audio output devices
KR102606789B1 (ko) 복수의 음성 인식 장치들을 제어하는 방법 및 그 방법을 지원하는 전자 장치
CN109684249B (zh) 用于使用电子附件连接的连接属性促进定位附件的主设备
EP3507797B1 (en) Accessing multiple virtual personal assistants (vpa) from a single device
US10410651B2 (en) De-reverberation control method and device of sound producing equipment
US9652532B2 (en) Methods for operating audio speaker systems
CN108540900B (zh) 音量调节方法及相关产品
KR102192361B1 (ko) 머리 움직임을 이용한 사용자 인터페이스 방법 및 장치
JP2021520141A (ja) マイクロフォンアレイ内のインテリジェントビームステアリング
CN108710486B (zh) 音频播放方法、装置、耳机及计算机可读存储介质
US20140370817A1 (en) Determining proximity for devices interacting with media devices
CN108668009B (zh) 输入操作控制方法、装置、终端、耳机及可读存储介质
KR20170076181A (ko) 전자 장치 및 전자 장치의 동작 제어 방법
KR20200015267A (ko) 음성 인식을 수행할 전자 장치를 결정하는 전자 장치 및 전자 장치의 동작 방법
JP2018509820A (ja) パーソナライズされたヘッドホン
US20200389740A1 (en) Contextual guidance for hearing aid
CN107911777B (zh) 一种耳返功能的处理方法、装置及移动终端
US11069332B2 (en) Interference generation
CN106126170B (zh) 一种终端的音效设置方法及终端
CN108769364B (zh) 通话控制方法、装置、移动终端及计算机可读介质
EP2887698B1 (en) Hearing aid for playing audible advertisement
CN113099347A (zh) 耳机控制方法、装置、无线耳机及存储介质
WO2019061292A1 (zh) 一种终端降噪方法及终端
US10506327B2 (en) Ambient environmental sound field manipulation based on user defined voice and audio recognition pattern analysis system and method

Legal Events

Date Code Title Description
AS Assignment

Owner name: SAMSUNG ELECTRONICS CO., LTD., KOREA, REPUBLIC OF

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:CHOI, JONG MIN;KU, YUN SEO;KIM, DONG WOOK;AND OTHERS;SIGNING DATES FROM 20131210 TO 20131211;REEL/FRAME:031760/0432

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4