EP3225047B1 - Method and apparatus for detecting that a device is immersed in a liquid - Google Patents

Method and apparatus for detecting that a device is immersed in a liquid Download PDF

Info

Publication number
EP3225047B1
EP3225047B1 EP15864102.7A EP15864102A EP3225047B1 EP 3225047 B1 EP3225047 B1 EP 3225047B1 EP 15864102 A EP15864102 A EP 15864102A EP 3225047 B1 EP3225047 B1 EP 3225047B1
Authority
EP
European Patent Office
Prior art keywords
electronic device
signal
module
present disclosure
liquid
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
EP15864102.7A
Other languages
German (de)
English (en)
French (fr)
Other versions
EP3225047A1 (en
EP3225047A4 (en
Inventor
Yong-Suk Lee
Tae-Ho Kang
Sung-Woo Choi
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Publication of EP3225047A1 publication Critical patent/EP3225047A1/en
Publication of EP3225047A4 publication Critical patent/EP3225047A4/en
Application granted granted Critical
Publication of EP3225047B1 publication Critical patent/EP3225047B1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01DMEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
    • G01D21/00Measuring or testing not otherwise provided for
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/02Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems using reflection of acoustic waves
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W88/00Devices specially adapted for wireless communication networks, e.g. terminals, base stations or access point devices
    • H04W88/02Terminal devices
    • H04W88/021Terminal devices adapted for Wireless Local Loop operation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01HMEASUREMENT OF MECHANICAL VIBRATIONS OR ULTRASONIC, SONIC OR INFRASONIC WAVES
    • G01H15/00Measuring mechanical or acoustic impedance
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19678User interface
    • G08B13/19682Graphic User Interface [GUI] presenting system data to the user, e.g. information on a screen helping a user interacting with an alarm system
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B11/00Transmission systems employing sonic, ultrasonic or infrasonic waves
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/18Telephone sets specially adapted for use in ships, mines, or other places exposed to adverse environment
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R29/00Monitoring arrangements; Testing arrangements
    • H04R29/001Monitoring arrangements; Testing arrangements for loudspeakers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W24/00Supervisory, monitoring or testing arrangements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01FMEASURING VOLUME, VOLUME FLOW, MASS FLOW OR LIQUID LEVEL; METERING BY VOLUME
    • G01F23/00Indicating or measuring liquid level or level of fluent solid material, e.g. indicating in terms of volume or indicating by means of an alarm
    • G01F23/22Indicating or measuring liquid level or level of fluent solid material, e.g. indicating in terms of volume or indicating by means of an alarm by measuring physical variables, other than linear dimensions, pressure or weight, dependent on the level to be measured, e.g. by difference of heat transfer of steam or water
    • G01F23/28Indicating or measuring liquid level or level of fluent solid material, e.g. indicating in terms of volume or indicating by means of an alarm by measuring physical variables, other than linear dimensions, pressure or weight, dependent on the level to be measured, e.g. by difference of heat transfer of steam or water by measuring the variations of parameters of electromagnetic or acoustic waves applied directly to the liquid or fluent solid material
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01FMEASURING VOLUME, VOLUME FLOW, MASS FLOW OR LIQUID LEVEL; METERING BY VOLUME
    • G01F23/00Indicating or measuring liquid level or level of fluent solid material, e.g. indicating in terms of volume or indicating by means of an alarm
    • G01F23/80Arrangements for signal processing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R1/00Details of transducers, loudspeakers or microphones
    • H04R1/44Special adaptations for subaqueous use, e.g. for hydrophone

Definitions

  • the present disclosure relates to electronic devices, and more particularly to a method and apparatus for detecting that a device is immersed in a liquid.
  • Such electronic devices come with various functions that attract users' attention during leisure time (e.g., mountain climbing, tracking, or swimming) as well as while they do routines.
  • electronic devices are equipped with various functions that may be useful for users' outdoor activities, including dust or water-proof capability that enables the devices to be used underwater.
  • JP 2012 119975 discloses a mobile device which detects whether it is immersed in water by monitoring the change of level of a received RF signal, and which applies a noise reduction algorithm which is adapted to whether the device is immersed in water or not.
  • US 2001/050613 discloses a personal alarm device which determines whether it is immersed in water by emitting light and detecting said light when reflected from a prism of the personal alarm device.
  • the user of an electronic device may face an unexpected situation when carrying and using the electronic device.
  • the electronic device may be dropped into the water while in use.
  • conventional non-waterproof electronic devices do not have any function or operation that may alert the user to that.
  • Such conventional waterproof electronic devices may lack waterproofing and/or any functions or operations that enable the electronic devices to be controlled underwater.
  • an electronic device and method for operating the same may alert the user to the electronic device dropping underwater.
  • an electronic device and method for operating the same may determine whether the electronic device is underwater.
  • an electronic device and method for operating the same that may control the electronic device underwater.
  • the terms “have,” “may have,” “include,” or “may include” a feature e.g., a number, function, operation, or a component such as a part
  • a feature e.g., a number, function, operation, or a component such as a part
  • a or B may include all possible combinations of A and B.
  • “A or B,” “at least one of A and B,” “at least one of A or B” may indicate all of (1) including at least one A, (2) including at least one B, or (3) including at least one A and at least one B.
  • first and second may modify various components regardless of importance and do not limit the components. These terms are only used to distinguish one component from another.
  • a first user device and a second user device may indicate different user devices from each other regardless of the order or importance of the devices.
  • a first component may be denoted a second component, and vice versa without departing from the scope of the present disclosure.
  • the terms “configured (or set) to” may be interchangeably used with the terms “suitable for,” “having the capacity to,” “designed to,” “adapted to,” “made to,” or “capable of” depending on circumstances.
  • the term “configured (or set) to” does not essentially mean “specifically designed in hardware.” Rather, the term “configured to” may mean that a device can perform an operation together with another device or parts.
  • the term “processor configured (or set) to perform A, B, and C” may mean a generic-purpose processor (e.g., a CPU or application processor) that may perform the operations by executing one or more software programs stored in a memory device or a dedicated processor (e.g., an embedded processor) for performing the operations.
  • examples of the electronic device may include at least one of a smartphone, a tablet personal computer (PC), a mobile phone, a video phone, an e-book reader, a desktop PC, a laptop computer, a netbook computer, a workstation, a PDA (personal digital assistant), a portable multimedia player (PMP), an MP3 player, a mobile medical device, a camera, or a wearable device (e.g., smart glasses, a head-mounted device (HMD), electronic clothes, an electronic bracelet, an electronic necklace, an electronic appcessory, an electronic tattoo, a smart mirror, or a smart watch).
  • a wearable device e.g., smart glasses, a head-mounted device (HMD), electronic clothes, an electronic bracelet, an electronic necklace, an electronic appcessory, an electronic tattoo, a smart mirror, or a smart watch.
  • the electronic device may be a smart home appliance.
  • the smart home appliance may include at least one of a television, a digital video disk (DVD) player, an audio player, a refrigerator, an air conditioner, a cleaner, an oven, a microwave oven, a washer, a drier, an air cleaner, a set-top box, a home automation control panel, a security control panel, a TV box (e.g., Samsung HomeSyncTM , Apple TVTM , or Google TVTM) , a gaming console (XboxTM, PlayStationTM), an electronic dictionary, an electronic key, a camcorder, or an electronic picture frame.
  • a television e.g., Samsung HomeSyncTM , Apple TVTM , or Google TVTM
  • a gaming console e.g., PlayStationTM
  • an electronic dictionary e.g., an electronic key, a camcorder, or an electronic picture frame.
  • examples of the electronic device may include at least one of various medical devices (e.g., diverse portable medical measuring devices (a blood sugar measuring device, a heartbeat measuring device, or a body temperature measuring device), a magnetic resource angiography (MRA) device, a magnetic resonant imaging (MRI) device, a computed tomography (CT) device, an imaging device, or an ultrasonic device), a navigation device, a global positioning system (GPS) receiver, an event data recorder (EDR), a flight data recorder (FDR), an automotive infotainment device, a sailing electronic device (e.g., a sailing navigation device or a gyro compass), avionics, security devices, vehicular head units, industrial or home robots, automatic teller's machines (ATMs), point of sales (POS) devices, or Internet of Things devices (e.g., a bulb, various sensors, an electric or gas meter, a sprinkler, a fire alarm, a thermostat,
  • MRA magnetic resource ang
  • the electronic device may be part of a furniture or building/structure, an electronic board, an electronic signature receiving device, a projector, or various measurement devices (e.g., devices for measuring water, electricity, gas, or electromagnetic waves).
  • the electronic device may be one or a combination of the above-listed devices.
  • the electronic device may be a flexible electronic device.
  • the electronic device disclosed herein is not limited to the above-listed devices, and may include new electronic devices depending on the development of technology.
  • the term "user” may denote a human or another device (e.g., an electronic device that possesses artificial intelligence) using the electronic device.
  • FIG. 1 is a diagram of an example of a network environment, according to an embodiment of the present disclosure.
  • an electronic device 101 may be included in a network environment 100.
  • the electronic device 101 may include a bus 110, a processor 120, a memory 130, an input/output interface 150, a display 160, and a communication interface 170.
  • the electronic device 101 may exclude at least one of the components or may add another component.
  • the bus 110 may include a circuit for connecting the components 110 to 170 with one another and transferring communications (e.g., control messages and/or data) between the components.
  • communications e.g., control messages and/or data
  • the processor 120 may include one or more processors. Any of the processors may include one or more of a central processing unit (CPU), an application processor (AP), or a communication processor (CP), a general-purpose processor (e.g., ARM-based processors), a Digital Signal Processor (DSP), a Programmable Logic Device (PLD), an Application-Specific Integrated Circuit (ASIC), a Field-Programmable Gate Array (FPGA), and/or any other suitable type of processing circuitry.
  • the processor 120 may perform control on at least one of the other components of the electronic device 101, and/or perform an operation or data processing relating to communication.
  • the memory 130 may include any suitable type of volatile or non-volatile memory, such as Random Access Memory (RAM), Read-Only Memory (ROM), Network Accessible Storage (NAS), cloud storage, a Solid State Drive (SSD), etc.
  • the memory 130 may store commands or data related to at least one other component of the electronic device 101.
  • the memory 130 may store software and/or a program 140.
  • the program 140 may include, e.g., a kernel 141, middleware 143, an application programming interface (API) 145, and/or an application program (or "application”) 147. At least a portion of the kernel 141, middleware 143, or API 145 may be denoted an operating system (OS).
  • OS operating system
  • the kernel 141 may control or manage system resources (e.g., the bus 110, processor 120, or a memory 130) used to perform operations or functions implemented in other programs (e.g., the middleware 143, API 145, or application program 147).
  • the kernel 141 may provide an interface that allows the middleware 143, the API 145, or the application 147 to access the individual components of the electronic device 101 to control or manage the system resources.
  • the middleware 143 may function as a relay to allow the API 145 or the application 147 to communicate data with the kernel 141, for example.
  • a plurality of applications 147 may be provided.
  • the middleware 143 may control work requests received from the applications 147, e.g., by allocation the priority of using the system resources of the electronic device 101 (e.g., the bus 110, the processor 120, or the memory 130) to at least one of the plurality of applications 147.
  • the API 145 is an interface allowing the application 147 to control functions provided by the kernel 141 or the middleware 143.
  • the API 145 may include at least one interface or function (e.g., a command) for filing control, window control, image processing or text control.
  • the input/output interface 150 may serve as an interface that may, e.g., transfer commands or data input from a user or other external devices to other component(s) of the electronic device 101. Further, the input/output interface 150 may output commands or data received from other component(s) of the electronic device 101 to the user or the other external device.
  • the display 160 may include, e.g., a liquid crystal display (LCD), a light-emitting diode (LED) display, an organic light-emitting diode (OLED) display, or a microelec-tromechanical systems (MEMS) display, or an electronic paper display.
  • the display 160 may display, e.g., various contents (e.g., text, images, videos, icons, or symbols) to the user.
  • the display 160 may include a touchscreen and may receive, e.g., a touch, gesture, proximity or hovering input using an electronic pen or a body portion of the user.
  • the communication interface 170 may set up communication between the electronic device 101 and an external electronic device (e.g., a first electronic device 102, a second electronic device 104, or a server 106).
  • the communication interface 170 may be connected to the network 162 through wireless or wired communication to communicate with the external electronic device.
  • the wireless communication may use at least one of, e.g., long-term evolution (LTE), long-term evolution- advanced (LTE-A), code division multiple access (CDMA), wideband code division multiple access (WCDMA), universal mobile telecommunication system (UMTS), wireless broadband (WiBro), or global system for mobile communication (GSM), as a cellular communication protocol.
  • LTE long-term evolution
  • LTE-A long-term evolution- advanced
  • CDMA code division multiple access
  • WCDMA wideband code division multiple access
  • UMTS universal mobile telecommunication system
  • WiBro wireless broadband
  • GSM global system for mobile communication
  • the wired connection may include at least one of universal serial bus (USB), high definition multimedia interface (HDMI), recommended standard 232 (RS-232), or plain old telephone service (POTS).
  • the network 162 may include at least one of a telecommunication network, e.g., a computer network (e.g., local area network (LAN) or wide area network (WAN)), the Internet, or
  • the first and second external electronic devices 102 and 104 each may be a device of the same or different type than the electronic device 101.
  • the server 106 may include a group of one or more servers.
  • all or some of operations executed on the electronic device 101 may be executed on another or multiple other electronic devices (e.g., the electronic devices 102 and 104 or server 106).
  • the electronic device 101 when the electronic device 101 should perform some function or service automatically or at a request, the electronic device 101, instead of executing the function or service on its own or additionally, may request another device (e.g., electronic devices 102 and 104 or server 106) to perform at least some functions associated therewith.
  • the other electronic device may execute the requested functions or additional functions and transfer a result of the execution to the electronic device 101.
  • the electronic device 101 may provide a requested function or service by processing the received result as it is or additionally.
  • a cloud computing, distributed computing, or client-server computing technique may be used, for example.
  • FIG. 2 is a diagram of an example of an electronic device, according to an embodiment of the present disclosure.
  • the electronic device 20 may include a control module 200, a wireless communication module 210, a sound wave output module 220, a sound wave receiving module 230, an input module 240, a display module 250, a storage module 260, a sensor module 270, and a power module 280.
  • the control module 200 may include one or more processors. Any of the processors may include one or more of a central processing unit (CPU), an application processor (AP), or a communication processor (CP), a general-purpose processor (e.g., ARM-based processors), a Digital Signal Processor (DSP), a Programmable Logic Device (PLD), an Application-Specific Integrated Circuit (ASIC), a Field-Programmable Gate Array (FPGA), and/or any other suitable type of processing circuitry.
  • CPU central processing unit
  • AP application processor
  • CP communication processor
  • DSP Digital Signal Processor
  • PLD Programmable Logic Device
  • ASIC Application-Specific Integrated Circuit
  • FPGA Field-Programmable Gate Array
  • the control module 200 may execute operation or data processing regarding the control and/or communication of, e.g., at least one other components of the electronic device 20 (e.g., the wireless communication module 210, the sound wave output module 220, the sound wave receiving module 230, the input module 240, the display module 250, the storage module 260, the sensor module 270, and the power module 280).
  • the function(s) or operation(s) performed by the control module 200 may be executed by, e.g., the processor 120.
  • the wireless communication module 210 may receive radio frequency (RF) signals.
  • the RF signals may include at least one of, e.g., a wireless fidelity (Wi-Fi) signal, a Bluetooth (BT) signal, a near field communication (NFC) signal, a global positioning system (GPS) signal, an FM/AM radio signal, or a signal of cellular communication (e.g., long-term evolution (LTE), LTE-advanced (LTE-A), code division multiple access (CDMA), wideband CDMA (WCDMA), universal mobile telecommunications service (UMTS), wireless broadband (WiBro) or global system for mobile communications (GSM).
  • the control module 200 may monitor the RF signals received by the wireless communication module 210. According to aspects of the present disclosure, such RF signal may be referred to as a first signal.
  • the sound wave output module 220 may perform control to output a sound wave having a predetermined frequency (e.g., an inaudible frequency of 10 KHz) depending on variations in the amplitude and/or frequency of the first signal.
  • a predetermined frequency e.g., an inaudible frequency of 10 KHz
  • the 10 KHz inaudible frequency is provided as an example, and the sound wave may have other various frequencies.
  • the variations in the amplitude and/or frequency may come from, e.g., differences in the mediums through which the first signal is transmitted.
  • the sound wave output module 220 may include any suitable type of acoustic transducer, such as a speaker for example. According to aspects of the present disclosure, the sound wave may be referred to as a second signal.
  • the sound wave receiving module 230 may receive a sound wave output from the sound wave output module 220.
  • the sound wave receiving module 230 may include any suitable type of acoustic transducer, such as a microphone for example.
  • the input module 240 may include, e.g., a touch panel, a pen sensor, a key, or an ultrasonic input device.
  • the touch panel may recognize touch inputs in at least one of capacitive, resistive, infrared, or ultrasonic methods. With the capacitive method, physical contact or proximity detection may be possible.
  • the touch panel may further include a tactile layer. In this regard, the touch panel may provide the user with a tactile response.
  • the pen sensor may be implemented in a well-known fashion.
  • the key may include e.g., a physical button, optical key or keypad.
  • the ultrasonic input device may use an input tool that generates an ultrasonic signal and enable the electronic device 20 to identify data by sensing the ultrasonic signal to a microphone (e.g., a microphone 1788).
  • the display module 250 may display various types of information (e.g., multimedia data or text data) to the user.
  • the display module 250 may display a list of applications that may be run when the electronic device is underwater.
  • the function(s) or operation(s) performed by the control module 200 may be executed by, e.g., the display 160.
  • the storage module 260 may store commands or data received from the control module 200 or other components (e.g., the wireless communication module 210, the sound wave output module 220, the sound wave receiving module 230, the input module 240, the display module 250, the storage module 260, the sensor module 270, and the power module 280) or commands or data generated by the processor 120 or other components.
  • the storage module 260 may retain data regarding a reference signal to determine a variation in the first signal.
  • the sensor module 270 may measure a physical quantity or detect an operational stage of the electronic device 20, and the sensor module 270 may convert the measured or detected information into an electrical signal.
  • the sensor module 270 may include at least one of, e.g., a gesture sensor, a gyro sensor, an atmospheric pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor, such as an RGB (Red, Green, and Blue) sensor, a biosensor, a temperature/ humidity sensor, an illumination sensor, or an Ultra Violet (UV) sensor.
  • the sensor module 270 may sense various events inputted to the electronic device 20 to control the electronic device when the electronic device is underwater.
  • the sensor module 270 may further include a control circuit for controlling one or more of the sensors included in the sensor module.
  • the power module 280 may be controlled to supply power to various components of the electronic device 20 (e.g., the control module 200, the wireless communication module 210, the sound wave output module 220, the sound wave receiving module 230, the input module 240, the display module 250, the storage module 260, and the sensor module 270).
  • the power module 280 may include, e.g. a battery, but is not limited thereby, and the power module 280 may include various modules that may supply power to various components of the electronic device 20.
  • FIG. 3 is a diagram illustrating the effect of water on signals transmitted and received by an electronic device when the electronic device is positioned underwater, according to an embodiment of the present disclosure.
  • the electronic device 300 may be positioned underwater 330. This may happen, e.g., when the user of the electronic device 300 accidentally drops the electronic device 300 in the water 330 or when the user unintentionally carries the electronic device 300 into the water.
  • the RF signals 310 and 320 may experience attenuation due to differences in the media across which the RF signals 310 and 320 travel.
  • the RF signal 310 traveling in the air may have a greater strength than the RF signal 320 traveling underwater.
  • the strength may be assessed based on any suitable type of measure, such as dBm which may be obtained by representing the mW unit of power in the dB scale.
  • FIG. 4 illustrates actual examples regarding the description of FIG. 3 .
  • FIG. 4a is a diagram of an example of a signal monitoring screen that can be displayed when the electronic device 400 is not immersed in water, according to an embodiment of the present disclosure.
  • the control module (e.g., the control module 200) of the electronic device 400 may monitor the first signal and may display the result of monitoring the first signal.
  • the monitoring result may be displayed as, e.g., a received signal code power (RSCP) value 410.
  • RSCP received signal code power
  • the RSCP shown in FIG. 4a is provided only as an example, and instead of RSCP, any other suitable type of parameter may be used that is related to the strength (RF signal level) of the first signal (e.g., reference signal received power (RSRP), received signal strength indicator (RSSI), etc.).
  • the monitoring result may be numerically displayed through various signal-to-noise - ratio-related parameters (e.g., Ec/Io, Eb/No).
  • FIG. 4b is a diagram of an example of a signal monitoring screen that can be displayed when the electronic device 400 is immersed in water, according to an embodiment of the present disclosure.
  • an attenuated first signal may be received due to differences in the media through which the signal travels.
  • a reduced RSCP value 420 may be measured and subsequently displayed by the electronic device 400.
  • FIG. 4c depicts a plot of signal strength values measured by the electronic device 400 over a given time period, according to an embodiment of the present disclosure.
  • the strength 430 of the first signal when the electronic device is not immersed in water is greater than the strength 440 of the first signal when the electronic device is underwater.
  • the second signal may be output at a triggering point 450, in response to the strength of the first signal falling below a predetermined threshold.
  • FIG. 5a is a diagram illustrating the operation of an electronic device when the electronic device is located underwater, according to an embodiment of the present disclosure.
  • the sound wave output module e.g., the speaker 510
  • the sound wave output module may output a second signal (e.g., the sound wave 512) having a predetermined frequency (e.g., the inaudible frequency of 10KHz) when a change in the strength of the first signal is sensed (i.e., the triggering point).
  • the second signal may then be received by a sound wave receiving module (e.g., a microphone) of the electronic device 500.
  • a sound wave receiving module e.g., a microphone
  • FIG. 5b and 5c are diagrams illustrating the attributes of a second signal that is output by an electronic device during a period in which the electronic device is situated above water (e.g., in the air) and a period in which the electronic device is located underwater, according to an embodiment of the present disclosure. More particularly, FIG. 5b illustrates the frequency and amplitude of the second signal as the second signal travels above water during period (1), and then underwater during period (2). As illustrated, when the frequency of the second signal is 10KHz, and the second signal is propagating underwater, the second signal exhibits a reduced amplitude 522a or a different frequency characteristic 522b as compared with the frequency and amplitude when the second signal is propagating through the air. That is, when the second signal propagates underwater, the high-frequency components of the signal are reduced while its low-frequency components are exaggerated as compared with when the second signal propagates through air.
  • FIG. 5c is a diagram illustrating the frequency and amplitude of the second signal as the second signal travels above water during period (1) and then under water during period (2).
  • the frequency of the second signal is 20KHz
  • the second signal when the frequency of the second signal is 20KHz, and the second signal is propagating underwater, the second signal exhibits an increased amplitude 522a or a different frequency characteristic 522b as compared with when the second signal is propagating through the air. That is, when the second signal having a frequency of 20 KHz propagates underwater, the high-frequency components of the second signal are reduced while the low-frequency components are exaggerated in comparison to when the second signal is propagating through the air.
  • a storage module may retain data identifying various characteristics of a signal when the signal travels through a first medium (e.g., water) and through a second medium (e.g., air), and how those characteristics change when the signal transitions from one medium to the other.
  • the data may represent a waveform (hereinafter "reference waveform") and it may be used to determine whether the electronic device is underwater.
  • the reference waveform may include a waveform when the second signal having a predetermined frequency (e.g., 10 kHz) propagates underwater. Additionally or alternatively, the reference waveform may include a waveform when the second signal having the predetermined frequency propagates in the air.
  • the data may include various elements of the waveform (e.g., the reference waveform), such as data regarding the amplitude and/or frequency of the waveform.
  • the control module of the electronic device may compare the reference waveform with the waveform of the received second signal to determine whether the electronic device 500 is positioned underwater. Additionally or alternatively, the control module may compare another reference waveform with the waveform of the received second signal to determine whether the electronic device 500 is positioned underwater.
  • the first signal may include a radio frequency (RF) signal
  • the second signal may include an acoustic signal (e.g., a sound wave).
  • RF radio frequency
  • the sound wave output module (e.g., the sound wave output module 220) may include a speaker, and the sound wave receiving module (e.g., the sound wave receiving module 230) may include a microphone.
  • the RF signal may include any suitable type of signal, such as a Wi-Fi signal, a Bluetooth signal, an NFC signal, a GPS signal, an LTE signal, an LTE-A signal, a CDMA signal, a WCDMA signal, a UMTS signal, a WiBro signal, a GSM signal, or an FM/AM radio signal.
  • control module may compare the reference waveform stored in the electronic device with the waveform of the received second signal.
  • the control module may determine that at least a portion of the electronic device is positioned underwater when the reference waveform matches the waveform of the received second signal. Alternatively, the control module may determine that at least a portion of the electronic device is positioned underwater when the reference waveform is different from the waveform of the received second signal.
  • the reference waveform form may be considered to match the waveform of the measured signal when one or more characteristics of the reference waveform are identical to respective characteristics of the measured second signal. Additionally or alternatively, the reference waveform form may be considered to match the waveform of the measured signal when one or more characteristics of the reference waveform are within a predetermined distance from respective characteristics of the measured second signal.
  • the term "reference waveform” may refer to any suitable type of data item and/or data set that identifies one or more characteristics of a reference wave, such as frequency, power, amplitude, etc.
  • comparing the reference waveform to the waveform of the second signal may include comparing a characteristic of the second signal (e.g., frequency, amplitude, etc.) to a corresponding reference value that is part of the waveform.
  • FIG. 6 is a flowchart of an example of a process, according to an embodiment of the present disclosure.
  • a first signal having a first frequency is received by an electronic device.
  • the first signal is received from an external electronic device through a wireless communication module (e.g., the wireless communication module 210).
  • the electronic device identifies at least one of the strength of the first signal or a signal-to-noise ratio of the first signal.
  • the electronic device outputs a second signal having a second frequency different from the first frequency based on at least one of the strength of the first signal or the signal-to-noise ratio.
  • the electronic device receives the second signal.
  • the electronic device detects whether it is at least partially immersed in a liquid. For example, the electronic device may detect whether at least a portion of the electronic device is positioned underwater based on a characteristic of the received second signal.
  • the first signal may include a radio frequency (RF) signal
  • the second signal may include a sound wave
  • the operation of outputting the second signal may include outputting a sound wave using a speaker included in the electronic device, and the operation of receiving at least a portion of the second signal may include an operation of receiving the outputted sound wave using a microphone included in the electronic device.
  • the RF signal may include any suitable type of signal, such as a Wi-Fi signal, a Bluetooth signal, an NFC signal, a GPS signal, an LTE signal, an LTE-A signal, a CDMA signal, a WCDMA signal, a UMTS signal, a WiBro signal, a GSM signal, or an FM/AM radio signal.
  • the operation of determining whether at least a portion of the electronic device is positioned underwater may include comparing a reference waveform stored in the electronic device with a waveform of the received second signal.
  • the operation of determining whether at least a portion of the electronic device is at least partially immersed in a liquid may include detecting that at least a portion of the electronic device is positioned underwater when the reference waveform matches the waveform of the received second signal.
  • the operation of determining whether at least a portion of the electronic device is positioned underwater may include detecting that at least a portion of the electronic device is positioned underwater when the reference waveform does not match the waveform of the received second signal.
  • FIGS. 7a-7b are diagrams illustrating an example of the operation of an electronic device 700, according to an embodiment of the disclosure.
  • the electronic device 700 is waterproof and capable of operating underwater at a predetermined water pressure.
  • the electronic device 700 may be transitioned to a state in which the electronic device is configured to perform one or more predetermined operations in response to input that is considered suitable for underwater use.
  • the electronic device 700 may be transitioned into an "underwater mode.”
  • the electronic device may be transitioned into the underwater mode by a control module that is part of the electronic device 700, such as the control module 200.
  • the electronic device 700 may display a notification message 715 to indicate entry into the underwater mode when the electronic device is transitioned into the underwater mode. More particularly, a home screen 710 may be displayed on the background of a notification message 715.
  • the control module may turn off (or otherwise disable) the touch key 703 of the electronic device 700. Doing so may reduce malfunctions that may be caused by the water.
  • the electronic device 700 may hide the notification message 715 and activate an indicator 702 showing that the electronic device 700 currently operates in the underwater mode.
  • the indicator 702 may include various light-emitting devices (e.g., an LED module). The user of the electronic device 700 may identify that the electronic device 700 is currently in the underwater mode through the notification indicated via the indicator 702.
  • FIGS. 8a-11 are diagrams illustrating examples of the operation of an electronic device, according to various embodiments of the disclosure. More particularly, FIGS. 8-11 illustrate function(s) or operation(s) performed by the electronic device when the electronic device is at least partially immersed in a liquid.
  • the electronic device may execute ( e.g., launch) a predetermined application 820 as shown in FIG. 8b .
  • FIG. 8b illustrates an example of the execution of a camera application as an embodiment of the application.
  • the predetermined application 820 may be an application designated by the user (e.g., the user 860) prior to the entry into the underwater mode (i.e., when the electronic device 800 is underwater).
  • the user may designate at least one application predicted to be used frequently in the underwater mode as the predetermined application via a user interface that is available on the electronic device.
  • the control module may cause the execution of the predetermined application (and/or the display of its interface) to be toggled on and off by repeated input to the physical key 805.
  • FIG. 9a another example is shown in which a predetermined application is executed in the underwater mode.
  • an application corresponding to the motion gesture may be run.
  • the motion gesture may include a hovering action performed adjacently to the electronic device 900, rather than as a direct touch to the electronic device.
  • the motion gesture and the application corresponding to the motion gesture may be designated by the user, e.g., before entry into the underwater mode. That is, the user may designate the motion gesture and the application corresponding to the motion gesture prior to the entry into the underwater mode.
  • FIG. 9a illustrates an example in which the character "C” is designated as the motion gesture, and a motion defining the shape of the character "C” occurs adjacently to the electronic device 900 as the motion gesture.
  • the sensor module 904 may recognize the motion gesture generated, and the control module may perform control to run a predetermined application (e.g., the camera application 920) corresponding to the motion gesture as shown in FIG. 9b
  • the electronic device 1000 may use a grip of the user 1060 on the electronic device as a basis for controlling the electronic device.
  • the grip of the user 1060 may be sensed through any suitable type of sensor, such as a grip sensor included in the electronic device 1000.
  • the electronic device may execute a predetermined application as shown in FIG. 10b .
  • the predetermined application 1120 may be displayed in a split screen mode along with an object (e.g., the home screen 1110) that was displayed by the electronic device immediately before the entry into the underwater mode.
  • the user may control each of a plurality of screens displayed on the electronic device 1100 in the underwater mode.
  • FIGS. 12a-13d are diagrams illustrating examples of the operation of an electronic device, according to an embodiment of the present disclosure. More particularly, FIGS 12a-13d illustrate an example in which at least one application is executed by the electronic device when the electronic device is situated underwater.
  • the electronic device 1200 may receive a touch input through an input means 1270 (e.g., Wacom penTM).
  • the input means 1270 may include at least one of an electromagnetic resonance (EMR) type, a capacitive type, an infrared (IR) type, or an ultrasonic type.
  • EMR electromagnetic resonance
  • IR infrared
  • the electronic device 1200 upon reception of the touch input, may execute an application (e.g., the camera application 1220) selected by the touch input as shown in FIG. 12b .
  • the user may designate an application to be run in the underwater mode through a direct touch input on the electronic device 1200 via the input means 1270.
  • the electronic device 1300 may receive an input to the physical key 1305 (e.g., pressing the physical key 1305 for a predetermined time or more) of the electronic device 1300 while the electronic device 1300 is in the underwater mode.
  • the electronic device 1300 may display a list 1315 of applications executable in the underwater mode as shown in FIG. 13b .
  • the application list 1315 displayed may include only application(s) executable in the underwater mode and may omit applications that have not been selected for execution while the electronic device is in the underwater mode.
  • the application(s) included in the application list 1315 may be previously designated by the user (e.g., the user 1360) or specified when the electronic device 1300 is manufactured.
  • FIG. 13b depicts an example in which a "camera application,” “gallery application,” and “message application” are identified in the list as applications executable in the underwater mode. Furthermore, the user may add to the application list 1315 at least one application executable in the underwater mode by selecting the "user selection" option from application list 1315.
  • FIG. 13c illustrates an example in which the "camera application” is selected as a result of being selected with the input means 1370. Afterwards, the electronic device 1300 may execute the "camera application” while underwater, as shown in FIG. 13d .
  • the control module may transition the electronic device to a state in which the electronic device is configured to perform one or more predetermined operations in response to input that is considered suitable for underwater use. More particularly, when the electronic device is in the underwater mode, detecting any such input may cause the electronic device to generate a control event that triggers the performance of a predetermined operation.
  • the control event may be generated in response to at least one of pressing a physical key (e.g., the physical key 1305) of the electronic device (e.g., the electronic device 1300), a motion gesture, a grip on the electronic device, and a touch through an input means electrically recognizable by the electronic device.
  • the control event may be generated only when the electronic device is in the underwater mode.
  • the control module may display the user interface of the selected application in a split screen mode along with the object or screen displayed immediately before the electronic device is positioned underwater.
  • FIG. 14 is a flowchart of an example of a process, according to an embodiment of the present disclosure.
  • the electronic device may enter the underwater mode (1400) and detect a control event while the electronic device is underwater (in other words, in the underwater mode) (1410). Afterwards, in response to the event, the electronic device may perform an operation associated with the event while the electronic device is in the underwater mode (1420).
  • the electronic device when the electronic device is positioned underwater, the electronic device may be transitioned to a state in which the electronic device is configured to perform one or more predetermined operations in response to input that is considered suitable for underwater use. More particularly, when the electronic device is in the underwater mode, detecting any such input may cause the electronic device to generate a control event that triggers the performance of a predetermined operation.
  • control event may be generated in response to at least one of a pressing a physical key of the electronic device, a motion gesture, a grip on the electronic device, and a touch on the electronic device that is performed through an input means electrically recognizable by the electronic device.
  • a list may be displayed of applications executable while the electronic device is underwater (e.g., the application list 1315).
  • the user interface of the selected application may be displayed along with the object (or screen) displayed on the electronic device immediately before the electronic device is immersed in the water.
  • FIGS. 15a-F are diagrams illustrating an example of the operation of an electronic device, according to an embodiment of the present disclosure. It is assumed that in the example of FIGS 15a-F , the electronic device lacks waterproofing and may sustain damage when it is immersed in water.
  • the electronic device 1500 may be unintentionally immersed in water.
  • FIG. 15a illustrates an example in which the electronic device is dropped in water.
  • the electronic device 1500 may request a wearable electronic device 1510 connected to the electronic device 1500 via wireless communications to notify the user of the immersion.
  • the wearable electronic device 1510 may display an immersion notification message.
  • the wearable electronic device 1510 may receive from the electronic device 1500 an indication of the time of immersion. The indication of the time of the immersion may be subsequently displayed on the wearable electronic device 1510.
  • the electronic device 1500 may identify the time when the electronic device 1500 was immersed and send the same to the wearable electronic device 1510.
  • the function or operation of displaying the immersion notification message 1512 on the wearable electronic device 1510 may be controlled by the control module (e.g., the control module 200) of the electronic device 1500 and/or by a separate control module (not shown) included in the wearable electronic device 1510.
  • the electronic device 1500 may request the wearable electronic device 1510 connected to the electronic device 1500 via wireless communications to provide to the user guide information indicating when the electronic device 1500 is immersed.
  • the wearable electronic device 1510 may display a first aid guide selection menu 1514. The user may cause the first aid guide to be displayed by selecting the "identify" icon in the first aid guide selection menu 1514.
  • the electronic device 1500 may photograph its surroundings in response to detecting that the electronic device 1500 is at least partially immersed in the water and send the captured image data to the wearable electronic device 1510.
  • the image data may include a still image 1516 or video.
  • FIG. 15d illustrates an example in which a still image 1516 is displayed on the wearable electronic device 1510.
  • the wearable electronic device 1510 may receive the image data from the electronic device 1500 and display it on the wearable electronic device 1510.
  • the user of the electronic device 1500 may become aware of the exact position of the electronic device 1500 as a result of the image being transmitted from the electronic device 1500.
  • the electronic device 1500 may obtain location information in response to detecting that the electronic device 1500 is at least partially immersed in the water and send the obtained location information 1518 to the wearable electronic device 1510.
  • the location information 1518 may be acquired by using any suitable type of device, such as a GPS module.
  • the wearable electronic device 1510 may receive the location information 1518 from the electronic device 1500 and display it on the wearable electronic device 1510.
  • the user of the electronic device 1500 may become aware of the exact position of the electronic device 1500 as a result of the location information 1518 being transmitted from the electronic device 1500.
  • the electronic device 1500 may send a request to a predetermined designated electronic device (e.g., a third party's device 1520) to display an immersion notification message 1522 indicating that the electronic device 1500 is immersed in the water.
  • a predetermined designated electronic device e.g., a third party's device 1520
  • the request may be transmitted to the designated electronic device 1520 in response to detecting the electronic device 1500 is at least partially immersed in the water.
  • the designated electronic device 1520 may be an electronic device previously designated by the user of the electronic device 1500 before the electronic device 1500 is immersed.
  • the immersion notification message 1522 may be displayed on the designated electronic device 1520 in response to the request, as shown in FIG. 15f .
  • the notification message 1522 may be displayed together with a unique identifier corresponding to the electronic device 1500 (e.g., a phone number).
  • the electronic device 1520 may be designated to receive the request in the event the electronic device 1500 is immersed in water through, e.g., the phone number of the designated electronic device 1520.
  • the phone number is only one possible way to identify the designated electronic device 1520, and, therefore, any other suitable type of unique identifier may be used instead.
  • the electronic device 1500 may save (in other words, "backup") various types of data stored in the electronic device 1500 on at least one of the wearable electronic device 1510, the designated electronic device 1520, and a backup server (e.g., the server 106).
  • the stored data may include any suitable type of data that is pre-designated by the user for back-up in the event that the electronic device is immersed in water.
  • the user may designate various types of data to be backed up when the electronic device 1500 is immersed, up to a predetermined data storage limit in anticipation of the immersion.
  • the designation of data may be performed by the operation of designating the path or address of the data, for example.
  • the data to be backed up may include data related to the immersion (e.g., location information at the time of immersion) generated or obtained by the electronic device 1500.
  • the electronic device 1500 may control the power module (e.g., the power module 280) to power off the electronic device 1500 when the data backup is finished or a predetermined time period has passed after the immersion event, regardless of whether the backup operation is finished.
  • the power module e.g., the power module 280
  • the control module of the electronic device 1500 may store in its memory an indication of the time of immersion of the electronic device.
  • the wireless communication module may send a request for displaying information related to the immersion of the electronic device to the wearable electronic device 1510 connected to the electronic device 1500 via wireless communications. Additionally or alternatively, the request may be sent to any other suitable type of electronic device that is designated by the user, such as the electronic device 1520.
  • the immersion-related information may include at least one of an immersion warning message, an indication of when the electronic device was immersed in the water, an immersion guide running message, a video or still image captured by the electronic device when the immersion is sensed, or an indication of the location of the electronic device when the electronic device was immersed in the water.
  • control module may backup data designated by the user onto least one of the wearable electronic device 1510, the designated electronic device 1520, and a backup server (e.g., the server 106) that is connected to the electronic device.
  • a backup server e.g., the server 106
  • control module may control the power module to power off the electronic device 1500 when the backup is complete and/or a predetermined time period has passed after detecting that the electronic device is at least partially immersed in water.
  • FIG. 16 is a flowchart of an example of a process, according to an embodiment of the present disclosure.
  • the electronic device may detect that the electronic device (e.g., the electronic device 1500) is at least partially immersed in a liquid, such as water (1600).
  • the electronic device may send a request to display an immersion notification message to a wearable electronic device (e.g., the wearable electronic device 1510) connected to the first electronic device, or a designated electronic device (e.g., the designated electronic device 1520) designated by the user of the electronic device (1610).
  • a wearable electronic device e.g., the wearable electronic device 1510
  • a designated electronic device e.g., the designated electronic device 1520
  • the electronic device may generate or obtain data related to the immersion (e.g., an indication of the location of the electronic device at the time of immersion, one or more images of the surrounding environment of the electronic device, etc. ) and transmit the same to the wearable electronic device or the designated electronic device as part of the request.
  • the electronic device may store the immersion-related data its own memory and/or on one of the wearable electronic device, the designated electronic device, and a server (e.g., the server 106).
  • the electronic device 1500 may backup any other data that is designated by the user on at least one of the wearable electronic device, the designated electronic device, and the server (1620. Afterwards, the electronic device may power itself off (1630).
  • the electronic device when the electronic device detects that it is at least partially immersed in the liquid, the electronic device may record the current time and treat it as the time of immersion.
  • the electronic device may request a wearable electronic device connected to the electronic device via wireless communications or a designated electronic device preset by the electronic device to display information related to the immersion of the electronic device.
  • the immersion-related information may include at least one of an immersion warning message, an indication of the time when the immersion was detected, an immersion guide running message, a video or still image captured by the electronic device when the immersion is sensed, or an indication of the location of the electronic device when the immersion is sensed.
  • the electronic device when detecting that the electronic device is immersed in the liquid, may back up selected data stored in the memory of the electronic device on at least one of the wearable electronic device, the designated electronic device, and a server connected to the electronic device via wireless communications.
  • the electronic device may power off when the data backup is complete and/or a predetermined time period expires after detecting that the electronic device is at least partially immersed in the liquid.
  • FIG. 17 is a diagram of an example of an electronic device, according to an embodiment of the present disclosure.
  • FIG. 17 is a block diagram 1700 illustrating an electronic device 1701 according to an embodiment of the present disclosure.
  • the electronic device 1701 may include the whole or part of the configuration of, e.g., the electronic device 101 shown in FIG. 1 .
  • the electronic device 1701 may include one or more application processors (APs) 1710, a communication module 1720, an SIM (subscriber identification module) card 1724, a memory 1730, a sensor module 1740, an input device 1750, a display 1760, an interface 1770, an audio module 1780, a camera module 1791, a power management module 1795, a battery 1796, an indicator 1797, and a motor 1798.
  • APs application processors
  • the AP 1710 may control multiple hardware and software components connected to the AP 1710 by running, e.g., an operating system or application programs, and the AP 1710 may process and compute various data.
  • the AP 1710 may be implemented in, e.g., a system on chip (SoC).
  • SoC system on chip
  • the AP 1710 may further include a graphic processing unit (GPU) and/or an image signal processor.
  • the AP 1710 may include at least some (e.g., the cellular module 1721) of the components shown in FIG. 17 .
  • the AP 1710 may load a command or data received from at least one of other components (e.g., a non-volatile memory) in volatile memory, process the command or data, and store various data in the non-volatile memory.
  • the communication module 1720 may have the same or similar configuration to the communication interface 160 of FIG. 1 .
  • the communication module 1720 may include, e.g., a cellular module 1721, a wireless fidelity (Wi-Fi) module 1723, a Bluetooth (BT) module 1725, a global positioning system (GPS) module 1727, a near-field communication (NFC) module 1728, and a radio frequency (RF) module 1729.
  • Wi-Fi wireless fidelity
  • BT Bluetooth
  • GPS global positioning system
  • NFC near-field communication
  • RF radio frequency
  • the cellular module 1721 may provide voice call, video call, text, or Internet services through a communication network (e.g., a long-term evolution (LTE), LTE-advanced (LTE-A), code division multiple access (CDMA), wideband CDMA (WCDMA), universal mobile telecommunications system (UMTS), wireless broadcast (WiBro), or global system for mobile communications (GSM) network).
  • a communication network e.g., a long-term evolution (LTE), LTE-advanced (LTE-A), code division multiple access (CDMA), wideband CDMA (WCDMA), universal mobile telecommunications system (UMTS), wireless broadcast (WiBro), or global system for mobile communications (GSM) network.
  • LTE long-term evolution
  • LTE-A LTE-advanced
  • CDMA code division multiple access
  • WCDMA wideband CDMA
  • UMTS universal mobile telecommunications system
  • WiBro wireless broadcast
  • GSM global system for mobile communications
  • the Wi-Fi module 1723, the BT module 1725, the GPS module 1727, or the NFC module 1728 may include a process for, e.g., processing data communicated through the module. At least some (e.g., two or more) of the cellular module 1721, the Wi-Fi module 1723, the BT module 1725, the GPS module 1727, and the NFC module 1728 may be included in a single integrated circuit (IC) or an IC package.
  • IC integrated circuit
  • the RF module 1729 may communicate data, e.g., communication signals (e.g., RF signals).
  • the RF module 1729 may include, e.g., a transceiver, a power amplifier module (PAM), a frequency filter, an LNA (low noise amplifier), or an antenna.
  • PAM power amplifier module
  • LNA low noise amplifier
  • at least one of the cellular module 1721, the Wi-Fi module 1723, the BT module 1725, the GPS module 1727, or the NFC module 1728 may communicate RF signals through a separate RF module.
  • the SIM card 1724 may include, e.g., a card including a subscriber identification module and/or an embedded SIM, and may contain unique identification information (e.g., an integrated circuit card identifier (ICCID) or subscriber information (e.g., an international mobile subscriber identity (IMSI)).
  • unique identification information e.g., an integrated circuit card identifier (ICCID) or subscriber information (e.g., an international mobile subscriber identity (IMSI)
  • ICCID integrated circuit card identifier
  • IMSI international mobile subscriber identity
  • the memory may include, e.g., an internal memory 1732 or an external memory 1734.
  • the internal memory 1732 may include at least one of, e.g., a volatile memory (e.g., a dynamic random access memory (DRAM), a static RAM (SRAM), a synchronous dynamic RAM (SDRAM), etc.) or a non-volatile memory (e.g., a one-time programmable read only memory (OTPROM), a programmable ROM (PROM), an erasable and programmable ROM (EPROM), an electrically erasable and programmable ROM (EEPROM), a mask ROM, a flash ROM, a flash memory (e.g., a NAND flash, or a NOR flash), a hard drive, or solid state drive (SSD).
  • a volatile memory e.g., a dynamic random access memory (DRAM), a static RAM (SRAM), a synchronous dynamic RAM (SDRAM), etc.
  • the external memory 1734 may include a flash drive, e.g., a compact flash (CF) memory, a secure digital (SD) memory, a micro-SD memory, a min-SD memory, an extreme digital (xD) memory, or a memory stickTM.
  • the external memory 1734 may be functionally and/or physically connected to the electronic device 1701 via various interfaces.
  • the sensor module 1740 may measure a physical quantity or detect an operational state of the electronic device 1701, and the sensor module 1740 may convert the measured or detected information into an electrical signal.
  • the sensor module 1740 may include at least one of, e.g., a gesture sensor 1740a, a gyro sensor 1740b, an air pressure sensor 1740C, a magnetic sensor 1740D, an acceleration sensor 1740E, a grip sensor 1740F, a proximity sensor 1740G, a color sensor 1740H such as an red-green-blue (RGB) sensor, a biosensor 1740I, a temperature/humidity sensor 1740J, an illumination sensor 1740K, or an ultraviolet (UV) sensor 1740M.
  • RGB red-green-blue
  • biosensor 1740I a temperature/humidity sensor 1740J
  • an illumination sensor 1740K or an ultraviolet (UV) sensor 1740M.
  • UV ultraviolet
  • the sensing module 1740 may include, e.g., an E-nose sensor, an electromyography (EMG) sensor, an electroencephalogram (EEG) sensor, an electrocardiogram (ECG) sensor, an infrared (IR) sensor, an iris sensor, or a fingerprint sensor.
  • the sensor module 1740 may further include a control circuit for controlling at least one or more of the sensors included in the sensing module.
  • the electronic device 1701 may further include a processor configured to control the sensor module 1740 as part of an AP 1710 or separately from the AP 1710, and the electronic device 1701 may control the sensor module 1740 while the AP is in a sleep mode.
  • the input unit 1750 may include, e.g., a touch panel 1752, a (digital) pen sensor 1754, a key 1756, or an ultrasonic input device 1758.
  • the touch panel 1752 may use at least one of capacitive, resistive, infrared, or ultrasonic methods.
  • the touch panel 1752 may further include a control circuit.
  • the touch panel 1752 may further include a tactile layer and may provide a user with a tactile reaction.
  • the (digital) pen sensor 1754 may include, e.g., a part of a touch panel or a separate sheet for recognition.
  • the key 1756 may include any suitable type of input device, such as a physical button, an optical key or a keypad.
  • the ultrasonic input device 1758 may use an input tool that generates an ultrasonic signal and enable the electronic device 1701 to identify data by sensing the ultrasonic signal to a microphone (e.g., a microphone 1788).
  • the display 1760 may include a panel 1762, a hologram device 1764, or a projector 1766.
  • the panel 1762 may have the same or similar configuration to the display 160 of FIG. 1 .
  • the panel 1762 may be implemented to be flexible, transparent, or wearable.
  • the panel 1762 may also be incorporated with the touch panel 1752 in a module.
  • the hologram device 1764 may make three-dimensional (3d) images (holograms) in the air by using light interference.
  • the projector 1766 may display an image by projecting light onto a screen.
  • the screen may be, for example, located inside or outside of the electronic device 1701.
  • the display 1760 may further include a control circuit to control the panel 1762, the hologram device 1764, or the projector 1766.
  • the interface 1770 may include e.g., a high definition multimedia Interface (HDMI) 1772, a USB 1774, an optical interface 1776, or a D-subminiature (D-sub) 1778.
  • the interface 1770 may be part of the communication interface 160 shown in FIG. 1 .
  • the interface 1770 may include a Mobile High-definition Link (MHL) interface, a secure digital (SD) card/ multimedia card (MMC) interface or infrared data association (IrDA) standard interface.
  • MHL Mobile High-definition Link
  • SD secure digital
  • MMC multimedia card
  • IrDA infrared data association
  • the audio module 1780 may convert a sound into an electric signal or vice versa, for example. At least a part of the audio module 1780 may be included in e.g., the electronic device 101 as shown in FIG. 1 .
  • the audio module 1780 may process sound information input or output via one or more of a speaker 1782, a receiver 1784, an earphone 1786, or a microphone 1788.
  • the camera module 1791 may be a device for capturing still images and videos, and may include, according to an embodiment of the present disclosure, one or more image sensors (e.g., front and back sensors), a lens, an Image Signal Processor (ISP), or a flash such as a light emitting diode (LED) or xenon lamp.
  • image sensors e.g., front and back sensors
  • ISP Image Signal Processor
  • flash such as a light emitting diode (LED) or xenon lamp.
  • LED light emitting diode
  • xenon lamp xenon lamp
  • the power manager module 1795 may manage the power supply of the electronic device 1701, for example.
  • a power management integrated circuit (PMIC), a charger IC, or a battery or fuel gauge is included in the power manager module 1795.
  • the PMIC may have a wired and/or wireless recharging scheme.
  • the wireless charging scheme may include any suitable type of wireless charging scheme, such as a magnetic resonance scheme, a magnetic induction scheme, or an electromagnetic wave based scheme, and an additional circuit, such as a coil loop, a resonance circuit, a rectifier, or the like may be added for wireless charging.
  • the battery gauge may measure an amount of remaining power of the battery 1796, a voltage, a current, or a temperature while the battery 1796 is being charged.
  • the battery 1796 may include, e.g., a rechargeable battery or a solar battery.
  • the indicator 1797 may indicate a particular state of the electronic device 1701 or a part of the electronic device (e.g., the AP 1710), including e.g., a booting state, a message state, or recharging state.
  • the motor 1798 may convert an electric signal to a mechanical vibration and may generate a vibrational or haptic effect.
  • a processing unit for supporting mobile TV such as a graphics processing unit (GPU) may be included in the electronic device 1701.
  • the processing unit for supporting mobile TV may process media data conforming to a standard for Digital Multimedia Broadcasting (DMB), Digital Video Broadcasting (DVB), or MediaFLOTM.
  • DMB Digital Multimedia Broadcasting
  • DVD Digital Video Broadcasting
  • MediaFLOTM MediaFLOTM
  • Each of the aforementioned components of the electronic device may include one or more parts, and a name of the part may vary with a type of the electronic device.
  • the electronic device in accordance with various embodiments of the present disclosure may include at least one of the aforementioned components, omit some of them, or include other additional component(s). Some of the components may be combined into an entity, but the entity may perform the same functions as the components may do.
  • FIG. 18 is a diagram of an example of a program module, according to an embodiment of the present disclosure.
  • FIG. 18 is a block diagram 1800 illustrating a program module 1810 according to an embodiment of the present disclosure.
  • the program module 1810 may include an operating system (OS) controlling resources related to the electronic device (e.g., the electronic device 101) and/or various applications (e.g., the application processor 147) executed on the operating system.
  • the operating system may include, e.g., Android, iOS, Windows, Symbian, Tizen, or Bada.
  • the program 1810 may include, e.g., a kernel 1820, middleware 1830, an application programming interface (API) 1860, and/or an application 1870. At least a part of the program module 1810 may be preloaded on the electronic device or may be downloaded from a server (e.g., the server 106).
  • a server e.g., the server 106
  • the kernel 1820 may include, e.g., a system resource manager 1821 or a device driver 1823.
  • the system resource manager 1821 may perform control, allocation, or recovery of system resources.
  • the system resource manager 1821 may include a process managing unit, a memory managing unit, or a file system managing unit.
  • the device driver 1823 may include, e.g., a display driver, a camera driver, a Bluetooth driver, a shared memory driver, a USB driver, a keypad driver, a Wi-Fi driver, an audio driver, or an inter-process communication (IPC) driver.
  • IPC inter-process communication
  • the middleware 1830 may provide various functions to the application 1870 through the API 1860 so that the application 1870 may efficiently use limited system resources in the electronic device or provide functions jointly required by applications 1870.
  • the middleware 1830 e.g., middleware 1418
  • the runtime library 1835 may include a library module used by a compiler in order to add a new function via a programming language while, e.g., the application 1870 is being executed.
  • the runtime library 1835 may perform input/output management, memory management, or arithmetic functions.
  • the application manager 1841 may manage the life cycle of at least one application of, e.g., the applications 1870.
  • the window manager 1842 may manage GUI resources used on the screen.
  • the multimedia manager 1843 may grasp formats necessary to play various media files and use a codec appropriate for a format to perform encoding or decoding on media files.
  • the resource manager 1844 may manage resources, such as source code of at least one of the applications 1870, memory or storage space.
  • the power manager 1845 may operate together with, e.g., a basic input/output system (BIOS) to manage battery or power and provide power information necessary for operating the electronic device.
  • the database manager 1846 may generate, search, or vary a database to be used in at least one of the applications 1870.
  • the package manager 1847 may manage installation or update of an application that is distributed in the form of a package file.
  • the connectivity manager 1848 may manage wireless connectivity, such as, Wi-Fi or Bluetooth.
  • the notification manager 1849 may display or notify an event, such as a coming message, appointment, or proximity notification, of the user without interfering with the user.
  • the location manager 1850 may manage locational information on the electronic device.
  • the graphic manager 1851 may manage graphic effects to be offered to the user and their related user interface.
  • the security manager 1852 may provide various security functions necessary for system security or user authentication. According to an embodiment of the present disclosure, when the electronic device (e.g., the electronic device 101) has telephony capability, the middleware 1830 may further include a telephony manager for managing voice call or video call functions of the electronic device.
  • the middleware 1830 may include a middleware module forming a combination of various functions of the above-described components.
  • the middleware 1830 may provide a specified module per type of the operating system in order to provide a differentiated function. Further, the middleware 1830 may dynamically omit some existing components or add new components.
  • the API 1860 may be a set of, e.g., API programming functions and may have different configurations depending on operating systems. For example, in the case of Android or iOS, one API set may be provided per platform, and in the case of Tizen, two or more API sets may be offered per platform.
  • the application 1870 may include one or more applications that may provide functions such as, e.g., a home 1871, a dialer 1872, a short message service (SMS)/multimedia messaging service (MMS) 1873, an instant message (IM) 1874, a browser 1875, a camera 1876, an alarm 1877, a contact 1878, a voice dial 1879, an email 1880, a calendar 1881, a media player 1882, an album 1883, or a clock 1884, a health-care (e.g., measuring the degree of workout or blood sugar), or provision of environmental information (e.g., provision of air pressure, moisture, or temperature information).
  • a health-care e.g., measuring the degree of workout or blood sugar
  • provision of environmental information e.g., provision of air pressure, moisture, or temperature information.
  • the application 1870 may include an application (hereinafter, "information exchanging application” for convenience) supporting information exchange between the electronic device (e.g., the electronic device 101) and an external electronic device (e.g., the electronic devices 102 and 104).
  • the information exchange application may include, but is not limited to, a notification relay application for transferring specific information to the external electronic device, or a device management application for managing the external electronic device.
  • the notification relay application may include a function for relaying notification information generated from other applications of the electronic device (e.g., the SMS/MMS application, email application, health-care application, or environmental information application) to the external electronic device (e.g., the electronic devices 102 and 104). Further, the notification relay application may receive notification information from, e.g., the external electronic device and may provide the received notification information to the user.
  • the notification relay application may include a function for relaying notification information generated from other applications of the electronic device (e.g., the SMS/MMS application, email application, health-care application, or environmental information application) to the external electronic device (e.g., the electronic devices 102 and 104).
  • the notification relay application may receive notification information from, e.g., the external electronic device and may provide the received notification information to the user.
  • the device management application may perform at least some functions of the external electronic device (e.g., the electronic device 104) communicating with the electronic device (for example, turning on/off the external electronic device (or some components of the external electronic device) or control of brightness (or resolution) of the display), and the device management application may manage (e.g., install, delete, or update) an application operating in the external electronic device or a service (e.g., call service or message service) provided by the external electronic device.
  • a service e.g., call service or message service
  • the application 1870 may include an application (e.g., a health-care application) designated depending on the attribute (e.g., as an attribute of the electronic device, the type of electronic device is a mobile medical device) of the external electronic device (e.g., the electronic devices 102 and 104).
  • the application 1870 may include an application received from the external electronic device (e.g., the server 106 or electronic devices 102 and 104).
  • the application 1870 may include a preloaded application or a third party application downloadable from a server.
  • the names of the components of the program module 1810 according to the shown embodiment may be varied depending on the type of operating system.
  • At least a part of the program module 1810 may be implemented in software, firmware, hardware, or in a combination of two or more thereof. At least a part of the programming module 1810 may be implemented (e.g., executed) by using a processor (e.g., the AP 210). At least a part of the program module 1810 may include a module, program, routine, set of instructions, process, or the like for performing one or more functions.
  • the term 'module' may refer to a unit including one of hardware, software, and firmware, or a combination thereof.
  • the term 'module' may be interchangeably used with a unit, logic, logical block, component, or circuit.
  • the module may be a minimum unit or part of an integrated component.
  • the module may be a minimum unit or part of performing one or more functions.
  • the module may be implemented mechanically or electronically.
  • the module may include at least one of Application Specific Integrated Circuit (ASIC) chips, Field Programmable Gate Arrays (FPGAs), or Programmable Logic Arrays (PLAs) that perform some operations, which have already been known or will be developed in the future.
  • ASIC Application Specific Integrated Circuit
  • FPGAs Field Programmable Gate Arrays
  • PDAs Programmable Logic Arrays
  • At least a part of the device may be implemented as instructions stored in a computer-readable storage medium e.g., in the form of a program module.
  • the instructions when executed by a processor (e.g., the processor 120), may enable the processor to carry out a corresponding function.
  • the computerreadable storage medium may be the memory 130 for example.
  • the computer-readable storage medium may include a hardware device, such as hard discs, floppy discs, and magnetic tapes (e.g., a magnetic tape), optical media such as compact disc read-only memories (ROMs) (CD-ROMs) and digital versatile discs (DVDs), magneto-optical media such as floptical disks, ROMs, random access memories (RAMs), flash memories, and/or the like.
  • Examples of the program instructions may include not only machine language codes but also high-level language codes which are executable by various computing means using an interpreter.
  • the aforementioned hardware devices may be configured to operate as one or more software modules to carry out exemplary embodiments of the present disclosure, and vice versa.
  • Modules or programming modules in accordance with various embodiments of the present disclosure may include at least one or more of the aforementioned components, omit some of them, or further include other additional components. Operations performed by modules, programming modules or other components in accordance with various embodiments of the present disclosure may be carried out sequentially, simultaneously, repeatedly, or heuristically. Furthermore, some of the operations may be performed in a different order, or omitted, or include other additional operation(s).
  • the embodiments disclosed herein are proposed for description and understanding of the disclosed technology and does not limit the scope of the present disclosure.
  • FIGS. 1-18 are provided as an example only. At least some of the operations discussed with respect to these figures can be performed concurrently, performed in a different order, and/or altogether omitted. It will be understood that the provision of the examples described herein, as well as clauses phrased as "such as,” “e.g.”, “including”, “in some aspects,” “in some implementations,” and the like should not be interpreted as limiting the claimed subject matter to the specific examples.
  • the above-described aspects of the present disclosure can be implemented in hardware, firmware or via the execution of software or computer code that can be stored in a recording medium such as a CD-ROM, a Digital Versatile Disc (DVD), a magnetic tape, a RAM, a floppy disk, a hard disk, or a magneto-optical disk or computer code downloaded over a network originally stored on a remote recording medium or a non-transitory machine-readable medium and to be stored on a local recording medium, so that the methods described herein can be rendered via such software that is stored on the recording medium using a general purpose computer, or a special processor or in programmable or dedicated hardware, such as an ASIC or FPGA.
  • a recording medium such as a CD-ROM, a Digital Versatile Disc (DVD), a magnetic tape, a RAM, a floppy disk, a hard disk, or a magneto-optical disk or computer code downloaded over a network originally stored on a remote recording medium or a non-transitory
  • the computer, the processor, microprocessor controller or the programmable hardware include memory components, e.g., RAM, ROM, Flash, etc. that may store or receive software or computer code that when accessed and executed by the computer, processor or hardware implement the processing methods described herein.
  • memory components e.g., RAM, ROM, Flash, etc.
  • the execution of the code transforms the general purpose computer into a special purpose computer for executing the processing shown herein.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Acoustics & Sound (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Human Computer Interaction (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Otolaryngology (AREA)
  • Environmental & Geological Engineering (AREA)
  • Telephone Function (AREA)
  • User Interface Of Digital Computer (AREA)
  • Multimedia (AREA)
EP15864102.7A 2014-11-26 2015-11-26 Method and apparatus for detecting that a device is immersed in a liquid Active EP3225047B1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020140166608A KR102319803B1 (ko) 2014-11-26 2014-11-26 전자 장치, 그 동작 방법 및 기록 매체
PCT/KR2015/012759 WO2016085265A1 (en) 2014-11-26 2015-11-26 Method and apparatus for detecting that a device is immersed in a liquid

Publications (3)

Publication Number Publication Date
EP3225047A1 EP3225047A1 (en) 2017-10-04
EP3225047A4 EP3225047A4 (en) 2018-08-22
EP3225047B1 true EP3225047B1 (en) 2019-07-31

Family

ID=56009987

Family Applications (1)

Application Number Title Priority Date Filing Date
EP15864102.7A Active EP3225047B1 (en) 2014-11-26 2015-11-26 Method and apparatus for detecting that a device is immersed in a liquid

Country Status (5)

Country Link
US (2) US10088565B2 (zh)
EP (1) EP3225047B1 (zh)
KR (1) KR102319803B1 (zh)
CN (1) CN105628081B (zh)
WO (1) WO2016085265A1 (zh)

Families Citing this family (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11059550B2 (en) * 2013-03-11 2021-07-13 Suunto Oy Diving computer with coupled antenna and water contact assembly
US11050142B2 (en) 2013-03-11 2021-06-29 Suunto Oy Coupled antenna structure
JP6140217B2 (ja) * 2015-05-12 2017-05-31 京セラ株式会社 電子機器、制御方法及び制御プログラム
US9661195B2 (en) * 2015-07-02 2017-05-23 Gopro, Inc. Automatic microphone selection in a sports camera based on wet microphone determination
US11343413B2 (en) 2015-07-02 2022-05-24 Gopro, Inc. Automatically determining a wet microphone condition in a camera
JP6153570B2 (ja) * 2015-07-24 2017-06-28 京セラ株式会社 電子機器
JP6587918B2 (ja) * 2015-11-27 2019-10-09 京セラ株式会社 電子機器、電子機器の制御方法、電子機器の制御装置、制御プログラム及び電子機器システム
US10104223B2 (en) * 2016-01-11 2018-10-16 Motorola Mobility Llc Automatically applying modifications to a device in a liquid environment
CN105891806B (zh) * 2016-06-03 2018-02-09 高沿 一种飞行记录仪水下定位发信器
US9807501B1 (en) 2016-09-16 2017-10-31 Gopro, Inc. Generating an audio signal from multiple microphones based on a wet microphone condition
US10965142B2 (en) * 2016-09-23 2021-03-30 Apple Inc. Magneto-inductive charging and communication in electronic devices
KR20180047694A (ko) * 2016-11-01 2018-05-10 엘지전자 주식회사 이동 단말기
CN106357935B (zh) * 2016-11-29 2019-07-26 维沃移动通信有限公司 一种移动终端的模式切换方法及移动终端
US10528015B2 (en) * 2016-12-15 2020-01-07 Trane International Inc. Building automation system controller with real time software configuration and database backup
KR102658311B1 (ko) 2017-02-08 2024-04-18 삼성전자주식회사 스피커를 포함하는 전자 장치
US10666839B2 (en) * 2017-03-24 2020-05-26 Motorola Mobility Llc Correcting for optical distortion in a liquid environment
CN108939834A (zh) * 2017-05-17 2018-12-07 应用材料公司 射频发生器干净干燥空气吹净
US10852770B2 (en) * 2017-06-09 2020-12-01 Casio Computer Co., Ltd. Electronic device having a waterproof structure
KR101923464B1 (ko) * 2017-07-27 2019-02-28 한국생산기술연구원 해양 조난 신호 발생을 위한 침수 감지 장치 및 이를 이용한 해양 조난 신호 발생 방법
US10976278B2 (en) * 2017-08-31 2021-04-13 Apple Inc. Modifying functionality of an electronic device during a moisture exposure event
KR102544757B1 (ko) 2018-01-15 2023-06-16 삼성전자주식회사 발수 구조를 포함하는 전자 장치 및 그 동작 방법
TWI798344B (zh) 2018-02-08 2023-04-11 芬蘭商順妥公司 槽孔模式天線
CN108430025B (zh) * 2018-03-07 2020-06-19 维沃移动通信有限公司 一种检测方法及移动终端
CN108597186B (zh) * 2018-04-18 2020-04-28 广东小天才科技有限公司 一种基于用户行为的溺水报警方法及可穿戴设备
EP3594636A1 (en) * 2018-07-10 2020-01-15 Nxp B.V. Liquid immersion sensor
US10969941B2 (en) * 2018-09-28 2021-04-06 Apple Inc. Underwater user interface
CN109781207A (zh) * 2019-01-15 2019-05-21 江苏东方赛光电有限公司 一种基于ipc技术的液位监测***
CN113940088A (zh) 2019-03-24 2022-01-14 苹果公司 用于查看和访问电子设备上的内容的用户界面
CN111862534A (zh) * 2019-04-26 2020-10-30 北京奇虎科技有限公司 遇水检测方法、装置、计算机设备及存储介质
US11394819B2 (en) * 2019-09-04 2022-07-19 Qualcomm Incorporated Control of a user device under wet conditions
JP7308732B2 (ja) * 2019-11-27 2023-07-14 アズビル金門株式会社 水没後復帰装置及び水没後復帰方法
TWI724725B (zh) * 2019-12-31 2021-04-11 禾瑞亞科技股份有限公司 偵測元件是否處在導電液體當中的方法、電子裝置與其中央處理器模組
CN112285663B (zh) * 2020-11-18 2023-10-17 中国铁道科学研究院集团有限公司 利用地质雷达信号标定车载雷达振动程度的方法和装置
CN113686410A (zh) * 2021-08-16 2021-11-23 合肥联睿微电子科技有限公司 蓝牙水深测量设备
WO2023085686A1 (ko) * 2021-11-11 2023-05-19 삼성전자주식회사 침수 인식 방법 및 이를 수행하는 전자 장치
US12009873B2 (en) 2021-11-11 2024-06-11 Samsung Electronics Co., Ltd. Method of recognizing immersion and electronic device for performing the same

Family Cites Families (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4125021A (en) * 1976-06-01 1978-11-14 Doryokuro Kakunenryo Kaihatsu Jigyodan Apparatus for detecting conductive liquid level
US5025247A (en) * 1990-04-09 1991-06-18 Banks James C Portable emergency alert system
US6518889B2 (en) * 1998-07-06 2003-02-11 Dan Schlager Voice-activated personal alarm
JPH08338849A (ja) * 1995-04-11 1996-12-24 Precision Syst Sci Kk 液体の吸引判別方法およびこの方法により駆動制御される分注装置
FI112400B (fi) * 1999-05-12 2003-11-28 Nokia Corp Menetelmä tiedon osoittamiseksi ja osoitusväline
US6486777B2 (en) * 1999-08-16 2002-11-26 Albert M. Clark Personal monitoring apparatus and method
CN2414549Y (zh) * 2000-03-20 2001-01-10 曹增全 数字式红外线智能传感装置
AU2002255750B2 (en) * 2001-03-12 2005-09-15 Eureka Technologies Partners, Llc Article locator system
US20080157970A1 (en) 2006-03-23 2008-07-03 G2 Microsystems Pty. Ltd. Coarse and fine location for tagged items
JP2008283406A (ja) 2007-05-09 2008-11-20 Toshiba Corp 携帯機器
KR101403839B1 (ko) * 2007-08-16 2014-06-03 엘지전자 주식회사 터치 스크린을 구비한 이동통신 단말기 및 그 디스플레이제어방법
JP2012119975A (ja) 2010-12-01 2012-06-21 Nec Saitama Ltd ノイズ抑制装置、ノイズ抑制方法、及び携帯端末
EP2675138B1 (en) 2011-02-09 2016-01-13 NEC Corporation Electronic apparatus, water detection means control method, and electronic apparatus operation mode setting method
US8881269B2 (en) * 2012-03-31 2014-11-04 Apple Inc. Device, method, and graphical user interface for integrating recognition of handwriting gestures with a screen reader
US20130339304A1 (en) 2012-06-15 2013-12-19 Lg Electronics Inc. Mobile terminal and method for controlling mobile terminal
KR101919791B1 (ko) * 2012-08-13 2018-11-19 엘지전자 주식회사 이동 단말기 및 이동 단말기 제어방법
KR102020345B1 (ko) * 2012-08-22 2019-11-04 삼성전자 주식회사 터치스크린을 구비하는 단말에서 홈 화면의 구성 방법 및 장치
KR20140038854A (ko) * 2012-09-21 2014-03-31 삼성전자주식회사 모바일 디바이스 및 모바일 디바이스에서 사용자 인터페이스 방법
KR20150002301A (ko) * 2013-06-28 2015-01-07 삼성전자주식회사 부팅 시 정보 표시 방법, 이를 이용한 전자장치 및 휴대 단말기
CN103971406B (zh) * 2014-05-09 2017-12-08 青岛大学 基于线结构光的水下目标三维重建方法

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
None *

Also Published As

Publication number Publication date
US20180329055A1 (en) 2018-11-15
KR102319803B1 (ko) 2021-11-01
EP3225047A1 (en) 2017-10-04
US11644568B2 (en) 2023-05-09
CN105628081B (zh) 2020-11-24
KR20160063068A (ko) 2016-06-03
WO2016085265A1 (en) 2016-06-02
US20160146935A1 (en) 2016-05-26
US10088565B2 (en) 2018-10-02
CN105628081A (zh) 2016-06-01
EP3225047A4 (en) 2018-08-22

Similar Documents

Publication Publication Date Title
US11644568B2 (en) Method and apparatus for detecting that a device is immersed in a liquid
CN108241422B (zh) 电子设备和基于电子设备中的电池温度的热控制方法
EP3532925B1 (en) Electronic device and method for displaying history of executed application thereof
US10038486B2 (en) Method for selecting transmit antenna and electronic device for supporting the same
KR102344045B1 (ko) 화면을 표시하는 전자 장치 및 그 제어 방법
US20180107353A1 (en) Electronic device and method for playing multimedia content by electronic device
KR102503937B1 (ko) 전자 장치의 사용자 인터페이스 제공 방법 및 장치
US10949019B2 (en) Electronic device and method for determining touch coordinate thereof
US20170041769A1 (en) Apparatus and method for providing notification
US10652680B2 (en) Electronic device and method for controlling input and output by electronic device
US10254883B2 (en) Electronic device for sensing pressure of input and method for operating the electronic device
KR20180025710A (ko) 전자 장치 및 이를 이용한 그립 상태를 인식하는 방법
US20170185134A1 (en) Electronic device for managing power and method for controlling thereof
US10222269B2 (en) Method and apparatus for operating sensor of electronic device
KR102304694B1 (ko) 전자 장치 및 전자 장치의 방수 판단 방법
KR102252448B1 (ko) 제어 방법 및 그 방법을 처리하는 전자장치
KR20170019806A (ko) 위치 정보 제공 방법 및 장치
KR20170014407A (ko) 전자장치의 보안 장치 및 제어 방법
KR20180014446A (ko) 전자 장치 및 전자 장치의 터치 스크린 디스플레이 제어 방법
EP3569003B1 (en) Electronic device and method for controlling communication thereof
CN108885853B (zh) 电子装置和用于控制该电子装置的方法
US10261744B2 (en) Method and device for providing application using external electronic device
US10298733B2 (en) Method for executing function of electronic device using bio-signal and electronic device therefor
KR102515282B1 (ko) 전원 공급을 제어하기 위한 방법 및 전자 장치
KR20160138762A (ko) 복수의 충전 회로를 활용하는 전자 장치 및 방법

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20170613

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
RIC1 Information provided on ipc code assigned before grant

Ipc: H04M 1/725 20060101ALI20180711BHEP

Ipc: G01H 15/00 20060101ALI20180711BHEP

Ipc: G01F 23/00 20060101ALI20180711BHEP

Ipc: G01S 15/02 20060101ALI20180711BHEP

Ipc: H04M 1/18 20060101AFI20180711BHEP

Ipc: H04R 1/44 20060101ALI20180711BHEP

Ipc: H04R 29/00 20060101ALI20180711BHEP

Ipc: G01F 23/28 20060101ALI20180711BHEP

A4 Supplementary search report drawn up and despatched

Effective date: 20180719

REG Reference to a national code

Ref country code: DE

Ref legal event code: R079

Ref document number: 602015035007

Country of ref document: DE

Free format text: PREVIOUS MAIN CLASS: H04W0024080000

Ipc: H04M0001180000

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

RIC1 Information provided on ipc code assigned before grant

Ipc: H04M 1/725 20060101ALI20190128BHEP

Ipc: H04R 1/44 20060101ALI20190128BHEP

Ipc: H04M 1/18 20060101AFI20190128BHEP

Ipc: G01S 15/02 20060101ALI20190128BHEP

Ipc: G01F 23/00 20060101ALI20190128BHEP

Ipc: G01F 23/28 20060101ALI20190128BHEP

Ipc: G01H 15/00 20060101ALI20190128BHEP

Ipc: H04R 29/00 20060101ALI20190128BHEP

INTG Intention to grant announced

Effective date: 20190227

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE PATENT HAS BEEN GRANTED

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: AT

Ref legal event code: REF

Ref document number: 1162234

Country of ref document: AT

Kind code of ref document: T

Effective date: 20190815

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602015035007

Country of ref document: DE

REG Reference to a national code

Ref country code: NL

Ref legal event code: FP

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG4D

REG Reference to a national code

Ref country code: AT

Ref legal event code: MK05

Ref document number: 1162234

Country of ref document: AT

Kind code of ref document: T

Effective date: 20190731

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191202

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190731

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190731

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190731

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191031

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190731

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191031

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190731

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191101

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190731

Ref country code: AL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190731

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191130

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190731

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: TR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190731

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190731

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190731

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190731

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190731

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190731

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190731

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190731

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190731

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20200224

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602015035007

Country of ref document: DE

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

PG2D Information on lapse in contracting state deleted

Ref country code: IS

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20191130

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190731

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20191130

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20191126

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20191030

26N No opposition filed

Effective date: 20200603

REG Reference to a national code

Ref country code: BE

Ref legal event code: MM

Effective date: 20191130

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190731

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190731

Ref country code: FR

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20191130

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20191126

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20191130

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: NL

Payment date: 20201008

Year of fee payment: 6

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190731

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190731

Ref country code: HU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO

Effective date: 20151126

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20190731

REG Reference to a national code

Ref country code: NL

Ref legal event code: MM

Effective date: 20211201

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: NL

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20211201

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: GB

Payment date: 20231023

Year of fee payment: 9

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DE

Payment date: 20231023

Year of fee payment: 9