KR101788105B1 - A wearable apparatus for detecting an object under water and a method for communicating thereof - Google Patents

A wearable apparatus for detecting an object under water and a method for communicating thereof Download PDF

Info

Publication number
KR101788105B1
KR101788105B1 KR1020150166762A KR20150166762A KR101788105B1 KR 101788105 B1 KR101788105 B1 KR 101788105B1 KR 1020150166762 A KR1020150166762 A KR 1020150166762A KR 20150166762 A KR20150166762 A KR 20150166762A KR 101788105 B1 KR101788105 B1 KR 101788105B1
Authority
KR
South Korea
Prior art keywords
unit
wearable device
ultrasonic wave
ultrasonic
information
Prior art date
Application number
KR1020150166762A
Other languages
Korean (ko)
Other versions
KR20170061549A (en
Inventor
권성근
김해수
Original Assignee
경일대학교산학협력단
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 경일대학교산학협력단 filed Critical 경일대학교산학협력단
Priority to KR1020150166762A priority Critical patent/KR101788105B1/en
Publication of KR20170061549A publication Critical patent/KR20170061549A/en
Application granted granted Critical
Publication of KR101788105B1 publication Critical patent/KR101788105B1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/521Constructional features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to a wearable device for detecting an underwater object and a communication method for the same, A hand gesture sensor unit mounted on an endothelium of the undergarment for sensing a user's hand gesture, and a gesture sensor unit for sensing through the gesture sensor unit that the predetermined finger indicates one direction longer than a predetermined time, And an ultrasonic transducer for transmitting a connection request signal by radiating an ultrasonic wave having a predetermined frequency in a predetermined time or longer in the direction through the ultrasonic wave unit and converting the ultrasound wave into a predetermined mode from the direction corresponding to the connection request signal through the ultrasonic wave unit, When receiving ultrasonic waves of a predetermined frequency or more as a connection response signal, Through parts of the sound wave provides a wearable device including a control unit for controlling to transmit the ultrasonic wave with the data in said direction.

Description

BACKGROUND OF THE INVENTION 1. Field of the Invention [0001] The present invention relates to a wearable apparatus for detecting an underwater object and a communication method therefor,

BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to a wearable device for detecting an object underwater, and more particularly, to a wearable device for detecting an underwater object and a communication method between the wearable device.

Sonar (sound navigation and ranging) measures the distance to an object by launching ultrasonic waves as short intermittent sounds and taking the time it takes to reflect and return to the object. It also detects the direction by rotating the feeder. In reality, it is the same as the PPI scope method of the radar. On the cathode ray tube, the distance is set on the periphery and the azimuth is scaled so that the scanning line is rotated together with the rotation of the projector. When the reflection sound returns, the object appears as a light spot on the cathode ray tube. . Sound side probes, fish finders, submarines and mine detection sonar, and side looking sonar probing the structure of the seabed are some of the sonar's that emit sound waves.

Korean Registered Patent No. 10-1050709 Registered July 14, 2011 (Name: SmartWare for underwater workers)

An object of the present invention is to provide a wearable device for detecting a submerged object in water and visually providing various kinds of information about the object, and a method of communication between the wearable device.

According to another aspect of the present invention, there is provided a wearable apparatus comprising: a helmet which can be worn on a head of a user; a helmet attached to one side of the helmet to radiate ultrasonic waves to the outside and receive external ultrasound waves; And a control unit for controlling the operation of the ultrasonic wave transmission unit to transmit the ultrasonic wave to the ultrasonic wave transmitting unit. The ultrasonic wave receiving unit may include an ultrasonic wave unit, a handgrip sensor unit mounted on the inner skin of the undergarment for sensing a user's handgrip, Mode, and transmits a connection request signal through the ultrasonic wave unit by radiating an ultrasonic wave of a predetermined frequency in a predetermined period of time or longer in the direction to transmit a connection request signal, and transmits the connection request signal through the ultrasonic wave unit, When receiving ultrasonic waves of a frequency as a connection response signal, An ultrasound data to contain a control unit for controlling to transmit to said direction.

The controller transmits the connection completion signal by radiating an ultrasonic wave having a predetermined frequency in a predetermined time or longer in the direction through the ultrasonic wave unit and transmits an end response signal in response to the connection end signal through the ultrasonic wave unit. And when the ultrasonic wave having a predetermined frequency is received for a predetermined time or longer, the ultrasonic wave portion is switched to the detection mode.

According to another aspect of the present invention, there is provided a wearable apparatus comprising: a helmet which can be worn on a head of a user; a helmet attached to one side of the helmet to radiate ultrasonic waves to the outside and receive external ultrasound waves; The ultrasound system according to claim 1, further comprising: an ultrasound unit; and an ultrasound unit configured to determine a connection request signal when a predetermined frequency of ultrasound is received from the ultrasound unit in a specific direction for a predetermined period of time, And a controller for transmitting ultrasound waves having a predetermined frequency in the direction as a connection response signal and for extracting data from the received ultrasound waves when ultrasound waves containing data are received from the ultrasound waves through the ultrasound waves.

The ultrasound system of claim 1 or 2, wherein the ultrasound unit is configured to receive the ultrasound echo signal from the ultrasound unit in a predetermined period of time, And transmits the ultrasonic wave of the ultrasonic wave to the detection mode.

According to another aspect of the present invention, there is provided a method of communicating with a wearable device, the method comprising the steps of: sensing a user's hand movements; Transmitting a connection request signal by radiating an ultrasonic wave of a predetermined frequency in a predetermined period of time or longer than the predetermined time in response to the connection request signal; And transmitting the ultrasound wave including the data in the direction when receiving the ultrasound wave of the predetermined frequency or more over the predetermined time as the connection response signal.

According to another aspect of the present invention, there is provided a communication method for a wearable device, the method comprising: determining a connection request signal when an ultrasonic wave having a predetermined frequency is received in a specific direction for a predetermined period of time; Transmitting an ultrasonic wave having a predetermined frequency in the direction corresponding to the connection request signal as a connection response signal in response to the connection request signal; .

According to the communication method in water according to the present invention, communication can be performed between a user underwater, i.e., a diver, by simple hand movements, and various kinds of information acquired through ultrasonic detection can be mutually shared .

1 is a perspective view for explaining a wearable device for underwater object detection according to an embodiment of the present invention.
2 is a block diagram illustrating a wearable device for underwater object detection according to an embodiment of the present invention.
FIG. 3 is a view for explaining a method of projecting an image on a glass of underwater goggles using the projection unit of FIG. 2. FIG.
4 is a view for explaining a configuration of an underwater glove according to an embodiment of the present invention.
5 is a flowchart illustrating a method of collecting location information of a wearable device according to an embodiment of the present invention.
6 is a view for explaining a method for detecting a terrain or an object in the water of a wearable device according to an embodiment of the present invention.
7 is a diagram for explaining a method of detecting an object using a sound wave according to an embodiment of the present invention.
8 is a view illustrating an image obtained by processing object information according to an embodiment of the present invention.
9 is a view for explaining a method of providing information about a detected object of a wearable device as an image according to an embodiment of the present invention.
10A to 10C are views illustrating an image projected according to a line of sight according to an exemplary embodiment of the present invention.
11 is a view illustrating an image projected according to a line of sight according to another embodiment of the present invention.
12 is a view for explaining the initialization of an underwater glove according to an embodiment of the present invention.
13 is a flowchart illustrating a method of controlling a wearable device according to an embodiment of the present invention.
14 to 18 are views for explaining a control method of a wearable device according to an embodiment of the present invention.
19 is a flowchart for explaining a communication method between wearable devices according to an embodiment of the present invention.
20 is a diagram for explaining a communication method between wearable devices according to an embodiment of the present invention.

In the following description, only parts necessary for understanding the embodiments of the present invention will be described, and the description of other parts will be omitted so as not to obscure the gist of the present invention.

The terms and words used in the present specification and claims should not be construed as limited to ordinary or dictionary meanings and the inventor is not limited to the meaning of the terms in order to describe his invention in the best way. It should be interpreted as meaning and concept consistent with the technical idea of the present invention. Therefore, the embodiments described in the present specification and the configurations shown in the drawings are merely preferred embodiments of the present invention, and are not intended to represent all of the technical ideas of the present invention, so that various equivalents And variations are possible.

In the meantime, prior to the detailed description, the 'location information' according to the embodiment of the present invention is a value obtained from a GPS signal received from a GPS satellite, and may be provided in coordinates such as latitude, longitude and altitude. Also, the 'position correction information' according to the embodiment of the present invention is information providing a range of error of the GPS signal according to the DGPS (Differential GPS) technique as a value, thereby enabling correction of the position information. More specifically, information transmitted from GPS satellites to terrestrial GPS receivers has an error. When there are two receivers located close to each other, the two receivers have a similar error. The DGPS technique is to obtain more precise data by canceling the common errors of the two receivers. The position correction information according to an embodiment of the present invention includes information for canceling the error. This information may be typically a pseudo range correction (PRC). The reference station that generates the DGPS signal calculates the difference between the geometric distance from each GPS satellite and the pseudo distance recorded with the C / A code data, and this difference becomes the pseudorange correction value (PRC). Further, the position correction information may further include a distance rate correction value (RRC). The RRC is the adjustment value of the pseudorange correction based on the PRC's predicted rate and varies over time. The RRC is a calculated value that should be reflected in the PRC at a certain time, thereby increasing the effectiveness of the PRC over time . Various other factors and parameters may be included in the position correction information, and basically all factors and parameters according to the Radio Technical Committee for Maritime Service (RTCM) standard may be included.

First, a wearable device 10 for detecting an underwater object according to an embodiment of the present invention will be described. 1 is a perspective view for explaining a wearable device for underwater object detection according to an embodiment of the present invention. 2 is a block diagram illustrating a wearable device for underwater object detection according to an embodiment of the present invention. 3 is a view for explaining a method of projecting an image on a glass of underwater goggles using the projection unit of FIG. 2. FIG.

1 to 3, a wearable device 10 for detecting an underwater object includes a helmet 100, a plurality of function modules 200, and an underwater goggles 300. The user wears the underwater goggles 300 and the helmet 10 and can perform exploration activities in water through the functional assistance of the plurality of function modules 200. [ The helmet 100 is worn on the head of the user. In addition, the underwater goggles 300 are provided to allow the user to open the eyes and detect the underwater by preventing water from entering the user's eyes underwater. The underwater goggles 300 include a transparent glass 310 that provides the user with an underwater field of view, And includes a waterproof frame for supporting the glass so as to be spaced apart from the user's eyes by a predetermined distance and for waterproofing the user's eyes and a band for supporting the underwater goggles 300 to be fixed to the user's head.

The plurality of function modules 200 are formed on the outside or inside of the helmet 100 or are embedded in the helmet 100. The plurality of functional modules 200 includes an ultrasound unit 210, a GPS receiving unit 220, a communication unit 230, a sensor unit 240, a projection unit 250, an illumination unit 260, a storage unit 270, 280). The plurality of function modules 200 further include a contact sensor unit 290.

The ultrasonic wave section 210 operates in two modes. First, in the detection mode, the ultrasonic unit 210 can operate as an ultrasonic detector. That is, the ultrasound unit 210 emits a sound wave in water in a detection mode, receives a reflected echo sound reflected by the object or landform in the water, and then transmits the received echo sound to the control unit 280 do. Thus, in the detection mode, the ultrasonic section 210 can be, for example, a sonar detector. Next, in the communication mode, the ultrasonic section 210 can operate as an ultrasonic communication apparatus. In the communication mode, the ultrasound unit 210 may transmit information or data to an ultrasound wave, or may receive ultrasound waves and extract information or data from the ultrasound waves. As shown in the figure, the ultrasonic wave part 210 may be installed on at least one side of the helmet 100, and a plurality of ultrasonic wave parts 210 may be installed on the outer side of the helmet 100 so as to orient all directions simultaneously .

The GPS receiving unit 220 is for receiving a GPS signal including position information from a GPS satellite. For example, the GPS receiving unit 220 can continuously receive position information through a GPS signal received from a GPS satellite or the like. Particularly, the GPS receiver 220 according to the embodiment of the present invention receives the position correction information from the controller 280, corrects the position information using the input position correction information, and then transmits the corrected position correction information to the controller 680). ≪ / RTI > Alternatively, when the GPS receiver 220 receives the position information, the GPS receiver 220 transmits the position information to the controller 280 so that the controller 280 can correct the position information using the position correction information. Such location information may be coordinates such as latitude, longitude, and altitude.

The communication unit 230 is for transmitting data to the outside of the wearable device 10 and for receiving data from the outside. The communication unit 110 may include a radio frequency (RF) transmitter for up-converting and amplifying a frequency of a transmitted signal, an RF receiver for low-noise amplifying a received signal, and down-converting the frequency of the received signal. The communication unit 110 may communicate with a base station such as a base station (BS), a NodeB, an eNodeB, or the like using an AP (Access Point). In particular, after receiving the DGPS signal from the reference station through the network, the communication unit 230 directly connects to a server (not shown) or a reference station (not shown) for providing the received DGPS signal and receives the DGPS signal . This DGPS signal includes position correction information. When the DGPS signal is received, the communication unit 230 transmits the received DGPS signal to the control unit 280.

The sensor unit 240 includes at least one sensor. The sensor unit 240 can sense movement of the helmet body 100 through the sensor. Since the helmet body 100 is worn on the head of the user, the direction of the user's gaze can be specified through sensing the movement of the helmet body 100. In addition, the sensor unit 240 can sense the moving direction and the moving distance of the helmet body 100 from a specific reference position by sensing movement of the helmet body 100. Accordingly, the control unit 280 can calculate the current position based on the displacement from the specific reference position sensed through the sensor unit 240. The sensor unit 240 may include an acceleration sensor, a gyro sensor, a geomagnetic sensor, a digital compass, an altimeter, a depth meter, and the like. When the sensor unit 240 senses the movement of the helmet body 100, the sensor unit 240 provides the control unit 280 with information on the movement of the helmet body 100, for example, a moving distance from the reference, a moving direction,

The projection unit 250 is for providing information on an object underwater according to an embodiment of the present invention as an image or an image including text. Such an image may include an image of the appearance of the object in the water, a size of the appearance, a distance between the wearable device 10, size, and similarity. As shown in FIG. 3, the projection unit 250 projects such an image onto the glass 310 of the underwater goggles 300 under the control of the control unit 280. The user wearing the underwater goggles 300 can view the projected image of the projection unit 250 so as to overlap with the view through the glass 310 of the underwater goggles 300.

The illumination unit 260 is installed on one side of the helmet body 100 so that the illumination is directed at the front. The illumination unit 260 may include a sensor for detecting the degree of turbidity of water due to suspended substances in the water, that is, turbidity. When the turbidity of the water is higher than a predetermined reference value, .

The storage unit 270 stores programs and data necessary for the operation of the wearable device 10, and can be divided into a program area and a data area. The program area may store a program for controlling the overall operation of the wearable device 10 and an operating system (OS) for booting the wearable device 10, applications, and the like. The data area is an area in which user data generated according to use of the wearable device 10 is stored. Such data may include various kinds of information about an object according to an embodiment of the present invention. Each kind of data stored in the storage unit 270 can be deleted, changed, or added according to a user's operation.

The control unit 280 may control the overall operation of the wearable device 10 and the signal flow between the internal blocks 110 to 170 of the wearable device 10 and may perform a data processing function for processing the data. The controller 280 may be a central processing unit (CPU), an application processor, or the like.

The control unit 280 obtains the positional information through the GPS receiving unit 220 and obtains the positional correction information through the communication unit 230. [ Then, the controller 280 can obtain the positional information corrected with the positional correction information. For convenience of explanation, the position information corrected by the position correction information will be referred to as 'high precision position information'. If the high precision position information can not be obtained, the control unit 280 measures the displacement of the wearable device 10 from the position indicated by the highly precise position information acquired last through the sensor unit 240, Can be obtained. The control unit 280 can continuously acquire information about an object underwater through the ultrasonic unit 210. This information includes the appearance and size of the object, the distance from the wearable device 10 to the object, the similarity between the object to be searched and the detected object, and the like. In addition, the control unit 280 generates an image having the above-described information, and controls the projection unit 250 to project the generated image onto the glass 310. The operation of the control unit 280 described above will be described in more detail below.

4 is a view for explaining an underwater glove 400 according to an embodiment of the present invention. Referring to FIG. 4, the waterproof glove 400 includes a hand-operated sensor unit 410 and a contact unit 420 that can detect a hand motion of a diver on an inner skin thereof. The hand movement sensor unit 410 includes a plurality of sensors. Such sensors may be geomagnetic sensors, gyro sensors, acceleration sensors, speed sensors, and the like. The contact portion 420 is for setting the reference point for detecting the displacement from the reference point by the hand movement sensor portion 410. The contact portion 420 may be included in the outer shell of the position corresponding to the fingertip of the waterproof glove 400. The contact portion 420 may be a magnetic body. The reference point may be the position of the contact sensor unit 290. When the user touches the contact portion 420 of the waterproof glove 400 with the contact sensor portion 290, the contact sensor portion 290 senses the contact portion and transmits the sensed contact portion to the control portion 280. Then, the control unit 280 initializes the position of the water-resistant glove 400. The controller 280 of the wearable device 10 senses that the contact portion 420 is in contact with the contact sensor portion 290 through the contact sensor portion 290 so that the contact portion 420 of the wearable glove 400 Since the position is the position of the touch sensor unit 290, the hand operation is discriminated through the displacement received from the hand movement sensor unit 410 as a reference point. The control unit 280 can also initialize the position of the helmet 100 for controlling the projected image when the contact unit 420 contacts the contact sensor unit 290 through the contact sensor unit 290 . Here, it should be noted that the position information of the wearable apparatus 10 calculated based on the GPS signal and the position of the helmet 100 for controlling the image operate in different processes. A wire is included in the inner surface of the wetsuit, and the hand movement sensor unit 410 and the control unit 280 are interconnected through a wire. The hand movement sensor unit 410 including the plurality of sensors measures the displacement of each part of the glove 400 from the reference point and transmits the measured displacement to the control unit 280. Then, the control unit 280 can derive the hand gesture through the displacement of each part of the glove 400 from the reference point.

The wearable device 10 continuously acquires the current position information and detects the underwater terrain or object such as the sea, the river, and the lake based on the obtained position information. First, a method of collecting positional information of the wearable device 10 will be described. 5 is a flowchart illustrating a method of collecting location information of a wearable device according to an embodiment of the present invention.

The control unit 280 of the wearable device 10 determines that the position information collection is to be terminated in step S150 after the position information collection process is started in step S110 and continues until the position collection process is terminated in step S160 Step S150 may be repeated.

The control unit 280 of the wearable device 10 can obtain position information in two ways. The two methods are a method using a GPS and a method using a sensor. This location information can be expressed as latitude, longitude and altitude (or depth). First, the method of using GPS is as follows. First, the controller 280 extracts the position information from the GPS signal received through the GPS receiver 220, and extracts the position correction information from the DGPS signal received through the communication unit 230. Then, the control unit 280 can obtain the high-precision position information corrected position information by using the position correction information. Secondly, the control unit 280 can measure the displacement of the wearable device 10, which is changed from the reference position, through the sensor unit 240 and acquire the current position. Such a reference position can be high precision position information.

In FIG. 5, the user, that is, the diver, wears the wearable device 10, and then assumes a situation where the user submits to find a specific object in the water. Since the GPS signal or the DGPS signal is not received in the water, the controller 280 of the wearable device 10 determines whether or not both the GPS signal and the DGPS signal are received in step S120.

As a result of the determination in step S120, if both the GPS signal and the DGPS signal are received, the controller 280 proceeds to step S130 and obtains current position information through the high-precision position information as described above. On the other hand, if it is determined in step S120 that neither the GPS signal nor the DGPS signal is received, the control unit 280 moves the wearable device 10 from the highly precise position information acquired last through the sensor unit 240 in step S140 It is possible to obtain the current position information by detecting a displacement of a distance, a direction, and the like.

As described above, according to the embodiment of the present invention, the position information of the GPS system is corrected using the DGPS position correction information, thereby obtaining more precise high-precision position information. In addition, precise position information can be obtained in the water through the sensed displacement based on high precision position information.

On the other hand, the wearable device 10 detects an underwater terrain or object such as a sea, a river, a lake, or the like based on the obtained location information. The wearable device 10 will continuously acquire positional information through the above-described method, and detect a terrain or an object in the water based on the acquired positional information. 6 is a view for explaining a method for detecting a terrain or an object in the water of a wearable device according to an embodiment of the present invention. 7 is a diagram for explaining a method of detecting an object using a sound wave according to an embodiment of the present invention. 8 is a view illustrating an image obtained by processing object information according to an embodiment of the present invention.

6, the control unit 280 of the wearable device 10 transmits information (hereinafter, referred to as " information about a target object " to be searched in the water from an arbitrary server or a computer via the communication unit 230 in step S210) Object object information '), and then stores the object information in the storage unit 270 in step S220. The target object information includes basic information which is information on the appearance of the target object in the basic posture and deformation information which is information on the appearance of the target object in the changed posture. The basic information includes information about the height, width, and the like of a target object whose target object can be measured in a basic posture. The attitude can be changed by the buoyancy of water whether it is living creature or inanimate object in water, and if the object has a joint, the attitude can be further changed by the joint. The deformation information predicts the posture to be changed, and includes information on the height, width, and the like of the contour that can be measured in each of the plurality of predicted changed postures. The target object information is received and stored through the communication unit 230. The target object information may be input to the wearable device 10 through other means and stored in the storage unit 270. [ For example, the wearable device 10 may include an input device or an interface for direct communication, and may be input to the wearable device 10 through the input device or interface and stored in the storage unit 270.

7, the control unit 280 of the wearable device 10 emits a sound wave through the ultrasonic unit 210 with respect to the object 30 in step S230. When the emitted sound wave is reflected from the object 30 And receives a reverberation sound. That is, under control of the control unit 280, the ultrasonic unit 210 emits a sound wave, captures the reflected sound, and transmits the sensed reflected sound to the control unit 280. Then, in step S240, the control unit 280 obtains object information, which is information for specifying the position, shape, and size of the object 30 in the hand using the reverberation based on the current position information. Since the sound wave is reflected from the surface of the object 30 and returns to the sound of the echo, the controller 280 determines whether or not the sound wave is returned from the object 30 to the echo sound based on the current position information, The distance to the surface of the object 30 can be calculated based on the position of the wearable device 10. [ As described above, since the current position of the wearable device 10 can be continuously acquired and the distance to all the surfaces can be known through the RTT, the control unit 280 can control the position of the object 30 Position, shape, and size can be derived.

In step S250, the control unit 280 may compare the stored object information with the detected object information to derive the similarity between the object and the target object to be searched by the object 30 in step S220. Here, the similarity may be determined by comparing the basic posture and the plurality of deformed postures with the object, and using the largest or average value of the similarities.

The object information and the similarity are processed into an image or text by the control unit 280 in step S260. For example, the control unit 280 may construct a wire image (ultrasound image) 41 representing the contour of the object 30 based on the object information. Such an example is shown in Fig. As shown in the figure, the control unit 280 may construct a wire image (ultrasound image) 41 based on the object information obtained according to the reflections of the object 30 with respect to the object 30. In addition, the object information such as the degree of similarity, the distance to the wearable device 10, the height, width, and the like of the object 30 can be processed into text.

Detection using sound of the wearable device 10 as described above is continuously performed in water, and the present invention provides information of an object detected acoustically as an image to help a user in the water, that is, a diver. Such a method will be described. 9 is a view for explaining a method of providing information about a detected object of a wearable device as an image according to an embodiment of the present invention. 10A to 10C are views illustrating an image projected according to a line of sight according to an exemplary embodiment of the present invention. And FIG. 11 is an example of a screen for explaining an image projected according to a line of sight according to another embodiment of the present invention.

Referring to FIG. 9, as described above, information about an object is continuously detected, and the processed image and text are stored. The control unit 280 senses the movement of the helmet 100 through the sensor unit 240 in step S310 and derives the user's gaze according to the movement of the helmet 100 sensed in step S320. Then, the control unit 280 projects an image including at least one of the processed image and text through the projection unit 250 onto the glass 300 according to the user's gaze derived in step S330.

For example, when projecting an image on the glass 310, the controller 280 senses the movement of the helmet 100 through the sensor unit 240 and displays the user's gaze according to the movement of the helmet 100 And the wire image (ultrasound image) 41 is displayed on the projection unit 250 through the wire image (ultrasound image) 41 so that the wire image (ultrasound image) 41 appears to overlap the object 30 of the object, I project. An example of such a screen is shown in Figs. 10A to 10C. FIG. 10A shows a projected wire image (ultrasound image) when the user's line of sight is on the left side of the object with respect to the object 30. FIG. FIG. 10B shows a projected wire image (ultrasound image) when the user's gaze is staring at the front of the object with reference to the object 30. FIG. FIG. 10C shows a projected wire image (ultrasound image) when the user's line of sight is on the right side of the object with respect to the object 30. FIG. Visibility in the water is not the same as the ground, and in the case of very damp water, it may be difficult for the user to obtain sight. Therefore, according to the embodiment of the present invention, the outline of the object is superimposed on the line of sight of the user through the wire image (ultrasound image) 41 to help the user identify the object. This makes it possible for the user to search more smoothly.

The control unit 280 detects the motion of the helmet 100 through the sensor unit 240 and detects the movement of the helmet 100 according to the movement of the helmet 100. In other words, If the wire image (ultrasound image) 41 is fixed to a specific object (for example, the object 30) for a predetermined time or more, the wire image (ultrasound image) 41 is superimposed on the object 30 of the object shown according to the gaze of the user The wire image (ultrasound image) 41 can be projected through the projection unit 250, and at the same time, further information about the object can be further projected. An example of such a screen is shown in Fig. The control unit 280 basically sets the specific region 46 and senses the movement of the helmet 100 through the sensor unit 240. When the user's gaze reaches a predetermined time It is judged that the object 30 viewed through the specific area 46 is staring. Accordingly, the control unit 280 projects the wire image (ultrasound image) 41 so that the wire image (ultrasound image) 41 appears to overlap the real object 30 seen through the glass, and at the same time, can do. Such additional information includes size information 42 such as height and width of the object 30, similarity 43 between the object 30 and the target object, a distance 44 from the wearable device 10 to the object 30, (Latitude, longitude, depth (altitude), 45) of the object 30 and the like. Since the user has to look at the real thing through the glass, when the user continuously displays a large number of text images, the user's gaze may be confused. Therefore, according to the embodiment of the present invention, the control unit 280 projects and provides the additional information 42 to 44 about the object whose gaze is fixed only when the sight line is fixed, as an image. Accordingly, the user can control the image through the sight line only, so that the two hands can be freely operated in the underwater exploration operation.

As another example, when projecting an image on the glass 310, the control unit 280 senses the moving direction of the helmet 100 and the direction of the user's gaze (direction of the gaze) through the sensor unit 240. Here, the moving direction is a direction in which the helmet 100 moves for a predetermined time, and the line of sight is a direction in which the front surface of the helmet 100 is oriented. At this time, when the moving direction of the helmet 100 and the direction of the line of sight coincide with each other and the object 30 is gazed at a predetermined time or longer, the similarity ((the ultrasound image) 43, 44) can be textually projected. The user can look at the left and right while moving to the north, and if the above-described information is continuously projected every time, the user's view may be confused. Therefore, it is desirable to provide additional information about the object only when the direction of the movement coincides with the direction of the line of sight and the specific object strikes for a predetermined time or more.

The control unit 280 senses the moving direction of the helmet 100 and the direction of the user's gaze through the sensor unit 240. In addition, Here, the moving direction is a direction in which the helmet 100 moves for a predetermined time, and the line of sight is a direction in which the front surface of the helmet 100 is oriented. At this time, although the moving direction of the helmet 100 and the direction of the line of sight do not coincide with each other, when the object 30 is gazed for a predetermined time or longer and the similarity degree between the object 30 and the target object is equal to or larger than a predetermined value, As shown in the figure, the information 42, 43, 44 for the object including the similarity 43 can be textually projected together with the wire image (ultrasound image) The user can look at the left and right while moving to the north, and if the above-described information is continuously projected every time, the user's view may be confused. However, if the degree of similarity between the object 30 to be examined and the target object to be searched is higher than a preset reference value, it is desirable to provide additional information to the object 30. Therefore, even when the direction of movement and the direction of the line of sight do not coincide, additional information can be provided if the similarity is not less than a predetermined value.

On the other hand, there is a limitation in performing control of various functions by simply gazing. Therefore, according to the embodiment of the present invention, it is possible to control the wearable apparatus through the hand movement.

First, in order to control the wearable device 10 through a hand operation, initialization of the waterproof gloves, i.e., a position initialization procedure is required. The initialization of the waterproof glove 400 according to the embodiment of the present invention will be described. 12 is a view for explaining the initialization of an underwater glove according to an embodiment of the present invention.

According to the embodiment of the present invention, the hand movement sensor unit 410 of the waterproof glove 400 senses the hand motion of the user. The detection of such hand movements is detected by the position and direction of each part of the hand. The position and direction of each part of the hand are position and direction from any one reference point. That is, the hand movement sensor unit 410 transmits the displacement of each part of the hand to the control unit 280 with reference to the reference point, and the control unit 280 can discriminate the hand movement through the displacement from the reference point. This reference point may be the position of the contact sensor unit 290. That is, when the user touches the contact portion 420 of the water-resistant glove 400 with the contact sensor portion 290, the position of the water-resistant glove 400 is initialized. The control unit 280 of the wearable device 10 detects that the contact unit 420 contacts the contact sensor unit 290 so that the position of the contact unit 420 of the waterproof glove 400 is the position of the contact sensor unit 290 The hand operation is discriminated based on the displacement received from the hand movement sensor unit 410 as a reference point.

A method of controlling a wearable device according to an embodiment of the present invention will now be described. FIG. 13 is a flowchart for explaining a method of controlling a wearable device according to an embodiment of the present invention, and FIGS. 14 to 18 are views for explaining a method of controlling the wearable device according to an embodiment of the present invention. The embodiments of FIGS. 13 to 18 assume the initialized state as described in FIG. 12 described above.

Referring to FIG. 13, in step S410, the controller 280 detects whether the position of the waterproof glove 400 is within the control range 60, which is a preset range. Figs. 14 and 15 show examples of the control range 60. Fig. The control range 60 is preset to determine whether the position of the waterproof glove 400 is within the user's line of sight. The position of the control range 60 can be changed according to the user's gaze. Since the helmet body 100 is worn on the head of the user, the direction of the user's gaze can be specified through sensing the movement of the helmet body 100. The control unit 280 can sense the displacement of the helmet body 100 from the reference point by sensing the movement of the helmet body 100 through the sensor unit 240, , The control range 60 is also shifted as the direction of the line of sight changes. When the position of the waterproof glove 400 is within the predetermined control range 60, the controller 280 detects the hand movement within the control range 60 through the hand controller 410 at step S420 . Then, the control unit 280 can control the image projected on the glass 310 through the projection unit 250 according to the manual operation in step S430.

In summary, as described above, the control range 60 also moves as the direction of the user's line of sight changes. In addition, the hand gesture controls the projected image only when it is sensed within the control range 60. That is, the control unit 280 derives the control range 60 from the reference point, that is, the displacement of the helmet obtained through the sensor unit 240 from the position when the contact unit 420 contacts the contact sensor unit 290 can do. Then, the control unit 280 can obtain a hand gesture (position from the reference point of each part of the glove or the hand) obtained from the reference point through the gesture sensor unit 410. Accordingly, the control unit 280 can locate the glove 400 within the control range 60 and obtain the hand operation within the control range 60. [ Accordingly, the control unit 280 can control an image projected on the glass 310 through the projection unit 250 according to the manual operation.

For example, as shown in FIG. 16, the user can spread his / her finger all over his or her gaze and maintain the state for a predetermined time or more. Then, the control unit 280 can detect a hand gesture within the control range 60 through the gesture sensor unit 410. 17, the control unit 280 can project a plurality of menus 61 onto the glass 310 through the projection unit 250. [ Here, the hand gesture is a state in which all of the fingers are in the unfolded state, and the operation corresponding to the gesture operation is exemplified by the plurality of menus 61. As a further example, in the case of hand-operated scissors, the underwater map of the surrounding area can be displayed, and in the case of hand-operated fist-shaped operation, the wire image (ultrasound image) described above can be projected. That is, when the controller 280 detects the hand gesture within the control range 60 derived from the reference point, the image mapped in advance to the sensed gesture may be projected onto the glass 300.

18, the control unit 280 has projected a plurality of menus 61 on the glass 310 through the projection unit 250, and the user has a plurality of menus 61 . [0050] In addition, as shown in Fig. Then, the control unit 280 can detect that the hand operation is within the control range 60 and determine which one of the plurality of menus 61 is selected. According to this determination, the control unit 280 can project an image corresponding to the selected menu on the glass 310 through the projection unit 250.

The area occupied by each of the plurality of menus 61 (e.g., menus 1 to 4 in FIG. 18) can also be derived based on the reference point. That is, the control unit 280 can know the area occupied by each of the plurality of menus 61 projected on the screen based on the reference point, and can also control the projection unit 250 to control the area.

Accordingly, when the controller 280 detects a hand motion in the control range 60 and detects that the fingertip is included in an area occupied by one of the plurality of menus 61, the controller 280 selects the corresponding menu And control the projection unit 250 to project an image according to the menu.

According to the embodiment of the present invention, the wearable device 100 can perform mutual ultrasonic communication in water. A communication method between the wearable devices 100 will be described. 19 is a flowchart for explaining a communication method between wearable devices according to an embodiment of the present invention. 20 is a diagram for explaining a communication method between wearable devices according to an embodiment of the present invention.

19, the first wearable device 11 and the second wearable device 12 are present. The first wearable device 11 is a device worn by a first user and the second wearable device 12 is a device worn by a second user. The embodiment of FIG. 19 assumes a situation where the first user tries to transmit the ultrasound image detected by the first user to the second user.

Referring to FIG. 19, a first user may perform a hand operation to start communication to transmit data to a second user. This hand gesture may be that the first user points the second user by hand, as shown in Fig. That is, the hand gesture for communication can be a predetermined finger, for example, indicating the finger in any one direction (for example, the direction of the second user) over a predetermined time. The control unit 280 of the first wearable device 11 can detect such hand movements through the hand movement sensor unit 410 of the waterproof glove 400 in step S510. The hand gesture for communication is independent of the control range 60 described above, but is a hand gesture determined based on the reference point at the time of initialization described in Fig. The controller 280 of the first wearable device 11 switches from the detection mode to the communication mode in step S520. Then, the controller 280 of the first wearable device 11 emits the connection request signal in the direction of the finger in step S530. That is, the connection request signal may radiate ultrasonic waves of a predetermined frequency continuously in the direction of the finger.

The control unit 280 of the second wearable device 12 determines that there is a connection request when the ultrasonic wave of a predetermined frequency is continuously received for a predetermined time or more and then switches the ultrasonic unit 210 to the communication mode in step S540. Then, the controller 280 of the second wearable device 12 transmits a connection response signal to the first wearable device 11 in step S550. Here, the connection response signal radiates ultrasonic waves of a predetermined frequency in a direction in which the connection request signal is received for a predetermined time or longer. That is, the control unit 280 of the second wearable device 12 emits ultrasonic waves of a predetermined frequency in a direction in which the connection request signal is received through the ultrasonic unit 210 over a predetermined period of time.

The controller 280 of the first wearable device 11 receiving the connection response signal transmits the data through the ultrasonic waves in step S560. Then, the controller 280 of the first wearable device 11 can extract data from the ultrasonic waves in step S570. For example, the data may be ultrasound images.

When the data transmission is completed, the controller 280 of the first wearable device 11 transmits a connection termination signal to terminate the ultrasonic communication connection in step S580. The connection termination signal may continuously radiate ultrasonic waves of a predetermined frequency toward the second wearable device 12. [

The control unit 280 of the second wearable device 12 determines that the ultrasonic wave of the predetermined frequency is continuously received for a predetermined time or more and determines that the ultrasonic wave is a connection end signal and transmits an end response signal to the first wearable device 11 in step S590 do. Here, the termination response signal radiates ultrasonic waves of a predetermined frequency in the direction of the first wearable device 11 over a predetermined period of time. That is, the control unit 280 of the second wearable device 12 emits ultrasonic waves of a predetermined frequency in a direction in which the connection end signal is received through the ultrasonic unit 210 for a predetermined time or more.

The control unit 280 of the second wearable device 12 that has transmitted the termination response signal switches its mode from the communication mode to the detection mode in step S600. On the other hand, the control unit 280 of the first wearable device 11 receiving the termination response signal switches the mode from the communication mode to the detection mode in step S610.

19, when the first wearable device 11 transmits a connection request signal in steps S530 to S550, when the second wearable device 12 switches to the communication mode (step < RTI ID = 0.0 > , And responded through the connection response signal in step S550. However, according to another embodiment of the present invention, the connection response signal of step S550 may be omitted. In this case, the first wearable device 11 transmits a connection request signal for a predetermined time, and then transmits data as in step S560.

In addition, according to the embodiment of FIG. 19, the second wearable device 12 receives the connection end signal in step S580, and then transmits an end response signal to the first wearable device 11 in step S590. However, according to the embodiment of the present invention, the end response signal of this step S590 may be omitted. In this case, after transmitting the connection end signal, the first wearable device 11 terminates the communication and switches to the detection mode. After the second wearable device 12 receives the connection end signal, Mode.

According to the communication method in water according to the present invention as described above, communication can be performed by a simple hand operation between a user underwater, i.e., a diver, and various information obtained through ultrasonic detection can be transmitted to each other .

The method according to the present invention may be implemented in the form of software readable by various computer means and recorded in a computer-readable recording medium. Here, the recording medium may include program commands, data files, data structures, and the like, alone or in combination. Program instructions to be recorded on a recording medium may be those specially designed and constructed for the present invention or may be available to those skilled in the art of computer software. For example, the recording medium may be an optical recording medium such as a magnetic medium such as a hard disk, a floppy disk and a magnetic tape, a compact disk read only memory (CD-ROM), a digital versatile disk (DVD) A magneto-optical medium such as a floppy disk and a ROM, a random access memory (RAM), a flash memory, a solid state disk (SSD), a hard disk drive (HDD) And hardware devices specifically configured to store and perform the same program instructions. Examples of program instructions may include machine language code such as those generated by a compiler, as well as high-level language code that may be executed by a computer using an interpreter or the like. Such hardware devices may be configured to operate as one or more software modules to perform the operations of the present invention, and vice versa.

While the present invention has been described with reference to several preferred embodiments, these embodiments are illustrative and not restrictive. It will be understood by those skilled in the art that various changes and modifications may be made without departing from the spirit of the invention and the scope of the appended claims.

10: wearable device 100: helmet
200: Function module 210: Ultrasonic part
220: GPS receiving unit 230:
240: Sensor part 250: Projection part
260: illumination unit 270: storage unit
280: control unit 290: contact sensor unit
300: Underwater goggles 400: Underwater gloves
410: hand motion sensor part 420: contact part

Claims (6)

1. A wearable device for underwater communication between a submerged diver in water,
A helmet which can be worn on the user's head;
An ultrasonic unit attached to one side of the helmet to emit ultrasonic waves to the outside and receive external ultrasonic waves;
A contact sensor formed on the outside of the helmet;
A contact portion formed inside the shell of the waterproof glove;
A hand gesture sensor unit mounted on an inner skin of the waterproof glove and transmitting a displacement from the reference point with a position when the contact unit contacts the contact sensor unit as a reference point; And
When the user senses that a finger determined in a predetermined period of time through a displacement from the reference point transmitted from the hand movement sensor unit is pointing in one direction in the water, the ultrasonic unit is switched to the communication mode, And transmits ultrasonic waves of a predetermined frequency or more to a direction indicated by the ultrasonic wave, and transmits an ultrasonic wave of a predetermined frequency or more from a direction pointed by the finger corresponding to the connection request signal through the ultrasonic wave unit And a controller for controlling the ultrasonic transmitter to transmit ultrasonic waves containing data through the ultrasonic wave to a direction indicated by the finger when the ultrasonic wave is received as a response signal.
The method according to claim 1,
The control unit
And transmits the connection end signal through the ultrasonic wave unit to the direction indicated by the finger through the ultrasonic wave unit to emit ultrasonic waves of a predetermined frequency for a predetermined period of time or longer, And switches the ultrasonic wave detecting unit to the detection mode when an ultrasonic wave having a predetermined frequency is received for a predetermined time or more in response to the end response signal.
delete delete A communication method of a wearable device including a helmet for underwater communication between a submerged diver in water and an underwater glove,
And a control unit for controlling the position of the contact point formed on the inside of the outer skin of the waterproof glove as a reference point in the water when the contact unit is in contact with the contact sensor unit formed on the outer side of the helmet, Sensing a displacement from the object;
Switching to a communication mode when detecting that the determined finger indicates a certain direction over a predetermined time period through a displacement from the reference point;
Transmitting a connection request signal by radiating an ultrasonic wave of a predetermined frequency for a predetermined time or more toward a direction pointed by the finger; And
And transmitting ultrasonic waves containing data to a direction pointed by the finger when receiving ultrasonic waves of a predetermined frequency or more from a direction indicated by the finger in response to the connection request signal as a connection response signal Wherein the wearable device is a mobile phone.
delete
KR1020150166762A 2015-11-26 2015-11-26 A wearable apparatus for detecting an object under water and a method for communicating thereof KR101788105B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150166762A KR101788105B1 (en) 2015-11-26 2015-11-26 A wearable apparatus for detecting an object under water and a method for communicating thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150166762A KR101788105B1 (en) 2015-11-26 2015-11-26 A wearable apparatus for detecting an object under water and a method for communicating thereof

Publications (2)

Publication Number Publication Date
KR20170061549A KR20170061549A (en) 2017-06-05
KR101788105B1 true KR101788105B1 (en) 2017-10-19

Family

ID=59223138

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150166762A KR101788105B1 (en) 2015-11-26 2015-11-26 A wearable apparatus for detecting an object under water and a method for communicating thereof

Country Status (1)

Country Link
KR (1) KR101788105B1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3080088U (en) 2001-03-07 2001-09-14 株式会社ケーカンパニー Dive helmet
JP2005246578A (en) * 2004-03-08 2005-09-15 Mitsui Eng & Shipbuild Co Ltd Underwater robot steering method and underwater robot steering system
KR101050709B1 (en) 2010-11-03 2011-07-20 (주)텔레콤랜드 Smart helmet for diver
KR101079343B1 (en) 2010-11-03 2011-11-04 (주)티엘씨테크놀로지 Smart monitorring control system
KR101282669B1 (en) 2012-11-12 2013-07-12 (주)티엘씨테크놀로지 Smart ware for preventing an accident on workshop
KR101356605B1 (en) * 2012-08-21 2014-02-04 강릉원주대학교산학협력단 System for exploring of underwater

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3080088U (en) 2001-03-07 2001-09-14 株式会社ケーカンパニー Dive helmet
JP2005246578A (en) * 2004-03-08 2005-09-15 Mitsui Eng & Shipbuild Co Ltd Underwater robot steering method and underwater robot steering system
KR101050709B1 (en) 2010-11-03 2011-07-20 (주)텔레콤랜드 Smart helmet for diver
KR101079343B1 (en) 2010-11-03 2011-11-04 (주)티엘씨테크놀로지 Smart monitorring control system
KR101356605B1 (en) * 2012-08-21 2014-02-04 강릉원주대학교산학협력단 System for exploring of underwater
KR101282669B1 (en) 2012-11-12 2013-07-12 (주)티엘씨테크놀로지 Smart ware for preventing an accident on workshop

Also Published As

Publication number Publication date
KR20170061549A (en) 2017-06-05

Similar Documents

Publication Publication Date Title
AU2022263451B2 (en) Systems and methods for controlling operations of marine vessels
US10107907B2 (en) Bobber field acoustic detection system
JP6297768B2 (en) Satellite signal multipath mitigation in GNSS devices
US10114470B2 (en) Using motion sensing for controlling a display
US20190219693A1 (en) 3-D US Volume From 2-D Images From Freehand Rotation and/or Translation of Ultrasound Probe
US20160245915A1 (en) Forward and Rear Scanning Sonar
CN110727282B (en) AUV docking method and device and underwater docking system
JP2014115250A (en) Underwater searching apparatus, and target indicating method
US10473781B2 (en) Determining a boundary enclosing a region of interest for a body of water
KR102654617B1 (en) Method for receiving satellite signal adjusting resonant frequency according to external medium of electronic device and electronic device for supporting the same
US20190293747A1 (en) Base station for marine display
KR101643195B1 (en) Helmet apparatus for detecting object under water and method thereof
US9972110B2 (en) Thermocline display
KR101768972B1 (en) A wearable apparatus for detecting an object under water and a method for controlling thereof
KR101788105B1 (en) A wearable apparatus for detecting an object under water and a method for communicating thereof
US9832609B2 (en) Sensor remote control system, remote control device, sensor device and method of remotely controlling sensor
US20190209130A1 (en) Real-Time Sagittal Plane Navigation in Ultrasound Imaging
JP2009098126A (en) Automatic tracking scanning sonar
JP6673699B2 (en) Terrain display system
JP7275472B2 (en) Velocity measurement system
Viswanathan et al. Blind navigation proposal using SONAR
KR101580956B1 (en) Sonar image emulator and method for sonar image forecast using the same
JP5547889B2 (en) Scanning sonar device and tracking method
US20240212293A1 (en) Information processing apparatus, information processing method, and program
US20240149992A1 (en) Navigational information displaying device, navigational information displaying method, and a non-transitory computer-readable medium

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
AMND Amendment
AMND Amendment
X701 Decision to grant (after re-examination)