WO2019240070A1 - Système de vérification d'action - Google Patents

Système de vérification d'action Download PDF

Info

Publication number
WO2019240070A1
WO2019240070A1 PCT/JP2019/022897 JP2019022897W WO2019240070A1 WO 2019240070 A1 WO2019240070 A1 WO 2019240070A1 JP 2019022897 W JP2019022897 W JP 2019022897W WO 2019240070 A1 WO2019240070 A1 WO 2019240070A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
terminal
notification
information
check system
Prior art date
Application number
PCT/JP2019/022897
Other languages
English (en)
Japanese (ja)
Inventor
大輔 桐生
Original Assignee
本田技研工業株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 本田技研工業株式会社 filed Critical 本田技研工業株式会社
Priority to JP2020525549A priority Critical patent/JP7523347B2/ja
Publication of WO2019240070A1 publication Critical patent/WO2019240070A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B25/00Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
    • G08B25/01Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium
    • G08B25/04Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium using a single signalling line, e.g. in a closed loop
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/005Traffic control systems for road vehicles including pedestrian guidance indicator
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M11/00Telephonic communication systems specially adapted for combination with other electrical systems

Definitions

  • the present invention relates to an operation confirmation system.
  • This application claims priority based on Japanese Patent Application No. 2018-111419 filed in Japan on June 11, 2018 and Japanese Patent Application No. 2018-146316 filed in Japan on August 2, 2018 And their contents are incorporated herein.
  • the system takes an image of the front of the child with a camera attached to the carrying bag, and transmits the image obtained by the imaging to a terminal device operated by a child's guardian or the like via a communication network. To do. Further, the voice of the guardian or the like collected by the microphone of the terminal device is transmitted via the communication network and output from the speaker on the back.
  • the camera attached to the carrying bag it is difficult to confirm the movement of the child in a place where there is a danger such as a position crossing an intersection or a road.
  • the present invention has been made in view of the above-described problems, and an object of the present invention is to provide an operation confirmation system capable of confirming a predetermined motion of a mobile person.
  • An operation check system is provided in a terminal carried by a mobile person, an operation acquisition unit that acquires the mobile user's action, and a server that can communicate with the terminal or the terminal. And an operation determination unit that determines that a predetermined operation has been performed based on the operation acquired by the operation acquisition unit.
  • the motion acquisition unit acquires the forward movement of the mobile person, and the motion determination unit is configured to stop the forward movement from stopping the forward movement from the forward movement. You may judge that
  • the operation acquisition unit includes a position acquisition unit that acquires a current position of the terminal, and the operation determination unit is based on the current position acquired by the position acquisition unit. You may judge.
  • the position acquisition unit may perform positioning using electromagnetic waves received from outside the terminal.
  • the position acquisition unit may perform positioning based on an inquiry between electromagnetic waves from surrounding objects and stored map information.
  • the operation acquisition unit includes a speed acquisition unit that acquires the movement speed of the terminal, and the operation determination unit is based on the movement speed acquired by the speed acquisition unit. You may judge.
  • the speed acquisition unit may be a detection device that is provided in the terminal and detects a physical quantity.
  • the motion acquisition unit acquires the motion of the head of the mobile person, and the motion determination unit swings the head to the left or right with respect to the moving direction of the mobile person. You may judge that
  • the motion acquisition unit acquires the orientation of the face of the mobile person, and the motion determination unit directs the orientation of the face to the left and right with respect to the movement direction of the mobile person. You may judge that
  • the motion acquisition unit acquires at least one of the position and movement of the hand of the mobile person, and the motion determination unit has the hand part of the mobile user. It may be determined that it is located in a predetermined area which is an area higher than the shoulder portion of.
  • the motion acquisition unit is a remote detection unit that remotely detects at least one of the position and movement of an object outside the terminal. May be included.
  • the remote detection unit includes a reception unit that receives electromagnetic waves from the object, and the terminal includes the hand unit located in the predetermined area. You may arrange
  • the terminal may be arranged such that the head of the moving person is further within the detection range of the receiving unit.
  • the terminal or the server stores a set position that is a geographical position set in advance so that the predetermined operation is performed. May be provided.
  • the set position may be set to a specific position where there is a possibility of danger to the mobile person on a road on which the mobile person passes.
  • the road may include an intersection where the road intersects, and the set position may be set at the intersection.
  • the road may include a roadway through which a vehicle passes and a sidewalk that is partitioned, and the set position may be set on the sidewalk at the intersection.
  • the set position is based on input information related to the movement start point and the movement end point of the mover, and the movement start point and the movement end point It may be set on the route.
  • the terminal or the server may include a position setting unit that sets the set position on the route based on the movement start point and the movement end point.
  • the set position may be set based on input information related to position designation for the terminal or another terminal different from the terminal. .
  • the input information indicates a position on a map including a road part region and a non-road part region, and when the input information is located in the non-road part region, You may provide the position correction
  • the road part region includes a roadway part region and a sidewalk part region
  • the position correction unit includes the input information in the non-road part region or the roadway part region. When located, the input information may be corrected to the adjacent sidewalk area.
  • the set position which is a geographical position set in advance so that the predetermined operation is performed by the mobile person is the first set position.
  • a second setting position different from the first setting position, and the terminal or the server performs the first determination result determined by the operation determination unit at the first setting position and the operation at the second setting position.
  • An evaluation unit that compositely evaluates the movement of the mobile person may be provided based on the second determination result determined by the determination unit.
  • the terminal or the server includes an output unit that outputs the evaluation to the outside of the terminal or to another terminal provided to be communicable with the terminal or the server. You may prepare.
  • the output unit may output the evaluation outside the terminal and may not output in the vicinity of the set position.
  • the output unit may output the evaluation to the other terminal and sequentially output the evaluation at the set position.
  • the operation determination unit may be provided in the terminal, and the evaluation unit may be provided in the server.
  • the storage unit is provided in the terminal or the server and stores a set position that is a preset geographical position; And a notifying unit for notifying the approach of the terminal to the set position at a predetermined distance from the set position.
  • the notification unit may increase the amount of notification as the terminal approaches the set position from the remote position.
  • the notification unit may decrease the notification amount as the terminal moves away from the set position after the start of the notification.
  • the notification unit may stop the notification when the operation determination unit determines the predetermined operation after the start of the notification. Good.
  • the notification unit stops the notification when the terminal is separated from the set position by a predetermined amount after the start of the notification. May be.
  • the terminal or the server includes a position acquisition unit that acquires a current position of the terminal, and the terminal based on the current position. And a position determination unit that determines that the position is within a notification area that is closer to the set position than the position.
  • the position determination unit may determine that the terminal is located in the notification area based on a linear distance between the current position and the set position.
  • the terminal or the server includes a speed acquisition unit that acquires a movement speed of the terminal, and the notification unit reports the notification based on the movement speed. You may provide the alerting
  • the notification correction unit may correct the start of the notification earlier as the moving speed is higher than when it is slower.
  • the notification correction unit may correct the notification amount so that the amount of the notification increases as the moving speed increases.
  • the terminal or the server may include a speed acquisition unit that acquires a moving speed of the terminal, and the terminal based on the moving speed.
  • a position determination unit that determines that the position has been reached may be provided.
  • the position determination unit may determine that the remote position has been reached based on an arrival time to the set position obtained based on the moving speed.
  • the mobile person may be a pedestrian.
  • the pedestrian may be a child.
  • the terminal may be arranged in the traveling object of the mobile person.
  • the carry item may be a shoulder bag.
  • the shoulder bag may be a shoulder bag.
  • the motion acquisition unit and the motion determination unit it can be easily used for motion evaluation or the like of whether or not a predetermined motion has been performed.
  • the results of motion acquisition and motion determination can be easily used for motion evaluation or the like as to whether or not forward movement stop has been performed as a predetermined motion.
  • the accuracy of the current position of the terminal can be improved by positioning electromagnetic waves received from, for example, a satellite, a base station, and various facilities.
  • the current position of the terminal can be acquired by estimation based on sensing by a range sensor, a rider, a camera, and the like and map information.
  • the moving speed of the terminal can be detected stably and directly with high accuracy, for example, without depending on information received from the outside.
  • the determination result of the left and right movement of the head can be easily used for the operation evaluation of whether or not the left and right confirmation operation or the like has been performed as the predetermined operation.
  • the determination result of the left and right face movements can be easily used for the operation evaluation of whether or not the left / right confirmation operation has been performed as the predetermined operation.
  • the determination result of the position and movement of the hand can be easily used for operation evaluation or the like as to whether or not a hand raising operation or the like has been performed as a predetermined operation.
  • the results of motion acquisition and motion determination can be easily utilized for motion evaluation or the like of whether or not a predetermined motion corresponding to the set position has been performed.
  • the results of motion acquisition and motion determination can be easily used for motion evaluation or the like as to whether or not a predetermined motion corresponding to a set position with a risk on a traffic road has been performed.
  • the results of motion acquisition and motion judgment can be easily utilized for motion evaluation or the like of whether or not a predetermined motion corresponding to a set position with a traffic risk on a traffic road has been performed. it can.
  • the results of motion acquisition and motion determination can be easily used for motion evaluation or the like as to whether or not a predetermined motion corresponding to the set position on the sidewalk has been performed.
  • the results of motion acquisition and motion determination can be easily used for motion evaluation or the like as to whether or not a predetermined motion corresponding to the set position on the movement route has been performed.
  • setting of the setting position by the user or the like can be easily supported.
  • direct designation of the setting position by the user or the like can be easily supported.
  • the results of motion acquisition and motion determination can be easily utilized for motion evaluation or the like of whether or not a predetermined motion corresponding to a set position on the road has been performed.
  • the results of motion acquisition and motion determination can be easily used for motion evaluation or the like of whether or not a predetermined motion corresponding to the set position on the sidewalk has been performed.
  • the movement of the moving person can be evaluated in a complex and accurate manner.
  • confirmation of operation evaluation by the user or the like can be easily supported.
  • confirmation of real-time operation evaluation at another terminal by a user or the like can be easily supported.
  • the evaluation accuracy can be improved by performing the behavioral evaluation on the server.
  • the movement of the pedestrian can be acquired and determined.
  • the case of (41) above it is possible to acquire and determine the behavior of the child.
  • the case of (42) above it is possible to prevent the mobile person from forgetting to carry the terminal.
  • the accuracy of the acquisition and determination of the operation can be improved by arranging the terminal around the shoulder of the moving person.
  • the terminal is arranged from the shoulder to the back of the mobile person so that the terminal is prevented from interfering with the mobile user's various operations, and the accuracy of operation acquisition and determination is improved. Can do.
  • FIG. 1 is a diagram illustrating a configuration example of an information communication system 1 according to the present embodiment.
  • the information communication system 1 includes a first mobile terminal 2 (first terminal), a second mobile terminal 3 (second terminal), a data center 4 (center), and a server 5.
  • the first mobile terminal 2, the second mobile terminal 3, the data center 4, and the server 5 are connected to each other via a network 6.
  • the network 6 is a mobile communication network, for example.
  • the mobile communication network is, for example, a fourth generation mobile communication network or a fifth generation mobile communication network.
  • the network 6 includes a base station and the like.
  • the first portable terminal 2 is a terminal that is carried by a bearer (for example, a child such as a child) who is a watching target.
  • the first portable terminal 2 is a robot having a communication function.
  • the first mobile terminal 2 transmits at least one of position information, the same group pedestrian information, peripheral pedestrian information, and vehicle information to the data center 4 via the network 6.
  • the first mobile terminal 2 receives at least one of traffic information, a message, and emergency information transmitted from the data center 4 via the network 6.
  • the configuration of the first mobile terminal 2 will be described later.
  • the same group pedestrian information is, for example, the same group pedestrian information indicating the same group pedestrian who attends or leaves school together.
  • the peripheral pedestrian information is information indicating a pedestrian walking around the carrier.
  • the vehicle information is vehicle information of a road on which walking is performed.
  • the second mobile terminal 3 is a terminal possessed by a guardian of a carrier (for example, a child such as a child) to be watched over.
  • the 2nd portable terminal 3 is a smart phone, a tablet terminal, or a smart watch, for example.
  • the second portable terminal 3 receives the first portable terminal position information regarding the position of the first portable terminal 2 transmitted by the data center 4 via the network 6. Note that the first portable terminal position information is image information. The configuration of the second mobile terminal 3 will be described later.
  • the data center 4 includes a communication unit 41, a control unit 42, a model storage unit 43, and a storage unit 44.
  • the communication unit 41 outputs position information, the same group pedestrian information, surrounding pedestrian information, vehicle information, and the like received from the first mobile terminal 2 via the network 6 to the control unit 42.
  • the communication unit 41 outputs disaster information, suspicious person information, emergency information, and the like received from the server 5 via the network 6 to the control unit 42.
  • the communication unit 41 transmits the danger notification output by the control unit 42 to the first mobile terminal 2 via the network 6.
  • the communication unit 41 transmits traffic information, messages, emergency information, and the like output from the control unit 42 to the first mobile terminal 2 via the network 6.
  • the communication unit 41 transmits the first mobile terminal position information, the emergency notification, the voice message, and the like output from the control unit 42 to the second mobile terminal 3 via the network 6. Further, the communication unit 41 receives a signal of a detection value of the acceleration sensor 28 from the first portable terminal 2 and outputs the received signal to the control unit 42.
  • the control unit 42 stores the position information received from the first mobile terminal 2 in the storage unit 44.
  • the control unit 42 stores the suspicious person information received from the server 5 in the storage unit 44.
  • the suspicious person information includes the position information where the suspicious person appears, the time when the suspicious person appears, and the like.
  • the control unit 42 generates a movement trajectory of the person who carries the first mobile terminal 2 based on the position information and map information stored in the storage unit 44.
  • the control part 42 produces
  • the control unit 42 outputs the generated first mobile terminal position information to the communication unit 41.
  • the control unit 42 includes position information and vehicle information received from the first mobile terminal 2, emergency information, disaster information and suspicious person information received from the server 5, model information stored in the model storage unit 43, and storage unit 44. Is set on the basis of the map information and the position information stored therein, and information indicating the set dangerous place is stored in the storage unit 44.
  • a dangerous location is a specific location where danger may occur, such as a location where a disaster has occurred, a location where there are many traffic accidents, or a location where a suspicious person appears.
  • the control unit 42 generates a danger notification based on the map information stored in the storage unit 44 and information indicating the dangerous location and the received position information, and outputs the generated danger notification to the communication unit 41.
  • the control unit 42 includes position information and vehicle information received from the first portable terminal 2, emergency information received from the server 5, model information stored in the model storage unit 43, and map information and position stored in the storage unit 44. Based on the information, at least one of traffic information, a message, and emergency information is generated.
  • the message is an audio signal, which is text information converted into an audio signal by a well-known method using a language model or a dictionary.
  • the control unit 42 outputs the generated traffic information, message, and emergency information to the communication unit 41.
  • the control unit 42 outputs the generated message as a voice message, and outputs emergency information to the communication unit 41 as an emergency notification.
  • control unit 42 associates the first portable terminal 2 possessed by the carrier with the second portable terminal 3 possessed by the guardian and stores it in the storage unit 44 based on the telephone number or identification information of the portable terminal. Let As a result, the data center 4 performs processing on a plurality of sets of the first mobile terminal 2 and the second mobile terminal 3.
  • the model storage unit 43 stores a statistical regional traffic model, a traffic prediction model, and the like as model information.
  • the statistical regional traffic model statistically indicates the traffic conditions in the area, and the data center 4 obtains data relating to map information or traffic information from mobile terminals, etc., and statistically calculates traffic information in each area. It is the model shown in.
  • the traffic prediction model is a model for predicting the traffic volume in the corresponding region in the future from the traffic situation in each region accumulated in the past, and may be determined based on the data from the statistical regional traffic model.
  • the traffic prediction model may be the traffic volume at the time of going to and from school on the route of the going to and from school of the carrier carrying the first mobile terminal 2.
  • the storage unit 44 stores map information.
  • the map information includes, for example, a map that includes a school route from home to school, location information of stores (such as supermarkets, bookstores, and stationery stores) to which the carrier departs, and the location of a cram school or the like to go to before going home Information.
  • the map information includes information such as sidewalks and road route networks (divergence points and branches).
  • the storage unit 44 stores position information of the first mobile terminal 2.
  • the storage unit 44 stores information indicating a dangerous place and suspicious person information.
  • storage part 44 memorize
  • the server 5 is installed in, for example, a school and a ward office.
  • the server 5 transmits disaster information related to natural disasters such as earthquakes and typhoons and suspicious person information to the data center 4 via the network 6.
  • the data center 4 may acquire such information through communication with the server 5.
  • FIG. 2 is a diagram illustrating an example of a usage scene of the information communication system 1 according to the present embodiment.
  • the first mobile terminal 2 is used by being attached to a shoulder belt of a school bag, for example.
  • the first mobile terminal 2 includes a microphone, a radar, a speaker, a position detection unit, and the like.
  • the first mobile terminal 2 acquires and transmits a voice spoken by a child carrying the first mobile terminal 2.
  • the first mobile terminal 2 transmits a radar detection result signal to the data center 4.
  • the first mobile terminal 2 receives information and messages transmitted from the data center 4 and reproduces and notifies them.
  • a school bag is an example of a carried item, a shoulder bag, and a shoulder bag.
  • the data center 4 transmits the traffic information, the message, and the emergency information generated based on the information received from the first mobile terminal 2 and the information received from the server 5 to the first mobile terminal 2. Further, the data center 4 transmits the first mobile terminal location information, the emergency notification, and the voice message generated based on the information received from the first mobile terminal 2 and the information received from the server 5 to the second mobile terminal 3. To do. Note that the data center 4 has real-time information captured by the plurality of first mobile terminals 2 (for example, the current position and the same, for example) when another child has the other first mobile terminal 2 at the time of going to and from school. Group pedestrian information, surrounding pedestrian information, vehicle information, etc.) may be mutually used to generate each information.
  • the second portable terminal 3 receives the first portable terminal position information including the map information including the route that the child carrying the first portable terminal 2 is walking from the data center 4, and the received first portable terminal position information Is displayed as status information.
  • the second mobile terminal 3 receives the emergency notification and voice message transmitted from the data center 4, and displays or reproduces the received emergency notification or voice message. The image displayed on the second mobile terminal 3 will be described later.
  • the server 5 transmits, for example, broadcast contents by the disaster prevention administrative radio to the data center 4 via the network 6.
  • the usage scene shown in FIG. 2 is an example, and the present invention is not limited to this.
  • FIG. 3 is a block diagram illustrating a configuration example of the first mobile terminal 2 according to the present embodiment.
  • the first mobile terminal 2 includes a communication unit 21, a radar 22, a microphone 23, a position detection unit 24, a vibrator 25, a speaker 26, a display unit 27, an acceleration sensor 28, and a control unit 29.
  • the first mobile terminal 2 may include an operation unit.
  • the communication unit 21 includes, for example, a communication IC card, a communication antenna, and a communication module.
  • the communication unit 21 transmits information output by the control unit 29 (for example, position information, same group pedestrian information, peripheral pedestrian information, and vehicle information) to the data center 4 via the network 6.
  • the communication unit 21 receives information (for example, traffic information, messages, and emergency information) transmitted by the data center 4, and outputs the received information (for example, traffic information, messages, and emergency information) to the control unit 29.
  • the communication unit 21 receives the danger information transmitted by the data center 4 and outputs the received danger information to the control unit 29.
  • the radar 22 includes a front radar that detects a front object around the first portable terminal 2 and a rear radar that detects an object behind the first portable terminal 2 (for example, behind the front radar).
  • the front radar and the rear radar are, for example, millimeter wave radars.
  • the radar 22 outputs a detection signal related to the distance, relative speed, relative position, direction, and the like to an external object to the control unit 29.
  • the microphone 23 outputs a signal (for example, an audio signal) obtained by collecting sound to the control unit 29.
  • a signal for example, an audio signal
  • the microphone 23 may be a microphone array including a plurality of microphones.
  • the position detection unit 24 includes a positioning signal receiving antenna that receives a positioning signal from a satellite or the like. The position detection unit 24 outputs the received positioning signal to the control unit 29.
  • the 1st portable terminal 2 may acquire the information which shows a present position based on communication with a base station, for example.
  • the vibrator 25 is a vibrator and generates vibrations according to the notification information output from the control unit 29.
  • the speaker 26 converts a signal (for example, an audio signal) output from the control unit 29 into sound.
  • the display unit 27 operates according to notification information output from the control unit 29.
  • the display unit 27 includes, for example, LEDs (light emitting diodes) that emit three colors of red, green, and blue.
  • the display unit 27 may be, for example, a liquid crystal display device, an organic EL (electroluminescence) display device, or an electronic ink display device.
  • the acceleration sensor 28 detects the acceleration in the three directions of the X axis, the Y axis, and the Z axis of the first mobile terminal 2 and outputs the detected acceleration signal to the control unit 29.
  • the control unit 29 acquires a detection signal output from the radar 22.
  • the control unit 29 acquires an audio signal output from the microphone 23.
  • the control unit 29 acquires the positioning information output from the position detection unit 24.
  • the control unit 29 acquires danger information output from the communication unit 21.
  • the control unit 29 outputs position information based on the positioning information acquired from the position detection unit 24 to the communication unit 21. For example, the control unit 29 determines whether or not a person is included using a known recognition method from the detection signal acquired from the radar 22. When a person is included in the detection signal, the control unit 29 sets the detection signal as the same group pedestrian information or the surrounding pedestrian information, and outputs the same group pedestrian information or the surrounding pedestrian information to the communication unit 21.
  • control part 29 discriminate
  • control part 29 discriminate
  • the control unit 29 acquires information (for example, traffic information, messages, and emergency information) output from the communication unit 21.
  • information from the communication unit 21 is an audio signal
  • the control unit 29 outputs the audio signal to the speaker 26.
  • the control unit 29 outputs the information to at least one of the display unit 27 and the vibrator 25.
  • the control part 29 controls the color or blinking state etc. which are light-emitted according to information, for example.
  • the control unit 29 controls the vibration interval or the strength of vibration according to the information, for example.
  • the control unit 29 generates a voice message based on the danger information, and outputs the generated voice message to the speaker 26. Further, the control unit 29 generates notification information based on the danger information, and outputs the generated notification information to the vibrator 25.
  • the control unit 29 acquires the detection value output from the acceleration sensor 28 and transmits the acquired detection value to the data center 4 through the communication unit 21.
  • FIG. 4 is a block diagram illustrating a configuration example of the second mobile terminal 3 according to the present embodiment.
  • the second mobile terminal 3 includes a communication unit 31, a microphone 32, an operation unit 33, a display unit 34, a speaker 35, a control unit 36, and a vibrator 37.
  • the communication unit 31 receives the first mobile terminal position information transmitted from the data center 4 and outputs the received first mobile terminal position information to the control unit 36.
  • the communication unit 31 receives the emergency notification or voice message transmitted from the data center 4 and outputs the received emergency notification or voice message to the control unit 36.
  • the microphone 32 outputs a signal (for example, an audio signal) obtained by collecting sound to the control unit 36.
  • the microphone 32 may be a microphone array including a plurality of microphones.
  • the operation unit 33 is a touch panel sensor provided on the display unit 34, for example.
  • the operation unit 33 outputs a signal generated by a user operation to the control unit 36.
  • the display unit 34 displays the first portable terminal position information output from the control unit 36.
  • the display unit 34 is, for example, a liquid crystal display device, an organic EL display device, or an electronic ink display device.
  • the speaker 35 converts a signal (for example, an audio signal) output from the control unit 36 into sound.
  • Vibrator 37 is a vibrator, and generates vibration according to the notification information output from control unit 36.
  • the control unit 36 acquires the first portable terminal position information output from the communication unit 31.
  • the control unit 36 acquires an emergency notification or a voice message output from the communication unit 31.
  • the control unit 36 outputs the acquired first portable terminal position information to the display unit 34.
  • the emergency notification is image information
  • the control unit 36 outputs the emergency information to the display unit 34.
  • the control unit 36 outputs the acquired voice message to the speaker 35.
  • the control unit 36 acquires danger information output from the communication unit 31.
  • the control unit 36 generates a voice message based on the danger information, and outputs the generated voice message to the speaker 35.
  • the control unit 36 generates notification information based on the danger information, and outputs the generated notification information to the vibrator 37.
  • FIG. 5 is a flowchart of a processing procedure example in which the data center 4 transmits the first mobile terminal position information according to the present embodiment to the second mobile terminal 3.
  • Step S ⁇ b> 11 The control unit 42 receives position information from the first portable terminal 2 by the communication unit 41 via the network 6.
  • the positional information includes received time information.
  • Step S ⁇ b> 12 The control unit 42 stores the received position information in the storage unit 44.
  • Step S13 The control unit 42 generates the first portable terminal position information using the position information and map information stored in the storage unit 44. Note that the control unit 42 connects the time information included in the position information in chronological order, and generates a movement trajectory of the person who carries the first portable terminal 2. Subsequently, the control unit 42 transmits the first portable terminal position information to the second portable terminal 3 by the communication unit 41 via the network 6.
  • the data center 4 when the data center 4 receives a transmission request for the first mobile terminal position information from the second mobile terminal 3, the data center 4 performs the processing illustrated in FIG.
  • the transmission request for the first mobile terminal position information is transmitted from the control unit 36 via the communication unit 31 in response to an operation on the operation unit 33 of the second mobile terminal 3 by the guardian.
  • the data center 4 may execute the processing shown in FIG. 5 at a preset time, for example.
  • the preset time is, for example, the attendance time and the school leaving time of the mobile person set in accordance with the operation of the guardian on the operation unit 33 of the second mobile terminal 3.
  • FIG. 6 is a diagram illustrating an example of first mobile terminal position information displayed on the display unit 34 of the second mobile terminal 3 according to the present embodiment.
  • the first mobile terminal position information shown in FIG. 6 is, for example, the first mobile terminal position information when the mobile person leaves school.
  • the first portable terminal position information includes information on the movement locus g11 and the current position g12 of the person who carries the first portable terminal 2.
  • FIG. 7 is a flowchart of a processing procedure example in which the data center 4 according to the present embodiment transmits a danger notice to the first mobile terminal 2 and the second mobile terminal 3.
  • Step S ⁇ b> 21 The control unit 42 of the data center 4 accesses the server 5.
  • Step S ⁇ b> 22 The control unit 42 acquires information indicating a dangerous location and suspicious person information from the server 5 by the communication unit 41.
  • the control unit 42 causes the storage unit 44 to store the acquired information indicating the dangerous location and the suspicious person information.
  • the dangerous part includes, for example, a predetermined range including a position (latitude and longitude) where suspicious person information is obtained, or a range such as an address where suspicious person information is obtained. Note that the range may be set by the control unit 42.
  • Step S ⁇ b> 23 The control unit 42 acquires position information from the first mobile terminal 2.
  • the location information may be transmitted by the first mobile terminal 2 based on a request from the data center 4 or may be transmitted by the first mobile terminal 2 at a predetermined cycle.
  • the control unit 42 stores the acquired position information in the storage unit 44.
  • Step S24 The control unit 42 calculates the distance between the dangerous location and the current position of the carrier based on the map information stored in the storage unit 44, the information indicating the dangerous location, and the location information.
  • Step S ⁇ b> 25 The control unit 42 determines whether the calculated distance is less than a predetermined value stored in the storage unit 44. When it is determined that the distance is less than the predetermined value (step S25; YES), the control unit 42 proceeds to the process of step S26. When it is determined that the distance is equal to or greater than the predetermined value (step S25; NO), the control unit 42 returns to the process of step S21.
  • Step S26 The control unit 42 generates danger information.
  • the control unit 42 notifies the danger by transmitting the generated danger information to the second portable terminal 3 through the communication unit 41.
  • the danger information is a notification sound
  • the control unit 42 increases the notification sound as the first portable terminal 2 approaches the dangerous place.
  • Step S ⁇ b> 27 The control unit 42 notifies the danger by transmitting the generated danger information to the first mobile terminal 2 through the communication unit 41.
  • the danger information is a notification sound
  • the control unit 42 increases the notification sound as the first portable terminal 2 approaches the dangerous place.
  • the guardian may set the dangerous part by operating the operation unit 33 of the second portable terminal 3, and the control unit 42 of the data center 4 may acquire the set information.
  • FIG. 8 is a flowchart of a processing procedure example when the first portable terminal 2 according to the present embodiment receives the danger information.
  • Step S ⁇ b> 31 The control unit 29 of the first mobile terminal 2 receives the danger information transmitted by the data center 4 by the communication unit 21.
  • Step S ⁇ b> 32 The control unit 29 generates a voice message informing the danger based on the danger information, and reproduces the generated voice message from the speaker 26. Moreover, the control part 29 produces
  • the control unit 29 outputs a predetermined voltage to the speaker 26 based on the danger information. Thereby, the control part 29 performs the audio
  • the control unit 29 increases the volume as the distance between the dangerous location and the current position of the wearer decreases.
  • Step S ⁇ b> 33 The control unit 29 determines whether or not it is a vibrator mode in which sound output from the speaker 26 is prohibited.
  • the control unit 29 may set the vibrator mode in accordance with, for example, an operation on an operation unit (not shown) provided in the first mobile terminal 2 or in response to an instruction from the second mobile terminal 3. May be executed.
  • the control unit 29 proceeds to the process of step S34.
  • the control part 29 returns a process to step S31, when it determines with it not being vibrator mode (step S33; NO). Note that, in the vibrator mode, the control unit 29 may not perform the process of step S32 or may blink the display unit 27 instead of the process of step S32.
  • Step S34 The control unit 29 drives the vibrator 25 based on the danger information.
  • the control part 29 may enlarge the amplitude of the vibration of the vibrator 25, or may shorten the space
  • FIG. 9 is a flowchart of a processing procedure example when the second portable terminal 3 according to the present embodiment receives the danger information.
  • Step S ⁇ b> 41 The control unit 36 of the second portable terminal 3 receives the danger information transmitted from the data center 4 by the communication unit 31.
  • Step S ⁇ b> 42 The control unit 36 generates a voice message informing the danger based on the danger information, and reproduces the generated voice message from the speaker 35. Or the control part 36 produces
  • the control unit 36 outputs a predetermined voltage to the speaker 35 based on the danger information. Thereby, the control part 36 performs the audio
  • the control unit 36 increases the volume as the distance between the dangerous place and the current position of the wearer decreases.
  • Step S43 The control unit 36 determines whether or not the vibrator mode is set.
  • the control unit 36 sets the vibrator mode according to the result of operating the operation unit 33 of the second portable terminal 3.
  • step S43; YES the control unit 36 proceeds to the process of step S44.
  • step S43; NO the control unit 36 proceeds to the process of step S45. Note that, in the vibrator mode, the control unit 36 may not perform the process of step S42, or may blink the display unit 34 instead of the process of step S42.
  • Step S44 The controller 36 drives the vibrator 37 based on the danger information.
  • the control unit 36 may increase the vibration amplitude of the vibrator 37 or shorten the vibration interval as the distance between the dangerous place and the current position of the wearer becomes shorter.
  • Step S45 The control unit 36 transmits a transmission request for the first portable terminal position information to the data center 4 through the communication unit 31 based on the result of the guardian operating the operation unit 33. Based on the received transmission request, the control unit 42 of the data center 4 attaches a warning image notifying the danger to the first portable terminal position information and transmits the warning image to the second portable terminal 3 by the communication unit 31. The control unit 36 receives the first portable terminal position information transmitted from the data center 4 by the communication unit 31. The control unit 36 causes the display unit 34 to display the received first portable terminal position information.
  • FIG. 10 is a diagram illustrating an example of first mobile terminal position information including a warning image displayed on the display unit 34 of the second mobile terminal 3 according to the present embodiment.
  • the first mobile terminal position information shown in FIG. 10 is, for example, the first mobile terminal position information when the mobile person leaves school.
  • the first portable terminal position information includes information on the movement locus g11 and the current position g12 of the person carrying the first portable terminal 2, an image g21 indicating a dangerous place, and a warning image g22.
  • the current position of the first mobile terminal 2 possessed by the mobile phone is within the danger area, so a predetermined wording (for example, “You are in the danger area. Please be careful! ])
  • Warning image g22 is superimposed or combined with other information and displayed.
  • the warning image may be generated based on the warning information received by the control unit 36.
  • the control unit 29 of the first portable terminal 2 may transmit the detection signal acquired from the radar 22 to the data center 4 by the communication unit 21 as the same group pedestrian information, surrounding pedestrian information, or vehicle information.
  • control unit 42 of the data center 4 uses the communication unit 41 to transmit the traffic information to the first part when the owner of the first mobile terminal 2 approaches a place with a large traffic volume based on, for example, information received from the server 5. You may transmit to the portable terminal 2.
  • the traffic information may be an audio signal.
  • the control unit 29 of the first mobile terminal 2 reproduces the received traffic information. Thereby, the portable person who possesses the 1st portable terminal 2 can know that he approached the location with much traffic.
  • the control part 42 of the data center 4 may transmit emergency information to the 1st portable terminal 2 and the 2nd portable terminal 3 based on the information received from the server 5, for example.
  • the emergency information is information calling for evacuation when, for example, a disaster or accident occurs.
  • the location information acquired by the first portable terminal 2 carried by the carrier or attached to the bag or the like is collected in the data center 4 and the second portable terminal 3 possessed by the guardian.
  • suspicious person information is collected from the administrative or school server 5 to the data center 4.
  • the data center 4 and the second portable The danger information is transmitted to the guardian who owns the terminal 3.
  • the carrier of the 1st portable terminal 2 when the carrier of the 1st portable terminal 2 is approaching a dangerous place, the carrier (for example, children, such as a child) of the 1st portable terminal 2, or the 2nd portable terminal 3 A danger notice can be sent to the owner (eg, a guardian, etc.).
  • the owner eg, a guardian, etc.
  • an alarm sound etc. are enlarged as the 1st portable terminal 2 approaches a dangerous location. Thereby, according to this embodiment, it can notify more accurately that the carrier of the 1st portable terminal 2 is approaching a dangerous place.
  • the second mobile terminal 3 acquires the movement locus of the first mobile terminal 2. Thereby, according to this embodiment, the holder of the 2nd portable terminal 3 can grasp
  • trajectory for example, a return route from a school to a home etc.
  • the data center 4A determines whether or not a predetermined operation to be performed at a predetermined position such as a school route is performed based on the information received from the first mobile terminal 2, and the determination result is the second mobile phone.
  • a predetermined position such as a school route
  • the predetermined position may be information on a position including the above-described dangerous place.
  • FIG. 11 is a diagram illustrating a main operation example of the information communication system 1A according to the first modification of the embodiment.
  • Symbol g100 is a diagram illustrating an example of an interaction between the first portable terminal 2 and a carrier (for example, a child such as a child) when returning home.
  • reference sign g100 reference sign g101 is a child's utterance
  • reference sign g102 is the first mobile terminal 2's utterance.
  • Symbol g110 is a diagram showing an example of dialogue between the child and the guardian when returning home.
  • reference sign g111 is the utterance of the guardian
  • reference sign g112 is the utterance of the child
  • reference sign g113 is the utterance of the first portable terminal 2.
  • the symbol g120 is a diagram illustrating an example of an image displayed on the second portable terminal 3.
  • the displayed image is an image received from the data center 4.
  • a symbol g121 is an image received from the data center 4.
  • the image g121 includes a score image g122, a child movement trajectory image g122, and motion confirmation result images g123 and g124 at the confirmation location.
  • FIG. 12 is a diagram illustrating a configuration example of an information communication system 1A according to the first modification of the embodiment.
  • the information communication system 1 ⁇ / b> A includes a first mobile terminal 2, a second mobile terminal 3, a data center 4 ⁇ / b> A (center), and a server 5.
  • the first mobile terminal 2, the second mobile terminal 3, the data center 4 ⁇ / b> A, and the server 5 are connected to each other via a network 6.
  • description is abbreviate
  • the difference between the information communication system 1 and the modified information communication system 1A is a data center 4A.
  • the data center 4A includes a communication unit 41, a control unit 42A, a model storage unit 43, a storage unit 44A, a voice recognition model storage unit 45, a voice recognition unit 46, and a scenario storage unit 47.
  • the voice recognition model storage unit 45 stores, for example, an acoustic model, a language model, a word dictionary, and the like.
  • the acoustic model is a model based on the feature amount of sound
  • the language model is a model of information on words and how to arrange them.
  • the word dictionary is a dictionary with a large number of vocabularies, for example, a large vocabulary word dictionary.
  • the voice recognition unit 46 detects the voice signal in the utterance section with respect to the voice signal output from the control unit 42A. For the detection of the utterance section, for example, an audio signal having a predetermined threshold value or more is detected as the utterance section. Note that the speech recognition unit 46 may perform detection of the utterance section using another known method. The speech recognition unit 46 performs speech recognition on the detected speech signal in the utterance section with reference to the speech recognition model storage unit 45 using a known method. Note that the voice recognition unit 46 performs voice recognition using, for example, a technique disclosed in JP-A-2015-64554. The speech recognition unit 46 converts the recognized recognition result into text with reference to the speech recognition model storage unit 45.
  • the speech recognition unit 46 performs morphological analysis and dependency analysis on the text information with reference to the speech recognition model storage unit 45.
  • dependency analysis for example, SVM (Support Vector Machines) is used in the shift-reduce method, the spanning tree method, and the chunk identification stage application method.
  • the voice recognition unit 46 outputs the analysis result to the control unit 42A.
  • the scenario storage unit 47 stores a dialogue performed in an environment where the first mobile terminal 2 is used, for example, in a text format.
  • the control unit 42A performs the following processing in addition to the processing performed by the control unit 42.
  • the control unit 42 ⁇ / b> A receives the detection signal of the radar 22 received by the communication unit 41 from the first portable terminal 2 and the detection value of the acceleration sensor 28 received by the communication unit 41 from the first portable terminal 2.
  • the control unit 42A determines whether or not a predetermined operation is performed at the set position based on at least one of the received detection signal of the radar 22 and the detection value of the acceleration sensor 28.
  • the control unit 42A obtains, for example, a score when returning home based on the determined result.
  • the control unit 42A stores the obtained score in the storage unit 44A.
  • the control unit 42A generates an image including a score image and an image indicating the determined result in the image of the child's movement trajectory.
  • the control unit 42 ⁇ / b> A transmits the generated image to the second portable terminal 3.
  • a method for determining whether or not a predetermined operation has been performed and how to obtain a score will be described later.
  • the control unit 42A acquires the analysis result output by the voice recognition unit 46. Based on the analysis result, the control unit 42A refers to the scenario storage unit 47 and selects a response to the audio signal received from the first portable terminal 2.
  • the control unit 42A converts the selected response into an audio signal by, for example, a formant synthesis method, and transmits the converted audio signal to the first portable terminal 2 by the communication unit 41.
  • storage parts associate and store a predetermined operation
  • the storage unit 44A stores, for example, the total score when returning home.
  • the storage unit 44 stores a threshold value used when determining whether or not a predetermined operation has been performed.
  • the storage unit 44 ⁇ / b> A stores face model data and arm model data used for detection of the detection signal of the radar 22.
  • the guardian may operate the operation unit 33 of the second portable terminal 3 to set the school route in advance.
  • the control unit 36 of the second portable terminal 3 may transmit information indicating the school route to the data center 4A by the communication unit 31.
  • the guardian may operate the operation unit 33 of the second portable terminal 3 to set a predetermined position where the predetermined operation should be performed.
  • the control unit 36 of the second portable terminal 3 may transmit information indicating the predetermined position to the data center 4A by the communication unit 31.
  • the guardian may set the predetermined operation by operating the operation unit 33 of the second portable terminal 3.
  • control unit 36 of the second portable terminal 3 may transmit information indicating a predetermined operation performed at a predetermined position to the data center 4A by the communication unit 31.
  • the data center 4A may acquire a dangerous spot from the server 5 and set the acquired dangerous spot at a predetermined position. Then, a predetermined operation may be set according to the type of the predetermined position.
  • FIG. 13 is a diagram illustrating an example of the predetermined position, the predetermined operation, and the score according to the first modification example of the embodiment.
  • the example shown in FIG. 13 is a predetermined position on the return route (return home route), a predetermined action, and a part of the score, and the total score is set to 100 points on the return route.
  • the predetermined locations are the A intersection, the B intersection, and the C intersection.
  • the predetermined operation is, for example, temporary stop and left / right confirmation at the A intersection.
  • the score is 10 for the temporary stop and 10 for the left / right confirmation at the A intersection, for example.
  • the control unit 42A sets the score to 10 when the pause operation is performed at the A intersection, and sets the score to 0 when the operation is not performed. Then, the total score from school to home is obtained, and the obtained total score is stored in the storage unit 44A. Note that the control unit 42A may obtain a total score by subtracting a score of a predetermined action that has not been performed from 100 points.
  • FIG. 14 is a flowchart of a processing procedure example performed by the data center 4A according to the first modification of the embodiment.
  • the control unit 42 ⁇ / b> A determines whether or not a predetermined operation has been performed based on the detection signal of the radar 22.
  • Step S51 The control unit 42A of the data center 4A receives the position information transmitted by the first mobile terminal 2.
  • Step S52 The control unit 42A receives the detection signal of the radar 22 transmitted from the first mobile terminal 2.
  • the 1st portable terminal 2 transmits a positional information and a detection signal to the data center 4A for every predetermined period.
  • Step S53 When the position is a predetermined location, the control unit 42A performs an operation analysis by determining whether or not a predetermined operation has been performed based on the received detection signal.
  • Step S54 The control unit 42A performs a scoring process based on the result of the operation analysis and the information stored in the storage unit 44A.
  • the scoring process is, for example, a point addition or deduction process.
  • Step S55 The control unit 42A determines whether a score transmission request is received from the second portable terminal 3. When it is determined that the request has been received from the second portable terminal 3 (step S55; YES), the control unit 42A advances the process to step S56. On the other hand, when it is determined that the request is not received from the second portable terminal 3 (step S55; NO), the control unit 42A returns the process to step S51.
  • Step S56 The control unit 42A generates an image including the image of the score and the image indicating the determination result in the image of the child's movement trajectory.
  • the control unit 42 ⁇ / b> A transmits the generated image to the second mobile terminal 3.
  • the data center 4 ⁇ / b> A has described the example in which the information including the score is transmitted when it is determined that the request is received from the second mobile terminal 3, but the present invention is not limited thereto.
  • the data center 4A may transmit information including a score at a time set by a guardian or every predetermined period.
  • FIG. 15 is a flowchart of a processing procedure example performed by the second mobile terminal 3 according to the first modification of the embodiment.
  • the control unit 36 of the second portable terminal 3 receives information including the score transmitted by the data center 4A.
  • the control unit 36 causes the display unit 34 to display information including the received score.
  • the process shown in FIG. 15 may be executed at a time set by the guardian or at a predetermined cycle when a request is transmitted from the second portable terminal 3.
  • FIG. 16 is an image example of information including a score displayed on the display unit 34 of the second mobile terminal 3 according to the first modification of the embodiment.
  • the image g200 is an image displayed on the display unit 34.
  • a symbol g201 indicates a school.
  • a symbol g202 indicates a home.
  • Reference numeral g203 is an image showing the movement trajectory of the child.
  • Reference numerals g211, g212, and g213 indicate predetermined locations.
  • a symbol g221 represents a determination result of the predetermined operation in the symbol g211.
  • the circles indicate that a predetermined operation has been performed.
  • a symbol g222 represents a determination result of the predetermined operation in the symbol g212.
  • a symbol g223 represents a determination result of the predetermined operation in the symbol g213. Note that a cross indicates that a predetermined operation is not performed.
  • a symbol g230 is an image representing the total score.
  • the symbols g221 and g222 when the predetermined operation is performed may be displayed in blue, for example, and the symbol g223 when the predetermined operation is not performed may be displayed in red, for example.
  • Such an image is generated by the control unit 42A of the data center 4A.
  • FIG. 17 is a flowchart of a processing procedure example of an operation analysis method for pause at a predetermined position according to the first modification of the embodiment.
  • the control unit 42A starts the following process after initializing the counter value T to zero. Further, the control unit 42A performs the following processing at a predetermined position.
  • Step S71 The control unit 42A of the data center 4A receives the position information from the first mobile terminal 2.
  • the control unit 42A stores the received position information in the storage unit 44 in association with the received time information.
  • Step S72 The control unit 42A compares the previous position (position information closest to the current time) stored in the storage unit 44A with the received position information, and determines whether or not the position is the same as the previous time. Is determined.
  • the control unit 42A determines that the position is the same as the previous position (step S72; YES)
  • the control unit 42A proceeds to the process of step S73.
  • the control unit 42A determines that the position is not the same as the previous position (step S72; NO)
  • the process proceeds to step S74.
  • Step S73 The control unit 42A adds 1 to the counter value T, and the process proceeds to step S75.
  • Step S74 The control unit 42A assigns 0 to the counter value T, and returns the process to step S71.
  • Step S75 The control unit 42A determines whether or not the counter value T is larger than a threshold value used when determining whether or not a predetermined operation has been performed.
  • the control unit 42A determines that the counter value T is larger than the threshold value (step S75; YES)
  • the control unit 42A advances the process to step S76.
  • the control unit 42A determines that the counter value T is equal to or less than the threshold value (step S75; NO)
  • the control unit 42A returns the process to step S71.
  • Step S76 The control unit 42A determines that the child carrying the first mobile terminal 2 has paused at a predetermined position.
  • control unit 42A may perform the operation analysis of the suspension based on the detection value of the acceleration sensor 28 of the first portable terminal 2. In this case, in step S71, the control unit 42A receives the detection value of the acceleration sensor 28. In step S72, the control unit 42A may determine that the position is the same when the difference between the previous detection value and the current detection value is within a predetermined range including zero. The control unit 42A determines whether or not the child has performed a predetermined operation based on at least one of the detection value of the acceleration sensor 28 and other information (for example, a detection signal output by the radar 22). It may be detected.
  • FIG. 18 is a flowchart of a processing procedure example of the motion analysis method for checking the left and right at a predetermined position according to the first modification of the embodiment. Note that the control unit 42A performs the following processing at a predetermined position.
  • Step S81 The control unit 42A of the data center 4A receives the detection signal output from the radar 22 from the first portable terminal 2.
  • Step S ⁇ b> 82 The control unit 42 ⁇ / b> A detects the face portion of the person who carries the first portable terminal 2 from the received detection signal by a known method. Subsequently, the control unit 42A stores the detected face detection signal in the storage unit 44A in association with the acquired time information.
  • Step S83 The control unit 42A compares the acquired face detection signal with the previous face detection signal based on the face detection signal stored in the storage unit 44A. As a result of the comparison, the control unit 42A determines whether or not the face portion is moving left and right. When the control unit 42A determines that the face part is moving left and right (step S83; YES), the process proceeds to step S84. On the other hand, when the control unit 42A determines that the face portion has not moved left and right (step S83; NO), the control unit 42A advances the process to step S85.
  • Step S ⁇ b> 84 The control unit 42 ⁇ / b> A determines that the carrier of the first mobile terminal 2 has confirmed left and right (right and left confirmation OK).
  • Step S85 42 A of control parts discriminate
  • the control unit 42A determines whether or not the face has moved to the left or right when the left / right value of the detected value changes by a predetermined value or more based on the detected value of the acceleration sensor 28. May be determined. Note that the control unit 42A determines whether or not the user has performed a predetermined operation based on at least one of a detection value of the acceleration sensor 28 and other information (for example, a detection signal output by the radar 22). May be detected.
  • FIG. 19 is a flowchart of an example of a processing procedure of an operation analysis method for a hand raising operation at a predetermined position according to the first modification of the embodiment. Note that the control unit 42A performs the following processing at a predetermined position.
  • Step S91 The control unit 42A of the data center 4A receives the detection signal output from the radar 22 from the first portable terminal 2.
  • Step S ⁇ b> 92 The control unit 42 ⁇ / b> A detects the arm portion of the holder of the first portable terminal 2 from the received detection signal by a known method.
  • the control unit 42A stores the detected arm detection signal in the storage unit 44A in association with the acquired time information.
  • Step S93 The control unit 42A determines whether or not the arm is raised depending on whether or not the arm portion has been detected. When it is determined that the arm is raised (step S93; YES), the control unit 42A advances the process to step S94. On the other hand, when it is determined that the arm is not raised (the arm cannot be detected) (step S93; NO), the control unit 42A advances the process to step S95.
  • Step S94 42 A of control parts discriminate
  • Step S95 42 A of control parts discriminate
  • the guardian grasps the skill of the child. be able to. According to the first modification of the embodiment, the guardian and the child can communicate with each other while viewing the information displayed on the second mobile terminal 3, thereby increasing the chance of communication between the guardian and the child. At the same time, safety awareness is improved.
  • the control unit 29 of the first portable terminal 2 transmits this audio signal picked up by the microphone 23 to the data center 4 ⁇ / b> A through the communication unit 21.
  • the voice recognition unit 46 of the data center 4A performs voice recognition processing on the received voice signal.
  • the control unit 42A generates a reply voice signal based on the result of the voice recognition process, and transmits the generated voice signal to the first portable terminal 2 by the communication unit 41.
  • the control unit 29 of the first portable terminal 2 speaks by reproducing “Return!” Received from the data center 4A.
  • the data center 4A receives the voice signal that the child uttered “What is today's score?”, And the control unit 42A receives the voice recognition result and the information stored in the scenario storage unit 47 and the storage unit 44. Based on this, a “90 points” audio signal as a response is generated.
  • the control unit 42A transmits an audio signal as a response to the first portable terminal 2 through the communication unit 41. As a result, the control unit 29 of the first mobile terminal 2 utters “90 points” received from the data center 4A.
  • the audio signal collected by the first portable terminal 2 is transmitted to the data center 4A, and the response of the audio signal received by the data center 4A is transmitted to the first portable terminal 2.
  • the audio signal transmitted from the data center 4A to the first mobile terminal 2 is “I did well!” When the score is 70 or more, for example, and “I did not pause when the score is 50 or less. Tomorrow. "Let's pause.”
  • the guardian and the child have a conversation to confirm the traffic safety point while viewing the image displayed on the second portable terminal 3.
  • the first portable terminal 2 or the second portable terminal 3 transmits the collected audio signal to the data center 4A.
  • the data center 4 ⁇ / b> A transmits an audio signal of advice to the first mobile terminal 2 based on the result of the voice recognition. As a result, the first portable terminal 2 reproduces the advice audio signal “Please pause only because the pause is OK!”.
  • the data center 4 ⁇ / b> A may also transmit such advice audio signals to the second portable terminal 3.
  • the voice signal picked up by the first portable terminal 2 or the second portable terminal 3 is transmitted to the data center 4A, and the response of the voice signal received by the data center 4A is sent to the first portable terminal. 2 to send.
  • communication can be performed between the guardian, the child, and the robot that is the first portable terminal 2, and the child carrying the first portable terminal 2 can be continuously provided for traffic safety education. it can.
  • a child carries the first portable terminal 2
  • a mobile person, a pedestrian, or an elderly person may carry the first portable terminal 2.
  • the geographical position may be information on a position including the above-described dangerous place and a predetermined position.
  • FIG. 20 is a diagram illustrating an example of a configuration of an information communication system 1B according to a second modification of the embodiment.
  • the information communication system 1 ⁇ / b> B includes a first mobile terminal 2, a second mobile terminal 3, a data center 4 ⁇ / b> B, and a server 5.
  • the data center 4B includes a communication unit 41, a control unit 42B, a model storage unit 43, a storage unit 44B, a voice recognition model storage unit 45, a voice recognition unit 46, and a scenario storage unit 47.
  • the storage unit 44B includes, for example, an HDD, a flash memory, an EEPROM (Electrically Erasable Programmable Read Only Memory), a ROM (Read Only Memory), or a RAM (Random Access Memory).
  • the storage unit 44B stores, for example, map information, danger information, suspicious person information, and geographical position information.
  • the suspicious person information includes a suspicious person pattern in addition to the position information and time at which the suspicious person appears.
  • the suspicious person pattern is information indicating the characteristics of the behavior and behavior of the suspicious person. Details of the geographical location information will be described later.
  • the control unit 42B includes, for example, a notification unit 421 as a functional unit.
  • the control unit 42B is realized by, for example, a hardware processor such as a CPU (Central Processing Unit) executing a program (software) stored in the storage unit 44B.
  • a hardware processor such as a CPU (Central Processing Unit) executing a program (software) stored in the storage unit 44B.
  • Some or all of these components include hardware (circuitry) such as LSI (Large Scale Integration), ASIC (Application Specific Integrated Circuit), FPGA (Field-Programmable Gate Array), and GPU (Graphics Processing Unit). Part (including circuit)), or may be realized by cooperation of software and hardware.
  • the notification unit 421 informs the carrier (for example, a child such as a child) who holds the first mobile terminal 2 about the approach of the geographical position.
  • the geographical position information is information indicating the geographical position.
  • the geographical position is a position on a route through which the carrier of the first mobile terminal 2 passes, for example, a position where there is a possibility of danger to children.
  • the geographical position includes, for example, an intersection on a route, a position on a sidewalk separated from a roadway, and the vicinity of a dangerous structure.
  • the geographical position includes a point (for example, a safety confirmation point) where execution of a predetermined operation such as a stop operation, a left / right confirmation operation, and a hand raising operation is required.
  • a point for example, a safety confirmation point
  • execution of a predetermined operation such as a stop operation, a left / right confirmation operation, and a hand raising operation is required.
  • the geographical position information includes a geographical position on a route where a child walks when going to and from school.
  • the first portable terminal 2 identifies the position of the first portable terminal 2 (that is, the position of the child) based on the positioning signal received from the satellite or the like by the position detection unit 24, and position information indicating the identified position. Is transmitted to the data center 4B.
  • the information communication system 1B may adopt a method other than the position detection unit 24 as long as it can identify the position of the child.
  • the first mobile terminal 2 may be connected to the first mobile terminal 2 by a method such as BLE (Bluetooth (registered trademark) Low Energy), PDR (Pedestrian® Dead® Reckoning), IMES (Indoor Messaging System), SLAM (Simultaneous Localization and Mapping).
  • the position of may be specified.
  • the first portable terminal 2 may specify the position of the first portable terminal 2 based on the peripheral information detected by the radar 22.
  • the information communication system 1B may acquire the position of the first mobile terminal 2 from another system that specifies the position of the first mobile terminal 2, for example.
  • another system includes a detection device installed at a geographical position, and the detection device detects the first mobile terminal 2 present in the vicinity of the own device, thereby specifying the position of the first mobile terminal 2. May be.
  • the notification unit 421 determines that the first mobile terminal 2 A notification function provided is controlled to notify the child of the approach of the geographical position.
  • the notification function with which the 1st portable terminal 2 is provided is vibrator 25, speaker 26, etc., for example.
  • reporting part 421 of the data center 4B transmits the control signal of the vibrator 25 to the 1st portable terminal 2, and the 1st portable terminal 2 operates (vibrates) the vibrator 25 based on the said control signal.
  • the notification unit 421 transmits a control signal for the vibrator 25 to the first mobile terminal 2, and the notification unit oscillates the vibrator 25 based on the control signal. It also describes that 421 vibrates the vibrator 25.
  • FIG. 21 is a diagram schematically illustrating the start timing of notification according to the second modification of the embodiment.
  • the child C1 who is the carrier of the first mobile terminal 2 goes straight in the direction of the intersection and walks along a route that turns right at the intersection (route rt shown in the drawing).
  • the geographical position PT1 is set in advance at the position of the intersection.
  • the geographical position PT1 is preferably set on a route rt on which the child C1 walks rather than the center of the intersection. For example, in a road in which a roadway and a sidewalk are divided, the route rt is a sidewalk, and the geographical position PT1 is set on the sidewalk (more specifically, the boundary between the sidewalk and the roadway).
  • the notification unit 421 operates the vibrator 25 when the child C1 approaches a position (that is, a predetermined range ARP illustrated) that is a predetermined distance (a first predetermined distance dt1 illustrated) from the geographical position PT1.
  • the first predetermined distance dt1 is a distance of about several meters, for example.
  • the notification unit 421 may change the strength of notification (for example, the strength of vibration) according to the distance to the geographical position PT1.
  • the child C1 walks while paying attention to the surroundings by being notified of the geographical position PT1 (that is, the intersection) by the vibrator 25. For this reason, the information communication system 1B can suppress contact between the child C1 and the vehicle m traveling in the traveling direction of the child C1.
  • FIG. 22 is a flowchart illustrating an example of processing of the notification unit 421 according to the second modification of the embodiment.
  • the notification unit 421 executes the process shown in FIG. 22 at predetermined time intervals.
  • reporting part 421 acquires the positional information which shows the position of the child C1 from the 1st portable terminal 2 (step S100).
  • the notification unit 421 determines whether or not the distance from the position of the child C1 to the geographical position PT1 is less than the first predetermined distance dt1 (step S102). When the distance from the position of the child C1 to the geographical position PT1 is not less than the first predetermined distance dt1, the notification unit 421 ends the process.
  • the notification unit 421 advances the process to step S104.
  • the notification unit 421 performs a geographical position approach notification process for notifying the child C1 of the approach of the geographical position PT1 (step S104), and ends the process.
  • FIG. 23 is a flowchart showing details of the geographical location approach notification process in step S104 shown in FIG.
  • reporting part 421 starts the alerting
  • the notification unit 421 starts the vibration of the vibrator 25.
  • reporting part 421 acquires the position of the child C1 by the position detection part 24 (step S202).
  • the notification unit 421 determines whether or not the child C1 is approaching the geographical position PT1 based on the position of the child C1 acquired in step S100 and the position of the child C1 acquired in step S202 ( Step S204).
  • the notification unit 421 determines that the child C1 is approaching the geographical position PT1 (YES in step S204)
  • the notification unit 421 advances the process to step S206.
  • the notification unit 421 determines that the child C1 has not approached the geographical position PT1 (NO side of step S204)
  • the notification unit 421 advances the processing to step S208.
  • reporting part 421 increases the alerting
  • Increasing the notification amount is, for example, at least one of increasing the vibration of the vibrator 25 and increasing the frequency of vibration.
  • the notification unit 421 determines whether or not the child C1 is away from the geographical position PT1 (step S208). When the notification unit 421 determines that the child C1 is moving away from the geographical position PT1 (YES in step S208), the notification unit 421 advances the process to step S210. On the other hand, when the notification unit 421 determines that the child C1 has not approached or moved away from the geographical position PT1 (NO in step S208), the notification amount of the vibrator 25 is not changed, and the process proceeds to step S212. . And the alerting
  • the notification unit 421 determines whether the distance from the position of the child C1 to the geographical position PT1 is equal to or greater than the first predetermined distance dt1 (step S212). While the notification unit 421 determines that the distance from the position of the child C1 to the geographical position PT1 is less than the first predetermined distance dt1 (NO in step S212, that is, until the child C1 goes out of the predetermined range ARP). The process returns to step S204. The notification unit 421 adjusts the notification amount by repeating the processing from steps S204 to S210.
  • step S212 when the notification unit 421 determines that the distance from the position of the child C1 to the geographical position PT1 is equal to or greater than the first predetermined distance dt1 (YES side of step S212), the process proceeds to step S214. And the alerting
  • the distance from the position of the child C1 to the geographical position PT1 may be a distance along the route rt of the child C1.
  • the notification unit 421 suppresses the notification for a long time by performing notification based on the distance to the geographical position PT1 along the route rt of the child C1, and the child C1 gets used to the notification state. Can be suppressed.
  • the information communication system 1B according to the second modification of the embodiment can more appropriately notify the child C1 of the approach of the geographical position.
  • FIG. 24 is a diagram illustrating an example of the configuration of the first mobile terminal 2B according to the third modification of the embodiment.
  • the first mobile terminal 2B includes a communication unit 21, a radar 22, a microphone 23, a position detection unit 24, a vibrator 25, a speaker 26, a display unit 27, an acceleration sensor 28, and the like. And a control unit 29B.
  • the control unit 29B includes, for example, an approach determination unit 291 as a functional unit in addition to the configuration included in the control unit 29.
  • the control unit 29B is realized, for example, when a hardware processor such as a CPU executes a program (software) stored in a storage unit (not shown). Some or all of these components may be realized by hardware (including a circuit unit) such as LSI, ASIC, FPGA, or GPU, or by cooperation of software and hardware. Also good.
  • the approach determination unit 291 determines, for example, whether or not an object existing around the mobile person has approached the mobile person based on the detection result of the detection function provided in the first mobile terminal 2B.
  • the detection function provided in the first mobile terminal 2B is the radar 22 will be described.
  • FIG. 25 is a diagram schematically illustrating the start timing of notification according to the third modification of the embodiment.
  • the radar 22 included in the first mobile terminal 2B is configured to be able to detect the surroundings of the child C1, for example.
  • the radar 22 is configured by, for example, a plurality of (two in the drawing) millimeter wave radars, and is provided in different directions so that the surroundings of the first portable terminal 2B (that is, the surroundings of the child C1) can be detected. .
  • the front radar and the rear radar included in the radar 22 detect different ranges (a front detection range AF and a rear detection range AR shown in the drawing).
  • the radar 22 When the radar 22 is set to be able to detect the entire area of the child C1 (for example, a range of 360 degrees around the vertical axis), the front detection range AF and the rear detection range AR may overlap each other. Good.
  • the radar 22 detects the surroundings of the user at predetermined time intervals and generates a detection signal.
  • the radar 22 may include one or a plurality of millimeter wave radars as long as it can detect the surroundings of the child C1.
  • the first mobile terminal 2B may include a device other than the radar 22 as a configuration for detecting the surrounding environment of the user.
  • the first portable terminal 2 ⁇ / b> B may include a detection device that performs remote detection, such as a camera, a LIDAR (Light Detection and Ranging or Laser Imaging Detection and Ranging), or an infrared sensor, instead of the radar 22.
  • the first mobile terminal 2 ⁇ / b> B may include one or a plurality of cameras instead of the radar 22.
  • the three cameras provided in the first portable terminal 2B capture images in different ranges (imaging ranges AR1 to AR3 shown).
  • the imaging ranges AR1 to AR3 of the three cameras may overlap each other.
  • the three cameras may capture the periphery of the user at predetermined time intervals and generate captured images (for example, still images or moving images).
  • the approach determination unit 291 moves the object to a position (that is, a predetermined range ARC shown in the drawing) that is a predetermined distance (a second predetermined distance dt2 shown in the drawing) from the position of the child C1.
  • a predetermined range ARC shown in the drawing
  • a second predetermined distance dt2 is a distance of about several meters.
  • the notification unit 421 of the data center 4B controls the notification function provided in the first mobile terminal 2B when an object is approaching the child C1, based on the detection signal of the radar 22 received from the first mobile terminal 2B. Inform child C1 of the approach of the object.
  • the notification unit 421 notifies the child C1 of the approach of the vehicle m and the approach of the suspicious person among the objects approaching the child C1.
  • FIG. 26 is a flowchart illustrating an example of processing of the approach determination unit 291 and the notification unit 421 according to the third modification example of the embodiment.
  • the approach determination unit 291 acquires the detection signal output by the radar 22 (step S300).
  • the approach determination unit 291 determines whether or not the distance from the child C1 to the detected object is less than the second predetermined distance dt2 based on the acquired detection signal (step S302).
  • the approach determination unit 291 returns the process to step S300 and waits.
  • step S304 when the approach determination unit 291 determines that the distance from the child C1 to the detected object is less than the second predetermined distance dt2 (YES side of step S302), the process proceeds to step S304. And the alerting
  • FIG. 27 is a flowchart showing details of the object approach notification process in step S304 shown in FIG.
  • the notification unit 421 determines whether or not the size of the object indicated in the detection signal acquired in step S300 is greater than or equal to a predetermined size (step S400).
  • the predetermined size is, for example, a general appropriate vehicle size.
  • the notification unit 421 determines that the size of the object indicated by the detection signal is equal to or larger than the predetermined size (YES side of step S400)
  • the process proceeds to step S402.
  • the notification unit 421 determines that the size of the object is not equal to or larger than the predetermined size
  • the notification unit 421 advances the processing to step S410.
  • reporting part 421 determines with an object being the vehicle m (step S402).
  • the notification unit 421 acquires a detection signal output after the detection signal acquired in step S300 (step S404).
  • the notification unit 421 determines whether or not the speed of the vehicle m is equal to or higher than a predetermined speed based on the detection signal acquired in step S300 and the detection signal acquired in step S404 (step S406). ).
  • the predetermined speed is, for example, about several to several tens km / h.
  • the notification unit 421 advances the process to step S408.
  • the notification unit 421 determines that the speed of the vehicle m is equal to or higher than the predetermined speed (YES side of step S406), the process proceeds to step S412.
  • step S408 determines whether the vehicle m finished passing the vicinity of the child C1 (step S408).
  • the notification unit 421 estimates that there is no danger to the child C1 and starts to notify the child C1 of the approach of the vehicle m. The process ends.
  • the notification unit 421 determines that the vehicle m has not passed through the vicinity of the child C1 (NO side of step S408), the notification unit 421 returns the process to step S404.
  • the notification unit 421 estimates that the child C1 is not dangerous even if approaching. , Do not start notification.
  • the notification unit 421 determines whether or not the movement pattern of the object matches the suspicious person's pattern (step S410).
  • the suspicious person's pattern is, for example, a pattern in which a specific object is encountered a predetermined number of times or more during a unit period related to the movement of the child C1 (for example, when attending school or leaving school on an appropriate day).
  • the suspicious person's pattern is a pattern in which an object reciprocates along a path that matches the movement path of the child C1 during a unit period related to the movement of the child C1, for example.
  • the suspicious person's pattern is, for example, a pattern in which the feature of the object matches the suspicious person's feature of the suspicious person information.
  • the notification unit 421 determines that the movement pattern of the object does not match the pattern of the suspicious person (NO side of step S410)
  • the notification unit 421 estimates that the object is not a suspicious person and starts notification that notifies the child C1 of the approach of the object. Without processing, the process is terminated.
  • the notification unit 421 determines that the movement pattern of the object matches the suspicious person's pattern (YES in step S410, that is, the object is a suspicious person)
  • the notification unit 421 advances the processing to step S412. And the alerting
  • reporting part 421 acquires the detection signal output after the detection signal acquired in step S300 (step S414).
  • the notification unit 421 determines whether the vehicle m or the suspicious person is approaching the child C1 based on the detection signal acquired in Step S300 and the detection signal acquired in Step S414 (Step S414). S416). Informing part 421 advances processing to Step S418, when vehicle m or a suspicious person is approaching child C1 (YES side of Step S410). On the other hand, when the vehicle m or the suspicious person is not approaching the child C1 (NO side of step S410), the notification unit 421 advances the process to step S420.
  • the notification unit 421 increases the notification amount (step S418) and returns the process to step S416. Further, the notification unit 421 determines whether or not the vehicle m or the suspicious person has passed near the child C1 (step S420). When the notification unit 421 determines that the vehicle m or the suspicious person has not passed near the child C1 (NO side of step S420), the notification unit 421 returns the process to step S416 and repeats the processes of steps S416 to S418. On the other hand, when the vehicle m or the suspicious person has passed the vicinity of the child C1 (YES side of step S420), the notification unit 421 advances the process to step S422. The notification unit 421 stops the notification (step S422).
  • the information communication system 1B can more appropriately notify the child C1 of the approach of the vehicle or the suspicious person.
  • FIG. 28 is a diagram illustrating an example of the configuration of the first mobile terminal 2C according to the fourth modification example of the embodiment.
  • the information communication system 1B according to the fourth modification of the embodiment includes a first mobile terminal 2C instead of the first mobile terminal 2B or in addition to the first mobile terminal 2B.
  • the first mobile terminal 2C includes a communication unit 21, a radar 22, a microphone 23, a position detection unit 24, a vibrator 25, a speaker 26, a display unit 27, an acceleration sensor 28, and the like.
  • a control unit 29C. 29 C of control parts are provided with the approach determination part 291 and the action recognition part 292 as a function part in addition to the structure with which the control part 29 is provided, for example.
  • the motion recognition unit 292 recognizes whether or not the wearer has performed a predetermined motion based on the detection result of the detection function (radar 22 in this example) included in the first mobile terminal 2C.
  • the predetermined operation is an operation that is preferably performed by the wearer at the geographical location.
  • the predetermined operation is, for example, a stop operation that stops at a geographical position.
  • the predetermined operation is, for example, a left / right confirmation operation for confirming left / right at a geographical position.
  • the predetermined operation is, for example, a hand raising operation at a geographical position.
  • the detection range of the radar 22 includes the movable range of the portable person's head and the movable range of the portable person's hand.
  • the motion recognition unit 292 recognizes a predetermined motion performed by the carrier based on the detection signal output by the radar 22, and transmits information indicating the recognized predetermined motion to the data center 4B.
  • FIG. 29 is a diagram schematically illustrating timing for performing recognition processing of a predetermined action according to the fourth modification example of the embodiment.
  • the first mobile terminal 2C starts notification by the vibrator 25 at a position where the distance to the geographical position PT1 is less than the first predetermined distance dt1 (the position of “child C1 at time t” shown in the figure). To do.
  • the child C1 performs a stop operation that stops at the geographical position PT1.
  • processing when the motion recognition unit 292 recognizes various predetermined motions will be described.
  • the motion recognition unit 292 transmits the position information acquired by the position detection unit 24 to the data center 4B.
  • the data center 4B refers to the geographical position information stored in the storage unit 44B, and specifies the geographical position PT1 existing in the vicinity of the position of the child C1 indicated by the position information.
  • the data center 4B transmits information indicating the identified geographical position PT1 to the first mobile terminal 2C.
  • the motion recognition unit 292 determines whether or not the position of the child C1 specified by the position detection unit 24 is the geographical position PT1. Further, based on the detection signal output by the radar 22, the motion recognition unit 292 recognizes that the child C1 is stopped when the surrounding environment acquired by the detection signal has not changed for a predetermined time.
  • the action recognition unit 292 integrates the above-described determinations, and recognizes that the stop action has been performed when the child C1 is stopped at the geographical position PT1.
  • the motion recognition unit 292 performs a process of recognizing the stop motion from a remote position separated from the geographical position PT1 by a predetermined distance (a third predetermined distance dt3 in the illustrated example) to the geographical position PT1. Also good.
  • the motion recognition unit 292 specifies a remote position that is a third predetermined distance dt3 in front of the geographical position PT1 in the vicinity of the current position acquired from the data center 4B, and changes the geographical position PT1 from the specified remote position. Until it arrives, processing for recognizing the stop operation may be performed.
  • the motion recognition unit 292 performs a left / right confirmation operation when the head of the child C1 (for example, a face) faces leftward and rightward with respect to the child C1. Recognize that
  • the motion recognition unit 292 recognizes that the hand raising operation has been performed when the hand portion (for example, the palm) of the child C1 is positioned upward with respect to the child C1.
  • the operation recognizing unit 292 transmits information indicating that the predetermined operation is recognized to the data center 4B at the timing when it is recognized that the predetermined operation has been performed.
  • the notification unit 421 stops the notification when the notification indicating the approach of the geographical position is started to the child C1 and when the information indicating that the predetermined operation is performed is received from the motion recognition unit 292. .
  • the motion recognition unit 292 may be able to recognize all of the predetermined operations (for example, stop operation, left / right confirmation operation, and raising hand operation), or only a part of the predetermined operation (for example, only the left / right confirmation operation and raising hand operation). Etc.) may be recognizable.
  • FIG. 30 is a flowchart illustrating an example of notification stop processing of the notification unit 421 according to the fourth modification of the embodiment.
  • the notification unit 421 determines whether notification has been started (step S500). When the notification is not started (NO side of step S500), the notification unit 421 repeats the process of step S500 and waits until the notification is started. On the other hand, when the notification unit 421 has started notification (YES in step S500), the notification unit 421 advances the process to step S502. And the alerting
  • step S502 When the predetermined operation is not performed (NO side of step S502), the notification unit 421 repeats the process of step S502 and waits until receiving information indicating that the predetermined operation has been performed. On the other hand, when the notification unit 421 receives information indicating that a predetermined operation has been performed (YES in step S502), the notification unit 421 advances the process to step S504. And the alerting
  • the information communication system 1B estimates that a necessary action has been performed for traffic safety when the child C1 performs a predetermined operation at a geographical position, and is unnecessary. It is possible to prevent the notification from being continued.
  • the notification part 421 performed various notifications to the child C1 by vibrating the vibrator 25 in the second to fourth modifications described above, the present invention is not limited to this.
  • the notification may be performed, for example, when the speaker 26 outputs sound.
  • increasing the notification amount is, for example, at least one of increasing the sound volume and increasing the frequency of sound output.
  • the motion recognition unit 292 recognizes the moving speed of the child C1 based on the detection signal output by the radar 22 based on the detection signal output by the radar 22 .
  • the moving speed of the child C1 may be recognized based on the output of the acceleration sensor 28, for example.
  • the motion recognition unit 292 recognizes the moving speed of the child C1 by integrating the output of the acceleration sensor 28 and deriving the moving speed of the child C1.
  • the notification unit 421 may perform notification based on the moving speed of the child C1. For example, when the moving speed of the child C1 is high, the notification unit 421 may increase the notification start timing or increase the notification amount as compared to the case where the movement speed is low. Thereby, the information communication system 1B can promptly notify the child C1 who is likely to run and jump out on the road.
  • the notification unit 421 is based on the position information of the child C1 acquired by the position detection unit 24 and the geographical position information, and approaches the geographical position PT1.
  • the notification unit 421 derives the predicted arrival time of the geographical position PT1 based on the moving speed of the child C1, and determines whether or not the geographical position PT1 is approached based on the derived predicted arrival time. Good.
  • the notification unit 421 derives an expected arrival time to arrive at the geographical position PT1, and determines whether or not the geographical position PT1 has approached based on the derived expected arrival time.
  • the geographical position PT1 is determined in advance and stored as the geographical position information in the storage unit 44B.
  • the geographical location PT1 may be specified from, for example, the second mobile terminal 3 possessed by the guardian of the child C1.
  • the second portable terminal 3 executes a program (software) used when setting the geographical position PT1 as a program (software) stored in a storage unit (not shown) of the second portable terminal 3. .
  • a map image is displayed on the display unit 34 by executing this program (software).
  • the guardian inputs the start position and the end position of the route on which the child C1 walks when going to and from school to a reception unit (not shown) provided in the second mobile terminal 3.
  • the reception unit is an operation device such as a button, a keyboard, and a mouse, for example.
  • the reception unit is a touch panel formed integrally with the display unit 34
  • the second portable terminal 3 superimposes a plurality of route rt candidates that the child C1 is estimated to walk on the map image based on the start position and the end position received by the reception unit, and presents them on the display unit 34. To do.
  • the guardian selects a route rt on which the child C1 actually walks from among the plurality of presented route rt candidates.
  • the accepting unit accepts an operation of a guardian who designates the geographical position PT1 that the child C1 wants to perform a predetermined action on the selected (designated) route rt.
  • the information on the geographical position PT1 of the storage unit 44B includes information indicating the geographical position PT1 designated by the guardian. Thereby, the information communication system 1B can notify the child C1 of the approach of the geographical position PT1 designated by the guardian.
  • the accepting unit may designate the route rt on which the child C1 actually walks by inputting the position of the route rt on the touch panel by the guardian with respect to the displayed map image.
  • the control unit 36 is in the vicinity of the specified position. It may be corrected to a position on the route rt or on the sidewalk.
  • reporting part 421 demonstrated the case where the alerting
  • the notification unit 421 may perform notification that prompts a predetermined operation at the geographical position PT1. In this case, for example, when the approach determination unit 291 determines that the child C1 has approached the geographical position PT1, the notification unit 421 confirms “pause at the intersection!” Then, raise your hand and let's cross the pedestrian crossing! "
  • reporting part 421 may alert
  • the notification unit 421 may perform notification that praises the child C1.
  • the notification that praises the child C1 is notification that the vibrator 25 is vibrated according to a predetermined rhythm, and that the speaker 26 outputs a voice that praises the child C1 “You have confirmed right and left! .
  • reporting part 421 may alert
  • the notification that pays attention to the child C1 is notification that the vibrator 25 is vibrated according to a predetermined rhythm, and a sound that warns the child C1 that “It is dangerous if it does not stop properly” is output from the speaker 26.
  • the notification unit 421 may notify the evaluation result of the predetermined operation every time the predetermined operation is performed, or whenever the distance from the child C1 to the geographical position PT1 becomes less than the first predetermined distance dt1.
  • reporting part 421 does not need to alert
  • the notification unit 421 may not notify the evaluation result in a range within the first predetermined distance dt1 from the geographical position PT1. Thereby, the alerting
  • the control unit 42B may be configured to score a predetermined action performed by the child C1.
  • the control unit 42B scores the predetermined action performed by the child C1 based on whether the child C1 performs the predetermined action at the geographical position PT1 or does not perform the predetermined action, and indicates information indicating the result of the scoring. You may transmit to the 2nd portable terminal 3 of a guardian. Further, the control unit 42B may transmit the scoring result of the predetermined operation to the second portable terminal 3 every time the child C1 approaches the geographical position PT1.
  • the configuration provided in the first mobile terminal 2, the first mobile terminal 2B, or the first mobile terminal 2C and the configuration provided in the data center 4B is not limited to this.
  • the configuration provided in the first mobile terminal 2, the first mobile terminal 2B, or the first mobile terminal 2C may be provided in the data center 4B, and the configuration provided in the data center 4B is the first mobile terminal 2, One mobile terminal 2B or the first mobile terminal 2C may be provided.
  • storage part 44B may be memorize
  • the notification unit 421 of the data center 4B is stored in the storage unit 44B and the position information acquired by the position detection unit 24 of the first mobile terminal 2.
  • the approach between the first portable terminal 2 and the geographical position PT1 is determined based on the geographical position information that is present, the present invention is not limited to this.
  • the proximity determination unit 291 of each of the first mobile terminals 2B and 2C shown in the third modification and the fourth modification described above receives the geographical position information from the data center 4B, and the first mobile terminal 2 and the geographical position You may determine the approach of PT1.
  • each first mobile terminal 2, 2 ⁇ / b> B, 2 ⁇ / b> C includes the radar 22 by the front radar and the rear radar.
  • the present invention is not limited to this, and instead of the radar 22, a camera, LIDAR (Light Detection and Ranging or Laser Imaging Detection and Ranging) or an infrared sensor may be provided.
  • a program for realizing all or part of the functions of the data centers 4, 4A, 4B in the present invention is recorded on a computer-readable recording medium, and the program recorded on the recording medium is stored in a computer system. You may perform all or one part of the processing which each data center 4, 4A, 4B performs by making it read and execute.
  • the “computer system” includes an OS and hardware such as peripheral devices.
  • the “computer system” includes a WWW system having a homepage providing environment (or display environment).
  • the “computer-readable recording medium” means a portable medium such as a flexible disk, a magneto-optical disk, a ROM and a CD-ROM, and a storage device such as a hard disk built in the computer system.
  • the “computer-readable recording medium” refers to a volatile memory (RAM) inside a computer system that becomes a server or a client when a program is transmitted via a network such as the Internet or a communication line such as a telephone line. In addition, those holding programs for a certain period of time are also included.
  • RAM volatile memory
  • the program may be transmitted from a computer system storing the program in a storage device or the like to another computer system via a transmission medium or by a transmission wave in the transmission medium.
  • the “transmission medium” for transmitting the program refers to a medium having a function of transmitting information, such as a network (communication network) such as the Internet or a communication line (communication line) such as a telephone line.
  • the program may be for realizing a part of the functions described above.
  • a difference file difference file (difference program) may be sufficient.
  • an operation confirmation system capable of confirming a predetermined operation of a mobile person.
  • Display part (position setting part), 35 ... Speaker, 36 ... Control part (notification part, position setting part, position correction part), 41 ... Communication part, 42, 42A, 42B ... Control unit (position acquisition unit, position determination , Notification unit, operation acquisition unit, operation determination unit, evaluation unit), 43 ... model storage unit, 44, 44A, 44B ... storage unit, 45 ... voice recognition model storage unit, 46 ... voice recognition unit, 47 ... scenario storage unit 292 ... Motion recognition unit (motion determination unit, speed acquisition unit, output unit), 421 ... Notification unit (notification correction unit, output unit), ARP ... Predetermined range (separation position, notification region), C1 ... Child (moving person) , Pedestrians, children), PT1 ... Geographical position (first set position)

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Surgery (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Signal Processing (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Automation & Control Theory (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Physiology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Business, Economics & Management (AREA)
  • Emergency Management (AREA)
  • Alarm Systems (AREA)
  • Purses, Travelling Bags, Baskets, Or Suitcases (AREA)
  • Emergency Alarm Devices (AREA)
  • Traffic Control Systems (AREA)
  • Telephonic Communication Services (AREA)
  • Navigation (AREA)

Abstract

L'invention concerne un système de communication d'informations (1) qui est pourvu d'un premier terminal (2) et d'une unité de commande (42). Le premier terminal mobile (2) est porté par une personne en déplacement. L'unité de commande (42) est située dans un centre de données (4) pouvant communiquer avec le premier terminal mobile (2). L'unité de commande (42) acquiert des actions de la personne en déplacement. Sur la base des actions de la personne en déplacement, l'unité de commande (42) détermine qu'une action prescrite a été effectuée.
PCT/JP2019/022897 2018-06-11 2019-06-10 Système de vérification d'action WO2019240070A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2020525549A JP7523347B2 (ja) 2018-06-11 2019-06-10 動作確認システム

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2018111419 2018-06-11
JP2018-111419 2018-06-11
JP2018-146316 2018-08-02
JP2018146316 2018-08-02

Publications (1)

Publication Number Publication Date
WO2019240070A1 true WO2019240070A1 (fr) 2019-12-19

Family

ID=68842253

Family Applications (2)

Application Number Title Priority Date Filing Date
PCT/JP2019/022888 WO2019240065A1 (fr) 2018-06-11 2019-06-10 Système d'évaluation d'action
PCT/JP2019/022897 WO2019240070A1 (fr) 2018-06-11 2019-06-10 Système de vérification d'action

Family Applications Before (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/022888 WO2019240065A1 (fr) 2018-06-11 2019-06-10 Système d'évaluation d'action

Country Status (2)

Country Link
JP (1) JP7424974B2 (fr)
WO (2) WO2019240065A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7494825B2 (ja) 2021-10-05 2024-06-04 トヨタ自動車株式会社 学習支援装置

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023203972A1 (fr) * 2022-04-20 2023-10-26 ソニーグループ株式会社 Système de traitement d'informations, serveur et procédé de traitement d'informations

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05232873A (ja) * 1992-02-19 1993-09-10 Honda Motor Co Ltd 車両走行案内装置
JP2005106742A (ja) * 2003-10-01 2005-04-21 Clarion Co Ltd ナビゲーション装置、方法及びプログラム
JP2009009281A (ja) * 2007-06-27 2009-01-15 Ntt Docomo Inc 交通事故防止システム、サーバ装置及び交通事故防止方法
JP2009123105A (ja) * 2007-11-16 2009-06-04 Nissan Motor Co Ltd 情報通知システム、携帯端末装置、車載装置及び情報送信方法
JP2010041281A (ja) * 2008-08-04 2010-02-18 Yupiteru Corp 車両用警報装置
WO2010109836A1 (fr) * 2009-03-24 2010-09-30 シャープ株式会社 Dispositif de notification, système de notification, procédé de commande de dispositif de notification, programme de commande et support d'enregistrement lisible par ordinateur comportant le programme enregistré
JP2011209794A (ja) * 2010-03-29 2011-10-20 Hiromitsu Hama 対象物認識システム及び該システムを利用する監視システム、見守りシステム
JP2014009984A (ja) * 2012-06-28 2014-01-20 Zenrin Datacom Co Ltd 経路探索システム、携帯端末および経路案内方法
JP2015184975A (ja) * 2014-03-25 2015-10-22 富士通テン株式会社 車両用装置及び車両制御方法
WO2016111067A1 (fr) * 2015-01-05 2016-07-14 ソニー株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations et programme
WO2018008314A1 (fr) * 2016-07-07 2018-01-11 株式会社デンソー Dispositif de détection de piéton et procédé de détection de piéton

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005275529A (ja) 2004-03-23 2005-10-06 Fujitsu Ltd 情報管理方法および情報管理装置

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05232873A (ja) * 1992-02-19 1993-09-10 Honda Motor Co Ltd 車両走行案内装置
JP2005106742A (ja) * 2003-10-01 2005-04-21 Clarion Co Ltd ナビゲーション装置、方法及びプログラム
JP2009009281A (ja) * 2007-06-27 2009-01-15 Ntt Docomo Inc 交通事故防止システム、サーバ装置及び交通事故防止方法
JP2009123105A (ja) * 2007-11-16 2009-06-04 Nissan Motor Co Ltd 情報通知システム、携帯端末装置、車載装置及び情報送信方法
JP2010041281A (ja) * 2008-08-04 2010-02-18 Yupiteru Corp 車両用警報装置
WO2010109836A1 (fr) * 2009-03-24 2010-09-30 シャープ株式会社 Dispositif de notification, système de notification, procédé de commande de dispositif de notification, programme de commande et support d'enregistrement lisible par ordinateur comportant le programme enregistré
JP2011209794A (ja) * 2010-03-29 2011-10-20 Hiromitsu Hama 対象物認識システム及び該システムを利用する監視システム、見守りシステム
JP2014009984A (ja) * 2012-06-28 2014-01-20 Zenrin Datacom Co Ltd 経路探索システム、携帯端末および経路案内方法
JP2015184975A (ja) * 2014-03-25 2015-10-22 富士通テン株式会社 車両用装置及び車両制御方法
WO2016111067A1 (fr) * 2015-01-05 2016-07-14 ソニー株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations et programme
WO2018008314A1 (fr) * 2016-07-07 2018-01-11 株式会社デンソー Dispositif de détection de piéton et procédé de détection de piéton

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7494825B2 (ja) 2021-10-05 2024-06-04 トヨタ自動車株式会社 学習支援装置

Also Published As

Publication number Publication date
JPWO2019240070A1 (ja) 2021-07-29
JP7424974B2 (ja) 2024-01-30
WO2019240065A1 (fr) 2019-12-19
JPWO2019240065A1 (ja) 2021-06-24

Similar Documents

Publication Publication Date Title
US11663516B2 (en) Artificial intelligence apparatus and method for updating artificial intelligence model
US11455792B2 (en) Robot capable of detecting dangerous situation using artificial intelligence and method of operating the same
US11302311B2 (en) Artificial intelligence apparatus for recognizing speech of user using personalized language model and method for the same
US11495214B2 (en) Artificial intelligence device for providing voice recognition service and method of operating the same
US11164586B2 (en) Artificial intelligence apparatus and method for recognizing utterance voice of user
KR20190083317A (ko) 차량의 차선 변경에 관련된 알림을 제공하는 인공 지능 장치 및 그 방법
JP2024028516A (ja) 報知システム
US20210173614A1 (en) Artificial intelligence device and method for operating the same
US11755033B2 (en) Artificial intelligence device installed in vehicle and method therefor
US11398222B2 (en) Artificial intelligence apparatus and method for recognizing speech of user in consideration of user's application usage log
US11182922B2 (en) AI apparatus and method for determining location of user
US11977384B2 (en) Control system for controlling a plurality of robots using artificial intelligence
WO2019240070A1 (fr) Système de vérification d'action
EP3893162A1 (fr) Appareil d'intelligence artificielle à l'aide d'une pluralité de couches de sortie et son procédé
US11449074B2 (en) Robot for providing guidance service using artificial intelligence and method of operating the same
US11322134B2 (en) Artificial intelligence device and operating method thereof
US11074814B2 (en) Portable apparatus for providing notification
US11810575B2 (en) Artificial intelligence robot for providing voice recognition function and method of operating the same
JP7523347B2 (ja) 動作確認システム
WO2020203605A1 (fr) Système de notification, procédé de notification, programme, et support de stockage
JP7289618B2 (ja) 報知システム
WO2020203691A1 (fr) Système d'affichage, procédé d'affichage, programme et support d'enregistrement
WO2021085583A1 (fr) Instrument électronique
JP7442295B2 (ja) 電子機器
JP2020166507A (ja) 電子機器

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19819319

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2020525549

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19819319

Country of ref document: EP

Kind code of ref document: A1