WO2016098457A1 - 情報処理装置、情報処理方法およびプログラム - Google Patents
情報処理装置、情報処理方法およびプログラム Download PDFInfo
- Publication number
- WO2016098457A1 WO2016098457A1 PCT/JP2015/080290 JP2015080290W WO2016098457A1 WO 2016098457 A1 WO2016098457 A1 WO 2016098457A1 JP 2015080290 W JP2015080290 W JP 2015080290W WO 2016098457 A1 WO2016098457 A1 WO 2016098457A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- sensor data
- information
- sensor
- information processing
- feature
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3605—Destination input or retrieval
- G01C21/362—Destination input or retrieval received from an external device or application, e.g. PDA, mobile phone or calendar application
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S5/00—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
- G01S5/02—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using radio waves
- G01S5/0252—Radio frequency fingerprinting
- G01S5/02521—Radio frequency fingerprinting using a radio-map
- G01S5/02522—The radio-map containing measured values of non-radio values
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S5/00—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
- G01S5/02—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using radio waves
- G01S5/0257—Hybrid positioning
- G01S5/0263—Hybrid positioning by combining or switching between positions derived from two or more separate positioning systems
- G01S5/0264—Hybrid positioning by combining or switching between positions derived from two or more separate positioning systems at least one of the systems being a non-radio wave positioning system
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S2205/00—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
- G01S2205/01—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations specially adapted for specific applications
- G01S2205/02—Indoor
Definitions
- This disclosure relates to an information processing apparatus, an information processing method, and a program.
- GNSS Global Navigation Satellite System
- GPS Global Positioning System
- Patent Document 1 Although the autonomous positioning technique described in, for example, Patent Document 1 can be applied to a wide range of cases, it eliminates the effects of errors caused by individual differences in the manner of wearing or carrying the terminal device and the movement of the user. There is a limit to improving Further, since the positioning is relative, the influence of errors may increase cumulatively. Therefore, for example, when positioning using GNSS or access point as described above is difficult to use, there is a need for a technique for accurately estimating the position of the user based on an absolute reference.
- a feature extraction unit that extracts features of first sensor data provided by a sensor that is carried or worn by a user, the features of the first sensor data, and association with given position information
- An information processing comprising: a matching unit that matches the characteristics of the second sensor data corresponding to the first sensor data, and a position estimation unit that estimates the position of the user based on the result of the matching An apparatus is provided.
- the feature of the first sensor data provided by the sensor carried or worn by the user is extracted, and the feature of the first sensor data is associated with the given position information.
- an information processing method including matching with the characteristics of the second sensor data corresponding to the first sensor data, and estimating the position of the user based on the result of the matching Is done.
- the function of extracting the feature of the first sensor data provided by the sensor carried or worn by the user, the feature of the first sensor data, and the given position information are associated with each other.
- the processing circuit In order to cause the processing circuit to realize the function of matching the characteristics of the second sensor data corresponding to the first sensor data and the function of estimating the user position based on the result of the matching Programs are provided.
- FIG. 2 is a block diagram illustrating an example of an overall configuration of an embodiment of the present disclosure.
- FIG. It is a block diagram which shows another example of the whole structure of one Embodiment of this indication. It is a block diagram which shows another example of the whole structure of one Embodiment of this indication.
- 3 is a schematic block diagram illustrating a first example of functional configurations of an input unit, a processing unit, and an output unit according to an embodiment of the present disclosure.
- FIG. 12 is a schematic block diagram illustrating a second example of functional configurations of an input unit, a processing unit, and an output unit according to an embodiment of the present disclosure.
- FIG. 5 is a diagram for describing an overview of map learning and position estimation according to an embodiment of the present disclosure.
- FIG. 2 is a block diagram illustrating a first example of a system configuration according to an embodiment of the present disclosure.
- FIG. 3 is a block diagram illustrating a second example of a system configuration according to an embodiment of the present disclosure.
- FIG. 10 is a block diagram illustrating a third example of a system configuration according to an embodiment of the present disclosure.
- FIG. 9 is a block diagram illustrating a fourth example of a system configuration according to an embodiment of the present disclosure.
- FIG. 3 is a block diagram illustrating a hardware configuration example of an information processing apparatus according to an embodiment of the present disclosure.
- FIG. 1 is a block diagram illustrating an example of the overall configuration of an embodiment of the present disclosure.
- the system 10 includes an input unit 100, a processing unit 200, and an output unit 300.
- the input unit 100, the processing unit 200, and the output unit 300 are realized by one or a plurality of information processing apparatuses, as shown in a configuration example of the system 10 described later.
- the input unit 100 includes, for example, an operation input device, a sensor, or software that acquires information from an external service, and receives input of various information from the user, the surrounding environment, or other services.
- the operation input device includes, for example, hardware buttons, a keyboard, a mouse, a touch panel, a touch sensor, a proximity sensor, an acceleration sensor, a gyro sensor, a temperature sensor, and the like, and receives an operation input by a user.
- the operation input device may include a camera (imaging device), a microphone, or the like that receives an operation input expressed by a user's gesture or voice.
- the input unit 100 may include a processor or a processing circuit that converts a signal or data acquired by the operation input device into an operation command.
- the input unit 100 may output a signal or data acquired by the operation input device to the interface 150 without converting it into an operation command.
- the signal or data acquired by the operation input device is converted into an operation command by the processing unit 200, for example.
- the sensor includes an acceleration sensor, a gyro sensor, a geomagnetic sensor, an illuminance sensor, a temperature sensor, an atmospheric pressure sensor, and the like, and detects acceleration, angular velocity, direction, illuminance, temperature, atmospheric pressure, and the like applied to the apparatus.
- the various sensors described above can detect various information as information related to the user, for example, information indicating the user's movement and orientation.
- the sensor may include a sensor that detects user's biological information such as pulse, sweat, brain wave, touch, smell, and taste.
- the input unit 100 includes a processing circuit that acquires information indicating the emotion of the user by analyzing information detected by these sensors and / or image or sound data detected by a camera or microphone, which will be described later. May be. Alternatively, the above information and / or data may be output to the interface 150 without being analyzed, and the analysis may be executed in the processing unit 200, for example.
- the sensor may acquire an image or sound near the user or the device as data using a camera, a microphone, the various sensors described above, or the like.
- the sensor may include position detection means for detecting an indoor or outdoor position.
- the position detection means may include a GNSS (Global Navigation Satellite System) receiver and / or a communication device.
- the GNSS can include, for example, GPS (Global Positioning System), GLONASS (Global Navigation Satellite System), BDS (BeiDou Navigation Satellite System), QZSS (Quasi-Zenith Satellite Systems), or Galileo.
- GPS Global Positioning System
- GLONASS Global Navigation Satellite System
- BDS BeiDou Navigation Satellite System
- QZSS Quadasi-Zenith Satellite Systems
- Galileo Galileo
- Communication devices include, for example, Wi-fi, MIMO (Multi-Input Multi-Output), cellular communication (for example, position detection using a mobile base station, femtocell), or short-range wireless communication (for example, BLE (Bluetooth Low Energy), The position is detected using a technique such as Bluetooth (registered trademark).
- MIMO Multi-Input Multi-Output
- cellular communication for example, position detection using a mobile base station, femtocell
- short-range wireless communication for example, BLE (Bluetooth Low Energy)
- BLE Bluetooth Low Energy
- the device including the sensor When the sensor as described above detects a user's position and situation (including biological information), the device including the sensor is carried or worn by the user, for example. Alternatively, even when a device including a sensor is installed in the user's living environment, it may be possible to detect the user's position and situation (including biological information). For example, the user's pulse can be detected by analyzing an image including the user's face acquired by a camera fixed in a room or the like.
- the input unit 100 includes a processor or a process for converting a signal or data acquired by the sensor into a predetermined format (for example, converting an analog signal into a digital signal or encoding image or audio data).
- a circuit may be included.
- the input unit 100 may output the acquired signal or data to the interface 150 without converting it into a predetermined format. In that case, the signal or data acquired by the sensor is converted into an operation command by the processing unit 200.
- Software that obtains information from an external service obtains various types of information provided by the external service using, for example, an API (Application Program Interface) of the external service.
- the software may acquire information from an external service server, or may acquire information from application software of a service executed on the client device.
- information such as text and images posted by users or other users to external services such as social media can be acquired.
- the acquired information does not necessarily have to be intentionally posted by the user or other users, and may be, for example, a log of operations performed by the user or other users.
- the acquired information is not limited to the personal information of the user or other users. For example, an unspecified number of people such as news, weather forecast, traffic information, POI (Point Of Interest), or advertisements. It may be information distributed to the user.
- information acquired from external services includes information acquired by the various sensors described above, such as acceleration, angular velocity, azimuth, illuminance, temperature, atmospheric pressure, pulse, sweating, brain waves, tactile sensation, olfaction, taste, and other living organisms.
- Information, emotion, position information, and the like may be detected by a sensor included in another system that cooperates with the external service, and information generated by posting to the external service may be included.
- the interface 150 is an interface between the input unit 100 and the processing unit 200.
- the interface 150 may include a wired or wireless communication interface.
- the Internet may be interposed between the input unit 100 and the processing unit 200.
- wired or wireless communication interfaces include cellular communication such as 3G / LTE, Wi-Fi, Bluetooth (registered trademark), NFC (Near Field Communication), Ethernet (registered trademark), and HDMI (registered trademark). (High-Definition Multimedia Interface), USB (Universal Serial Bus), etc.
- the interface 150 may include a bus in the device, data reference in a program module, and the like (hereinafter referred to as these). Also referred to as the interface within the device). Further, when the input unit 100 is realized by being distributed to a plurality of devices, the interface 150 may include different types of interfaces for the respective devices. For example, the interface 150 may include both a communication interface and an interface within the device.
- the processing unit 200 executes various processes based on information acquired by the input unit 100. More specifically, for example, the processing unit 200 is a processor or processing circuit such as a central processing unit (CPU), a digital signal processor (DSP), an application specific integrated circuit (ASIC), or a field-programmable gate array (FPGA). including.
- the processing unit 200 may include a memory or a storage device that temporarily or permanently stores a program executed in the processor or the processing circuit and data read / written in the processing.
- the processing unit 200 may be realized by a single processor or processing circuit in a single device, or may be realized by being distributed to a plurality of devices or a plurality of processors or processing circuits in the same device. May be.
- an interface 250 is interposed between the divided portions of the processing unit 200 as in the example illustrated in FIGS. 2A and 2B.
- the interface 250 may include a communication interface or an interface in the apparatus, similar to the interface 150 described above.
- individual functional blocks constituting the processing unit 200 are illustrated, but the interface 250 may be interposed between arbitrary functional blocks. That is, when the processing unit 200 is realized by being distributed to a plurality of devices, or a plurality of processors or processing circuits, how to distribute the functional blocks to each device, each processor, or each processing circuit is described separately. It is optional unless otherwise specified.
- the output unit 300 outputs the information provided from the processing unit 200 to a user (may be the same user as the user of the input unit 100 or a different user), an external device, or another service. To do.
- the output unit 300 may include an output device, a control device, or software that provides information to an external service.
- the output device uses the information provided from the processing unit 200 as visual, auditory, tactile, olfactory, taste, etc. of the user (may be the same user as the user of the input unit 100 or a different user). Output in a form perceived by the sense of
- the output device is a display and outputs information as an image.
- the display is not limited to a reflective or self-luminous display such as an LCD (Liquid Crystal Display) or an organic EL (Electro-Luminescence) display, and an image is displayed on the user's eye as used in a wearable device.
- a combination of a light guide member that guides light and a light source is also included.
- the output device may include a speaker and output information by voice.
- the output device may include a projector, a vibrator, and the like.
- the control device controls the device based on the information provided from the processing unit 200.
- the controlled device may be included in a device that implements the output unit 300 or may be an external device. More specifically, for example, the control device includes a processor or a processing circuit that generates a control command.
- the output unit 300 may further include a communication device that transmits a control command to the external device.
- the control device controls a printer that outputs information provided from the processing unit 200 as a printed matter.
- the control device may include a driver that controls writing of information provided from the processing unit 200 to a storage device or a removable recording medium.
- the control device may control a device other than the device that outputs or records the information provided from the processing unit 200.
- the control device controls the lighting device to turn on the illumination, controls the television to erase the image, controls the audio device to adjust the volume, controls the robot to control its movement, etc. You may do it.
- the software that provides information to the external service provides the information provided from the processing unit 200 to the external service by using, for example, an API of the external service.
- the software may provide information to a server of an external service, or may provide information to application software of a service executed on the client device.
- the provided information does not necessarily have to be immediately reflected in the external service, and may be provided as a candidate for a user to post or transmit to the external service, for example.
- the software may provide text used as a candidate for a search keyword or URL (Uniform Resource Locator) input by the user in browser software executed on the client device.
- the software may post text, images, videos, sounds, and the like on an external service such as social media on behalf of the user.
- the interface 350 is an interface between the processing unit 200 and the output unit 300.
- the interface 350 may include a wired or wireless communication interface.
- the interface 350 may include an interface in the above-described device.
- the interface 350 may include different types of interfaces for the respective devices.
- interface 350 may include both a communication interface and an interface within the device.
- FIG. 3 is a schematic block diagram illustrating a functional configuration example of the input unit, the processing unit, and the output unit at the time of position estimation according to an embodiment of the present disclosure.
- FIG. 3 a functional configuration example at the time of position estimation of the input unit 100, the processing unit 200, and the output unit 300 included in the system 10 according to the present embodiment will be described.
- the input unit 100 includes an acceleration sensor 101, a gyro sensor 103, a geomagnetic sensor 105, an atmospheric pressure sensor 107, and / or a Wi-Fi communication device 109 as sensors.
- the Wi-Fi communication device 109 is originally a communication device, but is used as a sensor for detecting a radio wave reception state in this embodiment.
- the Wi-Fi communication device 109 may be used as an original communication function at the same time as being used as a sensor for detecting a reception state of radio waves.
- These sensors are carried or worn by the user, for example. More specifically, for example, the user carries or wears a terminal device on which these sensors are mounted.
- Measured values of acceleration, angular velocity, geomagnetism, and / or atmospheric pressure provided by the sensor as described above are provided to the processing unit 200 as sensor data.
- the sensor data since sensor data is used for matching with position information as described later, the sensor data is not necessarily limited to one that can directly indicate a user's behavior or position. Therefore, the input unit 100 may include other types of sensors as sensors. Further, among the sensors exemplified above, there may be a sensor that is not included in the input unit 100.
- the Wi-Fi communication device 109 used as a position sensor communicates with one or a plurality of Wi-Fi base stations (access points) installed in a space where the user can move.
- the installation position of each access point does not necessarily need to be specified.
- the Wi-Fi communication device 109 provides the processing unit 200 with information including which access point is communicable and the radio wave intensity from the communicable access point as sensor data.
- the operation input device 111 acquires, for example, an operation input indicating a user instruction regarding generation of position related information described later.
- the input unit 100 may further include a processor or a processing circuit for converting or analyzing data acquired by these sensors and the operation input device.
- the processing unit 200 may include a Wi-Fi feature amount extraction unit 201, a sensor data feature extraction unit 203, a matching / position estimation unit 205, a position related information generation unit 207, and a sensor map 209.
- These functional configurations are realized by, for example, a server processor or processing circuit that communicates with a terminal device, and a memory or storage. Further, some of these functional configurations may be realized by a processor or processing circuit of the same terminal device as the sensor or operation input device included in the input unit 100. A specific example of such a configuration will be described later. Hereinafter, each functional configuration will be further described.
- the Wi-Fi feature amount extraction unit 201 extracts a feature amount related to Wi-Fi communication from the sensor data provided by the Wi-Fi communication device 109 of the input unit 100. For example, the Wi-Fi feature amount extraction unit 201 extracts a Wi-Fi feature amount by hashing an access point capable of communication and the radio wave intensity from the access point. More specifically, the Wi-Fi feature amount extraction unit 201 weights and adds a random number vector uniquely assigned to an access point arranged in the user's movement space according to the radio wave intensity from each access point. By doing so, the Wi-Fi feature value may be extracted.
- the Wi-Fi feature value is not intended to directly indicate position information, but is a pattern of the access point that can be communicated and the radio wave intensity from the access point. It is. Therefore, for example, when Wi-Fi feature values (vectors) extracted from sensor data at different times are close to each other, the user's position at those times may be close, but at which position You don't have to know at this point. Therefore, in the present embodiment, Wi-Fi feature quantities that do not include individual access point IDs themselves or access point location information are extracted. For example, even when an access point is added / removed or moved, there is no need to change the Wi-Fi feature extraction procedure and setting value, and the map as will be described later in the access point arrangement after the change. May be generated.
- the sensor data feature extraction unit 203 extracts various features from the sensor data provided by the acceleration sensor 101, the gyro sensor 103, the geomagnetic sensor 105, and / or the atmospheric pressure sensor 107 of the input unit 100.
- the extracted features may include those expressed as feature amounts, or may include features that are not necessarily quantified, such as action labels described later. More specifically, for example, the sensor data feature extraction unit 203 may extract the user's moving speed, gravity component, and / or acceleration component other than gravity from the detected acceleration value provided by the acceleration sensor 101. Good.
- the sensor data feature extraction unit 203 may extract the angular velocity around the vertical axis from the detected angular velocity value provided by the gyro sensor 103. Further, for example, the sensor data feature extraction unit 203 may extract the azimuth from the detected value of geomagnetism provided by the geomagnetic sensor 105.
- the sensor data feature extraction unit 203 may perform behavior recognition based on the sensor data, and use the behavior label of the user specified by the behavior recognition as the feature of the sensor data. That is, the sensor data feature extraction unit 203 may include an action recognition unit.
- action recognition for example, action labels such as stay, walk, run, jump, stairs, elevator, escalator, bicycle, bus, train, car, ship or airplane can be recognized. Since the action recognition method is described in many documents such as Japanese Patent Application Laid-Open No. 2012-8771, for example, detailed description thereof is omitted.
- the behavior recognition unit can employ any configuration of known behavior recognition technology.
- the matching / position estimation unit 205 uses the sensor data features extracted by the Wi-Fi feature amount extraction unit 201 and the sensor data feature extraction unit 203 (hereinafter sometimes collectively referred to as a feature extraction unit) in the sensor map 209. Match the characteristics of the sensor data associated with the given location information.
- the feature of the sensor data extracted by the feature extraction unit and the feature of the sensor data associated with the position information in the sensor map 209 correspond to each other. More specifically, the characteristics of each sensor data may include common types of characteristics among the sensor data characteristics described above.
- the matching / position estimation unit 205 estimates the position of the user based on the matching result. That is, when the feature of the first sensor data extracted by the feature extraction unit matches the feature of the second sensor data defined in the sensor map 209, the matching / position estimation unit 205 matches the second sensor data feature. A position corresponding to the position information associated with the sensor data is estimated as the position of the user.
- the position estimation by the matching / position estimation unit 205 can be performed based on a snapshot of sensor data provided by the sensor at a single time.
- the matching / position estimation unit 205 may perform position estimation based on time-series sensor data, that is, sensor data provided by the sensor over a continuous series of times.
- the matching / position estimation unit 205 is associated with the feature of the first sensor data extracted by the feature extraction unit and constituting the time series, and the sequence of position information constituting the route such as adjacent to each other.
- the feature of the second sensor data is matched. For example, even when similar sensor data features appear at a plurality of different positions, it is possible to estimate positions more accurately by performing matching on time-series sensor data.
- the position related information generation unit 207 generates information to be output from the output unit 300 to the user based on the information provided from the matching / position estimation unit 205. More specifically, for example, the position-related information generation unit 207 includes the action recognition included in the sensor data feature extraction unit 203 on the map generated based on the user position estimated by the matching / position estimation unit 205. You may generate
- the output unit 300 can include a display 301, a speaker 303, and a vibrator 305.
- the display 301, the speaker 303, and the vibrator 305 are mounted on, for example, a terminal device that is carried or worn by the user.
- the display 301 outputs information as an image
- the speaker 303 outputs information as sound
- the vibrator 305 outputs information as vibration.
- the output information may include information generated by the position related information generation unit 207.
- the display 301, the speaker 303, or the vibrator 305 may be mounted on the same terminal device as the sensor of the input unit 100.
- the display 301, the speaker 303, or the vibrator 305 may be mounted on the same terminal device as the operation input device 111 of the input unit 100.
- the display 301, the speaker 303, or the vibrator 305 may be mounted on a terminal device that is different from the components of the input unit 100.
- a terminal device that is different from the components of the input unit 100.
- a more specific configuration example of the terminal device and the server that realizes the input unit 100, the processing unit 200, and the output unit 300 will be described later.
- FIG. 4 is a schematic block diagram illustrating a functional configuration example of the input unit, the processing unit, and the output unit during map learning according to an embodiment of the present disclosure.
- the output unit 300 may output information indicating the progress of map learning, the generated map, and the like to a user who is executing map learning, for example. Therefore, the illustration and description of the output unit 300 are omitted in the example of map learning.
- the input unit 100 includes an acceleration sensor 101, a gyro sensor 103, a geomagnetic sensor 105, an atmospheric pressure sensor 107, and / or a Wi-Fi communication device 109 as sensors.
- the sensor included in the input unit 100 may be the same as that at the time of position estimation.
- the input unit 100 includes a positioning device / input device 113.
- a positioning device / input device 113 different from the above-described example at the time of position estimation in the configuration of the input unit 100 will be further described.
- the positioning device / input device 113 is used to acquire position information in parallel with acquisition of sensor data.
- the position information acquired by the positioning device / input device 113 is handled as accurate position information.
- accurate position information can be acquired by Visual SLAM (Simultaneous Localization and Mapping) using an image acquired by a camera carried or worn by the user in the process of moving around in a space in which the user can move.
- the positioning device / input device 113 includes a camera that acquires an image.
- the calculation for Visual SLAM may be executed on the input unit 100 side or may be executed on the processing unit 200 side.
- Visual SLAM is a technique for performing self-position estimation and environment structure mapping in parallel, and is described in, for example, Japanese Patent Application Laid-Open No. 2007-156016.
- Visual SLAM means SLAM executed using an image.
- an image may be acquired with a stereo camera (two or more cameras), or an image may be acquired by moving one camera.
- the accurate position information may be absolute coordinates in the space input by the user himself (or an accompanying person).
- the positioning device / input device 113 is realized by an input device that accepts input of absolute coordinates, for example.
- the absolute coordinates may be input in real time, for example, when the user is moving around in the space, or may be input with reference to the user's video afterwards.
- the processing unit 200 may include a Wi-Fi feature amount extraction unit 201, a sensor data feature extraction unit 203, a position information acquisition unit 213, and a sensor map learning unit 215.
- the process in which the Wi-Fi feature amount extraction unit 201 and the sensor data feature extraction unit 203 (feature extraction unit) extract the feature of the sensor data provided by the sensor of the input unit 100 is the same as in the above-described position estimation example. It is. However, at the time of map learning, the feature amount of the extracted sensor data is input to the sensor map learning unit 215.
- the sensor map learning unit 215 generates the sensor map 209 by associating the feature amount of the extracted sensor data with the accurate position information acquired by the position information acquisition unit 213.
- the sensor map learning unit 215 associates the feature of the sensor data extracted by the feature extraction unit with the accurate position information acquired by the position information acquisition unit 213 according to, for example, a probability model.
- the sensor map 209 represents the observation probability of the feature of the sensor data in a state defined by accurate position information.
- a position corresponding to a state having an observation probability having the highest consistency with respect to a feature extracted from sensor data acquired at a single time is estimated as a user position. be able to.
- the sensor map learning unit 215 may calculate a transition probability between states defined by accurate position information.
- the observation probability of the feature of the sensor data in the state defined by the accurate position information and the transition probability between the states can be expressed.
- a series of features extracted from the sensor data constituting the time series at the time of position estimation for example, a series of positions extracted from the sensor data constituting the time series at the time of position estimation.
- a series of positions corresponding to a state having a higher consistency of observation probabilities in each state and a higher consistency of transition probabilities between the series of states can be estimated as the latest movement history of the user.
- FIG. 5 is a diagram for describing an overview of map learning and position estimation according to an embodiment of the present disclosure.
- FIG. 5 conceptually shows the relationship between processing and information in map learning and position estimation performed in the system 10 as described above with reference to FIGS.
- Feature extraction from sensor data provided by sensors for example, acceleration sensor 101, gyro sensor 103, geomagnetic sensor 105, barometric pressure sensor 107, and / or Wi-Fi communication device 109) during map learning performed as advance preparation Features are extracted by the units 201 and 203.
- the feature extraction here is performed in order to remove the influence of redundant portions and noise included in the sensor data and facilitate matching at the time of position measurement.
- the behavior of the sensor is smaller than the movement of the user by walking or the like, for example, sensor data changes caused by fine body shakes, etc. Can be regarded as noise.
- the sensor map learning unit 215 associates the characteristics of the sensor data extracted by the feature extraction units 201 and 203 as described above with accurate position information separately acquired, for example, absolute coordinates,
- a sensor map 209 is generated by learning.
- a probabilistic model such as IHMM (Incremental Hidden Markov Model) may be used. That is, in the sensor map, the feature of the sensor data may be associated with the position information according to the probability model.
- IHMM Incmental Hidden Markov Model
- the features are extracted by the feature extraction units 201 and 203 from the sensor data provided by the sensor as in the case of map learning.
- the extracted features are input to the matching / position estimation unit 205, and position information is estimated by matching with the features defined in the sensor map 209.
- FIG. 6 is a diagram for describing an example of a probability model used in an embodiment of the present disclosure.
- IHMM is described as an example of a model used for generating the sensor map 209 in the present embodiment.
- Fig. 6 shows arbitrary time series data as the model input.
- the arbitrary time series data may be a continuous value signal or a discrete signal.
- the continuous value signal includes a pseudo continuous value signal provided as a digital signal.
- the orientation extracted from the detected value of geomagnetism can constitute a continuous value signal.
- the Wi-Fi feature amount can constitute a discrete signal.
- IHMM is a technology that learns, as a state transition model (HMM), a law hidden behind time-series data that is input sequentially (incrementally).
- HMM state transition model
- the state transition model shown as an output in FIG. 6 is expressed by a plurality of states, an observation model for each state, and a transition probability between states.
- the sensor map 209 defines a state including features extracted from sensor data and accurate position information (absolute coordinates) acquired in parallel with the sensor data.
- a state is defined in IHMM or a transition probability between states is calculated, only position information in time-series data may be used. This is because the position information is most accurate in the map learning process of this embodiment, so even if the features extracted from the sensor data are different, the same state should be defined if the position information is common. Is appropriate.
- Such processing can be realized, for example, by setting the weight for learning position information (absolute coordinates) to 1 and setting the weight of other observation states to 0 in the IHMM library.
- FIG. 7 is a diagram illustrating an example of a sensor map generated in an embodiment of the present disclosure.
- the state ST defined in the sensor map 209 is indicated by a circle or an ellipse.
- a state observation probability OP is defined for each state ST.
- the observation probability OP is expressed by the average and variance of the characteristics of the sensor data in each state.
- the center of the circle or ellipse indicated as the state ST indicates the average of the X coordinate and the Y coordinate at the observation probability OP.
- the diameter of the circle or ellipse (in the case of an ellipse, the major axis and the minor axis) indicates the variance of the X coordinate and the Y coordinate in the observation probability OP.
- a line connecting the circles or ellipses shown as the states ST indicates that the transition probability between the states ST is larger than zero.
- the acceleration sensor 101, the gyro sensor 103, and the geomagnetic sensor 105 included in the input unit 100 a triaxial acceleration sensor, A gyro sensor and a geomagnetic sensor are used.
- the sampling period is 50 Hz in all cases.
- the Wi-Fi communication device 109 outputs the ID of the access point with which communication was possible and the radio wave intensity from the access point.
- the Wi-Fi feature amount extraction unit 201 assigns a 64-dimensional Gaussian random number vector to each access point, and adds each random number vector by weighting according to the radio wave intensity from each access point. To extract a Wi-Fi feature quantity as a 64-dimensional real value vector.
- the sensor data feature extraction unit 203 performs action recognition based on the detected values of acceleration, angular velocity, and geomagnetism, and includes stationary, walking, left turn, right turn, stairs up, stairs down, escalator up, escalator down. Eight action labels are identified.
- the sensor data feature extraction unit 203 extracts the following feature amounts from the detected values of acceleration, angular velocity, and geomagnetism.
- gravity is obtained by inputting acceleration detection values of three axes (X axis, Y axis, and Z axis) to a low-pass filter and extracting signals in the front, side, and vertical directions.
- the acceleration (other than gravity) is obtained by subtracting the above gravity value from the detected acceleration value of each axis and extracting signals in the forward, lateral, and vertical directions.
- the geomagnetism is obtained by extracting forward, lateral, and vertical signals from the detected value of geomagnetism. Further, in the extraction of the angular velocity, the offset is estimated and removed when the user is stationary estimated from the acceleration.
- the Wi-Fi feature amount extraction unit 201 and the sensor data feature extraction unit 203 extract the sensor data features for each second of the time stamp of the sensor data.
- the accuracy of position estimation is improved by using a plurality of sensor data (for example, compared to the case where only the Wi-Fi feature amount is used). Also, when multiple sensor data are used, position estimation is performed by performing matching with multiple sensor data features that make up a time series, as compared to matching with sensor data features at a single time. Improves accuracy. When using the characteristics of a plurality of sensor data constituting the time series, the longer the time series, the better the position estimation accuracy.
- the characteristics of sensor data provided by one or more sensors carried or worn by a user can be represented in the sensor data associated with given position information.
- the position of the user can be estimated with good accuracy.
- the position estimation according to the present embodiment is less susceptible to error accumulation compared to autonomous positioning performed using, for example, acceleration, angular velocity, geomagnetism, or the like.
- the feature of the sensor data is used for matching, there are few restrictions on the content of the sensor data.
- sensor data such as acceleration, angular velocity, and geomagnetism are often essential, but in the present embodiment, any of these may be temporarily or missing from the beginning. (If you have enough other sensor data available, you do n’t have to.)
- the information regarding the Wi-Fi communication is not intended to estimate the position of the user based on the position of the access point, it is only necessary to identify each access point as described above.
- various data can be used in addition to the data exemplified above or in place of the data exemplified above.
- the accuracy is due to the reception state of radio waves from a beacon installed in a space where the user can move, indoors, buildings, etc.
- Inferior GNSS positioning data or the like may be used (in addition, if accurate GNSS positioning data is available, position estimation itself is not necessary).
- These data can also be used as sensor data because they are considered to change with some relationship with the user's position, like the Wi-Fi feature value.
- the user's action label specified by action recognition based on the sensor data is used as a feature of the sensor data, but this is not always necessary.
- the user's action label is not used as a feature of the sensor data, it is not always necessary for the user to move around in the space and collect the sensor data during map learning.
- sensor data may be collected by tricking a robot equipped with a terminal device.
- the position estimation result in the present embodiment can be used for generating information to be output to the user by the position related information generation unit 207.
- the destination of the user is predicted and the illumination of the room or passage is previously performed. Can be used to light up, appropriately switch access points such as Wi-Fi, or notify other users at the destination of arrival.
- the result of position estimation may be used as a history of position information of the terminal device on which the sensor is mounted, for example, without being limited to user movement prediction. For example, when the user loses his / her smartphone, the location of the smartphone can be estimated if the latest position estimation result when the user is carrying the smartphone can be used.
- the system 10 includes the input unit 100, the processing unit 200, and the output unit 300, and these components are realized by one or a plurality of information processing apparatuses.
- achieves the system 10 is demonstrated with a more specific example.
- FIG. 8 is a block diagram illustrating a first example of a system configuration according to an embodiment of the present disclosure.
- the system 10 includes information processing apparatuses 11 and 13.
- the input unit 100 and the output unit 300 are realized in the information processing apparatus 11.
- the processing unit 200 is realized in the information processing apparatus 13.
- the information processing apparatus 11 and the information processing apparatus 13 communicate via a network in order to realize the function according to the embodiment of the present disclosure.
- the interface 150b between the input unit 100 and the processing unit 200 and the interface 350b between the processing unit 200 and the output unit 300 can both be communication interfaces between apparatuses.
- the information processing apparatus 11 may be a terminal device, for example.
- the input unit 100 may include an input device, a sensor, software that acquires information from an external service, and the like.
- software that acquires information from an external service acquires data from application software of a service that is executed in the terminal device.
- the output unit 300 may include an output device, a control device, software that provides information to an external service, and the like.
- the software that provides information to an external service can provide information to application software of a service that is executed by a terminal device, for example.
- the information processing apparatus 13 can be a server.
- the processing unit 200 is realized by a processor or a processing circuit included in the information processing device 13 operating according to a program stored in a memory or a storage device.
- the information processing device 13 may be a device dedicated as a server, for example. In this case, the information processing apparatus 13 may be installed in a data center or the like, or may be installed in a home. Alternatively, the information processing device 13 can be used as a terminal device for other functions, but may be a device that does not realize the input unit 100 and the output unit 300 for the functions according to the embodiment of the present disclosure.
- FIG. 9 is a block diagram illustrating a second example of the system configuration according to the embodiment of the present disclosure.
- the system 10 includes information processing apparatuses 11a, 11b, and 13.
- the input unit 100 is realized by being divided into input units 100a and 100b.
- the input unit 100a is realized in the information processing apparatus 11a.
- the input unit 100a can include, for example, the acceleration sensor 101, the gyro sensor 103, the geomagnetic sensor 105, the atmospheric pressure sensor 107, and / or the Wi-Fi communication device 109 described above.
- the input unit 100b and the output unit 300 are realized in the information processing apparatus 11b.
- the input unit 100b can include, for example, the operation input device 111 described above.
- the processing unit 200 is realized in the information processing apparatus 13.
- the information processing apparatuses 11a and 11b and the information processing apparatus 13 communicate with each other via a network in order to realize the functions according to the embodiment of the present disclosure.
- the interfaces 150b1 and 150b2 between the input unit 100 and the processing unit 200 and the interface 350b between the processing unit 200 and the output unit 300 can be communication interfaces between apparatuses.
- the interface 150b1, the interface 150b2, and the interface 350b can include different types of interfaces.
- the information processing devices 11a and 11b may be terminal devices, for example.
- the information processing apparatus 11a is carried or worn by a user and senses the user.
- the information processing device 11b outputs information generated in the information processing device 13 based on the sensing result to the user.
- the information processing apparatus 11b receives a user operation input regarding the output information. Therefore, the information processing apparatus 11b does not necessarily have to be carried or worn by the user.
- the information processing apparatus 13 can be a server or a terminal device, as in the first example.
- the processing unit 200 is realized by a processor or a processing circuit included in the information processing device 13 operating according to a program stored in a memory or a storage device.
- FIG. 10 is a block diagram illustrating a third example of the system configuration according to the embodiment of the present disclosure.
- the system 10 includes information processing apparatuses 11 and 13.
- the input unit 100 and the output unit 300 are realized in the information processing apparatus 11.
- the processing unit 200 is realized by being distributed to the information processing apparatus 11 and the information processing apparatus 13.
- the information processing apparatus 11 and the information processing apparatus 13 communicate via a network in order to realize the function according to the embodiment of the present disclosure.
- the processing unit 200 is realized by being distributed between the information processing apparatus 11 and the information processing apparatus 13. More specifically, the processing unit 200 includes processing units 200 a and 200 c realized by the information processing apparatus 11 and a processing unit 200 b realized by the information processing apparatus 13.
- the processing unit 200a executes processing based on information provided from the input unit 100 via the interface 150a, and provides the processing result to the processing unit 200b.
- the processing unit 200a can include, for example, the Wi-Fi feature amount extraction unit 201 and the sensor data feature extraction unit 203 described above.
- the processing unit 200c executes processing based on the information provided from the processing unit 200b, and provides the processing result to the output unit 300 via the interface 350a.
- the processing unit 200c may include, for example, the position related information generation unit 207 described above.
- both the processing unit 200a and the processing unit 200c are shown, but only one of them may actually exist. That is, the information processing apparatus 11 implements the processing unit 200a, but does not implement the processing unit 200c, and the information provided from the processing unit 200b may be provided to the output unit 300 as it is. Similarly, the information processing apparatus 11 implements the processing unit 200c, but may not implement the processing unit 200a.
- An interface 250b is interposed between the processing unit 200a and the processing unit 200b and between the processing unit 200b and the processing unit 200c.
- the interface 250b is a communication interface between apparatuses.
- the interface 150a is an interface in the apparatus.
- the interface 350a is an interface in the apparatus.
- the processing unit 200c includes the position related information generation unit 207, a part of information from the input unit 100, for example, information from the operation input device 111 is directly transmitted to the processing unit 200c via the interface 150a. Provided.
- the third example described above is the first example described above except that one or both of the processing unit 200a and the processing unit 200c is realized by a processor or a processing circuit included in the information processing apparatus 11. It is the same. That is, the information processing apparatus 11 can be a terminal device. Further, the information processing apparatus 13 can be a server.
- FIG. 11 is a block diagram illustrating a fourth example of the system configuration according to the embodiment of the present disclosure.
- the system 10 includes information processing apparatuses 11a, 11b, and 13.
- the input unit 100 is realized by being divided into input units 100a and 100b.
- the input unit 100a is realized in the information processing apparatus 11a.
- the input unit 100a can include, for example, the acceleration sensor 101, the gyro sensor 103, the geomagnetic sensor 105, the atmospheric pressure sensor 107, and / or the Wi-Fi communication device 109 described above.
- the input unit 100b and the output unit 300 are realized in the information processing apparatus 11b.
- the input unit 100b can include, for example, the operation input device 111 described above.
- the processing unit 200 is realized by being distributed to the information processing apparatuses 11 a and 11 b and the information processing apparatus 13.
- the information processing apparatuses 11a and 11b and the information processing apparatus 13 communicate with each other via a network in order to realize the functions according to the embodiment of the present disclosure.
- the processing unit 200 is realized by being distributed between the information processing apparatuses 11a and 11b and the information processing apparatus 13. More specifically, the processing unit 200 includes a processing unit 200a realized by the information processing device 11a, a processing unit 200b realized by the information processing device 13, and a processing unit 200c realized by the information processing device 11b. Including. Such distribution of the processing unit 200 is the same as in the third example. However, in the fourth example, since the information processing device 11a and the information processing device 11b are separate devices, the interfaces 250b1 and 250b2 can include different types of interfaces. As described above, when the processing unit 200c includes the position-related information generation unit 207, information from the input unit 100b, for example, information from the operation input device 111 is directly provided to the processing unit 200c via the interface 150a2. .
- the fourth example is the same as that described above except that one or both of the processing unit 200a and the processing unit 200c is realized by a processor or a processing circuit included in the information processing device 11a or the information processing device 11b.
- the information processing devices 11a and 11b can be terminal devices.
- the information processing apparatus 13 can be a server.
- FIG. 12 is a block diagram illustrating a hardware configuration example of the information processing apparatus according to the embodiment of the present disclosure.
- the information processing apparatus 900 includes a CPU (Central Processing unit) 901, a ROM (Read Only Memory) 903, and a RAM (Random Access Memory) 905.
- the information processing apparatus 900 may include a host bus 907, a bridge 909, an external bus 911, an interface 913, an input device 915, an output device 917, a storage device 919, a drive 921, a connection port 923, and a communication device 925.
- the information processing apparatus 900 may include an imaging device 933 and a sensor 935 as necessary.
- the information processing apparatus 900 may include a processing circuit such as a DSP (Digital Signal Processor), an ASIC (Application Specific Integrated Circuit), or an FPGA (Field-Programmable Gate Array) instead of or in addition to the CPU 901.
- DSP Digital Signal Processor
- ASIC Application Specific Integrated Circuit
- FPGA Field-Programmable Gate Array
- the CPU 901 functions as an arithmetic processing device and a control device, and controls all or a part of the operation in the information processing device 900 according to various programs recorded in the ROM 903, the RAM 905, the storage device 919, or the removable recording medium 927.
- the ROM 903 stores programs and calculation parameters used by the CPU 901.
- the RAM 905 primarily stores programs used in the execution of the CPU 901, parameters that change as appropriate during the execution, and the like.
- the CPU 901, the ROM 903, and the RAM 905 are connected to each other by a host bus 907 configured by an internal bus such as a CPU bus. Further, the host bus 907 is connected to an external bus 911 such as a PCI (Peripheral Component Interconnect / Interface) bus via a bridge 909.
- PCI Peripheral Component Interconnect / Interface
- the input device 915 is a device operated by the user, such as a mouse, a keyboard, a touch panel, a button, a switch, and a lever.
- the input device 915 may be, for example, a remote control device that uses infrared rays or other radio waves, or may be an external connection device 929 such as a mobile phone that supports the operation of the information processing device 900.
- the input device 915 includes an input control circuit that generates an input signal based on information input by the user and outputs the input signal to the CPU 901. The user operates the input device 915 to input various data and instruct processing operations to the information processing device 900.
- the output device 917 is configured by a device capable of notifying the acquired information to the user using a sense such as vision, hearing, or touch.
- the output device 917 can be, for example, a display device such as an LCD (Liquid Crystal Display) or an organic EL (Electro-Luminescence) display, an audio output device such as a speaker or headphones, or a vibrator.
- the output device 917 outputs the result obtained by the processing of the information processing device 900 as video such as text or image, sound such as sound or sound, or vibration.
- the storage device 919 is a data storage device configured as an example of a storage unit of the information processing device 900.
- the storage device 919 includes, for example, a magnetic storage device such as an HDD (Hard Disk Drive), a semiconductor storage device, an optical storage device, or a magneto-optical storage device.
- the storage device 919 stores, for example, programs executed by the CPU 901 and various data, and various data acquired from the outside.
- the drive 921 is a reader / writer for a removable recording medium 927 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory, and is built in or externally attached to the information processing apparatus 900.
- the drive 921 reads information recorded on the attached removable recording medium 927 and outputs the information to the RAM 905.
- the drive 921 writes a record in the attached removable recording medium 927.
- the connection port 923 is a port for connecting a device to the information processing apparatus 900.
- the connection port 923 can be, for example, a USB (Universal Serial Bus) port, an IEEE 1394 port, a SCSI (Small Computer System Interface) port, or the like.
- the connection port 923 may be an RS-232C port, an optical audio terminal, an HDMI (registered trademark) (High-Definition Multimedia Interface) port, or the like.
- the communication device 925 is a communication interface configured with, for example, a communication device for connecting to the communication network 931.
- the communication device 925 can be, for example, a communication card for LAN (Local Area Network), Bluetooth (registered trademark), Wi-Fi, or WUSB (Wireless USB).
- the communication device 925 may be a router for optical communication, a router for ADSL (Asymmetric Digital Subscriber Line), or a modem for various communication.
- the communication device 925 transmits and receives signals and the like using a predetermined protocol such as TCP / IP with the Internet and other communication devices, for example.
- the communication network 931 connected to the communication device 925 is a network connected by wire or wireless, and may include, for example, the Internet, a home LAN, infrared communication, radio wave communication, satellite communication, or the like.
- the imaging device 933 uses various members such as an imaging element such as a CMOS (Complementary Metal Oxide Semiconductor) or a CCD (Charge Coupled Device), and a lens for controlling the formation of a subject image on the imaging element. It is an apparatus that images a real space and generates a captured image.
- the imaging device 933 may capture a still image or may capture a moving image.
- the sensor 935 is various sensors such as an acceleration sensor, an angular velocity sensor, a geomagnetic sensor, an illuminance sensor, a temperature sensor, an atmospheric pressure sensor, or a sound sensor (microphone).
- the sensor 935 acquires information about the state of the information processing apparatus 900 itself, such as the posture of the information processing apparatus 900, and information about the surrounding environment of the information processing apparatus 900, such as brightness and noise around the information processing apparatus 900, for example. To do.
- the sensor 935 may include a GPS receiver that receives a GPS (Global Positioning System) signal and measures the latitude, longitude, and altitude of the device.
- GPS Global Positioning System
- Each component described above may be configured using a general-purpose member, or may be configured by hardware specialized for the function of each component. Such a configuration can be appropriately changed according to the technical level at the time of implementation.
- an information processing apparatus for example, an information processing apparatus, a system, an information processing method executed by the information processing apparatus or system, a program for causing the information processing apparatus to function, and a program are recorded. It may include tangible media that is not temporary.
- a feature extraction unit that extracts features of first sensor data provided by a sensor carried or worn by a user
- a matching unit that matches the feature of the first sensor data with the feature of the second sensor data corresponding to the first sensor data associated with given position information
- An information processing apparatus comprising: a position estimation unit that estimates the position of the user based on the matching result.
- the feature extraction unit extracts features of the first sensor data in time series,
- the matching unit matches the characteristics of the first sensor data constituting the time series with the characteristics of the second sensor data respectively associated with the position information series constituting the route.
- the information processing apparatus according to 1).
- the information processing apparatus wherein the feature of the second sensor data is associated with the position information according to a probability model.
- the position information defines a state in the probability model,
- the probability model includes an observation probability of a feature of the second sensor data in the state;
- the information processing apparatus wherein the matching unit matches the feature of the first sensor data and the feature of the second sensor data based on the observation probability.
- the probability model includes a transition probability between the states defined by the time series of the position information,
- the feature extraction unit extracts features of the first sensor data in time series,
- the matching unit includes a feature of the first sensor data constituting the time series and a feature of the second sensor data respectively associated with the position information series constituting a route, the observation probability and
- the information processing apparatus according to any one of (1) to (6), wherein the first sensor data includes acceleration, angular velocity, or geomagnetism.
- the feature of the first sensor data includes an action recognition result based on the first sensor data.
- (11) a function of extracting features of first sensor data provided by a sensor carried or worn by a user; A function of matching the characteristics of the first sensor data with the characteristics of the second sensor data corresponding to the first sensor data, associated with given position information;
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Navigation (AREA)
- User Interface Of Digital Computer (AREA)
- Position Fixing By Use Of Radio Waves (AREA)
Abstract
Description
1.全体的な構成
1-1.入力部
1-2.処理部
1-3.出力部
2.機能構成例
2-1.位置推定時
2-2.マップ学習時
3.マップ学習および位置推定の原理
4.実装例
5.システム構成
6.ハードウェア構成
7.補足
図1は、本開示の一実施形態の全体的な構成の例を示すブロック図である。図1を参照すると、システム10は、入力部100と、処理部200と、出力部300とを含む。入力部100、処理部200、および出力部300は、後述するシステム10の構成例に示されるように、1または複数の情報処理装置によって実現される。
入力部100は、例えば、操作入力装置、センサ、または外部サービスから情報を取得するソフトウェアなどを含み、ユーザ、周辺環境、または他のサービスから、さまざまな情報の入力を受け付ける。
処理部200は、入力部100によって取得された情報に基づいてさまざまな処理を実行する。より具体的には、例えば、処理部200は、CPU(Central Processing Unit)、DSP(Digital Signal Processor)、ASIC(Application Specific Integrated Circuit)、またはFPGA(Field-Programmable Gate Array)などのプロセッサまたは処理回路を含む。また、処理部200は、プロセッサまたは処理回路において実行されるプログラム、および処理において読み書きされるデータを一時的または永続的に格納するメモリまたはストレージ装置を含んでもよい。
出力部300は、処理部200から提供された情報を、ユーザ(入力部100のユーザと同じユーザであってもよいし、異なるユーザであってもよい)、外部装置、または他のサービスに出力する。例えば、出力部300は、出力装置、制御装置、または外部サービスに情報を提供するソフトウェアなどを含みうる。
(2-1.位置推定時)
図3は、本開示の一実施形態の位置推定時における入力部、処理部、および出力部の機能構成例を示す概略的なブロック図である。以下、図3を参照して、本実施形態に係るシステム10に含まれる入力部100、処理部200、および出力部300の位置推定時の機能構成例について説明する。
図4は、本開示の一実施形態のマップ学習時における入力部、処理部、および出力部の機能構成例を示す概略的なブロック図である。以下、図4を参照して、本実施形態に係るシステム10に含まれる入力部100および処理部200のマップ学習時の機能構成例について説明する。なお、出力部300は、例えばマップの学習を実行しているユーザに対してマップ学習の進捗状況や生成されたマップなどを示す情報を出力してもよいが、本実施形態はそれ自体を目的としたものではないため、マップ学習時の例では出力部300の図示および説明を省略する。
図5は、本開示の一実施形態におけるマップ学習および位置推定の概要について説明するための図である。図5では、上記で図3,4を参照して説明したようなシステム10において実施されるマップ学習および位置推定における処理および情報の関係が概念的に示されている。
次に、本開示の一実装例について説明する。なお、本実装例は、本開示の一実施形態を理解するために提供されるより具体的な例であるにすぎず、本開示の実施形態を本実装例の範囲に限定することを意図したものではない。
・重力(前方、横、鉛直)(m/s2)
・加速度(重力以外;前方、横、鉛直)(m/s2)
・角速度(前方、横、鉛直)(μT)
・方位(北で0、時計回りで正)(deg)
以上、本開示の一実施形態について説明した。上述したように、本実施形態に係るシステム10は、入力部100と、処理部200と、出力部300とを含み、これらの構成要素は、1または複数の情報処理装置によって実現される。以下では、システム10を実現する情報処理装置の組み合わせの例について、より具体的な例とともに説明する。
図8は、本開示の実施形態に係るシステム構成の第1の例を示すブロック図である。図8を参照すると、システム10は、情報処理装置11,13を含む。入力部100および出力部300は、情報処理装置11において実現される。一方、処理部200は、情報処理装置13において実現される。情報処理装置11と情報処理装置13とは、本開示の実施形態に係る機能を実現するために、ネットワークを介して通信する。入力部100と処理部200との間のインターフェース150bおよび処理部200と出力部300との間のインターフェース350bは、いずれも装置間の通信インターフェースでありうる。
図9は、本開示の実施形態に係るシステム構成の第2の例を示すブロック図である。図9を参照すると、システム10は、情報処理装置11a,11b,13を含む。入力部100は、入力部100a,100bに分かれて実現される。入力部100aは、情報処理装置11aにおいて実現される。入力部100aは、例えば、上記で説明された加速度センサ101、ジャイロセンサ103、地磁気センサ105、気圧センサ107、および/またはWi-Fi通信装置109を含みうる。
図10は、本開示の実施形態に係るシステム構成の第3の例を示すブロック図である。図10を参照すると、システム10は、情報処理装置11,13を含む。第3の例において、入力部100および出力部300は、情報処理装置11において実現される。一方、処理部200は、情報処理装置11および情報処理装置13に分散して実現される。情報処理装置11と情報処理装置13とは、本開示の実施形態に係る機能を実現するために、ネットワークを介して通信する。
図11は、本開示の実施形態に係るシステム構成の第4の例を示すブロック図である。図11を参照すると、システム10は、情報処理装置11a,11b,13を含む。入力部100は、入力部100a,100bに分かれて実現される。入力部100aは、情報処理装置11aにおいて実現される。入力部100aは、例えば、上記で説明された加速度センサ101、ジャイロセンサ103、地磁気センサ105、気圧センサ107、および/またはWi-Fi通信装置109を含みうる。
次に、図12を参照して、本開示の実施形態に係る情報処理装置のハードウェア構成について説明する。図12は、本開示の実施形態に係る情報処理装置のハードウェア構成例を示すブロック図である。
本開示の実施形態は、例えば、上記で説明したような情報処理装置、システム、情報処理装置またはシステムで実行される情報処理方法、情報処理装置を機能させるためのプログラム、およびプログラムが記録された一時的でない有形の媒体を含みうる。
(1)ユーザによって携帯または装着されるセンサによって提供される第1のセンサデータの特徴を抽出する特徴抽出部と、
前記第1のセンサデータの特徴と、所与の位置情報に関連付けられた、前記第1のセンサデータに対応する第2のセンサデータの特徴とをマッチングするマッチング部と、
前記マッチングの結果に基づいて、前記ユーザの位置を推定する位置推定部と
を備える情報処理装置。
(2)前記特徴抽出部は、前記第1のセンサデータの特徴を時系列で抽出し、
前記マッチング部は、前記時系列を構成する前記第1のセンサデータの特徴と、経路を構成する前記位置情報の系列にそれぞれ関連付けられた前記第2のセンサデータの特徴とをマッチングする、前記(1)に記載の情報処理装置。
(3)前記第2のセンサデータの特徴は、確率モデルに従って前記位置情報に関連付けられる、前記(1)に記載の情報処理装置。
(4)前記位置情報は、前記確率モデルにおける状態を定義し、
前記確率モデルは、前記状態における前記第2のセンサデータの特徴の観測確率を含み、
前記マッチング部は、前記第1のセンサデータの特徴と、前記第2のセンサデータの特徴とを、前記観測確率に基づいてマッチングする、前記(3)に記載の情報処理装置。
(5)前記確率モデルは、前記位置情報の時系列によって定義される前記状態の間の遷移確率を含み、
前記特徴抽出部は、前記第1のセンサデータの特徴を時系列で抽出し、
前記マッチング部は、前記時系列を構成する前記第1のセンサデータの特徴と、経路を構成する前記位置情報の系列にそれぞれ関連付けられた前記第2のセンサデータの特徴とを、前記観測確率および前記遷移確率に基づいてマッチングする、前記(4)に記載の情報処理装置。
(6)前記確率モデルは、HMMを含む、前記(3)~(5)のいずれか1項に記載の情報処理装置。
(7)前記第1のセンサデータは、電波の受信状態を示すデータを含む、前記(1)~(5)のいずれか1項に記載の情報処理装置。
(8)前記第1のセンサデータは、加速度、角速度、または地磁気を含む、前記(1)~(6)のいずれか1項に記載の情報処理装置。
(9)前記第1のセンサデータの特徴は、前記第1のセンサデータに基づく行動認識結果を含む、前記(8)に記載の情報処理装置。
(10)ユーザによって携帯または装着されるセンサによって提供される第1のセンサデータの特徴を抽出することと、
前記第1のセンサデータの特徴と、所与の位置情報に関連付けられた、前記第1のセンサデータに対応する第2のセンサデータの特徴とをマッチングすることと、
前記マッチングの結果に基づいて、前記ユーザの位置を推定することと
を含む情報処理方法。
(11)ユーザによって携帯または装着されるセンサによって提供される第1のセンサデータの特徴を抽出する機能と、
前記第1のセンサデータの特徴と、所与の位置情報に関連付けられた、前記第1のセンサデータに対応する第2のセンサデータの特徴とをマッチングする機能と、
前記マッチングの結果に基づいて、前記ユーザの位置を推定する機能と
を処理回路に実現させるためのプログラム。
11,13 情報処理装置
100 入力部
101 加速度センサ
103 ジャイロセンサ
105 地磁気センサ
107 気圧センサ
109 Wi-Fi通信部
111 操作入力装置
113 測位装置/入力装置
150,250,350 インターフェース
200 処理部
201 Wi-Fi特徴量抽出部
203 センサデータ特徴抽出部
205 マッチング/位置推定部
207 位置関連情報生成部
209 センサマップ
213 センサマップ学習部
215 位置情報取得部
300 出力部
301 ディスプレイ
303 スピーカ
305 バイブレータ
Claims (11)
- ユーザによって携帯または装着されるセンサによって提供される第1のセンサデータの特徴を抽出する特徴抽出部と、
前記第1のセンサデータの特徴と、所与の位置情報に関連付けられた、前記第1のセンサデータに対応する第2のセンサデータの特徴とをマッチングするマッチング部と、
前記マッチングの結果に基づいて、前記ユーザの位置を推定する位置推定部と
を備える情報処理装置。 - 前記特徴抽出部は、前記第1のセンサデータの特徴を時系列で抽出し、
前記マッチング部は、前記時系列を構成する前記第1のセンサデータの特徴と、経路を構成する前記位置情報の系列にそれぞれ関連付けられた前記第2のセンサデータの特徴とをマッチングする、請求項1に記載の情報処理装置。 - 前記第2のセンサデータの特徴は、確率モデルに従って前記位置情報に関連付けられる、請求項1に記載の情報処理装置。
- 前記位置情報は、前記確率モデルにおける状態を定義し、
前記確率モデルは、前記状態における前記第2のセンサデータの特徴の観測確率を含み、
前記マッチング部は、前記第1のセンサデータの特徴と、前記第2のセンサデータの特徴とを、前記観測確率に基づいてマッチングする、請求項3に記載の情報処理装置。 - 前記確率モデルは、前記位置情報の時系列によって定義される前記状態の間の遷移確率を含み、
前記特徴抽出部は、前記第1のセンサデータの特徴を時系列で抽出し、
前記マッチング部は、前記時系列を構成する前記第1のセンサデータの特徴と、経路を構成する前記位置情報の系列にそれぞれ関連付けられた前記第2のセンサデータの特徴とを、前記観測確率および前記遷移確率に基づいてマッチングする、請求項4に記載の情報処理装置。 - 前記確率モデルは、HMMを含む、請求項3に記載の情報処理装置。
- 前記第1のセンサデータは、電波の受信状態を示すデータを含む、請求項1に記載の情報処理装置。
- 前記第1のセンサデータは、加速度、角速度、または地磁気を含む、請求項1に記載の情報処理装置。
- 前記第1のセンサデータの特徴は、前記第1のセンサデータに基づく行動認識結果を含む、請求項8に記載の情報処理装置。
- ユーザによって携帯または装着されるセンサによって提供される第1のセンサデータの特徴を抽出することと、
前記第1のセンサデータの特徴と、所与の位置情報に関連付けられた、前記第1のセンサデータに対応する第2のセンサデータの特徴とをマッチングすることと、
前記マッチングの結果に基づいて、前記ユーザの位置を推定することと
を含む情報処理方法。 - ユーザによって携帯または装着されるセンサによって提供される第1のセンサデータの特徴を抽出する機能と、
前記第1のセンサデータの特徴と、所与の位置情報に関連付けられた、前記第1のセンサデータに対応する第2のセンサデータの特徴とをマッチングする機能と、
前記マッチングの結果に基づいて、前記ユーザの位置を推定する機能と
を処理回路に実現させるためのプログラム。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2016564724A JPWO2016098457A1 (ja) | 2014-12-17 | 2015-10-27 | 情報処理装置、情報処理方法およびプログラム |
CN201580067236.XA CN107003382A (zh) | 2014-12-17 | 2015-10-27 | 信息处理设备、信息处理方法及程序 |
US15/518,327 US20170307393A1 (en) | 2014-12-17 | 2015-10-27 | Information processing apparatus, information processing method, and program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014-255037 | 2014-12-17 | ||
JP2014255037 | 2014-12-17 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016098457A1 true WO2016098457A1 (ja) | 2016-06-23 |
Family
ID=56126359
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/080290 WO2016098457A1 (ja) | 2014-12-17 | 2015-10-27 | 情報処理装置、情報処理方法およびプログラム |
Country Status (4)
Country | Link |
---|---|
US (1) | US20170307393A1 (ja) |
JP (1) | JPWO2016098457A1 (ja) |
CN (1) | CN107003382A (ja) |
WO (1) | WO2016098457A1 (ja) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017227594A (ja) * | 2016-06-24 | 2017-12-28 | トヨタ自動車株式会社 | 移動体の位置推定装置 |
JP2018013851A (ja) * | 2016-07-19 | 2018-01-25 | 日本電信電話株式会社 | 行動認識装置、および、行動認識方法 |
JP2018013855A (ja) * | 2016-07-19 | 2018-01-25 | 日本電信電話株式会社 | 行動認識装置、および、行動認識方法 |
JP2018194537A (ja) * | 2017-05-15 | 2018-12-06 | 富士ゼロックス株式会社 | 位置決定及び追跡のための方法、プログラム、及びシステム |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7005946B2 (ja) * | 2017-06-07 | 2022-01-24 | セイコーエプソン株式会社 | ウェアラブル機器、およびウェアラブル機器の制御方法 |
EP3462338A1 (en) * | 2017-09-28 | 2019-04-03 | Siemens Aktiengesellschaft | Data processing device, data analyzing device, data processing system and method for processing data |
JP7173024B2 (ja) * | 2017-09-29 | 2022-11-16 | ソニーグループ株式会社 | 情報処理装置、情報処理方法、およびプログラム |
JP7176563B2 (ja) * | 2018-04-17 | 2022-11-22 | ソニーグループ株式会社 | プログラム、情報処理装置、及び情報処理方法 |
CN109084768B (zh) * | 2018-06-27 | 2021-11-26 | 仲恺农业工程学院 | 基于智能地垫的人体定位方法 |
KR101948728B1 (ko) * | 2018-09-28 | 2019-02-15 | 네이버랩스 주식회사 | 데이터 수집 방법 및 시스템 |
CN110074797B (zh) * | 2019-04-17 | 2022-08-23 | 重庆大学 | 基于脑电波和时空数据融合的时空-心理分析方法 |
KR102277974B1 (ko) * | 2019-05-23 | 2021-07-15 | 주식회사 다비오 | 이미지 기반 실내 측위 서비스 시스템 및 방법 |
CN110781256B (zh) * | 2019-08-30 | 2024-02-23 | 腾讯大地通途(北京)科技有限公司 | 基于发送位置数据确定与Wi-Fi相匹配的POI的方法及装置 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005532560A (ja) * | 2002-07-10 | 2005-10-27 | エカハウ オーイー | 位置決め技法 |
JP2007093433A (ja) * | 2005-09-29 | 2007-04-12 | Hitachi Ltd | 歩行者の動態検知装置 |
JP2009103633A (ja) * | 2007-10-25 | 2009-05-14 | Internatl Business Mach Corp <Ibm> | 位置推定システム、方法及びプログラム |
JP2012058248A (ja) * | 2010-09-13 | 2012-03-22 | Ricoh Co Ltd | Rfidタグの動き追跡技術 |
JP2012532319A (ja) * | 2009-06-30 | 2012-12-13 | クゥアルコム・インコーポレイテッド | 軌道ベースのロケーション決定 |
Family Cites Families (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6839027B2 (en) * | 2002-11-15 | 2005-01-04 | Microsoft Corporation | Location measurement process for radio-frequency badges employing path constraints |
DE102010029589A1 (de) * | 2010-06-01 | 2011-12-01 | Bayerische Motoren Werke Aktiengesellschaft | Verfahren zur Bestimmung der Fahrzeugeigenposition eines Kraftfahrzeugs |
US8543135B2 (en) * | 2011-05-12 | 2013-09-24 | Amit Goyal | Contextually aware mobile device |
US9194949B2 (en) * | 2011-10-20 | 2015-11-24 | Robert Bosch Gmbh | Methods and systems for precise vehicle localization using radar maps |
US8588810B2 (en) * | 2011-11-30 | 2013-11-19 | International Business Machines Corporation | Energy efficient location tracking on smart phones |
KR20130066354A (ko) * | 2011-12-12 | 2013-06-20 | 현대엠엔소프트 주식회사 | 사용자 단말의 맵매칭 방법 및 장치 |
KR101919366B1 (ko) * | 2011-12-22 | 2019-02-11 | 한국전자통신연구원 | 차량 내부 네트워크 및 영상 센서를 이용한 차량 위치 인식 장치 및 그 방법 |
JP2013205171A (ja) * | 2012-03-28 | 2013-10-07 | Sony Corp | 情報処理装置、情報処理方法、およびプログラム |
US20150177359A1 (en) * | 2012-07-02 | 2015-06-25 | Locoslab Gmbh | Method for using and generating a map |
US10041798B2 (en) * | 2012-12-06 | 2018-08-07 | Qualcomm Incorporated | Determination of position, velocity and/or heading by simultaneous use of on-device and on-vehicle information |
US8934921B2 (en) * | 2012-12-14 | 2015-01-13 | Apple Inc. | Location determination using fingerprint data |
US9544740B2 (en) * | 2013-01-18 | 2017-01-10 | Nokia Technologies Oy | Method, apparatus and computer program product for orienting a smartphone display and estimating direction of travel of a pedestrian |
JP6143474B2 (ja) * | 2013-01-24 | 2017-06-07 | クラリオン株式会社 | 位置検出装置およびプログラム |
CN103338509A (zh) * | 2013-04-10 | 2013-10-02 | 南昌航空大学 | 一种基于隐含马尔可夫模型的wsn室内定位方法 |
DE102013104727A1 (de) * | 2013-05-07 | 2014-11-13 | Deutsche Telekom Ag | Verfahren und Vorrichtungen zum Bestimmen der Position einer beweglichen Kommunikationseinrichtung |
CN104185270B (zh) * | 2013-05-28 | 2017-11-28 | 中国电信股份有限公司 | 室内定位方法、***和定位平台 |
KR101493817B1 (ko) * | 2013-06-14 | 2015-03-02 | 현대엠엔소프트 주식회사 | 사용자 단말의 맵매칭 방법 |
GB201500411D0 (en) * | 2014-09-15 | 2015-02-25 | Isis Innovation | Determining the position of a mobile device in a geographical area |
US20160146616A1 (en) * | 2014-11-21 | 2016-05-26 | Alpine Electronics, Inc. | Vehicle positioning by map matching as feedback for ins/gps navigation system during gps signal loss |
-
2015
- 2015-10-27 US US15/518,327 patent/US20170307393A1/en not_active Abandoned
- 2015-10-27 WO PCT/JP2015/080290 patent/WO2016098457A1/ja active Application Filing
- 2015-10-27 CN CN201580067236.XA patent/CN107003382A/zh active Pending
- 2015-10-27 JP JP2016564724A patent/JPWO2016098457A1/ja active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005532560A (ja) * | 2002-07-10 | 2005-10-27 | エカハウ オーイー | 位置決め技法 |
JP2007093433A (ja) * | 2005-09-29 | 2007-04-12 | Hitachi Ltd | 歩行者の動態検知装置 |
JP2009103633A (ja) * | 2007-10-25 | 2009-05-14 | Internatl Business Mach Corp <Ibm> | 位置推定システム、方法及びプログラム |
JP2012532319A (ja) * | 2009-06-30 | 2012-12-13 | クゥアルコム・インコーポレイテッド | 軌道ベースのロケーション決定 |
JP2012058248A (ja) * | 2010-09-13 | 2012-03-22 | Ricoh Co Ltd | Rfidタグの動き追跡技術 |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017227594A (ja) * | 2016-06-24 | 2017-12-28 | トヨタ自動車株式会社 | 移動体の位置推定装置 |
JP2018013851A (ja) * | 2016-07-19 | 2018-01-25 | 日本電信電話株式会社 | 行動認識装置、および、行動認識方法 |
JP2018013855A (ja) * | 2016-07-19 | 2018-01-25 | 日本電信電話株式会社 | 行動認識装置、および、行動認識方法 |
JP2018194537A (ja) * | 2017-05-15 | 2018-12-06 | 富士ゼロックス株式会社 | 位置決定及び追跡のための方法、プログラム、及びシステム |
JP7077598B2 (ja) | 2017-05-15 | 2022-05-31 | 富士フイルムビジネスイノベーション株式会社 | 位置決定及び追跡のための方法、プログラム、及びシステム |
Also Published As
Publication number | Publication date |
---|---|
CN107003382A (zh) | 2017-08-01 |
US20170307393A1 (en) | 2017-10-26 |
JPWO2016098457A1 (ja) | 2017-09-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2016098457A1 (ja) | 情報処理装置、情報処理方法およびプログラム | |
US20190383620A1 (en) | Information processing apparatus, information processing method, and program | |
CN107339990B (zh) | 多模式融合定位***及方法 | |
US10719983B2 (en) | Three dimensional map generation based on crowdsourced positioning readings | |
JP6311478B2 (ja) | 情報処理装置、情報処理方法およびプログラム | |
US8588464B2 (en) | Assisting a vision-impaired user with navigation based on a 3D captured image stream | |
Sunny et al. | Applications and challenges of human activity recognition using sensors in a smart environment | |
US11181376B2 (en) | Information processing device and information processing method | |
US11143507B2 (en) | Information processing apparatus and information processing method | |
Capurso et al. | A survey on key fields of context awareness for mobile devices | |
JPWO2017047063A1 (ja) | 情報処理装置、評価方法及びコンピュータプログラム | |
Wang et al. | Indoor PDR Positioning Assisted by Acoustic Source Localization, and Pedestrian Movement Behavior Recognition, Using a Dual‐Microphone Smartphone | |
Zaib et al. | Smartphone based indoor navigation for blind persons using user profile and simplified building information model | |
KR102578119B1 (ko) | 모바일 디바이스와 연동하는 스마트 안경 작동 방법 | |
WO2017056774A1 (ja) | 情報処理装置、情報処理方法およびコンピュータプログラム | |
WO2015194270A1 (ja) | 情報処理装置、情報処理方法およびプログラム | |
Mahida et al. | Indoor positioning framework for visually impaired people using Internet of Things | |
JP2023131905A (ja) | 行動推定システム、行動推定方法、プログラム | |
Gil et al. | inContexto: A fusion architecture to obtain mobile context | |
US20190205580A1 (en) | Information processing apparatus, information processing method, and computer program | |
Gong et al. | Building smart transportation hubs with internet of things to improve services to people with disabilities | |
Shoushtari et al. | Data-Driven Inertial Navigation assisted by 5G UL-TDoA Positioning | |
WO2015194269A1 (ja) | 情報処理装置、情報処理方法およびプログラム | |
WO2022029894A1 (ja) | 情報処理装置、情報処理システム、情報処理方法、及びプログラム | |
JP2024058567A (ja) | 学習データ生成システム、推定システム、学習データ生成方法、推定方法、プログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15869665 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2016564724 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15518327 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 15869665 Country of ref document: EP Kind code of ref document: A1 |