WO2011007386A1 - Navigation device - Google Patents

Navigation device Download PDF

Info

Publication number
WO2011007386A1
WO2011007386A1 PCT/JP2009/003299 JP2009003299W WO2011007386A1 WO 2011007386 A1 WO2011007386 A1 WO 2011007386A1 JP 2009003299 W JP2009003299 W JP 2009003299W WO 2011007386 A1 WO2011007386 A1 WO 2011007386A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
information
destination
index value
preference
Prior art date
Application number
PCT/JP2009/003299
Other languages
French (fr)
Japanese (ja)
Inventor
福原英樹
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to PCT/JP2009/003299 priority Critical patent/WO2011007386A1/en
Publication of WO2011007386A1 publication Critical patent/WO2011007386A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3605Destination input or retrieval
    • G01C21/3617Destination input or retrieval using user history, behaviour, conditions or preferences, e.g. predicted or inferred from previous use or current movement

Definitions

  • This invention relates to a navigation device that presents destination candidates and changes routes according to the biological information and preferences of the crew members.
  • Patent Document 1 since the in-vehicle situation is used when generating the search condition for narrowing down the destination, user operations such as starting the search for the destination remain. For this reason, when the occupant is only the driver, there is a problem that the destination search start operation cannot be performed during driving, and the destination cannot be searched at a timing desired by the user.
  • Patent Document 1 the user's preference described in Patent Document 1 is only related to smoking. For example, the preference regarding the eating and drinking of the driver and passengers is not considered, and the destination desired by the user cannot be accurately presented.
  • the present invention has been made in order to solve the above-described problems.
  • the destination candidate and the route desired by the user at an appropriate timing according to the current state of the user based on the biological information and preferences of the crew of the vehicle.
  • An object of the present invention is to obtain a navigation device capable of presenting.
  • a navigation device includes a biological information detection unit that detects biological information of a user who is on a moving body, and an index that calculates an index value indicating the state of the user from the biological information detected by the biological information detection unit.
  • a biological information detection unit that detects biological information of a user who is on a moving body
  • an index that calculates an index value indicating the state of the user from the biological information detected by the biological information detection unit.
  • the destination candidate and route desired by the user can be determined at an appropriate timing according to the current state of the user based on the biological information and taste of the passenger. There is an effect that presentation can be performed.
  • FIG. 1 is a block diagram showing a configuration of a navigation device according to Embodiment 1.
  • FIG. It is a block diagram which shows the structure of the various sensors by Embodiment 1, and a database. It is a figure which shows the flow of information until a presentation route is determined with the car navigation system by Embodiment 1.
  • FIG. It is a figure which shows schematically the structure of the car navigation system using the navigation apparatus by Embodiment 2 of this invention.
  • 6 is a block diagram illustrating a configuration of a navigation device according to Embodiment 2.
  • FIG. 1 is a diagram schematically showing a configuration of a car navigation system using a navigation apparatus according to Embodiment 1 of the present invention.
  • the car navigation system according to the first embodiment includes a navigation device 10, various sensors 20, and a storage device that stores a database 30, and these are connected by an in-vehicle network 90 and can communicate with each other.
  • the navigation device 10 is a device that guides the route from the current position to the destination.
  • the various sensors 20 are sensor groups that acquire biological information such as a user's blood pressure and body temperature.
  • preference information such as user's preferences regarding eating and drinking, and destination information such as a destination address are registered.
  • the in-vehicle network 90 is a single wireless or wired communication line that connects to the navigation device 10, the various sensors 20, and the storage devices stored in the database 30.
  • FIG. 2 is a block diagram showing a configuration of the navigation device according to the first embodiment.
  • the navigation apparatus 10 includes a preference information acquisition unit 11, a biological information storage unit 12, a user state determination unit 13, a user state storage unit 13a, a destination information acquisition unit 14, a time information acquisition unit 15, and a time information storage unit. 15a, the present location information acquisition part 16, the apparatus control part 17, the display apparatus control part 18, and the display apparatus part 19 are provided.
  • the preference information acquisition unit 11 is a component that acquires the user preference information 100.
  • it is an HMI (Human Machine Interface) for inputting user preference information 100 in which an input screen is displayed on the display unit 19 and user preference information 100 is input based on this input screen.
  • HMI Human Machine Interface
  • an input device mounted on the navigation device 10 such as a key operation unit or a voice input device can be used.
  • the biological information storage unit 12 is a storage unit that stores the biological information 101 detected by the various sensors 20.
  • the biological information storage unit 12 when the biological information 101 is detected by the various sensors 20, the detected biological information 101 is acquired and stored via the in-vehicle network 90.
  • the biometric information 101 stored in the biometric information storage unit 12 is read by the user state determination unit 13 at a predetermined cycle.
  • the user state determination unit 13 is a configuration unit that determines user state information 102 such as a user's hunger degree and fatigue degree based on the user preference information 100 and the biological information 101. A method for determining the user status information 102 will be described later.
  • the user state storage unit 13 a is a storage unit that stores the user state information 102 obtained by the user state determination unit 13.
  • the destination information acquisition unit 14 is a component that acquires the destination information 103.
  • a destination setting screen can be displayed on the display unit 19, and the destination information 103 can be set with an input device that is normally mounted on the navigation device 10 based on the setting screen. This is realized as an HMI for setting the ground information 103.
  • the destination information 103 acquired by the destination information acquisition unit 14 is registered in the database 30 via the in-vehicle network 90. Further, the destination information acquisition unit 14 may acquire the destination information 103 registered in the past from the database 30.
  • the time information acquisition unit 15 is a configuration unit that acquires time information 104 such as the current time and driving time. For example, it is time measuring means using a timer mounted on a computer that functions as the navigation device 10.
  • the time information storage unit 15 a is a storage unit that stores the time information 104 obtained by the time information acquisition unit 15.
  • the current location information acquisition unit 16 is a component that acquires current location information 105 such as the current travel point of the host vehicle. For example, position information is acquired from GPS (Global Positioning System) or an acceleration sensor, and the vehicle position is measured.
  • the device control unit 17 uses the user preference information 100, the user status information 102, the destination information 103, the time information 104, and the current location information 105 to present a destination such as a destination to be presented to the user and a route to the destination. It is a component that determines information 110.
  • the display device control unit 18 is a component that controls the presentation operation of the presentation destination information 110 in accordance with a control signal from the device control unit 17.
  • the display unit 19 is a component that presents the presentation destination information 110 to the user side.
  • the display unit 19 includes an audio output device such as a speaker in addition to a display device such as an LCD (Liquid Crystal Display) mounted on the navigation device.
  • the map information and the guidance voice information indicating the destination set in the presentation destination information 110 and the route to the destination are displayed on the display screen of the display unit 19 and are output as voice from a speaker.
  • FIG. 3 is a block diagram showing configurations of various sensors and a database used in the car navigation system according to the first embodiment.
  • FIG. 3A shows the configuration of various sensors
  • FIG. 3B shows the configuration of the database 30.
  • the various sensors 20 are provided with a biological information detection unit 21 that detects biological information 101.
  • the biological information detection unit 21 is, for example, information that can process these output signals on the navigation device 10 side in addition to a camera that captures the user, a blood pressure sensor that measures the blood pressure of the user, a thermometer sensor that detects body temperature, and the like. Processing means for generating biometric information 101 by converting into a form is provided.
  • the database 30 is provided with a preference information storage unit 31 that stores user preference information 100 and a destination information storage unit 32 that stores destination information 103.
  • the storage device for storing the database 30 may be an external storage device connected via the in-vehicle network 90, but may be constructed in a storage area of a hard disk device built in the navigation device 10.
  • FIG. 4 is a diagram showing a flow of information until a presentation route is determined in the car navigation system according to the first embodiment.
  • the main operations of each part of the car navigation system will be described with reference to the configuration shown in FIGS.
  • a route including a restaurant is presented as a destination candidate and a route associated in advance with the user state.
  • the preference information acquisition unit 11 acquires user preference information 100 of a driver and a passenger.
  • the user preference information 100 includes the gender, age, favorite food genre, time of eating a normal meal (morning, noon, night), the amount of money per meal when eating out, and the like.
  • the preference information acquisition unit 11 displays an input screen having these pieces of information as setting items on the display unit 19. The user sets information corresponding to the setting item by terminal key input or voice input.
  • the user preference information 100 input in this way is held in the preference information storage unit 31 of the database 30 from the preference information acquisition unit 11 via the in-vehicle network 90.
  • the destination information 103 is also registered in the destination information storage unit 32 of the database 30 from the destination information acquisition unit 14 via the in-vehicle network 90.
  • the biological information detection unit 21 detects the biological information 101 of the driver and passengers. Based on the biometric information 101 acquired from the user in the vehicle by the biometric information detection unit 21, an appropriate presentation timing in accordance with the current state of the driving user is determined.
  • the biological information detection unit 21 is provided with a camera that captures the interior of the vehicle, a sound collection microphone that collects the interior sound, A blood pressure sensor, an acceleration sensor, a temperature sensor, a thermographic device, and the like are provided, and an information processing unit that processes the information acquired by these to generate the biological information 101 is provided.
  • the biometric information detection unit 21 measures the gaze time as the biometric information 101 based on the result of detecting the number of passengers or the direction in which each passenger's eyes are facing, for example, from an image captured by a camera. Moreover, the hungry sound may be extracted from the in-vehicle sound (speaking voice, hungry sound) detected by the sound collecting microphone, and the biological information 101 indicating the hungry state of the driver or the passenger may be generated. Further, the biological information detection unit 21 embeds a blood pressure sensor in a seat belt or the like, and measures the blood pressure of the driver or passenger as the biological information 101.
  • an acceleration sensor is embedded in a seat or the like, the vibration can be detected when the seated person takes the rhythm described above. That is, the vibration detection information acquired by the acceleration sensor can be used as the biological information 101 for estimating the psychological state of the driver or passengers.
  • the body temperature near the head of the passenger is measured as the biological information 101 by the temperature sensor and the thermography device.
  • body temperature information is utilized as the biometric information 101 for estimating the psychological state of a driver and a passenger.
  • the biological information 101 is acquired from the driver and passengers at the time of traveling of the host vehicle by the biological information detection unit 21 constantly or at a constant cycle, and is transmitted from the biological information detection unit 21 to the living body of the navigation device 10 via the in-vehicle network 90. It is sent to and held in the information storage unit 12. Further, the time information acquisition unit 15 acquires the current time from a clock or the like installed in the vehicle, acquires the boarding time from a timer or the like, and holds the time information 104 in the time information storage unit 15a.
  • the user state determination unit 13 reads various biological information 101 from the biological information storage unit 12, reads time information 104 from the time information storage unit 15 a, and inputs the user preference information 100 acquired by the preference information acquisition unit 11.
  • the user status information 102 of the driver and the passenger is calculated by performing a predetermined calculation on the information 100, 101, and 104, respectively.
  • the biometric information 101, the time information 104, and the user preference information 100 are scored according to a predetermined rule, so that an index indicating the driver's or passenger's hunger or fatigue can be obtained. .
  • the driver's or passenger's fatigue level B is calculated from the gaze time of the driver's or passenger's line of sight acquired as the biological information 101 and the boarding time indicated by the time information 104 using the following equation (1).
  • is a coefficient that differs depending on the driver or passenger.
  • Fatigue degree B ⁇ ⁇ ⁇ (gaze time) / (ride time) ⁇ (1)
  • the hunger degree H of the driver or passenger can be calculated.
  • index 1 is an index value indicating the degree of irritation to hunger.
  • the index 2 is an index value indicating the appropriate degree of timing for eating.
  • a1 is a coefficient for the blood pressure value
  • a2 is a coefficient for the above-described vibration value
  • a3 is a coefficient for the body temperature near the head.
  • b1 is a coefficient for the volume level of the hungry sound.
  • b2 is a coefficient with respect to the index value indicating the result of keyword matching indicating hunger extracted from the voice collection result.
  • the matching result is “1” if the matching is obtained, and the index value indicating the matching result is “0” if the matching is not obtained.
  • b3 is a coefficient for the degree of correlation between the current time and the normal meal intake time. This correlation degree has a maximum value of “1” and a minimum value of “0”. ⁇ 1, ⁇ 2, a1 to a3, and b1 to b3 are different coefficients depending on the driver or the passenger.
  • Hunger degree H ( ⁇ 1 ⁇ index 1) + ( ⁇ 2 ⁇ index 2) (2)
  • Index 1 ⁇ a1 ⁇ (blood pressure) ⁇ + ⁇ a2 ⁇ (vibration) ⁇ + ⁇ a3 ⁇ (body temperature near the head) ⁇ (3)
  • a rule for scoring is set for each of the biological information 101, the time information 104, and the user preference information 100, and the calculated value becomes the user status information 102.
  • the user status information 102 for each passenger obtained in this way is held in the user status storage unit 13a.
  • the current location information acquisition unit 16 acquires the current location information 105 of the vehicle based on GPS and acceleration sensor data.
  • the device control unit 17 determines the presentation destination information 110 using the user preference information 100, the user status information 102, the destination information 103, the time information 104, and the current location information 105 every certain period, and determines the determination result. Based on this, the display control unit 18 is controlled. As described above, the presentation destination information 110 is determined based on the biological information 101 at an appropriate presentation timing according to the current state of the user. Note that the device control unit 17 controls the operation of each component in the navigation device 10 in addition to this operation.
  • the apparatus control unit 17 reads the user state information 102 from the user state storage unit 13a, and compares the index values constituting the user state information 102 with threshold values set beforehand for these index values. The contents of the presentation destination information 110 are determined accordingly.
  • a route including a restaurant is presented as a destination candidate and a route previously associated with the user state.
  • the number of persons whose hunger degree H or fatigue degree B in the user state information 102 is higher than a threshold value is obtained, and if there are many persons with a high hunger degree H, the target genre of the presentation candidate 106 is set to a restaurant or the like restaurant. Moreover, if there are many people with high fatigue degree B, restaurants, such as a coffee shop, will be set, and if it is the same number, it will set to restaurants, such as a restaurant. Further, the user having the highest hunger degree H from the user status information 102 is set as the destination target user of the presentation candidate 106.
  • the device control unit 17 acquires the destination information 103 that matches the destination genre of the presentation candidate 106 from the destination information 103 stored in the destination information storage unit 32, and uses this destination information 103 as the destination content.
  • the analysis result 107 is assumed.
  • the device control unit 17 acquires user preference information 100 that matches the destination target user of the presentation candidate 106, and uses the user preference information 100 as the user preference content analysis result 108.
  • the device control unit 17 performs a route search from the current location to the destination from the current location information 105 and the address of the destination content analysis result 107 using map information acquired from a map database (not shown), and obtains a route search result 109. Ask for. Thereafter, the device control unit 17 obtains the presentation destination information 110 from the current time of the time information 104, the destination content analysis result 107, the user preference content analysis result 108, and the route search result 109 by scoring according to threshold determination. decide.
  • the presentation destination information 110 with the restaurant as the destination is determined.
  • the current time the cooking genre specified in the destination content analysis result 107, age of use, gender, business hours, average fee, age set in the user preference content analysis result 108, gender, favorite food genre, restaurant
  • the restaurant that matches the user's taste is determined from the cost to spend and the required time to the destination of the route search result 109.
  • the presentation determination value A is calculated using the following formulas (5) to (7).
  • the index 3 is an index value indicating the preference of the store.
  • the index 4 is an index value indicating the general degree of the store.
  • ⁇ 1 is a coefficient for the index 3 indicating the preference of the store
  • ⁇ 2 is a coefficient for the index 4 indicating the general degree of the store.
  • C1 is a coefficient for the degree of correlation between the cooking genre and the favorite food genre.
  • c2 is a coefficient for an index value that is “1” when the average fee includes the cost of eating out by the user. When the cost is not included in the average fee, the index value is “0”.
  • d1 is a coefficient for the index value that is “1” when the age is included in the usage age.
  • the index value When the age is not included in the usage age, the index value is “0”.
  • d2 is a coefficient for the index value that is “1” when the gender is included according to the usage. If the gender is not included by gender, the index value is “0”.
  • d3 is a coefficient for the index value that is “1” when the sum of the current time and the required time is within business hours. If the sum of the above times is not within business hours, the index value is “0”.
  • d4 is a coefficient for the index value that is “1” when the required time is within the threshold time. When the required time exceeds the threshold time, the index value becomes “0”. Note that ⁇ 1, ⁇ 2, c1 to c2, and d1 to d4 are different coefficients depending on the driver or the passenger.
  • the device control unit 17 calculates a presentation determination value A for each destination that matches the target genre of the presentation candidate 106 in the above-described procedure, and determines a store to be presented to the user.
  • the maximum number of shops to be presented is arbitrarily set.
  • the display device control unit 18 displays the images set in the presentation destination information 110 and the reasons for presentation of the images in descending order of the presentation determination value A of each destination from the presentation destination information 110 determined by the device control unit 17.
  • the display device unit 19 is controlled to present a voice to be explained. For example, if the presentation determination value A is equal to or greater than a predetermined threshold value, the shop is a presentation target.
  • the user determines (cancels) the destination from the presentation destination by terminal key input or voice input, and starts (does not implement) route guidance.
  • the navigation apparatus 10 ends the route guidance to the destination that has already been started.
  • the index value indicating the user state information 102 calculated from the biological information 101 of the user on the vehicle is compared with the predetermined threshold value, and a change occurs in the current state of the user. If it is determined, the destination candidate and route previously associated with the state are presented to the user. With this configuration, the navigation device 10 actively performs the destination candidate and the change to the route without the user's operation, so that the user's operation is saved, and the purpose matches the user's preference. Guidance to the ground is possible. In addition, since the user preference information 100 and the biometric information 101 are used, it is possible to present a destination candidate that matches the user preference and change the route at an appropriate presentation timing in accordance with the current state of the user.
  • the present invention is not limited to this configuration.
  • the preference information acquisition unit 11 is omitted, and the device control unit 17 determines the presentation timing based on the biological information 101 sequentially detected from the user during driving.
  • the device control unit 17 determines the presentation destination information 110 from the current time of the time information 104, the destination content analysis result 107, and the route search result 109, and controls the display device control unit 18 to inform the user.
  • the display device control unit 18 determines the presentation destination information 110 from the current time of the time information 104, the destination content analysis result 107, and the route search result 109, and controls the display device control unit 18 to inform the user.
  • FIG. FIG. 5 is a diagram schematically showing a configuration of a car navigation system using a navigation device according to Embodiment 2 of the present invention.
  • the car navigation system according to the second embodiment includes an out-of-vehicle server 70 and another navigation device 40 in addition to the configuration shown in the first embodiment.
  • the navigation device 10, the out-of-vehicle server 70, and the other navigation device 40 are connected by an out-of-vehicle network 91 and can communicate with each other.
  • the off-vehicle server 70 is a server device that manages the destination additional information 111.
  • the destination addition information 111 is information indicating, for example, an in-store image related to the destination information 103, a service of a store currently being implemented, and the like.
  • the outside-vehicle network 91 is a single wireless communication line that connects the navigation device 10, the outside-server 70, and the other navigation device 40.
  • the other navigation device 40 is a navigation device that is installed in a vehicle different from the navigation device 10 and is managed by another user.
  • the other navigation device 40 notifies the other user evaluation information 112 regarding the destination information 103 to the navigation device 10 in the wireless communication area via the outside-vehicle network 91.
  • the other user evaluation information 112 is evaluation information of other users indicating, for example, a degree of satisfaction with the store with respect to the destination information 103.
  • FIG. 6 is a block diagram showing the configuration of the navigation device according to the second embodiment.
  • the navigation device 10 according to the second embodiment includes a communication control unit 20 in addition to the configuration described with reference to FIG. 2 in the first embodiment.
  • the communication control unit 20 is a component that communicates with the outside server 70 and the other navigation device 40 via the outside network 91 and exchanges information with them.
  • the route including the restaurant is presented as the presentation destination information 110 at the timing when the driver or passenger is determined to be hungry or tired based on the biometric information 101.
  • the communication control unit 20 wirelessly communicates the destination additional information 111 such as the in-store image relating to the destination information 103 and the service of the store currently being executed by the wireless server 70. Get from.
  • the communication control unit 20 similarly acquires other user evaluation information 112 such as satisfaction with the store with respect to the destination information 103 from the other navigation device 40.
  • the other navigation device 40 to be communicated with needs to be communicable (within the communication area) with the navigation device 10 via the outside-vehicle network 91.
  • the target other user must manage the other navigation device 40 described above, and the user of the navigation device 10 needs to permit communication. That is, the target other user is a user highly correlated with the user preference content analysis result 108.
  • the destination additional information 111 and the other user evaluation information 112 acquired by the communication control unit 20 are stored in a memory (not shown) in the device control unit 17.
  • the destination additional information 111 is acquired by the communication control unit 20 from the out-of-vehicle server 70 as information on the determined store when the store to be presented to the user is determined by the device control unit 17. This destination additional information 111 is added to an image, sound, or the like presented on the display unit 19.
  • the out-of-vehicle server 70 creates and manages the destination additional information 111 from information provided from the outside.
  • the information content that can be provided to the outside server 70 may be set according to, for example, an advertisement fee paid to the administrator of the outside server 70.
  • the other user evaluation information 112 is the same age as the user preference content analysis result 108 (within an age difference threshold) when the device control unit 17 determines the store to be presented to the user. It is acquired from the other navigation apparatus 40 as evaluation information of other users who have sex and favorite food genres. The other user evaluation information 112 is used for the calculation of the presentation determination value by the device control unit 17.
  • the information provided by the destination (candidate place) acquired from the outside server 70 by the outside communication is used as the information presented by the destination (candidate place) and the route change. included.
  • the evaluation for the destination by another user having the same taste acquired from the other navigation device 40 by external communication is used.
  • the information 111 and 112 acquired from the out-of-vehicle server 70 and the other navigation device 40 may be provided by a charging system and may be configured so that the contents that can be presented differ according to the amount of money.
  • the presentation destination information 110 with the restaurant as the destination is determined.
  • the current time the cooking genre specified in the destination content analysis result 107, age of use, gender, business hours, average fee, age set in the user preference content analysis result 108, gender, favorite food genre, restaurant
  • the restaurant according to the user's preference is determined from the cost to spend, the required time to the destination of the route search result 109, and the satisfaction of the other user evaluation information 112 to the store.
  • the presentation determination value B is calculated using the following formulas (8) to (10).
  • the index 5 is an index value indicating the preference of the store
  • the index 6 is an index value indicating the general degree of the store.
  • ⁇ 1 is a coefficient for the index value indicating the preference of the store
  • ⁇ 2 is a coefficient for the index value indicating the generality of the store.
  • E1 is a coefficient for the degree of correlation between the cooking genre and the favorite food genre.
  • e2 is a coefficient for an index value that is “1” when the cost of eating out by the user is included in the average fee. When the cost is not included in the average fee, the index value is “0”.
  • f1 is a coefficient for the index value that is “1” when the age is included in the usage age.
  • the index value When the age is not included in the usage age, the index value is “0”.
  • f2 is a coefficient for an index value that is “1” when gender is included according to usage. If the gender is not included by gender, the index value is “0”.
  • f3 is a coefficient for the index value that is “1” when the sum of the current time and the required time is within business hours. If the sum of the above times is not within business hours, the index value is “0”.
  • f4 is an index value that is “1” when the required time is within the threshold time. When the required time exceeds the threshold time, the index value becomes “0”.
  • ⁇ 1 and ⁇ 2 are coefficients that have different values in proportion to the degree of satisfaction with the store and are proportional to the index value of the degree of satisfaction with the store indicated in the other user evaluation information 112.
  • e1, e2, and f1 to f4 are different coefficients depending on the driver or the passenger.
  • Presentation determination value B ( ⁇ 1 ⁇ index 5) + ( ⁇ 2 ⁇ index 6) (8)
  • Index 5 ⁇ e1 ⁇ (Correlation between cooking genre and favorite food genre) ⁇ + ⁇ e2 ⁇ (cost for eating out is an average fee) ⁇ (9)
  • the device control unit 17 calculates a presentation determination value B for each destination that matches the target genre of the presentation candidate 106 in the above-described procedure, and determines a store to be presented to the user.
  • the display device control unit 18 displays the images set in the presentation destination information 110 and the reasons for presentation of the images in descending order of the presentation determination value B of each destination from the presentation destination information 110 determined by the device control unit 17.
  • the display device unit 19 is controlled to present a voice to be explained.
  • the user uses the user evaluation information 113 that has been graded for the degree of satisfaction with the destination by terminal key input or voice input.
  • the information is input to the land information acquisition unit 14.
  • This user evaluation information 113 is stored in the destination information storage unit 32 of the database 30 from the destination information acquisition unit 14 via the in-vehicle network 90.
  • This user evaluation information 113 is used in the other navigation device 40.
  • the other navigation device 40 is a device having the same configuration and function as the navigation device 10
  • the user evaluation information 113 is used as the other user evaluation information 112 by acquiring the user evaluation information 113 from the navigation device 10. can do.
  • the off-vehicle server 70 that manages the destination additional information 111 related to the destination and the other navigation device that is managed by another user who transmits the other user evaluation information 112 for the destination.
  • the device control unit 17 determines that a change has occurred in the user state information 102
  • the device control unit 17 communicates with the user from the destination candidates and routes previously associated with the state.
  • the destination candidate and the route that are matched with the preference indicated by the preference information and selected according to the other-user evaluation information 112 received by the communication control unit 20, are received by the communication control unit 20 from the outside server 70.
  • the information is presented to the user of the host vehicle along with the destination additional information 111 regarding the candidate.
  • the other user evaluation information 112 which is an evaluation for the destination of another user having the same taste, can be used for determining the store to be presented, so that it is possible to present a destination that more closely matches the taste of the user It becomes.
  • the navigation device is capable of presenting destination candidates and routes desired by the user at an appropriate presentation timing in accordance with the current state of the user based on the biological information and preferences of the crew of the vehicle. It is useful as a high navigation device.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Social Psychology (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Navigation (AREA)

Abstract

A navigation device compares an indicator value with a predetermined threshold value, the indicator value indicating a user condition calculated from biometric information of a user boarding a mobile object.  If the navigation device determines that a change has occurred in the user condition, the navigation device presents to the user a destination candidate and a route previously made correspond to the user condition.

Description

ナビゲーション装置Navigation device
 この発明は、搭乗員の生体情報や趣向に応じた目的地候補の提示及び経路の変更を行うナビゲーション装置に関するものである。 This invention relates to a navigation device that presents destination candidates and changes routes according to the biological information and preferences of the crew members.
 特許文献1に記載のナビゲーション装置では、乗員構成等の車内状況に応じてユーザが必要とする目的地情報を簡単かつ迅速に検索するナビゲーションを行うために、乗車人数あるいは乗員の嗜好等の車内状況に基づいて目的地の検索条件を生成する。これにより、ユーザが所望する目的地が見つかるまでの検索操作の負荷軽減や検索時間の短縮化を可能としている。 In the navigation device described in Patent Document 1, in order to perform navigation for easily and quickly searching for destination information required by the user according to the in-vehicle situation such as the occupant configuration, the in-vehicle state such as the number of passengers or passenger preference Based on the above, a search condition for the destination is generated. As a result, it is possible to reduce the load of the search operation and shorten the search time until the destination desired by the user is found.
 しかしながら、特許文献1では、目的地絞り込みの検索条件を生成する際に車内状況を用いるので、目的地の検索開始等のユーザ操作が依然として残る。このため、乗員が運転者のみであった場合、運転中は目的地の検索開始操作を行うことができず、ユーザが所望するタイミングで目的地を検索できないという課題があった。 However, in Patent Document 1, since the in-vehicle situation is used when generating the search condition for narrowing down the destination, user operations such as starting the search for the destination remain. For this reason, when the occupant is only the driver, there is a problem that the destination search start operation cannot be performed during driving, and the destination cannot be searched at a timing desired by the user.
 また、特許文献1に記載されるユーザの嗜好は喫煙に関するもののみであり、例えば、運転者や同乗者の飲食に関する嗜好は考慮されておらず、ユーザが所望する目的地を精度よく提示できない。 In addition, the user's preference described in Patent Document 1 is only related to smoking. For example, the preference regarding the eating and drinking of the driver and passengers is not considered, and the destination desired by the user cannot be accurately presented.
 この発明は、上記のような課題を解決するためになされたもので、車両の搭乗員の生体情報や趣向から、ユーザの現状に即した適切なタイミングで、ユーザが所望する目的地候補及び経路の提示を行うことができるナビゲーション装置を得ることを目的とする。 The present invention has been made in order to solve the above-described problems. The destination candidate and the route desired by the user at an appropriate timing according to the current state of the user based on the biological information and preferences of the crew of the vehicle. An object of the present invention is to obtain a navigation device capable of presenting.
特開2005-10035号公報Japanese Patent Laid-Open No. 2005-10035
 この発明に係るナビゲーション装置は、移動体に搭乗する利用者の生体情報を検出する生体情報検出部と、生体情報検出部によって検出された生体情報から利用者の状態を示す指標値を算出する指標値算出部と、指標値算出部によって算出された指標値を所定の閾値と比較して利用者の状態に変化が発生したことを判定すると、当該状態に予め対応付けた目的地候補及び経路を、利用者に提示する制御部とを備えるものである。 A navigation device according to the present invention includes a biological information detection unit that detects biological information of a user who is on a moving body, and an index that calculates an index value indicating the state of the user from the biological information detected by the biological information detection unit. When the value calculation unit and the index value calculated by the index value calculation unit are compared with a predetermined threshold value and it is determined that a change has occurred in the user's state, destination candidates and routes previously associated with the state are determined. And a control unit to be presented to the user.
 この発明によれば、移動体に搭乗する利用者の生体情報から算出した利用者の状態を示す指標値を所定の閾値と比較して、利用者の状態に変化が発生したことを判定すると、当該状態に予め対応付けた目的地候補及び経路を利用者に提示するので、搭乗者の生体情報や趣向から、ユーザの現状に即した適切なタイミングで、ユーザが所望する目的地候補及び経路の提示を行うことができるという効果がある。 According to this invention, when it is determined that a change has occurred in the user state by comparing the index value indicating the state of the user calculated from the biological information of the user boarding the mobile body with the predetermined threshold value, Since the destination candidate and route previously associated with the state are presented to the user, the destination candidate and route desired by the user can be determined at an appropriate timing according to the current state of the user based on the biological information and taste of the passenger. There is an effect that presentation can be performed.
この発明の実施の形態1によるナビゲーション装置を利用したカーナビゲーションシステムの構成を概略的に示す図である。BRIEF DESCRIPTION OF THE DRAWINGS It is a figure which shows schematically the structure of the car navigation system using the navigation apparatus by Embodiment 1 of this invention. 実施の形態1によるナビゲーション装置の構成を示すブロック図である。1 is a block diagram showing a configuration of a navigation device according to Embodiment 1. FIG. 実施の形態1による各種センサ及びデータベースの構成を示すブロック図である。It is a block diagram which shows the structure of the various sensors by Embodiment 1, and a database. 実施の形態1によるカーナビゲーションシステムで提示ルートが決定されるまでの情報の流れを示す図である。It is a figure which shows the flow of information until a presentation route is determined with the car navigation system by Embodiment 1. FIG. この発明の実施の形態2によるナビゲーション装置を利用したカーナビゲーションシステムの構成を概略的に示す図である。It is a figure which shows schematically the structure of the car navigation system using the navigation apparatus by Embodiment 2 of this invention. 実施の形態2によるナビゲーション装置の構成を示すブロック図である。6 is a block diagram illustrating a configuration of a navigation device according to Embodiment 2. FIG.
 以下、この発明をより詳細に説明するために、この発明を実施するための最良の形態について、添付の図面に従って説明する。
実施の形態1.
 図1は、この発明の実施の形態1によるナビゲーション装置を利用したカーナビゲーションシステムの構成を概略的に示す図である。図1において、実施の形態1のカーナビゲーションシステムは、ナビゲーション装置10、各種センサ20及びデータベース30の記憶する記憶装置を備え、これらが車内ネットワーク90で接続されて互いに通信可能である。
Hereinafter, in order to describe the present invention in more detail, the best mode for carrying out the present invention will be described with reference to the accompanying drawings.
Embodiment 1 FIG.
1 is a diagram schematically showing a configuration of a car navigation system using a navigation apparatus according to Embodiment 1 of the present invention. 1, the car navigation system according to the first embodiment includes a navigation device 10, various sensors 20, and a storage device that stores a database 30, and these are connected by an in-vehicle network 90 and can communicate with each other.
 ナビゲーション装置10は、現在位置から目的地までの経路をユーザに案内する装置である。各種センサ20は、ユーザの血圧及び体温などの生体情報を取得するセンサ群である。データベース30には、ユーザの飲食に関する好みなどの趣向情報、及び目的地の住所などの目的地情報が登録される。車内ネットワーク90は、ナビゲーション装置10、各種センサ20及びデータベース30の記憶する記憶装置とそれぞれ接続する一つの無線または有線の通信回線である。 The navigation device 10 is a device that guides the route from the current position to the destination. The various sensors 20 are sensor groups that acquire biological information such as a user's blood pressure and body temperature. In the database 30, preference information such as user's preferences regarding eating and drinking, and destination information such as a destination address are registered. The in-vehicle network 90 is a single wireless or wired communication line that connects to the navigation device 10, the various sensors 20, and the storage devices stored in the database 30.
 図2は、実施の形態1によるナビゲーション装置の構成を示すブロック図である。図2において、ナビゲーション装置10は、趣向情報取得部11、生体情報記憶部12、ユーザ状態判定部13、ユーザ状態記憶部13a、目的地情報取得部14、時間情報取得部15、時間情報記憶部15a、現在地情報取得部16、装置制御部17、表示装置制御部18及び表示装置部19を備える。 FIG. 2 is a block diagram showing a configuration of the navigation device according to the first embodiment. In FIG. 2, the navigation apparatus 10 includes a preference information acquisition unit 11, a biological information storage unit 12, a user state determination unit 13, a user state storage unit 13a, a destination information acquisition unit 14, a time information acquisition unit 15, and a time information storage unit. 15a, the present location information acquisition part 16, the apparatus control part 17, the display apparatus control part 18, and the display apparatus part 19 are provided.
 趣向情報取得部11は、ユーザ趣向情報100を取得する構成部である。例えば、表示装置部19に入力用画面を表示し、この入力用画面を基にユーザ趣向情報100が入力される、ユーザ趣向情報100の入力用のHMI(Human Machine Interface)である。ユーザ趣向情報100を入力する構成には、キー操作部や音声入力装置などのナビゲーション装置10に実装された入力装置が利用できる。 The preference information acquisition unit 11 is a component that acquires the user preference information 100. For example, it is an HMI (Human Machine Interface) for inputting user preference information 100 in which an input screen is displayed on the display unit 19 and user preference information 100 is input based on this input screen. For the configuration for inputting the user preference information 100, an input device mounted on the navigation device 10 such as a key operation unit or a voice input device can be used.
 生体情報記憶部12は、各種センサ20で検出された生体情報101を記憶する記憶部である。生体情報記憶部12では、各種センサ20にて生体情報101が検出されると、検出された生体情報101が車内ネットワーク90を介して取得し記憶される。また、生体情報記憶部12に記憶された生体情報101は、ユーザ状態判定部13によって所定の周期で読み出される。ユーザ状態判定部13は、ユーザ趣向情報100及び生体情報101に基づいてユーザの空腹度や疲労度などのユーザ状態情報102を判定する構成部である。ユーザ状態情報102の判定方法については後述する。ユーザ状態記憶部13aは、ユーザ状態判定部13で得られたユーザ状態情報102を記憶する記憶部である。 The biological information storage unit 12 is a storage unit that stores the biological information 101 detected by the various sensors 20. In the biological information storage unit 12, when the biological information 101 is detected by the various sensors 20, the detected biological information 101 is acquired and stored via the in-vehicle network 90. The biometric information 101 stored in the biometric information storage unit 12 is read by the user state determination unit 13 at a predetermined cycle. The user state determination unit 13 is a configuration unit that determines user state information 102 such as a user's hunger degree and fatigue degree based on the user preference information 100 and the biological information 101. A method for determining the user status information 102 will be described later. The user state storage unit 13 a is a storage unit that stores the user state information 102 obtained by the user state determination unit 13.
 目的地情報取得部14は、目的地情報103を取得する構成部である。例えば、表示装置部19に目的地の設定用画面を表示し、この設定用画面を基にナビゲーション装置10に標準的に搭載された入力装置で目的地情報103を設定することが可能な、目的地情報103の設定用のHMIとして実現される。目的地情報取得部14が取得した目的地情報103は、車内ネットワーク90を介してデータベース30に登録される。また、目的地情報取得部14が、データベース30から過去に登録された目的地情報103を取得してもよい。 The destination information acquisition unit 14 is a component that acquires the destination information 103. For example, a destination setting screen can be displayed on the display unit 19, and the destination information 103 can be set with an input device that is normally mounted on the navigation device 10 based on the setting screen. This is realized as an HMI for setting the ground information 103. The destination information 103 acquired by the destination information acquisition unit 14 is registered in the database 30 via the in-vehicle network 90. Further, the destination information acquisition unit 14 may acquire the destination information 103 registered in the past from the database 30.
 時間情報取得部15は、現在時刻や運転時間などの時間情報104を取得する構成部である。例えば、ナビゲーション装置10として機能するコンピュータに搭載されたタイマを用いた計時手段である。時間情報記憶部15aは、時間情報取得部15で得られた時間情報104を記憶する記憶部である。 The time information acquisition unit 15 is a configuration unit that acquires time information 104 such as the current time and driving time. For example, it is time measuring means using a timer mounted on a computer that functions as the navigation device 10. The time information storage unit 15 a is a storage unit that stores the time information 104 obtained by the time information acquisition unit 15.
 現在地情報取得部16は、自車両の現在の走行地点などの現在地情報105を取得する構成部である。例えば、GPS(Global Positioning System)や加速度センサから位置情報を取得して自車位置を測位する。装置制御部17は、ユーザ趣向情報100、ユーザ状態情報102、目的地情報103、時間情報104及び現在地情報105に基づいて、ユーザ側に提示する目的地や目的地までのルートなどの提示目的地情報110を決定する構成部である。 The current location information acquisition unit 16 is a component that acquires current location information 105 such as the current travel point of the host vehicle. For example, position information is acquired from GPS (Global Positioning System) or an acceleration sensor, and the vehicle position is measured. The device control unit 17 uses the user preference information 100, the user status information 102, the destination information 103, the time information 104, and the current location information 105 to present a destination such as a destination to be presented to the user and a route to the destination. It is a component that determines information 110.
 表示装置制御部18は、装置制御部17からの制御信号に従って提示目的地情報110の提示動作を制御する構成部である。表示装置部19は、提示目的地情報110をユーザ側へ提示する構成部である。この表示装置部19には、ナビゲーション装置に搭載されたLCD(Liquid Crystal Display)などの表示装置に加え、スピーカなどの音声出力装置も含まれる。例えば、提示目的地情報110に設定された目的地や目的地までのルートを示す地図情報や案内音声情報は、表示装置部19の表示画面に表示され、スピーカから音声出力される。 The display device control unit 18 is a component that controls the presentation operation of the presentation destination information 110 in accordance with a control signal from the device control unit 17. The display unit 19 is a component that presents the presentation destination information 110 to the user side. The display unit 19 includes an audio output device such as a speaker in addition to a display device such as an LCD (Liquid Crystal Display) mounted on the navigation device. For example, the map information and the guidance voice information indicating the destination set in the presentation destination information 110 and the route to the destination are displayed on the display screen of the display unit 19 and are output as voice from a speaker.
 図3は、実施の形態1によるカーナビゲーションシステムに用いられる各種センサ及びデータベースの構成を示すブロック図である。図3(a)は、各種センサの構成を示しており、図3(b)はデータベース30の構成を示している。各種センサ20には、図3(a)に示すように、生体情報101を検出する生体情報検出部21が設けられる。
 この生体情報検出部21は、例えば、ユーザを撮影するカメラ、ユーザの血圧を計測する血圧センサや体温を検出する温計センサなどに加え、これらの出力信号をナビゲーション装置10側で処理可能な情報形態に変換して生体情報101を生成する処理手段を備える。
FIG. 3 is a block diagram showing configurations of various sensors and a database used in the car navigation system according to the first embodiment. FIG. 3A shows the configuration of various sensors, and FIG. 3B shows the configuration of the database 30. As shown in FIG. 3A, the various sensors 20 are provided with a biological information detection unit 21 that detects biological information 101.
The biological information detection unit 21 is, for example, information that can process these output signals on the navigation device 10 side in addition to a camera that captures the user, a blood pressure sensor that measures the blood pressure of the user, a thermometer sensor that detects body temperature, and the like. Processing means for generating biometric information 101 by converting into a form is provided.
 データベース30には、ユーザ趣向情報100を記憶する趣向情報記憶部31及び目的地情報103を記憶する目的地情報記憶部32が設けられる。データベース30を記憶する記憶装置は、車内ネットワーク90で接続された外部記憶装置であってもよいが、ナビゲーション装置10に内蔵されたハードディスク装置の記憶領域に構築しても構わない。 The database 30 is provided with a preference information storage unit 31 that stores user preference information 100 and a destination information storage unit 32 that stores destination information 103. The storage device for storing the database 30 may be an external storage device connected via the in-vehicle network 90, but may be constructed in a storage area of a hard disk device built in the navigation device 10.
 次に動作について説明する。
 図4は、実施の形態1によるカーナビゲーションシステムで提示ルートが決定されるまでの情報の流れを示す図である。図2,3に示した構成と図4を参照して、カーナビゲーションシステム各部の主な動作を述べる。なお、以降では、運転者や同乗者が空腹や疲労状態にあるとき、このユーザ状態に予め関連付けられた目的地候補及びルートとして飲食店を含むルートを提示する場合を例に挙げる。
Next, the operation will be described.
FIG. 4 is a diagram showing a flow of information until a presentation route is determined in the car navigation system according to the first embodiment. The main operations of each part of the car navigation system will be described with reference to the configuration shown in FIGS. In the following, when the driver or passenger is hungry or tired, an example is given in which a route including a restaurant is presented as a destination candidate and a route associated in advance with the user state.
 先ず、運転前に前処理として、趣向情報取得部11により、運転者及び同乗者のユーザ趣向情報100をそれぞれ取得しておく。ユーザ趣向情報100には、運転者や同乗者の性別、年齢、好きな食べ物のジャンル、通常食事(朝、昼、夜)を摂る時刻、外食の際の1食当たりの金額などがある。趣向情報取得部11によって、これらの情報を設定項目とする入力用画面が表示装置部19に表示される。ユーザは、設定項目に対応する情報を、端末キー入力又は音声入力で設定する。 First, as a pre-process before driving, the preference information acquisition unit 11 acquires user preference information 100 of a driver and a passenger. The user preference information 100 includes the gender, age, favorite food genre, time of eating a normal meal (morning, noon, night), the amount of money per meal when eating out, and the like. The preference information acquisition unit 11 displays an input screen having these pieces of information as setting items on the display unit 19. The user sets information corresponding to the setting item by terminal key input or voice input.
 このように入力されたユーザ趣向情報100は、車内ネットワーク90を介して、趣向情報取得部11からデータベース30の趣向情報記憶部31に保持される。また、目的地情報103も同様に、車内ネットワーク90を介して、目的地情報取得部14からデータベース30の目的地情報記憶部32に登録される。 The user preference information 100 input in this way is held in the preference information storage unit 31 of the database 30 from the preference information acquisition unit 11 via the in-vehicle network 90. Similarly, the destination information 103 is also registered in the destination information storage unit 32 of the database 30 from the destination information acquisition unit 14 via the in-vehicle network 90.
 次に、運転中は、生体情報検出部21が運転者や同乗者の生体情報101を検出する。
この生体情報検出部21により車内のユーザから取得された生体情報101に基づいて、運転中のユーザの現状に即した適切な提示タイミングが決定される。ここでは、運転者や同乗者が空腹や疲労状態にある場合に適した提示タイミングを決定するために、生体情報検出部21に、車内を撮影するカメラ、車内音を集音する集音マイク、血圧センサ、加速度センサ、温度センサ及びサーモグラフィ装置などを設け、これらが取得した情報を処理して生体情報101を生成する情報処理部を設けておく。
Next, during driving, the biological information detection unit 21 detects the biological information 101 of the driver and passengers.
Based on the biometric information 101 acquired from the user in the vehicle by the biometric information detection unit 21, an appropriate presentation timing in accordance with the current state of the driving user is determined. Here, in order to determine the presentation timing suitable when the driver or passenger is hungry or tired, the biological information detection unit 21 is provided with a camera that captures the interior of the vehicle, a sound collection microphone that collects the interior sound, A blood pressure sensor, an acceleration sensor, a temperature sensor, a thermographic device, and the like are provided, and an information processing unit that processes the information acquired by these to generate the biological information 101 is provided.
 生体情報検出部21は、例えば、カメラで撮影した画像から、乗車人数を検出したり、各搭乗者の目が向いている方向を検出した結果を基に凝視時間を生体情報101として測定する。また、集音マイクが検出した車内音(話し声、空腹音)から空腹音を抽出して、運転者や同乗者の空腹状態を示す生体情報101を生成してもよい。さらに、生体情報検出部21では、血圧センサをシートベルトなどに埋め込んでおき、運転者や同乗者の血圧を生体情報101として測定する。 The biometric information detection unit 21 measures the gaze time as the biometric information 101 based on the result of detecting the number of passengers or the direction in which each passenger's eyes are facing, for example, from an image captured by a camera. Moreover, the hungry sound may be extracted from the in-vehicle sound (speaking voice, hungry sound) detected by the sound collecting microphone, and the biological information 101 indicating the hungry state of the driver or the passenger may be generated. Further, the biological information detection unit 21 embeds a blood pressure sensor in a seat belt or the like, and measures the blood pressure of the driver or passenger as the biological information 101.
 また、人間は、空腹や疲労に起因して心理状態が不安定になると一定のリズムをとって気を紛らわせることがある。そこで、座席などに加速度センサを埋め込んでおけば、着席者が上述のリズムをとった際、その振動を検出することができる。つまり、加速度センサで取得された振動の検出情報を、運転者や同乗者の心理状態を推測する生体情報101として利用できる。 Also, human beings may be distracted with a certain rhythm when psychological state becomes unstable due to hunger and fatigue. Therefore, if an acceleration sensor is embedded in a seat or the like, the vibration can be detected when the seated person takes the rhythm described above. That is, the vibration detection information acquired by the acceleration sensor can be used as the biological information 101 for estimating the psychological state of the driver or passengers.
 さらに、温度センサ及びサーモグラフィ装置によって、搭乗者の頭部付近の体温を生体情報101として測定する。一般的に、心理状態に応じて皮膚血流量が変動すると、これによって皮膚温度分布が変化する。このため、体温情報を、運転者及び同乗者の心理状態を推測するための生体情報101として利用する。 Further, the body temperature near the head of the passenger is measured as the biological information 101 by the temperature sensor and the thermography device. In general, when the skin blood flow fluctuates according to the psychological state, the skin temperature distribution changes accordingly. For this reason, body temperature information is utilized as the biometric information 101 for estimating the psychological state of a driver and a passenger.
 生体情報101は、自車両の走行時に運転者及び同乗者から、生体情報検出部21によって常時若しくは一定の周期で取得され、車内ネットワーク90を経由して生体情報検出部21からナビゲーション装置10の生体情報記憶部12に送られて保持される。また、時間情報取得部15は、車内に設置した時計などから現在時刻を取得し、タイマなどから乗車時間などを取得して、時間情報104として時間情報記憶部15aに保持する。 The biological information 101 is acquired from the driver and passengers at the time of traveling of the host vehicle by the biological information detection unit 21 constantly or at a constant cycle, and is transmitted from the biological information detection unit 21 to the living body of the navigation device 10 via the in-vehicle network 90. It is sent to and held in the information storage unit 12. Further, the time information acquisition unit 15 acquires the current time from a clock or the like installed in the vehicle, acquires the boarding time from a timer or the like, and holds the time information 104 in the time information storage unit 15a.
 ユーザ状態判定部13は、生体情報記憶部12から各種の生体情報101を読み出し、時間情報記憶部15aから時間情報104を読み出し、趣向情報取得部11で取得されたユーザ趣向情報100を入力して、これらの情報100,101,104に所定の演算を施すことにより、運転者及び同乗者のユーザ状態情報102をそれぞれ算出する。
 ここでは、生体情報101及び時間情報104、ユーザ趣向情報100に対して所定のルールで評価する点数付けを行うことで、運転者又は同乗者の空腹度や疲労度を示す指標を求めることができる。
The user state determination unit 13 reads various biological information 101 from the biological information storage unit 12, reads time information 104 from the time information storage unit 15 a, and inputs the user preference information 100 acquired by the preference information acquisition unit 11. The user status information 102 of the driver and the passenger is calculated by performing a predetermined calculation on the information 100, 101, and 104, respectively.
Here, the biometric information 101, the time information 104, and the user preference information 100 are scored according to a predetermined rule, so that an index indicating the driver's or passenger's hunger or fatigue can be obtained. .
 例えば、生体情報101として取得された運転者又は同乗者の視線の凝視時間と、時間情報104が示す乗車時間から、下記式(1)を用いて運転者又は同乗者の疲労度Bを算出する。ただし、αは運転者か同乗者によって異なる係数である。
 疲労度B=α×{(凝視時間)/(乗車時間)}   ・・・(1)
For example, the driver's or passenger's fatigue level B is calculated from the gaze time of the driver's or passenger's line of sight acquired as the biological information 101 and the boarding time indicated by the time information 104 using the following equation (1). . However, α is a coefficient that differs depending on the driver or passenger.
Fatigue degree B = α × {(gaze time) / (ride time)} (1)
 また、生体情報101として取得された車内音と血圧、振動、頭部付近の体温、時間情報104で示される現在時刻、ユーザ趣向情報100として設定された通常の食事摂取時刻から、下記式(2)~(4)を用いて運転者又は同乗者の空腹度Hを算出できる。
 ただし、指標1は、空腹に対する苛立ち度合を示す指標値である。また、指標2とは、食事をとるタイミングの適正度合を示す指標値である。
 a1は、血圧の値に対する係数であり、a2は、上述の振動の値に対する係数であり、a3は、頭部付近の体温に対する係数である。b1は空腹音の音量レベルに対する係数である。
 b2は、話し声の集音結果から抽出した空腹を指すキーワードマッチングの結果を示す指標値に対する係数である。マッチングの結果は、マッチングが取れれば“1”となり、マッチングしなければ、マッチング結果を示す指標値は“0”になる。
 b3は、現在時刻と通常の食事摂取時刻との相関度に対する係数である。この相関度は、最大値が“1”となり、最小値が“0”になる。
 なお、β1,β2、a1~a3、b1~b3は、運転者か同乗者によって各々異なる係数である。
 空腹度H=(β1×指標1)+(β2×指標2)   ・・・(2)
 指標1={a1×(血圧)}+{a2×(振動)}+{a3×(頭部付近の体温)}   ・・・(3)
 指標2={b1×(空腹音の音量レベル)}+{b2×(話し声に空腹を指すキーワードが含まれている場合=1)}+{b3×(現在時刻と通常の食事摂取時刻との相関度)}   ・・・(4)
Further, from the in-vehicle sound and blood pressure acquired as the biological information 101, blood pressure, vibration, body temperature near the head, the current time indicated by the time information 104, and the normal meal intake time set as the user preference information 100, the following formula (2 ) To (4), the hunger degree H of the driver or passenger can be calculated.
However, index 1 is an index value indicating the degree of irritation to hunger. The index 2 is an index value indicating the appropriate degree of timing for eating.
a1 is a coefficient for the blood pressure value, a2 is a coefficient for the above-described vibration value, and a3 is a coefficient for the body temperature near the head. b1 is a coefficient for the volume level of the hungry sound.
b2 is a coefficient with respect to the index value indicating the result of keyword matching indicating hunger extracted from the voice collection result. The matching result is “1” if the matching is obtained, and the index value indicating the matching result is “0” if the matching is not obtained.
b3 is a coefficient for the degree of correlation between the current time and the normal meal intake time. This correlation degree has a maximum value of “1” and a minimum value of “0”.
Β1, β2, a1 to a3, and b1 to b3 are different coefficients depending on the driver or the passenger.
Hunger degree H = (β1 × index 1) + (β2 × index 2) (2)
Index 1 = {a1 × (blood pressure)} + {a2 × (vibration)} + {a3 × (body temperature near the head)} (3)
Index 2 = {b1 × (volume level of the hungry sound)} + {b2 × (if the spoken voice includes a keyword indicating hungry = 1)} + {b3 × (the current time and the normal meal intake time Correlation degree)} (4)
 上述のように、各々の生体情報101、時間情報104、ユーザ趣向情報100に対して点数化のルールを設定して、算出したものがユーザ状態情報102となる。このようにして求められた搭乗者ごとのユーザ状態情報102は、ユーザ状態記憶部13aに保持される。
 また、現在地情報取得部16は、GPSや加速度センサのデータに基づいて車両の現在地情報105を取得する。
As described above, a rule for scoring is set for each of the biological information 101, the time information 104, and the user preference information 100, and the calculated value becomes the user status information 102. The user status information 102 for each passenger obtained in this way is held in the user status storage unit 13a.
The current location information acquisition unit 16 acquires the current location information 105 of the vehicle based on GPS and acceleration sensor data.
 装置制御部17は、ある周期毎に、ユーザ趣向情報100とユーザ状態情報102、目的地情報103、時間情報104、現在地情報105を用いて、提示目的地情報110を決定し、この決定結果を基に表示装置制御部18を制御する。このように、生体情報101に基づいて、ユーザの現状に即した適切な提示タイミングで、提示目的地情報110が決定される。なお、装置制御部17は、この動作の他に、ナビゲーション装置10内の各構成部の動作も制御する。 The device control unit 17 determines the presentation destination information 110 using the user preference information 100, the user status information 102, the destination information 103, the time information 104, and the current location information 105 every certain period, and determines the determination result. Based on this, the display control unit 18 is controlled. As described above, the presentation destination information 110 is determined based on the biological information 101 at an appropriate presentation timing according to the current state of the user. Note that the device control unit 17 controls the operation of each component in the navigation device 10 in addition to this operation.
 次に、装置制御部17による提示目的地情報110の決定方法の詳細を説明する。
 先ず、装置制御部17は、ユーザ状態記憶部13aからユーザ状態情報102を読み出し、ユーザ状態情報102を構成する各指標値を、これら指標値を予め設定しておいた閾値と比較した閾値判定結果に応じて、提示目的地情報110の内容を決定する。ここでは、運転者や同乗者が空腹や疲労状態にあるとき、このユーザ状態に予め関連付けられた目的地候補及びルートとして飲食店を含むルートを提示する。
Next, details of a method of determining the presentation destination information 110 by the device control unit 17 will be described.
First, the apparatus control unit 17 reads the user state information 102 from the user state storage unit 13a, and compares the index values constituting the user state information 102 with threshold values set beforehand for these index values. The contents of the presentation destination information 110 are determined accordingly. Here, when a driver or a passenger is hungry or tired, a route including a restaurant is presented as a destination candidate and a route previously associated with the user state.
 例えば、ユーザ状態情報102における空腹度Hや疲労度Bが閾値よりも高い人数を求め、空腹度Hの高い人数が多ければ、提示候補106の目的ジャンルを、レストラン等の飲食店に設定する。また、疲労度Bの高い人数が多ければ、喫茶店等の飲食店を設定し、同数であれば、レストラン等の飲食店に設定する。さらに、ユーザ状態情報102から空腹度Hが最も高いユーザを、提示候補106の目的地対象ユーザに設定する。 For example, the number of persons whose hunger degree H or fatigue degree B in the user state information 102 is higher than a threshold value is obtained, and if there are many persons with a high hunger degree H, the target genre of the presentation candidate 106 is set to a restaurant or the like restaurant. Moreover, if there are many people with high fatigue degree B, restaurants, such as a coffee shop, will be set, and if it is the same number, it will set to restaurants, such as a restaurant. Further, the user having the highest hunger degree H from the user status information 102 is set as the destination target user of the presentation candidate 106.
 続いて、装置制御部17は、目的地情報記憶部32が記憶する目的地情報103から、提示候補106の目的ジャンルとマッチングする目的地情報103を取得し、この目的地情報103を目的地内容解析結果107とする。さらに、装置制御部17は、提示候補106の目的地対象ユーザとマッチングするユーザ趣向情報100を取得し、このユーザ趣向情報100をユーザ趣向内容解析結果108とする。 Subsequently, the device control unit 17 acquires the destination information 103 that matches the destination genre of the presentation candidate 106 from the destination information 103 stored in the destination information storage unit 32, and uses this destination information 103 as the destination content. The analysis result 107 is assumed. Furthermore, the device control unit 17 acquires user preference information 100 that matches the destination target user of the presentation candidate 106, and uses the user preference information 100 as the user preference content analysis result 108.
 また、装置制御部17は、現在地情報105と目的地内容解析結果107の住所より、不図示の地図データベースから取得した地図情報を用いて現在地から目的地までのルート検索を行い、ルート検索結果109を求める。
 この後、装置制御部17は、時間情報104の現在時刻と目的地内容解析結果107、ユーザ趣向内容解析結果108、ルート検索結果109から、閾値判定に応じた点数付けにより提示目的地情報110を決定する。
Further, the device control unit 17 performs a route search from the current location to the destination from the current location information 105 and the address of the destination content analysis result 107 using map information acquired from a map database (not shown), and obtains a route search result 109. Ask for.
Thereafter, the device control unit 17 obtains the presentation destination information 110 from the current time of the time information 104, the destination content analysis result 107, the user preference content analysis result 108, and the route search result 109 by scoring according to threshold determination. decide.
 ここで、飲食店を目的地とした提示目的地情報110を決定する場合を説明する。
 例えば、現在時刻と、目的地内容解析結果107に規定される料理ジャンル、利用年代、性別、営業時間、平均料金、ユーザ趣向内容解析結果108に設定される年齢、性別、好きな食べ物ジャンル、外食にかける費用、ルート検索結果109の目的地までの所要時間から、ユーザの趣向に合致する飲食店が決定される。
Here, the case where the presentation destination information 110 with the restaurant as the destination is determined will be described.
For example, the current time, the cooking genre specified in the destination content analysis result 107, age of use, gender, business hours, average fee, age set in the user preference content analysis result 108, gender, favorite food genre, restaurant The restaurant that matches the user's taste is determined from the cost to spend and the required time to the destination of the route search result 109.
 具体的には、下記式(5)~(7)を用いて、提示判定値Aを算出する。
 ただし、指標3は、店の好みを示す指標値である。また、指標4は、店の一般度を示す指標値である。
 γ1は、店の好みを示す指標3に対する係数であり、γ2は、店の一般度を示す指標4に対する係数である。
 また、c1は、料理ジャンルと好きな食べ物ジャンルの相関度に対する係数である。
 c2は、ユーザが外食にかける費用が平均料金に含まれる場合が“1”となる指標値に対する係数である。費用が平均料金に含まれない場合、指標値は“0”になる。
 d1は、年齢が利用年代に含まれる場合が“1”となる指標値に対する係数である。年齢が利用年代に含まれない場合、指標値は“0”になる。
 d2は、性別が利用性別に含まれる場合が“1”となる指標値に対する係数である。性別が利用性別に含まれない場合、指標値は“0”になる。
 d3は、現在時刻と所要時間の和が営業時間内の場合である場合が“1”となる指標値に対する係数である。上記時間の和が営業時間内でなければ、指標値は“0”になる。
 d4は、所要時間が閾値時間以内の場合が“1”となる指標値に対する係数である。所要時間が閾値時間を超える場合、指標値は“0”になる。
 なお、γ1,γ2、c1~c2、d1~d4は、運転者又は同乗者によって各々異なる係数である。
 提示判定値A=(γ1×指標3)+(γ2×指標4)   ・・・(5)
 指標3={c1×(料理ジャンルと好きな食べ物ジャンルの相関度)+{c2×(外食にかける費用が平均料金に含まれる場合=1)}   ・・・(6)
 指標4={d1×(年齢が利用年代に含まれる場合=1)}+{d2×(性別が利用性別に含まれる場合=1)}+{d3×(現在時刻と所要時間の和が営業時間内の場合である場合=1)}+{d4×(所要時間が閾値時間以内の場合=1)}   ・・・(7)
Specifically, the presentation determination value A is calculated using the following formulas (5) to (7).
However, the index 3 is an index value indicating the preference of the store. The index 4 is an index value indicating the general degree of the store.
γ1 is a coefficient for the index 3 indicating the preference of the store, and γ2 is a coefficient for the index 4 indicating the general degree of the store.
C1 is a coefficient for the degree of correlation between the cooking genre and the favorite food genre.
c2 is a coefficient for an index value that is “1” when the average fee includes the cost of eating out by the user. When the cost is not included in the average fee, the index value is “0”.
d1 is a coefficient for the index value that is “1” when the age is included in the usage age. When the age is not included in the usage age, the index value is “0”.
d2 is a coefficient for the index value that is “1” when the gender is included according to the usage. If the gender is not included by gender, the index value is “0”.
d3 is a coefficient for the index value that is “1” when the sum of the current time and the required time is within business hours. If the sum of the above times is not within business hours, the index value is “0”.
d4 is a coefficient for the index value that is “1” when the required time is within the threshold time. When the required time exceeds the threshold time, the index value becomes “0”.
Note that γ1, γ2, c1 to c2, and d1 to d4 are different coefficients depending on the driver or the passenger.
Presentation determination value A = (γ1 × index 3) + (γ2 × index 4) (5)
Index 3 = {c1 × (degree of correlation between cooking genre and favorite food genre) + {c2 × (when the cost for eating out is included in the average price = 1)} (6)
Index 4 = {d1 × (when age is included in usage age = 1)} + {d2 × (when gender is included according to usage = 1)} + {d3 × (sum of current time and required time is open When it is within the time = 1)} + {d4 × (when the required time is within the threshold time = 1)} (7)
 装置制御部17は、上述した手順で、提示候補106の目的ジャンルとマッチングする目的地の各々に対して提示判定値Aを算出し、ユーザに提示する店を決定する。なお、提示する店の最大数は任意に設定する。
 表示装置制御部18は、装置制御部17により決定された提示目的地情報110から、各目的地の提示判定値Aが高い順に、提示目的地情報110に設定された画像や画像の提示理由を説明する音声を提示するよう表示装置部19を制御する。例えば、提示判定値Aが所定の閾値以上であれば、提示対象の店とする。
The device control unit 17 calculates a presentation determination value A for each destination that matches the target genre of the presentation candidate 106 in the above-described procedure, and determines a store to be presented to the user. The maximum number of shops to be presented is arbitrarily set.
The display device control unit 18 displays the images set in the presentation destination information 110 and the reasons for presentation of the images in descending order of the presentation determination value A of each destination from the presentation destination information 110 determined by the device control unit 17. The display device unit 19 is controlled to present a voice to be explained. For example, if the presentation determination value A is equal to or greater than a predetermined threshold value, the shop is a presentation target.
 この後、ユーザは、端末キー入力又は音声入力などで、提示目的地から目的地を決定(キャンセル)し、ルート案内を開始(実施しない)する。目的地が決定されると、ナビゲーション装置10は、既に開始されている目的地へのルート案内を終了する。 After this, the user determines (cancels) the destination from the presentation destination by terminal key input or voice input, and starts (does not implement) route guidance. When the destination is determined, the navigation apparatus 10 ends the route guidance to the destination that has already been started.
 以上のように、この実施の形態1によれば、車両に搭乗するユーザの生体情報101から算出したユーザ状態情報102を示す指標値を所定の閾値と比較して、ユーザの現状に変化が発生したことを判定すると、当該状態に予め対応付けた目的地候補及び経路をユーザに提示する。このように構成することにより、目的地候補及びそのルートへの変更をユーザが操作することなく、能動的にナビゲーション装置10が行うことで、ユーザ操作の手間を省き、ユーザの趣向に合致する目的地への案内が可能となる。
 また、ユーザ趣向情報100及び生体情報101を利用するため、ユーザの現状に即した適切な提示タイミングで、ユーザの趣向に合致した目的地候補の提示及びルート変更を行うことができる。
As described above, according to the first embodiment, the index value indicating the user state information 102 calculated from the biological information 101 of the user on the vehicle is compared with the predetermined threshold value, and a change occurs in the current state of the user. If it is determined, the destination candidate and route previously associated with the state are presented to the user. With this configuration, the navigation device 10 actively performs the destination candidate and the change to the route without the user's operation, so that the user's operation is saved, and the purpose matches the user's preference. Guidance to the ground is possible.
In addition, since the user preference information 100 and the biometric information 101 are used, it is possible to present a destination candidate that matches the user preference and change the route at an appropriate presentation timing in accordance with the current state of the user.
 なお、上記実施の形態1では、ユーザ趣向情報100及び生体情報101に基づいて、目的地決定やルート変更を行う場合を示したが、この構成に限定されるものではない。
 例えば、趣向情報取得部11を省略し、装置制御部17が、運転中にユーザから順次検出される生体情報101に基づいて提示タイミングを決定する。この提示タイミングで、装置制御部17が、時間情報104の現在時刻と目的地内容解析結果107、ルート検索結果109から提示目的地情報110を決定し、表示装置制御部18を制御してユーザに提示する。このように構成することでも、ユーザの現状に即した適切な提示タイミングで、目的地及びルート変更を行うことができる。
In the first embodiment, the case where the destination is determined and the route is changed based on the user preference information 100 and the biological information 101 has been described. However, the present invention is not limited to this configuration.
For example, the preference information acquisition unit 11 is omitted, and the device control unit 17 determines the presentation timing based on the biological information 101 sequentially detected from the user during driving. At this presentation timing, the device control unit 17 determines the presentation destination information 110 from the current time of the time information 104, the destination content analysis result 107, and the route search result 109, and controls the display device control unit 18 to inform the user. Present. Even with this configuration, it is possible to change the destination and route at an appropriate presentation timing according to the current state of the user.
実施の形態2.
 図5は、この発明の実施の形態2によるナビゲーション装置を利用したカーナビゲーションシステムの構成を概略的に示す図である。図5において、実施の形態2のカーナビゲーションシステムは、上記実施の形態1で示した構成に加え、車外サーバ70及び他ナビゲーション装置40を備える。ナビゲーション装置10と車外サーバ70及び他ナビゲーション装置40とは、車外ネットワーク91で接続されて互いに通信可能である。
Embodiment 2. FIG.
FIG. 5 is a diagram schematically showing a configuration of a car navigation system using a navigation device according to Embodiment 2 of the present invention. In FIG. 5, the car navigation system according to the second embodiment includes an out-of-vehicle server 70 and another navigation device 40 in addition to the configuration shown in the first embodiment. The navigation device 10, the out-of-vehicle server 70, and the other navigation device 40 are connected by an out-of-vehicle network 91 and can communicate with each other.
 車外サーバ70は、目的地付加情報111を管理するサーバ装置である。また、目的地付加情報111とは、例えば、目的地情報103に関する店内画像や現在実施している店のサービスなどを示す情報である。また、車外ネットワーク91は、ナビゲーション装置10、車外サーバ70及び他ナビゲーション装置40をそれぞれ接続する一つの無線通信回線である。 The off-vehicle server 70 is a server device that manages the destination additional information 111. The destination addition information 111 is information indicating, for example, an in-store image related to the destination information 103, a service of a store currently being implemented, and the like. The outside-vehicle network 91 is a single wireless communication line that connects the navigation device 10, the outside-server 70, and the other navigation device 40.
 他ナビゲーション装置40は、ナビゲーション装置10とは異なる車両に搭載された、他のユーザに管理されるナビゲーション装置である。この他ナビゲーション装置40は、車外ネットワーク91を経由する無線通信エリア内のナビゲーション装置10に対して、目的地情報103に関する他ユーザ評価情報112を通知する。なお、他ユーザ評価情報112とは、例えば、目的地情報103に対する店への満足度などを示す他ユーザの評価情報である。 The other navigation device 40 is a navigation device that is installed in a vehicle different from the navigation device 10 and is managed by another user. The other navigation device 40 notifies the other user evaluation information 112 regarding the destination information 103 to the navigation device 10 in the wireless communication area via the outside-vehicle network 91. The other user evaluation information 112 is evaluation information of other users indicating, for example, a degree of satisfaction with the store with respect to the destination information 103.
 図6は、実施の形態2によるナビゲーション装置の構成を示すブロック図である。図6において、実施の形態2のナビゲーション装置10は、上記実施の形態1で図2を用いて説明した構成に加え、通信制御部20を備える。通信制御部20は、車外ネットワーク91を介して車外サーバ70及び他ナビゲーション装置40と通信し、これらと情報をやり取りする構成部である。 FIG. 6 is a block diagram showing the configuration of the navigation device according to the second embodiment. 6, the navigation device 10 according to the second embodiment includes a communication control unit 20 in addition to the configuration described with reference to FIG. 2 in the first embodiment. The communication control unit 20 is a component that communicates with the outside server 70 and the other navigation device 40 via the outside network 91 and exchanges information with them.
 次に動作について説明する。
 図5,6を参照して、カーナビゲーションシステム各部の主な動作を述べる。
 なお、上記実施の形態1と同様に、生体情報101に基づいて、運転者や同乗者が空腹や疲労状態であると判定したタイミングで、提示目的地情報110として飲食店を含むルートを提示する場合を例に挙げる。
 通信制御部20は、運転者や同乗者が空腹や疲労状態である場合、目的地情報103に関する店内画像や現在実施している店のサービスなどの目的地付加情報111を無線通信により車外サーバ70から取得する。
 また、通信制御部20は、目的地情報103に対する店への満足度などの他ユーザ評価情報112を、同様に他ナビゲーション装置40より取得する。
 このとき、通信対象となる他ナビゲーション装置40は、車外ネットワーク91を経由してナビゲーション装置10と通信可能(通信エリア内)であることが必要である。
Next, the operation will be described.
The main operation of each part of the car navigation system will be described with reference to FIGS.
As in the first embodiment, the route including the restaurant is presented as the presentation destination information 110 at the timing when the driver or passenger is determined to be hungry or tired based on the biometric information 101. Take the case as an example.
When the driver or passenger is hungry or tired, the communication control unit 20 wirelessly communicates the destination additional information 111 such as the in-store image relating to the destination information 103 and the service of the store currently being executed by the wireless server 70. Get from.
In addition, the communication control unit 20 similarly acquires other user evaluation information 112 such as satisfaction with the store with respect to the destination information 103 from the other navigation device 40.
At this time, the other navigation device 40 to be communicated with needs to be communicable (within the communication area) with the navigation device 10 via the outside-vehicle network 91.
 対象となる他ユーザは、上述の他ナビゲーション装置40を管理しており、かつナビゲーション装置10のユーザが通信許可をしている必要がある。つまり、対象となる他ユーザは、ユーザ趣向内容解析結果108と相関性が高いユーザである。
 通信制御部20で取得した目的地付加情報111及び他ユーザ評価情報112は、装置制御部17内の不図示のメモリに記憶される。
The target other user must manage the other navigation device 40 described above, and the user of the navigation device 10 needs to permit communication. That is, the target other user is a user highly correlated with the user preference content analysis result 108.
The destination additional information 111 and the other user evaluation information 112 acquired by the communication control unit 20 are stored in a memory (not shown) in the device control unit 17.
 目的地付加情報111は、装置制御部17によりユーザに提示する店が決定されると、通信制御部20が、決定された店の情報として車外サーバ70から取得したものである。この目的地付加情報111は、表示装置部19で提示する画像や音声等に追加される。
 なお、車外サーバ70は、外部から提供された情報から、目的地付加情報111を作成して管理する。車外サーバ70へ提供できる情報内容は、例えば、当該車外サーバ70の管理者へ支払う広告料に応じて設定されるようにしてもよい。
The destination additional information 111 is acquired by the communication control unit 20 from the out-of-vehicle server 70 as information on the determined store when the store to be presented to the user is determined by the device control unit 17. This destination additional information 111 is added to an image, sound, or the like presented on the display unit 19.
The out-of-vehicle server 70 creates and manages the destination additional information 111 from information provided from the outside. The information content that can be provided to the outside server 70 may be set according to, for example, an advertisement fee paid to the administrator of the outside server 70.
 他ユーザ評価情報112は、装置制御部17によりユーザに提示する店が決定されたことを契機に、通信制御部20が、ユーザ趣向内容解析結果108と同じ年代(年齢差がある閾値内)、性別、好きな食べ物ジャンルを有する他ユーザの評価情報として他ナビゲーション装置40から取得したものである。この他ユーザ評価情報112は、装置制御部17による提示判定値の計算に利用される。 The other user evaluation information 112 is the same age as the user preference content analysis result 108 (within an age difference threshold) when the device control unit 17 determines the store to be presented to the user. It is acquired from the other navigation apparatus 40 as evaluation information of other users who have sex and favorite food genres. The other user evaluation information 112 is used for the calculation of the presentation determination value by the device control unit 17.
 このように、実施の形態2によるナビゲーション装置10では、目的地(候補地)及びルート変更で提示する情報として、車外通信によって車外サーバ70から取得した目的地(候補地)で提供される情報が含まれる。また、目的地(候補地)及びルート変更を提示する際に、車外通信によって他ナビゲーション装置40から取得した同じ趣向を持つ他ユーザによる当該目的地に対する評価が利用される。なお、車外サーバ70及び他ナビゲーション装置40から取得する情報111,112は、課金制により提供され、金額に応じて提示可能内容が異なるように構成してもよい。 As described above, in the navigation device 10 according to the second embodiment, the information provided by the destination (candidate place) acquired from the outside server 70 by the outside communication is used as the information presented by the destination (candidate place) and the route change. included. Moreover, when presenting a destination (candidate place) and a route change, the evaluation for the destination by another user having the same taste acquired from the other navigation device 40 by external communication is used. Note that the information 111 and 112 acquired from the out-of-vehicle server 70 and the other navigation device 40 may be provided by a charging system and may be configured so that the contents that can be presented differ according to the amount of money.
 次に、飲食店を目的地とした提示目的地情報110を決定する場合を説明する。
 例えば、現在時刻と、目的地内容解析結果107に規定される料理ジャンル、利用年代、性別、営業時間、平均料金、ユーザ趣向内容解析結果108に設定される年齢、性別、好きな食べ物ジャンル、外食にかける費用、ルート検索結果109の目的地までの所要時間、他ユーザ評価情報112の店への満足度から、ユーザの嗜好に即した飲食店が決定される。
Next, the case where the presentation destination information 110 with the restaurant as the destination is determined will be described.
For example, the current time, the cooking genre specified in the destination content analysis result 107, age of use, gender, business hours, average fee, age set in the user preference content analysis result 108, gender, favorite food genre, restaurant The restaurant according to the user's preference is determined from the cost to spend, the required time to the destination of the route search result 109, and the satisfaction of the other user evaluation information 112 to the store.
 具体的には、下記式(8)~(10)を用いて、提示判定値Bを算出する。
 ただし、指標5は、店の好みを示す指標値であり、指標6は、店の一般度を示す指標値である。
 δ1は、店の好みを示す指標値に対する係数であり、δ2は店の一般度を示す指標値に対する係数である。
 また、e1は、料理ジャンルと好きな食べ物ジャンルの相関度に対する係数である。
 e2は、ユーザが外食にかける費用が平均料金に含まれる場合が“1”となる指標値に対する係数である。費用が平均料金に含まれない場合、指標値は“0”になる。
 f1は、年齢が利用年代に含まれる場合が“1”となる指標値に対する係数である。年齢が利用年代に含まれない場合、指標値は“0”になる。
 f2は、性別が利用性別に含まれる場合が“1”となる指標値に対する係数である。性別が利用性別に含まれない場合、指標値は“0”になる。
 f3は、現在時刻と所要時間の和が営業時間内の場合である場合が“1”となる指標値に対する係数である。上記時間の和が営業時間内でなければ、指標値は“0”になる。
 f4は、所要時間が閾値時間以内の場合が“1”となる指標値である。所要時間が閾値時間を超える場合、指標値は“0”になる。
 なお、δ1,δ2は、店への満足度と比例して各々異なる値を有し、他ユーザ評価情報112に示される当該店への満足度の指標値に比例する係数である。
 e1,e2、f1~f4は、運転者又は同乗者によって各々異なる係数である。
 提示判定値B=(δ1×指標5)+(δ2×指標6)   ・・・(8)
 指標5={e1×(料理ジャンルと好きな食べ物ジャンルの相関度)}+{e2×(外食にかける費用が平均料金)}   ・・・(9)
 指標6={f1×(年齢が利用年代に含まれる場合=1)}+{f2×(性別が利用性別に含まれる場合=1)}+{f3×((現在時刻)+(所要時間)が営業時間内の場合=1)}+{f4×(所要時間が閾値時間以内の場合=1)}   ・・・(10)
Specifically, the presentation determination value B is calculated using the following formulas (8) to (10).
However, the index 5 is an index value indicating the preference of the store, and the index 6 is an index value indicating the general degree of the store.
δ1 is a coefficient for the index value indicating the preference of the store, and δ2 is a coefficient for the index value indicating the generality of the store.
E1 is a coefficient for the degree of correlation between the cooking genre and the favorite food genre.
e2 is a coefficient for an index value that is “1” when the cost of eating out by the user is included in the average fee. When the cost is not included in the average fee, the index value is “0”.
f1 is a coefficient for the index value that is “1” when the age is included in the usage age. When the age is not included in the usage age, the index value is “0”.
f2 is a coefficient for an index value that is “1” when gender is included according to usage. If the gender is not included by gender, the index value is “0”.
f3 is a coefficient for the index value that is “1” when the sum of the current time and the required time is within business hours. If the sum of the above times is not within business hours, the index value is “0”.
f4 is an index value that is “1” when the required time is within the threshold time. When the required time exceeds the threshold time, the index value becomes “0”.
Δ1 and δ2 are coefficients that have different values in proportion to the degree of satisfaction with the store and are proportional to the index value of the degree of satisfaction with the store indicated in the other user evaluation information 112.
e1, e2, and f1 to f4 are different coefficients depending on the driver or the passenger.
Presentation determination value B = (δ1 × index 5) + (δ2 × index 6) (8)
Index 5 = {e1 × (Correlation between cooking genre and favorite food genre)} + {e2 × (cost for eating out is an average fee)} (9)
Index 6 = {f1 × (when age is included in usage age = 1)} + {f2 × (when gender is included according to usage = 1)} + {f3 × ((current time) + (required time) Is within business hours = 1)} + {f4 × (when required time is within threshold time = 1)} (10)
 装置制御部17は、上述した手順で、提示候補106の目的ジャンルとマッチングする目的地の各々に対して提示判定値Bを算出し、ユーザに提示する店を決定する。表示装置制御部18は、装置制御部17により決定された提示目的地情報110から、各目的地の提示判定値Bが高い順に、提示目的地情報110に設定された画像や画像の提示理由を説明する音声を提示するよう表示装置部19を制御する。 The device control unit 17 calculates a presentation determination value B for each destination that matches the target genre of the presentation candidate 106 in the above-described procedure, and determines a store to be presented to the user. The display device control unit 18 displays the images set in the presentation destination information 110 and the reasons for presentation of the images in descending order of the presentation determination value B of each destination from the presentation destination information 110 determined by the device control unit 17. The display device unit 19 is controlled to present a voice to be explained.
 目的地へのルート案内が終了し、当該目的地での用件が完了すると、ユーザは、端末キー入力又は音声入力などで、当該目的地に対する満足度などを段階評価したユーザ評価情報113を目的地情報取得部14に入力する。このユーザ評価情報113は、車内ネットワーク90を介して目的地情報取得部14からデータベース30の目的地情報記憶部32に記憶される。このユーザ評価情報113が、他ナビゲーション装置40で利用される。 When the route guidance to the destination is completed and the business at the destination is completed, the user uses the user evaluation information 113 that has been graded for the degree of satisfaction with the destination by terminal key input or voice input. The information is input to the land information acquisition unit 14. This user evaluation information 113 is stored in the destination information storage unit 32 of the database 30 from the destination information acquisition unit 14 via the in-vehicle network 90. This user evaluation information 113 is used in the other navigation device 40.
 例えば、他ナビゲーション装置40が、ナビゲーション装置10と同様の構成及び機能を有する装置とすれば、ナビゲーション装置10からユーザ評価情報113を取得することで、ユーザ評価情報113を他ユーザ評価情報112として利用することができる。 For example, if the other navigation device 40 is a device having the same configuration and function as the navigation device 10, the user evaluation information 113 is used as the other user evaluation information 112 by acquiring the user evaluation information 113 from the navigation device 10. can do.
 以上のように、この実施の形態2によれば、目的地に関する目的地付加情報111を管理する車外サーバ70と、目的地に対する他ユーザ評価情報112を送信する他のユーザが管理する他ナビゲーション装置40との間で通信する通信制御部20を備え、装置制御部17が、ユーザ状態情報102に変化が発生したことを判定すると、当該状態に予め対応付けた目的地候補及び経路から、ユーザの趣向情報が示す趣向と合致し、通信制御部20で受信された他ユーザ評価情報112に応じて選択された目的地候補及び経路を、通信制御部20により車外サーバ70から受信された当該目的地候補に関する目的地付加情報111と共に、自車両のユーザに提示する。このように構成することで、同じ趣向を持つ他ユーザの目的地に対する評価である他ユーザ評価情報112を、提示する店の決定に利用できるため、よりユーザの趣向に合致した目的地提示が可能となる。 As described above, according to the second embodiment, the off-vehicle server 70 that manages the destination additional information 111 related to the destination and the other navigation device that is managed by another user who transmits the other user evaluation information 112 for the destination. When the device control unit 17 determines that a change has occurred in the user state information 102, the device control unit 17 communicates with the user from the destination candidates and routes previously associated with the state. The destination candidate and the route that are matched with the preference indicated by the preference information and selected according to the other-user evaluation information 112 received by the communication control unit 20, are received by the communication control unit 20 from the outside server 70. The information is presented to the user of the host vehicle along with the destination additional information 111 regarding the candidate. By configuring in this way, the other user evaluation information 112, which is an evaluation for the destination of another user having the same taste, can be used for determining the store to be presented, so that it is possible to present a destination that more closely matches the taste of the user It becomes.
 この発明に係るナビゲーション装置は、車両の搭乗員の生体情報や趣向から、ユーザの現状に即した適切な提示タイミングで、ユーザが所望する目的地候補や経路の提示が可能であり、利便性の高いナビゲーション装置として有用である。 The navigation device according to the present invention is capable of presenting destination candidates and routes desired by the user at an appropriate presentation timing in accordance with the current state of the user based on the biological information and preferences of the crew of the vehicle. It is useful as a high navigation device.

Claims (3)

  1.  移動体に搭載されて目的地までの経路を案内するナビゲーション装置において、
     前記移動体に搭乗する利用者の生体情報を検出する生体情報検出部と、
     前記生体情報検出部によって検出された生体情報から前記利用者の状態を示す指標値を算出する指標値算出部と、
     前記指標値算出部によって算出された指標値を所定の閾値と比較して前記利用者の状態に変化が発生したことを判定すると、当該状態に予め対応付けた目的地候補及び経路を、前記利用者に提示する制御部とを備えたことを特徴とするナビゲーション装置。
    In a navigation device that is mounted on a moving body and guides the route to the destination,
    A biological information detection unit for detecting biological information of a user who is on the moving body;
    An index value calculation unit that calculates an index value indicating the state of the user from the biological information detected by the biological information detection unit;
    When the index value calculated by the index value calculation unit is compared with a predetermined threshold value and it is determined that a change has occurred in the state of the user, the destination candidate and the route previously associated with the state are used as the usage A navigation device comprising a control unit for presenting to a person.
  2.  利用者の趣向を示す趣向情報の入力を受け付ける趣向情報取得部を備え、
     指標値算出部は、生体情報検出部によって検出された生体情報及び前記趣向情報取得部によって取得された趣向情報から前記利用者の状態を示す第2の指標値を算出し、
     制御部は、前記指標値算出部により算出された第2の指標値を所定の閾値と比較して、前記利用者の状態に変化が発生したことを判定すると、当該状態に予め対応付けた目的地候補及び経路から、前記利用者の趣向情報が示す趣向と合致する目的地候補及び経路を選択して、前記利用者に提示することを特徴とする請求項1記載のナビゲーション装置。
    A preference information acquisition unit that accepts input of preference information indicating the preference of the user,
    The index value calculation unit calculates a second index value indicating the state of the user from the biological information detected by the biological information detection unit and the preference information acquired by the preference information acquisition unit,
    When the control unit compares the second index value calculated by the index value calculation unit with a predetermined threshold and determines that a change has occurred in the state of the user, the control unit associates with the state in advance The navigation apparatus according to claim 1, wherein a destination candidate and a route that match a preference indicated by the preference information of the user are selected from the location candidates and the route, and presented to the user.
  3.  目的地に関する情報を管理するサーバ装置と、前記目的地に対する評価情報を送信する他の利用者が管理するナビゲーション装置との間で通信する通信処理部を備え、
     制御部は、前記利用者の状態に変化が発生したことを判定すると、当該状態に予め対応付けた目的地候補及び経路から、前記利用者の趣向情報が示す趣向と合致し、前記通信処理部により受信された評価情報に応じて選択された目的地候補及び経路を、前記通信処理部により前記サーバ装置から受信された当該目的地候補に関する情報とともに、前記利用者に提示することを特徴とする請求項2記載のナビゲーション装置。
    A communication processing unit that communicates between a server device that manages information about a destination and a navigation device that is managed by another user who transmits evaluation information for the destination;
    When the control unit determines that a change has occurred in the state of the user, the control unit matches the preference indicated by the preference information of the user from destination candidates and routes previously associated with the state, and the communication processing unit The destination candidate and the route selected according to the evaluation information received by the information processing unit are presented to the user together with information on the destination candidate received from the server device by the communication processing unit. The navigation device according to claim 2.
PCT/JP2009/003299 2009-07-14 2009-07-14 Navigation device WO2011007386A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2009/003299 WO2011007386A1 (en) 2009-07-14 2009-07-14 Navigation device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2009/003299 WO2011007386A1 (en) 2009-07-14 2009-07-14 Navigation device

Publications (1)

Publication Number Publication Date
WO2011007386A1 true WO2011007386A1 (en) 2011-01-20

Family

ID=43449010

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2009/003299 WO2011007386A1 (en) 2009-07-14 2009-07-14 Navigation device

Country Status (1)

Country Link
WO (1) WO2011007386A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013165310A (en) * 2012-02-09 2013-08-22 Nikon Corp Electronic apparatus
JP2015212125A (en) * 2014-05-07 2015-11-26 株式会社小糸製作所 Vehicle interior light device
EP3246870A4 (en) * 2015-01-14 2018-07-11 Sony Corporation Navigation system, client terminal device, control method, and storage medium
DE102018100373A1 (en) 2018-01-09 2019-07-11 Motherson Innovations Company Limited Method for vertical keystone correction in projection systems for head-up displays
JP2020165694A (en) * 2019-03-28 2020-10-08 本田技研工業株式会社 Controller, method for control, and program

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005293587A (en) * 1997-07-22 2005-10-20 Equos Research Co Ltd Agent device
JP2005345325A (en) * 2004-06-04 2005-12-15 Kenwood Corp Car navigation system, car navigation method, and program
JP2007212421A (en) * 2006-02-13 2007-08-23 Denso Corp Entertainment information providing system for automobile
JP2009042891A (en) * 2007-08-07 2009-02-26 Denso Corp Facility retrieval device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005293587A (en) * 1997-07-22 2005-10-20 Equos Research Co Ltd Agent device
JP2005345325A (en) * 2004-06-04 2005-12-15 Kenwood Corp Car navigation system, car navigation method, and program
JP2007212421A (en) * 2006-02-13 2007-08-23 Denso Corp Entertainment information providing system for automobile
JP2009042891A (en) * 2007-08-07 2009-02-26 Denso Corp Facility retrieval device

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013165310A (en) * 2012-02-09 2013-08-22 Nikon Corp Electronic apparatus
JP2015212125A (en) * 2014-05-07 2015-11-26 株式会社小糸製作所 Vehicle interior light device
EP3246870A4 (en) * 2015-01-14 2018-07-11 Sony Corporation Navigation system, client terminal device, control method, and storage medium
US10408629B2 (en) 2015-01-14 2019-09-10 Sony Corporation Navigation system, client terminal device, control method, and storage medium
DE102018100373A1 (en) 2018-01-09 2019-07-11 Motherson Innovations Company Limited Method for vertical keystone correction in projection systems for head-up displays
DE102018100373A9 (en) 2018-01-09 2020-01-02 Motherson Innovations Company Limited Method for vertical keystone correction in projection systems for head-up displays
JP2020165694A (en) * 2019-03-28 2020-10-08 本田技研工業株式会社 Controller, method for control, and program
CN111762147A (en) * 2019-03-28 2020-10-13 本田技研工业株式会社 Control device, control method, and storage medium storing program
JP7190952B2 (en) 2019-03-28 2022-12-16 本田技研工業株式会社 Control device, control method and program

Similar Documents

Publication Publication Date Title
US10302444B2 (en) Information processing system and control method
US8655740B2 (en) Information providing apparatus and system
US20180172464A1 (en) In-vehicle device and route information presentation system
US20090318777A1 (en) Apparatus for providing information for vehicle
WO2011007386A1 (en) Navigation device
JP2007122579A (en) Vehicle controller
JP4807625B2 (en) Information provision device
JP2010008268A (en) Travel supporting system
JP2012112853A (en) Information processor, in-vehicle navigation device and information processing method
JP2014052518A (en) Advertisement distribution system and advertisement distribution method
WO2014203463A1 (en) Information processing system, method and non-transitory computer-readable medium
JP4604597B2 (en) State estimating device, state estimating method, information providing device using the same, information providing method
JP6552548B2 (en) Point proposing device and point proposing method
JP2022001870A (en) Route processing program, route processing apparatus, and route processing method
CN111189463A (en) Information processing apparatus and information processing program
WO2007135855A1 (en) Information presentation device, information presentation method, information presentation program, and computer readable recording medium
CN113496193A (en) Recommendation guidance system, recommendation guidance method, and storage medium
CN114119293A (en) Information processing device, information processing system, program, and vehicle
JP2014203357A (en) Information presentation system
CN110285824B (en) Information providing apparatus and control method thereof
US20220306124A1 (en) Information providing apparatus
US20160052523A1 (en) Apparatus and method of use for an alcohol test unit
JP2016114427A (en) Information presentation device and information presentation method
JP2020091777A (en) Information processing system
JP7159987B2 (en) INFORMATION DECISION DEVICE AND INFORMATION DECISION METHOD

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 09847286

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 09847286

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP