AU2020103034A4 - A System for Displaying Turn-By-Turn Navigation Details from a Mobile Computing Device on a Connected Computing Device Independent of Vendor Support - Google Patents
A System for Displaying Turn-By-Turn Navigation Details from a Mobile Computing Device on a Connected Computing Device Independent of Vendor Support Download PDFInfo
- Publication number
- AU2020103034A4 AU2020103034A4 AU2020103034A AU2020103034A AU2020103034A4 AU 2020103034 A4 AU2020103034 A4 AU 2020103034A4 AU 2020103034 A AU2020103034 A AU 2020103034A AU 2020103034 A AU2020103034 A AU 2020103034A AU 2020103034 A4 AU2020103034 A4 AU 2020103034A4
- Authority
- AU
- Australia
- Prior art keywords
- computing device
- turn
- navigation
- notification
- details
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3626—Details of the output of route guidance instructions
- G01C21/3661—Guidance output on an external device, e.g. car radio
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3626—Details of the output of route guidance instructions
- G01C21/365—Guidance using head up displays or projectors, e.g. virtual vehicles or arrows projected on the windscreen or on the road itself
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3667—Display of a road map
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3688—Systems comprising multiple parts or multiple output devices (not client-server), e.g. detachable faceplates, key fobs or multiple output screens
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Navigation (AREA)
Abstract
The present invention allows to render turn-by-turn navigation details from a mobile
computing device on a connected computing device. In contrast to prior art, this
system may allow this independently of support for a specific computing device from
the navigation software vendor. It achieves this through analysing and processing the
notifications of the navigation application on the sender device. Consequently, it may
support a plurality of computing devices and navigation software products.
1/10
DRAWINGS
100 130
350mr
Pine St
0
1
System
110 Figre
Figure 1
Description
1/10
100 130 350mr Pine St
1 0
System
110 Figre
Figure 1
1/11
[0001] Mobile computing devices like smartphones may offer feature maps and navigation applications. Additionally, another computing device like a smartwatch, a vehicle computer or a head-mounted display may be connected to the smartphone. When turn-by-turn navigation is started on the smartphone, it depends on the navigation software vendor to offer capabilities so that the navigation details may be displayed on a specific smartwatch, vehicle computer or head-mounted display model. Therefore, it is desirable to have a system, that allows to show navigation information on a smartwatch, vehicle computer or head-mounted display, independently of support by the navigation software vendor.
[0002] Navigation application on smartphones are widely used. Based on the global position system, internet access and built-in screen, speaker and vibration they enable the user to navigate to a destination.
[0003] Devices connected to this smartphone (e.g. wearable personal computing devices or an in-vehicle computer system) may feature a screen, sensors, vibration and speaker. They may be connected to the smartphone to display incoming texts and track body sensors.
[0004] Prior art reference frames are traditionally applications that already feature integration with a specific smartwatch, vehicle computer or head-mounted display model. The navigation software vendors only support specific smartwatches,
2/11
vehicle computers and head-mounted displays to work with specific smartphones. For example, smartphone navigation software by company A may only support smartwatches by company B to render navigation information from a running turn-by turn navigation on the phone. A customer with a smartwatch by company C cannot use the navigation software by company A with his watch. Consequently, there exists a plurality of device and navigation software combinations which cannot be used in conjunction.
[0005] It is, therefore, an object of the present invention to overcome the deficiencies of the prior art and to provide an improved reference frame capable of enabling a connected computing device to work with a plurality of navigation software products and mobile computing devices independently of support by the navigation software vendor. Such reference frame should be easily and accurately reproducible.
[0006] The foregoing needs are met by the present invention. In one embodiment of the present invention, a system is provided which may render navigation details from a navigation application on a mobile computing device (also referred to herein as "sender") on a connected computing device (also referred to herein as "receiver") without explicit support through application programming interfaces ("APIs") by the navigation software vendor.
[0007] There has thus been outlined certain embodiments of the invention in order that the detailed description of the invention herein may be better understood. There are, of course, additional embodiments of the invention that will be described below and which will form the subject matter of the claims appended hereto.
[0008] The presented system consists of two components. Firstly, the system on the sender device which processes notifications of a turn-by-turn navigation application running on the sender device through a utilisation of a notification listener
3/11
service. Secondly, a companion system 580 on the connected receiver device which receives navigation data from the system 110 and may render it on the device, e.g. by, but not limited to, displaying it on a screen, playing it through a speaker or executing vibrations.
[0009] The present invention also is capable of other and different embodiments, and its several details may be modified in various respects. Accordingly, the drawings and descriptions are to be regarded as illustrative in nature, and not as restrictive.
[0010] The accompanying drawings, which are incorporated herein and constitute part of this specification, illustrate exemplary embodiments of the invention, and, together with the general description given above and the detailed description given below, serve to explain the features of the invention.
FIG. 1 is a high-level diagram of the position of the system in between a sender device, a network and a receiver device.
FIG. 2 illustrates an example of turn-by-turn navigation screen on a mobile computing device acting as sender device.
FIG. 3 is an example of the notification list on a computing device acting as a sender system while a turn-by-turn navigation is active.
FIG. 4 illustrates three different styles of navigation notifications on a computing device acting as a sender device.
4/11
FIG. 5 is a flowchart illustrating a method of communicating between the navigation application on a sender system and a wearable personal computing device.
FIG. 6 is a flowchart showing how the data class of an incoming sender device application notification is transformed into a data class containing navigation details.
FIG. 7 is a flowchart illustrating the order of events between the devices and the system from the start until the end of a turn-by-turn navigation on the sender device.
FIG. 8 illustrates an example of the preferred embodiment of the system featuring a smartphone and a wearable personal computing device.
FIG. 9 illustrates an example of the preferred embodiment of the system featuring a smartphone and an in-vehicle computer system.
FIG. 10 illustrates an example of the preferred embodiment of the system featuring a smartphone and an optical head-mounted display.
[0011] The invention will now be described with reference to the drawing figures, in which like reference numerals refer to like parts throughout. The following detailed description is of example embodiments of the presently claimed invention with references to the accompanying drawings. Such description is intended to be illustrative and not limiting with respect to the scope of the present invention. Such embodiments are described in sufficient detail to enable one of ordinary skill in the art to practice the subject invention, and it will be understood that other embodiments may
5/11
be practiced with some variations without departing from the spirit or scope of the subject invention.
[0012] Figure (FIG.) 1 is a high-level diagram of the system 110 which communicates between a sender device (e.g., a smartphone) 100 and a receiver device (e.g., a smartwatch, an in-vehicle computer system or an optical head-mounted display) 130 over a network 120, in accordance with some embodiments. The functions performed by the various entities of FIG. 1 may vary in different embodiments. The system 110 coordinates communication between a user and a sender device, operating on the sender device 100 and receiver device 130 respectively. When a turn-by-turn navigation is started on the sender device 100, the system 110 receives and analyzes the applications notification to extract substantial navigation data. This data is passed through the network 120 and displayed on the receiver device 130.
[0013] Illustrated in FIG. 2 is a turn-by-turn navigation screen 200 on a computing device acting as a sender system 100. The graphical user interface 200 may contain navigation instruction details 210 such as, but not restricted to, the distance to the next turn 220, the name of the upcoming street 230, instructions (e.g., turn left, turn right, continue) as well as a graphical representation of the next step 240 (e.g., left arrow, right arrow, straight arrow). The layout and functions performed by the various display elements of FIG. 3 may vary in different embodiments. This information cannot be extracted without an application programming interface provided by the navigation software vendor.
[0014] FIG. 3 is an example of the notification list on a computing device acting as a sender system 100. It may be display by dragging downward of the status bar. The notification list may show a plurality of messages that the device has received and may contain, but not restricted to mails 310, messages 320, news, application details. When a turn-by-turn navigation is active on the device, it may display a notification 300 with navigation details. Those may, but are not restricted to, contain a title 330 (here
6/11
"Navigation"), a distance 340 (here "350 m"), a next street 350 (here "Pine St") and a graphical representation of an arrow 360 (here pointing right). In the example in FIG. 3, there are two total messages 310 and 320 and one ongoing navigation 300. This information may be extracted and used subsequently through the embodied system. The content of the notifications of FIG. 3 may vary in different embodiments.
[0015] FIG. 4 illustrates, but is not limited to, three different styles of navigation notifications on a computing device acting as a sender system. The content of the three notifications of FIG. 4 may vary in different embodiments. The layout, colours, amount of details, font style, size, type may all depend on the operating system, the language, the underlying navigation software and version. In the first variant 300 the notification shows a title 330 (here "Navigation"), a distance 340 (here "350 m"), a next street 350 (here "Pine St") and a graphical representation of an arrow 360 (here pointing right). In the second variant 400 the notification shows a distance 340 (here "350 m"), a next street 350 (here "Pine St"), the time remaining 420 (here "4 min"), the expected arrival time 430 (here "ETA 9:51") and a graphical representation of an arrow 460 (here pointing right). In the third variant 410 the notification shows a title 330 (here "Navigation"), a distance 340 (here "350 m"), a next street 350 (here "Pine St"), an instruction 440 (here "Turn Right") and a graphical representation of an arrow 460 (here pointing right).
[0016] FIG. 5 is a flowchart illustrating a method of communicating between the navigation application 500 on a sender device 100 and a connected computing device 130 that acts as receiver. A notification listener service 530 may receive an event containing the details of a notification when it is posted or removed by an application. This notification listener service 530 implementation is device specific and may be, but is not limited to, a "Notification Listener Service" on Android, an "Apple Notification Center Service" on Apple, or various other implementations in different embodiments. Furthermore, it may also receive notification data through a "notification posted" event 550 when it was posted 510 by a turn-by-turn navigation application 500. When the navigation application 500 removes the navigation notification 520 the notification
7/11
listener service 530 may also receive a corresponding event 540. The notification data comprises details such as, but not limited to, the distance to the next turn 340, the name of the upcoming street 350, estimated arrival time 430, as well as a graphical representation of the next step 360 (e.g., left arrow, right arrow, straight arrow). The data may also include data indicating an instruction (e.g. "turn right"), distance to destination or expected arrival time 430. The system may extract these details through a parser 560 using application specific methods. See FIG. 6 for more details on the parser. The resulting data is put into a message 570 forwarded over a network 120. The network 120 may, but is not limited to, be a wireless, WiFi, Internet, Bluetooth or cable connection. Through the network the receiver device 130 receives the message and hands it to a companion system 580. This companion system 580 is purpose-built to render the navigation instructions from the system 110 on the receiver device 130. Here it may be rendered 590, e.g. displayed on an integrated screen, played through a speaker or used to execute vibrations. The functions performed by the various entities of FIG. 5 may vary in different embodiments.
[0017] FIG. 6 is a flowchart containing the data class of an incoming sender device application notification 610, a parser 560 which transforms the data and the data class of the navigation details 620. The notification class may contain a wide variety of methods and variables. The data classes of FIG. 6 may vary in different embodiments. FIG 6 focuses, but is not limited to, the time the notification was posted 630, the application identifier of the posting application 640, the textual title of the notification 650, its text 660, its sub text 670 and its icon 680 which is represented as graphical data. The parser checks if the application identifier 640 is from a navigation application. If so, it transforms the textual data, but is not limited to, of the title of the notification 650, its text 660 and its sub text 670 to extract the distance 440, the next street 450, and the arrival time 480 through string operations and store the result in variables, a navigation data class 620 or through other means. The functions performed by the parser may vary in different embodiments. The icon 680 may be extracted by the parser 560 and may be edited and subsequently stored in a variable, in the graphic field 460 of a navigation data class 620 or through other means. The
8/11
string operations of the parser 560 may depend on the operating system and its version, the navigation application and the language and character set of the device 100.
[0018] FIG. 7 is a flowchart illustrating the order of events after starting a navigation 700 on the sender device 100 (a mobile computing device) and receiver device 130 (a connected computing device). In the first step, the user of the sender device 100 starts a turn-by-turn navigation 200. The navigation application posts a notification 300 and the system gets notified by the notification listener service 530 that a notification was posted 550. After checking that the application identifier 640 indicates a navigation application posted the notification the system can be sure it detected a newly started navigation 710. As a result, the system 110 may send an activation signal 720 over the network 120 to the companion system 580 on the receiver device 130. In this case, the companion system 580 launches 730 on the receiver device 130. This is an optional step, alternatively a user may start the companion system 580 manually. Next, after a notification was posted 550, a parser 560 may analyse this instance of the notification class 610, then generate a message 570 with the navigation details 620. This message may be sent 740 over the network 120 and may get received on the receiver device 130 by the companion system 750. The companion system may now render the content 590 of the message 570 received navigation details 620 on the receiver device 130. This may be executed by, but is not limited to, rendering the details on a screen. Playing the details through audio or conveying information through vibration may be further possibilities. After the turn-by turn navigation 200 exits, e.g. after arriving at the destination or being closed by the user, the system 110 receives a "notification removed" event 540 and may send a close signal 760 over the network 120 to the companion system 580 on the receiver device 130. In this case, the companion system 580 closes 770 on the receiver device 130. This is an optional step, alternatively a user may close the companion system manually. The functions performed by the various entities of FIG. 7 may vary in different embodiments.
9/11
[0019] FIG. 8 illustrates, but is not limited to, a first of three example embodiments of the system. The layout, colours, amount of details, font style, size, type may all depend on the device, the operating system, the underlying navigation software and its version. On the left, the sender device 100 is executing a turn-by-turn navigation 200. The system 110 is active on the sender device 100. It parses the notification data, as outlined earlier, and sends the navigation details over the network 120. In this embodiment, the receiver device 130 is a wearable personal computing device (e.g. a smartwatch). The companion system 580 is active on the receiver device. It receives the message from the system 110 and may use a graphical user interface to display it on the built-it screen 800, play it through a speaker or use it to execute vibrations. In this particular example, the distance 820 (here "350 m"), a next street 830 (here "Pine St") and a graphical representation of an arrow 810 (here pointing right) are being displayed. The graphical user interface may contain a plurality of details such as, but not restricted to, the distance to the next turn, the name of the upcoming street, instructions (e.g., turn left, turn right, continue) as well as a graphical representation of the next step (e.g., left arrow, right arrow, straight arrow), the distance to destination, the name of the destination, the expected arrival time, the remaining time, a map or any combination of these. The layout, content and functionality of the smartphone 100 and the wearable personal computing device 130 may vary in different embodiments.
[0020] FIG. 9 illustrates, but is not limited to, a second example embodiment of the system. The layout, colours, amount of details, font style, size, type may all depend on the device, the operating system, the underlying navigation software and its version. The functionality of the sender device 100, the navigation 200, the system 110 and the network 120 is the same is in FIG. 8. In this embodiment, the receiver device 130 is an in-vehicle computer system (e.g. in a car, motorbike or truck). The companion system 580 is active on the receiver device. It receives the message from the system 110 and uses a graphical user interface to display it on the built-it screen 900, play it through a speaker or use it to execute vibrations. In this particular example, the distance 920 (here "350 m"), a next street 930 (here "Pine St") and a
10/11
graphical representation of an arrow 910 (here pointing right) are being displayed. The graphical user interface may contain a plurality of details such as, but not restricted to, the distance to the next turn, the name of the upcoming street, instructions (e.g., turn left, turn right, continue) as well as a graphical representation of the next step (e.g., left arrow, right arrow, straight arrow), the distance to destination, the name of the destination, the expected arrival time, the remaining time, a map or any combination of these. The layout, content and functionality of the smartphone 100 and the in-vehicle computer system 130 may vary in different embodiments.
[0021] FIG. 10 illustrates, but is not limited to, a third example embodiment of the system. The layout, colours, amount of details, font style, size, type may all depend on the device, the operating system, the underlying navigation software and its version. The functionality of the sender device 100, the navigation 200, the system 110 and the network 120 is the same is in FIG. 8. In this embodiment, the receiver device 130 is an optical head-mounted display (e.g. augmented or virtual reality glasses). The companion system 580 is active on the receiver device. It receives the message from the system 110 and uses a graphical user interface to display it on the built-it screen 1000, play it through a speaker or use it to execute vibrations. In this particular example, the distance 1020 (here "350 m"), a next street 1030 (here "Pine St") and a graphical representation of an arrow 1010 (here pointing right) are being displayed. The graphical user interface may contain a plurality of details such as, but not restricted to, the distance to the next turn, the name of the upcoming street, instructions (e.g., turn left, turn right, continue) as well as a graphical representation of the next step (e.g., left arrow, right arrow, straight arrow), the distance to destination, the name of the destination, the expected arrival time, the remaining time, a map or any combination of these. The layout, content and functionality of the smartphone 100 and the in-vehicle computer system 130 may vary in different embodiments.
[0022] Furthermore, it should be noted that all examples in the present disclosure, while illustrating many embodiments of the present invention, are provided
11/11
as non-limiting examples and are, therefore, not to be taken as limiting the various aspects so illustrated.
[0023] While the present invention has been disclosed with references to certain embodiments, numerous modifications, alterations, and changes to the described embodiments are possible without departing from the spirit and scope of the present invention, as defined in the appended claims. Accordingly, it is intended that the present invention not be limited to the described embodiments, but that it has the full scope defined by the language of the following claims, and equivalents thereof.
Claims (5)
1. A computer-implemented system, comprising: a notification listener service to extract navigation details from an ongoing turn by-turn navigation notification on a mobile computing device acting as sender; and sending these navigation details over a network to a connected computing device acting as receiver which may render these navigation details.
2. The data processing system of claim 1, wherein the connected computing device has a built-in screen.
3. The data processing system of claim 1, wherein the connected computing device is a wearable personal computing device.
4. The data processing system of claim 1, wherein the connected computing device is an in-vehicle computer system.
5. The data processing system of claim 1, wherein the connected computing device is an optical head-mounted display.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
AU2020103034A AU2020103034A4 (en) | 2020-10-27 | 2020-10-27 | A System for Displaying Turn-By-Turn Navigation Details from a Mobile Computing Device on a Connected Computing Device Independent of Vendor Support |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
AU2020103034A AU2020103034A4 (en) | 2020-10-27 | 2020-10-27 | A System for Displaying Turn-By-Turn Navigation Details from a Mobile Computing Device on a Connected Computing Device Independent of Vendor Support |
Publications (1)
Publication Number | Publication Date |
---|---|
AU2020103034A4 true AU2020103034A4 (en) | 2020-12-24 |
Family
ID=73838780
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
AU2020103034A Active AU2020103034A4 (en) | 2020-10-27 | 2020-10-27 | A System for Displaying Turn-By-Turn Navigation Details from a Mobile Computing Device on a Connected Computing Device Independent of Vendor Support |
Country Status (1)
Country | Link |
---|---|
AU (1) | AU2020103034A4 (en) |
-
2020
- 2020-10-27 AU AU2020103034A patent/AU2020103034A4/en active Active
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6346281B2 (en) | In-vehicle interactive system and in-vehicle information equipment | |
US10089053B2 (en) | Mirroring deeplinks | |
CN113010016B (en) | Method and system for providing user interaction using virtual assistant | |
US20150269937A1 (en) | Disambiguating Input Based On Context | |
US11693531B2 (en) | Page display position jump method and apparatus, terminal device, and storage medium | |
CN103493030B (en) | Strengthen vehicle infotainment system by adding the distance sensor from portable set | |
JP2014135058A (en) | Management for interactive in-vehicle advertisement | |
US9083780B2 (en) | Electronic device, screen control method, and screen control program | |
JP2007086859A (en) | Information processing device and method of controlling display screen | |
US10950240B2 (en) | Information processing device and information processing method | |
US10684822B2 (en) | Locating and presenting key regions of a graphical user interface | |
CN104284811A (en) | Device for vehicle, computer readable medium for displaying information, and system for vehicle | |
AU2020103034A4 (en) | A System for Displaying Turn-By-Turn Navigation Details from a Mobile Computing Device on a Connected Computing Device Independent of Vendor Support | |
US20120329398A1 (en) | In-vehicle messaging | |
CN109683726B (en) | Character input method, character input device, electronic equipment and storage medium | |
IE20200260U1 (en) | A system for displaying turn-by-turn navigation details from a mobile computing device on a connected computing device independent of vendor support | |
US20140181651A1 (en) | User specific help | |
CN115412759B (en) | Information display method, apparatus, device, and computer-readable storage medium | |
JP7010646B2 (en) | Program, information processing device and screen display method | |
JP5743471B2 (en) | Information processing device | |
KR101027148B1 (en) | System and method for Guide Using POI handwriting | |
JP2002007414A (en) | Voice browser system | |
KR20180094684A (en) | Vehicle Terminal System And Control Method Thereof | |
CN111465921B (en) | User terminal device and control method thereof | |
CN114154491A (en) | Interface skin updating method, device, equipment, medium and program product |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
FGI | Letters patent sealed or granted (innovation patent) |