KR101708676B1 - Driver assistance apparatus and control method for the same - Google Patents
Driver assistance apparatus and control method for the same Download PDFInfo
- Publication number
- KR101708676B1 KR101708676B1 KR1020150067500A KR20150067500A KR101708676B1 KR 101708676 B1 KR101708676 B1 KR 101708676B1 KR 1020150067500 A KR1020150067500 A KR 1020150067500A KR 20150067500 A KR20150067500 A KR 20150067500A KR 101708676 B1 KR101708676 B1 KR 101708676B1
- Authority
- KR
- South Korea
- Prior art keywords
- vehicle
- user
- processor
- voice
- information
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 34
- 238000004891 communication Methods 0.000 claims description 15
- 230000002093 peripheral effect Effects 0.000 claims description 7
- 210000004392 genitalia Anatomy 0.000 claims 1
- 230000033001 locomotion Effects 0.000 description 22
- 238000010586 diagram Methods 0.000 description 15
- 230000008859 change Effects 0.000 description 14
- 210000003128 head Anatomy 0.000 description 12
- 238000001514 detection method Methods 0.000 description 10
- 238000012545 processing Methods 0.000 description 10
- 230000006870 function Effects 0.000 description 7
- 238000004378 air conditioning Methods 0.000 description 5
- 239000000446 fuel Substances 0.000 description 5
- 238000007781 pre-processing Methods 0.000 description 5
- 230000011218 segmentation Effects 0.000 description 5
- 230000001133 acceleration Effects 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- 230000000694 effects Effects 0.000 description 3
- 239000000284 extract Substances 0.000 description 3
- 239000004973 liquid crystal related substance Substances 0.000 description 3
- 230000004397 blinking Effects 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 230000014509 gene expression Effects 0.000 description 2
- 231100001261 hazardous Toxicity 0.000 description 2
- 210000001747 pupil Anatomy 0.000 description 2
- 230000005236 sound signal Effects 0.000 description 2
- 238000012706 support-vector machine Methods 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 125000002066 L-histidyl group Chemical group [H]N1C([H])=NC(C([H])([H])[C@](C(=O)[*])([H])N([H])[H])=C1[H] 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 210000004087 cornea Anatomy 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 230000000193 eyeblink Effects 0.000 description 1
- 210000000887 face Anatomy 0.000 description 1
- 229910052739 hydrogen Inorganic materials 0.000 description 1
- 239000001257 hydrogen Substances 0.000 description 1
- 125000004435 hydrogen atom Chemical class [H]* 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000001151 other effect Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 239000010409 thin film Substances 0.000 description 1
- 238000002834 transmittance Methods 0.000 description 1
- 238000012795 verification Methods 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R21/00—Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
- B60R21/01—Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents
- B60R21/013—Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents including means for detecting collisions, impending collisions or roll-over
- B60R21/0134—Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents including means for detecting collisions, impending collisions or roll-over responsive to imminent contact with an obstacle, e.g. using radar systems
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/02—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- B60W2050/08—
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
- B60W2050/146—Display means
Landscapes
- Engineering & Computer Science (AREA)
- Automation & Control Theory (AREA)
- Mechanical Engineering (AREA)
- Transportation (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Chemical & Material Sciences (AREA)
- Combustion & Propulsion (AREA)
- Mathematical Physics (AREA)
- Traffic Control Systems (AREA)
Abstract
The present invention relates to a driver assistance apparatus and a control method thereof, and a driver assistance apparatus according to an embodiment of the present invention includes a display unit; A first camera for generating a running image for the front of the vehicle; A second camera for generating an indoor image of the vehicle; An audio input unit for receiving a user voice; And a processor for selecting at least one object appearing on the traveling image based on the indoor image and the user's voice when the vehicle enters the driver assist mode and controlling the display unit to display information about the selected object .
Description
BACKGROUND OF THE
A vehicle is a device that drives a road or a track by driving a wheel for the purpose of transporting a person or cargo. For example, two-wheeled vehicles such as a motorcycle, a four-wheeled vehicle such as a sedan, as well as a train belong to the vehicle.
In recent years, various types of displays have been mounted on vehicles due to the remarkable development of display technology. Currently, TFT-LCD (Thin Film Transistor Liquid Crystal Display) is mainly used as a vehicle display, and convenience of safety and driving New types of displays such as the Head Up Display (HUD) have also become commercially available.
On the other hand, the display mounted on the vehicle displays various kinds of information related to driving such as vehicle speed, fuel consumption, fuel amount, etc., but it is not utilized because the information is within a predetermined limited category.
In addition, the user often wishes to provide information about other vehicles or facilities while driving, but it is troublesome that the user has to move his eyes or go through complicated input procedures several times.
Therefore, there is a need for a new technique that allows a user to select an object based on the user's motion and voice, to provide information related to the selected object in an intuitive form, and to reflect the control in the vehicle.
An object of the present invention is to provide a driver assistance device and a control method thereof that can provide information about various objects existing in the vicinity of a vehicle to a user on the basis of at least one of a traveling image, I will say that there is purpose.
Further, the present invention provides an operator-assisted device for controlling an operation of a vehicle (for example, deceleration, acceleration, steering, changing of a traveling mode) and a control method thereof based on information about various objects existing around the vehicle It is said that the purpose is.
The problems of the present invention are not limited to the above-mentioned problems, and other problems not mentioned can be clearly understood by those skilled in the art from the following description.
According to an aspect of the present invention, there is provided a display apparatus comprising: a display unit; A first camera for generating a running image for the front of the vehicle; A second camera for generating an indoor image of the vehicle; An audio input unit for receiving a user voice; And a processor for selecting at least one object appearing on the traveling image based on the indoor image and the user voice when the vehicle enters the driver assist mode and controlling the display unit to display information about the selected object , A driver assistance device is provided. The details of other embodiments are included in the detailed description and drawings.
Effects of the mobile terminal and the control method according to the present invention will be described as follows.
According to at least one of the embodiments of the present invention, by providing the user with information on various objects existing in the vicinity of the vehicle based on at least one of the running image, the indoor image and the user's voice, It is possible to selectively receive information that the user desires to receive.
Further, according to at least one embodiment of the present invention, by setting an area for detecting an object on the basis of a user's motion, the amount of computation is reduced compared with the case where an object is detected from the entire peripheral area of the vehicle, And the like.
Further, according to at least one embodiment of the present invention, it is possible to accurately select only the object intended by the user among a plurality of objects located in the periphery of the vehicle based on the length of the user's voice, Is increased.
According to at least one of the embodiments of the present invention, based on the result of converting the user's voice into text, it is possible to accurately select only the object intended by the user among the plurality of objects located in the vicinity of the vehicle, Therefore, user convenience is increased.
Also, according to at least one embodiment of the present invention, it is possible to eliminate the hassle of an input process for a conventional route search by providing a route to a specific object appearing in a traveling image.
Further, according to at least one embodiment of the present invention, by executing the autonomous running function along the route to the specific object, it is possible to safely and quickly carry the driver to a specific object even if the driver does not intervene in the running of the vehicle There are advantages.
The effects of the present invention are not limited to the effects mentioned above, and other effects not mentioned can be clearly understood by those skilled in the art from the description of the claims.
Figures 1A and 1B schematically illustrate a vehicle in which a driver assistance device according to embodiments of the present invention may be provided.
Fig. 2 is a functional diagram of the vehicle shown in Fig. 1. Fig.
3 shows a block diagram of a driver assistance device according to an embodiment of the present invention.
FIG. 4 shows an example of a block diagram of the processor shown in FIG.
5 is a diagram showing a concept of displaying information when the display unit of the driver assistance apparatus is a HUD (Head Up Display).
6 is a flowchart illustrating a method of controlling a driver assistance device according to an exemplary embodiment of the present invention.
FIG. 7 illustrates a situation in which the driver assistance device according to the embodiment of the present invention enters the driver assistance mode.
8A and 8B are diagrams illustrating an operation in which the driver assistance device according to the embodiment of the present invention sets a selection region on a traveling image based on a user's movement.
9 is a view showing an example of an operation of the driver assistance device according to the embodiment of the present invention to select an object displayed on the traveling image based on the user's voice.
10A and 10B are views showing another example of an operation of the driver assistance device according to the embodiment of the present invention to select an object appearing on the traveling image based on the user's voice.
11A and 11B are views showing another example of an operation of the driver assistance device according to the embodiment of the present invention to select an object appearing on the traveling image based on the user's voice.
12A and 12B are views showing another example of an operation in which the driver assistance device according to the embodiment of the present invention selects an object appearing on the traveling image based on the user's voice.
13A and 13B are diagrams illustrating another example of an operation in which the driver assistance device according to the embodiment of the present invention selects an object displayed on the traveling image based on the user's voice.
14A and 14B are views showing an example of an operation of providing information about a selected object according to an embodiment of the present invention.
15 is a view showing another example of an operation of providing information about a selected object according to an embodiment of the present invention.
Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings, wherein like reference numerals are used to designate identical or similar elements, and redundant description thereof will be omitted. The suffix "module" and " part "for the components used in the following description are given or mixed in consideration of ease of specification, and do not have their own meaning or role. In the following description of the embodiments of the present invention, a detailed description of related arts will be omitted when it is determined that the gist of the embodiments disclosed herein may be blurred. It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are intended to provide further explanation of the invention as claimed. , ≪ / RTI > equivalents, and alternatives.
Terms including ordinals, such as first, second, etc., may be used to describe various elements, but the elements are not limited to these terms. The terms are used only for the purpose of distinguishing one component from another.
It is to be understood that when an element is referred to as being "connected" or "connected" to another element, it may be directly connected or connected to the other element, . On the other hand, when an element is referred to as being "directly connected" or "directly connected" to another element, it should be understood that there are no other elements in between. It is also to be understood that when an element is referred to as controlling another element, it may directly control the other element, but may also perform control through intermediation of another intermediate element.
The singular expressions include plural expressions unless the context clearly dictates otherwise.
In the present application, the terms "comprises", "having", and the like are used to specify that a feature, a number, a step, an operation, an element, a component, But do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, or combinations thereof.
Hereinafter, a driver assistance apparatus 100 according to an embodiment of the present invention will be described in detail, referring to a vehicle having the driver assistance apparatus 100 according to an embodiment of the present invention.
Figures 1A and 1B schematically illustrate a
1A, a
The wheel 11 includes the
The window 12 may include a
The pillar 13 is a column connecting the vehicle body and the roof, and adds the strength of the
An external background is illuminated on the side mirrors 14 so that the user can check the situation behind the left and right sides of the
In addition, the
In addition, a
In addition, the
Referring to FIG. 1B, a
In addition, the
On the other hand, the
Of course, as mentioned above, the
Fig. 2 is a functional diagram of the
2, the
The
The
The
Also, the
Also, the
The
The
In addition, the
The
The
The driving
The
The
The
The power
The air
The
The
The
The
Also, the
The
The
The
The
On the other hand, the
Meanwhile, some of the components included in the
The driver assistance device 100 according to the embodiments of the present invention can display the information displayed on the
FIG. 3 shows a block diagram of a driver assistance device 100 according to an embodiment of the present invention.
Referring to FIG. 3, the driver assistance device 100 may include a
The
The
The
The indoor camera image generated by the
The
The
The
In particular, the
In addition, the
For example, when the detected object is another vehicle running in the vicinity of the
In addition, the
The
Specifically, the
The
Specifically, the
Here, the selection area is a part of the entire traveling image, which means an area including a point corresponding to a direction indicated by at least one of a user's sight line and a gesture. By setting the selection area, a range for selecting an object can be narrowed, and information on an object desired to be provided by the user can be promptly provided.
At this time, the selection region may be a point included in the running image, or may be a region having a predetermined area. For example, the
When the selection area is set, the
Meanwhile, the
Further, when the
Specifically, the
The
On the other hand, there may be a case where two or more objects corresponding to the text acquired from the user's voice appear in the selected area. For example, in a state where two other vehicles appear in the selection area, there may be a case where the text acquired from the user voice is "vehicle ". In this case, the
Specifically, the
At this time, the
Alternatively, the
Specifically, when the text corresponding to the additional input user voice is related to color, the
In addition, when the text corresponding to the additional input user voice is related to the direction, the
In addition, when the text corresponding to the additional input user voice is about the size, the
In addition, when the
In addition, the
When the text obtained from the additional input user voice is the same word as the text obtained from the user voice inputted by the
In addition, the
The information about the selected object may include, for example, the type of the selected object (e.g., pedestrian, lane, other vehicle, building, etc.), speed (e.g., absolute speed, relative speed, The distance from the
Further, when the selected object is a facility such as a building, detailed information such as the number of floors, area, utilization time, parking lot position, remaining distance to the facility, and the like can be provided.
The
In addition, the
The
The
FIG. 4 shows an example of a block diagram of the
4, the
The
In detail, the
The
The
Specifically, the
The
An
On the other hand, the
For example, the
An
3, the
5 is a diagram illustrating a concept of displaying information when the
Referring to FIG. 5, the
The
When the light reflected by the
On the other hand, the operation principle of the HUD (Head Up Display) shown in FIG. 5 is an exemplary one, and information can be displayed on the
The
Hereinafter, it is assumed that the
6 is a flowchart illustrating a control method of the driver assistance device 100 according to an embodiment of the present invention.
Referring to FIG. 6, the
Specifically, the
Next, the
Subsequently, the
Next, based on at least one of the indoor image received through step S620 and the user voice received through step S630, the
Then, the
Next, the
Hereinafter, the operation of the driver assistance device 100 according to the embodiment of the present invention will be described in more detail with reference to FIGS. 7 to 15. FIG.
FIG. 7 illustrates a situation in which the driver assistance apparatus 100 according to the embodiment of the present invention enters the driver assistance mode.
First, referring to FIG. 7A, when the
7B, the
Referring to FIG. 7C, the
8A and 8B are diagrams illustrating an operation of the driver assistance device 100 according to the embodiment of the present invention to set a selection region on a traveling image based on a movement of a
Referring to FIG. 8A, the
The
The
Next, referring to FIG. 8B, the
8A, the
In other words, the
On the other hand, when the setting of the selection areas S1 and S2 is completed, the
9 is a view showing an example of an operation of the driver assistance apparatus 100 according to the embodiment of the present invention to select an object appearing on the traveling image based on the user's voice. For convenience of explanation, it is assumed that an object is selected on the basis of a user's voice inputted after the selection area S1 shown in Fig. 8A is set.
Referring to FIG. 9, the
On the other hand, when the object corresponding to the text appears in the selection region, the
As shown in the figure, when the text corresponding to the recognition result of the user voice "building" is obtained, the
10A and 10B are diagrams showing another example of an operation of the driver assistance apparatus 100 according to the embodiment of the present invention to select any one of the objects displayed on the traveling image based on the user's voice. For convenience of explanation, it is assumed that an object is selected on the basis of a user's voice inputted after the selection area S1 shown in Fig. 8A is set.
First, according to FIGS. 10A and 10B, the
If the number of objects corresponding to the "vehicle" corresponding to the user voice is two or more (1001, 1002) as shown in the figure, the
10A and 10B, the text corresponding to the user voice is the same as "vehicle ", but the length of the user voice according to FIG. 10A is relatively shorter than the length of the user voice according to FIG. 10B. If the length of the user voice and the distance corresponding thereto are in a proportional relationship, the
10A, assuming that the
Referring to FIG. 10B, when it is assumed that the length of the user voice is longer than the reference time, the
11A and 11B are diagrams showing another example of an operation in which the driver assistance apparatus 100 according to the embodiment of the present invention selects an object appearing on a traveling image based on a user's voice. For convenience of explanation, it is assumed that two
The
Referring to FIG. 11A, the
Referring to FIG. 11B, when the
Of course, the
12A and 12B are diagrams showing another example of an operation in which the driver assistance apparatus 100 according to the embodiment of the present invention selects an object appearing on a traveling image based on a user's voice. For convenience of explanation, it is assumed that an object is selected on the basis of a user's voice inputted after the selection area S1 shown in Fig. 8A is set.
The
The
12A, if the user voice is "close ", the
Referring to FIG. 12B, when the
13A and 13B are diagrams showing another example of an operation in which the driver assistance apparatus 100 according to the embodiment of the present invention selects an object appearing on the traveling image based on the user's voice. For convenience of explanation, it is assumed that an object is selected on the basis of a user's voice inputted after the selection area S1 shown in Fig. 8A is set.
The
Referring to FIG. 13A, when "Next" is pronounced once to the user voice, the
Referring to FIG. 13B, when the "next" is repeated twice for the user voice as shown in the result of voice recognition of the user voice inputted through the
In this case, the
14A and 14B are views showing an example of an operation of providing information about a selected object according to an embodiment of the present invention.
14A illustrates information provided when the
As shown in the figure, the
On the other hand, FIG. 14B illustrates information provided when the
As shown, the
14A and 14B, information on the selected object is shown to be visually provided through the
15 is a view showing another example of an operation of providing information about a selected object according to an embodiment of the present invention. For convenience of explanation, the
15, the
Specifically, the
In addition, the
Meanwhile, the driver assistance device 100 and the control method thereof according to the present invention can be implemented as a code that the
While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed exemplary embodiments, but, on the contrary, It should be understood that various modifications may be made by those skilled in the art without departing from the spirit and scope of the present invention.
1: vehicle
100: Driver assistance device
Claims (15)
A first camera for generating a running image for the front of the vehicle;
A second camera for generating an indoor image of the vehicle;
An audio input unit for receiving a user voice; And
Upon entry into driver assistance mode,
Detecting at least one of a user's gaze and a gesture from the indoor image,
Setting a selection region in the traveling image based on at least any one of a sight line and a gesture of the user,
At least one user's voice is received through the audio input unit,
Selecting at least one object appearing on the traveling image based on the at least one user voice,
A processor for controlling the display unit to display information about the selected object; Lt; / RTI >
The processor comprising:
Converts the user voice into text,
Measuring a length of the user's voice to select an object corresponding to the length of the text and the user's voice among at least one or more objects included in the selected area,
And selects an object corresponding to the text among at least one or more objects included in the selection area based on at least one of the speed and the number of times the predetermined character string is repeated among the texts.
The processor comprising:
Selects a plurality of objects appearing on the traveling image based on the user voice,
And selects at least one object among the plurality of selected objects based on the user voice to be additionally input.
The processor comprising:
If the further input user voice is related to at least one of a color, a direction, a size, and a distance to the vehicle,
And selects at least one object corresponding to the further input user voice from among the plurality of selected objects.
Wherein the selection region includes:
Is an area including a point corresponding to a line of sight or a gesture of the user among the entire area of the traveling image.
The display unit includes:
And a HUD (Head Up Display) for projecting information on the selected object to a window of the vehicle.
The processor comprising:
Converts the user voice into text,
And selects an object corresponding to the text among at least one or more objects included in the selection region.
The genital selected object,
And a peripheral facility included in the selection area and corresponding to the text among the peripheral facility information received by the communication unit of the vehicle.
The processor comprising:
And measures the length of the user voice only when the text is a predetermined character string.
The processor comprising:
And controls the display unit to display the selected object among the objects appearing in the running image, from the remaining objects.
The information about the selected object may be,
And a path from the position of the vehicle to the selected object, the type of the selected object, the speed, the direction, the distance from the vehicle, and the path from the position of the vehicle to the selected object.
The processor comprising:
And controls the display unit to display a graphic object that guides a route from the position of the vehicle to the selected object.
The processor comprising:
And generates a control signal to enter an autonomous mode for a path from the position of the vehicle to the selected object.
The processor comprising:
And upon entering a predetermined input, enters the driver assist mode.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150067500A KR101708676B1 (en) | 2015-05-14 | 2015-05-14 | Driver assistance apparatus and control method for the same |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150067500A KR101708676B1 (en) | 2015-05-14 | 2015-05-14 | Driver assistance apparatus and control method for the same |
Publications (2)
Publication Number | Publication Date |
---|---|
KR20160134075A KR20160134075A (en) | 2016-11-23 |
KR101708676B1 true KR101708676B1 (en) | 2017-03-08 |
Family
ID=57541311
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020150067500A KR101708676B1 (en) | 2015-05-14 | 2015-05-14 | Driver assistance apparatus and control method for the same |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR101708676B1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20190033975A (en) * | 2017-09-22 | 2019-04-01 | 엘지전자 주식회사 | Method for controlling the driving system of a vehicle |
US11628860B2 (en) * | 2019-01-16 | 2023-04-18 | Toyota Jidosha Kabushiki Kaisha | Autonomous driving system that can eliminate a system distrust state of the driver |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11318961B2 (en) | 2018-07-20 | 2022-05-03 | Lg Electronics Inc. | Robot for vehicle and control method thereof |
US20210333869A1 (en) * | 2018-11-30 | 2021-10-28 | Lg Electronics Inc. | Vehicle control device and vehicle control method |
CN111311948B (en) * | 2020-02-19 | 2021-07-13 | 广州小马智行科技有限公司 | Control method and device for automatic driving vehicle, storage medium and vehicle |
KR102551283B1 (en) * | 2020-09-08 | 2023-07-06 | 한국전자통신연구원 | Metacognition-based autonomous driving correction device and method |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005138755A (en) * | 2003-11-07 | 2005-06-02 | Denso Corp | Device and program for displaying virtual images |
JP2014120112A (en) * | 2012-12-19 | 2014-06-30 | Aisin Aw Co Ltd | Travel support system, travel support method, and computer program |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101703144B1 (en) * | 2012-02-09 | 2017-02-06 | 한국전자통신연구원 | Apparatus and method for autonomous driving |
KR20140070861A (en) * | 2012-11-28 | 2014-06-11 | 한국전자통신연구원 | Apparatus and method for controlling multi modal human-machine interface |
-
2015
- 2015-05-14 KR KR1020150067500A patent/KR101708676B1/en active IP Right Grant
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005138755A (en) * | 2003-11-07 | 2005-06-02 | Denso Corp | Device and program for displaying virtual images |
JP2014120112A (en) * | 2012-12-19 | 2014-06-30 | Aisin Aw Co Ltd | Travel support system, travel support method, and computer program |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20190033975A (en) * | 2017-09-22 | 2019-04-01 | 엘지전자 주식회사 | Method for controlling the driving system of a vehicle |
KR102064222B1 (en) * | 2017-09-22 | 2020-03-02 | 엘지전자 주식회사 | Method for controlling the driving system of a vehicle |
US10705522B2 (en) | 2017-09-22 | 2020-07-07 | Lg Electronics Inc. | Method for controlling operation system of a vehicle |
US11628860B2 (en) * | 2019-01-16 | 2023-04-18 | Toyota Jidosha Kabushiki Kaisha | Autonomous driving system that can eliminate a system distrust state of the driver |
Also Published As
Publication number | Publication date |
---|---|
KR20160134075A (en) | 2016-11-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR101708657B1 (en) | Vehicle and control method for the same | |
KR101730321B1 (en) | Driver assistance apparatus and control method for the same | |
KR102674974B1 (en) | Method and apparatus for passenger recognition and boarding support of autonomous vehicle | |
KR101708676B1 (en) | Driver assistance apparatus and control method for the same | |
US10351060B2 (en) | Parking assistance apparatus and vehicle having the same | |
KR101750876B1 (en) | Display apparatus for vehicle and Vehicle | |
KR101750178B1 (en) | Warning Method Outside Vehicle, Driver Assistance Apparatus For Executing Method Thereof and Vehicle Having The Same | |
KR101895485B1 (en) | Drive assistance appratus and method for controlling the same | |
US10748428B2 (en) | Vehicle and control method therefor | |
KR20170004715A (en) | Driver Assistance Apparatus and Vehicle Having The Same | |
KR20160144829A (en) | Driver assistance apparatus and control method for the same | |
KR20170003133A (en) | Advanced Driver Assistance System, Display apparatus for vehicle and Vehicle | |
KR101762805B1 (en) | Vehicle and control method for the same | |
KR101832224B1 (en) | Appratus and method for assisting a driver based on difficulty level of parking | |
KR101691800B1 (en) | Display control apparatus and operating method for the same | |
KR101859044B1 (en) | Vehicle and control method for the same | |
KR20170035238A (en) | Vehicle and control method for the same | |
KR101850857B1 (en) | Display Apparatus and Vehicle Having The Same | |
US20220397895A1 (en) | Vehicle remote control device, vehicle remote control system, vehicle remote control method, and vehicle remote control program | |
KR101752798B1 (en) | Vehicle and control method for the same | |
KR101985496B1 (en) | Driving assistance apparatus and vehicle having the same | |
KR102192146B1 (en) | Vehicle control device and vehicle control method | |
WO2022201892A1 (en) | Information processing apparatus, information processing method, and program | |
US20230141584A1 (en) | Apparatus for displaying at least one virtual lane line based on environmental condition and method of controlling same | |
KR101929300B1 (en) | Parking Assistance Apparatus and Vehicle Having The Same |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AMND | Amendment | ||
E601 | Decision to refuse application | ||
AMND | Amendment | ||
X701 | Decision to grant (after re-examination) |