CN221162527U - Height-adjusting system and vehicle - Google Patents

Height-adjusting system and vehicle Download PDF

Info

Publication number
CN221162527U
CN221162527U CN202322595097.2U CN202322595097U CN221162527U CN 221162527 U CN221162527 U CN 221162527U CN 202322595097 U CN202322595097 U CN 202322595097U CN 221162527 U CN221162527 U CN 221162527U
Authority
CN
China
Prior art keywords
height
seat
display device
adjusting
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202322595097.2U
Other languages
Chinese (zh)
Inventor
周伟明
高荣琦
林皓
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202322595097.2U priority Critical patent/CN221162527U/en
Application granted granted Critical
Publication of CN221162527U publication Critical patent/CN221162527U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)

Abstract

The application provides a system for adjusting height, which can be used in a cabin virtual image display system of a vehicle. The height adjusting system provided by the application can automatically realize the height matching of the user and the display device according to the user information, ensure that the user can watch the image at a comfortable height and improve the user experience. The system comprises: the device comprises a first seat, a second seat, a display device, a processing device, an adjusting device and an acquisition device. The display device is installed in the first seat, and the user sits on the second seat to view the image displayed by the display device. The acquisition device is used for acquiring the user information and sending the user information to the processing device. The processing device determines a relative height between the height of the display device and the height of the human eye according to the user information, generates an adjustment amount based on the relative height, and sends the adjustment amount to the adjustment device. The adjusting device is used for adjusting the height of the first seat, the height of the display device or the height of the second seat according to the adjusting quantity.

Description

Height-adjusting system and vehicle
Technical Field
The embodiment of the application relates to the technical field of display and intelligent automobile driving, in particular to a system and a vehicle for adjusting height.
Background
The automobile has become an indispensable transportation means of people's daily life, through installing on-vehicle display device in the car, makes the interior passenger of car can obtain information or carry out activities such as amusement video through on-vehicle display device to the intelligent application scene in cabin space has been provided.
How to provide different users with different viewing heights and improve the comfort of the users is a problem to be solved.
Disclosure of utility model
The application provides a system and a vehicle for adjusting the height, which can automatically realize the height matching of a user and a display device according to user information, ensure that the user can watch images at a comfortable height and improve user experience.
In a first aspect, embodiments of the present application provide a system for adjusting height for a cabin virtual image display system. The system comprises: the device comprises a first seat, a second seat, a display device, a processing device, an adjusting device and an acquisition device, wherein the display device is installed in the first seat, a user sits on the second seat to watch an image displayed by the display device, the adjusting device is connected with at least one of the first seat, the second seat and the display device, the acquisition device is connected with the processing device, and the adjusting device is connected with the processing device. The acquisition device is used for acquiring user information and sending the user information to the processing device; the processing device is used for determining the relative height between the height of the display device and the human eye height according to the user information, generating an adjustment quantity based on the relative height, and sending the adjustment quantity to the adjustment device, wherein the height of the display device is the height of the center of a window unit of the display device relative to the ground of a cabin, the human eye height of the user is the height of the human eye of the user relative to the ground of the cabin, and the relative height between the height of the display device and the human eye height meets the requirement that the user is in a viewing angle range when viewing a virtual image of the display device; the adjusting device is used for adjusting the height of the first seat, the height of the display device or the height of the second seat according to the adjusting quantity.
It should be noted that, in the present application, the positions where the display device may be mounted include, but are not limited to, a seat back, a seat headrest, a secondary console, and the like. Wherein, the partial structure of the first seat can be a chair back, a headrest, a seat cushion and the like of the first seat.
Based on the scheme, the height adjusting system provided by the application ensures that the relative height between the height of the display device and the height of human eyes is within the viewing angle range when a user views the virtual image of the display device, and improves the user experience.
With reference to the first aspect, in certain implementation manners of the first aspect, the processing device is specifically configured to determine, according to the user information and the relative height, an adjustment amount Δh1 of the height of the first seat, or an adjustment amount Δh2 of the height of the display device, or an adjustment amount Δh3 of the height of the second seat, and send any one of the adjustment amount Δh1, the adjustment amount Δh2, and the adjustment amount Δh3 to the adjustment device; the adjusting device is specifically used for adjusting the height of the first seat according to the adjusting quantity delta H1, or adjusting the height of the display device according to the adjusting quantity delta H2, or adjusting the height of the second seat according to the adjusting quantity delta H3.
With reference to the first aspect, in certain implementation manners of the first aspect, the processing device is specifically configured to: a first height H1 of the first seat is acquired, a first target height is determined based on the user information, and the adjustment amount Δh1 is determined based on the first height H1 and the first target height.
With reference to the first aspect, in certain implementation manners of the first aspect, the processing device is specifically configured to: and determining the first target height based on the corresponding relation between the user information and the first target height.
With reference to the first aspect, in certain implementation manners of the first aspect, the processing device is specifically configured to: and acquiring a distance D between a vertical plane where human eyes of the user are positioned and a vertical plane where the window unit is positioned, and determining the first target height based on the distance D, the viewing angle theta of the human eyes and the user information.
With reference to the first aspect, in certain implementation manners of the first aspect, the processing device is specifically configured to: a second height H2 of the display device is acquired, a second target height is determined based on the user information, and the adjustment amount Δh2 is determined based on the second height H2 and the second target height.
With reference to the first aspect, in certain implementation manners of the first aspect, the processing device is specifically configured to: and determining the second target height based on the user information and a second corresponding relation.
With reference to the first aspect, in certain implementation manners of the first aspect, the processing device is specifically configured to: and acquiring a distance D between a vertical plane where the human eyes of the user are positioned and a vertical plane where the window unit is positioned, and determining the second target height based on the distance D, the viewing angle theta of the human eyes and the user information.
With reference to the first aspect, in certain implementation manners of the first aspect, the processing device is specifically configured to: a third height H3 of the second seat is acquired, a third target height is determined based on the user information, and the adjustment amount Δh3 is determined based on the third height H3 and the third target height.
The first, second, and third correspondence may be a correspondence between specific values of the user information and target heights (including the first, second, and third target heights), respectively. The first correspondence may be a correspondence between a specific value of the user information and the first target height, or may be a correspondence between a range corresponding to the user information and the first target height. For example, when the user information is the user eye height, the first correspondence may be a correspondence between a specific value of the eye height and the first target height. Or may be a correspondence between the range of eye heights in which the specific value of the human eye height is located and the first target height.
Based on the scheme, the adjustment accuracy of the embodiment of the application can be improved by determining the adjustment quantity (including delta H1, delta H2 and delta H3) through the user information, and the user experience effect is ensured. The seat or the display device can be adjusted rapidly through the first corresponding relation, the second corresponding relation and the third corresponding relation, and the sensitivity of the display system applied to the scheme is improved, so that the effect of improving user experience is achieved.
With reference to the first aspect, in certain implementation manners of the first aspect, the processing device is specifically configured to: and determining the third target height based on the corresponding relation between the user information and the third target height.
With reference to the first aspect, in certain implementation manners of the first aspect, the processing device is specifically configured to: and acquiring a distance D between a vertical plane where the human eyes of the user are positioned and a vertical plane where the window unit is positioned, and determining the third target height based on the distance D, the viewing angle theta of the human eyes and the user information.
With reference to the first aspect, in certain implementation manners of the first aspect, the viewing angle θ includes a lower viewing angle θ1 or an upper viewing angle θ2, where the lower viewing angle θ1 is a viewing angle of the human eye below a horizontal plane in which the human eye is located when the human eye is in a plane view, the upper viewing angle θ2 is a viewing angle of the human eye above a horizontal plane in which the human eye is located when the human eye is in a plane view, and a value range of the lower viewing angle θ1 is-1.5 ° or less θ1 or less than 0 °, and a value range of the upper viewing angle θ2 is 0 ° or less θ2 or less than 9 °.
Based on the scheme, the adjustment quantity is calculated through the viewing angle, the distance D and the user information (specifically, the height or the height of the human eyes), so that the method provided by the application can fully consider the comfort level of the human eyes in the process of determining the adjustment quantity, and the effect of improving the user experience is achieved.
With reference to the first aspect, in certain implementations of the first aspect, the user information includes at least one of: the height of the user's eyes, the height of the user, the position of the eyes in the image, and the weight of the user.
With reference to the first aspect, in certain implementation manners of the first aspect, the display device includes: the image generation unit is used for emitting image light to the window unit; the window unit is used for reflecting the image light from the image generation unit to the image amplifying unit and transmitting the image light from the image amplifying unit, and is also used for watching a virtual image formed by the image light through the window unit by human eyes; the image amplifying unit is used for reflecting the image light from the window unit to the window unit.
In a second aspect, an embodiment of the present application provides a processing device, configured for use in a system provided by any one of the foregoing first aspect and any one of the implementation manners of the first aspect. The processing device may include one or more units and/or modules. The processing device may include an input/output interface. Alternatively, the input/output interface may be an input/output circuit.
Or the processing device may be a chip, a system-on-a-chip or a processor, a processing circuit or logic circuit, etc.
The operations such as acquiring, etc. related to the processing device may be understood as operations such as receiving, inputting, etc. by a processor if not specifically stated or if not contradicted by actual action or inherent logic in the relevant description, and the application is not limited thereto.
In a third aspect, an embodiment of the present application provides a chip. The chip comprises the processing device and the communication interface according to the second aspect, and the processor obtains the user information through the communication interface.
In a fourth aspect, embodiments of the present application provide a cabin system comprising the system provided by any one of the implementations of the first aspect and the second aspect.
In a fifth aspect, an embodiment of the present application provides a vehicle, including a system provided by any implementation manner of the first aspect and the second aspect.
With reference to the fifth aspect, in certain implementations of the fifth aspect, the display device is disposed at least one of a headrest of a seat of the vehicle and a seatback of the seat of the vehicle and a secondary console of the vehicle.
The advantages of the second to fifth aspects may be specifically referred to the description of the advantages of the first aspect, and are not repeated here.
Drawings
Fig. 1 is a schematic diagram of an application scenario of an intelligent cockpit display system 100 according to an embodiment of the present application.
Fig. 2 is a schematic flow chart of a method 200 for adjusting height according to an embodiment of the present application.
Fig. 3 is a schematic flow chart of a method 300 for adjusting height according to an embodiment of the present application.
Fig. 4 is a schematic flow chart of a method 400 for adjusting a height of a first seat to a first target height according to an embodiment of the present application.
Fig. 5 is a schematic flow chart of a method 500 for adjusting a height of a first seat to a first target height according to an embodiment of the present application.
Fig. 6 is a schematic diagram of determining a first target height based on a distance D, a viewing angle θ of a human eye, and user information according to an embodiment of the present application.
Fig. 7 is a schematic flow chart of a method 700 for adjusting a height of a display device according to an embodiment of the present application.
Fig. 8 is a schematic flow chart of a method 800 for adjusting a height of a display device to a second target height according to an embodiment of the application.
Fig. 9 is a schematic flow chart of a method 900 for adjusting height according to an embodiment of the present application.
Fig. 10 is a schematic flow chart of a method 1000 for adjusting the height of a second seat to a third target height according to an embodiment of the present application.
Fig. 11 is a schematic flowchart of a method 1100 for adjusting an angle of a display device according to an embodiment of the present application.
Fig. 12 is a schematic diagram of an angle adjustment display device according to an embodiment of the present application.
Fig. 13 is a schematic structural diagram of a display device 1300 according to an embodiment of the application.
Fig. 14 is a schematic structural diagram of a processing apparatus 1400 according to an embodiment of the present application.
Fig. 15 is a schematic diagram of a height-adjustable display system 1500 according to an embodiment of the application.
Fig. 16 is a schematic circuit diagram of a display device according to an embodiment of the application.
Fig. 17 is a schematic diagram of a possible functional framework of a vehicle according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the accompanying drawings.
The following description is made in order to facilitate understanding of embodiments of the present application.
The words "first", "second", etc. and various numerical numbers in the first, the text description of the embodiments of the application shown below or in the drawings are merely for descriptive convenience and are not necessarily for describing particular sequences or successes and are not intended to limit the scope of the embodiments of the application. For example, to distinguish between different heights, etc.
The terms "comprises," "comprising," and "having," in the context of the second, following illustrated embodiment of the present application, are intended to cover a non-exclusive inclusion, such that a system, article, or apparatus that comprises a list of elements is not necessarily limited to those elements expressly listed but may include other elements not expressly listed or inherent to such article or apparatus.
Third, in embodiments of the application, the words "exemplary" or "such as" are used to mean examples, illustrations, or descriptions, and embodiments or designs described as "exemplary" or "such as" should not be construed as being preferred or advantageous over other embodiments or designs. The use of the word "exemplary" or "such as" is intended to present the relevant concepts in a concrete fashion to facilitate understanding.
Fourth, in the embodiment of the present application, image light refers to light carrying an image (or image information) for generating an image.
Fifth, in the drawings of the present application, the thickness, size, and shape of each optical element have been slightly exaggerated for convenience of explanation. In particular, the optical element shapes shown in the drawings are shown by way of example and are merely examples and are not drawn to scale.
Sixth, unless otherwise defined, all terms (including technical and scientific terms) used in the present application have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
With the rapid development of intelligent automobiles, automobiles play an increasing role in the life of people, and the demand for vehicle-mounted displays is increasing, for example, entertainment services such as games, viewing and the like are provided for long-time passengers. Or to provide a private office display environment for office personnel, etc.
In order to provide the vehicle-mounted display function, it is a common solution to directly mount a display screen, for example, a Liquid Crystal Display (LCD) may be hung on a vehicle roof or placed on a chair back of a seat, etc. for people in the vehicle to use, however, the position of such a LCD is usually fixed, and cannot be adapted to users with different heights, so that the comfort is poor.
In view of this, the present application provides a method for adjusting the height, which can automatically implement the height matching between the user and the display device according to the user information, so as to ensure that the user can watch the image at a comfortable height, and improve the user experience.
Fig. 1 is a schematic diagram of an application scenario of an intelligent cockpit display system 100 according to an embodiment of the present application. As shown in fig. 1, the intelligent cockpit display system 100 includes at least one display device 101 disposed on a back of a seat 102 and at least one seat 102, and fig. 1 illustrates one display device and one seat. The display device 101 can generate a remote enlarged virtual image through input of an external video signal (also referred to as a signal source), and provide a large-scale and remote visual experience for a viewer, so as to meet the requirements of various application scenes such as leisure and entertainment of passengers, business offices, and the like. The display device 101 may be mounted on the back of the seat 102 before leaving the factory, or mounted on the back of the seat 102 by retrofitting the seat 102 after leaving the factory.
It is understood that the intelligent cockpit display system 100 may be applied to vehicles including, but not limited to: cars, trucks, buses, boats, planes, helicopters, recreational vehicles, trains, and the like. The display device 101 may be mounted on a headrest of a seat before shipment or may be mounted on a headrest of a seat after shipment by retrofitting the seat.
It should be noted that the method for adjusting the height provided by the present application may be applied to the intelligent cabin display system 100 shown in fig. 1, but the embodiment of the present application is not limited thereto, and may be other similar systems including the intelligent cabin display system 100 shown in fig. 1. Hereinafter, the method for adjusting the height provided by the present application will be described in detail.
Fig. 2 is a schematic flow chart of a method 200 for adjusting height according to an embodiment of the present application. As shown in fig. 2, method 200 includes the following steps.
S201, adjusting a relative height between the display device and a human eye height of the user in response to the user information. The relative height is related to the height of the display device and the height of the eyes of the user, wherein the height of the display device is the height of the center of the window unit of the display device relative to the ground of the cabin, and the height of the eyes of the user is the height of the eyes of the user relative to the ground of the cabin.
In the scheme of the application, the relative height between the display device and the human eye of the user can be adjusted by adjusting the height of the display device, or the relative height between the display device and the human eye of the user can be adjusted by adjusting the height of the human eye.
In the present application, the user information includes, but is not limited to, at least one of: the height of the user's eyes, the height of the user, the position of the eyes in the image, and the weight of the user. When the eye height of the user is the height of the eye (or the eye sight line or pupil center, etc.) relative to the cabin floor, the eye height is the height of the eye in front of the eye head-up.
Optionally, before S201, the method further includes S202.
S202, user information is acquired.
In the scheme of the application, the user information can be acquired by receiving the sensing equipment from the cabin. When the obtained user information is the height or height of eyes, in one possible implementation manner, the height or sitting height of eyes of the passengers relative to the floor of the cabin can be estimated by recording the states of the passengers in the cabin and the cameras or the like recording the images in the cabin in real time and by shooting the images of the passengers in the sitting state. Or in another implementation, the height, angle or height of the human eye can be estimated by capturing an image of the human eye when the passenger is in a sitting position by an image sensor provided in the display device, for example, a camera for capturing and tracking the human eye, or the like. When the acquired user information is weight, the weight of the user can be acquired through a weight detector on a cabin pedal or a seat.
Fig. 3 is a schematic flow chart of a method 300 for adjusting height according to an embodiment of the present application. It is to be appreciated that the method 300 may be performed by the first seat, or by a processing device in the first seat, or by a processing chip in the first seat, or the like, or by a cabin domain controller for adjusting the first seat and sending an adjustment message to a means for adjusting the first seat, as the application is not limited. The method 300 provided by the present application is described below with the first seat as the execution subject, and as shown in fig. 3, the method 300 includes the following steps.
And S301, responding to the user information, adjusting the first seat to a first target height, wherein the first target height is the height of a part of structures of the first seat relative to the ground of the cabin, and the first seat is a seat for installing a display device.
In the scheme of the application, the first target height is used for meeting the requirement that a user is in a viewing angle range when viewing a virtual image of the display device. I.e. when the first seat is adjusted to the first target height, the user can be enabled to view the complete image displayed by the display device.
When the display device is installed in the seatback of the first seat, ways to adjust the first seat to the first target height in response to the user information include, but are not limited to: the seat cushion of the first seat is lifted or lowered according to the user information, and meanwhile, the seat back of the first seat is driven to be lifted or lowered to the first target height. At this time, the first target height is the height of the horizontal plane of the seat cushion of the first seat relative to the cabin floor. Or the first seat raises or lowers the seat back of the first seat to a first target height, etc., according to the user information. At this time, the first target height may be a height of a horizontal plane where a back center of the first seat is located with respect to the cabin floor, or the first target height may be a height of a horizontal plane where a back top end of the first seat is located with respect to the cabin floor, or the like.
Ways of adjusting the first seat to the first target height in response to the user information when the display device is installed in the headrest of the first seat include, but are not limited to: the first seat drives the headrest of the first seat to rise or fall to the first target height at the same time when the seat cushion of the first seat is raised or lowered according to the user information. At this time, the first target height is the height of the horizontal plane of the seat cushion of the first seat relative to the cabin floor. Or the first seat drives the headrest of the first seat to rise or fall simultaneously while the seat back of the first seat rises or falls according to the user information. At this time, the first target height may be a height of a horizontal plane where a back center of the first seat is located with respect to the cabin floor, or the first target height may be a height of a horizontal plane where a back top end of the first seat is located with respect to the cabin floor, or the like. Or the first seat raises or lowers the headrest of the first seat, or the like, according to the user information. At this time, the first target height may be a height of a horizontal plane in which a headrest center of the first seat is located with respect to the cabin floor, or the first target height may be a height of a horizontal plane in which a headrest tip of the first seat is located with respect to the cabin floor, or the like.
Optionally, before S301, the method further includes S302.
S302, user information is acquired.
Specifically, the manner in which the first seat obtains the user information may refer to the description of obtaining the user information in S202, which is not described herein.
Fig. 4 is a schematic flow chart of a method 400 for adjusting a height of a first seat to a first target height according to an embodiment of the present application. It is to be appreciated that the method 400 may be performed by a processing device of the first seat, which may be a processor, or a processing module in a processing device, such as a chip or the like, or by a cabin controller or the like, as the application is not limited. The processing apparatus of the first seat will be specifically described below as an example. As shown in fig. 4, method 400 includes the following steps.
S401, acquiring a first height H1 of the first seat.
Specifically, the processing device of the first seat acquires the current height H1 of the first seat (i.e., the height before the adjustment). The processing device of the first seat may, for example, obtain the current height H1 of the first seat from the seat height information stored in the storage unit of the cabin. The seat height information may be a height value of the first seat after the previous adjustment, or an adjustment value of the last seat, etc., which is not limited by the present application. When the seat height information acquired by the processing device of the first seat through the storage unit of the cabin is the height value adjusted in the previous time, the processing device of the first seat takes the height value as the current height H1 of the first seat. When the seat height information acquired by the processing device of the first seat through the storage unit of the cabin is the adjustment value of the last seat, the processing device of the first seat can calculate and generate the current height H1 according to the preset initial value and the adjustment value of the first seat. Wherein, the last adjustment value of the seat can be positive or negative or 0. Wherein, the adjustment value is positive, which can indicate that the last time the seat is adjusted upwards relative to the initial position; when the adjustment value is negative, the last downward adjustment of the seat relative to the initial position can be indicated; when the adjustment value is 0, it may indicate that the seat was not adjusted with respect to the initial position last time. For example, when the adjustment value is-6 mm and the initial value of the first seat is 764.4mm, the processing apparatus of the first seat calculates that the height H1 that can be present is 758.4mm. It should be noted that, when the processing apparatus of the first seat may calculate the height H1 from a preset initial value of the first seat, the preset initial value of the first seat may be preset in the processor.
S402, determining a first target height of the first seat based on user information.
Specifically, the processing device of the first seat determines a first target height of the first seat based on the user information.
S403, an adjustment amount Δh1 of the height of the first seat is determined based on the first height H1 and the first target height.
Specifically, after the processing apparatus of the first seat acquires the first height H1 and the first target height, the adjustment amount Δh1 of the height of the first seat is determined based on the first height H1 and the first target height.
S404, adjusting the height to the first target height according to the adjustment quantity delta H1.
Specifically, after the processing apparatus of the first seat determines the adjustment amount Δh1 of the height of the first seat, the height is adjusted to the first target height according to the adjustment amount Δh1. For example, the processing device of the first seat may send the adjustment quantity Δh1 to a control device or an adjustment device (for example by means of an adjustment message), with which the height of the first seat is adjusted to the first target height.
It should be noted that, the above-mentioned S401 and S402 are not strictly sequential, and S401 may be executed first and S402 may be executed second; or S402 is performed first and S401 is performed next; or S401 and S402 are performed simultaneously.
Next, a method for determining a first target height by the processing apparatus for a first seat according to an embodiment of the present application based on user information will be described in detail with reference to tables 1 to 6 and fig. 5.
In one implementation, the processing device of the first seat may determine the first target height from a first correspondence of user information and the first target height.
Optionally, the first correspondence is a correspondence between a human eye height value and a first target height, as shown in table 1 below; or the first correspondence is a correspondence between the height value and the first target height, as shown in table 2 below; or the first corresponding relation is the corresponding relation between the weight value and the first target height; or the first correspondence is a correspondence between the human eye height range and the first target height, as shown in table 4 below; or the first correspondence is a correspondence of the height range and the first target height, as shown in table 5 below; or the first correspondence is a correspondence between the weight range and the first target height.
For example, when the user information is a human eye height value, table 1 shows a correspondence relationship between a human eye height value and a first target height provided in the embodiment of the present application.
TABLE 1
In the present application, the eye height value refers to the height of the eye of the user relative to the floor of the cabin when the user sits on the rear seat.
For example, when the user information is a height value, table 2 shows a correspondence between a height value and a first target height provided by the embodiment of the present application.
TABLE 2
In the present application, the height refers to the height when the user stands vertically on the ground.
For example, when the user information is a weight value, table 3 shows a correspondence relationship between the weight value and the first target height provided in the embodiment of the present application.
TABLE 3 Table 3
For example, when the user information is the human eye height range, table 4 shows a correspondence relationship between the human eye height value and the first target height provided in the embodiment of the present application.
TABLE 4 Table 4
For example, when the user information is a height range, table 5 shows a correspondence between a height value and a first target height provided by the embodiment of the present application.
TABLE 5
For example, when the user information is a weight range, table 6 shows a correspondence relationship between a weight value and a first target height provided in the embodiment of the present application.
TABLE 6
It should be noted that, the above tables 1 to 6 are only examples and not limiting, and any basic modifications of the above tables 1 to 6 should be within the scope of the present application, for example, the correspondence between the sitting posture height and the first target height, or the range division of different human eye heights, height and weight, etc., and the present application is not limited thereto. It should be further understood that the data in tables 1 to 6 do not limit the protection scope of the present application, and the division of the scope may be uniform or non-uniform, which is within the protection scope of the present application.
In addition, the first correspondence relationship may be preset in the processing device or the storage device of the first seat, and the present application is not limited.
In another implementation, the processing device of the first seat may determine the first target height by the method 500 shown in fig. 5, taking into account the comfort of human eye viewing and in order to further enhance the user experience. As shown in fig. 5, method 500 includes the following steps.
S501, obtaining the distance D between the vertical plane where the eyes of the user are located and the vertical plane where the window unit is located.
Specifically, the processing device of the first seat may obtain the distance D between the vertical plane in which the eyes of the user are located and the vertical plane in which the window unit is located in a plurality of manners, which is not limited by the present application. The processing device of the first seat may, for example, obtain the distance D between the vertical plane in which the user's eyes lie and the vertical plane in which the window unit lies by receiving a distance detector from within the cabin, for example a camera within the cabin. Or the processing device of the first seat may obtain the distance D between the vertical plane in which the user's eyes are located and the vertical plane in which the window unit is located by receiving a distance detector from the display device, for example, a camera of the display device. Or the processing device of the first seat may obtain the preset distance D of the storage device, etc.
S502, determining a first target height based on the distance D, the viewing angle theta of human eyes and user information.
Specifically, as shown in fig. 6, after the processing device of the first seat obtains the distance D between the vertical plane in which the human eye is located and the vertical plane in which the window unit is located, the first target height h may be calculated based on the following formula (1).
h=H′-Dtanθ (1)
In the above formula, H' is the height of the human eye in a sitting posture. Because in the scheme of the application, the user information can be the eye height, the height or the weight, and the like, when the user information acquired by the processing equipment of the first seat is not the eye height in sitting posture, the user information can be converted into the eye height H' in sitting posture. Illustratively, when the user information is the height H, since the sitting posture eye height of the human eye is generally about 47% of the height, and considering that the back reclining angle gamma ranges from 22.3 DEG.ltoreq.gamma.ltoreq.38.3 DEG, the height can be converted into the sitting posture eye height H' by the following formula (2).
H′=0.47 ×Hcosγ (2)
In the present embodiment, the viewing angle θ includes a lower viewing angle θ1 or an upper viewing angle θ2. As shown in fig. 1, the lower viewing angle θ1 is an optimal viewing angle of the eye's line of sight with respect to the eye's lower horizontal plane when the eye is looking down, and the upper viewing angle θ2 is an optimal viewing angle of the eye's line of sight with respect to the eye's upper horizontal plane when the eye is looking up. Wherein, the value range of the lower visual angle theta 1 is-1.5 degrees or more and is less than or equal to 0 degrees or less, and the value range of the upper visual angle theta 2 is 0 degrees or more and is less than or equal to 9 degrees or less. In the present application, when the view angle value is negative, it means that the view angle is located below the horizontal line of sight of the human eye. When the viewing angle value is negative, the viewing angle is located above the horizontal line of sight of the human eye. When the viewing angle value is 0, the viewing angle is the horizontal line of sight of human eyes.
It should be noted that, in order to enhance the comfort level of the human eyes when viewing the display image, the present application provides an optimal comfortable viewing angle range for the human eyes at the lower viewing angle θ1 and the upper viewing angle θ2, and the optimal comfortable viewing angle is determined by the key factors affecting the optimal comfortable viewing angle as provided in the following table 7.
TABLE 7
As can be seen from table 7, factors affecting the optimal comfortable viewing angle of human eyes include the range of viewing angles α under the eyes and the head back tilt angle β. By collecting and analyzing a large amount of user data, the head back inclination angle beta is generally positioned at an angle of 0 degree or more and 16.1 degrees or less, and the angle of view alpha of the human eyes is generally positioned at an angle of 0 degree or more and 19.7 degrees or less.
Fig. 7 is a schematic flow chart of a method 700 for adjusting height according to an embodiment of the present application. It is to be appreciated that the method 700 may be performed by a processing device of a display apparatus, or by a processing chip of a display apparatus, or the like, or by a cabin domain controller for adjusting the display apparatus or sending an adjustment message to an apparatus for adjusting the display apparatus, as the application is not limited. The method 700 provided by the present application will be described below with a display device as an execution subject, and as shown in fig. 7, the method 700 includes the following steps.
And S701, responding to the user information, adjusting the display device to a second target height, wherein the second target height is the height of the center of the window unit of the display device relative to the cabin floor.
The second target height is used for meeting the requirement that a user is in a viewing angle range when watching a virtual image of the display device. I.e. when the display device has adjusted the height to the second target height, the user can be enabled to view the complete image displayed by the display device.
When the display device is mounted in the back of the seat or in the headrest of the seat, the manner of adjusting the display device to the second target height in response to the user information may be: the display device is lifted or lowered according to the user information, so that the height between the center of the window unit of the display device and the ground of the cabin is the second target height.
It is to be understood that the user information may refer to the description in S201, and will not be described herein.
Optionally, the method further comprises S702 before S701.
S702, user information is acquired.
Specifically, the manner in which the display device obtains the user information may refer to the description of obtaining the user information in S202, which is not described herein.
Fig. 8 is a schematic flow chart of a method 800 for adjusting a height of a display device to a second target height according to an embodiment of the application. It is to be appreciated that the method 800 may be performed by a processing device of the display apparatus, which may be a processor, or a processing module in a processing device, such as a chip or the like, or by a cockpit area controller or the like, as the application is not limited. The processing device of the display device will be specifically described below as an example. As shown in fig. 8, method 800 includes the following steps.
S801, a second height H2 of the display device is acquired.
Specifically, the manner in which the display device obtains the second height H2 may refer to the description of obtaining the first height H1 by the first seat in S401, which is not described herein again.
S802, determining a second target height of the display device based on the user information.
Specifically, reference may be made to the description related to S402, which is not repeated here.
S803, an adjustment amount Δh2 of the height of the display device is determined based on the second height H2 and the second target height.
Specifically, after the display device processing apparatus acquires the second height H2 and the second target height, the adjustment amount Δh2 of the height of the display device is determined based on the second height H2 and the second target height.
S804, adjusting the height to a second target height according to the adjustment amount delta H2.
Specifically, after the display device processing apparatus determines the adjustment amount Δh2 of the height of the display device, the height is adjusted to the second target height according to the adjustment amount Δh2. The processing device of the display device may, for example, send the adjustment quantity Δh2 to a control device or an adjustment device, with which the height of the display device is adjusted to the second target height.
Likewise, the steps S801 and S802 are not strictly sequential, and may be performed first S801 and then S802. Or S802 is executed first and S801 is executed next; or S801 and S802 are performed simultaneously.
It will be appreciated that in one possible manner, the processor of the display device may likewise determine the second target height based on a second correspondence of user information to the second target height. For example, the second correspondence is a correspondence between the human eye height value and the second target height; or the second corresponding relation is the corresponding relation between the height value and the second target height; or the second corresponding relation is the corresponding relation between the weight value and the second target height; or the second corresponding relation is the corresponding relation between the human eye height range and the second target height; or the second corresponding relation is the corresponding relation between the height range and the second target height; or the second correspondence is a correspondence between the weight range and the second target height. The form of the second correspondence may refer to the forms of the correspondence of the foregoing tables 1 to 6, and will not be described herein.
It is further understood that the second correspondence may be preset in a processing device or a storage device of the display apparatus, which is not limited by the present application.
In another possible manner, to further enhance the user experience, the viewing angle may also be considered when determining the second target height, and in particular, the processing device of the display apparatus may determine the second target height by the method 500 shown in fig. 5 described above, i.e. the processing device of the display apparatus determines the second target height based on the distance D, the viewing angle θ of the human eye, and the user information. After the processing device of the display device obtains the distance D between the vertical plane where the human eyes of the user are located and the vertical plane where the window unit is located through S501, the second target height may be determined through S502.
Fig. 9 is a schematic flow chart of a method 900 for adjusting height according to an embodiment of the present application. It is to be appreciated that the method 900 may be performed by a processing device of the second seat, or by a processing chip of the second seat, etc., or by a cabin controller for adjusting the second seat or sending an adjustment message to a device for adjusting the second seat, as the application is not limited. The method 900 provided by the present application is described below with the second seat as the execution subject, and as shown in fig. 9, the method 900 includes the following steps.
S901, in response to the user information, adjusting the second seat to a third target height, the third target height being a height of a seat cushion plane of the second seat relative to a cabin floor, the second seat being a seat on which the user sits.
The third target height is used for meeting the requirement that a user is in a viewing angle range when viewing a virtual image of the display device. I.e. when the second seat is adjusted to a third target height, the user can be enabled to view the complete image displayed by the display device.
When the display device is mounted on the back of the seat, or on the headrest of the seat, or in the secondary console, the manner in which the second seat is adjusted to the third target height in response to the user information may be: the second seat raises or lowers the seat cushion of the second seat according to the user information so that the seat cushion plane of the second seat is at a third target height with respect to the cabin floor.
It is to be understood that the user information may refer to the description in S201, and will not be described herein.
Optionally, before S901, the method further includes S902.
S902, acquiring user information.
Specifically, the manner in which the second seat obtains the user information may refer to the description of obtaining the user information in S202, which is not described herein.
Fig. 10 is a schematic flow chart of a method 1000 for adjusting the height of a second seat to a third target height according to an embodiment of the present application. It is to be appreciated that the method 1000 may be performed by a processing device of the second seat, which may be a processor, or a processing module in a processing device, such as a chip or the like, or by a cabin controller or the like, as the application is not limited. The processing apparatus of the second seat will be specifically described below as an example. As shown in fig. 10, method 1000 includes the following steps.
S1001, a third height H3 of the display device is acquired.
Specifically, the manner in which the second seat obtains the third height H3 may refer to the description in S401 that the first seat obtains the first height H1, which is not described herein.
S1002, determining a third target height of the second seat based on the user information.
Specifically, reference may be made to the description related to S402, which is not repeated here.
S1003, an adjustment amount Δh3 of the height of the second seat is determined based on the third height H3 and the third target height.
Specifically, after the second seat processing apparatus acquires the third height H3 and the third target height, the adjustment amount Δh3 of the height of the second seat is determined based on the third height H3 and the third target height.
S1004, adjusting the height to a third target height according to the adjustment quantity delta H3.
Specifically, the second seat processing apparatus determines the adjustment amount Δh3 of the height of the second seat, and then adjusts the height to the third target height according to the adjustment amount Δh3. The processing device of the second seat may, for example, send this adjustment Δh3 to a control device, with which the height of the second seat is adjusted to a third target height.
Likewise, the steps S1001 and S1002 are not strictly sequential, and S1001 may be executed first and S1002 may be executed second; or S1002 is performed first and S1001 is performed next; or S1001 and S1002 are performed simultaneously.
It will be appreciated that in one possible manner, the processor of the second seat may likewise determine a third target height from a third correspondence of user information and the third target height. For example, the third correspondence is a correspondence between the human eye height value and a third target height; or the third corresponding relation is the corresponding relation between the height value and the third target height; or the third corresponding relation is the corresponding relation between the weight value and the third target height; or the third corresponding relation is the corresponding relation between the human eye height range and the third target height; or the third corresponding relation is the corresponding relation between the height range and the third target height; or the third correspondence is a correspondence between the weight range and a third target height. The form of the third correspondence may refer to the forms of the correspondence of the foregoing tables 1 to 6, and will not be described herein.
It is further understood that the third correspondence may be preset in the processing device or the storage device of the second seat, and the present application is not limited thereto.
In another possible manner, to further enhance the user experience, the viewing angle may also be considered when determining the third target height, and in particular, the processing device of the second seat may determine the third target height by the method 500 shown in fig. 5 described above, i.e. the processing device of the second seat determines the third target height based on the distance D, the viewing angle θ of the human eye, and the user information. After the processing device of the second seat obtains the distance D between the vertical plane where the eyes of the user are located and the vertical plane where the window unit is located through S501, the third target height may be determined through S502.
Fig. 11 is a schematic flowchart of a method 1100 for adjusting an angle of a display device according to an embodiment of the present application. The method can be applied to the cabin virtual image display system shown in fig. 1. The method 300 may be performed by a display device, or by a processing apparatus in a display device, or by a processing chip in a display device, etc., and the application is not limited thereto. The method 1100 provided by the present application is described below with a display device as an execution subject, and as shown in fig. 11, the method 1100 includes the following steps.
S1101, acquiring the human eye position and the ideal human eye position in the image.
In the solution of the application, the way in which the display device obtains the position of the human eye and the ideal position of the human eye in the image may be obtained by receiving a sensor device from within the cabin. In one possible way, the position of the eyes in the image and the ideal position of the eyes in the image can be determined by recording the status of the passengers in the cabin in real time and by a camera or the like recording the images in the cabin, by taking the images of the passengers in a sitting position. Or in another realisable way the position of the human eye in the image and the ideal position of the human eye in the image may be determined by means of an image sensor provided in the display device, for example a camera or the like for capturing and tracking the human eye, by taking an image of the human eye when the passenger is in a sitting position.
S1102, an adjustment amount Δθ of the angle of the display device is determined based on the human eye position in the image and the ideal human eye position.
After the display device acquires the human eye position and the ideal human eye position in the image, the display device generates an adjustment quantity delta theta of the angle of the display device through digital image processing by determining a distance d between the human eye position and the ideal human eye position and according to a corresponding relation between the distance d and the adjustment quantity.
S1103, adjusting the angle of the display device according to the adjustment amount Δθ.
Specifically, after the adjustment amount Δθ is determined by the display device, the current angle is adjusted to an ideal position according to the adjustment amount Δθ, so that the human eye can be at the ideal position when viewing the image.
In the embodiment of the present application, the angle adjustment amount Δθ of the display device may be positive or negative, or may be 0. For example, when the adjustment amount Δθ is a positive number, it may indicate that the display device needs to be rotated upward with respect to the horizontal plane, when the adjustment amount Δθ is a negative number, it may indicate that the display device needs to be rotated downward with respect to the horizontal plane, and when the adjustment amount Δθ is 0, it may indicate that the display device does not need to be adjusted.
For example, as shown in (a) of fig. 12, when the human eye position in the display device acquired image is located above the ideal human eye position, the display device may determine the distance d between the human eye position and the ideal human eye position by an algorithm of image processing as shown in (b) of fig. 12, and calculate the adjustment amount Δθ of the angle of the generation display device according to the distance d. Subsequently, the display device performs angle adjustment according to the generated adjustment amount Δθ. It will be appreciated that when the display device completes the angle adjustment, as in fig. 12 (c), the eye position in the image obtained by the display device after the angle adjustment substantially coincides with the ideal eye position.
It should be noted that, the angle adjustment method for adjusting the height of the display device according to the embodiment of the present application may be used in combination with any one or more of the above-mentioned fig. 2 to 10. For example, when the relative height between the display device and the eye height of the user cannot continuously improve the experience such as eye comfort, the user's experience requirement can be met by adjusting the angle of the display device.
The method for adjusting the height according to the embodiment of the present application is described in detail above with reference to fig. 2 to 10, and the method for adjusting the angle according to the embodiment of the present application is described with reference to fig. 11 and 12. The display device and the processing apparatus provided by the embodiment of the application will be described in detail below with reference to fig. 13 and 14.
Fig. 13 is a schematic structural diagram of a display device 1300 according to an embodiment of the present application, where the display device 1300 can be applied to the cabin virtual image display system shown in fig. 1. As shown in fig. 13, the display device 1300 is arranged in order along the transmission direction of image light as an image generation unit 1301, a window unit 1302, and an image enlargement unit 1303. Wherein the image generating unit 1301 is configured to emit image light to the window unit. The window unit 1302 is for reflecting the image light from the image generating unit 1301 to the image amplifying unit 1303 and transmitting the image light from the image amplifying unit 1303, and the window unit 1302 is also for viewing a virtual image formed by the image light through the window unit 1302 by the human eye. The image magnifying unit 1303 is configured to reflect the image light from the window unit 1002 to the window unit 1302.
Alternatively, in the present embodiment, the image magnification unit 1303 is a free-form surface mirror.
Specifically, when a passenger views a video image using the display device 1300, the image generating unit 1301 emits image light to the window unit 1302, the image light is reflected by the window unit 1302 and then is incident on the image amplifying unit 1303, and is reflected again by the image amplifying unit 1303 and then is incident on the window unit 1302, and the window unit 1302 is transmitted, and at this time, the passenger can see a large video image located at a distance through the window unit 1302.
Fig. 14 is a schematic structural diagram of a processing apparatus 1400 according to an embodiment of the present application, where the processing apparatus 1400 can be applied to the cabin virtual image display system shown in fig. 1. As shown in fig. 14, the processing apparatus 1400 includes an acquisition unit 1410 and a processing unit 1420.
An obtaining unit 1410, configured to obtain user information, where the user information includes at least one of the following: the height of the user's eyes, the height of the user, and the weight of the user.
A processing unit 1420 for adjusting the height of the first seat to a first target height in response to user information; or for adjusting the height of the display device to a second target height in response to the user information; or for adjusting the height of the second seat to a third target height in response to user information. The first target height is the height of a part of the structure of the first seat relative to the ground of the cabin, and the first seat is a seat for installing the display device; the second target height is the height of the center of the window unit of the display device relative to the ground of the cabin; the third target height is the height of the seat cushion plane of the second seat relative to the ground of the cabin, and the second seat is the seat for the user to sit on.
The first target height, the second target height and the third target height are used for meeting the requirement that a user is in a viewing angle range when viewing a virtual image of the display device.
Optionally, the processing device 1400 further comprises a storage unit 1430 for storing computer programs or instructions and/or data, the processing unit 1420 being configured to execute the computer programs or instructions stored in the storage unit 1430 or to read the data stored in the storage unit 1430 for performing the methods in the method embodiments above.
Alternatively, the storage unit 1430 is one or more.
Alternatively, the storage unit 1430 may be integrated with the processing unit 1420 or may be provided separately.
It should be appreciated that the processing device 1400 herein is embodied in the form of functional units. The term "unit" herein may refer to an Application Specific Integrated Circuit (ASIC), an electronic circuit, a processor (e.g., a shared, dedicated, or group processor, etc.) and memory that execute one or more software or firmware programs, a combinational logic circuit, and/or other suitable components that support the described functionality. In an alternative example, it will be understood by those skilled in the art that the processing device 1400 may be specifically a processing device of the first seat, or a processing device of the display device, or a processing device of the second seat in the foregoing embodiment, and the processing device 1400 may be configured to execute each flow and/or step corresponding to each device in the foregoing method embodiment, which is not described herein for avoiding repetition.
When the processing device 1400 is a chip, the transceiver unit may be an input-output circuit or a communication interface; the processing unit may be an integrated processor or microprocessor or an integrated circuit on the chip.
It should be appreciated that in embodiments of the present application, the processor of the apparatus described above may be a central processing unit (central processing unit, CPU), which may also be other general purpose processors, digital Signal Processors (DSPs), application Specific Integrated Circuits (ASICs), field Programmable Gate Arrays (FPGAs) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
In implementation, the steps of the above method may be performed by integrated logic circuits of hardware in a processor or by instructions in the form of software. The steps of the method disclosed in connection with the embodiments of the present application may be embodied directly in a hardware processor for execution, or in a combination of hardware and software elements in the processor for execution. The software elements may be located in a random access memory, flash memory, read-only memory, programmable read-only memory or electrically erasable programmable memory, registers, etc. as well known in the art. The storage medium is located in a memory, and the processor executes instructions in the memory to perform the steps of the method described above in conjunction with its hardware. To avoid repetition, a detailed description is not provided herein.
Embodiments of the present application also provide a computer-readable storage medium having stored thereon a computer program or instructions which, when executed, cause a computer to perform the steps/methods as previously performed by the respective processing devices.
The embodiment of the application also provides a computer program product, which comprises: computer program code which, when run on a computer, causes the computer to perform the steps/methods performed by the respective processing devices described above.
The embodiment of the application also provides a display system, which comprises the display device and the processing equipment in the embodiment.
The embodiment of the application also provides a vehicle, and the vehicle is provided with the display system. It is to be understood that the present application may be applied to vehicles including, but not limited to, automobiles, airplanes, trains, or ships.
Fig. 15 is a schematic diagram of a height-adjustable display system 1500 according to an embodiment of the application. As shown in fig. 15, the display system 1500 includes a first seat 1510, a display device 1520, a second seat 1530, an acquisition device 1540, a processing device 1550, and an adjustment device 1560. The display device 1520 is mounted on the back of the first seat 1510 and the user sits on the second seat 1530. In the cabin display system, the first seat 1510 is a front seat, for example, a seat on which a driver sits or a secondary seat, and the second seat 1530 is a seat located at the rear of the first seat 1510. The display device 1520 may be the display device shown in fig. 13 described above. The acquisition device 1540 may be a sensor within the pod, such as an image sensor or a pressure sensor, etc. When the acquisition device 1540 is an image sensor, it may be a camera located on top of the cockpit or near a window of the display device 1520 (as shown in fig. 15). When the acquisition device 1540 is a pressure sensor, it may be located in a second seat. The processing device 1550 may be a processing device for performing the operation of the first seat 1510 in the above-described embodiment for performing the methods shown in fig. 3 to 5 described above. Or the processing device 1550 may be a processing device for performing the operation of the display device 1520 in the above-described embodiment for performing the methods shown in fig. 7 and 8 described above. Alternatively, or in addition, the processing device 1550 may be a processing device for performing the operation of the second seat 1530 in the above-described embodiment, for performing the method shown in fig. 9 and 10 described above. The adjustment device 1560 may be a rotating shaft, a sliding rail, or the like. Processing device 1550 is connected to acquisition device 1540, and processing device 1550 performs information interaction via a link connected to acquisition device 1540. The processing device 1550 is connected to the adjusting device 1560, and the adjusting device 1560 performs information interaction through a link connected to the processing device 1550. The adjustment device 1560 is coupled to at least one of the first seat 1510, the second seat 1530, and the display device 1520. It will be appreciated that the connection may be via a wired link, or may be via a wireless link, or may be directly or indirectly via other network devices, controllers, etc.
Specifically, when the user sits in the second seat 1530, the acquiring means 1540 acquires the user information and transmits the user information to the processing means 1550, and the processing means 1550 determines the adjustment amount Δh1 of the height of the first seat, or determines the adjustment amount Δh2 of the height of the display device, or determines the adjustment amount Δh3 of the height of the second seat, based on the user information and the relative height. The processing device 1550 sends an adjustment message to the adjustment device 1560, causing the adjustment device 1560 to adjust the height of the first seat Δh1 based on the adjustment message. For example, the adjustment message includes a specific adjustment amount, such as at least one of adjustment amount Δh1, adjustment amount Δh2, and adjustment amount Δh3. Subsequently, the adjusting device 1560 adjusts the height of the first seat according to the adjustment amount Δh1, or adjusts the height of the display device according to the adjustment amount Δh2, or adjusts the height of the second seat according to the adjustment amount Δh3.
It is appreciated that the display system 1500 may also include a plurality of first seats 1510, a display device 1520, a plurality of second seats 1530, a plurality of acquisition devices 1540, a plurality of processing devices 1550, and a plurality of adjustment devices 1560.
It should be noted that, in the display system 1500 provided in the embodiment of the present application, the relative height between the height of the display device and the height of the human eye satisfies that the user is in the viewing angle range when viewing the virtual image of the display device. The height of the display device and the height of the human eyes can be referred to the description in the above embodiments, and will not be repeated here.
Fig. 16 is a schematic circuit diagram of a display device according to an embodiment of the application. As shown in fig. 16, the circuits in the display device mainly include a main processor (host CPU) 1201, an external memory interface 1202, an internal memory 1203, an audio module 1204, a video module 1205, a power supply module 1206, a wireless communication module 1207, an i/O interface 1208, a video interface 1209, a display circuit 1210, a modulator 1212, and the like. The main processor 1201 and its peripheral components, such as an external memory interface 1202, an internal memory 1203, an audio module 1204, a video module 1205, a power module 1206, a wireless communication module 1207, an i/O interface 1208, a video interface 1209, and a display circuit 1210, may be connected via a bus. The main processor 1201 may be referred to as a front-end processor.
In addition, the circuit diagram illustrated in the embodiment of the present application does not constitute a specific limitation of the display device. In other embodiments of the application, the display device may include more or less components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The main processor 1201 includes one or more processing units, for example: the host Processor 1201 may include an application Processor (Application Processor, AP), a modem Processor, a graphics Processor (Graphics Processing Unit, GPU), an image signal Processor (IMAGE SIGNAL Processor, ISP), a controller, a video codec, a digital signal Processor (DIGITAL SIGNAL Processor, DSP), a baseband Processor, and/or a neural network Processor (Neural-Network Processing Unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
A memory may also be provided in the main processor 1201 for storing instructions and data. In some embodiments, the memory in the main processor 1201 is a cache memory. The memory may hold instructions or data that is just used or recycled by the main processor 1201. If the main processor 1201 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided, reducing the latency of the main processor 1201, and thus improving the efficiency of the system.
In some embodiments, the display device may also include a plurality of Input/Output (I/O) interfaces 1208 connected to the main processor 1201. Interface 1208 can include an integrated circuit (Inter-INTEGRATED CIRCUIT, I2C) interface, an integrated circuit built-in audio (Inter-INTEGRATED CIRCUIT SOUND, I2S) interface, a pulse code modulation (Pulse Code Modulation, PCM) interface, a universal asynchronous receiver Transmitter (Universal Asynchronous Receiver/Transmitter, UART) interface, a mobile industry processor interface (Mobile Industry Processor Interface, MIPI), a General-Purpose Input/Output (GPIO) interface, a subscriber identity module (Subscriber Identity Module, SIM) interface, and/or a universal serial bus (Universal Serial Bus, USB) interface, among others. The I/O interface 1208 may be connected to a mouse, a touch pad, a keyboard, a camera, a speaker/horn, a microphone, or a physical key (e.g., a volume key, a brightness adjustment key, an on/off key, etc.) on the display device.
The external memory interface 1202 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the display device. The external memory card communicates with the main processor 1201 through the external memory interface 1202 to realize a data storage function.
The internal memory 1203 may be used to store computer executable program code that includes instructions. The internal memory 1203 may include a stored program area and a stored data area. The storage program area may store an operating system, an application program (such as a call function, a time setting function, etc.) required for at least one function, and the like. The storage data area may store data created during use of the display device (e.g., phone book, universal time, etc.), etc. In addition, the internal memory 1203 may include a high speed random access memory, and may also include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (Universal Flash Storage, UFS), and the like. The main processor 1201 performs various functional applications of the display apparatus and data processing by executing instructions stored in the internal memory 1203 and/or instructions stored in a memory provided in the main processor 1201.
The display device may implement audio functions through the audio module 1204, an application processor, and the like. Such as music playing, talking, etc.
The audio module 1204 is used to convert digital audio information into an analog audio signal output, and also to convert an analog audio input into a digital audio signal. The audio module 1204 may also be used to encode and decode audio signals, such as for playback or recording. In some embodiments, the audio module 1204 may be provided in the main processor 1201, or some of the functional modules of the audio module 1204 may be provided in the main processor 1201.
The Video interface 1209 may receive an externally input audio/Video signal, which may specifically be a high-definition multimedia interface (High Definition Multimedia Interface, HDMI), a digital Video interface (Digital Visual Interface, DVI), a Video graphics array (Video GRAPHICS ARRAY, VGA), a Display Port (DP), etc., and the Video interface 1209 may also output Video. When the display device is used as an in-vehicle display, the video interface 1209 may receive a speed signal and an electric quantity signal input by a peripheral device, and may also receive a VR video signal input from the outside. When the display device is used, the video interface 1209 may receive a video signal input from an external computer or a terminal device.
The video module 1205 may decode video input by the video interface 1209, for example, h.264 decoding. The video module can also encode the video collected by the display device, for example, H.264 encoding is carried out on the video collected by the external camera. In addition, the main processor 1201 may decode the video input from the video interface 1209 and output the decoded image signal to the display circuit 1210.
The display circuit 1210 and modulator 1212 are used to display a corresponding image. In this embodiment, the video interface 1209 receives an externally input video source signal, and the video module 1205 decodes and/or digitizes the video source signal to output one or more image signals to the display circuit 1210, and the display circuit 1210 drives the modulator 1212 to image the incident polarized light according to the input image signal, so as to output image light. In addition, the main processor 1201 may output one or more image signals to the display circuit 1210.
In this embodiment, the display circuit 1210 and the modulator 1212 belong to the electronic element in the image generation unit described above, and the display circuit 1210 may be referred to as a driving circuit.
The power module 1206 is configured to provide power to the main processor 1201 and the light source 1200 based on input power (e.g., direct current), and the power module 1206 may include a rechargeable battery therein, which may provide power to the main processor 1201 and the light source 1200. Light from light source 1200 may be transmitted to modulator 1212 for imaging to form an image light signal.
The wireless Communication module 1207 may enable the display device to wirelessly communicate with the outside world, which may provide solutions for wireless Communication such as wireless local area network (Wireless Local Area Networks, WLAN) (e.g., wireless fidelity (WIRELESS FIDELITY, wi-Fi) network), bluetooth (BT), global navigation satellite system (Global Navigation SATELLITE SYSTEM, GNSS), frequency modulation (Frequency Modulation, FM), near field Communication (NEAR FIELD Communication, NFC), infrared (IR), etc. The wireless communication module 1207 may be one or more devices that integrate at least one communication processing module. The wireless communication module 1207 receives electromagnetic waves via an antenna, modulates the electromagnetic wave signals, performs filtering processing, and transmits the processed signals to the main processor 1201. The wireless communication module 1207 may also receive a signal to be transmitted from the main processor 1201, frequency modulate the signal, amplify the signal, and convert the signal into electromagnetic waves to radiate the electromagnetic waves through an antenna.
In addition, the video data decoded by the video module 1205 may be received wirelessly by the wireless communication module 1207 or read from an external memory, for example, the display device may receive video data from a terminal device or an in-vehicle entertainment system through a wireless lan in the vehicle, and the display device may read audio/video data stored in the external memory, in addition to the video data input through the video interface 1209.
The display device may be mounted on a vehicle, please refer to fig. 17, fig. 17 is a schematic diagram of a possible functional frame of a vehicle according to an embodiment of the present application.
As shown in FIG. 17, various subsystems may be included in the functional framework of the vehicle, such as a sensor system 12, a control system 14, one or more peripheral devices 16 (one shown in the illustration), a power supply 18, a computer system 20, and an on-board display system 22 in the illustration. Alternatively, the vehicle may include other functional systems, such as an engine system to power the vehicle, etc., as the application is not limited herein.
The sensor system 12 may include a plurality of sensing devices that sense the measured information and convert the sensed information to an electrical signal or other desired form of information output according to a certain rule. As shown, these detection devices may include, but are not limited to, a global positioning system (global positioning system, GPS), a vehicle speed sensor, an inertial measurement unit (inertial measurement unit, IMU), a radar unit, a laser rangefinder, an imaging device, a wheel speed sensor, a steering sensor, a gear sensor, or other elements for automatic detection, and so forth.
The control system 14 may include several elements such as a steering unit, a braking unit, a lighting system, an autopilot system, a map navigation system, a network timing system, and an obstacle avoidance system as shown. Optionally, control system 14 may also include elements such as throttle controls and engine controls for controlling the speed of travel of the vehicle, as the application is not limited.
Peripheral device 16 may include several elements such as the communication system in the illustration, a touch screen, a user interface, a microphone, and a speaker, among others. Wherein the communication system is used for realizing network communication between the vehicle and other devices except the vehicle. In practical applications, the communication system may employ wireless communication technology or wired communication technology to enable network communication between the vehicle and other devices. The wired communication technology may refer to communication between the vehicle and other devices through a network cable or an optical fiber, etc.
The power source 18 represents a system that provides power or energy to the vehicle, which may include, but is not limited to, a rechargeable lithium battery or lead acid battery, or the like. In practical applications, one or more battery packs in the power supply are used to provide electrical energy or power for vehicle start-up, the type and materials of the power supply are not limiting of the application.
Several functions of the vehicle are performed by the control of the computer system 20. The computer system 20 may include one or more processors 2001 (shown as one processor) and memory 2002 (which may also be referred to as storage devices). In practical applications, the memory 2002 is also internal to the computer system 20, or external to the computer system 20, for example, as a cache in a vehicle, and the application is not limited thereto. Wherein,
Processor 2001 may include one or more general-purpose processors, such as a graphics processor (graphic processing unit, GPU). The processor 2001 may be used to execute related programs or instructions corresponding to the programs stored in the memory 2002 to implement the corresponding functions of the vehicle.
Memory 2002 may include volatile memory (RAM), such as RAM; the memory may also include non-volatile memory (non-vlatile memory), such as ROM, flash memory (flash memory), HDD, or solid state disk SSD; memory 2002 may also include combinations of the above types of memory. Memory 2002 may be used to store a set of program codes or instructions corresponding to the program codes so that processor 2001 invokes the program codes or instructions stored in memory 2002 to implement the corresponding functions of the vehicle. In the present application, the memory 2002 may store a set of program codes for vehicle control, and the processor 2001 may call the program codes to control the safe running of the vehicle, and how the safe running of the vehicle is achieved will be described in detail below.
Alternatively, the memory 2002 may store information such as road maps, driving routes, sensor data, and the like, in addition to program codes or instructions. The computer system 20 may implement the relevant functions of the vehicle in combination with other elements in the functional framework schematic of the vehicle, such as sensors in the sensor system, GPS, etc. For example, the computer system 20 may control the direction of travel or speed of travel of the vehicle, etc., based on data input from the sensor system 12, and the application is not limited.
In-vehicle display system 22 may include several elements, such as a controller and an in-vehicle display. The controller 222 is configured to generate an image (e.g., an image of VR content) according to a user instruction, and send the image to the in-vehicle display for display; the in-vehicle display may include an image generating unit, a window unit through which a passenger can view a target image presented by the in-vehicle display, and an image enlarging unit. The functions of some elements in the vehicle display system may also be implemented by other subsystems of the vehicle, for example, the controller may also be an element in the control system.
In this regard, FIG. 17 of the present application is shown to include four subsystems, sensor system 12, control system 14, computer system 20, and in-vehicle display system 22, by way of example only, and not by way of limitation. In practical applications, the vehicle may combine several elements in the vehicle according to different functions, thereby obtaining subsystems with corresponding different functions. In practice, the vehicle may include more or fewer systems or elements, and the application is not limited.
The vehicle may be a car, truck, bus, ship, airplane, helicopter, recreational vehicle, train, etc., and the embodiment of the application is not particularly limited.
Unless defined otherwise, technical or scientific terms used herein should be given the ordinary meaning as understood by one of ordinary skill in the art to which this disclosure belongs.
The above embodiments are only examples of the present application, and are not intended to limit the present application, and any modifications, equivalent substitutions, improvements, etc. made on the basis of the present application should be included in the scope of the present application.

Claims (10)

1. A system for adjusting a height, the system comprising: the display device is arranged in the first seat, a user sits on the second seat to watch the image displayed by the display device, the adjusting device is connected with at least one of the first seat, the second seat and the display device, the acquiring device is connected with the processing device, the adjusting device is connected with the processing device,
The acquisition device is used for acquiring user information and sending the user information to the processing device;
The processing device is used for determining the relative height between the height of the display device and the human eye height according to the user information, generating an adjustment quantity based on the relative height, and sending the adjustment quantity to the adjustment device, wherein the height of the display device is the height of the center of a window unit of the display device relative to the ground of a cabin, the human eye height of the user is the height of the human eye of the user relative to the ground of the cabin, and the relative height between the height of the display device and the human eye height meets the requirement that the user is in a viewing angle range when viewing a virtual image of the display device;
the adjusting device is used for adjusting the height of the first seat, the height of the display device or the height of the second seat according to the adjusting quantity.
2. The system of claim 1, wherein the system further comprises a controller configured to control the controller,
The processing device is specifically configured to determine an adjustment amount Δh1 of the height of the first seat, or an adjustment amount Δh2 of the height of the display device, or an adjustment amount Δh3 of the height of the second seat, according to the user information and the relative height, and send any one of the adjustment amount Δh1, the adjustment amount Δh2, and the adjustment amount Δh3 to the adjustment device;
The adjusting device is specifically used for adjusting the height of the first seat according to the adjusting quantity delta H1, or adjusting the height of the display device according to the adjusting quantity delta H2, or adjusting the height of the second seat according to the adjusting quantity delta H3.
3. The system of claim 2, wherein the system further comprises a controller configured to control the controller,
The processing device is specifically configured to: a first height H1 of the first seat is acquired, a first target height is determined based on the user information, and the adjustment amount Δh1 is determined based on the first height H1 and the first target height.
4. A system according to claim 3, characterized in that the processing means are specifically adapted to: and determining the first target height based on the corresponding relation between the user information and the first target height.
5. A system according to claim 3, characterized in that the processing means are specifically adapted to: and acquiring a distance D between a vertical plane where human eyes of the user are positioned and a vertical plane where the window unit is positioned, and determining the first target height based on the distance D, the viewing angle theta of the human eyes and the user information.
6. The system of claim 5, wherein the viewing angle θ comprises a lower viewing angle θ1 or an upper viewing angle θ2, the lower viewing angle θ1 being a viewing angle of the human eye below a horizontal plane in which the human eye is positioned with respect to the human eye when the human eye is in a top view, the upper viewing angle θ2 being a viewing angle of the human eye above a horizontal plane in which the human eye is positioned with respect to the human eye when the human eye is in a bottom view, wherein the lower viewing angle θ1 is in a range of-1.5 ° θ1 ° or less than 0 °, and the upper viewing angle θ2 is in a range of 0 ° or less θ2 ° or less than 9 °.
7. The system of claim 1, wherein the user information comprises at least one of: the height of the user's eyes, the height of the user, the position of the eyes in the image, and the weight of the user.
8. The system of claim 1, wherein the display device comprises: an image generation unit, the window unit and an image magnification unit, wherein,
The image generating unit is used for emitting image light to the window unit;
The window unit is used for reflecting the image light from the image generation unit to the image amplifying unit and transmitting the image light from the image amplifying unit, and is also used for watching a virtual image formed by the image light through the window unit by human eyes;
The image amplifying unit is used for reflecting the image light from the window unit to the window unit.
9. A vehicle comprising a height adjustment system according to claim 1.
10. The vehicle of claim 9, wherein the display device is disposed at least one of a headrest of a seat of the vehicle, a seatback of the seat of the vehicle, and a secondary console of the vehicle.
CN202322595097.2U 2023-09-22 2023-09-22 Height-adjusting system and vehicle Active CN221162527U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202322595097.2U CN221162527U (en) 2023-09-22 2023-09-22 Height-adjusting system and vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202322595097.2U CN221162527U (en) 2023-09-22 2023-09-22 Height-adjusting system and vehicle

Publications (1)

Publication Number Publication Date
CN221162527U true CN221162527U (en) 2024-06-18

Family

ID=91435727

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202322595097.2U Active CN221162527U (en) 2023-09-22 2023-09-22 Height-adjusting system and vehicle

Country Status (1)

Country Link
CN (1) CN221162527U (en)

Similar Documents

Publication Publication Date Title
US20180056861A1 (en) Vehicle-mounted augmented reality systems, methods, and devices
JPWO2017169203A1 (en) Damping control device, damping control method, and moving body
US9404765B2 (en) On-vehicle display apparatus
CN105759426A (en) Intelligent automobile head-up display device
JP3183407U (en) Smartphone head-up display
CN221162527U (en) Height-adjusting system and vehicle
WO2024017038A1 (en) Image generation apparatus, display device and vehicle
CN204821323U (en) Vehicle -mounted electronic equipment
WO2024021852A1 (en) Stereoscopic display apparatus, stereoscopic display system, and vehicle
WO2024021574A1 (en) 3d projection system, projection system, and vehicle
CN115639673B (en) Display device and display method
US20230186651A1 (en) Control device, projection system, control method, and program
CN210149251U (en) Disconnect-type AR-HUD system with rotation function
CN221303607U (en) Window unit, display device, display system and vehicle
CN116500784A (en) Display device and vehicle
CN221303711U (en) Display device, processing equipment, display system and vehicle
CN221162235U (en) Display system, vehicle and cabin system
KR20230042285A (en) Route guidance device and its route guidance method
WO2024001225A1 (en) Virtual image display apparatus, method and apparatus for generating image data, and related device
TWM332601U (en) Head-up display apparatus
CN221067976U (en) Vehicle-mounted terminal regulation and control system and vehicle
WO2023216670A1 (en) Three-dimensional display apparatus and vehicle
US20210263315A1 (en) Wifi enabled head up display (hud)
US20180136462A1 (en) Rotating head-up display device
WO2023098228A1 (en) Display apparatus, electronic device and vehicle

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant