CN115457918A - Processing method and device - Google Patents

Processing method and device Download PDF

Info

Publication number
CN115457918A
CN115457918A CN202211214444.6A CN202211214444A CN115457918A CN 115457918 A CN115457918 A CN 115457918A CN 202211214444 A CN202211214444 A CN 202211214444A CN 115457918 A CN115457918 A CN 115457918A
Authority
CN
China
Prior art keywords
information
determining
target object
viewing effect
ambient light
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211214444.6A
Other languages
Chinese (zh)
Inventor
王煜坤
董文泉
刘振东
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN202211214444.6A priority Critical patent/CN115457918A/en
Publication of CN115457918A publication Critical patent/CN115457918A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D3/00Control of position or direction
    • G05D3/12Control of position or direction using feedback
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/06Adjustment of display parameters
    • G09G2320/0626Adjustment of display parameters for control of overall brightness
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/06Adjustment of display parameters
    • G09G2320/066Adjustment of display parameters for control of contrast
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/06Adjustment of display parameters
    • G09G2320/0666Adjustment of display parameters for control of colour parameters, e.g. colour temperature

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Computer Hardware Design (AREA)
  • Theoretical Computer Science (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The embodiment of the application provides a processing method and a device, wherein the processing method comprises the following steps of; obtaining environment information, wherein the environment information comprises environment light information and target object information in the environment where the display device is located; determining a first viewing effect of a display device according to the ambient light information and the target object information; and under the condition that the first viewing effect meets a trigger condition, adjusting the state of the display equipment according to the first viewing effect so that the display equipment has a second viewing effect.

Description

Processing method and device
Technical Field
The embodiment of the application relates to a processing method and device.
Background
If the surrounding environment has a strong light source, the screen is easy to generate mirror reflection, and the watching effect of the user on the screen is influenced.
Disclosure of Invention
The embodiment of the application provides a processing method, which comprises the following steps of;
obtaining environment information, wherein the environment information comprises environment light information and target object information in the environment where the display device is located;
determining a first viewing effect of a display device according to the ambient light information and the target object information;
and under the condition that the first viewing effect meets a trigger condition, adjusting the state of the display equipment according to the first viewing effect so that the display equipment has a second viewing effect.
As an optional embodiment, the target object information includes target object position information and a target object orientation direction, and the method further includes:
determining an activity area of the target object according to the position information and the orientation information of the target object;
determining the state adjustment range of the display equipment according to the target object activity area;
and adjusting the state of the display device according to the state adjustment range and the first viewing effect so that the display device has a second viewing effect.
As an alternative embodiment, the ambient light information includes an incident angle of ambient light, and the method further includes:
determining reflection information of the ambient light according to the incident information of the ambient light;
and determining a first viewing effect of the display device according to the reflection information of the ambient light and the target object information.
As an alternative embodiment, the method further comprises: determining a reflection path of the ambient light based on reflection information of the ambient light, the determining that the first viewing effect satisfies a trigger condition, comprising at least one of:
determining a target object position based on the target object information, and if the target object position and the reflection path meet a cross condition, determining that the first viewing effect meets a trigger condition; and
and if the target object activity area and the reflection path meet the intersection condition, determining that the first viewing effect meets the trigger condition.
As an alternative embodiment, the method further comprises the following steps:
obtaining an environment image;
determining brightness information of the environment image;
and determining the ambient light information according to the brightness information of the ambient image.
As an optional embodiment, the method further comprises at least one of:
determining an area with brightness information exceeding a preset value in the environment image as a target area, and determining a first viewing effect of the display device according to the target area and the target object information; and
determining an area with brightness information exceeding a preset value in the environment image as a target area, determining an adjusting parameter value required by the target area to be adjusted to a target state, and determining a first viewing effect of the display device according to the adjusting parameter value and the target object information.
Another embodiment of the present application provides a processing apparatus, comprising;
the device comprises an obtaining module, a processing module and a display module, wherein the obtaining module is used for obtaining environment information which comprises environment light information and target object information in the environment where the display device is located;
the first determining module is used for determining a first viewing effect of the display device according to the ambient light information and the target object information;
and the adjusting module is used for adjusting the state of the display equipment according to the first watching effect under the condition that the first watching effect meets the triggering condition, so that the display equipment has a second watching effect.
As an optional embodiment, the target object information includes target object position information and a target object orientation direction, and the processing apparatus further includes:
the second determining module is used for determining the activity area of the target object according to the position information and the orientation information of the target object;
the third determining module is used for determining the state adjusting range of the display equipment according to the target object activity area;
the adjusting module is further configured to adjust the state of the display device according to the state adjustment range and the first viewing effect, so that the display device has a second viewing effect.
As an alternative embodiment, the ambient light information includes an incident angle of an ambient light ray, and the processing device further includes:
the fourth determining module is used for determining the reflection information of the ambient light according to the incident information of the ambient light;
and the fifth determining module is used for determining the first viewing effect of the display equipment according to the reflection information of the ambient light and the target object information.
As an optional embodiment, the processing apparatus further comprises:
a sixth determining module, configured to determine a reflection path of the ambient light according to reflection information of the ambient light;
the determining that the first viewing effect satisfies a trigger condition comprises at least one of:
determining a target object position based on the target object information, and if the target object position and the reflection path meet a cross condition, determining that the first viewing effect meets a trigger condition; and
and if the target object active area and the reflection path meet the intersection condition, determining that the first viewing effect meets the trigger condition.
Drawings
Fig. 1 is a flowchart of a processing method in the embodiment of the present application.
FIG. 2 is a flow chart of a processing method in another embodiment of the present application.
Fig. 3 is a flow chart of a processing method in another embodiment of the present application.
Fig. 4 is a graph used in the processing method in the embodiment of the present application.
Fig. 5 is a block diagram of a processing device in the embodiment of the present application.
Detailed Description
Specific embodiments of the present application will be described in detail below with reference to the accompanying drawings, but the present application is not limited thereto.
It will be understood that various modifications may be made to the embodiments disclosed herein. The following description is, therefore, not to be taken in a limiting sense, but is made merely as an exemplification of embodiments. Other modifications will occur to those skilled in the art within the scope and spirit of the disclosure.
The accompanying drawings, which are incorporated in and constitute a part of the specification, illustrate embodiments of the disclosure and, together with a general description of the disclosure given above, and the detailed description of the embodiments given below, serve to explain the principles of the disclosure.
These and other characteristics of the present application will become apparent from the following description of a preferred form of embodiment, given as a non-limiting example, with reference to the attached drawings.
It should also be understood that, although the present application has been described with reference to some specific examples, a person of skill in the art shall certainly be able to achieve many other equivalent forms of application, having the characteristics as set forth in the claims and hence all coming within the field of protection defined thereby.
The above and other aspects, features and advantages of the present disclosure will become more apparent in view of the following detailed description when taken in conjunction with the accompanying drawings.
Specific embodiments of the present disclosure are described hereinafter with reference to the drawings; however, it is to be understood that the disclosed embodiments are merely examples of the disclosure that may be embodied in various forms. Well-known and/or repeated functions and structures have not been described in detail so as not to obscure the present disclosure with unnecessary or unnecessary detail. Therefore, specific structural and functional details disclosed herein are not to be interpreted as limiting, but merely as a basis for the claims and as a representative basis for teaching one skilled in the art to variously employ the present disclosure in virtually any appropriately detailed structure.
The specification may use the phrases "in one embodiment," "in another embodiment," "in yet another embodiment," or "in other embodiments," which may each refer to one or more of the same or different embodiments in accordance with the disclosure.
Hereinafter, embodiments of the present application will be described in detail with reference to the accompanying drawings.
As shown in fig. 1, an embodiment of the present application provides a processing method, including:
s100, obtaining environment information, wherein the environment information comprises environment light information and target object information in the environment where the display equipment is located;
s200, determining a first viewing effect of the display device according to the ambient light information and the target object information;
and S300, under the condition that the first viewing effect meets the trigger condition, adjusting the state of the display device according to the first viewing effect so that the display device has a second viewing effect.
For example, an image including environment information may be obtained by shooting with a camera, and the environment light information and the target object information may be obtained by using the image, where the environment light information may be light information emitted to the display device, and the target object information may be user information, or may be information of the camera itself, such as spatial orientation information, or may be position information of the camera with respect to the display device, such as position information of the camera with respect to the display device, or position information of the user with respect to the display device. Or, the ambient light information may be obtained through a light sensing device installed on the display device, and the target object information may be obtained through a device such as a radar, for example, the target object is a user, and the information of the user relative to the display device is obtained based on the radar. The above-mentioned manner of obtaining the environmental information is not unique, and information acquisition can be completed based on various different functional devices. After obtaining the environmental information, the electronic device may determine, according to the environmental light information and the target object information therein, a first viewing effect of the user on the display device, for example, determine whether the environmental light information causes display interference on a screen of the display device, so that a phenomenon of light reflection and glare occurs when the user views display content of the screen, which affects normal viewing of the screen by the user, or determine whether parameters such as brightness and contrast of the content displayed on the screen are affected by the environmental light, which causes the user to not see the content displayed on the screen, and so on. The second watching effect is superior to the first watching effect, the problem of the first watching effect can be improved, the watching requirement of a user is met, and the watching effect reaches the standard.
Based on the disclosure of the foregoing embodiment, it can be known that the present embodiment has beneficial effects that by obtaining environment information including environment light information and target object information of an environment where the display device is located, and determining a first viewing effect of the display device based on the environment information, when it is determined that the first viewing effect satisfies a trigger condition, a state of the display device is adjusted according to the first viewing effect, so that a user has a second viewing effect with respect to the display device. By the method in the embodiment, the orientation state, the display state and the like of the display device can be flexibly and effectively adjusted according to the actual environment condition, so that a user can have a second viewing effect relative to the display device, and the second viewing effect is better than the first viewing effect, so that the viewing experience of the user is better.
Further, the target object information in this embodiment includes target object position information and a target object orientation direction, for example, taking the target object as a user as an example, the target object information includes position information of the target object, such as position information of a relative display device, and orientation information of the user, such as whether the user is oriented to the display device, whether the user is looking at the display device, or whether the user is in a position and an orientation that can be looking at the display device. During application, the user can be monitored through the laser radar, the position information of the user relative to the display equipment and the distance between the user and the display equipment are determined, and the facing direction of the user relative to the display equipment can be determined by shooting through the camera. In addition, the camera and the gaze tracking method may be combined to track and shoot the user to obtain gaze information of the user, for example, gaze information of the user looking at the display device in different directions, and an active area of the gaze of the user on the screen of the display device may be determined based on the gaze information, that is, the gaze of the user is concentrated in the active area. The above-described manner of obtaining the sight line information, the position, and the orientation information is not limited to the method proposed in the present embodiment, and any method capable of achieving the effect can be applied.
As shown in fig. 2, the processing method in this embodiment further includes:
s400, determining an active area of the target object according to the position information and the orientation information of the target object;
s500, determining the state adjustment range of the display equipment according to the target object activity area;
and S600, adjusting the state of the display device according to the state adjustment range and the first viewing effect so that the display device has a second viewing effect.
Specifically, the target object in this embodiment is a user, and after obtaining the position information and the orientation information of the user based on the foregoing method, if it is determined that the user is located in front of the device and can view the display device, or even while viewing the display device, a method such as gaze tracking may be started to track and locate the gaze of the user, and an active region of the gaze of the user on the screen of the display device is determined, and the active region may also be a spatial active region, that is, a spatial active region enclosed between the eyes of the user and an active region range on the screen, where the region includes gaze information of the user looking at the display device in various directions. The active area in this embodiment is not fixed, but may change with the change of the user's orientation and the change of the orientation of the display device. After the activity area is determined, the device may determine a state adjustment range, such as an adjustment range of the orientation of the display device, corresponding to the activity area according to the activity area matching of the user, where the adjustment range is adapted to the orientation information of the current user. If the user is driving the vehicle at the moment, the display device is a vehicle-mounted display, and the current position of the user can be considered to be fixed, so that after the position information, the orientation information and the activity area of the user are determined, the device can directly determine the adjustment range of the orientation of the device based on the determined information, and the adjustment in the range does not influence the user to watch the display device. In addition, when the state adjustment range is determined, the device may further determine a current actual environment from the history adjustment information (including information about what adjustment is performed in what environment) or the captured picture by combining the history adjustment information or the environment, determine spatial information supporting the display device to perform orientation adjustment in the environment by combining the actual environment, and finally, comprehensively determine the state adjustment range based on the adjustment spatial information allowed by the environment and the activity area of the user. Or, the state adjustment range may also be determined by combining an adjustment range supportable by the display device and a material of the display screen, to assist in determining the state adjustment range that can be implemented, for example, whether the display screen has a peep-proof effect, whether a material attribute of the screen cannot be normally displayed for the user at a certain angle, and the like. Moreover, if the state adjustment range includes adjustment of the display parameter, it is also necessary to determine what brightness parameter range, contrast range, and the like can be normally displayed for the user in combination with the material properties of the screen, the display characteristics of the display device under different environmental characteristics, and the like, so as to determine the adjustment range of the display parameter based on the range. After the state adjustment range is determined, the electronic device can adjust the state of the display device according to the state adjustment range and the first viewing effect, for example, adjust the display direction of the display device relative to the user, the actual display parameters, such as brightness, contrast, color and other parameters, so as to enable the display device to have a second viewing effect meeting the current viewing requirement of the user, for example, the second viewing effect is clearer and brighter, and the screen has no reflection and glare phenomena.
The adjustment of the orientation of the display device may be determined according to a structural form of the display device relative to the electronic device, and if the display device is independent of other devices in the electronic device and can rotate and move independently, the adjustment policy may be calculated and determined according to the information, and then the rotation and/or movement may be implemented based on the adjustment policy. If the display device is integrated in the electronic device or is tightly connected to other devices of the electronic device, for example, the electronic device is a notebook computer, and the display device is a display end of the notebook computer, the electronic device can control the display end to rotate relative to the host end to adjust the orientation thereof, so as to achieve a second viewing effect for the user.
Further, the ambient light information in this embodiment includes incident angle information of ambient light with respect to the display device, which may be determined by a photosensitive device on the display device or by image capturing by a camera. As shown in fig. 3, the processing method further includes:
s700, determining reflection information of the ambient light according to the incident information of the ambient light;
and S800, determining a first viewing effect of the display device according to the reflection information of the ambient light and the target object information.
For example, based on incident information of ambient light, reflection information thereof is determined, which includes at least reflection information of ambient light directed to the display device. After the reflection information is determined, the device can determine a first watching effect of the user on the display device according to the reflection information, the position information and the orientation information of the user, and the light intensity information of the reflected light formed by the reflection of the ambient light and the screen of the display device. If the first viewing effect is a state where one or more portions have a glare phenomenon or a reflection phenomenon, or the first viewing effect is another state where the user cannot normally view the entire displayed content.
Further, the processing method in this embodiment further includes: the reflection path of the ambient light is determined based on the reflection information of the ambient light, such as determining the reflection information of the ambient light incident on the display device, and determining the reflection path based on the reflection information.
When the first viewing effect is determined to meet the trigger condition, at least one of the following is included:
determining the position of a target object based on the target object information, and if the position of the target object and the reflection path meet a cross condition, determining that a first viewing effect meets a trigger condition; and
and if the target object activity area and the reflection path meet the intersection condition, determining that the first viewing effect meets the trigger condition.
Specifically, the reflection path of the reflected light is determined in advance by the reflection information of the ambient light, and then the position of the user is determined based on the information recorded in the target object information, preferably, the position of the area where the eyes or the face of the user are located is determined, the position of the area range may have a certain threshold, and the size of the area range may also have a certain threshold, as long as the area range where the eyes or the face or the head of the user are located is roughly determined. The device may then determine, according to the reflection path, whether the reflected light is incident on the determined area of the area where the eye, face, or head is located, and if it is determined that the incident light is incident, that is, the intersection condition is satisfied, it may be determined that the first viewing effect satisfies the trigger condition.
Or, based on the foregoing, a spatial activity area of the sight line of the user is determined, and then it is determined whether the reflected light will be incident into the activity area based on the determined path of the reflected light, that is, whether the activity area and the reflected path satisfy the intersection condition, and if the reflected light is incident into the activity area, it may be indicated that the reflected light will interfere with the user viewing the display screen, that is, the viewing effect of the user is affected, and thus the triggering condition is satisfied.
Optionally, the method in this embodiment further includes:
s110, obtaining an environment image;
s120, determining brightness information of the environment image;
and S130, determining the ambient light information according to the brightness information of the ambient image.
For example, the device may determine at least the brightness information of the ambient light by capturing the ambient light with a camera to obtain an ambient image, the brightness information of which is matched with the brightness information of the ambient light. Alternatively, when determining the ambient light information, the luminance information of the ambient light may be determined by a light sensing device provided on the display apparatus. Of course, other ways of determining the brightness information of the ambient light are also possible.
Further, the method in this embodiment further includes at least one of:
determining an area of the environment image, of which the brightness information exceeds a preset value, as a target area, and determining a first viewing effect of the display device according to the target area and the target object information; and
determining an area of the environment image, in which the brightness information exceeds a preset value, as a target area, determining an adjusting parameter value required by the target area to be adjusted to a target state, and determining a first viewing effect of the display device according to the adjusting parameter value and the target object information.
Specifically, before the display device is in a state without adjusting the orientation, after the display device is irradiated by ambient light, if the brightness of the ambient light is high, a more or less large or small light reflection area may appear on the screen of the display device, and when it is determined that the reflection light intensity of the light reflection area exceeds the preset value and the light reflection area is located in the active area based on any mode such as an ambient image, the light reflection area may be directly determined as the target area. If the brightness of the ambient light is low and preset is not met, the area cannot be classified as a target area even if the area is located in the active area because the reflection of the light is weak and the interference to human eyes is not caused, and if the area of the full screen is the situation, the first viewing effect can be considered to meet the viewing standard or the viewing requirement, and the display equipment does not need to be adjusted at the moment. If it is determined that the display device has the target area, based on the target area, the viewing of the display device by the user is definitely interfered, that is, based on the target area and the orientation information of the user, or based directly on the active area, it may be determined that the first viewing effect is an effect of interfering with the viewing of the user, for example, an effect of generating glare interference.
Or, after the target area is determined based on the above method, the adjustment parameter value required for adjusting the target area to the target state may be determined, where the target state is a critical value at which the target area does not cause viewing interference to the user, and if the target area is higher than the critical value, the target area causes viewing interference to the user, and if the target area is smaller than the critical value, the target area does not cause viewing interference to the user. When the method is applied, the target state can be a group of RGB parameter values, when the adjusting parameter value is calculated, the adjusting parameter value required by adjusting the current parameter state of the target area to the target state can be calculated and determined, then the current first viewing effect of the display equipment by the user is determined based on the adjusting parameter value, if the adjusting parameter value does not exceed the preset standard, the target area can not cause viewing interference to the user, and if the adjusting parameter value exceeds the preset standard, the target area can be indicated to cause viewing interference to the user, and if the adjusting parameter value exceeds the preset standard, the target area can generate glare interference, and if the adjusting parameter value exceeds the preset standard, the target area can directly determine that the triggering condition is met, and the state of the display equipment needs to be adjusted.
Further, in practical application, the following can be referred to for specific processing procedures:
a rectangular spatial coordinate system shown in FIG. 4 is established by taking the center of the screen as an origin O (0, 0), an axis passing through a point O and parallel to the horizontal direction of the screen as an x-axis, an axis passing through the point O and parallel to the vertical direction of the screen as a y-axis, and an axis passing through the point O and perpendicular to the screen as a z-axis. Let coordinates of four vertexes A, B, C and D of the screen be A (x) respectively A ,y A ,z A ),B(x B ,y B ,z B ),C(x C ,y C ,z C ),D(x D ,y D ,z D ) Setting LiDAR camera (radar camera) at the upper edge of the screen, where the position is M, liDAR detects human eye activity area as E, and the coordinate of any point position in it is H (x) H ,y H ,z H ) The direction vector L of the incident ray detected by the LiDAR camera is L = (x) L ,y L ,z L ) Let x A =x D =x 0 ,x B =x C =-x 0 ,y A =y B =y 0 ,y C =y D =-y 0 ,z A =z B =z C =z D =0。
Obviously, the equation for the screen area ABCD is:
z=0(-x 0 ≤x≤x 0 ,-y 0 ≤y≤y 0 )
the normal vector n of the plane where the vector is located is as follows: n = (0, 1)
Assuming the coordinates of any point on the plane ABCD as (x ', y', 0), the equation (point-wise equation) for the incident ray is:
Figure BDA0003875695240000101
wherein x' ∈ [ -x 0 ,x 0 ],y′∈[-y 0 ,y 0 ]。
According to the vector solution of the reflected light, the corresponding reflected light direction vector m is:
Figure BDA0003875695240000102
the equation for the reflected ray (point-wise) is:
Figure BDA0003875695240000103
wherein x' ∈ [ -x 0 ,x 0 ],y′∈[-y 0 ,y 0 ]。
If the reflected light does not enter the human eye when the human moves in the moving area, the point H should not satisfy the equation condition, that is, for
Figure BDA0003875695240000106
Satisfies the following conditions:
Figure BDA0003875695240000104
or
Figure BDA0003875695240000105
Simplifying to obtain:
Figure BDA0003875695240000111
or
Figure BDA0003875695240000112
That is to
Figure BDA0003875695240000113
Figure BDA0003875695240000114
Or
Figure BDA0003875695240000115
Or
Figure BDA0003875695240000116
Or
Figure BDA0003875695240000117
If the conditions are met, the light is reflected when the person moves in the moving area, namely, the reflected light cannot enter human eyes; otherwise, the reflections may enter the human eye or may be directed towards the user's facial area.
The reflected light entering human eyes does not necessarily significantly affect the viewing of the screen by the user, and if the light is weak, only very slight reflected light is generated, and the direction of the display device may not be adjusted at this time. In order to facilitate the judgment of the equipment, the obtained reflected light reaches the condition that the user watching experience is just influenced, namely the critical condition, by adjusting the light source brightness of the electronic equipment by a technician or a user based on hardware parameters possibly influencing the reflected light intensity according to specific hardware, such as screen reflectivity, type selection of a camera and the like when the equipment leaves a factory or is used, and the current environment picture S is captured by the camera at the moment 0 As a proof sheet supplyLight intensity comparison was performed at a later date.
Specifically, when the user uses the method, if the algorithm determines that the reflected light enters the human eyes, the photo S in front of the screen at the moment is captured 1 . S can then be identified using an identification algorithm 0 And S 1 The respective brightest areas in the image are adjusted, the RGB curves of the image are adjusted, all pixel points in the areas corresponding to the areas are white (255, 255 and 255), the pixel points are equivalent to the parameter values of the target state, and then the respective adjustment amplitude r is recorded 0 And r 1 . If r is 1 ≥r 0 If the screen is in a preset range, the angle of the display equipment is adjusted, or the adjustment range allowed by the current environment is determined by the shot image; if r 1 <r 0 And the light reflection can be determined to be not obvious, and the angle of the display device is not required to be adjusted.
If the orientation of the display device needs to be adjusted according to the determination of the method, assuming that the screen ABCD rotates by an angle θ along the CD axis to avoid glare (in the following, θ > 0 is taken as an example, and θ < 0 is the same), the coordinates of the rotations a ', B', C, and D are respectively a '(x') 0 ,2y 0 cosθ-y 0 ,2y 0 sinθ),B′(-x 0 ,2y 0 cosθ-y 0 ,2y 0 sinθ),C(-x 0 ,-y 0 ,0),D(x 0 ,-y 0 ,0)。
The easy to find screen A 'B' CD equation is:
Figure BDA0003875695240000118
the normal vector n' is:
n′=(0,-tanθ,1)
assuming the coordinates of any point on the plane a ' B ' CD as (x ', y ', z '), the equation (point-wise equation) for the incident ray is:
Figure BDA0003875695240000121
wherein x' ∈ [ -x 0 ,x 0 ],y′∈[-y 0 ,2y 0 cosθ-y 0 ],z′∈[0,2y 0 sinθ]。
The corresponding reflected ray direction vector m' is:
Figure BDA0003875695240000122
the equation for the reflected ray (point-wise) is then:
Figure BDA0003875695240000123
wherein x' ∈ [ -x 0 ,x 0 ],y′∈[-y 0 ,2y 0 cosθ-y 0 ],z′∈[0,2y 0 sinθ]
If the reflected light does not enter the human eye when the person moves in the habitual area, the point H does not satisfy the equation condition, namely for
Figure BDA0003875695240000124
y′∈[-y 0 ,2y 0 cosθ-y 0 ],z′∈[0,2y 0 sinθ]Satisfies the following conditions:
Figure BDA0003875695240000125
or
Figure BDA0003875695240000126
And is provided with
Figure BDA0003875695240000127
Namely:
Figure BDA0003875695240000128
or
Figure BDA0003875695240000129
Since the region ∈ where the point H is located and the L vector are known, and the x ', y ', z ' ranges are known, therefore
Figure BDA00038756952400001210
And
Figure BDA00038756952400001211
ranges are also known. Solving the inequality according to monotonicity of the trigonometric function, if theta is not solved, the angle cannot be adjusted to a proper angle, so that the equipment does not adjust the angle of the display equipment, and the position of the equipment is suggested to be changed by a method for prompting a user; if theta has a solution (theta) 1 ,θ 2 ,...,θ n And theta 1 <θ 2 <…<θ n ) Then automatically adjust the angle of the display device to theta 1 Judging whether the position can reflect light or not, if not, or the reflection phenomenon can not cause viewing influence, stopping adjustment; otherwise, the adjustment process is continued according to the calculation result of the new position, and finally the user has a second viewing effect on the display equipment.
As shown in fig. 5, another embodiment of the present application also provides a processing apparatus (100), comprising;
the system comprises an obtaining module 1, a display module and a display module, wherein the obtaining module is used for obtaining environment information, and the environment information comprises environment light information and target object information in the environment where the display device is located;
the first determining module 2 is configured to determine a first viewing effect of the display device according to the ambient light information and the target object information;
and the adjusting module 3 is configured to adjust the state of the display device according to the first viewing effect under the condition that the first viewing effect meets a trigger condition, so that the display device has a second viewing effect.
As an optional embodiment, the target object information includes target object position information and a target object orientation direction, and the processing apparatus further includes:
the second determining module is used for determining the activity area of the target object according to the position information and the orientation information of the target object;
the third determining module is used for determining the state adjusting range of the display equipment according to the target object activity area;
the adjusting module is further configured to adjust the state of the display device according to the state adjustment range and the first viewing effect, so that the display device has a second viewing effect.
As an alternative embodiment, the ambient light information includes an incident angle of an ambient light ray, and the processing device further includes:
the fourth determining module is used for determining the reflection information of the ambient light according to the incident information of the ambient light;
and the fifth determining module is used for determining the first viewing effect of the display equipment according to the reflection information of the ambient light and the target object information.
As an optional embodiment, the processing apparatus further comprises:
a sixth determining module, configured to determine a reflection path of the ambient light according to reflection information of the ambient light;
the determining that the first viewing effect satisfies a trigger condition comprises at least one of:
determining a target object position based on the target object information, and if the target object position and the reflection path meet a cross condition, determining that the first viewing effect meets a trigger condition; and
and if the target object activity area and the reflection path meet the intersection condition, determining that the first viewing effect meets the trigger condition.
As an alternative embodiment, the method further comprises the following steps:
an obtaining module for obtaining an environment image;
a seventh determining module, configured to determine brightness information of the environment image;
an eighth determining module, configured to determine the ambient light information according to brightness information of the ambient image.
As an optional embodiment, the method further comprises at least one of:
determining an area of the environment image, of which the brightness information exceeds a preset value, as a target area, and determining a first viewing effect of the display device according to the target area and the target object information; and
determining an area of the environment image, in which the brightness information exceeds a preset value, as a target area, determining an adjusting parameter value required by the target area to adjust to a target state, and determining a first viewing effect of the display device according to the adjusting parameter value and the target object information.
Another embodiment of the present application further provides an electronic device, including:
one or more processors;
a memory configured to store one or more programs;
the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the processing methods described above.
An embodiment of the present application further provides a storage medium, on which a computer program is stored, which when executed by a processor implements the processing method as described above. It should be understood that each scheme in this embodiment has a corresponding technical effect in the foregoing method embodiment, and details are not described here.
Embodiments of the present application also provide a computer program product tangibly stored on a computer-readable medium and comprising computer-executable instructions that, when executed, cause at least one processor to perform a processing method such as the embodiments described above. It should be understood that each scheme in this embodiment has a corresponding technical effect in the foregoing method embodiment, and details are not described here.
Note that the computer storage media of the present application can be either computer readable signal media or computer readable storage media or any combination of the two. The computer readable medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access storage media (RAM), a read-only storage media (ROM), an erasable programmable read-only storage media (EPROM or flash memory), an optical fiber, a portable compact disc read-only storage media (CD-ROM), an optical storage media piece, a magnetic storage media piece, or any suitable combination of the foregoing. In the present application, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In this application, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, antenna, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
It should be understood that although the present application has been described in terms of various embodiments, not every embodiment includes only a single embodiment, and such description is for clarity purposes only, and those skilled in the art will recognize that the embodiments described herein may be combined as suitable to form other embodiments, as will be appreciated by those skilled in the art.
The above embodiments are only exemplary embodiments of the present application, and are not intended to limit the present application, and the protection scope of the present application is defined by the claims. Various modifications and equivalents may be made by those skilled in the art within the spirit and scope of the present application and such modifications and equivalents should also be considered to be within the scope of the present application.

Claims (10)

1. A method of processing, comprising;
obtaining environment information, wherein the environment information comprises environment light information and target object information in the environment where the display device is located;
determining a first viewing effect of the display device according to the ambient light information and the target object information;
and under the condition that the first viewing effect meets a trigger condition, adjusting the state of the display equipment according to the first viewing effect so that the display equipment has a second viewing effect.
2. The method of claim 1, the target object information comprising target object location information and target object heading direction, the method further comprising:
determining an active area of the target object according to the position information and the orientation information of the target object;
determining the state adjustment range of the display equipment according to the target object activity area;
and adjusting the state of the display device according to the state adjustment range and the first viewing effect so that the display device has a second viewing effect.
3. The method of claim 1 or 2, the ambient light information comprising an angle of incidence of an ambient light ray, the method further comprising:
determining reflection information of the ambient light according to the incident information of the ambient light;
and determining a first viewing effect of the display device according to the reflection information of the ambient light and the target object information.
4. The method of claim 3, further comprising: determining a reflection path of the ambient light based on reflection information of the ambient light, the determining that the first viewing effect satisfies a trigger condition, comprising at least one of:
determining a target object position based on the target object information, and if the target object position and the reflection path meet a cross condition, determining that the first viewing effect meets a trigger condition; and
and if the target object activity area and the reflection path meet the intersection condition, determining that the first viewing effect meets the trigger condition.
5. The method of claim 1, further comprising:
obtaining an environment image;
determining brightness information of the environment image;
and determining the ambient light information according to the brightness information of the ambient image.
6. The method of claim 5, further comprising at least one of:
determining an area with brightness information exceeding a preset value in the environment image as a target area, and determining a first viewing effect of the display device according to the target area and the target object information; and
determining an area of the environment image, in which the brightness information exceeds a preset value, as a target area, determining an adjusting parameter value required by the target area to adjust to a target state, and determining a first viewing effect of the display device according to the adjusting parameter value and the target object information.
7. A processing apparatus, comprising;
the device comprises an obtaining module, a display module and a display module, wherein the obtaining module is used for obtaining environment information, and the environment information comprises environment light information and target object information in the environment where the display device is located;
the first determining module is used for determining a first viewing effect of the display device according to the ambient light information and the target object information;
and the adjusting module is used for adjusting the state of the display equipment according to the first watching effect under the condition that the first watching effect meets the triggering condition, so that the display equipment has a second watching effect.
8. The processing apparatus according to claim 7, the target object information comprising target object position information and a target object heading direction, the processing apparatus further comprising:
the second determining module is used for determining the activity area of the target object according to the position information and the orientation information of the target object;
the third determining module is used for determining the state adjusting range of the display equipment according to the target object activity area;
the adjusting module is further configured to adjust the state of the display device according to the state adjustment range and the first viewing effect, so that the display device has a second viewing effect.
9. The processing device of claim 7 or 8, the ambient light information comprising an angle of incidence of an ambient light ray, the processing device further comprising:
the fourth determining module is used for determining the reflection information of the ambient light according to the incident information of the ambient light;
and the fifth determining module is used for determining the first viewing effect of the display equipment according to the reflection information of the ambient light and the target object information.
10. The processing device of claim 9, further comprising:
a sixth determining module, configured to determine a reflection path of the ambient light according to reflection information of the ambient light;
the determining that the first viewing effect satisfies a trigger condition comprises at least one of:
determining a target object position based on the target object information, and if the target object position and the reflection path meet a cross condition, determining that the first viewing effect meets a trigger condition; and
and if the target object activity area and the reflection path meet the intersection condition, determining that the first viewing effect meets the trigger condition.
CN202211214444.6A 2022-09-30 2022-09-30 Processing method and device Pending CN115457918A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211214444.6A CN115457918A (en) 2022-09-30 2022-09-30 Processing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211214444.6A CN115457918A (en) 2022-09-30 2022-09-30 Processing method and device

Publications (1)

Publication Number Publication Date
CN115457918A true CN115457918A (en) 2022-12-09

Family

ID=84308392

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211214444.6A Pending CN115457918A (en) 2022-09-30 2022-09-30 Processing method and device

Country Status (1)

Country Link
CN (1) CN115457918A (en)

Similar Documents

Publication Publication Date Title
CN110187855B (en) Intelligent adjusting method for near-eye display equipment for avoiding blocking sight line by holographic image
US10101807B2 (en) Distance adaptive holographic displaying method and device based on eyeball tracking
WO2020108647A1 (en) Target detection method, apparatus and system based on linkage between vehicle-mounted camera and vehicle-mounted radar
CN109996052B (en) Vehicle-mounted display device, vehicle-mounted display method, storage medium and vehicle
US20060140420A1 (en) Eye-based control of directed sound generation
CN109993115B (en) Image processing method and device and wearable device
CN106327584B (en) Image processing method and device for virtual reality equipment
KR102511292B1 (en) Apparatus and method for confirming object in electronic device
CN108781327B (en) Information processing apparatus, information processing method, and medium
CN112394526A (en) Multi-dimensional camera device and application terminal and method thereof
US9514524B2 (en) Optical distortion compensation
KR20140071330A (en) Method and apparatus for calibrating an imaging device
US20210084280A1 (en) Image-Acquisition Method and Image-Capturing Device
US11284191B1 (en) Customized sound field for increased privacy
KR101690646B1 (en) Camera driving device and method for see-through displaying
WO2019033469A1 (en) Augmented reality display method and device based on transparent display
WO2022126956A1 (en) Laser projection device and prompt method therefor
KR20190101445A (en) VR Head-Wearable Device
CN110895433B (en) Method and apparatus for user interaction in augmented reality
US11556009B1 (en) Camera mute indication for headset user
Drakopoulos et al. Front camera eye tracking for mobile VR
KR20210046984A (en) Method for obtaining face data and electronic device therefor
CN112954285B (en) Display method of projection picture and laser projection equipment
CN115457918A (en) Processing method and device
CN111200709A (en) Method for setting light source of camera system, camera system and vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination