CN107492144B - Light and shadow processing method and electronic equipment - Google Patents

Light and shadow processing method and electronic equipment Download PDF

Info

Publication number
CN107492144B
CN107492144B CN201710568069.8A CN201710568069A CN107492144B CN 107492144 B CN107492144 B CN 107492144B CN 201710568069 A CN201710568069 A CN 201710568069A CN 107492144 B CN107492144 B CN 107492144B
Authority
CN
China
Prior art keywords
virtual display
display object
information
light
electronic device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710568069.8A
Other languages
Chinese (zh)
Other versions
CN107492144A (en
Inventor
杨冠辰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201710568069.8A priority Critical patent/CN107492144B/en
Publication of CN107492144A publication Critical patent/CN107492144A/en
Application granted granted Critical
Publication of CN107492144B publication Critical patent/CN107492144B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • Controls And Circuits For Display Device (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present disclosure provides a light and shadow processing method, including: displaying at least one virtual display object, wherein the at least one virtual display object is displayed by projection to the eyes of a user; acquiring environment information to determine the environment light information of the environment where at least one virtual display object is located; and adjusting a light and shadow effect of the at least one virtual display object based on the ambient light information. The present disclosure also provides an electronic device.

Description

Light and shadow processing method and electronic equipment
Technical Field
The disclosure relates to a light and shadow processing method and an electronic device.
Background
Virtual Reality (VR) is a computer simulation system that can create and experience a Virtual world, which uses a computer to create a simulation environment, and is a system simulation of multi-source information-fused, interactive three-dimensional dynamic views and physical behaviors to immerse users in the environment.
Augmented Reality (AR) is a new technology for seamlessly integrating real world information and virtual world information, and is characterized in that entity information (information such as vision, hearing, taste, smell, touch and the like) which is difficult to experience in a certain time space range of the real world originally is simulated and then superposed through scientific technologies such as computers, and virtual information is applied to the real world and is perceived by human senses, so that the sensory experience beyond Reality is achieved. The real environment and the virtual object are superposed on the same picture or space in real time and exist simultaneously.
Currently, in the related art, AR/VR designers still add shadows (lights and/or shadows) to AR interfaces in a scene in a global light baking mapping manner. However, in the course of implementing the present disclosure, the inventors found that the related art has at least the following drawbacks: the light shadow pasted on the AR interface by the bake sticker is fixed after baking, when the light shadow of the AR interface is not consistent with the light shadow in the real space, the sense of realism is reduced, and the user may feel tired of the eyes even after long-time use.
Disclosure of Invention
One aspect of the present disclosure provides a light and shadow processing method, including: displaying at least one virtual display object, wherein the at least one virtual display object is displayed by projecting to the eyes of a user; acquiring environment information to determine the environment light information of the environment where the at least one virtual display object is located; and adjusting the light and shadow effect of the at least one virtual display object based on the ambient light information.
Optionally, the method further includes: acquiring position information to determine weather information of an actual geographic position corresponding to the at least one virtual display object; and determining ambient light information around a location where the at least one virtual display object is located based on the determined weather information.
Optionally, the obtaining the location information to determine weather information of the actual geographic location corresponding to the at least one virtual display object includes: acquiring relative position information of the virtual display object relative to the electronic equipment; and determining weather information of an actual geographic position corresponding to the at least one virtual display object based on the position information and the relative position information.
Optionally, the method further includes: acquiring relative position information of the at least one virtual display object relative to the electronic equipment; and acquiring the ambient light information of the actual geographic position corresponding to the at least one virtual display object through a sensor based on the relative position information.
Optionally, before adjusting the light and shadow effect of the at least one virtual display object, the method further includes: determining an orientation of the at least one virtual display object; and determining an environmental state of an actual geographic location corresponding to the at least one virtual display object based on the determined orientation, wherein adjusting a light and shadow effect of the at least one virtual display object based on the ambient light information comprises: and adjusting the light and shadow effect of the at least one virtual display object based on the environment light information and the environment state.
Another aspect of the present disclosure provides an electronic device including: a display device for displaying at least one virtual display object, wherein the at least one virtual display object is displayed by projecting to the eyes of a user; the processor is used for acquiring environment information to determine the environment light information of the environment where the at least one virtual display object is located; and adjusting the light and shadow effect of the at least one virtual display object based on the ambient light information.
Optionally, the electronic device further includes: the first sensor is used for acquiring position information to determine weather information of an actual geographic position corresponding to the at least one virtual display object; and the processor is further configured to: and determining ambient light information around the position where the at least one virtual display object is located based on the determined weather information.
Optionally, the processor is further configured to: acquiring relative position information of the virtual display object relative to the electronic equipment; and determining weather information of an actual geographic position corresponding to the at least one virtual display object based on the position information and the relative position information.
Optionally, the processor is further configured to obtain relative position information of the at least one virtual display object with respect to the electronic device; the above electronic device further includes: and the second sensor is used for acquiring the ambient light information of the actual geographic position corresponding to the at least one virtual display object based on the relative position information.
Optionally, the processor is further configured to: determining an orientation of the at least one virtual display object prior to adjusting the shadow effect of the at least one virtual display object; the above electronic device further includes: a third sensor for determining an environmental state of an actual geographic location corresponding to the at least one virtual display object based on the determined orientation; and the processor is further configured to: and adjusting the light and shadow effect of the at least one virtual display object based on the environment light information and the environment state.
Drawings
For a more complete understanding of the present disclosure and the advantages thereof, reference is now made to the following descriptions taken in conjunction with the accompanying drawings, in which:
fig. 1A schematically illustrates an application scenario suitable for a shadow processing method and an electronic device according to an embodiment of the present disclosure;
fig. 1B schematically illustrates an application scenario suitable for a shadow processing method and an electronic device according to another embodiment of the present disclosure;
fig. 1C schematically illustrates an application scenario suitable for a shadow processing method and an electronic device according to another embodiment of the present disclosure;
FIG. 2 schematically illustrates a flow diagram of a method of shadow processing according to an embodiment of the present disclosure;
FIG. 3A schematically illustrates a flow chart for obtaining environmental information to determine ambient light information for an environment in which at least one virtual display object is located, according to an embodiment of the disclosure;
FIG. 3B schematically illustrates a flow chart of weather information for obtaining location information to determine an actual geographic location corresponding to at least one virtual display object, in accordance with an embodiment of the present disclosure;
FIG. 3C schematically illustrates a flow chart for obtaining environment information to determine ambient light information of an environment in which at least one virtual display object is located, according to another embodiment of the present disclosure;
FIG. 3D schematically illustrates a flow diagram of a method of shadow processing according to another embodiment of the present disclosure;
FIG. 4 schematically shows a block diagram of an electronic device 400 according to an embodiment of the disclosure; and
FIG. 5 schematically shows a block diagram of an electronic device 400 according to another embodiment of the present disclosure; and
FIG. 6 schematically illustrates a block diagram of a computer system that can implement a shadow processing method according to an embodiment of the present disclosure.
Detailed Description
Hereinafter, embodiments of the present disclosure will be described with reference to the accompanying drawings. It should be understood that the description is illustrative only and is not intended to limit the scope of the present disclosure. Moreover, in the following description, descriptions of well-known structures and techniques are omitted so as to not unnecessarily obscure the concepts of the present disclosure.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. The words "a", "an" and "the" and the like as used herein are also intended to include the meanings of "a plurality" and "the" unless the context clearly dictates otherwise. Furthermore, the terms "comprises," "comprising," and the like, as used herein, specify the presence of stated features, steps, operations, and/or components, but do not preclude the presence or addition of one or more other features, steps, operations, or components.
All terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art unless otherwise defined. It is noted that the terms used herein should be interpreted as having a meaning that is consistent with the context of this specification and should not be interpreted in an idealized or overly formal sense.
Some block diagrams and/or flow diagrams are shown in the figures. It will be understood that some blocks of the block diagrams and/or flowchart illustrations, or combinations thereof, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the instructions, which execute via the processor, create means for implementing the functions/acts specified in the block diagrams and/or flowchart block or blocks.
Accordingly, the techniques of this disclosure may be implemented in hardware and/or software (including firmware, microcode, etc.). In addition, the techniques of this disclosure may take the form of a computer program product on a computer-readable medium having instructions stored thereon for use by or in connection with an instruction execution system. In the context of this disclosure, a computer-readable medium may be any medium that can contain, store, communicate, propagate, or transport the instructions. For example, the computer readable medium can include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or propagation medium. Specific examples of the computer readable medium include: magnetic storage devices, such as magnetic tape or Hard Disk Drives (HDDs); optical storage devices, such as compact disks (CD-ROMs); a memory, such as a Random Access Memory (RAM) or a flash memory; and/or wired/wireless communication links.
The embodiment of the disclosure provides a light and shadow processing method and electronic equipment, wherein the light and shadow processing method comprises the following steps: the method comprises the steps of displaying at least one virtual display object, wherein the at least one virtual display object is displayed in a mode of being projected to eyes of a user, obtaining environment information to determine environment light information of the environment where the at least one virtual display object is located, and adjusting the light and shadow effect of the at least one virtual display object based on the environment light information.
It should be noted that the application scenarios of the embodiments of the present disclosure may include one or more, and are not limited herein. For example, the light and shadow processing method can be used for light and shadow processing of a VR/AR display interface.
As shown in fig. 1A to 1C, in these application scenarios, it is assumed that the user 102 wears one AR device and is walking on the road, and at this time, the following content may be displayed through the display interface of the AR device: a puppy 101 (virtual display object) is virtualized in the environment (for example, there is a road, there is a tree 103, there is a sun 104, etc.) where the user 102 is located, and due to the influence of factors such as time and space (hereinafter, referred to as space-time), the light and shade of the display object including the puppy 101 and the tree 103 are not constant.
For example, under the influence of spatial factors, from the perspective of the user 102, when the puppy 101 is in the position shown in fig. 1A, its shadow appears as a pattern; when puppy 101 runs onto the road shown in fig. 1B (i.e., comes beside user 102), its shadow appears in another form; when the puppy 101 runs on the side of the tree 103 as shown in fig. 1C, its light and shadow again take on a form; when puppy 101 runs under tree 103, its shadow will be submerged in the shade of the tree.
Based on this, the light and shadow processing scheme provided by the embodiment of the present disclosure may be used to continuously adjust the light and shadow of each virtual display object in the display interface according to factors such as time and space. Fig. 2 schematically illustrates a flow chart of a light and shadow processing method according to an embodiment of the present disclosure.
As shown in fig. 2, the light and shadow processing method may include operations S201 to S203, in which:
in operation S201, at least one virtual display object is displayed. Wherein the at least one virtual display object is displayed by projection to the eyes of the user.
In the embodiment of the present disclosure, when at least one virtual display object is displayed by projecting to the eyes of the user, the virtual display object may be displayed by using various electronic devices, which is not limited herein. For example, by means of an AR/VR device. One or more virtual display objects may be displayed in the display interface of these devices. The virtual display object(s) may be any object, and is not limited herein. For example, as shown in fig. 1A to 1C, a puppy 101, a tree 103, a sun 104, and a road and other characters (not shown) can be used as virtual display objects.
In operation S202, environment information is acquired to determine environment light information of an environment in which at least one virtual display object is located.
In the embodiment of the present disclosure, the means for acquiring the environment information may include various means, and is not limited herein.
For example, the light condition of the surrounding environment can be sensed by the light sensor; or the actual geographic position corresponding to the virtual display object can be determined first, then the server is accessed to obtain the weather information at that time, and then the ambient light information of the environment where the virtual display object is located is determined according to the weather information.
Taking AR as an example, when a user wearing the AR device moves or a virtual display object in an AR display interface moves, due to the influence of space-time factors, an ambient environment where the virtual display object is located may change accordingly, and correspondingly, a light condition in the ambient environment may also change accordingly, so that a light shadow of each virtual display object in the AR display interface may be influenced. Thus, preferably, the above-mentioned environment information may be acquired in real time in order to adjust the light and shadow effect of the virtual display objects based on the environment information in real time.
In operation S203, a light and shadow effect of at least one virtual display object is adjusted based on the ambient light information.
Specifically, the light and shadow display effect of each virtual display object can be determined according to the irradiation intensity and the irradiation angle of the light in the environment where each virtual display object is located.
As shown in fig. 1A to 1C, the puppy 101 is taken as an example, and is influenced by the space-time factor, and when the puppy 101 is at the position shown in fig. 1A, the light and shadow thereof can be adjusted to a state; when the puppy 101 runs to the position shown in fig. 1B, its light and shadow can be adjusted to the B state; when the puppy 101 is moved to the position shown in fig. 1C, its light shade can be adjusted to the C state.
Through the embodiment of the disclosure, the light and shadow effect of the virtual display object in the display interface can be adjusted at any time according to the ambient light information, and the defect that the light and shadow of the virtual display object is unchanged once set in the related technology is overcome, so that the sense of reality of the virtual display object is enhanced, and the eyes of a user are not tired even if the virtual display object is used for a long time.
The method shown in fig. 2 is further described with reference to fig. 3A-3D in conjunction with specific embodiments.
Fig. 3A schematically illustrates a flow chart for obtaining environment information to determine ambient light information of an environment in which at least one virtual display object is located, according to an embodiment of the present disclosure.
In this embodiment, the light and shadow processing method may include operations S301 and S302 (i.e., operation S202 may include operations S301 and S302) in addition to operations S201 and S203 described above with reference to fig. 2. The description of operations S201 and S203 is omitted here for the sake of brevity of description. As shown in fig. 3A, wherein:
in operation S301, location information is acquired to determine weather information of an actual geographic location corresponding to at least one virtual display object.
In operation S302, ambient light information around a location where at least one virtual display object is located is determined based on the determined weather information.
In the present disclosed embodiment, acquiring the environment information to determine the environment light information of the environment in which the at least one virtual display object is located may include operations S301 and S302.
In the embodiments of the present disclosure, the manner of obtaining the location information may include various manners, which are not limited herein. For example, the positioning may be performed by a GPS, and specifically, the actual geographic position where the virtual display object is located may be positioned according to a display condition of the virtual display object in the display interface; or positioning the reference object around the virtual display object, and then positioning the virtual display object according to the relative position relation; or the user using the electronic device such as AR may be positioned, and then the virtual display object may be positioned according to the relative positional relationship.
In the embodiment of the present disclosure, after the position information is acquired, the corresponding weather information may be acquired in various forms, which is not limited herein. For example, the weather information may be obtained by accessing a weather information server.
In the embodiment of the present disclosure, after determining the weather information of the actual geographic location corresponding to the at least one virtual display object, the weather information may be analyzed to determine the ambient light information around the location where the virtual display object(s) is (are) located.
By the embodiment of the disclosure, the ambient light information around the position of the virtual display object can be determined according to the weather information of the actual geographic position corresponding to the virtual display object, and the light and shadow effect of the virtual display object can be adjusted based on the ambient light information, so that the sense of reality of the virtual display object is enhanced, and the technical effect of user experience is improved.
Fig. 3B schematically illustrates a flow chart of weather information for obtaining location information to determine an actual geographic location corresponding to at least one virtual display object according to an embodiment of the disclosure.
In this embodiment, the above operation S301 may include operations S401 and S402 with reference to the corresponding operations described in fig. 2 and 3A. For the sake of simplicity of description, descriptions of other corresponding operations described in fig. 2 and 3A are omitted here. As shown in fig. 3B, wherein:
in operation S401, relative position information of a virtual display object with respect to an electronic device is acquired.
In operation S402, weather information of an actual geographic location corresponding to at least one virtual display object is determined based on the location information and the relative location information.
In the embodiment of the present disclosure, taking an electronic device as an AR device as an example, since the AR device is generally worn on a user, an actual geographic location of the electronic device may be determined in advance by positioning (for example, GPS positioning) the user, after determining a relative location between a virtual display object and the electronic device, an actual geographic location corresponding to the virtual display object is further determined according to the location of the electronic device and the relative location between the virtual display object and the electronic device, and finally, corresponding weather information is determined.
Through the embodiment of the disclosure, the weather information of the actual geographic position corresponding to the virtual display object can be accurately determined by determining the relative position information of the virtual display object relative to the electronic equipment.
Fig. 3C schematically illustrates a flowchart of acquiring environment information to determine ambient light information of an environment in which at least one virtual display object is located, according to another embodiment of the present disclosure.
In this embodiment, the light and shadow processing method may include operations S501 and S502 (i.e., operation S202 may include operations S501 and S502) in addition to operations S201 and S203 described above with reference to fig. 2. The description of operations S201 and S203 is omitted here for the sake of brevity of description. As shown in fig. 3C, wherein:
in operation S501, relative position information of at least one virtual display object with respect to an electronic device is acquired.
In operation S502, ambient light information of an actual geographic location corresponding to at least one virtual display object is collected by a sensor based on the relative location information.
In the embodiment of the present disclosure, operation S501 is the same as operation S401 in the above embodiment, and is not described herein again.
In an embodiment of the present disclosure, the sensor for acquiring the ambient light information of the actual geographic location corresponding to the virtual display object may be a light sensor. Here, the light sensor may include one or more, and it (or them) may be a sensor provided at an actual geographical position corresponding to the virtual display object and a peripheral position thereof.
Through the embodiment of the disclosure, another mode for acquiring the ambient light information is provided, so that the mode for acquiring the ambient light information is more diversified, and the user experience can be improved.
Fig. 3D schematically illustrates a flow chart of a light and shadow processing method according to another embodiment of the present disclosure.
In this embodiment, the light and shadow processing method may include operations S601 to S603 in addition to operations S201 and S202 described above with reference to fig. 2, where operation S203 in fig. 2 may be implemented by operation S603 in this embodiment, in other words, operation S203 in fig. 2 may be replaced with operation S603 in this embodiment. The description of operations S201 and S202 is omitted here for the sake of brevity of description. As shown in fig. 3D, wherein:
in operation S601, before adjusting the light and shadow effect of the at least one virtual display object, an orientation of the at least one virtual display object is determined.
In operation S602, an environmental state of an actual geographic location corresponding to at least one virtual display object is determined based on the determined orientation.
In operation S603, a light and shadow effect of at least one virtual display object is adjusted based on the ambient light information and the ambient state.
In particular, determining the orientation of the virtual display object may include a variety of ways, which are not limited herein. For example, the orientation of the virtual display object may be determined with reference to an electronic device (e.g., an AR device worn by the user).
Since the virtual display object is displayed by projecting to the eyes of the user, the orientation of the virtual display object relative to the user and the environmental state of the actual geographic location corresponding to the virtual display object affect the light and shadow effect of the virtual display object to different degrees.
As shown in fig. 1A to 1C, taking the puppy 101 as an example, under the influence of the time orientation factor and the environmental condition factor, when the puppy 101 is at the position shown in fig. 1A, the light and shadow thereof can be adjusted to the a state; when the puppy 101 runs to the position shown in fig. 1B, its light and shadow can be adjusted to the B state; when the puppy 101 is moved to the position shown in fig. 1C, its light shade can be adjusted to the C state.
Through the embodiment of the disclosure, when the light and shadow effect of the virtual display object is adjusted, the influence of the position of the virtual display object and the environment state where the virtual display object is located is fully considered, so that the design of the light and shadow effect is more in place and more accurate, the sense of reality of the virtual display object is further enhanced, and the user experience is improved.
Fig. 4 schematically shows a block diagram of an electronic device 400 according to an embodiment of the disclosure.
In this embodiment, the electronic device 400 may include a display device 410 and a processor 420. The electronic device 400 may perform the methods described above with reference to fig. 2, 3A, 3B, 3C, and 3D. As shown in fig. 4, wherein: the display device 410 is used for displaying at least one virtual display object. Wherein the at least one virtual display object is displayed by projection to the eyes of the user. The processor 420 is configured to obtain the environment information to determine environment light information of an environment in which the at least one virtual display object is located, and adjust a light and shadow effect of the at least one virtual display object based on the environment light information, and adjust the light and shadow effect of the at least one virtual display object based on the environment light information.
In the embodiment of the present disclosure, the electronic device 400 may include various types, which are not limited herein. For example, it may be an AR/VR device, or the like.
According to the embodiment of the disclosure, the light and shadow effect of the virtual display object in the display interface can be adjusted at any time according to the provided light and shadow processing method and the ambient light information, when the environment where the virtual display object is located changes, the light intensity can be obtained according to the current environment where the virtual display object is located, and the shade of the light and shadow can be changed, so that the defect that the light and shadow of the virtual display object in the related technology are unchanged once the light and shadow are set is overcome, the sense of reality of the AR interface of the virtual display object in the real environment is enhanced, and the eyes of a user cannot be tired even if the AR interface is used for a.
Fig. 5 schematically shows a block diagram of an electronic device 400 according to another embodiment of the present disclosure.
In this embodiment, the electronic device may further include the first sensor 510 in addition to the display device 410 and the processor 420 described above with reference to fig. 4, wherein the functions implemented by the display device 410 are as described in the embodiment shown in fig. 4. The description of the display device 410 is omitted herein for the sake of brevity of description. As shown in fig. 5, wherein: the electronic device 400 may further include a first sensor 510, the first sensor 510 is configured to obtain the location information to determine weather information of an actual geographic location corresponding to the at least one virtual display object, and the processor 420 is further configured to determine ambient light information around the location of the at least one virtual display object based on the determined weather information.
Through the embodiment of the disclosure, the ambient light information around the position of the virtual display object can be determined according to the weather information of the actual geographic position corresponding to the virtual display object, and then the light and shadow effect of the virtual display object is adjusted based on the ambient light information, so that the sense of reality of the virtual display object is enhanced, and the technical effect of user experience is improved.
According to an embodiment of the present disclosure, the processor 420 is further configured to obtain relative position information of the virtual display object with respect to the electronic device 400, and determine weather information of an actual geographic location corresponding to at least one virtual display object based on the position information and the relative position information.
Through the embodiment of the disclosure, the weather information of the actual geographic position corresponding to the virtual display object can be accurately determined by determining the relative position information of the virtual display object with respect to the electronic device 400.
According to an embodiment of the present disclosure, the processor 420 is further configured to obtain relative position information of the at least one virtual display object with respect to the electronic device 400, and the electronic device 400 may further include a second sensor configured to collect, based on the relative position information, ambient light information of an actual geographic position corresponding to the at least one virtual display object.
Through the embodiment of the disclosure, another mode for acquiring the ambient light information is provided, so that the mode for acquiring the ambient light information is more diversified, and the user experience can be improved.
According to an embodiment of the present disclosure, the processor 420 is further configured to determine an orientation of the at least one virtual display object before adjusting the light and shadow effect of the at least one virtual display object, the electronic device 400 may further include a third sensor configured to determine an environmental state of an actual geographic location to which the at least one virtual display object corresponds based on the determined orientation, and the processor is further configured to adjust the light and shadow effect of the at least one virtual display object based on the environmental light information and the environmental state.
It should be noted that the first sensor, the second sensor and the third sensor may be the same sensor.
Through the embodiment of the disclosure, when the light and shadow effect of the virtual display object is adjusted, the influence of the position of the virtual display object and the environment state where the virtual display object is located is fully considered, so that the design of the light and shadow effect is more in place and more accurate, the sense of reality of the virtual display object is further enhanced, and the user experience is improved.
According to embodiments of the invention, the processor 420 may be implemented as a hardware circuit, such as a Field Programmable Gate Array (FPGA), a programmable logic array (P L A), a system on a chip, a system on a substrate, a system on a package, an Application Specific Integrated Circuit (ASIC), or any other reasonable means of integrating or packaging a circuit with hardware or firmware, or any suitable combination of software, hardware, and firmware implementations.
FIG. 6 schematically shows a block diagram of a computer system according to another embodiment of the disclosure.
As shown in fig. 6, computer system 600 includes a processor 420 and a memory 620. The computer system 600 may perform the methods described above with reference to fig. 2, 3A, 3B, 3C, and 3D.
In particular, processor 420 may include, for example, a general purpose microprocessor, an instruction set processor and/or related chip set and/or a special purpose microprocessor (e.g., an Application Specific Integrated Circuit (ASIC)), and/or the like. The processor 420 may also include on-board memory for caching purposes. Processor 420 may be a single processing unit or a plurality of processing units for performing the different actions of the method flows according to embodiments of the present disclosure described with reference to fig. 2, 3A, 3B, 3C and 3D.
The memory 620, for example, can be any medium that can contain, store, communicate, propagate, or transport the instructions. For example, a readable storage medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or propagation medium. Specific examples of the readable storage medium include: magnetic storage devices, such as magnetic tape or Hard Disk Drives (HDDs); optical storage devices, such as compact disks (CD-ROMs); a memory, such as a Random Access Memory (RAM) or a flash memory; and/or wired/wireless communication links.
Memory 620 may include a computer program 621, which computer program 621 may include code/computer-executable instructions that, when executed by processor 420, cause processor 420 to perform a method flow, such as described above in connection with fig. 2, 3A, 3B, 3C, and 3D, and any variations thereof.
The computer program 621 may be configured with, for example, computer program code comprising computer program modules. For example, in an example embodiment, code in computer program 621 may include one or more program modules, including 621A, 621B, … …, for example. It should be noted that the division and number of modules are not fixed, and those skilled in the art may use suitable program modules or program module combinations according to actual situations, which when executed by the processor 420, enable the processor 420 to perform the method flows described above in connection with fig. 2, 3A, 3B, 3C and 3D, for example, and any variations thereof.
Those skilled in the art will appreciate that various combinations and/or combinations of features recited in the various embodiments and/or claims of the present disclosure can be made, even if such combinations or combinations are not expressly recited in the present disclosure. In particular, various combinations and/or combinations of the features recited in the various embodiments and/or claims of the present disclosure may be made without departing from the spirit or teaching of the present disclosure. All such combinations and/or associations are within the scope of the present disclosure.
While the disclosure has been shown and described with reference to certain exemplary embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the disclosure as defined by the appended claims and their equivalents. Accordingly, the scope of the present disclosure should not be limited to the above-described embodiments, but should be defined not only by the appended claims, but also by equivalents thereof.

Claims (8)

1. A method of shadow processing, comprising:
displaying at least one virtual display object, wherein the at least one virtual display object is displayed by projection to the eyes of a user;
acquiring environment information to determine the ambient light information of the environment where the at least one virtual display object is located;
determining the orientation of the at least one virtual display object, wherein the orientation of each virtual display object comprises one or more of the following: a relative position between the each virtual display object and other virtual display objects, a relative position between the each virtual display object and a real object, an orientation of the each virtual display object relative to a user;
determining an environmental state of an actual location corresponding to the at least one virtual display object based on the determined orientation; and
adjusting a light and shadow effect of the at least one virtual display object based on the ambient light information, the determined orientation, and the ambient state.
2. The method of claim 1, wherein the method further comprises:
acquiring position information to determine weather information of an actual geographic position corresponding to the at least one virtual display object; and
determining ambient light information surrounding a location where the at least one virtual display object is located based on the determined weather information.
3. The method of claim 2, wherein obtaining location information to determine weather information for an actual geographic location to which the at least one virtual display object corresponds comprises:
acquiring relative position information of the virtual display object relative to the electronic equipment; and
and determining weather information of an actual geographic position corresponding to the at least one virtual display object based on the position information and the relative position information.
4. The method of claim 1, wherein the method further comprises:
acquiring relative position information of the at least one virtual display object relative to the electronic equipment; and
and acquiring the ambient light information of the actual geographic position corresponding to the at least one virtual display object through a sensor based on the relative position information.
5. An electronic device, comprising:
the display device is used for displaying at least one virtual display object, wherein the at least one virtual display object is displayed in a manner of being projected to eyes of a user;
a processor for obtaining environmental information to determine ambient light information of an environment in which the at least one virtual display object is located; determining the orientation of the at least one virtual display object, wherein the orientation of each virtual display object comprises one or more of the following: a relative position between the each virtual display object and other virtual display objects, a relative position between the each virtual display object and a real object; determining an environmental state of an actual location corresponding to the at least one virtual display object based on the determined orientation; and adjusting a light and shadow effect of the at least one virtual display object based on the ambient light information, the determined orientation, and the ambient state.
6. The electronic device of claim 5, wherein:
the electronic device further includes: the first sensor is used for acquiring position information so as to determine weather information of an actual geographic position corresponding to the at least one virtual display object; and
the processor is further configured to: determining ambient light information surrounding a location where the at least one virtual display object is located based on the determined weather information.
7. The electronic device of claim 6, wherein the processor is further configured to:
acquiring relative position information of the virtual display object relative to the electronic equipment; and
and determining weather information of an actual geographic position corresponding to the at least one virtual display object based on the position information and the relative position information.
8. The electronic device of claim 5, wherein:
the processor is further configured to obtain relative position information of the at least one virtual display object with respect to the electronic device;
the electronic device further includes: and the second sensor is used for acquiring the ambient light information of the actual geographic position corresponding to the at least one virtual display object based on the relative position information.
CN201710568069.8A 2017-07-12 2017-07-12 Light and shadow processing method and electronic equipment Active CN107492144B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710568069.8A CN107492144B (en) 2017-07-12 2017-07-12 Light and shadow processing method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710568069.8A CN107492144B (en) 2017-07-12 2017-07-12 Light and shadow processing method and electronic equipment

Publications (2)

Publication Number Publication Date
CN107492144A CN107492144A (en) 2017-12-19
CN107492144B true CN107492144B (en) 2020-07-24

Family

ID=60643736

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710568069.8A Active CN107492144B (en) 2017-07-12 2017-07-12 Light and shadow processing method and electronic equipment

Country Status (1)

Country Link
CN (1) CN107492144B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108734754B (en) * 2018-05-28 2022-05-06 北京小米移动软件有限公司 Image processing method and device
CN111462295B (en) * 2020-03-27 2023-09-05 咪咕文化科技有限公司 Shadow processing method, device and storage medium in augmented reality shooting
CN112562051B (en) * 2020-11-30 2023-06-27 腾讯科技(深圳)有限公司 Virtual object display method, device, equipment and storage medium
CN113223139B (en) * 2021-05-26 2024-06-07 深圳市商汤科技有限公司 Augmented reality shadow estimation method, device and computer storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101520904A (en) * 2009-03-24 2009-09-02 上海水晶石信息技术有限公司 Reality augmenting method with real environment estimation and reality augmenting system
CN102696057A (en) * 2010-03-25 2012-09-26 比兹摩德莱恩有限公司 Augmented reality systems
CN105844714A (en) * 2016-04-12 2016-08-10 广州凡拓数字创意科技股份有限公司 Augmented reality based scenario display method and system
CN106530406A (en) * 2016-11-29 2017-03-22 东洋有限公司 Light field source orientation method for augmented and virtual reality and front-end equipment

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140071163A1 (en) * 2012-09-11 2014-03-13 Peter Tobias Kinnebrew Augmented reality information detail

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101520904A (en) * 2009-03-24 2009-09-02 上海水晶石信息技术有限公司 Reality augmenting method with real environment estimation and reality augmenting system
CN102696057A (en) * 2010-03-25 2012-09-26 比兹摩德莱恩有限公司 Augmented reality systems
CN105844714A (en) * 2016-04-12 2016-08-10 广州凡拓数字创意科技股份有限公司 Augmented reality based scenario display method and system
CN106530406A (en) * 2016-11-29 2017-03-22 东洋有限公司 Light field source orientation method for augmented and virtual reality and front-end equipment

Also Published As

Publication number Publication date
CN107492144A (en) 2017-12-19

Similar Documents

Publication Publication Date Title
US11120628B2 (en) Systems and methods for augmented reality representations of networks
CN107492144B (en) Light and shadow processing method and electronic equipment
US10708704B2 (en) Spatial audio for three-dimensional data sets
US11024014B2 (en) Sharp text rendering with reprojection
US8872853B2 (en) Virtual light in augmented reality
US11887258B2 (en) Dynamic integration of a virtual environment with a physical environment
US9224243B2 (en) Image enhancement using a multi-dimensional model
US11417052B2 (en) Generating ground truth datasets for virtual reality experiences
US9164723B2 (en) Virtual lens-rendering for augmented reality lens
US9454848B2 (en) Image enhancement using a multi-dimensional model
US20120236029A1 (en) System and method for embedding and viewing media files within a virtual and augmented reality scene
US20110043522A1 (en) Image-based lighting simulation for objects
CN112424832A (en) System and method for detecting 3D association of objects
CN112148116A (en) Method and apparatus for projecting augmented reality augmentation to a real object in response to user gestures detected in a real environment
US10088678B1 (en) Holographic illustration of weather
US20200074725A1 (en) Systems and method for realistic augmented reality (ar) lighting effects
CN111815783A (en) Virtual scene presenting method and device, electronic equipment and storage medium
Stødle et al. High-performance visualisation of UAV sensor and image data with raster maps and topography in 3D
Lissa et al. Augmented Reality
US11625857B1 (en) Enhanced content positioning
Arbaoui Applying Augmented Reality to Stimulate User Awareness in Urban Environments
US20240161390A1 (en) Method, apparatus, electronic device and storage medium for control based on extended reality
KR20180112478A (en) Method and apparatus for processing 3-dimension image
CN117197223A (en) Space calibration method, device, equipment, medium and program
JP2012234017A (en) Image processing device, map display device, image processing method, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant