CN114727077A - Projection method, apparatus, device and storage medium - Google Patents
Projection method, apparatus, device and storage medium Download PDFInfo
- Publication number
- CN114727077A CN114727077A CN202210209625.3A CN202210209625A CN114727077A CN 114727077 A CN114727077 A CN 114727077A CN 202210209625 A CN202210209625 A CN 202210209625A CN 114727077 A CN114727077 A CN 114727077A
- Authority
- CN
- China
- Prior art keywords
- projection
- user
- area
- projection area
- region
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 44
- 230000000007 visual effect Effects 0.000 claims abstract description 59
- 230000000694 effects Effects 0.000 abstract description 10
- 238000007654 immersion Methods 0.000 abstract description 7
- 238000004590 computer program Methods 0.000 description 10
- 238000004891 communication Methods 0.000 description 8
- 238000010586 diagram Methods 0.000 description 7
- 238000012545 processing Methods 0.000 description 5
- 230000006870 function Effects 0.000 description 4
- 230000008859 change Effects 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 230000006399 behavior Effects 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 238000003491 array Methods 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000001953 sensory effect Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3179—Video signal processing therefor
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- User Interface Of Digital Computer (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
The embodiment of the disclosure provides a projection method, a projection device, projection equipment and a storage medium. The method comprises the following steps: the method comprises the steps of acquiring the position and the visual angle of a user in real time, determining a projection area according to the position and the visual angle of the user and an indoor environment, adjusting a to-be-projected image according to the projection area, and projecting the adjusted to-be-projected image to the projection area. In this way, the projection area can be changed along with the position and the visual angle of the user, the adjusted image is projected in the projection area, the projection effect is improved, and the immersion experience is brought to the user.
Description
Technical Field
The present disclosure relates to the field of projection technologies, and in particular, to a projection method, an apparatus, a device, and a storage medium.
Background
At present, watching videos is an important means for people to relax, and because a projection picture is large and clear, entertainment experience is good, projection watching is gradually a common entertainment mode.
The conventional projection method is generally a fixed projection, for example, the walls and the roof of a room display corresponding images. It can be known that the user needs to have the best visual effect at a specific position, and once the user moves or watches the projection picture in an area outside the best position, the seen picture has distortion of different degrees, so that the visual effect is poor, and the user is difficult to experience immersion in watching.
Disclosure of Invention
The present disclosure provides a projection method, apparatus, device and storage medium, which can improve projection effect and bring immersion experience to users.
In a first aspect, an embodiment of the present disclosure provides a projection method, where the method includes:
acquiring the position and the visual angle of a user in real time;
determining a projection area according to the position and the visual angle of a user and an indoor environment;
adjusting the image to be projected according to the projection area;
and projecting the adjusted image to be projected to a projection area.
In some implementations of the first aspect, obtaining the position and the perspective of the user in real-time includes:
and carrying out gesture recognition on the user to obtain the position and the visual angle of the user.
In some implementations of the first aspect, determining the projection area based on the user's location and perspective, and the indoor environment, includes:
determining the visual field of the user according to the position and the visual angle of the user;
and selecting a region in the user visual field as a projection region according to a preset rule based on the user visual field and the indoor environment.
In some realizations of the first aspect, selecting a region within the user field of view as a projection region according to a preset rule based on the user field of view and the indoor environment, includes:
identifying one or more planar regions within a user field of view in an indoor environment;
and determining a projection area from the planar area in the visual field of the user according to the attribute information of the planar area in the visual field of the user.
In some implementations of the first aspect, the attribute information includes: at least one of area flatness, area color, area size and area shape;
determining a projection region from a planar region in the user's field of view according to attribute information of the planar region in the user's field of view, including:
determining a candidate projection region with attribute information meeting projection attribute conditions from the plane region in the user view according to the attribute information of the plane region in the user view;
calculating the distance and/or angle between the user and each candidate projection area;
and selecting the candidate projection area with the distance meeting the projection distance condition and/or the angle meeting the projection angle condition as the projection area from the candidate projection areas with the attribute information meeting the projection attribute condition.
In some realizations of the first aspect, adjusting the image to be projected according to the projection area includes:
determining the relative position relation of the projection equipment and the projection area;
and updating parameters of the projection equipment according to the attribute information and the relative position relation of the projection area so as to adjust the image to be projected.
In a second aspect, an embodiment of the present disclosure provides a projection apparatus, including:
the acquisition module is used for acquiring the position and the visual angle of a user in real time;
the determining module is used for determining a projection area according to the position and the visual angle of the user and the indoor environment;
the adjusting module is used for adjusting the image to be projected according to the projection area;
and the projection module is used for projecting the adjusted image to be projected to a projection area.
In a third aspect, an embodiment of the present disclosure provides an electronic device, including: at least one processor; and a memory communicatively coupled to the at least one processor; the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method as described above.
In a fourth aspect, the disclosed embodiments provide a non-transitory computer readable storage medium having stored thereon computer instructions for causing a computer to perform the method as described above.
According to the method and the device, the position and the visual angle of the user can be acquired in real time, the projection area is determined according to the position and the visual angle of the user and the indoor environment, the image to be projected is adjusted according to the projection area, and the adjusted image to be projected is projected to the projection area. Therefore, the projection area can be changed along with the position and the visual angle of the user, the adjusted image is projected in the projection area, the projection effect is improved, and the immersion experience is brought to the user.
It should be understood that the statements herein reciting aspects are not intended to limit the critical or essential features of the embodiments of the present disclosure, nor are they intended to limit the scope of the present disclosure. Other features of the present disclosure will become apparent from the following description.
Drawings
The above and other features, advantages and aspects of embodiments of the present disclosure will become more apparent upon consideration of the following detailed description, taken in conjunction with the accompanying drawings. The accompanying drawings are included to provide a further understanding of the present disclosure, and are not intended to limit the disclosure thereto, and the same or similar reference numerals will be used to indicate the same or similar elements, where:
FIG. 1 illustrates a schematic diagram of an exemplary operating environment in which embodiments of the present disclosure can be implemented;
FIG. 2 is a flow chart illustrating a projection method provided by an embodiment of the present disclosure;
fig. 3 is a block diagram of a projection apparatus provided in an embodiment of the present disclosure;
FIG. 4 sets forth a block diagram of an exemplary electronic device capable of implementing embodiments of the present disclosure.
Detailed Description
To make the objects, technical solutions and advantages of the embodiments of the present disclosure more clear, the technical solutions of the embodiments of the present disclosure will be described clearly and completely with reference to the drawings in the embodiments of the present disclosure, and it is obvious that the described embodiments are some, but not all embodiments of the present disclosure. All other embodiments, which can be derived by a person skilled in the art from the embodiments disclosed herein without making any creative effort, shall fall within the protection scope of the present disclosure.
In addition, the term "and/or" herein is only one kind of association relationship describing an associated object, and means that there may be three kinds of relationships, for example, a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" herein generally indicates that the former and latter related objects are in an "or" relationship.
In view of the problems occurring in the background art, the embodiments of the present disclosure provide a projection method, apparatus, device and storage medium. Specifically, the position and the visual angle of the user can be acquired in real time, the projection area is determined according to the position and the visual angle of the user and the indoor environment, the image to be projected is adjusted according to the projection area, and the adjusted image to be projected is projected to the projection area.
In this way, the projection area can be changed along with the position and the visual angle of the user, the adjusted image is projected in the projection area, the projection effect is improved, and the immersion experience is brought to the user.
The projection method, apparatus, device and storage medium provided by the embodiments of the present disclosure are described in detail by specific embodiments with reference to the accompanying drawings.
FIG. 1 illustrates a schematic diagram of an exemplary operating environment in which embodiments of the present disclosure can be implemented, as shown in FIG. 1, in which operating environment 100 may include a projection device and a user.
The projection device may be a projector, an unmanned aerial vehicle, or other devices with a projection function, which are generally installed indoors, for example, on a ceiling or a track of a wall so as to be movable. The user is a user watching the projection indoors.
As an example, the projection device may acquire the position and the view angle of the user in real time, determine a currently suitable projection area according to the current position and view angle of the user and an indoor environment, change the projection area along with the position and view angle of the user, perform corresponding adjustment on the image to be projected according to the determined projection area, project the adjusted image to be projected to the projection area to improve the projection effect, and bring immersion experience to the user.
The projection method provided by the embodiment of the disclosure will be described in detail below, wherein the execution subject of the projection method may be the projection apparatus shown in fig. 1.
Fig. 2 shows a flowchart of a projection method provided by an embodiment of the present disclosure, and as shown in fig. 2, the projection method 200 may include the following steps:
s210, acquiring the position and the visual angle of the user in real time.
In some embodiments, gesture recognition may be performed on the user to quickly and accurately obtain the user's position and perspective.
For example, the indoor environment image may be obtained, that is, the user watches the projected image indoors, for example, the indoor environment may be photographed by a camera built in the projection device or an external camera, so as to obtain the indoor environment image. And then, carrying out target detection on the indoor environment image to obtain the position and the head image of the user, and identifying the visual angle, namely the head direction, of the user according to the head image.
S220, determining a projection area according to the position and the view angle of the user and the indoor environment.
In some embodiments, the user visual field, that is, the user visual range, may be determined according to the position and the viewing angle of the user, and then, based on the user visual field and the indoor environment, a suitable region in the user visual field is selected as a projection region according to a preset rule, so as to improve the subsequent projection effect.
For example, one or more planar regions within a user's field of view in an indoor environment may be identified, resulting in attribute information for each planar region. Wherein, the attribute information may include: at least one item of information such as area flatness, area color, area size, and area shape is not limited herein.
And according to the attribute information of the plane area in the user visual field, quickly determining a projection area from the plane area in the user visual field.
As an example, a candidate projection region whose attribute information satisfies the projection attribute condition may be determined from the planar region in the user field of view according to the attribute information of the planar region in the user field of view.
For example: if the attribute information includes: and determining the plane area with the area flatness higher than a preset threshold value and the area color being white as a candidate projection area from the plane area in the visual field of the user.
For another example: if the attribute information includes: the area flatness, the area color and the area shape may be determined from the planar area in the user view field, where the area flatness is higher than a preset threshold, the area color is white, and the planar area with the rectangular area shape is a candidate projection area.
And secondly, calculating the distance and/or angle between the user and each candidate projection region, namely the included angle between the sight line of the user and the normal vector of the region, for example, calculating the distance and/or angle between the user and each candidate projection region according to the position of the user.
And then selecting the candidate projection area with the distance meeting the projection distance condition and/or the angle meeting the projection angle condition as the projection area from the candidate projection areas with the attribute information meeting the projection attribute condition, so that the subsequent projection effect is improved.
For example: and selecting the candidate projection area with the minimum distance as the projection area from the candidate projection areas with the attribute information meeting the projection attribute condition.
For another example: and selecting the candidate projection area with the smallest angle as the projection area from the candidate projection areas with the attribute information meeting the projection attribute condition.
For another example: and selecting a candidate projection area with the distance smaller than a preset threshold value from candidate projection areas with attribute information meeting the projection attribute condition, and selecting a candidate projection area with the smallest angle from the candidate projection areas with the distance smaller than the preset threshold value as a projection area.
For another example: and selecting the candidate projection area with the angle smaller than a preset threshold value from the candidate projection areas with the attribute information meeting the projection attribute condition, and selecting the candidate projection area with the minimum distance from the candidate projection areas with the angle smaller than the preset threshold value as the projection area.
It is understood that the position and the view angle at the current time may be compared with the position and the view angle at the previous time, and the position variation information (e.g., the movement distance) and the view angle variation information (e.g., the variation included angle) may be calculated, and if the position variation information is greater than or equal to the preset threshold and/or the view angle variation information is greater than or equal to the preset threshold, the projection area at the current time may be determined according to the position and the view angle at the current time and the indoor environment, and may be used as the currently available projection area.
Therefore, the projection area can be changed only under the condition that the movement change of the user is large, and the relative stability of the projection area is kept.
Or comparing the attribute information of the projection area at the current moment with the attribute information of the projection area at the previous moment to obtain difference information of the two, and if the difference information meets a preset difference condition, replacing the projection area at the previous moment with the projection area at the current moment to determine that the projection area at the current moment is the currently available projection area.
Therefore, the projection area can be changed only under the condition that the change of the projection area is large, and the relative stability of the projection area is kept.
And S230, adjusting the image to be projected according to the projection area.
In some embodiments, a relative position relationship, that is, a distance and an angle, between the projection device and the projection area may be determined, and parameters (such as a focal length, a picture frame, distortion parameters, and the like) of the projection device may be updated according to the attribute information and the relative position relationship of the projection area, so as to adjust the image to be projected to match the current projection area, thereby facilitating subsequent projection.
And S240, projecting the adjusted image to be projected to a projection area.
According to the embodiment of the disclosure, the appropriate projection area can be determined according to the current position and the current visual angle of the user and the indoor environment, the projection area is changed along with the position and the visual angle of the user, the image to be projected is correspondingly adjusted according to the determined projection area, the adjusted image to be projected is projected to the projection area, the projection effect is improved, and the immersion experience is brought to the user.
In some embodiments, the behavior of the user may be obtained, the projection image corresponding to the behavior is determined from the projection database as the image to be projected, and then the image is projected, so that the user experience is further improved.
For example: when the user is learning, the relevant short film of the learning content is projected on the desk; projecting a short food slice when the user is eating; when the user dances, the light of the mosaic ball is projected.
The projection method provided by the present disclosure may be described in detail with reference to a specific embodiment, which is as follows:
step 1, carrying out gesture recognition on a user to obtain the current position and the current visual angle of the user.
And 2, determining the current user view according to the position and the view angle of the user.
And 3, identifying one or more plane areas in the current user view field in the indoor environment to obtain the attribute information of each plane area.
And 4, determining a candidate projection area with the attribute information meeting the projection attribute condition from the plane area in the current user visual field according to the attribute information of the plane area in the user visual field.
And 5, calculating the distance and/or angle between the user and each candidate projection area.
And 6, selecting the candidate projection area with the distance meeting the projection distance condition and/or the angle meeting the projection angle condition as the current projection area from the candidate projection areas with the attribute information meeting the projection attribute condition.
And 7, determining the relative position relationship between the projection equipment and the current projection area, and updating parameters of the projection equipment according to the attribute information and the relative position relationship of the current projection area so as to adjust the image to be projected to be matched with the current projection area.
And 8, projecting the adjusted image to be projected to the current projection area.
Therefore, the projection equipment can automatically project along with the user, full-space pointing projection is realized, and immersive viewing experience is brought to the user.
It is noted that while for simplicity of explanation, the foregoing method embodiments have been described as a series of acts or combination of acts, it will be appreciated by those skilled in the art that the present disclosure is not limited by the order of acts, as some steps may, in accordance with the present disclosure, occur in other orders and concurrently. Further, those skilled in the art should also appreciate that the embodiments described in the specification are exemplary embodiments and that acts and modules referred to are not necessarily required by the disclosure.
The above is a description of embodiments of the method, and the embodiments of the apparatus are further described below.
Fig. 3 shows a structural diagram of a projection apparatus provided according to an embodiment of the present disclosure, and as shown in fig. 3, the projection apparatus 300 may include:
the acquiring module 310 is configured to acquire a position and a viewing angle of a user in real time.
A determining module 320 for determining the projection area according to the position and the view angle of the user and the indoor environment.
And the adjusting module 330 is configured to adjust the image to be projected according to the projection area.
And the projection module 340 is configured to project the adjusted image to be projected to a projection area.
In some embodiments, the obtaining module 310 is specifically configured to:
and carrying out gesture recognition on the user to obtain the position and the visual angle of the user.
In some embodiments, the determining module 320 is specifically configured to:
and determining the visual field of the user according to the position and the visual angle of the user.
And selecting a region in the user visual field as a projection region according to a preset rule based on the user visual field and the indoor environment.
In some embodiments, the determining module 320 is specifically configured to:
one or more planar regions within a user's field of view in an indoor environment are identified.
And determining a projection area from the planar area in the user visual field according to the attribute information of the planar area in the user visual field.
In some embodiments, the attribute information includes: at least one of area flatness, area color, area size, and area shape.
The determining module 320 is specifically configured to:
and determining a candidate projection region with the attribute information meeting the projection attribute condition from the plane region in the user visual field according to the attribute information of the plane region in the user visual field.
The distance and/or angle of the user from each candidate projection region is calculated.
And selecting the candidate projection area with the distance meeting the projection distance condition and/or the angle meeting the projection angle condition as the projection area from the candidate projection areas with the attribute information meeting the projection attribute condition.
In some embodiments, the adjusting module 330 is specifically configured to:
the relative positional relationship of the projection device and the projection area is determined.
And updating parameters of the projection equipment according to the attribute information and the relative position relation of the projection area so as to adjust the image to be projected.
It can be understood that each module/unit in the projection apparatus 300 shown in fig. 3 has a function of implementing each step in the projection method 200 provided by the embodiment of the disclosure, and can achieve the corresponding technical effect, and for brevity, no further description is provided herein.
FIG. 4 illustrates a block diagram of an electronic device that may be used to implement embodiments of the present disclosure. Electronic device 400 is intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. Electronic device 400 may also represent various forms of mobile devices, such as personal digital processors, cellular telephones, smart phones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be examples only, and are not meant to limit implementations of the disclosure described and/or claimed herein.
As shown in fig. 4, the electronic device 400 may include a computing unit 401 that may perform various appropriate actions and processes according to a computer program stored in a Read Only Memory (ROM)402 or a computer program loaded from a storage unit 408 into a Random Access Memory (RAM) 403. In the RAM403, various programs and data required for the operation of the electronic device 400 can also be stored. The computing unit 401, ROM402, and RAM403 are connected to each other via a bus 404. An input/output (I/O) interface 405 is also connected to bus 404.
A number of components in the electronic device 400 are connected to the I/O interface 405, including: an input unit 406 such as a keyboard, a mouse, or the like; an output unit 407 such as various types of displays, speakers, and the like; a storage unit 408 such as a magnetic disk, optical disk, or the like; and a communication unit 409 such as a network card, modem, wireless communication transceiver, etc. The communication unit 409 allows the electronic device 400 to exchange information/data with other devices via a computer network such as the internet and/or various telecommunication networks.
The various embodiments described herein above may be implemented in digital electronic circuitry, integrated circuitry, Field Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuits (ASICs), Application Specific Standard Products (ASSPs), systems on a chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, receiving data and instructions from, and transmitting data and instructions to, a storage system, at least one input device, and at least one output device.
Program code for implementing the methods of the present disclosure may be written in any combination of one or more programming languages. These program codes may be provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the program codes, when executed by the processor or controller, cause the functions/operations specified in the flowchart and/or block diagram to be performed. The program code may execute entirely on the machine, partly on the machine, as a stand-alone software package partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of this disclosure, a computer-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a computer-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
It should be noted that the present disclosure also provides a non-transitory computer readable storage medium storing computer instructions, where the computer instructions are used to enable a computer to execute the method 200 and achieve the corresponding technical effects achieved by the method according to the embodiments of the present disclosure, and for brevity, the detailed description is omitted here.
Additionally, the present disclosure also provides a computer program product comprising a computer program which, when executed by a processor, implements the method 200.
To provide for interaction with a user, the above-described embodiments may be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic, speech, or tactile input.
The embodiments described above may be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user may interact with an implementation of the systems and techniques described herein), or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), Wide Area Networks (WANs), and the Internet.
The computer system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server may be a cloud server, a server of a distributed system, or a server combining a blockchain.
It should be understood that various forms of the flows shown above may be used, with steps reordered, added, or deleted. For example, the steps described in the present disclosure may be executed in parallel, sequentially, or in different orders, and are not limited herein as long as the desired results of the technical solutions disclosed in the present disclosure can be achieved.
The above detailed description should not be construed as limiting the scope of the disclosure. It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and substitutions may be made in accordance with design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present disclosure should be included in the scope of protection of the present disclosure.
Claims (9)
1. A method of projection, the method comprising:
acquiring the position and the visual angle of a user in real time;
determining a projection area according to the position and the visual angle of the user and an indoor environment;
adjusting the image to be projected according to the projection area;
and projecting the adjusted image to be projected to the projection area.
2. The method of claim 1, wherein the obtaining the position and the perspective of the user in real time comprises:
and performing gesture recognition on the user to obtain the position and the visual angle of the user.
3. The method of claim 1, wherein determining a projection area based on the user's location and perspective, and an indoor environment comprises:
determining the visual field of the user according to the position and the visual angle of the user;
and selecting a region in the user visual field as a projection region according to a preset rule based on the user visual field and the indoor environment.
4. The method according to claim 3, wherein the selecting a region in the user's field of view as a projection region according to a preset rule based on the user's field of view and the indoor environment comprises:
identifying one or more planar regions in the indoor environment within the user field of view;
and determining the projection area from the plane area in the user visual field according to the attribute information of the plane area in the user visual field.
5. The method of claim 4, wherein the attribute information comprises: at least one of area flatness, area color, area size and area shape;
the determining the projection region from the planar region in the user field of view according to the attribute information of the planar region in the user field of view includes:
determining a candidate projection region with attribute information meeting projection attribute conditions from the plane region in the user visual field according to the attribute information of the plane region in the user visual field;
calculating the distance and/or angle between the user and each candidate projection region;
and selecting the candidate projection area with the distance meeting the projection distance condition and/or the angle meeting the projection angle condition as the projection area from the candidate projection areas with the attribute information meeting the projection attribute condition.
6. The method of claim 1, wherein adjusting the image to be projected according to the projection area comprises:
determining the relative position relationship between the projection equipment and the projection area;
and updating parameters of the projection equipment according to the attribute information of the projection area and the relative position relation so as to adjust the image to be projected.
7. A projection device, the device comprising:
the acquisition module is used for acquiring the position and the visual angle of a user in real time;
the determining module is used for determining a projection area according to the position and the visual angle of the user and the indoor environment;
the adjusting module is used for adjusting the image to be projected according to the projection area;
and the projection module is used for projecting the adjusted image to be projected to the projection area.
8. An electronic device, characterized in that the electronic device comprises:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-6.
9. A non-transitory computer readable storage medium having stored thereon computer instructions for causing a computer to perform the method of any one of claims 1-6.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210209625.3A CN114727077A (en) | 2022-03-04 | 2022-03-04 | Projection method, apparatus, device and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210209625.3A CN114727077A (en) | 2022-03-04 | 2022-03-04 | Projection method, apparatus, device and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114727077A true CN114727077A (en) | 2022-07-08 |
Family
ID=82236564
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210209625.3A Pending CN114727077A (en) | 2022-03-04 | 2022-03-04 | Projection method, apparatus, device and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114727077A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116489326A (en) * | 2023-04-07 | 2023-07-25 | 深圳市臻火科技有限公司 | Automatic following projection method and device and electronic equipment |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104869372A (en) * | 2014-02-25 | 2015-08-26 | 联想(北京)有限公司 | Projection method and electronic equipment |
CN106647821A (en) * | 2016-12-27 | 2017-05-10 | Tcl数码科技(深圳)有限责任公司 | Indoor projection following control method and system |
CN107977082A (en) * | 2017-12-19 | 2018-05-01 | 亮风台(上海)信息科技有限公司 | A kind of method and system for being used to AR information be presented |
CN109815939A (en) * | 2019-03-01 | 2019-05-28 | 北京当红齐天国际文化发展集团有限公司 | Projection display system and method based on human eye tracking |
CN109857246A (en) * | 2018-12-28 | 2019-06-07 | 努比亚技术有限公司 | Terminal and its 3D display control method and computer readable storage medium |
CN109960401A (en) * | 2017-12-26 | 2019-07-02 | 广景视睿科技(深圳)有限公司 | A kind of trend projecting method, device and its system based on face tracking |
CN110290368A (en) * | 2019-08-02 | 2019-09-27 | 北京小狗智能机器人技术有限公司 | A kind of control method and device of projection type equipment |
US20210329203A1 (en) * | 2017-12-13 | 2021-10-21 | Goertek Inc. | Projection method and projection device |
CN113612978A (en) * | 2021-07-01 | 2021-11-05 | 江西科骏实业有限公司 | Geometric distortion correction method, device, system and computer readable storage medium |
-
2022
- 2022-03-04 CN CN202210209625.3A patent/CN114727077A/en active Pending
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104869372A (en) * | 2014-02-25 | 2015-08-26 | 联想(北京)有限公司 | Projection method and electronic equipment |
CN106647821A (en) * | 2016-12-27 | 2017-05-10 | Tcl数码科技(深圳)有限责任公司 | Indoor projection following control method and system |
US20210329203A1 (en) * | 2017-12-13 | 2021-10-21 | Goertek Inc. | Projection method and projection device |
CN107977082A (en) * | 2017-12-19 | 2018-05-01 | 亮风台(上海)信息科技有限公司 | A kind of method and system for being used to AR information be presented |
CN109960401A (en) * | 2017-12-26 | 2019-07-02 | 广景视睿科技(深圳)有限公司 | A kind of trend projecting method, device and its system based on face tracking |
CN109857246A (en) * | 2018-12-28 | 2019-06-07 | 努比亚技术有限公司 | Terminal and its 3D display control method and computer readable storage medium |
CN109815939A (en) * | 2019-03-01 | 2019-05-28 | 北京当红齐天国际文化发展集团有限公司 | Projection display system and method based on human eye tracking |
CN110290368A (en) * | 2019-08-02 | 2019-09-27 | 北京小狗智能机器人技术有限公司 | A kind of control method and device of projection type equipment |
CN113612978A (en) * | 2021-07-01 | 2021-11-05 | 江西科骏实业有限公司 | Geometric distortion correction method, device, system and computer readable storage medium |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116489326A (en) * | 2023-04-07 | 2023-07-25 | 深圳市臻火科技有限公司 | Automatic following projection method and device and electronic equipment |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11625896B2 (en) | Face modeling method and apparatus, electronic device and computer-readable medium | |
US9886622B2 (en) | Adaptive facial expression calibration | |
US20190318153A1 (en) | Methods and apparatus for video-based facial recognition, electronic devices, and storage media | |
US11713970B2 (en) | Positioning method, electronic device and computer readable storage medium | |
CN112492388B (en) | Video processing method, device, equipment and storage medium | |
US11700417B2 (en) | Method and apparatus for processing video | |
CN113077548B (en) | Collision detection method, device, equipment and storage medium for object | |
CN110796738B (en) | Three-dimensional visualization method and device for state tracking of inspection equipment | |
CN111784757A (en) | Training method of depth estimation model, depth estimation method, device and equipment | |
CN108665510B (en) | Rendering method and device of continuous shooting image, storage medium and terminal | |
CN113325954A (en) | Method, apparatus, device, medium and product for processing virtual objects | |
CN113947768A (en) | Monocular 3D target detection-based data enhancement method and device | |
CN114727077A (en) | Projection method, apparatus, device and storage medium | |
US20220198743A1 (en) | Method for generating location information, related apparatus and computer program product | |
US20170186407A1 (en) | Display device, display system, and non-transitory recording medium | |
CN112541479B (en) | Panorama and interest point hooking method and device, electronic equipment and storage medium | |
US20220404460A1 (en) | Sensor calibration method and apparatus, electronic device, and storage medium | |
CN113810755B (en) | Panoramic video preview method and device, electronic equipment and storage medium | |
CN113784217A (en) | Video playing method, device, equipment and storage medium | |
CN114140560A (en) | Animation generation method, device, equipment and storage medium | |
US8755819B1 (en) | Device location determination using images | |
CN112991179B (en) | Method, apparatus, device and storage medium for outputting information | |
CN114332416B (en) | Image processing method, device, equipment and storage medium | |
CN113160377B (en) | Method, apparatus, device and storage medium for processing image | |
CN115719361A (en) | Target tracking method, device, equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
CB03 | Change of inventor or designer information | ||
CB03 | Change of inventor or designer information |
Inventor after: Shi Xuan Inventor after: Kang Hua Inventor before: Shi Xuan Inventor before: Wang Hongguang |