CN110610545B - Image display method, terminal, storage medium and processor - Google Patents

Image display method, terminal, storage medium and processor Download PDF

Info

Publication number
CN110610545B
CN110610545B CN201810618191.6A CN201810618191A CN110610545B CN 110610545 B CN110610545 B CN 110610545B CN 201810618191 A CN201810618191 A CN 201810618191A CN 110610545 B CN110610545 B CN 110610545B
Authority
CN
China
Prior art keywords
displayed
data
acquiring
image
characteristic data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810618191.6A
Other languages
Chinese (zh)
Other versions
CN110610545A (en
Inventor
杨辰晖
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Tmall Technology Co Ltd
Original Assignee
Zhejiang Tmall Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Tmall Technology Co Ltd filed Critical Zhejiang Tmall Technology Co Ltd
Priority to CN201810618191.6A priority Critical patent/CN110610545B/en
Publication of CN110610545A publication Critical patent/CN110610545A/en
Application granted granted Critical
Publication of CN110610545B publication Critical patent/CN110610545B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses an image display method, a terminal, a storage medium and a processor. Wherein the method comprises the following steps: acquiring characteristic data of an object to be displayed, wherein the characteristic data are data acquired according to the preference of the object to be displayed; acquiring an initial image of the object to be displayed; acquiring imaging data of the object to be displayed according to the characteristic data and the initial image; the imaging data are displayed. The invention solves the technical problem that the existing AR equipment cannot effectively display the enhanced scene image according to the personal preference of the user.

Description

Image display method, terminal, storage medium and processor
Technical Field
The invention relates to the technical field of augmented reality, in particular to an image display method, a terminal, a storage medium and a processor.
Background
Augmented Reality (Augmented Reality, AR), which enhances a user's understanding and experience of a real environment by integrating a computer-generated Virtual object with the real environment, is an emerging technology developed on the basis of Virtual Reality technology (VR), and is one of the development hot spots in recent years.
However, in the context of big data age, how to combine big data with an emerging AR technology to improve the user experience of an AR device in various application scenarios, still remains a technical problem to be solved, for example, how to combine, in a shopping scenario, personal preferences of a user wearing or using the AR device with personal images of the user, and display images in a current scenario to the user, and no effective solution has been given in the prior art.
Aiming at the problem that the prior AR equipment cannot effectively carry out enhanced scene image display according to personal preference of a user, no effective solution is proposed at present.
Disclosure of Invention
The embodiment of the invention provides an image display method, a terminal, a storage medium and a processor, which are used for at least solving the technical problem that the conventional AR equipment cannot effectively display an enhanced scene image according to personal preference of a user.
According to an aspect of the embodiment of the present invention, there is provided an image display method including: acquiring characteristic data of an object to be displayed, wherein the characteristic data are data acquired according to the preference of the object to be displayed; acquiring an initial image of the object to be displayed; acquiring imaging data of the object to be displayed according to the characteristic data and the initial image; the imaging data are displayed.
According to another aspect of the embodiment of the present invention, there is also provided an image display method, including: acquiring preference data of an object to be displayed; acquiring an initial image of the object to be displayed, wherein the initial image is an Augmented Reality (AR) image displayed on the terminal; determining imaging data of the object to be displayed according to the preference data and the initial image; the imaging data are displayed.
According to another aspect of the embodiment of the present invention, there is also provided a terminal including: the processor is used for acquiring characteristic data of the object to be displayed, wherein the characteristic data are data acquired according to the preference of the object to be displayed; the image acquisition device is used for acquiring the initial image of the object to be displayed; the display device is used for acquiring imaging data of the object to be displayed according to the characteristic data and the initial image; and displaying the imaging data.
According to another aspect of the embodiments of the present invention, there is also provided a storage medium including a stored program, wherein the program is executed to control a device in which the storage medium is located to perform the following functions: acquiring characteristic data of an object to be displayed, wherein the characteristic data are data acquired according to the preference of the object to be displayed; acquiring an initial image of the object to be displayed; acquiring imaging data of the object to be displayed according to the characteristic data and the initial image; the imaging data are displayed.
According to another aspect of the embodiment of the present invention, there is also provided a processor for executing a program, where the program executes the following functions: acquiring characteristic data of an object to be displayed, wherein the characteristic data are data acquired according to the preference of the object to be displayed; acquiring an initial image of the object to be displayed; acquiring imaging data of the object to be displayed according to the characteristic data and the initial image; the imaging data are displayed.
In the embodiment of the invention, the characteristic data of the object to be displayed is obtained by combining big data with the augmented reality technology, wherein the characteristic data is obtained according to the preference of the object to be displayed; acquiring an initial image of the object to be displayed; acquiring imaging data of the object to be displayed according to the characteristic data and the initial image; the imaging data are displayed, the purpose of enhancing the scene image display according to the personal preference of the user is achieved, the technical effect of improving the user experience of the user in the current enhanced scene is achieved, and the technical problem that the existing AR equipment cannot effectively conduct the enhanced scene image display according to the personal preference of the user is solved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiments of the invention and together with the description serve to explain the invention and do not constitute a limitation on the invention. In the drawings:
FIG. 1 is a flow chart of an avatar presentation method according to an embodiment of the present application;
FIG. 2 is a schematic view of a scenario of an alternative avatar presentation method according to an embodiment of the present application;
FIG. 3 is a schematic view of a scenario of another alternative avatar presentation method according to an embodiment of the present application;
FIG. 4 is a schematic view of a scenario of yet another alternative avatar presentation method according to an embodiment of the present application;
FIG. 5 is a flow chart of an alternative avatar presentation method according to an embodiment of the present application;
FIG. 6 is a flow chart of an alternative avatar presentation method according to an embodiment of the present application;
FIG. 7 is a flow chart of an alternative avatar presentation method according to an embodiment of the present application;
FIG. 8 is a flow chart of an alternative avatar presentation method according to an embodiment of the present application;
FIG. 9 is a flow chart of an alternative avatar presentation method according to an embodiment of the present application;
FIG. 10 is a flow chart of an alternative avatar presentation method according to an embodiment of the present application;
FIG. 11 is a flow chart of an alternative avatar presentation method according to an embodiment of the present application;
fig. 12 is a schematic structural view of a terminal according to an embodiment of the present application;
FIG. 13 is a flow chart of another alternative avatar presentation method in accordance with embodiments of the present application;
FIG. 14 is a schematic structural view of an alternative character display device according to an embodiment of the present application;
FIG. 15 is a schematic view of an alternative character display device according to an embodiment of the present application; and
fig. 16 is a hardware configuration block diagram of a computer terminal (or mobile device) for implementing an avatar presentation method according to an embodiment of the present application.
Detailed Description
In order that those skilled in the art will better understand the present invention, a technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present invention without making any inventive effort, shall fall within the scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present invention and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the invention described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
First, partial terms or terminology appearing in describing embodiments of the present application are applicable to the following explanation:
augmented reality technology (Augmented Reality, AR for short): the method is a technology for calculating the position and angle of a camera image in real time and adding corresponding images, videos and 3D models, and can integrate real world information and virtual world information in a seamless mode and interact the real world and the virtual world on a screen.
Preference: is a basic concept in the theory of microscopic economic value, refers to the arrangement of alternative commodity combinations according to the own will of consumers, and belongs to a relative concept.
Example 1
Before describing further details of embodiments of the present application, one suitable visual presentation method embodiment that may be used to implement the principles of the present application will be described with reference to FIG. 1.
It should be noted that the steps illustrated in the flowcharts of the figures may be performed in a computer system such as a set of computer executable instructions, and that although a logical order is illustrated in the flowcharts, in some cases the steps illustrated or described may be performed in an order other than that illustrated herein.
The image display method embodiment provided by the embodiment 1 of the application can be widely applied to the Internet. With the advent of the big data age and the rapid development of AR technology, how to combine big data with emerging AR technology to improve the user experience of AR devices in various application scenarios, still a technical problem to be solved is still to be solved, for example, how to combine the personal preference of a user wearing or using AR devices with the personal image of the user in a shopping scenario, and to display the imaging in the current scenario to the user, and no effective solution is given in the prior art.
In the above-described operating environment, the present application provides an embodiment of an avatar presentation method as shown in fig. 1. Fig. 1 is a flowchart of an avatar display method according to an embodiment of the present application, and as shown in fig. 1, the avatar display method includes the following method steps:
step S102, obtaining characteristic data of an object to be displayed.
In the step S102, the object to be displayed may be a user wearing an augmented reality AR device (AR device) or using an augmented reality AR device, and the AR device may be in the following form: AR helmets, AR glasses, AR eye shields, and the like.
Specifically, the feature data may be data obtained according to the preference of the object to be displayed; as an alternative embodiment, the behavior and habit of the user may be observed through the augmented reality device carried by the user, and when the user stays for a long time in a part of the content of the augmented reality, the AR device understands the part of the content as the point of interest of the user, so that the preference of the user may be determined.
Step S104, obtaining the initial image of the object to be displayed.
In the step S104, the initial image of the object to be displayed may be, but not limited to, an augmented reality AR image displayed on the screen of the AR device, specifically, an original character image, which is acquired by an image acquisition device (e.g., a front or rear camera) on the AR device.
In an optional embodiment, an image of the object to be displayed, which is stored in advance in the AR device locally, or an image stored in a network side server may be obtained, so as to obtain the initial image; or, acquiring the current image of the object to be displayed through an image acquisition device on the AR equipment so as to obtain the initial image.
It should be noted that the sequence of the step S102 and the sequence of the step S104 may be interchanged, and the steps shown or described may be performed in a different order from the sequence herein, for example, the step S104 may be performed first to obtain the initial image of the object to be displayed; step S102 is executed again to acquire the feature data of the object to be displayed.
Step S106, acquiring imaging data of the object to be displayed according to the characteristic data and the initial image;
step S108, displaying the imaging data.
In the step S106 to the step S108, the imaging data of the object to be displayed may be acquired by, but not limited to, a display device (e.g., a display screen) on the AR device according to the feature data and the initial character, and displayed by the display device.
The display device may, but is not limited to, acquire imaging data of the object to be displayed by: acquiring at least one piece of clothing corresponding to the characteristic data; and combining the at least one garment on the corresponding body part in the initial image to obtain the imaging data.
As can be seen from the above, the present application obtains the feature data of the object to be displayed, where the feature data is obtained according to the preference of the object to be displayed; acquiring an initial image of the object to be displayed; acquiring imaging data of the object to be displayed according to the characteristic data and the initial image; the imaging data are displayed.
It is easy to notice that, in the present application, by combining big data with an augmented reality technology, feature data and an initial image of an object to be displayed are obtained through an AR device, where the feature data is data obtained according to the preference of the object to be displayed; acquiring imaging data of the object to be displayed according to the characteristic data and the initial image; the imaging data of the enhancement effect is displayed on the AR equipment, so that the purpose of enhancing the scene image display according to the personal preference of the user is achieved, the technical effect of improving the user experience of the user in the current enhancement scene is achieved, and the technical problem that the existing AR equipment cannot effectively conduct the enhancement scene image display according to the personal preference of the user is solved.
In an alternative embodiment, fig. 2 is a schematic view of a scenario of an alternative image displaying method according to an embodiment of the present application, as shown in fig. 2, an AR device may obtain, in advance, a preference of an object to be displayed by obtaining an initial image of the object to be displayed, and obtain feature data of the object to be displayed according to the preference of the object to be displayed; after the initial image and the feature data are obtained, the imaging data of the object to be displayed can be obtained through a display device on the AR device according to the feature data and the initial image, and in addition, in order to enhance the integration level of the virtual world information and the real world information, in the embodiment of the application, after the imaging data of the object to be displayed are obtained through the display device, the imaging data are matched with the initial image of the object to be displayed, and the imaging data successfully matched are displayed to a user using or wearing the AR device.
In another alternative embodiment, fig. 3 is a schematic view of a scene of another alternative image displaying method according to an embodiment of the present application, as shown in fig. 3, where the AR device may be an AR headset device: AR glasses. The personal preference data of the object to be displayed, for example, personal wearing preference data, can be obtained through AR glasses carried by the user. The personal preference data may be data stored in advance in the AR glasses, or may be data obtained in real time by the AR glasses through the image capturing device, for example, the AR glasses may observe the behavior and habit of the user, and when the user stays for a long time in a part of the augmented reality, the AR glasses understand the part of the content as the preference of the user, so the AR glasses may obtain the preference of the object to be displayed in advance, and obtain the feature data of the object to be displayed according to the preference of the object to be displayed, as shown in fig. 3, and when the AR glasses obtain that the object to be displayed favors the cap as the wearing ornament, the wearing cap may be used as the feature data of the object to be displayed.
In the above embodiment, as shown in fig. 3, the initial image of the object to be displayed may be obtained by the image acquisition device in the AR glasses, for example, the image of the object to be displayed stored in the AR glasses in advance may be obtained to obtain the initial image; or, acquiring the current image of the object to be displayed in real time by using an image acquisition device in the AR glasses so as to obtain the initial image. After the initial image and the feature data are obtained, the imaging data of the object to be displayed can be obtained through a display screen on the AR glasses according to the feature data and the initial image, and the imaging data are displayed to a user wearing the AR glasses.
There is an optional embodiment, and fig. 4 is a schematic view of a scenario of still another optional image displaying method according to an embodiment of the present application, as shown in fig. 4, where the AR device may also be a terminal device on the user side, for example: the smart phone can be configured or provided with software capable of realizing AR technology. The personal preference data of the object to be displayed may be data stored in the smart phone in advance, or may be data obtained by the smart phone through the image acquisition device in real time, for example, the smart phone may observe the behavior and habit of the user through the image acquisition device (front and rear cameras), and when the residence time of a part of the content of the user in augmented reality is longer, the smart phone understands the part of the content as the preference of the user, so the smart phone may obtain the preference of the object to be displayed in advance, and obtain the feature data of the object to be displayed according to the preference of the object to be displayed, as shown in fig. 4, when the smart phone obtains that the object to be displayed favors the glove as the wearing ornament, the glove may be worn as the feature data of the object to be displayed.
In the above embodiment, as shown in fig. 4, the initial image of the object to be displayed may be obtained by the image acquisition device in the smart phone, for example, the image of the object to be displayed stored in the smart phone in advance may be obtained to obtain the initial image; or the current image of the object to be displayed is acquired in real time through the image acquisition device so as to obtain the initial image. After the initial image and the feature data are obtained, the imaging data of the object to be displayed can be obtained through a display screen of the smart phone according to the feature data and the initial image, and the imaging data are displayed to a user of the smart phone.
In an alternative embodiment, fig. 5 is a flowchart of an alternative image displaying method according to an embodiment of the present application, as shown in fig. 5, before the step S102, that is, before the feature data of the object to be displayed is obtained, the method further includes:
step S202, obtaining the identity information of the object to be displayed.
Specifically, in the step S202, the identity information may be an account corresponding to the display object, for example, a panda account, a jindong account, a Su Ningyi purchase account, a payment treasured account, or the like, which are registered for the display object.
Step S204, the characteristic data is determined according to the identity information.
Based on the alternative embodiments provided in the steps S202 to S204, before the feature data of the object to be displayed is obtained, the feature data of the object to be displayed may be determined according to the identity information of the object to be displayed, for example, preference data may be determined through the purchase record of the object to be displayed, and further, the feature data of the object to be displayed may be determined, so that the purpose of accurately determining the feature data of the object to be displayed at a high speed based on the identity information of the object to be displayed is achieved.
In an alternative embodiment, the identity information includes: an account corresponding to the object to be displayed; fig. 6 is a flowchart of an alternative image displaying method according to an embodiment of the present application, as shown in fig. 6, in step S102, the feature data of the object to be displayed is obtained, which specifically includes the following method steps:
step S302, a history operation record under the account is obtained from a server.
In the step S302, the account may be an account such as a panda account, a jindong account, a Su Ningyi purchase account, or a payment treasured account registered for use by the display object; the history operation record may be history information such as history shopping consumption record, history payment record, history receiving record, and returning record in the account.
Under the condition that the identity information of the object to be displayed is an account corresponding to the object to be displayed, a historical operation record under the account of the object to be displayed can be obtained from a server of a network side, so that the purpose of accurately determining the characteristic data of the object to be displayed in real time is achieved.
Step S304, a target object corresponding to the maximum operation times in the history operation record is obtained, and the feature data of the object to be displayed is determined based on the target object.
In the step S304, the target object may be clothing, shoes, ornaments, etc. purchased by the object to be displayed, and preference data and feature data of the object to be displayed may be determined based on the clothing, shoes, ornaments, etc. purchased by the object to be displayed most.
In an alternative embodiment, fig. 7 is a flowchart of an alternative image displaying method according to an embodiment of the present application, and as shown in fig. 7, determining, based on the target object, feature data of the object to be displayed includes:
step S402, obtaining at least one of the following information of the target object: image information of the target object and attribute information of the target object;
step S404, the at least one piece of information of the target object is used as the characteristic data of the object to be displayed.
Based on the optional embodiments provided in the above steps S402 to S404, after the target object corresponding to the maximum operation number in the above history of operation records is acquired, at least one of the following information of the target object may be acquired: image information of the target object, and attribute information of the target object.
Taking the above target object as the jean jacket as an example, image information (picture or video image) of the jean jacket and attribute information of the jean jacket can be obtained: the dress-coat-women jeans coat can further take the image information of the target object or the attribute information of the target object as the characteristic data of the object to be displayed.
In another alternative embodiment, the identity information includes: user information locally stored by a terminal used by the object to be displayed; fig. 8 is a flowchart of an alternative image displaying method according to an embodiment of the present application, as shown in fig. 8, in step S102, the feature data of the object to be displayed is obtained, and specifically further includes the following method steps:
step S502, obtaining user information locally stored by a terminal used by the object to be displayed.
In the step S502, the terminal may be an AR device (AR helmet, AR glasses, AR eye-mask, etc.), or a terminal device (smart phone, tablet computer, wearable device, computer device, etc.) capable of implementing an AR function
As an alternative embodiment, the user information includes: and the object to be displayed records the historical operation of the terminal and the area information of the area where the terminal is located.
It should be noted that, the history operation record of the object to be displayed on the terminal may be: a record of browsing shopping websites for pairs to be displayed, a record of searching for or purchasing items at a shopping website. The area information of the area where the terminal is located can be implemented according to the GPS positioning function set in the terminal, and it can be clear that the area information is that the terminal is located in a first line city and the terminal is located in a remote small city, and the preference of the object to be displayed can be affected to a certain extent.
Step S504, the characteristic data is obtained according to the user information.
In the step S504, feature data of the object to be displayed may be obtained according to the history of the object to be displayed on the terminal and the area information of the area where the terminal is located.
There is also an optional embodiment, and fig. 9 is a flowchart of an optional image displaying method according to an embodiment of the present application, and as shown in fig. 9, before the image data of the object to be displayed is obtained according to the feature data and the initial image, the method further includes the following method steps:
step S602, displaying the prompt information and the characteristic data.
In step S602, the AR device may display, through a display screen, a prompt message and feature data, where the prompt message is used to prompt a user whether to acquire the imaging data according to the feature data or the initial image.
Step S604, receiving a selection instruction corresponding to the prompt information.
In the step S604, the user may send the selection instruction to the AR device by clicking a key, a touch screen, or voice control provided on the AR device.
Step S606, determining to enable or disable the acquisition of the imaging data of the object to be displayed according to the characteristic data and the initial image according to the selection instruction.
Based on the optional embodiments provided in the steps S602 to S606, by displaying the prompt information and the acquired feature data to the user, it is able to confirm to the user whether to allow the imaging data of the object to be displayed to be acquired according to the feature data and the initial image before acquiring the imaging data of the object to be displayed, so as to effectively enhance the scene experience feeling of the user in augmented reality.
In this embodiment of the present application, there is also an optional implementation manner, fig. 10 is a flowchart of an optional image displaying method according to an embodiment of the present application, and as shown in fig. 10, the step S104 is to obtain, according to the feature data and the initial image, imaging data of the object to be displayed, and specifically includes the following method steps:
step S702, at least one piece of clothing corresponding to the characteristic data is acquired.
In the above step S702, after acquiring the feature data of the object to be displayed, the AR device may further determine corresponding at least one garment, such as a glove, a hat, a coat, trousers, etc., according to the feature data.
And step S704, combining the at least one piece of clothing on the corresponding body part in the initial image to obtain the imaging data.
In the step S704, if the at least one garment is a glove, the glove may be combined to the hand position of the initial subject to obtain the imaging data; if the at least one garment is a garment, the garment may be incorporated into the body position of the original subject to obtain the imaging data.
In addition, it is also possible to acquire at least one hairstyle corresponding to the characteristic data, to combine the at least one hairstyle with the head of the original figure, and to obtain the imaging data.
As an alternative embodiment, fig. 11 is a flowchart of an alternative image displaying method according to an embodiment of the present application, and as shown in fig. 11, the step S104 is described above, and the method for obtaining the initial image of the object to be displayed specifically includes the following steps:
step S802, obtaining a pre-stored image of the object to be displayed, and obtaining the initial image; or alternatively, the process may be performed,
step S804, collecting the current image of the object to be displayed through the terminal to obtain the initial image.
In the alternative embodiments provided in the steps S802 to S804, the image of the object to be displayed may be an image stored locally in the terminal or an image stored in a server on the network side, and the source of the image of the object to be displayed is not particularly limited in this application.
In an alternative embodiment, the AR device may, but not limited to, determine the initial object of the object to be displayed by acquiring a current image of the object to be displayed stored in a server on the local or network side of the terminal of the AR device.
It should be noted that, for simplicity of description, the foregoing method embodiments are all described as a series of acts, but it should be understood by those skilled in the art that the present invention is not limited by the order of acts described, as some steps may be performed in other orders or concurrently in accordance with the present invention. Further, those skilled in the art will also appreciate that the embodiments described in the specification are all preferred embodiments, and that the acts and modules referred to are not necessarily required for the present invention.
From the description of the above embodiments, it will be clear to a person skilled in the art that the method according to the above embodiments may be implemented by means of software plus the necessary general hardware platform, but of course also by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) comprising several instructions for causing a terminal device (which may be a mobile phone, a computer, a server, or a network device, etc.) to perform the above-mentioned methods of the various embodiments of the present invention.
Example 2
In accordance with an embodiment of the present invention, there is further provided an embodiment of a terminal, and fig. 12 is a schematic structural diagram of a terminal according to an embodiment of the present application, and the depicted structure is merely an example of a suitable environment for descriptive purposes and not intended to suggest any limitation as to the scope of use or functionality of the application. Nor should the authentication system be construed as having any dependency or requirement relating to any one or combination of components illustrated in figure 12.
As shown in fig. 12, the terminal provided in the embodiment of the present application includes: a processor 100, an image acquisition device 120, and a display device 140, wherein,
a processor 100, configured to obtain feature data of an object to be displayed, where the feature data is data obtained according to a preference of the object to be displayed; the image acquisition device 120 is configured to acquire an initial image of the object to be displayed; a display device 140 for acquiring imaging data of the object to be displayed according to the feature data and the initial image; and displaying the imaging data.
In an alternative embodiment, the terminal is a smart device, which may be, but not limited to, a smart phone, a tablet, a computer, and a wearable device, and may be an augmented reality AR headset (AR device), where the AR headset may be in the following form: helmets, glasses, eye shields, etc.
In addition, the image capturing device 120 may be a front or rear camera of the terminal, and the display device 140 may be a display screen of the terminal. It should be noted that, in the alternative embodiment provided in the present application, the image capturing device 120 and the display device 140 may be implemented integrally.
In another alternative embodiment, the object to be displayed may be a user carrying or using an AR device, and the feature data may be personal image data of the object to be displayed, for example, an appearance feature, a physical feature, and the like; specifically, the feature data may be obtained according to the preference of the object to be displayed.
As an optional embodiment, the present application may further observe the behavior and habit of the user through carrying or using the AR device by the user, and when the user stays for a long time in the augmented reality part of the content, the AR device understands the part of the content as the point of interest of the user, so as to determine the preference of the object to be displayed.
As can be seen from the above, the processor in the embodiment of the present application is configured to obtain feature data of an object to be displayed, where the feature data is data obtained according to preferences of the object to be displayed; the image acquisition device is used for acquiring the initial image of the object to be displayed; the display device is used for acquiring imaging data of the object to be displayed according to the characteristic data and the initial image; and displaying the imaging data.
It is easy to notice that, in the present application, by combining big data with an augmented reality technology, feature data and an initial image of an object to be displayed are obtained through an AR device, where the feature data is data obtained according to the preference of the object to be displayed; acquiring imaging data of the object to be displayed according to the characteristic data and the initial image; the imaging data of the enhancement effect is displayed on the AR equipment, so that the purpose of enhancing the scene image display according to the personal preference of the user is achieved, the technical effect of improving the user experience of the user in the current enhancement scene is achieved, and the technical problem that the existing AR equipment cannot effectively conduct the enhancement scene image display according to the personal preference of the user is solved.
It should be noted that, before the processor 100 obtains the feature data of the object to be displayed, the identity information of the object to be displayed may also be obtained; and determining characteristic data according to the identity information of the object to be displayed.
As an alternative embodiment, the identity information includes: an account corresponding to the object to be displayed; the processor 100 may obtain a history of operations under the account from a server; the processor 100 may obtain at least one of the following information of the target object corresponding to the maximum operation number in the history operation record: image information of the target object and attribute information of the target object; and the at least one piece of information of the target object can be used as the characteristic data of the object to be displayed.
As another alternative embodiment, the identity information includes: user information locally stored by a terminal used by the object to be displayed; the processor 100 may obtain user information locally stored in a terminal used for the object to be displayed; and acquiring the characteristic data according to the user information.
Wherein, the user information includes: and the object to be displayed records the historical operation of the terminal and the area information of the area where the terminal is located. In an optional embodiment provided in the present application, the area information of the area where the terminal is located may be determined according to a GP event location function in the terminal device.
In an alternative embodiment, the image capturing device 120 may acquire the initial image of the object to be displayed by: acquiring a pre-stored image of the object to be displayed, and obtaining the initial image; or collecting the current image of the object to be displayed through the terminal to obtain the initial image.
It should be noted that, the image of the object to be displayed may be an image stored locally in the terminal or an image stored in the network side server, and the source of the image of the object to be displayed is not particularly limited in this application.
In an alternative embodiment, before the display device 140 obtains the imaging data of the object to be displayed according to the feature data and the initial image, the display device 140 may further display a prompt message and the feature data; receiving a selection instruction corresponding to the prompt information; and determining whether to allow or prohibit the acquisition of imaging data of the object to be displayed according to the characteristic data and the initial image according to the selection instruction.
It should be noted that, the prompt information is used to prompt the user whether to acquire the imaging data according to the feature data or the initial image.
In this embodiment of the present application, as an optional embodiment, the display device 140 may acquire the imaging data of the object to be displayed by: acquiring at least one piece of clothing corresponding to the characteristic data; and combining the at least one garment on the corresponding body part in the initial image to obtain the imaging data.
Example 3
There is further provided, according to an embodiment of the present invention, an embodiment of another avatar display method, and fig. 13 is a flowchart of another alternative avatar display method according to an embodiment of the present application, and as shown in fig. 13, the avatar display method includes the following method steps:
in step S902, preference data of an object to be displayed is obtained.
In the step S902, the object to be displayed may be a user wearing an augmented reality AR device (AR device) or using an augmented reality AR device, and the AR device may be in the following form: AR helmets, AR glasses, AR eye shields, and the like.
Specifically, the behavior and habit of the user can be observed through the augmented reality device carried by the user, and when the residence time of a part of content of the user in augmented reality is long, the AR device understands the part of content as the interest point of the user, so that preference data of the user can be determined.
Step S904, obtaining an initial image of the object to be displayed, wherein the initial image is an Augmented Reality (AR) image displayed on the terminal.
In the step S904, the initial image of the object to be displayed may be, but not limited to, an augmented reality AR image displayed on a screen of the AR device, specifically, an original character image, which is acquired by an image acquisition device (e.g., a front or rear camera) on the AR device.
In an optional embodiment, an image of the object to be displayed, which is stored in advance in the AR device locally, or an image stored in a network side server may be obtained, so as to obtain the initial image; or, acquiring the current image of the object to be displayed through an image acquisition device on the AR equipment so as to obtain the initial image.
It should be noted that the order of the steps S902 and S904 may be interchanged, and the steps shown or described may be performed in an order different from that described herein, for example, the steps S904 may be performed first and then the steps S902 may be performed.
Step S906, determining imaging data of the object to be displayed according to the preference data and the initial image;
step S908, displaying the imaging data.
In the step S906 to the step S908, the imaging data of the object to be displayed may be acquired by, but not limited to, a display device (e.g., a display screen) on the AR device according to the preference data and the initial character, and displayed by the display device.
The display device may, but is not limited to, acquire imaging data of the object to be displayed by: acquiring at least one piece of clothing corresponding to the characteristic data; and combining the at least one garment on the corresponding body part in the initial image to obtain the imaging data.
From the above, the present application obtains preference data of the object to be displayed; acquiring an initial image of the object to be displayed, wherein the initial image is an Augmented Reality (AR) image displayed on the terminal; determining imaging data of the object to be displayed according to the preference data and the initial image; the imaging data are displayed.
It is easy to notice that, in the present application, by combining big data with augmented reality technology, preference data and an initial image of an object to be displayed are obtained through an AR device, and imaging data of the object to be displayed is obtained according to the preference data and the initial image; the imaging data of the enhancement effect is displayed on the AR equipment, so that the purpose of enhancing the scene image display according to the personal preference of the user is achieved, the technical effect of improving the user experience of the user in the current enhancement scene is achieved, and the technical problem that the existing AR equipment cannot effectively conduct the enhancement scene image display according to the personal preference of the user is solved.
Example 4
There is further provided an embodiment of a character display apparatus for implementing the character display method of the above embodiment 1 according to an embodiment of the present invention, and fig. 14 is a schematic structural view of an alternative character display apparatus according to an embodiment of the present application, as shown in fig. 14, which includes the following modules: a first acquisition module 400, a second acquisition module 410, a third acquisition module 420 and a presentation module 430,
a first obtaining module 400, configured to obtain feature data of an object to be displayed, where the feature data is data obtained according to a preference of the object to be displayed; a second obtaining module 410, configured to obtain an initial image of the object to be displayed; a third obtaining module 420, configured to obtain imaging data of the object to be displayed according to the feature data and the initial image; and a display module 430, configured to display the imaging data.
It should be noted that, the first obtaining module 400, the second obtaining module 410, the third obtaining module 420, and the display module 430 correspond to steps S102 to S108 in embodiment 1, and the four modules are the same as examples and application scenarios implemented by the corresponding steps, but are not limited to those disclosed in embodiment 1. It should be noted that the above-described module may be operated as a part of the apparatus in the computer terminal 10 provided in embodiment 6.
In addition, it should be still noted that, optional or preferred implementations of this embodiment may be referred to the related descriptions in embodiments 1 and 2, and will not be repeated here.
Example 5
There is further provided an embodiment of a character display apparatus for implementing the character display method of the above embodiment 3 according to an embodiment of the present invention, and fig. 15 is a schematic structural view of another alternative character display apparatus according to an embodiment of the present application, as shown in fig. 15, which includes the following units: a first acquisition unit 500, a second acquisition unit 510, a third acquisition unit 520 and a presentation unit 530,
a first obtaining unit 500, configured to obtain preference data of an object to be displayed; the second obtaining unit 510 is configured to obtain an initial image of the object to be displayed, where the initial image is an augmented reality AR image displayed on the terminal; a third acquiring unit 520 for determining imaging data of the object to be displayed according to the preference data and the initial image; and a display unit 530 for displaying the imaging data.
Here, it should be noted that the first acquiring unit 500, the second acquiring unit 510, the third acquiring unit 520, and the display unit 530 correspond to steps S902 to S908 in embodiment 3, and the four modules are the same as the examples and application scenarios implemented by the corresponding steps, but are not limited to those disclosed in embodiment 3. It should be noted that the above-described module may be operated as a part of the apparatus in the computer terminal 10 provided in embodiment 6.
In addition, it should be still noted that, optional or preferred implementations of this embodiment may be referred to the related descriptions in embodiments 1, 2 and 3, and will not be repeated here.
Example 6
According to an embodiment of the present invention, there is also provided an embodiment of a computer terminal, which may be any one of a group of computer terminals. Alternatively, in the present embodiment, the above-described computer terminal may be replaced with a terminal device such as a mobile terminal.
Alternatively, in this embodiment, the above-mentioned computer terminal may be located in at least one network device among a plurality of network devices of the computer network.
In this embodiment, the computer terminal may execute the program code of the following steps in the image displaying method of the application program: acquiring characteristic data of an object to be displayed, wherein the characteristic data are data acquired according to the preference of the object to be displayed; acquiring an initial image of the object to be displayed; acquiring imaging data of the object to be displayed according to the characteristic data and the initial image; the imaging data are displayed.
The method embodiment provided in the first embodiment of the present application may be executed in a mobile terminal, a computer terminal or a similar computing device. Fig. 16 is a block diagram showing a hardware configuration of a computer terminal (or mobile device) for implementing the character presentation method. As shown in fig. 16, the computer terminal 10 (or mobile device 10) may include one or more processors 102 (shown as 102a, 102b, … …,102 n) which may include, but are not limited to, a microprocessor MCU or a processing device such as a programmable logic device FPGA, a memory 104 for storing data, and a transmission module 106 for communication functions. In addition, the method may further include: a display, an input/output interface (I/O interface), a Universal Serial Bus (USB) port (which may be included as one of the ports of the I/O interface), a network interface, a power supply, and/or a camera. It will be appreciated by those of ordinary skill in the art that the configuration shown in fig. 1 is merely illustrative and is not intended to limit the configuration of the electronic device described above. For example, the computer terminal 10 may also include more or fewer components than shown in FIG. 16, or have a different configuration than shown in FIG. 16.
It should be noted that the one or more processors 102 and/or other data processing circuits described above may be referred to generally herein as "data processing circuits. The data processing circuit may be embodied in whole or in part in software, hardware, firmware, or any other combination. Furthermore, the data processing circuitry may be a single stand-alone processing module, or incorporated, in whole or in part, into any of the other elements in the computer terminal 10 (or mobile device). As referred to in the embodiments of the present application, the data processing circuit acts as a processor control (e.g., selection of the path of the variable resistor termination to interface).
The memory 104 may be used to store software programs and modules of application software, such as program instructions/data storage devices corresponding to the image display method in the embodiment of the present invention, and the processor 102 executes the software programs and modules stored in the memory 104 to perform various functional applications and data processing, that is, implement the image display method of the application program. Memory 104 may include high-speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory. In some examples, the memory 104 may further include memory located remotely from the processor 102, which may be connected to the computer terminal 10 via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The transmission means 106 is arranged to receive or transmit data via a network. The specific examples of the network described above may include a wireless network provided by a communication provider of the computer terminal 10. In one example, the transmission device 106 includes a network adapter (Network Interface Controller, NIC) that can connect to other network devices through a base station to communicate with the internet. In one example, the transmission device 106 may be a Radio Frequency (RF) module for communicating with the internet wirelessly.
The display may be, for example, a touch screen type Liquid Crystal Display (LCD) that may enable a user to interact with a user interface of the computer terminal 10 (or mobile device).
It should be noted herein that in some alternative embodiments, the computer terminal 10 illustrated in fig. 16 described above may include hardware elements (including circuitry), software elements (including computer code stored on a computer-readable medium), or a combination of both hardware and software elements. It should be noted that fig. 16 is only one example of a specific example, and is intended to show the types of components that may be present in the above-described computer terminal.
The processor may call the information and the application program stored in the memory through the transmission device to perform the following steps: acquiring characteristic data of an object to be displayed, wherein the characteristic data are data acquired according to the preference of the object to be displayed; acquiring an initial image of the object to be displayed; acquiring imaging data of the object to be displayed according to the characteristic data and the initial image; the imaging data are displayed.
Optionally, the above processor may further execute program code for: acquiring identity information of the object to be displayed; and determining the characteristic data according to the identity information.
Optionally, the above processor may further execute program code for: acquiring a history operation record under the account from a server; and acquiring a target object corresponding to the maximum operation times in the historical operation record, and determining the characteristic data of the object to be displayed based on the target object.
Optionally, the above processor may further execute program code for: acquiring at least one of the following information of the target object: image information of the target object and attribute information of the target object; and taking the at least one piece of information of the target object as the characteristic data of the object to be displayed.
Optionally, the above processor may further execute program code for: acquiring user information locally stored by a terminal used by the object to be displayed; and acquiring the characteristic data according to the user information.
Optionally, the above processor may further execute program code for: displaying prompt information and the characteristic data, wherein the prompt information is used for prompting a user whether to acquire the imaging data according to the characteristic data or the initial image; receiving a selection instruction corresponding to the prompt information; and determining whether to allow or prohibit the acquisition of imaging data of the object to be displayed according to the characteristic data and the initial image according to the selection instruction.
Optionally, the above processor may further execute program code for: acquiring at least one piece of clothing corresponding to the characteristic data; and combining the at least one garment on the corresponding body part in the initial image to obtain the imaging data.
Optionally, the above processor may further execute program code for: acquiring a pre-stored image of the object to be displayed, and obtaining the initial image; or collecting the current image of the object to be displayed through the terminal to obtain the initial image.
By adopting the embodiment of the invention, a scheme for displaying the image is provided. The characteristic data of the object to be displayed is obtained, wherein the characteristic data are obtained according to the preference of the object to be displayed; acquiring an initial image of the object to be displayed; acquiring imaging data of the object to be displayed according to the characteristic data and the initial image; the imaging data are displayed, so that the purpose of enhancing the scene image display according to the personal preference of the user is achieved, and the technical problem that the conventional AR equipment cannot effectively enhance the scene image display according to the personal preference of the user is solved.
The processor may also call the information and the application program stored in the memory through the transmission device to execute the following steps: acquiring preference data of an object to be displayed; acquiring an initial image of the object to be displayed, wherein the initial image is an Augmented Reality (AR) image displayed on the terminal; determining imaging data of the object to be displayed according to the preference data and the initial image; the imaging data are displayed.
It will be appreciated by those skilled in the art that the configuration shown in fig. 16 is only illustrative, and the computer terminal may be a smart phone (such as an Android phone, an iOS phone, etc.), a tablet computer, a palm-phone computer, a mobile internet device (Mobile Internet Devices, MID), a PAD, etc. Fig. 16 is not limited to the structure of the electronic device. For example, the computer terminal 10 may also include more or fewer components (e.g., network interfaces, display devices, etc.) than shown in FIG. 16, or have a different configuration than shown in FIG. 16.
Those of ordinary skill in the art will appreciate that all or part of the steps in the various methods of the above embodiments may be implemented by a program for instructing a terminal device to execute in association with hardware, the program may be stored in a computer readable storage medium, and the storage medium may include: flash disk, read-Only Memory (ROM), random-access Memory (Random Access Memory, RAM), magnetic or optical disk, and the like.
Example 7
According to an embodiment of the present invention, there is also provided an embodiment of a storage medium. Alternatively, in the present embodiment, the above-described storage medium may be used to store program codes executed by the character presentation methods provided in the above-described embodiments 1 and 3.
Alternatively, in this embodiment, the storage medium may be located in any one of the computer terminals in the computer terminal group in the computer network, or in any one of the mobile terminals in the mobile terminal group.
Alternatively, in the present embodiment, the storage medium is configured to store program code for performing the steps of: acquiring characteristic data of an object to be displayed, wherein the characteristic data are data acquired according to the preference of the object to be displayed; acquiring an initial image of the object to be displayed; acquiring imaging data of the object to be displayed according to the characteristic data and the initial image; the imaging data are displayed.
Alternatively, in the present embodiment, the storage medium is configured to store program code for performing the steps of: acquiring identity information of the object to be displayed; and determining the characteristic data according to the identity information.
Alternatively, in the present embodiment, the storage medium is configured to store program code for performing the steps of: acquiring a history operation record under the account from a server; and acquiring a target object corresponding to the maximum operation times in the historical operation record, and determining the characteristic data of the object to be displayed based on the target object.
Alternatively, in the present embodiment, the storage medium is configured to store program code for performing the steps of: acquiring at least one of the following information of the target object: image information of the target object and attribute information of the target object; and taking the at least one piece of information of the target object as the characteristic data of the object to be displayed.
Alternatively, in the present embodiment, the storage medium is configured to store program code for performing the steps of: acquiring user information locally stored by a terminal used by the object to be displayed; and acquiring the characteristic data according to the user information.
Alternatively, in the present embodiment, the storage medium is configured to store program code for performing the steps of: displaying prompt information and the characteristic data, wherein the prompt information is used for prompting a user whether to acquire the imaging data according to the characteristic data or the initial image; receiving a selection instruction corresponding to the prompt information; and determining whether to allow or prohibit the acquisition of imaging data of the object to be displayed according to the characteristic data and the initial image according to the selection instruction.
Alternatively, in the present embodiment, the storage medium is configured to store program code for performing the steps of: acquiring at least one piece of clothing corresponding to the characteristic data; and combining the at least one garment on the corresponding body part in the initial image to obtain the imaging data.
Alternatively, in the present embodiment, the storage medium is configured to store program code for performing the steps of: acquiring a pre-stored image of the object to be displayed, and obtaining the initial image; or collecting the current image of the object to be displayed through the terminal to obtain the initial image.
Alternatively, in the present embodiment, the storage medium may be further configured to store program code for performing the steps of: acquiring preference data of an object to be displayed; acquiring an initial image of the object to be displayed, wherein the initial image is an Augmented Reality (AR) image displayed on the terminal; determining imaging data of the object to be displayed according to the preference data and the initial image; the imaging data are displayed.
The foregoing embodiment numbers of the present invention are merely for the purpose of description, and do not represent the advantages or disadvantages of the embodiments.
In the foregoing embodiments of the present invention, the descriptions of the embodiments are emphasized, and for a portion of this disclosure that is not described in detail in this embodiment, reference is made to the related descriptions of other embodiments.
In the several embodiments provided in the present application, it should be understood that the disclosed technology content may be implemented in other manners. The above-described embodiments of the apparatus are merely exemplary, and the division of the units, such as the division of the units, is merely a logical function division, and may be implemented in another manner, for example, multiple units or components may be combined or may be integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some interfaces, units or modules, or may be in electrical or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present invention may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied essentially or in part or all of the technical solution or in part in the form of a software product stored in a storage medium, including instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a removable hard disk, a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely a preferred embodiment of the present invention and it should be noted that modifications and adaptations to those skilled in the art may be made without departing from the principles of the present invention, which are intended to be comprehended within the scope of the present invention.

Claims (13)

1. A character display method, comprising:
acquiring characteristic data of an object to be displayed, wherein the characteristic data are data acquired according to the preference of the object to be displayed, and the characteristic data comprise user behaviors and habits observed through an image acquisition device;
acquiring an initial image of the object to be displayed through an image acquisition device;
acquiring imaging data of the object to be displayed according to the characteristic data and the initial image;
the imaging data is shown.
2. The method of claim 1, wherein prior to obtaining the characteristic data of the object to be displayed, the method further comprises:
acquiring identity information of the object to be displayed;
and determining the characteristic data according to the identity information.
3. The method of claim 2, wherein the identity information comprises: an account corresponding to the object to be displayed; acquiring characteristic data of an object to be displayed, including:
Acquiring a history operation record under the account from a server;
and acquiring a target object corresponding to the maximum operation times in the historical operation record, and determining the characteristic data of the object to be displayed based on the target object.
4. A method according to claim 3, wherein determining the characteristic data of the object to be presented based on the target object comprises:
acquiring at least one of the following information of the target object: image information of the target object and attribute information of the target object;
and taking the at least one piece of information of the target object as the characteristic data of the object to be displayed.
5. The method of claim 2, wherein the identity information comprises: user information locally stored by the terminal used by the object to be displayed; acquiring characteristic data of an object to be displayed, including:
acquiring user information locally stored by a terminal used by the object to be displayed;
and acquiring the characteristic data according to the user information.
6. The method of claim 5, wherein the user information comprises: and the object to be displayed records the historical operation record of the terminal and the area information of the area where the terminal is located.
7. The method of claim 1, wherein prior to acquiring imaging data of the object to be displayed from the feature data and the initial avatar, the method further comprises:
displaying prompt information and the feature data, wherein the prompt information is used for prompting a user whether to acquire the imaging data according to the feature data or the initial image;
receiving a selection instruction corresponding to the prompt information; and
and determining to enable or disable acquisition of imaging data of the object to be displayed according to the characteristic data and the initial image according to the selection instruction.
8. The method of claim 1, wherein acquiring imaging data of the object to be displayed from the feature data and the initial avatar comprises:
acquiring at least one piece of clothing corresponding to the characteristic data;
and combining the at least one garment to a corresponding body part in the initial image to obtain the imaging data.
9. The method according to any one of claims 1 to 8, wherein obtaining the initial image of the object to be displayed comprises:
acquiring a pre-stored image of the object to be displayed, and obtaining the initial image; or alternatively, the process may be performed,
And acquiring the current image of the object to be displayed through a terminal to obtain the initial image.
10. A character display method, comprising:
acquiring preference data of an object to be displayed;
acquiring an initial image of the object to be displayed through an image acquisition device, wherein the initial image is an Augmented Reality (AR) image displayed on a terminal;
determining imaging data of the object to be displayed according to the preference data and the initial image;
the imaging data is shown.
11. A terminal, comprising:
the processor is used for acquiring characteristic data of the object to be displayed, wherein the characteristic data are data acquired according to the preference of the object to be displayed, and the characteristic data comprise user behaviors and habits observed through the image acquisition device;
the image acquisition device is used for acquiring the initial image of the object to be displayed;
the display device is used for acquiring imaging data of the object to be displayed according to the characteristic data and the initial image; and displaying the imaging data.
12. A storage medium comprising a stored program, wherein the program, when run, controls a device on which the storage medium resides to perform the following functions: acquiring characteristic data of an object to be displayed, wherein the characteristic data are data acquired according to the preference of the object to be displayed, and the characteristic data comprise user behaviors and habits observed through an image acquisition device; acquiring an initial image of the object to be displayed through an image acquisition device; acquiring imaging data of the object to be displayed according to the characteristic data and the initial image; the imaging data is shown.
13. A processor for running a program, wherein the program when run performs the following functions: acquiring characteristic data of an object to be displayed, wherein the characteristic data are data acquired according to the preference of the object to be displayed, and the characteristic data comprise user behaviors and habits observed through an image acquisition device; acquiring an initial image of the object to be displayed through an image acquisition device; acquiring imaging data of the object to be displayed according to the characteristic data and the initial image; the imaging data is shown.
CN201810618191.6A 2018-06-15 2018-06-15 Image display method, terminal, storage medium and processor Active CN110610545B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810618191.6A CN110610545B (en) 2018-06-15 2018-06-15 Image display method, terminal, storage medium and processor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810618191.6A CN110610545B (en) 2018-06-15 2018-06-15 Image display method, terminal, storage medium and processor

Publications (2)

Publication Number Publication Date
CN110610545A CN110610545A (en) 2019-12-24
CN110610545B true CN110610545B (en) 2023-07-14

Family

ID=68887887

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810618191.6A Active CN110610545B (en) 2018-06-15 2018-06-15 Image display method, terminal, storage medium and processor

Country Status (1)

Country Link
CN (1) CN110610545B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105354869A (en) * 2015-10-23 2016-02-24 广东小天才科技有限公司 Method and system for embodying real head features of user on virtual head portrait
CN106462825A (en) * 2014-03-25 2017-02-22 电子湾有限公司 Data mesh platform
CN107213642A (en) * 2017-05-12 2017-09-29 北京小米移动软件有限公司 Virtual portrait outward appearance change method and device
CN107358493A (en) * 2017-05-10 2017-11-17 应凯 Intelligent dressing system with adaptive image design function

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10068547B2 (en) * 2012-06-29 2018-09-04 Disney Enterprises, Inc. Augmented reality surface painting

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106462825A (en) * 2014-03-25 2017-02-22 电子湾有限公司 Data mesh platform
CN105354869A (en) * 2015-10-23 2016-02-24 广东小天才科技有限公司 Method and system for embodying real head features of user on virtual head portrait
CN107358493A (en) * 2017-05-10 2017-11-17 应凯 Intelligent dressing system with adaptive image design function
CN107213642A (en) * 2017-05-12 2017-09-29 北京小米移动软件有限公司 Virtual portrait outward appearance change method and device

Also Published As

Publication number Publication date
CN110610545A (en) 2019-12-24

Similar Documents

Publication Publication Date Title
KR102506341B1 (en) Devices, systems and methods of virtualizing a mirror
US20180082479A1 (en) Virtual fitting method, virtual fitting glasses and virtual fitting system
CN107390863B (en) Device control method and device, electronic device and storage medium
CN108334191B (en) Method and device for determining fixation point based on eye movement analysis equipment
US10692113B2 (en) Method for providing customized information through advertising in simulation environment, and associated simulation system
US20130300739A1 (en) Stereoscopic apparel try-on method and device
US20160189431A1 (en) Virtual try-on system, virtual try-on terminal, virtual try-on method, and computer program product
WO2017025813A2 (en) Image processing method and apparatus
WO2019105411A1 (en) Information recommending method, intelligent mirror, and computer readable storage medium
WO2017095834A1 (en) Automatic-guided image capturing and presentation
CN110121728B (en) Cosmetic presentation system, cosmetic presentation method, and cosmetic presentation server
CN109117779A (en) One kind, which is worn, takes recommended method, device and electronic equipment
JP6656572B1 (en) Information processing apparatus, display control method, and display control program
CN105374058A (en) Virtual try-on apparatus, virtual try-on system and virtual try-on method
CN106162303B (en) Information processing method, information processing unit and user equipment
KR20210065423A (en) Virtual fitting system in augmented reality based offline store
KR102340904B1 (en) System and method for virtual fitting based on augument reality
CN108694601B (en) Media file delivery method and device
CN110610545B (en) Image display method, terminal, storage medium and processor
KR20170019917A (en) Apparatus, method and computer program for generating 3-dimensional model of clothes
CN111640194A (en) AR scene image display control method and device, electronic equipment and storage medium
WO2019134501A1 (en) Method and device for simulating fit of garment on user, storage medium, and mobile terminal
KR102064653B1 (en) Wearable glasses and method for clothes shopping based on augmented relity
CN108010038B (en) Live-broadcast dress decorating method and device based on self-adaptive threshold segmentation
TW201721519A (en) Virtual dressing system and virtual dressing method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40019512

Country of ref document: HK

TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20230625

Address after: Room 507, floor 5, building 3, No. 969, Wenyi West Road, Wuchang Street, Yuhang District, Hangzhou City, Zhejiang Province

Applicant after: ZHEJIANG TMALL TECHNOLOGY Co.,Ltd.

Address before: Box 847, four, Grand Cayman capital, Cayman Islands, UK

Applicant before: ALIBABA GROUP HOLDING Ltd.

GR01 Patent grant
GR01 Patent grant