CN105807952B - information processing method and electronic equipment - Google Patents

information processing method and electronic equipment Download PDF

Info

Publication number
CN105807952B
CN105807952B CN201610127890.1A CN201610127890A CN105807952B CN 105807952 B CN105807952 B CN 105807952B CN 201610127890 A CN201610127890 A CN 201610127890A CN 105807952 B CN105807952 B CN 105807952B
Authority
CN
China
Prior art keywords
image
parameter
electronic device
determining
tilt
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610127890.1A
Other languages
Chinese (zh)
Other versions
CN105807952A (en
Inventor
陈文辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201610127890.1A priority Critical patent/CN105807952B/en
Publication of CN105807952A publication Critical patent/CN105807952A/en
Application granted granted Critical
Publication of CN105807952B publication Critical patent/CN105807952B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention discloses information processing methods and electronic equipment, which comprise the steps of obtaining a image, determining a object related to a image, obtaining a parameter for representing the current motion state of the electronic equipment, and controlling a object to present a display effect corresponding to the parameter in the image based on the parameter.

Description

information processing method and electronic equipment
Technical Field
The present invention relates to the field of electronic technologies, and in particular, to information processing methods and electronic devices.
Background
With the rapid development of communication technologies, electronic devices such as smart phones and tablet computers move into thousands of households, and with the development and improvement of functions of the electronic devices, display technologies such as augmented reality or virtual reality are proposed and applied to the electronic devices, and the technologies synthesize real world information and virtual world information, so that users can perceive real environments and virtual objects through a display at the same time, and user experience is improved.
In the process of implementing the technical solution in the embodiment of the present application, the inventor of the present application finds that in the prior art, when a virtual image is displayed on a screen of an electronic device, the virtual image is static, and may appear too monotonous and not vivid, when the virtual image can be moved, is generally moved in place at a fixed position or along a fixed displacement route, and is relatively mechanical and inflexible.
Disclosure of Invention
The invention provides information processing methods and electronic equipment, which are used for solving the technical problem of inconvenient operation when the display state of the display content of the electronic equipment is changed in the prior art, so as to achieve the technical effect of simple, convenient and quick operation when the display state of the display content of the electronic equipment is changed.
, the embodiment of the application provides information processing methods, which comprises the following steps:
acquiring an th image;
determining objects related to the th image;
th parameter used for representing the current motion state of the electronic equipment is obtained;
controlling the object to present a display effect corresponding to the parameter in the image based on the parameter.
Optionally, the th parameter is a translational motion parameter for characterizing translational motion of the electronic device, and the controlling the th object to present a display effect corresponding to the th parameter in the th image based on the th parameter includes:
determining a translation direction and/or a translation speed of the electronic equipment according to the translation motion parameters;
controlling the th object to present a display effect corresponding to the panning direction and/or the panning speed in the th image.
Optionally, the th parameter is a tilt motion parameter for characterizing a tilt motion of the electronic device, and the controlling the th object to present a display effect corresponding to the th parameter in the th image based on the th parameter includes:
determining the inclination direction and/or the inclination angle and/or the inclination speed of the electronic equipment according to the inclination motion parameters;
controlling the th object to present a display effect corresponding to the tilt direction and/or the tilt angle and/or the tilt speed in the th image.
Optionally, after the determining the th object related to the th image, the method further comprises:
acquiring depth information of the th image;
the controlling, based on the th parameter, the th object to present a display effect corresponding to the th parameter in the th image comprises:
controlling the object to present a display effect corresponding to the parameter and the depth information in the image based on the parameter and the depth information.
Optionally, the th parameter is a translational motion parameter used for characterizing translational motion of the electronic device, and the controlling the th object to present a display effect corresponding to the th parameter and the depth information in the th image based on the th parameter and the second parameter includes:
determining a translation direction and/or a translation speed of the electronic equipment according to the translation motion parameters;
controlling the th object to present a display effect corresponding to the panning direction and/or the panning speed, and the depth information in the th image.
Optionally, the th parameter is a tilt motion parameter used for characterizing a tilt motion of the electronic device, and the controlling the th object to present a display effect corresponding to the th parameter and the depth information in the th image based on the th parameter and the second parameter includes:
determining the inclination direction and/or the inclination angle and/or the inclination speed of the electronic equipment according to the inclination motion parameters;
controlling the th object to present a display effect corresponding to the tilt direction and/or the tilt angle and/or the tilt speed, and the depth information in the th image.
Optionally, the determining a th object of the at least objects associated with the th image comprises:
identifying the th image, obtaining at least objects included in the th image, and determining a th object from the at least objects, or
Identifying the th image, obtaining at least objects included in the th image, and obtaining th object related to the at least objects based on the at least objects.
Optionally, the acquiring th image includes:
determining th image from the video or images obtained by the image acquisition unit in the electronic device during the acquisition state, or
The th image is determined from a video or images being played through a display unit of the electronic device.
In another aspect, embodiments of the present application further provide electronic devices, including:
a housing;
the image acquisition unit is arranged on the shell;
the processor is arranged in the shell, connected with the image acquisition unit and used for acquiring a th image, determining a th object related to the th image, acquiring a th parameter for representing the current motion state of the electronic equipment, and controlling the th object to present a display effect corresponding to the th parameter in the th image based on the th parameter.
Optionally, the th parameter is a translational motion parameter for characterizing translational motion of the electronic device, and the processor is configured to:
determining a translation direction and/or a translation speed of the electronic equipment according to the translation motion parameters;
controlling the th object to present a display effect corresponding to the panning direction and/or the panning speed in the th image.
Optionally, the th parameter is a tilt motion parameter for characterizing a tilt motion of the electronic device, and the processor is configured to:
determining the inclination direction and/or the inclination angle and/or the inclination speed of the electronic equipment according to the inclination motion parameters;
controlling the th object to present a display effect corresponding to the tilt direction and/or the tilt angle and/or the tilt speed in the th image.
Optionally, after the determining the th object associated with the th image, the processor is further configured to:
acquiring depth information of the th image;
the processor is configured to:
controlling the object to present a display effect corresponding to the parameter and the depth information in the image based on the parameter and the depth information.
Optionally, the th parameter is a translational motion parameter for characterizing translational motion of the electronic device, and the processor is configured to:
determining a translation direction and/or a translation speed of the electronic equipment according to the translation motion parameters;
controlling the th object to present a display effect corresponding to the panning direction and/or the panning speed, and the depth information in the th image.
Optionally, the th parameter is a tilt motion parameter for characterizing a tilt motion of the electronic device, and the processor is configured to:
determining the inclination direction and/or the inclination angle and/or the inclination speed of the electronic equipment according to the inclination motion parameters;
controlling the th object to present a display effect corresponding to the tilt direction and/or the tilt angle and/or the tilt speed, and the depth information in the th image.
Optionally, the processor is configured to:
identifying the th image, obtaining at least objects included in the th image, and determining a th object from the at least objects, or
Identifying the th image, obtaining at least objects included in the th image, and obtaining th object related to the at least objects based on the at least objects.
Optionally, the processor is configured to:
determining th image from the video or images obtained by the image acquisition unit during the acquisition state, or
The th image is determined from a video or images being played through a display unit of the electronic device.
In another aspect, embodiments of the present application further provide electronic devices, including:
an th acquiring unit for acquiring a th image;
an determining unit for determining a th object associated with the th image;
the second acquisition unit is used for acquiring th parameters for representing the current motion state of the electronic equipment;
an control unit for controlling the th object to present a display effect corresponding to the th parameter in the th image based on the th parameter.
By or more embodiments of the above embodiments of the present invention, at least the following technical effects can be achieved:
, according to the technical solution of the embodiment of the present application, a image is obtained, a object related to the 0 image is determined, a parameter for representing the current motion state of the electronic device is obtained, and the object is controlled to present a display effect corresponding to the parameter in the image based on the parameter, that is, unlike the prior art, a virtual image is not static, or is moved in place at a fixed position, or is moved along a fixed displacement route, and is relatively mechanical and inflexible, but the technical solution in the present application document can control the display effect of the object according to the parameter of the current motion state of the electronic device, and the operation process is relatively simple and convenient, so that the technical problem of inconvenient operation when the display state of the display content of the electronic device is changed in the prior art can be effectively solved, and the technical effect of changing the display state of the display content of the electronic device can be achieved easily and quickly.
Secondly, due to the technical solutions in the embodiments of the present application, various methods for controlling the display state of the th object are provided, for example, the display state of the th object is controlled according to the panning direction and/or the panning speed of the electronic device, the display state of the th object is controlled according to the tilting direction and/or the tilting angle and/or the tilting speed of the electronic device, or the display state of the th object is controlled according to the depth information of the th image, for this reason, the mode of controlling the display state of the th object by the user is no longer , but the display state of the th object can be changed by selecting any of the modes according to the needs or habits of the user, so as to achieve the technical effect of improving the user experience.
Thirdly, according to the technical solution of the embodiment of the present application, the th image is recognized, at least objects included in the th image are obtained, and the th object is determined from the at least objects, or the th image is recognized, at least th object included in the th image is obtained, and the th object related to the at least objects is obtained based on the at least objects.
Drawings
FIG. 1 is a flowchart illustrating an implementation of methods for processing information according to embodiment of the present application;
FIG. 2 is a diagram illustrating a th display state of a th object in the information processing methods according to example of the present application;
FIG. 3 is a diagram illustrating a second display status of a th object in the information processing methods according to example of the present application;
fig. 4 is a flowchart of implementation of step S104 in the information processing methods provided in embodiment of the present application;
FIG. 5 is a flowchart of a second implementation manner of step S104 in the information processing methods provided in embodiment of the present application;
FIG. 6 is a diagram illustrating a second display status of a th object in the information processing methods according to example of the present application;
fig. 7 illustrates kinds of electronic devices according to a second embodiment of the present application;
fig. 8 shows electronic devices according to a third embodiment of the present application.
Detailed Description
The technical scheme provided by the embodiment of the application is used for solving the technical problem of inconvenient operation when the display state of the display content of the electronic equipment is changed in the prior art, and achieving the technical effect of simple and convenient operation and rapidness when the display state of the display content of the electronic equipment is changed.
In order to solve the technical problems, the general idea of the embodiment of the application is as follows:
acquiring an th image;
determining objects related to the th image;
th parameter used for representing the current motion state of the electronic equipment is obtained;
controlling the object to present a display effect corresponding to the parameter in the image based on the parameter.
In the technical scheme, an th image is acquired, a th object related to the th image is determined, a th parameter used for representing the current motion state of the electronic device is acquired, and the th object is controlled to display a display effect corresponding to the th parameter in the th image based on the th parameter, namely, the display effect of the th object is not controlled according to the th parameter of the current motion state of the electronic device, as in the prior art, the operation process is simple and convenient, so that the technical problem of inconvenience in operation when the display state of the display content of the electronic device is changed in the prior art can be effectively solved, and the technical effect of convenience and quickness in operation when the display state of the display content of the electronic device is changed is further achieved.
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, it is obvious that the drawings in the following description are only the embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the provided drawings without creative efforts.
Example
Referring to fig. 1, information processing methods provided in embodiment of the present application include:
s101, acquiring an th image;
s102, determining a th object related to the th image;
s103, acquiring th parameters for representing the current motion state of the electronic equipment;
and S104, controlling the th object to present a display effect corresponding to the th parameter in the th image based on the th parameter.
In the embodiment of the present application, step S101 of acquiring th image is first performed.
In the specific implementation process, as for the specific implementation process of step S101, the following steps are included:
determining th image from the video or images obtained by the image acquisition unit in the electronic device during the acquisition state, or
The th image is determined from a video or images being played through a display unit of the electronic device.
The information processing methods provided by the embodiment of the application can be applied to electronic devices, such as smart phones, tablet computers, notebook computers, or other electronic devices capable of playing videos or images, and are not illustrated in .
In the embodiment of the present application, the th image may be a video image, such as a photo, a picture, or the like, and in a specific implementation, the th image may be a two-dimensional image, a three-dimensional image, or an image with other dimensions, which is not specifically limited in the embodiment of the present application.
In the embodiment of the application, there are two specific implementation manners for acquiring the image, in which the implementation manner is to acquire a multi-segment video or a plurality of images in real time by an image acquisition unit of the electronic device at the current time, and then determine the th image from the acquired multi-segment video or the plurality of images.
In a specific implementation process, the played video or images are acquired and stored in the electronic device or the electronic device on the network side before the current time.
In this embodiment, the determination of th image may be based on a user clicking on a plurality of videos or a plurality of images, or directly determining th image based on a preset condition, for example, sunny sedans with flowers and grass videos are determined according to the current environment of the electronic device, specifically, the current environment is outdoor, images of landscape are determined according to the current environment is indoor, or the determination is made by other means, which is not specifically limited in this embodiment of the present application.
After the step S101 is performed, a step S102 of determining a th object related to the th image is performed.
In the embodiment of the present application, as to the specific implementation process of step S102, the following steps are specifically included:
identifying the th image, obtaining at least objects included in the th image, and determining a th object from the at least objects, or
Identifying the th image, obtaining at least objects included in the th image, and obtaining th object related to the at least objects based on the at least objects.
In the embodiment, after the th image is obtained, the th object related to the th image is determined, in the implementation process, at least th objects are determined from the th image, in the embodiment, the th image is taken by taking bees to collect honey on flowers, and after the th image is obtained, at least objects such as flowers, bees, grass and the like are identified in the th image according to an image recognition algorithm such as a genetic algorithm or a neural network algorithm or other image recognition algorithms.
After identifying at least objects included in the th image, it is necessary to determine th object related to at least th object, in this embodiment, the th object may be determined from at least th objects or related to at least th objects, which are described in the above two cases.
In the th case, the th object is determined from at least identified objects, and there are two specific implementation manners in the specific implementation process:
(1) the operation performed by the user on objects of at least objects in the image, which indicates that the user desires to use the object as the object, may be a pressing operation, a clicking operation, a voice operation, or the like in a specific implementation process, and is not particularly limited in this embodiment of the application, specifically, when the user clicks a bee in the image, the bee is used as the object, and when the user clicks a flower in the image, the flower is used as the object, and the like.
(2) Automatically determining th objects from at least objects through preset conditions, wherein in the concrete implementation process, if the preset conditions are plants, the determined th objects are flowers and small grasses, if the preset conditions are animals, the determined th objects are bees, and if the th objects cannot be determined according to the preset conditions, prompting information is output to prompt a user to manually determine th objects from at least objects.
In the second case, objects related to at least objects are determined, in this embodiment, objects related to at least objects can be determined by association rules, such as rivers corresponding to fish, flowers corresponding to butterflies, trees corresponding to birds, grasses corresponding to sheep flocks, boys corresponding to basketballs, girls corresponding to hair clips, and the like, or other association rules, which are not specifically limited in this embodiment, so that, in the specific implementation process, when at least objects are determined to be bees, flowers, and grasses, the object is determined to be a butterfly or a sheep flock according to the corresponding association rules.
In the embodiment of the application, after the th object is determined, for example, a bee, a butterfly, a sheep flock or the like, the th object is generated, when the th object is an originally existing object in the th image, the 3526 th object may be obtained from a gallery stored in the electronic device, the th object in the th image may be directly copied or generated, when the th object is an object not present in the th image, the 3583 th object needs to be obtained from the gallery of the electronic device or generated directly, and any of the above implementation manners can be implemented, and a person skilled in the art can select the object according to actual needs, and is not particularly limited in the embodiment of the application.
After step S102 is executed, step S103 is executed to obtain th parameter for characterizing the current motion state of the electronic device.
In this embodiment of the present application, the th parameter for characterizing the current motion state of the electronic device may be obtained by a sensor installed in the electronic device, and in a specific implementation process, the sensor installed in the electronic device may be:
(1) the direction sensor is used for measuring the angles of the electronic equipment in three axes of x, y and z during movement;
(2) the acceleration sensor is used for measuring acceleration values of the electronic equipment in three axes of x, y and z during movement;
(3) the distance sensor is used for measuring the distance between the relative reference surfaces when the electronic equipment moves;
(4) gravity sensors, gyroscopes, etc.
The current motion state of the electronic device can be determined based on any or a combination of the above sensors, specifically, whether the electronic device is currently performing a translational motion or a tilting motion or both the translational motion and the tilting motion can be determined based on the direction sensor, the acceleration sensor and the distance sensor.
After the parameter is obtained by any method, step S104 is further , wherein the -th object is controlled to present a display effect corresponding to the parameter in the -th image based on the -th parameter.
In step S104, when the th parameter is a different parameter, the th object has different display effects in the th image, and 2 implementation manners in the specific implementation process of step S104 are described below.
In an implementation manner, when the parameter is a translational motion parameter for characterizing a translational motion of the electronic device, the step S104 specifically includes the following steps:
determining a translation direction and/or a translation speed of the electronic equipment according to the translation motion parameters;
controlling the th object to present a display effect corresponding to the panning direction and/or the panning speed in the th image.
In the specific implementation process, the th image is that honey bees are collecting honey on flowers, the th determined object is, for example, honey bees, and when the th parameter is obtained as a translational motion parameter for characterizing the translational motion of the electronic device, firstly, the translational direction and/or the translational speed of the electronic device is determined according to the translational motion parameter.
In a specific implementation process, when it is determined that a current motion state of the electronic device is translational motion through a sensing device in the electronic device, a translational direction and/or a translational speed of the electronic device is determined based on translational motion parameters, in this embodiment of the application, a standing surface where a user of the electronic device is currently located is taken as an example, and a direction facing away from the user is taken as an example, for example, after the motion of the electronic device is finished, a distance of the electronic device is acquired by a distance sensor in the electronic device by 10 centimeters relative to the reference surface, a motion direction of the electronic device is acquired by a direction sensor from the reference surface to a front side, and an acceleration of the motion of the electronic device is acquired by an acceleration sensor by 0 meter/square second.
Thus, it can be determined that the translation direction of the electronic device is the direction and the translation speed of the electronic device is 0.1 m/s based on the th parameter, and then after the translation direction and/or the translation speed of the electronic device are acquired, the th object bee in the th image is controlled to present the same display effect as the translation direction and/or the translation speed.
In a specific implementation process, when the electronic device translates to the th direction, the th object bee in the th image is controlled to translate from the initial position to the th direction, which may perform a uniform translation, or a translation at an acceleration of , or a translation at a uniform speed and then at an acceleration of , or another translation, which is not specifically limited in the embodiment of the present application.
When the th electronic device is translated towards the th direction at a speed of 0.1 meter per second, the th object is controlled to make translation movement towards the th direction from the original position at the same speed.
When the electronic device moves toward the th direction at a speed of 0.1 m/s and then moves toward the th direction at a speed of 0.1 m/s, the th object is controlled to move from the original position toward the th direction at a speed of 0.1 m/s and then move in a translational motion toward the th direction at a speed of 0.1 m/s, as shown in fig. 2.
In the embodiment of the application, when the electronic device moves to the th direction at a speed of 0.1 m/s, the object is controlled to move to the th direction at the same speed, in the specific implementation process, the object bee in the image can be controlled to move to the th direction at a speed of 0.1 m/s, and thus after the bee performs translational motion, the initial position where the honey is located is a blank area, and then the blank area can be filled through the image stored in the electronic device, or through the image acquired from the network side, or other objects similar to the object, such as butterflies, parrots, magpie and the like, to fill the blank area so as to provide better visual effect for the user;
in a specific implementation process, second bees which are the same as the object bees can be generated, the second bees are controlled to move from the position of the bee in the image to the direction at a speed of 0.1 m/s, and the bee in the image remains stationary, so that a white area in the image can be avoided, and a good visual effect is provided for a user.
In a second implementation manner, when the th parameter is a tilt motion parameter used for characterizing a tilt motion of the electronic device, the step S104 specifically includes the following steps:
determining the inclination direction and/or the inclination angle and/or the inclination speed of the electronic equipment according to the inclination motion parameters;
controlling the th object to present a display effect corresponding to the tilt direction and/or the tilt angle and/or the tilt speed in the th image.
In the specific implementation process, the th image is that honey is being collected on a flower, the th determined object is honey as an example, when the th parameter is obtained as a tilt motion parameter for characterizing the tilt motion of the electronic device, first, the tilt direction and/or the tilt angle and/or the tilt speed of the electronic device are determined according to the tilt motion parameter.
In a specific implementation process, when determining a tilt direction and/or a tilt angle and/or a tilt speed of the electronic device based on the tilt motion parameter of the electronic device, specifically: take the standing surface where the user of the electronic device is located as an example.
When the electronic device is tilted to the front of the reference plane, the th object bee in the th image is controlled to be tilted to the front of the reference plane, the tilting angle may be 30, 35 degrees or 45 degrees, the tilting speed may be 2 degrees/second, 3 degrees/second or 4 degrees/second, or other tilting angles to any angle of the above angles, and the embodiment of the present invention is not limited in particular.
When the electronic device is tilted to the front of the reference plane at a tilting speed of 2 degrees/second, the th object bee in the th image is controlled to be tilted to the front of the reference plane at 2 degrees/second, and the tilting angle may be 30 degrees, 35 degrees, 45 degrees, or other angles, which is not limited.
When the electronic device is tilted 30 degrees forward of the reference plane at a tilting speed of 2 degrees/second, the th object bee in the th image is controlled to be tilted 30 degrees forward at 2 degrees/second, so that the bee shows a display effect corresponding to the tilting direction, the tilting angle and the tilting speed, as shown in fig. 3.
In the embodiment of the application, when the electronic device is inclined to the front of the reference plane by 30 degrees at an inclination speed of 2 degrees/second, the th object is controlled to be inclined to the front of the reference plane by 30 degrees at the same inclination speed, in the specific implementation process, after the th object in the th image is controlled to be inclined to the front of the reference plane by 30 degrees at an inclination speed of 2 degrees/second, the honey is located in a blank area at the initial position, and then the blank area can be filled by the image stored in the electronic device, or by the image acquired from the network side, or other objects similar to the th object, such as butterflies, parrots, jubes and the like, so as to fill the blank area, and bring better visual effect to the user;
in a specific implementation process, bees which are the same as the object bees can be generated, and the newly generated bees are controlled to tilt 30 degrees from the position of the bee in the image to the front of the bee at a speed of 2 degrees/second, while the bee in the image remains still, so that a better experience effect is brought to a user.
In the embodiment of the application, in addition to adjusting the display state of the th object according to the translational motion parameter of the electronic device or adjusting the display state of the th object according to the tilting motion parameter of the electronic device, the display state of the th object can be adjusted according to the translational motion parameter and the tilting motion parameter of the electronic device together, specifically, when the electronic device is translated to the th direction and is tilted to the front of the reference plane by 30 degrees, the th object in the th image is controlled to be translated to the th direction from the original position and is tilted to the front of the reference plane by 30 degrees, so that a richer visual effect is provided for a user, and a technical effect of improving the user experience is achieved.
In the embodiment of the application, besides controlling the object in the image to have the same display effect as the panning motion parameter and/or the tilting motion parameter of the electronic device, the display mode of the object in the image can be adjusted according to the motion state of the electronic device according to a preset strategy, in the specific implementation process, for example, when the electronic device tilts to the front or the back of the reference surface, the display size of the object is controlled, specifically, when the electronic device tilts to the front of the reference surface, the object is controlled to be reduced in size from large, and when the electronic device tilts to the back of the reference surface, the object is controlled to be increased in size from small to large, and an experience similar to large and small in size is provided for a user , so that the experience effect of the user is improved.
In a specific implementation process, when the electronic device is shifted to the left of the reference surface, the th object is controlled to be shifted to the left from the original position, or when the electronic device is shifted to the right of the reference surface, the th object is controlled to be shifted to the right from the original position, or the display state of the th object is adjusted according to other preset strategies, which is not specifically limited in the embodiment of the present application.
In an embodiment of the application, after determining the th object related to the th image, the method further comprises:
acquiring depth information of the th image;
correspondingly, after the depth information of the th image is acquired, the step S104 specifically includes the following steps:
controlling the object to present a display effect corresponding to the parameter and the depth information in the image based on the parameter and the depth information.
In the embodiment of the present application, in order to make the display effect of the determined th object more realistic and bring a more realistic experience effect to the user, the display size of the th object can be controlled to match the size of the object related to the th object in the th image and the matching degree of the variation trend with the scene of the th image through the depth information of the th image, and the depth refers to the range of the distance between the camera lens or other imager and the photographed object measured along the image formation capable of obtaining a sharp image.
In a specific implementation, after the depth information of the th image is acquired, there are 2 more implementations for step S104.
implementation manner, please refer to fig. 4, which specifically includes the following steps:
s401: determining a translation direction and/or a translation speed of the electronic equipment according to the translation motion parameters;
s402, controlling the th object to present a display effect corresponding to the panning direction and/or the panning speed and the depth information in the th image.
In the specific implementation process, the th image is that honey bees are collecting honey on flowers, the th determined object is, for example, honey bees, and when the th parameter is obtained as a translational motion parameter for characterizing the translational motion of the electronic device, firstly, the translational direction and/or the translational speed of the electronic device is determined according to the translational motion parameter.
In a specific implementation process, when it is determined that a current motion state of the electronic device is translational motion through a sensing device in the electronic device, a translational direction and/or a translational speed of the electronic device is determined based on translational motion parameters, in this embodiment of the application, a standing surface where a user of the electronic device is currently located is taken as an example, and a direction facing away from the user is taken as an example, for example, after the motion of the electronic device is finished, a distance of the electronic device is acquired by a distance sensor in the electronic device by 10 centimeters relative to the reference surface, a motion direction of the electronic device is acquired by a direction sensor from the reference surface to a front side, and an acceleration of the motion of the electronic device is acquired by an acceleration sensor by 0 meter/square second.
In this way, it can be determined based on the -th parameter that the translation direction of the electronic device is -th direction and the translation speed of the electronic device is 0.1 m/s, after the translation direction and/or the translation speed of the electronic device is acquired, not only the 1-th object in the 0-th image is controlled to translate to 2-th direction at 0.1 m/s, but also the size of the 4-th object is determined based on the depth information of the 3-th image, in the specific implementation, the size of the 5-th object is described by taking the occupation ratio in the -th image as an example, if the obtained -th image is 0.1 m, the occupation ratio of the -th object is controlled to twentieth, and if the depth of the -th image is 0.45 m, the occupation ratio of the -th object is controlled to -fiftieth, and the control trend is that the depth is larger, the size of the -th object is larger.
Referring to fig. 5, a second implementation manner specifically includes the following steps:
s501: determining the inclination direction and/or the inclination angle and/or the inclination speed of the electronic equipment according to the inclination motion parameters;
and S502, controlling the th object to present a display effect corresponding to the inclination direction and/or the inclination angle and/or the inclination speed and the depth information in the th image.
In the specific implementation process, the th image is that honey is being collected on a flower, the th determined object is honey as an example, when the th parameter is obtained as a tilt motion parameter for characterizing the tilt motion of the electronic device, first, the tilt direction and/or the tilt angle and/or the tilt speed of the electronic device are determined according to the tilt motion parameter.
In a specific implementation process, when determining a tilt direction and/or a tilt angle and/or a tilt speed of the electronic device based on the tilt motion parameter of the electronic device, specifically: take the standing surface where the user of the electronic device is located as an example.
When the electronic device is inclined to the front of the reference surface at an inclination speed of 2 degrees/second, not only the th object bee in the th image is controlled to be inclined to the front thereof at an inclination speed of 2 degrees/second, but also the size of the th object bee is controlled to be changed in a trend from small to large according to the depth information of the th image.
In the specific implementation, the th object size is, for example, the proportion in the th image, when the depth of field of the 0 th image is 0.1 m, it indicates that the 1 th image has a smaller depth of field and the corresponding 2 th object size change interval is also smaller, for example, from to twentieth of the 3 th image, when the th image has a depth of field of 0.5 m, it indicates that the th image has a larger depth of field and the corresponding th object size change interval is also larger, for example, from twentieth of the th image to tenth of the th image, please refer to fig. 6 specifically.
Example two
An embodiment of the present application further provides electronic devices, please refer to fig. 7, including:
a housing 70;
an image acquisition unit 71 disposed on the housing;
the processor 72 is arranged in the shell, connected with the image acquisition unit 71 and used for acquiring th images, determining th objects related to the th images, acquiring th parameters for representing the current motion state of the electronic equipment, and controlling the th objects to present display effects corresponding to the th parameters in the th images on the basis of the th parameters.
Optionally, the th parameter is a translational motion parameter for characterizing translational motion of the electronic device, and the processor 72 is configured to:
determining a translation direction and/or a translation speed of the electronic equipment according to the translation motion parameters;
controlling the th object to present a display effect corresponding to the panning direction and/or the panning speed in the th image.
Optionally, the th parameter is a tilt motion parameter for characterizing a tilt motion of the electronic device, and the processor 72 is configured to:
determining the inclination direction and/or the inclination angle and/or the inclination speed of the electronic equipment according to the inclination motion parameters;
controlling the th object to present a display effect corresponding to the tilt direction and/or the tilt angle and/or the tilt speed in the th image.
Optionally, after the determining the th object associated with the th image, the processor 72 is further configured to:
acquiring depth information of the th image;
the processor 72 is configured to:
controlling the object to present a display effect corresponding to the parameter and the depth information in the image based on the parameter and the depth information.
Optionally, the th parameter is a translational motion parameter for characterizing translational motion of the electronic device, and the processor 72 is configured to:
determining a translation direction and/or a translation speed of the electronic equipment according to the translation motion parameters;
controlling the th object to present a display effect corresponding to the panning direction and/or the panning speed, and the depth information in the th image.
Optionally, the th parameter is a tilt motion parameter for characterizing a tilt motion of the electronic device, and the processor 72 is configured to:
determining the inclination direction and/or the inclination angle and/or the inclination speed of the electronic equipment according to the inclination motion parameters;
controlling the th object to present a display effect corresponding to the tilt direction and/or the tilt angle and/or the tilt speed, and the depth information in the th image.
Optionally, the processor 72 is configured to:
identifying the th image, obtaining at least objects included in the th image, and determining a th object from the at least objects, or
Identifying the th image, obtaining at least objects included in the th image, and obtaining th object related to the at least objects based on the at least objects.
Optionally, the processor 72 is configured to:
determining th image from the video or images obtained by the image acquisition unit 71 during the acquisition state, or
The th image is determined from a video or images being played through a display unit of the electronic device.
EXAMPLE III
An embodiment of the present application further provides electronic devices, please refer to fig. 8, including:
an th acquiring unit 80 for acquiring a th image;
an th determining unit 81 for determining a th object associated with the th image;
a second obtaining unit 82, configured to obtain th parameters for characterizing a current motion state of the electronic device;
an control unit 83 for controlling the th object to present a display effect corresponding to the th parameter in the th image based on the th parameter.
Optionally, the th parameter is a translational motion parameter for characterizing translational motion of the electronic device, and the th control unit 83 includes:
an determining module, configured to determine a panning direction and/or a panning speed of the electronic device according to the panning motion parameters;
an control module for controlling the object to present a display effect corresponding to the panning direction and/or the panning speed in the image.
Optionally, the th parameter is a tilt motion parameter for characterizing a tilt motion of the electronic device, and the th control unit 83 includes:
the second determining module is used for determining the inclination direction and/or the inclination angle and/or the inclination speed of the electronic equipment according to the inclination motion parameters;
a second control module for controlling the th object to present a display effect corresponding to the tilt direction and/or the tilt angle and/or the tilt speed in the th image.
Optionally, after the determining the th object related to the th image, the electronic device further comprises:
a third acquiring unit 84 configured to acquire depth information of the th image;
the -th control unit 83 includes:
a third control module for controlling the object to present a display effect corresponding to the th parameter and the depth information in the th image based on the th parameter and the depth information.
Optionally, the th parameter is a translational motion parameter for characterizing translational motion of the electronic device, and the third control module includes:
, a determining sub-module for determining the translation direction and/or translation speed of the electronic device according to the translation motion parameters;
an th control sub-module for controlling the th object to present a display effect corresponding to the panning direction and/or the panning speed, and the depth information in the th image.
Optionally, the th parameter is a tilt motion parameter for characterizing a tilt motion of the electronic device, and the third control module includes:
the second determining submodule is used for determining the tilting direction and/or the tilting angle and/or the tilting speed of the electronic equipment according to the tilting motion parameters;
a second control sub-module for controlling the th object to present a display effect corresponding to the tilt direction and/or the tilt angle and/or the tilt speed, and the depth information in the th image.
Optionally, the th determining unit 81 includes:
an th obtaining module for identifying the th image, obtaining at least objects included in the th image, and determining a th object from the at least objects, or
A second obtaining module, configured to identify the th image, obtain at least objects included in the th image, and obtain a th object related to the at least objects based on the at least objects.
Optionally, the th obtaining unit 80 includes:
a third determining module for determining th image from the video or images obtained by the image capturing unit in the electronic device during the capturing state, or
A fourth determining module, configured to determine th image from the video or the plurality of images being played through the display unit of the electronic device.
The or more technical solutions in the embodiment of the present application have at least the following or more technical effects:
, according to the technical solution of the embodiment of the present application, a image is obtained, a object related to the 0 image is determined, a parameter for representing the current motion state of the electronic device is obtained, and the object is controlled to present a display effect corresponding to the parameter in the image based on the parameter, that is, unlike the prior art, a virtual image is not static, or is moved in place at a fixed position, or is moved along a fixed displacement route, and is relatively mechanical and inflexible, but the technical solution in the present application document can control the display effect of the object according to the parameter of the current motion state of the electronic device, and the operation process is relatively simple and convenient, so that the technical problem of inconvenient operation when the display state of the display content of the electronic device is changed in the prior art can be effectively solved, and the technical effect of changing the display state of the display content of the electronic device can be achieved easily and quickly.
Secondly, due to the technical solutions in the embodiments of the present application, various methods for controlling the display state of the th object are provided, for example, the display state of the th object is controlled according to the panning direction and/or the panning speed of the electronic device, the display state of the th object is controlled according to the tilting direction and/or the tilting angle and/or the tilting speed of the electronic device, or the display state of the th object is controlled according to the depth information of the th image, for this reason, the mode of controlling the display state of the th object by the user is no longer , but the display state of the th object can be changed by selecting any of the modes according to the needs or habits of the user, so as to achieve the technical effect of improving the user experience.
Thirdly, according to the technical solution of the embodiment of the present application, the th image is recognized, at least objects included in the th image are obtained, and the th object is determined from the at least objects, or the th image is recognized, at least th object included in the th image is obtained, and the th object related to the at least objects is obtained based on the at least objects.
Furthermore, the present invention may take the form of a computer program product embodied on or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, etc.) having computer-usable program code embodied therein.
It is to be understood that each flow and/or block in the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions which can be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flow diagram flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
Specifically, the computer program instructions corresponding to the information processing method in the embodiment of the present application may be stored on a storage medium such as an optical disc, a hard disc, a usb disk, or the like, and when the computer program instructions corresponding to the information processing method in the storage medium are read or executed by the electronic device , the method includes the following steps:
acquiring an th image;
determining objects related to the th image;
th parameter used for representing the current motion state of the electronic equipment is obtained;
controlling the object to present a display effect corresponding to the parameter in the image based on the parameter.
Optionally, the method further includes controlling the th object to present a display effect corresponding to the th parameter in the th image based on the th parameter, and the corresponding computer instructions, when executed, further include:
determining a translation direction and/or a translation speed of the electronic equipment according to the translation motion parameters;
controlling the th object to present a display effect corresponding to the panning direction and/or the panning speed in the th image.
Optionally, the method further includes controlling the th object to present a display effect corresponding to the th parameter in the th image based on the th parameter, and the corresponding computer instructions, when executed, further include:
determining the inclination direction and/or the inclination angle and/or the inclination speed of the electronic equipment according to the inclination motion parameters;
controlling the th object to present a display effect corresponding to the tilt direction and/or the tilt angle and/or the tilt speed in the th image.
Optionally, the storage medium further stores additional computer instructions, the additional computer instructions being executed after the computer instructions corresponding to the th object determined to be related to the th image are executed, the additional computer instructions including the following steps in a specific execution process:
acquiring depth information of the th image;
the controlling, based on the th parameter, the th object to present a display effect corresponding to the th parameter in the th image comprises:
controlling the object to present a display effect corresponding to the parameter and the depth information in the image based on the parameter and the depth information.
Optionally, the step of controlling the th object to present a display effect corresponding to the th parameter and the depth information in the th image based on the th parameter and the second parameter stored in the storage medium includes the following steps in a specific executed process:
determining a translation direction and/or a translation speed of the electronic equipment according to the translation motion parameters;
controlling the th object to present a display effect corresponding to the panning direction and/or the panning speed, and the depth information in the th image.
Optionally, the step of controlling the th object to present a display effect corresponding to the th parameter and the depth information in the th image based on the th parameter and the second parameter stored in the storage medium includes the following steps in a specific executed process:
determining the inclination direction and/or the inclination angle and/or the inclination speed of the electronic equipment according to the inclination motion parameters;
controlling the th object to present a display effect corresponding to the tilt direction and/or the tilt angle and/or the tilt speed, and the depth information in the th image.
Optionally, the computer instructions stored in the storage medium corresponding to the step of determining the th object of the at least objects related to the th image, when executed, comprise the steps of:
identifying the th image, obtaining at least objects included in the th image, and determining a th object from the at least objects, or
Identifying the th image, obtaining at least objects included in the th image, and obtaining th object related to the at least objects based on the at least objects.
Optionally, the computer instructions stored in the storage medium corresponding to the step of acquiring the th image, in a specific executed process, include the steps of:
determining th image from the video or images obtained by the image acquisition unit in the electronic device during the acquisition state, or
The th image is determined from a video or images being played through a display unit of the electronic device.
Having described preferred embodiments of the invention, further alterations and modifications may be effected to these embodiments by those skilled in the art having the benefit of the basic inventive concepts .
It will be apparent to those skilled in the art that various changes and modifications may be made in the present invention without departing from the spirit and scope of the invention. Thus, if such modifications and variations of the present invention fall within the scope of the claims of the present invention and their equivalents, the present invention is also intended to include such modifications and variations.

Claims (15)

1, an information processing method, comprising:
acquiring an th image;
determining objects related to the th image;
th parameter used for representing the current motion state of the electronic equipment is obtained;
acquiring depth information of the th image;
determining the size of the th object according to the depth information, and
based on the th parameter and the size, controlling the th object to present a display effect corresponding to the th parameter and the size in the th image.
2. The method of claim 1, wherein the th parameter is a translational motion parameter for characterizing translational motion of the electronic device, the controlling the th object to present a display effect corresponding to the th parameter in the th image based on the th parameter comprises:
determining a translation direction and/or a translation speed of the electronic equipment according to the translation motion parameters;
controlling the th object to present a display effect corresponding to the panning direction and/or the panning speed in the th image.
3. The method of claim 1, wherein the th parameter is a tilt motion parameter used to characterize a tilt motion of the electronic device, the controlling the th object to present a display effect in the th image corresponding to the th parameter based on the th parameter comprises:
determining the inclination direction and/or the inclination angle and/or the inclination speed of the electronic equipment according to the inclination motion parameters;
controlling the th object to present a display effect corresponding to the tilt direction and/or the tilt angle and/or the tilt speed in the th image.
4. The method as claimed in claim 1, wherein the parameter is a translational motion parameter for characterizing translational motion of the electronic device, the controlling the object to present a display effect corresponding to the parameter and the size in the image based on the parameter and the size comprises:
determining a translation direction and/or a translation speed of the electronic equipment according to the translation motion parameters;
controlling the th object to present a display effect corresponding to the panning direction and/or the panning speed, and the size in the th image.
5. The method of claim 1, wherein the th parameter is a tilt motion parameter used to characterize a tilt motion of the electronic device, the controlling the th object to present a display effect in the th image corresponding to the th parameter and the size based on the th parameter and the size comprises:
determining the inclination direction and/or the inclination angle and/or the inclination speed of the electronic equipment according to the inclination motion parameters;
controlling the th object to present a display effect corresponding to the tilt direction and/or the tilt angle and/or the tilt speed, and the size in the th image.
6. The method of any claim 1-5, wherein the determining the th object associated with the th image comprises:
identifying the th image, obtaining at least objects included in the th image, and determining a th object from the at least objects, or
Identifying the th image, obtaining at least objects included in the th image, and obtaining th object related to the at least objects based on the at least objects.
7. The method of any claims 1-5, wherein the obtaining the th image comprises:
determining th image from the video or images obtained by the image acquisition unit in the electronic device during the acquisition state, or
The th image is determined from a video or images being played through a display unit of the electronic device.
An electronic device of the kind , comprising:
a housing;
the image acquisition unit is arranged on the shell;
the electronic equipment comprises an image acquisition unit, a processor, a first module and a second module, wherein the image acquisition unit is arranged in the shell and connected with the processor, the processor is used for acquiring an th image, determining a th object related to a th image, acquiring a th parameter for representing the current motion state of the electronic equipment, acquiring depth information of a th image, determining the size of a th object according to the depth information, and controlling the th object to present a display effect corresponding to the th parameter and the size in the th image based on the th parameter and the size.
9. The electronic device of claim 8, wherein the th parameter is a translational motion parameter for characterizing translational motion of the electronic device, the processor to:
determining a translation direction and/or a translation speed of the electronic equipment according to the translation motion parameters;
controlling the th object to present a display effect corresponding to the panning direction and/or the panning speed in the th image.
10. The electronic device of claim 8, wherein the th parameter is a tilt motion parameter characterizing a tilt motion of the electronic device, the processor to:
determining the inclination direction and/or the inclination angle and/or the inclination speed of the electronic equipment according to the inclination motion parameters;
controlling the th object to present a display effect corresponding to the tilt direction and/or the tilt angle and/or the tilt speed in the th image.
11. The electronic device of claim 8, wherein the th parameter is a translational motion parameter for characterizing translational motion of the electronic device, the processor to:
determining a translation direction and/or a translation speed of the electronic equipment according to the translation motion parameters;
controlling the th object to present a display effect corresponding to the panning direction and/or the panning speed, and the size in the th image.
12. The electronic device of claim 8, wherein the th parameter is a tilt motion parameter characterizing a tilt motion of the electronic device, the processor to:
determining the inclination direction and/or the inclination angle and/or the inclination speed of the electronic equipment according to the inclination motion parameters;
controlling the th object to present a display effect corresponding to the tilt direction and/or the tilt angle and/or the tilt speed, and the size in the th image.
13. The electronic device of any claim of claims 8-12, wherein the processor is to:
identifying the th image, obtaining at least objects included in the th image, and determining a th object from the at least objects, or
Identifying the th image, obtaining at least objects included in the th image, and obtaining th object related to the at least objects based on the at least objects.
14. The electronic device of any claim of claims 8-12, wherein the processor is to:
determining th image from the video or images obtained by the image acquisition unit during the acquisition state, or
The th image is determined from a video or images being played through a display unit of the electronic device.
15, an electronic device, comprising:
an th acquiring unit for acquiring a th image;
an determining unit for determining a th object associated with the th image;
the second acquisition unit is used for acquiring th parameters for representing the current motion state of the electronic equipment;
a third acquiring unit configured to acquire depth information of the th image;
a second determination unit for determining the size of the th object according to the depth information, and
an control unit for controlling the th object to present a display effect corresponding to the th parameter and the size in the th image based on the th parameter and the size.
CN201610127890.1A 2016-03-07 2016-03-07 information processing method and electronic equipment Active CN105807952B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610127890.1A CN105807952B (en) 2016-03-07 2016-03-07 information processing method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610127890.1A CN105807952B (en) 2016-03-07 2016-03-07 information processing method and electronic equipment

Publications (2)

Publication Number Publication Date
CN105807952A CN105807952A (en) 2016-07-27
CN105807952B true CN105807952B (en) 2020-01-31

Family

ID=56467681

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610127890.1A Active CN105807952B (en) 2016-03-07 2016-03-07 information processing method and electronic equipment

Country Status (1)

Country Link
CN (1) CN105807952B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107783669B (en) * 2016-08-23 2021-04-16 群光电子股份有限公司 Cursor generation system, method and computer program product
CN106534590B (en) * 2016-12-27 2019-08-20 努比亚技术有限公司 A kind of photo processing method, device and terminal
CN109359204A (en) * 2018-08-27 2019-02-19 中国农业大学 A kind of flowers recognition methods and equipment based on augmented reality

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1379871A (en) * 1999-10-12 2002-11-13 迈奥里高有限公司 Operation method of user interface of hand-held device
CN101213509A (en) * 2005-07-08 2008-07-02 三菱电机株式会社 Touch panel display device and portable apparatus
CN102541440A (en) * 2010-12-23 2012-07-04 Lg电子株式会社 Mobile terminal and controlling method thereof
CN102870065A (en) * 2011-05-04 2013-01-09 捷讯研究有限公司 Methods for adjusting presentation of graphical data displayed on graphical user interface
CN103049184A (en) * 2012-12-11 2013-04-17 中兴通讯股份有限公司 Method and device for adjusting display region of displayed contents in screen
CN103383626A (en) * 2012-05-02 2013-11-06 三星电子株式会社 Method and apparatus for moving an object
CN104866080A (en) * 2014-02-24 2015-08-26 腾讯科技(深圳)有限公司 Screen content display method and system

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9582049B2 (en) * 2008-04-17 2017-02-28 Lg Electronics Inc. Method and device for controlling user interface based on user's gesture
CN102158721B (en) * 2011-04-06 2012-12-05 青岛海信电器股份有限公司 Method and device for adjusting three-dimensional image and television
KR20140010823A (en) * 2012-07-17 2014-01-27 삼성전자주식회사 Image data scaling method and image display apparatus
CN102917232B (en) * 2012-10-23 2014-12-24 深圳创维-Rgb电子有限公司 Face recognition based 3D (three dimension) display self-adaptive adjusting method and face recognition based 3D display self-adaptive adjusting device
TWI602144B (en) * 2013-10-02 2017-10-11 國立成功大學 Method, device and system for packing color frame and original depth frame
CN103533333B (en) * 2013-10-28 2016-02-10 青岛海信电器股份有限公司 Depth of field control method, depth of field adjusting device and display unit

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1379871A (en) * 1999-10-12 2002-11-13 迈奥里高有限公司 Operation method of user interface of hand-held device
CN101213509A (en) * 2005-07-08 2008-07-02 三菱电机株式会社 Touch panel display device and portable apparatus
CN102541440A (en) * 2010-12-23 2012-07-04 Lg电子株式会社 Mobile terminal and controlling method thereof
CN102870065A (en) * 2011-05-04 2013-01-09 捷讯研究有限公司 Methods for adjusting presentation of graphical data displayed on graphical user interface
CN103383626A (en) * 2012-05-02 2013-11-06 三星电子株式会社 Method and apparatus for moving an object
CN103049184A (en) * 2012-12-11 2013-04-17 中兴通讯股份有限公司 Method and device for adjusting display region of displayed contents in screen
CN104866080A (en) * 2014-02-24 2015-08-26 腾讯科技(深圳)有限公司 Screen content display method and system

Also Published As

Publication number Publication date
CN105807952A (en) 2016-07-27

Similar Documents

Publication Publication Date Title
US11381758B2 (en) System and method for acquiring virtual and augmented reality scenes by a user
US10733801B2 (en) Markerless image analysis for augmented reality
CN107636534B (en) Method and system for image processing
US10587864B2 (en) Image processing device and method
US9214137B2 (en) Methods and systems for realistic rendering of digital objects in augmented reality
US11880956B2 (en) Image processing method and apparatus, and computer storage medium
US20110273369A1 (en) Adjustment of imaging property in view-dependent rendering
US20140181630A1 (en) Method and apparatus for adding annotations to an image
CN105807952B (en) information processing method and electronic equipment
CN103188434A (en) Method and device of image collection
CN107944420A (en) The photo-irradiation treatment method and apparatus of facial image
EP2936442A1 (en) Method and apparatus for adding annotations to a plenoptic light field
CN108846899B (en) Method and system for improving area perception of user for each function in house source
CN113965773A (en) Live broadcast display method and device, storage medium and electronic equipment
WO2017041740A1 (en) Methods and systems for light field augmented reality/virtual reality on mobile devices
CN110737326A (en) Virtual object display method and device, terminal equipment and storage medium
CN109978945B (en) Augmented reality information processing method and device
US20240100425A1 (en) Method for Displaying Skill Effect in Game
CN112511815B (en) Image or video generation method and device
CN105046740A (en) 3D graph processing method based on OpenGL ES and device thereof
KR20180070082A (en) Vr contents generating system
CN113194329B (en) Live interaction method, device, terminal and storage medium
CN113485547A (en) Interaction method and device applied to holographic sand table
CN108781251A (en) Image processing apparatus, image processing method and image processing system
CN104754201A (en) Electronic device and information processing method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant