CN110543275B - Interaction method based on mobile terminal photographing interface and mobile terminal - Google Patents

Interaction method based on mobile terminal photographing interface and mobile terminal Download PDF

Info

Publication number
CN110543275B
CN110543275B CN201910817998.7A CN201910817998A CN110543275B CN 110543275 B CN110543275 B CN 110543275B CN 201910817998 A CN201910817998 A CN 201910817998A CN 110543275 B CN110543275 B CN 110543275B
Authority
CN
China
Prior art keywords
mobile terminal
option
function
photo
display area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910817998.7A
Other languages
Chinese (zh)
Other versions
CN110543275A (en
Inventor
何超
刘涧
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Mobile Communications Technology Co Ltd
Original Assignee
Hisense Mobile Communications Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Mobile Communications Technology Co Ltd filed Critical Hisense Mobile Communications Technology Co Ltd
Priority to CN201910817998.7A priority Critical patent/CN110543275B/en
Publication of CN110543275A publication Critical patent/CN110543275A/en
Application granted granted Critical
Publication of CN110543275B publication Critical patent/CN110543275B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/12Details of telephonic subscriber devices including a sensor for measuring a physical value, e.g. temperature or motion

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Environmental & Geological Engineering (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application provides an interaction method based on a mobile terminal photographing interface and a mobile terminal. In the application, the mobile terminal shoots a photo in response to a shooting instruction received by the mobile terminal, and displays the photo and at least one function option associated with the photo in a shooting interface; and in response to one of the at least one function option being triggered, executing a function corresponding to the triggered function option.

Description

Interaction method based on mobile terminal photographing interface and mobile terminal
Technical Field
The present application relates to display technologies, and in particular, to an interaction method based on a mobile terminal photographing interface and a mobile terminal.
Background
After a user uses a mobile terminal to take a picture, the user often needs to perform subsequent processing on the taken picture or perform other processing related to the picture, for example, edit the picture, or use the picture in an e-commerce platform to perform shopping, or use the picture in a knowledge platform to perform information search, or use the picture in a social platform to share, etc.
However, at present, after a user takes a photo by using a mobile terminal, if the photo needs to be subsequently processed, the user needs to select the photo from a photo library and then select a corresponding function option to perform subsequent processing on the photo.
The user operation is complex in the process, the demand that the user can use the mobile terminal for shooting immediately can not be met, and the demand that the user can quickly and conveniently carry out subsequent processing after using the mobile terminal to shoot the photo can not be met.
Disclosure of Invention
The embodiment of the application provides an interaction method based on a mobile terminal photographing interface and a mobile terminal.
In a first aspect, an interaction method based on a mobile terminal photographing interface is provided, which includes: the method comprises the steps of responding to a photographing instruction received by a mobile terminal to obtain a photo, and displaying the photo and at least one function option related to the photo in a photographing interface; and in response to one of the at least one function option being triggered, executing a function corresponding to the triggered function option.
Optionally, the taking a picture in response to the photographing instruction received by the mobile terminal includes: and responding to a first gesture acted on a mobile terminal photographing preview interface, and photographing to obtain a picture.
Optionally, the photographing interface includes a photograph display area for displaying the photograph, and a function option area for displaying the at least one function option; the at least one function option is distributed around the photo display area, and one of the at least one function option is triggered, including: acquiring detection data of an angular velocity sensor of the mobile terminal to obtain a motion state of the mobile terminal; and determining that the function option matched with the motion state of the mobile terminal is triggered in the at least one function option according to the motion state of the mobile terminal.
Optionally, the method further comprises: determining the deflection distance of the photo display area according to the motion state of the mobile terminal; and updating and displaying the photographing interface according to the deflection distance of the photo display area, so that the photo display area deflects in the photographing interface according to the deflection distance.
Optionally, one of the at least one function option is triggered, including: acquiring a second gesture acting on the photo display area, and determining that a first function option is triggered according to the second gesture; wherein the first function option is one of the at least one function option.
Optionally, a first function option of the at least one function option includes at least a first level option and a second level option; the taking a picture in response to a photographing instruction and displaying the picture and at least one function option associated with the picture comprises: responding to a photographing instruction to obtain a photo, displaying the photo, and displaying a first-level option of the first function option; the method further comprises the following steps: and in response to the first-level option of the first functional option being triggered, displaying a second-level option of the first functional option.
Optionally, the photographing interface includes a photograph display area for displaying the photograph, and a function option area for displaying the at least one function option, wherein the at least one function option in the function option area is arranged around the photograph display area.
In a second aspect, a mobile terminal is provided, including: a controller, a memory and a display screen; the controller is configured to read computer program instructions from the memory and execute the method according to any of the above first aspects.
In a third aspect, there is provided a computer readable storage medium having stored thereon computer instructions which, when run on a computer, cause the computer to perform the method according to any of the first aspect above.
According to the embodiment of the application, the photo is shot in response to the photo instruction received by the mobile terminal, the photo and the at least one function option associated with the photo are displayed, and the function corresponding to the triggered function option is executed in response to the fact that one of the at least one function option is triggered.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to these drawings without inventive exercise.
Fig. 1 schematically shows a flow chart of an interaction method based on a mobile terminal photographing interface provided in an embodiment of the present application;
FIG. 2a is a schematic diagram illustrating a user interface of a conventional camera application;
fig. 2b schematically shows a photograph operation in the embodiment of the present application;
FIG. 3 is a schematic diagram illustrating a photographing interface in an embodiment of the application;
FIG. 4 is a schematic diagram illustrating another photographing interface in an embodiment of the present application;
FIG. 5 is a diagram illustrating secondary function options in an embodiment of the present application;
FIG. 6 is a diagram illustrating an offset of a photo display area;
fig. 7 is a diagram illustrating a hardware architecture of a mobile terminal in an embodiment of the present application;
fig. 8 illustrates a schematic structural diagram of a system of a mobile terminal 700 according to an exemplary embodiment;
fig. 9 is a schematic flowchart illustrating processing based on detection data of the angular velocity sensor.
Detailed Description
To make the objects, technical solutions and advantages of the exemplary embodiments of the present application clearer, the technical solutions in the exemplary embodiments of the present application will be clearly and completely described below with reference to the drawings in the exemplary embodiments of the present application, and it is obvious that the described exemplary embodiments are only a part of the embodiments of the present application, but not all the embodiments.
After a user uses the mobile terminal to take a picture, the user needs to quickly and conveniently perform processing related to the picture aiming at the taken picture, for example, edit the picture, or directly use the picture for an e-commerce platform to purchase, or directly use the picture for a knowledge platform to perform information search, or directly use the picture for a social platform to share, and the like.
In order to meet the above requirements of users, embodiments of the present application provide an interaction method based on a mobile terminal photographing interface and a mobile terminal. By adopting the embodiment of the application, the user can immediately and automatically display the function options related to the photo after taking the photo, so that the user can realize the corresponding processing function based on the photo by triggering the corresponding function options, and the requirement of the user for immediate use after taking the photo is realized.
The concept to which the present application relates will be first explained below with reference to the drawings. It should be noted that the following descriptions of the concepts are only for the purpose of facilitating understanding of the contents of the present application, and do not represent limitations on the scope of the present application.
In the embodiment of the present application, the mobile terminal is a device for providing voice and/or data connectivity to a user, and may include various handheld devices, vehicle-mounted devices, wearable devices, computing devices with wireless communication functions, or other processing devices connected to a wireless modem, for example, the mobile terminal may be a smart phone or a tablet computer, etc. The embodiment of the present application does not limit this.
The term "module," as used in various embodiments of the present application, may refer to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware and/or software code that is capable of performing the functionality associated with that element.
The term "gesture" used in the embodiments of the present application refers to a user behavior used to express an intended idea, action, purpose or result through a change in hand shape or an action such as hand movement.
The term "hardware system" used in the embodiments of the present application may refer to a physical component having computing, controlling, storing, inputting and outputting functions, which is formed by a mechanical, optical, electrical and magnetic device such as an Integrated Circuit (IC), a Printed Circuit Board (PCB) and the like. In various embodiments of the present application, a hardware system may also be referred to as a motherboard (or chip).
The term "middleware" (middleware) used in the embodiments of the present application is an independent system software or service program, and a multisystem architecture can implement resource sharing or information transfer between different systems by using the software. Middleware is software that connects two independent systems. Connected systems, even if they have different interfaces, can still exchange information with each other through middleware.
The embodiments of the present application will be described in detail below with reference to the accompanying drawings.
Fig. 1 exemplarily shows a flow diagram of an interaction method based on a mobile terminal photographing interface provided in an embodiment of the present application, and as shown in the drawing, the flow may include:
s101: the mobile terminal receives the photographing instruction, obtains a photo by responding to the photographing instruction, and displays the photo and at least one function option associated with the photo in a photographing interface.
S102: based on the interaction of the user and the mobile terminal, one of the at least one function option is triggered, and the mobile terminal responds to the triggered function option to execute the function corresponding to the triggered function option.
Based on the flow shown in fig. 1, in some embodiments of the present application, in S101, a conventional method may be used to take a picture using a mobile terminal. Taking a mobile phone as an example, a user may open a camera application in the mobile phone, and a user interface of the camera application may be as shown in fig. 2a, where the user interface 100 includes a photo preview area 201 and a functional area 202, and the functional area 202 includes a photo button 203. And if the user clicks the photographing button 203, triggering to generate a photographing instruction, and in response to the photographing instruction, photographing by the mobile terminal to obtain an image displayed in the current photographing preview area.
In order to implement the photographing function more quickly, in some other embodiments of the present application, in S101, the user may implement the photographing function by performing a gesture operation on the photographing preview area. Optionally, the gesture operation may be a touch operation of long pressing with a finger, or other types of touch operations, which is not limited in this application. For example, as shown in fig. 2b, the user may press the photographing preview area for a long time to trigger generation of a photographing instruction, so that the mobile terminal takes a photo in response to the photographing instruction. In this case, a photograph button or similar function option may not be displayed in the user interface.
In a specific implementation, a gesture (such as a "long-press" touch gesture) for realizing quick photographing may be registered in the system, so that when the user performs the gesture in the photographing preview area, a photographing operation can be performed in response to the gesture.
Of course, in the embodiment of the application, the user may also implement the photographing function through the voice instruction.
In this embodiment, in S101, after the photo is taken, the photo and at least one function option associated with the photo may be displayed on the user interface. For example, when the user presses the preview area for taking a picture for a long time and lifts the finger to achieve quick taking, the taken picture and at least one function option associated with the picture can be displayed on the user interface. Wherein the at least one function option may be arranged in the user interface in any manner,
in some embodiments, when displaying the photo obtained by taking a picture, the size of the photo may be smaller than the size of the display area of the screen of the mobile terminal, so that a certain space is left between the periphery of the photo and the border of the display area of the screen of the mobile terminal, and thus, the at least one function option may be arranged around the periphery of the photo.
Fig. 3 schematically illustrates a photographing interface. As shown, the user interface 300 includes a photograph display area 301 and a function option area 302. The function option area 302 includes 4 function options. The photo display area and the 4 function options may be implemented by using a display control or in other manners, which is not limited in this embodiment of the application. The 4 function options may be arranged around the photo area, such as the 4 function options arranged above, below, to the left, and to the right of the photo area 301, respectively, as shown in fig. 3. Wherein, these 4 function options are respectively:
shopping: a function for realizing shopping based on a photograph currently displayed in the photograph display area 301;
sharing: the function of sharing the currently displayed photo in the photo display area 301 on the social platform is realized;
encyclopedic: a function for implementing information search on the knowledge platform based on the photo currently displayed in the photo display area 301;
and (4) continuing to take the picture: the function of continuously taking the picture is realized.
It should be noted that the types and the numbers of the function options shown in fig. 3 are only examples, and the embodiments of the present application do not limit the types and the numbers of the function options.
For example, more or fewer function options than the number of function options shown in FIG. 3 may be displayed in the user interface. Fig. 4 exemplarily shows a user interface 400 with 8 function options. As shown in fig. 4, in the user interface 400, a photograph display area 401 and a function option area are included. The function option area includes 8 function options (the numbers of the function options are indicated by numbers in the figure), and the 8 function options are distributed around the photograph display area 401. As shown in the figure, function option 1 is displayed on the upper side of photograph display area 401, function options 2 to 4 are displayed on the right side of photograph display area 401, function option 5 is displayed on the lower side of photograph display area 401, and function options 6 to 8 are displayed on the left side of photograph display area 401.
Optionally, in this embodiment of the application, all or some of the function options displayed in the user interface may be one-level function options, or may include at least two-level options. If the function option is a primary function option, when the function option is triggered, the corresponding function can be triggered and executed; if the function option is a second-level function option, when the function option is triggered, a next-level option (second-level option) of the function option is displayed, and when one of the next-level option (second-level option) is triggered or selected, the function corresponding to the next-level option (second-level option) can be triggered to be executed.
Taking the user interface 300 shown in fig. 3 as an example, the "shopping" function option is a two-level function option, and on the basis of the user interface 300, if the "shopping" function option is triggered, a second-level option thereof is displayed, and the second-level option thereof is shown in fig. 5 and may include "shopping platform a", "shopping platform B", and "shopping platform C" (identified as a, B, and C in the figure). When the function option of the shopping platform A is triggered, the related application or function of the shopping platform A can be called, and shopping processing operation is carried out on the shopping platform A based on the photo in the current photo display area; when the function option of the shopping platform B is triggered, the related application or function of the shopping platform B can be called, and shopping processing operation is carried out on the shopping platform B based on the photo in the current photo display area; when the function option of the shopping platform C is triggered, the related application or function of the shopping platform C can be called, and the shopping processing operation is carried out on the shopping platform C based on the photo in the current photo display area.
Based on the flow shown in fig. 1, in S202, when one function option in the user interface is triggered, the mobile terminal may execute a function corresponding to the triggered function option.
In some embodiments, the user may trigger a function option by a gesture. The gesture may include: and one of touch operations such as single click, double click, sliding operation and the like.
Taking a single click as an example, when a user clicks one of the function options in the function option area of the user interface, the function option is triggered, so that the mobile terminal executes the function corresponding to the function option.
Taking the sliding operation as an example, in some embodiments, a user can slide towards the direction of the photo display area according to a function option, and when the user slides to the edge of the photo display area or slides to the inside of the photo display area, the user lifts the user, and the sliding operation triggers the function option; in other embodiments, the user may slide to the display position of a function option according to the position inside the sheet display area, and when the user slides to the display position of the function option, the user lifts the slide to trigger the function option.
In other embodiments, the function options may be triggered in other manners, for example, a sensor built in the mobile terminal may be used to determine which function option is triggered by detecting a change in a motion state of the mobile terminal. The sensor may include, but is not limited to, one of an acceleration sensor, a direction sensor, an angular velocity sensor, or a combination of the above. Taking an angular velocity sensor as an example, the angular velocity sensor is also called a gyroscope, and can measure the rotation angular velocity of the mobile terminal during the yaw and tilt.
In an example of detecting the motion state of the mobile terminal by using the angular velocity sensor to determine the triggered function option, in order to enable the motion state of the mobile terminal to correspond to different function options, that is, to trigger different function options through different motions of the mobile terminal, the function options may be arranged in different directions or angles around the photo display area as much as possible, such as above, below, left side, right side, or the like of the photo display area, or arranged in an upper left corner, an upper right corner, a lower left corner, a lower right corner, or the like of the photo display area.
Specifically, after the mobile terminal takes a picture according to the photographing instruction, the mobile terminal may obtain detection data of an angular velocity sensor built in the mobile terminal, so as to obtain a motion state of the mobile terminal according to the detection data, where the motion state of the mobile terminal may include a moving direction, a moving distance, a deflecting direction, a deflecting angle, an inclined direction, an inclined angle, and the like. The mobile terminal may determine which function option is triggered or selected according to its motion state.
For example, taking the user interface 300 shown in fig. 3 as an example, after the mobile terminal takes a picture according to a photographing instruction, the mobile terminal may obtain the yaw angular velocity θ of the mobile terminal detected by the current angular velocity sensor, that is, the angle rotated in unit time, and calculate to obtain the component θ of the yaw angular velocity θ on the x axis (horizontal axis)xnAnd a component theta on the y-axis (vertical axis)ynAnd the time interval t from the current detection result to the last detection resultnAnd dividing to obtain the change value of the angle in the time period:
Δθxn=θxn/tn
Δθyn=θyn/tn
wherein, Delta thetaxnShows the deviation obtained by this (nth) detectionA change value of a component of the angular velocity on the x-axis; delta thetaynThe change value of the component of the yaw angular velocity on the y axis obtained by the present (nth) detection is shown.
Taking the time when the picture is taken to the current time and the number of times of detection is n as an example, the change values of the angle in the x-axis direction of each detection time in the period of time are accumulated to obtain
Figure BDA0002186813910000091
And accumulating the change values of the angles in the x-axis direction at each detection moment in the period of time to obtain:
Figure BDA0002186813910000092
if it is
Figure BDA0002186813910000093
Greater than a set threshold (positive value),
Figure BDA0002186813910000094
if the position of the mobile terminal is kept unchanged or is smaller than the set threshold, the mobile terminal can be judged to deflect towards the right side of the photo display area, and the function option 'encyclopedia' on the left side of the photo display area is triggered or selected. If it is
Figure BDA0002186813910000095
Greater than a set threshold (negative value),
Figure BDA0002186813910000096
if the deviation value is kept unchanged or is smaller than the set threshold value, the mobile terminal can be judged to deflect towards the left side of the photo display area, and the function option 'share' on the right side of the photo display area is triggered or selected. The same principle, if
Figure BDA0002186813910000097
Greater than a set threshold (positive value),
Figure BDA0002186813910000098
if the value is kept unchanged or is smaller than the set threshold value, the judgment can be madeAnd if the fixed mobile terminal deflects towards the upper side of the photo display area, the function option 'continue taking photos' at the lower side of the photo display area is triggered or selected. If it is
Figure BDA0002186813910000099
Greater than a set threshold (negative value),
Figure BDA00021868139100000910
if the value is kept unchanged or is smaller than the set threshold value, the mobile terminal can be judged to deflect towards the lower side of the photo display area, and the function option 'shopping' on the upper side of the photo display area is triggered or selected.
Optionally, in order to improve the user experience so that the user can intuitively see which function option in the user interface is to be triggered, in some embodiments of the present application, when the user deflects, tilts or moves the mobile terminal, the position of the photo display area in the user interface may be updated according to the motion state, and the movement of the photo display area to which function option is located indicates that the function option is to be triggered.
In specific implementation, the mobile terminal can acquire the angle change value delta detected by the angular velocity sensor at presentnChange the angle by a value deltanFrom a set maximum angle of rotation deltamaxThe division, which multiplies the divided result by the width and height of the photo display area (since the photo display area can be realized by the display control, the divided result may be multiplied by the width and height of the display control), so as to obtain the distances that the photo display area should be shifted in the x direction and the y direction during the period from the time when the photo is taken to the current time:
Δx=δnmax×Pwidth
Δy=δnmax×Pheight
wherein, DeltaxIndicates the distance, Δ, by which the picture display area should be shifted in the x-directionyIndicating the distance, P, by which the picture display area should be shifted in the y-directionwidthPresentation of a photographic displayWidth of the display area, PheightIndicating the height of the photograph display area.
And the mobile terminal shifts the position of the photo display area in the user interface in the corresponding direction and distance according to the result obtained by the calculation.
For example, fig. 6 exemplarily shows a schematic diagram of a photograph display area offset, in which (a) in the diagram shows an initial state of a mobile terminal (a state at a time when the mobile terminal takes a photograph is referred to as an initial state herein). And then, the user deflects the mobile terminal to the left side, for example, the mobile terminal deflects towards the left side by taking the central axis in the y direction of the mobile terminal as an axis, and the current deflection angle is used for calculating the distance which the current photo display area should be deflected in the x direction and the y direction according to the formula. Fig. b shows a schematic diagram after the photograph display area is shifted. It can be seen that the photo display area is shifted to the right, and if the shifted distance reaches a set length (for example, the right frame of the photo display area overlaps with the frame of the right function option display control or the distance between the right frame and the frame is less than a set threshold), the function option on the right side of the photo display area is triggered.
In the embodiment of the application, the types and the number of the function options displayed on the user interface can be set according to requirements. After the function option is triggered, the function corresponding to the function option may be executed, for example, the function corresponding to the function option may be implemented by calling a system application program or a third-party application program. The following types of function options are taken as examples to describe the operations performed after the function options are triggered:
shopping type function options: when the function option of the shopping type is triggered, a shopping platform application program can be called, namely, the shopping platform application program is started or switched to, the shopping platform application program uses the picture displayed in the current picture display area as a search key picture to search the commodities matched with the items in the picture in the commodities provided by the shopping platform, and the search result is displayed so that the user can select to purchase the commodities. The shopping platform application program can be a system application program or a third party application program.
Social type of function options: when the social type function option is triggered, a social platform application program can be called, namely, the social platform application program is started or switched to, and the social platform application program can share the photos displayed in the current photo display area in the social platform. The social platform application program may be a system application program or a third-party application program.
Function options of information search type: when the function option of the information search type is triggered, an information search platform application program can be called, namely, the information search platform application program is started or switched to, the information search platform application program uses the picture displayed in the current picture display area as a search key picture to search information matched with the scene or the article in the picture, and the search result is displayed so that a user can select to view the information. The information search platform application program may be a system application program or a third party application program.
Fig. 7 illustrates a hardware architecture of a mobile terminal in an embodiment of the present application, and the mobile terminal may implement the functions described in the above embodiments.
A block diagram of a configuration of a mobile terminal 700 according to an exemplary embodiment is illustrated in fig. 7. As shown in fig. 7, the mobile terminal 700 includes a controller 710, a communicator 730, a user input/output interface 740, a memory 790, a power supply 780, and a display 791.
The mobile terminal 700 may receive an input operation instruction by a user and perform a responsive processing operation in response to the operation instruction.
The controller 710 includes a processor 712, a RAM 713 and a ROM 714, a communication interface, and a communication bus. The controller 710 is used for controlling the operation and manipulation of the mobile terminal 700, as well as internal components of the mobile terminal for communication and coordination, and external and internal data processing functions.
The communicator 730 enables communication of control signals and data signals with other devices under the control of the controller 710. Such as: and sending the received user input signal to the display device. The communicator 730 may include at least one of a WiFi module 731, a bluetooth module 732, a Near Field Communication (NFC) module 733, and the like.
A user input/output interface 740, wherein the input interface comprises at least one of a microphone 741, a touch panel 742, a sensor 743 (such as an angular velocity sensor), a key 744, and the like. Such as: the user can realize the user instruction input function through actions such as voice, touch, gestures, pressing and the like, and the input interface converts the received analog signals into digital signals and converts the digital signals into corresponding instruction signals to be sent to other equipment.
The output interface includes an interface that transmits the received user instruction to the other device. In some embodiments, it may be an infrared interface or a radio frequency interface. Such as: when the infrared signal interface is used, a user input instruction needs to be converted into an infrared control signal according to an infrared control protocol, and the infrared control signal is sent to other equipment through an infrared sending module. The following steps are repeated: when the radio frequency signal interface is used, a user input instruction needs to be converted into a digital signal, then the digital signal is modulated according to a radio frequency control signal modulation protocol, and then the digital signal is sent to other equipment through a radio frequency sending terminal.
In some embodiments, mobile terminal 700 includes at least one of a communicator 730 and an output interface. The mobile terminal 700 is configured with a communicator 730, such as: the WiFi, Bluetooth, NFC and other modules can transmit the user input command to other equipment through the WiFi protocol, the Bluetooth protocol or the NFC protocol.
A memory 790 for storing various operating programs, data, and applications that drive and control the mobile terminal 700 under the control of the controller 710. The memory 790 may store various control signal commands inputted by the user.
A power supply 780 for providing operational power support to the various elements of the mobile terminal 700 under the control of the controller 710. A battery and associated control circuitry.
Fig. 8 illustrates a schematic structural diagram of a system of a mobile terminal 700 according to an exemplary embodiment.
As shown in fig. 8, taking the Andriod system as an example, the system architecture is a four-layer structure, and from the upper layer to the lower layer, the application framework layer, the system runtime layer, and the Linux kernel layer are respectively provided.
The application program layer includes various application programs, where the application programs related to the embodiments of the present application include a "processing module" and a "sensing and control module," and the "sensing and control module" may execute a function corresponding to a triggered function option when detecting that the function option in the user interface is triggered.
In the application framework layer, the modules related to the embodiments of the present application may include middleware, an "event response distribution module" and the like. The event response distribution module is used for sending the event information to a corresponding application program in the application layer when the registered event information is monitored.
The system operation library layer comprises a Native system library and an android operation. Wherein, the Native system library comprises a plurality of system services.
The Linux kernel layer, core system services such as security, memory management, process management, network protocols, and driver models all rely on the Linux kernel.
In connection with the system architecture shown in fig. 8, fig. 9 is a schematic flow chart illustrating processing according to the detection data of the angular velocity sensor, and as shown in the figure, the flow chart may include:
after the application program related to the embodiment of the application is started, an initialization process is performed. During initialization, the "sensing and control module" registers the listening events of the angular velocity sensors so that the data detected by the angular velocity sensors can be obtained (801-803).
In the running process of the application program, the angular velocity sensor can send the detected state information (such as the deflection angle and the deflection angular velocity) of the mobile terminal to the sensing and control module. The sensing and control module calculates the offset distance of the picture display area in the user interface according to the information and sends the offset distance to the processing module, so that the processing module can trigger the display to refresh the user interface (804-807).
After the 'sensing and control module' obtains the state information (such as a deflection angle and a deflection angle speed) of the mobile terminal detected by the angular velocity sensor, if the offset distance of a photo display area in the user interface is calculated according to the information and then a certain function option in the user interface is judged to be triggered, a message is sent to the 'event response distribution module' to inform the triggered application program (808).
After the event response distribution module obtains the notification, the corresponding application program in the application layer can be triggered to execute the function corresponding to the function option.
The present embodiments also provide a computer-readable storage medium storing computer instructions that, when executed on a computer, cause the computer to perform a method implemented as a combination of one or more of the foregoing embodiments.
All other embodiments, which can be derived by a person skilled in the art from the exemplary embodiments shown in the present application without inventive effort, shall fall within the scope of protection of the present application. Moreover, while the disclosure herein has been presented in terms of exemplary one or more examples, it is to be understood that each aspect of the disclosure can be utilized independently and separately from other aspects of the disclosure to provide a complete disclosure.
It should be understood that the terms "first," "second," "third," and the like in the description and in the claims of the present application and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used are interchangeable under appropriate circumstances and can be implemented in sequences other than those illustrated or otherwise described herein with respect to the embodiments of the application, for example.
Furthermore, the terms "comprises" and "comprising," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or device that comprises a list of elements is not necessarily limited to those elements explicitly listed, but may include other elements not expressly listed or inherent to such product or device.
Finally, it should be noted that: the above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application.

Claims (8)

1. An interaction method based on a mobile terminal photographing interface is characterized by comprising the following steps:
the method comprises the steps of responding to a photographing instruction received by a mobile terminal to obtain a photo, and displaying the photo and at least one function option related to the photo in a photographing interface; the photographing interface comprises a photo display area and a function option area, wherein the photo display area is used for displaying the photos, and the function option area is used for displaying the at least one function option, and the at least one function option is distributed around the photo display area;
acquiring an angle change value of an angular velocity sensor of the mobile terminal, and multiplying a division result of the angle change value and a set maximum rotation angle by the width and the height of the photo display area respectively to obtain a horizontal offset distance and a vertical offset distance of the mobile terminal in a set time period;
and determining the deflection direction of the mobile terminal according to the obtained horizontal deflection distance and the obtained vertical deflection distance, determining that the function option matched with the motion state of the mobile terminal in the at least one function option corresponding to the deflection direction is triggered according to the deflection direction of the mobile terminal, and executing the function corresponding to the triggered function option.
2. The method of claim 1, wherein taking the picture in response to the photographing instruction received by the mobile terminal comprises:
and responding to a first gesture acted on a mobile terminal photographing preview interface, and photographing to obtain a picture.
3. The method of claim 1, further comprising:
determining the deflection distance of the photo display area according to the motion state of the mobile terminal;
and updating and displaying the photographing interface according to the deflection distance of the photo display area, so that the photo display area deflects in the photographing interface according to the deflection distance.
4. The method of claim 1, wherein one of the at least one function option is triggered, comprising:
acquiring a second gesture acting on the photo display area, and determining that a first function option is triggered according to the second gesture; wherein the first function option is one of the at least one function option.
5. The method of claim 1, wherein a first one of the at least one functionality option comprises at least a first level option and a second level option;
the taking a picture in response to a photographing instruction and displaying the picture and at least one function option associated with the picture comprises:
responding to a photographing instruction to obtain a photo, displaying the photo, and displaying a first-level option of the first function option;
the method further comprises the following steps:
and in response to the first-level option of the first functional option being triggered, displaying a second-level option of the first functional option.
6. The method of claim 1, wherein the photographing interface comprises a photograph display area for displaying the photograph and a function options area for displaying the at least one function option, the at least one function option in the function options area being disposed around the photograph display area.
7. A mobile terminal, comprising: a controller, a memory and a display screen;
the controller, reading computer program instructions in the memory, performing the method of any of claims 1-6.
8. A computer-readable storage medium having stored thereon computer instructions which, when executed on a computer, cause the computer to perform the method of any one of claims 1-6.
CN201910817998.7A 2019-08-30 2019-08-30 Interaction method based on mobile terminal photographing interface and mobile terminal Active CN110543275B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910817998.7A CN110543275B (en) 2019-08-30 2019-08-30 Interaction method based on mobile terminal photographing interface and mobile terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910817998.7A CN110543275B (en) 2019-08-30 2019-08-30 Interaction method based on mobile terminal photographing interface and mobile terminal

Publications (2)

Publication Number Publication Date
CN110543275A CN110543275A (en) 2019-12-06
CN110543275B true CN110543275B (en) 2021-12-14

Family

ID=68711148

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910817998.7A Active CN110543275B (en) 2019-08-30 2019-08-30 Interaction method based on mobile terminal photographing interface and mobile terminal

Country Status (1)

Country Link
CN (1) CN110543275B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114968042B (en) * 2022-05-26 2023-05-05 重庆长安汽车股份有限公司 Image editing system and method based on android system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102265242A (en) * 2008-10-29 2011-11-30 因文森斯公司 Controlling and accessing content using motion processing on mobile devices
CN103677609A (en) * 2012-09-24 2014-03-26 华为技术有限公司 Picture processing method and terminal
CN107483833A (en) * 2017-09-22 2017-12-15 维沃移动通信有限公司 The display methods and mobile terminal of a kind of camera function
CN108596095A (en) * 2018-04-24 2018-09-28 维沃移动通信有限公司 A kind of information processing method and mobile terminal
CN109085967A (en) * 2018-06-27 2018-12-25 阿里巴巴集团控股有限公司 A kind of call method of function of application, device and electronic equipment

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6852368B2 (en) * 2016-11-30 2021-03-31 株式会社リコー Shooting device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102265242A (en) * 2008-10-29 2011-11-30 因文森斯公司 Controlling and accessing content using motion processing on mobile devices
CN103677609A (en) * 2012-09-24 2014-03-26 华为技术有限公司 Picture processing method and terminal
CN107483833A (en) * 2017-09-22 2017-12-15 维沃移动通信有限公司 The display methods and mobile terminal of a kind of camera function
CN108596095A (en) * 2018-04-24 2018-09-28 维沃移动通信有限公司 A kind of information processing method and mobile terminal
CN109085967A (en) * 2018-06-27 2018-12-25 阿里巴巴集团控股有限公司 A kind of call method of function of application, device and electronic equipment

Also Published As

Publication number Publication date
CN110543275A (en) 2019-12-06

Similar Documents

Publication Publication Date Title
US20200125144A1 (en) Foldable electronic device for controlling user interface and operating method thereof
US11943530B2 (en) Electronic device and method for adjusting camera magnification
EP3537258A1 (en) Electronic device with flexible display and method for operating same
CN110891144B (en) Image display method and electronic equipment
EP3163404B1 (en) Method and device for preventing accidental touch of terminal with touch screen
EP4047922A1 (en) Object tracking method and electronic device
CN109862267B (en) Shooting method and terminal equipment
CN108495029B (en) Photographing method and mobile terminal
EP3232299A2 (en) Physical key component, terminal, and touch response method and device
US11755186B2 (en) Screen capturing method and terminal device
EP3276301B1 (en) Mobile terminal and method for calculating a bending angle
EP2887648B1 (en) Method of performing previewing and electronic device for implementing the same
CN109933252B (en) Icon moving method and terminal equipment
CN109117635B (en) Virus detection method and device for application program, computer equipment and storage medium
US20170118402A1 (en) Electronic device and camera control method therefor
US20150091824A1 (en) Information processing apparatus, information processing method, and computer program
CN111026464A (en) Identification method and electronic equipment
EP3182256B1 (en) Touch control button, touch control panel and touch control terminal
CN111742543A (en) Electronic device and recording method thereof
CN114546235A (en) Electronic device for providing multiple windows by using an expandable display
CN110543275B (en) Interaction method based on mobile terminal photographing interface and mobile terminal
CN113918258B (en) Page scrolling processing method, device, terminal and storage medium
CN108833791B (en) Shooting method and device
CN111064896A (en) Device control method and electronic device
JP7413546B2 (en) Photography method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP01 Change in the name or title of a patent holder
CP01 Change in the name or title of a patent holder

Address after: 266071 Shandong city of Qingdao province Jiangxi City Road No. 11

Patentee after: Qingdao Hisense Mobile Communication Technology Co.,Ltd.

Address before: 266071 Shandong city of Qingdao province Jiangxi City Road No. 11

Patentee before: HISENSE MOBILE COMMUNICATIONS TECHNOLOGY Co.,Ltd.