CN107992816B - Photographing search method and device, electronic equipment and computer readable storage medium - Google Patents

Photographing search method and device, electronic equipment and computer readable storage medium Download PDF

Info

Publication number
CN107992816B
CN107992816B CN201711219422.8A CN201711219422A CN107992816B CN 107992816 B CN107992816 B CN 107992816B CN 201711219422 A CN201711219422 A CN 201711219422A CN 107992816 B CN107992816 B CN 107992816B
Authority
CN
China
Prior art keywords
preset
image
eye
camera
shooting interval
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201711219422.8A
Other languages
Chinese (zh)
Other versions
CN107992816A (en
Inventor
梁金辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Genius Technology Co Ltd
Original Assignee
Guangdong Genius Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Genius Technology Co Ltd filed Critical Guangdong Genius Technology Co Ltd
Priority to CN201711219422.8A priority Critical patent/CN107992816B/en
Publication of CN107992816A publication Critical patent/CN107992816A/en
Application granted granted Critical
Publication of CN107992816B publication Critical patent/CN107992816B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • G06V40/171Local features and components; Facial parts ; Occluding parts, e.g. glasses; Geometrical relationships
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/583Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
    • G06F16/5838Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content using colour
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • H04N23/611Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof

Landscapes

  • Engineering & Computer Science (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Signal Processing (AREA)
  • Library & Information Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Human Computer Interaction (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Studio Devices (AREA)

Abstract

The invention is suitable for the technical field of intelligent terminals and provides a photographing search method, a photographing search device and electronic equipment, wherein the photographing search method comprises the following steps: acquiring a face image, and extracting eye feature information in the face image; verifying the extracted eye characteristic information; if the eye characteristic information meets a preset condition, triggering a camera to shoot an image of a preset shooting interval in the current framing range; and searching the content according to the shot image of the preset shooting interval. By the method, automatic framing is realized, and the shooting search efficiency is improved.

Description

Photographing search method and device, electronic equipment and computer readable storage medium
Technical Field
The invention belongs to the technical field of electronic terminals, and particularly relates to a photographing search method and device and electronic equipment.
Background
Along with the rapid development of the intelligent terminal technology, electronic education is gradually developed, and the function of photographing and searching questions by the intelligent terminal is more and more widely applied. The user can shoot the question by controlling the intelligent terminal, after the shooting is completed and the photo is displayed in the display screen of the intelligent terminal, the user controls the selection frame displayed in the display screen of the terminal to select the question, then the intelligent terminal can intercept the question of the answer to be searched, and the corresponding question content is analyzed and uploaded to the server to obtain the answer, so that the function of shooting and searching the question is completed.
However, in the conventional photo search, the frame selection of the content to be searched generally requires that the user adjusts the size of the selection frame through the touch screen of the intelligent terminal and moves the selection frame to a proper position until the selection frame just surrounds the content to be searched, so as to upload the screenshot for searching.
Disclosure of Invention
In view of this, embodiments of the present invention provide a photo search method, a photo search device, and an electronic device, so as to solve the problem in the prior art that the photo search efficiency is low due to complicated operation for selecting a content to be searched.
The first aspect of the present invention provides a photograph search method, including:
acquiring a face image, and extracting eye feature information in the face image;
verifying the extracted eye characteristic information;
if the eye characteristic information meets a preset condition, triggering a camera to shoot an image of a preset shooting interval in the current framing range;
and searching the content according to the shot image of the preset shooting interval.
With reference to the first aspect, in a first possible implementation manner of the first aspect, if the eye feature identified by the feature identification meets a preset condition, the step of triggering the camera to shoot an image of a preset shooting interval within the current viewing range includes:
tracking an eye movement trajectory over a first specified time;
calculating the matching degree of the eye movement track and a preset eye movement track;
and if the calculated matching degree is not lower than the preset matching degree, triggering the camera to shoot the image of the preset shooting interval in the current view finding range.
With reference to the first aspect, in a second possible implementation manner of the first aspect, if the eye feature identified by the feature identification meets a preset condition, the step of triggering the camera to shoot an image of a preset shooting interval within the current viewing range includes:
and when the blink frequency detected in the second designated time meets the preset blink frequency, triggering the camera to shoot the image of the preset shooting interval in the current framing range.
With reference to the first aspect, in a third possible implementation manner of the first aspect, if the eye feature identified by the feature identification meets a preset condition, the step of triggering the camera to shoot an image of a preset shooting interval within the current viewing range includes:
and when the continuous eye closing time of the eyes is detected to meet the preset eye closing time within the third appointed time, triggering the camera to shoot the image of the preset shooting interval within the current framing range.
With reference to the first aspect, or with reference to the first possible implementation manner of the first aspect, or with reference to the second possible implementation manner of the first aspect, or with reference to the third possible implementation manner of the first aspect, in a fourth possible implementation manner of the first aspect, if the eye feature identified by the feature identification meets a preset condition, the step of triggering the camera to shoot an image of a preset shooting interval within the current viewing range includes:
if the eye features identified by the feature identification meet the preset conditions, rotating the orientation of a camera so as to enable the camera to turn to the direction of the content to be searched;
and when the camera adjusting instruction is not detected within the preset time, shooting an image of a preset shooting interval within the current framing range of the camera.
A second aspect of the present invention provides a photograph search apparatus including:
the characteristic information acquisition unit is used for acquiring a face image and extracting eye characteristic information in the face image;
the characteristic information verification unit is used for verifying the extracted eye characteristic information;
the image shooting unit is used for triggering the camera to shoot an image of a preset shooting interval in the current framing range if the eye feature information meets a preset condition;
and the content searching unit is used for searching the content according to the shot image of the preset shooting interval.
With reference to the second aspect, in a first possible implementation manner of the second aspect, the image capturing unit includes:
the track tracking module is used for tracking the eye movement track in a first designated time;
the first matching degree calculation module is used for calculating the matching degree of the eye movement track and a preset eye movement track;
and the first image shooting module is used for triggering the camera to shoot the image of a preset shooting interval in the current framing range if the matching degree obtained by calculation is not lower than the preset matching degree.
With reference to the second aspect, in a second possible implementation manner of the second aspect, the image capturing unit includes:
and the second image shooting module is used for triggering the camera to shoot the image of the preset shooting interval in the current framing range when the blink frequency detected in the second designated time meets the preset blink frequency.
With reference to the second aspect, in a third possible implementation manner of the second aspect, the image capturing unit includes:
and the third image shooting module is used for triggering the camera to shoot the image of a preset shooting interval in the current framing range when the continuous eye closing time of the detected eyes in the third specified time meets the preset eye closing time.
With reference to the second aspect, or with reference to the first possible implementation manner of the second aspect, or with reference to the second possible implementation manner of the second aspect, or with reference to the third possible implementation manner of the second aspect, in a fourth possible implementation manner of the second aspect, the image capturing unit includes:
the shooting adjusting module is used for rotating the orientation of the camera if the eye features identified by the feature identification meet the preset conditions so as to enable the camera to turn to the direction of the content to be searched;
and the fourth image shooting module is used for shooting the image of a preset shooting interval in the current framing range of the camera when the camera adjusting instruction is not detected within the preset time.
A third aspect of the present invention provides an electronic device comprising: a memory, a processor and a computer program stored in the memory and executable on the processor, wherein the processor implements the steps of the photo search method according to any one of the above first aspect when executing the computer program.
A fourth aspect of the present invention provides a computer-readable storage medium storing a computer program which, when executed by a processor, implements the steps of the photo search method according to any one of the first aspects above.
Compared with the prior art, the embodiment of the invention has the following beneficial effects: according to the embodiment of the invention, the human face image is obtained, the eye feature information in the human face image is extracted, the extracted eye feature information is verified, if the eye feature information meets the preset condition, the camera is triggered to shoot the image of the preset shooting interval in the current framing range, finally, the content search is carried out according to the shot image of the preset shooting interval, the eye feature information in the human face image is extracted by the intelligent terminal, the camera is triggered to shoot the image of the preset shooting interval through verification, the image of the content to be searched is automatically obtained, the content to be searched can be obtained without manual frame selection of a user, the operation of the user is facilitated, and the shooting search efficiency and the user experience can be improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1 is a flowchart illustrating an implementation of a photo search method according to an embodiment of the present invention;
fig. 2 is a flowchart of implementing step S103 when the eye feature information is an eye movement track according to an embodiment of the present invention;
fig. 3 is a flowchart of a step S103 when a camera of an intelligent terminal is rotatable according to an embodiment of the present invention;
fig. 4 is a block diagram of a photographing search apparatus according to an embodiment of the present invention;
fig. 5 is a schematic diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the invention. It will be apparent, however, to one skilled in the art that the present invention may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present invention with unnecessary detail.
The embodiment of the invention provides a photographing search method, a photographing search device and electronic equipment, which are used for facilitating a user to use an intelligent terminal to perform photographing search and improving the photographing search efficiency. . In order to specifically describe the above photographing search method, apparatus and electronic device, the following description is made by using specific embodiments.
Example one
Fig. 1 shows a flowchart of a photo search method according to an embodiment of the present invention, which is detailed as follows:
step S101, a face image is obtained, and eye feature information in the face image is extracted.
Specifically, after a user starts a photographing search application of the intelligent terminal, a face is detected through a camera, a face image is obtained, and eye feature information extraction is performed on the obtained face image. Further, before extracting eye feature information, performing image preprocessing on the face image, including performing illumination compensation on the image, converting the image into a gray image, performing denoising processing on the gray image, and performing human eye detection through a human eye detection algorithm to obtain an eye region image. And extracting eye feature information for the eye region image.
Optionally, before the step S101, the method further includes: and acquiring a camera adjusting instruction, and adjusting the shooting parameters of the camera according to the camera adjusting instruction. The shooting parameters comprise the shooting angle of the camera and the optical parameters of the camera. Specifically, the position relationship between the camera and the content to be searched is adjusted by adjusting the shooting parameters of the camera. In an embodiment of the present invention, the optical parameter of the camera includes a depth of field, a field angle, or a focal length. The longer the focal length, the smaller the field angle, the smaller the shooting range, and on the contrary, the shorter the focal length, the larger the field angle, the larger the shooting range; the shorter the focal length, the longer the depth of field, and the longer the focal length, the shorter the depth of field. The camera adjusting instruction is used for adjusting the position relation between the camera and the content to be searched, so that the photographing searching efficiency is improved.
And step S102, verifying the extracted eye feature information.
Specifically, the extracted eye feature information is compared and matched with eye feature information pre-stored by a user.
In the embodiment of the present invention, the eye feature information refers to eye state image information in a specific state, and not only to a feature image of eyes in a natural state of a human face. Wherein the eye characteristic information includes, but is not limited to, eye movement trajectory, number of blinks, and eye closing time.
And step S103, if the eye feature information meets a preset condition, triggering a camera to shoot an image of a preset shooting interval in the current framing range.
Optionally, as shown in fig. 2, when the eye feature information is an eye movement trajectory, the step S103 specifically includes:
and A1, tracking the eye movement track in the first designated time.
And A2, calculating the matching degree of the eye movement track and a preset eye movement track.
And A3, if the calculated matching degree is not lower than the preset matching degree, triggering the camera to shoot the image of the preset shooting interval in the current view finding range.
Wherein the eye movement track includes a moving direction and a moving distance of the eyeball, for example, the eyeball moves a certain distance to the left, or the eyeball moves back and forth to the left and right. Specifically, in a first specified time, the moving direction of the eyeball is the same as the moving direction of the preset eyeball track, and the difference value between the moving distance of the eyeball and the moving distance in the preset eyeball track is within a preset range, it is determined that the matching degree between the eyeball track and the preset eyeball track is not lower than the preset matching degree, and at this time, the camera is triggered to shoot an image of a preset shooting interval within the current framing range.
Further, in order to accurately acquire the eye feature information, the step a1 specifically includes:
and A11, acquiring N frames of face images in the first appointed time. Wherein N is an integer not less than 2.
And A12, extracting the eye movement track of the eyes according to the N frames of face images in the first appointed time.
In the embodiment of the invention, the N frames of face images are multiple frames of continuous face images within a first specified time, or multiple frames of discontinuous face images. For eye tracking, the more the number of frames of face images acquired in a unit time and the faster the speed of processing the images, the more accurate the eye tracking effect.
Optionally, when the eye feature information is the number of blinks, the step S103 includes:
and when the blink frequency detected in the second designated time meets the preset blink frequency, triggering the camera to shoot the image of the preset shooting interval in the current framing range.
Specifically, N frames of face images within a second specified time are acquired. Wherein N is an integer not less than 2. And determining the blinking times according to the N frames of face images within the second designated time. And when the blink frequency detected in the second designated time meets the preset blink frequency, triggering the camera to shoot the image of the preset shooting interval in the current framing range. In this embodiment, the N frames of face images are face images of consecutive frames.
Optionally, when the eye feature information is eye closing time, the step S103 includes:
and when the continuous eye closing time of the eyes is detected to meet the preset eye closing time within the third appointed time, triggering the camera to shoot the image of the preset shooting interval within the current framing range.
Specifically, N frames of face images within a third specified time are acquired. Wherein N is an integer not less than 2. And extracting the continuous eye closing time of the eyes according to the N frames of face images in the third designated time. In this embodiment, the N frames of face images are face images of consecutive frames. And when the detected continuous eye closing time is not less than the preset eye closing time, triggering the camera to shoot the image of a preset shooting interval in the current framing range.
Optionally, in the embodiment of the present invention, when the intelligent terminal includes a first camera and a second camera, a face image is obtained through the first camera, the intelligent terminal extracts eye feature information according to the face image, verifies the extracted eye feature information, and if the eye feature information meets a preset condition, the second camera is triggered to shoot an image of a preset shooting interval within a current framing range;
optionally, when the camera of the intelligent terminal is rotatable, as shown in fig. 3, the step S103 includes:
b1, if the eye features identified by the feature identification meet the preset conditions, rotating the orientation of the camera so that the camera turns to the direction of the content to be searched.
And B2, when the camera adjusting instruction is not detected within the preset time, shooting an image of a preset shooting interval within the current view range of the camera.
In the embodiment of the invention, after the shooting search instruction is detected, the camera of the intelligent terminal faces the user to acquire the facial feature image of the user. If the identified facial feature image meets the preset feature condition, the orientation of the camera is immediately and automatically rotated, so that the camera turns to the direction of the content to be searched, and the angle of rotation of the camera when turning to the content to be searched is a preset rotation angle, for example, 180 °. And when the camera adjusting instruction is not detected within the preset time, shooting an image of a preset shooting interval within the current view finding range of the camera. That is, in the embodiment of the present invention, the facial feature image of the user and the image of the content to be searched are acquired by the same camera. The image of the preset shooting interval refers to a local image in the current view range of the camera, for example, the current view image is proportionally cut into an upper part, a middle part and a lower part, and the image of the preset shooting interval is an upper area image, a middle area image or a lower area image for shooting the current view image of the camera. Further, the range of the image of the preset photographing section may be defined by a fixed finder frame.
And step S104, searching contents according to the shot images in the preset shooting interval.
Specifically, text recognition is performed on an image of a photographed preset photographing section, and text recognition may be performed through Optical Character Recognition (OCR). And searching the content according to the character information acquired by text recognition. In the embodiment of the present invention, the text recognition mode may refer to an existing text recognition mode, and is not described herein again.
In the first embodiment of the invention, a face image is obtained, and eye feature information such as eye movement track, blinking times or eye closing time in the face image is extracted. Verifying the extracted eye characteristic information, if the eye characteristic information meets a preset condition, triggering a camera to shoot an image of a preset shooting interval in a current viewing range, finally performing content search according to the shot image of the preset shooting interval, extracting the eye characteristic information in the face image by the intelligent terminal, verifying the trigger camera to shoot the image of the preset shooting interval, automatically acquiring the image of the content to be searched, acquiring the content to be searched without manual frame selection of a user, facilitating the operation of the user, and improving the efficiency of shooting search and the user experience.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present invention.
Example two
Corresponding to the photo search method described in the above embodiment, fig. 4 shows a block diagram of a photo search apparatus provided in the embodiment of the present invention, which is applicable to electronic devices such as a smart terminal, which may include user equipment communicating with one or more core networks via a radio access network RAN, such as a mobile phone (or "cellular" phone), a computer with mobile equipment, and the like, for example, the user equipment may also be a portable, pocket, hand-held, computer-embedded, or vehicle-mounted mobile apparatus, which exchanges voice and/or data with the radio access network. Also for example, the mobile device may include a smartphone, a tablet computer, a personal digital assistant PDA, or a vehicle-mounted computer, among others. For convenience of explanation, only portions related to the embodiments of the present invention are shown.
Referring to fig. 3, the photograph search apparatus includes: the device comprises a characteristic information acquisition unit, a characteristic information verification unit, an image shooting unit and a content searching unit, wherein:
the characteristic information acquisition unit is used for acquiring a face image and extracting eye characteristic information in the face image;
the characteristic information verification unit is used for verifying the extracted eye characteristic information;
the image shooting unit is used for triggering the camera to shoot an image of a preset shooting interval in the current framing range if the eye feature information meets a preset condition;
and the content searching unit is used for searching the content according to the shot image of the preset shooting interval.
Optionally, the image capturing unit includes:
the track tracking module is used for tracking the eye movement track in a first designated time;
the first matching degree calculation module is used for calculating the matching degree of the eye movement track and a preset eye movement track;
and the first image shooting module is used for triggering the camera to shoot the image of a preset shooting interval in the current framing range if the matching degree obtained by calculation is not lower than the preset matching degree.
Optionally, the trajectory tracking module comprises:
and the first facial image acquisition sub-module is used for acquiring N frames of facial images within a first specified time. Wherein N is an integer not less than 2.
And the track extraction submodule is used for extracting the eye movement track of the eyes according to the N frames of face images within the first designated time.
Optionally, the image capturing unit includes:
and the second image shooting module is used for triggering the camera to shoot the image of the preset shooting interval in the current framing range when the blink frequency detected in the second designated time meets the preset blink frequency.
Optionally, the image capturing unit includes:
and the third image shooting module is used for triggering the camera to shoot the image of a preset shooting interval in the current framing range when the continuous eye closing time of the detected eyes in the third specified time meets the preset eye closing time.
Optionally, the image capturing unit includes:
the shooting adjusting module is used for rotating the orientation of the camera if the eye features identified by the feature identification meet the preset conditions so as to enable the camera to turn to the direction of the content to be searched;
and the fourth image shooting module is used for shooting the image of a preset shooting interval in the current framing range of the camera when the camera adjusting instruction is not detected within the preset time.
In the second embodiment of the present invention, a face image is obtained, and eye feature information such as an eye movement trajectory, a blinking number, or an eye closing time in the face image is extracted. Verifying the extracted eye characteristic information, if the eye characteristic information meets a preset condition, triggering a camera to shoot an image of a preset shooting interval in a current viewing range, finally performing content search according to the shot image of the preset shooting interval, extracting the eye characteristic information in the face image by the intelligent terminal, verifying the trigger camera to shoot the image of the preset shooting interval, automatically acquiring the image of the content to be searched, acquiring the content to be searched without manual frame selection of a user, facilitating the operation of the user, and improving the efficiency of shooting search and the user experience.
Example three:
fig. 5 is a schematic diagram of an electronic device according to an embodiment of the present invention. As shown in fig. 3, the electronic apparatus 3 of this embodiment includes: a processor 30, a memory 31 and a computer program 32, such as a photo search program, stored in said memory 31 and operable on said processor 30. The processor 30, when executing the computer program 32, implements the steps in the various shot search method embodiments described above, such as the steps 101-104 shown in fig. 1. Alternatively, the processor 30, when executing the computer program 32, implements the functions of the modules/units in the above-mentioned device embodiments, such as the functions of the units 21 to 24 shown in fig. 4.
Illustratively, the computer program 32 may be partitioned into one or more modules/units that are stored in the memory 31 and executed by the processor 30 to implement the present invention. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution of the computer program 32 in the electronic device 3. For example, the computer program 32 may be divided into a feature information acquisition unit, a feature information verification unit, an image capturing unit, and a content search unit, and each unit functions as follows:
the characteristic information acquisition unit is used for acquiring a face image and extracting eye characteristic information in the face image;
the characteristic information verification unit is used for verifying the extracted eye characteristic information;
the image shooting unit is used for triggering the camera to shoot an image of a preset shooting interval in the current framing range if the eye feature information meets a preset condition;
and the content searching unit is used for searching the content according to the shot image of the preset shooting interval.
The electronic device 3 may be a computing device such as a smart phone or a tablet computer. The electronic device 3 may include, but is not limited to, a processor 30, a memory 31. It will be appreciated by those skilled in the art that fig. 5 is merely an example of the electronic device 3, and does not constitute a limitation of the electronic device 3, and may include more or less components than those shown, or combine certain components, or different components, for example, the electronic device may also include input output devices, network access devices, buses, etc.
The Processor 30 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 31 may be an internal storage unit of the electronic device 3, such as a hard disk or a memory of the electronic device 3. The memory 31 may also be an external storage device of the electronic device 3, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, which are provided on the electronic device 3. Further, the memory 31 may also include both an internal storage unit and an external storage device of the electronic device 3. The memory 31 is used for storing the computer program and other programs and data required by the electronic device. The memory 31 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described system embodiments are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow of the method according to the embodiments of the present invention may also be implemented by a computer program, which may be stored in a computer-readable storage medium, and when the computer program is executed by a processor, the steps of the method embodiments may be implemented. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, etc. It should be noted that the computer readable medium may contain other components which may be suitably increased or decreased as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media which may not include electrical carrier signals and telecommunications signals in accordance with legislation and patent practice.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present invention, and not for limiting the same; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present invention, and are intended to be included within the scope of the present invention.

Claims (6)

1. A photographing search method is characterized by comprising the following steps:
after a photographing search application is started, a face image is obtained, and eye feature information in the face image is extracted;
verifying the extracted eye characteristic information, wherein the eye characteristic information comprises an eye movement track, blinking times and eye closing time;
if the eye characteristic information meets the preset condition, triggering a camera to shoot an image of a preset shooting interval in the current framing range, wherein the image specifically comprises:
if the eye features meet the preset conditions, rotating the orientation of the camera so as to enable the camera to turn to the direction of the content to be searched,
when the camera adjusting instruction is not detected within the preset time, shooting an image of a preset shooting interval within the current framing range of the camera, wherein the preset shooting interval is limited by a fixed framing frame;
searching contents according to the shot image of the preset shooting interval;
when the eye feature information is an eye movement track, if the eye feature meets a preset condition, the step of triggering the camera to shoot an image of a preset shooting interval in the current framing range includes:
tracking an eye movement trajectory over a first specified time;
calculating the matching degree of the eye movement track and a preset eye movement track;
if the calculated matching degree is not lower than the preset matching degree, triggering a camera to shoot an image of a preset shooting interval in the current framing range; the method comprises the following steps: when the moving direction of the eyeball is the same as the moving direction of the preset eyeball track within the first designated time, and the difference value between the moving distance of the eyeball and the moving distance in the preset eyeball track is within a preset range, the matching degree between the eyeball track and the preset eyeball track is judged to be not lower than the preset matching degree.
2. The photo search method of claim 1, wherein when the eye feature information is a blink count, if the eye feature satisfies a predetermined condition, the step of triggering the camera to capture an image of a predetermined capture zone within a current viewing range comprises:
and when the blink frequency detected in the second designated time meets the preset blink frequency, triggering the camera to shoot the image of the preset shooting interval in the current framing range.
3. The photographing search method according to claim 1, wherein when the eye feature information is eye closing time, if the eye feature satisfies a preset condition, the step of triggering the camera to photograph an image of a preset photographing section within a current viewing range includes:
and when the continuous eye closing time of the eyes is detected to meet the preset eye closing time within the third appointed time, triggering the camera to shoot the image of the preset shooting interval within the current framing range.
4. A photograph search apparatus, characterized in that the photograph search apparatus comprises:
the system comprises a characteristic information acquisition unit, a storage unit and a processing unit, wherein the characteristic information acquisition unit is used for acquiring a face image and extracting eye characteristic information in the face image after starting a photographing search application;
the characteristic information verification unit is used for verifying the extracted eye characteristic information, wherein the eye characteristic information comprises an eye movement track, blinking times and eye closing time;
the image shooting unit is used for triggering the camera to shoot an image of a preset shooting interval in the current framing range if the eye feature information meets a preset condition;
the content searching unit is used for searching contents according to the shot images of the preset shooting interval;
wherein the image capturing unit includes:
the shooting adjustment module is used for rotating the orientation of the camera if the eye features meet preset conditions so as to enable the camera to turn to the direction of the content to be searched;
the fourth image shooting module is used for shooting an image of a preset shooting interval in the current framing range of the camera when the camera adjusting instruction is not detected within the preset time, wherein the preset shooting interval is limited by the fixed framing frame;
when the eye feature information is an eye movement trajectory, the image capturing unit includes:
the track tracking module is used for tracking the eye movement track in a first designated time;
the first matching degree calculation module is used for calculating the matching degree of the eye movement track and a preset eye movement track;
the first image shooting module is used for triggering the camera to shoot an image of a preset shooting interval in the current framing range if the matching degree obtained by calculation is not lower than the preset matching degree, and specifically comprises the following steps: when the time is within a first designated time, the moving direction of the eyeball and a preset eye moving track are determined; the moving directions of the tracks are the same, and the difference value between the moving distance of the eyeball and the moving distance in the preset eyeball track is in a preset range, so that the matching degree between the eye movement track and the preset eye movement track is judged to be not lower than the preset matching degree.
5. An electronic device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, wherein the steps of the photo search method according to any one of claims 1 to 3 are implemented when the computer program is executed by the processor.
6. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the photo search method according to any one of claims 1 to 3.
CN201711219422.8A 2017-11-28 2017-11-28 Photographing search method and device, electronic equipment and computer readable storage medium Expired - Fee Related CN107992816B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711219422.8A CN107992816B (en) 2017-11-28 2017-11-28 Photographing search method and device, electronic equipment and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711219422.8A CN107992816B (en) 2017-11-28 2017-11-28 Photographing search method and device, electronic equipment and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN107992816A CN107992816A (en) 2018-05-04
CN107992816B true CN107992816B (en) 2020-09-04

Family

ID=62033770

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711219422.8A Expired - Fee Related CN107992816B (en) 2017-11-28 2017-11-28 Photographing search method and device, electronic equipment and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN107992816B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109358320B (en) * 2018-08-30 2021-08-31 江苏慧光电子科技有限公司 Data generating method and system and storage medium thereof
CN109284081B (en) * 2018-09-20 2022-06-24 维沃移动通信有限公司 Audio output method and device and audio equipment
CN110266947B (en) * 2019-06-26 2021-05-04 Oppo广东移动通信有限公司 Photographing method and related device

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102316219A (en) * 2011-09-21 2012-01-11 深圳市同洲电子股份有限公司 Method for controlling network terminal equipment, apparatus and system
CN104158980B (en) * 2014-08-28 2016-03-30 西安交通大学 A kind of smart machine unlock method based on human eye movement's feature
CN104469152B (en) * 2014-12-02 2017-11-24 广东欧珀移动通信有限公司 The automatic camera method and system of Wearable
CN105892685B (en) * 2016-04-29 2019-02-15 广东小天才科技有限公司 Question searching method and device of intelligent equipment
CN106101562A (en) * 2016-08-16 2016-11-09 重庆交通大学 A kind of camera installation taken pictures according to eye motion
CN106354777B (en) * 2016-08-22 2019-09-17 广东小天才科技有限公司 Question searching method and device applied to electronic terminal
CN106777363A (en) * 2017-01-22 2017-05-31 广东小天才科技有限公司 Photographing search method and device for mobile terminal
CN107168536A (en) * 2017-05-19 2017-09-15 广东小天才科技有限公司 Test question searching method, test question searching device and electronic terminal

Also Published As

Publication number Publication date
CN107992816A (en) 2018-05-04

Similar Documents

Publication Publication Date Title
US11423695B2 (en) Face location tracking method, apparatus, and electronic device
RU2762142C1 (en) Method and apparatus for determining the key point of the face, computer apparatus, and data storage
CN107886032B (en) Terminal device, smart phone, authentication method and system based on face recognition
CN110163053B (en) Method and device for generating negative sample for face recognition and computer equipment
CN108833784B (en) Self-adaptive composition method, mobile terminal and computer readable storage medium
CN109120854B (en) Image processing method, image processing device, electronic equipment and storage medium
CN106295638A (en) Certificate image sloped correcting method and device
CN109002796B (en) Image acquisition method, device and system and electronic equipment
CN107992816B (en) Photographing search method and device, electronic equipment and computer readable storage medium
CN105654033A (en) Face image verification method and device
CN105678242B (en) Focusing method and device under hand-held certificate mode
CN109040605A (en) Shoot bootstrap technique, device and mobile terminal and storage medium
CN108174082B (en) Image shooting method and mobile terminal
CN109815823B (en) Data processing method and related product
CN110956679A (en) Image processing method and device, electronic equipment and computer readable storage medium
CN112215084A (en) Identification object determination method, device, equipment and storage medium
CN111353364A (en) Dynamic face identification method and device and electronic equipment
CN113411498A (en) Image shooting method, mobile terminal and storage medium
CN112036209A (en) Portrait photo processing method and terminal
CN108289176B (en) Photographing question searching method, question searching device and terminal equipment
CN109788199B (en) Focusing method suitable for terminal with double cameras
CN109842791B (en) Image processing method and device
CN109598195B (en) Method and device for processing clear face image based on monitoring video
CN111091089A (en) Face image processing method and device, electronic equipment and storage medium
CN113259734B (en) Intelligent broadcasting guide method, device, terminal and storage medium for interactive scene

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20200904

Termination date: 20211128